Azure Cloud Data Engineer|Hybrid

placeMakati scheduleFull-time calendar_month 
Requirements:
  • Bachelor’s degree in Computer Science, Engineering, or a related field (or equivalent experience) and able to demonstrate high proficiency in programming fundamentals.
  • Proven experience as a Data Engineer or similar role dealing with data and ETL processes.
  • Strong knowledge of Microsoft Azure services, including Azure Data Factory, Azure Synapse, Azure Databricks, Azure Blob Storage and Azure Data Lake Gen 2.
  • Experience utilizing SQL DML to query modern RDBMS in an efficient manner (e.g., SQL Server, PostgreSQL).
  • Strong understanding of Software Engineering principles and how they apply to Data Engineering (e.g., CI/CD, version control, testing).
  • Experience with big data technologies (e.g., Spark).
  • Strong problem-solving skills and attention to detail.
  • Excellent communication and collaboration skills.
Preferred Qualifications:
  • Learning agility
  • Technical Leadership
  • Consulting and managing business needs
  • Strong experience in Python is preferred but experience in other languages such as Scala, Java, C#, etc is accepted.
  • Experience building spark applications utilizing PySpark.
  • Experience with file formats such as Parquet, Delta, Avro.
  • Experience efficiently querying API endpoints as a data source.
  • Understanding of the Azure environment and related services such as subscriptions, resource groups, etc.
  • Understanding of Git workflows in software development.
  • Using Azure DevOps pipeline and repositories to deploy and maintain solutions.
  • Understanding of Ansible and how to use it in Azure DevOps pipelines.
Job Description Summary:
Utilizes software engineering principles to deploy and maintain fully automated data transformation pipelines that combine a large variety of storage and computation technologies to handle a distribution of data types and volumes in support of data architecture design.

A Data Engineer designs data products and data pipelines that are resilient to change, modular, flexible, scalable, reusable, and cost effective.

As a Data Engineer, you will:
  • Design, develop, and maintain data pipelines and ETL processes using Microsoft Azure services (e.g., Azure Data Factory, Azure Synapse, Azure Databricks, Azure Fabric).
  • Utilize Azure data storage accounts for organizing and maintaining data pipeline outputs. (e.g., Azure Data Lake Storage Gen 2 & Azure Blob storage).
  • Collaborate with data scientists, data analysts, data architects and other stakeholders to understand data requirements and deliver high-quality data solutions.
  • Optimize data pipelines in the Azure environment for performance, scalability, and reliability.
  • Ensure data quality and integrity through data validation techniques and frameworks.
  • Develop and maintain documentation for data processes, configurations, and best practices.
  • Monitor and troubleshoot data pipeline issues to ensure timely resolution.
  • Stay current with industry trends and emerging technologies to ensure our data solutions remain cutting-edge.
  • Manage the CI/CD process for deploying and maintaining data solutions.
apartmentNityo InfotechplaceManila, 6 km from Makati
an early-career data enthusiast or a senior cloud expert, we have a spot for you! OPEN POSITIONS   1.  Data Engineer Location: Bagumbayan, Quezon City Schedule: Monday...
local_fire_departmentUrgent

Data Engineer

apartmentTrinity Workforce Solutions, Inc.placeMakati
Description: Data Engineer Position Overview:We are seeking a skilled Data Engineer to join our team. The successful candidate will be responsible for data migration, data replication, and system implementation, ensuring high-quality data management...
thumb_up_altRecommended

Junior Data Engineer

apartmentLENA - Lately, Everything Needs AnalyticsplacePasig, 6 km from Makati
We are looking for bright and talented Junior Data Engineers to join our dynamic team here at LENA. As a Junior Data Engineer, you will be responsible for building systems that will collect, manage, and convert raw data into usable information...