Data Engineer for Enterprise Data

apartmentConverge ICT Solutions placeQuezon City scheduleFull-time calendar_month 

The Data Engineer supports the Assurance Pillar by building, operating, and continuously improving enterprise data pipelines and reporting datasets that enable reliable, timely, and auditable insights across business units. The role focuses on ETL/ELT automation, data quality controls, issue resolution, and the scheduled execution/monitoring of scripts for periodic reports required by internal stakeholders and data platform operations on AWS.

Reports to: Head, Enterprise Data (Assurance) / Enterprise Data Lead (as applicable)

Key Responsibilities
  1. Data Pipeline & ETL/ELT Delivery
  • Develop, enhance, and maintain data pipelines (batch and/or near-real-time) that ingest data from source systems (e.g., BSS/OSS, CRM, finance, network systems) into reporting layers (DWH, data lake, reporting DB, or marts).
  • Implement reusable ETL/ELT frameworks, parameterized jobs, and standardized transformations to reduce manual processing.
  • Optimize pipelines for performance, scalability, and cost (query tuning, indexing/partitioning, incremental loads).
  1. Periodic Reports Execution & Operations (Critical)
  • Run, schedule, and monitor scripts/jobs for periodic reports (daily/weekly/monthly) required by business units (e.g., Finance, Operations, Sales, CX, Network, Collections).
  • Validate successful completion of scheduled runs; perform reruns as needed and document exceptions.
  • Conduct completeness and accuracy checks before releasing outputs (row counts, reconciliation vs. source totals, threshold checks).
  • Maintain runbooks and job calendars (report owners, cut-off times, dependencies, and distribution lists).
  • Ensure report outputs are stored and shared through approved channels with proper access controls.
  1. AWS Data Platform Operations & Enablement
  • Operate and support data workloads hosted on AWS, including provisioning/maintenance activities aligned with internal policies and IT controls.
  • Build and maintain cloud-based data pipelines and storage patterns (e.g., landing/bronze-silver-gold layers) using AWS-native and/or approved tools.
  • Perform day-to-day activities such as:
  • Managing data storage and access (e.g., Amazon S3 structures, lifecycle rules, encryption, and access permissions via IAM policies/roles).
  • Supporting orchestration and scheduled jobs (e.g., AWS Glue, Step Functions, MWAA/Airflow, Lambda, or equivalent approved schedulers).
  • Supporting analytics/warehouse workloads (e.g., Amazon Redshift, Athena, EMR, or equivalent), including performance tuning and cost controls.
  • Monitoring pipeline health (e.g., CloudWatch logs/metrics/alarms), investigating failures, and implementing alerting.
  1. Data Quality & Controls (Assurance-Focused)
  • Implement data quality rules (uniqueness, validity, timeliness, consistency) and automated alerts for anomalies.
  • Perform root-cause analysis for data issues and coordinate fixes with IT, application owners, and source system teams.
  • Support auditability: maintain logs, version control, and evidence of execution for critical reports.
  1. Reporting Data Enablement & Stakeholder Support
  • Partner with Data & Performance Analytics / CX Analytics and business report owners to translate report requirements into datasets and automated processes.
  • Support Tableau/dashboard consumption by ensuring reporting tables/views are stable, well-documented, and refreshed on schedule.
  • Triage and resolve tickets related to data extracts, refresh failures, and access requests within agreed SLAs.
  1. Documentation, Governance & Continuous Improvement
  • Maintain data dictionaries, pipeline documentation, lineage notes, and technical specifications.
  • Follow naming standards, change management, and deployment practices across environments (DEV/SIT/UAT/PROD).
  • Propose automation and process improvements (reduce manual steps, improve monitoring, add validations).
Key Deliverables
  • Production-grade ETL/ELT pipelines and datasets for enterprise reporting use cases
  • Automated periodic report scripts/jobs with monitoring + runbooks
  • Data quality checks and exception logs with documented resolution
  • Standardized, well-documented reporting tables/views for BI tools
  • Ticket closure evidence and stakeholder communications

Qualification

Required Qualifications
  • Bachelor’s degree in Computer Science, Engineering, IT, Mathematics, Statistics, or equivalent experience
  • Strong SQL skills (advanced joins, window functions, performance tuning)
  • Experience building and operating ETL/ELT pipelines (any relevant tools/stack)
  • Proficiency in at least one scripting language (Python preferred; or Shell/PLSQL)
  • Familiarity with data warehousing concepts (dimensional modeling, marts, incremental loads)
  • Hands-on experience with job scheduling and monitoring (cron/schedulers/orchestrators)
  • Strong troubleshooting mindset and comfort working with production run support
Preferred Qualifications
  • Experience with telecom data domains (BSS/OSS) and high-volume datasets
  • Exposure to cloud data platforms (e.g., BigQuery, Snowflake, Redshift, Synapse) or on-prem DWH
  • Experience supporting Tableau refresh patterns and optimizing extracts/views
  • Knowledge of data governance, lineage, and controls in an assurance/audit context
  • Experience with CI/CD, Git, and environment promotions
Key Competencies
  • Operational discipline (accuracy, timeliness, evidence-based execution)
  • Data quality and control mindset (pre/post checks, reconciliations, traceability)
  • Stakeholder management (clear updates, expectation setting, SLA adherence)
  • Problem-solving and root-cause analysis
  • Documentation and process rigor
Success Metrics (Examples)
  • % of periodic reports executed on time (daily/weekly/monthly)
  • Reduction in manual steps via automation (scripted + scheduled runs)
  • Data pipeline reliability (job success rate, MTTR for failures)
  • Data accuracy metrics (reconciliation pass rate, defect leakage)
  • Ticket SLA compliance and stakeholder satisfaction
  • Coverage of monitoring/alerts and completeness of runbooks
apartmentTerraBarn IncplaceQuezon City
Job Description We are looking for a Data Engineer specializing in Data Integration to design, build, and maintain scalable data solutions. This role focuses on developing ETL pipelines and ensuring seamless data flow from multiple sources...
electric_boltImmediate start

Data Engineer

apartmentManulife Data Services, IncplaceQuezon City
Job Description As a Data Engineer / Senior Data Engineer at Manulife, you would play a critical role in the development and maintenance of the company's data systems and architecture. You would collaborate with other highly skilled data engineers...
business_centerHigh salary

Senior Data Engineer New

apartmentMonroe Consulting GroupplaceMakati, 11 km from Quezon City
Executive Monroe Consulting Group is recruiting on behalf of a #1 global market leader in digital shopper marketing. Our client is looking for a Senior Data Engineer. This job offers a Hybrid arrangement (3x onsite, 2x WFH in a week) located...