Data Engineer

Posted 8 Days Ago
Be an Early Applicant
Kuala Lumpur, Wilayah Persekutuan Kuala Lumpur
Mid level
Financial Services
The Role
The Data Engineer will manage ELT processes, perform data integration using Azure tools, optimize data pipelines, and ensure data security and compliance. Responsibilities include designing data pipelines, managing infrastructure, and automating tasks with scripting. They will collaborate with various teams to drive data strategy and governance initiatives.
Summary Generated by Built In

Prudential’s purpose is to be partners for every life and protectors for every future. Our purpose encourages everything we do by creating a culture in which diversity is celebrated and inclusion assured, for our people, customers, and partners. We provide a platform for our people to do their best work and make an impact to the business, and we support our people’s career ambitions. We pledge to make Prudential a place where you can Connect, Grow, and Succeed.

Data Engineer is responsible to Extract, Load and Transform (ELT) as well as manage the Change Data Capture (CDC) using Qlik Replicate to ingest data into Data Lake, Data Models, Data Marts and 7-Sisters Master Data Management platform that help establish data-driven decision-making across the organization.
This is a new Data Team (Tribe) under IT Department that will drive the Data Strategy for the organization. They will work with respective Data Owners to develop the data models, data marts and data analytics for the whole PruBSN organization, following the regional Data COE approach to maintain good data governance and develop the 7-Sisters Master Data Management. The Data Strategy for PruBSN will be executed via the ‘Hub & Spoke’ approach, the ‘Hub’ being this core Data Team under IT, collaborating and working with the respective department Data Stewards and PruBSN Data Heroes (Power BI superusers), i.e. the ‘Spokes’.

Key Accountabilities:

  • Data Integration and Orchestration: Designs and implements data pipelines using Azure Data Factory to orchestrate data movement from various sources (databases, APIs, etc.) into Azure storage solutions (e.g., Azure Data Lake Storage, Azure Blob Storage).
  • ETL Processing and Transformation: Develops and manages ETL (Extract, Transform, Load) processes within Azure Databricks clusters, utilizing Spark (PySpark/Scala) for efficient data transformation.
  • Performance Optimization: Continuously optimizes data pipelines for efficiency and scalability using Spark techniques within Databricks and optimizes storage solutions for optimal performance.
  • Data Pipeline CI/CD: Designs, implements, maintains and enhanced CI/CD pipelines for data pipelines using Azure DevOps services (e.g., Azure Pipelines, Azure Repos). This includes automating testing, deployment, and configuration management for data infrastructure on Azure.
  • Azure Infrastructure Management: Provisions, configures, and manages infrastructure for data pipelines using Azure services (e.g., Azure Data Factory, Azure Databricks VMs, Azure Functions).
  • Version Control and Collaboration: Utilizes GitHub, Azure DevOps services (e.g., Azure Repos) to manage code changes for data pipelines and collaborates with other engineers on infrastructure development.
  • Security and Compliance: Ensures secure deployment of data pipelines on Azure, adhering to data privacy and compliance regulations using Azure Security Center and other security tools. Implement data quality and governance processes to ensure the accuracy, consistency, and reliability of data.
  • Monitoring and Troubleshooting: Sets up Azure Monitor to track data pipeline health and performance and implements alerting systems for potential issues through Azure Alerts. Perform troubleshooting to identify and resolve issues with data pipelines and infrastructure.
  • Scripting: Writes scripts using Azure Functions/ Azure Services CLI to automate tasks related to data pipeline deployment and management.

Additional Skills:

  • Experience with Azure DevOps, GitHub tools and methodologies.
  • Understanding of cloud security, governance best practices in Azure.

Qualification:

  • Bachelor’s degree in IT, computer science, or a relevant field.

  • 2-5 years of experience in the data engineering/analyst domain.
  • Able to work independently in a fast-paced and complex environment.
  • Good communication skills (both verbal and written) – English is mandatory.
  • Expert knowledge in data, data engineering and other related subdomains.
  • Background in cloud, cloud native applications and data flows.
  • Working knowledge of MS SQL, PostgreSQL, CDC, data Streaming Application (e.g. Apache Kafka, Qlik Replicate) and other related systems.

 

Prudential is an equal opportunity employer. We provide equality of opportunity of benefits for all who apply and who perform work for our organisation irrespective of sex, race, age, ethnic origin, educational, social and cultural background, marital status, pregnancy and maternity, religion or belief, disability or part-time / fixed-term work, or any other status protected by applicable law. We encourage the same standards from our recruitment and third-party suppliers taking into account the context of grade, job and location. We also allow for reasonable adjustments to support people with individual physical or mental health requirements.

Top Skills

Azure
Postgres
Pyspark
Scala
SQL
Get Personalized Job Insights.
Our AI-powered fit analysis compares your resume with a job listing so you know if your skills & experience align.

The Company
HQ: Central, Hong Kong
52,292 Employees
On-site Workplace

What We Do

In Asia and Africa, Prudential has been providing familiar, trusted financial security to people for 100 years. Today, headquartered in Hong Kong and London, we are ranked top three in 12 Asian markets with 18 million customers, around 68,000 average monthly active agents and access to over 27,000 bank branches in the region.

Prudential is focused on opportunities in the most exciting growth markets in Asia and Africa. With access to over 4 billion people in both these regions, we are investing in broadening our presence and building our leadership in the life and asset management markets.

We are committed to making a positive impact on our customers, our employees and our communities by delivering the best savings, health and protection solutions to people so they can get the most out of life. Visit our websites for more information

Prudential plc: https://www.prudentialplc.com/
Prudence Foundation: https://www.prudentialplc.com/en/prudence-foundation

Similar Jobs

MoneyLion Logo MoneyLion

Data Engineer

Fintech • Machine Learning • Mobile • Software • Financial Services
Easy Apply
Kuala Lumpur, Wilayah Persekutuan Kuala Lumpur, MYS
600 Employees

MoneyLion Logo MoneyLion

Senior Data Engineer

Fintech • Machine Learning • Mobile • Software • Financial Services
Easy Apply
Kuala Lumpur, Wilayah Persekutuan Kuala Lumpur, MYS
600 Employees

Mudah.my Logo Mudah.my

Junior Data Engineer

Marketing Tech • Software
Kuala Lumpur, Wilayah Persekutuan Kuala Lumpur, MYS
234 Employees

Capco Logo Capco

Senior Data Engineer

Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
Hybrid
Kuala Lumpur, Wilayah Persekutuan Kuala Lumpur, MYS
6000 Employees

Similar Companies Hiring

EDGE Thumbnail
Software • Fintech • Financial Services • Analytics
Chicago, IL
20 Employees
Energy CX Thumbnail
Utilities • Professional Services • Greentech • Financial Services • Energy • Consulting • Business Intelligence
Chicago, IL
55 Employees
MassMutual India Thumbnail
Insurance • Information Technology • Fintech • Financial Services • Big Data
Hyderabad, Telangana

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account