Prudential’s purpose is to be partners for every life and protectors for every future. Our purpose encourages everything we do by creating a culture in which diversity is celebrated and inclusion assured, for our people, customers, and partners. We provide a platform for our people to do their best work and make an impact to the business, and we support our people’s career ambitions. We pledge to make Prudential a place where you can Connect, Grow, and Succeed.
Responsibilities:
- Current Context:
- Design, develop, and maintain robust data models and schemas within our BigQuery lakehouse environment.
- Optimize data storage, retrieval, and processing for analytical workloads, ensuring high performance and efficiency.
- Implement and manage relational (RDB) and NoSQL databases as needed for specific use cases.
- Develop and maintain data APIs and services to enable data access for various applications and consumers.
- Implement data security and access control policies, adhering to best practices and regulatory requirements.
- Troubleshoot and resolve data backend issues, ensuring data integrity and availability.
- Collaborate with Data Engineers and other stakeholders to ensure the data backend meets evolving business needs.
- Partner with cross-functional teams globally, communicating platform updates effectively.
- Role Briefly: Data Modeling, Data Lakehouse (BigQuery), RDB, NoSQL, Data API, Performance Optimization, Data Security
- Expectations for Three Months: Become familiar with our existing technology stacks, not only within your specific role but across the broader data platform ecosystem.
- Expectations Within One Year: Ensure its stability, scalability, security, and performance for data backend. Serve as a data modeling and optimization expert, and driving architectural improvements. Specific contributions can be discussed.
Who We're Looking For:
- Non-Technical Skills & Mindset:
- Impact-Driven & Results Focused:
- Value-Oriented: Focused on delivering solutions that generate significant business value (millions USD impact).
- Impact Conscious: Prioritizes work with the greatest technical and business impact. A focus on enabling data consumption through API creation is a plus.
- Growth & Learning Mindset:
- Cross-Functional Learner: Eager to learn and understand cross-functional knowledge beyond core expertise.
- Technology Agnostic Learner: Willing to learn new technologies and adapt to evolving landscapes.
- Efficient Learner: Able to leverage AI tools to maximize productivity and accelerate learning.
- Best Practice Pragmatist: Loves to follow best practices but understands trade-offs and works around limitations when necessary. Demonstrated pro-activeness through contributions to open-source projects is highly valued.
- Collaborative & Global Communicator:
- Team Player: Collaborates effectively in global team environments. Adaptable and comfortable working within an Agile environment.
- Excellent Communicator (English & Chinese): Fluent in both English and Chinese (Mandarin) to effectively communicate with global teams and stakeholders.
- Impact-Driven & Results Focused:
- Technical Concepts: We're looking for candidates with a strong grasp of:
- Fundamental computer science knowledge
- Root cause finding methodologies
- Systematic/architectural thinking
- Clean code/clean architecture principles and an aversion to over-design
- Technical Skills:
- Python: Proficient in Python for data backend scripting and automation.
- SQL: Expert in SQL, with extensive experience in database design and optimization.
- Cloud Development: Hands-on experience with GCP, including hybrid environments with on-premises DCs. Experience with AWS or Azure is also acceptable.
- RDB and NoSQL: Solid understanding of relational and NoSQL database.
- Tech Stack:
- Compute & Hosting: GKE & GCE (RedHat), GCP Cloud Run & Cloud Functions
- Data Orchestration: GCP Cloud Composer (Airflow)
- Data Lakehouse: BigQuery
- Data Streaming: Kafka Ecosystem (Confluent Cloud, Debezium, Qlik)
- Monitoring & Observability: GCP Monitoring/Logging/Metrics, OpenTelemetry
- CI/CD: GitHub Actions, Jenkins
- Infrastructure as Code: Terraform
- Security: VPC SC & Policy Tags, Customer-Managed Encryption Keys (CMEK), Vault
- Containers: Docker, Kubernetes
- Data Governance: Collibra
- Data Visualization: Power BI
Prudential is an equal opportunity employer. We provide equality of opportunity of benefits for all who apply and who perform work for our organisation irrespective of sex, race, age, ethnic origin, educational, social and cultural background, marital status, pregnancy and maternity, religion or belief, disability or part-time / fixed-term work, or any other status protected by applicable law. We encourage the same standards from our recruitment and third-party suppliers taking into account the context of grade, job and location. We also allow for reasonable adjustments to support people with individual physical or mental health requirements.
Top Skills
What We Do
In Asia and Africa, Prudential has been providing familiar, trusted financial security to people for 100 years. Today, headquartered in Hong Kong and London, we are ranked top three in 12 Asian markets with 18 million customers, around 68,000 average monthly active agents and access to over 27,000 bank branches in the region.
Prudential is focused on opportunities in the most exciting growth markets in Asia and Africa. With access to over 4 billion people in both these regions, we are investing in broadening our presence and building our leadership in the life and asset management markets.
We are committed to making a positive impact on our customers, our employees and our communities by delivering the best savings, health and protection solutions to people so they can get the most out of life. Visit our websites for more information
Prudential plc: https://www.prudentialplc.com/
Prudence Foundation: https://www.prudentialplc.com/en/prudence-foundation