Capgemini is seeking a highly motivated and detail-oriented Data Information Architect for a top 10 US Insurance Carrier.
This role is for a skilled Data Information Architect to design and implement data solutions supporting Servicing Ops Analytics, Reporting and AI. They are responsible for the definition, creation and maintenance of architecture artifacts for data solutions along with definition of Data Architecture processes across Data & Analytics in Service Ops. The Information Architect is expected to understand current and future architecture, reporting and warehousing technologies to effectively architect future target state solutions for Service Ops data and analytics.
The ideal candidate will have experience with Snowflake, Python, Azure and AWS, along with a strong foundation Big Data, and Cloud Platforms. You will be responsible for developing scalable, efficient data architectures that enable personalized customer experiences, BI, and advanced analytics.
- Work towards a data strategy that maximizes the value of data as an asset, along with a migration strategy from our current state to our future state architecture.
- Responsible for definition, creation and maintenance of architecture artifacts of data solutions including data architecture, data flow diagrams, conceptual/logical data models, data dictionaries, and database design.
- Design and optimize data pipelines that integrate various data sources (1st party, 3rd party, operational) to support business intelligence and advanced analytics.
- Develop data models and data flows that enable personalized customer experiences and support omnichannel marketing and customer engagement.
- Implement and maintain data warehousing solutions in Snowflake to handle large-scale data processing and analytics needs.
- Support both real-time and batch data integration, ensuring data is accessible for actionable insights and decision-making.
Our Client is one of the United States’ largest insurers, providing a wide range of insurance and financial services products with gross written premium well over US$25 Billion (P&C). They proudly serve more than 10 million U.S. households with more than 19 million individual policies across all 50 states through the efforts of over 48,000 exclusive and independent agents and nearly 18,500 employees. Finally, our Client is part of one the largest Insurance Groups in the world.
- 4+ years of experience in Data Architecture or Data Engineering, with expertise in technologies such as AWS, Snowflake, and Azure.
- Strong understanding of data modeling, ETL/ELT processes, and modern data architecture frameworks.
- Minimun 2 years - experience designing scalable data architectures for customer analytics and operations.
- Expertise with cloud data platforms (AWS preferred) and Big Data technologies for large-scale data processing.
- Hands-on experience with Python for data engineering tasks and scripting.
- Proven track record of building and managing data pipelines and data warehousing solutions such as Snowflake, AWS.
- Familiarity with Customer Data Platforms (CDP), Master Data Management (MDM), and Customer 360 architectures.
- Strong problem-solving skills and ability to work with cross-functional teams to translate business requirements into scalable data solutions.
- Proficient knowledge in modern data architecture that supports advanced analytics including Snowflake, Azure, etc.
- Self-directed and comfortable supporting the data needs of multiple teams, systems and products.
- Expert knowledge of data modeling, documentation and governance tools and techniques.
- English Proficiency: Fluent
Other Critical Skills
Experience in Data Architecture or Data Engineering, with expertise in technologies such as AWS, Snowflake, and Azure. - Advanced
• Strong understanding of data modeling, ETL/ELT processes, and modern data architecture frameworks. - Advanced
• Expertise with cloud data platforms (AWS preferred) and Big Data technologies for large-scale data processing. - Advanced
• Hands-on experience with Python for data engineering tasks and scripting. - Advanced
• Hands-on experience with Python for data engineering tasks and scripting. - Advanced
• Expert knowledge of data modeling, documentation and governance tools and techniques. - Advanced
Competitive compensation and benefits package:
- Competitive salary and performance-based bonuses
- Comprehensive benefits package
- Career development and training opportunities
- Flexible work arrangements (remote and/or office-based)
- Dynamic and inclusive work culture within a globally renowned group
- Private Health Insurance
- Pension Plan
- Paid Time Off
- Training & Development
- Performance Bonus
Top Skills
What We Do
Capgemini is a global leader in partnering with companies to transform and manage their business by harnessing the power of technology. The Group is guided everyday by its purpose of unleashing human energy through technology for an inclusive and sustainable future. It is a responsible and diverse organization of 270,000 team members in nearly 50 countries. With its strong 50 year heritage and deep industry expertise, Capgemini is trusted by its clients to address the entire breadth of their business needs, from strategy and design to operations, fueled by the fast evolving and innovative world of cloud, data, AI, connectivity, software, digital engineering and platforms. The Group reported in 2020 global revenues of €16 billion.