Senior/Data Engineer (Snowflake + Kafka)

Posted 14 Days Ago
Be an Early Applicant
Pune, Mahārāshtra
Mid level
Design
The Role
The Data Engineer will build and optimize data pipelines using Snowflake and Kafka, ensuring data flow, governance, and performance across systems, while collaborating with teams for advanced analytics solutions.
Summary Generated by Built In

Allata is a fast-growing technology strategy and data development consulting firm delivering scalable solutions to our enterprise clients. Our mission is to inspire our clients to achieve their most strategic goals through uncompromised delivery, active listening, and personal accountability. We are a group who thrive in fast-paced environments, working on complex problems, continually learning, and working alongside colleagues to be better together.


We are seeking a skilled Data Engineer with expertise in Snowflake and Apache Kafka and to guide and drive the evolution of our client's data ecosystem.


IMRIEL (An Allata Company)are looking for a talented Data Engineer with experience in Snowflake and Apache Kafka to design, develop, and optimize our data pipelines. In this role, you will be responsible for building scalable, high-performance data systems that integrate with various data sources and contribute to the overall data architecture. You will collaborate with cross-functional teams to ensure seamless data flow and support advanced analytics initiatives.


What you'll be doing:


• Develop, implement, and manage scalable Snowflake data warehouse solutions using advanced features such as materialized views, task automation, and clustering.

• Design and build real-time data pipelines from Kafka and other sources into Snowflake using Kafka Connect, Snowpipe, or custom solutions for streaming data ingestion.

• Create and optimize ETL/ELT workflows using tools like DBT, Airflow, or cloud-native solutions to ensure efficient data processing and transformation.

• Tune query performance, warehouse sizing, and pipeline efficiency by utilizing Snowflake’s Query Profiling, Resource Monitors, and other diagnostic tools.

• Work closely with architects, data analysts, and data scientists to translate complex business requirements into scalable technical solutions.

• Enforce data governance and security standards, including data masking, encryption, and RBAC, to meet organizational compliance requirements.

• Continuously monitor data pipelines, address performance bottlenecks, and troubleshoot issues using monitoring frameworks such as Prometheus, Grafana, or Snowflake-native tools.

• Provide technical leadership, guidance, and code reviews for junior engineers, ensuring best practices in Snowflake and Kafka development are followed.

• Research emerging tools, frameworks, and methodologies in data engineering and integrate relevant technologies into the data stack.


What you need:

Basic Skills:


• 3+ years of hands-on experience with Snowflake data platform, including data modeling, performance tuning, and optimization.

• Strong experience with Apache Kafka for stream processing and real-time data integration.

• Proficiency in SQL and ETL/ELT processes.

• Solid understanding of cloud platforms such as AWS, Azure, or Google Cloud.

• Experience with scripting languages like Python, Shell, or similar for automation and data integration tasks.

• Familiarity with tools like dbt, Airflow, or similar orchestration platforms.

• Knowledge of data governance, security, and compliance best practices.

• Strong analytical and problem-solving skills with the ability to troubleshoot complex data issues.

• Ability to work in a collaborative team environment and communicate effectively with cross-functional teams


Responsibilities:


• Design, develop, and maintain Snowflake data warehouse solutions, leveraging advanced Snowflake features like clustering, partitioning, materialized views, and time travel to optimize performance, scalability, and data reliability.

• Architect and optimize ETL/ELT pipelines using tools such as Apache Airflow, DBT, or custom scripts, to ingest, transform, and load data into Snowflake from sources like Apache Kafka and other streaming/batch platforms.

• Work in collaboration with data architects, analysts, and data scientists to gather and translate complex business requirements into robust, scalable technical designs and implementations.

• Design and implement Apache Kafka-based real-time messaging systems to efficiently stream structured and semi-structured data into Snowflake, using Kafka Connect, KSQL, and Snowpipe for real-time ingestion.

• Monitor and resolve performance bottlenecks in queries, pipelines, and warehouse configurations using tools like Query Profile, Resource Monitors, and Task Performance Views.

• Implement automated data validation frameworks to ensure high-quality, reliable data throughout the ingestion and transformation lifecycle.

• Pipeline Monitoring and Optimization: Deploy and maintain pipeline monitoring solutions using Prometheus, Grafana, or cloud-native tools, ensuring efficient data flow, scalability, and cost-effective operations.

• Implement and enforce data governance policies, including role-based access control (RBAC), data masking, and auditing to meet compliance standards and safeguard sensitive information.

• Provide hands-on technical mentorship to junior data engineers, ensuring adherence to coding standards, design principles, and best practices in Snowflake, Kafka, and cloud data engineering.

• Stay current with advancements in Snowflake, Kafka, cloud services (AWS, Azure, GCP), and data engineering trends, and proactively apply new tools and methodologies to enhance the data platform.


Good To Have:


• Hands-on experience with DBT for transformations and data modeling techniques like dimensional modeling.

• Familiarity with cloud platforms (AWS, Azure, GCP) and tools like AWS Glue or Azure Data Factory.

• Knowledge of CI/CD pipelines, containerization (Docker), and stream processing frameworks (e.g., Spark, Flink).

• Certifications in Snowflake, Kafka, or cloud data engineering technologies are a plus


Personal Attributes:


• Ability to identify, troubleshoot, and resolve complex data issues effectively.

• Strong teamwork, communication skills and intellectual curiosity to work collaboratively and effectively with cross-functional teams.

• Commitment to delivering high-quality, accurate, and reliable data products solutions.

• Willingness to embrace new tools, technologies, and methodologies.

• Innovative thinker with a proactive approach to overcoming challenges


 

At Allata, we value differences.


Allata is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.


Allata makes employment decisions without regard to race, color, creed, religion, age, ancestry, national origin, veteran status, sex, sexual orientation, gender, gender identity, gender expression, marital status, disability or any other legally protected category.


This policy applies to all terms and conditions of employment, including but not limited to, recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation, and training.

Top Skills

Airflow
Apache Kafka
AWS
Azure
Dbt
Elt
ETL
GCP
Grafana
Prometheus
Python
Shell
Snowflake
SQL
Am I A Good Fit?
beta
Get Personalized Job Insights.
Our AI-powered fit analysis compares your resume with a job listing so you know if your skills & experience align.

The Company
HQ: Dallas, Texas
233 Employees
On-site Workplace

What We Do

Allata (pronounced a-ley-ta) is a strategy, architecture and enterprise-level application development company focused on helping clients enhance or scale business opportunities, create efficiencies and automate processes through custom technologies.

We are building a different kind of firm – focused on doing exciting, transformational work for great clients and bringing caring and dedicated people to make our clients goals a reality. Our vision is to build an energized group of talented professionals that can stand strong on their own but work better as a networked team.

We enable business agility at the intersection of people, process, and technology. We provide solutions and expert services to assist businesses to become more nimble, transformative, and disruptive in their respective industries. We define vision, strategy, and value creation models for shaping strategic product designs, managing, and transforming enterprise delivery.

Just as strongly as we care about our clients, we feel that it is important to give back to the community and non-profits that we are passionate about. Every month, Allata donates 2% of our net income to a charitable cause our team believes in.

We live by our mantra:
Family comes first, clients are king, we take great care of our people.

Similar Jobs

Citi Logo Citi

Senior Business Analyst-Corporate Actions

Fintech • Financial Services
Pune, Mahārāshtra, IND
223850 Employees
Pune, Mahārāshtra, IND
223850 Employees

Citi Logo Citi

Credit Risk Analytics Analyst II-C10

Fintech • Financial Services
Pune, Mahārāshtra, IND
223850 Employees

Barclays Logo Barclays

Data Analyst

Fintech • Financial Services
Pune, Mahārāshtra, IND
83500 Employees

Similar Companies Hiring

Mixbook Thumbnail
Software • Mobile • Manufacturing • Generative AI • eCommerce • Design
US
100 Employees
InspiringApps Thumbnail
Software • Mobile • Internet of Things • Enterprise Web • Design • Artificial Intelligence • App development
Boulder, CO
24 Employees
Altium Thumbnail
Software • Productivity • Other • Enterprise Web • Design • Cloud • Analytics
San Diego, CA
900 Employees

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account