Data Engineer

Posted 24 Days Ago
Be an Early Applicant
Argentina, Aguirre, Santiago del Estero
Mid level
Information Technology • Software • Consulting • App development • Generative AI • Big Data Analytics • Automation
The Role
The Data Engineer will design, build, and maintain data pipelines and architectures, focusing on data integration and management, especially with Salesforce. Responsibilities include ETL/ELT development, database management, optimizing performance, ensuring data quality, and collaborating with teams for data solutions.
Summary Generated by Built In

Job Title:

Data Engineer

Job Description

We are seeking a skilled and detail-oriented Data Engineer to join our team. The Data Engineer will be responsible for designing, building, and maintaining efficient data pipelines and architectures that enable advanced data analytics and business intelligence. The ideal candidate will have hands-on experience with data integration, data warehousing, and cloud-based platforms, as well as a passion for optimizing data systems to drive data-driven decision-making.

Key Responsibilities:

  • Data Pipeline Development:
    Design, build, and manage scalable, reliable, and efficient ETL/ELT data pipelines to ingest, process, and store large datasets from various data sources, including Salesforce.

     
  • Data Integration:
    Integrate data from diverse sources such as Salesforce, APIs, relational databases, NoSQL databases, flat files, and streaming data into centralized data lakes or data warehouses.

     
  • Salesforce Data Management:
    Leverage Salesforce data models, APIs, and connectors to extract, transform, and integrate Salesforce data with other enterprise data sources for analytics and reporting.

     
  • Database Management & Optimization:
    Implement and maintain data storage solutions such as relational databases (e.g., PostgreSQL, MySQL), NoSQL databases (e.g., MongoDB, Cassandra), and cloud-based data warehouses (e.g., Amazon Redshift, Google BigQuery).

     
  • Data Quality & Validation:
    Ensure data quality and integrity through the design and implementation of data validation, cleansing, and enrichment processes. 
    Address data discrepancies and inconsistencies, particularly within Salesforce data.
     
  • Collaboration with Teams:
    Collaborate with data scientists, analysts, and software engineers to understand data requirements, including Salesforce data, and deliver data solutions that meet analytical and operational needs.

     
  • Performance Tuning:
    Optimize data pipelines and data storage systems for maximum efficiency, scalability, and performance. 
    Proactively monitor system performance and troubleshoot issues.
     
  • Data Governance & Security:
    Work closely with the data governance and security teams to ensure data solutions comply with organizational standards, privacy laws (e.g., GDPR, CCPA), and security policies, particularly with sensitive Salesforce data.

     
  • Automation & Scheduling:
    Automate data workflows and implement scheduling tools (e.g., Airflow, Cron) to ensure timely data delivery for reports, dashboards, and analytics, including Salesforce-based insights.

     
  • Documentation:
    Create and maintain technical documentation for data pipelines, data models, and data management processes, including Salesforce data integration processes, to ensure knowledge sharing and reproducibility.

Requirements:

  • Education:
    Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Technology, or a related field.

     
  • Experience:
    • Minimum of 3-5 years of experience in data engineering, ETL development, or database management.
    • Experience with Salesforce data integration, including working with Salesforce APIs, data models, and connectors.
    • Experience with cloud platforms (AWS, Azure, Google Cloud) and cloud-native data processing tools.
       
  • Skills & Expertise:
    • Proficiency with SQL for querying and manipulating data, including Salesforce data.
    • Experience with data integration tools (e.g., Apache Airflow, Talend, Informatica) and ETL/ELT processes, including Salesforce integration.
    • Strong programming skills in Python, Java, or Scala for data processing.
    • Knowledge of big data technologies (e.g., Hadoop, Spark, Kafka).
    • Experience with data warehousing solutions, especially when integrating Salesforce data.

#ConcentrixCatalyst

Location:

ARG Work-at-Home

Language Requirements:

Time Type:

If you are a California resident, by submitting your information, you acknowledge that you have read and have access to the Job Applicant Privacy Notice for California Residents

Top Skills

SQL
The Company
0 Employees

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account