The Role
The Big Data Developer will design and manage large-scale enterprise big data platforms using tools like Spark, Python, and SQL. This role requires strong experience in data engineering and business intelligence delivery, and the ability to work with large datasets and collaborate in an agile team environment.
Summary Generated by Built In
Role – Big Data Developer (w/ Java)
Location – Phoenix, AZ
Duration: 12+ Months
Skills:
- Java, Spark, Python, GCP
- Strong experience with Spark , Data Frames
- Big Query background with data processing for large volumes
- Strong Python background
Qualifications:
- Degree in Computer Science, Applied Mathematics, Engineering, or any other technology related field (or equivalent work experience)
- 6+ years of experience working as a big data engineer (Required) must be able to articulate use cases supported and outcomes driven.
- knowledge of the following (expectation is to demonstrate these skills live during the interview): PySpark, Spark, Python, Scala, Hive, Pig, and MapReduce.
- Experience in AWS & GCP.
- Experience in SQL.
- Large scale data engineering and business intelligence delivery experience
- Design of large-scale enterprise level big data platforms
- Experience working with and performing analysis using large data sets
- Proven and Demonstrated experience working with or on a mature, self-organized agile team
Top Skills
Java
Python
Scala
The Company
What We Do
We Empower & Transform customers’ business through the use of digital technologies.
Our core focus areas are Big-Data, Cloud, Analytics (AI, ML), Blockchain, Automation & Mobility.
We enable navigation of digital transformation for several fortune 1000 clients in USA, Canada, UK & India.
NucleusTeq is a software services, solutions & products company empowering & transforming customers’ business through the use of digital technologies such as Big-Data, Analytics (AI, ML), Cloud, Enterprise Automation, Block-chain, Mobility, CRM & ERP.