Position: Data Engineer – AWS/GCP

Required Skills: AWS/GCP, Java/Scala/Python, Spark, Airflow, Snowflake

Years of experience: 1-2 Years

Locations: Indore, Raipur

We seek skilled Data Engineers with experience in AWS or GCP to join our dynamic team. You will leverage your knowledge and expertise in programming languages and data engineering tools to support data migration and transformation initiatives.

Key Responsibilities:

  • Develop data processing applications using Python, Java, or Scala with a strong focus on Spark.
  • Support team lead to gather requirements and deliver data solutions that meet business needs.
  • Optimize data workflows for performance, scalability, and reliability.
  • Implement best data governance, security, and compliance practices within Cloud environments.
  • Document architecture, data models, and workflows to facilitate knowledge sharing within the team.

Skills and Qualifications:

  • Bachelor’s degree in Engineering - Computer science or related field.
  • 6+ months experience as a Data Engineer.
  • Proficiency in programming languages: Java, Scala, and/or Python.
  • Experience with SQL and NoSQL databases.
  • Ability to work independently and collaboratively in a fast-paced environment.
  • Problem-solving skills and attention to detail.
  • Keen interest in learning and adapting

What We Offer:

NucleusTeq culture - Our positive and supportive culture encourages our associates to do their best work every day. We celebrate individuals by recognizing their uniqueness and offering them the flexibility to make daily choices to help them be healthy, centered, confident, and aware. We offer well-being programs and continuously seek new ways to maintain a culture where our people excel and lead healthy, happy lives.

We bring to the table:

  • Competitive salary and benefits package.
  • Hybrid work Model.
  • Opportunity to work with cutting-edge technologies.
  • Collaborative and inclusive work environment.
  • Professional development and growth opportunities