Role Description:

Experience- 8 Year +

Location- Indore/Bangalore/Hyderabad.

Technical Requirements:

Skills needed: Python, Spark, AWS with EMR Iceberg, GIT, Airflow, DBT, Snowflake, Trino, Linux/Unix,

Project Scope: Understand the code base built in AWS services and SQL and convert the same to tech stack primarily in airflow, Iceberg, Python/SQL


Qualifications:

  • Designing and building data models to support business requirements
  • Developing and maintaining data ingestion and processing systems
  • Implementing data storage solutions (databases and data lakes)
  • Ensuring data consistency and accuracy through data validation and cleansing techniques
  • Working together with cross-functional teams to identify and address data-related issues
  • Proficiency in programming language - Python experience with big data spark & Orchestration experience: Airflow.
  • AWS experience is required.
  • Knowledge of database management systems, (e.g., MySQL)
  • Strong problem-solving and analytical skills
  • Excellent communication and collaboration abilities

NucleusTeq culture-Our positive and supportive culture encourages our associates to do their best work every day. We celebrate individuals by recognizing their uniqueness and offering them the flexibility to make daily choices that can help them to be healthy, centered, confident, and aware. We offer well-being programs and are continuously looking for new ways to maintain a culture where our people excel and lead healthy, happy lives