Role – Big Data Developer
Location – Phoenix, AZ
Duration: 12+ Months

Skills:

  • PySpark, Python, GCP
  • Strong experience with Spark , Data Frames
  • Big Query background with data processing for large volumes
  • Strong Python background

Qualifications:

  • Degree in Computer Science, Applied Mathematics, Engineering, or any other technology related field (or equivalent work experience)
  • 6+ years of experience working as a big data engineer (Required) must be able to articulate use cases supported and outcomes driven.
  • knowledge of the following (expectation is to demonstrate these skills live during the interview): PySpark, Spark, Python, Scala, Hive, Pig, and MapReduce.
  • Experience in AWS & GCP.
  • Experience in SQL.
  • Large scale data engineering and business intelligence delivery experience
  • Design of large-scale enterprise level big data platforms
  • Experience working with and performing analysis using large data sets
  • Proven and Demonstrated experience working with or on a mature, self-organized agile team