What You’ll Do

● Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions in full-stack development tools and technologies

● Partner with business teams and communicate with product managers to automate business processes, build visualizations on key performance indicators, and provide real-time business insights to improve business performance

● Perform ad hoc analysis on data and provide recommendations on ways to refine existing data models and improve both management and governance of data in cooperation with backend teams

● Utilize database tools such as PostgreSQL with RDBMS, NoSQL databases, and Cloud-based data warehousing services such as RDS, Redshift, and Snowflake

● Write in programming languages like Python with high-quality unit tests. Conducting reviews with other team members to make sure your code is rigorously designed, elegantly coded, and effectively tuned for performance Basic Qualifications

● Data Engineering Experience

○ Previous project experience with big data technologies

○ Knowledge of Agile engineering practices

○ PostgreSQL 11.3 or later

● Application Development Experience

○ UNIX/Linux including basic commands and shell scripting

○ Strong programming fundamentals using Python 3

○ Working knowledge of Python libraries like pandas, NumPy, SQL alchemy, and dask

● AWS Cloud Experience

○ Experience with visualization tools like Tableau and Quick sight.

○ Data warehousing experience (Redshift or Snowflake) Preferred Qualifications

● Certifications

○ AWS Associate Developer

○ AWS Specialty in Big Data

● Prior Projects

○ Hands-on experience with building end-to-end data pipelines for operational dashboards

○ Hands-on experience building interactive dashboards for operational reporting and decision making