**WORK TO BE PERFORMED:**
* Participate in Requirements Gathering: work with key business partner groups and other Data Engineering personnel to understand department-level data requirements for the analytics platform.
* Design Data Pipelines: work with other Data Engineering personnel on an overall design for flowing data from various internal and external sources into the Analytics platform.
* Build Data Pipelines from scratch with Python and SQL: leverage standard toolset and develop ETL/ELT code to move data from various internal and external sources into the analytics platform.
* Support Data Quality Program: work with Data QA Engineer to identify automated QA checks and associated monitoring.
* Key contributor to defining, implementing, and supporting:
* * Data Services
* Data Dictionary
**SKILL AND EXPERIENCE REQUIRED:**
* Strong ELT/ ETL designer/developer
* Strong SQL & Python Expert level
* Performance tuning with SQL
* Structured & unstructured data expertise
* Cloud environment development & operations experience (GCP, AWS)
* Preference for candidates experienced with:
* * Google Cloud Platform (GCP) and associated services; e.g., Big Query, GCS, Cloud Composer, Dataproc, Dataflow, Dataprep, Cloud Pub/Sub, Metadata DB, Data Studio, Datalab, other tools Apache Airflow (scheduler), Bitbucket
* Real-time data replication/streaming tools
* Data Modeling
**ONBOARDING REQUIREMENT**
Successful completion of background check.
**INTERVIEW PROCESS**
* One phone interview followed by one MS Teams interview; candidates must be local to Chicago; IL as onsite work will be required post COVID-19.
Powered by JazzHR
jWSeiOaf0Q