We are seeking a highly skilled and experienced Data Engineer to join our
team. As a Data Engineer, you will play a crucial role in managing and
optimizing our data systems and infrastructure. You will be responsible for
ensuring a scalable data architecture, data synchronization across various
platforms, developing and maintaining the reporting platform, and implementing
efficient data loading practices. Your expertise in relational and document
databases as well as data architecture will be key to maintaining the
integrity and performance of our data systems.
**Key Responsibilities**
* Design and implement efficient ETL (Extract, Transform, Load) processes that move data between systems while ensuring high performance and data quality
* Develop and curate aggregated datasets for reporting, analysis, and model training
* Design, implement, and manage processes for data synchronization across different systems and platforms
* Ensure data is stored in formats that facilitate quick retrieval and efficient analysis (e.g., using columnar storage formats like Parquet or ORC)
* Develop and maintain a scalable and efficient data reporting platform to provide ad hoc and standardized reporting
* Ensure compliance with data protection regulations like GDPR, CCPA, HIPAA, or industry-specific guidelines
* Design and implement robust data-loading processes in live transactional systems
* Collaborate with other teams and stakeholders to understand data needs
* Develop and optimize data models (both relational and non-relational) that support the organization’s data requirements and analytical goals
**Required Skills and Qualifications**
* Bachelor’s degree in computer science or commensurate experience
* 5 years of relevant experience as a Data Engineer or in a similar role
* Strong knowledge of MS SQL and experience with relational databases
* Knowledge of NoSQL document databases
* Experience building and curating datasets to facilitate
* Experience working in a compliant field such as finance or healthcare
* Familiarity with data integration and ETL (Extract, Transform, Load) tools and experience implementing data pipelines from scratch
* Experience with data warehousing or data lake solutions
* Excellent analytical and problem-solving skills
* Strong communication and collaboration abilities
* Experience with cloud-based data solutions (e.g., AWS, Azure, GCP)
* Knowledge of programming languages such as Python or C#
* Familiarity with reporting tools and data visualization
This individual’s success will be measured by their ability to show leadership
in setting a strategic direction for the data platform, mentor development
staff in implementing successful data access patterns, and completion of
assigned strategic objectives.