Offer summary
Qualifications:
5-8 years of software/data engineering experience, Intermediate to advanced SQL and Python skills, Experience with data APIs, ETL, and distributed computing, Bachelor's degree in Computer Science or related field, Familiarity with Databricks and AWS preferred.
Key responsabilities:
- Design and implement scalable data ingestion flows
- Analyze complex data flows and design data models
- Lead implementation of data transformation orchestration layer
- Develop systems for data quality and exception handling
- Research new tools for database maintenance and efficiency