Offer summary
Qualifications:
Minimum of 3 years as a data engineer, At least 2 years of data model design and pipeline building, Skilled in dbt, SQL, Python, PySpark, Knowledgeable in Azure data services and/or AWS, Familiarity with Databricks Lakehouse is a plus.
Key responsabilities:
- Design and build data ingestion and transformation pipelines
- Communicate effectively with stakeholders
- Participate in project lifecycle from design to implementation
- Investigate new technologies and share knowledge
- Drive multiple projects within the Big Data Community