Offer summary
Qualifications:
Bachelor's degree in IT or related fields., Advanced knowledge of programming (Python, Spark) and SQL., Experience with cloud data ecosystems (Azure Data Factory, Databricks)., Familiarity with relational databases (Oracle, SQLServer, DB2)., Knowledge of data ingestion and architecture..
Key responsabilities:
- Develop data pipelines ensuring efficiency and high-quality delivery.
- Define data transformation flows and storage structures in Deltalake.
- Ensure data integrity and quality through controls and validations.
- Establish standards and best practices using modern techniques.
- Monitor and optimize performance and scalability of workflows.