Offer summary
Qualifications:
8+ years of experience, Expertise in AWS, Spark, Python, SQL, PySpark, ETL.Key responsabilities:
- Collaborate with business analysts and stakeholders
- Design and develop Airflow DAGs for ETL workflows
- Write PySpark scripts and custom Python functions
- Monitor and troubleshoot ETL pipelines for smooth operation
- Stay updated on new data engineering trends