Offer summary
Qualifications:
Proficiency in Python, Experience with PySpark, Advanced experience in NoSQL and Relational databases, Strong SQL skills for Datalake/Lakehouse, Intermediate experience in cloud, preferably AWS.
Key responsabilities:
- Participate in building the largest freight data platform in Latin America
- Develop and support data pipelines (ETL/ELT/EL)
- Automate processes in AWS cloud environment
- Support Data Community teams in Analytics flows
- Ensure code compliance with engineering standards