Offer summary
Qualifications:
Experience in data engineering with Scala, Knowledge of Apache Spark, Familiarity with Java and Microservices, Experience with Parquet storage format.Key responsabilities:
- Write and maintain efficient code in Scala
- Develop large-scale data processing solutions using Apache Spark
- Implement and optimize data storage in Parquet format
- Collaborate with multidisciplinary teams for data integration
- Analyze and solve complex data processing issues