Bachelor's degree in Computer Science, Engineering, or a related field., Proficiency in SQL and experience with data warehousing solutions., Familiarity with programming languages such as Python or Java., Experience with ETL tools and data pipeline development..
Key responsibilities:
Design and implement scalable data pipelines to support analytics.
Collaborate with data scientists and analysts to understand data needs.
Monitor and optimize data systems for performance and reliability.
Ensure data quality and integrity throughout the data lifecycle.
Report This Job
Help us maintain the quality of our job listings. If you find any issues with this job post, please let us know.
Select the reason you're reporting this job:
Optimizing Critical Infrastructures. Our smart solutions for electricity network faults and load management, monitoring and asset management enable electricity networks to run more smoothly, safely and sustainably. Together we are engineering better futures.