Bachelor’s degree in computer science or equivalent experience., 4+ years of industry experience in data engineering and building data solutions., Experience with Azure services including Synapse, Data Factory, and Databricks., Strong analytic skills and ability to design and implement well-written code..
Key responsibilities:
Design and maintain scalable data pipelines using Azure tools.
Collaborate with cross-functional teams for data integration and delivery.
Ensure data quality and governance using tools like Power BI.
Troubleshoot and maintain data infrastructure and support analytics applications.
Report This Job
Help us maintain the quality of our job listings. If you find any issues with this job post, please let us know.
Select the reason you're reporting this job:
Nos especializamos en la Busqueda y Seleccion de Personal de IT
JAVA-.NET-REACT-Node-Mobile-SAP-ORACLE-PHP-Analistas Funcionales-QA-BI-Project Managers
rrhh@dabrein.com
Design, develop, and maintain robust and scalable data pipelines using Azure Data Factory, Databricks, and other Azure-native tools.
Collaborate with cross-functional teams, including software developers, database architects, and data analysts, to ensure seamless data integration and delivery.
Transform raw data into clean, structured datasets optimized for analysis and reporting.
Develop solutions that enable real-time and batch data processing using Azure Synapse and high-speed indexing technologies.
Implement best practices for CI/CD pipelines, automated testing, version control, and production deployment of data solutions.
Ensure data quality, consistency, and governance by using tools such as Power BI and Microsoft Purview.
Troubleshoot and maintain data infrastructure and support analytics applications and data warehouse environments.
Support ongoing data initiatives by ensuring a consistent and reliable data architecture across projects.
Contribute to architectural decisions and solution design, leveraging expertise in data modeling, distributed systems, and automation.
Requirements
4+ years of industry experience in data engineering and building world class data solutions.
Excellent communication skills and a passion for collaborating with a diverse team and help foster an environment that promotes effective teamwork, communication, collaboration, and commitment.
Assure that data is cleansed, mapped, transformed, and otherwise optimized for storage and use according to business and technical requirements
Develop and maintain innovative Azure solutions
Solution design using Microsoft Azure services and other tools
The ability to automate tasks and deploy production standard code (with unit testing, continuous integration, versioning etc.)
Load transformed data into storage and reporting structures in destinations including data warehouse, high speed indexes, real-time reporting systems and analytics applications
Build data pipelines to collectively bring together data
Other responsibilities include extracting data, troubleshooting and maintaining the data warehouse
Assist software developers, database architects, data analysts, and data scientists with data initiatives and ensuring a consistent data delivery architecture is put in place throughout ongoing projects.
Qualifications
Bachelor’s degree in computer science or equivalent experience.
Inclusive of Data Warehouse fact and dimensional modeling concepts
Experience in Azure Synapse, Visual Studio, Azure DevOps, CI/CD, Pipelines, Data Factory, Python, Databricks, SQL
Experience with data tools such as Power BI and Purview
Expertise with cloud automation tooling such as ARM Templates
Experience with distributed (multi-tiered) systems and databases
Strong analytic skills related to working with unstructured datasets
The ability to design and implement well written code
Ability to work to tight deadlines
Ability to test the data from source to the presentation layer
Ability to support/troubleshoot data pipelines
Required profile
Experience
Spoken language(s):
English
Check out the description to know which languages are mandatory.