Match score not available

Senior Data Engineer (Azure Data Factory, Ukraine) #14712

Remote: 
Full Remote
Contract: 
Experience: 
Senior (5-10 years)
Work from: 

Offer summary

Qualifications:

Bachelor's degree in Computer Science or related field, Minimum of 5 years experience in cloud data management, 3 years experience with Azure Data Factory and data pipeline orchestration, Proficiency in Databricks, Python, and SQL, English proficiency at B2 level or higher.

Key responsabilities:

  • Design and manage complex data pipelines using Azure Data Factory
  • Lead data processing and optimization efforts with Databricks, Python, and SQL
  • Collaborate with Product Owners to architect advanced data solutions
  • Automate workflows for data ingestion efficiency
  • Mentor mid-level engineers and enhance team skills
Capgemini Engineering logo
Capgemini Engineering Information Technology & Services XLarge https://www.capgemini.com/
10001 Employees
See more Capgemini Engineering offers

Job description

Purpose Of The Job

We are looking for an experienced Senior Data Engineer with a minimum of 5 years of experience in data ingestion, processing, and management within cloud environments, especially the Azure ecosystem. The ideal candidate will lead and mentor agile teams, driving the development and optimization of our data infrastructure.

Main Tasks And Responsibilities

  • Design, develop, and manage complex data pipelines using Azure Data Factory (ADF).
  • Lead efforts in processing, transforming, and optimizing data with Databricks, Python, and SQL.
  • Collaborate closely with Product Owners and Tech Leaders to architect advanced data solutions in agile sprints.
  • Oversee the storage and organization of data in Azure Storage Accounts.
  • Automate and enhance workflows to maximize data ingestion and processing efficiency.
  • Establish and enforce best practices throughout the data pipelines.
  • Mentor mid-level data engineers and contribute to team skill development.

Education, Skills And Experience

MUST HAVE:

  • Bachelor's degree in Computer Science, Information Systems, or a related field.
  • Minimum of 3 years of experience with Azure Data Factory (ADF) and complex data pipeline orchestration.
  • Deep proficiency in Databricks, Python, and SQL.
  • Strong experience with Azure DevOps, data formats like Parquet, and data transfer protocols such as SFTP.
  • Proven leadership in agile teams using Scrum methodologies.
  • Bachelor's degree in Computer Science, Information Systems, or a related field (Master's degree is a plus).
  • English proficiency at B2 level or higher.

Nice To Have

  • Advanced experience with data integration or ETL tools such as Informatica PowerCenter, Talend, SSIS, Nifi or similar platforms.
  • Familiarity with project management tools like Jira, Confluence, and advanced schedulers like Control-M.
  • Strong knowledge of data security practices and data governance in cloud environments.

Additional Skills

  • Exceptional communication and leadership abilities.
  • Proactive in driving innovation and improving data workflows.
  • Ability to collaborate effectively with cross-functional teams.
  • Dedication to continuous learning and staying updated with industry trends.

Required profile

Experience

Level of experience: Senior (5-10 years)
Industry :
Information Technology & Services
Spoken language(s):
EnglishEnglish
Check out the description to know which languages are mandatory.

Other Skills

  • Leadership

Data Engineer Related jobs