Match score not available

Data Engineer (Databricks/Python, Ukraine) #14713

Remote: 
Full Remote
Contract: 
Experience: 
Mid-level (2-5 years)
Work from: 

Offer summary

Qualifications:

Bachelor's degree in Computer Science or related field, Minimum 2 years experience with Azure Data Factory, Proficiency in Databricks, Python, and SQL, Familiarity with Azure DevOps and data formats, English proficiency at B2 level or higher.

Key responsabilities:

  • Develop and manage data pipelines using ADF
  • Process, transform, and optimize data with Databricks, Python, and SQL
  • Collaborate with teams to implement data solutions during sprints
  • Store and organize data in Azure Storage Accounts
  • Automate workflows to enhance data processing efficiency
Capgemini Engineering logo
Capgemini Engineering Information Technology & Services XLarge https://www.capgemini.com/
10001 Employees
See more Capgemini Engineering offers

Job description

Purpose Of The Job

We are seeking a proactive Mid-Level Data Engineer with a minimum of 3 years of experience in data ingestion, processing, and management within cloud environments, particularly the Azure ecosystem. The ideal candidate will collaborate with agile teams using methodologies like Scrum to develop and optimize our data infrastructure.

Main Tasks And Responsibilities

  • Develop and manage data pipelines using Azure Data Factory (ADF).
  • Process, transform, and optimize data with Databricks, Python, and SQL.
  • Collaborate with Product Owners and Tech Leaders to implement data solutions during agile sprints.
  • Store and organize data in Azure Storage Accounts.
  • Automate workflows to enhance data ingestion and processing efficiency.
  • Document processes and implement best practices across the data pipeline.

Education, Skills And Experience

MUST HAVE:

  • Bachelor's degree in Computer Science, Information Systems, or a related field.
  • Minimum of 2 years of experience with Azure Data Factory (ADF) for orchestrating data pipelines.
  • Proficiency in Databricks, Python, and SQL.
  • Familiarity with Azure DevOps and data formats like Parquet.
  • Understanding of data transfer protocols such as SFTP.
  • Experience working in agile teams using Scrum methodologies.
  • Bachelor's degree in Computer Science, Information Systems, or a related field.
  • English proficiency at B2 level or higher.

Nice To Have

  • Experience with data integration or ETL tools such as Informatica PowerCenter, Talend, SSIS, Nifi or similar platforms.
  • Knowledge of ingesting data from SAP R/3 systems via IDocs.
  • Familiarity with tools like Jira, Confluence, and schedulers like Control-M.
  • Understanding of data security practices in cloud environments.

Additional Skills

  • Strong communication and collaboration abilities.
  • Proactive in documenting and improving data workflows.
  • Team-oriented with a commitment to excellence.

Required profile

Experience

Level of experience: Mid-level (2-5 years)
Industry :
Information Technology & Services
Spoken language(s):
EnglishEnglish
Check out the description to know which languages are mandatory.

Other Skills

  • Collaboration
  • Problem Solving
  • Verbal Communication Skills

Data Engineer Related jobs