Data Engineer_L3_ Program Tech Lead

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

Minimum of 4 years of professional experience in data engineering or a similar role., Proficiency in programming languages such as Python and expertise in database fundamentals., Over 7 years of experience in ETL orchestration and workflow management tools like Airflow and Flink., Strong leadership and communication skills to manage a team effectively..

Key responsibilities:

  • Lead the migration of data pipelines from Snowflake to Databricks.
  • Design, develop, and optimize ETL workflows and data pipelines.
  • Collaborate with cross-functional teams to understand database requirements for successful migration.
  • Implement best practices for data engineering to ensure high performance and reliability of data systems.

Grupo Data Portugal logo
Grupo Data Portugal Information Technology & Services Startup
11 - 50 Employees
See all jobs

Job description

Muito Gosto! Somos o Grupo DATA!

O nosso propósito é simplificar a vida dos nossos clientes e fazemos isso em todo o mundo, através das nossas soluções de TI.

Somos uma multinacional, que opera ativamente em Portugal, e estamos em constante expansão!

Adoramos trabalhar com grandes empresas e fazer crescer as empresas com as quais colaboramos.

Comprometemo-nos a ser a mudança que queremos ver nas grandes corporações do mercado, começando por valorizar as pessoas e as suas ideias.

Já percebemos que não existe apenas uma maneira de fazer as coisas funcionarem, que cada um dos nossos colaboradores tem características únicas e que, para trabalhar em equipa, precisamos realmente de nos conhecer.

Quem procuramos?


Description:
Program Tech Lead Databricks

Highly skilled Tech Lead to spearhead Snowflake and Databricks migration pipelines. The ideal candidate needs to have extensive experience in data engineering, ETL orchestration, and database management, with a strong proficiency in programming and distributed computing.

Key Responsibilities:

Lead the migration of data pipelines from Snowflake to Databricks.
Design, develop, and optimize ETL workflows and data pipelines.
Collaborate with cross-functional teams to understand database requirements and ensure successful migration.
Implement best practices for data engineering and ensure high performance and reliability of data systems.
Identify opportunities to optimize and reduce costs associated with data storage and processing.

Skills:

Very good English - C1
Minimum of 4 years of professional experience in data engineering, business intelligence, or a similar role.
Proficiency in programming languages such as Python.
Over 7+ years of experience in ETL orchestration and workflow management tools like Airflow, Flink, Oozie, and Azkaban using AWS or GCP.
Expertise in database fundamentals, Tableau or SQL, and distributed computing.
At least 4 years of experience with distributed data ecosystems (Spark, Hive, Druid, Presto).
Experience working with Snowflake, Redshift, PostgreSQL, Tableau and/or other DBMS platforms.
Lead and mentor a team of engineers, fostering a collaborative and productive work environment.
Apply Scrum methodologies to manage project workflows and deliverables efficiently.

Very good Tableau / Python/ SQL - It will be validated with real-time coding tests.
It will be validated with real-time coding tests.
Minimum 4 years experience in the technologies of the role.

Note:

Strong leadership and communication skills to manage and guide a team of engineers.

Full remote
Sector: Communication Services

Required profile

Experience

Industry :
Information Technology & Services
Spoken language(s):
Portuguese
Check out the description to know which languages are mandatory.

Other Skills

  • Leadership
  • Communication

Tech Lead Related jobs