Match score not available

Databricks Data Engineer | Mid/Senior

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

Experience in creating and maintaining ETL pipelines for analytical databases., Proficiency in SQL (DML, DDL, DQL) and SQL databases (Oracle, SQL Server, PostgreSQL)., Experience with Python, Spark, PySpark, and Spark SQL., Knowledge of Azure Databricks, Azure Data Factory, and AWS..

Key responsabilities:

  • Design and develop scalable data pipelines using Databricks (PySpark) for processing large datasets.
  • Collaborate with engineering and data science teams to understand requirements and provide efficient solutions.
  • Implement complex data transformations, data cleaning, and aggregations using PySpark.
  • Ensure data quality and integrity by implementing testing and monitoring practices.

Compass.uol logo
Compass.uol XLarge
5001 - 10000 Employees
See all jobs

Job description

JOB DESCRIPTION

.


RESPONSIBILITIES AND ASSIGNMENTS

  • Projetar e desenvolver pipelines de dados escaláveis utilizando Databricks (PySpark) para processamento e análise de grandes conjuntos de dados;
  • Colaborar com as equipes de engenharia e ciência de dados para entender os requisitos e fornecer soluções eficientes;
  • Implementar transformações de dados complexas, limpeza de dados e agregações usando PySpark e otimizar o desempenho do código;
  • Utilizar Python ElementTree para manipulação eficiente de dados XML e integração de dados heterogêneos;
  • Desenvolver scripts em Python utilizando Pandas para manipulação e análise de dados estruturados;
  • Garantir a qualidade e integridade dos dados, implementando práticas de teste e monitoramento;
  • Colaborar na definição e implementação de melhores práticas de engenharia de dados e arquitetura de dados;
  • Manter documentação técnica abrangente para os processos e soluções implementadas.

REQUIREMENTS AND QUALIFICATIONS

  • Experiência na criação e sustentação de pipelines ETL para bases de dados analíticas;
  • Domínio em SQL (DML, DDL, DQL) e Banco de Dados SQL (Oracle, SQL Server, PostgreSQL);
  • Experiência com Python, Spark, PySpark e Spark SQL;
  • Conhecimento em Azure Databricks, Azure Data Factory, Azure Dataflow e Synapse Analytics;
  • Conhecimento em AWS;
  • Experiência com virtualização de dados de diversas fontes (SQL, NoSQL, ServiceNow, arquivos CSV/JSON, etc.);
  • Habilidades no desenvolvimento de processos ETL e workflows;
  • Experiência com Containers e orquestradores como Airflow e Control-M;
  • Conhecimento sobre Governança e Arquitetura de Dados;
  • Experiência em monitoramento e metodologias de mensuração de resultado;
  • Domínio de Git Workflow (Git, GitHub) e geração de scripts;
  • Desejável: Experiência com ingestão e migração de bases de dados legadas para nuvem (Oracle/SQL Server/PostgreSQL on-premise para cloud).


Não possui todos os requisitos para a vaga?


Está tudo bem! Na Compass UOL, estimulamos o desenvolvimento contínuo de novos talentos e transformamos desafios em oportunidades.


ADDITIONAL INFORMATION


#remote

"remote"


DREAM BIG WHEN IT COMES TO TECHNOLOGY. BE A COMPASSER! 🚀

Compass UOL is a global company that is part of AI/R, which drives the transformation of organizations through Artificial Intelligence, Generative AI, and Digital Technologies.


We design and build digitally native platforms using cutting-edge technologies to help companies innovate, transform businesses, and drive success in their markets. With a focus on attracting and developing the best talent, we create opportunities that improve lives and highlight the positive impact of disruptive technologies on society.


That's why our selection process goes beyond technical skills. Our goal is to find unique individuals with the potential to make an extraordinary impact on our clients.


We empower talent without borders and promote knowledge and opportunities in the latest market trends, driving significant results.


Join us and be part of the AI-driven digital revolution in the technology universe.


HOW OUR SELECTION PROCESS WORKS

1. ONLINE APPLICATION
Choose the opportunity that best fits your goals. Remember: having a well-detailed profile with your experiences and knowledge can make all the difference!
2. INTERVIEWS
Learn about our culture and company! During interviews, be present and do your best to share your expertise in a chronological and structured way.
3. EVALUATION
Our tests and assessments focus on finding talent with the cultural and technical fit for the position applied for.
4. FEEDBACK

Wait for our response regardless of the result! We have Gupy platform feedback certification.


Required profile

Experience

Spoken language(s):
PortugueseEnglish
Check out the description to know which languages are mandatory.

Other Skills

  • Collaboration
  • Problem Solving

Data Engineer Related jobs