Match score not available

ID 3957 – Data Engineer

extra holidays
Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

5+ years of professional experience in data engineering or business intelligence., Proficiency in Python and experience with ETL orchestration tools like Airflow., Expertise in SQL, database fundamentals, and distributed computing., Experience with Snowflake, Redshift, and streaming technologies such as Kafka..

Key responsabilities:

  • Collaborate with business partners to understand data requirements.
  • Design and implement high-performance data models and pipelines for Data Lake and Data Warehouse.
  • Develop data quality checks and conduct QA for monitoring routines.
  • Manage a portfolio of data products ensuring high-quality and trustworthy data.

CONEXIONHR - Recruiting Company logo
CONEXIONHR - Recruiting Company Human Resources, Staffing & Recruiting SME https://www.conexion-hr.com/
51 - 200 Employees
See all jobs

Job description

Job Category: Data
Job Location: Argentina

The Data Engineering team builds database solutions for various use cases including reporting, product analytics, marketing optimization and financial reporting. By implementing pipelines, data structures, and data warehouse architectures; this team serves as the foundation for decision-making in the company.
We’re looking for Data Engineers to build and maintain a large scale 24×7 global infrastructure system that powers a 3-sided marketplace of Consumers, Merchants and Dashers. The Technology Stack used for these roles is Snowflake, Airflow, Python, AWS, Git, Jira.

Responsibilities and Tasks:
● Work with business partners and stakeholders to understand data requirements.
● Work with engineering, product teams and 3rd parties to collect required data.
● Design, develop and implement large scale, high volume, high performance data models and pipelines for Data Lake and Data Warehouse.
● Develop and implement data quality checks, conduct QA and implement monitoring routines.
● Improve the reliability and scalability of our ETL processes.
● Manage a portfolio of data products that deliver high-quality, trustworthy data.
● Help onboard and support other engineers as they join the team.

Hard Skills / Need to Have:
● 5+ years of professional experience working in data engineering, business intelligence, or a similar role.
● Proficiency in programming languages such as Python.
● 3+ years of experience in ETL orchestration and workflow management tools like Airflow, using AWS.
● Expert in Database fundamentals, SQL and distributed computing.
● 3+ years of experience with the Distributed data/similar ecosystem (Spark, Hive, Druid, Presto) and streaming technologies such as Kafka/Flink.
● Experience working with Snowflake, Redshift, PostgreSQL and/or other DBMS platforms.

Perks:
● Referral bonus.
● Tuition Reimbursement.
● English lessons with native teacher.
● Home office + optional coworking space.
● 2 days off per year.
● 3 sabbatical weeks every 3 years in the company,
● 3 weeks vacation.
● Birthday gift.
● Computer and 400 USD home office budget per year.

Great, just keep talking to your recruiter.

Required profile

Experience

Industry :
Human Resources, Staffing & Recruiting
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Data Engineer Related jobs