Match score not available

Senior Data Engineer

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

5+ years of experience in data engineering with a focus on SQL and PostgreSQL., Expertise in ETL pipeline design and data transformation., Proficiency in Python for scripting and automation., Deep knowledge of Apache Airflow for orchestrating data workflows..

Key responsabilities:

  • Unify and optimize data infrastructure for efficiency and maintainability.
  • Redesign ETL workflows for improved scalability and performance.
  • Ensure data integrity and accessibility to support business needs.
  • Act as a subject-matter expert on best practices in data engineering.

MAPEGY logo
MAPEGY
11 - 50 Employees
See all jobs

Job description

Job description

Location: Remote
Employment Type: Flexible – Open to freelancers, agencies, or full-time candidates

Why Join Us?
We are a leading provider of Business Intelligence in Innovation, Research & Development. Our platform helps enterprises uncover trends, track technology, and analyze competitors — easier, faster, and smarter.

✔ We prioritize expertise and results over location or contract type.
✔ Work from anywhere, ideally within European time zones to facilitate collaboration.
✔ Join a highly skilled, passionate team that thrives on data and innovation.
✔ Enjoy fair compensation, professional growth, and an engaging work environment.

About the Role
We’re looking for a Senior Data Engineer to help us optimize SQL-heavy workflows, improve ETL processes, and streamline data pipelines. This is an opportunity for a highly experienced expert, freelancer, or agency to collaborate with us on an impactful level. While we offer flexibility in working arrangements, we are invested in a long-term collaboration with the right professional.

If you have deep technical expertise, a problem-solving mindset, and a proven ability to deliver results, we’d love to hear from you!

What You’ll Do
- Unify and optimize data infrastructure to enhance efficiency and maintainability.
- Redesign and streamline ETL workflows for improved scalability and performance.
- Ensure data integrity, reliability, and accessibility to support business needs.
- Optimize PostgreSQL queries and indexing strategies to improve system performance.
- Automate and orchestrate workflows using Python and Apache Airflow.
- Act as a subject-matter expert, guiding the team on best practices in data engineering.

What We’re Looking For
Must-Have Skills:
✔ Expertise in SQL and PostgreSQL, with a strong focus on query optimization.
✔ Hands-on experience in ETL pipeline design, data transformation, and workflow automation.
✔ Proficiency in Python for scripting and automation.
✔ Deep knowledge of Apache Airflow for orchestrating data workflows.
✔ Strong understanding of data cleaning, deduplication, and integration best practices.
✔ Ability to collaborate effectively within European time zones.

Bonus Skills (Nice to Have):
+ Experience with document databases for handling large-scale data.
+ Strong problem-solving skills and ability to troubleshoot data issues.
+ A good sense of data architecture best practices for efficiency.

Your Background
5+ years of experience in data engineering, working with SQL-heavy workflows, ETL pipelines, and PostgreSQL-based systems. 

If you're a highly skilled expert, freelancer, or agency looking for a long-term collaboration in a cutting-edge company:
reach out to us: hr@mapegy.com

Required profile

Experience

Spoken language(s):
French
Check out the description to know which languages are mandatory.

Other Skills

  • Collaboration
  • Problem Solving

Data Engineer Related jobs