Match score not available

Mid Data Engineer

Remote: 
Full Remote
Contract: 
Experience: 
Mid-level (2-5 years)
Work from: 

Offer summary

Qualifications:

Proficiency in Python, SQL and data modeling, Experience with DBT, Airflow, DAGs.

Key responsabilities:

  • Develop, test, deploy data transformation workflows
  • Design, implement, optimize Directed Acyclic Graphs (DAGs)
  • Collaborate with cross-functional teams
WEX logo
WEX XLarge https://www.wexinc.com/
5001 - 10000 Employees
See all jobs

Job description

About the Team/Role

WEX Inc. is a leading global provider of payment processing services, information management, and fleet card payment solutions. 

We hire individuals who share our passion for continuous innovation, learning, and client service that is unparalleled in the industry.

 

If you are looking for a rewarding and challenging career – come be part of WEX today!

We are a group dedicated to enabling electronic payments to simplify the business of running a business. Our goal is to use technology to make electronic payments as simple and as seamless as possible.

 

Our team consists of small, agile development teams focused on building state of the art solutions using the best technology available. We focus on cloud-based microservice solutions to help achieve fast, scalable results that allow our business partners to compete in our competitive marketplace.

 

Our team of professionals works hard, has a continuous growth mindset, and maintains a work-life balance that is a cultural cornerstone of the WEX organization. We own our results and we take pride in ownership in everything we do.

 

How you'll make an impact

  • Problem-solving mindset: Ability to troubleshoot and resolve complex data pipeline issues

  • Strong communication skills: Capable of articulating technical concepts to non-technical stakeholders

  • Collaboration: Experience working in cross-functional teams, including data analysts, data scientists, and software engineers

  • Continuous Learning: Commitment to staying updated on industry trends, best practices, and new technologies

Experience you'll bring

  • Some experience DBT (Data Build Tool): Experience in developing, testing, and deploying data transformation workflows using DBT

  • Airflow: Solid knowledge in building, scheduling, and maintaining data pipelines using Apache Airflow

  • DAG Management: Proven experience in designing, implementing, and optimizing Directed Acyclic Graphs (DAGs) within Airflow

  • SQL Mastery: Advanced SQL skills, with the ability to write complex queries and optimize performance for large datasets.

  • Data Modeling: Experience with data warehousing concepts, dimensional modeling, and ETL processes

  • Cloud Platforms: Hands-on experience with cloud-based data platforms (e.g., AWS, GCP, or Azure) for data storage, processing, and orchestration

  • Programming Skills: Proficiency in Python for scripting and automation, especially within the context of data pipelines and Airflow DAGs.

  • Solid of experience as a Data Engineer or in a related role with a focus on data pipeline development and orchestration

  • Proven track record of working with DBT, Airflow, and DAGs in a production environment

  • Experience with version control systems (e.g., Git) for collaborative development and deployment of data transformations

  • Knowledge of CI/CD processes for automated testing and deployment of data pipelines.

Required profile

Experience

Level of experience: Mid-level (2-5 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Verbal Communication Skills
  • Problem Solving
  • Collaboration

Data Engineer Related jobs