Match score not available

Mid Software Engineer / Data

Remote: 
Full Remote
Contract: 
Experience: 
Mid-level (2-5 years)
Work from: 

Offer summary

Qualifications:

Experience with DBT and Airflow, Advanced SQL skills for data handling, Proficiency in Python programming, Knowledge of cloud platforms like AWS, Experience in data modeling and ETL.

Key responsabilities:

  • Troubleshoot data pipeline issues
  • Collaborate with cross-functional teams
  • Develop and deploy data workflows using DBT
  • Optimize and manage data pipelines in Airflow
  • Stay updated on technology trends
WEX logo
WEX XLarge https://www.wexinc.com/
5001 - 10000 Employees
See more WEX offers

Job description

About the Team/Role

WEX Inc. is a leading global provider of payment processing services, information management, and fleet card payment solutions. 

We hire individuals who share our passion for continuous innovation, learning, and client service that is unparalleled in the industry.

 If you are looking for a rewarding and challenging career – come be part of WEX today!

We are a group dedicated to enabling electronic payments to simplify the business of running a business. Our goal is to use technology to make electronic payments as simple and as seamless as possible.

 Our team consists of small, agile development teams focused on building state of the art solutions using the best technology available. We focus on cloud-based microservice solutions to help achieve fast, scalable results that allow our business partners to compete in our competitive marketplace.

Our team of professionals works hard, has a continuous growth mindset, and maintains a work-life balance that is a cultural cornerstone of the WEX organization. We own our results and we take pride in ownership in everything we do.

 

How you'll make an impact

  • Problem-solving mindset: Ability to troubleshoot and resolve complex data pipeline issues

  • Strong communication skills: Capable of articulating technical concepts to non-technical stakeholders

  • Collaboration: Experience working in cross-functional teams, including data analysts, data scientists, and software engineers

  • Continuous Learning: Commitment to staying updated on industry trends, best practices, and new technologies

Experience you'll bring

  • Some experience DBT (Data Build Tool): Experience in developing, testing, and deploying data transformation workflows using DBT

  • Airflow: Solid knowledge in building, scheduling, and maintaining data pipelines using Apache Airflow

  • DAG Management: Proven experience in designing, implementing, and optimizing Directed Acyclic Graphs (DAGs) within Airflow

  • SQL Mastery: Advanced SQL skills, with the ability to write complex queries and optimize performance for large datasets.

  • Data Modeling: Experience with data warehousing concepts, dimensional modeling, and ETL processes

  • Cloud Platforms: Hands-on experience with cloud-based data platforms (e.g., AWS, GCP, or Azure) for data storage, processing, and orchestration

  • Programming Skills: Proficiency in Python for scripting and automation, especially within the context of data pipelines and Airflow DAGs.

  • Solid of experience as a Data Engineer or in a related role with a focus on data pipeline development and orchestration

  • Proven track record of working with DBT, Airflow, and DAGs in a production environment

  • Experience with version control systems (e.g., Git) for collaborative development and deployment of data transformations

  • Experience with a modern microservice framework (,NET, Spring, etc.)

  • Knowledge of CI/CD processes for automated testing and deployment of data pipelines.

Required profile

Experience

Level of experience: Mid-level (2-5 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Problem Solving
  • Collaboration
  • Verbal Communication Skills

Software Engineer Related jobs