Data Engineer at Zeelo

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

Bachelor’s degree in a quantitative field; advanced degrees are a plus., Minimum of 3 years of data engineering experience in a commercial environment., Proficiency in SQL and experience with SQL-based transformation flows in dbt or similar tools., Good understanding of cloud platforms such as GCP, AWS, or Azure..

Key responsabilities:

  • Design, build, and maintain scalable and reliable data pipelines.
  • Manage Zeelo’s serverless centralized data architecture and support analytical functions across the business.
  • Identify and deliver improvements to data collection, management, and leveraging of internal and external data sources.
  • Educate team members on data and analytics, promoting data use throughout the company.

Zeelo logo
Zeelo Startup https://zeelo.co/
51 - 200 Employees
See all jobs

Job description

Data Engineer

Remote - UK based

As a Data Engineer, you will lay the groundwork for extracting value from data by developing and maintaining powerful and efficient data infrastructure. In addition, you will help Zeelo’s Data Team become the center of excellence on data and responsible for educating other teams, establishing best practices, and facilitating knowledge sharing on data-related matters across the company.

Zeelo and its clients have many data analytics needs all supported by Zeelo’s data team. You will support data analysts by engineering ETL pipelines to deliver useful, clean, clear, and timely data ready for analytics use. Zeelo’s data maturity is growing, and you will help incorporate streaming and AI tools into the business effectively. You will also maintain Zeelo’s serverless data architecture and improve its functionality and efficiency.

What we want you to know about Zeelo:

Zeelo is on a mission to make shared transportation more accessible, efficient, and sustainable. We’re scaling fast, and this is a chance to help shape the future of our technology in a role where you’ll have real ownership and impact.

  • Zeelo is a transit-tech company powering bus operators, employers and schools to provide highly efficient, sustainable, and affordable transport programs.
  • Our mission is to empower opportunity through sustainable transportation.
  • Our vision is to build the category leader for employers and schools offering transportation as a benefit.
  • Our culture strives to match a high performing sports team.
  • We are inspired by the “Ubuntu” mindset: I am, because we are.
  • Our model is asset light, we do not own vehicles or employee drivers, instead we routinely procure bus operator partners to provide ground transportation
  • We’re a team of 130+ across 3 offices (London, Barcelona & Boston) and our transit services are live in 2 markets (UK & US)
  • Our values are Trust, Efficiency, and Drive.

What will I be doing?
  • Design, build, and maintain scalable and reliable data pipelines.
  • Manage Zeelo’s serverless centralized data architecture (Fivetran, BigQuery, dbt, and other tools) that supports analytical functions across the business.
  • Design, build, and maintain ETL, ELT and other data pipelines for purposes to support analytics use cases.
  • Identify improvements in how Zeelo collects, manages and leverages internal and external data sources.
  • Identify and deliver improvements to scalability and cost.
  • Optimize queries and pipelines for cost and performance.
  • Develop data infrastructure to power accurate and efficient analytics.
  • Be a champion of data and analytics within the company, educating team members and supporting data use throughout the business.
  • Work with transportation data including location data, scheduling data, ridership data, and financial data.
  • Write clear documentation on the mechanics of the data architecture.

Skills and experience we’re looking for:
  • Bachelor’s degree in a quantitative field. Advanced degrees are a plus.
  • Min 3+ years data engineering experience in a commercial environment.
  • Proficiency in SQL.
  • Experience building SQL-based transformation flows in dbt or similar tools.
  • Good understanding of cloud platforms such as GCP, AWS or Azure.
  • Experience configuring orchestration of SQL and Python via Airflow or similar tools.
  • Experience working with data pipelines, defining problems, crafting and launching solutions, and practicing continuous improvement. Experience with process improvement frameworks and/or project management frameworks is a plus.
  • Experience maintaining a data warehouse including adding features to improve utility and refactoring to reduce costs.
  • Knowledge of data modeling best practices.
  • Experience with REST APIs.
  • Experience building unit tests or working with testing frameworks.
  • Experience with data governance and security.
  • A passion for sustainability, technology, and improving mobility.
  • Experience developing in Python (optional).
  • Experience with transportation systems (optional).




Required profile

Experience

Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Teamwork
  • Communication
  • Problem Solving

Data Engineer Related jobs