Match score not available

Data Platform Engineer

Remote: 
Full Remote
Contract: 
Salary: 
100 - 100K yearly
Experience: 
Mid-level (2-5 years)
Work from: 

Offer summary

Qualifications:

Experience in Data Engineering (ETL, OLAP), Experience in Infrastructure and DevOps, Experience in Analytics Engineering, Experience in Backend Engineering, Proficient in Python, SQL.

Key responsabilities:

  • Develop and manage robust data pipelines
  • Own data infrastructure using IaC and DevOps
  • Maintain data lakehouse system with governance
  • Develop real-time data applications and microservices
  • Manage CI/CD pipelines for code integration
ZFX logo
ZFX Financial Services Scaleup https://www.zfx.com/
501 - 1000 Employees
See more ZFX offers

Job description

About Us:

Award - The Best FinTech Trading Platform of The Year 2019.

Zeal group of companies (collectively Zeal Group) is a business portfolio of the parent company Zeal Holdings Limited, comprising regulated financial institutions and fintech companies specializing in multi-asset liquidity solutions in regulated markets backed by proprietary technology.

We are a people focused business and our team of 500+ professionals globally are dedicated to maximizing the success of our employees and customers. Headquarter is located in the UK with global presence in 12 countries across Asia, Middle East, & Europe with 22 offices internationally.

The estimated monthly trading volume in an average of USD 100B executed by 100,000 retail investors, professional traders, and financial institution clients.

Our team is focusing on building a state-of-the-art data platform for our company. This platform will centralize, process, and analyze vast amounts of data, empowering all employees with the critical insights they need to drive strategic decision-making.

We are a small but dedicated team of five data engineers and analysts. We are highly skilled professionals, each bringing a unique perspective and set of expertise to the table. We operate in a collaborative, agile environment where every member's contribution is valued.

In our team, we treat data as a product, emphasizing high-quality engineering practices, stringent data quality measures, and robust data governance. We are committed to delivering a scalable and reliable data platform that meets the needs of all stakeholders, utilizing best-in-class data tools to ensure optimal results.

Our mission is to empower every employee with the data they need, thereby driving innovation, improving efficiency, and fostering a data-driven culture within the company.

About the Data Team
Our team is focusing on building a state-of-the-art data platform. This platform centralizes, processes, and analyzes vast amounts of data, empowering all employees with the critical insights they need to drive strategic decision-making.

We are a small but dedicated team of data engineers and analysts. We are highly skilled professionals, each bringing a unique perspective and set of expertise to the table. We operate in a collaborative, agile environment where every member's contribution is valued.

As we grow, we seek a proficient Data Scientist, someone who not only delves deep into data but also amplifies its value by educating and collaborating. If you're passionate about transforming data into actionable business insights and spreading data literacy, we want to meet you.


Requirements

  • Experience in Data Engineering (Data Pipelines, ETL, Data Quality, OLAP, Big data frameworks)
  • Experience in Infrastructure and DevOps (Cloud infrastructure, Terraform, k8s, Docker, Prometheus)
  • Experience in Analytics Engineering (Data Modelling, Data warehouses, Workflow Managers)
  • Experience in Backend Engineering (API, Micro-services, OLTP storages, Stream processing)
  • Experience with Continuous Integration Tools (Gitlab Ci, Teamcity, Jenkins)
  • Experience with Python, SQL


Responsibilities

  • Develop and manage robust data pipelines, ensuring high data quality
  • Develop and own the data Infrastructure using IaC and DevOps practices
  • Develop and own the data lakehouse system through effective data modeling and data governance
  • Develop and maintain real-time data applications (microservices and stream processing systems)
  • Develop and maintain CI/CD pipelines for streamlining code integration, testing, and deployment


Nice to have

  • Experience with Google Cloud Platform
  • Experience in Business Intelligence (BI tools, Dashboards, Visualizations)
  • Experience in Data Analytics (Statistical Analysis, Presentation skills)
  • Experience in Machine Learning (ML/DL techniques, Feature engineering, MLOps)


What we offer

  • State-of-the-art technologies (Google Cloud Platform, BigQuery, Dagster, DBT, Datahub, Spark, Python, SQL)
  • A collaborative and friendly team of professionals
  • High engineering standards (with a focus on change management and data quality)
  • Flexible schedule
  • Generous vacation policy

Benefits

  • Paid 21 days of holidays per year and 10 additional days for national holiday
  • Training opportunities for growth and expansion of knowledge
  • Ability to work remotely
  • Flexible and hybrid schedule – we value work-life balance
  • Referral Bonus Program

Required profile

Experience

Level of experience: Mid-level (2-5 years)
Industry :
Financial Services
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Collaboration

Data Engineer Related jobs