Match score not available

Data Platform Engineer

Remote: 
Full Remote
Contract: 

Offer summary

Qualifications:

3+ years of experience in Data Platform Engineering., Strong software engineering skills in Python, with knowledge of C++, Java, Go, or Rust as a plus., Proven expertise in ETL and stream processing tools like Kafka, Spark, or Airflow., Familiarity with cloud services (AWS/GCP) and data science tools such as Jupyter and Pandas..

Key responsabilities:

  • Build and operate data platforms using public cloud infrastructure and containers.
  • Develop data science platforms leveraging open-source software and cloud services.
  • Design and run ETL tools and frameworks to onboard data and monitor data quality.
  • Manage mission-critical production services and perform related duties as assigned.

TheoremOne logo
TheoremOne SME https://www.theoremone.co/
201 - 500 Employees
See all jobs

Job description

Formula.Monks, part of Media.Monks and S4 Capital, is a global consulting firm mastering AI-powered transformations for the Fortune 100. We combine long-term strategic thinking, deep enterprise experience, and a human-centered approach to help clients transform business processes and dominate their industries.

About the Role
As a Data Platform Engineer, you’ll build cutting-edge data platforms that ingest, manage, and process data from various businesses. This role supports a wide range of use cases, including customer-facing APIs and large-scale machine learning models.

Responsibilities
  • Build and operate data platforms using public cloud infrastructure (AWS, GCP), Kafka, databases, and containers
  • Develop data science platforms leveraging open-source software and cloud services
  • Design and run ETL tools and frameworks to onboard data, define schemas, create DAG processing pipelines, and monitor data quality
  • Aid in the development of machine learning frameworks and pipelines
  • Manage mission-critical production services
  • Perform other related duties as assigned

  • Qualifications & Skills
  • 3+ years of relevant experience in a Data Platform Engineering role
  • Strong software engineering skills with Python; additional experience in C++, Java, Go, or Rust is a plus
  • Proven expertise in building ETL and stream processing tools with technologies like Kafka, Spark, Flink, or Airflow/Prefect
  • Proficient in SQL and databases/engines such as MySQL, PostgreSQL, Snowflake, Redshift, or Presto
  • Familiarity with data science tools (e.g., Jupyter, Pandas, Scikit-learn, PyTorch) and frameworks like MLFlow or Kubeflow
  • Hands-on experience with AWS/GCP services, Kubernetes, and Linux in production environments
  • Strong inclination toward automation and DevOps practices
  • Capable of managing increasing data volume, velocity, and variety
  • Agile, self-starter mindset with strong communication skills
  • Ability to navigate ambiguity and prioritize tasks effectively
  • Participation in on-call support beyond standard business hours
  • Strong english Skills (B2)
  • Required profile

    Experience

    Industry :
    Spoken language(s):
    English
    Check out the description to know which languages are mandatory.

    Other Skills

    • Communication
    • Time Management
    • Problem Solving

    Data Engineer Related jobs