3+ years of experience in Data Platform Engineering., Strong software engineering skills in Python, with knowledge of C++, Java, Go, or Rust as a plus., Proven expertise in ETL and stream processing tools like Kafka, Spark, or Airflow., Familiarity with cloud services (AWS/GCP) and data science tools such as Jupyter and Pandas..
Key responsabilities:
Build and operate data platforms using public cloud infrastructure and containers.
Develop data science platforms leveraging open-source software and cloud services.
Design and run ETL tools and frameworks to onboard data and monitor data quality.
Manage mission-critical production services and perform related duties as assigned.
Report This Job
Help us maintain the quality of our job listings. If you find any issues with this job post, please let us know.
Select the reason you're reporting this job:
TheoremOne (now Formula.Monks) is an innovation and engineering company that advises clients on product strategy, engineering, design, and culture, then partners with them to build and launch technology-driven solutions to their most complex problems. TheoremOne is chosen by clients when results matter most — becoming the agent of change, and driving a transformation that involves not only technology, but also people, process and leadership. Founded in 2007, and headquartered in Los Angeles, Theorem’s global team of engineers, designers, technologists, researchers, strategists, and advisors, has deep expertise across a broad variety of industries including consumer electronics, automotive, manufacturing, supply chain, healthcare, finance, and entertainment. Learn more at www.theoremone.co
Formula.Monks, part of Media.Monks and S4 Capital, is a global consulting firm mastering AI-powered transformations for the Fortune 100. We combine long-term strategic thinking, deep enterprise experience, and a human-centered approach to help clients transform business processes and dominate their industries.
About the Role
As a Data Platform Engineer, you’ll build cutting-edge data platforms that ingest, manage, and process data from various businesses. This role supports a wide range of use cases, including customer-facing APIs and large-scale machine learning models.
Responsibilities
Build and operate data platforms using public cloud infrastructure (AWS, GCP), Kafka, databases, and containers
Develop data science platforms leveraging open-source software and cloud services
Design and run ETL tools and frameworks to onboard data, define schemas, create DAG processing pipelines, and monitor data quality
Aid in the development of machine learning frameworks and pipelines
Manage mission-critical production services
Perform other related duties as assigned
Qualifications & Skills
3+ years of relevant experience in a Data Platform Engineering role
Strong software engineering skills with Python; additional experience in C++, Java, Go, or Rust is a plus
Proven expertise in building ETL and stream processing tools with technologies like Kafka, Spark, Flink, or Airflow/Prefect
Proficient in SQL and databases/engines such as MySQL, PostgreSQL, Snowflake, Redshift, or Presto
Familiarity with data science tools (e.g., Jupyter, Pandas, Scikit-learn, PyTorch) and frameworks like MLFlow or Kubeflow
Hands-on experience with AWS/GCP services, Kubernetes, and Linux in production environments
Strong inclination toward automation and DevOps practices
Capable of managing increasing data volume, velocity, and variety
Agile, self-starter mindset with strong communication skills
Ability to navigate ambiguity and prioritize tasks effectively
Participation in on-call support beyond standard business hours
Strong english Skills (B2)
Required profile
Experience
Industry :
Spoken language(s):
English
Check out the description to know which languages are mandatory.