Offer summary
Qualifications:
Experience with major distributed data processing framework (e.g.: Spark, Dask), 6+ months working with a streaming dataflow framework (e.g.: Flink, Kafka Streams), Ability to set up distributed dataflows independently, Working familiarity with data streams like Kafka, CDC, and Kubernetes, Knowledge of Python, SQL, and innovative tech environments.
Key responsabilities:
- Implement data flow from client's warehouses to Pathway's ingress
- Setup CDC interfaces for change streams and design ETL pipelines
- Contribute to benchmark and open-source test frameworks for streaming data
- Work closely with CTO, Head of Product, and key developers
- Expected to make a significant contribution to company success