Match score not available

Senior Data Scientist

Remote: 
Full Remote
Contract: 
Experience: 
Mid-level (2-5 years)
Work from: 

Offer summary

Qualifications:

7+ years of experience in data science and machine learning., Strong programming skills in Python, SQL, and Spark., Hands-on experience with MLflow, PyTorch, and FastAPI for model development., Knowledge of deep learning architectures and MLOps best practices..

Key responsabilities:

  • Design the architecture for the data analytics platform.
  • Develop scalable data models and data pipelines.
  • Ensure integration of various data sources and implement modern data platform components.
  • Collaborate with data scientists and engineers to ensure platform usability.

Intellectsoft logo
Intellectsoft Computer Software / SaaS SME https://www.intellectsoft.net/
51 - 200 Employees
See all jobs

Job description

Intellectsoft is a software development company delivering innovative solutions since 2007. We operate across North America, Latin America, the Nordic region, the UK, and Europe.We specialize in industries like Fintech, Healthcare, EdTech, Construction, Hospitality, and more, partnering with startups, mid-sized businesses, and Fortune 500 companies to drive growth and scalability. Our clients include Jaguar Motors, Universal Pictures, Harley-Davidson, Qualcomm, and London Stock Exchange.Together, our team delivers solutions that make a difference. Learn more at www.intellectsoft.net

Requirements

  • 7+ years of experience in data science, machine learning, and statistical modeling.
  • Strong programming skills in Python, SQL, and Spark.
  • Hands-on experience with MLflow, PyTorch, Spark MLlib, and FastAPI for model development and deployment.
  • Understanding of distributed computing and big data processing using Apache Spark and ClickHouse.
  • Proficiency in feature engineering, data preprocessing, and model tuning for large-scale datasets.
  • Experience in building and deploying ML models in production environments using TorchServe, FastAPI, or similar frameworks.
  • Knowledge of deep learning architectures (CNNs, RNNs, transformers) and their practical applications.
  • Strong grasp of MLOps best practices, including CI/CD for ML models, model monitoring, and retraining pipelines.
  • Understanding of real-time analytics and event-driven architectures for processing streaming data.
  • Experience working with SQL and NoSQL databases such as PostgreSQL, ClickHouse, and Delta Lake.
  • Strong ability to collaborate with data engineers, architects, and business analysts to ensure ML models align with business objectives.
  • Knowledge of A/B testing methodologies and causal inference techniques for evaluating model effectiveness.
  • Familiarity with cloud services (AWS, GCP, or Azure) for scalable model training and deployment.

Responsibilities:

  • Design the architecture for the open-source-based data analytics platform.
  • Develop scalable data models, data pipelines, and data lakes.
  • Ensure integration of various data sources, including Kafka, NiFi, Apache Airflow, and Spark.
  • Implement modern data platform components like Apache Iceberg, Delta Lake, ClickHouse, and PostgreSQL.
  • Define and enforce data governance, security, and compliance best practices.
  • Optimize data storage, access, and retrieval for performance and scalability.
  • Collaborate with data scientists, engineers, and business analysts to ensure platform usability.

Benefits

  • 35 absence days per year for work-life balance
  • Udemy courses of your choice
  • English courses with native-speaker
  • Regular soft-skills trainings
  • Excellence Сenters meetups
  • Online/offline team-buildings

Required profile

Experience

Level of experience: Mid-level (2-5 years)
Industry :
Computer Software / SaaS
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Collaboration
  • Problem Solving

Data Scientist Related jobs