Match score not available

Senior Data Scientist

Remote: 
Full Remote
Contract: 
Experience: 
Mid-level (2-5 years)
Work from: 

Offer summary

Qualifications:

Proficiency in Python and its libraries, Experience with R for statistical analysis, Knowledge of Adobe Analytics and AWS, Familiarity with Snowflake and machine learning tools, Understanding of data governance and security.

Key responsabilities:

  • Analyze, process, and model data
  • Create actionable insights for decision-making
  • Advancing AI and machine learning capabilities
  • Collaborate with cross-functional teams
  • Invest in automating routine tasks using AI
Devsu logo
Devsu Computer Software / SaaS SME https://www.devsu.com/
51 - 200 Employees
See more Devsu offers

Job description

We are seeking a skilled and passionate Senior Data Scientist to join our dynamic team. You will play a crucial role in leveraging data to drive business insights and strategies and work remotely from anywhere in Latin America.

As a Data Scientist, you will work closely with cross-functional teams to analyze, process, and model data, using various tools and technologies to create actionable insights. You will be integral in advancing our AI and machine learning capabilities, enhancing our data analytics tech stack, and contributing to strategic decision-making processes.

Requirements

  • Python: Essential for data analysis, machine learning, automation scripts, and statistical analysis.
    • Proficiency in Python libraries such as Pandas, NumPy, Scikit-Learn, and TensorFlow.
  • R: Useful for statistical analysis and data visualization.
    • Proficiency in R packages like ggplot2 and dplyr.
  • Adobe Analytics: Integral for tracking and analyzing marketing and web traffic data.
    • Understanding of Adobe Experience Cloud, reporting, and insights generation.
  • Amazon Web Services (AWS): Cloud infrastructure for data storage, processing, and various other computing needs.
    • Proficiency in services such as S3 (for storage), EC2 (for computing), Lambda (serverless functions), Redshift (data warehousing), and AWS Glue (data catalog and ETL).
  • Snowflake: Data warehouse solution utilized for storing and querying large datasets.
    • Leveraging Snowflake for efficient querying, data storage management, and integration with other tools.
  • Machine Learning: Essential for predictive analytics, data modeling, and AI integration.
    • Proficiency in tools such as SciKit-Learn, TensorFlow, PyTorch, and understanding of ML lifecycle management with tools like MLflow and Kubeflow.
  • Automation and AI Integration: Invest in automating routine tasks using AI and machine learning to yield better productivity and insights.
  • Data Governance and Security: Implement a robust framework using tools like Apache Ranger and AWS IAM for data security and governance.
  • Skill Development: Encourage continuous learning via courses and certifications in new technologies, e.g., AWS Certified Data Analytics, and Machine Learning Nanodegree by Udacity.
Nice to Have
  • Data Visualization Tools: Tableau, Power BI, D3.js
    • Creating interactive dashboards, compelling visualizations, and generating insights.
  • Big Data Technologies: Apache Spark, Hadoop Ecosystem (Hive, Pig, HDFS).
    • Processing large datasets efficiently, distributed storage and processing.
  • ETL Tools: Apache NiFi, Talend, Airflow.
    • Designing and managing robust data integration and transformation workflows.
  • Version Control Systems: Git.
    • Collaborative coding, version tracking, and managing changes in code.
  • APIs and Web Development: RESTful APIs, GraphQL, Flask, Django.
    • Data integration, developing services, web applications, and ML model deployment.
  • DevOps and CI/CD: Jenkins, GitLab CI, CircleCI, Docker, Kubernetes.
    • Continuous integration, delivery, containerization, and orchestration.
  • Statistical Analysis: MATLAB, SAS, Stata.
    • Advanced statistical analysis, simulation, and econometrics.
  • Real-Time Data Processing: Kafka, Flink.
    • Managing real-time data streams and processing.

Benefits

A stable, long-term contract. Continuous Training. Private Health insurance stipend. Flexible schedule. Work with some of the most talented software engineers in Latin America and the US, doing challenging work and world-class software for clients in the US and around the world.

Required profile

Experience

Level of experience: Mid-level (2-5 years)
Industry :
Computer Software / SaaS
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Training And Development

Data Scientist Related jobs