Senior Data Analytics Engineer (worldwide remote, work anywhere)

extra holidays - fully flexible
Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

4+ years of experience in analytics/data engineering roles, preferably in a B2B SaaS context., Deep knowledge of SQL and data normalization best practices., Hands-on experience with Airflow, dbt, and Snowflake., Excellent communication skills and a proactive problem-solving mindset..

Key responsibilities:

  • Partner with various teams to define and implement accessible data models.
  • Transform raw data into clean, analytics-ready models using dbt.
  • Ensure data consistency and integrity across multiple systems.
  • Collaborate on designing self-serve Looker experiences and debug data issues.

CloudLinux logo
CloudLinux Information Technology & Services Scaleup http://www.cloudlinux.com
51 - 200 Employees
See all jobs

Job description

We’re a newly formed data & analytics team of three. Together, we’ve successfully built the foundation of our modern data platform using Airflow, Snowflake, and Looker. We're actively cleaning, aligning, and integrating multiple data sources — including marketing, finance, and product usage data — into a centralized and governed system.

We’re looking for a Senior Data Analytics Engineer to help us transform fragmented data sources into robust, scalable pipelines and analytical assets that support self-serve reporting and advanced analysis across our product and go-to-market teams.

As our Data Analytics Engineer you will be responsible for:

  • Partner with Analysts, Marketing, Product, and Finance teams to define and implement data models that make insights easily accessible.
  • Transform raw data into clean, governed, analytics-ready models using dbt.
  • Ensure data consistency and integrity across systems.
  • Collaborate on designing self-serve Looker experiences while optimizing performance and reducing manual reporting overhead.
  • Debug data issues across multiple source systems and orchestrate efficient resolutions.

Our Current Stack:

  • Data Sources: PostgreSQL, ClickHouse, HubSpot, Chargebee, QuickBooks, Zendesk, PostHog
  • ETL/ELT: Airflow, dbt, Python, custom connectors
  • Data Warehouse: Snowflake
  • Orchestration: Airflow
  • BI & Governance: Looker, OpenMetadata
  • Version Control: GitLab

Requirements

To be successful in this role you should have:

  • 4+ years of experience in analytics/data engineering roles, ideally in a B2B SaaS or enterprise context
  • Deep knowledge of SQL, data normalization, and transformation best practices
  • Familiarity with data governance, Looker modeling (LookML)
  • A strong commitment to data quality and accuracy
  • Hands-on experience with Airflow, dbt, Snowflake
  • A proactive, problem-solving mindset with attention to data accuracy and maintainability
  • Excellent communication skills

Benefits

What's in it for you?

  • A focus on professional development
  • Interesting and challenging projects
  • Fully remote work with flexible working hours, that allows you to schedule your day and work from any location worldwide
  • Paid 24 days of vacation per year, 10 days of national holidays, and unlimited sick leaves
  • Compensation for private medical insurance
  • Co-working and gym/sports reimbursement
  • Budget for education
  • The opportunity to receive a reward for the most innovative idea that the company can patent

By applying for this position, you agree with the CloudLinux Privacy Policy - https://cloudlinux.com/privacy-policy and give us your consent to maintain and process your personal data in this respect. Please read our Privacy Policy for more information - https://cloudlinux.com/privacy-policy .

Required profile

Experience

Industry :
Information Technology & Services
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Communication
  • Problem Solving

Data Engineer Related jobs