Match score not available

Senior Data Engineer, Data Finance

extra holidays - extra parental leave - work from anywhere - fully flexible
Remote: 
Full Remote
Salary: 
150 - 228K yearly
Experience: 
Senior (5-10 years)
Work from: 

Offer summary

Qualifications:

8+ years of experience in data engineering., Strong expertise in data health metrics., Proven experience with Databricks migration., Solid programming skills in Python, Scala, or Java..

Key responsabilities:

  • Lead data migration to Databricks.
  • Build and integrate data health metrics.

Dropbox logo
Dropbox Large http://www.dropbox.com
1001 - 5000 Employees
See all jobs

Job description

Role Description

Dropbox is seeking a highly skilled and motivated Senior Data Engineer to join our dynamic Financial Data Engineering team. You will be responsible for building next-generation financial data pipelines that support crucial business decisions across the organization. The ideal candidate will have extensive experience migrating from other platforms to Databricks, a strong culture of innovation and accountability, and expertise in developing data health metrics that integrate with data governance, observability, and quality management tools. 

If you enjoy thinking about how businesses can utilize data and figuring out how to build it, this role is perfect for you. With a solid foundation in test-driven development and experience in building scalable data pipelines, as well as familiarity with traditional data warehousing (DW) and ETL architectures, and significant experience with ecosystems like Databricks, Snowflake, EMR, and Airflow, you would be a great fit for our team. By collaborating with cross-functional teams, you will have the opportunity to drive substantial business impact, as high data quality and effective tooling are key to achieving significant growth at Dropbox. 

Our Engineering Career Framework is viewable by anyone outside the company and describes what’s expected for our engineers at each of our career levels. Check out our blog post on this topic and more here.

Responsibilities
  • Lead data migration from legacy platforms to Databricks and develop scalable, efficient, and cost-optimized data pipelines
  • Build and integrate data health metrics and quality management tools, ensuring robust data governance and consistent standards
  • Design and maintain tools for efficient data investigations, issue detection, and automated mitigation to uphold data quality and consistency
  • Replace outdated infrastructure with modern systems and provide operational support for critical data pipelines
  • Solve complex data integration challenges using optimal ETL patterns, frameworks, and techniques for structured and unstructured data
  • Collaborate with cross-functional teams to meet technical and business needs while fostering a culture of innovation and continuous improvement
  • Define and manage SLAs for high-priority datasets, including those driving critical (P0) business metrics
  • Apply Agile methodologies and industry best practices to ensure consistent delivery and alignment with business objectives

Many teams at Dropbox run Services with on-call rotations, which entails being available for calls during both core and non-core business hours. If a team has an on-call rotation, all engineers on the team are expected to participate in the rotation as part of their employment. Applicants are encouraged to ask for more details of the rotations to which the applicant is applying

Requirements
  • 8+ years of experience in data engineering or related roles
  • Proven experience with data migration projects, specifically to Databricks
  • Strong expertise in data health metrics, data governance, and quality management, with experience integrating tools like Monte Carlo and Atlan
  • Solid experience in building and maintaining data pipelines and infrastructure
  • Excellent problem-solving skills and the ability to troubleshoot complex data issues
  • Strong programming skills in Python, Scala, or Java
  • Demonstrated ability to innovate and drive accountability within a team
  • Experience with version control systems like Git and test automation and CICD
Preferred Qualifications
  • 8+ years of Python or Java, Scala development experience
  • 8+ years of SQL experience
  • 8+ years of experience with schema design and dimensional data modeling
Compensation

US Zone 1

This role is not available in Zone 1

US Zone 2
$168,300$227,700 USD
US Zone 3
$149,600$202,400 USD

Required profile

Experience

Level of experience: Senior (5-10 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Accountability
  • Collaboration
  • Problem Solving
  • Innovation

Financial Data Analyst Related jobs