Sr. Data & Analytics Engineer

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

Bachelor’s degree in Information Systems, Computer Science, or a related field., 5 years of experience in Data Engineering or related fields., Proficiency in at least one automation language such as Bash, PowerShell, Python, or NodeJS., Experience with cloud engineering in Microsoft Azure and cloud-based warehousing in Snowflake..

Key responsibilities:

  • Develop and operationalize data pipelines for analytics and data services.
  • Collaborate with product owners and engineering teams to design and automate data pipelines.
  • Manage production data ensuring fault tolerance and redundancy across multiple datasets.
  • Participate in the development and maintenance of Data Warehouses and provide technical assistance to the team.

Ocean Spray Cranberries logo
Ocean Spray Cranberries Food & Beverages Large https://www.oceanspray.com/
1001 - 5000 Employees
See all jobs

Job description

The Sr. Data & Analytics Engineer is responsible for the end to end data pipelines to power analytics and data services. This role is focused on data engineering to build and deliver automated data pipelines from a plethora of internal and external data sources. The Data Engineer will partner with product owners, engineering, and data platform teams to design, build, test and automate data pipelines that are relied upon across the company as the single source of truth.

Responsible for developing and operationalizing data pipelines to make data available for consumption (reports and advanced analytics). This includes data ingestion, data transformation, data validation / quality, data pipeline optimization, orchestration; and engaging with DevOps Engineer during CI / CD. The role requires a grounding in programming and SQL, followed by expertise in data storage, modeling, cloud, data warehousing, and data lakes. The Data Engineer works closely with Data Architects, Data Scientists and BI Engineers to design and maintain scalable data models and pipelines

Specific job duties include the following:

  • Collaborate with networking team on implementations of VNET’s/Subnets, NSGs, Layer 4 vs 7, etc.
  • Identify and ensure IAM/RBAC is configured to security best practices
  • Manage and automate Day 2 ops with enterprise monitoring using agent/agentless approach
  • Develop and design data pipelines to support an end-to-end solution
  • Participate in development and maintenance of Data Warehouses
  • Provide technical design and coding assistance to the team to accomplish the project deliverables as planned and scoped
  • Develop and maintain artifacts (i.e., schemas, data dictionaries, and transforms) related to ETL processes
  • Manage production data within multiple datasets ensuring fault tolerance and redundancy
  • Collaborate with the rest of the data engineering team to design and launch new features, including the coordination and documentation of dataflows, capabilities, etc.
  • Apply analytical skills for cloud consumption and cost optimizations
  • Position is based out of HQ in Lakeville, MA, but the employee may live and work from anywhere in the U.S.

Minimum Education Required:

  • Bachelor’s degree in Information Systems, Computer Science, or a related field, or the foreign equivalent

Minimum Other Special Skills or Requirements:

  • 5 years of experience in Data Engineering or related, including extracting data from a wide variety of sources and transforming the data as needed
  • 5 years of experience in at least one of the following automation languages: (a) Bash, (b) PowerShell, (c) Python, or (d) NodeJS
  • 3 years of experience with cloud native engineering in Microsoft Azure
  • 3 years of experience with cloud-based warehousing in Snowflake
  • 3 years of experience with databases, APIs, ETLs, and Cron
  • 3 years of experience with Apache Airflow, creating and scheduling DAGs, and backend infrastructure
  • 2 years of experience with data/software engineering Dev (Sec) Ops practices, including CI/CD, Docker builds, container registries, patching Linux base images, and environment deployments
  • 1 year of experience with building and management of Kubernetes architecture, including storage, ingress, cert/cluster issuer, and node pools
  • Must be able to travel to HQ in Lakeville, MA a minimum of two times per year

Who We Are:

You might have our iconic cranberry juice in your fridge or have gotten into heated holiday debate about what’s better - canned or fresh cranberry sauce. But did you know that the hardworking people growing the superfruit in our products are 700 family farmers that own our cooperative? They entrust us with what is most precious to them to create new and innovative products that will delight consumers and grow this beloved brand today and into the future.

Team members, farmers, consumers and communities alike--we value what makes us unique and strive to connect our farms to families for a better life by living our values:

  • Grower Mindset – We embrace our grower-owners innovative spirit and heritage through confidence, learning and focus on the future.
  • Sustainable Results – Guided by purpose, we are focused on delivering results for our grower-owners.
  • Integrity Above All – We are ethical, doing the right thing for our grower-owners, customers, consumers and each other
  • Inclusive Teamwork – We build diverse and inclusive teams that strengthen our cooperative.

All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status.

Required profile

Experience

Industry :
Food & Beverages
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Collaboration
  • Analytical Skills

Data Engineer Related jobs