Principal Data and Integration Engineer - Fully Remote

extra parental leave
Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

Bachelor's degree in Computer and Information Science or related field., Strong experience in architecting and developing large scale data solutions on AWS., Proficiency in programming languages such as Python and PySpark, and advanced SQL., Experience in Agile methodology and leading teams of Data Engineers..

Key responsabilities:

  • Lead the design and development of scalable data processing frameworks in AWS.
  • Collaborate with data engineers to implement automated data pipelines for data integration.
  • Continuously optimize the performance and cost efficiency of AWS infrastructure.
  • Mentor junior data engineers and provide technical guidance to development teams.

Magellan Health logo
Magellan Health XLarge https://www.magellanhealth.com/
5001 - 10000 Employees
See all jobs

Job description

About the Role:

Magellan’s Data Engineering department is responsible for ingesting & integration data, implementing the standardized models and metrics into Enterprise data warehouse to serve Analytics reporting for Stakeholders and Clients. As a Principal Data and Integration Engineer, you will be reporting to the Manager of Data Engineering & Integration and will be leading a group of other junior/mid-level Data Engineers.

This is a fully remote opportunity, allowing you to work from the comfort of your own home from anywhere in the US.

In this Data journey, this role will enhance our Serverless Data engineering platform by designing/building robust Data Pipelines to integrate data from various disparate sources (Structured, Semi-structured, Unstructured data) into our Enterprise data lake (built on AWS platform) and implement standardized data models in our Enterprise data warehouse (built using AWS Redshift) to achieve a “Single version of truth”.

This is a hands-on development role and focuses on achieving technical excellence in adopting Serverless methodology, Automation, streamlining developments and deployments using CI/CD frameworks, continuously modernize the Data Engineering architecture to keep up with changing Technological landscape, collaborate with our other teams (Data Architecture, Reporting, Operations) to work towards a consolidated software release during every release cycle by following Agile/Scrum practices.

Key Responsibilities:
  • Architectural Solutioning – Lead the design and development of scalable, high performance data processing frameworks in AWS Cloud platform as per business requirements in scope and aligning with software best practices.

  • Data Integration – Collaborate with data engineers to design and implement robust automated data pipelines for ingesting, transforming and loading data from various disparate data sources.

  • Performance Optimization – Continuously assess and optimize the performance and cost efficiency of AWS infrastructure and data processing frameworks.

  • Automation – Design scalable Automations to eliminate manual efforts and technical debts.

  • Technical Leadership - Lead and mentor other data engineers and provide technical guidance to Dev Teams in following cloud best practices.

  • Collaboration – Work closely with other cross-functional teams and business stakeholders to understand requirements and translate into technical solutions.

  • Innovation – Stay up to date with latest AWS developments and industry trends and continuously upgrade current architecture for future proofing.

Minimum Qualifications:

  • Strong experience in architecting and developing large scale data solutions in AWS platform with experience using a variety of Cloud Services in designing ETL and ELT solutions.

  • Strong proficiency in AWS Redshift and data warehousing concepts to perform ELT using advanced SQL transformations.

  • Strong programming knowledge in Python & PySpark to design AWS Lambda and Glue jobs for connecting to data sources, reading/loading data, data frames using advanced techniques like parallel processing, multi-threading.

  • Experience in leading and mentoring a pool of Data Engineers.

  • Strong Data Literacy knowledge in understanding the steps involved in moving data from A to B

  • Experience in Project planning such as managing, tracking ongoing project status and communicating holistic update of all Data Engineering activities.

  • Strong experience in Agile Methodology and Scrum development practices & ceremonies such as Backlog Refinement, Grooming, Sprint Planning & Release, Daily Scrums, Sprint Retrospectives.

  • Strong experience in following Release Management practices and lifecycle of Software release activities by aligning with scheduled release cycles.

  • Strong experience and hands-on in architecting and designing Continuous Integration and Continuous Delivery (CICD) frameworks using GitHub and AWS CodePipelines.

  • Strong Commitment to deliver solutions within agreed deadlines and scope.

  • Excellent communication skills to act as a technical liaison between all stakeholders.

Technological skills required:
  • AWS Cloud - S3, IAM, Lambda, Glue Jobs and Workflows, EventBridge, Appflow, CloudWatch, CloudFormation, CodePipeline, CodeBuild, CodeDeploy, SNS, SQS, SES, EKS

  • Programming - Python, PySpark, SQL, PL/SQL, REST API Services

  • Databases - AWS Redshift, AWS RDS MySQL, AWS DynamoDB, Oracle, SQL Server

  • CI/CD - Git, GitHub / GitLab, Deployments, AWS CodePipeline, Jenkins, CloudFormation

  • Orchestration - Apache Airflow (Highly preferred), DBT (Highly preferred)

  • NoSQL Databases - MongoDB, DynamoDB

  • ITSM and Release Management – JIRA Cloud, ServiceNow

  • Data Replication - Qlik Replicate / Attunity (Optional but highly preferred)

  • Reporting tools - Tableau, Cognos (Optional but highly preferred)

Certifications:
  • AWS Intermediate level (or above) Certifications preferred - Such as AWS Solution Architect Associate (SAA), AWS Database Specialty, AWS Developer Associate

This position will Lead agile software development efforts as a Technical Leader. This role will respond to audits and contribute to the RFP process. Ensure data governance and best practice is embraced. Responsible for understanding business and IT strategy to align with outcomes. Will be a hands-on Data and Integration Engineer who can write quality code, assist with problem solving, root cause analysis, trouble shooting and coaching. Must understand big picture from a business standpoint within the context of the application. Will define improvements to increase system reliability, security and performance. Perform rich data visualizations and presentations to senior management on value adds. May manage a team.
  • Participate in defining strategic IT objectives and leading subordinates toward that strategic vision for their products.
  • Acts as the primary focal point for both internal and external customers for software development tasks. This includes estimates of feasibility, time and effort of tasks. 
  • Provide updates to both the user community and the programmers. 
  • Monitor projects, determines potential problems and guides them to a successful completion. 
  • Ensure that all work is getting accomplished by making assignments and monitoring tasks. This includes balancing work between programmers, analysts, project managers, supervisors, and managers, and ensuring that the proper policies and procedures are being followed.
  • Assists with budget preparation and management.
  • Mentor and evaluate staff performance. - Continues to work hands-on doing programming and analysis work themselves.
  • Tracks all project requests in functional area and updates status of projects on a regular basis.
  • Assists in estimating work effort associated with new project requests.
  • Assists in planning for the development and support of a functional systems area.
  • Reviews and evaluates work of subordinate staff and prepares performance reports.
  • Participates in planning and budgeting.

Other Job Requirements

Responsibilities

6+ years related experience including a minimum of 3-4 years of designing, building and maintaining high quality, secure software in IT.
Agile and Design Thinking (Coursera).
Critical thinker.
Demonstrated problem solving techniques.
Strong verbal and written communication skills.
ServiceNow training.

General Job Information

Title

Principal Data and Integration Engineer - Fully Remote

Grade

30

Work Experience - Required

IT

Work Experience - Preferred

Education - Required

A Combination of Education and Work Experience May Be Considered., Bachelor's, Bachelor's - Computer and Information Science

Education - Preferred

License and Certifications - Required

License and Certifications - Preferred

Salary Range

Salary Minimum:

$105,230

Salary Maximum:

$178,890

This information reflects the anticipated base salary range for this position based on current national data. Minimums and maximums may vary based on location. Actual pay will be adjusted based on an individual's skills, experience, education, and other job-related factors permitted by law.

This position may be eligible for short-term incentives as well as a comprehensive benefits package. Magellan offers a broad range of health, life, voluntary and other benefits and perks that enhance your physical, mental, emotional and financial wellbeing.

Magellan Health, Inc. is proud to be an Equal Opportunity Employer and a Tobacco-free workplace. EOE/M/F/Vet/Disabled.
Every employee must understand, comply with and attest to the security responsibilities and security controls unique to their position; and comply with all applicable legal, regulatory, and contractual requirements and internal policies and procedures.

Required profile

Experience

Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Communication
  • Critical Thinking
  • Problem Solving
  • Mentorship
  • Collaboration

Data Engineer Related jobs