Magellan’s Data Engineering department is responsible for ingesting & integration data, implementing the standardized models and metrics into Enterprise data warehouse to serve Analytics reporting for Stakeholders and Clients. As a Principal Data and Integration Engineer, you will be reporting to the Manager of Data Engineering & Integration and will be leading a group of other junior/mid-level Data Engineers.
This is a fully remote opportunity, allowing you to work from the comfort of your own home from anywhere in the US.
In this Data journey, this role will enhance our Serverless Data engineering platform by designing/building robust Data Pipelines to integrate data from various disparate sources (Structured, Semi-structured, Unstructured data) into our Enterprise data lake (built on AWS platform) and implement standardized data models in our Enterprise data warehouse (built using AWS Redshift) to achieve a “Single version of truth”.
This is a hands-on development role and focuses on achieving technical excellence in adopting Serverless methodology, Automation, streamlining developments and deployments using CI/CD frameworks, continuously modernize the Data Engineering architecture to keep up with changing Technological landscape, collaborate with our other teams (Data Architecture, Reporting, Operations) to work towards a consolidated software release during every release cycle by following Agile/Scrum practices.
Architectural Solutioning – Lead the design and development of scalable, high performance data processing frameworks in AWS Cloud platform as per business requirements in scope and aligning with software best practices.
Data Integration – Collaborate with data engineers to design and implement robust automated data pipelines for ingesting, transforming and loading data from various disparate data sources.
Performance Optimization – Continuously assess and optimize the performance and cost efficiency of AWS infrastructure and data processing frameworks.
Automation – Design scalable Automations to eliminate manual efforts and technical debts.
Technical Leadership - Lead and mentor other data engineers and provide technical guidance to Dev Teams in following cloud best practices.
Collaboration – Work closely with other cross-functional teams and business stakeholders to understand requirements and translate into technical solutions.
Innovation – Stay up to date with latest AWS developments and industry trends and continuously upgrade current architecture for future proofing.
Minimum Qualifications:
Strong experience in architecting and developing large scale data solutions in AWS platform with experience using a variety of Cloud Services in designing ETL and ELT solutions.
Strong proficiency in AWS Redshift and data warehousing concepts to perform ELT using advanced SQL transformations.
Strong programming knowledge in Python & PySpark to design AWS Lambda and Glue jobs for connecting to data sources, reading/loading data, data frames using advanced techniques like parallel processing, multi-threading.
Experience in leading and mentoring a pool of Data Engineers.
Strong Data Literacy knowledge in understanding the steps involved in moving data from A to B
Experience in Project planning such as managing, tracking ongoing project status and communicating holistic update of all Data Engineering activities.
Strong experience in Agile Methodology and Scrum development practices & ceremonies such as Backlog Refinement, Grooming, Sprint Planning & Release, Daily Scrums, Sprint Retrospectives.
Strong experience in following Release Management practices and lifecycle of Software release activities by aligning with scheduled release cycles.
Strong experience and hands-on in architecting and designing Continuous Integration and Continuous Delivery (CICD) frameworks using GitHub and AWS CodePipelines.
Strong Commitment to deliver solutions within agreed deadlines and scope.
Excellent communication skills to act as a technical liaison between all stakeholders.
AWS Cloud - S3, IAM, Lambda, Glue Jobs and Workflows, EventBridge, Appflow, CloudWatch, CloudFormation, CodePipeline, CodeBuild, CodeDeploy, SNS, SQS, SES, EKS
Programming - Python, PySpark, SQL, PL/SQL, REST API Services
Databases - AWS Redshift, AWS RDS MySQL, AWS DynamoDB, Oracle, SQL Server
CI/CD - Git, GitHub / GitLab, Deployments, AWS CodePipeline, Jenkins, CloudFormation
Orchestration - Apache Airflow (Highly preferred), DBT (Highly preferred)
NoSQL Databases - MongoDB, DynamoDB
ITSM and Release Management – JIRA Cloud, ServiceNow
Data Replication - Qlik Replicate / Attunity (Optional but highly preferred)
Reporting tools - Tableau, Cognos (Optional but highly preferred)
AWS Intermediate level (or above) Certifications preferred - Such as AWS Solution Architect Associate (SAA), AWS Database Specialty, AWS Developer Associate
Other Job Requirements
Responsibilities
6+ years related experience including a minimum of 3-4 years of designing, building and maintaining high quality, secure software in IT.General Job Information
Title
Principal Data and Integration Engineer - Fully RemoteGrade
30Work Experience - Required
ITWork Experience - Preferred
Education - Required
A Combination of Education and Work Experience May Be Considered., Bachelor's, Bachelor's - Computer and Information ScienceEducation - Preferred
License and Certifications - Required
License and Certifications - Preferred
Salary Range
Salary Minimum:
$105,230Salary Maximum:
$178,890This information reflects the anticipated base salary range for this position based on current national data. Minimums and maximums may vary based on location. Actual pay will be adjusted based on an individual's skills, experience, education, and other job-related factors permitted by law.
This position may be eligible for short-term incentives as well as a comprehensive benefits package. Magellan offers a broad range of health, life, voluntary and other benefits and perks that enhance your physical, mental, emotional and financial wellbeing.
Magellan Health, Inc. is proud to be an Equal Opportunity Employer and a Tobacco-free workplace. EOE/M/F/Vet/Disabled.
Every employee must understand, comply with and attest to the security responsibilities and security controls unique to their position; and comply with all applicable legal, regulatory, and contractual requirements and internal policies and procedures.
SynergisticIT
Age of Learning
Talent hackers
ADI Resourcing
Komodo Health