Match score not available

Data Engineer II

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

4+ years of experience in DevOps and data pipeline development., A 4-year degree in computer science, healthcare information technology, or a relevant field., Experience with Agile methodologies and cloud computing, preferably Azure., Familiarity with Python, SQL, and AI tools is essential..

Key responsabilities:

  • Develop new features and systems to meet business and project requirements.
  • Ensure application performance, uptime, and scalability while maintaining code quality.
  • Collaborate with stakeholders to align IT solutions with business needs.
  • Participate in all aspects of software development, including design, implementation, and deployment.

Unified Women's Healthcare logo
Unified Women's Healthcare XLarge https://unifiedwomenshealthcare.com/
5001 - 10000 Employees
See all jobs

Job description

Overview

  • EASTERN/CENTRAL TIMEZONE***

Unified Women’s Healthcare is a company dedicated to caring for OB-GYN providers who care for others, be they physicians or their support staff. A team of like-minded professionals with significant business and healthcare experience, we operate with a singular mindset - great care needs great care. We take great pride in not just speaking about this but executing it.

As a company, our mission is to be an indispensable source of business knowledge, innovation and support to the practices in our network. We are advocates for our OB-GYN medical affiliates - enabling them to focus solely on the practice of medicine while we focus on the business of medicine.

We are action oriented. We strategize, implement, and execute - on behalf of the practices we serve.

Position Summary

The Data Engineer II role revolves around providing both technical and team leadership. This is a hands-on position using modern technologies and within a forward-thinking organization dedicated to good customer experience. This individual will possess an in-depth understanding of system design principles, system scaling, DevOps methodologies, and the integration of AI for automating tasks. The ideal candidate will showcase skills in data workflows, data quality, and Cloud computing. Strong communication skills are essential, as the role involves assisting other engineers and conveying complex concepts to non-technical colleagues.

Responsibilities

  • Develop new features and systems to support rapidly emerging business and project requirements.
  • Contribute to design sessions, code reviews, collaborative coding, and research.
  • Ensure application performance, uptime, and scalability while upholding rigorous standards for code quality and application design.
  • Employ agile development methodologies, embracing best practices and pursuing ongoing learning opportunities.
  • Participate in all aspects of software development, including design, implementation, and deployment.
  • Provide and implement solutions to automate internal or customer workflows.
  • Provide and implement solutions for optimizations of system performance.
  • Use AI tools to monitor and analyze various aspects of internal systems, including infrastructure, logs, and DevOps, and report on and implement improvements.
  • Collaborate across time zones via Jira, GitHub, SharePoint, video conferences, and other standard platforms.
  • Work closely with the stakeholders to ensure IT solutions align with business needs.
  • Follow all applicable federal/state regulations concerning privacy and security, as well as all compliance policies defined in SOC2, HIPAA, and HITRUST.
  • Other duties as assigned.

Qualifications

Education and Experience

  • 4+ years of experience in DevOps and data pipeline development.
  • Healthcare technology, or other heavily regulated industry experience.
  • A 4-year degree in computer science, healthcare information technology, or a relevant field or equivalent knowledge and skills obtained through a combination of education, training, and experience.
  • Experience with Agile methodologies.
  • No sponsorship.

Technologies:

  • Python 3.10+ / PySpark - Required
  • SQL - at least on one of MS Sql Server, MySql, PostgreSql, SparkSql
  • Databricks - Preferred
  • Feature Flagging (e.g. LaunchDarkly) - Preferred
  • Cloud Computing - Required, Azure Preferred
  • Event-driven or distributed systems - Required
  • AI tools (e.g. Azure Copilot, GitHub Copilot, Kubiya)

Required profile

Experience

Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Team Leadership
  • Collaboration
  • Communication
  • Problem Solving

Data Engineer Related jobs