Match score not available

Python Developer (GCP)

Remote: 
Full Remote
Contract: 
Experience: 
Senior (5-10 years)
Work from: 

Offer summary

Qualifications:

5-7 years of Python programming experience, Deep understanding of production-level coding techniques, Expertise with big data technologies including Spark and Hadoop, Proficient in SQL, data modeling, and query optimization, Experience in data visualization tools.

Key responsabilities:

  • Build efficient ETL pipelines using advanced data engineering skills
  • Focus on cloud-native solutions in GCP or Azure
  • Communicate data insights effectively to stakeholders
  • Debug and implement solutions for technical problems
  • Continuously learn new technologies and propose new approaches
NTD Software logo
NTD Software Startup https://ntdsoftware.com/
11 - 50 Employees
See more NTD Software offers

Job description

We are seeking a highly skilled GCP Python Developer with 5-7 years of experience in Python programming and data engineering. The ideal candidate should have a deep understanding of production-level coding techniques, including testing, object-oriented programming (OOP), and code optimization. The role requires strong expertise with big data technologies, such as Spark, PySpark, Hadoop, HIVE, BigQuery, and Pub/Sub, to build scalable ETL pipelines. This is an exciting opportunity for an individual with a passion for cloud-native solutions, data visualization, and a drive to solve complex technical challenges.

Responsibilities:
  • Utilize advanced data engineering skills, including expert-level SQL, data modeling, and query optimization, to build efficient ETL pipelines.
  • Focus on cloud-native solutions with hands-on experience, preferably in Google Cloud Platform (GCP) or Azure environments.
  • Leverage data visualization and dashboarding techniques to effectively communicate complex data insights to stakeholders.
  • Debug, troubleshoot, and implement solutions for complex technical problems, ensuring high performance and scalability.
  • Continuously learn new technologies, prototype solutions, and propose innovative approaches to optimize data engineering processes.
  • Collaborate with cross-functional teams to integrate data solutions across platforms and services.

  • Requirements:
  • Proficiency in Python programming, with a strong emphasis on data engineering.
  • Extensive experience with big data technologies: Spark, PySpark, Hadoop, HIVE, BigQuery, and Pub/Sub.
  • Expertise in SQL, data modeling, and query optimization for large-scale data processing.
  • Experience in data visualization and dashboarding tools.
  • Strong debugging and problem-solving skills to resolve complex technical issues.
  • Ability to work independently, learn new technologies, and prototype innovative solutions.
  • Required profile

    Experience

    Level of experience: Senior (5-10 years)
    Spoken language(s):
    English
    Check out the description to know which languages are mandatory.

    Other Skills

    • Collaboration
    • Problem Solving

    Software Engineer Related jobs