Match score not available

Part time Cloud Data Architect - Python, SQL & GCP (AU Media, Home-based)

Remote: 
Full Remote
Contract: 
Experience: 
Mid-level (2-5 years)
Work from: 

Offer summary

Qualifications:

3+ years of experience in Cloud Data Architecture, Expertise in Python and SQL programming, Experience with CI/CD pipelines and data workflows, Strong understanding of ETL, data modelling, and warehousing, Google Cloud certification required; Databricks is a plus.

Key responsabilities:

  • Design GCP-based infrastructure for data workflows
  • Lead installation and management of Apache Airflow
  • Provision and manage Databricks environments
  • Implement infrastructure as code with Terraform
  • Monitor performance and optimize cloud services
ConnectOS logo
ConnectOS Large https://www.connectos.co/
1001 - 5000 Employees
See more ConnectOS offers

Job description

Schedule: Monday – Friday (09:00 AM - 06:00 PM AEST)

What are we looking for?

Skills Required:

  • Experience with machine learning infrastructure and deploying ML models on cloud platforms.
  • Minimum of 3 years of experience in a similar role. 3+ years of experience in Python programming & SQL.
  • Experience with CI/CD pipelines for infrastructure and data workflows.
  • Strong understanding of data modelling, design, ETL processes, and data warehousing concepts.
  • Experience with data visualization tools (e.g., Tableau, Power BI, Looker Studio) and proficiency in dashboard development and reporting.
  • Certification in Google Cloud (e.g., Professional Cloud Architect, Professional Data Engineer). Databricks certification will be advantageous as well.

Nice to Have:

  • Familiarity with machine learning and AI concepts.
  • Stakeholder management
  • Knowledge of best practices in data security and compliance.
  • Docker
  • Kubernetes

What will you do?

As a Cloud Data Architect, you will work closely with data engineers and analytics teams to ensure that the infrastructure supports complex data workflows, pipelines, and analytics needs, optimizing for performance, security, and cost-efficiency.

  • Design and Architect Cloud Infrastructure: Lead the design of GCP-based infrastructure to support data pipelines, machine learning, and analytics workloads, ensuring scalability and reliability.
  • Install and Manage Apache Airflow on Kubernetes: Set up and maintain Airflow for orchestrating data workflows in a Kubernetes environment, ensuring seamless scheduling and execution of DAGs.
  • Provision and Manage Databricks Environments: Set up Databricks clusters and integrations with GCP services, ensuring efficient use of resources for data processing and analytics.
  • Implement Infrastructure as Code (IaC): Use tools like Terraform or Cloud Deployment Manager to automate the provisioning and management of cloud infrastructure.
  • Optimize Cloud Data Services: Utilize GCP’s data products, such as BigQuery, Dataflow, Pub/Sub, and Cloud Storage, to build scalable data architectures.
  • Collaborate with Data Engineering Teams: Work closely with data engineers to design efficient data pipelines, optimize data workflows, and support data integration and processing at scale.
  • Ensure Security and Compliance: Implement best practices for securing data infrastructure, including IAM policies, VPC configurations, and data encryption.
  • Performance Monitoring and Optimization: Monitor and optimize the performance of cloud infrastructure, ensuring that it meets the needs of data pipelines and analytical workloads.
  • Cost Management: Implement cost-effective solutions and continuously optimize cloud infrastructure to reduce operational costs without compromising performance.
  • Documentation and Best Practices: Maintain detailed documentation of the cloud architecture and establish best practices for data infrastructure management.

JOIN CONNECTOS NOW!

ConnectOS is certified as a Great Place to Work and is a top-rated Philippines employer of choice.

Our client is Australia’s largest independent publishing business with over 100 brands reaching 4.3 million Australians every month, with over 1000-5000 talented employees who are dedicated to serving their audiences, advertisers and customers. They have world-class print centers with an established client base across both newsprint and heat-set products. Their network includes 14 daily titles, such as The Canberra Times, Newcastle Herald, The Courier in Ballarat and The Examiner in Launceston.

 

#ConnectOS #ConnectOSCareers #TeamConnectOS

Equal Employment Statement

Employment decisions at ConnectOS will be conducted without consideration of factors such as age’, race, color, religion, gender, disability status, sexual orientation, gender identity or expression, genetic information, and marital status. ConnectOS ensures the full confidentiality of the data it processes.

Required profile

Experience

Level of experience: Mid-level (2-5 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Software Engineer Related jobs