Match score not available

Data Engineer_6-8 Years_Remote_Codersbrain

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

6-8 years of experience, Expertise in SQL, Python, PySpark.

Key responsabilities:

  • Develop and maintain ETL services
  • Tune and optimize SQL databases
  • Implement airflow using Python or Scala
  • Utilize Redshift, Snowflake, Databricks for data warehousing
CodersBrain logo
CodersBrain SME https://www.codersbrain.com/
201 - 500 Employees
See all jobs

Job description

About us - Coders Brain is a global leader in its services, digital and business solutions that partners with its clients
to simplify, strengthen and transform their businesses. We ensure the highest levels of certainty and satisfaction
through a deep-set commitment to our clients, comprehensive industry expertise and a global network of
innovation and delivery centers. We achieved our success because of how successfully we integrate with our
clients.
• Quick Implementation - We offer quick implementation for the new onboarding client.
• Experienced Team - We’ve built an elite and diverse team that brings its unique blend of talent, expertise,
and experience to make you more successful, ensuring our services are uniquely customized to your
specific needs.
• One Stop Solution - Coders Brain provides end-to-end solutions for the businesses at an affordable price
with uninterrupted and effortless services.
• Ease of Use - All of our products are user friendly and scalable across multiple platforms. Our dedicated
team at Coders Brain implements keeping the interest of enterprise and users in mind.
• Secure - We understand and treat your security with utmost importance. Hence we blend security and
scalability in our implementation considering long term impact on business benefit.

Position Name – Data Engineer
Experience Required – 6-8 Years
Salary – As per the Market Standard
Notice period – Immediate joiner

We are currently looking to hire a #DataEngineer for a #remote opportunity.

Key Skills:

SQL Database tuning and performance
Airflow implemented using Python or Scala
Python and PySpark
Redshift or Snowflake or Databricks for data warehousing
ETL services in AWS like EMR, GLUE, S3, Redshift or similar services in GCP or Azure
Worked on Gigabytes of data or more


Mode: Remote
Time : 3.30pm ist to 11.30pm ist

Exp:6-8 yrs

Required profile

Experience

Industry :
Management Consulting
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Data Engineer Related jobs