Senior Data Engineer

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

3 to 7 years of experience in data engineering or related fields., Proficiency in at least four GCP services such as Data Flow, BigQuery, and Cloud Functions., Hands-on programming skills in Spark/Scala, Python, or Java., Experience with Ascend.io is non-negotiable..

Key responsabilities:

  • Build, automate, and optimize data workloads using Ascend.io.
  • Design and operationalize large-scale data solutions on GCP.
  • Develop and maintain ETL/ELT data pipelines from ingestion to consumption.
  • Collaborate with team members and stakeholders to ensure effective communication.

Cognisol logo
Cognisol Scaleup http://www.cognisolglobal.com/
11 - 50 Employees
See all jobs

Job description

 
 
Job Title: Senior Data Engineer
 
Job Description:
 
 
 
This role involves building, automating, and optimizing data workloads, and requires a proactive individual with strong problem-solving skills. The position is full-time, requiring 8 hours per day for 4-6 months, with a possible extension.
 
Key Responsibilities:
  • Data Workload Management: Use Ascend.io to build, automate, and optimize data workloads.
  • Enterprise Data Solutions: Design and operationalize large-scale data solutions using GCP services like Data Flow, Data Proc, Pub Sub, BigQuery, Cloud Functions, Composer, and GCS.
  • ETL/ELT Pipelines: Develop and maintain ETL/ELT data pipelines from ingestion to consumption.
  • Programming: Apply hands-on programming skills in Spark/Scala (Python/Java).
  • Data Engineering: Utilize knowledge of Data Lakes, Data Warehouses (Redshift/Hive/Snowflake), integration, and migration.
  • Communication: Collaborate effectively with team members and stakeholders.
  • Version Control: Manage code using version control tools (Git/Bitbucket/Code Commit).
Required Skills and Experience:
  • Ascend.io Expertise: Experience with Ascend.io (Non-negotiable).
  • Work Experience: 3 to 7 years in data engineering or related fields.
  • GCP Services: Proficient with at least four GCP services (Data Flow, Data Proc, Pub Sub, BigQuery, Cloud Functions, Composer, GCS).
  • Programming Skills: Hands-on experience in Spark/Scala (Python/Java).
  • ETL/ELT Pipelines: Proven experience in building ETL/ELT pipelines.
  • Data Engineering Knowledge: Understanding of Data Lakes, Data Warehouses, integration, and migration.
  • Communication: Excellent written and verbal skills.
  • Version Control: Experience with version control tools (Git/Bitbucket/Code Commit).
Shift Timings:
  • Eastern Standard Time (EST)

Required profile

Experience

Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Communication
  • Problem Solving

Data Engineer Related jobs