3 to 7 years of experience in data engineering or related fields., Proficiency in at least four GCP services such as Data Flow, BigQuery, and Cloud Functions., Hands-on programming skills in Spark/Scala, Python, or Java., Experience with Ascend.io is non-negotiable..
Key responsabilities:
Build, automate, and optimize data workloads using Ascend.io.
Design and operationalize large-scale data solutions on GCP.
Develop and maintain ETL/ELT data pipelines from ingestion to consumption.
Collaborate with team members and stakeholders to ensure effective communication.
Report This Job
Help us maintain the quality of our job listings. If you find any issues with this job post, please let us know.
Select the reason you're reporting this job:
Cognisol is a distinguished ISO 9001:2015 certified company committed to excellence and innovation. With a focus on delivering cutting-edge services, we pride ourselves on being at the forefront of industry trends and technological advancements. Our commitment to quality management ensures that we consistently meet and exceed our customer expectations, setting us apart as a reliable and forward-thinking partner in the ever-evolving business landscape.
As a trusted partner, we are dedicated to fostering long-term relationships with our clients by delivering high-quality and futuristic services that contribute to their success.
Our Services: Web Applications, Mobile Applications, Custom product Applications
Write us at: info@cognisolglobal.com
Sales enquiries: sales@cognisolglobal.com
This role involves building, automating, and optimizing data workloads, and requires a proactive individual with strong problem-solving skills. The position is full-time, requiring 8 hours per day for 4-6 months, with a possible extension.
Key Responsibilities:
Data Workload Management: Use Ascend.io to build, automate, and optimize data workloads.
Enterprise Data Solutions: Design and operationalize large-scale data solutions using GCP services like Data Flow, Data Proc, Pub Sub, BigQuery, Cloud Functions, Composer, and GCS.
ETL/ELT Pipelines: Develop and maintain ETL/ELT data pipelines from ingestion to consumption.
Programming: Apply hands-on programming skills in Spark/Scala (Python/Java).
Data Engineering: Utilize knowledge of Data Lakes, Data Warehouses (Redshift/Hive/Snowflake), integration, and migration.
Communication: Collaborate effectively with team members and stakeholders.
Version Control: Manage code using version control tools (Git/Bitbucket/Code Commit).
Required Skills and Experience:
Ascend.io Expertise: Experience with Ascend.io (Non-negotiable).
Work Experience: 3 to 7 years in data engineering or related fields.
GCP Services: Proficient with at least four GCP services (Data Flow, Data Proc, Pub Sub, BigQuery, Cloud Functions, Composer, GCS).
Programming Skills: Hands-on experience in Spark/Scala (Python/Java).
ETL/ELT Pipelines: Proven experience in building ETL/ELT pipelines.
Data Engineering Knowledge: Understanding of Data Lakes, Data Warehouses, integration, and migration.
Communication: Excellent written and verbal skills.
Version Control: Experience with version control tools (Git/Bitbucket/Code Commit).
Shift Timings:
Eastern Standard Time (EST)
Required profile
Experience
Spoken language(s):
English
Check out the description to know which languages are mandatory.