Senior Data Engineer - GP

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

7+ years of experience in data engineering and application development., Strong knowledge of ELT/ETL tools and cloud platforms, particularly AWS., Advanced SQL skills and experience with data governance concepts., Familiarity with Big Data infrastructure and event-driven architectures..

Key responsibilities:

  • Act as a liaison between engineering and analytics teams to ensure data quality.
  • Develop and implement ETL processes and build an Enterprise Data Warehouse.
  • Support multiple teams by addressing their data needs and requirements.
  • Contribute to the governance strategy of the data platform and optimize analytics schemas.

Gorilla Logic logo
Gorilla Logic SME https://www.gorillalogic.com/
501 - 1000 Employees
See all jobs

Job description

Senior Data Engineer

Gorilla Logic is looking for a Sr. Data Engineer , who will assist in organizing our client's system data into dashboards and on demand reports. This is a unique and highly technical role, requiring strong database and reporting experience in delivering leading-edge solutions. Our environment will require you to work effectively with your teammates, of course. But your real success will be measured by how well you couple critical thinking with self-motivation, enthusiasm, and determination.

Responsibilities

*Act as a bridge between engineering and analytics to provide high fidelity data to the business
*Work directly with the data team to build out a new ETL process
*Work directly with the engineering team to build a new Enterprise Data Warehouse (EDW)
*Help build out the governance strategy on the data platform
*Able to break down requirements and get clarity on the critical use cases
*Self-directed and comfortable supporting the data needs of multiple teams

Technical Requirements

*Experience in designing and building Data pipeline using at least one traditional ELT/ETL tools
*Knowledge of Modern techniques and tool for Data Engineering concepts, especially Data ingestion, Data Quality
control, Curate & Enrich, distribute
*Experience in traditional RDBMS
*Experience in at least one cloud platform from AWS
*Experience with tools similar to AWS appflow that can automate data flows between SaaS and AWS services
*Experience in designing and building high performance data pipelines
*Advance experience in SQL
*Experience in event driven architectures (Pub/Sub, Kafka, RabbitMQ, AMQP, SNS, etc…)
*Experience with Big Data Infrastructure (e.g. Snowflake or Hadopp)
*Experience of overall data governance concepts
*Strong customer focus and ability to recognize impact of decisions on end users
*Strong verbal and written communications skills and ability to interact with team members and external customers
*7+ years experience working with Application Development teams:
*Own and optimize our analytics schemas - help design the structure of our data warehouse and implement optimization techniques to make data accessible and distributed
* Experience in ETL scheduling - automating tasks; data refresh; balancing processing/cost requirements with business needs.
* Experience modeling complex data sets to enable analytics teams to query and distribute business intelligence assets

Bonus Skills

*Experience with data pipeline development for AI/ML will be a big plus.

Required profile

Experience

Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Communication
  • Persistence
  • Critical Thinking
  • Enthusiasm
  • Self-Motivation

Data Engineer Related jobs