3+ years of development experience with Snowflake or similar data warehouse technology., Proficiency in dbt and modern data stack technologies like Apache Airflow, Fivetran, and AWS., Extensive experience in writing and optimizing advanced SQL statements., Experience in data modeling, ETL processes, and using reporting tools such as Tableau or Looker..
Key responsabilities:
Building and running data pipelines and services to support business functions and reports.
Developing end-to-end ETL/ELT pipelines in collaboration with Data Analysts.
Troubleshooting technical issues and improving data pipeline delivery.
Translating business requirements into technical specifications and owning the delivery of data models and reports.
Report This Job
Help us maintain the quality of our job listings. If you find any issues with this job post, please let us know.
Select the reason you're reporting this job:
Overture Rede is an (ISO 9001:2008), group of companies, focused on providing end to end Information & communication services & solution across the world. We believe in creating value for business & our clients through our quality services. Our commitment to deliver quality customer solutions has been the guiding factor in our development of a comprehensive menu of services.
We strive to develop solutions that allow businesses to save time and money. We are dedicated to developing solutions that help alleviate the problems in IT Services, corporate training, competency assessments, staffing & payroll management for customer to focus on more pressing issues
As part of the team, you will be responsible for building and running the data pipelines and services that are required to support business functions/reports/dashboard.. We are heavily dependent on BigQuery/Snowflake, Airflow, Stitch/Fivetran, dbt , Tableau/Looker for our business intelligence and embrace AWS with some GCP.
As a Data Engineer you’ll be:
Developing end to end ETL/ELT Pipeline working with Data Analysts of business Function.
Designing, developing, and implementing scalable, automated processes for data extraction, processing, and analysis in a Data Mesh architecture
Mentoring Fother Junior Engineers in the Team
Be a “go-to” expert for data technologies and solutions
Ability to provide on the ground troubleshooting and diagnosis to architecture and design challenges
Troubleshooting and resolving technical issues as they arise
Looking for ways of improving both what and how data pipelines are delivered by the department
Translating business requirements into technical requirements, such as entities that need to be modelled, DBT models that need to be build, timings, tests and reports
Owning the delivery of data models and reports end to end
Perform exploratory data analysis in order to identify data quality issues early in the process and implement tests to ensure prevent them in the future
Working with Data Analysts to ensure that all data feeds are optimised and available at the required times. This can include Change Capture, Change Data Control and other “delta loading” approaches
Discovering, transforming, testing, deploying and documenting data sources
Applying, help defining, and championing data warehouse governance: data quality, testing, coding best practises, and peer review
Building Looker Dashboard for use cases if required
What makes you a great fit:
Having 3+ years of extensive development experience using snowflake or similar data warehouse technology
Having working experience with dbt and other technologies of the modern datastack, such as Snowflake, Apache Airflow, Fivetran, AWS, git, Looker
Experience in agile processes, such as SCRUM
Extensive experience in writing advanced SQL statements and performance tuning them
Experience in Data Ingestion techniques using custom or SAAS tool like fivetran
Experience in data modelling and can optimise existing/new data models
Experience in data mining, data warehouse solutions, and ETL, and using databases in a business environment with large-scale, complex datasets
Mandatory skills
DBT, Snowflake, SQL Along with reporting tools like power bi or tableau or Looker
Salary:
4900000
Required profile
Experience
Spoken language(s):
English
Check out the description to know which languages are mandatory.