Match score not available

Data Engineer

Remote: 
Full Remote
Contract: 
Experience: 
Mid-level (2-5 years)
Work from: 

Offer summary

Qualifications:

Expert knowledge of Data Bricks and Delta Live Tables, Proficient in SQL and Python, Experience in designing data pipelines, Familiarity with cloud data lakes and warehouses, Knowledge of DevOps principles.

Key responsabilities:

  • Design and maintain data products using Data Bricks
  • Ensure data quality with automated testing
  • Collaborate to understand data requirements
  • Develop data infrastructure and integrations
  • Troubleshoot and resolve data-related issues
Allata logo
Allata Scaleup https://allata.com/
201 - 500 Employees
See more Allata offers

Job description

Allata is an IT company dedicated to strategy, architecture, and enterprise-level application development with offices in the US, India, and Argentina. We aim to be strategic advisors for our clients, focusing on helping them enhance or scale business opportunities, create efficiencies, automate processes through custom technologies, and find elegant solutions to inefficient problems.

We provide Data Analytics, Advanced Integrations, Product Launch, Experience Design, Support, Cloud, DevOps, Software Development, among other services. Our agile centralized development teams powered by our on-site senior leadership allow us to work with you as a stand-alone group be it integrating into your in-house dev teams or providing external architectural guidance.



We are actively seeking an experienced Data Engineer with a strong proficiency in Data Bricks and Delta Live Tables to join our team.

The successful candidate will play a vital role in elevating our client's current data ecosystem and solutions. The primary objectives will involve migrating existing data pipelines, crafting new data products to support enterprise analytics, and implementing data quality standards tailored to a specific business use-cases.

Role & Responsibilities:
  • Design, build, and maintain reusable data products through Data Bricks, Data Bricks Delta Live Tables(DLT), DBT, Python and SQL
  • Ensure data accuracy, consistency, timeliness, and completeness with automated testing and reporting.
  • Develop, maintain, and improve data infrastructure, data pipeline architecture and data integration between various systems, with a clear focus on leveraging Data Bricks and DLT.
  • Collaborate with different stakeholders to understand their data needs and requirements.
  • Build scalable and efficient data processing solutions to support data-driven applications.
  • Develop and maintain documentation related to data engineering processes and procedures.
  • Ensure data security and compliance with data privacy regulations.
  • Troubleshoot and resolve data-related issues and incidents.
  • Keep up to date with emerging trends and technologies in data engineering and incorporate them into our data processes.

  • Hard Skills - Must have:
  • Expert level knowledge of using Data Bricks and Delta Live Tables along with SQL and Python to write complex, highly-optimized queries across large volumes of data.
  • Experience designing and building dimensional models.
  • Expertise in building and maintaining data pipelines with tools such as Data Bricks, DLT (preferred, DBT, Matillion, AWS Glue, Azure DataFactory, Fivetran.
  • Experience designing, building, deploying, testing, maintaining, monitoring, and owning scalable, resilient and distributed data pipelines.
  • Experience with the medallion architecture
  • Expertise in cloud data lake such as Data Bricks Lakehouse, Azure Storage, AWS S3
  • Expertise in cloud data warehouses like Snowflake (preferred), Redshift, BigQuery.
  • Knowledge of batch and streaming data processing techniques.
  • Understanding of the Data Lifecycle Management process to collect, access, use, store, transfer, delete data.
  • Experience with applying DevOps principles to data projects and familiarity with tools and concepts such as Git, infrastructure as code, and CICD.

  • Hard Skills - Nice to have/It's a plus:
  • Knowledge or experience in architectural best practices in building data lakes.
  • Experience with DBT
  • Experience with BI Tools and in deploying and maintaining data models (Power BI, Tableau, Qlik, etc).
  • Obsession for service observability, instrumentation, monitoring and alerting.
  • Experience with big data technologies (Python, Spark, Data Lake, Delta Lake, Hive, Azure DataLake Storage Gen 2).
  • Experience configuring, connecting, and maintaining cloud platforms and services to extend capabilities of data platforms

  • Soft Skills / Business Specific Skills:
  • Experience working with distributed teams and clients.
  • Knowledge of Agile and SDLC concepts, practices, and techniques.
  • Proactive Analytical attitude, strong sense of self-motivation, organization, attention to detail and analytical skills.
  • Strong English communication skills, both written and spoken.
  • Required profile

    Experience

    Level of experience: Mid-level (2-5 years)
    Spoken language(s):
    EnglishEnglish
    Check out the description to know which languages are mandatory.

    Data Engineer Related jobs