Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field., 5+ years of experience in Data Engineering or a similar role., Deep experience with AWS services, particularly S3 and RDS (PostgreSQL)., Strong proficiency in PostgreSQL and practical experience with Airflow for data pipeline orchestration..
Key responsabilities:
Design and develop scalable data pipelines and workflows using Airflow.
Collaborate with stakeholders to understand data requirements and ensure alignment with business objectives.
Develop and maintain transactional databases in PostgreSQL on AWS RDS.
Monitor and troubleshoot data pipelines to ensure data availability and performance optimization.
Report This Job
Help us maintain the quality of our job listings. If you find any issues with this job post, please let us know.
Select the reason you're reporting this job:
Lean Tech is a rapidly expanding organization situated in Medellín, Colombia. We pride ourselves on possessing one of the most influential networks within software development and IT services for the entertainment, financial, and logistics sectors. Our corporate projections offer a multitude of opportunities for professionals to elevate their careers and experience substantial growth. Joining our team means engaging with expansive engineering teams across Latin America and the United States, contributing to cutting-edge developments in multiple industries.
We are seeking a Senior Data Engineer to support our data infrastructure using AWS (S3, RDS PostgreSQL), Airflow, Snowflake, and Power BI, while designing transactional databases to meet business needs.
Position Title: Senior Software Engineer
Location: LATAM
What you will be doing:
We are seeking a Senior Data Engineer who will support our data infrastructure by working with technical and business teams. This role requires expertise in AWS (with a focus on S3 and RDS PostgreSQL), data pipeline orchestration (Airflow), cloud data warehouse (Snowflake), and data visualization (Power BI). You will also collaborate closely with cross-functional teams to design transactional databases that support our growing business needs. Your responsibilities will include:
Data Architecture & Pipeline Design
Design and develop scalable data pipelines and workflows using Airflow to orchestrate data ingestion, transformation, and delivery.
Leverage AWS services (S3, RDS PostgreSQL, etc.) to build robust, highly available data solutions.
Collaborate with stakeholders to understand data requirements, ensuring that data models and solutions align with business objectives.
Develop and maintain transactional databases in PostgreSQL on AWS RDS, ensuring optimal performance and reliability.
Work closely with application teams to design and optimize efficient, normalized transactional data structures.
Data Modeling & Transformation
Design data models that support efficient analytics and align with business requirements.
Ensure data quality and consistency across multiple data pipelines and systems.
Cloud Data Warehouse & Analytics
Develop and optimize data schemas to support reporting and data science initiatives.
Performance Optimization & Monitoring
Implement best practices for performance tuning, capacity planning, and cost optimization across AWS, Snowflake, and other data platforms.
Monitor and troubleshoot data pipelines, proactively resolving bottlenecks and ensuring data availability.
Collaborate with cross-functional teams (Data Science, Product, IT, etc.) to ensure seamless data integration and architecture alignment.
Participate in architecture reviews, technical roadmaps, and strategic planning discussions.
Requirements & Qualifications
To excel in this role, you should possess:
Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
5+ years of experience in Data Engineering or a similar role.
AWS: Deep experience with S3, RDS (PostgreSQL), IAM, EC2, Lambda, or other related services.
Databases: Strong proficiency in PostgreSQL; ability to design normalized transactional database structures and optimize complex SQL queries.
Orchestration: Practical experience with Airflow for automating complex data pipelines.
Cloud Data Warehouse: Experience designing, developing, and optimizing data models in Snowflake.
BI Tools: Working knowledge of Power BI or other visualization tools for analytics and reporting.
Programming: Proficiency in Python, SQL, and shell scripting.
Nice to have
Data Modeling & Transformation: Familiarity with DBT for building, testing, and maintaining transformation pipelines.
Version Control: Hands-on expertise with Liquibase (or similar) for database version control.
CI/CD & DevOps: Experience with modern software development practices (Git, CI/CD pipelines) is a plus.
Why you will love Lean Tech
Join a powerful tech workforce and help us change the world through technology.
Professional development opportunities with international customers.
Collaborative work environment.
Career path and mentorship programs that will lead to new levels.
Join Lean Tech and contribute to shaping the data landscape within a dynamic and growing organization. Your skills will be honed, and your contributions will be vital to our continued success. Lean Tech is an equal-opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.
Required profile
Experience
Industry :
Information Technology & Services
Spoken language(s):
English
Check out the description to know which languages are mandatory.