Help us maintain the quality of our job listings. If you find any issues with this job post, please let us know.
Select the reason you're reporting this job:
We are a staffing solutions and IT Solutions company that provides you with a platform to find the right fit for your desired job profile.
The employment situation is constantly shifting with the changing times, and we are here to ensure that you gather the workforce that compliments your long-term goals. We understand the struggle of the search for talent that accommodates the skills and qualifications for specific profiles, one that blends in with the theme of your organization.
We aim to be the best available source for young talent to find their dream jobs, by helping them narrowing down their options to the most suitable work profiles available in the market.
At RIG, we consistently work towards creating the latest technology that may not only simplify your work process but also provides you with the most cost-effective solutions. We work hard to make sure that your business strives in the market to be on the top in your field.
Collaborate as part of a cross-functional Agile team to create and enhance software that enables state of the art, next generation Big Data & Fast Data applications.
Build software and frameworks to automate high-volume and real-time data delivery between our cloud based data platforms and applications.
Build data APIs and data delivery services that support critical operational and analytical applications for our internal business operations, customers and partners
Leverage DevOps techniques and practices like Continuous Integration, Continuous Deployment, Test Automation, Build Automation and Test Driven Development to enable the rapid delivery of software utilizing tools like Jenkins, Maven, Nexus, Chef, Terraform, Ruby, Git and Docker
Perform unit tests and conduct reviews with other team members to make sure the code is rigorously designed, elegantly coded, and effectively tuned for performance
Develop and deploy distributed computing Data applications using Spark or Pyspark
Utilize programming languages like Python and NoSQL databases and Cloud based data warehousing services such as Redshift and Snowflake
Sql experience
ETL experience
Basic Qualifications:
Bachelor's Degree or military experience
At least 3 years of professional work experience in data engineering
At least 3 years of experience with Python
At least 3 years of experience with Sql
At least 3 years of experience with Spark or Pyspark
At least 3 years of experience with ETL development
At least 2 year of experience working with cloud data capabilities - AWS
Required profile
Experience
Level of experience:Mid-level (2-5 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.