5+ years in software development and data management, Expertise in AWS services and SQL databases.
Key responsabilities:
Design, implement and maintain data pipelines using AWS services
Support ETL operational processes; ensure data security and quality
Prototype future state solutions for cost reduction and performance improvement
Report This Job
Help us maintain the quality of our job listings. If you find any issues with this job post, please let us know.
Select the reason you're reporting this job:
Advansys Inc ., is a global IT services company that provides high-quality Software engineering, Technology consulting and Outsourcing services to its clients. Incorporated in the year 1998 by a group of highly skilled and dedicated business and technology experts, Advansys is head quartered in Herndon, VA.
The company's core competencies in the areas of Technology, Business and Management, our core values, our impeccable track record for empowering business with Innovation and Technology and our commitment to our customer's success has driven our presence across various industries like Financial Services, Insurance, Health Care, Logistics, Communications and Media & Internet in the past decade.
We have proven track record with fortune 1000 companies for our services.
Our Core Values that keep us moving forward are :
* Integrity in our dealings.
* Excellence in service.
* Respect for all we serve and our fellow employees.
* Continuous improvement in our work
* Belief in team work.
Our Strengths that put us above the competition are :
* Business domain expertise.
* Innovation
* Technology
* Result Oriented
* Proven success
* Proven Performance.
* Award winning quality.
* Retention of clientele
Our Motto that has established our presence across various industries is "Our success lies in our client's success"
The mantra that we believe in is "Empowering Business with Innovation and Technology".
Position: Data Engineer Location: Remote Contract: Longterm ongoing Visa: Open
Notes: Position = AWS data engineer. Brief description is attached. Highlights: Strong in-depth AWS data services/technologies experience Strong python developer Strong server scripting Strong SQL querying skills Big bonus for candidates that have worked with Adobe clickstream as a data feed Big bonus for candidates that might have also worked with Microsoft Azure, Databricks, etc. (but primary recent experience should be AWS)
Looking for a Cloud Data Engineer to leverage AWS data services to find ways to solve complex data problems. Support analytics & BI teams with database solutions, performing tasks such as data transformation, data integration, creating & maintaining data pipelines, ensuring data quality, as well as data warehousing. Multiple projects in the pipeline to ingest data from a variety of sources, including digital interaction logs, marketing campaigns, guest services & bot logs. Responsibilities Demonstrate proficiency in choosing the right streaming technologies & ETL tools and be able to articulate business & technical reasoning to business leaders. Design, plan, create, implement, and document data storage solutions using AWS services to deliver technology solution projects Develop and maintain data pipelines Support the ETL operational processes including, but not limited to: automation, job scheduling, dependencies, monitoring, maintenance, and administration. Create custom integrations between existing legacy databases, AWS, and other database products Implement data security measures; ensure data quality Evaluate current states and prototype future state solutions that can be leveraged to decrease costs and improve performance Qualifications 5+ years of software development experience across diverse domains, with at least five years of experience in data management and distributed systems solving problems. Understanding of core AWS services, uses, and AWS architecture best practices Expert knowledge of SQL databases, SQL queries, Amazon Athena, and data formats such as XML, TSV and JSON Relational Database Management Systems (RDBMS), such as Oracle, Teradata, SQL Server or Snowflake; Version control systems, such as Git, Bitbucket or Github Experience designing and implementing solutions using AWS services, with a strong understanding of at least some core services: EC2, S3, Redshift, Spectrum, Kinesis, Lambda, Athena, Glue, IAM Strong experience in Python development & Shell scripting (Bash, AWS CLI, UNIX command line, Z) Exposure to Azure Data Lake and Azure Databricks, as well as Google BigQuery a plus Experience with Adobe Clickstream feeds desired
Required profile
Experience
Level of experience:Senior (5-10 years)
Industry :
Information Technology & Services
Spoken language(s):
English
Check out the description to know which languages are mandatory.