7-10+ years in Java development, Experience with cloud development, AWS, and distributed computing frameworks including Kafka.
Key responsabilities:
Architect, design, and build streaming solutions for financial risk management
Contribute to enterprise transformation into a data-driven organization
Report This Job
Help us maintain the quality of our job listings. If you find any issues with this job post, please let us know.
Select the reason you're reporting this job:
Final round of Interview must be in person at Chicago or Dallas.
WORK TO BE PERFORMED: the person hired will architect, design and build streaming solution(s) as part of a Risk Based set of systems. We need a major contributor in the development of scalable resilient hybrid Cloud-based distributed computing solutions, supporting critical financial risk management activities. You will help in the transformation of the enterprise into a data-driven organization. The role is for someone with experience in cloud development, having the ability to design large scale micro services based streaming solution(s). Further, this person should have hands on technical skills in creating prototype(s) and in setting right standards around software development practices. II. SKILL AND EXPERIENCE REQUIRED: Must Have: • 7-10+ years of technical experience building newly configured, and designed datacentric software solutions • Advance level knowledge and use of Java 8+ w/experience using Multithreading, Collections, Streams API and functional programming, working on real enterprise projects. • Minimum of one year working to in develop cloud native streaming applications using Kafka, Kafka Streams and Sprint Boot. • Hands-on experience with any high speed distributed computing frameworks such as AWS EMR, Hadoop, HDFS, S3, MapReduce, Apache Spark, Apache Hive, Kafka Streams, Apache Flink etc. • Hands-on experience with one of these distributed data stores; HBase, Cassandra, MongoDB, AWS Dynamo DB • Some hands-on experience with a distributed message broker, such as Kafka, RabbitMQ, ActiveMQ, Amazon Kinesis, etc. • Hands-on experience with AWS foundational services like VPCs, Security groups, EC2, RDS, S3 ACLs, KMS, AWS CLI and IAM etc. and experience with Big Data architectures and BI solutions • Additional qualifications (Nice to have) : • Intermediate working knowledge of DevOps tools Terraform, Ansible, Jenkins, Maven/Gradle, Nexus/Artifactory and CI/CD pipeline etc. • Comprehensive debugging and troubleshooting skills, resourcefulness and strong researching skills
Proficiency and demonstrated skills in both Oral and Written business communications
Required profile
Experience
Level of experience:Senior (5-10 years)
Industry :
Information Technology & Services
Spoken language(s):
English
Check out the description to know which languages are mandatory.