Azure Kafka_Parthiban_Capgemini

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

7+ years of IT experience in BI/DW domain with hands-on Azure modern data platform experience., Proficient in data analysis and transformation using Python/R/Scala on Azure Databricks or Apache Spark., Strong understanding of NoSQL data store concepts and distributed processing., Experience with traditional RDBMS and NoSQL databases, along with knowledge of information security principles..

Key responsibilities:

  • Develop and optimize data pipelines using Azure Data Factory, Databricks, and Synapse.
  • Analyze and transform data across structured, semi-structured, and unstructured formats.
  • Collaborate with teams using Agile methodologies and manage code using Git.
  • Administer and develop Kafka Confluent and implement ADB streaming solutions.

CodersBrain logo
CodersBrain SME https://www.codersbrain.com/
201 - 500 Employees
See all jobs

Job description

Skills
 
•            7+ years of relevant IT experience in the BI/DW domain with minimum of hands-on experience on Azure modern data platform that includes Data Factory, Databricks, Synapse (Azure SQL DW) and Azure Data Lake

•            Meaningful experience of data analysis and transformation using Python/R/Scala on Azure Databricks or Apache Spark

•            Well versed NoSQL data store concepts

•            Good knowledge in Distributed Processing using Databricks (preferred) or Apache Spark

•            Ability to debug using tools like Ganglia UI, expertise in Optimizing Spark Jobs 

•            The ability to work across structured, semi-structured, and unstructured data, extracting information and identifying linkages across disparate data sets

•            Expert in creating data structures optimized for storage and various query patterns for e.g. Parquet and Delta Lake

•            Meaningful experience in at least one database technology in each segment such as:

o            Traditional RDBMS (MS SQL Server, Oracle)

o            NoSQL (MongoDB, Cassandra, Neo4J, Cosmos DB, Gremlin)

•            Understanding of Information Security principles to ensure compliant handling and management of data

•            Effective in communication

•            Proficient at working with large and complex code bases (Github, Gitflow, Fork/Pull Model) and Realtime data processing.

•            Working experience in Agile methodologies (SCRUM, XP, Kanban)

Hands on development experience Data Factory, Databricks, Synapse (Azure SQL DW) and Azure Data Lake,Python/R/Scala . Expertise in traditional and NoSQL Database.

•          Kafka Confluent admin and development

•           ADB streaming knowledge.

Required profile

Experience

Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Communication

Related jobs