Bromley Business Centre, 27 Hastings Road, Bromley, Kent BR2 8NA, UK

Kafka Lead

Position Kafka Lead
Nature of Hiring Contract
Duration 12 months
Location Sodertalje, Sweden
Remote Hybrid- 2 days office and 3 days WFH for next 6 months.

Must have 8+ years of experience with client facing and Team handling Capabilities .
Excellent Communication Skills understand client requirements and implement the same along with the team.
Should be able to act as KAFKA project Lead , who can prepare project plan and timelines and work with end customers for successful migration
Must have skill – Kafka Confluent, Project Lead, project planning , onboarding and migration experience for end users
Good to have skills – AWS, Kubernetes, Devops

Job Description:
· Provide expertise and hands on experience working on Kafka connect using schema registry in a very high-volume environment (~900 million messages)
· Provide expertise in Kafka brokers, Zookeepers, KSQLDB, K-Stream , Connect, Replicator, Cluster-linking, Schema-registry and Confluent Control center.
· Provide expertise and hands on experience working on Avro-Converters, Json-Converters, and String Converters.
· Provide expertise and hands on experience working on Kafka connectors such as MQ connectors, Elastic Search connectors, JDBC connectors, File stream connector, JMS source connectors, Tasks, Workers, converters, Transforms.
· Provide expertise and hands on experience on custom connectors using the Kafka core concepts and API.
· Hands on Security setup with different authentication and Authorization mechanism like SASL, GSSAPI, SSL, Oauth2.0. Working knowledge on Kafka Rest proxy.
· Ensure optimum performance, high availability, and stability of solutions. Create topics, setup redundancy cluster, deploy monitoring tools, alerts, and has good knowledge of best practices.
· Create stubs for producers, consumers, and consumer groups for helping onboard applications from different languages/platforms like Java, Python, C, Microservices etc. Leverage Hadoop ecosystem knowledge to design, and develop capabilities to deliver our solutions using Spark, Scala, Python, Hive, Kafka, and other things in the Hadoop ecosystem.

Kafka Lead

ROLE: KAFKA Lead
Duration: 12 months
Location: Sweden
Rate: RATE OPEN

· Provide expertise and hands on experience working on Kafka connect using schema registry in a very high-volume environment (~900 million messages)
· Provide expertise in Kafka brokers, Zookeepers, KSQLDB, K-Stream , Connect, Replicator, Cluster-linking, Schema-registry and Confluent Control center.
· Provide expertise and hands on experience working on Avro-Converters, Json-Converters, and String Converters.
· Provide expertise and hands on experience working on Kafka connectors such as MQ connectors, Elastic Search connectors, JDBC connectors, File stream connector, JMS source connectors, Tasks, Workers, converters, Transforms.
· Provide expertise and hands on experience on custom connectors using the Kafka core concepts and API.
· Hands on Security setup with different authentication and Authorization mechanism like SASL, GSSAPI, SSL, Oauth2.0. Working knowledge on Kafka Rest proxy.
· Ensure optimum performance, high availability, and stability of solutions. Create topics, setup redundancy cluster, deploy monitoring tools, alerts, and has good knowledge of best practices.
· Create stubs for producers, consumers, and consumer groups for helping onboard applications from different languages/platforms like Java, Python, C, Microservices etc. Leverage Hadoop ecosystem knowledge to design, and develop capabilities to deliver our solutions using Spark, Scala, Python, Hive, Kafka, and other things in the Hadoop ecosystem.
· Experience with RDBMS systems, particularly Oracle 11/12g Use automation tools like provisioning using Jenkins, Gitlab and Terraform.
· Ability to perform data related benchmarking, performance analysis and tuning.
· Strong skills in In-memory applications, Database Design, Data Integration. Having Confluent Certified Kafka Developer would be additional advantage Monitoring setup knowledge with Grafana, Datadog and Prometheus.