ROLE: KAFKA Lead
Duration: 12 months
Rate: RATE OPEN
· Provide expertise and hands on experience working on Kafka connect using schema registry in a very high-volume environment (~900 million messages)
· Provide expertise in Kafka brokers, Zookeepers, KSQLDB, K-Stream , Connect, Replicator, Cluster-linking, Schema-registry and Confluent Control center.
· Provide expertise and hands on experience working on Avro-Converters, Json-Converters, and String Converters.
· Provide expertise and hands on experience working on Kafka connectors such as MQ connectors, Elastic Search connectors, JDBC connectors, File stream connector, JMS source connectors, Tasks, Workers, converters, Transforms.
· Provide expertise and hands on experience on custom connectors using the Kafka core concepts and API.
· Hands on Security setup with different authentication and Authorization mechanism like SASL, GSSAPI, SSL, Oauth2.0. Working knowledge on Kafka Rest proxy.
· Ensure optimum performance, high availability, and stability of solutions. Create topics, setup redundancy cluster, deploy monitoring tools, alerts, and has good knowledge of best practices.
· Create stubs for producers, consumers, and consumer groups for helping onboard applications from different languages/platforms like Java, Python, C, Microservices etc. Leverage Hadoop ecosystem knowledge to design, and develop capabilities to deliver our solutions using Spark, Scala, Python, Hive, Kafka, and other things in the Hadoop ecosystem.
· Experience with RDBMS systems, particularly Oracle 11/12g Use automation tools like provisioning using Jenkins, Gitlab and Terraform.
· Ability to perform data related benchmarking, performance analysis and tuning.
· Strong skills in In-memory applications, Database Design, Data Integration. Having Confluent Certified Kafka Developer would be additional advantage Monitoring setup knowledge with Grafana, Datadog and Prometheus.
|Job Category||Kafka Lead|