Kafka Administrator/Architect 100% remote

Apply for this position Please mention DailyRemote when applying
timePosted 8 days ago location United States salarySalary undisclosed
Before you apply - make sure the job is legit.

Attempting to apply for jobs might take you off this site to a different website not owned by us. Any consequence as a result for attempting to apply for jobs is strictly at your own risk and we assume no liability.

Job Description

Hands-on production experience and a deep understanding of the Kafka architecture and internals of how it works, along with the interplay of architectural components: brokers, Zookeeper, producers/consumers, Kafka Connect, Kafka Streams

  • Strong fundamentals in Kafka administration, configuration, and troubleshooting
  • Knowledge of Kafka clustering, and its fault-tolerance model supporting HA and DR
  • Practical experience with how to scale Kafka, KStreams, and Connector infrastructures, with the motivation to build efficient platforms
  • Best practices to optimize the Kafka ecosystem based on use-case and workload, e.g. how to effectively use topic, partitions, and consumer groups to provide optimal routing and support of QOS
  • Experience with Kafka Streams / KSQL architecture and associated clustering model
  • Hands-on experience as a developer who has used the Kafka API to build producer and consumer applications, along with expertise in implementing KStreams components. Have developed KStreams pipelines, as well as deployed KStreams clusters
  • Experience with developing KSQL queries and best practices of using KSQL vs streams
  • Strong knowledge of the Kafka Connect framework, with experience using several connector types: HTTP REST proxy, JMS, File, SFTP, JDBC, Splunk, Salesforce, and how to support wire-format translations. Knowledge of connectors available from Confluent and the community
  • Hands-on experience in designing, writing and operationalizing new Kafka Connectors using the framework
  • The familiarity of the Schema Registry
  • Solid programming proficiency with Java, and best practices in development
  • Experience with monitoring Kafka infrastructure along with related components (Connectors, KStreams, and other producers/consumer apps)
  • Familiarity with Confluent Control Center
  • Working knowledge of Splunk, how it integrates with Kafka, and using it effectively as a Kafka operational tool

Thanks & Regards

Aryan

Delivery Head - US Recruitments

Direct:

Email:

Wafts solutions Inc.
32969 Hamilton court, Suite 123, Farmington Hills, MI- 48334.
| eFax : |

Website :

Linkedin : linkedin.com/in/jidoshaadhya

- provided by Dice