Manage and operate the Kafka streaming platform using Confluent SaaS, ensuring high availability, scalability, and performance of the messaging infrastructure
Design, implement, and maintain infrastructure as code (IaC) for the provisioning and configuration of Kafka environments using tools such as Terraform or similar
Monitor and troubleshoot Kafka clusters and components
Collaborate with integration platform teams to ensure seamless and secure Kafka topic creation, subscription, and management workflows within the DeveloperHub portal
Develop and maintain automation scripts and pipelines to support Kafka lifecycle management and improve operational efficiency
Support customer-facing self-service capabilities
Implement and enforce security policies and access controls
Work closely with development, product, and support teams to gather requirements, design solutions, and provide expert guidance on best practices
QualificationsQualificationsQualifications
Must haves:
Apache Kafka Expertise: Deep understanding of Kafka architecture and internal workings, including topics, partitions, brokers, producers, consumers, and Kafka streams
Streaming architecture: The ability to design, implement, and optimize streaming applications using Kafka, AWS, Kubernetes, GitOps, and other relevant technologies and tools
API Development/Integration: Ability to develop and integrate APIs for publishing and subscribing to streaming data. Experience with RESTful service patterns and event-driven architecture
Streaming Data Processing: Understanding of stream processing principles and the ability to design and implement real-time data processing pipelines
Cloud Services Proficiency: Proficient in cloud platforms like AWS and Azure, including their managed Kafka services if available (e.g., Amazon MSK, Azure Event Hubs)
Infrastructure as Code (IaC): Skills in using tools like Terraform or AWS CloudFormation to define and provision the cloud infrastructure in a repeatable and consistent manner
Good to have:
Familiarity with the Confluent SaaS offering, including setting up and managing Kafka clusters, schema registry, Kafka Connect, and ksqlDB, as well as understanding Confluent's specific tooling and features
The ability to ensure the quality, reliability, and security of the code
Understanding of security best practices related to streaming data, such as encryption in transit and at rest, role-based access control, and compliance standards
Skills in setting up monitoring, alerting, and logging for Kafka and the applications using it, possibly with tools like Prometheus, Grafana, ELK stack, or cloud-specific monitoring tools
Ability to write scripts in languages ??like Python, Bash, or PowerShell to automate repetitive tasks and deployments
Things to know before departure:
Start: by arrangement - always on the 1st and 15th of the month
Working hours: full time (40h); 27 vacation days
Employment contract: CIM, Unlimited
Line of work: Consulting
Language skills: Fluency in written and spoken English
Flexibility & willingness to travel
Other: a valid work permit
For more detail, salary and company information, use the apply link