Datamatics Technologies
Senior Kafka Developer
Om jobbet
Job Title:Senior Kafka DeveloperLocation:Gothenburg, Sweden
Experience:7-8 Years
Employment Type:Full-Time
Eligibility:Candidates with a valid EU/Sweden work permit can applyJob Summary
We are seeking an experiencedSenior Kafka Developerto join our engineering team in Gothenburg, Sweden. The ideal candidate will have strong expertise inApache Kafka, event-driven architecture, and real-time data streaming platforms. You will be responsible for designing, developing, and maintaining scalable streaming solutions that support high-volume data pipelines and enterprise integration use cases. The role requires deep technical knowledge of distributed systems, strong programming skills, and experience building resilient data processing architectures.Key Responsibilities
- Design, develop, and maintain high-performance Kafka-based streaming applications and data pipelines.
- Architect and implement event-driven microservices and real-time data streaming solutions.
- Build and manage Kafka producers, consumers, topics, and stream processing applications.
- Optimize Kafka clusters for scalability, performance, and reliability.
- Implement Kafka security, monitoring, and fault-tolerant architecture.
- Work with cross-functional teams including data engineers, backend developers, and DevOps teams.
- Ensure high availability and low latency in real-time data streaming platforms.
- Troubleshoot production issues and provide performance tuning and optimization.
- Implement data governance, schema management, and data integration frameworks.
- Contribute to architecture discussions and define best practices for event streaming platforms.
- 7-8 years of experience in backend development and distributed systems.
- Strong hands-on experience with Apache Kafka ecosystem.
- Experience with Kafka Streams, Kafka Connect, and Schema Registry.
- Proficiency in Java, Scala, or Python for building streaming applications.
- Experience with microservices architecture and RESTful APIs.
- Good understanding of event-driven architecture and real-time data processing.
- Experience working with containerization technologies such as Docker and Kubernetes.
- Strong knowledge of data pipelines, ETL processes, and message-driven systems.
- Experience with monitoring tools and logging frameworks for distributed systems.
- Experience with Confluent Kafka Platform or similar enterprise streaming platforms.
- Knowledge of cloud platforms such as AWS, Azure, or GCP.
- Familiarity with CI/CD pipelines and DevOps practices.
- Experience with big data technologies such as Spark, Flink, or Hadoop.
- Strong problem-solving skills and ability to work in high-scale production environments.
- Opportunity to work on large-scale real-time data platforms.
- Collaborative and innovative engineering environment.
- Competitive salary and relocation support (if applicable).
- Long-term opportunity in a growing technology team in Sweden.






