Project Name | Stars | Downloads | Repos Using This | Packages Using This | Most Recent Commit | Total Releases | Latest Release | Open Issues | License | Language |
---|---|---|---|---|---|---|---|---|---|---|
Javafamily | 31,639 | 7 months ago | ||||||||
【Java面试+Java学习指南】 一份涵盖大部分Java程序员所需要掌握的核心知识。 | ||||||||||
Bigdata Notes | 13,291 | 2 months ago | 33 | Java | ||||||
大数据入门指南 :star: | ||||||||||
Fullstack Tutorial | 10,884 | 4 months ago | 19 | Java | ||||||
🚀 fullstack tutorial 2022,后台技术栈/架构师之路/全栈开发社区,春招/秋招/校招/面试 | ||||||||||
God Of Bigdata | 7,992 | 2 days ago | 2 | |||||||
专注大数据学习面试,大数据成神之路开启。Flink/Spark/Hadoop/Hbase/Hive... | ||||||||||
Gpmall | 4,513 | 4 months ago | 49 | apache-2.0 | Java | |||||
【咕泡学院实战项目】-基于SpringBoot+Dubbo构建的电商平台-微服务架构、商城、电商、微服务、高并发、kafka、Elasticsearch | ||||||||||
Kafdrop | 4,396 | 2 days ago | 50 | apache-2.0 | Java | |||||
Kafka Web UI | ||||||||||
Surging | 3,183 | 7 | 30 | 10 days ago | 27 | April 21, 2019 | 251 | mit | C# | |
Surging is a micro-service engine that provides a lightweight, high-performance, modular RPC request pipeline. support Event-based Asynchronous Pattern and reactive programming ,The service engine supports http, TCP, WS,Grpc, Thrift,Mqtt, UDP, and DNS protocols. It uses ZooKeeper and Consul as a registry, and integrates it. Hash, random, polling, Fair Polling as a load balancing algorithm, built-in service governance to ensure reliable RPC communication, the engine contains Diagnostic, link tracking for protocol and middleware calls, and integration SkyWalking Distributed APM | ||||||||||
Efak | 2,727 | a day ago | 167 | apache-2.0 | Java | |||||
A easy and high-performance monitoring system, for comprehensive monitoring and management of kafka cluster. | ||||||||||
Kafka Stack Docker Compose | 2,397 | 24 days ago | 9 | apache-2.0 | Shell | |||||
docker compose files to create a fully working kafka stack | ||||||||||
Whatsmars | 1,793 | a month ago | 10 | apache-2.0 | Java | |||||
Java生态研究(Spring Boot + Redis + Dubbo + RocketMQ + Elasticsearch)🔥🔥🔥🔥🔥 |
A library that provides an in-memory Kafka instance to run your tests against.
Inspired by kafka-unit.
embedded-kafka is available on Maven Central, compiled for Scala 2.12 and 2.13.
Versions match the version of Kafka they're built against.
Prior to v2.8.0 Kafka core was inlining the Scala library, so you couldn't use a different Scala patch version than what Kafka used to compile its jars!
From v2.8.0 onwards package name has been updated to reflect the library group id (i.e. io.github.embeddedkafka
).
Aliases to the old package name have been added, along with a one-time Scalafix rule to ensure the smoothest migration.
build.sbt
file add the following dependency (replace x.x.x
with the appropriate version): "io.github.embeddedkafka" %% "embedded-kafka" % "x.x.x" % Test
EmbeddedKafka
trait.withRunningKafka
closure.An example, using ScalaTest:
class MySpec extends AnyWordSpecLike with Matchers with EmbeddedKafka {
"runs with embedded kafka" should {
"work" in {
withRunningKafka {
// ... code goes here
}
}
}
}
withRunningKafka
methodA EmbeddedKafka
companion object is provided for usage without extending the EmbeddedKafka
trait. Zookeeper and Kafka can be started and stopped in a programmatic way. This is the recommended usage if you have more than one test in your file and you don't want to start and stop Kafka and Zookeeper on every test.
class MySpec extends AnyWordSpecLike with Matchers {
"runs with embedded kafka" should {
"work" in {
EmbeddedKafka.start()
// ... code goes here
EmbeddedKafka.stop()
}
}
}
Please note that in order to avoid Kafka instances not shutting down properly, it's recommended to call EmbeddedKafka.stop()
in a after
block or in a similar teardown logic.
It's possible to change the ports on which Zookeeper and Kafka are started by providing an implicit EmbeddedKafkaConfig
class MySpec extends AnyWordSpecLike with Matchers with EmbeddedKafka {
"runs with embedded kafka on a specific port" should {
"work" in {
implicit val config = EmbeddedKafkaConfig(kafkaPort = 12345)
withRunningKafka {
// now a kafka broker is listening on port 12345
}
}
}
}
If you want to run ZooKeeper and Kafka on arbitrary available ports, you can
use the withRunningKafkaOnFoundPort
method. This is useful to make tests more
reliable, especially when running tests in parallel or on machines where other
tests or services may be running with port numbers you can't control.
class MySpec extends AnyWordSpecLike with Matchers with EmbeddedKafka {
"runs with embedded kafka on arbitrary available ports" should {
"work" in {
val userDefinedConfig = EmbeddedKafkaConfig(kafkaPort = 0, zooKeeperPort = 0)
withRunningKafkaOnFoundPort(userDefinedConfig) { implicit actualConfig =>
// now a kafka broker is listening on actualConfig.kafkaPort
publishStringMessageToKafka("topic", "message")
consumeFirstStringMessageFrom("topic") shouldBe "message"
}
}
}
}
The same implicit EmbeddedKafkaConfig
is used to define custom consumer or producer properties
class MySpec extends AnyWordSpecLike with Matchers with EmbeddedKafka {
"runs with custom producer and consumer properties" should {
"work" in {
val customBrokerConfig = Map("replica.fetch.max.bytes" -> "2000000",
"message.max.bytes" -> "2000000")
val customProducerConfig = Map("max.request.size" -> "2000000")
val customConsumerConfig = Map("max.partition.fetch.bytes" -> "2000000")
implicit val customKafkaConfig = EmbeddedKafkaConfig(
customBrokerProperties = customBrokerConfig,
customProducerProperties = customProducerConfig,
customConsumerProperties = customConsumerConfig)
withRunningKafka {
// now a kafka broker is listening on port 12345
}
}
}
}
This works for withRunningKafka
, withRunningKafkaOnFoundPort
, and EmbeddedKafka.start()
Also, it is now possible to provide custom properties to the broker while starting Kafka. EmbeddedKafkaConfig
has a
customBrokerProperties
field which can be used to provide extra properties contained in a Map[String, String]
.
Those properties will be added to the broker configuration, be careful some properties are set by the library itself and
in case of conflict the customBrokerProperties
values will take precedence. Please look at the source code to see what these properties
are.
The EmbeddedKafka
trait provides also some utility methods to interact with the embedded kafka, in order to set preconditions or verifications in your specs:
def publishToKafka(topic: String, message: String): Unit
def consumeFirstMessageFrom(topic: String): String
def createCustomTopic(topic: String, topicConfig: Map[String,String], partitions: Int, replicationFactor: Int): Unit
Given implicits Deserializer
s for each type and an EmbeddedKafkaConfig
it is possible to use withProducer[A, B, R] { your code here }
where R is the code return type.
For more information about how to use the utility methods, you can either look at the Scaladocs or at the tests of this project.
Given implicits Serializer
s for each type and an EmbeddedKafkaConfig
it is possible to use withConsumer[A, B, R] { your code here }
where R is the code return type.
A simple test using loan methods can be as simple as this:
implicit val serializer: Serializer[String] = new StringSerializer()
implicit val deserializer: Deserializer[String] = new StringDeserializer()
val key = "key"
val value = "value"
val topic = "loan_method_example"
EmbeddedKafka.withProducer[String, String, Unit](producer =>
producer.send(new ProducerRecord[String, String](topic, key, value)))
EmbeddedKafka.withConsumer[String, String, Assertion](consumer => {
consumer.subscribe(Collections.singletonList(topic))
eventually {
val records = consumer.poll(java.time.Duration.ofMillis(1.seconds.toMillis)).asScala
records should have size 1
records.head.key shouldBe key
records.head.value shouldBe value
}
})
A library that builds on top of embedded-kafka
to offer easy testing of Kafka Streams.
It takes care of instantiating and starting your streams as well as closing them after running your test-case code.
build.sbt
file add the following dependency (replace x.x.x
with the appropriate version): "io.github.embeddedkafka" %% "embedded-kafka-streams" % "x.x.x" % Test
EmbeddedKafkaStreams
trait. This offers both streams management and easy loaning of producers and consumers for asserting resulting messages in output/sink topics.EmbeddedKafkaStreams.runStreams
and EmbeddedKafka.withConsumer
and EmbeddedKafka.withProducer
. This allows you to create your own consumers of custom types as seen in the example test.A library that builds on top of embedded-kafka
to offer easy testing of Kafka Connect.
It takes care of instantiating and starting a Kafka Connect server as well as closing it after running your test-case code.
build.sbt
file add the following dependency (replace x.x.x
with the appropriate version): "io.github.embeddedkafka" %% "embedded-kafka-connect" % "x.x.x" % Test
EmbeddedKafkaConnect
trait.EmbeddedKafkaConnect.startConnect
. This allows you to start a Kafka Connect server to interact with as seen in the example test.