The Provenance Event Stream Library provides an interface for consuming events resulting from activity on the Provenance Blockchain.
This directory contains a collection of examples showing usage of the event stream. The basic structure of Provenance Blockchain events are as follows:
- A block may contain zero or more transactions
- A transaction may contain one or more messages
- A message may result in the emission of zero or more events of various types pertaining to the actions executed as part of that message
- An event may contain zero or more attributes (key-value pairs) containing details about the corresponding event
The goal of the Event Stream Library is to make it easy to consume these events and take action on any events that are relevant to the needs of your application. A block is cut approximately every 5 seconds, so under normal circumstances listening to live blocks you should receive one block every 5 seconds. If you are catching up from a historical block height, you can expect to receive a much faster stream of blocks until you catch up with the live height.
You may find it useful to look at various transactions on Provenance Explorer to understand what events result from transactions you are interested in for your application.
Please see the Event Stream Readme for further documentation
- Java JDK 11 (install via an sdk manager, like SdkMan)
- Gradle for building/running the examples
- Docker Compose in order to spin up supporting docker-compose setup for the kafka example
If you make use of Kafka in your system, the Event Stream provides a Kafka Connector that can be used to feed events directly into a Kafka topic and consumed downstream. Please see the KafkaConsumerExample for an example usage.
- SimpleEventStreamListener: a simple example of
listening to the event stream and filtering down to blocks relevant to your application. The following environment
variables can be included to modify behavior, but are not necessary for the application to run successfully:
NODE_URI
: A connection URI to a Provenance Blockchain node. This value defaults to a connection to the figure.tech testnet node. If running a local node, use:http://localhost:26657
.START_HEIGHT
: The block height at which to start streaming events. This value defaults to null, which will start listening to live blocks being cut (not historical blocks) only.
- KafkaConsumerExample: a simple producer/consumer setup
illustrating publishing blocks to/reading blocks from a Kafka topic. Note: There is an included Docker Compose
setup to stand up a local single-node kafka instance for testing purposes, can be run using the kafka.yml
configuration. The following environment variables can be included to modify behavior, but are not necessary for the
application to run successfully:
KAFKA_TOPIC
: The name of the kafka topic on which events should be omitted. By default, the topic name will beexampleTopic
.KAFKA_BOOTSTRAP_SERVERS_CONFIG
: The address of the kafka server to which the application connects. By default, this value is set tolocalhost:9092
and is intended to communicate with the servers created with the local kafka docker-compose file.NODE_URI
: A connection URI to a Provenance Blockchain node. This value defaults to a connection to the figure.tech testnet node. If running a local node, use:http://localhost:26657
.START_HEIGHT
: The block height at which to start streaming events. This value defaults to null, which will start listening to live blocks being cut (not historical blocks) only.
To start and stop local kafka nodes in docker containers for this example, the following commands can be used:
# Start the kafka container:
docker compose -f src/main/docker/kafka.yml up -d
# Stop the kafka container:
docker compose -f src/main/docker/kafka.yml down
To run an example, simple start the gradle application using the specified class name. In example, to start the kafka
consumer example KafkaConsumerExample
, run the following. Make sure to include the Kt
suffix on the class name that
is auto-generated by the Kotlin compiler.
./gradlew run -PmainClass=io.provenance.example.KafkaConsumerExampleKt
Each example contains various configuration options that can be used to control the node to connect to for the event stream, kafka connection/topic information, etc.