This project is integrated with BaritoLog found on this link: https://github.com/BaritoLog/
This project will create a stream processor which will process logs according to the contract (TimberWolf) and will send the processed logs into a new topic that will later be stored in a database.
- Golang
Using Homebrew:
brew install go
go get
- Kafka
- PostgreSQL 10.4
Download kafka from https://kafka.apache.org/downloads and unzip the folder
- Note:
Using Homebrew:
brew install postgresql
sudo -u postgres psql
cd $GOPATH/src
git clone [email protected]/go-squads/unclog-worker
cd unclog-worker
go install
- Note: To kill processes, use control + c key, make sure you don't just exit the terminal
Run zookeeper (from kafka directory)
./bin/zookeeper-server-start.sh config/zookeeper.properties
Run kafka-cluster (from kafka directory)
./bin/kafka-server-start.sh config/server.properties
Run Unclog Migration (setting up database)
- Note: Create
config.yaml
according to your environment, followconfig.yaml.example
$GOPATH/bin/unclog-worker migrate
Run Unclog Worker Stream Processor
$GOPATH/bin/unclog-worker sp
Run Unclog Worker Log Count Stream Processor
$GOPATH/bin/unclog-worker splc
lsof -i @localhost:[port]
kill -9 [pid]
cd ~/../../tmp
rm -rf kafka-logs
rm -rf zookeeper/version-2