- About
- How it works
- Message format
- Configure oxauth-server logging
- External properties
- RESTful API
- Install and run activeMQ
- Database schema
- MySQL
- PostgreSQL
- Building and running
- Enable logging
The goal of this app to centralize all logs in one place and to provide a quick access to logging data by exposing RESTful API for searching with custom conditions. Roots of this project are drawn to the following issue.
This version is uses activemq messaging server and postgresql or mysql (up to your choice) database to store logging data.
At first the application tries to connect to activemq using the following url: failover:(tcp://localhost:61616)?timeout=5000
(could be configured from application properties ).
If connection to message broker succeeded, then the application starts two asynchronous receivers, which reads messages from: oauth2.audit.logging
and oxauth.server.logging
(could be modified in application-{profile}.properties
) queues and stores them in database. It also exposes a discoverable REST API that helps clients to read and search through logging messages.
At the same time the application starting scheduled tasks that must delete old messages from database. The cron expression and the number of days that messages must be stored, could be configured from application properties.
Messages from oauth2.audit.logging
queue are expected to be json strings with the following properties:
{
"ip" : "",
"action" : "",
"timestamp" : 1480935174312,
"macAddress" : "",
"clientId" : "",
"username" : "",
"scope" : "",
"success" : true
}
Messages from oxauth.server.logging
queue are expected to be objects: org.apache.log4j.spi.LoggingEvent
. To send them JMSQueueAppender could be used.
To configure oxauth-server to send logging messages via JMS, just add JMSAppender into log4j2.xml e.g:
<JMS name="jmsQueue"
destinationBindingName="dynamicQueues/oxauth.server.logging"
factoryName="org.apache.activemq.jndi.ActiveMQInitialContextFactory"
factoryBindingName="ConnectionFactory"
providerURL="tcp://localhost:61616"
userName="admin"
password="admin">
</JMS>
and <AppenderRef ref="jmsQueue"/>
to the root
tag in the log4j2.xml file.
More about JMSAppender you can read here.
Besides others standard spring boot properties, the following could also be customized:
message-consumer.oauth2-audit.destination
- defines activemq queue name for oauth2 audit loggingmessage-consumer.oauth2-audit.days-after-logs-can-be-deleted
- defines how many days the oauth2 audit logging data should be keptmessage-consumer.oauth2-audit.cron-for-log-cleaner
- defines cron expression for oauth2 audit logging data cleanermessage-consumer.oxauth-server.destination
- defines activemq queue name for oxauth server logsmessage-consumer.oxauth-server.days-after-logs-can-be-deleted
- defines how many days the oxauth server logging data should be keptmessage-consumer.oxauth-server.cron-for-log-cleaner
- defines cron expression for oxauth server logging data cleaner
Important notes.
In order to perform search by date the following date format yyyy-MM-dd HH:mm:ss.SSS
MUST be used, e.g:/api/oauth2-audit-logs/search/query?fromDate=2016-10-03%2015:53:47.509
.
All string params are matched exactly e.g: /api/oauth2-audit-logs/search/query?ip=10.0.2.2
, except scope
and formattedMessage
- they are searched using 'like' query, e.g: scope like concat('%', :scope,'%')
Default page size for all requests is 20, and max page size is 100. These properties could be configured in application-*.properties file.
- oauth2 audit logs
- Get all logs:
/api/oauth2-audit-logs{?page,size,sort}
- Get single log:
/api/oauth2-audit-logs/{id}
- Search by custom fields:
/api/oauth2-audit-logs/search/query{?ip,clientId,action,username,scope,success,fromDate,toDate,page,size,sort}
- Get all logs:
- oxauth server logs
- Get all logs:
/api/oxauth-server-logs{?page,size,sort}
- Get single log:
/api/oxauth-server-logs/{id}
- Search by custom fields:
/api/oxauth-server-logs/search/query{?level,loggerName,formattedMessage,fromDate,toDate,page,size,sort}
- Get all logs:
curl http://localhost:8080/api/oauth2-audit-logs/search/query?ip=10.0.2.2&username=admin&scope=openid&size=1
{
"_embedded" : {
"oauth2-audit-logs" : [ {
"ip" : "10.0.2.2",
"action" : "USER_AUTHORIZATION",
"clientId" : "@!7A06.6C73.B7D4.3983!0001!CFEA.2908!0008!13E4.C749",
"username" : "admin",
"scope" : "openid profile email user_name",
"success" : true,
"timestamp" : "2016-10-03T12:53:47.509+0000",
"_links" : {
"self" : {
"href" : "http://localhost:8080/api/oauth2-audit-logs/3335"
},
"oAuth2AuditLoggingEvent" : {
"href" : "http://localhost:8080/api/oauth2-audit-logs/3335"
}
}
} ]
},
"_links" : {
"first" : {
"href" : "http://localhost:8080/api/oauth2-audit-logs/search/query?ip=10.0.2.2&username=admin&scope=openid&page=0&size=1"
},
"self" : {
"href" : "http://localhost:8080/api/oauth2-audit-logs/search/query?ip=10.0.2.2&username=admin&scope=openid&size=1"
},
"next" : {
"href" : "http://localhost:8080/api/oauth2-audit-logs/search/query?ip=10.0.2.2&username=admin&scope=openid&page=1&size=1"
},
"last" : {
"href" : "http://localhost:8080/api/oauth2-audit-logs/search/query?ip=10.0.2.2&username=admin&scope=openid&page=1&size=1"
}
},
"page" : {
"size" : 1,
"totalElements" : 2,
"totalPages" : 2,
"number" : 0
}
}
curl http://127.0.0.1:9339/logger/api/oxauth-server-logs?page=3&size=1
{
"_embedded" : {
"oxauth-server-logs" : [ {
"timestamp" : "2017-01-14T18:48:17.000+0000",
"formattedMessage" : "Start U2F request clean up",
"loggerName" : "org.xdi.oxauth.service.CleanerTimer",
"level" : "DEBUG",
"exceptions" : [ ],
"_links" : {
"self" : {
"href" : "http://127.0.0.1:9339/logger/api/oxauth-server-logs/4"
},
"oXAuthServerLoggingEvent" : {
"href" : "http://127.0.0.1:9339/logger/api/oxauth-server-logs/4"
}
}
} ]
},
"_links" : {
"first" : {
"href" : "http://127.0.0.1:9339/logger/api/oxauth-server-logs?page=0&size=1"
},
"prev" : {
"href" : "http://127.0.0.1:9339/logger/api/oxauth-server-logs?page=2&size=1"
},
"self" : {
"href" : "http://127.0.0.1:9339/logger/api/oxauth-server-logs"
},
"next" : {
"href" : "http://127.0.0.1:9339/logger/api/oxauth-server-logs?page=4&size=1"
},
"last" : {
"href" : "http://127.0.0.1:9339/logger/api/oxauth-server-logs?page=486&size=1"
},
"profile" : {
"href" : "http://127.0.0.1:9339/logger/api/profile/oxauth-server-logs"
},
"search" : {
"href" : "http://127.0.0.1:9339/logger/api/oxauth-server-logs/search"
}
},
"page" : {
"size" : 1,
"totalElements" : 487,
"totalPages" : 487,
"number" : 3
}
}
http://127.0.0.1:9339/logger/api/oxauth-server-logs/4
{
"timestamp" : "2017-01-14T18:48:17.000+0000",
"formattedMessage" : "Start U2F request clean up",
"loggerName" : "org.xdi.oxauth.service.CleanerTimer",
"level" : "DEBUG",
"exceptions" : [ ],
"_links" : {
"self" : {
"href" : "http://127.0.0.1:9339/logger/api/oxauth-server-logs/4"
},
"oXAuthServerLoggingEvent" : {
"href" : "http://127.0.0.1:9339/logger/api/oxauth-server-logs/4"
}
}
}
http://127.0.0.1:9339/logger/api/oauth2-audit-logs/1
{
"ip" : "10.0.2.2",
"action" : "USER_AUTHORIZATION",
"clientId" : "@!00EA.DF1E.31A5.C287!0001!50C2.44A6!0008!DF32.8FD8",
"macAddress" : "08-00-27-36-17-42",
"username" : null,
"scope" : "openid profile email user_name",
"success" : false,
"timestamp" : "2017-01-14T19:17:49.000+0000",
"_links" : {
"self" : {
"href" : "http://127.0.0.1:9339/logger/api/oauth2-audit-logs/1"
},
"oAuth2AuditLoggingEvent" : {
"href" : "http://127.0.0.1:9339/logger/api/oauth2-audit-logs/1"
}
}
}
Schema | Name | Type | Owner |
---|---|---|---|
public | oauth2_audit_logging_event | table | gluu |
public | oxauth_server_logging_event | table | gluu |
public | oxauth_server_logging_event_exception | table | gluu |
Column | Type | Modifiers | Storage | Stats target | Description |
---|---|---|---|---|---|
id | bigint | not null | plain | ||
action | character varying(255) | extended | |||
client_id | character varying(255) | extended | |||
ip | character varying(255) | extended | |||
mac_address | character varying(255) | extended | |||
scope | character varying(255) | extended | |||
success | boolean | plain | |||
timestamp | timestamp without time zone | plain | |||
username | character varying(255) | extended |
"oauth2_audit_logging_event_pkey" PRIMARY KEY, btree (id)
"oauth2_audit_logging_event_timestamp" btree ("timestamp")
Column | Type | Modifiers | Storage | Stats target | Description |
---|---|---|---|---|---|
id | bigint | not null | plain | ||
formatted_message | text | extended | |||
level | character varying(255) | extended | |||
logger_name | character varying(255) | extended | |||
timestamp | timestamp without time zone | plain |
"oxauth_server_logging_event_pkey" PRIMARY KEY, btree (id)
"oxauth_server_logging_event_timestamp" btree ("timestamp")
TABLE "oxauth_server_logging_event_exception" CONSTRAINT "fktp5p28uolrsx6vlj6annm7255" FOREIGN KEY (oxauth_server_logging_event_id) REFERENCES oxauth_server_logging_event(id)
Column | Type | Modifiers | Storage | Stats target | Description |
---|---|---|---|---|---|
id | bigint | not null | plain | ||
index | integer | plain | |||
trace_line | text | extended | |||
oxauth_server_logging_event_id | bigint | plain |
"oxauth_server_logging_event_exception_pkey" PRIMARY KEY, btree (id)
"fktp5p28uolrsx6vlj6annm7255" FOREIGN KEY (oxauth_server_logging_event_id) REFERENCES oxauth_server_logging_event(id)
To create schema for MySQL use mysql-schema.sql.
source ${path_to_file}/mysql_schema.sql
To create schema for PostgreSQL use postgresql_schema.sql.
Edit postgresql_schema.sql
to change databse Owner and postgresql user. Here is example how to create gluu
user and create database schema:
CREATE USER gluu WITH password 'root';
\i ${path_to_file}/postgresql_schema.sql
Note: update spring.datasource.username
and spring.datasource.password
in application-prod-postgresql.properties
after creating new postgresql user.
- Download the activemq zipped tarball file to the Unix machine, using either a browser or a tool, i.e., wget, scp, ftp, etc. (see Download -> "The latest stable release")
- Extract archive, e.g:
tar -xvzf apache-activemq-x.x.x-bin.tar.gz
- Edit
apache-activemq-x.x.x-bin.tar.gz/bin/env
to specify the location of your java installation using JAVA_HOME - Run activeMQ, e.g:
apache-activemq-x.x.x-bin.tar.gz/bin/activemq start
- Optional. Navigate to activeMQ console
http://localhost:8161/
.
message-consumer supports four production profiles: dev
, prod
, prod-mysql
and prod-postgresql
.
dev
profile includes in-memory h2 database and also write logs to the file( by default filename is message-consumer.log and filepath is the same directory from which the application was launched). H2 console could be accessed from/h2-console
uri(JDBC URL isjdbc:h2:mem:gluu_log;DB_CLOSE_DELAY=-1
). Log filename and path could be configured by providing--logging.file
and--logging.path
as additional console params (e.g:java -jar target/message-consumer-0.0.1-SNAPSHOT.jar --logging.file=message-consumer.log --logging.path=.
).prod
profile contains dependencies to both postgresql and mysql database connectors. To select required database you need to pass--database
property to the command line with the following values:postgresql
ormysql
.prod-mysql
profile contains dependency only for mysql database connector.prod-postgresql
profile contains dependency only for postgresql database connector.
Before running this app make sure that MySQL/PostgreSQL is running and schema is created and activeMQ is installed and running. Also check the configuration from application-{profile}.properties
, make sure that connection properties to activeMQ and database are correct.
Here is an example how to build the project for both PostgreSQL and MySQL:
git clone https://github.com/GluuFederation/message-consumer.git
cd message-consumer/
sudo mvn -Pprod clean package
runnig message-consumer that need to be used with postgresql
java -jar target/message-consumer-0.0.1-SNAPSHOT.jar --database=postgresql
running message-consumer that need to be used with mysql
java -jar target/message-consumer-0.0.1-SNAPSHOT.jar --database=mysql
Here is an example how to build and run the project for MySQL:
git clone https://github.com/GluuFederation/message-consumer.git
cd message-consumer/
sudo mvn -Pprod-mysql clean package
java -jar target/message-consumer-0.0.1-SNAPSHOT.jar
Here is an example how to build and run the project for PostgreSQL:
git clone https://github.com/GluuFederation/message-consumer.git
cd message-consumer/
sudo mvn -Pprod-postgresql clean package
java -jar target/message-consumer-0.0.1-SNAPSHOT.jar
message-consumer uses logback as a logging library. In order to enable logging to file you need to run message-consumer with the following arguments:--enable-logging=true
.
By default logback will write logs in ./logs
folder and will archive logs in ./logs/archive
. In order to configure logging path you need to override message-consumer.logger-path
property, e.g: --message-consumer.logger-path=/Users/testUser/Desktop
. In this case logback will write logs into /Users/testUser/Desktop
folder and archives into /Users/testUser/Desktop/archive
java -jar target/message-consumer-0.0.1-SNAPSHOT.jar --database=mysql --enable-logging=true --message-consumer.logger-path=/Users/testUser/Desktop