As a part of this tutorial, we will see how to transfer the document model to the knowledge graph model using Kolle without writing any single line of code.
For the purpose and details of document db and knowledge graph will not cover in this tutorial, please ask chatGPT and looks on Wikipedia.
Domain: Insurance claim
Source data: Datasets
Flatten model -> data contract -> Knowledge graph model
Producer: Flatten model
Consumer: Knowledge graph model
- Importing source models from policy datasets
- Remove duplicate data from source data
- Data profiling on the source data
- Apply data contract for data quality i.e selection, typecast, enrichment, reference data integration, etc 4a. Good data will move to refined model 4b. Bad data will move to refined error model
- Data profiling on refined data
- Convert to the RDF model from refined model as the target model
- Visualize data as knowledge graph
- CSV file as a source
- Kafka for event streaming to ingest and process data in real-time
- Stardog as a target for knowledge graph
- Kolle for metabase repository and automation