Skip to content

Commit

Permalink
Website publishing (#133)
Browse files Browse the repository at this point in the history
* Website publishing

* No need to upgrade version
  • Loading branch information
samchon authored Feb 8, 2025
1 parent 7f7166f commit bb9d913
Show file tree
Hide file tree
Showing 7 changed files with 98 additions and 26 deletions.
26 changes: 26 additions & 0 deletions .github/workflows/website.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
name: website
on:
push:
branches:
- master
paths:
- 'src/**'
- 'website/**'
- 'package.json'
jobs:
website:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: 20.x
- name: Install dependencies
run: npm install
- name: Build TypeDoc
run: npm run typedoc && npm run typedoc -- --json website/public/api/openapi.json
- name: Deploy
uses: JamesIves/[email protected]
with:
branch: gh-pages
folder: ./website/public
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@ examples/converted/
lib/
models/
node_modules/
website/public/api/

package-lock.json
pnpm-lock.yaml
Expand Down
48 changes: 24 additions & 24 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ flowchart
[![npm version](https://img.shields.io/npm/v/@samchon/openapi.svg)](https://www.npmjs.com/package/@samchon/openapi)
[![Downloads](https://img.shields.io/npm/dm/@samchon/openapi.svg)](https://www.npmjs.com/package/@samchon/openapi)
[![Build Status](https://github.com/samchon/openapi/workflows/build/badge.svg)](https://github.com/samchon/openapi/actions?query=workflow%3Abuild)
[![API Documents](https://img.shields.io/badge/API-Documents-forestgreen)](https://nestia.io/api/modules/_samchon_openapi.html)
[![API Documents](https://img.shields.io/badge/API-Documents-forestgreen)](https://samchon.github.io/api/)
[![Discord Badge](https://img.shields.io/badge/discord-samchon-d91965?style=flat&labelColor=5866f2&logo=discord&logoColor=white&link=https://discord.gg/E94XhzrUCZ)](https://discord.gg/E94XhzrUCZ)

OpenAPI definitions, converters and LLM function calling application composer.
Expand All @@ -32,19 +32,19 @@ OpenAPI definitions, converters and LLM function calling application composer.
3. [OpenAPI v3.1](https://github.com/samchon/openapi/blob/master/src/OpenApiV3_1.ts)
4. [**OpenAPI v3.1 emended**](https://github.com/samchon/openapi/blob/master/src/OpenApi.ts)

`@samchon/openapi` also provides LLM (Large Language Model) function calling application composer from the OpenAPI document with many strategies. With the [`HttpLlm`](https://nestia.io/api/modules/_samchon_openapi.HttpLlm.html) module, you can perform the LLM function calling extremely easily just by delivering the OpenAPI (Swagger) document.
`@samchon/openapi` also provides LLM (Large Language Model) function calling application composer from the OpenAPI document with many strategies. With the [`HttpLlm`](https://samchon.github.io/api/modules/HttpLlm.html) module, you can perform the LLM function calling extremely easily just by delivering the OpenAPI (Swagger) document.

- [`HttpLlm.application()`](https://nestia.io/api/functions/_samchon_openapi.HttpLlm.application.html)
- [`IHttpLlmApplication<Model>`](https://nestia.io/api/interfaces/_samchon_openapi.IHttpLlmApplication-1.html)
- [`IHttpLlmFunction<Model>`](https://nestia.io/api/interfaces/_samchon_openapi.IHttpLlmFunction-1.html)
- [`HttpLlm.application()`](https://samchon.github.io/api/functions/HttpLlm.application.html)
- [`IHttpLlmApplication<Model>`](https://samchon.github.io/api/interfaces/IHttpLlmApplication-1.html)
- [`IHttpLlmFunction<Model>`](https://samchon.github.io/api/interfaces/IHttpLlmFunction-1.html)
- Supported schemas
- [`IChatGptSchema`](https://nestia.io/api/types/_samchon_openapi.IChatGptSchema-1.html): OpenAI ChatGPT
- [`IClaudeSchema`](https://nestia.io/api/types/_samchon_openapi.IClaudeSchema-1.html): Anthropic Claude
- [`IGeminiSchema`](https://nestia.io/api/types/_samchon_openapi.IGeminiSchema-1.html): Google Gemini
- [`ILlamaSchema`](https://nestia.io/api/types/_samchon_openapi.ILlamaSchema-1.html): Meta Llama
- [`IChatGptSchema`](https://samchon.github.io/api/types/IChatGptSchema-1.html): OpenAI ChatGPT
- [`IClaudeSchema`](https://samchon.github.io/api/types/IClaudeSchema-1.html): Anthropic Claude
- [`IGeminiSchema`](https://samchon.github.io/api/types/IGeminiSchema-1.html): Google Gemini
- [`ILlamaSchema`](https://samchon.github.io/api/types/ILlamaSchema-1.html): Meta Llama
- Midldle layer schemas
- [`ILlmSchemaV3`](https://nestia.io/api/types/_samchon_openapi.ILlmSchemaV3-1.html): middle layer based on OpenAPI v3.0 specification
- [`ILlmSchemaV3_1`](https://nestia.io/api/types/_samchon_openapi.ILlmSchemaV3_1-1.html): middle layer based on OpenAPI v3.1 specification
- [`ILlmSchemaV3`](https://samchon.github.io/api/types/ILlmSchemaV3-1.html): middle layer based on OpenAPI v3.0 specification
- [`ILlmSchemaV3_1`](https://samchon.github.io/api/types/ILlmSchemaV3_1-1.html): middle layer based on OpenAPI v3.1 specification

> https://github.com/user-attachments/assets/01604b53-aca4-41cb-91aa-3faf63549ea6
>
Expand Down Expand Up @@ -225,21 +225,21 @@ LLM function calling application from OpenAPI document.

`@samchon/openapi` provides LLM (Large Language Model) function calling application from the "emended OpenAPI v3.1 document". Therefore, if you have any HTTP backend server and succeeded to build an OpenAPI document, you can easily make the A.I. chatbot application.

In the A.I. chatbot, LLM will select proper function to remotely call from the conversations with user, and fill arguments of the function automatically. If you actually execute the function call through the [`HttpLlm.execute()`](https://nestia.io/api/functions/_samchon_openapi.HttpLlm.execute.html) function, it is the "LLM function call."
In the A.I. chatbot, LLM will select proper function to remotely call from the conversations with user, and fill arguments of the function automatically. If you actually execute the function call through the [`HttpLlm.execute()`](https://samchon.github.io/api/functions/HttpLlm.execute.html) function, it is the "LLM function call."

Let's enjoy the fantastic LLM function calling feature very easily with `@samchon/openapi`.

- Application
- [`HttpLlm.application()`](https://nestia.io/api/functions/_samchon_openapi.HttpLlm.application.html)
- [`IHttpLlmApplication`](https://nestia.io/api/interfaces/_samchon_openapi.IHttpLlmApplication-1.html)
- [`IHttpLlmFunction`](https://nestia.io/api/interfaces/_samchon_openapi.IHttpLlmFunction-1.html)
- [`HttpLlm.application()`](https://samchon.github.io/api/functions/HttpLlm.application.html)
- [`IHttpLlmApplication`](https://samchon.github.io/api/interfaces/IHttpLlmApplication-1.html)
- [`IHttpLlmFunction`](https://samchon.github.io/api/interfaces/IHttpLlmFunction-1.html)
- Schemas
- [`IChatGptSchema`](https://nestia.io/api/types/_samchon_openapi.IChatGptSchema-1.html): OpenAI ChatGPT
- [`IClaudeSchema`](https://nestia.io/api/types/_samchon_openapi.IClaudeSchema-1.html): Anthropic Claude
- [`IGeminiSchema`](https://nestia.io/api/types/_samchon_openapi.IGeminiSchema-1.html): Google Gemini
- [`ILlamaSchema`](https://nestia.io/api/types/_samchon_openapi.ILlamaSchema-1.html): Meta Llama
- [`ILlmSchemaV3`](https://nestia.io/api/types/_samchon_openapi.ILlmSchemaV3-1.html): middle layer based on OpenAPI v3.0 specification
- [`ILlmSchemaV3_1`](https://nestia.io/api/types/_samchon_openapi.ILlmSchemaV3_1-1.html): middle layer based on OpenAPI v3.1 specification
- [`IChatGptSchema`](https://samchon.github.io/api/types/IChatGptSchema-1.html): OpenAI ChatGPT
- [`IClaudeSchema`](https://samchon.github.io/api/types/IClaudeSchema-1.html): Anthropic Claude
- [`IGeminiSchema`](https://samchon.github.io/api/types/IGeminiSchema-1.html): Google Gemini
- [`ILlamaSchema`](https://samchon.github.io/api/types/ILlamaSchema-1.html): Meta Llama
- [`ILlmSchemaV3`](https://samchon.github.io/api/types/ILlmSchemaV3-1.html): middle layer based on OpenAPI v3.0 specification
- [`ILlmSchemaV3_1`](https://samchon.github.io/api/types/ILlmSchemaV3_1-1.html): middle layer based on OpenAPI v3.1 specification
- Type Checkers
- [`ChatGptTypeChecker`](https://github.com/samchon/openapi/blob/master/src/utils/ChatGptTypeChecker.ts)
- [`ClaudeTypeChecker`](https://github.com/samchon/openapi/blob/master/src/utils/ClaudeTypeChecker.ts)
Expand All @@ -250,7 +250,7 @@ Let's enjoy the fantastic LLM function calling feature very easily with `@samcho

> [!NOTE]
>
> You also can compose [`ILlmApplication`](https://nestia.io/api/interfaces/_samchon_openapi.ILlmApplication-1.html) from a class type with `typia`.
> You also can compose [`ILlmApplication`](https://samchon.github.io/api/interfaces/ILlmApplication-1.html) from a class type with `typia`.
>
> https://typia.io/docs/llm/application
>
Expand All @@ -275,7 +275,7 @@ Actual function call execution is by yourself.
LLM (Large Language Model) providers like OpenAI selects a proper function to call from the conversations with users, and fill arguments of it. However, function calling feature supported by LLM providers do not perform the function call execution. The actual execution responsibility is on you.
In `@samchon/openapi`, you can execute the LLM function calling by [`HttpLlm.execute()`](https://nestia.io/api/functions/_samchon_openapi.HttpLlm.execute.html) (or [`HttpLlm.propagate()`](https://nestia.io/api/functions/_samchon_openapi.HttpLlm.propagate.html)) function. Here is an example code executing the LLM function calling through the [`HttpLlm.execute()`](https://nestia.io/api/functions/_samchon_openapi.HttpLlm.execute.html) function. As you can see, to execute the LLM function call, you have to deliver these information:
In `@samchon/openapi`, you can execute the LLM function calling by [`HttpLlm.execute()`](https://samchon.github.io/api/functions/HttpLlm.execute.html) (or [`HttpLlm.propagate()`](https://samchon.github.io/api/functions/HttpLlm.propagate.html)) function. Here is an example code executing the LLM function calling through the [`HttpLlm.execute()`](https://samchon.github.io/api/functions/HttpLlm.execute.html) function. As you can see, to execute the LLM function call, you have to deliver these information:
- Connection info to the HTTP server
- Application of the LLM function calling
Expand Down Expand Up @@ -382,7 +382,7 @@ Arguments from both Human and LLM sides.
When composing parameter arguments through the LLM (Large Language Model) function calling, there can be a case that some parameters (or nested properties) must be composed not by LLM, but by Human. File uploading feature, or sensitive information like secret key (password) cases are the representative examples.
In that case, you can configure the LLM function calling schemas to exclude such Human side parameters (or nested properties) by `IHttpLlmApplication.options.separate` property. Instead, you have to merge both Human and LLM composed parameters into one by calling the [`HttpLlm.mergeParameters()`](https://nestia.io/api/functions/_samchon_openapi.HttpLlm.mergeParameters.html) before the LLM function call execution of [`HttpLlm.execute()`](https://nestia.io/api/functions/_samchon_openapi.HttpLlm.execute.html) function.
In that case, you can configure the LLM function calling schemas to exclude such Human side parameters (or nested properties) by `IHttpLlmApplication.options.separate` property. Instead, you have to merge both Human and LLM composed parameters into one by calling the [`HttpLlm.mergeParameters()`](https://samchon.github.io/api/functions/HttpLlm.mergeParameters.html) before the LLM function call execution of [`HttpLlm.execute()`](https://samchon.github.io/api/functions/HttpLlm.execute.html) function.
Here is the example code separating the file uploading feature from the LLM function calling schema, and combining both Human and LLM composed parameters into one before the LLM function call execution.
Expand Down
7 changes: 5 additions & 2 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,8 @@
"build:main": "rimraf lib && tsc && rollup -c",
"build:test": "rimraf bin && tsc -p test/tsconfig.json",
"dev": "npm run build:test -- --watch",
"test": "node bin/test"
"test": "node bin/test",
"typedoc": "typedoc --plugin typedoc-github-theme --theme typedoc-github-theme"
},
"keywords": [
"swagger",
Expand Down Expand Up @@ -40,7 +41,7 @@
"bugs": {
"url": "https://github.com/samchon/openapi/issues"
},
"homepage": "https://nestia.io/api/modules/_samchon_openapi.html",
"homepage": "https://samchon.github.io/openapi/api",
"devDependencies": {
"@anthropic-ai/sdk": "^0.32.1",
"@google/generative-ai": "^0.21.0",
Expand Down Expand Up @@ -73,6 +74,8 @@
"ts-node": "^10.9.2",
"ts-patch": "^3.3.0",
"tstl": "^3.0.0",
"typedoc": "^0.27.6",
"typedoc-github-theme": "^0.2.1",
"typescript": "~5.7.2",
"typescript-transform-paths": "^3.5.2",
"typia": "7.6.0",
Expand Down
1 change: 1 addition & 0 deletions typedoc.json
Original file line number Diff line number Diff line change
Expand Up @@ -2,4 +2,5 @@
"$schema": "https://typedoc.org/schema.json",
"tsconfig": "tsconfig.json",
"entryPoints": ["src/index.ts"],
"out": "./website/public/api",
}
Empty file added website/public/.nojekyll
Empty file.
41 changes: 41 additions & 0 deletions website/public/index.html
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
<html lang="en">
<head>
<meta charset="UTF-8" />
<title>@samchon/openapi</title>
</head>
<body>
<h2>@samchon/openapi</h2>
<hr/>
<h3>
<a href="https://github.com/samchon/openapi">Github Repository</a>
</h3>
<h3>
<a href="./api/">API Documents</a>
</h3>
<br/>

<h2>Demonstration Cases</h2>
<hr/>
<h3>Shopping Chat</h3>
<iframe
src="https://www.youtube.com/embed/m47p4iJ90Ms?si=cvgfckN25GJhjLTB"
title="Shopping A.I. Chatbot built with Nestia"
width="100%"
height="600"
frameborder="0"
allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin"
allowfullscreen
></iframe>
<br/><br/>
<h3>BBS Chat</h3>
<iframe
src="https://www.youtube.com/embed/pdsplQyok8k?si=geL7DH5CWcC8qlz_"
title="Shopping A.I. Chatbot built with Typia"
width="100%"
height="600"
frameborder="0"
allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin"
allowfullscreen
></iframe>
</body>
</html>

0 comments on commit bb9d913

Please sign in to comment.