Skip to content

Commit

Permalink
added README in the fern ignore (#11)
Browse files Browse the repository at this point in the history
* added README in the fern ignore

* Update ci.yml
  • Loading branch information
adeelehsan authored Jan 22, 2025
1 parent 3668d6e commit b6e7a3c
Show file tree
Hide file tree
Showing 2 changed files with 241 additions and 45 deletions.
7 changes: 1 addition & 6 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -26,11 +26,6 @@ jobs:
- name: Set up node
uses: actions/setup-node@v3

- name: Compile
env:
APIKEY: ${{ secrets.APIKEY }}
run: yarn && yarn test

publish:
needs: [ compile, test ]
if: github.event_name == 'push' && contains(github.ref, 'refs/tags/')
Expand All @@ -56,4 +51,4 @@ jobs:
npm publish --access public
fi
env:
NPM_TOKEN: ${{ secrets.NPM_TOKEN }}
NPM_TOKEN: ${{ secrets.NPM_TOKEN }}
279 changes: 240 additions & 39 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,44 +19,251 @@ npm i -s vectara

A full reference for this library is available [here](./reference.md).

## Usage
### Usage

Instantiate and use the client with the following:
First, create an SDK client.<br />
You can use either an `apiKey` or OAuth (`clientId` and `clientSecret`) for [authentication](https://docs.vectara.com/docs/console-ui/api-access-overview).

```typescript
```Typescript
import { VectaraClient } from "vectara";

const client = new VectaraClient({
clientId: "YOUR_CLIENT_ID",
clientSecret: "YOUR_CLIENT_SECRET",
apiKey: "YOUR_API_KEY",
});
await client.query({
query: "What is a hallucination?",
search: {
# creating the client using API key
client = VectaraClient(
apiKey="YOUR_API_KEY"
)

# creating the client using oauth credentials
client = VectaraClient(
clientId="YOUR_CLIENT_ID",
clientSecret="YOUR_CLIENT_SECRET",
)
```

If you don't already have a corpus, you can create it using the SDK:

```Typescript
client.corpora.create(name="my-corpus", key="my-corpus-key")
```

### Add a document to a corpus
You can add documents to a corpus in two formats: [structured](https://docs.vectara.com/docs/learn/select-ideal-indexing-api#structured-document-definition) or [core](https://docs.vectara.com/docs/learn/select-ideal-indexing-api#core-document-definition).<br/> For more information, refer to the [Indexing Guide](https://docs.vectara.com/docs/learn/select-ideal-indexing-api).

Here is an example for adding a Structured document
```typescript
const document: StructuredDocument = {
id: "my-doc-id",
type: "structured",
title: "my document",
description: "test document",
sections: [
{
id: 1,
title: "A nice title.",
text: "I'm a nice document section.",
metadata: {'section': '1.1'}
},
{
id: 2,
title: "Another nice title.",
text: "I'm another document section on something else.",
metadata: {'section': '1.2'}
}
],
metadata: {'url': 'https://example.com'}
};

const response = await client.documents.create(corpusKey, {body: document});
```
And here is one with Core document:

```typescript
const document: CoreDocument = {
id: "my-doc-id",
type: "core",
documentParts: [
{
text: "I am part of a document."
}
]
};

const response = await client.documents.create(corpusKey, {body: document});
```

### Upload a file to the corpus
In addition to creating a document as shown above (using StructuredDocument or CoreDocument), you can also upload files (such as PDFs or Word Documents) directly to Vectara.
In this case Vectara will parse the files automatically, extract text and metadata, chunk them and add them to the corpus.

Using the SDK you need to provide both the file name, the binary content of the file, and the content_type, as follows:

```typescript
const filename = "examples.pdf";
const fileStream = fs.createReadStream(filename);
const response = await client.upload.file(fileStream, "test-upload", {filename: "test-upload.pdf"})
```

### Querying the corpora
With the SDK it's super easy to run a query from one or more corpora. For more detailed information, see this [Query API guide](https://docs.vectara.com/docs/api-reference/search-apis/search)

A query uses two important objects:
* The `SearchCorporaParameters` object defines parameters for search such as hybrid search, metadata filtering or reranking
* The `GenerationParameters` object defines parameters for the generative step.

Here is an example query for our corpus above:

```typescript
const searchParams: SearchCorporaParameters = {
corpora: [
{
corpusKey: "corpus_key",
corpusKey: "test-search-1",
metadataFilter: "",
lexicalInterpolation: 0.005,
lexicalInterpolation: 0.05,
},
{
corpusKey: "test-search-2",
metadataFilter: "",
lexicalInterpolation: 0.05,
}
],
contextConfiguration: {
sentencesBefore: 2,
sentencesAfter: 2,
},
reranker: {
type: "customer_reranker",
rerankerId: "rnk_272725719",
rerankerId: "rnk_272725719"
},
},
generation: {
};

const generationParams: GenerationParameters = {
// LLM used for processing. For more details https://docs.vectara.com/docs/learn/grounded-generation/select-a-summarizer
generationPresetName: "vectara-summary-ext-v1.2.0",
responseLanguage: "eng",
citations: {
style: "none",
},
enableFactualConsistencyScore: true,
},
};

const response = await client.query({
query: "what is vectara?",
search: searchParams,
generation: generationParams,
});
```

### Using Chat

Vectara [chat](https://docs.vectara.com/docs/api-reference/chat-apis/chat-apis-overview) provides a way to automatically store chat history to support multi-turn conversations.

Here is an example of how to start a chat with the SDK:

```typescript
const searchParams: SearchCorporaParameters = {
corpora: [
{
corpusKey: "test-chat",
metadataFilter: "",
lexicalInterpolation: 1,
},
],
contextConfiguration: {
sentencesBefore: 2,
sentencesAfter: 2,
},
reranker: {
type: "customer_reranker",
rerankerId: "rnk_272725719"
},
};

const generationParams: GenerationParameters = {
responseLanguage: "eng",
citations: {
style: "none",
},
enableFactualConsistencyScore: false,
};

const chatParams: ChatParameters = { store: true };
const requestOptions: RequestOptions = { timeoutInSeconds: 100 };

const session = await client.createChatSession(
searchParams,
generationParams,
chatParams,
requestOptions
);

const response1 = await session.chat("what is vectara?");
const response2 = await session.chat("is vectara a vector database?");
```
Note that we used the `createChatSession` with `chatConfig` set for storing chat history. The resulting session can then be used for turn-by-turn chat, simply by using the `chat()` method of the session object.

### Streaming

The SDK supports streaming responses for both query and chat. When using streaming, the response will be a generator that you can iterate over.

Here's an example of calling `queryStream`:

Streaming the query response
```typescript
const searchParams: SearchCorporaParameters = {...}
const generationParams: GenerationParameters = {...}

const responseStream = await client.queryStream({
query: "what is vectara?",
search: searchParams,
generation: generationParams
});

const responseItems = [];
for await (const event of responseStream) {
if (event.type === "generation_chunk") {
console.log(event.generationChunk)
}
if (event.type === "search_results") {
console.log(event.searchResults)
}
}
```

And stream with chat:

```typescript
const searchParams: SearchCorporaParameters = {...};
const generationParams: GenerationParameters = {...};
const chatParams: ChatParameters = { store: true };

const session = await client.createChatSession(
searchParams,
generationParams,
chatParams,
requestOptions
);
const responseStream = await session.chatStream("Tell me about machine learning")
for await (const event of responseStream) {
// ChatInfo event contains metadata about the chat session
// - chatId: Unique identifier for the chat
// - turnId: Identifier for the current turn in the conversation
if (event.type === "chat_info"){
console.log(event.chatId)
console.log(event.turnId)
}
// SearchResults event contains the relevant documents
// - Contains matched text segments, their relevance scores, and metadata
if (event.type === "search_results") {
console.log(event.searchResults)
}
// GenerationChunk events contain pieces of the generated response
if (event.type === "generation_chunk") {
console.log(event.generationChunk)
}
}


```

## Request And Response Types

The SDK exports all request and response types as TypeScript interfaces. Simply import them with the
Expand Down Expand Up @@ -113,21 +320,6 @@ while (page.hasNextPage()) {
}
```

## Author

👤 **Vectara**

- Website: https://vectara.com
- Twitter: [@vectara](https://twitter.com/vectara)
- GitHub: [@vectara](https://github.com/vectara)
- LinkedIn: [@vectara](https://www.linkedin.com/company/vectara/)
- Discord: [@vectara](https://discord.gg/GFb8gMz6UH)

## 🤝 Contributing

Contributions, issues and feature requests are welcome!<br/>
Feel free to check [issues page](https://github.com/vectara/python-sdk/issues). You can also take a look at the [contributing guide](./CONTRIBUTING).

## Advanced

### Additional Headers
Expand Down Expand Up @@ -210,12 +402,21 @@ const client = new VectaraClient({
});
```

## Contributing
## Author

👤 **Vectara**

- Website: https://vectara.com
- Twitter: [@vectara](https://twitter.com/vectara)
- GitHub: [@vectara](https://github.com/vectara)
- LinkedIn: [@vectara](https://www.linkedin.com/company/vectara/)
- Discord: [@vectara](https://discord.gg/GFb8gMz6UH)

## 🤝 Contributing

Contributions, issues and feature requests are welcome!<br/>
Feel free to check [issues page](https://github.com/vectara/python-sdk/issues). You can also take a look at the [contributing guide](./CONTRIBUTING).

While we value open-source contributions to this SDK, this library is generated programmatically.
Additions made directly to this library would have to be moved over to our generation code,
otherwise they would be overwritten upon the next generated release. Feel free to open a PR as
a proof of concept, but know that we will not be able to merge it as-is. We suggest opening
an issue first to discuss with us!
## Show your support

On the other hand, contributions to the README are always very welcome!
Give a ⭐️ if this project helped you!

0 comments on commit b6e7a3c

Please sign in to comment.