-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
update Read me #10
update Read me #10
Conversation
.fernignore
Outdated
src/Client.ts | ||
src/core/fetcher/Fetcher.ts | ||
tests/integration | ||
src/wrapper | ||
src/index.ts | ||
src/api/resources/auth/client/Client.ts | ||
src/core/fetcher/getRequestBody.ts | ||
contributing.md |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Isn't it usually with uppercase: CONTRIBUTING.md?
README.md
Outdated
citations: { | ||
style: "none", | ||
}, | ||
enableFactualConsistencyScore: false, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why not have this true in the example?
generation: { | ||
}; | ||
|
||
const generationParams: GenerationParameters = { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
IN this example, what would be the default prompt/LLM? perhaps add to the example also that variable to show how it's used and a link to the docs for options to choose from?
README.md
Outdated
|
||
### Streaming | ||
|
||
The SDK supports streaming responses for both query and chat. When using streaming, the response will be a generator that you can iterate. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: "... that you can iterate over".
README.md
Outdated
} | ||
``` | ||
|
||
And streaming the chat response: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would rephrase as "And stream with chat:"
generation: generationParams | ||
}); | ||
|
||
const responseItems = []; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there a link in the docs you can refer to that explains the various "types" (chat_info, generation_chunk, etc)? or if not, perhaps explain this a bit.
Also we call this "chunk" in the loop, but it's not really a chunk anymore (chunk to me says part of the text). It's like "item" or "event"?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I made some comments inline. Approved once addressed.
No description provided.