Skip to content
This repository has been archived by the owner on Aug 25, 2024. It is now read-only.

Stream support #24

Open
ai opened this issue Apr 16, 2020 · 0 comments
Open

Stream support #24

ai opened this issue Apr 16, 2020 · 0 comments

Comments

@ai
Copy link
Member

ai commented Apr 16, 2020

We need to add stream support to protocol processing as it requires in docs.

Pseudocode:

- answers = [get_resend(), get_access(), get_processed()]
- response.send(json: answers)
+ response.write("[")
+ resporce.write(get_resend())
+ response.write(",")
+ resporce.write(get_access())
+ response.write(",")
+ resporce.write(get_processed())
+ resporce.write("]")

We do not need async support here. Good old sync way is enough.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant