Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add async req/res wrapper in sapper #4

Open
miketang84 opened this issue Jun 16, 2016 · 9 comments
Open

add async req/res wrapper in sapper #4

miketang84 opened this issue Jun 16, 2016 · 9 comments

Comments

@miketang84
Copy link
Owner

add async req/res wrapper in sapper

@miketang84
Copy link
Owner Author

wait for async/await feature in rust.

@Xudong-Huang
Copy link

Xudong-Huang commented Jan 16, 2018

did you bench the current implementation? I'd like to see the data recorded in this issue, in future we can track on it. I have a plan to support async io based on MAY, but first we need to use http crate as the interface instead of hyper. And the new http server engine should use http crate for compatibility.

This would impcat this project a lot, should have some plans to discuss it.

Currently you can evaluate may_minihttp

@miketang84
Copy link
Owner Author

miketang84 commented Jan 16, 2018

You think hyper (0.10.13) is slow? Alright?
Sapper will make a branch based on may, with any good idea!

Or can call it sapper_may.

@Xudong-Huang
Copy link

can you create a branch for may_http?
and test it on https://github.com/Xudong-Huang/sapper/tree/may

once you create the branch, I can have a PR for it.

may_http is useable now, and is faster than future based hyper. you can test it out.

But the sapper design is not very optimized, you don't need to cache the body into an extra vec buf, but write the rsp directly to the response. maybe we can address this issue in another place.

@Xudong-Huang
Copy link

Xudong-Huang commented Jan 29, 2018

test for examples/tiny

the may_http version with 2 io threads

$ wrk http://127.0.0.1:1337/foo/ -d 30 -t 2 -c 200
Running 30s test @ http://127.0.0.1:1337/foo/
  2 threads and 200 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.86ms    1.59ms  36.35ms   97.31%
    Req/Sec    58.39k     4.12k   90.16k    82.17%
  3489710 requests in 30.09s, 336.13MB read
Requests/sec: 115974.88
Transfer/sec:     11.17MB
wrk http://127.0.0.1:1337/foo/ -d 30 -t 2 -c 200  12.30s user 31.38s system 145% cpu 30.095 total

the hyper master branch

$ wrk http://127.0.0.1:1337/foo/ -d 30 -t 2 -c 200
Running 30s test @ http://127.0.0.1:1337/foo/
  2 threads and 200 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency    72.68us   58.95us  13.76ms   99.20%
    Req/Sec    50.24k     6.26k   72.91k    88.00%
  1499423 requests in 30.09s, 124.41MB read
Requests/sec:  49824.83
Transfer/sec:      4.13MB
wrk http://127.0.0.1:1337/foo/ -d 30 -t 2 -c 200  3.29s user 15.76s system 63% cpu 30.101 total 

@miketang84
Copy link
Owner Author

wow, nice work!

@miketang84
Copy link
Owner Author

hi, I have created a new branch may, you can pr now. Thx.

@danclive
Copy link

danclive commented Feb 6, 2018

@Xudong-Huang You just tested hello world. That doesn't make any sense. Also, your implementation of the HTTP protocol is not comprehensive.

@Xudong-Huang
Copy link

thanks @mitum. This is just an experiment branch. I am not http expert, I welcome any suggestions about the project. what do you think I can improve the project, is there any failure details so that I can help to fix?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants