New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
http: introduce new http Client #34082
Conversation
89c03f3
to
4c36192
Compare
|
I started working on This is intertwined with a lot of other questions, mainly - how many HTTP/1.1 client should we have? |
|
cc @nodejs/tsc |
|
There's a lot here. @ronag, would you be willing to jump on a call next week to discuss in depth? |
|
If reviewing the code in detail please look at https://github.com/mcollina/undici instead of this PR. I'm not keeping the code in this PR up to date at the moment. Waiting for feedback and a plan before initiating that. |
|
Some updated benchmark results based on recent updates: Machine: 2.8GHz AMD EPYC 7402P
Configuration: Node v14.4, HTTP/1.1 without TLS, 100 connections, Linux 5.4.12-1-lts
http - keepalive - pipe x 5,768 ops/sec ±4.17% (71 runs sampled)
undici - pipeline - pipe x 7,151 ops/sec ±2.59% (80 runs sampled)
undici - request - pipe x 11,618 ops/sec ±4.43% (72 runs sampled)
undici - stream - pipe x 12,592 ops/sec ±1.03% (81 runs sampled) |
|
Not sure how the call gets scheduled but I'm interested in participating when it does happen |
|
I ran out of time this week to get it scheduled. I'll get something into the calendar next week and will post a comment here on the details! |
|
Why is nodejs http client slower than undici? |
|
undici now has pretty much feature parity with node core. Only thing missing (as far as I'm aware) and which is currently a won't fix is 100 expect support. |
|
Slightly improved perf and also a new and faster low level API ( http - keepalive x 5,634 ops/sec ±2.53% (274 runs sampled)
undici - pipeline x 8,642 ops/sec ±3.08% (276 runs sampled)
undici - request x 12,681 ops/sec ±0.51% (279 runs sampled)
undici - stream x 14,006 ops/sec ±0.53% (280 runs sampled)
undici - dispatch x 15,002 ops/sec ±0.39% (278 runs sampled)Up to 2.67x faster at the moment. EDIT: Updated 18/8 |
|
With 2.0 + updated benchmarks which make use if pipelining + unix sockets; we are now almost up to 5x faster: Node 14 http - keepalive x 6,770 ops/sec ±7.70% (75 runs sampled)
undici - pipeline x 10,627 ops/sec ±5.88% (79 runs sampled)
undici - request x 10,902 ops/sec ±1.28% (85 runs sampled)
undici - stream x 20,144 ops/sec ±0.84% (86 runs sampled)
undici - dispatch x 20,295 ops/sec ±1.00% (83 runs sampled)Node 15 http - keepalive x 10,337 ops/sec ±6.17% (71 runs sampled)
undici - pipeline x 30,387 ops/sec ±1.37% (80 runs sampled)
undici - request x 40,117 ops/sec ±3.25% (77 runs sampled)
undici - stream x 40,543 ops/sec ±1.30% (80 runs sampled)
undici - dispatch x 50,434 ops/sec ±2.08% (77 runs sampled)Not sure what we did in Node 15 but it's significantly faster. I would guess it has to do with streams + somewhat synthetic benchmarks. |
|
does this support all http versions of just 1.1? |
Only 1.1 |
|
I think we should get this working with h2 (and maybe h2c) before merging. |
|
At the moment there are no plans to add h2 support. h3 was recently briefly discussed but that’s still a year or two away and also would need someone to do the implementation. |
| kRetryTimeout: Symbol('retry timeout'), | ||
| kClient: Symbol('client'), | ||
| kEnqueue: Symbol('enqueue'), | ||
| kMaxAbortedPayload: Symbol('max aborted payload') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can this follow the standard symbol style please, i.e. identifier and description identical?
| } = primordials; | ||
|
|
||
| const { AsyncResource } = require('async_hooks'); | ||
| const EE = require('events'); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As in the other file:
| const EE = require('events'); | |
| const EventEmitter = require('events'); |
(and elsewhere below as well, for readability – most Node.js core contributors would know what EE stands for, others might not)
| @@ -0,0 +1,798 @@ | |||
| 'use strict'; | |||
|
|
|||
| /* eslint-disable no-restricted-syntax */ | |||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just leaving a comment here to make sure this is resolved at some point :)
(Also, the files should probably all go into lib/internal?)
|
Quite happy to leave undici outside of core. Gives greater flexibility in development and releases. At the moment I don’t see much advantage in merging it into core other than visibility. We already have it under the NodeJS organization on GH. Would maybe be nice to mention it in the HTTP docs and in the future doc deprecate the current client. Maybe publishing it on npm under @nodejs/undici might be interesting. @mcollina Thoughts? Can I close this PR? |
|
Let's close this for now. We might want to publicize undici a little bit more. |
I'm opening this PR mostly as a forum for discussion and expect it to probably end up being closed. Not sure if this is the right approach so feel free to point me in a better direction if possible.
During the OpenJS conference session yesterday we discussed the future of HTTP. Mostly focused on the server side but also the client side was briefly mentioned. In particular @jasnell pointed out that in HTTP2/3 the difference between server and client side is much smaller. Though, we all agreed that any plans in this regard is probably a few years away. We basically ended up with two issues (#1, #2) in the web-server-frameworks working group. Again this mostly focuses on the server side so I'm a little unsure where client side discussions should go. Given the mentioned decrease in difference, maybe it should land in the web-server-frameworks WG at some point?
That being said, I have been looking into implementing a simpler and faster http client and ended up working on @mcollina's undici library. It might or might not make sense to include in node core or possibly some kind of WG package. The idea is that this client should provide a building block for higher level client implementations.
I think most of us are familiar with the issues with the current http client in Node core and that it can be a challenge to maintain. How do we get past this?
In this PR I have included undici to discuss whether or not it might make sense to include in core, and if so what things are missing, should be changed or needs further consideration?
As a quick summary:
request,pipeline,stream,upgradeandconnect.For further details it's easiest to check the README.
Some challenges:
- Missing 1xx support (in particular 101).- Missing trailers support.Some TODOs:
Benchmarks:
Checklist
make -j4 test(UNIX), orvcbuild test(Windows) passes