Skip to content

Conversation

@rluvaton
Copy link
Member

@rluvatonrluvaton commented Oct 27, 2023

Benchmarks

Change _events to be a Map:

commit: 249c2b8

Notes:

  • I'm afraid this can break userland implementation of EventEmitter that are using node EventEmitter class functions (cc @ronag )
  • This also improves creation by a lot (and therefore improves streams creation)

Benchmark URL

15:31:01 confidence improvement accuracy (*) (**) (***) 15:31:01 events/ee-add-remove.js n=1000000 removeListener=0 newListener=0 *** 19.64 % ±1.57% ±2.09% ±2.73% 15:31:01 events/ee-add-remove.js n=1000000 removeListener=0 newListener=1 *** 46.84 % ±2.02% ±2.70% ±3.54% 15:31:01 events/ee-add-remove.js n=1000000 removeListener=1 newListener=0 *** 45.01 % ±2.68% ±3.60% ±4.75% 15:31:01 events/ee-add-remove.js n=1000000 removeListener=1 newListener=1 *** 56.27 % ±2.72% ±3.64% ±4.79% 15:31:01 events/ee-emit.js listeners=1 argc=0 n=2000000 *** 32.42 % ±9.82% ±13.07% ±17.02% 15:31:01 events/ee-emit.js listeners=1 argc=10 n=2000000 *** 36.00 % ±8.08% ±10.75% ±14.00% 15:31:01 events/ee-emit.js listeners=1 argc=2 n=2000000 *** 23.57 % ±9.57% ±12.75% ±16.62% 15:31:01 events/ee-emit.js listeners=1 argc=4 n=2000000 *** 28.33 % ±8.95% ±11.92% ±15.52% 15:31:01 events/ee-emit.js listeners=10 argc=0 n=2000000 4.48 % ±7.03% ±9.35% ±12.17% 15:31:01 events/ee-emit.js listeners=10 argc=10 n=2000000 0.02 % ±4.28% ±5.70% ±7.42% 15:31:01 events/ee-emit.js listeners=10 argc=2 n=2000000 ** -6.22 % ±4.54% ±6.06% ±7.91% 15:31:01 events/ee-emit.js listeners=10 argc=4 n=2000000 ** 6.55 % ±4.59% ±6.13% ±8.00% 15:31:01 events/ee-emit.js listeners=5 argc=0 n=2000000 1.39 % ±9.30% ±12.37% ±16.11% 15:31:01 events/ee-emit.js listeners=5 argc=10 n=2000000 -0.34 % ±7.02% ±9.35% ±12.18% 15:31:01 events/ee-emit.js listeners=5 argc=2 n=2000000 5.45 % ±8.85% ±11.78% ±15.33% 15:31:01 events/ee-emit.js listeners=5 argc=4 n=2000000 0.79 % ±8.60% ±11.44% ±14.89% 15:31:01 events/ee-listen-unique.js n=1000000 events=1 *** 30.56 % ±2.05% ±2.73% ±3.56% 15:31:01 events/ee-listen-unique.js n=1000000 events=10 *** 13.34 % ±1.21% ±1.62% ±2.11% 15:31:01 events/ee-listen-unique.js n=1000000 events=2 *** 18.32 % ±1.99% ±2.67% ±3.51% 15:31:01 events/ee-listen-unique.js n=1000000 events=20 *** 13.37 % ±1.22% ±1.62% ±2.12% 15:31:01 events/ee-listen-unique.js n=1000000 events=3 *** 19.46 % ±1.15% ±1.53% ±2.00% 15:31:01 events/ee-listen-unique.js n=1000000 events=5 *** 15.05 % ±1.35% ±1.80% ±2.35% 15:31:01 events/ee-listener-count-on-prototype.js n=50000000 *** 29.98 % ±5.74% ±7.70% ±10.14% 15:31:01 events/ee-listeners.js raw='false' listeners=5 n=5000000 -2.69 % ±7.61% ±10.13% ±13.18% 15:31:01 events/ee-listeners.js raw='false' listeners=50 n=5000000 0.48 % ±1.82% ±2.42% ±3.15% 15:31:01 events/ee-listeners.js raw='true' listeners=5 n=5000000 6.34 % ±8.35% ±11.12% ±14.49% 15:31:01 events/ee-listeners.js raw='true' listeners=50 n=5000000 -1.34 % ±4.45% ±5.92% ±7.71% 15:31:01 events/ee-once.js argc=0 n=20000000 *** 31.26 % ±1.10% ±1.47% ±1.94% 15:31:01 events/ee-once.js argc=1 n=20000000 *** 26.62 % ±0.60% ±0.80% ±1.04% 15:31:01 events/ee-once.js argc=4 n=20000000 *** 26.93 % ±0.65% ±0.86% ±1.12% 15:31:01 events/ee-once.js argc=5 n=20000000 *** 27.19 % ±0.76% ±1.02% ±1.32% 15:31:01 events/eventtarget-add-remove.js nListener=10 n=1000000 *** 0.92 % ±0.38% ±0.51% ±0.66% 15:31:01 events/eventtarget-add-remove.js nListener=5 n=1000000 * 0.87 % ±0.69% ±0.91% ±1.19% 15:31:01 events/eventtarget-creation.js n=1000000 0.60 % ±5.04% ±6.70% ±8.73% 15:31:01 events/eventtarget.js listeners=1 n=1000000 0.90 % ±6.28% ±8.35% ±10.87% 15:31:01 events/eventtarget.js listeners=10 n=1000000 0.45 % ±3.51% ±4.68% ±6.09% 15:31:01 events/eventtarget.js listeners=5 n=1000000 0.21 % ±4.80% ±6.39% ±8.31% 15:31:01 15:31:01 Be aware that when doing many comparisons the risk of a false-positive 15:31:01 result increases. In this case, there are 37 comparisons, you can thus 15:31:01 expect the following amount of false-positive results: 15:31:01 1.85 false positives, when considering a 5% risk acceptance (*, **, ***), 15:31:01 0.37 false positives, when considering a 1% risk acceptance (**, ***), 15:31:01 0.04 false positives, when considering a 0.1% risk acceptance (***) 

I run it on streams as well to see the impact as this was my original intent: url

Change to Map todos:

  • move all our internal code away from _events and use [kEvents] instead
  • I saw undici used _events so need to migrate as well
change _events to be without __proto__: null

change _events to be without __proto__: null

Commit: eb29e97

I think the performance without proto is not fully representative as the event name can be anything so the dictionary mode is not used here

Benchmark URL

12:55:39 ++ Rscript benchmark/compare.R 12:55:40 confidence improvement accuracy (*) (**) (***) 12:55:40 events/ee-add-remove.js n=1000000 removeListener=0 newListener=0 ** -2.22 % ±1.50% ±2.00% ±2.60% 12:55:40 events/ee-add-remove.js n=1000000 removeListener=0 newListener=1 *** 30.34 % ±1.75% ±2.33% ±3.04% 12:55:40 events/ee-add-remove.js n=1000000 removeListener=1 newListener=0 *** 22.74 % ±1.17% ±1.56% ±2.03% 12:55:40 events/ee-add-remove.js n=1000000 removeListener=1 newListener=1 *** 10.04 % ±1.11% ±1.47% ±1.92% 12:55:40 events/ee-emit.js listeners=1 argc=0 n=2000000 *** 159.54 % ±8.72% ±11.64% ±15.20% 12:55:40 events/ee-emit.js listeners=1 argc=10 n=2000000 *** 139.14 % ±12.44% ±16.59% ±21.66% 12:55:40 events/ee-emit.js listeners=1 argc=2 n=2000000 *** 159.56 % ±13.83% ±18.46% ±24.15% 12:55:40 events/ee-emit.js listeners=1 argc=4 n=2000000 *** 148.52 % ±10.04% ±13.37% ±17.43% 12:55:40 events/ee-emit.js listeners=10 argc=0 n=2000000 *** 13.34 % ±7.48% ±9.96% ±12.96% 12:55:40 events/ee-emit.js listeners=10 argc=10 n=2000000 *** 11.22 % ±5.66% ±7.53% ±9.81% 12:55:40 events/ee-emit.js listeners=10 argc=2 n=2000000 3.11 % ±6.21% ±8.27% ±10.77% 12:55:40 events/ee-emit.js listeners=10 argc=4 n=2000000 *** 12.07 % ±6.04% ±8.04% ±10.47% 12:55:40 events/ee-emit.js listeners=5 argc=0 n=2000000 *** 20.76 % ±9.34% ±12.44% ±16.24% 12:55:40 events/ee-emit.js listeners=5 argc=10 n=2000000 *** 17.29 % ±8.71% ±11.59% ±15.10% 12:55:40 events/ee-emit.js listeners=5 argc=2 n=2000000 * 13.10 % ±9.93% ±13.21% ±17.20% 12:55:40 events/ee-emit.js listeners=5 argc=4 n=2000000 * 9.82 % ±9.51% ±12.66% ±16.48% 12:55:40 events/ee-listen-unique.js n=1000000 events=1 0.24 % ±1.32% ±1.76% ±2.29% 12:55:40 events/ee-listen-unique.js n=1000000 events=10 *** -3.02 % ±1.08% ±1.44% ±1.88% 12:55:40 events/ee-listen-unique.js n=1000000 events=2 *** -2.28 % ±0.69% ±0.92% ±1.20% 12:55:40 events/ee-listen-unique.js n=1000000 events=20 ** -2.16 % ±1.56% ±2.09% ±2.73% 12:55:40 events/ee-listen-unique.js n=1000000 events=3 *** -2.03 % ±0.98% ±1.31% ±1.71% 12:55:40 events/ee-listen-unique.js n=1000000 events=5 *** -4.36 % ±0.89% ±1.19% ±1.55% 12:55:40 events/ee-listener-count-on-prototype.js n=50000000 *** -6.40 % ±3.40% ±4.53% ±5.91% 12:55:40 events/ee-listeners.js raw='false' listeners=5 n=5000000 * -8.30 % ±6.34% ±8.45% ±11.02% 12:55:40 events/ee-listeners.js raw='false' listeners=50 n=5000000 -1.08 % ±1.71% ±2.28% ±2.97% 12:55:40 events/ee-listeners.js raw='true' listeners=5 n=5000000 -3.00 % ±5.69% ±7.58% ±9.89% 12:55:40 events/ee-listeners.js raw='true' listeners=50 n=5000000 3.68 % ±4.65% ±6.18% ±8.05% 12:55:40 events/ee-once.js argc=0 n=20000000 -0.21 % ±0.64% ±0.85% ±1.10% 12:55:40 events/ee-once.js argc=1 n=20000000 0.06 % ±0.44% ±0.59% ±0.76% 12:55:40 events/ee-once.js argc=4 n=20000000 -0.13 % ±0.62% ±0.83% ±1.08% 12:55:40 events/ee-once.js argc=5 n=20000000 0.26 % ±0.69% ±0.91% ±1.19% 12:55:40 events/eventtarget-add-remove.js nListener=10 n=1000000 ** 0.71 % ±0.42% ±0.56% ±0.73% 12:55:40 events/eventtarget-add-remove.js nListener=5 n=1000000 0.30 % ±0.70% ±0.94% ±1.22% 12:55:40 events/eventtarget-creation.js n=1000000 1.95 % ±3.80% ±5.09% ±6.72% 12:55:40 events/eventtarget.js listeners=1 n=1000000 -0.66 % ±5.51% ±7.34% ±9.55% 12:55:40 events/eventtarget.js listeners=10 n=1000000 -0.32 % ±3.15% ±4.19% ±5.46% 12:55:40 events/eventtarget.js listeners=5 n=1000000 -0.39 % ±5.18% ±6.90% ±8.98% 12:55:40 12:55:40 Be aware that when doing many comparisons the risk of a false-positive 12:55:40 result increases. In this case, there are 37 comparisons, you can thus 12:55:40 expect the following amount of false-positive results: 12:55:40 1.85 false positives, when considering a 5% risk acceptance (*, **, ***), 12:55:40 0.37 false positives, when considering a 1% risk acceptance (**, ***), 12:55:40 0.04 false positives, when considering a 0.1% risk acceptance (***) 

@nodejs-github-botnodejs-github-bot added events Issues and PRs related to the events subsystem / EventEmitter. needs-ci PRs that need a full CI run. labels Oct 27, 2023
@rluvatonrluvaton changed the title events: improve creation performanceevents: improve events performanceOct 27, 2023
@rluvatonrluvaton added the performance Issues and PRs related to the performance of Node.js. label Oct 27, 2023
@rluvatonrluvaton added the request-ci Add this label to start a Jenkins CI on a PR. label Oct 27, 2023
@rluvatonrluvaton requested a review from ronagOctober 27, 2023 11:06
@github-actionsgithub-actionsbot removed the request-ci Add this label to start a Jenkins CI on a PR. label Oct 27, 2023
@nodejs-github-bot
Copy link
Collaborator

@rluvatonrluvaton marked this pull request as draft October 27, 2023 11:22
@rluvatonrluvaton removed the request for review from ronagOctober 27, 2023 11:22
@anonriganonrig added needs-benchmark-ci PR that need a benchmark CI run. needs-citgm PRs that need a CITGM CI run. labels Oct 27, 2023
@ronag
Copy link
Member

ronag commented Oct 27, 2023

This is probably going to break someone somewhere. That being said, maybe there is a way to minimize it by returning a Proxy from the _events getter that more accurately imitates the original behavior.

Also, I'm surprised this is so much faster. Often the same events are used (e.g. for streams), so I would assume that V8 uses a shape rather than a map behind the scenes, which should be faster than a Map instance. @benjamingr wdyt?

@ronag
Copy link
Member

I'm actually not that interested in ee-listen-unique and ee-once.

@rluvaton
Copy link
MemberAuthor

rluvaton commented Oct 27, 2023

Also, I'm surprised this is so much faster. Often the same events are used (e.g. for streams), so I would assume that V8 uses a shape rather than a map behind the scenes, which should be faster than a Map instance.

{__proto__: null } hint v8 to use the dictionary mode src/runtime/runtime-literals.cc in V8 (founded by @benjamingr)

@rluvaton
Copy link
MemberAuthor

This is probably going to break someone somewhere. That being said, maybe there is a way to minimize it by returning a Proxy from the _events getter that more accurately imitates the original behavior.

Often the same events are used (e.g. for streams)

you can see the results when removing __proto__ null at the bottom (collapsed)

@rluvaton
Copy link
MemberAuthor

This is concerning:

16:57:40 confidence improvement accuracy (*) (**) (***) 16:57:40 streams/creation.js kind='duplex' n=50000000 ** 1.61 % ±0.93% ±1.23% ±1.61% 16:57:40 streams/creation.js kind='readable' n=50000000 *** -17.07 % ±0.68% ±0.91% ±1.19% 16:57:40 streams/creation.js kind='transform' n=50000000 *** 24.20 % ±1.46% ±1.95% ±2.55% 16:57:40 streams/creation.js kind='writable' n=50000000 0.54 % ±0.92% ±1.23% ±1.60% 16:57:40 streams/destroy.js kind='duplex' n=1000000 ** -3.85 % ±2.25% ±2.99% ±3.90% 16:57:40 streams/destroy.js kind='readable' n=1000000 -0.90 % ±1.66% ±2.21% ±2.88% 16:57:40 streams/destroy.js kind='transform' n=1000000 -1.38 % ±1.91% ±2.54% ±3.31% 16:57:40 streams/destroy.js kind='writable' n=1000000 -2.12 % ±2.19% ±2.91% ±3.80% 16:57:40 streams/pipe-object-mode.js n=5000000 * 6.22 % ±5.34% ±7.11% ±9.26% 16:57:40 streams/pipe.js n=5000000 *** 7.73 % ±3.87% ±5.15% ±6.70% 16:57:40 streams/readable-async-iterator.js sync='no' n=100000 -2.28 % ±6.37% ±8.48% ±11.05% 16:57:40 streams/readable-async-iterator.js sync='yes' n=100000 -0.80 % ±5.49% ±7.30% ±9.51% 16:57:40 streams/readable-bigread.js n=1000 * 3.49 % ±2.92% ±3.92% ±5.17% 16:57:40 streams/readable-bigunevenread.js n=1000 -0.62 % ±3.22% ±4.33% ±5.72% 16:57:40 streams/readable-boundaryread.js type='buffer' n=2000 *** 11.62 % ±1.13% ±1.50% ±1.95% 16:57:40 streams/readable-boundaryread.js type='string' n=2000 2.67 % ±3.46% ±4.61% ±6.03% 16:57:40 streams/readable-from.js type='array' n=10000000 4.68 % ±5.29% ±7.05% ±9.18% 16:57:40 streams/readable-from.js type='async-generator' n=10000000 *** 2.44 % ±0.90% ±1.20% ±1.56% 16:57:40 streams/readable-from.js type='sync-generator-with-async-values' n=10000000 *** 4.27 % ±0.99% ±1.32% ±1.71% 16:57:40 streams/readable-from.js type='sync-generator-with-sync-values' n=10000000 2.93 % ±4.20% ±5.59% ±7.28% 16:57:40 streams/readable-readall.js n=5000 -1.18 % ±6.06% ±8.06% ±10.50% 16:57:40 streams/readable-unevenread.js n=1000 *** 3.97 % ±2.02% ±2.68% ±3.49% 16:57:40 streams/writable-manywrites.js len=1024 callback='no' writev='no' sync='no' n=100000 *** -3.08 % ±1.04% ±1.38% ±1.79% 16:57:40 streams/writable-manywrites.js len=1024 callback='no' writev='no' sync='yes' n=100000 0.94 % ±1.97% ±2.63% ±3.43% 16:57:40 streams/writable-manywrites.js len=1024 callback='no' writev='yes' sync='no' n=100000 *** -6.76 % ±2.49% ±3.34% ±4.39% 16:57:40 streams/writable-manywrites.js len=1024 callback='no' writev='yes' sync='yes' n=100000 -3.66 % ±4.38% ±5.84% ±7.63% 16:57:40 streams/writable-manywrites.js len=1024 callback='yes' writev='no' sync='no' n=100000 -2.13 % ±3.01% ±4.01% ±5.22% 16:57:40 streams/writable-manywrites.js len=1024 callback='yes' writev='no' sync='yes' n=100000 -1.16 % ±2.15% ±2.86% ±3.72% 16:57:40 streams/writable-manywrites.js len=1024 callback='yes' writev='yes' sync='no' n=100000 *** -4.85 % ±2.67% ±3.56% ±4.66% 16:57:40 streams/writable-manywrites.js len=1024 callback='yes' writev='yes' sync='yes' n=100000 -1.77 % ±3.70% ±4.95% ±6.49% 16:57:40 streams/writable-manywrites.js len=32768 callback='no' writev='no' sync='no' n=100000 0.89 % ±3.28% ±4.39% ±5.76% 16:57:40 streams/writable-manywrites.js len=32768 callback='no' writev='no' sync='yes' n=100000 -1.94 % ±3.64% ±4.85% ±6.31% 16:57:40 streams/writable-manywrites.js len=32768 callback='no' writev='yes' sync='no' n=100000 * -1.35 % ±1.11% ±1.47% ±1.92% 16:57:40 streams/writable-manywrites.js len=32768 callback='no' writev='yes' sync='yes' n=100000 -2.63 % ±3.85% ±5.12% ±6.67% 16:57:40 streams/writable-manywrites.js len=32768 callback='yes' writev='no' sync='no' n=100000 1.63 % ±3.13% ±4.19% ±5.53% 16:57:40 streams/writable-manywrites.js len=32768 callback='yes' writev='no' sync='yes' n=100000 -1.37 % ±2.82% ±3.75% ±4.89% 16:57:40 streams/writable-manywrites.js len=32768 callback='yes' writev='yes' sync='no' n=100000 ** -1.50 % ±1.09% ±1.44% ±1.88% 16:57:40 streams/writable-manywrites.js len=32768 callback='yes' writev='yes' sync='yes' n=100000 -1.05 % ±2.24% ±2.98% ±3.88% 16:57:40 16:57:40 Be aware that when doing many comparisons the risk of a false-positive 16:57:40 result increases. In this case, there are 38 comparisons, you can thus 16:57:40 expect the following amount of false-positive results: 16:57:40 1.90 false positives, when considering a 5% risk acceptance (*, **, ***), 16:57:40 0.38 false positives, when considering a 1% risk acceptance (**, ***), 16:57:40 0.04 false positives, when considering a 0.1% risk acceptance (***) 

Copy link
Member

@benjamingrbenjamingr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is neat but I don't understand how it's faster since objects in dictionary mode and maps have basically the same performance.

Also the "polyfill" for ._eventsis incorrect since if someone sets a value on it it won't get reflected in the emitter.

@ronag
Copy link
Member

{proto: null } hint v8 to use the dictionary mod

What happens if you remove that hint? Sorry it's a bit unclear what your last benchmark results are from.

@rluvaton
Copy link
MemberAuthor

rluvaton commented Oct 27, 2023

Also the "polyfill" for ._eventsis incorrect since if someone sets a value on it it won't get reflected in the emitter.

I know I still have a todo there to convert to Proxy so it support everything

This is neat but I don't understand how it's faster since objects in dictionary mode and maps have basically the same performance.

Maybe because it by default go to the dict mode?

{proto: null } hint v8 to use the dictionary mod

What happens if you remove that hint? Sorry it's a bit unclear what your last benchmark results are from.

the benchmarks are from the linked commit

change _events to be without __proto__: null

Commit: eb29e97

I think the performance without proto is not fully representative as the event name can be anything so the dictionary mode is not used here

Benchmark URL

12:55:39 ++ Rscript benchmark/compare.R 12:55:40 confidence improvement accuracy (*) (**) (***) 12:55:40 events/ee-add-remove.js n=1000000 removeListener=0 newListener=0 ** -2.22 % ±1.50% ±2.00% ±2.60% 12:55:40 events/ee-add-remove.js n=1000000 removeListener=0 newListener=1 *** 30.34 % ±1.75% ±2.33% ±3.04% 12:55:40 events/ee-add-remove.js n=1000000 removeListener=1 newListener=0 *** 22.74 % ±1.17% ±1.56% ±2.03% 12:55:40 events/ee-add-remove.js n=1000000 removeListener=1 newListener=1 *** 10.04 % ±1.11% ±1.47% ±1.92% 12:55:40 events/ee-emit.js listeners=1 argc=0 n=2000000 *** 159.54 % ±8.72% ±11.64% ±15.20% 12:55:40 events/ee-emit.js listeners=1 argc=10 n=2000000 *** 139.14 % ±12.44% ±16.59% ±21.66% 12:55:40 events/ee-emit.js listeners=1 argc=2 n=2000000 *** 159.56 % ±13.83% ±18.46% ±24.15% 12:55:40 events/ee-emit.js listeners=1 argc=4 n=2000000 *** 148.52 % ±10.04% ±13.37% ±17.43% 12:55:40 events/ee-emit.js listeners=10 argc=0 n=2000000 *** 13.34 % ±7.48% ±9.96% ±12.96% 12:55:40 events/ee-emit.js listeners=10 argc=10 n=2000000 *** 11.22 % ±5.66% ±7.53% ±9.81% 12:55:40 events/ee-emit.js listeners=10 argc=2 n=2000000 3.11 % ±6.21% ±8.27% ±10.77% 12:55:40 events/ee-emit.js listeners=10 argc=4 n=2000000 *** 12.07 % ±6.04% ±8.04% ±10.47% 12:55:40 events/ee-emit.js listeners=5 argc=0 n=2000000 *** 20.76 % ±9.34% ±12.44% ±16.24% 12:55:40 events/ee-emit.js listeners=5 argc=10 n=2000000 *** 17.29 % ±8.71% ±11.59% ±15.10% 12:55:40 events/ee-emit.js listeners=5 argc=2 n=2000000 * 13.10 % ±9.93% ±13.21% ±17.20% 12:55:40 events/ee-emit.js listeners=5 argc=4 n=2000000 * 9.82 % ±9.51% ±12.66% ±16.48% 12:55:40 events/ee-listen-unique.js n=1000000 events=1 0.24 % ±1.32% ±1.76% ±2.29% 12:55:40 events/ee-listen-unique.js n=1000000 events=10 *** -3.02 % ±1.08% ±1.44% ±1.88% 12:55:40 events/ee-listen-unique.js n=1000000 events=2 *** -2.28 % ±0.69% ±0.92% ±1.20% 12:55:40 events/ee-listen-unique.js n=1000000 events=20 ** -2.16 % ±1.56% ±2.09% ±2.73% 12:55:40 events/ee-listen-unique.js n=1000000 events=3 *** -2.03 % ±0.98% ±1.31% ±1.71% 12:55:40 events/ee-listen-unique.js n=1000000 events=5 *** -4.36 % ±0.89% ±1.19% ±1.55% 12:55:40 events/ee-listener-count-on-prototype.js n=50000000 *** -6.40 % ±3.40% ±4.53% ±5.91% 12:55:40 events/ee-listeners.js raw='false' listeners=5 n=5000000 * -8.30 % ±6.34% ±8.45% ±11.02% 12:55:40 events/ee-listeners.js raw='false' listeners=50 n=5000000 -1.08 % ±1.71% ±2.28% ±2.97% 12:55:40 events/ee-listeners.js raw='true' listeners=5 n=5000000 -3.00 % ±5.69% ±7.58% ±9.89% 12:55:40 events/ee-listeners.js raw='true' listeners=50 n=5000000 3.68 % ±4.65% ±6.18% ±8.05% 12:55:40 events/ee-once.js argc=0 n=20000000 -0.21 % ±0.64% ±0.85% ±1.10% 12:55:40 events/ee-once.js argc=1 n=20000000 0.06 % ±0.44% ±0.59% ±0.76% 12:55:40 events/ee-once.js argc=4 n=20000000 -0.13 % ±0.62% ±0.83% ±1.08% 12:55:40 events/ee-once.js argc=5 n=20000000 0.26 % ±0.69% ±0.91% ±1.19% 12:55:40 events/eventtarget-add-remove.js nListener=10 n=1000000 ** 0.71 % ±0.42% ±0.56% ±0.73% 12:55:40 events/eventtarget-add-remove.js nListener=5 n=1000000 0.30 % ±0.70% ±0.94% ±1.22% 12:55:40 events/eventtarget-creation.js n=1000000 1.95 % ±3.80% ±5.09% ±6.72% 12:55:40 events/eventtarget.js listeners=1 n=1000000 -0.66 % ±5.51% ±7.34% ±9.55% 12:55:40 events/eventtarget.js listeners=10 n=1000000 -0.32 % ±3.15% ±4.19% ±5.46% 12:55:40 events/eventtarget.js listeners=5 n=1000000 -0.39 % ±5.18% ±6.90% ±8.98% 12:55:40 12:55:40 Be aware that when doing many comparisons the risk of a false-positive 12:55:40 result increases. In this case, there are 37 comparisons, you can thus 12:55:40 expect the following amount of false-positive results: 12:55:40 1.85 false positives, when considering a 5% risk acceptance (*, **, ***), 12:55:40 0.37 false positives, when considering a 1% risk acceptance (**, ***), 12:55:40 0.04 false positives, when considering a 0.1% risk acceptance (***) 

@ronag
Copy link
Member

change _events to be without proto: null

Looks great to me!

@ronag
Copy link
Member

Also what about something like the following for streams?

diff --git a/lib/internal/streams/duplex.js b/lib/internal/streams/duplex.js index 834d875be6..2b3fe64df9 100644 --- a/lib/internal/streams/duplex.js +++ b/lib/internal/streams/duplex.js @@ -63,6 +63,21 @@ function Duplex(options){if (!(this instanceof Duplex)) return new Duplex(options); + this._events ={+ close: undefined, + error: undefined, + prefinish: undefined, + finish: undefined, + drain: undefined, + data: undefined, + end: undefined, + pause: undefined, + resume: undefined, + readable: undefined, + pipe: undefined, + unpipe: undefined, + }; + this._readableState = new Readable.ReadableState(options, this, true); this._writableState = new Writable.WritableState(options, this, true); diff --git a/lib/internal/streams/readable.js b/lib/internal/streams/readable.js index 80798a35dc..83f09194a6 100644 --- a/lib/internal/streams/readable.js +++ b/lib/internal/streams/readable.js @@ -316,6 +316,18 @@ function Readable(options){if (!(this instanceof Readable)) return new Readable(options); + this._events ={+ close: undefined, + error: undefined, + data: undefined, + end: undefined, + pause: undefined, + resume: undefined, + readable: undefined, + pipe: undefined, + unpipe: undefined, + }; + this._readableState = new ReadableState(options, this, false); if (options){diff --git a/lib/internal/streams/writable.js b/lib/internal/streams/writable.js index 74573033ea..69fb2be7f0 100644 --- a/lib/internal/streams/writable.js +++ b/lib/internal/streams/writable.js @@ -382,6 +382,14 @@ function Writable(options){if (!(this instanceof Writable)) return new Writable(options); + this._events ={+ close: undefined, + error: undefined, + prefinish: undefined, + finish: undefined, + drain: undefined, + }; + this._writableState = new WritableState(options, this, false); if (options){

@rluvaton
Copy link
MemberAuthor

Do you think it's representative, I think the reason someone put __proto__ null was to be in dict mode

@rluvaton
Copy link
MemberAuthor

Also what about something like the following for streams?

diff --git a/lib/internal/streams/duplex.js b/lib/internal/streams/duplex.js index 834d875be6..2b3fe64df9 100644 --- a/lib/internal/streams/duplex.js +++ b/lib/internal/streams/duplex.js @@ -63,6 +63,21 @@ function Duplex(options){if (!(this instanceof Duplex)) return new Duplex(options); + this._events ={+ close: undefined, + error: undefined, + prefinish: undefined, + finish: undefined, + drain: undefined, + data: undefined, + end: undefined, + pause: undefined, + resume: undefined, + readable: undefined, + pipe: undefined, + unpipe: undefined, + }; + this._readableState = new Readable.ReadableState(options, this, true); this._writableState = new Writable.WritableState(options, this, true); diff --git a/lib/internal/streams/readable.js b/lib/internal/streams/readable.js index 80798a35dc..83f09194a6 100644 --- a/lib/internal/streams/readable.js +++ b/lib/internal/streams/readable.js @@ -316,6 +316,18 @@ function Readable(options){if (!(this instanceof Readable)) return new Readable(options); + this._events ={+ close: undefined, + error: undefined, + data: undefined, + end: undefined, + pause: undefined, + resume: undefined, + readable: undefined, + pipe: undefined, + unpipe: undefined, + }; + this._readableState = new ReadableState(options, this, false); if (options){diff --git a/lib/internal/streams/writable.js b/lib/internal/streams/writable.js index 74573033ea..69fb2be7f0 100644 --- a/lib/internal/streams/writable.js +++ b/lib/internal/streams/writable.js @@ -382,6 +382,14 @@ function Writable(options){if (!(this instanceof Writable)) return new Writable(options); + this._events ={+ close: undefined, + error: undefined, + prefinish: undefined, + finish: undefined, + drain: undefined, + }; + this._writableState = new WritableState(options, this, false); if (options){

I thought about that, but since I changed to Map I did not do it

@benjamingr
Copy link
Member

Eh, this makes one use case slower and another one faster (changing to no proto: null), making "EventEmitter events shape specific to common use cases for streams" could be worth it, with benchmarks

@lpinca
Copy link
Member

Possible duplicate of #44627 and #39939. See also #11930.

@rluvaton
Copy link
MemberAuthor

Supersede by #50428

@rluvatonrluvaton deleted the improve-ee-creation branch October 29, 2023 10:21
Sign up for freeto join this conversation on GitHub. Already have an account? Sign in to comment

Labels

eventsIssues and PRs related to the events subsystem / EventEmitter.needs-benchmark-ciPR that need a benchmark CI run.needs-ciPRs that need a full CI run.needs-citgmPRs that need a CITGM CI run.performanceIssues and PRs related to the performance of Node.js.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

7 participants

@rluvaton@nodejs-github-bot@ronag@benjamingr@lpinca@aduh95@anonrig