Uh oh!
There was an error while loading. Please reload this page.
- Notifications
You must be signed in to change notification settings - Fork 34k
bpo-46841: Move the cache for LOAD_GLOBAL inline.#31575
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
bpo-46841: Move the cache for LOAD_GLOBAL inline. #31575
Uh oh!
There was an error while loading. Please reload this page.
Conversation
markshannon commented Feb 25, 2022 • edited by bedevere-bot
Loading Uh oh!
There was an error while loading. Please reload this page.
edited by bedevere-bot
Uh oh!
There was an error while loading. Please reload this page.
bedevere-bot commented Feb 25, 2022
🤖 New build scheduled with the buildbot fleet by @markshannon for commit 26d0d70 🤖 If you want to schedule another build, you need to add the ":hammer: test-with-buildbots" label again. |
markshannon commented Feb 25, 2022
Performance is underwhelming. What is notable is that the number of |
brandtbucher commented Feb 25, 2022
Yeah, I suspect we may not see a ton of payoff until we convert most/all of the caches. |
brandtbucher left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A few things:
- The magic number needs to be bumped every time we do this.
- I thought we decided to use
_Py_CODEUNITeverywhere you're usinguint16_tnow (it may also be worth adding_Py_TWOCODEUNITSand_Py_FOURCODEUNITSaliases foruint32_tanduintptr_t, respectively).
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
bedevere-bot commented Feb 25, 2022
When you're done making the requested changes, leave the comment: |
markshannon commented Feb 28, 2022
👍
Hmm, I'd prefer not. Versions, for example, are 32 bit value, not 2 code units. If the size of the Maybe we should give up on the pretense that the size of the code unit can be easily changed, and replace |
| # Python 3.11a5 3481 (Use inline CACHE instructions) | ||
| # Python 3.11a5 3482 (Use inline caching for UNPACK_SEQUENCE) | ||
| # Python 3.11a5 3481 (Use inline cache for BINARY_OP) | ||
| # Python 3.11a5 3482 (Use inline caching for UNPACK_SEQUENCE and LOAD_GLOBAL) |
brandtbucherFeb 28, 2022 • edited
Loading Uh oh!
There was an error while loading. Please reload this page.
edited
Uh oh!
There was an error while loading. Please reload this page.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hm, looks like the magic number wasn't actually changed... bad merge?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I figured we didn't need more than one version number per day.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
#31618 has another version number bump.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, I see what you did now. Thanks for clarifying.
Also removes duplication of the
_PyOpcode_InlineCacheEntriestable.https://bugs.python.org/issue46841