-
Notifications
You must be signed in to change notification settings - Fork 1.1k
Possible fix for pathological performance in Django 1.11 (widget rendering) #933
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
I should add, there's possibly a middleground where the |
|
(The only failing build on Travis CI was the flake8 check.) Hey @kezabelle, Thanks you, this looks great! I'm still worrying a little bit about nested dictionaries; the Is the Anyway, that's just nitpicking and if nobody else sees something bad I missed I'll merge this in a few days. Thanks again! |
Yeah, I nodded to that issue in the 'theory' part of the original comment. I wouldn't think it's stable, but it seemed "good enough" from my testing, and certainly better in at least some workloads than the current situation.
This is ... a super good point that I'd entirely overlooked, because I nearly never use |
Codecov Report
@@ Coverage Diff @@
## master #933 +/- ##
==========================================
+ Coverage 83.25% 83.42% +0.17%
==========================================
Files 31 31
Lines 1660 1671 +11
Branches 246 246
==========================================
+ Hits 1382 1394 +12
Misses 199 199
+ Partials 79 78 -1
Continue to review full report at Codecov.
|
|
Pushed a new commit on top, This is actually faster again than the previous one (for me), precisely because the Context appears to get re-used a lot. YMMV because of different versions, etc. edit:Having thought further on it, using the edit 2:Thinking on it further, using I have one remaining idea to try, using a pair of lists to track the data structures and equality checks, but it may be that the stringifying offers the best trade off of saving some/many/most |
…et rendering, the number of context values which need prettifying has increased drastically, which can (given enough template contexts to render) cause pages using the template panel to become essentially unresponsive due to the sheer amount of data. Instead, when first seeing a new context layer, keep a copy of it + the formatted version, and when subsequently seeing the exact same layer (by relying on the implicit __eq__ calls it is expected the lists's __contains__ will do) re-use the existing formatted version. This cuts the number of pformat calls at a minimum by half, and ultimately (far) more than that due to re-use of the same one where possible.
|
Right. I've ditched the For posterity's sake, the previous attempts remain in a separate branch: |
|
Thanks for working on this! |
|
Thank you! That's awesome. |
See #910 for details.
Essentially, try and reduce the number of calls to
pformatby storing a string representation of a given context layer and only doing apformatwhen we haven't seen it before.In practice for me on Python 2.7 this makes a massive difference.
In theory the difference is somewhere between the best case I'm presenting, and the current pathological situation, because a context which itself contains nested un-ordered data structures (sets, dictionaries) probably don't guarantee their output when converted to a string representation, but there's not a huge amount that can be done by that, I don't think.
Rendering a user change form in the admin with 50 groups, in the current pathological case will do 1121
pformatcalls at the point where it attempts to put thetemp_layerin thecontext_listand another 2226 topformatan individual value within a context layer.With the proposed changes, the total number of pformat calls becomes 175, which is slightly under the number of templates rendered.
These numbers are only my findings using a single test case, one version of Python, and one operating system (OSX) ... so if anyone wants to try this patch out to independently verify whether or not it has a real world affect elsewhere, that'd be nice.