-
Notifications
You must be signed in to change notification settings - Fork 125
[gql_code_builder] optimize slow code generation to reduce build time as 10% of previous. #413
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[gql_code_builder] optimize slow code generation to reduce build time as 10% of previous. #413
Conversation
(cherry picked from commit fe2e2e2)
Thanks for your PR. I will take a deeper look on the weekend. IMO a breaking change on this is acceptable for this improvement. If possible without much overhead, I would like to make this configurable though, so users have some time to migrate. |
@@ -19,6 +20,9 @@ Library buildDataLibrary( | |||
generateMaybeWhenExtensionMethod: false, | |||
), | |||
]) { | |||
final fragmentMap = _fragmentMap(docSource); | |||
final dataClassAliasMap = _dataClassAliasMap(docSource, fragmentMap); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Okay, to make this configurable, we can simply update this line. So a new gql_build
option need to be added for backward compatibility.
e.g
final dataClassAliasMap = featureEnable ? _dataClassAliasMap(docSource, fragmentMap) : <String, Reference>{};
Looks good to me. Thanks so much for this contribution. Also tried it on the codebase I'm working on and seems to work fine (although I don't see much improvement in build times as I don't have many nested, shared fragment spreads ). In order to ship this, please add a configuration option to the I suggest adding a simple config class, like for the when() methods for the inline fragment spreads, in order to have the option to add more parameters later without changing the signature. The default value when the config variable is not passed should be the current behaviour that emits new classes for every instance of the fragment in order to not break existing users. From that, I can take the steps to ships this, i.e.:
|
c7fc27f
to
4f55b56
Compare
Hi @knaeckeKami I found a hidden bug and fixed that. Also I added configuration signature as your explanation. P.S. Quick Question: Is there anyway to deliver options to |
ferry does not use gql_build anymore, as this lead to configuration overhead and user confusion. But we should still maintain gql_build as other projects depend on it. so, this change also has to be integrated into ferry_generator as well. |
…n order to re-use selection sets with only a single inline fragment spread. see gql-dart/gql#413 for details.
… with a single inline fragment spread as selection (#530) * feat(ferry_generator): add data_class_config: reuse_fragment option in order to re-use selection sets with only a single inline fragment spread. see gql-dart/gql#413 for details. In order to opt in to this (breaking) change, add ```yaml data_class_config: reuse_fragments: true ``` To your in the config for your graphql_builder of your build.yaml
Background
Hi Team and especially @knaeckeKami,
There is an old issue in this library. Which is gql-dart/ferry#143, first reported at Jan, 2021.
I made this PR to address this issue again and share the working draft of a possible solution.
There can be many aspects making build slower, but here I focused on how
gql_code_builder
generates data models.A code generation for GraphQL type class and built value class gets extremely slower as schema and query gets larger.
The main reasons for that I found:
Example
Assume we have above query and fragments (though it seems strange).
GUserFragmentData_location
toGLocationFragmentData
. Also for moderation field too.GUserFragmentData
forGTestQueryData_currentUser
.UserFragment
already includes bothlocation {lat, lng}
andmoderation {ratingPlaceholder}
, hereGUserFragmentData
can be used forGTestQueryData_currentUser2
too.Because of this graph-way nature of GraphQL, we can reuse a lot of our model code. However, the current code generator does not reflect this reuse logic, and so recursively nested Fragments are generating a huge amount of similar code.
Changes
I created this draft solution that performs the optimizations for reusable fragment models mentioned in the example above, including InlineFragmentNode and WhenExtension support too. So it covers all models generated from current builder.
I said it is a draft because if a builder with this patch generates code against an existing schema, it will not generate many of the models that were previously generated, and it will also change the type signatures of the reusable getters in many of the former models. This could be a breaking change for users, so for now, I look forward to discussing this approach further.
And there has been made no regression tests for this patch yet.
But I am using this patched builder with my own product and found no issues for now.
See, there were breaking changes as mentioned above, so I had to fix few type signatures and model names.
FYI: Below are full errors after using this patched build.
Be note that this app is on production with enough number of features.
Comparison
Here's a quick summary of the results with my own product:
Compared only with
*.data.g.dart
and*.data.dart
files, which this patch is effectively applied to.I think the result will fairly vary on the structure of scheme and query complexity.
For me, I have 7-depth nested fragment at most. And used fragments pretty much in anywhere.
*My product has about 2k lines of `*.graphql` codes.
BEFORE
It took about 20 minutes for a build, and generated about 552k lines of "data" dart codes . The total number of characters in the generated codes is 22M.
AFTER
A builder with this patch, It took about 2 minutes for a build, and generated about 68k lines of "data" dart codes . The total number of characters in the generated codes is 2.4M.
Diff
Also please refer below broken tests as an example of effective changes.
Note
Thank you for reviewing it.
And I am looking forward to any thoughts about this approach.