-
Notifications
You must be signed in to change notification settings - Fork 724
Build flatcc for the host #10855
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Build flatcc for the host #10855
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/10855
Note: Links to docs will display an error until the docs builds have been completed. ❗ 1 Active SEVsThere are 1 currently active SEVs. If your PR is affected, please view them below: ✅ No FailuresAs of commit 1fec681 with merge base f1ef702 ( This comment was automatically generated by Dr. CI and updates every 15 minutes. |
773a731 to
83850e3
Compare
|
cc @zingo I see that arm build scripts set |
|
Thanks for this change, we have an extra build and copy of that executable we now can clean up 🙏 |
### Summary I came across this "cleaner" approach when setting up flatcc (#10855). So, let's do it for flatc too. ### Test plan CI cc @larryliu0820
### Summary I forgot to do this in #10855. But since we build flatcc for the host now, we don't need this variable. ### Test plan CI ``` $ rg EXECUTORCH_SEPARATE_FLATCC_HOST_PROJECT --hidden -g '!.git/' ``` cc @larryliu0820
### Summary Seems like there is a race in building `libflatccrt.a`. This issue has existed for a while: #7300. It was temporarily mitigated in #7570 by just reducing the parallelism. In this diff I attempt to fix it. This is just my assumption of what is wrong. Given flatccrt builds a debug version with a `_d` suffix, if the target isn't depended on (i.e. some target don't use the conditional target name) then the order of how the lib is built causes a race. So for now, always use the non-debug version. Given it's a race, I was never able to repro the issue locally — I can't guarantee this is the problem. However, it seems my recent changes in #10855 has increased the frequency of the problem in CI. ### Test plan CI cc @larryliu0820
### Summary Seems like there is a race in building `libflatccrt.a`. This issue has existed for a while: pytorch#7300. It was temporarily mitigated in pytorch#7570 by just reducing the parallelism. In this diff I attempt to fix it. This is just my assumption of what is wrong. Given flatccrt builds a debug version with a `_d` suffix, if the target isn't depended on (i.e. some target don't use the conditional target name) then the order of how the lib is built causes a race. So for now, always use the non-debug version. Given it's a race, I was never able to repro the issue locally — I can't guarantee this is the problem. However, it seems my recent changes in pytorch#10855 has increased the frequency of the problem in CI. ### Test plan CI cc @larryliu0820
Summary
flatc, let's buildflatcc_clifor the host using ExternalProjectflatccdefinitions to be underthird-party/etdumpandbundled_programdefinitions to be under their respective foldersTest plan
CI
cc @larryliu0820