Skip to content

Conversation

@kubkon
Copy link
Member

@kubkon kubkon commented Dec 15, 2022

This turned out to be quite a large change for which I do apologise, but the vast majority of them is for the MachO linker so reviewing updates to the link-tests harness should be manageable. Anyhow, here's a summary of the changes.

MachO

Fixes #11737

MachO now generates a fully reproducible UUID value between runs. It matches the behaviour of Apple's ld64 in that calculating the UUID for a binary, we exclude regions in the final binary that are non-deterministic, i.e. depend on the presence of debug info stabs in the file, and these include:

  • __LINKEDIT segment command header
  • LC_SYMTAB and LC_DYSYMTAB command headers
  • LC_CODE_SIGNATURE command header
  • subsection of the actual symbol table containing the debug info symbol stabs
  • subsection of the actual string table containing the strings of the debug info symbol stabs
  • code signature

I would like to point out here that lld does not generate UUID this way thus not matching Apple's behavior: D92736.

With this, running

$ zig cc -shared testlib.c -o testlib.dylib -O2 -g
$ cp testlib.dylib testlib_g.dylib
$ zig cc -shared testlib.c -o testlib.dylib -O2 -s

$ otool -l testlib.dylib | grep uuid
    uuid D780AB99-65F8-30C7-AC8A-BE1001DE24C4

$ otool -l testlib_g.dylib | grep uuid
    uuid D780AB99-65F8-30C7-AC8A-BE1001DE24C4

produces two files with identical UUIDs even though one has debug info while the other doesn't.

While at it, I have thoroughly cleaned up how we track different offsets of interest in the final binary.

One thing I am not happy with is the fact that in order to calculate the UUID we have do streaming MD5 hashing. I have tried re-using the thread pool like we do for sha256 when code signing, however, due to the presence of gaps in the file due to exclusions, it is a non-trivial task. I will think about it some more though.

Finally, calculating a deterministic UUID is only enforced for release builds; debug builds hash some random string generating a valid MD5 hash but not reproducible between runs.

### Link tests
My previous solution to selective grepping and variable extracting of data from object dumps was unnecessarily complicated and so I took this opportunity to rewrite it a little (and simplify a lot!). It is how based around creating a tree-like structure of steps which are then resolved in a breadth-first manner.

An example of old syntax:

const check = exe.checkObject(.macho);
check.checkStart("sectname __text");
check.checkNext("offset {offset}");
check.checkComputeCompare("offset", .{ .op = .gte, .value = .{ .literal = 0x10000 } });

now becomes

const check_exe = exe.checkObject(.macho, .{});
const check = check_exe.root();
check.match("sectname __text");
check.match("offset {offset}");
const offset = check.get("offset");
offset.gte(0x10000);

which I think is cleaner. The harness itself is definitely easier to extend now if need be. cc'ing @Luukdegram as you will probably want to take a look at this before it is merged.

EDIT: After further thought, changes to the link test are not warranted in this PR and I have since reverted them back. (cc @Luukdegram, sorry for the noise!)

Fix path written to `LC_ID_DYLIB` to include the current CWD (if any).
By pulling out the parallel hashing setup from `CodeSignature.zig`,
we can now reuse it different places across MachO linker (for now;
I can totally see its usefulness beyond MachO, eg. in COFF or ELF too).
The parallel hasher is generic over actual hasher such as Sha256 or MD5.
The implementation is kept as it was.

For UUID calculation, depending on the linking mode:
* incremental - since it only supports debug mode, we don't bother with MD5
  hashing of the contents, and populate it with random data but only once
  per a sequence of in-place binary patches
* traditional - in debug, we use random string (for speed); in release,
  we calculate the hash, however we use LLVM/LLD's trick in that we
  calculate a series of MD5 hashes in parallel and then one an MD5 of MD5
  final hash to generate digest.
@kubkon
Copy link
Member Author

kubkon commented Dec 15, 2022

@andrewrk I have applied your patch #13919 (comment), let's see if this is enough to guarantee byte-for-byte reproducibility on macOS :-)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

macos shared libraries built with MachO are not reproducible

1 participant