Conversation
This is a primary motivating use case for this ruleset.
Behavior:
Use case: developer changes foo.proto to remove a field. They either
1. don't update the foo.pb.go file in the source tree
2. they do update it but incorrectly (commonly an AI agent might do this)
Their editor doesn't show an issue, but their Bazel tests will fail.
Because we have flag //diff:validate_diffs set in bazelrc, attempting to execute a test which depends on the protobuf
file won't even build:
```
diff.bzl % bazel test examples:foo_usage_test
INFO: Analyzed target //examples:foo_usage_test (168 packages loaded, 12749 targets configured).
ERROR: /Users/alexeagle/Projects/diff.bzl/examples/BUILD:23:5: Action examples/bazel-out/darwin_arm64-fastbuild/bin/examples/go_codegen_diff.exit_code.valid failed: (Exit 1): bash failed: error executing Action command (from diff_rule rule target //examples:go_codegen_diff) /bin/bash -c ... (remaining 1 argument skipped)
Use --sandbox_debug to see verbose messages from the sandbox and retain the sandbox build root for debugging
ERROR: diff command exited with non-zero status.
To accept the diff, run:
patch -d $(bazel info workspace) -p0 < $(bazel info bazel-bin)/examples/go_codegen_diff.patch
Target //examples:foo_usage_test failed to build
Use --verbose_failures to see the command lines of failed build steps.
INFO: Elapsed time: 0.952s, Critical Path: 0.04s
INFO: 2 processes: 518 action cache hit, 2 internal.
ERROR: Build did NOT complete successfully
//examples:foo_usage_test FAILED TO BUILD
```
thesayyn
left a comment
There was a problem hiding this comment.
This will work perfectly with AXL, we already have retry and process bes events logic over there, looking at test.sh, it looks like almost the same layering logic as rules_lint.
76c71f7 to
bec02c5
Compare
bec02c5 to
48412e4
Compare
diff/private/diff.bzl
Outdated
| To accept the diff, run: | ||
| patch -d \\$(bazel info workspace) -p0 < {patch} | ||
| """.format(patch = ctx.outputs.patch.path))) | ||
| patch -d \\$(bazel info workspace) -p0 < \\$(bazel info bazel-bin)/{patch} |
There was a problem hiding this comment.
(for my understanding) what case does this fix?
There was a problem hiding this comment.
I believe it's when your current working directory is not the repository root.
perhaps it should be $(bazel info workspace) still, since there are multiple bin dirs under transitions
| permissions: | ||
| contents: write |
There was a problem hiding this comment.
I hope this can't push to main pull-requests: write?
There was a problem hiding this comment.
I think so, because it needs to create a branch to hold the commits
There was a problem hiding this comment.
I'm really hesitant about including this. There are going to be lots of test cases in this repo that produce patches but we don't necessarily want the build to fail and a patch to be auto applied.
I'm also concerned about the security implications of contents: write and I don't fully understand what capabilities it would give to someone submitting a PR from their fork. Do you know?
There was a problem hiding this comment.
I think this might be best kept as a documented example of how you can use the diff output groups for automatic patching and not part of ci.
| "--output_groups=diff_bzl__patch" | ||
| "--build_event_json_file=$buildevents" |
There was a problem hiding this comment.
Will this capture and patch all diffs, even if we set validate = False because we didn't want the build to fail? Not every patch represents a build error, and some generated patches may be wrong (think a snapshot test that changed unexpectedly). Is there way we can separate diffs that we want to be automatically patched versus those we don't?
This is a primary motivating use case for this ruleset.
Use case: developer changes foo.proto to remove a field. They either
Their editor doesn't show an issue in the Go code that uses the proto, but their Bazel tests will fail. Because we have flag
//diff:validate_diffsset in.bazelrc, attempting to execute a test which depends on the protobuf file won't even build:If you now simply run the command printed, it works: