Skip to content

Conversation

@Fidget-Spinner
Copy link
Member

Closes #445

No tests because there are no current tests for the option configurations in the current file :(.

I have manually verified that it works on my system.

Copy link
Member

@corona10 corona10 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm!

The test failure does not looks like reated to this PR.
But need to check

def test_run_and_show(self):
filename = self.resolve_tmp("bench.json")
# -b all: check that *all* benchmark work
#
# --debug-single-value: benchmark results don't matter, we only
# check that running benchmarks don't fail.
# XXX Capture and check the output.
self.run_pyperformance(
"run",
"-b",
"all",
"--debug-single-value",
"-o",
filename,
capture=None,
)
# Display slowest benchmarks
# XXX Capture and check the output.
self.run_module("pyperf", "slowest", filename)

@Fidget-Spinner Fidget-Spinner merged commit 4494463 into main Dec 8, 2025
70 of 72 checks passed
@Fidget-Spinner Fidget-Spinner deleted the rigorous-option branch December 8, 2025 15:27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

compile_all config file does not respect rigorous option

3 participants