Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

infer more completely everything that the optimizer/codegen requires #56565

Merged
merged 2 commits into from
Nov 15, 2024

Conversation

vtjnash
Copy link
Member

@vtjnash vtjnash commented Nov 14, 2024

Inlining wants to know information about every isa_compileable_sig method as well as everything it might consider inlining (which is almost the same thing). So even if inference could bail on computing the type since it already reached the maximum fixed point, it should keep going to get that information. This now uses two loops here now: one to compute the inference types information, then a second loop go back and get coverage of all of the compileable targets (unless that particular target is predicted to be inlined or dropped later).

(system image size contribution seems to be fairly negligible)

Inlining wants to know information about every isa_compileable_sig
method as well as everything it might consider inlining (which is almost
the same thing). So even if inference could bail on computing the type
since it already reached the maximum fixed point, it should keep going
to get that information. This now uses two loops here now: one to
compute the inference types information, then a second loop go back and
get coverage of all of the compileable targets (unless that particular
target is predicted to be inlined or dropped later).
@vtjnash vtjnash added the compiler:inference Type inference label Nov 14, 2024
@vtjnash vtjnash requested a review from aviatesk November 14, 2024 21:14
@oscardssmith
Copy link
Member

does this have an effect on compile time (e.g. for OmniPackage)?

@aviatesk
Copy link
Member

@nanosoldier runbenchmarks("inference", vs=":master")

@nanosoldier
Copy link
Collaborator

Your benchmark job has completed - no performance regressions were detected. A full report can be found here.

aviatesk added a commit that referenced this pull request Nov 15, 2024
Might be better to merge after #56565.
aviatesk added a commit that referenced this pull request Nov 15, 2024
Might be better to merge after #56565.
@vtjnash vtjnash merged commit caa2f7d into master Nov 15, 2024
6 of 8 checks passed
@vtjnash vtjnash deleted the jn/completed-infer_compilation_signature-inference branch November 15, 2024 16:44
aviatesk added a commit that referenced this pull request Nov 16, 2024
Might be better to merge after #56565.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
compiler:inference Type inference
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants