Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

poor inference when calling constant IntrinsicFunction passed as argument #28003

Closed
jrevels opened this issue Jul 9, 2018 · 6 comments
Closed
Labels
compiler:inference Type inference performance Must go faster regression Regression in behavior compared to a previous version

Comments

@jrevels
Copy link
Member

jrevels commented Jul 9, 2018

A non-Cassette MWE for JuliaLabs/Cassette.jl#45 and a demonstration of a regression from #26826 (ported to the new optimizer in #26981) see comment below

On current master (b6f9244):

julia> g(f, args...) = f(args...)
g (generic function with 1 method)

julia> h(args...) = g(Base.pointerref, args...)
h (generic function with 1 method)

julia> code_typed(h, (Ptr{Float32}, Int, Int); optimize = false)
1-element Array{Any,1}:
 CodeInfo(
1 1%1 = :(Base.pointerref)::Core.Compiler.Const(pointerref, false)       │
  │   %2 = Core.tuple(%1)::Core.Compiler.Const((pointerref,), false)        │
  │   %3 = Core._apply(Main.g, %2, %%args)::Any                             │
  └──      return %3                                                        │
) => Any

This was previously working at commit 1d53232 (not an exact bisect):

julia> g(f, args...) = f(args...)
g (generic function with 1 method)

julia> h(args...) = g(Base.pointerref, args...)
h (generic function with 1 method)

julia> code_typed(h, (Ptr{Float32}, Int, Int); optimize = false)
1-element Array{Any,1}:
 CodeInfo(:(begin
      nothing
      Core.SSAValue(0) = Base.pointerref
      Core.SSAValue(1) = (Core.tuple)(Core.SSAValue(0))::Tuple{Core.IntrinsicFunction}
      Core.SSAValue(2) = (Core._apply)(Main.g, Core.SSAValue(1), args)::Float32
      return Core.SSAValue(2)
      6:
  end)) => Float32
@jrevels jrevels added performance Must go faster regression Regression in behavior compared to a previous version compiler:inference Type inference labels Jul 9, 2018
@jrevels
Copy link
Member Author

jrevels commented Jul 9, 2018

Argh - I bisected this MWE to 9277d3a (cc @JeffBezanson), but that's too recent for it to be the direct cause of the problem in JuliaLabs/Cassette.jl#45 (commits before 9277d3a exhibit the regression there). Oh well, back to the bug hunt.

@jrevels jrevels changed the title broken varargs constant propagation inference regression potentially related to varargs constant propagation Jul 9, 2018
@JeffBezanson
Copy link
Member

This is because in g(f::IntrinsicFunction, ...) = f(...), the call to f is dynamic (since the call target isn't known). Previously the inlining cost model said this call has no cost, which is wrong, so this method was marked as inlineable. Inference then used that as a hint that it's worth doing constant propagation.

The only fix that comes to mind is always doing constant propagation if an argument is a known IntrinsicFunction.

@jrevels
Copy link
Member Author

jrevels commented Jul 11, 2018

Right, Cassette'ified calls inferring poorly due to intrinsic functions not constant propagating was the main motivation for me to implement vararg constant propagation in the first place.

Inference then used that as a hint that it's worth doing constant propagation. The only fix that comes to mind is always doing constant propagation if an argument is a known IntrinsicFunction.

It seems dangerous to let inlining heuristics determine whether or not constant propagation happens when we're relying on constant propagation to infer basic function calls.

I guess the only alternative besides further special-casing for IntrinsicFunctions is to give each IntrinsicFunction its own type. This would be nice for other reasons as well, but I'm assuming that would result in non-negligible compile-time/memory regressions.

@jrevels jrevels changed the title inference regression potentially related to varargs constant propagation poor inference when calling constant IntrinsicFunction passed as argument Dec 14, 2018
@jrevels
Copy link
Member Author

jrevels commented Dec 19, 2018

@vtjnash @Keno Well whaddayaknow, this is indeed fixed by setting aggressive_constant_propagation=true:

julia> code_typed(h, (Ptr{Float32}, Int, Int))
1-element Array{Any,1}:
 CodeInfo(
1 ─ %1 = (getfield)(args, 1)::Ptr{Float32}
│   %2 = (getfield)(args, 2)::Int64
│   %3 = (getfield)(args, 3)::Int64
│   %4 = (pointerref)(%1, %2, %3)::Float32
└──      return %4
) => Float32

What does that mean for next steps for resolving this issue?

@jrevels
Copy link
Member Author

jrevels commented Dec 19, 2018

Ah, but this doesn't actually resolve my actual problem, which is why I remember aggressive_constant_propagation=true not being enough here:

julia> using Cassette: @context, overdub, disablehooks

julia> @context NoOpCtx;

julia> ctx = disablehooks(NoOpCtx());

julia> sig = map(Core.Typeof, (ctx, overdub, ctx, overdub, ctx, Float64, 1));

# using https://github.com/JuliaLang/julia/pull/30451 for the nice kwarg here
julia> p = Core.Compiler.CustomParams(typemax(UInt), aggressive_constant_propagation = true);

julia> code_typed(overdub, sig, params = p, optimize=false)
1-element Array{Any,1}:
 CodeInfo(
1 ─       (overdub@_4 = (Core.getfield)(##overdub_arguments#364@_3, 1))::Const(overdub, false)
│         (##overdub_context#363@_5 = (Core.getfield)(##overdub_arguments#364@_3, 2))::Const(Context{nametype(NoOpCtx),Nothing,Nothing,##PassType#365,Nothing,DisableHooks}(nametype(NoOpCtx)(), nothing, nothing, ##PassType#365(), nothing, DisableHooks()), false)%3  = (Core.getfield)(##overdub_arguments#364@_3, 3)::Const(overdub, false)%4  = (Core.getfield)(##overdub_arguments#364@_3, 4)::Const(Context{nametype(NoOpCtx),Nothing,Nothing,##PassType#365,Nothing,DisableHooks}(nametype(NoOpCtx)(), nothing, nothing, ##PassType#365(), nothing, DisableHooks()), false)%5  = (Core.getfield)(##overdub_arguments#364@_3, 5)::Const(Float64, false)%6  = (Core.getfield)(##overdub_arguments#364@_3, 6)::Int64
│         (##overdub_arguments#364@_6 = (Core.tuple)(%3, %4, %5, %6))::PartialTuple(Tuple{typeof(overdub),Context{nametype(NoOpCtx),Nothing,Nothing,##PassType#365,Nothing,DisableHooks},DataType,Int64}, Any[Const(overdub, false), Const(Context{nametype(NoOpCtx),Nothing,Nothing,##PassType#365,Nothing,DisableHooks}(nametype(NoOpCtx)(), nothing, nothing, ##PassType#365(), nothing, DisableHooks()), false), Const(Float64, false), Int64])%8  = (Cassette.overdub)(##overdub_context#363@_2, Core.getfield, ##overdub_arguments#364@_6::PartialTuple(Tuple{typeof(overdub),Context{nametype(NoOpCtx),Nothing,Nothing,##PassType#365,Nothing,DisableHooks},DataType,Int64}, Any[Const(overdub, false), Const(Context{nametype(NoOpCtx),Nothing,Nothing,##PassType#365,Nothing,DisableHooks}(nametype(NoOpCtx)(), nothing, nothing, ##PassType#365(), nothing, DisableHooks()), false), Const(Float64, false), Int64]), 1)::Const(overdub, false)
│         (overdub@_7 = %8)::Const(overdub, false)
│   %10 = (Cassette.overdub)(##overdub_context#363@_2, Core.getfield, ##overdub_arguments#364@_6::PartialTuple(Tuple{typeof(overdub),Context{nametype(NoOpCtx),Nothing,Nothing,##PassType#365,Nothing,DisableHooks},DataType,Int64}, Any[Const(overdub, false), Const(Context{nametype(NoOpCtx),Nothing,Nothing,##PassType#365,Nothing,DisableHooks}(nametype(NoOpCtx)(), nothing, nothing, ##PassType#365(), nothing, DisableHooks()), false), Const(Float64, false), Int64]), 2)::Const(Context{nametype(NoOpCtx),Nothing,Nothing,##PassType#365,Nothing,DisableHooks}(nametype(NoOpCtx)(), nothing, nothing, ##PassType#365(), nothing, DisableHooks()), false)
│         (##overdub_context#363@_8 = %10)::Const(Context{nametype(NoOpCtx),Nothing,Nothing,##PassType#365,Nothing,DisableHooks}(nametype(NoOpCtx)(), nothing, nothing, ##PassType#365(), nothing, DisableHooks()), false)%12 = (Cassette.overdub)(##overdub_context#363@_2, Core.getfield, ##overdub_arguments#364@_6::PartialTuple(Tuple{typeof(overdub),Context{nametype(NoOpCtx),Nothing,Nothing,##PassType#365,Nothing,DisableHooks},DataType,Int64}, Any[Const(overdub, false), Const(Context{nametype(NoOpCtx),Nothing,Nothing,##PassType#365,Nothing,DisableHooks}(nametype(NoOpCtx)(), nothing, nothing, ##PassType#365(), nothing, DisableHooks()), false), Const(Float64, false), Int64]), 3)::Const(Float64, false)%13 = (Cassette.overdub)(##overdub_context#363@_2, Core.getfield, ##overdub_arguments#364@_6::PartialTuple(Tuple{typeof(overdub),Context{nametype(NoOpCtx),Nothing,Nothing,##PassType#365,Nothing,DisableHooks},DataType,Int64}, Any[Const(overdub, false), Const(Context{nametype(NoOpCtx),Nothing,Nothing,##PassType#365,Nothing,DisableHooks}(nametype(NoOpCtx)(), nothing, nothing, ##PassType#365(), nothing, DisableHooks()), false), Const(Float64, false), Int64]), 4)::Int64%14 = (Cassette.overdub)(##overdub_context#363@_2, Core.tuple, %12, %13)::PartialTuple(Tuple{DataType,Int64}, Any[Const(Float64, false), Int64])
│         (##overdub_arguments#364@_9 = %14)::PartialTuple(Tuple{DataType,Int64}, Any[Const(Float64, false), Int64])%16 = (Cassette.overdub)(##overdub_context#363@_2, Cassette.overdub, ##overdub_context#363@_5, Core.getfield, ##overdub_arguments#364@_9::PartialTuple(Tuple{DataType,Int64}, Any[Const(Float64, false), Int64]), 1)::Const(Float64, false)
│         (#self# = %16)::Const(Float64, false)%18 = (Cassette.overdub)(##overdub_context#363@_2, Cassette.overdub, ##overdub_context#363@_5, Core.getfield, ##overdub_arguments#364@_9::PartialTuple(Tuple{DataType,Int64}, Any[Const(Float64, false), Int64]), 2)::Int64
│         (x = %18)::Int64%20 = (Cassette.overdub)(##overdub_context#363@_2, Cassette.overdub, ##overdub_context#363@_5, Cassette.overdub, ##overdub_context#363@_8, Base.sitofp, Float64, x)::Any
└──       return %20
) => Any

Naive guess: the Base.sitofp on the last line is getting shoved into a PartialTuple later, but as a IntrinsicFunction not as a Const. Thus, by the time a downstream _apply sees it, it's too late.

@vtjnash
Copy link
Member

vtjnash commented Jan 25, 2022

This can probably be assumed to be improved now.

@vtjnash vtjnash closed this as completed Jan 25, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
compiler:inference Type inference performance Must go faster regression Regression in behavior compared to a previous version
Projects
None yet
Development

No branches or pull requests

3 participants