-
Notifications
You must be signed in to change notification settings - Fork 13.9k
Closed
Labels
A-inferenceArea: Type inferenceArea: Type inferenceC-bugCategory: This is a bug.Category: This is a bug.
Description
We expect this to compile without error (reasoning below):
let c: &u32 = &5; ((c >> 8) & 0xff) as u8Compile error:
error: the trait bound `u32: std::ops::BitAnd<i32>` is not satisfied [--explain E0277]
--> <anon>:10:27
|>
10 |> let c: &u32 = &5; ((c >> 8) & 0xff) as u8
|> ^^^^^^^^^^^^^^^^^Problem: The 0xff is getting inferred to i32 instead of u32, which would work.
Explanation of my understanding of the code:
chas type&u328is defaulted toi32(perhaps0xffis also unhelpfully defaulted toi32at this stage?)c >> 8has typeu32viaimpl<'a> Shr<i32> for &'a u32(to the best of my knowledge)- Next we expect
0xffto infer tou32to use theu32: BitAnd<u32>impl, but it fails.
Working examples (each with a slight twist):
// The most obvious fix is to force the type of 0xff:
let c: &u32 = &5; ((c >> 8) & 0xffu32) as u8
// But that shouldn't be necessary! It works without that if `c: u32` or `*c` is used:
let c: u32 = 5; ((c >> 8) & 0xff) as u8
let c: &u32 = &5; ((*c >> 8) & 0xff) as u8
// Finally, and most oddly, using an identity cast or type ascription
// from `u32` to `u32` also convinces the inference engine:
let c: &u32 = &5; ((c >> 8) as u32 & 0xff) as u8
let c: &u32 = &5; ((c >> 8): u32 & 0xff) as u8Who thought identity casts were useless?
retep998, hanna-kruppe and bluss
Metadata
Metadata
Assignees
Labels
A-inferenceArea: Type inferenceArea: Type inferenceC-bugCategory: This is a bug.Category: This is a bug.