Skip to content

Use the size of numeric literals to inform the inferred numeric type #8337

@bstrie

Description

@bstrie

I've annotated the following program with its current output:

fn foo<T: Signed>(x: T, y: T) {
    printfln!("%?, %?", x, y);
}

fn main() {
    foo(1, 8_000_000_000);      // 1, -589934592
    foo(1, 8_000_000_000i64);   // 1, 8000000000
    foo(1i8, 8_000_000_000);    // 1, 0
    //foo(1, 8_000_000_000i8);  // error: literal out of range for its type
}

As shown by the last (commented-out) line, Rust is already smart enough to reject numeric literals whose values are larger than the maximum value that can be represented by their type, but only when their type is specified by a suffix:

  1. As shown by the first call to foo, it's perfectly willing to infer the default type of int, even when the value is too large to be contained in an int.
  2. As shown by the third call to foo, it's perfectly willing to infer a concrete type from a previously-given numeric suffix, even when that type is too small to hold the value of the later, un-suffixed literal.

This might be naive of me, but it would be wonderful if these issues could be addressed somehow. The first issue would be complicated by the fact that int is a different size on different architectures, and that inferring the type to i64 would require assigning the same type to the prior literal. The second issue would require recognizing that 1i8 and 8_000_000_000 cannot be the same type without overflow, and would throw an error just as the 8_000_000_000i8 line does today.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions