-
Notifications
You must be signed in to change notification settings - Fork 197
Description
I'm using the Int32.of_string to parse unsigned integers. It works well in native, but in jsoo, it crashes with fatal error exception Failure("int_of_string") if the input exceeds Int32.max_int.
How To Reproduce
Edit the file test.ml
(*in file test.ml*)
let () = Int32.of_string "0u2147483648" |> Printf.sprintf "%ld" |> print_endline
Compiling with ocamlc and jsoo, and run the compiled result.
$ ocamlfind ocamlc -package js_of_ocaml -package js_of_ocaml-ppx -linkpkg -o test.byte test.ml
$ js_of_ocaml test.byte
$ node test.js
Fatal error: exception Failure("int_of_string")
Expected behavior
In the document of Int32.of_string, it says
The 0u prefix reads the input as an unsigned integer in the range [0, 2*Int32.max_int+1]. If the input exceeds Int32.max_int it is converted to the signed integer Int32.min_int + input - Int32.max_int - 1.
Use ocamlc to compile the same file as above, and run the output executable, it accepts the input string 0u2147483648, which represents the unsigned integer 2^31, and print -2147483648 (As described in the document, -2^31 + 2^31 - (2^31-1) - 1 = -2^31).
$ ocamlc test.ml -o out && ./out
-2147483648
By checking the source file ints.js, I'm pretty sure the
caml_int32_of_string intrinsic didn't handle 0u prefix correctly. It seems that Int64.of_string also has this issue. If you confirm it’s a bug, I'd like to help fix this.
Versions
- ocamlc 5.1.0
- js_of_ocaml 5.8.2