Files
mercury/tests/hard_coded/lexer_ints.m
Julien Fischer fdb91d5aba Extend the lexer to recognise additional integer literals.
Extend the lexer to recognise uint literals, optional signedness suffixes on
ints and the literals for all of the proposed fixed size integer types.

Fix XXX UINTs in the library and compiler.

library/lexer.m:
    Uncomment the other alternatives in the integer_size/0 type.

    Handle signedness and size suffixes in integer literals.

library/parser.m
library/term.m:
    Conform to the above changes.

library/stream.string_writer.m:
    Fix an XXX UINT: make write handle uints properly.

library/term_io.m:
     Fix an XXX UINT: output integer signedness and size suffixes for
     integers when appropriate.

compiler/superhomogeneous.m:
     Print an error message if we encounter a fixed size integer literal
     as the rest of the compiler does not yet support them.

compiler/hlds_out_util.m:
compiler/parse_tree_out_info.m:
     Output the 'u' suffix on uint values.

test/hard_coded/lexer_zero.{m,inp,exp*}:
     Extend this test to cover zeros of varying signedness and size.

     Prefix each line of the output with the input line number of the
     token -- this makes it easier to relate the output back to the
     input.

tests/hard_coded/Mmakefile:
     Add the new test case.

tests/hard_coded/lexer_ints.{m,inp,exp}:
     Test the lexer on non-zero integer literals.
2017-04-26 10:00:45 +10:00

54 lines
1.5 KiB
Mathematica

%---------------------------------------------------------------------------%
% vim: ts=4 sw=4 et ft=mercury
%---------------------------------------------------------------------------%
:- module lexer_ints.
:- interface.
:- import_module io.
:- pred main(io::di, io::uo) is det.
%---------------------------------------------------------------------------%
:- implementation.
:- import_module lexer.
%---------------------------------------------------------------------------%
main(!IO) :-
% Read from the current input stream.
lexer.get_token_list(Tokens, !IO),
write_token_list(Tokens, !IO),
io.nl(!IO),
% Read from a string.
io.open_input("lexer_ints.inp", OpenRes, !IO),
(
OpenRes = ok(Stream),
io.read_file_as_string(Stream, ReadRes, !IO),
(
ReadRes = ok(String),
Posn0 = posn(1, 0, 0),
lexer.string_get_token_list(String, StringTokens, Posn0, _Posn),
write_token_list(StringTokens, !IO)
;
ReadRes = error(_, Error),
io.write_line(Error, !IO)
),
io.close_input(Stream, !IO)
;
OpenRes = error(Error),
io.write_line(Error, !IO)
).
:- pred write_token_list(token_list::in, io::di, io::uo) is det.
write_token_list(token_nil, !IO).
write_token_list(token_cons(Token, Context, List), !IO) :-
io.write_int(Context, !IO),
io.write_string(": ", !IO),
io.write_line(Token, !IO),
write_token_list(List, !IO).