Module pl.lexer
Lexical scanner for creating a sequence of tokens from text.
Functions
cpp (s, filter, options) | create a C/C++ token iterator from a string. |
expecting (tok, expected_type, no_skip_ws) | get the next token, which must be of the expected type. |
get_separated_list (tok, endtoken, delim) | get a list of parameters separated by a delimiter from a stream. |
getline (tok) | get everything in a stream upto a newline. |
getrest (tok) | get the rest of the stream. |
insert (tok, a1, a2) | insert tokens into a stream. |
lua (s, filter, options) | create a Lua token iterator from a string. |
scan (s, matches, filter, options) | create a plain token iterator from a string. |
skipws (tok) | get the next non-space token from the stream. |
Functions
- cpp (s, filter, options)
-
create a C/C++ token iterator from a string. Will return the token type 'keyword'.
Parameters:
-
s
: the string -
filter
: a table of token types to exclude, by default {space=true,comments=true} -
options
: a table of options; by default, {number=true,string=true}, which means convert numbers and strip string quotes.
-
- expecting (tok, expected_type, no_skip_ws)
-
get the next token, which must be of the expected type. Throws an error if this type does not match!
Parameters:
-
tok
: the token stream -
expected_type
: the token type -
no_skip_ws
: whether we should skip whitespace
-
- get_separated_list (tok, endtoken, delim)
-
get a list of parameters separated by a delimiter from a stream.
Parameters:
-
tok
: the token stream -
endtoken
: end of list (default ')'). Can be '\n' -
delim
: separator (default ',')
Return value:
- a list of token lists.
-
- getline (tok)
-
get everything in a stream upto a newline.
Parameters:
-
tok
: a token stream
Return value:
- a string
-
- getrest (tok)
-
get the rest of the stream.
Parameters:
-
tok
: a token stream
Return value:
- a string
-
- insert (tok, a1, a2)
-
insert tokens into a stream.
Parameters:
-
tok
: a token stream -
a1
: a string is the type, a table is a token list and a function is assumed to be a token-like iterator (returns type & value) -
a2
: a string is the value
-
- lua (s, filter, options)
-
create a Lua token iterator from a string. Will return the token type 'keyword'.
Parameters:
-
s
: the string -
filter
: a table of token types to exclude, by default {space=true,comments=true} -
options
: a table of options; by default, {number=true,string=true}, which means convert numbers and strip string quotes.
-
- scan (s, matches, filter, options)
-
create a plain token iterator from a string.
Parameters:
-
s
: the string -
matches
: an optional match table (set of pattern-action pairs) -
filter
: a table of token types to exclude, by default {space=true} -
options
: a table of options; by default, {number=true,string=true}, which means convert numbers and strip string quotes.
-
- skipws (tok)
-
get the next non-space token from the stream.
Parameters:
-
tok
: the token stream.
-