LexerEdit
Redirected from Lexers
A lexer is a tokenizer that takes a character-based input stream (read from a file, or a string in memory, or characters arriving over the network) and attempts to recognize characters and groups of characters into so-called terminal symbols or tokens.
Tools that can be used to generate lexers from a grammar (a formal specification of how input should be lexed) include: