Next | Tricks of the Wizards | 190 |
Tokens are the basic syntactically meaningful portions of an input.
For example, in
print 12+$var;
The tokens are print, 12, +, $, var, and ;
Individual characters are not generally meaningful.
Tokenizing is the act of converting a character stream into a token stream.
Also called lexing
Next | Copyright © 2003 M. J. Dominus |