A token is a piece of text, typically a word or punctuation mark, separated by a lexical analyzer and passed to a parser.
Older Interpreters would tokenize the program, turning the instructions from the programmer into byte-codes that were more easily processed by the VM - Virtual Machine. By doing so they would gain a bit more speed, faster execution.
Some examples are the, O-code, P-code, Z-machine, and byte-codes used in different VM systems.
Fact-index.com financially supports the Wikimedia Foundation. Displaying this page does not burden Wikipedia hardware resources. This article is from Wikipedia. All text is available under the terms of the GNU Free Documentation License.