Token (computing)


For other uses of this term, see Token.

A token or also called a lexical component is a string of characters that has a coherent meaning in a certain programming language. Examples of tokens could be keywords (if, else, while, int, ...), identifiers, numbers, signs, or a multi-character operator (for example: = "': +"') p>

They are the most basic elements on which all translation of a program is developed. They arise in the first phase, called lexical analysis, but are still used in the following phases (syntactic analysis and semantic analysis) before being lost in the phase of synthesis. Example

Suppose the following line of a program: SI New & gt; MaxNUM THEN

The tokens are: * "SI" * "New" * ">" * "MaxNum" * "SO"

And are usually described in two parts, a type or class and a value, like this: Token = (Type, Value)

For the above sequence, the tokens can be described * [Reserved Word, "YES"] * [Identifier, "New"] * [Operator, "& gt;"] * [Identifier, "MaxNum"] * [Reserved Word, "THEN"]

wiki