The Free On-line Dictionary of Computing (30 December 2018):
lexical analysis
    (Or "linear analysis", "scanning") The first
   stage of processing a language.  The stream of characters
   making up the source program or other input is read one at a
   time and grouped into lexemes (or "tokens") - word-like
   pieces such as keywords, identifiers, literals and
   punctuation.  The lexemes are then passed to the parser.
   ["Compilers - Principles, Techniques and Tools", by Alfred
   V. Aho, Ravi Sethi and Jeffrey D. Ullman, pp. 4-5]
   (1995-04-05)