1

As a token may have different attributes e.g name,type,size etc.I am confused about which of these are filled in by lexical- analyser and which are filled by other phases of compiler.As different compilers may behave differently we may take gcc c compiler as reference.

  • The lexical analyser only knows the name. It doesn't know the type, and therefore the size. It doesn't know the scope. It knows exactly nothing except the name. Everything else is filled in during semantic analysis. – user207421 Oct 28 '19 at 01:28

1 Answers1

0

Somewhat subjective as it depends on what you are building (compiler, interpreter, etc.). Also, fairly generic description:

Lexical analysis, or scanning, will take your source code and breaks it up to 'tokens' or 'lexemes' but they may not yet be put into the symbol table.

Evaluation, if you implement, is a point you may start stropping identifiers (for example, marking as a keyword, etc.). And is generally the first point that a token would qualify for entry into a symbol table. Depending on the symbol, the attributes of the symbol at that point may be complete.

Later stages, such as parsing, may reference the symbol and imbue additional information such as scoping, internal or external reference, visibility, type and size, etc..

Code generation would finalize any attributes of the symbol and at that point you are writing a binary.

Frank C.
  • 7,758
  • 4
  • 35
  • 45