I have a huge file that I must parse line by line. Speed is of the essence.
Example of a line:
Token-1 Here-is-the-Next-Token La
I made a lexical analyser based on a state engine (DFA). It works with a table and is pretty fast. But there are possible faster options.
It also depends on the language. A simple language can possibly have a smart algorithm.
The table is an array of records each containing 2 chars and 1 integer. For each token the lexer walks through the table, startting at position 0:
state := 0;
result := tkNoToken;
while (result = tkNoToken) do begin
if table[state].c1 > table[state].c2 then
result := table[state].value
else if (table[state].c1 <= c) and (c <= table[state].c2) then begin
c := GetNextChar();
state := table[state].value;
end else
Inc(state);
end;
It is simple and works like a charm.