We already did most of the heavy lifting in the lexer, so our tokenizer module can be simpler than before. We can use the jsonic tokenizer as a template. But instead of having the lexer embedded inside, we import basic-lexer from our "basic/lexer.rkt" module:
But since we’re now relying on lexer-srcloc to gather source locations for us, we make one update. One of the fields tracked in a source-location structure is the path to the file containing the source code. But our basic-lexer only gets an input port as an argument, not a path. Without that extra path argument, its source locations will be incomplete.
We modify make-tokenizer to take an optional path argument. Later, when we invoke the tokenizer from the reader, we pass it both an input port and a path. For testing purposes, we might not want to always use a path, so we have a default argument of #f for the path.
We set the lexer-file-path parameter to this value. A parameter is a special kind of Racket value that approximates a global variable. Unlike a global variable, which is a simple value, a parameter is a function that is used to store and retrieve a value. So whatever argument we pass to lexer-file-path “sets” it, and it can then be retrieved by other functions that rely on it, including lexer-srcloc. + True, parameters go somewhat against the grain of functional programming. Thus, they are used sparingly in Racket. But sometimes they’re the cleanest way to do things.
With those changes, we’re done with the tokenizer.