This is a simple lexical analyzer implemented in Python that uses dictionaries to create tokens from input strings.
- Tokenizes input strings based on specified rules.
- Uses Python dictionaries for efficient tokenization.
- Easy to customize for different languages or tokenization requirements.
-
Clone the repository:
git clone https://github.com/imustitanveer/Lexical_Analyzer_UsingPythonDictionaries.git
-
Run the
lexical_analyzer.py
file:python lexical_analyzer.py
-
Enter an input string when prompted, and the analyzer will generate tokens based on the specified rules.
For example, if the input string is:
int x = 10;
The analyzer might generate the following tokens:
int
: Keywordx
: Identifier=
: Assignment Operator10
: Integer Constant;
: Semicolon
You can customize the rules for tokenization by modifying the dictionaries in the lexical_analyzer.py
file. For example, you can add new keywords, operators, or constants based on your requirements.