Tokenization breaks text into smaller parts for easier machine analysis, helping machines understand human language.
https://www.datacamp.com/blog/what-is-tokenization