In NLP, tokenization is one of the first steps in the pipeline to break down text into individual words. I don't know about others but for the project I'm doing, the tokenization speed doesn't matter much in the big picture because the following steps take much longer.