Tutorial: How to build a Tokenizer in Spark and Scala

Table of contents
Reading Time: 2 minutes

In our earlier blog A Simple Application in Spark and Scala, we explained how to build Spark and make a simple application using it.

In this blog, we will see how to build a fast Tokenizer in Spark & Scala using sbt.

Tokenization is the process of breaking a stream of text up into words, phrases, symbols, or other meaningful elements called tokens. The list of tokens becomes input for further processing such as parsing or text mining. Although tokenization is a slow process. But, with the help of Spark we can make it fast by running it in chunks/parallel.

In following example we will see how to tokenize (segregate) the words in a text file and count the number of times they occur in the text file (i.e., term frequency).

Before start building this application follow the instructions of building an application in Spark given in here.

After building the application, we can start building the Tokenizer.

To build the Tokenizer, create a file TokenizerApp.scala in your application like this

As we all can see that while processing “textFile” to obtain tokens, we have split the “logFile” into 2 parts.

This makes processing of text file, faster as Spark can process these 2 parts of file in chunks to obtain tokens. So, more the number of splits, the faster execution is.

However, if you want to run Spark on 2 or more cores then specify the number of cores in “local[n]” field of “SparkContext” like this

This will run Tokenizer on 2 cores. Note that, the number of splits should be a multiple of “n“.

To download a Demo Application click here.

Written by 

Himanshu Gupta is a software architect having more than 9 years of experience. He is always keen to learn new technologies. He not only likes programming languages but Data Analytics too. He has sound knowledge of "Machine Learning" and "Pattern Recognition". He believes that best result comes when everyone works as a team. He likes listening to Coding ,music, watch movies, and read science fiction books in his free time.

Discover more from Knoldus Blogs

Subscribe now to keep reading and get access to the full archive.

Continue reading