MachineX: A tour to KSAI – Neural Networks

Table of contents
Reading Time: 4 minutes

In this blog we would look into how we can use KSAI; A machine learning library purely written in Scala using most of its feature and functional aspects of programming, you can read more about the library at KSAI Wiki, alternatively you can even fork the project from here, KSAI has a rich set of algorithms that address some of the vital problems in classification, regression, clustering, one of the attractive feature of KSAI is that it has internally used asynchronous programming where ever possible.

KSAI has many algorithms and discussing all of them might be a little tedious and may surpass the scope of this blog, so in this blog, we would be focusing on how we can easily use KSAI to write our own Neural Networks wrapper, for regression or classification.

Before moving ahead a prerequisite is to have a basic understanding of how neural networks work, for that you may visit the link, it has several blogs explaining most of the concepts related to how neural networks work from basic to advance, okay now let’s move on to the library, buckle up.

Setting up:

Just add the dependency in build.sbt in case you are using an sbt project

   libraryDependencies += "io.github.knolduslabs.ksai" %% "ksai" % "0.0.4"

Or

If you are using a maven project use the following in pom.xml

<dependency>
<groupId>io.github.knolduslabs</groupId>
​​<artifactId>ksai_2.12</artifactId>
<version>0.0.4</version>
</dependency>

Now, for neural networks, KSAI has to offer you different combinations of Error Functions and Activation Functions that are as follows

  • Error Functions
    • Cross Entropy
    • Least Mean Squares
  • Activation Function
    • Linear
    • Logistic Sigmoid
    • SoftMax
    • TANH

Using Neural Networks provided by KSAI, one can use to train both types of networks

  • Feed Forward
  • Back Propagate

It uses to store the data in Dense Matrix, which is a column major matrix, which in turns uses Arrays, Dense Matrix used by KSAI are provided by breeze library, which is a rich mathematical library that provides many mathematical functions if you look into ksai.core.classification.NeuralNetworkTest in test package you would get a clear insight of what the Neural Networks implemented in KSAI amounts to.

In test cases we  have used the sample data used by smile library for training as well to validate the trained network, we have also implemented our own parser to transform the data in the standard format, which is generally used to feed in artificial neural networks, moving towards to core classes one would see that code is quite crisp and concise, moving towards the main package ksai.core.classification.NeuralNetwork, once the Network is created with the appropriate layers along with the respective error functions and activation function you may simply pass the data for which you need to train the network and the method learn would train the network and would return the trained network to you.

So now let’s move on to the coding ground,

Let’s say we have this file containing the data in the standard format

neural_network_training_data

since KSAI has parsers implemented in place we would use them to format the raw data as for the,class NeuralNetwork.scala once the file gets successfully parsed it would format the data in the following structure

case class Delimited[A](
                         labels: List[A] = Nil,
                         data: List[Array[Double]] = Nil,
                         target: List[A] = Nil)

Or

case class AttributeMeta(name: String = "", typ: String = "")


case class ARFF[A](
                    relation: String = "",
                    attributes: List[AttributeMeta] = Nil,
                    labels: List[A] = Nil,
                    isDataPresent: Boolean = false,
                    data: List[Array[Double]] = Nil,
                    target: List[A] = Nil)

In case you are using NeuralNetwork for Regression the input file, in that case, would be one that would be in ARFF format

Now, that we have our data ready in the specified format we are good to go to train and make our network learn which we would be creating next..

case class Layer(
                  units: Int,
                  output: DenseVector[Double],
                  error: DenseVector[Double],
                  weight: DenseMatrix[Double],
                  delta: DenseMatrix[Double]
                )

case class Network(
                    dimension: Int, //Number of feature
                    numOfClass: Int,
                    net: Seq[Layer],
                    numUnits: Seq[Int],
                    errorFunction: ErrorFunction,
                    activationFunction: ActivationFunction,
                    learningRate: Double = 0.1,
                    momentum: Double = 0.0,
                    weightDecay: Double = 0.0
                 )

Here we have our network case class, so it’s  quite easy to create a network with the appropriate field with the layers, KSAI has different apply methods defined for different scenarios,

now, just need to call the method learn() with dimensions and labels which we got from the parsed data and on the basis of attributes set for the network it would start learning with respect to the data

def learn(features: DenseMatrix[Double], labels: Array[Int]): Network

after the learning it would return the trained network, which can be seen by the signature of method learn, once we got the trained network you can simply use the method predict on the network to see how effectively the network has been trained with respect to test data, a more intuitive rather demonstrative example could be found on this repository, you can simply fork it and see how it works.

So that’s all about this blog, I hope I was able to give a brief idea of how to use Neural Networks by KSAI,  rather KSAI has some more algorithms implemented in it you can read about them here, Thanks for reading

knoldus-advt-sticker

Written by 

Shubham Verma is a software consultant. He likes to explore new technologies and trends in the IT world. Shubham is familiar with programming languages such as Java, Scala, C, C++, HTML, Javascript and he is currently working on reactive technologies like Scala, Akka , spark and Kafka. His hobbies includes playing computer games and watching hollywood movies.

Discover more from Knoldus Blogs

Subscribe now to keep reading and get access to the full archive.

Continue reading