The Story So Far…

This started off as a quick look at Linear Regression in spreadsheets and using the findings in Clojure code, that’s all in Part 1. Muggins here decided that wasn’t good enough and rigged up a Neural Network to keep the AI/ML kids happy, that’s all in Part 2.

Darcey, Len, Craig or Bruno haven’t contacted me with a cease and desist so I’ll carry on where I left off….. making this model better. In fact they seem rather supportive of the whole thing.

striclty-come-dancing-judges

Weka Has Options.

When you create a classifier in Weka there are options available to you to tweak and refine the model. With the Multilayer Perceptron that was put together in the previous post, that all ran with the defaults. As Weka can automatically build the neural network I don’t have to worry about how many hidden layers to define, that will be handled for me.

I do however want to alter the number of iterations the model runs (epochs) and I want to have a little more control over the learning rate.

The clj-ml library handles the options as a map.

darceyneuralnetwork.core> (def opts {:learning-rate 0.4 :epochs 10000})
#'darceyneuralnetwork.core/opts
darceyneuralnetwork.core> (classifier/make-classifier-options :neural-network :multilayer-perceptron opts)

The code on Github is modified to take those options into account.

(defn train-neural-net [training-data-filename class-index opts]
 (let [instances (load-instance-data training-data-filename)
       neuralnet (classifier/make-classifier :neural-network :multilayer-perceptron opts)]
   (data/dataset-set-class instances class-index)
   (classifier/classifier-train neuralnet instances)))

(defn build-classifier [training-data-filename output-filename]
 (let [opts (classifier/make-classifier-options :neural-network :multilayer-perceptron
                                                {:learning-rate 0.4
                                                 :epochs 10000})
       nnet (train-neural-net training-data-filename 3 opts)]
   (utils/serialize-to-file nnet output-filename)))

Concluding

There’s not much more I can take this as it stands. The data is actually pretty robust that using Linear Regression would give the kind of answers we were looking for. Another argument would say that you could use a basic decision tree to read Craig’s score and classify Darcey’s score.

If the data were all over the place in terms of scoring then using something along the lines of an artificial neural network would be worth doing. And using Weka with Clojure the whole thing is made a lot easier. It’s actually easy to do in Java which I did in my book Machine Learning: Hands on for Developers and Technical Professionals.

51tc7h5-I7L._SX342_

Rest assured this is not the last you’ll see of machine learning in this blog, there’s more to come.

 

 

 

Advertisements