Monday, November 24, 2014

[EZPZ] Artificial Neural Networks with R

ANN is vastly useful for its accurate predicting and modelling capabilities. Fortunately, it is extremely easy to jump-start on ANN by focusing on the application side of things. I do recommend reading up on the statistical explanation for ANN, but if it's just for your hobby or interest, you can get by with just the basics as the interpretation of results is easy to grasp.

In the following tutorial, I'll teach you how to set up and use ANN within an hour. It won't cost you a cent, either. ;)

This is the easiest way I found to start using ANN.

1) First, you need R. It's a cool open-source statistical software. My sister's statistician friends say it's slow as a turtle compared to MATLAB or any other premium software, but hey, it does the job!

2) You need some data that you want to model. You can have a classification-based data set (if you want to, say, guess what kind of car it is based on data such as MPG, manufactured year, color, etc), or a continuous data set (eg: if you want to predict the stock prices for tomorrow).

3) Install these packages in R: 'nnet', 'caret' and 'RSNNS'. You can use the codes below:
#Handy way of loading all packages that you need
libs <- c('nnet', 'caret', 'RSNNS')
lapply(libs, library, character.only = T)
4) Import your data (make sure the data is delimited by tab ('\t'), space (' ') or comma (',') and specify the delimiter in the "sep" argument), and normalize it.
InputFile  = "[file_path]/[to]/[your_data].txt"
dataset    =  read.table(InputFile, sep="\t", header = TRUE)
dataset    <- na.omit(dataset)
dataset    <- normalizeData(dataset, type = "0_1")
4) Assign the input and output data. Divide up the train and test sets using createDataPartition function in the caret package.
trainIndex <- createDataPartition(dataset[,1], p=.75, list=F)
traindata  <- dataset[trainIndex, ]
testdata   <- dataset[-trainIndex, ]

## "dataset[,1]" returns a column vector, which is what the function requires.
## If you try to feed it the whole matrix, it will complain.
5) Set your parameter grid.
tc = trainControl("repeatedcv", number=10, repeats=3, classProbs=FALSE, savePred=T)
param.grid <- expand.grid(.decay = c( 0.01, 0.001, 0.0001, 0, 0.00001), 
     .size = c(1:15))

## Repeated 10-fold cross-validation, repeated 3 times. 
## This reduces the chance of getting stuck at local minima during 
## the error-reduction step as well as overfitting.
## In our case, the two ANN structural parameters being changed are 
## weight decay rate (decay) and number of hidden neurons (size).

6) You are now ready to train the network :) <- train(traindata[,InputIndices],traindata[,OutputIndices][,n],
  maxit = 1e4, 
  method = "nnet", linout=F, trace=F,
  na.rm = TRUE,
  trControl = tc

## Note that the 'train' function only allows 1 output at a time. 
## So, you will have to iterate this over n in 1:NumOutputNeurons.
#### for (n in 1:NumOutputNeurons) {
####    assign(paste("", n, sep=""),train(...)) }
## The 'linout' argument decides whether to apply a linear activation
## function on the output neuron. 
## If false, sigmoid (logistic) function is applied.
7) With your trained network, you can make predictions by giving it some input data.
pred.test <- predict($finalModel, newdata=testInput)

## The prediction is normalized. You need to denormalize it to make it useful!
8) Denormalize the prediction.
rescaled.pred  <- denormalizeData(test.pred,getNormParameters(dataset))
9) Plot your results!
## For prediction only
## For a regression plot to compare actual and prediction values
plot(actual.test, rescaled.pred)
abline (0,1)

Here is an example regression plot:
You can even plot the itself!

This shows the training results. RMSE is the root mean squared error, and you can see it generally decreasing to a plateau as size of the hidden layer increases.

And... that's all you need to know to start using ANN! :)

No comments:

Post a Comment