Friday, November 28, 2014

Christmas Market in Hannover (Part 1)

For Germans, Oktoberfest and Christmas are two of the biggest celebrated events of the year. Even a month before Christmas, the markets open up in big cities and offer the citizens with delish xmas treats and warm wine. Despite the -4 degree Celsius weather, Alvin, Philip, Kinan and I decided to check out the market :3 Even with frozen toes we were able to enjoy it.


This huge rotating nut-crackerish fan(?) greeted us as we got out of the downtown tram station

 From left: Philip, Alvin, and Kinan (aka Teenager)

 oops


We ate so much!

Three things you MUST try at the market are: Glühwein (Mulled wine), Kartoffelpuffer (fried potato pancake) with apple sauce, and Mandeln (fried almonds with sugar and spice coating)


Thursday, November 27, 2014

[Jam Sesh] with Alvin and Annalena - Hit The Road Jack

Alvin is an avid jazz guitar player, and although he had to make do with an acoustic guitar here in Germany, his style and skill level are ace. He met Annalena, an orchestra sax player, through a musician-search website, and he invited me to their jam sesssion three weeks ago after he learned that the room they are renting has a piano in it.

The first jam sesh was definitely a musical ice-breaker. We all had different musical styles, as Annalena was more used to playing in an orchestra, Alvin was already at a semi-professional level with his band back in Vancouver, and I was just some guy that knows a few songs on the piano.

So this was our third week jamming together, and I admit, we sound pretty good!

The easiest song to play was Hit the Road Jack, as it essentially repeats 4 chords throughout the entire song. The room for improvisation is vast, for all the instruments involved :P

Here's a video! (Annalena and Alvin were camera shy :))
*Alvin actually broke a string on the guitar during this song :)

The sax solo starts at 1:22, guitar solo at 3:11, and piano solo at 4:14

I feel like I have a LOT to improve on my improv :$

Enjoy! :3

Monday, November 24, 2014

[Ramble] Perception

This is something I wrote down on my phone long time ago:

We can only perceive up to the fourth dimension. 
(Space and time)
Neurological reason? 
(We indeed have found 'spatial' maps and time-keepers in our brains)
We are also limited in how quickly we perceive things, 
(10 to 12 different images per second)
just as how we have a limited field of vision.
Perhaps we merely have the bare necessities for our survival. 
(Is natural selection a limiting condition?)
What if we push the boundaries
of what is 'natural,' 
and let ourselves adapt?


[EZPZ] Artificial Neural Networks with R


ANN is vastly useful for its accurate predicting and modelling capabilities. Fortunately, it is extremely easy to jump-start on ANN by focusing on the application side of things. I do recommend reading up on the statistical explanation for ANN, but if it's just for your hobby or interest, you can get by with just the basics as the interpretation of results is easy to grasp.

In the following tutorial, I'll teach you how to set up and use ANN within an hour. It won't cost you a cent, either. ;)

This is the easiest way I found to start using ANN.

1) First, you need R. It's a cool open-source statistical software. My sister's statistician friends say it's slow as a turtle compared to MATLAB or any other premium software, but hey, it does the job!

2) You need some data that you want to model. You can have a classification-based data set (if you want to, say, guess what kind of car it is based on data such as MPG, manufactured year, color, etc), or a continuous data set (eg: if you want to predict the stock prices for tomorrow).

3) Install these packages in R: 'nnet', 'caret' and 'RSNNS'. You can use the codes below:
#Handy way of loading all packages that you need
libs <- c('nnet', 'caret', 'RSNNS')
lapply(libs, library, character.only = T)
4) Import your data (make sure the data is delimited by tab ('\t'), space (' ') or comma (',') and specify the delimiter in the "sep" argument), and normalize it.
InputFile  = "[file_path]/[to]/[your_data].txt"
dataset    =  read.table(InputFile, sep="\t", header = TRUE)
dataset    <- na.omit(dataset)
dataset    <- normalizeData(dataset, type = "0_1")
4) Assign the input and output data. Divide up the train and test sets using createDataPartition function in the caret package.
trainIndex <- createDataPartition(dataset[,1], p=.75, list=F)
traindata  <- dataset[trainIndex, ]
testdata   <- dataset[-trainIndex, ]

## "dataset[,1]" returns a column vector, which is what the function requires.
## If you try to feed it the whole matrix, it will complain.
5) Set your parameter grid.
tc = trainControl("repeatedcv", number=10, repeats=3, classProbs=FALSE, savePred=T)
param.grid <- expand.grid(.decay = c( 0.01, 0.001, 0.0001, 0, 0.00001), 
     .size = c(1:15))

## Repeated 10-fold cross-validation, repeated 3 times. 
## This reduces the chance of getting stuck at local minima during 
## the error-reduction step as well as overfitting.
## In our case, the two ANN structural parameters being changed are 
## weight decay rate (decay) and number of hidden neurons (size).

6) You are now ready to train the network :)
train.net <- train(traindata[,InputIndices],traindata[,OutputIndices][,n],
  tuneGrid=param.grid, 
  maxit = 1e4, 
  method = "nnet", linout=F, trace=F,
  na.rm = TRUE,
  trControl = tc
)

## Note that the 'train' function only allows 1 output at a time. 
## So, you will have to iterate this over n in 1:NumOutputNeurons.
#### for (n in 1:NumOutputNeurons) {
####    assign(paste("train.net.", n, sep=""),train(...)) }
## The 'linout' argument decides whether to apply a linear activation
## function on the output neuron. 
## If false, sigmoid (logistic) function is applied.
7) With your trained network, you can make predictions by giving it some input data.
pred.test <- predict(train.net$finalModel, newdata=testInput)

## The prediction is normalized. You need to denormalize it to make it useful!
8) Denormalize the prediction.
rescaled.pred  <- denormalizeData(test.pred,getNormParameters(dataset))
9) Plot your results!
## For prediction only
plot(rescaled.pred)
 
## For a regression plot to compare actual and prediction values
plot(actual.test, rescaled.pred)
abline (0,1)

Here is an example regression plot:
You can even plot the train.net itself!
plot(train.net)

This shows the training results. RMSE is the root mean squared error, and you can see it generally decreasing to a plateau as size of the hidden layer increases.

And... that's all you need to know to start using ANN! :)


Tuesday, November 18, 2014

Weekend Trip to Prague




One of my favorite things to do is going on weekend trips. It could be somewhere close by, or in a totally different country. Two weeks ago, I was fortunate enough to find car-share rides (using blablacar) to and from Prague, so my friend Philip and I visited the city for two nights.

Night time in Prague :)
We left on Friday morning, and arrived at around 6pm. The traffic jam was truly horrific; it took us about 30 minutes to go about 1.5km. I guess we could have simply walked out of the car, too. S:

During the first night in Prague, we saw the city in its most beautiful form. The nightscape in Prague is amazing, and since most tourists are gone, the eerie emptiness of the path on the way to the castle was quite memorable.
Brick roads are my favorite
There are quite a few things to see in the city, but mostly, the castle and Karluv Most (Charles' Bridge) are the two main attractions. There are many street vendors on the bridge, and street music is prevalent. The city's huge canal divides the city in two, and it's mostly the castle side that has more 'touristy' attractions.

This magnificent overview of the city is from near the Temple
Czech food is amazing, too! I had about two dishes of Goulash in a single day :D
Don't forget to Czech out their beer, too!
I love panorama <3
I rate Prague 9/10 for its cheap and delicious food, and amazing scenery. The score of 10 is reserved for Venice (<3) for now. :)

Sunday, November 16, 2014

Introduction to Artificial Neural Networks (ANN) for the laypersons

Artificial Neural Networks (ANN) not only have a cool name, but also are applied in a huge variety of fields from market research to modelling weld properties during materials processing (nudge nudge, that's what I do).

In essence, ANN is used for multivariate regression (MVR) problems. This means, you have two or more input parameters (independent variables) that may or may not affect one or more output variables (response variables) and you want to figure out the functional relationship between the input and the output. How does ANN solve this problem? Well, it creates three distinct sets of layers, each including a certain number of 'nodes' (or neurons).


The figure above depicts the three types of layers. I say 'types of layers', because There can only be one input and one output layer, but multiple hidden layers are allowed. Having many hidden layers improves the model, but typically one or two layers are enough. There is also a compensation of exponential increase in computational time when increasing the number of hidden layers.

Multivariate regression (MVR) problems have been around for quite a while, and before the wonderful inventions we call computers, statisticians used to do everything by hand. Yikes! But that also means alternatives to ANN exist. One really old but still-used method is called Response Surface Methodology (RSM) which uses second degree polynomials to approximate the functional relationship.

Machine learning (ML) has opened a new door to a wide variety of solutions to MVR problems. In a tl;dr form, ML uses datasets from experiments or simulations, and can perform either 'supervised' or 'unsupervised' learning.
With supervised learning, data are classified, allowing errors of predictions to be calculated. With unsupervised learning, there is no labels to the data and therefore only draw inferences.

ANN can be used for both supervised and unsupervised learning.
An alternative to ANN exists, and it is called Support Vector Machines (SVM). SVM only supports supervised learning. (I don't yet know enough about SVMs yet to write more about it, sorry)

More on the two types of learning for ANN will be covered in my next post, so keep an eye out! :)