Thursday, December 18, 2014

Real Time Controls with ATMEGA1284 using Interrupts

Few months ago, I had to create a real-time signal generator using a microcontroller unit (MCU) that would control the trajectory of a multi-kilowatt laser. As the process involved is for large-scale laser welding, it was requested that the system be highly reliable.
I was given an ATMEGA1284P with an external 16MHz quartz. (To see how to set up the external clock, click here)
The signal being generated was between 5-30Hz (which is quite slow), so I tried using time-based Sleep() instructions. This resulted in a terribly incorrect frequency.

The solution was to use interruptsInterrupts are used everywhere. In software, they can be used to handle errors that pop up during program execution. In electronics, a button press can generate an interrupt, leading to a function execution.

In our case, the interrupt is generated from the MCU itself. 
In short, the quartz --which is a crystal oscillator-- 'ticks' (hopefully) at a constant frequency. The MCU is able to count these ticks. At every certain number of ticks (which you assign), the MCU can call a function called an Interrupt Service Routine (ISR).

The benefit of this method is that as long as the quartz is reliable, the MCU will reliably run the ISR at every few instances in time.
The biggest drawback -- and something to watch out for -- is that the ISR must be very concise (a loop of any kind is a big no-no).

Example Problem

Say we want to generate a sine signal to an analog-out pin at a certain frequency.

We can use interrupts to output a certain value of the sine wave at a given time. Unfortunately, depending on the MCU (and true for ATMEGA1284P), calculating sin() within the ISR takes a long time -- long enough to cause delays in the signal. Since sine is a periodic function, it is much better to pre-compute the values with a given temporal resolution and call them from an array during the ISR. 


First, we need to declare a few variables.

#define RESOLUTION 250

volatile unsigned int counterMaxValue;
const int clkspeed = 16000000; 
const int prescaler = 64;
volatile int frequency = 5; // Default freq. is 5Hz

// for pre-computed sine table.
int table_index = 0;  
double sin_table[RESOLUTION];
float increment;

const int outportB = 2; // PortB
const int outportC = 1; // PortC

counterMaxValue is the number of 'ticks' of the quartz at which the ISR should be called.
clkspeed is the frequency of our quartz, in Hz.
prescaler is the number at which to divide the number of 'ticks,' and it allows us to use a certain range of frequencies. Read more about prescaler here.

The Interrupt

Now, we must configure the hardware. We are using Timer 1 of the 1284P, in the Clear Timer on Compare (CTC) mode. When I was first learning about CTC, I used this resource.
void initInterrupt() {
  cli();//stop interrupts

  TCCR1A = 0;// set entire TCCR1A register to 0
  TCCR1B = 0;// same for TCCR1B

   CTC mode activation:
   The counter value (TCNT1) is incremented from 0 to the value specified in OCR1A. 
   Once it hits that value, the interrupt is generated, and the interrupt service routine (ISR) is called.
   The below setting for TCCR1B enables this to happen. See the datasheet for more details.
  TCCR1B |= (1 << WGM12)|(1 << CS11)|(1 << CS10); 

  // set compare match register for freq * n increments
  counterMaxValue = (unsigned int) (clkspeed/(frequeny*n*prescaler) - 1);
  OCR1A = counterMaxValue;//eg:(16*10^6)/(2000*64)-1 (must be <256 for 8 bit counter.); 124 for 2kHz

  //initialize counter value to 0
  TCNT1  = 0; 
  // enable timer compare interrupt
  TIMSK1 |= (1 << OCIE1A);

  //enable interrupts

After this function is run, the interrupts will begin. But first, we must set up a pre-computed sine table; and above all, we must define the ISR.

The setup()

void setup()

  // Pre-compute sine table
  increment = 6.283185/(RESOLUTION-1);
  for (int index = 0; index < RESOLUTION; index++)
    sin_table[index] = sin(x);
    x = x + increment;



I was using Wiring to program the MCU, and there was an error when I tried to use any of the timers. The problem was solved by commenting out the definition set out by wiring.
// Had to comment out line 77-86 of ...\wiring-0100\wiring-0100\cores\AVR8Bit\WHardwareTimer.cpp
// in order to 're-define' TIMER1_COMPA_vect
  // Traverse through the table. This is much faster than computing sine every time! :)
  // Equivalent to: 
  // y = sin(x)*amplitude+1638+offset; // Based at 5V, up to 4V peak to peak; offset +/- 1V. 2047 - 409 = 1638; 5V - 1V to account for offset
  // x = x + increment;
  if (table_index > RESOLUTION-1) // loop back to beginning
    table_index = 0;
  y = sin_table[table_index]*amplitude+1638+offset;
  output = word(y);
  portWrite(outportB, int (lowByte(output)));
  portWrite(outportC, int (highByte(output))); 

Just a quick note on outputting the signal: I am using a 12-bit digital to analog converter (DAC). Two ports on the MCU are used, each outputting half of the word.
The DAC I used allowed two modes of output: unipolar (0V to 10V), or bipolar (-5V to 5V). I used the latter, which means that when I send (1111 1111 1111)2 to the DAC, the output value will be (+Vref)*(2047/2048).
In short, output value of '409' from the MCU corresponds to 1V output from the DAC.

Now everything is ready! An empty loop() suffices to run this on the MCU.

The loop()

void loop(){}

After uploading the program to your 1284P, it should start sending out sine signals in digital form to the DAC, which it can then transform to the analog signal.


Establishing accurate real time controls on a microcontroller requires us to use timer-based interrupts. There are many benefits to this method, including parallelization of tasks for the MCU.

In other words, the MCU can do other things, such as polling for user input in the main loop(), while ISR takes care of the timing-based tasks. This would not be possible using the sleep() function, which simply waits for a certain amount of time until moving on to the next instruction.

Lastly, the most important thing to consider when using timer-based interrupts is to keep the ISR as simple and short as possible!

Monday, December 15, 2014

[Solidworks] Tube Routing without Solidworks Premium

Without a doubt, Solidworks Routing package within the premium version of SW can be quite useful when creating electrical connections, pipes, or tubes. Unfortunately, a lot of us have to make-do without such handy tools. In this tutorial, I will cover how to create flexible tubings with just a few loft tools. The example model I am using is a minimal working example (MWE) so please excuse the crude design.


- Route a flexible tube between two connections that updates with the varying position of the end connections.
A flexible tube must be connected between the two highlighted profiles.


1. First, create a cross-sectional sketch of the tube. Save the part.

Cross-sectional sketch of the tube
2. Create a new assembly, and import the assembly with the original parts. This will enable the tube route to be automatically updated. (Do not import by parts!)
3. Import the sketch, and mate it to one end of the connection. Make sure it is fully defined.
Mate the sketch to one end of the tube connection.

4. Go into “Edit Component” mode for the tube, and click on any of the planes.

5. CTRL+DRAG the plane to create a new plane.

Define the plane to be coincident to the other end of the tube connection.

6. Make a sketch on this new plane of the tube cross section. Exit sketch.

7. While still in Edit Component mode, go to Features -> Lofted Boss/Base.
8. First we have to create the outer wall. So, designate the outer closed loop as the end geometries.
It will create the shortest-path connection, which ends up looking quite strange!

9. This can be easily fixed by clicking on Start/End Constraints, and setting them to be “Normal To Profile.” The orthogonal distance to profile can be edited here as well, but remember these values because we will need it soon.

10. We have to now cut out the inside of the tube. To do this, first make the sketches in loft visible. (Click on the + sign beside Loft, right click on the sketches and click “Show”)

11. Go to Features -> Lofted Cut. Now click on the inner closed loops.
Set the same Start/End Constraints as before (do you remember those numbers? :O)

Now your tube is done!

You can open the original assembly without the tube, change the positions, and the assembly with the tube should update on its own.


Saturday, December 13, 2014

Budapest for 1.5 days

If I could describe Budapest using two words, it would be 'grey' and 'mysterious.' The weather wasn't so great during the 1.5 days; it was raining, cloudy and foggy -- definitely not the best time to take photographs. Still, they turned out alright after some post-processing. :)

My travel buddy this time was Alvin, and despite our super uncomfortable wet shoes, we tried to make the best of the trip. Alvin's friend, Albert from Canada met up with us and showed us around despite his busy study schedule for med school finals.

The Chain Bridge
The city is divided into two parts by the river Danube; Buda on the west bank and Pest on the east.
Hungarian Parliament Building

Walking around the city, I noticed a lot of Roman-esque buildings. It turns out, Romans declared this city the capital of Pannonia Inferior.

At the centre of Chain Bridge, overlooking the Statue of Liberty on the left and Pest on the right.

Friday, December 12, 2014

[Jam Sesh] Girl from Ipamena

This week, Alvin and I covered a classic Bossa nova song called Girl from Ipamena.
For some reason, the piano went completely out of tune on certain notes since last week :(
To spice up the video, I added random pictures/videos from my trips :)

Wednesday, December 10, 2014

Homeless in Dortmund

Alvin and I bought our plane tickets to Budapest, leaving from Dortmund airport two weeks before the trip. Unfortunately, the only available times for the flights were early morning; which meant we had to be at Dortmund the night before.

Well, it turns out Dortmund airport closes between midnight and 3am, and the city has exactly one hostel, and it was all booked out. Hotels were out of our price range and couchsurfing failed us, so we decided to YOLO it. We brainstormed a few ideas:

  1. Bring a sleeping bag, set up a tent near the airport, and camp out for the night. It turns out wild-camping is illegal in Germany. 
  2. Find a 24/7 McDonalds, sleep there. There was no 24/7 McD's in Dortmund.
  3. Sleep on the street. No problem here.
Clearly, the only option we had was sleeping on the street. 
So we arrived in Dortmund around 8pm, had dinner, and set out to see the city for a bit. We found a Christmas market, and Germany's biggest Christmas tree. Impressive!! Except it was about -5 degrees outside.

As we walked around the city, we looked for bushes near big buildings. We found a super nice hidden place, but we had to sleep on concrete floor. It was away from the city centre, and not a lot of people walked by (even if they did, they couldn't see anything). I didn't take a picture of the place, since it was night time :(

After about an hour of trying to fall asleep in freezing conditions, I gave up. Alvin and I decided to just hang out at the Hauptbahnhoff (main train staition), which is supposedly open 24/7. 
Then came our savior, KFC. This KFC by the HBf was the warmest, and the tastiest-smelling place we were at all night. So, along with 10 other travellers cozied up inside, we took a short nap.
At 4am, we headed to DTM airport.

Off to Budapest! <<-- will be linked to a new post soon.

After Budapest, we came back to Dortmund airport, and stayed at the most beautiful McD's I have ever witnessed until our car-share ride home arrived.

The only good thing about Dortmund was this McDonalds. (Partly because I planned my trip terribly) Dortmund gets 5 McBreakfast out of 10. 

Wednesday, December 3, 2014

[R] Caret training error


When I was training a neural network with R using 'nnet' method, I got the following error:
Error in model.frame.default(Terms, newdata, na.action = na.omit, xlev = object$xlevels) : 
  invalid type (list) for variable 'x'


It turns out, predict.train function in the {caret} package uses the model.frame function in {stats} package to extract the formula from the trained network.

This error is received when you are trying to train a single-input network (1-n-k network).


Unfortunately, I haven't found way to train single-input networks using caret and nnet. You CAN, however, use 'nnet' by itself to train your network. If you wanted to optimize your network, you would have to iteratively change the desired hyperparameters (such as weight decay, size of hidden layer, etc).

I will post an update if it is fixed in the future.

Friday, November 28, 2014

Christmas Market in Hannover (Part 1)

For Germans, Oktoberfest and Christmas are two of the biggest celebrated events of the year. Even a month before Christmas, the markets open up in big cities and offer the citizens with delish xmas treats and warm wine. Despite the -4 degree Celsius weather, Alvin, Philip, Kinan and I decided to check out the market :3 Even with frozen toes we were able to enjoy it.

This huge rotating nut-crackerish fan(?) greeted us as we got out of the downtown tram station

 From left: Philip, Alvin, and Kinan (aka Teenager)


We ate so much!

Three things you MUST try at the market are: Gl├╝hwein (Mulled wine), Kartoffelpuffer (fried potato pancake) with apple sauce, and Mandeln (fried almonds with sugar and spice coating)

Thursday, November 27, 2014

[Jam Sesh] with Alvin and Annalena - Hit The Road Jack

Alvin is an avid jazz guitar player, and although he had to make do with an acoustic guitar here in Germany, his style and skill level are ace. He met Annalena, an orchestra sax player, through a musician-search website, and he invited me to their jam sesssion three weeks ago after he learned that the room they are renting has a piano in it.

The first jam sesh was definitely a musical ice-breaker. We all had different musical styles, as Annalena was more used to playing in an orchestra, Alvin was already at a semi-professional level with his band back in Vancouver, and I was just some guy that knows a few songs on the piano.

So this was our third week jamming together, and I admit, we sound pretty good!

The easiest song to play was Hit the Road Jack, as it essentially repeats 4 chords throughout the entire song. The room for improvisation is vast, for all the instruments involved :P

Here's a video! (Annalena and Alvin were camera shy :))
*Alvin actually broke a string on the guitar during this song :)

The sax solo starts at 1:22, guitar solo at 3:11, and piano solo at 4:14

I feel like I have a LOT to improve on my improv :$

Enjoy! :3

Monday, November 24, 2014

[Ramble] Perception

This is something I wrote down on my phone long time ago:

We can only perceive up to the fourth dimension. 
(Space and time)
Neurological reason? 
(We indeed have found 'spatial' maps and time-keepers in our brains)
We are also limited in how quickly we perceive things, 
(10 to 12 different images per second)
just as how we have a limited field of vision.
Perhaps we merely have the bare necessities for our survival. 
(Is natural selection a limiting condition?)
What if we push the boundaries
of what is 'natural,' 
and let ourselves adapt?

[EZPZ] Artificial Neural Networks with R

ANN is vastly useful for its accurate predicting and modelling capabilities. Fortunately, it is extremely easy to jump-start on ANN by focusing on the application side of things. I do recommend reading up on the statistical explanation for ANN, but if it's just for your hobby or interest, you can get by with just the basics as the interpretation of results is easy to grasp.

In the following tutorial, I'll teach you how to set up and use ANN within an hour. It won't cost you a cent, either. ;)

This is the easiest way I found to start using ANN.

1) First, you need R. It's a cool open-source statistical software. My sister's statistician friends say it's slow as a turtle compared to MATLAB or any other premium software, but hey, it does the job!

2) You need some data that you want to model. You can have a classification-based data set (if you want to, say, guess what kind of car it is based on data such as MPG, manufactured year, color, etc), or a continuous data set (eg: if you want to predict the stock prices for tomorrow).

3) Install these packages in R: 'nnet', 'caret' and 'RSNNS'. You can use the codes below:
#Handy way of loading all packages that you need
libs <- c('nnet', 'caret', 'RSNNS')
lapply(libs, library, character.only = T)
4) Import your data (make sure the data is delimited by tab ('\t'), space (' ') or comma (',') and specify the delimiter in the "sep" argument), and normalize it.
InputFile  = "[file_path]/[to]/[your_data].txt"
dataset    =  read.table(InputFile, sep="\t", header = TRUE)
dataset    <- na.omit(dataset)
dataset    <- normalizeData(dataset, type = "0_1")
4) Assign the input and output data. Divide up the train and test sets using createDataPartition function in the caret package.
trainIndex <- createDataPartition(dataset[,1], p=.75, list=F)
traindata  <- dataset[trainIndex, ]
testdata   <- dataset[-trainIndex, ]

## "dataset[,1]" returns a column vector, which is what the function requires.
## If you try to feed it the whole matrix, it will complain.
5) Set your parameter grid.
tc = trainControl("repeatedcv", number=10, repeats=3, classProbs=FALSE, savePred=T)
param.grid <- expand.grid(.decay = c( 0.01, 0.001, 0.0001, 0, 0.00001), 
     .size = c(1:15))

## Repeated 10-fold cross-validation, repeated 3 times. 
## This reduces the chance of getting stuck at local minima during 
## the error-reduction step as well as overfitting.
## In our case, the two ANN structural parameters being changed are 
## weight decay rate (decay) and number of hidden neurons (size).

6) You are now ready to train the network :) <- train(traindata[,InputIndices],traindata[,OutputIndices][,n],
  maxit = 1e4, 
  method = "nnet", linout=F, trace=F,
  na.rm = TRUE,
  trControl = tc

## Note that the 'train' function only allows 1 output at a time. 
## So, you will have to iterate this over n in 1:NumOutputNeurons.
#### for (n in 1:NumOutputNeurons) {
####    assign(paste("", n, sep=""),train(...)) }
## The 'linout' argument decides whether to apply a linear activation
## function on the output neuron. 
## If false, sigmoid (logistic) function is applied.
7) With your trained network, you can make predictions by giving it some input data.
pred.test <- predict($finalModel, newdata=testInput)

## The prediction is normalized. You need to denormalize it to make it useful!
8) Denormalize the prediction.
rescaled.pred  <- denormalizeData(test.pred,getNormParameters(dataset))
9) Plot your results!
## For prediction only
## For a regression plot to compare actual and prediction values
plot(actual.test, rescaled.pred)
abline (0,1)

Here is an example regression plot:
You can even plot the itself!

This shows the training results. RMSE is the root mean squared error, and you can see it generally decreasing to a plateau as size of the hidden layer increases.

And... that's all you need to know to start using ANN! :)

Tuesday, November 18, 2014

Weekend Trip to Prague

One of my favorite things to do is going on weekend trips. It could be somewhere close by, or in a totally different country. Two weeks ago, I was fortunate enough to find car-share rides (using blablacar) to and from Prague, so my friend Philip and I visited the city for two nights.

Night time in Prague :)
We left on Friday morning, and arrived at around 6pm. The traffic jam was truly horrific; it took us about 30 minutes to go about 1.5km. I guess we could have simply walked out of the car, too. S:

During the first night in Prague, we saw the city in its most beautiful form. The nightscape in Prague is amazing, and since most tourists are gone, the eerie emptiness of the path on the way to the castle was quite memorable.
Brick roads are my favorite
There are quite a few things to see in the city, but mostly, the castle and Karluv Most (Charles' Bridge) are the two main attractions. There are many street vendors on the bridge, and street music is prevalent. The city's huge canal divides the city in two, and it's mostly the castle side that has more 'touristy' attractions.

This magnificent overview of the city is from near the Temple
Czech food is amazing, too! I had about two dishes of Goulash in a single day :D
Don't forget to Czech out their beer, too!
I love panorama <3
I rate Prague 9/10 for its cheap and delicious food, and amazing scenery. The score of 10 is reserved for Venice (<3) for now. :)

Sunday, November 16, 2014

Introduction to Artificial Neural Networks (ANN) for the laypersons

Artificial Neural Networks (ANN) not only have a cool name, but also are applied in a huge variety of fields from market research to modelling weld properties during materials processing (nudge nudge, that's what I do).

In essence, ANN is used for multivariate regression (MVR) problems. This means, you have two or more input parameters (independent variables) that may or may not affect one or more output variables (response variables) and you want to figure out the functional relationship between the input and the output. How does ANN solve this problem? Well, it creates three distinct sets of layers, each including a certain number of 'nodes' (or neurons).

The figure above depicts the three types of layers. I say 'types of layers', because There can only be one input and one output layer, but multiple hidden layers are allowed. Having many hidden layers improves the model, but typically one or two layers are enough. There is also a compensation of exponential increase in computational time when increasing the number of hidden layers.

Multivariate regression (MVR) problems have been around for quite a while, and before the wonderful inventions we call computers, statisticians used to do everything by hand. Yikes! But that also means alternatives to ANN exist. One really old but still-used method is called Response Surface Methodology (RSM) which uses second degree polynomials to approximate the functional relationship.

Machine learning (ML) has opened a new door to a wide variety of solutions to MVR problems. In a tl;dr form, ML uses datasets from experiments or simulations, and can perform either 'supervised' or 'unsupervised' learning.
With supervised learning, data are classified, allowing errors of predictions to be calculated. With unsupervised learning, there is no labels to the data and therefore only draw inferences.

ANN can be used for both supervised and unsupervised learning.
An alternative to ANN exists, and it is called Support Vector Machines (SVM). SVM only supports supervised learning. (I don't yet know enough about SVMs yet to write more about it, sorry)

More on the two types of learning for ANN will be covered in my next post, so keep an eye out! :)