Stop Motion Animation

This week, I created my first stop-motion animation. I’m really proud of my end product despite the fact that it only runs for 8 seconds and that it took me over three hours to make.  My real reward was that the (salutary) learning curve was fun, rewarding and worthwhile.

The reason for it taking this length of time to create can be attributed to my cocky incompetence. In my eagerness to complete the task I completely disregarded all the rules of planning, storyboarding and good practice. Here’s the animation – then let me explain

I’d been asked to deliver someone else’s PowerPoint presentation entitled ‘New Technology in the Classroom’ to groups of school teachers in Salford: Five repeated one-hour sessions. One of those technologies was ‘animation’. As I’ve demonstrated examples of stop-motion in workshops before and I do get the theory behind it, I didn’t feel that there would be a problem doing the same thing again. However, because the presentation was very much based on what my employer for that day (RM) was offering to this particular school I felt the need to take a little ownership of the slideshow and decided that I would also further my own CPD at the same time: I would make my own animation to show the teachers. So far so good.

I quickly decided that the subject matter wasn’t important and that it would be the process that counted. I fetched my Panasonic Lumix TZ10 (camera) from my backpack, selected a few tools on my desktop (my real desktop – not the computer desktop), turned on all my lights and set to work. I took the photos (the snaps) and decided to make the animation on iMovie. What could be hard about that?

Take-1 reflection: Putting theory into practice isn’t as straightforward as it seems. First of all: lighting – I turned on my two desk lamps and my ceiling light (in my eyrie, the ceiling is less than 4” above my head and way behind me) and I felt confident that the light would be ok. It wasn’t, but I’ll come back to this. The next problem was the way I held the camera – which was the problem – I HELD THE CAMERA. As I moved each item (actor?) on my stage I also inadvertently moved the camera so the resulting snaps were taken from different angles. Those were the two main problems with Take-1, although I also had to overcome iMovie’s default 4-second timing for individual images too.

I went and retrieved my camera tripod from the boot of my car, where it had been since the last lot of Pathfinder videos I managed to film for RSC-YH. I set up my camera on this and planned what each of my actors would do on stage. The rudimentary beginnings of a ‘story’board can be seen here. However, this was another failed attempt!

Take-2 reflection: I was taking 20-30 photographs at a time and moving my actors around 1-2 centimetres for each shot. Then, when I’d loaded the photos into iMovie and repeated my changes to the default image timing I played it back and found the camera angles were still changing. I’d reduced the default image time to 0.4secs so the 20-30 images would make a movie/animation of around 8-12 seconds. But it was still rubbish. Back to the drawing board.

I double checked the tripod and I checked all the camera settings to make sure that the photos were getting as much light as possible, that the flash wasn’t working (flash throws shadows) and that it was set on manual. I even planned a more believable ‘story’.

Take-3 reflection: I’d realised by now that I should have planned this better and earlier. The resulting iMovie animation was no better than the previous two and I was left scratching my head.

So, why did Take-4 work (at least) as well as it did?

  • I’d investigated iMovie’s defaults a little further and realised that it puts a Burns Effect onto each image! Doh – that’s why the camera angles on Take-2 and Take-3 were so erratic. I had to highlight all of the snaps and make sure that they had no attributes added to them by iMovie. 0.4secs seemed to be a good length of time though.
  • I’d moved things around on the desktop so that the light was pretty even (again, I’ll come back to this).
  • By now, I’d actually scripted an activity where the pot cat would chase a startled computer mouse from the desk mat. Similarly startled pens would also run away in the background.
  • I didn’t plan to take 20 snaps, but that’s what turned out.

I am probably lucky that the distance moved between each shot was about right and the ‘jumps’ between frames are not too big.  The end product is a bit short (20 frames @ 0.4 seconds each has turned out to be 8 seconds) but good enough for a first publication 🙂


I said that I’d come back to lighting. If you watch the animation you’ll see light coming and going from the top left of the screen – the laptop! Well, that was something I’d overlooked. The mouse (actor) was a Gyro-mouse and I’d neglected to remove the receiver from the laptop. Therefore, every time I moved the mouse, something changed on the laptop screen – which was half closed and which I therefore didn’t notice until the final product was rendered.

Also, because I had (what I now realise was) too little light, different shadows were thrown each time I moved an actor. This doesn’t distress my clip too much, but it might be an important consideration for others making animations.

Then of course, I looked at the product being supported by RM in the school: I Can Animate. This does all the hard work for you! Grrrr. Thanks too to James Clay for introducing me to iMotion for iPad and iPhone.


Jet lag and jelly

I’m not doing so well on the jet-lag front this week: On Monday morning, I was in my office at 3.30am trying to do something useful as sleep wouldn’t come my way. Then on Wednesday, after another night’s disturbed (almost non-existent sleep!) I was up at 5.00am to get ready for an early trip to Blackpool, where I’d been asked to deliver a mobile learning workshop at the 6th form college. Initial discussions had taken place via email while I was in America.

I’d initially decided to re-work one of the MoLeNET days Di and I designed because it involves plenty of activities and related pedagogy. I would just tailor the programme to suit my audience.  Then, I was told that the college was a Mac college – entirely equipped with Macs and MacBooks! Apparently the only college in the country so equipped. This was a blow – as it meant changing the way it was delivered because some of the planned day’s activities involve software that Mac won’t support. I searched and found CaptureIT which supposedly does similar things to Cam Studio but stopped exploring it (I still haven’t explored the video bit yet) because I got another phone call on Tuesday to say that it wouldn’t be an all day gig as originally planned – but a 2 hour one repeated three times (I’m good value!!) That meant starting again as there would still be a need to include engaging activities that didn’t appear too rushed. My objective was for the groups to explore mobile learning without being too didactic.

I felt that the day went well. Each group remained engaged and fulfilled the activities I’d set them with the resulting discussions going the way I’d planned – without dissent.

One of the activities involves reading a piece of text 300-350 words long (I used a newspaper article on two of the sessions and a piece of Shakespeare on the other – readers think the Shakespeare is “hard”) and then composing an SMS text (up to 160 characters) or ‘Tweet’ (up to 140 characters) to demonstrate their understanding of the piece. Part of the value in this comes from the concentration required to sift through the prose, picking out important aspects and then combining them in a very short message. Once all the received messages are shown to the group the full value is realised in the ensuing discussion of submitted ‘understandings’. Lilian and I plan to use this in our ALT-C workshop.

Talking of which, Lilian and I met on Thursday for lunch and to catch up on things that had happened over summer and that needed to be done for the upcoming academic year.  It was a great afternoon, during which we both felt we had achieved something and left each other feeling invigorated. We had pretty much planned our workshop for ALT (Programme) Wednesday morning 9.00am (currently).

And today (more later on Twitter and via blog next week I’m sure) we’re setting off for Wembley to see Fartown (the older persons’ term for Huddersfield Giants) play in the Rugby League Cup Final.