In Aberystwyth Robotics Club we run after school robot-based activities for local schoolkids. This year we’ve just started a new term, and we decided to take in twice as many new starters as before. We decided it’d be a good idea to try being more open and let more kids have a go; it doesn’t matter if they haven’t been keen enough in the past to try out computing or robotics, what matters is they’re interested now and they want to give it a go. If they drop out at Christmas, that doesn’t matter, they’ve given it a shot.
So we are running two parallel sessions for the complete beginners. Half of them are doing a new “coding taster project” and half are doing the normal robot club starter, which is soldering, circuits, arduino basics and a bit of lego robots, and then we swap. I’ve been running the coding half, and we’ve been hacking around with face controlled games. The rest of this blog post is about the coding stuff – and what it’s like doing computer vision based real-time coding with a bunch of 11-12 year olds.
If you’re just here for the resources to try this yourself they’re linked at the bottom of this post.
What did we do?
I came up with a four week programme of sessions (2h per session) which use the computer vision library OpenCV to animate, read the camera, detect things in the camera and then make a game. Broadly speaking the plan is as follows…
Week 1: animation
The first week was all about simple graphics. Drawing dots, squares and lines on a canvas, having a loop and changing things in the loop, thinking about colours and R,G,B space.
Week 2: video and faces
Next up we looked at video and faces. This involved thinking of a video as a set of still images (get an image from the camera, put it in the window, repeat). Then we tried running the face detector, and working out what the face detector can do. This session was a bit short – we spent some time changing sensitivity settings, and min/max face size settings, and trying to fool the face detector by drawing pictures of faces. I’ve beefed this session up in the worksheets so hopefully next time round (in 2 weeks time) it won’t be short.
Week 3: collision detection (when does the ball hit the face?)
This session involved bringing together week 2 (rectangles around faces) and week 1 (balls moving on the screen) and working out when the rectangle and the ball overlap. This is harder than it sounds.
Week 4: game time
Now we introduce classic game elements, including scores, levels, lives and so on. There was also a lot of changing colours, and using graphics to overwrite the face area (e.g. a smiley face). Customisation was a very popular activity – no two games looked alike.
How did it go?
At the end, everyone had a game working, and a lot of the kids wanted to spend longer on their project. This was not all their own work – we have a lot of volunteers so I think the ratio was about 1:3 helper-to-student, and the project is really ambitious. But I do think all the kids have a fairly solid understanding of how their game fits together, even if they didn’t write all the code themselves.
The code was quite hacky and the entire project was based around getting something working quick rather than coding properly. A lot of them had tried Scratch before, but I think only one or two had looked at any textual programming language, and Python is quite a leap from Scratch. I was keen to keep it fun, so I provided them a lot of code to edit (each week had a starting code file which they had to develop). In this way they got things on the screen and spent their time doing animation rather than doing exercises. Robot club is after all supposed to be fun…
I think that all the kids have learned a few fairly key concepts associated with coding: case sensitivity, indentation, if statements, debugging, comments, the idea of iteration, the idea of a variable, and the general way coding best proceeds by small changes and lots of testing. I don’t think any of them could take an empty file and write a python program, but they could probably look at a simple program with conditionals and so on and have a guess at what it does, and maybe edit some aspects.
In terms of higher level concepts, they have a good idea about what a face detector is, and how it works, they have a grasp of image-based coordinate systems, maths and inequalities in 2d space (collision detection), the idea of a colour being made up of R, G and B, and the idea of a video being a set of images presented quickly in succession.
They’re also all able to tell the difference between different quote marks, and can find the colon symbol on a keyboard, although I die a little every time I hear myself refer to “#” as “hashtag”.
If anyone wants to try this at home or indeed anywhere else… we ran the workshop on Linux, in a lab with OpenCV and Python installed. You can find the instructions for the kids here: worksheets for kids on Google Docs, and the code samples broken down by week here: https://github.com/handee/ARCfacegame. Let me know if you use them.