Kinect optical tweezers
Over the summer I have had a project student (BSc Applied Computing student, Matt MacPherson) working on developing a Kinect project to control an optical tweezers experiment. I keep meaning to write something on optical tweezers, but here I want to outline a little bit about this particular project.
In case you don’t know, Microsoft developed the Kinect as a control system for their XBox 360 games console. The idea is that your body becomes the games controller. So by kicking you can kick a football on screen, or by jumping you cause your on-screen character to jump. The system is pretty nice, and contains some simple optics, to give an idea of depth within a scene.
Not long after the Kinect was launched, open access drivers were released allowing anyone to hack together code to interface with the system. Recently Microsoft have made an SDK available. The range of things people have been using the Kinect for is huge and very impressive, from remote control of model aircraft, to a re-creation of the Princess Leia hologram message from Star Wars.
I have always liked to play around with high tech toys, and it seemed that the Kinect offered a really simple and intuitive way to interface with some of our experimental systems, especially our holographic optical tweezers. So I ran this idea as a summer project and we have got some decent results.
The initial ‘simple’ program took input from one hand and turned it into a special kind of hologram, called a kinoform (technically a phase only hologram), which changed as the user’s hand moved. This in turn was relayed through a microscope system to form a moveable optical trap, which can pick up and manipulate microscopic particles.
Matt plays with the Kinect Tweezers
The second phase of work has been a bit more difficult, getting the kinect to control multiple particles. This has mainly been due to gestures clashes – but we have hopefully sorted this out now, and I hope to take the final set of videos we need to show the functionality I originally envisaged next week if I have time…
The plan now is to use the demos for things like open days (we are having a lab open day as part of the Dundee Science Festival) and possibily to develop outreach type projects at Sensation (but we’ll see).
The question now is whether to try and publish this work. I like a good publication and as there is an iPad controlled optical tweezers paper (by my Glaswegian colleagues), there is a precedent. I am torn a little though. The idea is not nearly as swish as people have demonstrated in their bedrooms and in a way doesn’t quite feel like proper science (although I have had some thoughts on ‘manual’ calibration of optical tweezers using the system). However I do feel like that a paper on this topic might generate some press, and this could help my group’s and my department’s profile. The flip side is that it could been seen in a slightly negative light. It’s an interesting question (at least in my head) but I’ll wait until we have the final ‘data’ before making a decision.
If anyone has any thoughts on what we could do with out Kinect tweezers, leave a comment, and I’ll see if we can try it out…