I just watched this TED talk:
Pattie Mays and Pranav Mistry demo SixthSense
There is no doubt that the student creating this device is a genius. The difficulty of creating algorithms to read, recognise and make meaning of an image in front of it beyond that of a barcode or similar pictorial representation is incredible.
That said, I'm no too sure of the actual implementations of this product. At the moment you have a camera with a projector around your neck to make any flat surface nearby into a screen. This is interesting, but in my mind nt incredibly useful. Especially since you are forking out the cost of a phone for your phone to do all the computing anyway, without using all the tools that your phone already has.
Take the shopping uses, for example. You have to make sure the camera is pointing in the right direction, make the correct hand gestures and then you need a surface to display the information you're receiving from your projector, also around your neck. Or, you could get out your phone, take a picture with its inbuilt camera and do exactly the same thing in (I bet) the same time or less and get the same answer for $300 or whaever it was less.
When I'm out and taking photos, I want the photos to be deliberately aimed, zoomed and shot. I want to be able to see exactly what I'm taking before I take it, like with any existing camera. This doesn't allow that. A camera hanging around my neck is precarious, it moves and sways as I move. It turns around. It's not practical.
The software is cool, there's no doubt about that, but the hardware is, in my mind, unconvincing. Johnny Chung hs done some amazing things with similar technologies with what, I think, are much more usefl educational outcomes, although as wth this the implementation is, at the moment, a bit ugly and buggy.
It just seems to me that this is adding hardware to do what I can do already with the hardware in my pocket.
Also, at the end of the talk Pattie Maes, as she was leaving the stage, she said that 'Maybe in ten years we'll be here with thte ultimate SixthSense brain implant'. If this is what they're aiming for then I'm not at all interested.
As for the implications to the future for designers of interactive multimedia, I'm not sre that there are, at the moment, any differences. As I stated above, I don't see this technology achieving anything beyond what is already very achievable just with a phone. I'm more than happy to be shown that I am wrong about this.
These are my intial impressions.
What do you think?
After giving it some thought, i totally agree with you. Initially i was blown away by this (and like you, despite the practicality i still am).
ReplyDeleteOther than computers, i don't pay much attention to many new technologies- my phone for example is, on principle (if nothing else) the oldest existing nokia phone i could find :P
So as i was watching it there were a lot of WOWWWS! and woahhh!!s coming from me, but then when i showed it to a friend, he had a similar reaction to you. Being that this friend ALWAYS argues for arguments sake, i fought him on it- but now reading this, i can see where he was coming from.
New technology should have the WOW factor and the practical factor if its going to have any real implications for design. It seems technologies that work are things that actually make life simpler, wereas this invention, at least for its near future use, has too many steps.
Like you mentioned the camera not being practical- i don't even want a camera on my phone for the same reason, i cant imagine this one being any better. Also the fact that it only works with the colored things on your fingers, what if you lose one?
so yea, very very cool- but cant see it really impacting design until they streamline it, possibly find a way to build a phone with the same capabilities, projector etc?
I'm also in agreement. I don't think this fulfills the tenant of 'relative benefit'. I really don't see a social use for this - yet, and I don't think the technology is there. Agreed, the software is very interesting, especially the ability to recognize gestures. I've seen new research on facial recognition software, which is also interesting. The facila recognition would be used for autistics to help them understand and identify different emotions. This, I think, is a relative benefit. *Article was in the latest New Scientist, hopefully online soon. I'll keep looking.
ReplyDeleteJust saw your 'feed' directions. Very nice. I'll include the link in my Week 3 post, in case folks don't see it.
ReplyDeleteYes, I think the demonstrations were a bit gimmicky and the hardware clunky. I definitely would not like a cloud of 'tags' to appear on my body when approached by someone, or huge amounts of additional information about supermarket products.
ReplyDeleteBut I could imagine the device integrated into a pair of glasses that the user could wear.
I notice that this video is a little old - 2009 - I would be really interested to know where the SixthSense is up to now - 2 years down the track. Does anyone know?
ReplyDelete