Disruptive user-interface technology available soon
Disruptive user-interface technology available soon
https://leapmotion.com/
Not Kinect. For $70 you get 0.01mm resolution within an 8 ft^3 space. Uses infrared.
Check out the video on the home page, especially the point clouds of the hands.
This will be big.
Not Kinect. For $70 you get 0.01mm resolution within an 8 ft^3 space. Uses infrared.
Check out the video on the home page, especially the point clouds of the hands.
This will be big.
Re: Disruptive user-interface technology available soon
DeltaV wrote:https://leapmotion.com/
Not Kinect. For $70 you get 0.01mm resolution within an 8 ft^3 space. Uses infrared.
Check out the video on the home page, especially the point clouds of the hands.
This will be big.
Very cool.
‘What all the wise men promised has not happened, and what all the damned fools said would happen has come to pass.’
— Lord Melbourne —
— Lord Melbourne —
The skinsuit is a serious contender for an actual pressure suit, so you may get your wish.zapkitty wrote:Dammitall to hell...
... how are we going to get to the point of mech pilots wearing skintight data films when they can just use this gadget instead?!...
... Sorry, Shirow-san, the present has overrun the future again... *sob*...

When disruptive technologies combine ....
If you combine this ability to 3D scan to 0.01mm in an 8 cubic foot space with appropriate software and 3D printing, some amazing things become possible. The time to create a 3D model of a physical object at very high resolution will drop significantly. This combined with 3D printing would allow the cheap replication of spare parts given an undamaged original part. Damaged parts would of course require cleanup to 'remove' the damage from the scan.
Another very interesting area would be to use this interface technique to enhance 3D modeling software to allow a more direct manipulation style or even 'air sculpting'.
Wow, just wow.
Another very interesting area would be to use this interface technique to enhance 3D modeling software to allow a more direct manipulation style or even 'air sculpting'.
Wow, just wow.
http://www.technologyreview.com/news/50 ... ntrol-era/
Leap’s founders won’t share exact details of their technology, but Holz says that unlike the Kinect, the Leap doesn’t project a grid of infrared points onto the world that are tracked to figure out what is moving and where (see the pattern produced by the Kinect sensor).
Despite having two cameras, the Leap does not use stereovision techniques to determine depth, says Holz. Instead, the second camera is to provide an extra source of information and prevent errors due to parts of a person’s hand obscuring itself or the other hand.
Cool, but I'm not sure I see this catching on in the long run for desktops (I guess the counter-argument is that desktop's are dying, eh).
I honestly think I'd be faster and more accurate with the mouse, plus I gotta think mouse buttons are always going to be more precise and reliable that gesture recognition.
Plus, wouldn't you get tired holding your arms out in front of yourself the whole day? Hey, I know I could use more exercise, but...
I thought I heard years ago that someone was working on a pointer that tracked the direction of your gaze. Combine that with this for auxiliary functions and maybe I'm interested.
I honestly think I'd be faster and more accurate with the mouse, plus I gotta think mouse buttons are always going to be more precise and reliable that gesture recognition.
Plus, wouldn't you get tired holding your arms out in front of yourself the whole day? Hey, I know I could use more exercise, but...
I thought I heard years ago that someone was working on a pointer that tracked the direction of your gaze. Combine that with this for auxiliary functions and maybe I'm interested.
As I read it, you can rest your hand on the desk and just move one or two fingers, if need be. The 'gains' (motion scale factors) in the software can be tuned to fit your particular style of use.Maui wrote:Plus, wouldn't you get tired holding your arms out in front of yourself the whole day? Hey, I know I could use more exercise, but...
I thought I heard years ago that someone was working on a pointer that tracked the direction of your gaze. Combine that with this for auxiliary functions and maybe I'm interested.
The gaze sensor might be problematic, since human eyes move in saccades which the conscious mind is usually not aware of. Not saying a saccade filter could not be developed. Something similar might be needed with Leap for people with Parkinson's, etc.
-
- Posts: 2488
- Joined: Fri Jun 19, 2009 5:53 am
- Location: Third rock from the sun.
I suspectDeltaV wrote:I'd like to know how the point cloud points beyond line-of-sight (such as the backs of the fingers) are obtained. Infrared diffraction? But, that is part of the secret sauce...
http://www.technovelgy.com/ct/Science-F ... wsNum=3823
-
- Posts: 105
- Joined: Wed Nov 21, 2012 9:56 pm
OpenCV does gaze trackingDeltaV wrote:The gaze sensor might be problematic, since human eyes move in saccades which the conscious mind is usually not aware of. Not saying a saccade filter could not be developed. Something similar might be needed with Leap for people with Parkinson's, etc.
http://hackaday.com/2012/05/30/opencv-k ... -tracking/
I wonder if that's what they use. Massively useful software. Body part tracking, face id, people location, counting, tracking. And a lot of it requires no more than smartphone style hardware.