|
Boost : |
Subject: Re: [boost] [rfc] a library for gesture recognition, speech recognition, and synthesis
From: Michael Fawcett (michael.fawcett_at_[hidden])
Date: 2009-10-26 15:05:39
On Sat, Oct 24, 2009 at 5:39 PM, Stjepan Rajko <stjepan.rajko_at_[hidden]> wrote:
>
> Nice video. Â Did you work on that project or are you working on
> something similar?
I worked with the software and hardware shown in the video, but I did
not work on that particular demonstration.
> You could use the library in it's current state to provide a richer
> set of gestures. Â Here are some examples showing a vocabulary of
> gestures being trained and then recognized:
> http://www.youtube.com/watch?v=LAX5qgzYHjU (iPhone gesture classification)
> http://www.youtube.com/watch?v=mjjwhK4Dxt4 (mouse gesture on-line recognition)
>
> You could do something similar in the system shown in your video, at
> least with single-touch gestures (maybe they could be used as
> shortcuts to some of the functions otherwise accessible through the
> menu).
>
> There was also a nuicode GSoC project this year that used the library
> for multi-touch gestures:
> http://nuicode.com/projects/gsoc-gesture-models
> http://code.google.com/p/multitouch-gestr/
>
> The system works pretty well, but I don't think there is much
> documentation at this point.
>
> Would any of this be useful to you?
Yes, I had seen the nuicode project, but didn't follow it very
closely. It's good to hear that your library would be useful in this
context.
Thanks!
--Michael Fawcett
Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk