[Documentation] [TitleIndex] [WordIndex

#format wiki #language en

Neuro Gesture - Leap Motion

The following package is a Neural Network based Gesture Recognition package. It uses a Feed-Foward Neural Network, which is trained by a Back Propogation Learning Algorithm.

The package consists of 3 parts

  1. Generating dataset from Leap Motion for Gesture Recognition.
  2. Training the dataset, with a Feed-Forward Neural Network.
  3. Predicting the gestures being given as inputs.

PyBrain

Pybrain is a python based Machine Learning Library, which is extensively based on Neural Network Implementation. It is a modular library with very easy to use functions and classes for building various kinds of Neural Networks on different kinds of datasets. It has many tools for supervised as well as unsupervised learning.

Though not needed to use this package, a detailed information about PyBrain can be found at http://pybrain.org/.

So, moving ahead, to build and train the Feed Forward Neural Network, we will need to download the PyBrain library.

First of all install SciPy. To do that, run

sudo apt-get install python-scipy

Once that is done, Install matplotlib, used for PyBrain's plotting functoins.

sudo apt-get install python-matplotlib

Once, that is done, we have all the prerequisites for the functioning of PyBrain.

Next, clone the repository from github, to your desired location

git clone git://github.com/pybrain/pybrain.git

Now, the PyBrain must be copied to the directory form where you called git clone.

In the terminal, export the PYTHONPATH to the loation, where PyBrain is coppied. You will have to do it everytime, a new terminal is launched.

export PYTHONPATH=~/pybrain:$PYTHONPATH

In my case the PyBrain is copied in the home location.

Note:- If you don't want to export the python path everytime, simply add the export command to your .bashrc file. Open it using

sudo gedit ~/.bashrc

Then, add the export command at the last line of the file and save it.

OK, now we test the PyBrain. To do that fire up your terminal and type

$ export PYTHONPATH=~/pybrain:$PYTHONPATH
$ python
Python 2.7.3 (default, Feb 27 2014, 19:58:35)
[GCC 4.6.3] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import pybrain
>>>

If there is no error, then everything worked fine.

Leap Motion

So with PyBrain setup, its time to setup The ROS package for Leap Motion. Below is a quickie for setting up the ROS package for LEAP motion. A detailed instruction can be found at http://wiki.ros.org/leap_motion

So, go to https://developer.leapmotion.com/ and download the suitable SDK. Open it and extract the components present in the .tgz file. Double click on the suitable deb file for installing the suitable driver.

Like earlier, configure the PYTHONPATH to find the extracted SDK. For me, it is in the home directory.

$ export PYTHONPATH=$PYTHONPATH:$HOME/LeapSDK/lib:$HOME/LeapSDK/lib/x64

Then download the ROS package for LEAP motion. Change the directory to the /src of your catkin_ws. Then

$ git clone https://github.com/ros-drivers/leap_motion

Once it has ben cloned to the src folder, go back to catkin_ws and

$ catkin_make

Make sure that the nodes are made executable, present in the scripts folder of the package

$ chmod +x sender.py

Then initialize the driver and run the sender node.

$ leapd
$ roscore
$ rosrun leap_motion sender.py

Next, we download and install the Pckage for gesture recognition

Neuro Gesture - Leap

Clone the package into the src folder of your catkin workspace

$ git clone https://github.com/parhartanvir/neuro_gesture_leap.git
$ cd ..
$ catkin_make

After this, just make them executable by

$ chmod +x get_gesture.py
$ chmod +x gesture_rec.py

Testing

The data_sets folder contains the datasets and pretrained Neural Networks for three gestures, viz

  1. Square
  2. Triangle
  3. Circle

Just run the following

$ roscore

$ rosrun leap_motion sender.py

In another terminal, change the directory to the data_sets folder in the neuro_gesture_leap package. Then run the gesture_rec.py node.

$ rosrun neuro_gesture_leap gesture_rec.py

The gesture is published by the topic /neuro_gest/gesture

$ rostopic echo /neuro_gest/gesture

Making and Training Datasets

So start by firing up your Roscore

$ roscore

$ rosrun leap_motion sender.py

Now run the script for making data

$ rosrun neuro_gesture_leap get_gesture.py

The node will ask you to input the path for the folder in which you want to save the dataset. For Example, for me it is '/home/tanvir/catkin_ws2/src/neuro_gesture_leap/my_data_set'

Just keep on following the instructions, that appear in the terminal. It will ask to give the number of gestures, number of training examples, their names, etc.

Once you are done with making the datasets, change the directory in the terminal to the point to the dataset folder that was just created. Then, it is time to train the dataset

$ rosrun neuro_gesture_leap trainer.py

It will train the Neural Network and save it in the same folder. Once the training is done, we can use it to predict the input gestures. Simply run

$ rosrun neuro_gesture_leap gesture_rec.py

The node will publish the name of the gesture to the topic /neuro_gest/gesture Enjoy!!!

Expected Updates

  1. Multiple Learning Algorithms to choose from.
  2. Use of Recurrent Neural Networks, for continuous input.

2024-03-23 12:23