“MooseGesture” – Python Mouse Gestures Module

“MooseGesture” is a Python module that implements a basic mouse gesture recognition system. It can identify gestures made up of strokes in the eight cardinal and diagonal directions.

A mouse gesture is holding down the mouse button and moving the mouse cursor in a specific pattern to issue a command.

Mouse gestures are a way of dragging with the mouse in order to draw out a certain pattern. The most mainstream uses of mouse gestures in computer software are for web browsers. (Chrome, Firefox, Opera, Internet Explorer)

Mouse gestures can provide an interesting game mechanic that you can add to your own programs (similar to what the Wiimote does in a few games).

You can download the module (and a simple Pygame test script that uses it) here:

MooseGestures & Test App (zipped)

moosegesture.py

moosegesturetest.py

Simon Gesture – A game that uses the MooseGesture module.

(The screenshot above shows the test app after entering a mouse gesture. It correctly identifies the gesture as Down Right Up.)

(The above screenshot shows a more complicated gesture: Down, Up, Down, Right, Left, Right.)

Description of the Algorithm

First, some nomenclature (at least how I use it):

  • A “point” is a 2D tuple of x, y values. These values can be ints or floats, MooseGesture supports both.
  • A “point pair” is a point and its immediately subsequent point, i.e. two points that are next to each other.
  • A “segment” is two or more ordered points forming a series of lines.
  • A “stroke” is a segment with all its point pairs going in a single direction (one of the 8 cardinal or diagonal directions: up, upright, left, etc.)
  • A “gesture” is one or more strokes in a specific pattern, e.g. up then right then down then left.

The identification of strokes is based on a simple slope calculation between two points. MooseGesture can identify several strokes in a row in the eight cardinal and diagonal directions (up, down, left, right, and the four diagonals in between). MooseGesture does not recognize two strokes in the same direction performed in succession, but sees it as just one stroke in that direction. So up, up, down, down, left, right, left right would be identified as just up, down, left, right.

A slope is calculated between two XY coordinates with the basic “rise over run” calculation: (y2 – y1) / (x2 – x1)

The slope is then categorized into one of the eight directions by checking which “zone” it falls into. Each of the eight zones spans 45 degrees. To make it easier to remember, I’ve numbered them according to the same directions the numbers on your keyboard’s number pads align to:

The key is to know when one stroke ends and another begins, and to allow for some inaccuracy and imperfection from the user. MooseGesture does this by breaking up the user’s gesture into overlapping “segments” and testing for a consistent direction on all point pairs in that segment.

Here’s what I mean. Say the user’s gesture (which is just an ordered list of XY points) looks like this:

The user intends to just have a simple to-the-right gesture, but because of some natural shaky mouse movement there is a little deviation in it. We want this deviation to be ignored, and not identified as an additional upright stroke. The segment consistency checks in our algorithm will ignore these little bumps.

Page 1 of 2 | Next page