Skip to content

pybae/HandMade

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commits
 
 
 
 
 
 

Repository files navigation

HandMade

The final project for @pybae, @MikkelKim, @tsherlock, and @iancai.
Our team name is HandMade!

Our members:

Links:

Instructions

To build the program, run the following command

make HandMade

To build the test suite, run the following command

make TestDetect

The program execution requires a static background, and first approximates the background. This is when the OpenCV window named "background" is visible.

Once the background presented seems suitable, press any key to begin the process.

The key 'q' will terminate the main program

Pitch

There is a growing market in drawing tools, such as Paper, Sketchpad.io. These tools don't seek to replace the power of heavy applications, such as PhotoShop, but instead seek to provide a simple interface to allow users to quickly sketch up their ideas.

There have been several burgeoning ideas in this region as well. RocketBoard, which allows users to draw on a whiteboard to share thoughts in real time, FlockDraw, and much more.

Our app seeks to do the same, but in a unique way.

Rather than relying on mice or touch surfaces for input, our app will take input from the user by a camera. By using real-time hand tracking, we can track the motions of the hand to simulate drawing on a virtual whiteboard.

We can come up with a set of gestures to represent each action:

  • An index finger pointed to the camera will represent the drawing state.
  • A palm moving will cause the current white board view to move.
  • A closing fist represents zooming out
  • A opening fist represents zooming in

And these gestures are hypothetical as of the moment, and will likely changed based on user input.

Objectives

Objective 1: Preprocessing

  • Compute background
  • Background subtraction
  • Convert to YCrCb
  • Threshold on each channel
  • Morphology (erode and dilate) on each channel
  • Merge channels
  • Bitwise and operator with original frame
  • Threshold final image
  • Morphology (erode and dilate) final image
  • Gaussian Blur
  • Result: A binary image of the hand with most background noise

Objective 2: Detect hand motion

  • Contour extraction
  • Polygon approximation
  • Max inscribed circle and radius
  • Find region of interest
  • Convex hull
  • Convexity defects
  • K-curvature
  • Min enclosing circle
  • Result: Contours, convexity defects, and circles of the hand from Objective 1

Objective 3: Gesture Recognition (TBA)

  • Simple heuristics to detect static gesture
  • For example:
    • 0-1 fingers detected, then close palm
    • 4-5 fingers detected, then open palm
    • 1 finger detected and no thumb, then pointing
  • More research to be done here
  • Result: Coordinates of where the gesture is and a flag corresponding to the gesture

Objective 4: OpenGL Whiteboard

  • Design a modular system for gesture recognition
  • Link up the x, y coordinates to drawing on an OpenGL canvas
  • Implement an interface for multiple brushes, colors, and the like
  • Result: A GUI showing an OpenGL whiteboard (built on Qt as a backend)

Objective 5: HTML5 Canvas (optional)

  • Link up the OpenGL whiteboard in step 4 to HTML5 Canvas
  • Make the updating real time
  • Optimize the speed
  • Launch on a server (most likely heroku)
  • Result: Real time mirroring to a server of the image

Schedule

Our implementation will likely be in C++, OpenCV, and OpenGL. Our goal for the final project is to have a working demo of the whiteboard, with the above gestures implemented, and if time permits, present the white board in real time online.

13 Nov:

  • Objective 1

20 Nov:

  • Objective 2

2 Dec:

  • Objective 3
  • Objective 4
  • Objective 5 (maybe)

Videos

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published