Skip to content

alepez/threescanner

Repository files navigation

threescanner

3D scanning library which creates a point cloud using structured light, a method of scanning involving a projector and a camera.

With this software you can get a three dimensional point cloud of a real object.

You only need a camera (even a webcam) and a projector.

You can use the provided scanner+projector applications or include the library in your own project using the API.

Projector and scanner are in separate processes and they communicate via tcp, so they can reside in different computers.

Scanner and projector are implementation-agnostic, so different algorithms can be used .

At the moment, only ThreePhase by Kyle McDonald algorithm is implemented, but it's possible to add other implementations, like binary codes, gray codes etc...

THIS PROJECT ISN'T USABLE YET

What to do to have a working scanner:

  • threephase: syncronization between scanner (camera) and projector
  • a GUI

Build

Dependencies

To build:

make thirdparty # download and build dependencies
make

If you want to build documentation, tests, examples etc... you need:

Then you can build with:

make all

Development

Build debug version:

make DEBUG=1 all

Test scripts are in test/scripts/ and automatic tests start with auto_.
Automatic tests can be called with make test.

Unit tests are in test/unit and can be called with make unit_test.
You need to install gmock first.

What is structured light?

Structured light is a method of 3D scanning where we project a known pattern onto an unknown surface and by analyzing the deformation (warping) of the known pattern we can mathematically reconstruct the surface virtually.

Imagine a room full of perfectly matte white objects. If you put a projector and a camera in the room, and project a pattern such that every column (or row) has a unique color, then you can create a correspondence between what the projector "sees", and what the camera sees. This correspondence allows you to triangulate the position of every projected pixel and determine its depth. If you project one frame for every frame the camera captures, you can extrapolate 3D information at your camera's framerate.

Unfortunately, rooms are not generally filled with matte white objects. Whats more, camera-projector synchronization can be difficult outside of hardware, and most cameras distort the scene to some degree. But there are lots of techniques for projecting more information onto the scene in order to overcome these limitations.

References:

About

3D scanning library which creates a point cloud using structured light, a method of scanning involving a projector and a camera.

Resources

License

Stars

Watchers

Forks

Packages

No packages published