|Project Name||Stars||Downloads||Repos Using This||Packages Using This||Most Recent Commit||Total Releases||Latest Release||Open Issues||License||Language|
|Azure Kinect Sensor Sdk||1,281||1||9||5 months ago||26||June 16, 2020||291||mit||C++|
|A cross platform (Linux and Windows) user mode SDK to read data from your Azure Kinect device.|
|Iai_kinect2||736||2 years ago||138||apache-2.0||C++|
|Tools for using the Kinect One (Kinect v2) in ROS|
|Livescan3d||689||2 months ago||32||mit||C++|
|LiveScan3D is a system designed for real time 3D reconstruction using multiple Azure Kinect or Kinect v2 depth sensors simultaneously at real time speed.|
|Volumetriccapture||313||2 years ago||Python|
|A multi-sensor capture system for free viewpoint video.|
|Pyk4a||248||1||a month ago||13||May 29, 2022||11||mit||Python|
|Python 3 wrapper for Azure-Kinect-Sensor-SDK|
|Ofxkinectv2||185||3 years ago||6||C++|
|An addon for the new Kinect For Windows V2 sensor.|
|K4a.net||154||5 months ago||13||January 24, 2023||8||mit||C#|
|K4A.Net - Three-in-one .NET library to work with Azure Kinect devices (also known as Kinect for Azure, K4A, Kinect v4). It includes sensor API, recording and playback API, body tracking API. Samples for WPF, .NET Core and Unity are included.|
|Kinectdressingroom||102||12 years ago||C#|
|A virtual dressing room in Unity using the Kinect sensor, an adaptable body mesh and full clothing simulation.|
|Ar Sandbox||83||4 years ago||2||other||C#|
|Augmented Sandbox with Unity3D and Kinect|
|Azure Kinect Dk Unity||74||4 years ago||9||C#|
|Azure Kinect C# wrapper compatible both with Sensor SDK and Body Tracking SDK ant Unity sample project using it.|
LiveScan3D is a system designed for real time 3D reconstruction using multiple AzureKinect or Kinect v2 depth sensors simultaneously at real time speed. The code for working with Kinect v2 is in the master branch, and the v1.x.x releases. If you want to work with Azure Kinect please use the appropriately named branch.
For both sensors the produced 3D reconstruction is in the form of a coloured point cloud, with points from all of the Kinects placed in the same coordinate system. The point cloud stream can be visualized, recorded or streamed to a HoloLens or any Unity application. The code for streaming to Unity and HoloLens is available in the LiveScan3D-Hololens repository.
Possible use scenarios of the system include:
You will also find a short presentation of LiveScan3D in the video below (click to go to YouTube):
In our system each sensor is governed by a separate instance of a client app, which is connected to a server. The client apps can either run on separate machines or all on the same machine (only for Azure Kinect). The server allows the user to perform calibration, filtering, synchronized frame capture, and to visualize the acquired point cloud live.
To start working with our software you will need a Windows machine with and at least a single Kinect device. You can either build LiveScan3D from source, for which you will need Visual Studio 2019, or you can download the binary release. Both the binary and source distributions contain a manual (in the docs directory) which contains the steps necessary to start working with our software (it won't take more than a couple of minutes to set up).
If you have any problems feel free to contact us: Marek Kowalski [email protected], Jacek Naruniec [email protected]. We usually answer emails quickly (our timezone is CET).
For details regarding the methods used in LiveScan3D you can take a look at our article: LiveScan3D: A Fast and Inexpensive 3D Data Acquisition System for Multiple Kinect v2 Sensors.
While all of our code is licensed under the MIT license, the 3rd party libraries have different licenses:
If you use this software in your research, then please use the following citation:
Kowalski, M.; Naruniec, J.; Daniluk, M.: "LiveScan3D: A Fast and Inexpensive 3D Data Acquisition System for Multiple Kinect v2 Sensors". in 3D Vision (3DV), 2015 International Conference on, Lyon, France, 2015