OpenNI Cookbook
- Length: 324 pages
- Edition: 1
- Language: English
- Publisher: Packt Publishing
- Publication Date: 2013-07-26
- ISBN-10: 1849518467
- ISBN-13: 9781849518468
- Sales Rank: #5058763 (See Top 100 Books)
Learn how to write NIUI-based applications and motion-controlled games
Overview
- Learn something new in an Instant! A short, fast, focused guide delivering immediate results
- Learn more about the Natural Interaction features of OpenNI
- The book is useful for both beginners and professionals because it covers the most basic to advanced concepts in the OpenNi technology.
- Full of illustrations, examples, and tips for understanding different aspects of topics, with clear step-by-step instructions to get different parts of OpenNI working for you
In Detail
The release of Microsoft Kinect, then PrimeSense Sensor, and Asus Xtion opened new doors for developers to interact with users, re-design their application’s UI, and make them environment (context) aware. For this purpose, developers need a good framework which provides a complete application programming interface (API), and OpenNI is the first choice in this field. This book introduces the new version of OpenNI.
“OpenNI Cookbook” will show you how to start developing a Natural Interaction UI for your applications or games with high level APIs and at the same time access RAW data from different sensors of different hardware supported by OpenNI using low level APIs. It also deals with expanding OpenNI by writing new modules and expanding applications using different OpenNI compatible middleware, including NITE.
“OpenNI Cookbook” favors practical examples over plain theory, giving you a more hands-on experience to help you learn. OpenNI Cookbook starts with information about installing devices and retrieving RAW data from them, and then shows how to use this data in applications. You will learn how to access a device or how to read data from it and show them using OpenGL, or use middleware (especially NITE) to track and recognize users, hands, and guess the skeleton of a person in front of a device, all through examples.You also learn about more advanced aspects such as how to write a simple module or middleware for OpenNI itself.
“OpenNI Cookbook” shows you how to start and experiment with both NIUI designs and OpenNI itself using examples.
What you will learn from this book
- Retrieve and use depth, vision, and audio from compatible devices
- Get basic information about the environment
- Recognize hands, humans, and their skeleton and track their moves
- Customize frames right from the device itself
- Identify basic gestures like pushing or swapping
- Select between devices or use more than one device to read data
- Recognize pre-defined hand gestures and detect user poses
Approach
This is a Cookbook with plenty of practical recipes enriched with explained code and plenty of screenshots to ease your learning curve.
Who this book is written for
If you are a beginner or a professional in NIUI and want to write serious applications or games, then this book is for you. Even OpenNI 1 and OpenNI 1.x programmers who want to move to new versions of OpenNI can use this book as a starting point.
This book uses C++ as the primary language but there are some examples in C# and Java too, so you need to have about a basic working knowledge of C or C++ for most cases.
Table of Contents
Chapter 1: G etting Started
Chapter 2 : OpenNI and C++
Chapter 3 : Using Low-level Data
Chapter 4 : More about Low-level Outputs
Chapter 5 : NiTE and User Tracking
Chapter 6 : NiTE and Hand Tracking
Chapter 7 : NiTE and Skeleton Tracking