MRPT  1.9.9
COpenNI2_RGBD360.h
Go to the documentation of this file.
1 /* +------------------------------------------------------------------------+
2  | Mobile Robot Programming Toolkit (MRPT) |
3  | http://www.mrpt.org/ |
4  | |
5  | Copyright (c) 2005-2018, Individual contributors, see AUTHORS file |
6  | See: http://www.mrpt.org/Authors - All rights reserved. |
7  | Released under BSD License. See details in http://www.mrpt.org/License |
8  +------------------------------------------------------------------------+ */
9 #ifndef mrpt_OPENNI2_RGBD360_H
10 #define mrpt_OPENNI2_RGBD360_H
11 
17 
18 namespace mrpt::hwdrivers
19 {
20 /** A class for grabing RGBD images from several OpenNI2 sensors. This is used
21  *to obtain larger fields of view using a radial configuration of the sensors.
22  * The same options (resolution, fps, etc.) are used for every sensor.
23  *
24  * <h2>Configuration and usage:</h2> <hr>
25  * Data is returned as observations of type mrpt::obs::CObservationRGBD360.
26  * See those classes for documentation on their fields.
27  *
28  * As with any other CGenericSensor class, the normal sequence of methods to be
29  *called is:
30  * - CGenericSensor::loadConfig() - Or calls to the individual setXXX() to
31  *configure the sensor parameters.
32  * - COpenNI2_RGBD360::initialize() - to start the communication with the
33  *sensor.
34  * - call COpenNI2_RGBD360::getNextObservation() for getting the data.
35  *
36  * <h2>Calibration parameters</h2><hr>
37  * The reference system for both depth and RGB images provided by each
38  *individual OpenNI2 sensors are referred to the
39  * RGB Camera.
40  * The extrinsic parameters of each RGBD sensor are provided from a
41  *configuration file. This calibration was obtained
42  * using the method reported in [].
43  *
44  * <h2>Coordinates convention</h2><hr>
45  * The origin of coordinates is the focal point of the RGB camera of the
46  *first indexed sensor, with the axes oriented
47  * as in the diagram shown in mrpt::obs::CObservation3DRangeScan. Notice in
48  *that picture that the RGB camera is
49  * assumed to have axes as usual in computer vision, which differ from those
50  *for the depth camera.
51  *
52  * The X,Y,Z axes used to report the data from accelerometers coincide with
53  *those of the depth camera
54  * (e.g. the camera standing on a table would have an ACC_Z=-9.8m/s2).
55  *
56  * Notice however that, for consistency with stereo cameras, when loading the
57  *calibration parameters from
58  * a configuration file, the left-to-right pose increment is expected as if
59  *both RGB & IR cameras had
60  * their +Z axes pointing forward, +X to the right, +Y downwards (just like
61  *it's the standard in stereo cameras
62  * and in computer vision literature). In other words: the pose stored in
63  *this class uses a different
64  * axes convention for the depth camera than in a stereo camera, so when a
65  *pose L2R is loaded from a calibration file
66  * it's actually converted like:
67  *
68  * L2R(this class convention) = CPose3D(0,0,0,-90deg,0deg,-90deg) (+)
69  *L2R(in the config file)
70  *
71  *
72  * <h2>Some general comments</h2><hr>
73  * - Depth is grabbed in 10bit depth, and a range N it's converted to
74  *meters
75  *as: range(m) = 0.1236 * tan(N/2842.5 + 1.1863)
76  * - This sensor can be also used from within rawlog-grabber to grab
77  *datasets
78  *within a robot with more sensors.
79  * - There is no built-in threading support, so if you use this class
80  *manually
81  *(not with-in rawlog-grabber),
82  * the ideal would be to create a thread and continuously request data
83  *from
84  *that thread (see mrpt::system::createThread ).
85  * - The intensity channel default to the RGB images, but it can be changed
86  *with setVideoChannel() to read the IR camera images (useful for calibrating).
87  * - There is a built-in support for an optional preview of the data on a
88  *window, so you don't need to even worry on creating a window to show them.
89  * - This class relies on an embedded version of libfreenect (you do NOT
90  *need
91  *to install it in your system). Thanks guys for the great job!
92  *
93  * <h2>Converting to 3D point cloud </h2><hr>
94  * You can convert the 3D observation into a 3D point cloud with this piece
95  *of code:
96  *
97  * \code
98  * mrpt::obs::CObservationRGBD360 obs3D;
99  * mrpt::maps::CColouredPointsMap pntsMap;
100  * pntsMap.colorScheme.scheme = CColouredPointsMap::cmFromIntensityImage;
101  * pntsMap.loadFromRangeScan(obs3D);
102  * \endcode
103  *
104  * Then the point cloud mrpt::maps::CColouredPointsMap can be converted into
105  *an OpenGL object for
106  * rendering with mrpt::maps::CMetricMap::getAs3DObject() or alternatively
107  *with:
108  *
109  * \code
110  * mrpt::opengl::CPointCloudColoured::Ptr gl_points =
111  *mrpt::make_aligned_shared<mrpt::opengl::CPointCloudColoured>();
112  * gl_points->loadFromPointsMap(&pntsMap);
113  * \endcode
114  *
115  *
116  * <h2>Platform-specific comments</h2><hr>
117  * For more details, refer to <a href="http://openkinect.org/wiki/Main_Page"
118  *>libfreenect</a> documentation:
119  * - Linux: You'll need root privileges to access Kinect. Or, install
120  *<code>
121  *MRPT/scripts/51-kinect.rules </code> in <code>/etc/udev/rules.d/</code> to
122  *allow access to all users.
123  * - Windows:
124  * - Since MRPT 0.9.4 you'll only need to install <a
125  *href="http://sourceforge.net/projects/libusb-win32/files/libusb-win32-releases/"
126  *>libusb-win32</a>: download and extract the latest
127  *libusb-win32-bin-x.x.x.x.zip
128  * - To install the drivers, read this:
129  *http://openkinect.org/wiki/Getting_Started#Windows
130  * - MacOS: (write me!)
131  *
132  *
133  * <h2>Format of parameters for loading from a .ini file</h2><hr>
134  *
135  * \code
136  * PARAMETERS IN THE ".INI"-LIKE CONFIGURATION STRINGS:
137  * -------------------------------------------------------
138  * [supplied_section_name]
139  * sensorLabel = RGBD360 // A text description
140  * preview_window = false // Show a window with a preview of the
141  *grabbed data in real-time
142  *
143  * device_number = 0 // Device index to open (0:first Kinect,
144  *1:second Kinect,...)
145  *
146  * grab_image = true // Grab the RGB image channel?
147  *(Default=true)
148  * grab_depth = true // Grab the depth channel? (Default=true)
149  * grab_3D_points = true // Grab the 3D point cloud? (Default=true)
150  *If disabled, points can be generated later on.
151  *
152  * video_channel = VIDEO_CHANNEL_RGB // Optional. Can be:
153  *VIDEO_CHANNEL_RGB (default) or VIDEO_CHANNEL_IR
154  *
155  * pose_x=0 // Camera position in the robot (meters)
156  * pose_y=0
157  * pose_z=0
158  * pose_yaw=0 // Angles in degrees
159  * pose_pitch=0
160  * pose_roll=0
161  *
162  *
163  * // Left/Depth camera
164  * [supplied_section_name_LEFT]
165  * rawlog-grabber-ignore = true // Instructs rawlog-grabber to ignore this
166  *section (it is not a separate device!)
167  *
168  * resolution = [640 488]
169  * cx = 314.649173
170  * cy = 240.160459
171  * fx = 572.882768
172  * fy = 542.739980
173  * dist = [-4.747169e-03 -4.357976e-03 0.000000e+00 0.000000e+00
174  *0.000000e+00] // The order is: [K1 K2 T1 T2 K3]
175  *
176  * // Right/RGB camera
177  * [supplied_section_name_RIGHT]
178  * rawlog-grabber-ignore = true // Instructs rawlog-grabber to ignore this
179  *section (it is not a separate device!)
180  *
181  * resolution = [640 480]
182  * cx = 322.515987
183  * cy = 259.055966
184  * fx = 521.179233
185  * fy = 493.033034
186  * dist = [5.858325e-02 3.856792e-02 0.000000e+00 0.000000e+00
187  *0.000000e+00] // The order is: [K1 K2 T1 T2 K3]
188  *
189  * // Relative pose of the right camera wrt to the left camera:
190  * // This assumes that both camera frames are such that +Z points
191  * // forwards, and +X and +Y to the right and downwards.
192  * // For the actual coordinates employed in 3D observations, see figure in
193  *mrpt::obs::CObservation3DRangeScan
194  * [supplied_section_name_LEFT2RIGHT_POSE]
195  * rawlog-grabber-ignore = true // Instructs rawlog-grabber to ignore this
196  *section (it is not a separate device!)
197  *
198  * pose_quaternion = [0.025575 -0.000609 -0.001462 0.999987 0.002038
199  *0.004335 -0.001693]
200  *
201  * \endcode
202  *
203  * More references to read:
204  * - http://RGBD360
205  * - http://http://www.openni.org/
206  * \ingroup mrpt_hwdrivers_grp
207  */
210 {
212 
213  public:
214  /** Default ctor
215  */
217  /** Default ctor
218  */
220 
221  /** Initializes the 3D camera - should be invoked after calling loadConfig()
222  * or setting the different parameters with the set*() methods.
223  * \exception This method must throw an exception with a descriptive
224  * message if some critical error is found.
225  */
226  virtual void initialize();
227 
228  /** To be called at a high rate (>XX Hz), this method populates the
229  * internal buffer of received observations.
230  * This method is mainly intended for usage within rawlog-grabber or
231  * similar programs.
232  * For an alternative, see getNextObservation()
233  * \exception This method must throw an exception with a descriptive
234  * message if some critical error is found.
235  * \sa getNextObservation
236  */
237  virtual void doProcess();
238 
239  /** The main data retrieving function, to be called after calling
240  * loadConfig() and initialize().
241  * \param out_obs The output retrieved observation (only if
242  * there_is_obs=true).
243  * \param there_is_obs If set to false, there was no new observation.
244  * \param hardware_error True on hardware/comms error.
245  *
246  * \sa doProcess
247  */
248  void getNextObservation(
249  mrpt::obs::CObservationRGBD360& out_obs, bool& there_is_obs,
250  bool& hardware_error);
251 
252  /** Set the path where to save off-rawlog image files (this class DOES take
253  * into account this path).
254  * An empty string (the default value at construction) means to save
255  * images embedded in the rawlog, instead of on separate files.
256  * \exception std::exception If the directory doesn't exists and cannot be
257  * created.
258  */
259  virtual void setPathForExternalImages(const std::string& directory);
260 
261  /** @name Sensor parameters (alternative to \a loadConfig ) and manual
262  control
263  @{ */
264 
265  /** Get the maximum range (meters) that can be read in the observation field
266  * "rangeImage" */
267  inline double getMaxRange() const { return m_maxRange; }
268  /** Enable/disable the grabbing of the RGB channel */
269  inline void enableGrabRGB(bool enable = true) { m_grab_rgb = enable; }
270  inline bool isGrabRGBEnabled() const { return m_grab_rgb; }
271  /** Enable/disable the grabbing of the depth channel */
272  inline void enableGrabDepth(bool enable = true) { m_grab_depth = enable; }
273  inline bool isGrabDepthEnabled() const { return m_grab_depth; }
274  /** Enable/disable the grabbing of the 3D point clouds */
275  inline void enableGrab3DPoints(bool enable = true)
276  {
277  m_grab_3D_points = enable;
278  }
279  inline bool isGrab3DPointsEnabled() const { return m_grab_3D_points; }
280  /** @} */
281 
282  protected:
283  virtual void loadConfig_sensorSpecific(
284  const mrpt::config::CConfigFileBase& configSource,
285  const std::string& section);
286 
288 
289  static const int NUM_SENSORS = 2;
290 
291  /** Show preview window while grabbing
292  */
294  /** If preview is enabled, only show 1 out of N images.
295  */
300 
301  /** Sensor max range (meters)
302  */
303  double m_maxRange;
304 
305  /** Default: all true
306  */
308 
309 }; // End of class
310 }
311 #endif
312 
313 
A generic interface for a wide-variety of sensors designed to be used in the application RawLogGrabbe...
bool m_grab_rgb
Default: all true.
Declares a class derived from "CObservation" that encapsules an omnidirectional RGBD measurement from...
void enableGrab3DPoints(bool enable=true)
Enable/disable the grabbing of the 3D point clouds.
void enableGrabRGB(bool enable=true)
Enable/disable the grabbing of the RGB channel.
Contains classes for various device interfaces.
mrpt::gui::CDisplayWindow::Ptr m_win_int[NUM_SENSORS]
size_t m_preview_window_decimation
If preview is enabled, only show 1 out of N images.
virtual void doProcess()
To be called at a high rate (>XX Hz), this method populates the internal buffer of received observati...
A class for grabing RGBD images from several OpenNI2 sensors.
void enableGrabDepth(bool enable=true)
Enable/disable the grabbing of the depth channel.
This class allows loading and storing values and vectors of different types from a configuration text...
virtual void setPathForExternalImages(const std::string &directory)
Set the path where to save off-rawlog image files (this class DOES take into account this path)...
An abstract class for accessing OpenNI2 compatible sensors.
virtual void loadConfig_sensorSpecific(const mrpt::config::CConfigFileBase &configSource, const std::string &section)
Loads specific configuration for the device from a given source of configuration parameters, for example, an ".ini" file, loading from the section "[iniSection]" (see config::CConfigFileBase and derived classes)
virtual void initialize()
Initializes the 3D camera - should be invoked after calling loadConfig() or setting the different par...
double getMaxRange() const
Get the maximum range (meters) that can be read in the observation field "rangeImage".
GLsizei const GLchar ** string
Definition: glext.h:4101
#define DEFINE_GENERIC_SENSOR(class_name)
This declaration must be inserted in all CGenericSensor classes definition, within the class declarat...
A class used to store a 3D pose (a 3D translation + a rotation in 3D).
Definition: CPose3D.h:86
mrpt::poses::CPose3D m_sensorPoseOnRobot
bool m_preview_window
Show preview window while grabbing.
void getNextObservation(mrpt::obs::CObservationRGBD360 &out_obs, bool &there_is_obs, bool &hardware_error)
The main data retrieving function, to be called after calling loadConfig() and initialize().
mrpt::gui::CDisplayWindow::Ptr m_win_range[NUM_SENSORS]
double m_maxRange
Sensor max range (meters)



Page generated by Doxygen 1.8.14 for MRPT 1.9.9 Git: 7d5e6d718 Fri Aug 24 01:51:28 2018 +0200 at lun nov 2 08:35:50 CET 2020