MRPT  2.0.0
COpenNI2_RGBD360.h
Go to the documentation of this file.
1 /* +------------------------------------------------------------------------+
2  | Mobile Robot Programming Toolkit (MRPT) |
3  | https://www.mrpt.org/ |
4  | |
5  | Copyright (c) 2005-2020, Individual contributors, see AUTHORS file |
6  | See: https://www.mrpt.org/Authors - All rights reserved. |
7  | Released under BSD License. See: https://www.mrpt.org/License |
8  +------------------------------------------------------------------------+ */
9 #pragma once
10 
16 
17 namespace mrpt::hwdrivers
18 {
19 /** A class for grabing RGBD images from several OpenNI2 sensors. This is used
20  *to obtain larger fields of view using a radial configuration of the sensors.
21  * The same options (resolution, fps, etc.) are used for every sensor.
22  *
23  * <h2>Configuration and usage:</h2> <hr>
24  * Data is returned as observations of type mrpt::obs::CObservationRGBD360.
25  * See those classes for documentation on their fields.
26  *
27  * As with any other CGenericSensor class, the normal sequence of methods to be
28  *called is:
29  * - CGenericSensor::loadConfig() - Or calls to the individual setXXX() to
30  *configure the sensor parameters.
31  * - COpenNI2_RGBD360::initialize() - to start the communication with the
32  *sensor.
33  * - call COpenNI2_RGBD360::getNextObservation() for getting the data.
34  *
35  * <h2>Calibration parameters</h2><hr>
36  * The reference system for both depth and RGB images provided by each
37  *individual OpenNI2 sensors are referred to the
38  * RGB Camera.
39  * The extrinsic parameters of each RGBD sensor are provided from a
40  *configuration file. This calibration was obtained
41  * using the method reported in [].
42  *
43  * <h2>Coordinates convention</h2><hr>
44  * The origin of coordinates is the focal point of the RGB camera of the
45  *first indexed sensor, with the axes oriented
46  * as in the diagram shown in mrpt::obs::CObservation3DRangeScan. Notice in
47  *that picture that the RGB camera is
48  * assumed to have axes as usual in computer vision, which differ from those
49  *for the depth camera.
50  *
51  * The X,Y,Z axes used to report the data from accelerometers coincide with
52  *those of the depth camera
53  * (e.g. the camera standing on a table would have an ACC_Z=-9.8m/s2).
54  *
55  * Notice however that, for consistency with stereo cameras, when loading the
56  *calibration parameters from
57  * a configuration file, the left-to-right pose increment is expected as if
58  *both RGB & IR cameras had
59  * their +Z axes pointing forward, +X to the right, +Y downwards (just like
60  *it's the standard in stereo cameras
61  * and in computer vision literature). In other words: the pose stored in
62  *this class uses a different
63  * axes convention for the depth camera than in a stereo camera, so when a
64  *pose L2R is loaded from a calibration file
65  * it's actually converted like:
66  *
67  * L2R(this class convention) = CPose3D(0,0,0,-90deg,0deg,-90deg) (+)
68  *L2R(in the config file)
69  *
70  *
71  * <h2>Some general comments</h2><hr>
72  * - Depth is grabbed in 10bit depth, and a range N it's converted to
73  *meters
74  *as: range(m) = 0.1236 * tan(N/2842.5 + 1.1863)
75  * - This sensor can be also used from within rawlog-grabber to grab
76  *datasets
77  *within a robot with more sensors.
78  * - There is no built-in threading support, so if you use this class
79  *manually
80  *(not with-in rawlog-grabber),
81  * the ideal would be to create a thread and continuously request data
82  *from
83  *that thread (see mrpt::system::createThread ).
84  * - The intensity channel default to the RGB images, but it can be changed
85  *with setVideoChannel() to read the IR camera images (useful for calibrating).
86  * - There is a built-in support for an optional preview of the data on a
87  *window, so you don't need to even worry on creating a window to show them.
88  * - This class relies on an embedded version of libfreenect (you do NOT
89  *need
90  *to install it in your system). Thanks guys for the great job!
91  *
92  * <h2>Converting to 3D point cloud </h2><hr>
93  * You can convert the 3D observation into a 3D point cloud with this piece
94  *of code:
95  *
96  * \code
97  * mrpt::obs::CObservationRGBD360 obs3D;
98  * mrpt::maps::CColouredPointsMap pntsMap;
99  * pntsMap.colorScheme.scheme = CColouredPointsMap::cmFromIntensityImage;
100  * pntsMap.loadFromRangeScan(obs3D);
101  * \endcode
102  *
103  * Then the point cloud mrpt::maps::CColouredPointsMap can be converted into
104  *an OpenGL object for
105  * rendering with mrpt::maps::CMetricMap::getAs3DObject() or alternatively
106  *with:
107  *
108  * \code
109  * mrpt::opengl::CPointCloudColoured::Ptr gl_points =
110  *mrpt::opengl::CPointCloudColoured::Create();
111  * gl_points->loadFromPointsMap(&pntsMap);
112  * \endcode
113  *
114  *
115  * <h2>Platform-specific comments</h2><hr>
116  * For more details, refer to <a href="http://openkinect.org/wiki/Main_Page"
117  *>libfreenect</a> documentation:
118  * - Linux: You'll need root privileges to access Kinect. Or, install
119  *<code>
120  *MRPT/scripts/51-kinect.rules </code> in <code>/etc/udev/rules.d/</code> to
121  *allow access to all users.
122  * - Windows:
123  * - Since MRPT 0.9.4 you'll only need to install <a
124  *href="http://sourceforge.net/projects/libusb-win32/files/libusb-win32-releases/"
125  *>libusb-win32</a>: download and extract the latest
126  *libusb-win32-bin-x.x.x.x.zip
127  * - To install the drivers, read this:
128  *http://openkinect.org/wiki/Getting_Started#Windows
129  * - MacOS: (write me!)
130  *
131  *
132  * <h2>Format of parameters for loading from a .ini file</h2><hr>
133  *
134  * \code
135  * PARAMETERS IN THE ".INI"-LIKE CONFIGURATION STRINGS:
136  * -------------------------------------------------------
137  * [supplied_section_name]
138  * sensorLabel = RGBD360 // A text description
139  * preview_window = false // Show a window with a preview of the
140  *grabbed data in real-time
141  *
142  * device_number = 0 // Device index to open (0:first Kinect,
143  *1:second Kinect,...)
144  *
145  * grab_image = true // Grab the RGB image channel?
146  *(Default=true)
147  * grab_depth = true // Grab the depth channel? (Default=true)
148  * grab_3D_points = true // Grab the 3D point cloud? (Default=true)
149  *If disabled, points can be generated later on.
150  *
151  * video_channel = VIDEO_CHANNEL_RGB // Optional. Can be:
152  *VIDEO_CHANNEL_RGB (default) or VIDEO_CHANNEL_IR
153  *
154  * pose_x=0 // Camera position in the robot (meters)
155  * pose_y=0
156  * pose_z=0
157  * pose_yaw=0 // Angles in degrees
158  * pose_pitch=0
159  * pose_roll=0
160  *
161  *
162  * // Left/Depth camera
163  * [supplied_section_name_LEFT]
164  * rawlog-grabber-ignore = true // Instructs rawlog-grabber to ignore this
165  *section (it is not a separate device!)
166  *
167  * resolution = [640 488]
168  * cx = 314.649173
169  * cy = 240.160459
170  * fx = 572.882768
171  * fy = 542.739980
172  * dist = [-4.747169e-03 -4.357976e-03 0.000000e+00 0.000000e+00
173  *0.000000e+00] // The order is: [K1 K2 T1 T2 K3]
174  *
175  * // Right/RGB camera
176  * [supplied_section_name_RIGHT]
177  * rawlog-grabber-ignore = true // Instructs rawlog-grabber to ignore this
178  *section (it is not a separate device!)
179  *
180  * resolution = [640 480]
181  * cx = 322.515987
182  * cy = 259.055966
183  * fx = 521.179233
184  * fy = 493.033034
185  * dist = [5.858325e-02 3.856792e-02 0.000000e+00 0.000000e+00
186  *0.000000e+00] // The order is: [K1 K2 T1 T2 K3]
187  *
188  * // Relative pose of the right camera wrt to the left camera:
189  * // This assumes that both camera frames are such that +Z points
190  * // forwards, and +X and +Y to the right and downwards.
191  * // For the actual coordinates employed in 3D observations, see figure in
192  *mrpt::obs::CObservation3DRangeScan
193  * [supplied_section_name_LEFT2RIGHT_POSE]
194  * rawlog-grabber-ignore = true // Instructs rawlog-grabber to ignore this
195  *section (it is not a separate device!)
196  *
197  * pose_quaternion = [0.025575 -0.000609 -0.001462 0.999987 0.002038
198  *0.004335 -0.001693]
199  *
200  * \endcode
201  *
202  * More references to read:
203  * - http://RGBD360
204  * - http://http://www.openni.org/
205  * \ingroup mrpt_hwdrivers_grp
206  */
209 {
211 
212  public:
213  /** Default ctor
214  */
216  /** Default ctor
217  */
218  ~COpenNI2_RGBD360() override;
219 
220  /** Initializes the 3D camera - should be invoked after calling loadConfig()
221  * or setting the different parameters with the set*() methods.
222  * \exception This method must throw an exception with a descriptive
223  * message if some critical error is found.
224  */
225  void initialize() override;
226 
227  /** To be called at a high rate (>XX Hz), this method populates the
228  * internal buffer of received observations.
229  * This method is mainly intended for usage within rawlog-grabber or
230  * similar programs.
231  * For an alternative, see getNextObservation()
232  * \exception This method must throw an exception with a descriptive
233  * message if some critical error is found.
234  * \sa getNextObservation
235  */
236  void doProcess() override;
237 
238  /** The main data retrieving function, to be called after calling
239  * loadConfig() and initialize().
240  * \param out_obs The output retrieved observation (only if
241  * there_is_obs=true).
242  * \param there_is_obs If set to false, there was no new observation.
243  * \param hardware_error True on hardware/comms error.
244  *
245  * \sa doProcess
246  */
247  void getNextObservation(
248  mrpt::obs::CObservationRGBD360& out_obs, bool& there_is_obs,
249  bool& hardware_error);
250 
251  /** Set the path where to save off-rawlog image files (this class DOES take
252  * into account this path).
253  * An empty string (the default value at construction) means to save
254  * images embedded in the rawlog, instead of on separate files.
255  * \exception std::exception If the directory doesn't exists and cannot be
256  * created.
257  */
258  void setPathForExternalImages(const std::string& directory) override;
259 
260  /** @name Sensor parameters (alternative to \a loadConfig ) and manual
261  control
262  @{ */
263 
264  /** Get the maximum range (meters) that can be read in the observation field
265  * "rangeImage" */
266  inline double getMaxRange() const { return m_maxRange; }
267  /** Enable/disable the grabbing of the RGB channel */
268  inline void enableGrabRGB(bool enable = true) { m_grab_rgb = enable; }
269  inline bool isGrabRGBEnabled() const { return m_grab_rgb; }
270  /** Enable/disable the grabbing of the depth channel */
271  inline void enableGrabDepth(bool enable = true) { m_grab_depth = enable; }
272  inline bool isGrabDepthEnabled() const { return m_grab_depth; }
273  /** Enable/disable the grabbing of the 3D point clouds */
274  inline void enableGrab3DPoints(bool enable = true)
275  {
276  m_grab_3D_points = enable;
277  }
278  inline bool isGrab3DPointsEnabled() const { return m_grab_3D_points; }
279  /** @} */
280 
281  protected:
283  const mrpt::config::CConfigFileBase& configSource,
284  const std::string& section) override;
285 
287 
288  static const int NUM_SENSORS = 2;
289 
290  /** Show preview window while grabbing
291  */
292  bool m_preview_window{false};
293  /** If preview is enabled, only show 1 out of N images.
294  */
299 
300  /** Sensor max range (meters)
301  */
302  double m_maxRange;
303 
304  /** Default: all true
305  */
306  bool m_grab_rgb{true}, m_grab_depth{true}, m_grab_3D_points{true};
307 
308 }; // End of class
309 } // namespace mrpt::hwdrivers
A generic interface for a wide-variety of sensors designed to be used in the application RawLogGrabbe...
bool m_grab_rgb
Default: all true.
Declares a class derived from "CObservation" that encapsules an omnidirectional RGBD measurement from...
void enableGrab3DPoints(bool enable=true)
Enable/disable the grabbing of the 3D point clouds.
void enableGrabRGB(bool enable=true)
Enable/disable the grabbing of the RGB channel.
Contains classes for various device interfaces.
mrpt::gui::CDisplayWindow::Ptr m_win_int[NUM_SENSORS]
size_t m_preview_window_decimation
If preview is enabled, only show 1 out of N images.
A class for grabing RGBD images from several OpenNI2 sensors.
void enableGrabDepth(bool enable=true)
Enable/disable the grabbing of the depth channel.
This class allows loading and storing values and vectors of different types from a configuration text...
An abstract class for accessing OpenNI2 compatible sensors.
double getMaxRange() const
Get the maximum range (meters) that can be read in the observation field "rangeImage".
~COpenNI2_RGBD360() override
Default ctor.
#define DEFINE_GENERIC_SENSOR(class_name)
This declaration must be inserted in all CGenericSensor classes definition, within the class declarat...
A class used to store a 3D pose (a 3D translation + a rotation in 3D).
Definition: CPose3D.h:85
mrpt::poses::CPose3D m_sensorPoseOnRobot
bool m_preview_window
Show preview window while grabbing.
void getNextObservation(mrpt::obs::CObservationRGBD360 &out_obs, bool &there_is_obs, bool &hardware_error)
The main data retrieving function, to be called after calling loadConfig() and initialize().
void initialize() override
Initializes the 3D camera - should be invoked after calling loadConfig() or setting the different par...
mrpt::gui::CDisplayWindow::Ptr m_win_range[NUM_SENSORS]
double m_maxRange
Sensor max range (meters)
void loadConfig_sensorSpecific(const mrpt::config::CConfigFileBase &configSource, const std::string &section) override
Loads specific configuration for the device from a given source of configuration parameters, for example, an ".ini" file, loading from the section "[iniSection]" (see config::CConfigFileBase and derived classes)
void doProcess() override
To be called at a high rate (>XX Hz), this method populates the internal buffer of received observati...
void setPathForExternalImages(const std::string &directory) override
Set the path where to save off-rawlog image files (this class DOES take into account this path)...



Page generated by Doxygen 1.8.14 for MRPT 2.0.0 Git: b38439d21 Tue Mar 31 19:58:06 2020 +0200 at miƩ abr 1 00:50:30 CEST 2020