Intel® RealSense™ Cross Platform API
Intel Realsense Cross-platform API
|
In addition to streaming video and other data from devices and sensors, the realsense SDK provides the ability to record a live session of streaming to a file. The recorded files can later be loaded by the SDK and to create a device with "read-only" abilities of the recorded device ( we will explain what "read-only" abilities mean later on). The SDK is recording a single device to a single rosbag file, using mostly standard ROS messages. This allows files recorded by the SDK to be replayed using any ROS tools \ application.
For example recording files, please see ../../doc/sample-data.md "Sample Data"
rs2::recorder
:exclamation: If you are not familiar with the basic streaming examples, please follow them before moving on
To enable recording of any device, simply create a rs2::recorder from it and provide a path to the desired output file:
A recorder
has the same functionality as a "real" device, with additional control for recording, such as pausing and resuming record.
rs2::playback
:exclamation: If you are not familiar with the basic streaming examples, please follow them before moving on
Recorded files can be loaded and used to create a playback device by simply loading a file to the context:
The above code creates a playback device, which can be used as any device, but has the obvious limitation of only playing the recorded streams. Playback devices can be used to query information on the device and it sensors, and can be extended to which ever extension the "real" device could. A playback
provides additional functionalities such as seek, pause, resume and playback speed.
rs2::config
with rs2::pipeline
The rs2::pipeline
can be configured to record or play a streaming session by providing it with a rs2::config
of your choice:
Recording to file:
Playing from file:
Among its many features, the RealSense Viewer allows recording a device, and loading a file to playback.
To record a streaming session, simply click the "bars" icon next to the device name, choose "Record to File...", and select the destination for the file.
After choosing a file destination, a red dot will appear next to the device's name, indicating that it is recording. Starting a stream will save its frames to the file, and once all streams are stopped, recording will complete automatically.
To replay a file, click "Add Source", choose "Load Recorded Sequence" and select the file you want to play. Once you select the file, the Viewer will automatically add it the the list of source, and a popup should appear stating that the file was loaded:
>Notice that devices that were loaded from file have a "movie" icon next to their name.
After loading the file, you can start streaming its streams, view its controls (with the values at time of record), pause the playback, choose speed, and use the seek bar to navigate trough frames.
A Device is a container of Sensors with some correlation between them (e.g - all sensors are on a single board, sensors are mounted on a robot and share calibration information, etc.). A Sensor is a data streaming object, that provides one or more Streams. Stream is a sequence of data items of a single data type, which are ordered according to their time of creation or arrival. The Sensor provides the Streams frames to the user.
We call the device's sensors and stream, the topology of the device.
Devices and Sensors can have Extensions that provide additional functionalities. A Snapshot of an Extension is a snapshot of the data that is available by the extension at some point of time, it is a sort of "read-only" version of the extension. For example, say we have a DigitalClockExtension
, that can set and show the time. If we take a snapshot of that extension at noon, then whenever we ask the snapshot to show the time it will show "12:00", and trying to set its time will fail.
Finally, we will refer to a an actual implementation of devices and sensors as "live" or "real" devices and sensors.
The Rosbag file consists of data records, each data that is written to the file is represented by a topic that represents the data content, message that is the data definition, and capture time. Any topic can be coupled with any message, in our file each topic will be represented by a single message. However, we will have different topics with the same message such as property. A bag file consists of a list of records, each record can be one of the following types:
Each message that we write is stored in a chunk, each chunk consists of a number of messages based on the chunk size. In addition to the message data, the chunk stores also the connections of the messages, the connection describes the topic name and message definition to be able to serialize the message data. The flow to find a message on topic “A”:
The SDK depends on ROS packages that are used for recording and playing the bag file. The dependencies are:
Note: The above dependencies are embedded into the source files, there is no need for additional installations.
The rosbag file requires messages to be associates with Topics
Topics are named buses over which nodes exchange messages. [ROS Wiki]
The following table depicts the Topics that are supported by RealSense file format:
Topic | Published Name | Message Type | Description |
---|---|---|---|
File Version | /file_version | std_msgs/UInt32 | Version of file format. A single message for entire file |
Device Information | /device_<device_id>/info | diagnostic_msgs/KeyValue | Device information. Many messages to a single topic |
Sensor Information | /device_<device_id>/info | diagnostic_msgs/KeyValue | Sensor information. Many messages to a single topic |
Stream Information | /device_<device_id>/sensor_<sensor_id>/<stream_type>_<stream_id>/info | realsense_msg/StreamInfo | generic-stream information A single messages to a single topic |
Video Stream Info | /device_<device_id>/sensor_<sensor_id>/<stream_type>_<stream_id>/info/camera_info | sensor_msgs/camera_info | Image information. A single messages to a single topic |
IMU Intrinsics | /device_<device_id>/sensor_<sensor_id>/<stream_type>_<stream_id>/info/imu_intrinsic | realsense_msgs/ImuIntrinsic | Intrinsic of a motion stream. A single messages to a single topic |
Options | /device_<device_id>/sensor_<sensor_id>/option/<option name>/value /device_<device_id>/sensor_<sensor_id>/option/<option name>/description | std_msgs/Float32 std_msgs/String | Options of a sensor. values represent the value of the option and has many messages to a single topic description is a human readable description of the option, with a single messages to a single topic |
Image Data | /device_<device_id>/sensor_<sensor_id>/<stream_type>_<stream_id>/image/data | sensor_msgs/Image | The data of a single image. A single messages to a single topic |
Image Information | /device_<device_id>/sensor_<sensor_id>/<stream_type>_<stream_id>/image/metadata | diagnostic_msgs/KeyValue | Additional information of a single image. Many message to a single topic |
IMU Data | /device_<device_id>/sensor_<sensor_id>/<stream_type>_<stream_id>/imu/data | sensor_msgs/Imu | The data of a single imu frame. A single messages to a single topic |
IMU Information | /device_<device_id>/sensor_<sensor_id>/<stream_type>_<stream_id>/imu/metadata | diagnostic_msgs/KeyValue | Additional information of a single imu frame. Many message to a single topic |
Pose Data | /device_<device_id>/sensor_<sensor_id>/<stream_type>_<stream_id>/pose/{transform, accel, twist}/data | ||
pose/transform | geometry_msgs/Transform Message | ||
pose/accel | geometry_msgs/Accel Message | ||
pose/twist | geometry_msgs/Twist Message |
The Pose data is split into 3 messages (each under different topic [1]):
pose/transform
Translation and rotation of the object
pose/accel
Angular and linear acceleration of the object
pose/twist
Angular and linear velocity of the object
Pose Information
/device_<device_id>/sensor_<sensor_id>/<stream_type>_<stream_id>/pose/metadata
Additional information of a single pose frame. Many message to a single topic
Occupancy Map Data
/device_<device_id>/sensor_<sensor_id>/<stream_type>_<stream_id>/occupancy_map/data
Occupancy Map Information
/device_<device_id>/sensor_<sensor_id>/<stream_type>_<stream_id>/occupancy_map/metadata
Additional
Time Stream
/device_<device_id>/sensor_<sensor_id>/time
Log
/log
Log messages
Stream Exntrinsics
/device_<device_id>/sensor_<sensor_id>/<stream_type>_<stream_id>/tf/<group_index>
geometry_msgs/Transform Message
Extrinsic transformation between some point of reference (indexed by group_index) to the stream in the topic
Additional Info
/additional_info
Additinal information of any kind. Can be useful for application that require additional metadata on the recorded file (such as program name, version etc...)
Sensor Notification
/device_<device_id>/sensor_<sensor_id>/notification/<rs2_notification_category>
ROS uses a simplified messages description language for describing the data values (aka messages) that ROS nodes publish. This description makes it easy for ROS tools to automatically generate source code for the message type in several target languages. Message descriptions are stored in .msg files. There are two parts to a .msg file: fields and constants. Fields are the data that is sent inside of the message. Constants define useful values that can be used to interpret those fields (e.g. enum-like constants for an integer value).
In addition to the standard ROS messsages, the SDK writes additional proprietary messages (available in the 3rd party folder) for new data types that are recorded. The following are the new messages created by the SDK:
This message defines meta information for a stream The stream type is expressed in the topic name
Name | Description | Format |
---|---|---|
is_recommended | Indicates if this stream is recommended by RealSense SDK | bool |
fps | Frame per second value | uint32 |
encoding | Stream's data format. Supported encodings are listed below | string |
For video streams, the supported encoding types can be found at ros documentation. Additional supported encodings are listed under rs_sensor.h as the rs2_format
enumeration. Note that some of the encodings appear in both locations.
This message defines a Motion device's intrinsic: scale, bias, and variances
Name
Description
Format
data
Interpret data array values.
3x4 Row-major matrix:
Scale X
Cross Axis
Cross Axis
Bias X
Cross Axis
Scale Y
Cross Axis
Bias Y
Cross Axis
Cross Axis
Scale Z
Cross Axis </tbody>
float32[12]
noise_variances
Variance of noise for X, Y, and Z axis
float32[3]
bias_variances
Variance of bias for X, Y, and Z axis
float32[3] </tbody>
This message defines a notification
Name
Description
Format
timestamp
The time of occurrence
category
Category of the notification, matching an rs2_notification_category
severity
Severity of the notification, matching an rs2_log_severity
description
Human readable description of the notification
serialized_data
JSON string with additional data
std_msgs/String </tbody>
Each bag file recorded using this SDK should contains a version message. Version number is a single integer which indicates the version of topics and messages that were used to record the file. The above messages and topics reflect the current version. Changes from previous versions will appear at the end of this section.
Current file version: ***3***
Changes from previous version:
Recording is performed at the Device level, meaning that the device, its sensors, their streams' data (frames) and all extensions are saved to file. To allow for a seamless experience when switching between a live device and a record or playback device we save the device's topology and all of the extensions' snapshots to the file, in addition to the streaming frames.
A record device is like a wrapper around a real device, that delegates actions to the device and sensors while recording new data in between actions. When a record device is created, a record sensor is created per real sensor of the real device. A record sensor will record newly arriving frames for each of its streams, and changes to extensions' data (snapshots).
Recording related files are:
A librealsense::record_device
is constructed with a "live" device and a device_serializer::writer
. At the moment the only device_serializer::writer
we use is a ros_writer
which writes device information to a rosbag file.
When constructing a ros_writer
the requested file is created if it does not exist, and then opened for writing. In addition, a single message containing the realsense file format version is written to the file. The ros_writer
implements the device_serializer::writer
interface which has only 4 functions:
A device description (or device snapshot) is the set of data that is available for any librealsense::device
. It contains all of the device's extensions snapshots, and a collection of each of the device's sensors' extensions snapshots. In addition it holds a mapping of extrinsic information for all streams of the device.
A single librealsense::device_snapshot
instance is enough to understand the a device's topology: Which extensions it supports, how many sensors it has, which extensions each sensor supports, which streams each sensor supports and which streams share calibration information (extrinsic).
Each frame in the SDK implements the librealsense::frame_interface
interface. This means that frames are polymorphic and represent all types of data that streams provide. Frames are recorded to file with all of their additional information (such as metadata, timestamp, etc...), and the time that they arrived from the backend to the sensor.
Upon creation, the record device goes over all of the extensions of the real device and its sensors, and saves snapshots of those extensions to the file (This is the data that is passed to write_device_description(..)
) . These snapshots will allow the playback device to recreate the topology of the recorded device, and will serve as the initial data of their extensions. To allow record sensors to save changes to extensions' data over the life time of the program, when a user asks a record sensor for an extension, a record-able extension is provided. A record-able version of an extension holds an action to perform whenever the extension's data changes. This action is provided by the record device (or sensor), and requires extensions to pass a reference of themselves to the device, which will usually create a snapshot from them and record them to file with the time at which they occurred.
Playback device is an implementation of device interface which reads from a file to simulate a real device.
Playback device holds playback sensors which simulate real sensors.
When creating the playback device, it will read the initial device snapshot from the file in order to map itself and its sensors in matters of functionality and data provided. When creating each sensor, the device will create a sensor from the sensor's initial snapshot. Each sensor will hold a single thread for each of the sensor's streams which is used to raise frames to the user. The playback device holds a single reading thread that reads the next frame in a loop and dispatches the frame to the relevant sensor.