DataPacks are simple objects which wrap around arbitrary data structures, like JSON objects or protobuf messages. They provide the necessary abstract interface, which is understood by all components of NRP-Core, while still allowing to pass data in various formats.
A DataPack consists of two parts:
DataPacks are mainly used by Transceiver functions to relay data between engines. Each engine type is designed to accept only datapacks of a certain type and structure. To discover which datapacks can be processed by each engine, check out the engine's documentation here.
Every datapack contains a DataPackIdentifier, which uniquely identifies the datapack object and allows for routing of the data between transceiver functions, engine clients and engine servers. A datapack identifier consists of three fields:
These fields can be accessed from the transceiver functions:
DataPack is a template class with a single template parameter, which specifies the type of the data contained by the DataPack. This DataPack data can be in principle of any type. In practice there are some limitations though, since DataPacks, which are C++ objects, must be accessible from TransceiverFunctions, which are written in Python. Therefore the only DataPack data types which can be actually used in NRP-core are those for which Python bindings are provided. These are commented below.
In TransceiverFunctions, the DataPack data can always be accessed using the datapack "data" attribute.
It is possible for a datapack to contain no data. This is useful for example when an Engine is asked for a certain DataPack but it is not able to provide it. In this case, an Engine can return an empty datapack. This type of datapack contains only a datapack identifier and no data.
Attempting to retrieve the data from an empty DataPack will result in an exception. A method "isEmpty" is provided to check whether a DataPack is empty or not before attempting to access its data:
It may happen that certain DataPacks that are available to Preprocessing and Transceiver functions will not be updated on every simulation iteration. For example, this will be the case whenever the engines that take part in the simulation run at different frequencies. It may be wasteful to perform processing of the same data multiple times. The DataPack objects have a flag, called isUpdated
that allows to check if they contain the most recent data. It will be set to true only on the simulation iteration,on which the DataPack has been created (for example returned from a Preprocessing Function) or received (for example from an Engine). The flag can be accessed from Preprocessing, Transceiver, and Status Functions in the following way:
DataPacks are both the input and output of TransceiverFunctions. When a datapack is declared as input of a TF, this datapack is always requested to the corresponding Engine when the latter is synchronized. When a datapack is returned by a TF, it is sent to the corresponding Engine after the TF is executed. For more information about the synchronization model used in NRP-core the reader can refer to these sections:
The subsections below elaborate on the details of how to use DataPacks in TFs.
DataPacks can be declared as TransceiverFunction inputs using the dedicated decorator. After that they can be accessed in TFs as input arguments.
By default, all input DataPacks are passed to Transceiver Functions by value, i.e. the function receives a copy of the original DataPack object. This is a safety measure implemented to prevent accidental modifications of DataPacks available to other Functions and Engines. The policy can be changed by setting the DataPackPassingPolicy configuration parameter.
DataPacks can be returned from the transceiver function.
As commented in the section above, DataPacks are both the input and output of TFs. Therefore, a conversion mechanism between C++ and Python is required for each supported DataPack data type. The types currently supported are nlohmann::json and protobuf messages. The subsections below give details of the Python API provided for each of these types.
JsonDataPack type wraps around nlohmann::json C++ objects. The Python class wrapping the C++ json object is NlohmannJson, which is stored in JsonDataPack data attribute. NlohmannJson is very flexible and allows to pass most types of data between engines and transceiver functions without writing any additional code, it can contain all basic Python types. NlohmannJson also has partial support for numpy arrays - it's possible to use 1-dimensional arrays of integers and floats.
To import JsonDataPack:
To create a JsonDataPack object:
Inside transceiver functions the data can be accessed like a python dictionary:
Numpy arrays:
Printing the content using Python's built-in function str:
Getting a list of keys:
Getting length of the object:
The above will return number of keys, if data is a JSON object, or number of elements, if it's a JSON array.
In all the examples above it has been assumed that JsonDataPack is storing a JSON object. Actually the data object can contain either a JSON object, a JSON array or an empty object, it depends on how it is started. After instantiation, it contains an empty object:
If data is appended to it, the datapack stores a JSON array:
If instead a key is assigned, it stores a JSON object:
Please be aware that methods specific to 'object' type for accessing or setting elements will raise an error when used with an 'array' type and otherwise.
In contrast with JsonDataPack, which can wrap any nlohmann::json C++ object, a Python wrapper class is generated for each Protobuf definition. For example, for the Camera message listed below (which is used by the Gazebo Engine), a python class GazeboCameraDataPack is generated.
This class contains a data attribute which is of type GazeboCamera and gives access to the wrapped datapack data. The generated Python classes match the original Protobuf Python API as described in the protobuf documentation. There are some known limitations with respect to the original Protobuf Python API which are listed below with references to the protobuf documentation:
Finally, these Python wrappers are automatically generated in the NRP-core build process for Protobuf message definitions used by Engines shipped with NRPCore. These can be found in the folder nrp-core-msgs/protobuf/engine_proto_defs. See this guide to know how to compile additional messages so they become afterwards available to Engines and TFs.
A set of generic DataPacks is provided with NRP Core. These DataPacks allow to exchange most of the data types available in protobuf, and they can be used to pass arrays of arbitrary size. The data part of the generic DataPacks has the following structure:
With the asterisk replacing concrete types.
Another generic DataPack type that is provided with NRP Core is the image DataPack, based on the NrpGenericProto::Image type:
The definitions of the protobuf messages used by the DataPacks can be found in the messages repository, in protobuf/engine_proto_defs/nrp_generic.proto
.
A subset of wrappers that mimic the (well-known protobuf types) is also available to use. Unfortunately, at the moment it is not possible to use the official wrappers directly, and one must use the wrappers provided with NRP Core. The messages can be found in the messages repository, in protobuf/engine_proto_defs/wrappers.proto
.
An example use of the generic DataPacks, as well as the wrappers, can be found in the examples/generic_proto_test
directory.
Similarly to Protobuf datapacks, a Python wrapper class is generated for each ROS msg definition used in NRP-Core. For example, for a message of type Pose
from package geometry_msgs
, a python class PoseDataPack is generated. This class contains a data attribute which is of type Pose and gives access to the wrapped datapack data. The generated Python bindings can be found under the Python module nrp_core.data.nrp_ros
. For example, in the case of the Pose
message:
By default Python bindings are generated for all message types in the next ROS packages:
Any message definition contained in these packages can be used in TransceiverFunctions directly. It is also possible to generate Python bindings for messages defined in other ROS packages. For more information can be found here.
NOTE: At this moment there is no Engine implementation which can use ROS datapacks. The only way of interacting with other ROS nodes in NRP-Core is by using a Computational Graph in your experiment instead of Transceiver Functions to rely data between Engines. See this page for more information at this regard.
All concrete datapack classes should be based on the DataPack class. It is a template class, and the single template argument specifies the data structure type, which is going to be held by the class instances.
The DataPack class design is somewhat similar to that of std::unique_ptr. Whenever a datapack object is constructed, it takes ownership of the input data structure. This structure may be then accessed and modified, or the ownership may be released.
The DataPack class inherits from DataPackInterface. This class may also be instantiated, but the object will not carry any data (ie. it's an empty DataPack).
A DataPack class is considered empty when its data is released. Every instance of the base class, DataPackInterface, is also considered empty, because there is no data stored in it.
In order to be accessible to transceiver functions, a conversion mechanism between C++ and Python must be specified for each DataPack data type. Currently NRP-core provides Python bindings for nlohmann::json and protobuf messages. In case you wished to integrate a different data type, you would have to implement Python bindings for this type and make them available to NRP-core as a Python module.