Skip to content

Subscribing

BLISS offers the possibility to have a separate process (on the system level) for retrieving the acquired data in order to save or process.

Example 1: Save data in HDF5

Note

BLISS already comes with an HDF5 writer. This is just an example. The example script discussed here is provided in Bliss repository at scripts/external_saving_example/external_saving_example.py. To have a minimal working Bliss environment have a look at the installation notes and the test_configuration setup.

Listening to a Bliss session

When running the script

python scripts/external_saving_example/external_saving_example.py

it is listening to new scans in the Bliss test_session

listen_to_session_wait_for_scans("test_session")

Connect to the node ‘test_session’ in redis:

session_node = get_session_node(session)

Using the walk_on_new_events() function with include_filter="scan"(limit walk to nodes of type node.type == "scan" ) in order to handle new events on scan nodes:

  • NEW_NODE when a new scan is launched
  • END_SCAN when a scan terminates.
# wait for new events on scan
for event_type, node, event_data in session_node.walk_on_new_events(
    include_filter="scan", from_next=True):

Receiving events from a scan

In the example script a new instance of the class HDF5_Writer is created per scan that is started. Following the initialisation a gevent greenlet is spawned to run the actual listener in a non blocking way. Inside def run(self) a second iterator is started walking through all events emitted by the scan (see data structure section):

for event_type, node, event_data in self.scan_node.walk_events():
    ...

Hint

Data generated by scans in BLISS is emitted though a structure called channel. Further reading can be found at

Once an event is received it can be categorized by the event type:

  • NEW_NODE
  • NEW_DATA

and by node type:

  • channel
  • lima

Currently 0d and and 1d data is directly kept in redis and published through channels (node.type == "channel"). For each new channel a corresponding hdf5 dataset is created and filled with data emitted on a "NEW_DATA" event.

2d data (e.g. lima images) is not saved in redis itself, but can be retrieved through references (method get_image of class LimaDataView). However, for this example we do not want to deal with the 2d data itself but only resolve the final saving destination.

In this example data from channels (not lima) is written to the hdf5 file as soon as the event is received, but references of images are only saved in hdf5 once the scan has ended (nothing is preventing to do this also on the fly).

Finalizing the scan dataset on “END_SCAN”

Once the END_SCAN event is received the finalize method of the HDF5_Writer instance is called to

  • 1) stop the listening on events of the regarding scan,
  • 2) have a final synchronization for all datasets of the scan and
  • 3) to write instrument and meta-data entries to hdf5.

Meta-data and Instrument dataset

Each scan has an attached scan_info structure (nested dict) which e.g. contains meta-data to be saved in the HDF5 file. Required Nexus attributes such as "NX_class" will added automatically when missing.

Examples of complex scans

In order to have some test cases for more demanding scans when working with the presented api a script file that can be executed inside the Bliss shell is provided:

    TEST_SESSION [1]:   exec(open("scripts/external_saving_example/some_scans.py").read())

The same scans can also be executed using Bliss in a library mode running (TANGO_HOST to be chosen according to the running server…)

BEACON_HOST=localhost TANGO_HOST=localhost:20000 python scripts/external_saving_example/some_scans_bliss_as_library.py