Files
seiscomp-training/share/doc/caps/html/_sources/apps/sproc2caps.rst.txt

688 lines
14 KiB
ReStructuredText

.. highlight:: rst
.. _sproc2caps:
##########
sproc2caps
##########
**Recordstream data acquisition plugin that applies filter and/or a
mathematical expression to one or more data streams forming new streams**
Description
===========
The sproc2caps plugin requests data from a |scname| :cite:t:`recordstream` in
real time or based on :ref:`time windows <sproc-tw>`,
:ref:`filters the data<sproc-filter>` and/or applies
:ref:`mathematical expressions <sproc-expressions>`. The processed data is sent
to a CAPS server or to stdout. Streams can be :ref:`renamed <sproc-rename>`.
Setup
=====
Streams
-------
The plugin reads the streams to subscribe to from a separate stream map file.
The location of the file can be either defined in the plugin configuration or
given as a command line argument:
.. code-block:: bash
streams.map = @DATADIR@/sproc2caps/streams.map
Each line of the stream map file defines n input streams and one output stream.
By definition at least one input and one output stream must be given. The last
argument in a line is the output stream. All other lines are input stream. Lines
beginning with a comment are ignored.
.. note::
The map file is required even if the stream codes remain the same. Without an
entry in the map file the input streams are not treated.
Example map file:
.. code-block:: bash
#Input 1 Input 2 ... Output
XX.TEST1..HHZ XX.TEST2..HHZ ... XX.TEST3..HHZ
Each stream entry may contain additional stream options e.g. for
:ref:`data filtering <sproc-filter>`. Options are indicated by "?".
The following stream options are supported:
====== ============================================= ========
Name Description Example
====== ============================================= ========
filter Filter string filter=BW(4,0.7,2)
unit Output unit unit=cm/s
expr Expression to be used(Output only) expr=x1+x2
====== ============================================= ========
Examples of streams with stream options:
.. code-block:: bash
XX.TEST1..HHZ?filter=BW_HP(4,0.1)
XX.TEST2..HHZ?filter=BW_HP(4,0.1)
XX.TEST3..HHZ?filter=BW(4,0.7,2)?unit=cm/s
For the given example the plugin assigns the following access variables to
the streams. The access variables can be used in the mathematical expression string.
The *unit* option provides an additional description of the stream. *unit*
does not modify the stream.
Access variables for N input streams:
========= ============= ============= === =============
map,input input 1 input 2 ... input N
========= ============= ============= === =============
stream XX.TEST1..HHZ XX.TEST2..HHZ ... XX.TESTN..HHZ
variable x1 x2 ... xN
========= ============= ============= === =============
When the mathematical expression is evaluated the xi will be replaced with the
sample of the corresponding stream at the sample time. The maximum number of
input streams is 3.
.. _sproc-filter:
Filtering
---------
Input data can be filtered before :ref:`mathematical expressions <sproc-expressions>`
are applied. Filter grammar and all filters :cite:p:`filter-grammar` known from
|scname| can be considered. By default input data remain unfiltered.
Example for setting the filter in the map file:
.. code-block:: bash
XX.TEST1..HHZ?filter=BW(4,0.7,2) XX.TEST2..HHZ XX.TEST3..HHZ
.. _sproc-expressions:
Expressions
-----------
The sproc plugin uses the C++ Mathematical Expression Library to evaluate
mathematical expressions. The library supports a wide range of mathematical
expressions. The complete feature list can be found here_. The number of
input variables depends on the number of input streams. The variables are numbered
consecutively from 1 to n: x1, x2, ..., xn.
Example how to multiply 3 streams:
- via command-line:
.. code-block:: bash
--expr="x1*x2*x3"
- via config:
.. code-block:: bash
streams.expr = x1*x2*x3
- via stream options:
.. code-block:: bash
XX.TEST1..HHZ XX.TEST2..HHZ XX.TEST3..HHZ XX.TESTOUT..HHZ?expr=x1*x2*x3
.. _here: http://www.partow.net/programming/exprtk/
.. _sproc-rename:
Rename Streams
--------------
In addition to applying mathematical expressions to streams, the plugin can be also used to
rename streams. With the following example we show how to map the streams **GE.APE..BHE** and
**GE.BKNI..BHE** to new stream ids and store the output streams in the same CAPS server:
1. Open the plugin configuration and create a clone of the input data stream with:
.. code-block:: bash
streams.expr = x1
#. Create the mapping file **@DATADIR@/sproc2caps/streams.map** with the following content
.. code-block:: bash
# Input Output
GE.APE..BHE AB.APE..BHE
GE.BKNI..BHE GE.BKNI2..BHE
.. _sproc-tw:
Time windows
------------
Set the time windows using *--begin* and *--end* to set the start and the end times,
respectively. When no time window is given, real-time input data are considered.
Examples
========
#. To map waveform data for a specific time window reading from a local CAPS server on
localhost:18002 and sending to the plugin port of the same CAPS server on localhost:18003 run:
.. code-block:: bash
sproc2caps --begin "2019-01-01 00:00:00" --end "2019-01-01 01:00:00" -I "caps://localhost:18002" -a localhost:18003
This will create duplicate data on the CAPS server if the map file renames the streams.
To remove the original streams:
1. Configure caps to keep the orignal data for 0 days
#. Restart or reload caps
#. Read real-time data from an external seedlink server like with
:cite:p:`slink2caps` but applying the mapping:
.. code-block:: bash
sproc2caps -I "slink://host:18000" -a localhost:18003
#. Read data from the file *data.mseed* resample to 10 Hz sample rate by the
RecordStream and write the resulting data to stdout. By applying
:option:`--stop` the processing stops when the data is read completely:
.. code-block:: bash
sproc2caps -I dec://file?rate=10/data.mseed -d localhost --gain-in 1 --gain-out 1 --dump-packets --mseed --begin "2000-01-01 00:00:00" --stop > test.mseed
You may join the command with :cite:t:`capstool` and :cite:t:`scmssort`:
.. code-block:: bash
echo "2024,01,01,00,00,00 2024,01,01,00,10,00 * * * *" | capstool -H localhost |\
sproc2caps -I dec://file?rate=10/- -d localhost --gain-in 1 --gain-out 1 --dump-packets --mseed --begin "2000-01-01 00:00:00" --stop |\
scmssort -E > test.mseed
.. note::
A similar action may be executed using :ref:`rs2caps`.
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/sproc2caps.cfg`
| :file:`etc/global.cfg`
| :file:`etc/sproc2caps.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/sproc2caps.cfg`
sproc2caps inherits :ref:`global options<global-configuration>`.
.. note::
Modules/plugins may require a license file. The default path to license
files is :file:`@DATADIR@/licenses/` which can be overridden by global
configuration of the parameter :confval:`gempa.licensePath`. Example: ::
gempa.licensePath = @CONFIGDIR@/licenses
.. _journal:
.. confval:: journal.file
Default: ``@ROOTDIR@/var/run/sproc2caps/journal``
Type: *string*
File to store stream states
.. confval:: journal.flush
Default: ``10``
Unit: *s*
Type: *uint*
Flush stream states to disk every n seconds
.. confval:: journal.waitForAck
Default: ``60``
Unit: *s*
Type: *uint*
Wait when a sync has been forced, up to n seconds
.. confval:: journal.waitForLastAck
Default: ``5``
Unit: *s*
Type: *uint*
Wait on shutdown to receive acknownledgement messages, up to n seconds
.. _streams:
.. note::
**streams.\***
*Configure operations applied to input streams and the stream mapping.*
.. confval:: streams.begin
Type: *string*
Start time of data time window, default 'GMT'
.. confval:: streams.end
Type: *string*
End time of data time window
.. confval:: streams.filter
Default: ``self``
Type: *string*
Sets the input filter
.. confval:: streams.expr
Default: ``x1 + x2``
Type: *string*
Sets the mathematical expression
.. confval:: streams.map
Default: ``@DATADIR@/sproc2caps/streams.map``
Type: *string*
Absolute path to the stream map file. Each line
holds n input streams and one output stream.
Example:
CX.PB11..BHZ CX.PB11..BHZ
CX.PB11..BHZ CX.PB07..BHZ CX.PB11..BBZ
.. _output:
.. note::
**output.\***
*Configure the data output.*
.. confval:: output.address
Default: ``localhost:18003``
Type: *string*
Data output URL [[caps\|capss]:\/\/][user:pass\@]host[:port]. This parameter
superseds the host and port parameter of previous versions and takes precedence.
.. confval:: output.host
Default: ``localhost``
Type: *string*
Data output host. Deprecated: Use output.address instead.
.. confval:: output.port
Default: ``18003``
Type: *int*
Data output port. Deprecated: Use output.address instead.
.. confval:: output.bufferSize
Default: ``1048576``
Unit: *B*
Type: *uint*
Size \(bytes\) of the packet buffer
.. confval:: output.backfillingBufferSize
Default: ``180``
Unit: *s*
Type: *uint*
Length of backfilling buffer. Whenever a gap is detected, records
will be held in a buffer and not sent out. Records are flushed from
front to back if the buffer size is exceeded.
.. _output.mseed:
.. confval:: output.mseed.enable
Default: ``true``
Type: *boolean*
Enable on\-the\-fly MiniSeed
encoding. If the encoder does not support the input
type of a packet it will be forwarded. Re encoding of
MiniSEED packets is not supported.
.. confval:: output.mseed.encoding
Default: ``Steim2``
Type: *string*
MiniSEED encoding to use. \(Uncompressed, Steim1 or Steim2\)
.. _statusLog:
.. confval:: statusLog.enable
Default: ``false``
Type: *boolean*
Log information status information e.g.
max bytes buffered
.. confval:: statusLog.flush
Default: ``10``
Unit: *s*
Type: *uint*
Flush status every n seconds to disk
Command-Line Options
====================
.. _Generic:
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: -D, --daemon
Run as daemon. This means the application will fork itself
and doesn't need to be started with \&.
.. _Verbosity:
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --log-file arg
Use alternative log file.
.. _Records:
Records
-------
.. option:: --record-driver-list
List all supported record stream drivers.
.. option:: -I, --record-url arg
The recordstream source URL, format:
[service:\/\/]location[#type].
\"service\" is the name of the recordstream driver
which can be queried with \"\-\-record\-driver\-list\".
If \"service\" is not given, \"file:\/\/\" is
used.
.. option:: --record-file arg
Specify a file as record source.
.. option:: --record-type arg
Specify a type for the records being read.
.. _Output:
Output
------
.. option:: -O, --output arg
Overrides configuration parameter :confval:`output.address`.
This is the CAPS server which shall receive the data.
.. option:: --agent arg
Sets the agent string. Allows the server to identify the
application that sends data.
.. option:: -b, --buffer-size arg
Size \(bytes\) of the journal buffer. If the value ist
exceeded, a synchronization of the journal is forced.
.. option:: --backfilling arg
Default: ``0``
Buffer size in seconds for backfilling gaps.
.. option:: --mseed
Enable on\-the\-fly miniSEED encoding. If the encoder does not
support the input type of a packet, it will be forwarded.
Re\-encoding of miniSEED packets is not supported.
.. option:: --encoding arg
miniSEED encoding to use: Uncompressed, Steim1 or Steim2.
.. option:: --rec-len arg
miniSEED record length expressed as a power of
2. A 512 byte record would be 9.
.. option:: --max-future-endtime arg
Maximum allowed relative end time for packets. If the packet
end time is greater than the current time plus this value,
the packet will be discarded. By default this value is set
to 120 seconds.
.. option:: --dump-packets
Dump packets to stdout.
.. _Journal:
Journal
-------
.. option:: -j, --journal arg
File to store stream states. Use an empty string to log to
stdout.
.. option:: --flush arg
Flush stream states to disk every n seconds.
.. option:: --wait-for-ack arg arg
Wait when a sync has been forced, up to n seconds.
.. option:: -w, --wait-for-last-ack arg
Wait on shutdown to receive acknownledgement messages, up to
the given number of seconds.
.. _Status:
Status
------
.. option:: --status-log
Log information status information, e.g., max bytes buffered.
.. option:: --status-flush arg
Flush status every n seconds to disk.
.. option:: --stop
Stop processing when data acquisition is finished. The
'finished' signal depends on data source.
.. _Streams:
Streams
-------
.. option:: --begin arg
Start time of data time window.
.. option:: --end arg
End time of data time window.
.. option:: --map arg
Stream map file.
.. option:: --expr arg
Mathematical expression to be applied.
.. _Test:
Test
----
.. option:: --gain-in arg
Gain that is applied to the input values.
.. option:: --gain-out arg
Gain that is applied to the output values.