[seiscomp, scanloc] Install, add .gitignore

This commit is contained in:
2025-10-09 15:07:02 +02:00
commit 20f5301bb1
2848 changed files with 1315858 additions and 0 deletions

View File

@ -0,0 +1,59 @@
.. highlight:: rst
.. _access:
######
access
######
**Access module for FDSNWS.**
Bindings Parameters
===================
.. note::
**access.\***
*Defines access to restricted data. When defined it will add the listed users to the list of authorized users to access certain restricted data give the parameters on this binding.*
.. confval:: access.users
Type: *list:string*
List of users \(e\-mail addresses\) allowed to access the restricted data.
.. confval:: access.disableStationCode
Default: ``false``
Type: *boolean*
When disableStationCode option is set to true the access entries will be generated only for the network level \(and optionally stream level\), no station code will be filled \(this can potentially reduce the number of entries on the access table, and save memory on the request handler when used\).
.. confval:: access.streams
Type: *list:string*
List of locations.streams this access rule applies to. The location code is optional \(optional, allow wildcards\).
.. confval:: access.start
Type: *datetime*
Start of validity \(optional\).
.. confval:: access.end
Type: *datetime*
End of validity \(optional\).

View File

@ -0,0 +1,248 @@
.. highlight:: rst
.. _bindings2cfg:
############
bindings2cfg
############
**Synchronize key files with configuration database or convert them to
configuration XML.**
Description
===========
bindings2cfg dumps the bindings configuration from a specific key directory
to the given database or a configuration XML. In this way, the bindings parameters
can be configured in a directory different from $SEISCOMP_ROOT/etc/. From this
non-standard directory the configuration XML can be created without
prior writing the bindings to a database and reading from there using e.g.
:ref:`scxmldump`.
This utility is useful for repeating parameter tuning.
Examples
========
#. Write the bindings configuration from some key directory to a configuration
XML file:
.. code-block:: sh
bindings2cfg --key-dir ./etc/key -o config.xml
#. Write the bindings configuration from some key directory to the seiscomp
database on localhost
.. code-block:: sh
bindings2cfg --key-dir ./etc/key -d mysql://sysop:sysop@localhost/seiscomp
.. _bindings2cfg_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/bindings2cfg.cfg`
| :file:`etc/global.cfg`
| :file:`etc/bindings2cfg.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/bindings2cfg.cfg`
bindings2cfg inherits :ref:`global options<global-configuration>`.
Command-Line Options
====================
.. program:: bindings2cfg
:program:`bindings2cfg [options]`
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
.. option:: --plugins arg
Load given plugins.
.. option:: -D, --daemon
Run as daemon. This means the application will fork itself
and doesn't need to be started with \&.
.. option:: --auto-shutdown arg
Enable\/disable self\-shutdown because a master module shutdown.
This only works when messaging is enabled and the master
module sends a shutdown message \(enabled with \-\-start\-stop\-msg
for the master module\).
.. option:: --shutdown-master-module arg
Set the name of the master\-module used for auto\-shutdown.
This is the application name of the module actually
started. If symlinks are used, then it is the name of
the symlinked application.
.. option:: --shutdown-master-username arg
Set the name of the master\-username of the messaging
used for auto\-shutdown. If \"shutdown\-master\-module\" is
given as well, this parameter is ignored.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --log-file arg
Use alternative log file.
Database
--------
.. option:: --db-driver-list
List all supported database drivers.
.. option:: -d, --database arg
The database connection string, format:
service:\/\/user:pwd\@host\/database.
\"service\" is the name of the database driver which
can be queried with \"\-\-db\-driver\-list\".
.. option:: --config-module arg
The config module to use.
.. option:: --inventory-db arg
Load the inventory from the given database or file, format:
[service:\/\/]location .
.. option:: --config-db arg
Load the configuration from the given database or file,
format: [service:\/\/]location .
Messaging
---------
.. option:: -u, --user arg
Overrides configuration parameter :confval:`connection.username`.
.. option:: -H, --host arg
Overrides configuration parameter :confval:`connection.server`.
.. option:: -t, --timeout arg
Overrides configuration parameter :confval:`connection.timeout`.
.. option:: -g, --primary-group arg
Overrides configuration parameter :confval:`connection.primaryGroup`.
.. option:: -S, --subscribe-group arg
A group to subscribe to.
This option can be given more than once.
.. option:: --content-type arg
Overrides configuration parameter :confval:`connection.contentType`.
.. option:: --start-stop-msg arg
Set sending of a start and a stop message.
Input
-----
.. option:: --key-dir arg
Override the location of the default key directory,
which is \$SEISCOMP_ROOT\/etc\/key .
Output
------
.. option:: -o, --output arg
If given, an output XML file is generated. Use '\-' for
stdout.
.. option:: --create-notifier
If given then a notifier message containing all notifiers
will be written to the output XML. This option only applies
if an output file is given. Notifier creation either requires
and input database and an input config XML as reference.

View File

@ -0,0 +1,66 @@
.. highlight:: rst
.. _diskmon:
#######
diskmon
#######
**Monitors a disk and sends notifications.**
Description
===========
Diskmon is a SeisComP init script that checks on each call to :program:`seiscomp check`
the filesystem by running the following command:
.. code-block:: sh
df | awk -v max="%d" \'{ if ( $5 > max ) print $0 }\'
where "%d" is replaced by the configured threshold. If there are lines in the
output (which means some filesystem exceed the usage threshold) it sends
the output along with a description line to all configured receipients using
the :program:`mail` command.
To make diskmon work it is important that :program:`mail` is working on the shell.
.. _diskmon_configuration:
Module Configuration
====================
.. note::
diskmon is a :term:`standalone module` and does not inherit :ref:`global options <global-configuration>`.
| :file:`etc/defaults/diskmon.cfg`
| :file:`etc/diskmon.cfg`
| :file:`~/.seiscomp/diskmon.cfg`
.. confval:: threshold
Default: ``95``
Type: *int*
Disk usage threshold in percent. Each time when the disk usage exceeds this level,
an alert e\-mail is send to the user. Note that disk usage is only checked when a
cron job of seiscomp check is installed or seiscomp check is called
regularly by other means.
.. confval:: emails
Type: *list:string*
Comma\-separated list of e\-mail addresses to notify when disk usage
threshold is exceeded.

View File

@ -0,0 +1,212 @@
.. highlight:: rst
.. _dlsv2inv:
########
dlsv2inv
########
**Convert dataless SEED to SeisComP inventory XML.**
Description
===========
dlsv2inv converts dataless `SEED <http://www.iris.edu/data/dataless.htm>`_ to
SeisComP XML (:term:`SCML`). Due to the limitations of dataless SEED dlsv2inv allows to set
attributes which are not available in dataless such as network type, network
description and so on.
It takes basically two important parameters:
#. input file
#. output file
whereas the output file defaults to stdout if not given.
The SeisComP inventory network and station objects have the attribute archive
which should contain the local datacenter where the information comes from.
While importing the attribute :confval:`datacenterID` is read and written into
the archive attribute of all networks and stations available in the dataless.
The datacenterID can be overridden with the ``--dcid`` command-line option.
.. note::
Conversion of inventory in |scname| XML to dataless SEED is provided by :ref:`inv2dlsv`.
Examples
========
#. Convert a given dataless SEED file to SeisComP XML.
.. code-block:: sh
dlsv2inv GE.dataless GE.xml
#. Override the datacenterID and leave it blank in the output.
.. code-block:: sh
dlsv2inv --dcid "" GE.dataless GE.xml
.. _dlsv2inv_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/dlsv2inv.cfg`
| :file:`etc/global.cfg`
| :file:`etc/dlsv2inv.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/dlsv2inv.cfg`
dlsv2inv inherits :ref:`global options<global-configuration>`.
Command-Line Options
====================
.. program:: dlsv2inv
:program:`dlsv2inv [OPTIONS] input [output=stdout]`
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
.. option:: --plugins arg
Load given plugins.
.. option:: -D, --daemon
Run as daemon. This means the application will fork itself
and doesn't need to be started with \&.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --log-file arg
Use alternative log file.
.. option:: --print-component arg
For each log entry print the component right after the
log level. By default the component output is enabled
for file output but disabled for console output.
.. option:: --trace
Execute in trace mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 \-\-print\-component\=1
\-\-print\-context\=1 .
ArcLink
-------
.. option:: --dcid arg
Override the datacenter ID which is read from the
datacenterID configuration parameter and written to the
network and station archive attribute.
.. option:: --net-description arg
Set the network description. It supports the following
placeholders: \${code}, \${start}, \${end}, \${class} and
\${archive}.
.. option:: --net-start arg
Set network start time. Format is %Y\-%m\-%d.
.. option:: --net-end arg
Set network end time. Format is %Y\-%m\-%d.
.. option:: --net-type arg
Set the network type \(VBB, SM, etc.\).
.. option:: --temporary
Set the network temporary flag to true.
.. option:: --restricted
Set the network restricted flag to true.
.. option:: --private
Set the network private flag to true.
Convert
-------
.. option:: -f, --formatted
Enable formatted XML output.

View File

@ -0,0 +1,331 @@
.. highlight:: rst
.. _ew2sc:
#####
ew2sc
#####
**Earthworm hypo2000_arc messages importer**
.. _ew2sc_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/ew2sc.cfg`
| :file:`etc/global.cfg`
| :file:`etc/ew2sc.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/ew2sc.cfg`
ew2sc inherits :ref:`global options<global-configuration>`.
.. confval:: ew2sc3.configPath
Type: *string*
Folder to store logs and achives of hypo2000_arc files from Earthworm export_genericer
.. confval:: ew2sc3.senderPort
Type: *int*
Earthworm export_generic's socket to listen to
.. confval:: ew2sc3.modID
Type: *int*
Expected Earthworm Module ID \(ew2sc3 will read the message only if its ModID is correct\).
Set to 0 \(MOD_WILDCARD\) to accept any Earthworm Module ID.
.. confval:: ew2sc3.instID
Type: *int*
Expected Earthworm Institute ID \(ew2sc3 will read the message only if its InstID is correct\)
Set to 0 \(INST_WILDCARD\) to accept any Earthworm Institute ID.
.. confval:: ew2sc3.customAgencyID
Type: *string*
Institute name to use when storing origin into database.
If blank, the origin will have the system AgencyID. If specified, the origin will have the given AgencyID.
.. confval:: ew2sc3.author
Type: *string*
Author name to use when storing origin into database.
.. confval:: ew2sc3.hostname
Type: *string*
Earthworm export_generic IP address to connect to
.. confval:: ew2sc3.defaultLatitude
Type: *string*
Default event latitude to use if hypo2000_arc location is null \(space filled\)
.. confval:: ew2sc3.defaultLongitude
Type: *string*
Default event longitude to use if hypo2000_arc location is null \(space filled\)
.. confval:: ew2sc3.locatorProfile
Type: *string*
Name of the earth model used by Earthworm location process
.. confval:: ew2sc3.enableArchiving
Type: *boolean*
Enable\/Disable hypo2000_arc message file archiving \(usefull for troobleshooting\)
.. confval:: ew2sc3.myAliveInt
Type: *int*
Time interval \(in sec\) between two heartbeats messages sent by SeisComP.
This parameter should be lower than Earthworm export_generic RcvAliveInt parameter
.. confval:: ew2sc3.senderTimeout
Type: *int*
Maximum time \(in millisecond\) before the connection to Earthworm export_generic times out.
This parameter has to be set accordingly to Earthworm export_generic parameters
.. confval:: ew2sc3.maxMsgSize
Type: *int*
Maximum message size \(in char\) between the Earthworm export_generic and ew2sc3.
This parameter has to be set accordingly to Earthworm export_generic MaxMsgSize parameter
.. confval:: ew2sc3.myAliveString
Type: *string*
Alive string to send to Earthworm export_generic in order to keep the connection alive.
This string should be identical as Earthworm export_generic RcvAliveText string
.. confval:: ew2sc3.senderAliveString
Type: *string*
Alive string expected from Earthworm export_generic in order to keep the connection alive.
This string should be identical as Earthworm export_generic SendAliveText string
.. confval:: ew2sc3.enableUncertainties
Type: *boolean*
Enable\/Disable uncertainties conversions.
Earthworm doesn't have uncertainties but weight from 0 to 4.
If enabled, ew2sc3 will convert pick weight to uncertainties followind a simple mapping between weight and pickerUncertainties list.
.. confval:: ew2sc3.pickerUncertainties
Type: *list:double*
Uncertainty values \(in sec\) to use when enableUncertainties is enabled.
Refer to scolv document for syntax.
.. confval:: ew2sc3.maxUncertainty
Type: *double*
Maximum weight value from Earthworm corresponding to maximum uncertainty
Command-Line Options
====================
.. program:: ew2sc
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
.. option:: --plugins arg
Load given plugins.
.. option:: -D, --daemon
Run as daemon. This means the application will fork itself
and doesn't need to be started with \&.
.. option:: --auto-shutdown arg
Enable\/disable self\-shutdown because a master module shutdown.
This only works when messaging is enabled and the master
module sends a shutdown message \(enabled with \-\-start\-stop\-msg
for the master module\).
.. option:: --shutdown-master-module arg
Set the name of the master\-module used for auto\-shutdown.
This is the application name of the module actually
started. If symlinks are used, then it is the name of
the symlinked application.
.. option:: --shutdown-master-username arg
Set the name of the master\-username of the messaging
used for auto\-shutdown. If \"shutdown\-master\-module\" is
given as well, this parameter is ignored.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --log-file arg
Use alternative log file.
Messaging
---------
.. option:: -u, --user arg
Overrides configuration parameter :confval:`connection.username`.
.. option:: -H, --host arg
Overrides configuration parameter :confval:`connection.server`.
.. option:: -t, --timeout arg
Overrides configuration parameter :confval:`connection.timeout`.
.. option:: -g, --primary-group arg
Overrides configuration parameter :confval:`connection.primaryGroup`.
.. option:: -S, --subscribe-group arg
A group to subscribe to.
This option can be given more than once.
.. option:: --content-type arg
Overrides configuration parameter :confval:`connection.contentType`.
.. option:: --start-stop-msg arg
Set sending of a start and a stop message.
Database
--------
.. option:: --db-driver-list
List all supported database drivers.
.. option:: -d, --database arg
The database connection string, format:
service:\/\/user:pwd\@host\/database.
\"service\" is the name of the database driver which
can be queried with \"\-\-db\-driver\-list\".
.. option:: --config-module arg
The config module to use.
.. option:: --inventory-db arg
Load the inventory from the given database or file, format:
[service:\/\/]location .
.. option:: --db-disable
Do not use the database at all

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,172 @@
.. highlight:: rst
.. _fdsnxml2inv:
###########
fdsnxml2inv
###########
**Convert station inventory between FDSN StationXML format and
SeisComP XML.**
Description
===========
fdsnws2inv is an inventory converter. It converts station meta data from
FDSN StationXML format to SeisComP XML (:term:`SCML`) and back writing the
output to a file, if given, or the command line (stdout).
Examples
========
#. Convert an inventory file in FDSN StationXML format to SCML with formatted XML.
Redirect the output to a new file:
.. code-block:: sh
fdsnxml2inv -f inventory_fdsn.xml inventory_sc.xml
#. Convert an inventory file in SCML format to FDSN StationXML with formatted XML.
Redirect the output to a new file:
.. code-block:: sh
fdsnxml2inv --to-staxml -f inventory_sc.xml inventory_fdns.xml
.. _fdsnxml2inv_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/fdsnxml2inv.cfg`
| :file:`etc/global.cfg`
| :file:`etc/fdsnxml2inv.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/fdsnxml2inv.cfg`
fdsnxml2inv inherits :ref:`global options<global-configuration>`.
Command-Line Options
====================
.. program:: fdsnxml2inv
:program:`fdsnxml2inv [OPTIONS] input [output]`
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
.. option:: --plugins arg
Load given plugins.
.. option:: -D, --daemon
Run as daemon. This means the application will fork itself
and doesn't need to be started with \&.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --log-file arg
Use alternative log file.
.. option:: --print-component arg
For each log entry print the component right after the
log level. By default the component output is enabled
for file output but disabled for console output.
.. option:: --trace
Execute in trace mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 \-\-print\-component\=1
\-\-print\-context\=1 .
Convert
-------
.. option:: -f, --formatted
Generate formatted SCML. Otherwise, output XML to a single line.
.. option:: --to-staxml
Convert from SCML to StationXML and expects SCML as input.
If not given, input is SCML and output is StationXML.
.. option:: --relaxed-ns-check
Enable relaxed XML namespace checks. This will also accept
tags within a different namespace than defined in the
supported schema.
.. option:: --log-stages
Add more output to stderr for all channel response stages
when converting from StationXML.

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,280 @@
.. _global_fixedhypocenter:
###############
FixedHypocenter
###############
Locator for re-computing source time with fixed hypocenter
Description
===========
Mining-related events are useful as ground truth events (:cite:t:`bondár-2009a`)
because the epicentre and depth can be constrained by physical inspection.
Unless a local seismograph network with accurate timing also locates the event,
and that information is available, the origin time must be estimated in order
for the event to be useful as ground truth. Existing location algorithms in
|scname|, including :ref:`Hypo71 <global_locsat>` and :ref:`LOCSAT <global_locsat>`
do not allow the determination of origin time given a set of arrivals and a
fixed hypocentre. There is a need, then, for a method of fixed hypocentre
origin time determination.
Objectives of this locator are:
* Inversion of arrival times of phase picks for source time fixing hypocenter location.
* Compatibility of the method of fixed-hypocentre origin time determination with
the practise of the Comprehensive Test Ban Treaty Organization (CTBTO).
* Adaptation of a procedure which is compatible with the other locators supported by |scname|.
* Adaptation of a procedure which can reproduce results of legacy locators currently
in use, such as GENLOC :cite:t:`pavlis-2004` and GRL, a
grid-based locator developed at the Canadian Hazards Information Service (CHIS).
The implementation of this locator by :term:`gempa GmbH` was initiated and has received
initial funding from :cite:t:`nrcan`.
Methodology
===========
Given the measured arrival times :math:`t_i^k` of phase :math:`k` at
station :math:`i`, most methods of earthquake hypocentre location involve
minimization of the weighted squared sum of the residuals. That is,
minimization of:
.. math::
|r_w|^2 = \sum_{i=1}^N {w_i^2 [ t_i^k - \tau - T_{model}^k(r_i,x) ]^2}
The residuals are computed by subtracting the expected arrival times
:math:`\tau - T_{model}^k(r_i,x)` based on a velocity model applied at the
coordinates of each station
:math:`r_i`.
Typically the weights can be a combination of the inverse of the
estimated pick uncertainty :math:`1/{\sigma}_i`, a distance term
:math:`d^k(\Delta)` and/or a residual weight term :math:`p(r_i)`.
Alternative weighting schemes can be applied but in this
implementation we weight by pick uncertainty alone: :math:`w_i=\frac{1}{{\sigma}_i}`.
In the general case, the model is a nonlinear function of its inputs, and there
is no analytic solution for the origin time and hypocenter that minimize the
norm. Typically, the solution is found iteratively, based on an initial guess
for the origin time and hypocenter. This is the normal procedure for an earthquake
without an a priori estimate of the hypocentral location.
When the hypocenter is in fact accurately constrained, the modeled travel time
is a constant, so we can project each phase arrival back to an equivalent origin
time
.. math ::
\tau_i^k = t_i^k - T_{model}^k (r_i,x)
so that we only have to find which minimizes:
.. math::
|r_w|^2 = \sum_{i=1}^{N}w_i^2 [\tau_i^k - \tau]^2
The residuals are minimized by:
.. math::
\tau = \frac{\sum_{i=1}^{N}w_i^2 (\tau_i^k)^2}{\sum_{i=1}^{N}w_i^2}.
Thus, the origin time is simply the weighted mean of the equivalent origin
times, according to the velocity model, associated with the arrivals.
The standard error of this estimate is:
.. math::
\sigma = \sqrt{\frac{\sum_{i=1}^{N}w_i^2 [\tau_i^k - \tau]^2}{\sum_{i=1}^{N}w_i^2}}.
The methodology for estimating error intervals and ellipses recommended for
standard processing at the CTBTO (:cite:t:`lee-1975`) is that of
:cite:t:`jordan-1981` and is implemented
in LOCSAT (:cite:t:`bratt-1988`).
Uncertainty is represented by a set of points :math:`x_e` around the final estimate
:math:`x_f` satisfying:
.. math::
\kappa_p^2 &= (x_e - x_f)^TC_m(x_e-x_f), \\
\kappa_p^2 &= Ms^2F_p(M,K+N-M), \\
s^2 &= \frac{Ks_K^2+|r_w|^2}{K+N-M}
where:
* :math:`C_m`: Covariance matrix, corresponding to the final hypocentre estimate :math:`x_f`.
* :math:`s^2`: Ratio of actual to assumed.
* :math:`\kappa_p^2`: The “confidence coefficient” at probability :math:`\rho`.
* :math:`F_p(m,n)`: Fisher-Snedecor quantile function (inverse cumulative F-distribution)
for and degrees of freedom of numerator and denominator sum of squares,
respectively, and probability.
* :math:`p`: Confidence level: the desired probability that the true epicentre should
fall within the uncertainty bounds.
* :math:`N`: Sum of all arrival time, azimuth or slowness estimates. Here, only
arrival times are considered for inversion.
* :math:`M`: Number of fitted parameters:
* 3: error ellipsoid
* 2: error ellipse
* 1: depth or time error bounds.
Here, :math:`M = 1` as we only invert for the time.
* :math:`s_K^2`: A prior estimate of the ratio of actual to assumed data variances; typically set to 1.
* :math:`K`: Number of degrees of freedom in prior estimate :math:`s_K^2`.
:math:`K` can be configured by :confval:`FixedHypocenter.degreesOfFreedom`.
* :math:`r_w`: Vector of weighted residuals.
Although this formulation is complex it is useful it because allows the analyst to
balance a priori and a posteriori estimates of the ratio of actual to assumed
data variances.
The covariance matrix in the general case is computed from the weighted sensitivity
matrix :math:`A_w`, the row-weighted matrix of partial derivatives of arrival
time with respect to the solution coordinates.
.. math::
C_m = A^T_wA_w
However, when origin time is the only coordinate, the partial derivatives with
respect to origin time are unity, the weighted sensitivity matrix is simply a
row vector of weights, and the time-time covariance
:math:`c_{tt}` is simply the sum of the squares of these weights.
.. math::
c_{tt} = \sum_{i=1}^{N}w_i^2
It is recommended that fixed-hypocentre origin time confidence intervals be
estimated using the method of :cite:t:`jordan-1981` for error ellipsoids,
that is, that the time error bounds be represented using
.. math::
\Delta t_p &= \sqrt{ \frac{\kappa_p^2}{c_{tt}} } \\
&= \sqrt{ \frac{F_p(1,K+N-1)}{K+N-1} \frac{Ks_K^2 + \sum_{i=1}^{N}w_i^2 [\tau_i^k-\tau]^2}{\sum_{i=1}^{N}w_i^2}}.
In addition to recording arrival weights and residuals, distances and azimuths,
and other details of origin quality, the details of a ground-truth-level (GT1)
fixed-hypocentre origin time estimate are recorded as:
* origin.time = :math:`\tau`
* origin.time_errors.uncertainty = :math:`\Delta t_p`
* origin.time_errors.confidence_level = :math:`100p`
* origin.quality.standard_error = :math:`\sigma`
* origin.quality.ground_truth_level = GT1
For the sake of reproducibility, a comment is added to every new :term:`origin`
reporting :math:`K`, :math:`s_K` and :math:`\kappa_p`.
Application
===========
#. Configure the parameters in the section *FixedHypocenter* of the global configuration.
#. When using in :ref:`scolv` the FixedHypocenter locator can be chosen right away
from the available locators.
.. figure:: media/scolv-fixedhypocenter.png
:align: center
:width: 18cm
scolv Location tab with FixHypocenter selected for relocating.
#. Configure the module, e.g. :ref:`screloc` or :ref:`scolv`, which is to use FixedHypocenter:
* set the locator type / interface: "FixedHypocenter"
* if requested, set the profile as [interface]/[model], e.g.: LOCSAT/iasp91 or libtau/ak135
#. Run the module with FixedHypocenter
Origins created by the FixedHypocenter locator can be identified by the methodID
and the *confidence/description* comment of the origin paramters, e.g.: ::
<origin publicID="Origin/20200102030459.123456.8222">
...
<timeFixed>false</timeFixed>
<epicenterFixed>true</epicenterFixed>
<methodID>FixedHypocenter</methodID>
<earthModelID>iasp91</earthModelID>
...
<comment>
<text>Confidence coefficient: K-weighted ($K$=8, $s_K$=1 s), $\kappa_p$ = 1.6, $n_{eff}$ = 5.0</text>
<id>confidence/description</id>
</comment>
...
</origin>
.. _global_fixedhypocenter_configuration:
Module Configuration
====================
.. note::
**FixedHypocenter.\***
*Locator parameters: FixedHypocenter*
.. confval:: FixedHypocenter.profiles
Default: ``LOCSAT/iasp91,LOCSAT/tab``
Type: *list:string*
Defines a list of available travel time tables. Each item
is a tuple separated by a slash with format \"[interface]\/[model]\".
Built\-in interfaces are \"LOCSAT\" and \"libtau\".
Other interfaces might be added via plugins. Please check their
documentation for the required interface name.
.. confval:: FixedHypocenter.usePickUncertainties
Default: ``false``
Type: *boolean*
Whether to use pick time uncertainties rather than a fixed
time error. If true, then the uncertainties are retrieved from
each individual pick object. If they are not defined, then the
default pick time uncertainty as defined by defaultTimeError
will be used instead.
.. confval:: FixedHypocenter.defaultTimeError
Default: ``1.0``
Type: *double*
Unit: *s*
The default pick time uncertainty if pick uncertainties are
not going to be used or if they are absent.
.. confval:: FixedHypocenter.degreesOfFreedom
Default: ``8``
Type: *int*
Number of degrees of freedom used for error estimate.
.. confval:: FixedHypocenter.confLevel
Default: ``0.9``
Type: *double*
Confidence level between 0.5 and 1.

View File

@ -0,0 +1,45 @@
.. _global_fx-dfx:
######
FX-DFX
######
Implementation of the CTBTO/IDC polarization analysis for
three-component stations.
Description
===========
The feature extraction as implemented at CTBTO IDC for single three-component
stations determines back azimuth (station to origin) and slowness, including the
uncertainties for both of these. In the IDC source code and database, the back
azimuth is referred to as only azimuth.
Algorithm
=========
The algorithm computes polarization attributes for a three-component station using
a modification to the :cite:t:`jurkevic-1988` algorithm. Some of these attributes are
then used to determine detection azimuth (seazp = P-type azimuth in degrees),
detection slowness and azimuth/slowness uncertainties (inang1 = emergence (incidence)
angle and rect = rectilinearity).
A fixed noise window of 9.5 seconds ([-30s;-20.5s] with respect to trigger time)
and a signal window of 5.5 seconds ([-4s;1.5s] with respect to trigger time)
is used. The signal window is subdivided into intervals of 1.5s length which
overlap by 50%.
1. De-mean data according to mean of noise window.
2. Apply cosine ramp to noise data and filter the entire data window.
3. Rotate three components into ZNE space.
4. Compute 3x3 covariance matrix for each interval.
5. Extract eigenvalues and compute parameters including rectilinearity.
6. Choose the result set with the largest rectilinearity.
Picks
=====
In addition to the extracted back azimuth and slowness values the rectilinearity
is added as a comment to the resulting pick. The comment ID is
``DFX:rectilinearity`` and the comment is the value in string representation.

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,64 @@
.. _global_homogeneous:
###########
homogeneous
###########
Travel-times for a homogeneous velocity model
Description
===========
The travel-time interface *homogeneous* allows predicting travel times for
P and S wave and homogeneous velocity models.
Configuration
=============
The travel-time interface *homogeneous* is controlled by global parameters,
e.g., in :file:`$SEISCOMP_ROOT/etc/global.cfg`:
#. Add a new table profile for homogeneous travel-time tables with some custom
profile name. In :ref:`scconfig` navigate to the section *ttt.homogeneous*
and click on the green button to add a table profile.
#. Set all parameters in the new profile.
#. Register the new profile by adding its name to the list of tables in
:confval:`ttt.homogeneous.tables`
Example configuration:
.. code-block:: params
# The list of supported model names per interface.
ttt.homogeneous.tables = "5"
# Geographic origin of the region. Expects 2 values: latitude, longitude.
ttt.homogeneous.5.origin = 51, 12
# Radius validity of the region.
ttt.homogeneous.5.radius = 1
# Min Depth validity of the region.
ttt.homogeneous.5.minDepth = 0
# Max Depth validity of the region.
ttt.homogeneous.5.maxDepth = 2
# P wave velocity.
ttt.homogeneous.5.P-velocity = 5
# S wave velocity.
ttt.homogeneous.5.S-velocity = 3
Application
===========
Once the travel-time interface profile is defined and registered, in can be
selected
* interactively in the :ref:`scolv phase picker <scolv-sec-waveform-review>`
or the :ref:`scolv amplitude picker <scolv-sec-amplitude-review>`,
* or used in other modules which allow the configuration of travel-time
interfaces.

View File

@ -0,0 +1,470 @@
.. _global_hypo71:
######
Hypo71
######
The traditional Hypo71PC locator with SeisComP.
Description
===========
The Hypo71 locator algorithm by Fred Klein locator (:cite:t:`lee-1975`) has
been implemented into |scname| through the plugin mechanism. The plugin *hypo71*
contains the LocatorInterface implementation for Hypo71.
This plugin uses a slightly modified Hypo71 version from Alexandre Nercessian (IPGP)
which allows negative earthquake depth (above sea level) and negative stations
altitude (below sea level - OBS).
The development of this plugin was co-financed by the European Union and
`le Ministère de l'Ecologie, du Développement Durable, des Transports et du Logement
<http://www.developpement-durable.gouv.fr>`_
and developed by :cite:t:`ovsm` / :cite:t:`ipgp`.
How it works
============
When receiving a list of arrivals to locate, the plugin builds a Hypo71 input
file with informations from the station inventory and configured profile.
It then runs Hypo71, reads the output file and sends the results (location,
uncertainties, RMS, pick residuals ...) to |scname|.
If several trial depths are configured, the plugin will run as many Hypo71
rushes as configured depths.
Then all the results are read, and a decision is made on the best one, based on
location RMS and uncertainty.
A final run is then made with the best result depth as trial depth.
Profiles
========
The plugin allows the user to set up as many profiles as needed.
A profile contains all the information relative to the velocity model and
Hypo71 iteration parameters.
This allows the user to tune the behaviour of Hypo71 to what he needs.
If no profiles are set-up, the plugin will use default Hypo71 profile, according
to example shown in Hypo71 first publication.
Some of this default Hypo71 parameters have been altered to allow more and finer
iteration, since computer power is now far above what was available in the 1970's.
Error measures
==============
After running Hypo71, the output is converted into a |scname| origin (:term:`SCML`)
object including some error measures. The following table shows how
the Hypo71 error measures are mapped to the |scname| data model:
========================================================= =====================================================
|scname| Hypo71
========================================================= =====================================================
Origin.latitude.uncertainty ERH/sqrt(2)
Origin.longitude.uncertainty ERH/sqrt(2)
Origin.depth.uncertainty ERZ
Origin.originQuality.standardError _
Origin.originQuality.secondaryAzimuthalGap _
Origin.originQuality.usedStationCount usedStationCount
Origin.originQuality.associatedStationCount associatedStationCount
Origin.originQuality.associatedPhaseCount associatedPhaseCount
Origin.originQuality.usedPhaseCount associatedPhaseCount
Origin.originQuality.depthPhaseCount depthPhaseCount
Origin.originQuality.minimumDistance km2deg(Tdist.front)
Origin.originQuality.maximumDistance km2deg(Tdist.back)
Origin.originQuality.medianDistance km2deg(~Tdist)
Origin.originQuality.groundTruthLevel QUALITY
Origin.originUncertainty.horizontalUncertainty _
Origin.originUncertainty.minHorizontalUncertainty _
Origin.originUncertainty.maxHorizontalUncertainty _
Origin.originUncertainty.azimuthMaxHorizontalUncertainty _
ConfidenceEllipsoid.semiMajorAxisLength _
ConfidenceEllipsoid.semiMinorAxisLength _
ConfidenceEllipsoid.semiIntermediateAxisLength _
ConfidenceEllipsoid.majorAxisPlunge _
ConfidenceEllipsoid.majorAxisAzimuth _
ConfidenceEllipsoid.majorAxisRotation _
========================================================= =====================================================
Plugin
======
The *hypo71* plugin is installed under :file:`share/plugins/hypo71.so`.
It provides a new implementation of the LocatorInterface with the name Hypo71.
To add the plugin to a module add it to the modules configuration, either
:file:`modulename.cfg` or :file:`global.cfg`:
.. code-block:: sh
plugins = ${plugins}, hypo71
Basically it can be used by two |scname| modules: :ref:`screloc` and :ref:`scolv`.
Output
======
All output is stored in the configured :confval:`HYPO71ROOT`.
The following file are stored:
- Input file (input)
- Input configuration (.INP)
- Hypo71 location header (.OUT)
- Hypo71 location results (.PRT)
- Hypo71 ZTR evalutation log (.LOG)
In addition to the native Hypo71 outp ut a |scname| origin object is created and
returned to the calling instance. Usually this object is then sent via messaging.
In addition, the stdout output of the locator is redirected to |scname| output at
INFO level. Each line is identified by leading "Hypo71PC:", e.g.: ::
12:02:25 [info] Hypo71PC: Date Heure Minute Seconde
12:02:25 [info] Hypo71PC: 90113 22 48 3.78 Nb Iterations : 3
To view this output on the command line add *--debug* to your application when executing on
the command line. Increasing logging level of the module which executes the plugin
allows to read the output in the log file: ::
loggging.level = 3
Configuration example
=====================
To add the plugin to an application such as scolv or screloc, add the plugin
name to the list of plugins that are loaded (e.g. :file:`scolv.cfg`):
.. code-block:: sh
plugins = ${plugins}, hypo71
Futhermore add the plugin configuration (e.g. :file:`scolv.cfg`):
.. code-block:: sh
########################################################
############# Hypo71 plugin configuration ##############
########################################################
# Hypo71 input file to process (generated by plugin)
hypo71.inputFile = @DATADIR@/hypo71/HYPO71.INP
# Hypo71 log file to store ZTR calculation and final results
hypo71.logFile = @LOGDIR@/HYPO71.LOG
# Hypo71 output file to read results from (generated by binary)
hypo71.outputFile = @DATADIR@/hypo71/HYPO71.PRT
# Hypo71 script called by plugin
hypo71.hypo71ScriptFile = @DATADIR@/hypo71/run.sh
# Hypo71 default profile
hypo71.defaultControlFile = @DATADIR@/hypo71/profiles/default.hypo71.conf
# Hypo71 origin patternID
hypo71.publicID = Hypo71.@time/%Y%m%d%H%M%S.%f@.@id@
# Should we use the custom patternID ?
hypo71.useHypo71PatternID = false
# Hypo71 custom profiles examples
hypo71.profiles = ModelA
hypo71.profile.ModelA.earthModelID = "My Velocity Model A"
hypo71.profile.ModelA.methodID = Hypo71PC
hypo71.profile.ModelA.controlFile = @DATADIR@/hypo71/profiles/profile.a.conf
Verify that everything is properly set up in the script :file:`@DATADIR@/hypo71/run.sh`
.. code-block:: sh
#!/bin/bash
HYPO71PC_BINARY=Hypo71PC
HYPO71PC_HOME=`dirname $0`
# Jumping into the right directory
cd ${HYPO71PC_HOME}/
# Executing binary with input file as argument
${SEISCOMP_ROOT}/bin/$HYPO71PC_BINARY < input
Verify that everything is properly set up in the file :file:`${SEISCOMP_ROOT}/share/hypo71/input`
.. code-block:: sh
HYPO71.INP
HYPO71.PRT
HYPO71.OUT
.. important ::
There must be 3 blank lines at the end of the :file:`input` file, those are not to be removed.
Finally set-up your Hypo71 profile (e.g. :file:`${SEISCOMP_ROOT}/share/hypo71/profiles/profile.a.conf`)
.. code-block:: sh
############################################################
## HYPO71 SeisComP Plugin ##
## @OVSM-IPGP ##
## ##
############################################################
## This profile is based on Dorel velocity model for French Antilles
## It allows several iterations at different starting depth for deep and crustal earthquakes
##############
# Reset List #
##############
TEST(01) = .1 # sec # cutoff RMS value under which Jeffrey's weghting of residuals is not used
TEST(02) = 50. # km #
TEST(03) = 0.2 # critical F-value for the stepwise multiple regression
TEST(04) = .01 # km # adjustment value under which Geiger's iteration is terminated
TEST(05) = 5. # km # focal-depth value above which DZ is rest to DZ / (K+1)
TEST(06) = 4. # regression TEST(03)/TEST(06) coefficient value if no significant variable is found in the stepwise multiple regression
TEST(10) = 2. # km # coefficient value J = D/TEST(10) used for resetting DX and DY
TEST(11) = 999. # maximum number of iterations in the hypocentral adjustment
TEST(12) = .5 # coefficient value DZ = -Z*TEST(12) used for resetting DZ when hypocenter is placed in the air
TEST(13) = 1. # km # standard error value of hypocentral optionally calculated RMS
## The following values are only available with the Hypo71PC version modified by Alexandre Nercessian (IPGP) which is included with this plugin
TEST(15) = -2. # km # maximum altitude of earthquake in km (down is positive)
TEST(20) = 1. # used altitude = read altitude * TEST(20) - for example, -2500 = -250 * 10
######################
# Crustal Model List #
######################
CRUSTAL_VELOCITY_MODEL = 3.50, 6.00, 7.00, 8.00
CRUSTAL_DEPTH_MODEL = 0.00, 3.00, 15.00, 30.00
################
# Control Card #
################
# MANDATORY
ZTR = 5, 20, 40, 60, 80, 100, 150, 200 # km # trial focus depth, at least two
# MANDATORY
XNEAR = 200. # km # distance from epicenter up to which the distance weighting is 1
# MANDATORY
XFAR = 450. # km # distance from epicenter beyond which the distance weighting is 0
# MANDATORY
POS = 1.76 # ratio of P-velocity to S-velocity
KAZ = 1 # 1 or blank # apply azimuthal weighting of stations ?
KSORT = 1 # 1 or blank # sort stations by distance in the output ?
# Use the position obtained from the best ZTR value ?
USE_TRIAL_POSITION = false
####################
# Instruction Card #
####################
KNST = 1 # use S data ?
INST = 0 # fix depth ?
#####################################
# Optionnal Weighting Look-Up Table #
#####################################
# Uncomment if you want to disable dynamic weighting and use those uncertainties as
# boundaries for weighting (e.g. a pick with +-0.02 will have a weight of 0)
#WEIGHT_UNCERTAINTY_BOUNDARIES = 0.1, 0.2, 0.5, 1.0
Usage
=====
Locator
-------
The usage of the new Hypo71 plugin is straight forward. Once loaded successfully
the new locator shows up in the lower left corners combo box.
.. figure:: media/hypo71/locator_selection_small.png
:align: center
Select the new Hypo71 locator and a profile from the pre-configured list.
.. figure:: media/hypo71/locator_profile_selection_small.png
:align: center
The Hypo71 implementation doesn't provide a virtual profile automatically but the
plugins ships with some example profiles.
If an origin has been relocated the method should be set to "Hypo71" and
the earth model contains the name of the profile used to perform this localization.
.. figure:: media/hypo71/origin_information.png
:align: center
Settings
--------
The Hypo71 locator implementation supports to override configured settings or
control parameters for a session. Those changes are not persistent and lost if
the locator is changed to another one or the profile has been changed.
To open the settings dialog press the button right to the locator selection
combo box.
.. figure:: media/hypo71/locator_settings.png
:align: center
Then the Hypo71 selected profile parameters show up.
.. figure:: media/hypo71/hypo71_settings.png
:align: center
More
====
* Take a look at Fred Klein HYPOINVERSE Earthquake Location software (:cite:t:`klein-2002`),
* Hypo71PC original manual and binary are available on USGS website (:cite:t:`lee-1975`).
.. _global_hypo71_configuration:
Module Configuration
====================
.. note::
**hypo71.\***
*Locator parameters: Hypo71*
.. confval:: hypo71.logFile
Default: ``@LOGDIR@/HYPO71.LOG``
Type: *string*
Temporary file used by Hypo71 to store calculation logs.
.. confval:: hypo71.inputFile
Default: ``@DATADIR@/hypo71/HYPO71.INP``
Type: *string*
Temporary file to write Hypo71 input data to.
.. confval:: hypo71.outputFile
Default: ``@DATADIR@/hypo71/HYPO71.PRT``
Type: *string*
Temporary output file to read Hypo71 location data from.
.. confval:: hypo71.defaultControlFile
Default: ``@DATADIR@/hypo71/profiles/default.hypo71.conf``
Type: *string*
Hypo71 default profile.
If no custom profile is specified, this profile will be used by the plugin when proceeding to a localization.
.. confval:: hypo71.hypo71ScriptFile
Default: ``@DATADIR@/hypo71/run.sh``
Type: *string*
Bash script executed when calling the Hypo71 locator plugin for locating the earthquake.
.. confval:: hypo71.profiles
Type: *list:string*
Hypo71 profile name.
Multiples names may be set. They must be separated by comma.
Each profile can have different velocity or parameters.
.. confval:: hypo71.publicID
Type: *string*
Custom patternID to use when generating origin publicID
.. confval:: hypo71.useHypo71PatternID
Type: *boolean*
Specifies if the given publicD should be used for generating origin publicID
.. note::
**hypo71.profile.\***
*Profiles containing the profile-specific velocity model and the Hypo71 parameters.*
.. note::
**hypo71.profile.$name.\***
$name is a placeholder for the name to be used and needs to be added to :confval:`hypo71.profiles` to become active.
.. code-block:: sh
hypo71.profiles = a,b
hypo71.profile.a.value1 = ...
hypo71.profile.b.value1 = ...
# c is not active because it has not been added
# to the list of hypo71.profiles
hypo71.profile.c.value1 = ...
.. confval:: hypo71.profile.$name.earthModelID
Type: *string*
Profile's velocity model name.
.. confval:: hypo71.profile.$name.methodID
Default: ``Hypo71``
Type: *string*
Profile's method.
It is generally the locator's name \(Hypo71\).
.. confval:: hypo71.profile.$name.controlFile
Type: *string*
File containing the profile parameters.
.. confval:: hypo71.profile.$name.fixStartDepthOnly
Default: ``false``
Type: *boolean*
If the depth is requested to be fixed \(e.g. by ticking the option
in scolv\) the plugin performs only one location starting at
specified depth but with free depth evaluation. This option
defines whether it should really fix the depth \(false\) or
use this fixed depth only as starting point \(true\).

View File

@ -0,0 +1,837 @@
.. _global_iloc:
####
iLoc
####
Locator in SeisComP implemented by the plugin lociloc.
Description
===========
iLoc is a locator developed by István Bondár which has been integrated into
|scname| by :cite:t:`gempa`. It is invoked by the wrapper plugin *lociloc* - the
interface between |scname| and iLoc.
Read the sections :ref:`iloc-setup` and :ref:`iloc-application` for
configuring and using iLoc in |scname|.
Background
----------
iLoc is a locator tool for locating seismic, hydroacoustic and
infrasound sources
based on :term:`phase picks <pick>`. iLoc is based on the location
algorithm developed by :cite:t:`bondár-2009a` and implemented at the
International Seismological Center, (:cite:t:`isc`, :cite:t:`bondár-2018`)
with numerous new features added (:cite:t:`bondár-2018`).
The stand-alone iLoc code can be downloaded from the :cite:t:`iloc-github`
software repository.
Among the major advantages of using iLoc is that it can
* Use any phases with valid travel-time predictions;
* Use seismic, hydroacoustic and infrasound arrival time, slowness and azimuth
observations in location;
* Use travel-time predictions from a global 3D upper mantle velocity model;
* Use a local 1D velocity model;
* Account for the correlated travel-time prediction error structure due to
unmodeled 3D velocity heterogeneities;
* Check if the data has sufficient resolution to determine the
hypocenter depth;
* Identify ground truth (GT5) candidate events.
History
-------
* Originally developed for U.S. Air Force Research Laboratory, today the standard
at the International Seismological Centre (ISC) replacing previous routines
* Open source, download website: :cite:t:`iloc-github`
* Integrated first in SeisComP3 in 2019
* Basis of the EMSC crowd-source locator, CsLoc since 2019
* EMSC standard as of 2022
iLoc in a nutshell
------------------
* Accounts for correlated travel-time prediction errors
* Initial hypocenter guess from Neighborhood Algorithm search
* Linearised inversion using a priori estimate of the full data covariance matrix
Attempts for free-depth solution only if there is depth resolution
* Default depth is derived from historical seismicity
* Seismic, hydroacoustic and infrasound observations
* Arrival time, slowness and azimuth measurements
* Uses most ak135 or iasp91 Earth model phases in locating
* Integrated RSTT travel-time predictions
* RSTT is default for Pn/Sn and Pg/Lg
* Local velocity model and local phase TT predictions for Pg/Sg/Lg, Pb/Sb, Pn/Sn.
Algorithms
----------
This section describes some of the principles. The full description of the applied
algorithms can be found in the iLoc documentation provided along with the package
on the :cite:t:`iloc-github` website.
Neighbourhood algorithm
~~~~~~~~~~~~~~~~~~~~~~~
Linearized inversion algorithms are quite sensitive to the initial guess. In order
to find an initial hypocenter guess for the linearized inversion the Neigbourhood
Algorithm (:cite:t:`sambridge-1999`; :cite:t:`sambridge-2001`) is performed
around the starting hypocentre if :confval:`iLoc.profile.$name.DoGridSearch` is active.
During the NA search, we identify the phases with respect to each trial hypocenter
and calculate the misfit of the trial hypocenter. The misfit is defined as the sum
of the :confval:`iLoc.profile.$name.NAlpNorm` residual and a penalty factor that
penalizes against freakish local minima provided by just a few phases. In the first
iteration :confval:`iLoc.profile.$name.NAinitialSample` hypocenter hypotheses are tested,
while the subsequent iterations consider the best :confval:`iLoc.profile.$name.NAcells`
solutions and resample the search space around them with
:confval:`iLoc.profile.$name.NAnextSample` hypocenter hypotheses. The solution with
the lowest misfit after :confval:`iLoc.profile.$name.NAiterMax` iteration is taken
as the initial hypocenter for the linearized least squares inversion.
A grid search can be performed to obtain a better initial hypocenter
guess. The search is performed around the starting hypocenter.
For a very exhaustive search one can increase :confval:`iLoc.profile.$name.NAinitialSample`,
:confval:`iLoc.profile.$name.NAnextSample` and :confval:`iLoc.profile.$name.NAcells`
values. Note that the maximum value for :confval:`iLoc.profile.$name.NAinitialSample`
is around 3500 before hitting memory limits.
An exhaustive search will
considerably slow iLoc down, especially when RSTT predictions are
enabled (:confval:`iLoc.profile.$name.UseRSTT`, :confval:`iLoc.profile.$name.UseRSTTPnSn`,
:confval:`iLoc.profile.$name.UseRSTTPgLg`).
Depth resolution
~~~~~~~~~~~~~~~~
Depth resolution can be provided by a local network, depth phases, core reflections
and to a lesser extent near-regional secondary phases. iLoc attempts for a free-depth
solution if the set of :term:arrivals meets at least one of the following conditions:
* Number of pairs of defining P and depth phases
:math:`\le` :confval:`iLoc.profile.$name.MinDepthPhases`
* Number of pairs of defining P and core phases
:math:`\le` :confval:`iLoc.profile.$name.MinCorePhases`
* Number of pairs of defining P and S phases
:math:`\le` :confval:`iLoc.profile.$name.MinSPpairs`
within a regional distance of :confval:`iLoc.profile.$name.MaxLocalDistDeg`
degree
* Number of defining P phases
:math:`\le` :confval:`iLoc.profile.$name.MinLocalStations`
within a local distance of :confval:`iLoc.profile.$name.MinLocalStations`
degree.
If there is insufficient depth resolution provided by the data, or the depth uncertainty
for a free-depth solution exceeds a threshold, the hypocentre depth is set to the depth
from the default depth grid if a grid point for the epicentre location exists; otherwise
it is set to a depth :cite:t:`bolton-2006` assigned to
the corresponding Flinn-Engdahl geographic
region (:cite:t:`young-1996`). The default depth grid (:cite:t:`bondár-2011`)
is defined on a 0.5º x 0.5º grid as the median of all depths in the cell, provided
that there were at least five events in the cell, and the 7525 percent quartile
range was less than 100 km. The latter constraint is imposed to avoid regions with
both shallow and deep seismicity. Anthropogenic events are fixed to the surface.
Finally, the user can fix the depth to the initial depth.
iLoc reports back how the depth was determined in the FixedDepthType parameter:
* 0 - free depth solution
* 1 - airquake/deepquake, depth fixed to surface/MaxHypocenterDepth
* 2 - depth fixed to depth reported by an agency (not used in |scname|)
* 3 - depth fixed to depth-phase depth
* 4 - anthropogenic event, depth fixed to surface
* 5 - depth fixed to default depth grid depth
* 6 - no default depth grid point exists, fixed to median reported depth
* 7 - no default depth grid point exists, fixed to GRN-dependent depth
* 8 - depth fixed by user provided value
Linearized inversion
~~~~~~~~~~~~~~~~~~~~
Once the Neighbourhood search get close to the global optimum, iloc switches
to an iterative linearized least-squares inversion of travel-time, azimuth and
slowness observations (:cite:t:`bondár-2009b`; :cite:t:`bondár-2011`) to obtain the final solution
for the hypocenter.
The convergence test after (:cite:t:`paige-1982`) is
applied after every iteration. Once a convergent solution is obtained, the location
uncertainty is defined by the a posteriori model covariance matrix. The model
covariance matrix yields the four-dimensional error ellipsoid whose projections
provide the two-dimensional error ellipse and one-dimensional errors for depth
and origin time. These uncertainties are scaled to the 90% confidence level
(:cite:t:`jordan-1981`).
The final hypocentre is tested against the
ground truth selection criteria (:cite:t:`bondár-2009a`),
and it is reported as
a GT5candidate if the solution meets the GT5 criteria.
Some important parameters are:
* :confval:`iLoc.profile.$name.SigmaThreshold`: Residuals that exceed
:math:`abs(Sigmathreshold * PriorMeasError)` are made non-defining.
* :confval:`iLoc.profile.$name.MinNdefPhases`: Minimum number of observations
required to attempt for a solution.
If the number of defining arrival times exceed
:confval:`iLoc.profile.$name.MinNdefPhases`, then slowness observations will not
be used in the location.
Integration into |scname|
-------------------------
* Integration of iLoc into |scname| is provided by an external library of
routines (:cite:t:`iloc-github`).
* |scname| modules call iLoc routines by passing the objects via the plugin
*lociloc* installed in :file:`@DATADIR@/plugins/lociloc.so`.
* iLoc returns objects to |scname| for integration.
* The iLoc implementation in |scname| retains all original iLoc functionalities.
Read the section :ref:`iloc-setup` for the installation of the iLoc library and
the configuration in |scname|.
Velocity models
---------------
iLoc ships with the global models *iasp91* and *ak135* as well as with regional
seismic travel-time tables, RSTT, which, if activated by configuration, replaces
the global models in areas where they are defined.
.. _iloc-velocity_global:
Global models
~~~~~~~~~~~~~
The global models *iasp91* and *ak135* and RSTT are available by default without
further configuration.
.. _iloc-velocity_rstt:
RSTT
~~~~
RSTT are available in :file:`@DATADIR@/iloc/RSTTmodels/pdu202009Du.geotess`.
Custom RSTT can be integrated into iLoc and provided to |scname|.
For adding custom RSTT to iLoc read the original iLoc documentation from the
:cite:t:`iloc-github` software repository.
The usage of RSTT is controlled per iLoc profile by global configuration
parameters
* :confval:`iLoc.profile.$name.UseRSTT`
* :confval:`iLoc.profile.$name.UseRSTTPnSn`
* :confval:`iLoc.profile.$name.UseRSTTPgLg`
.. _iloc-velocity_local:
Local velocity models
~~~~~~~~~~~~~~~~~~~~~
Custom local velocity models can be provided by a file in
:file:`@DATADIR@/iloc/localmodels`. Example file
:file:`@DATADIR@/iloc/localmodels/test.localmodel.dat`:
.. code-block:: properties
#
# test
#
# number of layers
4
0.000 5.8000 3.4600 x
20.000 6.5000 3.8500 CONRAD
45.000 8.0400 4.4800 MOHO
77.500 8.0400 4.4800 x
Once added, the velocity can be configured in |scname| as set out in section
:ref:`iloc-setup`.
Station elevation
-----------------
iLoc considers station elevation. It calculates the elevation correction,
*elevationCorrection*, for a station as
.. math::
elevationCorrection = \frac{\sqrt{1 - (surfVel * p)^2} * elev}{surfVel}
where
* *elev*: elevation of the station
* *p*: the ray parameter (horizontal slowness)
* *surfVel*: layer P or S velocity of at the surface depending on the last lag
of the phase name.
.. note ::
iLoc does not allow airquakes or source locations above datum (0 km). If the
depth of an origin becomes negative, iLoc
fixes the depth to 0 km and the depth type of the origin will be "operator
assigned".
.. _sec-iloc-references:
Resources
---------
iLoc has taken advantage of many publications or has been cited therein.
Read the section :ref:`sec-references` for a list.
.. _iloc-setup:
Setup in |scname|
=================
#. Add the plugin *lociloc* to the global configuration, e.g. in
:file:`@SYSTEMCONFIGDIR@/global.cfg`:
.. code-block:: properties
plugins = ${plugins}, lociloc
#. Install the dependencies missing for iLoc. For download, the system variable
*SEISCOMP_ROOT* must be defined which you may wish to test first:
.. code-block:: sh
echo $SEISCOMP_ROOT
In case the variable is undefined, follow the instructions in section
:ref:`getting-started-variables`.
After *$SEISCOMP_ROOT* is defined you may install the software dependencies
for iLoc using the :ref:`install scripts <software_dependencies>` or simply
the :ref:`seiscomp` script:
.. code-block:: sh
seiscomp install-deps iloc
The install scripts will fetch auxiliary files from :cite:t:`iloc-github`
and install them in :file:`@DATADIR@/iloc/iLocAuxDir`. For manual download and
installation read the install scripts located in
:file:`@DATADIR@/deps/[os]/[version]/install-iloc.sh`.
.. note ::
* Check the :cite:t:`iloc-github` website for updates before downloading
the file since the version number, hence the name of the download file
may change.
* Instead of generating the :file:`SEISCOMP_ROOT/share/iloc/iLocAuxDir`
directory, you can also manually install the dependencies somewhere else,
create a symbolic link and maintain always the same iLoc version in
|scname| and externally.
#. Add and configure iLoc profiles for the velocity models. The global models
*iasp91* and *ak135* are considered by default with default configuration
parameters even without setting up *iasp91*/*ak135* profiles. You may,
however, create these profiles for their customization.
Create new profiles or consider existing ones for adjusting their
configuration:
* :confval:`iLoc.profile.$name.globalModel`: The name of the
:ref:`global model <iloc-velocity_global>`, e.g. *iasp91* or *ak135*.
* Consider the :ref:`RSTT parameters <iloc-velocity_rstt>`.
* :confval:`iLoc.profile.$name.LocalVmodel`, :confval:`iLoc.profile.$name.UseLocalTT`
and :confval:`iLoc.profile.$name.MaxLocalTTDelta`: The definition of a
:ref:`local velocity model <iloc-velocity_local>`: model file, default
usability, distance range.
* :confval:`iLoc.profile.$name.DoNotRenamePhases`: Renaming seismic phases
automatically
impacts the usability of the origins with other locators and locator profiles.
Activate the parameter to avoid phase renaming.
* Consider the remaining parameters.
.. note ::
Creating the profiles allows using the same global velocity model along
with different local models or RSTT settings in separate profiles.
#. Test the locator using :ref:`scolv` or configure with :ref:`screloc` or other
locator modules.
.. _iloc-application:
Application in |scname|
=======================
Once the *lociloc* plugin is configured, the iLoc locator can be applied
* Automatically e.g. in :ref:`screloc` or
* Interactively in :ref:`scolv`.
For using iLoc in :ref:`scolv` select it in the locator menu of the Location tab
.. figure:: media/scolv-iloc-locator.png
:align: center
Select iLoc locator
along with a profile:
.. figure:: media/scolv-iloc-profile.png
:align: center
Select iLoc profile
The parameters for iLoc can be adjusted by pressing the wrench button next to the
locator selection combo box
.. figure:: media/scolv-iloc-change.png
:align: center
Start the settings dialog
which opens the iLoc settings dialog:
.. figure:: media/scolv-iloc-settings.png
:align: center
Adjust the settings and click OK to confirm
.. warning ::
By default, automatic phase renaming by iLoc is active. The renaming may
change the phase names, e.g. from P to Pn.
Renaming seismic phases automatically will later impact the usability of
the new origins with other locators and locator
profiles. Deactivate *DoNotRenamePhases* to avoid phase renaming.
However,
when deactivating, iLoc may not provide results if the initial phases do not
exist in the phase table for the given source depth and epicentral distance.
Example: For great source depth and small epicentral distance, the first arrival
phase is p or Pn and not P but |scname| provides P.
After relocating, the iLoc locator and the selected profile are shown in the
:ref:`scolv` Location tab as Method and Earth model, respectively:
.. figure:: media/scolv-iloc-info.png
:align: center
Information in scolv Locator tab
.. _global_iloc_configuration:
Module Configuration
====================
.. note::
**iLoc.\***
*Locator parameters: iLoc*
.. confval:: iLoc.auxDir
Default: ``@DATADIR@/iloc/iLocAuxDir``
Type: *string*
iLoc directory for auxialiary files and directories. Some
of them must be provided from the iLoc website. Read the
documentation for their installation.
.. confval:: iLoc.usePickUncertainties
Default: ``false``
Type: *boolean*
Whether to use pick time uncertainties \(true\) or to use the
default timing error \(false\).
.. confval:: iLoc.defaultTimeError
Default: ``9999999``
Type: *double*
Unit: *s*
The default pick time error forwarded to iLoc if no pick time
uncertainties are set or if using the pick time uncertainties
is disabled.
.. confval:: iLoc.profiles
Default: ``iasp91,ak135``
Type: *list:string*
iLoc profile name.
Multiples names may be set separated by comma.
Each profile can have different velocity or parameters.
.. note::
**iLoc.profile.\***
*Profiles containing the specific locator parameters. For*
*the global models, iasp91 and ak135, profiles are*
*automatically considered with defaults.*
*To adjust the profile parameters the corresponding profile*
*must be created.*
.. note::
**iLoc.profile.$name.\***
$name is a placeholder for the name to be used and needs to be added to :confval:`iLoc.profiles` to become active.
.. code-block:: sh
iLoc.profiles = a,b
iLoc.profile.a.value1 = ...
iLoc.profile.b.value1 = ...
# c is not active because it has not been added
# to the list of iLoc.profiles
iLoc.profile.c.value1 = ...
.. confval:: iLoc.profile.$name.Verbose
Default: ``true``
Type: *boolean*
.. confval:: iLoc.profile.$name.globalModel
Type: *string*
Name of globally applied velocity model
into which RSTT or the local model is integrated. If
unset, the name of the profile is considered instead.
.. confval:: iLoc.profile.$name.UseRSTT
Default: ``false``
Type: *boolean*
Use regional seismic travel\-time tables
.. confval:: iLoc.profile.$name.UseRSTTPnSn
Default: ``true``
Type: *boolean*
Use regional seismic travel\-time tables for Pn and Sn
.. confval:: iLoc.profile.$name.UseRSTTPgLg
Default: ``true``
Type: *boolean*
Use regional seismic travel\-time tables for Pg and Lg
.. confval:: iLoc.profile.$name.UseLocalTT
Default: ``false``
Type: *boolean*
Use local velocity model if defined in LocalVmodel.
.. confval:: iLoc.profile.$name.LocalVmodel
Type: *string*
Full path to a file containing the local velocity model.
Requires: UseLocalTT \= true. Empty string or unset or
UseLocalTT \= false disables using a local model in
this profile.
Example:
\@DATADIR\@\/iloc\/iLocAuxDir\/localmodels\/model.localmodel.dat.
.. confval:: iLoc.profile.$name.MaxLocalTTDelta
Default: ``3``
Type: *float*
Unit: *deg*
Maximum epicentral distance for applying the local
velocity model.
.. confval:: iLoc.profile.$name.DoGridSearch
Default: ``true``
Type: *boolean*
Perform neighbourhood algorithm
.. confval:: iLoc.profile.$name.NAsearchRadius
Default: ``5``
Type: *float*
Unit: *deg*
Neighbourhood Algorithm: Search radius around initial
epicentre
.. confval:: iLoc.profile.$name.NAsearchDepth
Default: ``300``
Type: *float*
Unit: *km*
Neighbourhood Algorithm: Search radius around initial
depth
.. confval:: iLoc.profile.$name.NAsearchOT
Default: ``30``
Type: *float*
Unit: *s*
Neighbourhood Algorithm: Search radius around initial
origin time
.. confval:: iLoc.profile.$name.NAlpNorm
Default: ``1``
Type: *float*
Neighbourhood Algorithm: p\-value for norm to compute
misfit [1,2]
.. confval:: iLoc.profile.$name.NAiterMax
Default: ``5``
Type: *integer*
Neighbourhood Algorithm: Maximum number of iterations
.. confval:: iLoc.profile.$name.NAcells
Default: ``25``
Type: *integer*
Neighbourhood Algorithm: Number of cells to be resampled
at each iteration
.. confval:: iLoc.profile.$name.NAinitialSample
Default: ``1000``
Type: *integer*
Neighbourhood Algorithm: Size of initial sample
.. confval:: iLoc.profile.$name.NAnextSample
Default: ``100``
Type: *integer*
Neighbourhood Algorithm: Size of subsequent samples
.. confval:: iLoc.profile.$name.MinDepthPhases
Default: ``3``
Type: *integer*
Depth resolution: Minimum number of depth phases for depdp
.. confval:: iLoc.profile.$name.MaxLocalDistDeg
Default: ``0.2``
Type: *float*
Unit: *deg*
Depth resolution: Maximum local distance
.. confval:: iLoc.profile.$name.MinLocalStations
Default: ``1``
Type: *integer*
Depth resolution: Minimum number of local defining stations
.. confval:: iLoc.profile.$name.MaxSPDistDeg
Default: ``2.0``
Type: *float*
Unit: *deg*
Depth resolution: Maximum distance for using S\-P travel\-time differences.
.. confval:: iLoc.profile.$name.MinSPpairs
Default: ``3``
Type: *integer*
Depth resolution: Minimum number of defining S\-P phase pairs
.. confval:: iLoc.profile.$name.MinCorePhases
Default: ``3``
Type: *integer*
Depth resolution: Minimum number of defining core reflection phases
.. confval:: iLoc.profile.$name.MaxShallowDepthError
Default: ``30.0``
Type: *float*
Unit: *km*
Depth resolution: Maximum depth error for crustal free\-depth
.. confval:: iLoc.profile.$name.MaxDeepDepthError
Default: ``60.0``
Type: *float*
Unit: *km*
Depth resolution: Maximum depth error for deep free\-depth
.. confval:: iLoc.profile.$name.DoCorrelatedErrors
Default: ``true``
Type: *boolean*
Linearized inversion: Account for correlated errors
.. confval:: iLoc.profile.$name.SigmaThreshold
Default: ``6.0``
Type: *float*
Unit: *s*
Linearized inversion: Used to exclude big residuals from solution
.. confval:: iLoc.profile.$name.AllowDamping
Default: ``true``
Type: *boolean*
Linearized inversion: Allow damping of model vector
.. confval:: iLoc.profile.$name.MinIterations
Default: ``4``
Type: *integer*
Linearized inversion: Minimum number of iterations
.. confval:: iLoc.profile.$name.MaxIterations
Default: ``20``
Type: *integer*
Linearized inversion: Maximum number of iterations
.. confval:: iLoc.profile.$name.MinNdefPhases
Default: ``4``
Type: *integer*
Linearized inversion: Minimum number of defining phases
.. confval:: iLoc.profile.$name.DoNotRenamePhases
Default: ``false``
Type: *boolean*
Linearized inversion: Do not rename phases. Deactivating
allows to rename the phases automatically for this
profile.

View File

@ -0,0 +1,128 @@
.. _global_locext:
######
LocExt
######
Locator which forwards the processing to external scripts
Description
===========
The ExternalLocator implements a wrapper for scripts which do the actual location
process. The input and output are represented as XML and communicated via the
input/output channels of the called process: stdin and stdout.
Plugin
======
To enable the ExternalLocator the plugin ``locext`` must be loaded.
Commandline Parameters
======================
There are several commandline parameters passed to the script depending on
the locator configuration. The following table summarizes them.
========================= ====================================================
Parameter Description
========================= ====================================================
--max-dist=X The cut-off distance if set
--ignore-initial-location Whether to ignore the initial origin location or not
--fixed-depth=X The depth in km to be fixed if enabled
========================= ====================================================
Input
=====
The input document written to stdin of the child process is a valid SeisComP
XML document containing ``EventParameters``. The event parameters hold exactly
one origin to be relocated and all picks references from the origins arrivals.
Example:
.. code:: xml
<?xml version="1.0" encoding="UTF-8"?>
<seiscomp xmlns="http://geofon.gfz-potsdam.de/ns/seiscomp3-schema/0.11" version="0.11">
<EventParameters>
<pick ...>...</pick>
<pick ...>...</pick>
...
<origin ...>
...
<arrival>
...
</arrival>
<arrival>
...
</arrival>
...
</origin>
</EventParameters>
</seiscomp>
Output
======
The output is read from stdout and is expected to be a SeisComP XML document
just containing an origin.
Example:
.. code:: xml
<?xml version="1.0" encoding="UTF-8"?>
<seiscomp xmlns="http://geofon.gfz-potsdam.de/ns/seiscomp3-schema/0.11" version="0.11">
<Origin publicID="...">
</Origin>
</seiscomp>
Example Configuration
=====================
#. Define the external locator by global configuration e.g. in :file:`global.cfg`:
.. code::
plugins = ${plugins}, locext
ExternalLocator.profiles = locator1:"python /path/to/locator/script1.py",\
locator2:"/path/to/other/locator/script1.sh"
with
* *locator1*/*locator2*: The names of the profiles as shown in :ref:`scolv`
or use in ther modules like :ref:`screloc` for calling the external locator,
* *script1.py*/*script2.sh*: The names of Python/Bash scripts with full path
called by the profile to execute the locator given within the scripts.
#. Once defined, the external locator can be further configured and called
within :ref:`scolv` or by other modules e.g. :ref:`screloc`.
.. _global_locext_configuration:
Module Configuration
====================
.. note::
**ExternalLocator.\***
*Locator parameters: External. This locator requires the plugin*
*"locext" to be loaded.*
.. confval:: ExternalLocator.profiles
Type: *list:string*
A list of profiles defined as tuples of name
and path to a script separated by colon.

View File

@ -0,0 +1,233 @@
.. _global_locrouter:
#########
LocRouter
#########
Meta locator routing location requests of picks and origins to actual
locator implementations.
Description
===========
Router is a meta locator which selects an actual
:ref:`locator <concepts_locators>` based on region profiles configured in
GeoJSON or BNA files.
The locator supports both, the initial location based on a pick set and the
relocation based on an existing origin. In case no origin is available an
initial solution is calculated by a configurable locator followed by a
relocation configured through region profiles.
Setup
=====
The Router locator offers configuration by global module parameters.
Plugin
------
Add the plugin ``locrouter`` to :confval:`plugins` for activating the Router
locator. Example:
.. code-block:: sh
plugins = ${plugins},locrouter
Initial locator
---------------
For routing, an initial source location is required. When only picks but no
origins are provided, the initial location is unknown but it can be defined by
an initial locator independent of location. Set
:confval:`RouterLocator.initial.locator` and
:confval:`RouterLocator.initial.profile` for defining the initial locator.
Region Configuration
--------------------
Regions may be considered by configuring their names in
:confval:`RouterLocator.regions`. The regions themselves are defined as polygons
files in either :ref:`GeoJSON <sec-gui_layers-vector-format-geojson>` or
:ref:`BNA <sec-gui_layers-vector-format-bna>` format. Supported polygon
attributes are:
* name (recommended): Name of polygon. An empty string is assumed if not given.
* locator (mandatory): Name of the locator interface to use.
* profile: Name of the locator-specific profile which must be configured
according to the selected locator.
* minDepth: Minimum depth in km the profile should be applied to.
* maxDepth: Maximum depth in km the profile should be applied to.
The configured features are sorted by rank and area. Larger ranks and smaller
areas are prioritized.
Example :ref:`GeoJSON file<sec-gui_layers-vector-format-geojson>`:
.. code-block:: json
{
"type": "FeatureCollection",
"features": [
{
"type": "Feature",
"properties": {
"name": "Iceland",
"minDepth": 0,
"maxDepth": 30,
"locator": "LOCSAT",
"profile": "iceland"
},
"geometry": {
"type": "Polygon",
"coordinates": [
[
[
-24.5469, 63.3967
],
[
-13.4958, 63.3967
],
[
-13.4958, 66.5667
],
[
-24.5469, 66.5667
],
[
-24.5469, 63.3967
]
]
]
}
},
{
"type": "Feature",
"properties": {
"name": "World",
"locator": "LOCSAT",
"profile": "iasp91"
},
"geometry": {
"type": "Polygon",
"coordinates": [
[
[
-33, 90
],
[
-180, 90
],
[
-180, -90
],
[
-33, -90
],
[
33, -90
],
[
180, -90
],
[
180, 90
],
[
33, 90
],
[
-33, 90
]
]
]
}
}
]
}
Example :ref:`BNA file<sec-gui_layers-vector-format-bna>`:
.. code-block:: properties
"Iceland", "rank 1", "minDepth: 0, maxDepth: 30, locator: LOCSAT, profile: iceland", 4
-24.5469, 63.3967
-13.4958, 63.3967
-13.4958, 66.5667
-24.5469, 66.5667
"World", "rank 1", "locator: LOCSAT, profile: iasp91", 8
-33, 90
-180, 90
-180, -90
-33, -90
33, -90
180, -90
180, 90
33, 90
Application
===========
Once configured, the Router locator may be used by other |scname| modules such
as :ref:`scolv` or :ref:`screloc`. Refer to the locator as "Router".
.. _global_locrouter_configuration:
Module Configuration
====================
.. note::
**RouterLocator.\***
*Locator parameters: Router. This locator requires the plugin*
*"locrouter" to be loaded.*
.. confval:: RouterLocator.regions
Type: *string*
A GeoJSON or BNA file defining locator profiles by region.
Supported polygon attributes:
name: Name of polygon
locator: Name of the locator interface
profile: Name of the locator specific profile
minDepth: Minimum depth in km
maxDepth: Minimum depth in km
.. note::
**RouterLocator.initial.\***
*Configuration of initial locator used to create an*
*initial solution based on a pick set. If a solution is*
*found, relocate of the actual locator configured in the*
*region file is invoked.*
.. confval:: RouterLocator.initial.locator
Type: *string*
Name of the initial locator interface, e.g., LOCSAT.
.. confval:: RouterLocator.initial.profile
Type: *string*
Profile name of the initial locator, e.g., iasp91.

View File

@ -0,0 +1,326 @@
.. _global_locsat:
######
LOCSAT
######
Locator in SeisComP for computing source time and hypocenter
coordinates from phase picks.
Description
===========
LOCSAT is a locator with a travel-time interface in |scname| for computing
source time and hypocenter coordinates from phase picks considering:
* Pick time and pick uncertainty,
* Backazimuth and backazimuth uncertainty,
* Slowness and slowness uncertainty,
* Phase-specfic travel-time tables.
The LOCSAT :ref:`locator interface <locsat_li>` implements a wrapper for the
LocSAT locator by :cite:t:`bratt-1991` (according to the README file shipped
with the LocSAT distribution) referred to as **LOCSAT** in |scname|. The LOCSAT
:ref:`travel-time interface <locsat_tti>` provides travel time for specfic
phases, epicentral distance, soure depth and station elevation.
.. _locsat_li:
Locator Interface
=================
LOCSAT provides the hypocenter parameters through the locator interface.
.. _locsat_tti:
Travel-Time Interface
=====================
LOCSAT provides an interface for computing travel times based on coordinates and
depth. The times are plotted on waveforms, e.g., blue marks in
:ref:`scolv picker window <scolv-sec-waveform-review>`.
Use "LOCSAT" as a value for the travel-time interface when configurable, e.g.,
by :ref:`global_fixedhypocenter`.
.. _locsat_ttt:
Travel-Time Tables
==================
|scname| ships with two sets of predefined travel-time tables which are
made available as the profiles tab and iasp91.
The default profile is *iasp91*.
LOCSAT travel time tables are located as plain ascii files under
:file:`@DATADIR@/locsat/tables/`.
The tables provide the travel times for particular seismic phases at
given depth and epicentral distance in one file per Earth model and seismic
phase. E.g. P-wave arrival times in the iasp91 model are found in
:file:`@DATADIR@/locsat/tables/iasp91.P`. You may easily add your own tables
for any available Earth model and seismic phase by adopting existing ones in new
files which are added by :ref:`configuration <locsat_station_application>` to
your |scname| modules.
Limitations
-----------
#. Only phases for which a travel-time table exists can be considered.
#. LOCSAT currently considers travel-time tables for phases which are hard-coded
* seismic body waves: P, Pg, Pb, Pn, Rg, pP, sP, PKP, PP, PKPab, PKPbc, PKPdf,
SKPdf, PcP,
S, Sg, Sb, Sn, Lg, SKS, SS, ScS,
where P and S are the direct P and S phases, respectively, at all distances
no matter the take-off angle at the source.
* seismic surface waves: LQ, LR.
* infrasound: Is, It, Iw.
#. The maximum number of distance and depth intervals per table file is
currently 210 and 50, respectively.
.. warning::
* Travel-time tables with larger numbers of distance or depth samples are
reported along with command-line error output (stderr). The travel-time
tables should therefore be tested, e.g., with :ref:`scolv` before
unsupervised application.
* Travel times at distance and depth samples exceeding the limits are
ignored. This may lead to undesired behavior during location.
* Phase picks observed outside the distance and depth ranges defined by
travel-time tables may lead to undesired behavior during location.
#. The considered minimum depth is 0 km. Elevations and depths above datum are
not natively considered. The effects of station elevation can be
:ref:`corrected for empirically <locsat_station_elevation>`.
.. _locsat_station_elevation:
Station elevations
------------------
LOCSAT does not natively support corrections of travel-time tables for station
elevations. At least checking the code:
.. code-block:: c
sta_cor[i] = 0.0; /* FIX !!!!!!*/
However, the |scname| wrapper adds this feature. It allows to define a
:file:`.stacor` file which defines emperic corrections of observed travel times.
The corrections are provided in seconds and **subtracted** (not added) from
the observation time to be compatible with the NonLinLoc :cite:p:`nonlinloc`
station correction definitions.
Each LOCSAT profile (travel time table) can have one associated station
correction file. E.g. for adding station corrections to the iasp91 tables, the
file :file:`$SEISCOMP_ROOT/share/locsat/tables/iasp91.stacor` needs to be created.
A station correction table takes the form:
.. code-block:: params
# LOCDELAY code phase numReadings delay
LOCDELAY GE.MORC P 1 -0.1
with
- **code** (*string*) station code (after all alias evaluations)
- **phase** (*string*) phase type (any of the available travel time tables)
- **numReadings** (*integer*) number of residuals used to calculate mean residual/delay
(not used by NLLoc, included for compatibility with the format of a summary,
phase statistics file)
- **delay** (*float*) delay in seconds, subtracted from observed time
.. note::
The fourth column (numReadings) is ignored and just provided for compatibility
reasons with :ref:`NonLinLoc <global_nonlinloc>`.
.. _locsat_station_application:
Application and Setup
=====================
LOCSAT is the default and only locator for :ref:`scautoloc` with *iasp91* as the
default profile. However, LOCSAT can be used optionally in other modules such as
:ref:`scolv` or :ref:`screloc`.
.. _locsat_custom-ttt:
Custom travel-time tables
-------------------------
#. Generate your travel-time tables from a custom Earth model, depth and
distance intervals. Use the same format as the defaults as the *iasp91*
tables. Tools such as :cite:t:`taup` allow the generation.
#. Add your custom travel-time tables along with station corrections to
:file:`@DATADIR@/locsat/tables/`
#. Add your available custom LOCSAT travel-time tables in global configuration,
e.g., to the list of tables of travel-time interfaces
.. code-block:: params
ttt.LOCSAT.tables = iasp91, tab, custom
and to the list of locator profiles
.. code-block:: params
LOCSAT.profiles = iasp91, tab, custom
and optionally to locators which make use of LOCSAT tables, e.g.,
:ref:`global_fixedhypocenter`.
Application with modules
------------------------
Additional parameters of LOCSAT may be configured in global module configuration
(:confval:`LOCSAT.*`).
* The profiles for locating may be extended or limited by
:confval:`LOCSAT.profiles`.
* When using picks with time uncertainties, consider
:confval:`LOCSAT.usePickUncertainties` and :confval:`LOCSAT.defaultTimeError`.
* Measurements of backazimuth and slowness may be deactivated by
:confval:`LOCSAT.useBackazimuth` and :confval:`LOCSAT.useSlownewss`, respectively.
Such measurements may be obtained from array processing or from feature
extraction using :ref:`scautopick`.
You may also configure some |scname| modules with LOCSAT and a profile.
* :ref:`scautoloc`: Configure a profile for automatic locations,
* :ref:`screloc`: Configure *LOCSAT* along with a profile for automatically
relocating.
* :ref:`scolv`: Configure *LOCSAT* along with a profile as defaults for
interactive locations.
When using LOCSAT in :ref:`scolv` you may interactively some settings. The
changes only apply during runtime.
.. figure:: media/scolv-locsat-settings.png
:align: center
:width: 10cm
scolv Location tab with LOCSAT selected and the settings menu.
.. _global_locsat_configuration:
Module Configuration
====================
.. note::
**LOCSAT.\***
*Locator parameters: LOCSAT*
.. confval:: LOCSAT.profiles
Default: ``iasp91, tab``
Type: *list:string*
Defines a list of available LOCSAT travel\-time tables.
.. confval:: LOCSAT.depthInit
Default: ``20.0``
Type: *double*
Unit: *km*
The initial depth estimate for LOCSAT.
.. confval:: LOCSAT.usePickUncertainties
Default: ``false``
Type: *boolean*
Whether to use pick time untertainties for arrival deltim rather
than a fixed time error. If true then the uncertainties are
retrieved from each individual pick object. If they are not
defined then the default pick time uncertainty will be used
as fallback.
.. confval:: LOCSAT.defaultTimeError
Default: ``1.0``
Type: *double*
Unit: *s*
The default pick time uncertainty assigned to LOCSAT's arrival deltim
attribute if pick uncertainties are not going to be used or
if they are absent. A time uncertainty of 0 s may result in
errors of the SVD decomposition in LOCSAT.
.. confval:: LOCSAT.usePickBackazimuth
Default: ``true``
Type: *boolean*
Whether to forward pick backazimuth to LOCSAT or not. In an
automatic mode backazimuth measurements might be inaccurrate
and disabling their usage in LOCSAT can be controlled with
this parameter.
.. confval:: LOCSAT.usePickSlowness
Default: ``true``
Type: *boolean*
Whether to forward pick horizontal slowness to LOCSAT or not.
In an automatic mode slowness measurements might be
inaccurrate and disabling their usage in LOCSAT can be
controlled with this parameter.
.. confval:: LOCSAT.degreesOfFreedom
Default: ``9999``
Type: *int*
Number of degrees of freedom.
.. confval:: LOCSAT.confLevel
Default: ``0.9``
Type: *double*
Confidence level between 0.5 and 1.0.
.. confval:: LOCSAT.enableConfidenceEllipsoid
Default: ``false``
Type: *boolean*
Compute the confidence ellipsoid from covariance matrix in 3D.

View File

@ -0,0 +1,83 @@
.. _global_mb:
##
mb
##
Body wave magnitude at teleseismic distances
Description
===========
mb is the standard body-wave magnitude.
Compare also with the :ref:`mB magnitude <global_mb_bb>`.
Amplitude
---------
mb is defined on the amplitude of the first few cycles of the P-wave,
typically a time window of 20 s - 30 s. Only the first few cycles are used to
minimize the effects of radiation pattern and depth phases, which result in
complicate waveform signatures.
In |scname| mb amplitudes are measured on vertical-component displacement seismograms
in a 30 s time window after simulation of a :term:`WWSSN_SP` short-period
seismometer. Amplitudes are used from stations with epicentral distances between
5° and 105° (configurable). The methods for measuring amplitudes are configurable
in the global bindings.
Station Magnitude
-----------------
The general formula is
.. math::
mb = \log \left(\frac{A}{T}\right) + Q(h,\Delta) - 3.0
with A as the displacement amplitude in micrometers, T as the dominant period of
the signal in seconds, Q as a correction term for depth and distance. mb is
usually determined at periods around 1s in adaptation to the use
of the World-Wide Standard Seismograph Network (WWSSN) short-period stations.
A scatter in the order of +/- 0.3 for the station magnitudes is usual.
Typically, mb is determined for stations with distances larger than 5° to
have a distinct direct P-wave phase. A correction term for the distance has to
be determined empirically, which is quite complicate for distances smaller than 20°.
This reflects the complexity of the body waves that traverse only in the upper
mantle. mb saturates at about magnitude 5.5 to 6.0 because the maximum amplitudes of larger
earthquakes occur at lower frequencies than the frequency range between 0.7 Hz - 2 Hz
used for the magnitude calculation.
* Amplitude unit in |scname|: **nanometers** (nm)
* Time window: 30 s
* Default distance range: 5 - 105 deg, configurable: :confval:`magnitudes.mb.minDist`,
:confval:`magnitudes.mb.maxDist`
* Depth range: no limitation, for depth < 0 km, depth = 0 km is assumed
.. note::
In 2013 the IASPEI commission (:cite:t:`iaspei-2013`) recommended a minimum distance of
20 deg. However, :ref:`scautoloc` requires mb amplitudes by default for
considering a pick.
For maintaining consistency, 5 deg is therefore kept as the default
for :confval:`magnitudes.mb.minDist`.
Network magnitude
-----------------
By default, the trimmed mean is calculated from the station magnitudes to form
the :term:`network magnitude`. Outliers beyond the outer 12.5% percentiles are
removed before forming the mean.
Configuration
-------------
Adjust the configurable parameters in global bindings in the mB section or use
:file:`global.cfg`
as in :ref:`global_mlv`. Add mb to the list of computed amplitudes and magnitudes
in the configuration of
:ref:`scamp` and :ref:`scmag` and in :ref:`scesv` or :ref:`scolv`/:ref:`scesv`
for visibility.

View File

@ -0,0 +1,93 @@
.. _global_mb_bb:
#####
mB_BB
#####
Body wave magnitude at teleseismic distances similar to mb
Description
===========
The **mB**/**mB_BB** magnitude has been recommended by the IASPEI commission
(:cite:t:`bormann-2008`, :cite:t:`bormann-2009`, :cite:t:`iaspei-2013`).
It is based on amplitude measurements of body waves like :ref:`global_mb`, but
with the amplitude measured in a broad frequency range and longer time windows.
Instead of amplitude measurements on displacement data together with the
dominant period, the maximum velocity amplitude Vmax is taken directly from
velocity-proportional records with :math:`V = 2 \pi A/T`. The time window for
the measurement can be determined by the duration of the high-frequency (1-3 Hz)
radiation (:cite:t:`bormann-2008`). This time window usually contains the phases
P, pP, sP, PcP, but not PP. According to the long time window and broad
frequency range used for amplitude measurements mB saturates not like mb.
.. note::
In |scname| the term **m_B** is a synonym for **mB_BB** which is used
by IASPEI :cite:p:`iaspei-2013`.
Amplitude
---------
mB amplitudes are calculated on vertical-component displacement seismograms
in accordance with :cite:t:`bormann-2008` and similar to :ref:`mb <global_mb>`.
A default time window of 60 s is considered for amplitude measurements
at stations in the distance range of 5° to 105°.
If the epicentral is known, the length of the time window after the P wave onset
is
.. math::
t = min(\Delta * 11.5, 60)
where :math:`\Delta` is the epicentral distance. The methods for measuring
amplitudes are configurable in the global bindings.
Station Magnitude
-----------------
The mB station magnitudes are calculated in accordance with :cite:t:`bormann-2008`.
.. math::
mB = \log \left(\frac{A}{2\Pi}\right) + Q(h,\Delta) - 3.0
with A as the displacement amplitude in micrometers, T as the dominant period of
the signal in seconds, Q as a correction term for depth and distance.
* Amplitude unit in |scname|: **nanometers/s** (nm/s),
* Time window: 60 s if set by :ref:`scautopick`, otherwise 0 s - 11.5 * distance
(deg) with 60 s minimum
* Default distance range: 5 - 105 deg, configurable: :confval:`magnitudes.mB.minDist`,
:confval:`magnitudes.mB.maxDist`,
* Depth range: no limitation.
.. note::
In 2013 the IASPEI commission (:cite:t:`iaspei-2013`) recommended a minimum
distance of
20 deg. However, the calibration formula (:cite:t:`bormann-2008`) which is
integrated in
|scname| allows the extension down to 5 deg while maintaining consistent magnitudes
at 20 deg and beyond. Therefore, 5 deg is used as the default in
:confval:`magnitudes.mB.minDist`.
Network magnitude
-----------------
By default, the trimmed mean is calculated from the station magnitudes to form
the :term:`network magnitude`. Outliers beyond the outer 12.5% percentiles are
removed before forming the mean.
Configuration
-------------
Adjust the configurable parameters in global bindings in the mB section or use
:file:`global.cfg`
as :ref:`global_mlv`. Add mB to the list of computed amplitudes and magnitudes
in the configuration of
:ref:`scamp` and :ref:`scmag` and in :ref:`scesv` or :ref:`scolv` for visibility.

View File

@ -0,0 +1,67 @@
.. _global_mb_idc:
######
mb_IDC
######
Body wave magnitude computed at CTBTO/IDC (mb) is calculated for seismic
events from the time-defining primary body waves recorded at seismic
stations at an epicentral distance between 20 and 105 degrees
from the event.
Description
===========
Amplitude
---------
The A5/2 amplitudes are calculated on the vertical component seismograms filtered
between 0.8 and 4.5 Hz and converted to displacement.
Station Magnitude
-----------------
.. math::
mag = \log10(\frac{A}{T}) + Q(\Delta,h)
with
A: amplitude of type A5/2
T: period of the signal in seconds
Q: attenuation correction function of event distance and event depth
h: event depth in km
The attenuation corrections as a function of distance and depth are based on
(Veith, K. F., and Clawson, G. E., 1972). The corrections are tabulated every
degree for distances out to 180 degrees and for depths 0, 15, 40 km, and
100-800 km in steps of 100 km. Bi-cubic splines were used for interpolating the
tables. The tabulated values were adjusted for the fact that the original
(Veith, K. F., and Clawson, G. E., 1972) tables relate to peak-to-peak
amplitudes, whereas the measured amplitudes for mb calculations are half
peak-to-peak. The default corrections are read from a file installed at
:file:`@DATADIR@/magnitudes/IDC/qfvc.mb`.If that file is not present no magnitude
will be calculated.
Station corrections
-------------------
Station magnitudes can be computed with a station specific correction table
which is configured in the global bindings. The parameter :confval:`magnitudes.mb(IDC).Q`
takes a path and allows to use placeholders for network code (:code:`{net}`),
station code (:code:`{sta}`) and location code (:code:`{loc}`).
Example:
.. code::
magnitudes.mb(IDC).Q = @DATADIR@/magnitudes/IDC/{net}.{sta}.mb
* Amplitude unit in SeisComP: **nanometer** (nm)
* Time window: 5.5 s
* Default distance range: 20 - 105 deg
* Depth range: 0 - 800 km

View File

@ -0,0 +1,78 @@
.. _global_md:
##
Md
##
Duration magnitude plugin
Description
===========
The duration magnitude is based on coda duration measurement.
It's usually valid for small earthquakes up to magnitude 4 to 5.
First used in 1972 by Lee et al., Duration magnitude (Md) or Coda duration
magnitude plugin estimates Richter magnitude of local earthquakes by using
signal duration on vertical components of seismographs.
Estimations are quite stable for local earthquakes ranging from magnitude
Md 0.0 to 5.0.
Amplitude processing
--------------------
Duration magnitude is usually computed on short period seismometers by searching
the time at which the amplitude of the signal is close to pre-earthquake amplitude.
Since it's mainly used for small earthquake whose signal is at rather high frequency,
it's usefull to highpass filter broadband seismomters (select sismo type 6 and a
Butterworth filter "3,1.5").
Or a better solution is to deconvolve the signals and reconvolve with a widely used
short-period instrument : the 1Hz eigen-frequency L4C (select sismo type 9).
If you have the full responses in your inventory and have activated them
(amplitudes.enableResponses set to true), you will be able to use also accelerometers.
The plugin then searches for the maximum amplitude of the signal, which should be
the S-wave and then computed mean amplitude of one-second time windows.
As soon as a one-second time window mean amplitude vs pre-earthquake amplitude
reaches the configured SNR ratio, the process is stopped.
The middle of the one-second time window is assumed to be the end of the Coda and
the time difference between Coda time and P arrival time is stored as Coda duration.
Magnitude processing
--------------------
Once amplitudes calculated by the AmplitudeProcessor and a Coda has been found,
the generic formula is applied and the duration magnitude is computed
for a given station, if it fits the criteria (max depth, max distance).
.. math::
mag = FMA + FMB \times \log10(period) + (FMF \times period) + (FMD \times epidistkm) + (FMZ \times depth) + STACOR
Plugin
======
The Coda duration magnitude plugin (Md) is installed under :file:`share/plugins/md.so`.
It provides a new implementations of AmplitudeProcessor and MagnitudeProcessor.
To add the plugin to a module add it to the modules configuration, either
:file:`modulename.cfg` or :file:`global.cfg`:
.. code-block:: sh
plugins = ${plugins}, md
Basically it can be used by modules: :ref:`scamp`, :ref:`scmag`, :ref:`scolv`.
More information
----------------
Description of the formula can be found in Hypo2000 manual from USGS website.
`<http://earthquake.usgs.gov/research/software/#HYPOINVERSE>`_

View File

@ -0,0 +1,102 @@
.. _global_ml:
##
ML
##
Standard local (Richter) magnitude
Description
===========
ML is the standard local (Richter) magnitude originally designed for
Southern California by :cite:t:`richter-1935`.
General (default) conditions apply:
* Amplitude unit in SeisComP: **millimeter** (mm) by simulation of a :term:`Wood-Anderson seismometer`.
* Time window, configurable: 150 s by :ref:`scautopick` or distance dependent, configurable.
* Distance type: epicentral distance.
* Distance range: 0 - 8 deg, maximum is configurable:
:confval:`magnitudes.ML.maxDistanceKm`,
measurements beyond 8 deg will be strictly ignored.
* Depth range: 0 - 80 km, configurable for amplitude measurements.
Amplitudes
----------
The ML amplitude calculation is similar to the original ML. Waveforms from both
horizontal components are time-windowed and restituted to the Wood-Anderson
seismograph. Within the time window the amplitudes are measured on both
horizontal components and combined. The methods for measuring and combining
amplitudes are configurable in the global bindings.
Station Magnitudes
------------------
The individual station ML is calculated using the following formula:
.. math::
ML = \log10(A) - \log10(A0)
*A* is the measured ML Wood-Anderson amplitude in millimeters. The second term
is the empirical calibration function, which in turn is a function
of the epicentral distance (:cite:t:`richter-1935`). This calibration
function and distance range can be configured globally or per station using
global bindings or the global module configuration variable
module.trunk.global.magnitudes.ML.logA0 in :file:`global.cfg`, e.g.
.. code-block:: params
module.trunk.global.magnitudes.ML.logA0 = "0:-1.3,60:-2.8,100:-3.0,400:-4.5,1000:-5.85"
module.trunk.global.magnitudes.ML.maxDistanceKm = "-1"
The *logA0* configuration string consists of an arbitrary number of
distance-value pairs separated by comma. Within the pairs, the values are
separated by colon. The distance is epicentral distance in km
and the second value corresponds to the *log10(A0)* term above.
Within each interval the values are computed by linear
interpolation. E.g. for the above default specification, at a
distance of 80 km the *log10(A0)* value would be
.. math::
\log10(A0) &= ((-3.0)-(-2.8))*(80-60)/(100-60)-2.8 \\
&= -2.9
In other words, at 80 km distance the magnitude would be
.. math::
ML &= \log10(A) - (-2.9) \\
&= \log10(A) + 2.9
which is according to the original Richter formula :cite:p:`richter-1935` if the
amplitude is measured in millimeters.
Several distance-value pairs can be configured for different ranges of
epicenter distance.
Network magnitude
-----------------
By default, the mean is calculated from the station magnitudes to form the
network magnitude.
Configuration
-------------
Set the configuration and calibration parameters in the global bindings similar
to :ref:`global_mlv`.
Instead configuring lots of global bindings profiles or station bindings one
line per parameter can be added to the global module configuration
(:file:`global.cfg`).
Add ML to the list of computed amplitudes and magnitudes in the configuration of
:ref:`scamp` and :ref:`scmag` and in :ref:`scesv` or :ref:`scolv` for visibility.

View File

@ -0,0 +1,53 @@
.. _global_ml_idc:
######
ML_IDC
######
CTBTO/IDC local magnitude.
Description
===========
Amplitude
---------
The SNSNR amplitudes are calculated on the vertical component seismograms.
Station Magnitude
-----------------
.. math::
mag = \log10(A) + B(\Delta)
with
A: amplitude of type SBSNR
B: attenuation correction function of epicentral distance in km
The default corrections are read from a file installed at
:file:`@DATADIR@/magnitudes/IDC/global.ml`. If that file is not present no magnitude
will be calculated.
Station corrections
-------------------
Station magnitudes can be computed with a station specific correction table
which is configured in the global bindings. The parameter :confval:`magnitudes.ML(IDC).A`
takes a path and allows to use placeholders for network code (:code:`{net}`),
station code (:code:`{sta}`) and location code (:code:`{loc}`).
Example:
.. code::
magnitudes.ML(IDC).A = @DATADIR@/magnitudes/IDC/{net}.{sta}.ml
* Amplitude unit in SeisComP: **nanometer** (nm)
* Time window: 4.5 s
* Default distance range: 0 - 20 deg
* Depth range: 0 - 40 km

View File

@ -0,0 +1,283 @@
.. _global_mlc:
###
MLc
###
Custom magnitude for local events measured on horizontal components
Description
===========
MLc is a custom magnitude for local events based on :ref:`ML<global_ml>` but
with greater flexibility.
The original implementation is based on specifications by the Hessian Agency for
Nature Conservation, Environment and Geology, Hessian Earthquake Service.
More options have been added allowing the magnitude to be configured with
great flexibility in order to account for many different conditions. The
general procedures for measuring amplitudes and computing magnitudes are
outlined in the :ref:`"Concepts" section on magnitudes <concepts_magnitudes>`.
The MLc magnitude is very similar to the original :ref:`ML<global_ml>`,
except that
* Amplitude pre-filtering is applied by default.
* Wood-Anderson simulation is optionally applied and can be deactivated.
* Measured amplitudes can be scaled accounting for expected units.
* Measured amplitudes are combined by taking the maximum instead of the average.
* A parametric :ref:`magnitude calibration <mlc_station_magnitude>` function
applies but a correction in the form log10(A0) can be configured for converting
measured amplitudes to station magnitudes.
* Hypocentral instead of epicentral distance is considered by default.
Amplitudes
----------
Some general conditions apply for measuring amplitudes:
* Measured amplitude type: MLc.
* Expected unit of gain-corrected input data: m/s. Activate response correction
in global bindings in case data are provided in acceleration.
* Components used for amplitude measurements: both horizontal components
separately.
The default parameters for measuring MLc amplitudes can be adjusted by global
binding parameters:
* Filtering before instrument simulation: :ref:`BW(3,0.5,12) <filter-bw>`,
configurable by :confval:`amplitudes.MLc.preFilter`.
* :term:`Wood-Anderson seismometer` simulation: yes, can be deactivated by
:confval:`amplitudes.MLc.applyWoodAnderson`.
* Characteristics of :term:`Wood-Anderson seismometer`: according to IASPEI
recommendations. Can be adjusted by :confval:`amplitudes.WoodAnderson.gain`,
:confval:`amplitudes.WoodAnderson.T0`, :confval:`amplitudes.WoodAnderson.h`
in global bindings or global module configuration.
* Amplitude scaling: 1, configure by :confval:`amplitudes.MLc.amplitudeScale`
for considering non-default units by magnitude.
* Method applied for measuring amplitudes: absolute maximum, configurable in
global bindings by :confval:`amplitudes.MLc.measureType`.
* Method for combining amplitude measurements: *max* (maximum from both
horizontal components), configurable in global bindings by
:confval:`amplitudes.MLc.combiner`.
Some additional parameters require you to create an amplitude-type profile for
global binding parameters. Name the profile like the amplitude name, hence MLc:
* Time window for measuring signal amplitudes [s]: P pick time + 150 s by
:ref:`scautopick` or distance [km]/3 km/s + 30 s,
the relevant parameters are: :confval:`amplitudes.MLc.signalBegin`,
:confval:`amplitudes.MLc.signalEnd`. :ref:`Time grammar <time-grammar>` may be
applied for begin and end times.
* Time window for measuring noise amplitudes [s]: 30 s before the P pick,
the relevant parameters are: :confval:`amplitudes.MLc.noiseBegin`,
:confval:`amplitudes.MLc.noiseEnd`. :ref:`Time grammar <time-grammar>` may be
applied for begin and end times.
* Minimum SNR: 0, configurable by :confval:`amplitudes.MLc.minSNR`.
* Distance range: 0 - 8 deg, configurable by :confval:`amplitudes.MLc.minDist`,
:confval:`amplitudes.MLc.maxDist`, stations at distances beyond 8 deg will be strictly
ignored.
* Depth range: <= 80 km, can be adjusted and extended by
:confval:`amplitudes.MLc.minDepth` and :confval:`amplitudes.MLc.maxDepth`.
Most parameters controlling the amplitude measurements are configurable in
global bindings or global module configuration.
The Wood-Anderson simulation will convert input velocity data to ground
displacement in mm. The input data may be of a different unit after applying
:confval:`amplitudes.MLc.preFilter`, e.g. when integration is applied, and / or
when Wood-Anderson simulation is disabled. Configure
:confval:`amplitudes.MLc.amplitudeScale` for converting the unit of the
processed data to the unit expected by the
:ref:`station magnitude calibration <mlc_station_magnitude>` for the measured
amplitude.
.. note::
For comparing MLc amplitudes with :ref:`ML amplitudes <global_ml>` set the
global bindings parameters ::
amplitudes.MLc.preFilter = ""
amplitudes.MLc.combiner = average
.. _mlc_station_magnitude:
Station magnitudes
------------------
Default properties, most parameters are configurable in global bindings:
* Distance type: hypocentral, epicentral can be selected by :confval:`magnitudes.MLc.distMode`.
* Distance range: 0 - 8 deg, configurable by :confval:`magnitudes.MLc.minDist`,
:confval:`magnitudes.MLc.maxDist`, measurements beyond 8 deg will be strictly
ignored.
* Depth range: <= 80 km, can be extended by :confval:`magnitudes.MLc.maxDepth`.
* Expected amplitude type: MLc, configurable by magnitude alias.
* Expected amplitude unit: millimeter (mm), other units can be assumed by
amplitude scaling with :confval:`amplitudes.MLc.amplitudeScale`.
* Magnitude calibration type: parametric, parametric and non-parametric are
available through :confval:`magnitudes.MLc.calibrationType`.
* Calibration function (see below for the equations), configurable by global bindings
depending on the actual calibration type:
* parametric: :confval:`magnitudes.MLc.parametric.c0`,
:confval:`magnitudes.MLc.parametric.c1`,
:confval:`magnitudes.MLc.parametric.c2`,
:confval:`magnitudes.MLc.parametric.c3`,
:confval:`magnitudes.MLc.parametric.c4`,
:confval:`magnitudes.MLc.parametric.c5`
* A0: :confval:`magnitudes.MLc.A0.logA0`
* Station correction: none, configurable by a magnitude-type profile in global
bindings with :confval:`magnitudes.MLc.offset` or the equivalent in global
module configuration as :confval:`module.trunk.NET.STA.magnitudes.MLc.offset`.
The latter is not supported by :ref:`scconfig` but it reduces the amount of
required bindings.
The calibration function is considered in one of the forms
* parametric when :confval:`magnitudes.MLc.calibrationType` = "parametric"`:
.. math::
MLc = \log_{10}(A) + c_3 * \log_{10}(r/c_5) + c_2 * (r + c_4) + c_1 + c_0(station)
where
* *A*: displacement amplitude measured in unit of mm or as per configuration
* *r*: hypocentral (default) or epicentral distance
* *c1*, *c2*, *c3*, *c4*, *c5*: general calibration parameters
* *c0*: station-specific correction
* *r*: Hypocentral (default) or epicentral distance as configured by
:confval:`magnitudes.MLc.distMode`.
The default values are valid for SW-Germany (:cite:t:`stange-2006`), c6 and H
have been added for supporting dependency on depth (:cite:t:`rhoades-2020`).
* log10(A0)-based non-parametric when :confval:`magnitudes.MLc.calibrationType` = "A0"`:
.. math::
MLc = \log_{10}(A) - \log_{10}(A_0)
where
* :math:`log_{10}(A_0)`: distance-dependent correction value. Read
:ref:`global_mlv` for the details.
.. note::
The magnitude calibration function can be regionalized by adjusting global
module configuration parameters in MLc region profiles of
:confval:`magnitudes.MLc.region.*` and in a *MLc* Magnitude type profile e.g.
in :file:`global.cfg`.
The flexibility of the amplitude and magnitude processing allows for MLc to be
applied in various use cases. Examples are given below.
* **Default:** Pre-filtered and gain-corrected amplitudes, Wood-Anderson
corrected and measured in mm for Southwestern Germany, :cite:t:`stange-2006`:
.. math::
MLc = \log_{10}(A) + 1.11 * \log_{10}(r) + 0.00095 * r + 0.69 + c_0
* Wood-Anderson-corrected displacement amplitudes measured in mm for
Southern California, :cite:t:`hutton-1987`:
.. math::
MLc = \log_{10}(A) + 1.110 * \log_{10}(r / 100) + 0.00189 * (r - 100) + 3.0
* Pre-filtered velocity amplitudes in units of mym/s (requiring to set
:confval:`amplitudes.MLc.amplitudeScale`), no Wood-Anderson correction,
for West Bohemia, e.g. :cite:t:`hiemer-2012`:
.. math::
MLc = \log_{10}(A) - log_{10}(2\Pi) + 2.1 * \log_{10}(r) - 1.7 + c_0
.. figure:: media/magnitude-calibrations_MLc_s_MLc_hb.png
:align: center
:width: 18cm
MLc magnitudes for measured amplitude of 1 mm with default magnitude
calibration (*MLc_s*, :cite:t:`stange-2006`) and calibration values for Southern
California (*MLc_hb*, :cite:t:`hutton-1987`).
Network magnitude
-----------------
The network magnitude is computed from station magnitudes automatically by
:ref:`scmag` or interactively by :ref:`scolv`.
Originally the median was computed from all station MLc to form the
:term:`network magnitude` MLc. Here, the trimmed mean is applied. Outliers
beyond the outer 12.5% percentiles are removed before forming the mean. The
method can be adjusted in :ref:`scmag` by :confval:`magnitudes.average`.
Moment magnitude
----------------
MLc can be scaled to a moment magnitude, Mw(MLc), by a magnitude-type profile in
global module configuration. Read the
:ref:`Tutorial on moment magnitudes <tutorials_mags_moment>` for the details.
Magnitude aliases
-----------------
Magnitude aliases can be created by :confval:`magnitudes.aliases` in
global module configuration in order to derive other magnitude types from
original amplitudes and magnitudes. The actual amplitude and magnitude
parameters of the aliases will be configured in global bindings or by
magnitude-type profiles in global module configuration. Read the
:ref:`Tutorial on magnitude aliases <tutorials_magnitude-aliases>` for the
details.
Regionalization
---------------
Regionalization may be achieved by a magnitude-type profile in global module
configuration. Read the
:ref:`Tutorial on regionalization <tutorials_magnitude-region>` for the details.
Setup
=====
#. **Set the configuration and calibration parameters** in the global bindings
similar
to :ref:`global_ml`. Instead of configuring lots of global bindings profiles
or station bindings one line per parameter can be added to the global module
configuration (:file:`global.cfg`) which takes the form
.. code-block:: params
module.trunk.NET.STA.amplitudes.MLc.preFilter = value
module.trunk.NET.STA.magnitudes.MLc.parametric.c0 = value
#. Add MLc to the list of default amplitudes and magnitudes if MLc is to be
computed by automatic modules, e.g. of :ref:`scamp`, :ref:`scmag`.
#. Configure :ref:`scmag` (:confval:`magnitudes.average` in :file:`scmag.cfg`)
for choosing the method to form the
network magnitude from station magnitudes, e.g.
.. code-block:: params
magnitudes.average = MLc:median
#. Add MLc to the list of magnitudes preferred by :ref:`scevent`
(:confval:`eventAssociation.magTypes` in :file:`scevent.cfg`) in order to let
MLc become the preferred magnitude.
#. Set defaults/visibility of MLc in :term:`GUI` modules, e.g. :ref:`scolv`
or :ref:`scesv`.
.. note::
All default values for bindings configuration parameters are from
:cite:t:`stange-2006`.

View File

@ -0,0 +1,60 @@
.. _global_mlh:
###
MLh
###
The MLh plugin (previously MLsed) is designed to compute amplitudes
and magnitudes according to the Swiss Seismological Service (SED)
standards.
Description
===========
Amplitude
---------
The MLh amplitude calculation is very similar to the original :ref:`ML<global_ml>`.
The two differences are:
- It uses the maximum of the two horizontal components (average can be configured if necessary)
- It uses zero-to-peak in stead of peak-to-peak values
Zero-to-peak is calculated by just dividing the peak-to-peak amplitude by two.
This is not exact for unsymmetrical signals, but that doesn't matter because the
code actually generates zero-to-peak amplitudes internally and multiplies them
with two. So in the end we get real zero-to-peak values.
Station Magnitude
-----------------
The MLh plugin calculates the individual station magnitude using the following formula:
.. math::
mag = \log10(waamp1) + A \times hypdistkm + B
waampl is the amplitude produced by the MLh plugin. Hypdistkm is the distance
from the sensor to the hypocenter in kilometers. A and B are parameters that
can be configured in a config file. Several pairs of A and B can be configured
for different ranges of hypocenter distance.
* Amplitude unit in SeisComP: **millimeter** (mm)
* Time window: 150 s by :ref:`scautopick` or distance dependent
* Distance range: 0 - 20 deg
* Depth range: 0 - 80 km
Network Magnitude
-----------------
To compute the network magnitude from station magnitudes the SED standard is applied
by computing the median value of all contributing station magnitudes without any trimming.
Configuration
-------------
Add the *mlh* plugin to the existing plugins in the global configuration.
Set the calibration parameters in the global bindings to compute MLh.
There exist no default configuration. Add MLh to the list of
amplitudes and magnitudes in the configuration of :ref:`scamp` and :ref:`scmag` for computation
and in :ref:`scesv` for visibility.

View File

@ -0,0 +1,75 @@
.. _global_mlr:
###
MLr
###
The GNS/Geonet local magnitude
Description
===========
The MLr magnitude provides a GNS/Geonet local magnitude (:cite:t:`ristau-2016`).
MLr magnitudes are implemented by the *mlr* plugin.
It is a modified version of the gempa ML magnitude developed
at the Liverpool developer meeting (:cite:t:`gempa`) based on the SED
:ref:`MLh<global_mlh>` magnitude.
The *mlr* plugin is designed to use the MLv station amplitudes for computing
MLr magnitudes.
The magnitude uses a station correction term and the hypocentral distance.
Amplitude
=========
The MLr amplitude calculation is that of :ref:`MLv<global_mlv>`.
Station Magnitude
=================
The *mLr* plugin calculates individual MLr station local magnitudes from
:ref:`MLv<global_mlv>` amplitudes as:
.. math::
MLr = \log10(waampl) - \log10(waamplRef)
.. math::
\log10(waamplRef)= 0.2869 - 1.272 \times 1^{-3} \times hypdistkm - 1.493 \times \log10(hypdistkm) + StationCorrection
with
* *waampl:* the :ref:`MLv<global_mlv>` amplitude.
* *hypdistkm:* the distance from the sensor to the hypocenter in kilometers.
* *A(station):* Station correction is given by module.trunk.NZ.WEL.MLR.params, A.
Station Correction is set to be distance dependent.
Format: "UpToKilometers A ; UpToNextKilometers A ".
The option "nomag" disables the station magnitude.
General parameters:
* Amplitude unit in SeisComP: **millimeter** (mm) from :ref:`MLv<global_mlv>`
* Time window: 150 s by :ref:`scautopick` or distance dependent
* Distance range: 0 - 20 deg (hypocentral distance, hard-coded)
* Depth range: 0 - 800 km (hard-coded)
Network magnitude
=================
The GNS/Geonet Mlr local magnitude is using the default |scname| behaviour for
the automatic network magnitudes.
Hard-coded ranges are 0-20 degrees maximum distance and 800 km maximum depth.
Configuration
=============
Add the mlr plugin to the existing plugins in the global configuration.
Set the calibration parameters in the global bindings. Add MLr to the list of
magnitudes in the configuration of :ref:`scamp` and :ref:`scmag` for computation
and in :ref:`scesv` for visibility.

View File

@ -0,0 +1,117 @@
.. _global_mlv:
###
MLv
###
Local (Richter) magnitude measured on the vertical component
Description
===========
MLv is the local (Richter) magnitude (:cite:t:`richter-1935`) computed from amplitudes measured on the
vertical component.
General (default) conditions apply:
* Amplitude unit in SeisComP: **millimeter** (mm) by simulation of a :term:`Wood-Anderson seismometer`.
* Time window: 150 s by :ref:`scautopick` or distance dependent, configurable.
* Default distance range: 0 - 8 deg, maximum is configurable
:confval:`magnitudes.MLv.maxDistanceKm`, measurements beyond 8 deg will be
strictly ignored.
* Depth range: no limitation.
Amplitudes
----------
The MLv amplitude calculation is very similar to the original :ref:`ML<global_ml>`,
except that the amplitude is measured on the vertical component. The methods
for measuring amplitudes are configurable in the global bindings.
Station Magnitudes
------------------
The individual station MLv is calculated up to the epicentral distance
:confval:`magnitudes.MLv.maxDistanceKm` using the following formula:
.. math::
MLv = \log10(A) - \log10(A0)
A is the MLv Wood-Anderson amplitude in millimeters. The second term
is the empirical calibration function, which in turn is a function
of the epicentral distance (see :cite:t:`richter-1935`). This calibration
function can be configured globally or per station using global
bindings or the global module configuration variable
module.trunk.global.magnitudes.MLv.logA0 in :file:`global.cfg`, e.g. ::
module.trunk.global.magnitudes.MLv.logA0 = "0:-1.3,60:-2.8,100:-3.0,400:-4.5,1000:-5.85"
module.trunk.global.magnitudes.MLv.maxDistanceKm = "-1"
The logA0 configuration string consists of an arbitrary number of
distance-value pairs separated by semicolons. The distance is in km
and the value corresponds to the *log10(A0)* term above.
Within each interval the values are computed by linear
interpolation. E.g. for the above default specification, at a
distance of 80 km the *log10(A0)* value would be
.. math::
\log10(A0) &= ((-3.0)-(-2.8))*(80-60)/(100-60)-2.8 \\
&= -2.9
In other words, at 80 km distance the magnitude would be
.. math::
MLv &= \log10(A) - (-2.9) \\
&= \log10(A) + 2.9
which is according to the original Richter formula :cite:p:`richter-1935` if the
amplitude is measured in millimeters.
Network magnitude
-----------------
By default, the trimmed mean is calculated from the station magnitudes to form
the :term:`network magnitude`. Outliers beyond the outer 12.5% percentiles are
removed before forming the mean.
Configuration
-------------
Several distance-value pairs can be configured for different ranges of
epicentral distance.
The calibration function and maximum distance can be configured globally,
per network or per station using the configuration variables. Instead configuring
lots of global bindings profiles or station bindings one line per parameter can be
added to the global module configuration (:file:`global.cfg`), e.g.
global:
.. code-block:: params
module.trunk.global.magnitudes.MLv.logA0 = "0:-1.3,60:-2.8,100:-3.0,400:-4.5,1000:-5.85"
module.trunk.global.magnitudes.MLv.maxDistanceKm = -1
or per network:
.. code-block:: params
module.trunk.GR.magnitudes.MLv.logA0 = "0:-1.3,60:-2.8,100:-3.0,400:-4.5,1000:-5.85"
module.trunk.GR.magnitudes.MLv.maxDistanceKm = -1
or per station:
.. code-block:: params
module.trunk.GR.MOX.magnitudes.MLv.logA0 = "0:-1.3,60:-2.8,100:-3.0,400:-4.5,1000:-5.85"
module.trunk.GR.MOX.magnitudes.MLv.maxDistanceKm = -1
Set the configuration and calibration parameters in the global bindings. By
default MLv is computed by :ref:`scautopick` and is visible in GUIs.

View File

@ -0,0 +1,118 @@
.. _global_mn:
##
MN
##
Nuttli magnitude for Canada and other Cratonic regions
Description
===========
MN is the Nuttli magnitude :cite:p:`nuttli-1973` for Canada and other Cratonic
regions. It is implemented by the *nuttli* plugin according to the
Geological Survey of Canada (NRCan).
For measuring AMN amplitudes and for computing MN magnitudes |scname| provides
regionalization.
Amplitude
---------
Amplitude unit in |scname|: **meter/second** (m/s)
Settings
========
Add the *nuttli* plugin to the list of loaded plugins e.g. in the global module configuration:
.. code-block:: sh
plugins = ${plugins},nuttli
Adjust MN-specific global bindings parameters in the amplitude section and set the
region-specific calibration parameters in the global module configuration.
scamp
-----
Add the Nuttli amplitude type, **AMN**, to the range of magnitudes for which the amplitudes are
to be calculated by :ref:`scamp`, e.g.:
.. code-block:: sh
amplitudes = AMN
Adjust MN-specific global bindings parameters in the amplitude section and set the
region-specific calibration parameters in the global module configuration
(amplitude section).
.. note::
Provide *AMN* for computing Nuttli-type amplitudes.
scmag
-----
Add the Nuttli magnitude type, **MN**, to the range of magnitudes to be calculated by
:ref:`scmag`, e.g.:
.. code-block:: sh
magnitudes = MN
Adjust MN-specific global bindings parameters in the magnitude section and define
the region polygons in the global module configuration (magnitude section).
.. _global_mn_configuration:
Module Configuration
====================
.. note::
**amplitudes.MN.\***
*Amplitude control parameters for MN (Nuttli magnitude).*
.. confval:: amplitudes.MN.velocityModel
Default: ``iasp91``
Type: *string*
The travel time table set compiled for LocSAT. The tables
are located in \"share\/locsat\/tables\/[vmodel].\*\".
.. note::
**magnitudes.MN.\***
*Regionalization of MN (Nuttli magnitude).*
.. confval:: magnitudes.MN.region
Default: ``@DATADIR@/magnitudes/MN/MN.bna``
Type: *path*
The path to the BNA file which defines the valid region
for the MN magnitude. Note that the entire path from
source to receiver must lie entirely within the polygon\(s\).
.. confval:: magnitudes.MN.offsetMw
Type: *double*
The offset applied to the MN network magnitude to
estimate Mw\(MN\). If not configured then no Mw estimation
will be applied.

View File

@ -0,0 +1,66 @@
.. _global_ms_20:
#####
Ms_20
#####
Surface wave magnitude measured at around 20 s
Description
===========
Ms_20 is the surface wave magnitude measured on the vertical component at around
20 s period in accordance with the IASPEI standards.
Amplitude
---------
The Ms_20 amplitudes are calculated on vertical-component displacement seismograms
corrected for the instrument response of a :term:`WWSSN_LP` seismograph.
Station Magnitude
-----------------
Ms_20 is the surface-wave magnitude at 20 s period based on the recommendations
by the IASPEI magnitude working group issued on 27 March, 2013 (:cite:t:`iaspei-2013`).
.. math::
M_s = \log \left(\frac{A}{T}\right) + 1.66 \log(\Delta) + 0.3
with
A: :term:`WWSSN_LP` corrected ground displacement in units of nm measured on the vertical-component
seismogram as the maximum absolute trace amplitude of a surface wave at periods between
18 s and 22 s,
T: period of the surface wave in seconds.
The term *Ms_20* is chosen in accordance with the IASPEI standard as of 2013 (:cite:t:`iaspei-2013`).
Alternatively, the term *Ms(20)* may be used.
* Amplitude unit in |scname|: **nanometer** (nm)
* Time window: 0 s - distance (km) / 3.5 km/s + 30 s
* Period range: 18 s - 22 s, configurable: :confval:`magnitudes.Ms_20.lowerPeriod`,
:confval:`magnitudes.Ms_20.upperPeriod`
* Default distance range: 20 - 160 deg, configurable: :confval:`magnitudes.Ms_20.minDist`,
:confval:`magnitudes.Ms_20.maxDist`
* Depth range: <= 100 km, configurable: :confval:`magnitudes.Ms_20.maxDepth`
Network magnitude
-----------------
By default, the trimmed mean is calculated from the station magnitudes to form
the :term:`network magnitude`. Outliers below the 12.5% and above the 12.5% percentiles are
removed before the calculation.
Configuration
-------------
Adjust the configurable parameters in global bindings in the Ms_20 section. Add
Ms_20 to the list of computed amplitudes and magnitudes in the configuration of
:ref:`scamp` and :ref:`scmag` and in :ref:`scesv` or :ref:`scolv` for visibility.

View File

@ -0,0 +1,567 @@
.. _global_nonlinloc:
#########
NonLinLoc
#########
NonLinLoc locator wrapper plugin for SeisComP.
NonLinLoc was written by Anthony Lomax (http://alomax.free.fr/nlloc).
Description
===========
Funded by `SED/ETH Zurich <http://www.seismo.ethz.ch/>`_, developed by `gempa GmbH <http://www.gempa.de>`_.
This plugin is available from SeisComP version Release Potsdam 2010 and later.
The `NonLinLoc (NLL) <http://alomax.free.fr/nlloc>`_ locator algorithm has been
implemented into SeisComP through the plugin mechanism. A new plugin locnll
contains the LocatorInterface implementation for NonLinLoc.
The implementation bundles the NonLinLoc source files required to use the library
function calls. The following source files are included:
.. code-block:: sh
GridLib.c
GridLib.h
GridMemLib.c
GridMemLib.h
NLLocLib.h
NLLoc1.c
NLLocLib.c
calc_crust_corr.c
calc_crust_corr.h
crust_corr_model.h
crust_type_key.h
crust_type.h
loclist.c
octtree.h
octtree.c
phaseloclist.h
phaselist.c
geo.c
geo.h
geometry.h
map_project.c
map_project.h
otime_limit.c
otime_limit.h
ran1.c
ran1.h
velmod.c
velmod.h
util.h
util.c
alomax_matrix/alomax_matrix.c
alomax_matrix/alomax_matrix.h
alomax_matrix/alomax_matrix_svd.c
alomax_matrix/alomax_matrix_svd.h
Error measures
==============
After running NonLinLoc the output is converted into a SeisComP (QuakeML) origin
object including all available error measures. The following table shows how
the NLL error measures are mapped to the SeisComP data model:
========================================================= =====================================================
SeisComP NLL
========================================================= =====================================================
Origin.latitude.uncertainty sqrt(phypo->cov.yy)
Origin.longitude.uncertainty sqrt(phypo->cov.xx)
Origin.depth.uncertainty sqrt(phypo->cov.zz)
Origin.originQuality.standardError phypo->rms
Origin.originQuality.secondaryAzimuthalGap phypo->gap_secondary
Origin.originQuality.usedStationCount phypo->usedStationCount
Origin.originQuality.associatedStationCount phypo->associatedStationCount
Origin.originQuality.associatedPhaseCount phypo->associatedPhaseCount
Origin.originQuality.usedPhaseCount phypo->nreadings
Origin.originQuality.depthPhaseCount phypo->depthPhaseCount
Origin.originQuality.minimumDistance km2deg(phypo->minimumDistance)
Origin.originQuality.maximumDistance km2deg(phypo->maximumDistance)
Origin.originQuality.medianDistance km2deg(phypo->medianDistance)
Origin.originQuality.groundTruthLevel phypo->groundTruthLevel
Origin.originUncertainty.horizontalUncertainty phypo->ellipse.len2
Origin.originUncertainty.minHorizontalUncertainty phypo->ellipse.len1
Origin.originUncertainty.maxHorizontalUncertainty phypo->ellipse.len2
Origin.originUncertainty.azimuthMaxHorizontalUncertainty phypo->ellipse.az1 + 90
ConfidenceEllipsoid.semiMajorAxisLength phypo->ellipsoid.len3
ConfidenceEllipsoid.semiMinorAxisLength phypo->ellipsoid.len1
ConfidenceEllipsoid.semiIntermediateAxisLength phypo->ellipsoid.len2
ConfidenceEllipsoid.majorAxisPlunge (phypo->ellipsoid.axis1 x phypo->ellipsoid.axis2).dip
ConfidenceEllipsoid.majorAxisAzimuth (phypo->ellipsoid.axis1 x phypo->ellipsoid.axis2).az
ConfidenceEllipsoid.majorAxisRotation T.B.D.
========================================================= =====================================================
Plugin
======
The NonLinLoc plugin is installed under :file:`share/plugins/locnll.so`.
It provides a new implementation of the LocatorInterface with the name NonLinLoc.
To add the plugin to a module add it to the modules configuration, either
:file:`modulename.cfg` or :file:`global.cfg`:
.. code-block:: sh
plugins = ${plugins}, locnll
Basically it can be used by two modules: :ref:`screloc` and :ref:`scolv`.
Output
======
All output is stored in the configured :confval:`NonLinLoc.outputPath`.
The file prefix for a location is the originID (:confval:`NonLinLoc.publicID`).
The following file are stored:
- Input observations (.obs)
- Input configuration (.conf)
- NLL location (.loc.hyp)
- NLL 3D grid header (.loc.hdr)
- NLL octree (.loc.octree)
- NLL scatter file (.loc.scat)
In addition to the native NLL output a SeisComP origin object is created and
returned to the calling instance. Usually this object is then sent via messaging.
Profiles
========
The plugin allows to specify multiple configuration profiles (:confval:`NonLinLoc.profiles`).
The profile to use can selected both in `scolv` and `screloc`, however a
virtual profile automatic is also provided, which selects the best matching
configured profile based on the initial location. For this reason each profile
contains some configuration parameters that defines where the profile is valid
(`transform`, `region`, `origin`, `rotation`). The `transform` profile
configuration parameter supports only `GLOBAL` or `SIMPLE` at the moment: only the
profile has this limitation, not the NonLinLoc control file, which supports
all transformations available in NonLinLoc.
**NOTE**: If a profile `transform` is set as `GLOBAL` and the `region` parameter is
left `empty`, then the plugin adds the line `TRANS GLOBAL` to the control file,
forcing a global transformation.
Configuration example
=====================
To add the plugin to an application such as scolv or screloc, add the plugin
name to the list of plugins that are loaded (e.g. :file:`scolv.cfg`):
.. code-block:: sh
plugins = ${plugins}, locnll
Futhermore add the plugin configuration (e.g. :file:`scolv.cfg`):
.. code-block:: sh
########################################################
################ NonLinLoc configuration################
########################################################
NLLROOT = ${HOME}/nll/data
NonLinLoc.outputPath = ${NLLROOT}/output/
# Define the default control file if no profile specific
# control file is defined.
NonLinLoc.controlFile = ${NLLROOT}/NLL.default.conf
# Set the default pick error in seconds passed to NonLinLoc
# if no SeisComP pick uncertainty is available.
NonLinLoc.defaultPickError = 0.1
# Define the available NonLinLoc location profiles. The order
# implicitly defines the priority for overlapping regions
#NonLinLoc.profiles = swiss_3d, swiss_1d, global
NonLinLoc.profiles = swiss_3d, global
# The earthModelID is copied to earthModelID attribute of the
# resulting origin
NonLinLoc.profile.swiss_1d.earthModelID = "swiss regional 1D"
# Specify the velocity model table path as used by NonLinLoc
NonLinLoc.profile.swiss_1d.tablePath = ${NLLROOT}/time_1d_regio/regio
# Specify the region valid for this profile
# Without this parameter the plugin will add an additional
# TRANS GLOBAL statement in the NLL control file
NonLinLoc.profile.swiss_1d.region = 41.2, 3.8, 50.1, 16.8
# The NonLinLoc control file to use for this profile
NonLinLoc.profile.swiss_1d.controlFile = ${NLLROOT}/NLL.swiss_1d.conf
# Configure the swiss_3d profile
NonLinLoc.profile.swiss_3d.earthModelID = "swiss regional 3D"
NonLinLoc.profile.swiss_3d.tablePath = ${NLLROOT}/time_3d/ch
NonLinLoc.profile.swiss_3d.region = 45.15, 5.7, 48.3, 11.0
NonLinLoc.profile.swiss_3d.controlFile = ${NLLROOT}/NLL.swiss_3d.conf
# And the global profile
NonLinLoc.profile.global.earthModelID = iaspei91
NonLinLoc.profile.global.tablePath = ${NLLROOT}/iasp91/iasp91
NonLinLoc.profile.global.controlFile = ${NLLROOT}/NLL.global.conf
An example of a NonLinLoc control file configuration that contains all the
required statements, but it must be adapted to the specific use case. The
missing statements are generated by the plugin (LOCFILES, LOCHYPOUT, LOCSRCE):
.. code-block:: sh
# -1 = no logs, useful for playback with screloc --ep option
CONTROL -1 123456
# This must be the same TRANS used for generating the grid files
TRANS SDC 46.51036987 8.47575546 0.0
LOCSIG Swiss Seismological Service, ETHZ
LOCCOM location using my local velocity model
LOCSEARCH OCT 20 20 20 0.001 10000 1000
# This grid origin is relative to the TRANS statement. The grid
# must be wholly contained in the grid files
LOCGRID 101 101 101 -0.5 -0.5 -1.8 0.01 0.01 0.01 PROB_DENSITY SAVE
LOCMETH EDT_OT_WT 9999.0 4 -1 -1 -1 0 -1 1
LOCGAU 0.001 0.0
LOCPHASEID P P p G Pn Pg P1
LOCPHASEID S S s G Sn Sg S1
LOCQUAL2ERR 0.025 0.050 0.100 0.200 0.400 99999.9
LOCANGLES ANGLES_YES 5
**NOTE**: The LOCHYPOUT parameter statement is always generated by the plugin.
By default it outputs `LOCHYPOUT NONE`. If `enableSEDParameters` is enabled
or if the original control file contains `CALC_SED_ORIGIN`, it will append
`CALC_SED_ORIGIN`. If the original control file contains `SAVE_NLLOC_EXPECTATION`,
that flag will also be preserved. Currently, only `CALC_SED_ORIGIN` and
`SAVE_NLLOC_EXPECTATION` are supported by the plugin. Any other options are
omitted and will not be forwarded to NonLinLoc.
Stations names
==============
When generating the grid files the station names used in the GTSRCE statement
must match the rule set in the plugin configuration. E.g.
.. code-block:: sh
# Format of the station name used to select the right travel time table (grid)
# file for a station. By default only the station code is used (e.g.
# tablePath.P.@STA@.time.*), but that doesn't allow to distinguish between
# multiple network codes or location codes that use the same station code. To
# overcome this limitation this parameter could be set in a more general way,
# for example @NET@_@STA@_@LOC@. In this way NonLinLoc will look for travel
# time table (grid) files of the form: tablePath.P.@NET@_@STA@_@LOC@.time.*
# Where @NET@ @STA@ @LOC@ are just placeholder for the actual codes
NonLinLoc.profile.MYPROFILE.stationNameFormat = @NET@_@STA@_@LOC@
Given the above plugin configuration, the GTSRCE statement should be something
like this:
.. code-block:: sh
GTSRCE CH_STA01_ LATLON 46.519 8.474 0.0 1.295
GTSRCE CH_STA02_01 LATLON 46.456 8.474 0.0 1.323
GTSRCE CH_STA03_AA LATLON 46.784 8.474 0.0 1.292
Alternatively the names could just contain the station code:
.. code-block:: sh
NonLinLoc.profile.MYPROFILE.stationNameFormat = @STA@
.. code-block:: sh
GTSRCE STA01 LATLON 46.519 8.474 0.0 1.295
GTSRCE STA02 LATLON 46.456 8.474 0.0 1.323
GTSRCE STA03 LATLON 46.784 8.474 0.0 1.292
Usage
=====
Locator
-------
The usage of the new NLL plugin is straight forward. Once loaded successfully the
new locator shows up in the lower left corners combo box.
.. figure:: media/nonlinloc/locator_selection_small.png
Select the new NonLinLoc locator and the configured profiles will be loaded into
the combo box right of it.
.. figure:: media/nonlinloc/locator_profile_selection_small.png
The NonLinLoc implementation provides a virtual profile automatic. This emulates
the complete automatic case and selects the best matching configured profiles
based on the initial location.
If an origin has been relocated the method should be set to "NonLinLoc" and
the earth model contains the string NonLinLoc.profile.[name].earthModelID
configured for the selected profile.
.. figure:: media/nonlinloc/origin_information.png
Settings
--------
The NLL locator implementation supports to override configured settings or
control parameters for a session. Those changes are not persistent and lost if
the locator is changed to another one or the profile has been changed. However
this feature is particularly useful when trying differnt settings on a particular
origin or for enabling the NonLinLoc logs (`CONTROL` statement) that becomes
visible on the console.
To open the settings dialog press the button right to the locator selection
combo box.
.. figure:: media/nonlinloc/locator_settings.png
Then the NLL specific parameters show up.
.. figure:: media/nonlinloc/NLL_settings.png
Seismicity Viewer
-----------------
scolv provides two additional configurable buttons. To bind
`Seismicity Viewer <http://alomax.free.fr/seismicity>`_ to the first one the
following configuration can be used:
.. code-block:: sh
button0 = "Seismicity Viewer"
scripts.script0 = @CONFIGDIR@/scripts/sv
A small wrapper script sv has been created that calls Seismicity Viewer based
on the origin ID passed to the script.
.. code-block:: sh
#!/bin/sh
FILE=$HOME/nll/data/output/$1.loc.hyp
java -classpath $HOME/nll/bin/SeismicityViewer50.jar \
net.alomax.seismicity.Seismicity $FILE
This examples assumes that Seismicity Viewer has been installed in $HOME/nll/bin.
.. _global_nonlinloc_configuration:
Module Configuration
====================
.. confval:: NonLinLoc.publicID
Default: ``NLL.@time/%Y%m%d%H%M%S.%f@.@id@``
Type: *string*
PublicID creation pattern for an origin created by NonLinLoc.
.. confval:: NonLinLoc.outputPath
Default: ``/tmp/sc3.nll``
Type: *path*
Defines the output path for all native NonLinLoc input and output files.
.. confval:: NonLinLoc.saveInput
Default: ``true``
Type: *boolean*
Save input files \*.obs in outputPath for later processing.
Setting to false reduces file i\/o and saves disk space.
.. confval:: NonLinLoc.saveIntermediateOutput
Default: ``true``
Type: *boolean*
Save output files in outputPath for later processing or
for viewing by the Seismicity Viewer.
Setting to false reduces file i\/o and saves disk space.
.. confval:: NonLinLoc.controlFile
Type: *path*
The default NonLinLoc control file to use.
.. confval:: NonLinLoc.defaultPickError
Default: ``0.5``
Type: *double*
Unit: *s*
The default pick error in seconds passed to NonLinLoc if a SeisComP pick
object does not provide pick time uncertainties.
.. confval:: NonLinLoc.fixedDepthGridSpacing
Default: ``0.1``
Type: *double*
Unit: *km*
Since NLL does not support fixing the depth natively so this
feature is emulated by settings the Z grid very tight around
the depth to be fixed. This value sets the Z grid spacing.
.. confval:: NonLinLoc.allowMissingStations
Default: ``true``
Type: *boolean*
Picks from stations with missing configuration will be
ignored. The origin will be relocated without that pick
if possible.
If set to false, the plug\-in throws
an excepection without locating.
.. confval:: NonLinLoc.profiles
Type: *list:string*
Defines a list of active profiles to be used by the plugin.
.. note::
**NonLinLoc.profile.$name.\***
*Defines a regional profile that is used if a prelocation falls*
*inside the configured region.*
$name is a placeholder for the name to be used and needs to be added to :confval:`NonLinLoc.profiles` to become active.
.. code-block:: sh
NonLinLoc.profiles = a,b
NonLinLoc.profile.a.value1 = ...
NonLinLoc.profile.b.value1 = ...
# c is not active because it has not been added
# to the list of NonLinLoc.profiles
NonLinLoc.profile.c.value1 = ...
.. confval:: NonLinLoc.profile.$name.earthModelID
Type: *string*
earthModelID that is stored in the created origin.
.. confval:: NonLinLoc.profile.$name.methodID
Default: ``NonLinLoc``
Type: *string*
methodID that is stored in the created origin.
.. confval:: NonLinLoc.profile.$name.tablePath
Type: *path*
Path to travel time tables \(grids\).
.. confval:: NonLinLoc.profile.$name.stationNameFormat
Default: ``@STA@``
Type: *string*
Format of the station name used to select the right travel time table \(grid\) file
for a station.
By default only the station code is used \(e.g. tablePath.P.\@STA\@.time.\*\), but
that doesn't allow to distinguish between multiple network codes or location codes
that use the same station code.
To overcome this limitation this parameter could be set in a more general way, for
example \@NET\@_\@STA\@_\@LOC\@. In this way NonLinLoc will look for
travel time table \(grid\) files of the form: tablePath.P.\@NET\@_\@STA\@_\@LOC\@.time.\*
Where \@NET\@ \@STA\@ \@LOC\@ are just placeholder for the actual codes
.. confval:: NonLinLoc.profile.$name.controlFile
Type: *path*
Control file of the current profile. If not set, the default
control file will be used instead.
.. confval:: NonLinLoc.profile.$name.transform
Default: ``GLOBAL``
Type: *string*
Transformation type of the configured region. Supported are
SIMPLE and GLOBAL.
Default: GLOBAL is assumed.
.. confval:: NonLinLoc.profile.$name.region
Type: *list:double*
Defines the 4 corner values of the epicentral region for selecting the profile.
The original epicentre must be within the region.
If transform is GLOBAL: min_lat, min_lon, max_lat, max_lon.
The values define the geographic corner coordinates. Unit is degree.
If transform is SIMPLE: xmin, ymin, xmax, ymax.
The values define the region relative to the configured origin.
Unit is km.
.. confval:: NonLinLoc.profile.$name.origin
Type: *list:double*
Unit: *deg*
Only used for transformation SIMPLE. Expects 2 values: latitude, longitude.
The value define the geographic origin of the area spanned by region.
Unit is degree.
.. confval:: NonLinLoc.profile.$name.rotation
Type: *double*
Unit: *deg*
Only used for transformation SIMPLE. Defines the rotation around the
origin of the defined region.

View File

@ -0,0 +1,522 @@
.. _global_recordstream:
############
RecordStream
############
RecordStream interface for SeisComP.
Description
===========
|scname| applications access waveform data through the RecordStream interface.
The following tables lists available implementations:
.. csv-table::
:header: "Name", "URL Scheme(s)", "Description"
":ref:`rs-arclink`", "``arclink``", "Connects to an ArcLink server"
":ref:`rs-balanced`", "``balanced``", "Distributes requests to multiple proxy streams"
":ref:`rs-routing`", "``routing``", "Distributes requests to multiple proxy streams according to user defined rules"
":ref:`rs-caps`", "``caps``, ``capss``", "Connects to a `gempa CAPS server <https://www.gempa.de/products/caps/>`_"
":ref:`rs-combined`", "``combined``", "Combines archive and real-time stream"
":ref:`rs-dec`", "``dec``", "Decimates (downsamples) a proxy stream"
":ref:`rs-fdsnws`", "``fdsnws``, ``fdsnwss``", "Connects to :ref:`FDSN web service <fdsnws>`"
":ref:`rs-file`", "``file``", "Reads records from file"
":ref:`rs-memory`", "``memory``", "Reads records from memory"
":ref:`rs-resample`", "``resample``", "Resamples (up or down) a proxy stream to a given sampling rate"
":ref:`rs-sdsarchive`", "``sdsarchive``", "Reads records from |scname| archive (:term:`SDS`)"
":ref:`rs-slink`", "``slink``", "Connects to :ref:`SeedLink server <seedlink>`"
Application
===========
The RecordStream parameters considered by an application are provided as a *URL*
in 2 alternative ways:
* Specification of the *URL* on the command line. Use the option ``-I URL``
* Configuration of the *URL* using the global parameter :confval:`recordstream`.
The URL scheme defines the specific RecordStream implementation. If the scheme
is omitted, the :ref:`rs-file` implementation is used as default.
.. note::
Older SeisComP versions used to split the URL into the parameters
:confval:`recordstream.service` and :confval:`recordstream.source`.
These parameters are not being used anymore.
Implementations
===============
.. _rs-slink:
SeedLink
--------
This RecordStream fetches data from a SeedLink server.
Definition
^^^^^^^^^^
URL: ``slink://[host][:port][?parameter]``
The default host is set to `localhost`, the default port to `18000`. Optional
URL encoded parameters are:
- `timeout` - connection timeout in seconds, default: 300
- `retries` - number of connection retry attempts, default: -1
- `no-batch` - disables BATCH mode to request data, does not take a value
Examples
^^^^^^^^
- ``slink://``
- ``slink://geofon.gfz-potsdam.de?timeout=60&retries=5``
- ``slink://localhost:18000``
.. _rs-arclink:
ArcLink
-------
This RecordStream fetches data from a ArcLink server.
Definition
^^^^^^^^^^
URL: ``arclink://[host][:port][?parameters]``
The default host is set to `localhost`, the default port to `18001`. Optional
URL encoded parameters are:
- `user` - user name required on some servers
- `pwd` - password required on some servers
- `dump` - optional output file for all records being received
Examples
^^^^^^^^
- ``arclink://``
- ``arclink://geofon.gfz-potsdam.de?user=foo&pwd=secret``
- ``arclink://localhost:18042``
- ``arclink://localhost?dump=test.mseed``
.. _rs-fdsnws:
FDSNWS
------
This RecordStream fetches data from a FDSN web service.
Definition
^^^^^^^^^^
URL: ``fdsnws[s]://host[:port][path]``
The host is a mandatory parameter. The default port depends on the URL scheme
used:
- `fdsnws`: `80` (HTTP)
- `fdsnwss`: `443` (HTTPS)
The default path is set to `/fdsnws/dataselect/1/query`. If a path is specified,
it needs to be complete up until the `query` resource.
Authentication via the `queryauth` resource is currently not supported.
Examples
^^^^^^^^
- ``fdsnws://service.iris.edu``
- ``fdsnws://service.iris.edu:80/fdsnws/dataselect/1/query``
- ``fdsnwss://geofon.gfz-potsdam.de``
.. _rs-file:
File
----
This RecordStream reads data from a file.
Definition
^^^^^^^^^^
URL: ``file://path``
The path may be a absolute or relative path to a file. Environment variables
are **not** resolved. If path is set to ``-`` the data is read from `stdin`.
Supported files types are:
* miniSEED
* SAC
* XML
* binary
By default the record type is set to `mseed`. SAC data can be read using the *#sac*
descriptor. If a file name extension is available, then the record type is set as
follows:
========= ===========
Extension Record Type
========= ===========
`*.xml` `xml`
`*.bin` `binary`
`*.mseed` `mseed`
========= ===========
Optional descriptor:
- `sac` - input data are in SAC format.
Examples
^^^^^^^^
- ``file://-``
- ``file:///tmp/input.mseed``
- ``file:///tmp/input.sac#sac``
.. note ::
When defining the File RecordStream on the command line using the option `-I`,
the file name can also be passed without the URL scheme, e.g. ::
-I -
-I /tmp/input.mseed
.. _rs-sdsarchive:
SDSArchive
----------
This RecordStream reads data from one or more |scname| (:term:`SDS`) archives using the
:ref:`rs-file` RecordStream.
Definition
^^^^^^^^^^
URL: ``sdsarchive://[path[,path2[, ...]]]``
The default path is set to `$SEISCOMP_ROOT/var/lib/archive`.
In contrast to a formal URL definition, the URL path is interpreted as a directory path list
separated by commas.
.. note::
When defining multiple directories separated by comma in a configuration
file, please enclose the entire definition (including ``sdsarchive://`` with
double quotes. Otherwise the configuration parser will interpret it as a list
and will only return the first part up to the first comma.
Different SDS archives are not merged, but are read sequentially depending on
data existence. If a requested file is missing in the current SDS archive, it is
searched for in the archive next in the list. On success it will deliver all
the rest of files for the current channel from this SDS archive. On failure the
next SDS archive is searched.
This process is repeated for each requested channel individually. It always
starts to search data from the first given SDS to the last one, for each data
channel.
Examples
^^^^^^^^
- ``sdsarchive://``
- ``sdsarchive:///home/sysop/seiscomp/var/lib/archive``
- ``sdsarchive:///SDSA,/SDSB,/SDSC``
.. _rs-caps:
CAPS
----
This RecordStream reads data from a
`gempa CAPS server <https://www.gempa.de/products/caps/>`_.
Definition
^^^^^^^^^^
URL: ``caps[s]://[user:pass@][host[:port]][?parameters]``
The default host is set to `localhost`. The default port depends on the URL scheme
used:
- `caps`: `18002`
- `capss`: `18022` (SSL)
Optional URL encoded parameters are:
- `arch` - No parameter. Retrieve only archived data. In this mode the connection
finished when all available data has been sent. It won't wait for additional
real-time data.
- `ooo` - Allow out-of-order data
- `timeout` - The socket timeout in seconds
- `user` - **Deprecated:** The user name of an authenticated request. Please use
the standard URL userinfo in front of the host instead.
- `pwd` - **Deprecated:** The password of an authenticated request. Please use
the standard URL userinfo in front of the host instead.
- `request-file` - Use the given file to feed the request
Example
^^^^^^^
- ``caps://``
- ``caps://localhost:18002``
- ``capss://localhost:18022``
- ``caps://localhost:18002?arch``
- ``capss://user:mysecret@localhost``
.. _rs-memory:
Memory
------
This RecordStream reads data from memory and is only useful for developing
applications. For instance a record sequence stored in an internal buffer could
be passed to an instance of this RecordStream for reading.
.. _rs-combined:
Combined
--------
This RecordStream combines one archive and one real-time RecordStream, e.g.
:ref:`rs-arclink` and :ref:`rs-slink`. First the archive stream is read up to
the size of the real-time buffer. Then the acquisition is switched to the
real-time stream. The syntax for the source is similar to an URL:
Definition
^^^^^^^^^^
URL-like: ``combined://[real-time-stream];[archive-stream][??parameters]``
By default the real-time stream is set to :ref:`rs-slink` and the
archive-stream is set to :ref:`rs-arclink`. Any other streams may be configured.
The definition of the proxy streams has slightly changed: Scheme and source are
only separated by a slash, e.g. `slink://localhost` needs to be defined as
`slink/localhost`.
The URL parameters of the combined stream are separated by 2 question marks
(`??`) in order to distinguish them from the parameters used in the proxy
streams:
- `slinkMax|rtMax|1stMax` - Buffer size in seconds of the first stream
(typically the real-time stream), default: 3600
Time spans can be configured with an additional and optional suffix:
====== =============
Suffix Multiplicator
====== =============
s 1
m 60
h 3600
d 86400
w 86400*7
====== =============
- `splitTime` - The absolute time of the separation of both sources. The argument
is an ISO time string, e.g. 2018-05-10T12:00:00Z or a year, e.g. 2018, which is
the same as 2018-01-01T00:00:00.000Z.
`splitTime` can be used if the waveform archives are spread over several
directories or hard disks. See also the :ref:`examples<rs_splitTime>`.
The combined record stream may be nested allowing the configuration of a
(theoretically) infinite number of archive streams. The URL syntax for a nested
configuration uses parenthesis:
``combined://real-time-stream;combined/(archive-stream1;archive-stream2??parameters)??parameters``
.. _rs_splitTime:
Examples
^^^^^^^^
.. csv-table::
:header: "URL", "Description"
"``combined://localhost:18000;localhost:18001``", "Seedlink on localhost:18000 combined with Arclink on localhost 18001"
"``combined://slink/localhost:18000;arclink/localhost:18001``", "Same as above"
"``combined://;``", "Same as above"
"``combined://:18042;?user=foo&pwd=secret??rtMax=1800``", "Seedlink on localhost:18042 combined with Arclink on localhost 18001, real-time (SeedLink) buffer size set to 30min"
"``combined://slink/localhost:18000;sdsarchive//home/sysop/seiscomp/var/lib/archive``", Seedlink combined with SDS archive
"``combined://slink/localhost:18000;combined/(arclink/localhost:18001;arclink/localhost:18002??1stMax=30d)??1stMax=1h``", Seedlink combined with a combined record stream using two Arclink sources
"``combined://slink/localhost:18000;combined/(sdsarchive//home/sysop/seiscomp/var/lib/archive;combined/(sdsarchive//home/sysop/seiscomp/var/lib/archive2017;sdsarchive//home/sysop/seiscomp/var/lib/archive2016??splitTime=2017)??splitTime=2018)``", "Seedlink combined with a combined recordStream providing access to 3 different SDS archives separated by time. The first SDS archive contains the most recent archived data. The other two contain the data from 2016 and 2017."
"``combined://slink/localhost:18000;combined/(sdsarchive//home/sysop/seiscomp/var/lib/archive;combined/(sdsarchive//home/sysop/seiscomp/var/lib/archive2017;sdsarchive//home/sysop/seiscomp/var/lib/archive2016??splitTime=2017-06-01T00:00:00Z)??splitTime=2018-06-01T00:00:00Z)``", "Seedlink combined with a combined recordStream providing access to 3 different SDS archives separated by time. The first SDS archive contains the most recent archived data. The other two are separated in mid of 2016."
.. _rs-balanced:
Balanced
--------
This RecordStream distributes requests quasi-equally (but deterministically) to
multiple proxy streams. It can be used for load balancing and to improve failure
tolerance. The algorithm to choose a proxy stream (counting from 0) is based on
station code and can be expressed in Python as follows:
.. code-block:: python
stationCode = "WLF"
nproxies = 2
x = 0
for c in stationCode:
x += ord(c)
print("choosing proxy stream", x % nproxies)
Definition
^^^^^^^^^^
URL-like: ``balanced://proxy-stream[;proxy-stream2[; ...]]``
The definition of the proxy streams has slightly changed: Scheme and source
are only separated by a slash, e.g. `slink://localhost` needs to be defined as
`slink/localhost`.
Examples
^^^^^^^^
.. csv-table::
:header: "URL", "Description"
"``balanced://slink/server1:18000;slink/server2:18000``", "Distribute requests to 2 :ref:`rs-slink` RecordStreams"
"``balanced://combined/(server1:18000;server1:18001);combined/(server2:18000;server2:18001)``", "Distribute requests to 2 :ref:`rs-combined` RecordStreams"
.. _rs-routing:
Routing
--------
This RecordStream distributes requests to multiple proxy streams according to
user supplied routing rules, which allow to route specific network, station,
location or channel codes to fixed proxy streams.
Definition
^^^^^^^^^^
URL-like: ``routing://proxy-stream??match=pattern[;proxy-stream2??match=pattern[; ...]]``
The definition of the proxy streams has slightly changed: Scheme and source
are only separated by a slash, e.g. `slink://localhost` needs to be defined as
`slink/localhost`.
The URL parameters of the routing stream are separated by 2 question marks
(`??`) in order to distinguish them from the parameters used in the proxy
streams.
`pattern` defines the rule used to route the request to the proxy stream and it is
in `NET.STA.LOC.CHA` format. The special characters `?` `*` `|` `(` `)` are allowed.
Examples
^^^^^^^^
.. csv-table::
:header: "URL", "Description"
"``routing://slink/server1:18000??match=(NET1|NET2).*.*.*;slink/server2:18000??match=*.*.*.*``", "Requests for network `NET1` and `NET2` go to server1, all the rest to server2"
"``routing://slink/server1:18000??match=TMP?.*.*.*;slink/server2:18000??match=NET.*.*.*``", "Requests for network `TMPX` go to server1, for network `NET` go to server 2, all the rest are not fulfilled"
"``routing://slink/server1:18000??match=*.*.*.(HH|EH)?;slink/server2:18000??match=*.*.*.*``", "Requests for channels `HH` and `EH` go to server1, all the rest to server2"
"``routing://combined/(server1:18000;server1:18001??rtMax=1800)??match=NET1.*.*.*;combined/(server2:18000;server2:18001??rtMax=1800)??match=NET2.*.*.*``", "Split requests to 2 :ref:`rs-combined` RecordStreams according to the network code `STA1` or `STA2`. Other network codes are not fullfilled"
"``routing://combined/(slink/special-server:18000;sdsarchive//home/sysop/seiscomp/var/lib/special-archive)??match=SP.*.*.*;combined/(slink/default-server:18000;sdsarchive//home/sysop/seiscomp/var/lib/default-archive)??match=*.*.*.*``", "Requests for special network `SP` are fullfilled by seedlink `special-server` and sdsarchive `/home/sysop/seiscomp/var/lib/special-archive`, all the rest are fullfilled by seedlink `default-server` and archive `/home/sysop/seiscomp/var/lib/default-archive`"
.. _rs-dec:
Decimation
----------
This RecordStream decimates (downsamples) a proxy stream, e.g. :ref:`rs-slink`.
Definition
^^^^^^^^^^
URL-like: ``dec://proxy-stream-scheme[?dec-parameters]/[proxy-stream-source]``
The definition of the proxy streams has slightly changed: Scheme and source are
only separated by a slash, e.g. `slink://localhost` needs to be defined as
`slink/localhost`. Also optional decimation parameters directly follow the proxy
stream scheme.
Optional decimation parameters are:
- `rate` - target sampling rate in Hz, default: 1
- `fp` - default: 0.7
- `fs` - default: 0.9
- `cs` - coefficient scale, default: 10
Examples
^^^^^^^^
- ``dec://slink/localhost:18000``
- ``dec://file?rate=2/-``
- ``dec://combined/;``
.. _rs-resample:
Resample
--------
This RecordStream resamples (up or down) a proxy stream, e.g. :ref:`rs-slink`,
to a given sampling rate.
Definition
^^^^^^^^^^
URL-like: ``resample://proxy-stream-scheme[?dec-parameters]/[proxy-stream-source]``
The definition of the proxy streams has slightly changed: Scheme and source are
only separated by a slash, e.g. `slink://localhost` needs to be defined as
`slink/localhost`. Also optional decimation parameters directly follow the proxy
stream scheme.
Optional resample parameters are:
- `rate` - target sampling rate in Hz, default: 1
- `fp` - default: 0.7
- `fs` - default: 0.9
- `cs` - coefficient scale, default: 10
- `lw` - lanczos kernel width, default: 3
- `debug` - enables debug output, default: false
Examples
^^^^^^^^
- ``resample://slink/localhost:18000``
- ``resample://file?rate=2/-``
- ``resample://combined/;``

View File

@ -0,0 +1,455 @@
.. _global_stdloc:
######
StdLoc
######
Generic locator plugin for SeisComP.
Description
===========
StdLoc is a SeisComP locator plugin that combines standard location methods
and was developed with the focus on local seismicity, although the methods
are generic enough to work at larger scales as well.
Plugin
======
To enable StdLoc the plugin ``stdloc`` must be loaded.
How does it work?
=================
The locator can apply a multitude of location methods and it is particularly useful to
combine them to achieve better solutions:
- LeastSquares: this is the classic algorithm that solves the linearized problem of
travel time residual minimization via iterative least squares. However an initial
location estimate is required. This is the intended method to select when StdLoc
is used in combination with a pick associator: it provides the initial location
estimate and StdLoc will improve it. When used in :ref:`scolv` or :ref:`screloc`,
the location of the origin to be relocated is used as starting estimate.
The configuration doesn't require any mandatory parameters:
.. code-block:: params
method = LeastSquares
- GridSearch: finds the source parameters by evaluating the hypocenter probability
of each point in a grid and returning the maximum likelihood hypocenter.
Because the search space is fully evaluated there is no need for an initial
location estimate and the location uncertainty is completely known. However the
method is very slow. It can be used to relocate events in :ref:`scolv` that seem
difficult to locate via other methods or to verify the uncertainty of a solution.
The following example configuration computes a grid search around the average
location of the picked stations. The grid points are spaced apart 0.5km
horizontally and 2km vertically.
.. code-block:: params
method = GridSearch
GridSearch.center = auto,auto,15
GridSearch.size = 40,40,30
GridSearch.numPoints = 81,81,16
- GridSearch+LeastSquares: this method can be used in very complex networks where
a bad initial location estimates can get LeastSquares stuck in a local minimum.
The method finds a LeastSquares solution for each cell in a (coarse) grid, using
the cell centroid as initial location estimate. If finally returns the maximum
likelihood solution. This method is intended to be used in :ref:`screloc` or
:ref:`scolv` to relocate existing events.
The following example configuration returns the best among the 75 (5x5x3)
LeastSquares solutions, computed for every point in the grid.
.. code-block:: params
method = GridSearch+LeastSquares
GridSearch.center = auto,auto,15
GridSearch.size = 100,100,30
GridSearch.numPoints = 5,5,3
- OctTree: this method produces similar results to GridSearch but it is extremely
faster and it follows the NonLinLoc approach. The OctTree search starts by
evaluating the hypocenter probability of each cell in a grid, computed as the
probability density at the cell center coordinates times the cell volume. The
search then continues by repeatedly fetching the cell with highest probability
and splitting it in 8 sub-cells. These 8 cells are then inserted in the pool of
cells to fetch from at next iteration.
The search terminates after either a maximum number of iterations or after
reaching a minimum cell size. At that point the maximum likelihood hypocenter
is selected. Because the algorithms splits only the cells with higher
probability, the search space is sampled in a very efficient way and it makes
the method way faster than GridSearch.
This method is intended to be used in :ref:`screloc` or :ref:`scolv` to
relocate existing events.
The following example is a plausible configuration for the entire Swiss
network:
.. code-block:: params
method = OctTree
GridSearch.center = 47.0,8.5,50
GridSearch.size = 700,700,100
GridSearch.numPoints = 21,21,11
OctTree.maxIterations = 100000
OctTree.minCellSize = 0.001
However in this example we are at the size limit for a flat earth study
geometry and for bigger regions `GridSearch.center` should be set to
`auto` and `GridSearch.size` to a smaller size.
- OctTree+LeastSquares: this method allows the OctTree search to find the
maximum probability cell in the network and uses that as the initial
location estimate for LeastSquares.
This method is intended to be used in :ref:`screloc` or :ref:`scolv` to
relocate existing events.
The following example is a plausible configuration for the entire Swiss
network:
.. code-block:: params
method = OctTree+LeastSquares
GridSearch.center = 47.0,8.5,50
GridSearch.size = 700,700,100
GridSearch.numPoints = 21,21,11
OctTree.maxIterations = 10000
OctTree.minCellSize = 1.0
However in this example we are at the size limit for a flat earth study
geometry and for bigger regions `GridSearch.center` should be set to
`auto` and `GridSearch.size` to a smaller size.
The algorithms implemented in StdLoc are standard methods described in "Routine Data
Processing in Earthquake Seismology" by Jens Havskov and Lars Ottemoller. The OctTree
search algorithm is based on NonLibLoc by Antony Lomax.
Why is stdloc suitable for local seismicity?
============================================
When dealing with very local seismicity (few kilometers or hundreds of meters)
simplifications that are common for regional seismicity have to be removed.
In particular the locator should take into consideration:
- station elevation and even negative elevation (e.g. borehole sensors)
- earthquake location can be above a seismic sensor (e.g. borehole sensors)
- possible negative earthquake depth (above surface)
More importantly the travel time tables used by the locator must be able to take
into consideration all the above too.
Travel Time Table
=================
StdLoc can be configured with any Travel Time Table type available in SeisComP,
however only the `homogeneous` type is able to take into consideration station
elevation, negative source depth and sources happening above stations. For this
reason `homogeneous` should be the preferred choice when working on very local
seismicity and especially with borehole sensors.
.. _global_stdloc_configuration:
Module Configuration
====================
.. note::
**StdLoc.\***
*Locator parameters: StdLoc. This locator requires the plugin*
*"stdloc" to be loaded.*
.. confval:: StdLoc.profiles
Type: *list:string*
Defines a list of profiles to make available to the plugin.
.. note::
**StdLoc.profile.$name.\***
$name is a placeholder for the name to be used and needs to be added to :confval:`StdLoc.profiles` to become active.
.. code-block:: sh
StdLoc.profiles = a,b
StdLoc.profile.a.value1 = ...
StdLoc.profile.b.value1 = ...
# c is not active because it has not been added
# to the list of StdLoc.profiles
StdLoc.profile.c.value1 = ...
.. confval:: StdLoc.profile.$name.method
Default: ``LeastSquares``
Type: *string*
The location method to use: LeastSquares, GridSearch,
OctTree, GridSearch+LeastSquares or OctTree+LeastSquares.
.. confval:: StdLoc.profile.$name.tableType
Default: ``LOCSAT``
Type: *string*
Travel time table format type. Also consider
\"tableModel\"\!
.. confval:: StdLoc.profile.$name.tableModel
Default: ``iasp91``
Type: *string*
The model to be used. The format depends on
\"tableType\".
.. confval:: StdLoc.profile.$name.PSTableOnly
Default: ``true``
Type: *boolean*
If enabled the arrival travel time information are fetched
using 'P' and 'S' tables only and the user selected
specific phase type is not considered \(e.g. Pg, Sg,
PmP, SmS, P1, S1, etc\).
.. confval:: StdLoc.profile.$name.usePickUncertainties
Default: ``false``
Type: *boolean*
Use pick time uncertainties rather than a fixed
time error of XXX s. If true, an arrival weight is
associated according to the uncertainty of the pick
and \"pickUncertaintyClasses\".
.. confval:: StdLoc.profile.$name.pickUncertaintyClasses
Default: ``0.000,0.025,0.050,0.100,0.200,0.400``
Type: *list:string*
Unit: *s*
Comma\-separated list of time limits of uncertainty
classes from which, along with pick time uncertainties,
arrival weights are computed. The first value
defines the lower limit of class 0.
The interval into which a pick time uncertainty falls
defines the index of the uncertainty class starting
with 0.
The corresponding arrival weight is computed as:
weight \= 1 \/ 2\^\(index\).
Example: A pick with a time uncertainty of 0.15 s is
within the 4th interval ranging from 0.1 to 0.2 s.
The class index is then 3.
If pick uncertainty is absent, the highest class index
applies.
.. confval:: StdLoc.profile.$name.confLevel
Default: ``0.9``
Type: *double*
Confidence level, between 0.5 and 1.0, used in
computing the hypocenter confidence ellipsoid.
.. confval:: StdLoc.profile.$name.enableConfidenceEllipsoid
Default: ``false``
Type: *boolean*
Compute the hypocenter confidence ellipsoid. Disable
this optional parameter to save some computation time.
.. note::
**StdLoc.profile.$name.GridSearch.\***
*Parameters controlling the GridSearch and OctTree methods.*
.. confval:: StdLoc.profile.$name.GridSearch.center
Default: ``auto,auto,20``
Type: *list:string*
Unit: *deg,deg,km*
Grid center defined as: latitude,longitude,depth. The
special value \"auto\" can be used and the corresponding latitude, longitude
and\/or depth will be automatically computed as the average of the arrival
station locations.
.. confval:: StdLoc.profile.$name.GridSearch.size
Default: ``40,40,30``
Type: *list:string*
Unit: *km*
Grid size in km defined as: X,Y,Z
direction extents around the \"GridSearch.center\",
where X is the longitudinal extent, Y the
latitudinal extent and Z the vertical extent.
.. confval:: StdLoc.profile.$name.GridSearch.numPoints
Type: *list:string*
Number of grid points in X, Y, Z
direction. The first and last points are on the
grid boundary unless the number of points is 1
and the point will be in the grid center.
Format: numX,numY,numZ.
.. confval:: StdLoc.profile.$name.GridSearch.misfitType
Default: ``L1``
Type: *string*
The type of misfit to use, from which
the likelihood function is derived: L1 or L2 norm.
L1 is less sensitive to outliers and so more
suitable with automatic picks, L2 is the preferred
choice for manual picks.
.. confval:: StdLoc.profile.$name.GridSearch.travelTimeError
Default: ``0.25``
Type: *double*
Unit: *s*
Typical error in seconds for travel times to
stations. The value affects the uncertainty
of the location. In OctTree it also influences
the probability density computation: too
conservative values increase the number of
iterations required by OctTree to converge
to a high resolution solution.
.. note::
**StdLoc.profile.$name.OctTree.\***
*Parameters controlling the OctTree method. OctTree*
*uses the parameters defined in GridSearch, but*
*applies the OctTree search algorithm on the grid.*
*The starting cells of the OctTree search are created by*
*dividing the initial grid in equally sized cells.*
*The grid points becomes the cell vertices.*
*Resulting number of cells in each direction:*
*"GridSearch.numPoints" - 1.*
.. confval:: StdLoc.profile.$name.OctTree.maxIterations
Default: ``50000``
Type: *int*
Maximum number of iterations after which the
search stops. Zero or negatives values disable
the stopping.
.. confval:: StdLoc.profile.$name.OctTree.minCellSize
Default: ``0.1``
Type: *double*
Unit: *km*
Minimum cell size to be generate by the OctTree
search to stop. A zero or negative value disable
this stopping mechanism.
.. note::
**StdLoc.profile.$name.LeastSquares.\***
*Parameters controlling the LeastSquares method.*
.. confval:: StdLoc.profile.$name.LeastSquares.depthInit
Default: ``20``
Type: *double*
The initial depth estimate when no initial
hypocenter is provided. Used only with
'LeastSquares'.
.. confval:: StdLoc.profile.$name.LeastSquares.iterations
Default: ``20``
Type: *int*
Number of iterations. Each iteration will
use the location and time from the previous
Least Squares solution.
.. confval:: StdLoc.profile.$name.LeastSquares.dampingFactor
Default: ``0.0``
Type: *double*
Damping factor to be used when solving the
system of equations.
0: no damping.
.. confval:: StdLoc.profile.$name.LeastSquares.solverType
Default: ``LSMR``
Type: *string*
Algorithm to use: either LSMR or LSQR.

View File

@ -0,0 +1,201 @@
.. highlight:: rst
.. _import_inv:
##########
import_inv
##########
**Import inventory information from various sources.**
Description
===========
import_inv is a wrapper for inventory converters. Inventory converters convert
an input format such as
.. csv-table::
:widths: 15 15 70
:header: Format, Converter, Conversion
:align: left
:delim: ;
scml; :ref:`scml2inv`; :ref:`SeisComP inventory XML <concepts_inventory>`, schema: :file:`$SEISCOMP_ROOT/share/xml/`
sc3; :ref:`sc32inv`; Alias for scml for backwards compatibility to SeisComP3
arclink; :ref:`arclink2inv`; Arclink inventory XML
dlsv; :ref:`dlsv2inv`; `dataless SEED <http://www.iris.edu/data/dataless.htm>`_
fdsnxml; :ref:`fdsnxml2inv`; `FDSN StationXML <http://www.fdsn.org/xml/station/>`_
to SeisComP inventory XML which is read by the trunk config module to
synchronize the local inventory file pool with the central inventory database.
For printing all available formats call
.. code-block:: sh
$ import_inv help formats
When :program:`import_inv help formats` is called it globs for
:file:`$SEISCOMP_ROOT/bin/*2inv`.
If another format needs to be converted, it is very easy to provide a new
converter.
Converter interface
-------------------
For making a new converter work with import_inv it must implement an interface
on shell level. Furthermore the converter program must be named
:file:`{format}2inv` and must live in :file:`SEISCOMP_ROOT/bin`.
The converter program must take the input location (file, directory, URL, ...)
as first parameter and the output file (SeisComP XML) as second parameter. The
output file must be optional and default to stdout.
To add a new converter for a new format, e.g. Excel, place the new converter
program at :file:`$SEISCOMP_ROOT/bin/excel2inv`.
Examples
--------
* Convert inventory file in FDSN StationXML format (fdsnxml) and copy the content to
:file:`$SEISCOMP_ROOT/etc/inventoy/inventory.xml`. The call will invoke
:ref:`fdsnxml2inv` for actually making the conversion:
.. code-block:: sh
$ import_inv fdsnxml inventory_fdsn.xml $SEISCOMP_ROOT/etc/inventoy/inventory.xml
.. _import_inv_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/import_inv.cfg`
| :file:`etc/global.cfg`
| :file:`etc/import_inv.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/import_inv.cfg`
import_inv inherits :ref:`global options<global-configuration>`.
Command-Line Options
====================
.. program:: import_inv
:program:`import_inv [FORMAT] input [output]`
:program:`import_inv help [topic]`
The first form takes the format as first parameter and the input
and output location. The input location is either a file or directory
depending on the format and its converter. If the output is not
given it defaults to seiscomp\/etc\/inventory\/{input}.xml. To write
the output to stdout, \"\-\" must be used.
The second form provides help on a particular topic. The only topic
currently supported is \"formats\" which prints all available input
formats.
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
.. option:: --plugins arg
Load given plugins.
.. option:: -D, --daemon
Run as daemon. This means the application will fork itself
and doesn't need to be started with \&.
.. option:: --auto-shutdown arg
Enable\/disable self\-shutdown because a master module shutdown.
This only works when messaging is enabled and the master
module sends a shutdown message \(enabled with \-\-start\-stop\-msg
for the master module\).
.. option:: --shutdown-master-module arg
Set the name of the master\-module used for auto\-shutdown.
This is the application name of the module actually
started. If symlinks are used, then it is the name of
the symlinked application.
.. option:: --shutdown-master-username arg
Set the name of the master\-username of the messaging
used for auto\-shutdown. If \"shutdown\-master\-module\" is
given as well, this parameter is ignored.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --log-file arg
Use alternative log file.

View File

@ -0,0 +1,63 @@
.. highlight:: rst
.. _inv2dlsv:
########
inv2dlsv
########
**Converts SC3 inventory XML to dataless SEED.**
Description
===========
inv2dlsv is a simple filter that converts inventory in |scname| XML (:term:`SCML`)
format from stdin (or a file) to dataless SEED on stdout (or a file). It does
not support processing of input XML such as extraction of networks or channels.
To accomplish this task, combine :program:`inv2dlsv` with :ref:`invextr`.
.. note::
Conversion of dataless SEED to |scname| XML is provided by :ref:`dlsv2inv`.
Examples
========
.. note::
"-" can always be used as filename to refer to the standard input/output channel.
#. Convert an inventory XML file to a dataless SEED file
.. code-block:: sh
inv2dlsv inv.xml inv.seed
#. Convert an inventory XML file to a compressed dataless SEED file
.. code-block:: sh
inv2dlsv inv.xml | gzip > inv.seed.gz
#. Convert a subset of an inventory XML using :ref:`invextr`.
.. code-block:: sh
invextr --chans "*MORC*" inv.xml | inv2dlsv - inv.seed
Command-Line Options
====================
.. program:: inv2dlsv
:program:`inv2dlsv [in_xml [out_dataless]]`
If in_xml is not given, stdin is used. If out_dataless is not given,
stdout is used.

View File

@ -0,0 +1,268 @@
.. highlight:: rst
.. _invextr:
#######
invextr
#######
**Extract channels from inventory.**
Description
===========
invextr reads and modifies inventory XML provided as file or on stdin:
* Extract or remove networks, stations and channels based on
* channel IDs
* geographic region
* time
* Clean inventories from unused objects such as data loggers, sensors or
instrument responses.
The important parameters are:
* Channel ID list (required)
* Input file or stdin
* Output file or stdout
* Region bounding box (optional)
whereas the output file defaults to stdout and the input file to
stdin if not given.
The optional region box will be used to filter the read inventory based on the
coordinates of sensor locations. Only stations with sensor locations within the
region will be considered. All others will be ignored.
A channel ID is a simple string that is matched against the final channel ID
in the inventory. This final channel ID is constructed by joining the codes of
all stages with a dot where the stages are network, station, location and
channel.
The content of the resulting inventory may be listed using :ref:`scinv`.
Examples
--------
Suppose an inventory with network GE, a station MORC and several channels:
.. code-block:: sh
network GE
station MORC
location __
channel BHZ ID: GE.MORC..BHZ
channel BHN ID: GE.MORC..BHN
channel BHE ID: GE.MORC..BHE
channel LHZ ID: GE.MORC..LHZ
channel LHN ID: GE.MORC..LHN
channel LHE ID: GE.MORC..LHE
* The IDs are matched against streams passed with --chans.
.. code-block:: sh
invextr --chans "GE*" inv.xml
All streams are passed and nothing is filtered because GE* matches all
available IDs and region filter is not used. Since :file:`inv.xml` only
contains stations from the GE network the option :option:`--chans` is not
useful here at all.
.. code-block:: sh
invextr -r 0,-180,90,180 inv.xml
All streams located in the northern hemisphere are passed as commanded by the
region bounding box.
* Nothing is filtered again because *MORC* matches all available IDs.
.. code-block:: sh
invextr --chans "*MORC*" inv.xml
* Everything is filtered because GE.MORC does not match with any ID. To make it
work, an asterisk needs to be appended: GE.MORC* or GE.MORC.*.
.. code-block:: sh
invextr --chans "GE.MORC" inv.xml
* To extract all vertical components, use:
.. code-block:: sh
invextr --chans "*Z" inv.xml
* To extract BHN and LHZ, use:
.. code-block:: sh
invextr --chans "*BHN,*LHZ" inv.xml
* To remove all HH and SH channels, use:
.. code-block:: sh
invextr --rm --chans "*HH?,*SH?" inv.xml
.. _invextr_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/invextr.cfg`
| :file:`etc/global.cfg`
| :file:`etc/invextr.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/invextr.cfg`
invextr inherits :ref:`global options<global-configuration>`.
Command-Line Options
====================
.. program:: invextr
:program:`invextr [OPTIONS] [input=stdin] [output=stdout]`
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
.. option:: --plugins arg
Load given plugins.
.. option:: -D, --daemon
Run as daemon. This means the application will fork itself
and doesn't need to be started with \&.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --log-file arg
Use alternative log file.
.. option:: --print-component arg
For each log entry print the component right after the
log level. By default the component output is enabled
for file output but disabled for console output.
.. option:: --trace
Execute in trace mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 \-\-print\-component\=1
\-\-print\-context\=1 .
Extract
-------
.. option:: --begin arg
Begin time to consider streams. Streams ending at or
before that time will be ignored.
.. option:: --end arg
End time to consider streams. Streams starting after
that time will be ignored.
.. option:: --chans arg
A comma separated list of channel IDs to extract
which can contain wildcards. Default: \*.\*.\*.\* meaning
all streams.
Example: invextr \-\-chans \"GE.\*.\*.BHZ,GE.MORC.\*.\*\" inv.xml
.. option:: --nslc arg
Stream list file to be used for extracting inventory.
Wildcards can be used. \-\-chans is ignored.
.. option:: -r, --region arg
Filter streams by geographic region given as
\"South, East, North, West\". Region is unused by default.
.. option:: --rm arg
Removes all channels given with '\-\-chans' instead of
extracting them.
Example: invextr \-\-rm \-\-chans \"GE.\*\" inv.xml
.. option:: -f, --formatted
Enables formatted XML output.

View File

@ -0,0 +1,80 @@
.. highlight:: rst
.. _kernel:
######
kernel
######
**SeisComP kernel.**
Description
===========
The kernel is the basic configuration for the :command:`seiscomp` tool.
It contains configuration parameters for all init scripts in :file:`etc/init`. Each init script can, but does not
need to, read the kernel parameters and configure itself accordingly. Kernel parameters are not mandatory but
should be taken as (serious) hints. If for example syslog is enabled in the kernel then all init scripts should
configure syslog as logging backend for the programs they start. But if a program does not support syslog it can
also be started without logging to syslog.
.. _kernel_configuration:
Module Configuration
====================
.. note::
kernel is a :term:`standalone module` and does not inherit :ref:`global options <global-configuration>`.
| :file:`etc/defaults/kernel.cfg`
| :file:`etc/kernel.cfg`
| :file:`~/.seiscomp/kernel.cfg`
.. confval:: syslog
Default: ``false``
Type: *boolean*
Sends all logging output to the syslog backend which logs
usually to \/var\/log\/messages.
.. _kernel/messaging:
messaging extension
-------------------
SeisComP messaging component that enables communication of modules over the network.
.. confval:: messaging.enable
Default: ``true``
Type: *boolean*
Enables\/disables local messaging \(scmaster\).
The messaging component is an integral component
of all modules \(except e.g. acquisition modules\).
If you are not sure what to do, enable it.
.. confval:: messaging.bind
Type: *string*
Defines the messaging unencrypted bind address. If
left empty then the configuration file \(see scmaster\)
will be used instead. Use this to override the
unencrypted bind address. The format is
\"[ip:]port\".

View File

@ -0,0 +1,201 @@
.. highlight:: rst
.. _msrtsimul:
#########
msrtsimul
#########
**MiniSEED real time playback and simulation**
Description
===========
msrtsimul simulates a real-time data acquisition by injecting miniSEED data from a
file into the seedlink buffer via the mseedfifo plugin for seedlink. It can be
used for simulating real-time conditions in playbacks for whole-system
demonstrations, user training, etc.
The data is played back as if they were recorded at current time. Therefore,
creation times and the actual data times including pick times, event times etc.
will be **obscured**. :ref:`Historic playbacks <sec-msrtsimul-historic>` allow
keeping the actual data times.
.. hint::
* Playbacks on production systems are normally not recommended.
* For real-time playbacks, the data must be sorted by end time. This
requirement may be violated. Use :ref:`scmssort` for sorting the data by
(end) time.
* Stop :ref:`slarchive` before running msrtsimul for avoiding that data with
wrong times are archived.
* Normally, :ref:`seedlink` assumes that the data is provided in records of
512 bytes. msrtsimul issues a warning when detecting a record of other size.
* Data available in other record sizes can be repacked to 512 bytes by
external software such as :program:`msrepack` available with
:cite:t:`libmseed-github`.
* Applications other than standard :ref:`seedlink` in |scname| or
:ref:`seedlink` compiled specifically may accept other record sizes. For
accepting these records use msrtsimul with :option:`--unlimited`.
Non-default seedlink pipes
--------------------------
By default, msrtsimul writes the data into the mseedfifo pipe
*$SEISCOMP_ROOT/var/run/seedlink/mseedfifo*.
If the data is to be written into the pipe of a :program:`seedlink` alias or
into any other pipe, the pipe name must be adjusted. Use the option
* :option:`--seedlink` to replace *seedlink* by another name, e.g. a seedlink instance
created as an alias, **seedlink-test**. This would write into
*$SEISCOMP_ROOT/var/run/seedlink-test/mseedfifo*.
* :option:`--stdout` to write to standard output and then redirect to any other location.
.. _sec-msrtsimul-historic:
Historic playbacks
------------------
You may use msrtsimul with the :option:`-m` *historic* option to maintain the
time of the records,
thus the times of picks, amplitudes, origins, etc. but not the creation times.
Applying :option:`-m` *historic* will feed the data into the seedlink buffer at the time
of the records. The time of the system is untouched. GUI, processing modules, logging,
etc. will run with current system time. The historic mode allows to process waveforms
with the stream inventory valid at the time when the data were recorded including
streams closed at current time.
.. warning ::
When repeating historic playbacks, the waveforms are fed multiple times to the
seedlink buffer and the resulting picks are also repeated with the same pick
times. This may confuse the real-time system. Therefore, seedlink and other modules
creating or processing picks should be
stopped, the seedlink buffer should be cleared and the processing
modules should be restarted to clear the buffers before starting the
historic playbacks. Make sure :ref:`scautopick` is configured or started with
the :option:`--playback` option. Example:
.. code-block:: sh
seiscomp stop
rm -rf $SEISCOMP_ROOT/var/lib/seedlink/buffer
seiscomp start
msrtsimul ...
seedlink setup
--------------
For supporting msrtsimul activate the :confval:`msrtsimul` parameter in the
seedlink module configuration (:file:`seedlink.cfg`), update the configuration
and restart seedlink before running msrtsimul:
.. code-block:: sh
seiscomp update-config seedlink
seiscomp restart seedlink
msrtsimul ...
Examples
--------
1. Playback miniSEED waveforms in real time with verbose output:
.. code-block:: sh
$ msrtsimul -v miniSEED-file
#. Playback miniSEED waveforms in historic mode. This may require :ref:`scautopick`
to be started with the option *playback*:
.. code-block:: sh
msrtsimul -v -m historic miniSEED-file
#. Feed the data into the buffer of a specific seedlink instance, e.g. *seedlink-test*:
.. code-block:: sh
msrtsimul -v --seedlink seedlink-test miniSEED-file
.. _msrtsimul_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/msrtsimul.cfg`
| :file:`etc/global.cfg`
| :file:`etc/msrtsimul.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/msrtsimul.cfg`
msrtsimul inherits :ref:`global options<global-configuration>`.
Command-Line Options
====================
.. program:: msrtsimul
:program:`msrtsimul [OPTION] miniSEED-file`
Verbosity
---------
.. option:: -h, --help
Display this help message.
.. option:: -v, --verbose
Verbose mode.
Playback
--------
.. option:: -c, --stdout
Write on standard output. The output my be redirected to a
specific mseedfifo path.
.. option:: -d, --delays
Add artificial delays.
.. option:: -j, --jump float
Minutes to skip at the beginning.
.. option:: -m, --mode string
Playback mode: choose between 'realtime' and 'historic'
.. option:: --seedlink string
The seedlink module name. Useful if a seedlink alias or
non\-standard names are used. Replaces 'seedlink'
in the standard mseedfifo path.
.. option:: -s, --speed float
Speed factor. 1 is normal speed.
.. option:: --test
Test mode.
.. option:: -u, --unlimited
Allow miniSEED records which are not 512 bytes.

View File

@ -0,0 +1,742 @@
.. highlight:: rst
.. _ql2sc:
#####
ql2sc
#####
**QuakeLink (gempa GmbH) to SeisComP event parameter exchange.**
Description
===========
ql2sc manages the import of SeisComP objects from one or several QuakeLink servers
into a SeisComP system in real time. Like :ref:`scimex` but contrary to
:ref:`scimport` the exchange of the SeisComP objects is event based. This means no
messages will be exchanged until the exporting system has produced an event.
The user may control at various levels which information to import. Whenever
possible server-side filters should be preferred to reduce both the network
bandwidth consumption as well as the CPU and memory utilization on the local
machine.
.. note::
ql2sc does not delete events at the import system although quakelink
allows the deletion of events. Deleted events are ignored by ql2sc and kept
in the SeisComP database.
.. _ql2sc_event_filter:
Server-Side Event Filter
========================
QuakeLink provides a filter syntax similar to SQL-WHERE clauses which may be
used to filter interesting events on the server side:
.. code-block:: none
clause := condition[ AND|OR [(]clause[)]]
condition := MAG|DEPTH|LAT|LON|PHASES|DIST(lat,lon) op {float} |
DIST(lat,lon) IN [{float}, {float}] |
UPDATED|OTIME op time |
AGENCY|AUTHOR|STATUS|ESTATUS|EMODE|TYPE|REGION|MAG_T op 'string' |
MAG|DEPTH|LAT|LON|PHASES|OTIME|UPDATED IS [NOT] NULL
op := =|!=|>|>=|<|<=|eq|gt|ge|lt|ge
time := %Y,%m,%d[,%H,%M,%S,%f]
E.g., the following filter string would select only those events with a minimum
magnitude of 6, detected by at least 10 stations and which are shallower than
100km:
.. code-block:: sql
MAG >= 6.0 AND PHASES >= 10 AND DEPTH < 100
.. note::
The supported filter commands depend on the specific QuakeLink version. To
list all available options you may connect to the server, e.g., using
`telnet localhost 18010`, and request the help page of the `SELECT` command
using `help select`.
.. _ql2sc_object_filter:
Server-Side Object Filter
=========================
QuakeLink provides a coarse object filter for the most relevant SeisComP objects:
============ ==============================================================
Option Impact
============ ==============================================================
picks include picks
amplitudes include amplitudes
arrivals include origin arrivals
staMags include origin station magnitudes
staMts include moment tensor station contributions and phase settings
preferred include only preferred origin and magnitude information
============ ==============================================================
.. _routing:
Local Object Filter and Routing
===============================
Subsequent to the server-side filters a routing table defines which objects to
import and to which message group to send them. Depending on the |scname| modules
listening to the specified message groups an object may be further processed.
Typically no modules (other than :ref:`scmaster`) is connected to the
``IMPORT_GROUP`` so that objects sent to this group are just stored to the
database. If an object should be discarded, the special group identifier ``NULL``
may be used.
The routing table is defined as a comma-separated list of
``object name:group name`` pairs. Also the routing rules are inherited
recursively within the SeisComP object tree. If no explicit rule exists for an
object, the routing of its parent is evaluated up to the ``EventParameters``
root node.
Examples
--------
.. code-block:: none
EventParameters:IMPORT_GROUP
Imports everything
.. code-block:: none
EventParameters:IMPORT_GROUP,Comment:NULL
Imports everything except comments
.. code-block:: none
Origin:LOCATION,StationMagnitude:MAGNITUDE,Magnitude:MAGNITUDE
Sends origins and it's children arrival, origin uncertainty to the ``LOCATION``
group but the magnitude children to the ``MAGNITUDE`` group. Skips picks,
amplitudes, focal mechanisms and events.
Default routing table
---------------------
The default use case of ql2sc is to import earthquake solutions from other data
centers or in-house redundant SeisComP systems. The intention is not to
reprocess the solution but to add them to the local catalog.
By default we route:
* Picks and Amplitudes to the ``IMPORT_GROUP`` group to prevent processing by
the local locator and amplitude processor
* Origins (including its StationMagnitude and Magnitude children) to the
``LOCATION`` to allow event association.
* FocalMechanisms to the ``FOCMECH`` group to trigger processing by specialized
applications, e.g., graphical user interfaces for strong motion analysis or
tsunami risk assessment.
We don't route events at all. With the help of :ref:`scevent` locations are
either associated to existing events or will create new events with local
settings.
We don't route StationMagnitudes and Magnitude to the ``MAGNITDUE`` group
because :ref:`scmag` subscribes to ``LOCATION`` and ``MAGNITUDE``. Separated
groups might lead to duplicated magnitude types in case a manual magnitude
solution is imported. In this case the foreign Origin with its Magnitudes would
be split into at least two messages, the first one containing the Origin, the
second one the Magnitude. The Origin message immediately triggers magnitude
calculation, potentially for a magnitude type which is received with the second
message.
The default routing table is set as given in :confval:`host.$name.routingTable`.
.. _agency_filter:
Agency List Filter
==================
In addition to the local object filter the user may choose to accept only those
objects originating from a set of trusted agencies. If at least one agency is
defined in the ``processing.whitelist.agencies`` or
``processing.blacklist.agencies`` configuration option, then the
``creationInfo.agencyID`` of amplitudes, arrivals, comments, events, focal
mechanisms, magnitudes, moment tensors, origins, picks and station magnitudes is
evaluated. Objects with unmatched or unset agency information are filtered out.
If objects with unset agency ID should match, then empty string ``""`` has to be
added to the white list.
The agency filter is applied on remote as well as local objects. In this way
remote objects may be excluded from import and local objects my be protected
from overriding or removing. Also the filter is applied recursively. If parent
object (e.g., an origin) is filtered out, all of its children (e.g., magnitudes)
are also skipped even if they carry a different agency ID.
.. note::
The agency white list filter might be essential to avoid circular event
updates between cross-connected SeisComP systems.
.. _publicID_filter:
PublicID Prefix Filter
======================
In addition to the :ref:`agency filter<agency_filter>` incoming or local objects
can be skipped by checking their publicID prefix. It behaves similar to the
:ref:`agency filter<agency_filter>` but checks the ``publicID`` attribute rather
than the ``creationInfo.agencyID`` attribute.
Prefixes can be configure as white- or blacklist with
``processing.whitelist.publicIDs = ...`` and
``processing.blacklist.publicIDs = ...``.
Workflow
========
Each event update received from a QuakeLink host is parsed and analyzed for
differences to the local database. The comparison starts at the level of the
top-level elements in the following order: picks, amplitudes, origins, focal
mechanisms, events.
For each top-level element the object tree is traversed in a depth-first search
order. Objects on the same level are processed in the order of their appearance.
The differences are collected as a list of notifier objects with the following
operation types:
====== ===========
Type Description
====== ===========
ADD The object does not exist locally
UPDATE The object does exist locally but differs from the remote one
REMOVE The object exist locally but not remotely
====== ===========
The ``ADD`` and ``REMOVE`` operation always generates notifies of the same type
for all children of the current object. ``ADD`` notifiers are collected top-down,
``REMOVE`` notifiers are collected bottom-up.
Because the order of child objects is arbitrary, e.g., the arrivals of an origin,
each object on the remote side has to be found in the set of local objects. For
public objects (e.g., origins, magnitudes, magnitudes..), the ``publicID``
property is used for comparison. All other objects are compared by looking at
their index properties. For e.g., arrivals this is the ``pickID`` property, for
comments the ``id`` property.
Ones all notifiers are collected they are send to the local messaging system.
For performance reasons and because of the processing logic of listening |scname|
modules ql2sc tries to batch as many notifiers as possible into one notifier
message. A separate notifier message is created if the target message group
changes between successive notifiers or if the configurable :confval:`batchSize`
limit is reached.
.. note::
Care must be taken when configuring the ``batchSize`` limit. If the value
is to big the overall message size limit (default: 1MB) may be exceeded
resulting in an undeliverable message. On the other hand a much to small
value will create unwanted results in the |scname| processing chain. If for
instance picks are routed to the ``PICK`` group and the pick set is split
into several notifier messages the local :ref:`scautoloc` might create
locations based on an incomplete dataset.
Event Attributes
================
It might be desirable to synchronize event attributes set at the source with
the local system. In particular the event type, the type uncertainty, event
descriptions and comments might be of interest. Because it is not advisable
to route events and let :ref:`scevent` associate imported origins it can
happen that the imported event ID is different from the event ID of the local
system. The input host configuration parameter :confval:`syncEventAttributes`
controls that behaviour. It is set to true by default which means that imported
event attributes are going to be imported as well. ql2sc does not update
directly the attributes but commandates scevent in as many cases as possible
to do so. To find the matching local event it takes the first occurrence which
has associated the currently imported preferred origin.
Limitations
-----------
There are limitations to this process to avoid infinite loops when cross
connecting two systems. Prior to sending the commands to scevent to change a
particular attribute ql2sc checks if that attribute has been set already by
another module (via JournalEntry database table). If not, then ql2sc is allowed
to request an attribute change otherwise not. To illustrate the issue take the
following example:
scolv connected to system ``A`` changes the event type to 'earthquake'. ql2sc
of system ``B`` checks if the event type of the local event has been changed
already which is not the case and it requests that change. System ``A``
changes the event type again to 'unset'. ql2sc of system ``B`` notices that
someone has already changed the event type and it was ql2sc itself. It requests
again a change.
scolv connected to system ``B`` changes the event type to 'earthquake' again.
ql2sc of system ``A`` notices that ``scolv@A`` has already changed the
event type and ignores the request.
That simple case would not create an infinite loop even if ``ql2sc@A`` would
accept the last change. The situation changes immediately if two subsequent
attribute changes are being received by ``ql2sc@B`` while both of them are
already applied on system ``A``. ``ql2sc@B`` would "restore" the old state due
to the first received update and then apply the "final" state due to the
second update. Each update triggers again an update at system ``A`` and the
states start flapping. Without the described check there wouldn't be a well
defined exit condition.
Caveats
=======
Specific combinations of remote and local object filters may result in the loss
of data. If for instance origins are imported from system ``A`` to ``B`` and
additional magnitudes for the received origins are calculated on ``B``, care must
be taken. Without protection a new event update containing the same origin will
``REMOVE`` all newly calculated magnitudes on ``B`` since they are not included
in the magnitude set sent by ``A``.
To avoid losing these local magnitudes one may decide to block magnitudes from
import by routing them to ``NULL``. If magnitudes from ``A`` and from ``B``
should be available, an :ref:`agency filter<agency_filter>` or
:ref:`publicID filter<publicID_filter>` may be defined.
Make sure ``A`` and ``B`` use either distinct agency IDs or distinct publicID
patterns and add the agency ID of ``B`` to ``processing.blacklist.agencies`` or
the publicID prefix of ``B`` to ``processing.blacklist.publicIDs``.
.. _ql2sc_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/ql2sc.cfg`
| :file:`etc/global.cfg`
| :file:`etc/ql2sc.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/ql2sc.cfg`
ql2sc inherits :ref:`global options<global-configuration>`.
.. confval:: backLog
Default: ``1800``
Type: *int*
Unit: *s*
Number of seconds to fetch missed updates on start up.
.. confval:: cacheSize
Default: ``5000``
Type: *int*
Number of public objects to cache.
.. confval:: batchSize
Default: ``2000``
Type: *int*
Maximum number of notifiers to batch in one message. If set
to 0 no size limit is enforced. Make sure to not hit the
overall message size limited of 16MiB which is enforced by
the messaging system.
.. confval:: eventAssociationTimeout
Default: ``10``
Type: *int*
Unit: *s*
If event synchronisation is enabled and an incoming origin
is not yet associated with an event on the target machine,
then this timeout defines the maximum number of seconds to
wait for an association.
.. confval:: hosts
Type: *list:string*
Registration of the host profiles defining the connection
parameters to the QuakeLink hosts.
.. note::
**host.\***
*Definition of host profiles. For each host profile a connection*
*to one QuakeLink server can established. The profiles must be registered*
*in 'hosts' to apply them.*
.. note::
**host.$name.\***
*Provide the connection parameters to one QuakeLink server.*
$name is a placeholder for the name to be used and needs to be added to :confval:`hosts` to become active.
.. code-block:: sh
hosts = a,b
host.a.value1 = ...
host.b.value1 = ...
# c is not active because it has not been added
# to the list of hosts
host.c.value1 = ...
.. confval:: host.$name.url
Default: ``ql://localhost:18010``
Type: *string*
URL of the QuakeLink service, the scheme 'qls' enables SSL.
Format: [ql[s]:\/\/][user:pwd\@][host][:port].
If set to an empty string the application will run without any QuakeLink connection attempt.
.. confval:: host.$name.gzip
Default: ``false``
Type: *boolean*
Enable\/disable GZip \(GNU zip\) compression.
.. confval:: host.$name.native
Default: ``false``
Type: *boolean*
Request native data instead of XML format.
Native data export may be disabled on some hosts.
.. confval:: host.$name.syncEventAttributes
Default: ``true``
Type: *boolean*
Try to update the event attributes of the target event
with the attributes of the source event which includes
event type and event certainty. It will not import
events but tries to find the associated event of the
input preferred origin at the target system and will
update the event attributes via journaling.
.. confval:: host.$name.syncPreferred
Default: ``false``
Type: *boolean*
Synchronize the preferred origin and preferred
magnitude selection if different from the imported
selection. ql2sc will wait for the event association
of an imported origin and check if the preferred origin
or preferred magnitude is different from the imported
Quakelink event. If so it will send a journal to
force selection of the preferred origin and selection
of the preferred magnitude type. These are the same
operations as within scolv to fix an origin and
a particular magnitude type.
.. confval:: host.$name.syncEventDelay
Default: ``0``
Type: *int*
Delays the synchronization of event attributes in seconds
if set to a value greater than zero.
.. confval:: host.$name.keepAlive
Default: ``false``
Type: *boolean*
Request server to send keep alive message every 30s to
prevent connection reset by firewalls on long idle
periods. If activated the client will reset the
connection if no alive message is received within 60s.
.. confval:: host.$name.filter
Type: *string*
SQL like WHERE clause to filter the result set.
clause :\= condition[ AND\|OR [\(]clause[\)]] __
condition :\= MAG\|DEPTH\|LAT\|LON\|PHASES\|OTIME\|UPDATED [op float\|time]\|[IS [NOT] NULL] __
op :\= \=\|>\|>\=\|<\|<\=\|eq\|gt\|ge\|lt\|ge __
time :\= %Y,%m,%d[,%H,%M,%S,%f]
.. confval:: host.$name.routingTable
Default: ``Pick:IMPORT_GROUP,Amplitude:IMPORT_GROUP,FocalMechanism:EVENT,Origin:EVENT``
Type: *list:string*
Map datamodel class names to messaging groups. For unmapped objects
the mapping of their parent objects is evaluated recursively. Objects
may be excluded by mapping them to 'NULL'.
.. note::
**host.$name.data.\***
*Specify the XML components to fetch.*
*Note: These options are not used if 'native' data is requested.*
.. confval:: host.$name.data.picks
Default: ``true``
Type: *boolean*
Include picks
.. confval:: host.$name.data.amplitudes
Default: ``true``
Type: *boolean*
Include amplitudes
.. confval:: host.$name.data.arrivals
Default: ``true``
Type: *boolean*
Include origin arrivals
.. confval:: host.$name.data.staMags
Default: ``true``
Type: *boolean*
Include origin station magnitudes
.. confval:: host.$name.data.staMts
Default: ``true``
Type: *boolean*
Include moment tensor station contributions and phase settings
.. confval:: host.$name.data.preferred
Default: ``true``
Type: *boolean*
Include only preferred origin and magnitude information
.. confval:: processing.blacklist.publicIDs
Type: *list:string*
Defines a blacklist of publicID prefixes that are
not allowed for processing. Separate items by comma.
.. confval:: processing.whitelist.publicIDs
Type: *list:string*
Defines a whitelist of publicID prefixes that are
allowed for processing. Separate items by comma.
Command-Line Options
====================
.. program:: ql2sc
:program:`ql2sc [options]`
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
.. option:: --plugins arg
Load given plugins.
.. option:: -D, --daemon
Run as daemon. This means the application will fork itself
and doesn't need to be started with \&.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --print-component arg
For each log entry print the component right after the
log level. By default the component output is enabled
for file output but disabled for console output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --trace
Execute in trace mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 \-\-print\-component\=1
\-\-print\-context\=1 .
.. option:: --log-file arg
Use alternative log file.
Messaging
---------
.. option:: -u, --user arg
Overrides configuration parameter :confval:`connection.username`.
.. option:: -H, --host arg
Overrides configuration parameter :confval:`connection.server`.
.. option:: -t, --timeout arg
Overrides configuration parameter :confval:`connection.timeout`.
.. option:: -g, --primary-group arg
Overrides configuration parameter :confval:`connection.primaryGroup`.
.. option:: -S, --subscribe-group arg
A group to subscribe to.
This option can be given more than once.
.. option:: --content-type arg
Overrides configuration parameter :confval:`connection.contentType`.
.. option:: --start-stop-msg arg
Set sending of a start and a stop message.
Database
--------
.. option:: --db-driver-list
List all supported database drivers.
.. option:: -d, --database arg
The database connection string, format:
service:\/\/user:pwd\@host\/database.
\"service\" is the name of the database driver which
can be queried with \"\-\-db\-driver\-list\".

View File

@ -0,0 +1,414 @@
.. highlight:: rst
.. _scalert:
#######
scalert
#######
**Real time alert template.**
Description
===========
This module executes custom scripts upon arrival of objects or updates.
It provides as template for custom modification and is not a replacement for :ref:`scvoice`.
There are four possible trigger mechanisms for calling scripts:
* Event creation/update,
* Amplitude creation,
* Origin creation (with status = preliminary),
* Pick creation with filter for phase hint.
.. note ::
People started modifying :ref:`scvoice` to send emails or
other alert messages. Then, the name *scvoice* is then just wrong.
If you want to customize :ref:`scvoice`, use scalert instead.
.. _scalert_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/scalert.cfg`
| :file:`etc/global.cfg`
| :file:`etc/scalert.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/scalert.cfg`
scalert inherits :ref:`global options<global-configuration>`.
.. confval:: firstNew
Default: ``false``
Type: *boolean*
Treat an event as new event when it is seen the first time.
.. confval:: agencyIDs
Type: *list:string*
List of agency IDs to consider picks and origins. The agency ID
is extracted from the pick or the preferred origin of the event
and compared with the configured IDs.
unset \(\=\): use agencyID defined in global, default
empty list \(\=\"\"\): all agencies are allowed
.. confval:: authors
Type: *list:string*
List of authors to consider picks and origins. The author
is extracted from the pick or the preferred origin of the event
and compared with the configured authors.
empty list \(\=\"\"\): all authors are allowed
.. confval:: poi.message
Type: *string*
The default message string for the event\-script is
\"earthquake, [HOURS] hours [MINS] minutes ago, [DESC],
magnitude [MAG], depth [DEP] kilometers\" whereas [DESC]
is the string given in the event.description attribute. This
string can be overwritten using one of the following options.
There are three placeholders that can be used: \@region\@,
\@dist\@ and \@poi\@.
Example: \"\@region\@, \@dist\@ kilometers from \@poi\@
away\".
.. confval:: poi.maxDist
Default: ``20``
Type: *double*
Unit: *deg*
When using the nearest point of interest \(city\) as part of
the message string, specify the maximum distance in degrees
from the event. Any point of interest farther away will be
ignored.
.. confval:: poi.minPopulation
Default: ``50000``
Type: *double*
Minimum population for a city to become a point of interest.
.. confval:: scripts.pick
Type: *string*
The script to be called when a pick
arrives. Network\-, station code, pick publicID are passed
as parameters \$1, \$2, \$3.
.. confval:: scripts.amplitude
Type: *string*
The script to be called when an amplitude
arrives. Network\-, station code, amplitude and amplitude
public ID are passed as parameters \$1, \$2, \$3, \$4.
.. confval:: scripts.alert
Type: *string*
The script to be called when a preliminary
origin arrives. Latitude and longitude are passed as
parameters \$1 and \$2.
.. confval:: scripts.event
Type: *string*
The script to be called when an event has been
declared. The message string, a flag \(1\=new event,
0\=update event\), the EventID, the arrival count and the
magnitude \(optional when set\) are passed as parameters
\$1, \$2, \$3, \$4 and \$5.
.. note::
**constraints.\***
*Constraints for executing scripts*
.. confval:: constraints.phaseHints
Default: ``P,S``
Type: *list::string*
Start the pick script only when the phaseHint of the
received pick has one of the value\(s\).
.. confval:: constraints.phaseStreams
Type: *list::string*
Start the pick script only when the stream \(NET.STA.LOC.CHA\)
of the received pick belongs to the list of stream IDs. If empty,
all picks are accepted, otherwise only the ones whose stream ID
matches one of the entry of this comma separated list. Each entry
must follow the NET.STA.LOC.CHA format, but the special
characters ? \* \| \( \) are also accepeted.
E.g. \"CH.\*,GR.STA??.\*,\*.\*.\*.HH?,\*.\*.\*.??\(Z\|1\)\"
.. confval:: constraints.phaseNumber
Default: ``1``
Type: *int*
Start the pick script only when a minimum number of phases
'phaseNumber' is received within 'phaseInterval'.
.. confval:: constraints.phaseInterval
Default: ``1``
Type: *int*
Unit: *s*
Start the pick script only when a minimum number of phases
'phaseNumber' is received within 'phaseInterval'.
Command-Line Options
====================
.. program:: scalert
:program:`scalert [options]`
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
.. option:: --plugins arg
Load given plugins.
.. option:: -D, --daemon
Run as daemon. This means the application will fork itself
and doesn't need to be started with \&.
.. option:: --auto-shutdown arg
Enable\/disable self\-shutdown because a master module shutdown.
This only works when messaging is enabled and the master
module sends a shutdown message \(enabled with \-\-start\-stop\-msg
for the master module\).
.. option:: --shutdown-master-module arg
Set the name of the master\-module used for auto\-shutdown.
This is the application name of the module actually
started. If symlinks are used, then it is the name of
the symlinked application.
.. option:: --shutdown-master-username arg
Set the name of the master\-username of the messaging
used for auto\-shutdown. If \"shutdown\-master\-module\" is
given as well, this parameter is ignored.
.. option:: --first-new
Overrides configuration parameter :confval:`firstNew`.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --log-file arg
Use alternative log file.
Messaging
---------
.. option:: -u, --user arg
Overrides configuration parameter :confval:`connection.username`.
.. option:: -H, --host arg
Overrides configuration parameter :confval:`connection.server`.
.. option:: -t, --timeout arg
Overrides configuration parameter :confval:`connection.timeout`.
.. option:: -g, --primary-group arg
Overrides configuration parameter :confval:`connection.primaryGroup`.
.. option:: -S, --subscribe-group arg
A group to subscribe to.
This option can be given more than once.
.. option:: --content-type arg
Overrides configuration parameter :confval:`connection.contentType`.
.. option:: --start-stop-msg arg
Set sending of a start and a stop message.
Database
--------
.. option:: --db-driver-list
List all supported database drivers.
.. option:: -d, --database arg
The database connection string, format:
service:\/\/user:pwd\@host\/database.
\"service\" is the name of the database driver which
can be queried with \"\-\-db\-driver\-list\".
.. option:: --config-module arg
The config module to use.
.. option:: --inventory-db arg
Load the inventory from the given database or file, format:
[service:\/\/]location .
.. option:: --db-disable
Do not use the database at all
Alert
-----
.. option:: --amp-type arg
Specify the amplitude type to listen to.
.. option:: --amp-script arg
Overrides configuration parameter :confval:`scripts.amplitude`.
.. option:: --alert-script arg
Overrides configuration parameter :confval:`scripts.alert`.
.. option:: --event-script arg
Overrides configuration parameter :confval:`scripts.event`.
Cities
------
.. option:: --max-dist arg
Overrides configuration parameter :confval:`poi.maxDist`.
.. option:: --min-population arg
Overrides configuration parameter :confval:`poi.minPopulation`.
Debug
-----
.. option:: -E, --eventid arg
Specify event ID that is used for testing. After running the
alert scripts scvoice will exit.

View File

@ -0,0 +1,466 @@
.. highlight:: rst
.. _scamp:
#####
scamp
#####
**Calculates amplitudes on basis of incoming origins and the associated picks.**
Description
===========
scamp measures several different kinds of amplitudes from waveform data.
It listens for origins and measures amplitudes in time windows determined
from the origin. Thus, in contrast to amplitudes measured by :ref:`scautopick`
the considered time windows can depend on epicentral distance.
The resulting amplitude objects are sent to the "AMPLITUDE"
messaging group. scamp is the counterpart of :ref:`scmag`. Usually, all
amplitudes are computed at once by scamp and then published.
Only very rarely an amplitude needs to be recomputed if the location of an
origin changes significantly. The amplitude can be reused by :ref:`scmag`, making
magnitude computation and update efficient. Currently, the automatic picker
in SeisComP, scautopick, also measures a small set of amplitudes
(namely "snr" and "mb", the signal-to-noise ratio and the amplitude used in
mb magnitude computation, respectively) for each automatic pick in fixed
time windows. If there already exists an amplitude, e.g. a previously determined
one by scautopick, scamp will not measure it again for the respective stream.
Amplitudes are also needed, however, for manual picks. scamp does this as well.
Arrivals with weight smaller than 0.5 (default) in the corresponding Origin are
discarded. This minimum weight can be configured with
:confval:`amptool.minimumPickWeight`.
Amplitude Types
===============
Amplitudes of many types are currently computed for their corresponding
magnitudes.
.. note::
In order to be used by scmag, the input amplitude names for the
various magnitude types must typically match exactly. Exceptions:
* :term:`MN <magnitude, Nuttli (MN)>` requires *AMN* amplitudes,
* :term:`MLr <magnitude, local GNS/GEONET (MLr)>` requires *MLv* amplitudes.
Local distances
---------------
:term:`Md <magnitude, duration (Md)>`
Duration magnitude as described in HYPOINVERSE (:cite:t:`klein-2002`).
:term:`Mjma <magnitude, JMA (M_JMA)>`
Mjma is computed on displacement data using body waves of period < 30s.
:term:`ML <magnitude, local (ML)>`
Local (Richter) magnitude calculated on the horizontal components using a
correction term to fit with the standard ML (:cite:t:`richter-1935`).
:term:`MLc <magnitude, local custom (MLc)>`
Local custom magnitude calculated on the horizontal components according to
Hessian Earthquake Service and :cite:t:`stange-2006`
:term:`MLh <magnitude, local horizontal (MLh)>`
Local magnitude calculated on the horizontal components according to SED
specifications.
:term:`MLv <magnitude, local vertical (MLv)>`
Local magnitude calculated on the vertical component using a correction term
to fit with the standard ML.
AMN for :term:`MN <magnitude, Nuttli (MN)>`
Nuttli magnitude for Canada and other Cratonic regions (:cite:t:`nuttli-1973`).
Teleseismic distances
---------------------
:term:`mb <magnitude, body-wave (mb)>`
Narrow band body wave magnitude measured on a WWSSN-SP filtered trace
:term:`mBc <magnitude, cumulative body-wave (mBc)>`
Cumulative body wave magnitude
:term:`mB <magnitude, broadband body-wave (mB)>`
Broad band body wave magnitude after :cite:t:`bormann-2008`
:term:`Mwp <magnitude, broadband P-wave moment (Mwp)>`
The body wave magnitude of :cite:t:`tsuboi-1995`
:term:`Ms_20 <magnitude, surface wave (Ms_20)>`
Surface-wave magnitude at 20 s period
:term:`Ms(BB) <magnitude, broadband surface wave (Ms(BB))>`
Broad band surface-wave magnitude
Acceleration Input Data
=======================
For amplitudes to be computed, the input waveforms are usually given in velocity.
Acceleration data, e.g. from strong-motion instruments must therefore be transformed
to velocity. The transformation is enabled by activating the response correction.
Activate the correction in the global bindings for all
types or in a new Amplitude type profile for specific types.
Example global binding parameters for computing MLv amplitudes from accleration
data. Here, the frequency range is limited to 0.5 - 20 Hz: ::
amplitudes.MLv.enableResponses = true
amplitudes.MLv.resp.taper = 5
amplitudes.MLv.resp.minFreq = 0.5
amplitudes.MLv.resp.maxFreq = 20
Re-processing
=============
*scamp* can be used to reprocess and to update amplitudes, e.g. when inventory paramters
had to be changed retrospectively. Updating ampitudes requires waveform access.
The update can be performed
1. In **offline processing** based on XML files (:confval:`--ep`). :confval:`--reprocess<reprocess>`
will replace exisiting amplitudes. Updated values can be dispatched to the messing by
:ref:`scdispatch` making them available for further processing, e.g. by :ref:`scmag`.
**Example:**
.. code-block:: sh
$ scamp --ep evtID.xml -d [type]://[host]/[database] --reprocess > evtID_update.xml
$ scdispatch -O merge -H [host] -i evtID_update.xml
#. **With messaging** by setting :confval:`start-time` or :confval:`end-time`.
All parameters are read from the database. :confval:`--commit<commit>` will
send the updated parameters to the messing system making them available for
further processing, e.g. by :ref:`scmag`. Otherwise, XML output is generated.
**Example:**
.. code-block:: sh
$ scamp -u testuser -H [host] --commit \
--start-time '2016-10-15 00:00:00' --end-time '2016-10-16 19:20:00'
.. _scamp_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/scamp.cfg`
| :file:`etc/global.cfg`
| :file:`etc/scamp.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/scamp.cfg`
scamp inherits :ref:`global options<global-configuration>`.
.. confval:: amplitudes
Default: ``MLv, mb, mB, Mwp``
Type: *list:string*
Definition of magnitude types for which amplitudes are to be calculated.
.. confval:: amptool.minimumPickWeight
Default: ``0.5``
Type: *double*
The minimum arrival weight within an origin to compute amplitudes for the associated pick.
.. confval:: amptool.streamFromBindings
Default: ``false``
Type: *boolean*
If enabled then global bindings will be used to replace
location code and channel code of a pick with the configured
values of detecLocid and detecStream for amplitude computation.
.. confval:: amptool.initialAcquisitionTimeout
Default: ``30``
Type: *double*
Unit: *s*
Timeout in seconds of the first data packet of waveform data acquisition.
.. confval:: amptool.runningAcquisitionTimeout
Default: ``2``
Type: *double*
Unit: *s*
Timeout in seconds of any subsequent data packet of waveform data acquisition.
Command-Line Options
====================
.. program:: scamp
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
.. option:: --plugins arg
Load given plugins.
.. option:: -D, --daemon
Run as daemon. This means the application will fork itself
and doesn't need to be started with \&.
.. option:: --auto-shutdown arg
Enable\/disable self\-shutdown because a master module shutdown.
This only works when messaging is enabled and the master
module sends a shutdown message \(enabled with \-\-start\-stop\-msg
for the master module\).
.. option:: --shutdown-master-module arg
Set the name of the master\-module used for auto\-shutdown.
This is the application name of the module actually
started. If symlinks are used, then it is the name of
the symlinked application.
.. option:: --shutdown-master-username arg
Set the name of the master\-username of the messaging
used for auto\-shutdown. If \"shutdown\-master\-module\" is
given as well, this parameter is ignored.
.. option:: -x, --expiry time
Time span in hours after which objects expire.
.. option:: -O, --origin-id publicID
OriginID to calculate amplitudes for and exit.
.. option:: --dump-records
Dumps the filtered traces to ASCII when using \-O.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --log-file arg
Use alternative log file.
Messaging
---------
.. option:: -u, --user arg
Overrides configuration parameter :confval:`connection.username`.
.. option:: -H, --host arg
Overrides configuration parameter :confval:`connection.server`.
.. option:: -t, --timeout arg
Overrides configuration parameter :confval:`connection.timeout`.
.. option:: -g, --primary-group arg
Overrides configuration parameter :confval:`connection.primaryGroup`.
.. option:: -S, --subscribe-group arg
A group to subscribe to.
This option can be given more than once.
.. option:: --content-type arg
Overrides configuration parameter :confval:`connection.contentType`.
.. option:: --start-stop-msg arg
Set sending of a start and a stop message.
.. option:: --test
Test mode where no messages are sent.
Database
--------
.. option:: --db-driver-list
List all supported database drivers.
.. option:: -d, --database arg
The database connection string, format:
service:\/\/user:pwd\@host\/database.
\"service\" is the name of the database driver which
can be queried with \"\-\-db\-driver\-list\".
.. option:: --config-module arg
The config module to use.
.. option:: --inventory-db arg
Load the inventory from the given database or file, format:
[service:\/\/]location .
.. option:: --db-disable
Do not use the database at all
Records
-------
.. option:: --record-driver-list
List all supported record stream drivers.
.. option:: -I, --record-url arg
The recordstream source URL, format:
[service:\/\/]location[#type].
\"service\" is the name of the recordstream driver
which can be queried with \"\-\-record\-driver\-list\".
If \"service\" is not given, \"file:\/\/\" is
used.
.. option:: --record-file arg
Specify a file as record source.
.. option:: --record-type arg
Specify a type for the records being read.
Input
-----
.. option:: --ep file
Defines an event parameters XML file to be read and processed. This
implies offline mode and only processes all origins contained
in that file. It computes amplitudes for all picks associated
with an origin and outputs an XML file that additionally
contains the amplitudes.
.. option:: -p, --picks
Force measuring amplitudes for picks only. Origins are
ignored and time windows are independent of distance. Works
only in combination with \-\-ep.
.. option:: --reprocess
Reprocess and update existing amplitudes. Manual amplitudes
will be skipped. Works only in combination with \-\-ep.
This option can be used, e.g., for reprocessing amplitudes
with new inventory information. Waveform access is required.
Reprocess
---------
.. option:: --force
Forces reprocessing of all amplitudes, even manual ones.
.. option:: --start-time time
.. option:: --end-time time
.. option:: --commit
Send amplitude updates to the messaging otherwise an XML
document will be output.

View File

@ -0,0 +1,499 @@
.. highlight:: rst
.. _scardac:
#######
scardac
#######
**Waveform archive data availability collector.**
Description
===========
scardac scans an :term:`SDS waveform archive <SDS>`, e.g.,
created by :ref:`slarchive` or :ref:`scart` for available
:term:`miniSEED <miniSeed>` data. It will collect information about
* ``DataExtents`` -- the earliest and latest times data is available
for a particular channel,
* ``DataAttributeExtents`` -- the earliest and latest times data is available
for a particular channel, quality and sampling rate combination,
* ``DataSegments`` -- continuous data segments sharing the same quality and
sampling rate attributes.
scardac is intended to be executed periodically, e.g., as a cronjob.
The availability data information is stored in the SeisComP database under the
root element :ref:`DataAvailability <api-datamodel-python>`. Access to the
availability data is provided by the :ref:`fdsnws` module via the services:
* :ref:`/fdsnws/station <sec-station>` (extent information only, see
``matchtimeseries`` and ``includeavailability`` request parameters).
* :ref:`/fdsnws/ext/availability <sec-avail>` (extent and segment information
provided in different formats)
.. _scarcac_non-sds:
Non-SDS archives
----------------
scardac can be extended by plugins to scan non-SDS archives. For example the
``daccaps`` plugin provided by :cite:t:`caps` allows scanning archives generated
by a CAPS server. Plugins are added to the global module configuration, e.g.:
.. code-block:: properties
plugins = ${plugins}, daccaps
.. _scarcac_workflow:
Definitions
-----------
* ``Record`` -- continuous waveform data of same sampling rate and quality bound
by a start and end time. scardac will only read the record's meta data and not
the actual samples.
* ``Chunk`` -- container for records, e.g., a :term:`miniSEED <miniSeed>` file,
with the following properties:
- overall, theoretical time range of records it may contain
- contains at least one record, otherwise it must be absent
- each record of a chunk must fulfill the following conditions:
- `chunk start <= record start < chunk end`
- `chunk start < record end < next chunk end`
- chunks do not overlap, end time of current chunk equals start time of
successive chunk, otherwise a ``chunk gap`` is declared
- records may occur unordered within a chunk or across chunk boundaries,
resulting in `DataSegments` marked as ``outOfOrder``
* ``Jitter`` -- maximum allowed deviation between the end time of the current
record and the start time of the next record in multiples of the current's
record sampling rate. E.g., assuming a sampling rate of 100Hz and a jitter
of 0.5 will allow for a maximum end to start time difference of 50ms. If
exceeded a new `DataSegment` is created.
* ``Mtime`` -- time the content of a chunk was last modified. It is used to
- decided whether a chunk needs to be read in a secondary application run
- calculate the ``updated`` time stamp of a `DataSegment`,
`DataAttributeExtent` and `DataExtent`
* ``Scan window`` -- time window limiting the synchronization of the archive
with the database configured via :confval:`filter.time.start` and
:confval:`filter.time.end` respectively :option:`--start` and :option:`--end`.
The scan window is useful to
- reduce the scan time of larger archives. Depending on the size and storage
type of the archive it may take some time to just list available chunks and
their mtime.
- prevent deletion of availability information even though parts of the
archive have been deleted or moved to a different location
* ``Modification window`` -- the mtime of a chunk is compared with this time
window to decide whether it needs to be read or not. It is configured via
:confval:`mtime.start` and :confval:`mtime.end` repectively
:option:`--modified-since` and :option:`--modified-until`. If no lower bound
is defined then the ``lastScan`` time stored in the `DataExtent` is used
instead. The mtime check may be disabled using :confval:`mtime.ignore` or
:option:`--deep-scan`.
**Note:** Chunks in front or right after a chunk gap are read in any case
regardless of the mtime settings.
Workflow
--------
#. Read existing `DataExtents` from database.
#. Collect a list of available stream IDs either by
* scanning the archive for available IDs or
* reading an ID file defined by :confval:`nslcFile`.
#. Identify extents to add, update or remove respecting `scan window`,
:confval:`filter.nslc.include` and :confval:`filter.nslc.exclude`.
#. Subsequently process the `DataExtents` using :confval:`threads` number of
parallel threads. For each `DataExtent`:
#. Collect all available chunks within `scan window`.
#. If the `DataExtent` is new (no database entry yet), store a new and
empty `DataExtent` to database, else query existing `DataSegments` from
the database:
* count segments outside `scan window`
* create a database iterator for extents within `scan window`
#. Create two in-memory segment lists which collect segments to remove and
segments to add/update
#. For each chunk
* determine the `chunk window` and `mtime`
* decide whether the chunk needs to be read depending on the `mtime`
and a possible `chunk gap`. If necessary, read the chunk and
- create chunk segments by analyzing the chunk records for
gaps/overlaps defined by :confval:`jitter`, sampling rate or quality
changes
- merge chunk segments with database segments and update the in-memory
segment lists.
If not necessary, advance the database segment iterator to the end
of the chunk window.
#. Remove and then add/update the collected segments.
#. Merge segment information into `DataAttributeExtents`
#. Merge `DataAttributeExtents` into overall `DataExtent`
Examples
--------
#. Get command line help or execute scardac with default parameters and informative
debug output:
.. code-block:: sh
scardac -h
scardac --debug
#. Synchronize the availability of waveform data files existing in the standard
:term:`SDS` archive with the seiscomp database and create an XML file using
:ref:`scxmldump`:
.. code-block:: sh
scardac -d mysql://sysop:sysop@localhost/seiscomp -a $SEISCOMP_ROOT/var/lib/archive --debug
scxmldump -Yf -d mysql://sysop:sysop@localhost/seiscomp -o availability.xml
#. Synchronize the availability of waveform data files existing in the standard
:term:`SDS` archive with the seiscomp database. Use :ref:`fdsnws` to fetch a flat file containing a list
of periods of available data from stations of the CX network sharing the same
quality and sampling rate attributes:
.. code-block:: sh
scardac -d mysql://sysop:sysop@localhost/seiscomp -a $SEISCOMP_ROOT/var/lib/archive
wget -O availability.txt 'http://localhost:8080/fdsnws/ext/availability/1/query?network=CX'
.. note::
The |scname| module :ref:`fdsnws` must be running for executing this
example.
.. _scardac_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/scardac.cfg`
| :file:`etc/global.cfg`
| :file:`etc/scardac.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/scardac.cfg`
scardac inherits :ref:`global options<global-configuration>`.
.. confval:: archive
Default: ``@SEISCOMP_ROOT@/var/lib/archive``
Type: *string*
The URL to the waveform archive where all data is stored.
Format: [service:\/\/]location[#type]
\"service\": The type of the archive. If not given,
\"sds:\/\/\" is implied assuming an SDS archive. The SDS
archive structure is defined as
YEAR\/NET\/STA\/CHA\/NET.STA.LOC.CHA.YEAR.DAYFYEAR, e.g.
2018\/GE\/APE\/BHZ.D\/GE.APE..BHZ.D.2018.125
Other archive types may be considered by plugins.
.. confval:: threads
Default: ``1``
Type: *int*
Number of threads scanning the archive in parallel.
.. confval:: jitter
Default: ``0.5``
Type: *float*
Acceptable derivation of end time and start time of successive
records in multiples of sample time.
.. confval:: maxSegments
Default: ``1000000``
Type: *int*
Maximum number of segments per stream. If the limit is reached
no more segments are added to the database and the corresponding
extent is flagged as too fragmented. Set this parameter to 0 to
disable any limits.
.. confval:: nslcFile
Type: *string*
Line\-based text file of form NET.STA.LOC.CHA defining available
stream IDs. Depending on the archive type, size and storage
media used this file may offer a significant performance
improvement compared to collecting the available streams on each
startup. Filters defined under `filter.nslc` still apply.
.. note::
**filter.\***
*Parameters of this section limit the data processing to either*
**
*reduce the scan time of larger archives or to*
**
*prevent deletion of availability information even though parts*
*of the archive have been deleted or moved to a different*
*location.*
.. note::
**filter.time.\***
*Limit the processing by record time.*
.. confval:: filter.time.start
Type: *string*
Start of data availability check given as date string or
as number of days before now.
.. confval:: filter.time.end
Type: *string*
End of data availability check given as date string or
as number of days before now.
.. note::
**filter.nslc.\***
*Limit the processing by stream IDs.*
.. confval:: filter.nslc.include
Type: *list:string*
Comma\-separated list of stream IDs to process. If
empty all streams are accepted unless an exclude filter
is defined. The following wildcards are supported: '\*'
and '?'.
.. confval:: filter.nslc.exclude
Type: *list:string*
Comma\-separated list of stream IDs to exclude from
processing. Excludes take precedence over includes. The
following wildcards are supported: '\*' and '?'.
.. note::
**mtime.\***
*Parameters of this section control the rescan of data chunks.*
*By default the last update time of the extent is compared with*
*the record file modification time to read only files modified*
*since the list run.*
.. confval:: mtime.ignore
Default: ``false``
Type: *boolean*
If set to true all data chunks are read independent of their
mtime.
.. confval:: mtime.start
Type: *string*
Only read chunks modified after specific date given as date
string or as number of days before now.
.. confval:: mtime.end
Type: *string*
Only read chunks modified before specific date given as date
string or as number of days before now.
Command-Line Options
====================
.. program:: scardac
:program:`scardac [OPTION]...`
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
.. option:: --plugins arg
Load given plugins.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --print-component arg
For each log entry print the component right after the
log level. By default the component output is enabled
for file output but disabled for console output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --trace
Execute in trace mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 \-\-print\-component\=1
\-\-print\-context\=1 .
.. option:: --log-file arg
Use alternative log file.
Collector
---------
.. option:: -a, --archive arg
Overrides configuration parameter :confval:`archive`.
.. option:: --threads arg
Overrides configuration parameter :confval:`threads`.
.. option:: -j, --jitter arg
Overrides configuration parameter :confval:`jitter`.
.. option:: --nslc arg
Overrides configuration parameter :confval:`nslcFile`.
.. option:: --start arg
Overrides configuration parameter :confval:`filter.time.start`.
.. option:: --end arg
Overrides configuration parameter :confval:`filter.time.end`.
.. option:: --include arg
Overrides configuration parameter :confval:`filter.nslc.include`.
.. option:: --exclude arg
Overrides configuration parameter :confval:`filter.nslc.exclude`.
.. option:: --deep-scan
Overrides configuration parameter :confval:`mtime.ignore`.
.. option:: --modified-since arg
Overrides configuration parameter :confval:`mtime.start`.
.. option:: --modified-until arg
Overrides configuration parameter :confval:`mtime.end`.
.. option:: --generate-test-data arg
Do not scan the archive but generate test data for each
stream in the inventory. Format:
days,gaps,gapslen,overlaps,overlaplen. E.g., the following
parameter list would generate test data for 100 days
\(starting from now\(\)\-100\) which includes 150 gaps with a
length of 2.5s followed by 50 overlaps with an overlap of
5s: \-\-generate\-test\-data\=100,150,2.5,50,5

View File

@ -0,0 +1,364 @@
.. highlight:: rst
.. _scart:
#####
scart
#####
**Import/export MiniSEED data to/from SDS archives.**
Description
===========
The archive tool scart reads and writes :term:`SDS` archives and files
in miniSEED format and checks miniSEED archives or prints stream information.
* **Dump mode:** Create miniSEED files (multiplexed), e.g. for playbacks, from
:term:`SDS` structured data (e.g. created by slarchive).
* **Dump mode:** Play back records directly out of an SDS structure.
* **Import mode:** Import multiplexed miniSEED files into a local SDS waveform
archive.
* **Import mode:** Import data using the :ref:`global_recordstream` interface
into a local SDS waveform archive.
* **Import mode:** Read data from any :ref:`global_recordstream` interface
and dump it to file.
* **Check mode:** Check an archive of miniSEED files for out-of-order records in
files.
.. warning::
* When creating :term:`SDS` archives, scart simply appends the new records to
existing ones. Multiple imports of the same data result in duplication.
* Out-of-order imports of waveforms into a SDS archive result in out-of-order
records which may not be processed. Clean your archive using :ref:`scmssort`.
* **Before importing miniSEED data** into an SDS archive they must be sorted
by time and duplicate records must be removed. Otherwise, the SDS archive
may not be correctly readable by other modules. Therefore, combine scart
with :ref:`scmssort` for multiplexing and removal of duplicates.
.. hint::
In dump and import mode output streams may be filtered by
* Time windows (:option:`-t`),
* Network-station-location-channel (NSLC) lists (:option:`--nslc`) created,
e.g., with
* :ref:`scinv` from inventories,
* :ref:`scdumpcfg` from bindings configuration,
* scart itself from other miniSEED files or achives.
In dump mode output streams may also be filtered by
* Time window - stream lists (:option:`--list`, dump mode) generated by
:ref:`scevtstreams` for particular events.
Time strings may be given in
* ISO time format, e.g., 2023-03-28T15:48:00
* or the 'old' SeisComP time format with empty character between date and time, e.g.,
'2023-03-28 15:48:00'.
When omitting seconds, minutes or hours values of zero are implied.
.. _scart-config:
Configuration
=============
scart can make use of :ref:`global_recordstream` implementations which are
provided by additional plugins. For loading additional plugins, e.g. the *xyz*
plugin create and configure :file:`scart.cfg`:
.. code-block:: sh
plugins = xyz
Examples
========
.. hint::
The usage of wildcards in place of network, station, location or channel code
is allowed in many options (:option:`-n`, :option:`-c`, :option:`-l`,
:option:`--list`, :option:`--nslc`) and follows these rules:
* Import mode: the wildcards are passed to the :ref:`global_recordstream` interface,
that interprets them. Normally both "*" and "?" are supported by RecordStreams.
* Dump mode: the wildcards are interpreted by scart command that supports "*" for
network, station, location codes and "*", "?", "(", ")", "|" for channel code.
#. Extract data from the default :term:`SDS` archive in :file:`$SEISCOMP_ROOT/var/lib/archive`
or from a local :term:`SDS` archive [SDS archive] into a miniSEED file :file:`file.mseed`
and sort by end time of the records:
.. code-block:: sh
scart -dsvE -t '[start-time]~[end-time]' [SDS archive] > [file.mseed]
scart -dsvE -t '[start-time]~[end-time]' > file.mseed
scart -dsvE -t '[start-time]~[end-time]' -n '[NET1],[NET2]' > file.mseed
scart -dsvE -t '[start-time]~[end-time]' -n '[NET]' -c '(E,H)H(1,2,3)' > file.mseed
scart -dsvE -t '[start-time]~[end-time]' -n '[N1.S1.L1.C1],[N2.S2.L2.C2]' > file.mseed
scart -dsvE -t '[start-time]~[end-time]' --nslc list.file > file.mseed
scart -dsvE -t --list list.file > file.mseed
It is possible to achieve the same result of the dump mode using a
combination of the input mode and the :ref:`scmssort` command, which allows
to read the input data from any supported :ref:`global_recordstream`,
not only an SDS archive:
.. code-block:: sh
scart -I [record-stream] --list list.file --stdout | scmssort -u -E -v > file.mseed
.. note::
Sorting data is computational expensive but required for waveform playbacks.
#. Push miniSEED data from file :file:`file.mseed` or standard input
(stdin) into a local :term:`SDS` archive or a file. Additionally, you may
check if the records of archived files are correctly ordered, filter by time
and/or with NSLC list and print the output streams:
.. code-block:: sh
scmssort -u -E [file.mseed] > sorted.mseed
scart -I sorted.mseed --with-filecheck [SDS archive]
scart -I [file.mseed] -t '[start-time]~[end-time]' --print-streams --nslc list.file -o [out.mseed]
cat sorted.mseed | scart -I - [SDS archive]
cat sorted.mseed | scart [SDS archive]
#. Collect data using the :ref:`global_recordstream` interface (e.g. FDSNWS server)
and write to a miniSEED file or import it into a local :term:`SDS` archive. The
data streams and the time spans can be defined in several ways. The data streams
(:option:`--list`) can be automatically generated, e.g., by :ref:`scevtstreams`.
.. code-block:: sh
scart -I fdsnws://[server]:80 --list list.file [SDS archive]
scart -I fdsnws://[server]:80 --list list.file -o file.mseed
scart -I fdsnws://[server]:80 -t '[start-time]~[end-time]' --nslc list.file [SDS archive]
scart -I fdsnws://[server]:80 -t '[start-time]~[end-time]' -n '[NET1],[NET2]' [SDS archive]
scart -I fdsnws://[server]:80 -t '[start-time]~[end-time]' -n '[NET]' -c 'EH?' [SDS archive]
scart -I fdsnws://[server]:80 -t '[start-time]~[end-time]' -n '[N1.S1.L1.C1],[N2.S2.L2.C2]' [SDS archive]
#. Check all files of an SDS archive or other directory structure for
miniSEED files with out-of-order records:
.. code-block:: sh
scart --check [archive]
#. Print stream information from miniSEED files in archives or from
:term:`RecordStream` without actually writing miniSEED data. In dump and
import mode use the :option:`--test` if miniSEED data shall be read but not
written.
.. code-block:: sh
scart --print-streams -I [miniSEED file] --test
scart --print-streams -d -t [time span] --test [SDS archive]
scart --print-streams --check [archive]
The output looks like this:
.. code-block:: sh
# streamID start end records samples samplingRate
GE.RGN..BHZ 2022-12-08T15:34:41.895Z 2022-12-08T15:52:19.145Z 58 21145 20.0
where the header and the text body are printed to stderr.
Command-Line Options
====================
.. program:: scart
:program:`scart [options] {archive-dir}`
The last option has to be the archive directory when dump mode is enabled.
When no archive directory is explicitly given,
\$SEISCOMP_ROOT\/var\/lib\/archive or the current directory
is used depending on whether \$SEISCOMP_ROOT has been set or not.
The default operation mode is import. That means that a multiplexed
MiniSEED file or another record source such as ArcLink is used to import
records into a SDS structure.
Verbosity
---------
.. option:: -v, --verbose
Verbose mode.
.. option:: -h, --help
Display a help message.
Mode
----
.. option:: --check
Check mode. Check all files
in the given directory for erroneous miniSEED records.
All sub\-directories are included. If no directory is given,
the default SDS archive is scanned. Checks are only complete
for files containing exactly one stream. More complete checks
are made with scmssort.
.. option:: -d, --dump
Set export \(dump\) mode. Records are retrieved from an archive and
written to standard output.
.. option:: -I
Import mode \(default\): Specify the recordstream URL to read
the data from for archiving. When using any other
recordstream than file, a stream list file is needed.
Specifying \- implies file:\/\/\- \(stdin\). If no mode is
explicitly specified, \-I file:\/\/\- is assumed.
Processing
----------
.. option:: -c channels
Channel filter to be applied to the data streams.
Default for Dump: \"\(B\|E\|H\|M\|S\)\(D\|H\|L\|N\)\(E\|F\|N\|Z\|1\|2\|3\)\"
Default for Import: \"\*\"
.. option:: -E
Dump mode: sort records according to their end time.
Default: start time.
.. option:: --files count
Dump mode: Specify the number of file handles to cache.
Default: 100.
.. option:: -i
Ignore records without data samples.
.. option:: -l, --list file
Import, dump mode: Use a stream list file with time windows instead
of defined networks and channels \(\-n, \-c and \-t are ignored\).
The list can be generated from events by scevtstreams. One
line per stream. Line format: starttime;endtime;streamID
The time format is the same as described in option '\-t'.
Example:
2019\-07\-17 02:00:00;2019\-07\-17 02:10:00;GR.CLL..BH?
.. option:: -m, --modify
Dump mode: Modify the record time for real time playback.
The first record time is NOW. The relative time of
successive records to the first one are kept.
.. option:: -n networks
Import, dump mode: Data stream selection as a comma separated list
\"stream1,stream2,streamX\" where each stream can be NET or NET.STA
or NET.STA.LOC or NET.STA.LOC.CHA.
If CHA is omitted, it defaults to the value of \-c option.
Default: \"\*\"
.. option:: --nslc file
Import, dump mode: Stream list file to be used instead of
defined networks and channels \(\-n and \-c are ignored\)
for filtering the data by the given streams. Dump mode:
Use in combination with \-t\! One line per stream, line
format: NET.STA.LOC.CHA
Example:
GR.CLL..BH?
.. option:: --rename rule
Import, dump mode: Rename stream data according to the provided
rule\(s\). A rule is \"[match\-stream:]rename\-stream\" and match\-stream
is optional. match\-stream and rename\-stream are in the
\"NET.STA.LOC.CHA\" format. match\-stream supports special
charactes \"?\" \"\*\" \"\|\" \"\(\" \"\)\". rename\-stream supports the
special character \"\-\" that can be used in place of NET, STA,
LOC, CHA codes with the meaning of not renaming those.
\"\-\" can also be used as the last character in CHA code.
Multiple rules can be provided as a comma separated list
or by providing multiple \-\-rename options.
.. option:: -s, --sort
Dump mode: Sort records by [start\-]time. To sort records by their
end time use \-E.
.. option:: --speed value
Dump mode: Specify the speed to dump the records. A value of 0 means
no delay otherwise speed is a multiplier of the real time difference
between the records. When feeding the records directly into the replay
pipe a value of 1 \(real time\) is recommended.
.. option:: -t, --time-window timeWindow
Import, dump mode: Specify the time window \(as one properly
quoted string\) to dump records for. Times are UTC and
separated by a tilde \"\~\". To dump one hour of
waveform data between 2008\/01\/01 00:00:00 and 2008\/01\/01
01:00:00 use
<\-t 2008\-01\-01T00:00:00\~2008\-01\-01T01:00:00>.
Output
------
.. option:: -o, --output
Dump, Import mode: Write data to given file instead of creating
a SDS archive. Deactivates \-\-stdout. Deactivated by \-\-test.
.. option:: --print-streams
Print stream information only and exit. Works in import, dump
and check mode. Output: NET.STA.LOC.CHA StartTime EndTime.
.. option:: --stdout
Import mode: Write to stdout instead of creating a SDS archive.
Deactivated by \-\-test and \-\-output.
.. option:: --test
Test input only, deactivate all miniSEED output. This switch is
useful for debugging and printing stream information with
\-\-print\-streams.
.. option:: --with-filecheck
Import mode: Check all accessed files. Unsorted or unreadable
files are reported to stderr. Checks are only complete
for files containing exactly one stream. More complete
checks are made with scmssort.
.. option:: --with-filename
Import mode: Print all accessed files to stderr after import.

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,942 @@
.. highlight:: rst
.. _scautopick:
##########
scautopick
##########
**Phase detection and picking on waveforms.**
Description
===========
scautopick applies threshold monitoring by searching for waveform anomalies in
form of changes in amplitudes. It is applied for detecting phase arrivals
creating :term:`phase picks <pick>` and for measuring related features and
:term:`amplitudes <amplitude>`. The picks and associated amplitudes and
features are typically provided to modules like :ref:`scautoloc` for locating
the source.
.. note::
Instead of detecting phase arrivals for source location, scautopick
can also be applied for detecting simple amplitude exceedence applying filters
like the :py:func:`MAX` filter. Exceedences are reported as picks and can be
processed further, e.g. by :ref:`scalert`.
Phase Detections
================
scautopick detects phase onsets for generating :term:`picks <pick>`. Initally,
it searches for detections on the waveform streams defined by global bindings.
P picks
-------
A primary detector is applied first. When a detection is found, 'P' is by default
assigned to the guess of the phase type (phaseHint). The actual guess can be configured by
:confval:`phaseHint`. By default the primary detector applies a robust STA/LTA
detector (:py:func:`STALTA` filter) to waveforms for making detections. Other
detection filters and filter chains can be choosen from the
:ref:`list of SeisComP filters <filter-grammar>`. A guess of the pick type may
be defined by :confval:`phaseHint`.
Waveforms are typically :ref:`pre-filtered <filter-grammar>` before the actual
:py:func:`STALTA` filter. Without further configuration a
running-mean highpass, a cosine taper and a Butterworth bandpass filter of
third order with corner frequencies of 0.7 and 2 Hz are applied before the
:py:func:`STALTA` filter. The entire filter sequence is configurable by
:confval:`filter`, module configuration, or :confval:`detecFilter`, binding
configuration.
Once the STA/LTA ratio has reached a configurable threshold (by default 3) for a
particular stream, a :term:`pick` is set to the time when this
threshold is exceeded (pick time) and the picker is set inactive. The picker is
reactivated for this stream once the STA/LTA ratio falls to the value of 1.5 (default).
The trigger thresholds are configurable:
* Trigger on: :confval:`thresholds.triggerOn` in module configuration or
:confval:`trigOn` in binding configuration,
* Trigger off: :confval:`thresholds.triggerOff`, module configuration or :confval:`trigOff`,
binding configuration.
Initial detections can be further adjusted by a second-stage phase re-picker
(post picker) as defined by :confval:`picker`. The re-picker should be tuned
carefully and global bindings parameters :confval:`picker.*` should be
configured accordingly.
After having detected a phase, the re-picker will be inactive and accept no further
detection until
* The amplitudes measured after filtering (:confval:`filter` in module configuration
or :confval:`detecFilter` in binding configuration) fall below the
:confval:`thresholds.triggerOff` (module configuration) or :confval:`trigOff`
(binding configuration) and
* Amplitudes, :math:`A_{trigger}`, measured after filtering reach or
exceed a threshold determined by :math:`T_{minOffset}` (:confval:`thresholds.minAmplOffset`),
:math:`T_{dead}` (:confval:`thresholds.deadTime`) and the amplitude of the
previous pick, :math:`A_{prev}`:
.. math ::
A_{trigger} \ge T_{minOffset} + A_{prev} * exp\left(-(dt/T_{dead})^2\right)
if :math:`T_{dead} > 0`. Otherwise:
.. math ::
A_{trigger} \ge T_{minOffset}
Here, :math:`dt` is the time passed since the last pick.
:math:`T_{minOffset}` (:confval:`thresholds.minAmplOffset`) is typically similar to
the trigger threshold, :confval:`thresholds.triggerOn` (module configuration) or
:confval:`trigOn` (binding configuration).
S picks
-------
Based on the inital detection or pick a secondary picker may applied be applied,
e.g., for picking S phases as defined by :confval:`spicker`. The secondary picker
is halted as soon as new detections are made unless :confval:`killPendingSPickers`
is inactive.
As for the re-picker also the spicker should be tuned carefully and global
bindings parameters :confval:`spicker.*` should be set.
.. csv-table:: Second-stage pickers available by configuration of :confval:`picker` or :confval:`spicker`
:align: center
:delim: ,
:widths: 1 3 1 1 3
:header: "picker name", "phase", "picker", "spicker", "global bindings parameters"
"AIC", "P, configurable: :confval:`phaseHint`", "x", "", "picker.AIC.*"
"BK", "P, configurable: :confval:`phaseHint`", "x", "", "picker.BK.*"
"S-L2", "S", "", "x", "spicker.L2.*"
Feature extraction
------------------
For extracting features related to picks such as polarization parameters
configure :confval:`fx` and the related global bindings parameters :confval:`fx.*`.
Amplitude Measurements
======================
The second task of scautopick is to calculate amplitudes of a given type for the
corresponding magnitude type (see :ref:`scamp` for a list of amplitude types and
:ref:`scmag` for the magnitude types). Such amplitudes are required by:
* :ref:`scautoloc` for associating phase picks and generating a source location
* EEW (earthquake early warning) systems in order to provide rapid amplitudes for
magnitudes as soon as source locations are available.
The time window for measuring amplitudes starts at the pick time. The window
length is constant and specific to the amplitude type. It can be adjusted in
global bindings. For example mb is calculated
for a fixed time window of 30 s after the pick, mB for time window of 60s, for
MLv a time window of 150 s is estimated to make sure that S-arrivals are inside
this time window. The pre-calculated amplitudes are sent out and received by
the magnitude tool, :ref:`scmag`.
The fixed time window poses a limitation to EEW system. Howver, a speed-up is
available with :confval:`amplitudes.enableUpdate`.
Read the :ref:`scamp` documentation for more details on amplitude measurements.
Modes of Operation
==================
scautopick usually runs in the background connected to a real-time data source
such as :ref:`Seedlink <seedlink>`. This is referred to as online mode. Another
option to run scautopick is on offline mode with files.
Real-time
---------
In real-time mode the workflow draws like this:
* scautopick reads all of its binding parameters and subscribes to stations
defined by global binding parameters where :confval:`detecEnable` is set to ``true``.
* The data time window requested from the data source is [system-:confval:`leadTime`, NULL]
meaning an open end time that causes :ref:`SeedLink <seedlink>` to stream
real-time data if no more data are in the buffers.
* Each incoming record is filtered according to :confval:`detecFilter`.
* The samples are checked for exceedance of :confval:`trigOn` and in the positive
case either a post picker (:confval:`picker`) is launched or a :term:`Pick <pick>`
object will be sent.
* If :confval:`sendDetections` is set to ``true``, a trigger will be sent in any
case for e.g. debugging.
* After the primary stage has finished (detector only or picker) secondary
pickers will be launched if configured with :confval:`spicker`.
These steps repeat for any incoming record.
To run scautopick in the background as a daemon module enable and start it ::
$ seiscomp enable scautopick
$ seiscomp start scautopick
For executing on the command line simply call it with appropriate options, e.g. ::
$ seiscomp exec scautopick -h
Non-real-time
-------------
.. note::
Due to code changes in the file data source, the command line option
:option:`--playback` is essential for non-real-time operation. Otherwise a
real-time time window is set and all records are most likely filtered out.
To tune scautopick or to do playbacks it is helpful to run scautopick not with
a real-time data source but on a defined data set, e.g. a multiplexed sorted miniSEED
volume. scautopick will apply the same workflow as in online mode but the
acquisition of data records has to change. If the input data (file) has been
read, scautopick will exit and furthermore it must not ask for a particular
time window, especially not for a real-time time window. To accomplish that
the command-line parameter :option:`--playback` has to be used. Example:
.. code-block:: sh
$ scautopick --playback -I data.mseed
This call will process all records in :file:`data.mseed` for which bindings
exist and **send the results to the messaging**. If all data records are processed,
scautopick will exit. The processing steps are similar to the online mode.
Use the :option:`--ep` for offline processing **without messaging**. The results are
printed in :term:`SCML` format. Example:
.. code-block:: sh
$ scautopick --playback -I data.mseed --ep -d [type]://[host]/[database] > picks.xml
.. _scautopick_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/scautopick.cfg`
| :file:`etc/global.cfg`
| :file:`etc/scautopick.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/scautopick.cfg`
scautopick inherits :ref:`global options<global-configuration>`.
.. confval:: ringBufferSize
Default: ``300``
Type: *int*
Unit: *s*
Defined the record ringbuffer size in seconds.
.. confval:: leadTime
Default: ``60``
Type: *int*
Unit: *s*
The leadTime defines the time in seconds to start picking on
waveforms before current time.
.. confval:: playback
Default: ``false``
Type: *boolean*
If enabled, picks can be made on waveforms which are older than
current time \- \"leadTime\". Current time is the time
when the module was started. This allows to pick
historic data in real\-time playbacks which are preserving the
record times. See e.g. the \"msrtsimul\" module.
This option deactivates \"leadTime\". Activate only for playbacks.
.. confval:: initTime
Default: ``60``
Type: *int*
Unit: *s*
The initTime defines a time span in seconds for that the picker
is blind after initialization. This time is needed to initialize
the filter and depends on it.
.. confval:: gapInterpolation
Default: ``false``
Type: *boolean*
Interpolate gaps linearly? This is valid for gaps shorter
than thresholds.maxGapLength.
.. confval:: useAllStreams
Default: ``true``
Type: *boolean*
If enabled, all streams that are received by the picker are
used for picking. This option has only effect if a
file is used as input which contains more data than the
picker requests. If connected to a waveform server such as
SeedLink, the picker will only receive the data it is
subscribed to.
.. confval:: filter
Default: ``"RMHP(10)>>ITAPER(30)>>BW(4,0.7,2)>>STALTA(2,80)"``
Type: *string*
The default filter used for making detections. Station\-specific
configurations \(bindings\) override this value.
.. confval:: timeCorrection
Default: ``-0.8``
Type: *double*
Unit: *s*
The time correction applied for a pick. Station\-specific
values \(bindings\) override this value.
.. confval:: picker
Type: *string*
The re\-picker to use. By default only simple detections
are emitted as picks. To enable re\-picking on a time window around
the detection, an algorithm \(plugin\) can be defined with this parameter.
Currently available: \"AIC\", \"BK\" or
\"GFZ\".
More options may be available by plugins. Configure related
parameters in global bindings.
.. confval:: phaseHint
Default: ``P``
Type: *string*
Phase hint to be assigned to the pick made by the primary picker.
.. confval:: sendDetections
Default: ``false``
Type: *boolean*
If enabled and \"picker\" is configured, then
initial detections are sent as well. To distinguish between
detections and picks the evaluation status of the pick is
set to \"rejected\". This is meant to be a debug
option which can be used to compare detections and picks by
their evaluation status.
.. confval:: spicker
Type: *string*
The secondary picker to use, e.g., for picking S\-phases.
Currently available is: \"S\-L2\". More options may
be available by plugins. Configure related parameters
in global bindings.
.. confval:: killPendingSPickers
Default: ``true``
Type: *boolean*
If enabled, all secondary pickers that were triggered by
a previous pick will be terminated when a new detection or
pick has been found. This aims to avoid the case where an
S phase is wrongly picked as P but would also be picked as
S by the secondary picker. But suppressing the S pick can
lead to undesired results. It might be better in some
situations to have two picks \(P and S\) instead only a wrong P.
.. confval:: extraPickComments
Default: ``false``
Type: *boolean*
If enabled and \"picker\" or \"spicker\" is
configured, extra comments will be added to the resulting pick.
Supported comments:
SNR: added if SNR >\= 0, comment id is \"SNR\"
duration: added if the duration has been computed at the time
of the pick creation, which actually requires
\"thresholds.maxDuration\" to be configured
with a non\-negative value.
.. confval:: fx
Type: *string*
Configures the feature extraction type to use. Currently
available: \"DFX\". Configure related parameters
in global bindings.
When configured, the usability of the features for locating
events depends on the used locator, e.g. LOCSAT. Read the
locator's documentation and configuration parameters.
.. confval:: amplitudes
Default: ``MLv, mb, mB``
Type: *list:string*
The amplitude types to be computed by the picker based on
picks.
.. note::
**thresholds.\***
*Threshold parameters for the primary picker.*
.. confval:: thresholds.triggerOn
Default: ``3``
Type: *double*
For which value on the filtered waveforms is a pick
detected. Station specific values override this value.
.. confval:: thresholds.triggerOff
Default: ``1.5``
Type: *double*
The value the filtered waveforms must reach to enable
detection again. Between triggerOn and triggerOff the
picker is blind and does not produce picks. Station
specific values override this value.
.. confval:: thresholds.maxGapLength
Default: ``4.5``
Type: *double*
Unit: *s*
The maximum gap length in seconds to handle.
Gaps larger than this will cause the picker to be reset.
.. confval:: thresholds.amplMaxTimeWindow
Default: ``10``
Type: *double*
Unit: *s*
The time window used to compute a maximum \(snr\) amplitude
on the filtered waveforms.
.. confval:: thresholds.deadTime
Default: ``30``
Type: *double*
Unit: *s*
The time used together with measured amplitude and
`thresholds.minAmplOffset` for scaling the amplitude below which
the picker is inactive after a P pick. Read the documentation\!
.. confval:: thresholds.minAmplOffset
Default: ``3``
Type: *double*
The amplitude used together with measured amplitude and
`thresholds.deadTime` for scaling the amplitude below which
the picker is inactive after a P pick. The value is typically
similar to the trigger threshold. Read the documentation\!
.. confval:: thresholds.minDuration
Default: ``-1``
Type: *double*
The minimum duration to reach. The duration is measured as
the time between trigger on and trigger off. If this value
is configured the detection \(pick\) will be delayed in order
to compute and check the duration.
.. confval:: thresholds.maxDuration
Default: ``-1``
Type: *double*
The maximum duration allowed. The duration is measured as
the time between trigger on and trigger off. If this value
is configured the detection \(pick\) will be delayed in order
to compute and check the duration.
.. confval:: amplitudes.enableUpdate
Type: *list:string*
Configure a list of magnitude types.
Update and send amplitudes for these magnitudes as soon as data are
available. Do not wait for complete time windows.
Only magnitudes computed by scautopick as given by the amplitudes parameter are considered.
This option is for rapid magnitude estimation and EEW.
WARNING: This option increases the load on the system\!
.. confval:: connection.amplitudeGroup
Default: ``AMPLITUDE``
Type: *string*
Message group for sending amplitudes to.
.. note::
**comment.\***
*Properties of a custom comment added to a pick. It requires both*
*ID and text to be configured.*
.. confval:: comment.ID
Type: *string*
The ID of a custom comment.
.. confval:: comment.text
Type: *string*
The text of a custom comment.
Bindings Parameters
===================
.. confval:: detecEnable
Default: ``true``
Type: *boolean*
Enables\/disables picking on a station.
.. confval:: detecFilter
Default: ``"RMHP(10)>>ITAPER(30)>>BW(4,0.7,2)>>STALTA(2,80)"``
Type: *string*
Defines the filter to be used for picking.
.. confval:: trigOn
Default: ``3``
Type: *double*
For which value on the filtered waveform is a pick detected.
.. confval:: trigOff
Default: ``1.5``
Type: *double*
The value the filtered waveform must reach to
enable a detection again.
.. confval:: timeCorr
Default: ``-0.8``
Type: *double*
Unit: *s*
The time correction applied to a pick.
.. confval:: sensitivityCorrection
Default: ``false``
Type: *boolean*
Defines whether the detector applies sensitivity correction
\(applying the gain\) or not in advance to filter the data.
Command-Line Options
====================
.. program:: scautopick
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
.. option:: --plugins arg
Load given plugins.
.. option:: -D, --daemon
Run as daemon. This means the application will fork itself
and doesn't need to be started with \&.
.. option:: --auto-shutdown arg
Enable\/disable self\-shutdown because a master module shutdown.
This only works when messaging is enabled and the master
module sends a shutdown message \(enabled with \-\-start\-stop\-msg
for the master module\).
.. option:: --shutdown-master-module arg
Set the name of the master\-module used for auto\-shutdown.
This is the application name of the module actually
started. If symlinks are used, then it is the name of
the symlinked application.
.. option:: --shutdown-master-username arg
Set the name of the master\-username of the messaging
used for auto\-shutdown. If \"shutdown\-master\-module\" is
given as well, this parameter is ignored.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --log-file arg
Use alternative log file.
Messaging
---------
.. option:: -u, --user arg
Overrides configuration parameter :confval:`connection.username`.
.. option:: -H, --host arg
Overrides configuration parameter :confval:`connection.server`.
.. option:: -t, --timeout arg
Overrides configuration parameter :confval:`connection.timeout`.
.. option:: -g, --primary-group arg
Overrides configuration parameter :confval:`connection.primaryGroup`.
.. option:: -S, --subscribe-group arg
A group to subscribe to.
This option can be given more than once.
.. option:: --content-type arg
Overrides configuration parameter :confval:`connection.contentType`.
.. option:: --start-stop-msg arg
Set sending of a start and a stop message.
Database
--------
.. option:: --db-driver-list
List all supported database drivers.
.. option:: -d, --database arg
The database connection string, format:
service:\/\/user:pwd\@host\/database.
\"service\" is the name of the database driver which
can be queried with \"\-\-db\-driver\-list\".
.. option:: --config-module arg
The config module to use.
.. option:: --inventory-db arg
Load the inventory from the given database or file, format:
[service:\/\/]location .
.. option:: --db-disable
Do not use the database at all
Records
-------
.. option:: --record-driver-list
List all supported record stream drivers.
.. option:: -I, --record-url arg
The recordstream source URL, format:
[service:\/\/]location[#type].
\"service\" is the name of the recordstream driver
which can be queried with \"\-\-record\-driver\-list\".
If \"service\" is not given, \"file:\/\/\" is
used.
.. option:: --record-file arg
Specify a file as record source.
.. option:: --record-type arg
Specify a type for the records being read.
Mode
----
.. option:: --offline
Do not connect to a messaging server and do not use the database.
.. option:: --playback
Switches to playback mode which does not request a particular time window from
the input data source. This implies that all records are forwarded to scautopick
if files are being used. Without this option scautopick sets the requested
start time to NOW\-leadTime and therefore would not work anymore with
older datasets in offline mode or when running playbacks.
.. option:: --ep
Outputs an XML event parameters file containing all picks and amplitudes.
This option implies offline.
.. option:: --amplitudes arg
Enables or disables computation of amplitudes.
.. option:: --test
Runs the picker as usual but does not send any messages. This can be useful to
test the picker within a running system.
.. option:: --dump-config
Dumps the current configuration and exits. Station configuration is only read if
the picker connects to the messaging and the database. In offline mode it will
only dump the application specific setting unless a station.conf file is provided.
.. option:: --dump-records
This option only works in combination with :option:`\-\-offline`. It will dump
the data of an amplitude processor if it completed processing successfully
and a new amplitude is available. The output format is a simple ASCII format.
Settings
--------
.. option:: --filter filter
Overrides configuration parameter :confval:`filter`.
.. option:: --time-correction time
Overrides configuration parameter :confval:`timeCorrection`.
.. option:: --buffer-size timespan
Overrides configuration parameter :confval:`ringBufferSize`.
.. option:: --before timespan
Overrides configuration parameter :confval:`leadTime`.
.. option:: --init-time timespan
Overrides configuration parameter :confval:`initTime`.
.. option:: --trigger-on arg
Overrides configuration parameter :confval:`thresholds.triggerOn`.
.. option:: --trigger-off arg
Overrides configuration parameter :confval:`thresholds.triggerOff`.
.. option:: --trigger-dead-time arg
Overrides configuration parameter :confval:`thresholds.deadTime`.
.. option:: --ampl-max-time-window arg
Overrides configuration parameter :confval:`thresholds.amplMaxTimeWindow`.
.. option:: --min-ampl-offset arg
Overrides configuration parameter :confval:`thresholds.minAmplOffset`.
.. option:: --gap-tolerance arg
Overrides configuration parameter :confval:`thresholds.maxGapLength`.
.. option:: --gap-interpolation arg
Overrides configuration parameter :confval:`gapInterpolation`.
.. option:: --any-stream arg
Overrides configuration parameter :confval:`useAllStreams`.
.. option:: --send-detections
Overrides configuration parameter :confval:`sendDetections`.
.. option:: --extra-comments
Overrides configuration parameter :confval:`extraPickComments`.

View File

@ -0,0 +1,337 @@
.. highlight:: rst
.. _scbulletin:
##########
scbulletin
##########
**Create bulletins from SCML.**
Description
===========
scbulletin transforms the parameters of events or origins either to various formats.
Currently supported output formats are:
* autoloc1,
* autoloc3,
* fdsnws,
* kml.
Input Modes
===========
Two modes of parameter input are possible:
#. Dump mode: Fetch all necessary information from database directly. Either
choose one or more event or origin IDs. The database connection must be given
with :option:`-o`.
#. Input mode: Provide all event and origin information in XML (:term:`SCML`)
format from file or stdin. You may filter further by event or origin ID.
If event IDs are given, the preferred origin will be derived and used for printing
information.
.. hint::
Events and origins are referred to by their public IDs. They can be provided
by :ref:`scevtls` and :ref:`scorgls`, respectively, from graphical tools like
:ref:`scolv` or by database queries supported by :ref:`scquery`. XML files
can be generated by :ref:`scxmldump` or from other formats by :ref:`sccnv`.
Output Modes
============
The generated content is written to stdout or, with option :option:`-o` to a
file. Different output formats are available by command-line options:
* ``-1`` for **autoloc1**: Print one bulletin per event.
* ``-3`` for **autoloc3**: Print one bulletin per event.
* ``-3 -x`` for **extended autoloc3**.
* ``-4`` or ``--fdsnws`` for FDSNWS event text: Print one line per event. Useful
for generating event catalogs. This option offers an alternative to generating
event catalogs by :ref:`fdsnws-event <sec-event>`.
* ``-5`` or ``--kml`` for KML/GIS file format. The output can be viewed, e.g.,
in *Google Earth*.
By default, the output precision of times or coordinates is optimized for events
at teleseismic or regional distances. Use the option :option:`-e` for
**enhanced** output at higher precision: All times and distances are in units
of milliseconds and meters, respectively.
Examples
========
#. Create a bulletin from one or multiple event(s) in database
.. code-block:: sh
scbulletin -d mysql://sysop:sysop@localhost/seiscomp -E gfz2012abcd
scbulletin -d mysql://sysop:sysop@localhost/seiscomp -E gfz2012abcd,gfz2022abcd
#. Convert XML file to bulletin
.. code-block:: sh
scbulletin -i gfz2012abcd.xml
.. code-block:: sh
cat gfz2012abcd.xml | scbulletin
.. code-block:: sh
scbulletin < gfz2012abcd.xml
#. Convert XML file to bulletin but filter by event ID(s)
.. code-block:: sh
scbulletin -i gfz2012abcd.xml -E gempa2022abcd
scbulletin -i gfz2012abcd.xml -E gempa2022abcd,gfz2022abcd
.. note::
When considering a single event XML file containing many events, the
bulletins of all events will be generated unless ``--first-only`` is used.
.. _scbulletin_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/scbulletin.cfg`
| :file:`etc/global.cfg`
| :file:`etc/scbulletin.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/scbulletin.cfg`
scbulletin inherits :ref:`global options<global-configuration>`.
Command-Line Options
====================
.. program:: scbulletin
:program:`scbulletin [options]`
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
.. option:: --plugins arg
Load given plugins.
.. option:: -D, --daemon
Run as daemon. This means the application will fork itself
and doesn't need to be started with \&.
.. option:: --auto-shutdown arg
Enable\/disable self\-shutdown because a master module shutdown.
This only works when messaging is enabled and the master
module sends a shutdown message \(enabled with \-\-start\-stop\-msg
for the master module\).
.. option:: --shutdown-master-module arg
Set the name of the master\-module used for auto\-shutdown.
This is the application name of the module actually
started. If symlinks are used, then it is the name of
the symlinked application.
.. option:: --shutdown-master-username arg
Set the name of the master\-username of the messaging
used for auto\-shutdown. If \"shutdown\-master\-module\" is
given as well, this parameter is ignored.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --log-file arg
Use alternative log file.
Database
--------
.. option:: --db-driver-list
List all supported database drivers.
.. option:: -d, --database arg
The database connection string, format:
service:\/\/user:pwd\@host\/database.
\"service\" is the name of the database driver which
can be queried with \"\-\-db\-driver\-list\".
.. option:: --config-module arg
The config module to use.
.. option:: --inventory-db arg
Load the inventory from the given database or file, format:
[service:\/\/]location .
.. option:: --db-disable
Do not use the database at all
Input
-----
.. option:: -f, --format arg
Input format to use \(xml [default], zxml \(zipped xml\), binary\).
.. option:: -i, --input arg
The input file. Default is stdin.
Dump
----
.. option:: -E, --event arg
ID of event\(s\) that is read from database and transformed into
a bulletin. Separate multiple IDs by comma.\"
.. option:: -O, --origin arg
ID of origin\(s\) that is read from database and transformed into
a bulletin. Separate multiple IDs by comma.
.. option:: --event-agency-id
Use the agency ID of the event instead of the preferred origin.
.. option:: --first-only
Convert only the first event\/origin to bulletin. Otherwise
all events or origins will be converted to single bulletins
which will be concatenated.
.. option:: -p, --polarities
Dump onset polarities.
.. option:: -w, --weight arg
Weight threshold for printed and counted picks.
.. option:: -x, --extra
Use a specially detailed autoloc3 format. This options works only
in combination with the autoloc3\-flag.
Output
------
.. option:: -1, --autoloc1
Format: Use autoloc1 format for output. This is default.
.. option:: -3, --autoloc3
Format: Use autoloc3 format for output.
.. option:: -4, --fdsnws
Format: FDSNWS event text, e.g., for generating catalogs.
.. option:: -5, --kml
Format: KML. GIS file format.
.. option:: -e, --enhanced
Enhanced output with higher precision. Times: milliseconds,
distances: meter. Use for bulletins of local earthquakes.
.. option:: -k, --dist-in-km
Print distances in km instead of degree.
.. option:: -o, --output
Name of output file. If not given, all event parameters are
printed to stdout.
.. option:: -x, --extra
Use a specially detailed autoloc3 format. This options works
only in combination with the autoloc3\-flag.

View File

@ -0,0 +1,75 @@
.. highlight:: rst
.. _scchkcfg:
########
scchkcfg
########
**Checks a module configuration.**
Description
===========
scchkcfg checks for case-sensitivity issues of parameter names of the module
configuration file for a module. It reads all defined configuration stages
(see :ref:`global_modules`) and checks for each parameter read if it exists
again with a different spelling.
If *standalone* is not given, it checks all 6 configurations
files (including :file:`global.cfg`), 3 otherwise.
Examples
========
.. code-block:: sh
$ scchkcfg scautopick
Read configuration files OK
No possible conflict detected
scchkcfg checks only for possible conflicts since it does not know what parameters
a module will read eventually.
.. code-block:: sh
$ scchkcfg scautopick
Read configuration files OK
Conflict #1
connection.server /home/sysop/seiscomp/etc/global.cfg:8
connection.Server /home/sysop/.seiscomp/scautopick.cfg:1
1 conflict detected
In this case everything is ok and the conflict needs to be fixed.
:confval:`connection.Server` is not a valid parameter name
(but :confval:`connection.server` is) in
:file:`/home/sysop/.seiscomp/scautopick.cfg` and thus will not be used.
.. code-block:: sh
$ scchkcfg scautopick
Read configuration files OK
Conflict #1
module.trunk.global.amplitudes.mb.signalEnd \
/home/sysop/.seiscomp/scautopick.cfg:1
module.trunk.global.amplitudes.mB.signalEnd \
/home/sysop/.seiscomp/scautopick.cfg:2
1 conflict detected
In this case the configuration is OK and this is an example why the case-sensitivity
has changed from previous versions: mb != mB. scchkcfg detects a possible
conflicts but does not know that this case is well defined. But it helps the user to
decide whether it needs a fix or not.
Command-Line Options
====================
.. program:: scchkcfg
:program:`scchkcfg {mod-name} [standalone]`

View File

@ -0,0 +1,188 @@
.. highlight:: rst
.. _sccnv:
#####
sccnv
#####
**Converts data in different formats.**
Description
===========
sccnv reads input given in a supported format, converts the content to another
format and writes the output. Use the command-line option :confval:`format-list`
for a list of supported formats.
Formats
=======
Different formats are supported for input and output files.
.. csv-table::
:widths: 10, 60, 10, 10
:header: Name, Description, Input, Output
:align: left
arclink , `Arclink XML <https://www.seiscomp.de/seiscomp3/doc/applications/arclink-status-xml.html>`_ , X , X
bson , , X , X
bson-json , , , X
csv , comma-separated values , , X
hyp71sum2k , Hypo71 format , , X
ims10 , , , X
json , `JSON <https://www.json.org/>`_ format , X , X
qml1.2 , :term:`QuakeML` format , \* , X
qml1.2rt , :term:`QuakeML` real time (RT) format , \* , X
scdm0.51 , , X , X
trunk , SeisComP XML (:term:`SCML`) - :ref:`SCML API <api-datamodel-python>` , X , X
**\***: The conversion from files in QuakeML format is not supported by sccnv
but can be realized by system tools. Read section :ref:`sec-sccnv-quakeml` for
details and instructions.
.. _sec-sccnv-quakeml:
QuakeML
-------
:term:`QuakeML` is used in a variety of flavors involving, e.g.,
* Using non-standard objects,
* PublicID references which are not globally unique,
* Missing references to parent objects,
* Missing creationInfo parameters.
The ability to convert from QuakeML to :term:`SCML` is thus limited and it
depends on the parameters provided with the input QuakeML file.
However, XSLT stylesheets are provided for mapping the parameters. The files
are located in :file:`@DATADIR@/xml/[version]/` for different |scname| data schema
versions. The stylesheet files provide information on the mapping and on
limitations as well as examples on their application.
.. note::
You may find out about the |scname| data schema version using modules along
with the command-line option `-V`, e.g.,
.. code-block:: sh
$ sccnv -V
The style sheets can be used along with other stylesheet converter tools provided
by your system, e.g., :program:`xalan` or :program:`xsltproc`. Examples are given
in section :ref:`sec-sccnv-examples`.
.. _sec-sccnv-examples:
Examples
========
* Print the list of supported formats:
.. code-block:: sh
$ sccnv --format-list
* Convert an event parameter file in :term:`SCML` format to :term:`QuakeML` and
store the content in a file:
.. code-block:: sh
$ sccnv -i seiscomp.xml -o qml1.2:quakeml.xml
* Convert an inventory file in Arclink XML format to :term:`SCML` and store the
content in a file:
.. code-block:: sh
$ sccnv -i arclink:Package_inventory.xml -o inventory.sc.xml
* Convert an event parameter file in :term:`SCML` format to ims1.0 and store the
content in a file:
.. code-block:: sh
$ sccnv -i trunk:event.xml -o ims10:event.ims
* Convert QuakeML in version 1.2 to SCML in data schema version 0.12:
.. code-block:: sh
$ xsltproc $SEISCOMP_ROOT/share/xml/0.12/quakeml_1.2__sc3ml_0.12.xsl file.quakeml > file_sc.xml
Command-Line Options
====================
.. program:: sccnv
:program:`sccnv -i format:file -o format:file`
sccnv reads the input given in a supported format, converts the content
and writes the output in another format. Use the option `format\-list`
for a list of supported formats.
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
Verbosity
---------
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
Formats
-------
.. option:: --format-list
List all supported formats
Input
-----
.. option:: -i, --input arg
Input stream [format:][file], default: trunk:\-
Output
------
.. option:: -o, --output arg
Output stream [format:][file], default trunk:\-
.. option:: -f, --formatted
Use formatted output
.. option:: --indent arg
Formatted line indent. Default: 2

View File

@ -0,0 +1,521 @@
.. highlight:: rst
.. _scconfig:
########
scconfig
########
**Configuration and system management frontend.**
Description
===========
scconfig is a graphical user interface which allows to
* Retrieve :ref:`information <scconfig-information>` about the installed |scname|
system,
* :ref:`Control modules <scconfig-system>` (start/stop/check/enable/disable) and
access logging file,
* :ref:`Import, check, synchronize and remove <scconfig-inventory>` station meta
data/inventory,
* Configure the :ref:`module configuration <scconfig-modules>` and
:ref:`bindings <scconfig-bindings>` all SeisComP modules for which descriptions
are provided,
* Access the :ref:`documentation and the changelog <scconfig-documentation>`.
The modules are usually programs part of the SeisComP system and have two
distinct types of configuration:
* :ref:`Modules configuration <scconfig-modules>`, or just program
configuration stored in file like :file:`scautopick.cfg`.
* :ref:`Bindings <scconfig-modules>`, that are set of parameters to configure
how the module will treat a certain station. I.e. station-specific
configurations per module. Bindings can be configured using profiles, or
directly per station. A profile is a named set of parameters for a certain
module that can be attributed for more than one station. Using profiles makes
it easiear to maintain large number of station configuration. When two stations
are configured by the same profile, both will have the same parameter set for
a certain module.
scconfig does not know anything about the SeisComP database, the only thing it
can do is to actually read and write the content of files from :file:`etc/` and
:file:`~/.seiscomp` folder. It allows you to manage this information in an
organized and friendly manner. Also, it relies on other applications like the
proper :ref:`seiscomp` tool to complete the system configuration.
.. _scconfig-first-start:
First start
-----------
If scconfig is started for the first time it will ask the user to setup
its new installation.
.. figure:: media/scconfig/first-start.*
:align: center
If done already with the :ref:`command line interface <getting-started>`,
this step can be skipped. If the setup has been run already, is indicated by
the presence of the file :file:`var/run/seiscomp.init`.
If pressing yes, the setup wizard will be started and will configure exactly
the same parameters as described in :ref:`getting-started`.
.. figure:: media/scconfig/wizard-start.*
:align: center
.. figure:: media/scconfig/wizard-finish.*
:align: center
Pressing 'Finish' will run the setup and report the progress.
.. figure:: media/scconfig/wizard-done.*
:align: center
Pressing 'Close' will launch the main configuration window.
.. _scconfig-mainwindow:
Main Window
-----------
The layout of the main window is always the same regardless of what panel
is selected.
.. _fig-scconfig-mainwindow:
.. figure:: media/scconfig/mainwindow.*
:align: center
:width: 18cm
Main window of scconfig: mode switch (red), panel selection (yellow),
panel title and description (green),
panel content (blue)
It is divided into 4 areas:
* red: the mode switch (user vs. system)
* yellow: panel switch
* green: title and description of current panel
* blue: the content and interactive screen of the current panel
The Main menu contains two entries: :guilabel:`File` and :guilabel:`Edit`.
The file menu allows to run the setup wizard (:guilabel:`Wizard`), to reload
the configuration (:guilabel:`Reload`), to save the
configuration (:guilabel:`Save`) and to close the configuration (:guilabel:`Quit`).
The Edit menu allows to switch the current configuration mode. Pressing the
switch button in the upper left corner (red box) is a shortcut for this operation.
.. _scconfig-information:
Information panel
-----------------
This panel shows information about the |scname| environment
(see figure :ref:`main window <fig-scconfig-mainwindow>`). All variables
(except PATH) can be used as placeholders in most of the configuration
parameters which define directories or files, e.g.:
.. code-block:: sh
autoloc.grid = @CONFIGDIR@/autoloc/local.grid
.. _scconfig-system:
System panel
------------
The system panel is a graphical frontend for the :ref:`seiscomp <system-management>` script.
.. figure:: media/scconfig/system-overview.*
:align: center
:width: 18cm
It is divided into 3 parts: the toolbar on the top (red), the module list (green)
and the log window (blue).
The log window shows the output of all external programs called such as :program:`seiscomp`.
The standard output is colored black and standard error is colored brown.
.. note::
Due to the buffering of the GUI it can happen that standard output and
standard error logs are not in perfect order.
The toolbar gives access to the available operations. All operations
will affect the currently selected modules (rows). If no row is selected, all
modules are affected and the corresponding call to :program:`seiscomp <arg>` is done
without any module.
*Update*
Updates the current module state by calling :program:`seiscomp --csv status`.
*Start*
Calls :program:`seiscomp start`.
*Stop*
Calls :program:`seiscomp stop`.
*Restart*
Calls :program:`seiscomp restart`.
*Check*
Calls :program:`seiscomp check`.
*Enable module(s)*
Enables all selected modules for autostart.
At least one module must be selected.
*Disable module(s)*
Disables all selected modules for autostart.
At least one module must be selected.
*Update configuration*
Calls :program:`seiscomp update-config`. This is important after the
module configuration or bindings have changed and before restarting the
affected modules.
For applying an action to all modules deselect any modules selection pressing
:kbd:`ESC` and press the corresponding action button. When one ore multiple
modules are selected, the action is only applied to those.
To open the most recent log files of modules right click on the module name and
select the available log.
.. figure:: media/scconfig/system-start.png
:align: center
:width: 18cm
.. _scconfig-inventory:
Inventory panel
---------------
The inventory panel allows to import, check and synchronize inventory files as
well as to inspect the content or to rename or remove the files. The panel shows
a list of inventory XML files located in folder :file:`etc/inventory`. Only
:term:`SCML` files can be used as source for inventory data but various importers
exist to integrate inventory data from other formats. After the first start
the list is empty and contains only a README file.
.. figure:: media/scconfig/inventory-empty.*
:align: center
:width: 18cm
Importing station meta data is outlined in the
:ref:`tutorial on adding a station <tutorials_addstation>`.
One source of importing inventory information is ArcLink as run at
http://www.webdc.eu. After downloading the inventory XML file from ArcLink it
can be imported into SeisComP by pressing the 'Import' button in the toolbar
on the top.
It will open a popup which allows to select for input format.
.. figure:: media/scconfig/inventory-import-format.*
:align: center
If ArcLink is selected, the source location should then point to the ArcLink
XML file downloaded before.
.. figure:: media/scconfig/inventory-import-source.*
:align: center
If successfully imported a window will popup with the execution result and
the import output.
.. figure:: media/scconfig/inventory-import-finished.*
:align: center
After closing the popup the imported inventory file will show up in the list of
files. Selecting a files by right-clicking allows:
* Renaming,
* Deleting,
* Inspecting the content of
the file.
.. figure:: media/scconfig/inventory-arclink.*
:align: center
:width: 18cm
The toolbar support 4 additional actions:
*Check inventory*
The inventory is checked for issues including inconsistencies wich are reported.
The tests are based on :ref:`scinv` and listed in the documentation of this
module. Adjust sensitivity by configuring :ref:`scinv`.
*Sync keys*
This action is part of sync but can be called also standalone. It merges all
inventory XML files and creates key files in :file:`etc/key/station_*` if a
key file does not yet exist. Existing key files are not touched unless the
station is not part of the inventory anymore.
As a result, all stations in inventory will have a corresponding key file and
each key file will have a corresponding station in inventory.
*Test sync*
The inventory XML files are not used directly with SeisComP. They need to
be synchronized with the database first (see :ref:`global-stations`).
Synchronization needs to merge all existing XML files and create differences
against the existing database tables. While merging conflicts can occur such
as duplicate stations with different content (e.g. different description).
This action is a dry-run of the actual synchronisation. It performs merging
and creates differences but does not send any update. This actions is useful
to test all your existing inventory files before actually modifying the
database.
.. figure:: media/scconfig/inventory-sync-test-passed.*
:align: center
*Sync*
Almost identical to *Test sync* but it does send updates to the database and
additionally synchronizes key files and resource files.
*Sync* and *Sync keys* will cause a reload of the configuration to refresh the
current binding tree (see :ref:`scconfig-bindings`).
.. _scconfig-modules:
Modules panel
-------------
The modules panel allows configuration of all registered modules.
.. figure:: media/scconfig/modules-overview.*
:align: center
:width: 18cm
The left/green part shows the list of available modules grouped by defined
categories and the right/blue part shows the current active module configuration.
The active configuration corresponds to the selected item in the list. See
section :ref:`scconfig-editing` for further information about the content panel.
.. _scconfig-bindings:
Bindings panel
--------------
The binding panel configures a station for a module providing station-specific
configuration such as data acquisition or processing. You may configure station
bindings or binding profiles. The profiles are typically applied to a set of
station. Any change in the profile parameters apply to all stations bound to it.
.. hint::
Working with :ref:`bindings profiles <scconfig-bindings-profile>` allows to
maintain a single set of binding parameters for one or multiple stations.
:ref:`Station bindings <scconfig-bindings-station>` are useful if a set of
binding parameters are applied only to a single station. Otherwise configure
:ref:`binding profiles <scconfig-bindings-profile>`.
:ref:`Profiles <scconfig-bindings-profile>` are therefore preferred over
:ref:`station bindings <scconfig-bindings-station>` unless only one single
station shall be configured.
.. figure:: media/scconfig/modules-binding.*
:align: center
:width: 18cm
The binding panel is separated into 3 main areas:
* the station tree (red + orange),
* the binding content (green),
* the module tree (blue + magenta).
The station tree (red) shows a tree of all available networks and their
stations. Each stations contains nodes of its configured bindings. The lower
view (orange) represents the content of the currently selected item in the
station tree.
The binding content shows the content of a binding and is similar to the
module configuration content. See section :ref:`scconfig-editing` for further
information about this panel.
The module tree contains all modules which can be used along with bindings.
The upper/blue window contains the modules and all available binding profiles
for each module and the lower/magenta part shows all binding profiles of the
currently selected module. This view is used to add new profiles and delete
existing profiles.
.. _scconfig-bindings-profile:
Profiles
^^^^^^^^
Create a profile
~~~~~~~~~~~~~~~~
For creating a binding profile select a module in the module tree (blue area)
and right-click on the module or select the "add" button in the lower (magenta)
panel. Provide a descriptive name. Clicking on the name of the profile opens the
profile allowing to adjust the parameters.
.. figure:: media/scconfig/modules-profiles.png
:align: center
:width: 18cm
Create bindings
~~~~~~~~~~~~~~~
Assigning a binding profile to one or more stations creates one or more bindings.
To assign a binding profile to a single station, a single network including all
stations or all networks drag a profile from the right part (blue or magenta)
to the target in the left part (red or orange).
For assigning to a set of stations/networks, select the target first by mouse
click and then drag to profile onto the selection.
It is also possible to drag and drop multiple profiles with one action.
.. _scconfig-bindings-station:
Station bindings
^^^^^^^^^^^^^^^^
To create an exclusive station binding for a module, it must be opened in the
binding view (orange box) by either selecting a station in the station tree
(red) or opening/clicking that station in the binding view (orange). The
binding view will then contain all currently configured bindings.
.. figure:: media/scconfig/modules-bindings-station.*
:align: center
Clicking with the right mouse button into the free area will open a menu which
allows to add a binding for a module which has not yet been added. Adding
a binding will activate it and bring its content into the content panel.
To convert an existing profile into a station binding, right click on the
binding icon and select :menuselection:`Change profile --> None`. The existing
profile will be converted into a station binding and activated for editing.
.. figure:: media/scconfig/modules-bindings-convert.*
:align: center
Applying bindings
^^^^^^^^^^^^^^^^^
The bindings parameters must be additionally written to the database or as for a
:term:`standalone module` converted to the specific module configuration by
updating the configuration. You may update configuration for all modules or just
the specific one. To this end, change to the
:ref:`System panel <scconfig-system>` select the specific module or none and
press the button "*Update configuration*".
Alternatively, execute the :ref:`seiscomp` script on the command line or all or
the specific module:
.. code-block:: sh
seiscomp update-config
seiscomp update-config module
.. _scconfig-editing:
Editing parameters
------------------
The content panel of a configuration is organized as a tree. Each module/binding
name is a toplevel item and all namespace are titles of collapsible sections.
Namespaces are separated by dot in the configuration file, e.g.
:file:`scautopick.cfg which also reads :file:`global.cfg would end up in a tree
like this:
.. code-block:: sh
+ global
| |
| +-- connection
| | |
| | +-- server (global.cfg: connection.server)
| | |
| | +-- username (global.cfg: connection.username)
| |
| +-- database (global.cfg: database)
|
+ scautopick
|
+-- connection
| |
| +-- server (scautopick.cfg: connection.server)
| |
| +-- username (scautopick.cfg: connection.username)
|
+-- database (scautopick.cfg: database)
Figure :ref:`fig-scconfig-modules-global` describes each item in the content
panel.
.. _fig-scconfig-modules-global:
.. figure:: media/scconfig/modules-global.*
:align: center
:width: 18cm
Content panel layout
.. figure:: media/scconfig/config-typing.*
:align: right
The content of the input widget (except for boolean types which are mapped
to a simple checkbox) is the raw content of the configuration file without parsing.
While typing a box pops up which contains the parsed and interpreted content as
read by an application. It shows the number of parsed list items, possible
errors and the content of each list item.
Each parameter has a lock icon. If the parameter is locked it is not written
to the configuration file. If it is unlocked, it is written to the configuration
file and editable. Locking is similar to remove the line with a text
editor.
The configuration content that is displayed depends on the current mode. In system
mode :file:`etc/<module>.cfg` is configured while in user mode it is
:file:`~/.seiscomp/<module>.cfg`.
It may happen that a configuration parameter is editable but will not have any
affect on the module configuration. This is caused by the different configuration
stages. If the system configuration is active but a parameter has set in the
user configuration it cannot be overriden in the system configuration. The user
configuration is always of higher priority. scconfig will detect such problems
and will color the input widget red in such situations.
.. figure:: media/scconfig/config-warning.*
:align: center
:width: 18cm
The value in the edit widget will show the currently configured value in the
active configuration file but the tooltip will show the evaluated value, the
location of the definition and a warning.
.. _scconfig-documentation:
Documentation and changelog
---------------------------
Access the documentation and the changelog of any installad package from the
Docs panel.
.. figure:: media/scconfig/documentation.png
:align: center
:width: 18cm

View File

@ -0,0 +1,339 @@
.. highlight:: rst
.. _scdb:
####
scdb
####
**Populate a SQL database from XML files or messages.**
Description
===========
A major component of the SeisComP system is the database. Almost all
applications have only read access to the database, but all the processing
results and objects have to be written into the database. This was the task of
scdb. In very first versions of SeisComP scdb was the only component that had
write access to the database. Its task is to connect to :ref:`scmaster` and populate
the database with all received notifier messages. Although it worked it
introduced race conditions caused by the latency of the database backend since
all other clients received the message at the same time. Accessing the database
immediately at this point in time did not guarantee that the object was
written already.
In consequence, the scmaster itself gained write access to the database and
forwards messages to all clients after they are written to database.
:ref:`scdb` by definition does not check existing objects in the database. It only
generates INSERT/UPDATE/DELETE statements based on the data used and sends
these statements to the database. E.g. if :ref:`scdb` receives a message to
insert a new object into the database and this object exists already, the
database will raise an error because :ref:`scdb` hasn't checked it.
Online mode
-----------
Now scdb can be used to maintain a backup or archive the database that is not
part of the real time processing. When running scdb as database write daemon it
can inform a client about the database connection to use. A client sends a
DatabaseRequest message and scdb sends back a DatabaseResponse message containing
the database connection parameters.
For that it connects to a messaging server and writes all received messages to a
configured database, e.g. a backup database.
.. note::
The database connection received from the messaging server during the
handshake is reported to clients requesting a database address. To overwrite
the read-only database, just override the application's database address
(with the '-d' option)
Offline mode
------------
Another important task of :ref:`scdb` is to populate the database with any SeisComP
data model content. In combination with :ref:`scxmldump` it can be used to copy events
from one database to another.
For that it does not connect to a messaging server but reads data from XML
files and writes it to the database. Offline mode will be used if the
'--input/-i' option is provided. Multiple input files can be specified by
providing this option multiple times with separate filenames.
.. warning::
When reading XML files the output database address is not passed
with -o but -d. The application's database address is used.
Examples
--------
#. Connect to a messaging server and write all messages to the output database
`seiscomp` running on the host `db-server`:
.. code-block:: sh
scdb -H [server] -o mysql://sysop:sysop@db-server/seiscomp
#. As above, but with the read-only database connection using the user `sysop`
and the output database connection using the user `writer`:
.. code-block:: sh
scdb -H [server] -d mysql://sysop:sysop@db-server/seiscomp \
-o mysql://writer:12345@db-server/seiscomp
#. Import data from the file :file:`data.xml` and write it to the database
`seiscomp` on the host `db-server`:
.. code-block:: sh
scdb -i data.xml -d mysql://sysop:sysop@db-server/seiscomp
#. Import data from three files at once:
.. code-block:: sh
scdb -i data1.xml -i data2.xml -i data3.xml \
-d mysql://sysop:sysop@db-server/seiscomp
.. _scdb_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/scdb.cfg`
| :file:`etc/global.cfg`
| :file:`etc/scdb.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/scdb.cfg`
scdb inherits :ref:`global options<global-configuration>`.
.. confval:: connection.requestGroup
Type: *string*
Define the group on scmaster to subscribe for database
requests.
.. confval:: connection.provideGroup
Type: *string*
Define the group on scmaster to send database response
messages to.
.. confval:: output.type
Type: *string*
Define the output database connection type.
.. confval:: output.parameters
Type: *string*
Define the output database connection parameters.
Command-Line Options
====================
.. program:: scdb
:program:`scdb [options]`
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
.. option:: --plugins arg
Load given plugins.
.. option:: -D, --daemon
Run as daemon. This means the application will fork itself
and doesn't need to be started with \&.
.. option:: --auto-shutdown arg
Enable\/disable self\-shutdown because a master module shutdown.
This only works when messaging is enabled and the master
module sends a shutdown message \(enabled with \-\-start\-stop\-msg
for the master module\).
.. option:: --shutdown-master-module arg
Set the name of the master\-module used for auto\-shutdown.
This is the application name of the module actually
started. If symlinks are used, then it is the name of
the symlinked application.
.. option:: --shutdown-master-username arg
Set the name of the master\-username of the messaging
used for auto\-shutdown. If \"shutdown\-master\-module\" is
given as well, this parameter is ignored.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --log-file arg
Use alternative log file.
Messaging
---------
.. option:: -u, --user arg
Overrides configuration parameter :confval:`connection.username`.
.. option:: -H, --host arg
Overrides configuration parameter :confval:`connection.server`.
.. option:: -t, --timeout arg
Overrides configuration parameter :confval:`connection.timeout`.
.. option:: -g, --primary-group arg
Overrides configuration parameter :confval:`connection.primaryGroup`.
.. option:: -S, --subscribe-group arg
A group to subscribe to.
This option can be given more than once.
.. option:: --content-type arg
Overrides configuration parameter :confval:`connection.contentType`.
.. option:: --start-stop-msg arg
Set sending of a start and a stop message.
.. option:: --m, --mode arg
scdb can either process a XML file and write it to the
database or collect messages from scmaster.
If connected to scmaster, the mode defines what objects
are handled: none \(no objects at all\), notifier \(notifier
only\) or all \(all objects whereas non\-notifier objects
are INSERTED into the database\).
Database
--------
.. option:: --db-driver-list
List all supported database drivers.
.. option:: -d, --database arg
The database connection string, format:
service:\/\/user:pwd\@host\/database.
\"service\" is the name of the database driver which
can be queried with \"\-\-db\-driver\-list\".
.. option:: --config-module arg
The config module to use.
.. option:: --inventory-db arg
Load the inventory from the given database or file, format:
[service:\/\/]location .
.. option:: --config-db arg
Load the configuration from the given database or file,
format: [service:\/\/]location .
.. option:: --o, --output arg
If connected to scmaster, this flag defines the database
connection to use for writing. The configured application
database connection \(as received from scmaster\) is reported
to clients as part of a database response messages.
Import
------
.. option:: -i, --input arg
Define the import XML file to be written to database.
Can be provided multiple times to import multiple files.

View File

@ -0,0 +1,404 @@
.. highlight:: rst
.. _scdbstrip:
#########
scdbstrip
#########
**Clean up a database from event and waveform quality parameters.**
Description
===========
|scname|'s :ref:`scmaster` is continuously writing to the database. This causes
the database to grow and to occupy much space on the harddisc. scdbstrip taggles
this problem and removes processed objects from the database older than a
configurable time span. The time comparison considers the object time, not the
time of their creation.
The parameters which scdbstrip removes are
* Event parameters including events, origins, magnitudes, amplitudes, arrivals, picks,
focal mechanisms, moment tensors
* Waveform quality control (QC) parameters.
scdbstrip will remove all events with an origin time and QC parameters older or
younger than specified. Default is 'older'. It will also remove all associated
objects such as picks, origins, arrivals, amplitudes and so on.
scdbstrip does not run as a daemon. To remove old objects continuously scdbstrip
should be added to the list of cronjobs running every e.g. 30 minutes. The more
often it runs the less objects it has to remove and the faster it will unlock
the database again. The timing and the parameters to be removed is controlled
by module configuration or command-line options.
.. hint::
* For removing specific parameters and not all in a time range, use
:ref:`scdispatch` along with XML files created by :ref:`scxmldump` and
:ref:`scqueryqc` for event parameters and waveform QC parameters,
respectively.
* For removing data availability parameters use :ref:`scardac`.
Known Issues
============
When running scdbstrip for the first time on a large database it can happen
that it aborts in case of MYSQL with the following error message:
.. code-block:: sh
[ 3%] Delete origin references of old events...08:48:22 [error]
execute("delete Object from Object, OriginReference, old_events where
Object._oid=OriginReference._oid and
OriginReference._parent_oid=old_events._oid") = 1206 (The total number
of locks exceeds the lock table size)
Exception: ERROR: command 'delete Object from Object, OriginReference,
old_events where Object._oid=OriginReference._oid and
OriginReference._parent_oid=old_events._oid' failed
That means your MYSQL server cannot hold enough data required for deletion.
There are two solutions to this:
#. Increase the memory pool used by MYSQL by changing the configuration. The
minimum is 64 MBytes but modern system typically have a larger default:
.. code-block:: sh
innodb_buffer_pool_size = 64M
The size of the new buffer depends on the size of the database that should
be cleaned up. Read also the section :ref:`database_configuration`. It
provides more options for optimizing your database server.
#. Run scdbstrip on smaller batches for the first time:
.. code-block:: sh
$ scdbstrip -d mysql://sysop:sysop@localhost/seiscomp --days 1000
$ scdbstrip -d mysql://sysop:sysop@localhost/seiscomp --days 900
...
$ scdbstrip -d mysql://sysop:sysop@localhost/seiscomp --days 100
.. hint::
In the examples, database connection parameters correspond to default values.
You may thus replace ``-d mysql://sysop:sysop@localhost/seiscomp`` by
``-d localhost`` or ``-d mysql://``.
Examples
========
* Remove event and waveform quality parameters older than 30 days
.. code-block:: sh
scdbstrip -d mysql://sysop:sysop@localhost/seiscomp --days 30
* Remove event and waveform quality parameters newer than 30 days
.. code-block:: sh
scdbstrip -d mysql://sysop:sysop@localhost/seiscomp --days 30 -i
* Only remove waveform QC parameters older than 30 days but no others
.. code-block:: sh
scdbstrip -d mysql://sysop:sysop@localhost/seiscomp --days 30 --qc-only
* Remove event and waveform quality parameters before 2000-01-01 12:00:00
.. code-block:: sh
scdbstrip -d mysql://sysop:sysop@localhost/seiscomp --datetime 2000-01-01T12:00:00
* Remove event and waveform quality parameters after 2000-01-01 12:00:00
.. code-block:: sh
scdbstrip -d mysql://sysop:sysop@localhost/seiscomp --datetime 2000-01-01T12:00:00 -i
* Remove event and waveform quality parameters between 2000-01-01 12:00:00 ~ 2000-01-01 14:00:00
.. code-block:: sh
scdbstrip -d mysql://sysop:sysop@localhost/seiscomp --time-window 2000-01-01T12:00:00~2000-01-01T14:00:00
* Remove event and waveform quality parameters before 2000-01-01 12:00:00 and after 2000-01-01 14:00:00
.. code-block:: sh
scdbstrip -d mysql://sysop:sysop@localhost/seiscomp --time-window 2000-01-01T12:00:00~2000-01-01T14:00:00 -i
.. _scdbstrip_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/scdbstrip.cfg`
| :file:`etc/global.cfg`
| :file:`etc/scdbstrip.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/scdbstrip.cfg`
scdbstrip inherits :ref:`global options<global-configuration>`.
.. confval:: database.cleanup.invertMode
Default: ``false``
Type: *boolean*
Invert the selection of the specified time period, that is
delete all parameters after the specified time period,
not before. When a date range is specified, then delete all
parameters before and after the time range, not in between.
.. confval:: database.cleanup.eventParameters
Default: ``true``
Type: *boolean*
Strip all event parameters including events, origins,
magnitudes, amplitudes, arrivals, picks, focal mechanisms.
.. confval:: database.cleanup.qualityControl
Default: ``true``
Type: *boolean*
Strip waveform quality control \(QC\) parameters.
.. note::
**database.cleanup.keep.\***
*Parameters controlling the time to keep objects in the database.*
*The time comparison considers the object time, not the time of*
*their creation.*
.. confval:: database.cleanup.keep.days
Default: ``30``
Type: *int*
The number of days to preserve in the database. This
value is added to the whole timespan. Hours
and minutes are configured separately.
.. confval:: database.cleanup.keep.hours
Default: ``0``
Type: *int*
The number of hours to preserve in the database. This
value is added to the whole timespan. Days
and minutes are configured separately.
.. confval:: database.cleanup.keep.minutes
Default: ``0``
Type: *int*
The number of minutes to preserve in the database. This
value is added to the whole timespan. Days
and hours are configured separately.
Command-Line Options
====================
.. program:: scdbstrip
:program:`scdbstrip [options]`
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
.. option:: --plugins arg
Load given plugins.
.. option:: --first-new
Overrides configuration parameter :confval:`firstNew`.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --log-file arg
Use alternative log file.
Database
--------
.. option:: --db-driver-list
List all supported database drivers.
.. option:: -d, --database arg
The database connection string, format:
service:\/\/user:pwd\@host\/database.
\"service\" is the name of the database driver which
can be queried with \"\-\-db\-driver\-list\".
.. option:: --config-module arg
The config module to use.
.. option:: --inventory-db arg
Load the inventory from the given database or file, format:
[service:\/\/]location .
.. option:: --db-disable
Do not use the database at all
Mode
----
.. option:: --check
Checks if unreachable objects exist.
.. option:: --clean-unused
Remove all unreachable objects when in checkmode. Default:
off.
Objects
-------
.. option:: -E, --ep-only
Strip only event parameters. Other parameters, like QC, are
are ignored.
.. option:: -Q, --qc-only
Strip only waveform quality control \(QC\) parameters. Other
parameters, like event parameters, are are ignored.
Overrides 'eq\-only'.
Timespan
--------
.. option:: --days arg
Overrides configuration parameter :confval:`database.cleanup.keep.days`.
.. option:: --hours arg
Overrides configuration parameter :confval:`database.cleanup.keep.hours`.
.. option:: --minutes arg
Overrides configuration parameter :confval:`database.cleanup.keep.minutes`.
.. option:: --datetime arg
Replaces the days:hours:minutes timespan definition by an
arbitrary absolute timestamp in UTC. Format:
%Y\-%m\-%dT%H:%M:%S.
.. option:: -t, --time-window arg
Delete objects in the specified time window.
Replaces the days:hours:minutes timespan definition by an
arbitrary absolute time range in UTC. Format:
startTime\~endTime that is %Y\-%m\-%dT%H:%M:%S\~%Y\-%m\-%dT%H:%M:%S
.. option:: -i, --invert
Overrides configuration parameter :confval:`database.cleanup.invertMode`.
.. option:: --keep-events
IDs of events to keep in the database separated with comma.

View File

@ -0,0 +1,421 @@
.. highlight:: rst
.. _scdispatch:
##########
scdispatch
##########
**Read objects (event, origin, etc) from a SCML file and sends the objects
to the messaging system.**
Description
===========
scdispatch reads an :term:`SCML` file and creates notifier objects for them that
are sent to the :ref:`messaging <concepts_messaging>` and the corresponding
messaging groups (see :option:`--routingtable`). In contrast to :ref:`scdb`
which writes SCML files directly into the :ref:`database <concepts_database>`
scdispatch uses the messaging bus. If :ref:`scmaster` is configured with the
database plugin, messages will end up in the database as well.
Modes
-----
scdispatch can work in two modes applying different
:ref:`operations <scdispatch-operations>`:
* *Without database check:* One of the :ref:`operations <scdispatch-operations>`
*add*, *update* or *remove* is selected along with the option :option:`-O`. In
that case all objects in the :term:`SCML` are encapsulated in a notifier with
that specific operation and sent to the messaging. No check is performed if
the object is already in the database or not.
* *With database check:* The option :option:`-O` is not given or the
option is used along with one of the :ref:`operations <scdispatch-operations>`
*merge* or *merge-without-remove*. scdispatch first tries to load the corresponding
objects from the database and calculates differences. It will then create the
corresponding notifiers with operations *add*, *update* or *remove* and sends
them to the messaging. That mode is quite close to a sync operation with the
exception that top level objects (such as origin or event) that are not part
of the input SCML are left untouched in the database. It can be used to
synchronize event information from one system with another.
.. _scdispatch-operations:
Operations
----------
Different operations can be chosen along with the option :option:`-O`.
If :option:`-O` is not given, *merge* is assumed by default.
* *Add*: All objects are sent trying to be added to the database. If they
already exist in the database, they will be rejected and not spread through
the messaging. Modules connected to the messaging will not receive rejected
objects.
* *Remove*: All sent objects with all their attributes and child objects are
removed from the database. Modules connected to the messaging will not receive
any sent object.
* *Update*: All objects are sent trying to be updated to the database along with
all of their child objects and attributes. Sent objects not existing in the
database will be ignored and not received by any module connected to the
messaging. Child objects and attributes existing in the database but not
included in the sent object will be removed as well.
* *Merge* (default): Applies *Add* and *Update* and requires a database
connection.
* *Merge-without-remove*: Applies *Add* and *Update* and requires a database
connection. However, no objects are removed from the database.
.. note::
All |scname| objects along are listed and described along with their child
objects and attributes in the :ref:`API documentation <api-datamodel-python>`.
Examples
--------
#. Send different objects from a :term:`SCML` file for merging (adding or
updating). The option :option:`-O` can be ommitted because the default
behavior is to merge:
.. code-block:: sh
scdispatch -i test.xml -O merge
scdispatch -i test.xml
#. Send all objects by ignoring events. When :ref:`scevent` receives origins it
will create new events or associate the origins to existing ones. The ignored
events may be already existing with different IDs. Hence, event duplication
is avoided by ignoring them.
.. code-block:: sh
scdispatch -i test.xml -e
#. Send new objects to be added:
.. code-block:: sh
scdispatch -i test.xml -O add
#. Send an update of objects:
.. code-block:: sh
scdispatch -i test.xml -O update
#. Send objects to be removed:
.. code-block:: sh
scdispatch -i test.xml -O remove
#. Compare new objects with the database content and send the difference (optionally without removing objects):
.. code-block:: sh
scdispatch -i test.xml -O merge
scdispatch -i test.xml -O merge-without-remove
#. Offline mode: all operations can be performed without the messaging system using xml files:
.. code-block:: sh
scdispatch -i test.xml -O operation --create-notifier > notifier.xml
then:
.. code-block:: sh
scdb -i notifier.xml
#. Subsets of SCML Objects
It can be useful to import a subset of QuakeML objects, e.g. Origins from other
agencies and then allow :ref:`scevent` to associate them to existing
events (and possibly prefer them based on the rules in scevent) or create new
events for the origins. If the event objects from a SCML file are not required
to be sent to the messaging then either they should be removed (e.g. using XSLT)
and all the remaining objects in the file added:
.. code-block:: sh
scdispatch -i test.xml -O add
or the **event objects** can be left out of the routing table, e.g.
.. code-block:: sh
scdispatch -i test.xml -O add \
--routingtable Pick:PICK, \
Amplitude:AMPLITUDE, \
Origin:LOCATION,StationMagnitude:MAGNITUDE, \
Magnitude:MAGNITUDE
.. hint::
The option :option:`--no-event` is a wrapper for removing Event:EVENT from
the routing table. With this option no event objects will be sent which may
be useful if just the origins with magnitudes, amplitudes, arrivals, picks, etc.
shall be integrated, e.g. after XML-based playbacks.
#. Testing
For testing it is useful to watch the results of dispatch with :ref:`scolv` or
:ref:`scxmldump`. It is also useful to clean the database and logs to remove
objects from persistent storage to allow repeated reloading of a file.
.. note::
The following will clear all events from the database and any other
other object persistence. Modify the mysql command to suit your db setup.
.. code-block:: sh
mysql -u root --password='my$q1' -e "DROP DATABASE IF EXISTS seiscomp; \
CREATE DATABASE seiscomp CHARACTER SET utf8 COLLATE utf8_bin; \
GRANT ALL ON seiscomp.* TO 'sysop'@'localhost' IDENTIFIED BY 'sysop'; \
USE seiscomp;source seiscomp/trunk/share/db/mysql.sql;"
seiscomp start
.. _scdispatch_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/scdispatch.cfg`
| :file:`etc/global.cfg`
| :file:`etc/scdispatch.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/scdispatch.cfg`
scdispatch inherits :ref:`global options<global-configuration>`.
Command-Line Options
====================
.. program:: scdispatch
:program:`scdispatch [options]`
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
.. option:: --plugins arg
Load given plugins.
.. option:: -D, --daemon
Run as daemon. This means the application will fork itself
and doesn't need to be started with \&.
.. option:: --auto-shutdown arg
Enable\/disable self\-shutdown because a master module shutdown.
This only works when messaging is enabled and the master
module sends a shutdown message \(enabled with \-\-start\-stop\-msg
for the master module\).
.. option:: --shutdown-master-module arg
Set the name of the master\-module used for auto\-shutdown.
This is the application name of the module actually
started. If symlinks are used, then it is the name of
the symlinked application.
.. option:: --shutdown-master-username arg
Set the name of the master\-username of the messaging
used for auto\-shutdown. If \"shutdown\-master\-module\" is
given as well, this parameter is ignored.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --log-file arg
Use alternative log file.
Messaging
---------
.. option:: -u, --user arg
Overrides configuration parameter :confval:`connection.username`.
.. option:: -H, --host arg
Overrides configuration parameter :confval:`connection.server`.
.. option:: -t, --timeout arg
Overrides configuration parameter :confval:`connection.timeout`.
.. option:: -g, --primary-group arg
Overrides configuration parameter :confval:`connection.primaryGroup`.
.. option:: -S, --subscribe-group arg
A group to subscribe to.
This option can be given more than once.
.. option:: --content-type arg
Overrides configuration parameter :confval:`connection.contentType`.
.. option:: --start-stop-msg arg
Set sending of a start and a stop message.
Database
--------
.. option:: --db-driver-list
List all supported database drivers.
.. option:: -d, --database arg
The database connection string, format:
service:\/\/user:pwd\@host\/database.
\"service\" is the name of the database driver which
can be queried with \"\-\-db\-driver\-list\".
.. option:: --config-module arg
The config module to use.
.. option:: --inventory-db arg
Load the inventory from the given database or file, format:
[service:\/\/]location .
.. option:: --db-disable
Do not use the database at all
Dispatch
--------
.. option:: -e, --no-events
Do not send any event object which are ignored when reading
event parameters.
.. option:: -i, --input arg
File from which the content is dispatched to the messaging.
.. option:: -O, --operation arg
Merge and merge\-without\-remove require a database
connection. Both will read the object corresponding to the
sent object from the database and calculate the differences.
Merge\-without\-remove behaves like merge with the exception
that remove operations will be filtered out and no objects
in the database will be removed.
If add, update or remove is specified, then all objects in
XML are sent with the given operation regardless of their
existence in the database.
.. option:: --print-objects
Print names of routable objects.
.. option:: --print-routingtable
Print routing table.
.. option:: --routingtable arg
Specify routing table as comma separated list of object:group
pairs, e.g. \"Origin:LOCATION,Event:EVENT\". When an
object should be routed to a group and no table entry for
that particular class type is available, all parent objects
are checked for valid routing entries and the first found is
used. E.g. if only \"Origin:LOCATION\" is specified
but the input file contains also Arrivals which are child
objects of Origin then the routing entry of Origin is used
because of the parent\-child relationship between Origin and
Arrival.
.. option:: --test
Test mode. Does not send any object.
.. option:: --create-notifier
Do not send any object. All notifiers will be written to
standard output in XML format.

View File

@ -0,0 +1,253 @@
.. highlight:: rst
.. _scdumpcfg:
#########
scdumpcfg
#########
**Dump bindings or module configurations used by a specific module or global
for particular stations.**
Description
===========
scdumpcfg reads and prints the
:ref:`module or bindings configuration <concepts_configuration>`
for a specific module or for global. It even prints the global bindings for modules
which do not have module bindings, such as :ref:`scmv`.
This command-line utility is therefore useful for debugging configuration parameters.
Instead of printing parameters and values for stations, the option :option:`--nlsc`
allows printing a list of the channel considering bindings. The output may be
used, e.g., for
* filtering inventory by :ref:`invextr`
* miniSEED records by :ref:`scart` or :ref:`scmssort`
* event information by :ref:`scevtstreams`.
Related to :program:`scdumpcfg` is :ref:`bindings2cfg` which dumps the bindings
configuration to :term:`SCML`.
Examples
========
#. Dump the global bindings configuration for all stations which have global bindings:
.. code-block:: sh
scdumpcfg global -d mysql://sysop:sysop@localhost/seiscomp -B
#. Dump the bindings configuration for all stations which have bindings to a
:ref:`scautopick` profile. Additionally use *-G* as :ref:`scautopick` inherits global
bindings:
.. code-block:: sh
scdumpcfg scautopick -d localhost -GB
#. Dump the module global module configuration specifcally searching for the map
zoom sensitivity and output the result in the format of the |scname| module
configuration:
.. code-block:: sh
scdumpcfg global -d localhost --cfg -P map.zoom.sensitivity
#. Dump the module configuration of scautopick and output in the format of the
|scname| module configuration:
.. code-block:: sh
scdumpcfg scautopick -d localhost --cfg
#. Dump global bindings configuration considerd by scmv:
.. code-block:: sh
scdumpcfg scmv -d localhost -BG
#. Dump the channel codes defined by scautopick binding as a list of NET.STA.LOC.CHA:
.. code-block:: sh
scdumpcfg scautopick -d localhost -B --nslc
.. _scdumpcfg_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/scdumpcfg.cfg`
| :file:`etc/global.cfg`
| :file:`etc/scdumpcfg.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/scdumpcfg.cfg`
scdumpcfg inherits :ref:`global options<global-configuration>`.
Command-Line Options
====================
.. program:: scdumpcfg
:program:`scdumpcfg [options]`
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
.. option:: --plugins arg
Load given plugins.
.. option:: -D, --daemon
Run as daemon. This means the application will fork itself
and doesn't need to be started with \&.
.. option:: --auto-shutdown arg
Enable\/disable self\-shutdown because a master module shutdown.
This only works when messaging is enabled and the master
module sends a shutdown message \(enabled with \-\-start\-stop\-msg
for the master module\).
.. option:: --shutdown-master-module arg
Set the name of the master\-module used for auto\-shutdown.
This is the application name of the module actually
started. If symlinks are used, then it is the name of
the symlinked application.
.. option:: --shutdown-master-username arg
Set the name of the master\-username of the messaging
used for auto\-shutdown. If \"shutdown\-master\-module\" is
given as well, this parameter is ignored.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --log-file arg
Use alternative log file.
Database
--------
.. option:: --db-driver-list
List all supported database drivers.
.. option:: -d, --database arg
The database connection string, format:
service:\/\/user:pwd\@host\/database.
\"service\" is the name of the database driver which
can be queried with \"\-\-db\-driver\-list\".
.. option:: --config-module arg
The config module to use.
.. option:: --inventory-db arg
Load the inventory from the given database or file, format:
[service:\/\/]location .
.. option:: --config-db arg
Load the configuration from the given database or file,
format: [service:\/\/]location .
Dump
----
.. option:: -B, --bindings arg
Dump bindings instead of module configuration.
.. option:: -G, --allow-global arg
Print global bindings if no module binding is available.
.. option:: -P, --param arg
Specify the parameter name\(s\) to filter for. Use comma
separation of multiple parameters.
.. option:: --cfg
Print output in .cfg format. Does not work along with \-B.
.. option:: --nslc
Print the list of channels which have bindings of the given
module. Requires to set \-B. Can be used by other modules,
e.g., invextr, scart, scmssort, scevtstreams.

View File

@ -0,0 +1,673 @@
.. highlight:: rst
.. _scesv:
#####
scesv
#####
**Event summary view.**
Description
===========
scesv is the summary display of the event parameters. It shows the primary information
about the current event including location, time, strength, type and processing status.
Additional to the current event older events can also be chosen from the eventlist in the
events tab.
The two tabs of scesv are
* Events tab showing the list of loaded events, compare Fig. :ref:`fig-events`
* Summary tab with the details of the selected event, see Fig. :ref:`fig-summary`.
.. _scesv-events-tab:
Events Tab
==========
The Events tab shows the eventlist of the time span defined on the bottom side of
the window. A the Events tab is also available for other GUIs, such as :ref:`scolv`
it can be configured in the global module configuration.
The :ref:`description of the Events tab in scolv <scolv-events-tab>` provides more
details.
.. _fig-events:
.. figure:: media/scesv/scesv-events.png
:width: 16cm
:align: center
Eventlist tab
Tab1-2: Summary/Events tab, EventList: list of the last events with summarized information,
List options: Show fake events, Reading 1: spinbox to limit timespan of displayed events (in days),
Reading 2: spinboxes to limit timespan of displayed events (specified dates), Status: connection status
.. note::
As for :ref:`scolv` filtering of the event list and custom information can be
added to the event list by configuration. Read the scolv documentation on
:ref:`event filtering <scolv-events-filtering>` and :ref:`custom action <scolv-custom-actions>`
for the details.
.. _scesv-summary-tab:
Summary Tab
===========
The most recent (default) or the event selected from the event list is shown in
the Summary tab, see Fig. :ref:`fig-summary`.
Here the information are highlighted in four sections:
==================== =====================================================
section description
==================== =====================================================
Time Orign time in UTC and relative to now
Region A map of the region and location with the event and stations
Magnitude Different magnitude types, the values and counts
Hypocenter Origin information with location, depth, azimuthal gap etc.
==================== =====================================================
.. _fig-summary:
.. figure:: media/scesv/scesv-summary.png
:width: 16cm
:align: center
Summary tab
Tab1-2: Summary/Events tab, Origin Time: origin time in UTC and relative to now, Map: map with region
and location and stations, Magnitude: different magnitude types with values and counts,
Origin Info: hypocenter information with position, phase count and azimuthal gap,
Event: earthquake location, Station: station with ray-path, Details: "Show Details" button to open
detailed information in :ref:`scolv`, Event Type: event type combo box to set event type, Status: connection status
Hotkeys
=======
================= =======================================
Hotkey Description
================= =======================================
:kbd:`F1` Open |scname| documentation
Shift + :kbd:`F1` Open scesv documentation
:kbd:`F2` Setup connection dialog
:kbd:`F6` Show propagation of P and S wave
:kbd:`F7` Show focal mechanism by beach ball
Shift + :kbd:`F7` Show focal mechanism by beach ball
:kbd:`F8` Toggle auto update
:kbd:`F9` Show raypaths and associated stations
:kbd:`F10` Toggle tabs
:kbd:`F11` Toggle fullscreen
Mouse wheel Zoom map in/out
Double click Center map
================= =======================================
.. _scesv_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/scesv.cfg`
| :file:`etc/global.cfg`
| :file:`etc/scesv.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/scesv.cfg`
scesv inherits :ref:`global options<global-configuration>`.
.. confval:: loadEventDB
Default: ``1``
Type: *double*
Unit: *days*
Number of days to preload if scesv is started.
.. confval:: showLastAutomaticSolution
Default: ``false``
Type: *boolean*
If enabled, the last automatic solution is displayed next to the
current solution in the summary panel. If the last automatic
solution differs from the current solution it is displayed
in red. If both solutions match it is displayed in gray.
.. confval:: showOnlyMostRecentEvent
Default: ``true``
Type: *boolean*
If enabled, only the most recent event is shown even if an update of
an older event is being received.
.. confval:: recenterMap
Default: ``true``
Type: *boolean*
If enabled, the map is centered around the most recent event event.
.. confval:: enableFixAutomaticSolutions
Default: ``false``
Type: *boolean*
If enabled, an additional button is displayed which allows to
switch back the latest automatic solution. This button triggers
an command for scevent to prioritize automatic solutions until
the next manual solution is available.
.. confval:: visibleMagnitudes
Default: ``M,MLv,mb,mB,Mw(mB)``
Type: *list:string*
A list of magnitude types to be displayed.
.. confval:: button0
Type: *string*
Label of button0 which triggers script0.
.. confval:: button1
Type: *string*
Label of button1 which triggers script1.
.. confval:: ignoreOtherEvents
Default: ``true``
Type: *boolean*
If the event type is either \"other\" or
\"not existing\" and this flag is true then the
event will not be shown.
.. confval:: scripts.script0
Type: *string*
Path to a script if button0 is clicked.
.. confval:: scripts.script1
Type: *string*
Path to a script if button1 is clicked.
.. confval:: scripts.script0.exportMap
Default: ``false``
Type: *boolean*
If enabled, the current map is exported to file.
The filename is appened to the parameter list of script0.
The script has to take ownership of the file.
.. confval:: scripts.script0.oldStyle
Default: ``true``
Type: *boolean*
If enabled, the parameter list of script0 is event ID,
arrival count, magnitude, description.
If disabled, the parameter list of script0 is event ID,
preferredOriginID, preferredMagnitudeID,
preferredFocalMechanismID.
.. confval:: scripts.script1.exportMap
Default: ``false``
Type: *boolean*
If enabled, the current map is exported to file.
The filename is appened to the parameter list of script1.
The script has to take ownership of the file.
.. confval:: scripts.script1.oldStyle
Default: ``true``
Type: *boolean*
If enabled, the parameter list of script1 is event ID,
arrivalCount, magnitude, description.
If disabled, the parameter list of script1 is event ID,
preferredOriginID, preferredMagnitudeID,
preferredFocalMechanismID.
.. confval:: summary.borders
Default: ``false``
Type: *boolean*
Draw borders in the summary panel.
.. note::
**display.\***
*Adjust content or display custom information in the Summary tab.*
.. confval:: display.lonmin
Default: ``-180``
Type: *double*
Unit: *deg*
Minimum longitude of initially displayed map region.
.. confval:: display.lonmax
Default: ``180``
Type: *double*
Unit: *deg*
Maximum longitude of initially displayed map region.
.. confval:: display.latmin
Default: ``-90``
Type: *double*
Unit: *deg*
Minimum latitude of initially displayed map region.
.. confval:: display.latmax
Default: ``90``
Type: *double*
Unit: *deg*
Maximum latitude of initially displayed map region.
.. note::
**display.event.\***
*Event information*
.. confval:: display.event.comment.id
Type: *string*
ID of the event comment to be considered.
.. confval:: display.event.comment.default
Type: *string*
Value to be shown in case no valid event comment is
found.
.. confval:: display.event.comment.label
Type: *string*
Label of the value to be shown.
.. note::
**display.origin.\***
*Origin information*
.. note::
**display.origin.comment.\***
*Display origin comments.*
.. confval:: display.origin.comment.id
Type: *string*
ID of the origin comment to be considered.
.. confval:: display.origin.comment.default
Type: *string*
Value to be shown in case no valid origin comment is
found.
.. confval:: display.origin.comment.label
Type: *string*
Label of the value to be shown.
.. note::
**poi.\***
*Display information related to a point of interest (POI)*
*read from the cities XML file.*
.. confval:: poi.maxDist
Default: ``20``
Type: *double*
Unit: *deg*
Maximum distance in degrees of a POI to be taken into account.
.. confval:: poi.minPopulation
Type: *double*
The minimum population of a POI to be taken into account.
.. confval:: poi.message
Type: *string*
Message conversion string that converts a POI into the text
displayed under the region label. There are different
placeholders that can be used: \@dist\@, \@dir\@, \@poi\@ and \@region\@.
Command-Line Options
====================
.. program:: scesv
:program:`scesv [options]`
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
.. option:: --plugins arg
Load given plugins.
.. option:: --auto-shutdown arg
Enable\/disable self\-shutdown because a master module shutdown.
This only works when messaging is enabled and the master
module sends a shutdown message \(enabled with \-\-start\-stop\-msg
for the master module\).
.. option:: --shutdown-master-module arg
Set the name of the master\-module used for auto\-shutdown.
This is the application name of the module actually
started. If symlinks are used, then it is the name of
the symlinked application.
.. option:: --shutdown-master-username arg
Set the name of the master\-username of the messaging
used for auto\-shutdown. If \"shutdown\-master\-module\" is
given as well, this parameter is ignored.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --log-file arg
Use alternative log file.
.. option:: --print-component arg
For each log entry print the component right after the
log level. By default the component output is enabled
for file output but disabled for console output.
.. option:: --trace
Execute in trace mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 \-\-print\-component\=1
\-\-print\-context\=1 .
Messaging
---------
.. option:: -u, --user arg
Overrides configuration parameter :confval:`connection.username`.
.. option:: -H, --host arg
Overrides configuration parameter :confval:`connection.server`.
.. option:: -t, --timeout arg
Overrides configuration parameter :confval:`connection.timeout`.
.. option:: -g, --primary-group arg
Overrides configuration parameter :confval:`connection.primaryGroup`.
.. option:: -S, --subscribe-group arg
A group to subscribe to.
This option can be given more than once.
.. option:: --content-type arg
Overrides configuration parameter :confval:`connection.contentType`.
.. option:: --start-stop-msg arg
Set sending of a start and a stop message.
Database
--------
.. option:: --db-driver-list
List all supported database drivers.
.. option:: -d, --database arg
The database connection string, format:
service:\/\/user:pwd\@host\/database.
\"service\" is the name of the database driver which
can be queried with \"\-\-db\-driver\-list\".
.. option:: --config-module arg
The config module to use.
.. option:: --inventory-db arg
Load the inventory from the given database or file, format:
[service:\/\/]location .
.. option:: --db-disable
Do not use the database at all
Records
-------
.. option:: --record-driver-list
List all supported record stream drivers.
.. option:: -I, --record-url arg
The recordstream source URL, format:
[service:\/\/]location[#type].
\"service\" is the name of the recordstream driver
which can be queried with \"\-\-record\-driver\-list\".
If \"service\" is not given, \"file:\/\/\" is
used.
.. option:: --record-file arg
Specify a file as record source.
.. option:: --record-type arg
Specify a type for the records being read.
Cities
------
.. option:: --city-xml arg
The path to the cities XML file. This overrides the default
paths. Compare with the global parameter \"citiesXML\".
User interface
--------------
.. option:: -F, --full-screen
Start the application filling the entire screen.
This only works with GUI applications.
.. option:: -N, --non-interactive
Use non\-interactive presentation mode. This only works with
GUI applications.
Options
-------
.. option:: --script0 arg
Path to the script called when configurable
button0 is pressed; EventID, arrival count, magnitude and
the additional location information string are passed as
parameters \$1, \$2, \$3 and \$4, respectively.
.. option:: --script1 arg
Path to the script called when configurable
button1 is pressed; EventID, arrival count, magnitude and
the additional location information string are passed as
parameters \$1, \$2, \$3 and \$4, respectively.
.. option:: --load-event-db arg
Number of days to load from database.

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,99 @@
.. _scevent_eventtype:
#########
EventType
#########
evtype plugin for scevent
Description
===========
The *evtype* pluging sets the type of an event based on comments of picks which
are associated to the preferred origin of this event. The IDs of the comments
from which the comment values are considered must be configured by
:confval:`eventType.pickCommentIDs` and the name of the plugin, *evtype*, must
be added to the :confval:`plugins` parameter. The text of the considered
comments must contain a supported event type.
.. note::
Other criteria for setting the event type may be added later to this plugin.
Example of a pick comment:
.. code-block:: xml
<pick publicID="20160914.075944.29-deepc-AB.XYZ..HHZ">
...
<comment>
<text>earthquake</text>
<id>deepc:eventTypeHint</id>
</comment>
...
</pick>
Example configuration (:file:`scevent.cfg`):
.. code-block:: properties
plugins = evtype
eventType.setEventType = true
eventType.pickCommentIDs = scrttv:eventTypeHint, deepc:eventTypeHint
.. _scevent_eventtype_configuration:
Module Configuration
====================
.. note::
**eventType.\***
*Set the event type based on type comments of picks. Add the*
*plugin "evtype" to the list of plugins in the order*
*of priority to make this feature available.*
.. confval:: eventType.setEventType
Default: ``false``
Type: *boolean*
Allow setting the event type.
The type of events which have manual origins will
not be changed unless configured explicitely by
\"overwriteManual\".
.. confval:: eventType.overwriteEventType
Default: ``true``
Type: *boolean*
Allow overwriting existing event types set by other modules.
.. confval:: eventType.overwriteManual
Default: ``false``
Type: *boolean*
Allow setting the event type if the mode of the preferred
origin is manual or if the event type was set manually.
.. confval:: eventType.pickCommentIDs
Default: ``scrttv:eventTypeHint,deepc:eventTypeHint``
Type: *list:string*
Consider comments of picks which have one of the
given values. An empty list disables setting the type.

View File

@ -0,0 +1,473 @@
.. _scevent_regioncheck:
###########
RegionCheck
###########
evrc plugin for scevent
Description
===========
*evrc* (event region check) is a :term:`plugin` for :ref:`scevent` setting the
event type by comparing the location of the preferred origin with
:ref:`defined regions <sec-evrc-regions>`.
.. note::
Events for which the mode of the preferred origin is "manual" are by default
not considered.
.. _sec-evrc-regions:
Definition of regions
---------------------
The regions are defined by closed polygons provided in
:ref:`GeoJSON or BNA files <sec-gui_layers>`. Configure :confval:`rc.regions` to
consider a region defined by its region name. The name is given either
* As a property of the polygon when given in GeoJSON format,
* Or in the header when given in BNA format.
There exist **positive and negative regions**:
* **Positive region:** All events within the area enclosed by the polygon are
flagged positive, all events not enclosed by the polygon are flagged negative.
* **Negative region:** All events within the area enclosed by the polygon are
flagged negative, all events not enclosed by the polygon are flagged positive.
Regions are negative if the :confval:`name <rc.regions>` of the enclosing polygon
starts with **!** (exclamation mark. Otherwise the region is positive.
If a list of region names is defined, the last matching region in the list takes
priority when treating events.
.. note::
* When regions are defined or configured multiple times by polygons or
:confval:`rc.regions`, respectively, the region is not unique and the
region check is entirely inactive.
* When a region is not defined but configured in :confval:`rc.regions`, the
region check remains active but the region is ignored.
In both cases, error log message are printed.
Treatment of events
-------------------
When the *evrc* plugin is loaded and configured, the location of the preferred
origin of an events is compared with the defined regions.
Events within a positive and a negative region are flagged positive and
negative, respectively. By default it sets the event type to "outside of network
interest" if the event is flagged negative.
#. When activating :confval:`rc.readEventTypeFromBNA` the type of positive
events is set according to the eventType defined in
:ref:`polygon <sec-evrc-polygon>`.
The type of negative events is set according to :confval:`rc.eventTypeNegative`.
Prepend 'accept' to the list of polygons to unset the type of negative events.
#. When :confval:`rc.readEventTypeFromBNA` is inactive, the event type is set
based on :confval:`rc.eventTypePositive` and :confval:`rc.eventTypeNegative`:
#. by default the type of all negative events (events within negative regions)
is set to "outside of network interest".
Prepend **accept** to :confval:`rc.regions` to unset the event type for
negative events.
#. **positive:** The event type of positive events is set to
:confval:`rc.eventTypePositive`. For empty :confval:`rc.eventTypePositive`
the type is unset.
#. **negative:** The event type of negative events is set to
:confval:`rc.eventTypeNegative`. The default type for negative events is
"outside of network interest".
Evaluation is made based on the order of the regions names defined in
:confval:`rc.regions`. The last matching criteria applies.
In this way disjunct and overlapping regions with different behavior can be
defined. If events ARE NOT within positive regions their type is set to
"outside of network interest".
.. _fig-evrc-region:
.. figure:: media/regions.png
:align: center
:width: 10cm
Disjunct and overlapping regions in front of a default.
Event types
-----------
The event types are either set based the types configured in
:confval:`rc.eventTypePositive` and :confval:`rc.eventTypeNegative`
or based on the type provided in the polygon files if
:confval:`rc.readEventTypeFromBNA` is active.
Type definition
~~~~~~~~~~~~~~~
For defining the event type, any value defined in :cite:t:`uml`.
The list of valid values can also be found in the Event tab of :ref:`scolv`: Type.
Examples for valid event types:
* earthquake
* quarry blast
* nuclear explosion
* not existing
* ...
Invalid values result in errors or debug messages which are reported depending
on the verbosity level of :ref:`scevent` as given :confval:`logging.level` or
:option:`--verbosity`/:option:`-v`.
.. _sec-evrc-polygon:
Event type from polygon
~~~~~~~~~~~~~~~~~~~~~~~
If :confval:`rc.readEventTypeFromBNA` is active, the event type is read from the
polygon defining a region. Use a key-value pair in double quotes to specify the
type where the key is "eventType" and the value is the event type. The
formatting depends on the file format.
The depth of the event can be tested, as well. For events within a region but
with depth outside a depth range the type is not set. The limits of the depth
range can be added to the polygons using the key words *minDepth* and
*maxDepth*. For considering a polygon, the depth *d* of the preferred
:term:`origin` of an :term:`event` must be within the range
.. math::
minDepth \le d \le maxDepth
The origin depth is only tested if minDepth or maxDepth or both are set and if
:confval:`rc.readEventTypeFromBNA` is active.
.. warning::
* The names of polygons, e.g. coal, are case sensitive and must not contain
commas.
* A hierarchy applies to the reading of GeoJSON/BNA files. Read the section
:ref:`sec-gui_layers-vector` for the details.
**Example polygon in BNA format:**
.. code-block:: properties
"coal","rank 1","eventType: mining explosion, minDepth: -5, maxDepth: 10",6
13.392,50.3002
13.2244,50.4106
13.4744,50.5347
13.6886,50.4945
13.6089,50.358
13.6089,50.358
where the name of the polygon / region is "coal" and the considered event type
is "mining explosion". The name and the rank are mandatory fields. All key-value
pairs for eventType, minDepth and maxDepth are written within one single field
enclosed by double quotes.
**Example polygon in GeoJSON format:**
* Single Feature
For a single Feature and Poylgon, eventType, minDepth are maxDepth are added as
key-value pair to the properities of the feature:
.. code-block:: properties
{
"type": "Feature",
"geometry": {
"type": "Polygon",
"coordinates": [
[
[-77.075, -37.7108], [-76.2196, -21.2587], [-69.0919, -7.10994]
]
]
},
"properties": {
"name": "mines",
"rank": 1,
"eventType": "mining explosion",
"minDepth": -5,
"maxDepth": 10
}
}
* Single Feature and MultiPoylgon
For a single Feature and a MultiPoylgon, eventType, minDepth are maxDepth are
added as key-value pair to the properities of the MultiPoylgon:
.. code-block:: properties
{
"type": "Feature"
"properties": {
"name": "mines",
"rank" : 1,
"eventType": "mining explosion",
"minDepth": -5,
"maxDepth": 10
},
"geometry": {
"type": "MultiPolygon",
"coordinates": [
[
[
[ 10.0, -25.0 ],
[ 13.0, -25.0 ],
[ 13.0, -22.0 ],
[ 10.0, -25.0 ]
]
], [
[
[ 20.0, -25.0 ],
[ 23.0, -25.0 ],
[ 23.0, -22.0 ],
[ 20.0, -25.0 ]
]
]
]
}
}
* FeatureCollection
For a FeatureCollection, the key-value pairs may be added to the properties of
each individual feature:
.. code-block:: properties
{
"type": "FeatureCollection",
"features": [
{ "type": "Feature",
"properties": {
"name": "Krakatau",
"rank": 1,
"eventType": "mining explosion",
"minDepth": -5,
"maxDepth": 10
},
"geometry": {
"type": "Polygon",
"coordinates": [ ... ]
}
},
{ "type": "Feature",
"properties": {
"name": "Batu Tara",
"rank": 1,
"eventType": "mining explosion",
"minDepth": -5,
"maxDepth": 10
},
"geometry": {
"type": "Polygon",
"coordinates": [ ... ]
}
},
}
}
Setting up the Plugin
======================
Load the *evrc* plugin: Add to the global configuration or to the
global configuration of :ref:`scevent` in the order of priority:
.. code-block:: sh
plugins = ${plugins},evrc
Add BNA polygons by defining :confval:`rc.regions`.
Use the region name to define positive and negative regions. Names with
leading *!* define negative regions.
.. code-block:: sh
rc.regions = accept,area
.. note::
:ref:`scevent` stops
if the *evrc* plugin is loaded but :confval:`rc.regions` is not defined.
Activate :confval:`rc.readEventTypeFromBNA` and add the eventType key-value pair
to the :ref:`polygons <sec-evrc-polygon>` if the event type
shall be read from GeoJSON or BNA polygon.
**Examples:**
Set type of events within the positive polygon **germany** but do not change the
type outside:
.. code-block:: sh
rc.regions = accept,germany
Accept all events without setting the type but set the type for all events within
the positive polygon **germany** but consider negative within the polygon
**quarries**:
.. code-block:: sh
rc.regions = accept,germany,!quarries
Accept all events without setting the type but consider events within the
negative polygon **germany** and events within the positive polygon **saxony**:
.. code-block:: sh
rc.regions = accept,!germany,saxony
.. _scevent_regioncheck_configuration:
Module Configuration
====================
.. note::
**rc.\***
*Test if events lie within or outside geographic regions defined*
*by polygons.*
*Events within a region are flagged as positive, outside as negative.*
*The event type is set accordingly. Add the*
*plugin "evrc" to the plugins parameter in the*
*order of priority to make this feature available. Read the*
*documentation of the RegionCheck for more details.*
.. confval:: rc.setEventType
Default: ``true``
Type: *boolean*
Allow setting the event type.
The type of events which have manual origins will
not be changed unless configured explicitely by
\"overwriteManual\".
.. confval:: rc.overwriteEventType
Default: ``true``
Type: *boolean*
Allow overwriting existing event types. Disabling does not
allow accounting for changes in source region.
.. confval:: rc.overwriteManual
Default: ``false``
Type: *boolean*
Allow setting the event type if the mode of the preferred
origin is manual or if the event type was set manually.
.. confval:: rc.regions
Default: ``!reject``
Type: *list:string*
The list of closed polygon names defining regions for
flagging event as positive or negative.
A polygon name defines a positive region but names with
prefix \! \(exclamation mark\) define negative regions.
Evaluation is done in the order of the polygons. The last
matching criteria applies and the event type is set
accordingly.
Default: If events are not positive or are negative regions
the event type is set to \"outside of network
interest\". Default:
\"\!reject\", use \"accecpt\" to overwrite
the default.
Examples:
Events are flagged positive within the polygon
\"germany\":
germany
All events are flagged positive but events within the
polygon \"quarries\" are negative:
accept,\!quarries
Events within the polygon \"germany\" are flagged
positive but all other events and events within the polygon
\"quarries\" are negaitve:
germany,\!quarries
All events are flagged positive but events within the
polygon \"germany\" are negative and all events
within the polygon \"saxony\" are positive:
accept,\!germany,saxony
.. confval:: rc.readEventTypeFromBNA
Default: ``false``
Type: *boolean*
Consider the event type, minDepth and maxDepth values from
the polygons defined by GeoJSON or BNA files. Read the
documentation of the RegionCheck plugin for the details.
When eventType is defined in the polygons, the value
supersedes values of 'eventTypePositive' and
'eventTypeNegative'.
If not set, 'eventTypePositive' and 'eventTypeNegative' are
considered.
.. confval:: rc.eventTypePositive
Type: *string*
New type of an event which is flagged positive. Ignored
if 'readEventTypeFromBNA' is active and the polygons
define eventType.
Empty: Do not set type.
.. confval:: rc.eventTypeNegative
Default: ``"outside of network interest"``
Type: *string*
New type of an event which is flagged negative. Ignored
if 'readEventTypeFromBNA' is active and the polygons
define eventType.
Empty means default: \"outside of network interest\"

View File

@ -0,0 +1,250 @@
.. highlight:: rst
.. _scevtlog:
########
scevtlog
########
**Event log preserving the history of updates.**
Description
===========
Running SeisComP causes many database accesses for writing. Anytime a new
event has been created a new row will be inserted in the database table.
When the same event is updated the row in the database table is going to be
changed as well. The information about the history of the event is lost because
the database contains only the current event attributes. scevtlog saves the
event history into files. While scevtlog is running it keeps the track of all
event updates and stores this information in a directory that can be analyzed
at anytime. The stored information is written as plain text in an easily
readable format. Additionally scevtlog maintains an event summary file for an
overview of the event history.
.. _scevtlog_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/scevtlog.cfg`
| :file:`etc/global.cfg`
| :file:`etc/scevtlog.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/scevtlog.cfg`
scevtlog inherits :ref:`global options<global-configuration>`.
.. confval:: directory
Default: ``@LOGDIR@/events``
Type: *string*
Specify the output directory. Within this directory the logging
directory structure and the event files are stored.
.. confval:: format
Default: ``xml``
Type: *string*
Specify output event format \(default is autoloc3\). For completeness
it is recommended to switch to xml as storage format. The autoloc3
format can be easily reconstructed with scbulletin but not the other
way around.
.. confval:: gzip
Default: ``false``
Type: *boolean*
If format is xml then all XML files will be compressed with gzip
and stored with file extension \".xml.gz\". They are also
valid gzip files and can be used as input to e.g. zgrep.
Command-Line Options
====================
.. program:: scevtlog
:program:`scevtlog [options]`
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
.. option:: --plugins arg
Load given plugins.
.. option:: -D, --daemon
Run as daemon. This means the application will fork itself
and doesn't need to be started with \&.
.. option:: --auto-shutdown arg
Enable\/disable self\-shutdown because a master module shutdown.
This only works when messaging is enabled and the master
module sends a shutdown message \(enabled with \-\-start\-stop\-msg
for the master module\).
.. option:: --shutdown-master-module arg
Set the name of the master\-module used for auto\-shutdown.
This is the application name of the module actually
started. If symlinks are used, then it is the name of
the symlinked application.
.. option:: --shutdown-master-username arg
Set the name of the master\-username of the messaging
used for auto\-shutdown. If \"shutdown\-master\-module\" is
given as well, this parameter is ignored.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --log-file arg
Use alternative log file.
Messaging
---------
.. option:: -u, --user arg
Overrides configuration parameter :confval:`connection.username`.
.. option:: -H, --host arg
Overrides configuration parameter :confval:`connection.server`.
.. option:: -t, --timeout arg
Overrides configuration parameter :confval:`connection.timeout`.
.. option:: -g, --primary-group arg
Overrides configuration parameter :confval:`connection.primaryGroup`.
.. option:: -S, --subscribe-group arg
A group to subscribe to.
This option can be given more than once.
.. option:: --content-type arg
Overrides configuration parameter :confval:`connection.contentType`.
.. option:: --start-stop-msg arg
Set sending of a start and a stop message.
Database
--------
.. option:: --db-driver-list
List all supported database drivers.
.. option:: -d, --database arg
The database connection string, format:
service:\/\/user:pwd\@host\/database.
\"service\" is the name of the database driver which
can be queried with \"\-\-db\-driver\-list\".
.. option:: --config-module arg
The config module to use.
.. option:: --inventory-db arg
Load the inventory from the given database or file, format:
[service:\/\/]location .
.. option:: --db-disable
Do not use the database at all
Storage
-------
.. option:: -o, --directory arg
Overrides configuration parameter :confval:`directory`.
.. option:: -f, --format arg
Overrides configuration parameter :confval:`format`.

View File

@ -0,0 +1,223 @@
.. highlight:: rst
.. _scevtls:
#######
scevtls
#######
**List event IDs from database.**
Description
===========
*scevtls* lists the event IDs of all events available in a database or
:term:`SCML` file within a given time span. The list may be filtered by
event type. The IDs are printed to stdout.
Similarly, use :ref:`scorgls` for listing all origin IDs. In extension to
*scevtls* and :ref:`scorgls`, :ref:`scquery` can search for parameters based on
complex custom queries.
Examples
========
* Print all event IDs for the complete year 2012:
.. code-block:: sh
scevtls -d mysql://sysop:sysop@localhost/seiscomp \
--begin "2012-01-01 00:00:00" \
--end "2013-01-01 00:00:00"
* Print all event IDs with event type *quarry blast*:
.. code-block:: sh
scevtls -d mysql://sysop:sysop@localhost/seiscomp \
--event-type "quarry blast"
* Print the IDs of all events provided with the XML file:
.. code-block:: sh
scevtls -i events.xml
* Print all event IDs along with the ID of the preferred origin:
.. code-block:: sh
scevtls -d localhost -p
.. _scevtls_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/scevtls.cfg`
| :file:`etc/global.cfg`
| :file:`etc/scevtls.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/scevtls.cfg`
scevtls inherits :ref:`global options<global-configuration>`.
Command-Line Options
====================
.. program:: scevtls
:program:`scevtls [options]`
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --log-file arg
Use alternative log file.
Database
--------
.. option:: --db-driver-list
List all supported database drivers.
.. option:: -d, --database arg
The database connection string, format:
service:\/\/user:pwd\@host\/database.
\"service\" is the name of the database driver which
can be queried with \"\-\-db\-driver\-list\".
.. option:: --config-module arg
The config module to use.
.. option:: --inventory-db arg
Load the inventory from the given database or file, format:
[service:\/\/]location .
.. option:: --db-disable
Do not use the database at all
Input
-----
.. option:: -i, --input arg
Name of input XML file. Read from stdin if '\-' is given.
Deactivates reading events from database.
Events
------
.. option:: --begin time
Specify the lower bound of the time interval. Format:
2012\-01\-01T00:00:00.
.. option:: --end time
Specify the upper bound of the time interval. Format:
2012\-01\-01T00:00:00.
.. option:: --hours arg
Start searching given hours before now. If set, \-\-begin and
\-\-end are ignored.
.. option:: --modified-after time
Print IDs of events modified after the specified time.
.. option:: --event-type arg
The event type for filtering events.
Use quotes for types with more than one word.
Example: \"mining explosion\".
Output
------
.. option:: -D, --delimiter string
Specify the delimiter of the resulting event IDs.
.. option:: -p, --preferred-origin
Print the ID of the preferred origin along with the event
ID.

View File

@ -0,0 +1,312 @@
.. highlight:: rst
.. _scevtstreams:
############
scevtstreams
############
**Extract stream information with time windows from picks of an event.**
Description
===========
scevtstreams reads all picks of an event and determines the time window between
the first pick and the last pick. In addition a symmetric or an asymmetric time
margin is added to this
time window. It writes the streams that are picked including the determined
time window for the event to stdout. This tool gives appropriate input
information for :ref:`scart`, :ref:`fdsnws` and :cite:t:`capstool` for
:cite:t:`caps` server (Common Acquisition Protocol Server by gempa GmbH) to dump
waveforms from archives based on event data.
Output Format
=============
The generated list contains start and end time as well as stream information.
Generic:
.. code-block:: properties
starttime;endtime;stream
Example:
.. code-block:: properties
2019-07-17 02:00:00;2019-07-17 02:10:00;GR.CLL..BHZ
Examples
========
#. Get the time windows for an event in the database:
.. code-block:: sh
scevtstreams -E gfz2012abcd -d mysql://sysop:sysop@localhost/seiscomp
#. Get the asymmetric time windows for an event in an XML file. The time window
starts 120 s before the first pick and ends 500 s after the last pick:
.. code-block:: sh
scevtstreams -E gfz2012abcd -i event.xml -m 120,500
#. Create a playback of an event with a time window of 5 minutes data and
sort the records by end time:
.. code-block:: sh
scevtstreams -E gfz2012abcd -d mysql://sysop:sysop@localhost/seiscomp -m 300 |\
scart -dsvE --list - ~/seiscomp/acquisition/archive > gfz2012abcd-sorted.mseed
#. Download waveforms from Arclink and import into local archive. Include
all stations from the contributing networks:
.. code-block:: sh
scevtstreams -E gfz2012abcd -d mysql://sysop:sysop@localhost/seiscomp -m 300 -R --all-stations |\
scart --list - ./my-archive
#. Create lists compatible with :ref:`fdsnws` or `caps <https://docs.gempa.de/caps/current/apps/capstool.html>`_: ::
scevtstreams -E gfz2012abcd -i event.xml -m 120,500 --fdsnws
scevtstreams -E gfz2012abcd -i event.xml -m 120,500 --caps
.. _scevtstreams_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/scevtstreams.cfg`
| :file:`etc/global.cfg`
| :file:`etc/scevtstreams.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/scevtstreams.cfg`
scevtstreams inherits :ref:`global options<global-configuration>`.
Command-Line Options
====================
.. program:: scevtstreams
:program:`scevtstreams [options]`
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
.. option:: --plugins arg
Load given plugins.
.. option:: -D, --daemon
Run as daemon. This means the application will fork itself
and doesn't need to be started with \&.
.. option:: --auto-shutdown arg
Enable\/disable self\-shutdown because a master module shutdown.
This only works when messaging is enabled and the master
module sends a shutdown message \(enabled with \-\-start\-stop\-msg
for the master module\).
.. option:: --shutdown-master-module arg
Set the name of the master\-module used for auto\-shutdown.
This is the application name of the module actually
started. If symlinks are used, then it is the name of
the symlinked application.
.. option:: --shutdown-master-username arg
Set the name of the master\-username of the messaging
used for auto\-shutdown. If \"shutdown\-master\-module\" is
given as well, this parameter is ignored.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --log-file arg
Use alternative log file.
Database
--------
.. option:: --db-driver-list
List all supported database drivers.
.. option:: -d, --database arg
The database connection string, format:
service:\/\/user:pwd\@host\/database.
\"service\" is the name of the database driver which
can be queried with \"\-\-db\-driver\-list\".
.. option:: --config-module arg
The config module to use.
.. option:: --inventory-db arg
Load the inventory from the given database or file, format:
[service:\/\/]location .
.. option:: --db-disable
Do not use the database at all
Input
-----
.. option:: -i, --input arg
Input XML file name. Reads event from the XML file instead of
database. Use '\-' to read from stdin.
.. option:: -f, --format arg
Input format to use \(xml [default], zxml \(zipped xml\),
binary\). Only relevant with \-i.
Dump
----
.. option:: -E, --event arg
The ID of the event to consider.
.. option:: --net-sta arg
Filter read picks by network code or network and station
code. Format: NET or NET.STA
.. option:: --nslc arg
Stream list file to be used for filtering read picks by
stream code. '\-\-net\-sta' will be ignored. One line per
stream. Line format: NET.STA.LOC.CHA.
Output
------
.. option:: -m, --margin arg
Time margin around the picked time window, default is 300.
Added before the first and after the last pick,
respectively. Use 2 comma\-separted values \(before,after\)
for asymmetric margins. Example: 120,300.
.. option:: -S, --streams arg
Comma separated list of streams per station to add.
Example: BH,SH,HH.
.. option:: -C, --all-components arg
Specify whether to use all components \(1\) or just the
picked ones \(0\). Default: 1.
.. option:: -L, --all-locations arg
Specify whether to use all location codes \(1\) or just
the picked ones \(0\). Default: 1.
.. option:: --all-stations
Dump all stations from the same network. If unused, just
stations with picks are dumped.
.. option:: --all-networks
Dump all networks. If unused, just networks with picks are
dumped. This option implies \-\-all\-stations, \-\-all\-locations,
\-\-all\-streams, \-\-all\-components and will only provide the
time window.
.. option:: -R, --resolve-wildcards flag
If all components are used, use inventory to resolve stream
components instead of using '?' \(important when Arclink
should be used\).
.. option:: --caps
Dump in capstool format \(Common Acquisition Protocol Server
by gempa GmbH\).
.. option:: --fdsnws flag
Dump in FDSN dataselect webservice POST format.

View File

@ -0,0 +1,663 @@
.. highlight:: rst
.. _scheli:
######
scheli
######
**Real-time helicorder view for one stream.**
Description
===========
:program:`scheli` visualizes waveforms from a single stream or multiple stations
mimicking a drum-recorder plot (see :ref:`fig-scheli`):
* :program:`scheli` plots one configurable trace in helicorder style in the
:term:`GUI` (:ref:`GUI mode <scheli-show>`).
* Configurable GUI: trace colors, visualized time spans, number of rows, data filtering,
amplitude ranges and much more.
* Automatic image capturing: Capture helicorder images at configurable time intervals
of one trace in :ref:`GUI mode<scheli-show>` or a set of multiple channels in
:ref:`capture mode<scheli-capture>`.
The images can be used, e.g. for showing data images on web sites.
.. _fig-scheli:
.. figure:: media/scheli.png
:width: 16cm
:align: center
scheli in GUI mode
Examples
========
.. _scheli-show:
1. **GUI mode - Simple helicorder window:**
* Learn about the plenty command-line options for :program:`scheli`: ::
scheli -h
* Start :program:`scheli` with the configured values and informative debug output: ::
scheli --debug
* Let :program:`scheli` show data from the CX station PB01 for the previous 5 hours
overriding configuration by command-line paramaters:
.. code-block:: sh
scheli --stream CX.PB01..HHZ --rows 10
* Define the data request window by end time and duration; scale traces to the
maximum amplitude per row: ::
scheli --stream IU.TSUM.00.BHZ --end-time "2021-04-22 14:00:00" --time-span 600 --amp-scaling row
.. _scheli-capture:
2. **Capture mode - Image capturing:**
Capture the helicorder plot for 3 stations in intervals of 10 seconds.
The data is retrieved using seedlink and the plots are stored as PNG images.
The image files are named according to network, station, stream and location codes
of the requested stations. Command-line parameters override the module configuration.
.. code-block:: sh
scheli capture --stream CX.PB01..HHZ --stream CX.PB02..HHZ --stream CX.PB04..HHZ --interval 10 -o "/tmp/heli_%N_%S_%L_%C.png" -H localhost -I slink://localhost
The output file names will be generated based on network code (%N), station code (%S),
location code (%L) and stream code (%C): ::
/tmp/CX.PB01..HHZ.png
/tmp/CX.PB02..HHZ.png
/tmp/CX.PB04..HHZ.png
Setup
=====
Specifc :program:`scheli` parameters are adjusted in the :ref:`module configuration <scheli_configuration>`.
Colors of traces etc. can be adjusted by setting the *scheme* parameters in
the global configuration of scheli. For alternating colors between the traces
set the parameters scheme.colors.records.foreground and
scheme.colors.records.alternateForeground in :file:`scheli.cfg`:
.. code-block:: sh
# The general color of records/traces.
scheme.colors.records.foreground = 4286F4
# A general trace color of the alternate trace (eg scheli).
scheme.colors.records.alternateForeground = B72D0E
.. _scheli_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/scheli.cfg`
| :file:`etc/global.cfg`
| :file:`etc/scheli.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/scheli.cfg`
scheli inherits :ref:`global options<global-configuration>`.
.. confval:: heli.streams
Type: *list:string*
List of stream codes to be plotted \(net.sta.loc.cha\).
If not in capture mode only the first stream is shown.
When using a list, the first entry is considered.
Use commas for separating streams.
Example: GR.MOX..BHZ
.. confval:: heli.filter
Type: *string*
Filter to be applied on the data.
.. confval:: heli.numberOfRows
Default: ``48``
Type: *integer*
Filter to be applied on the data.
.. confval:: heli.rowTimeSpan
Default: ``1800``
Type: *double*
Unit: *s*
Length of data per trace.
.. confval:: heli.timeFormat
Default: ``%F``
Type: *string*
The time format used to print the start and end time of the
whole plot \(upper right corner\). The format specification is
the one used in the strftime function \(man strftime\).
.. confval:: heli.recordTime
Default: ``false``
Type: *boolean*
Set current time to last data sample.
.. confval:: heli.lineWidth
Default: ``1``
Type: *integer*
Unit: *px*
Line width of traces.
.. confval:: heli.colors
Default: ``FF0000, 0000FF``
Type: *list:string*
A list of alternating row colors cycled through for painting
traces.
.. confval:: heli.antialiasing
Default: ``false``
Type: *boolean*
Use anti aliasing to plot the traces. The default uses the
settings from scheme.records.antiAliasing
.. confval:: heli.stream.description
Default: ``true``
Type: *boolean*
Add stream description to traces.
.. note::
**heli.amplitudeRange.\***
*Gain-corrected amplitudes given in units of the sensor.*
*For example: m/s.*
.. confval:: heli.amplitudeRange.scaling
Default: ``minmax``
Type: *string*
Define the method to scale traces within rows. Possible
values are:
minmax: Scale all rows to configured minimum and maximum
amplitudes configured by amplitudeRange.min and
amplitudeRange.max
row: Scale each row to the maximum within this row.
.. confval:: heli.amplitudeRange.min
Default: ``-0.00001``
Type: *double*
Unit: *unit of input data*
Minimum amplitude to show in trace. Requires
amplitudeRange.scale \= \"minmax\".
.. confval:: heli.amplitudeRange.max
Default: ``0.00001``
Type: *double*
Unit: *unit of input data*
Minimum amplitude to show in trace. Requires
amplitudeRange.scale \= \"minmax\".
.. note::
**heli.dump.\***
*Control dumping of PNG images.*
*Execute "scheli capture" for image generation in*
*the background without the graphics.*
.. confval:: heli.dump.interval
Default: ``-1``
Type: *integer*
Unit: *s*
Image creation interval. Negative values disable image
dumping.
If enabled, images are generated at the configured
interval.
.. confval:: heli.dump.outputFile
Default: ``/tmp/heli_%N_%S_%L_%C.png``
Type: *string*
Name of output file.
The filename can contain placeholders
that are replaced by the corresponding streamID parts:
%N : network code
%S : station code
%L : location code
%C : channel code
Placeholders are important if more than one stream
is given and capture mode is active.
.. confval:: heli.dump.dpi
Default: ``300``
Type: *integer*
Unit: *dpi*
Image resolution.
.. confval:: heli.dump.xres
Default: ``1024``
Type: *integer*
Unit: *px*
Number of pixels horizontally.
.. confval:: heli.dump.yres
Default: ``768``
Type: *integer*
Unit: *px*
Number of pixels vertically.
.. confval:: scripts.postprocessing
Type: *path*
Define the path to a script that is called whenever an image
has been captured and written to disc. The only parameter is
the path to the generated image.
Command-Line Options
====================
.. program:: scheli
:program:`scheli [options]`
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
.. option:: --plugins arg
Load given plugins.
.. option:: -D, --daemon
Run as daemon. This means the application will fork itself
and doesn't need to be started with \&.
.. option:: --auto-shutdown arg
Enable\/disable self\-shutdown because a master module shutdown.
This only works when messaging is enabled and the master
module sends a shutdown message \(enabled with \-\-start\-stop\-msg
for the master module\).
.. option:: --shutdown-master-module arg
Set the name of the master\-module used for auto\-shutdown.
This is the application name of the module actually
started. If symlinks are used, then it is the name of
the symlinked application.
.. option:: --shutdown-master-username arg
Set the name of the master\-username of the messaging
used for auto\-shutdown. If \"shutdown\-master\-module\" is
given as well, this parameter is ignored.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --log-file arg
Use alternative log file.
.. option:: --print-component arg
For each log entry print the component right after the
log level. By default the component output is enabled
for file output but disabled for console output.
.. option:: --trace
Execute in trace mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 \-\-print\-component\=1
\-\-print\-context\=1 .
Messaging
---------
.. option:: -u, --user arg
Overrides configuration parameter :confval:`connection.username`.
.. option:: -H, --host arg
Overrides configuration parameter :confval:`connection.server`.
.. option:: -t, --timeout arg
Overrides configuration parameter :confval:`connection.timeout`.
.. option:: -g, --primary-group arg
Overrides configuration parameter :confval:`connection.primaryGroup`.
.. option:: -S, --subscribe-group arg
A group to subscribe to.
This option can be given more than once.
.. option:: --content-type arg
Overrides configuration parameter :confval:`connection.contentType`.
.. option:: --start-stop-msg arg
Set sending of a start and a stop message.
Database
--------
.. option:: --db-driver-list
List all supported database drivers.
.. option:: -d, --database arg
The database connection string, format:
service:\/\/user:pwd\@host\/database.
\"service\" is the name of the database driver which
can be queried with \"\-\-db\-driver\-list\".
.. option:: --config-module arg
The config module to use.
.. option:: --inventory-db arg
Load the inventory from the given database or file, format:
[service:\/\/]location .
.. option:: --config-db arg
Load the configuration from the given database or file,
format: [service:\/\/]location .
Records
-------
.. option:: --record-driver-list
List all supported record stream drivers.
.. option:: -I, --record-url arg
The recordstream source URL, format:
[service:\/\/]location[#type].
\"service\" is the name of the recordstream driver
which can be queried with \"\-\-record\-driver\-list\".
If \"service\" is not given, \"file:\/\/\" is
used.
.. option:: --record-file arg
Specify a file as record source.
.. option:: --record-type arg
Specify a type for the records being read.
User interface
--------------
.. option:: -F, --full-screen
Start the application filling the entire screen.
This only works with GUI applications.
.. option:: -N, --non-interactive
Use non\-interactive presentation mode. This only works with
GUI applications.
Mode
----
.. option:: --offline
Do not connect to a messaging server and do not use the
database.
.. option:: --end-time arg
Set the acquisition end time, e.g. '2017\-09\-08 13:30:00',
default: 'gmt'
Data
----
.. option:: --stream arg
The record stream that should be displayed. Can be used
multiple times for multiple streams.
Example: GR.MOX..BHZ \(net.sta.loc.cha\)
.. option:: --filter arg
The filter to apply
.. option:: --gain arg
Gain applied to the data before plotting
.. option:: --amp-scaling arg
Lower bound of amplitude range per row. Possible values:
minmax: Scale all rows to configured minimum and maximum
amplitudes.
row: Scale each row to the maximum within this row.
.. option:: --amp-range-min arg
Lower bound of amplitude range per row
.. option:: --amp-range-max arg
Upper bound of amplitude range per row
.. option:: --amp-range arg
Arround zero bound of amplitude range per row
.. option:: --record-time arg
Do the last row always contain the last record received
Output
------
.. option:: --desc arg
Enable\/disable the display of a station description
.. option:: --rows arg
Configure the number of rows to display
.. option:: --time-span arg
Configure the time\-span \(in secs\) per row. Unit: seconds.
.. option:: --aa arg
Set antialiasing for rendering the traces
.. option:: --xres arg
Output x resolution when generating images. Unit: dpi.
.. option:: --yres arg
Output y resolution when generating images. Unit: dpi.
.. option:: --dpi arg
Output dpi when generating postscript. Unit:dpi.
.. option:: -o arg
Output filename. Placeholders are %N,%S,%L,%C for network
code, station code, location code, channel code.
.. option:: --interval arg
Snapshot interval \(less than 0 disables timed snapshots\).
Unit: seconds.

View File

@ -0,0 +1,452 @@
.. highlight:: rst
.. _scimex:
######
scimex
######
**SeisComP event exchange between two systems.**
Description
===========
scimex manages the |scname| object exchange between two or more different SeisComP systems in
real time. scimex may import or export the data to one or several systems. In
contrary to :ref:`scimport` the exchange of the |scname| objects is event based.
This means no messages will be exchanged until the exporting system has produced
an event.
By default all objects (picks, amplitudes, origins, arrivals, station
magnitudes, magnitudes, magnitude references) are transferred to the other
system. The user can define filters at both the sender and the receiver, to
limit the events for which objects are transferred. Possible filter parameters
are the event location, magnitude, arrival count and agency. scimex supports
two modi: *import* and *export*. In export mode scimex collects all objects
relevant for an event (e.g. picks, amplitudes, origins, magnitudes) from
scmaster's message groups at the source and checks if the filter criteria
match. Once the criteria are fulfilled, the whole package of objects is send
to the scmaster IMPORT group of the receiving system.
At the receiving |scname| system an instance of scimex runs in import mode. It
fetches the whole event information from its own IMPORT group, checks the local
filter criteria of the system and sends the collected objects to the different
message groups, e.g. Pick, Amplitude, Magnitude, Location. In export mode
several recipients can be defined and for each recipient individual filters
can be set. To run several instances of scimex on one system, aliases have to
be defined, e.g. for import:
.. code-block:: sh
seiscomp alias create scimex_import scimex
and for export:
.. code-block:: sh
seiscomp alias create scimex_export scimex
Then the configuration can be split into scimex_import.cfg and
scimex_export.cfg.
Examples
========
For a push-type configuration, in which the exporting server must be able to
connect to the messaging server on the receiving host. On the receiving host:
scimex_import.cfg
.. code-block:: sh
connection.username = scimexIm
connection.server = localhost
mode = IMPORT
cleanupinterval = 86400
importHosts = import1
criteria.world.longitude = -180:180
criteria.world.latitude = -90:90
criteria.world.magnitude = 1:9
criteria.world.agencyID = ""
criteria.world.arrivalcount = 15
hosts.import1.address = localhost
# The criterion "world" has been defined above
hosts.import1.criteria = world
# optional and true per default
hosts.import1.filter = false
# optional and true per default
hosts.import1.useDefinedRoutingTable = true
hosts.import1.routingtable = Pick:IMPORT,StationAmplitude:IMPORT,
Origin:LOCATION,Arrival:LOCATION,
StationMagnitude:MAGNITUDE,
Magnitude:MAGNITUDE,
StationMagnitudeContribution:MAGNITUDE,
OriginReference:EVENT,Event:EVENT
In this example, Pick and StationAmplitude objects are sent to the
receiving system's IMPORT group to avoid interfering with the receiving system's
picking.
On the sending system, only those events with a high enough magnitude
and enough arrivals, and with the AgencyID "GFZ" are exported:
scimex_export.cfg
.. code-block:: sh
connection.username="scimexEx"
connection.server = localhost
mode = EXPORT
cleanupinterval = 7200
exportHosts = exp1, exp2
# Match everything with magnitude above or equal 5
# and with more than 25 phases which comes from
# agency GFZ.
criteria.globalM5.latitude = -90:90
criteria.globalM5.longitude = -180:180
criteria.globalM5.magnitude = 5:10
criteria.globalM5.arrivalcount = 25
criteria.globalM5.agencyID = GFZ
# Export to a system which still runs a very old version. The
# messages need to be converted.
hosts.exp1.address = 192.168.0.3
hosts.exp1.criteria = globalM5
hosts.exp1.conversion = imexscdm0.51
hosts.exp2.address = 192.168.0.4
hosts.exp2.criteria = globalM5
.. _scimex_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/scimex.cfg`
| :file:`etc/global.cfg`
| :file:`etc/scimex.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/scimex.cfg`
scimex inherits :ref:`global options<global-configuration>`.
.. confval:: mode
Type: *string*
Mode of operation. Options are IMPORT or EXPORT.
.. confval:: cleanupinterval
Type: *double*
Unit: *s*
Cache lifetime for objects.
.. confval:: subscriptions
Type: *list:string*
Only used in export mode. A list of message groups to subscribe.
.. confval:: conversion
Type: *string*
Used only in import mode. It defines the source format of the
messages that need to be converted. Currently the import of
SeisComP datamodel version 0.51 \(imexscdm0.51\)
is supported which was used in release Barcelona \(2008\).
.. confval:: exportHosts
Type: *list:string*
A list of hosts profiles to be considered for exporting.
These are used in hosts.\$name directives
\(see below\) to define addresses, filter criteria, etc.
applicable to each recipient.
.. confval:: importHosts
Type: *list:string*
A list of hosts profiles to be considered for importing.
These are used with hosts.\$name directives similarly to
exportHosts.
.. note::
**criteria.$name.\***
*A definition of an event filter.*
$name is a placeholder for the name to be used.
.. confval:: criteria.$name.latitude
Type: *tuple:double*
Pair of doubles that defines the latitude range.
Example: \-90:90.
.. confval:: criteria.$name.longitude
Type: *tuple:double*
Pair of doubles that defines the longitude range.
Example: \-180:180.
.. confval:: criteria.$name.magnitude
Type: *tuple:double*
Pair of doubles that defines the magnitude range.
Example: 3:10.
.. confval:: criteria.$name.arrivalcount
Type: *int*
Number of minimum arrivals.
.. confval:: criteria.$name.agencyID
Type: *list:string*
White list of AgencyIDs \(the agency identifier which
appears in the objects sent over the export\-import link\).
.. note::
**hosts.$name.\***
*A sink definition used for either import or export.*
$name is a placeholder for the name to be used.
.. confval:: hosts.$name.address
Type: *string*
Address of a sink, as a host name with an optional port
number e.g. 'address \= 192.168.1.1' or
'address \= somewhere.com:4803'
.. confval:: hosts.$name.criteria
Type: *string*
Defining filter criteria name for sink, e.g. criteria \=
world\-xxl. The criteria must be defined in the criteria.
\* configuration lines.
.. confval:: hosts.$name.filter
Default: ``true``
Type: *boolean*
Enable\/disable filtering based on defined criteria.
If set to false, all events will pass, even if one
or more criteria are defined.
.. confval:: hosts.$name.conversion
Type: *string*
Optional target format for export.
.. confval:: hosts.$name.useDefinedRoutingTable
Default: ``false``
Type: *boolean*
Enable\/disable defined routing tables.
.. confval:: hosts.$name.routingTable
Type: *list:string*
Defining routing tables in the meaning of mapping
objects to message groups. Example: Pick:NULL,
StationAmplitude:NULL, Origin:LOCATION,
StationMagnitude: MAGNITUDE, NetworkMagnitude:MAGNITUDE,
MagnitudeReference:MAGNITUDE, OriginReference:EVENT,
Event:EVENT. Specifying NULL for the message group causes
messages to be thrown away\/dropped\/discarded.
Command-Line Options
====================
.. program:: scimex
:program:`scimex [options]`
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
.. option:: --plugins arg
Load given plugins.
.. option:: -D, --daemon
Run as daemon. This means the application will fork itself
and doesn't need to be started with \&.
.. option:: --auto-shutdown arg
Enable\/disable self\-shutdown because a master module shutdown.
This only works when messaging is enabled and the master
module sends a shutdown message \(enabled with \-\-start\-stop\-msg
for the master module\).
.. option:: --shutdown-master-module arg
Set the name of the master\-module used for auto\-shutdown.
This is the application name of the module actually
started. If symlinks are used, then it is the name of
the symlinked application.
.. option:: --shutdown-master-username arg
Set the name of the master\-username of the messaging
used for auto\-shutdown. If \"shutdown\-master\-module\" is
given as well, this parameter is ignored.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --log-file arg
Use alternative log file.
Messaging
---------
.. option:: -u, --user arg
Overrides configuration parameter :confval:`connection.username`.
.. option:: -H, --host arg
Overrides configuration parameter :confval:`connection.server`.
.. option:: -t, --timeout arg
Overrides configuration parameter :confval:`connection.timeout`.
.. option:: -g, --primary-group arg
Overrides configuration parameter :confval:`connection.primaryGroup`.
.. option:: -S, --subscribe-group arg
A group to subscribe to.
This option can be given more than once.
.. option:: --content-type arg
Overrides configuration parameter :confval:`connection.contentType`.
.. option:: --start-stop-msg arg
Set sending of a start and a stop message.
SCIMEX
------
.. option:: --print-default-routingtable
Print the default object routing table.

View File

@ -0,0 +1,536 @@
.. highlight:: rst
.. _scimport:
########
scimport
########
**Forward messages across two SeisComP systems.**
Description
===========
scimport is responsible to forward messages from one system to another. The
difference to :ref:`scimex` is that scimport does not handle the messages
event based. scimport supports two different modes. The relay mode does a
simple mapping from GROUP:SYSTEM_A to GROUP:SYSTEM_B. This mode is default.
In case GROUP is not defined in the second system the message is forwarded to
IMPORT_GROUP. The import mode supports custom mapping and filter functionality.
It is possible to forward GROUP1:SYSTEM_A to GROUP2:SYSTEM_B. In addition the
forwarded objects can be filtered by:
Pick
- Status
- Mode
- Phase
- AgencyID
Amplitude
- Amplitude
Origin
- Location
- Depth
- AgencyID
- Status
- Mode
Event
- Type
StationMagnitude
- Type
Magnitude
- Type
Examples
========
Example scimport.cfg
.. code-block:: sh
# The address of the importing system
sink = sinkAddress
# This option has to be set if the application runs in import mode.
# The routing table has to be defined in the form of source_group:sink_group
routingtable = PICK:PICK
# List of sink groups to subscribe to. If this option is not set the message
# groups will be determined automatically. If this option is set but not
# needed for a setup it can be ignored with the option --ignore-groups
msggroups = GROUP_ONE, GROUP_TWO
# Available filter options
filter.pick.mode = manual
filter.pick.status = confirmed
filter.pick.phase = P
filter.pick.agencyID = GFZ
# Values: eq (==), lt (<=) ,gt (>=), *
filter.amplitude.operator = gt
filter.amplitude.amplitude = 100
# Values: lat0:lat1 (range)
filter.origin.latitude = -90:90
# Values: lon0:lon1 (range)
filter.origin.longitude = -180:180
filter.origin.depth = 0:100
filter.origin.agencyID = GFZ
# Values: automatic, manual
filter.origin.mode = manual
filter.origin.status = confirmed
# Values: earthquake, explosion, quarry blast, chemical explosion,
# nuclear explosion, landslide, debris avalanche, rockslide,
# mine collapse, volcanic eruption, meteor impact, plane crash,
# building collapse, sonic boom, other
filter.event.type = earthquake
# Values: Whatever your magnitudes are named
filter.stationMagnitude.type = MLv
# Values: Whatever your magnitudes are named
filter.magnitude.type = MLv
# Values: latency, delay, timing quality, gaps interval, gaps length,
# spikes interval, spikes amplitude, offset, rms
filter.qc.type = latency
.. _scimport_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/scimport.cfg`
| :file:`etc/global.cfg`
| :file:`etc/scimport.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/scimport.cfg`
scimport inherits :ref:`global options<global-configuration>`.
.. confval:: sink
Type: *string*
URI of receiving host which runs scmaster. The URI contains
the host name with an optional protocol and port.
Format: protocol:\/\/host:port
Examples:
\- proc
\- scmp:\/\/proc:18180
.. confval:: routingtable
Type: *list:string*
This option has to be set if the application runs in import mode.
The routing table has to be defined in the form of
source_group:sink_group
.. confval:: msggroups
Type: *list:string*
Define a list of message groups of the source system
\(connection.server\). If not specified, the source system is
queried for a list of message groups which are then used to check
each subscription extracted from the routing table.
This parameter allows to override to source query result.
.. confval:: useFilter
Default: ``true``
Type: *boolean*
*No description available*
.. note::
**filter.\***
*Define filter criteria before sending.*
.. note::
**filter.pick.\***
*Criteria for filtering picks.*
.. confval:: filter.pick.mode
Type: *string*
The mode of picks to filter for. Allowed values:
\"automatic\" and \"manual\".
.. confval:: filter.pick.status
Type: *string*
The status of picks to filter for. Allowed values:
\"preliminary\", \"confirmed\",
\"reviewed\", \"final\",
\"rejected\" and \"reported\".
.. confval:: filter.pick.phase
Type: *string*
The pick phase hint to filter for. Allowed values:
all possible phase codes.
.. confval:: filter.pick.agencyIDs
Type: *list:string*
The pick agencyIDs to filter for. Allowed values:
all possible agency ids;
.. confval:: filter.pick.networkCode
Type: *string*
The pick network code of the processed waveforms.
Allowed values: all possible network codes.
.. note::
**filter.amplitude.\***
*Criteria for filtering amplitudes*
.. confval:: filter.amplitude.operator
Type: *string*
The amplitude comparison operator. Allowed values:
\"eq\", \"lt\", \"gt\" and \"\*\".
.. confval:: filter.amplitude.amplitude
Type: *double*
The amplitude threshold to filter for. The operator
configured with \"operator\" is used to compare this threshold with
the incoming value. If \"operator\" is \"\*\" then
values will pass.
.. confval:: filter.amplitude.agencyIDs
Type: *list:string*
The amplitude agencyIDs to filter for. Allowed values:
all possible agency ids.
.. note::
**filter.origin.\***
*Criteria for filtering origins*
.. confval:: filter.origin.latitude
Type: *string*
The latitude range in format [min]:[max].
.. confval:: filter.origin.longitude
Type: *string*
The longitude range in format [min]:[max].
.. confval:: filter.origin.depth
Type: *string*
The depth range in format [min]:[max].
.. confval:: filter.origin.agencyIDs
Type: *list:string*
The origin agencyIDs to filter for. Allowed values:
all possible agency ids;
.. confval:: filter.origin.mode
Type: *string*
The origin evaluation mode to filter for. Allowed values:
\"automatic\" and \"manual\".
.. confval:: filter.origin.status
Type: *string*
The origin status to filter for. Allowed values:
\"preliminary\", \"confirmed\",
\"reviewed\", \"final\",
\"rejected\" and \"reported\".
.. confval:: filter.origin.arrivalcount
Type: *string*
The minimum number of arrivals of an origin to pass
the filter.
.. note::
**filter.event.\***
*Criteria for filtering events*
.. confval:: filter.event.type
Type: *string*
The event type to filter for, e.g. \"earthquake\",
\"explosion\" ...
.. note::
**filter.stationMagnitude.\***
*Criteria for filtering station magnitudes*
.. confval:: filter.stationMagnitude.type
Type: *string*
The station magnitude type. Allowed values: all possible
magnitude types such as \"MLv\".
.. note::
**filter.magnitude.\***
*Criteria for filtering network magnitudes*
.. confval:: filter.magnitude.type
Type: *string*
The magnitude type. Allowed values: all possible
magnitude types such as \"MLv\".
.. note::
**filter.qc.\***
*Criteria for filtering QC parameters*
.. confval:: filter.qc.type
Type: *string*
The QC parameter type. Allowed values: all possible
types such as \"latency\", \"delay\" ...
Command-Line Options
====================
.. program:: scimport
:program:`scimport [options]`
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
.. option:: --plugins arg
Load given plugins.
.. option:: -D, --daemon
Run as daemon. This means the application will fork itself
and doesn't need to be started with \&.
.. option:: --auto-shutdown arg
Enable\/disable self\-shutdown because a master module shutdown.
This only works when messaging is enabled and the master
module sends a shutdown message \(enabled with \-\-start\-stop\-msg
for the master module\).
.. option:: --shutdown-master-module arg
Set the name of the master\-module used for auto\-shutdown.
This is the application name of the module actually
started. If symlinks are used, then it is the name of
the symlinked application.
.. option:: --shutdown-master-username arg
Set the name of the master\-username of the messaging
used for auto\-shutdown. If \"shutdown\-master\-module\" is
given as well, this parameter is ignored.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --log-file arg
Use alternative log file.
Messaging
---------
.. option:: -u, --user arg
Overrides configuration parameter :confval:`connection.username`.
.. option:: -H, --host arg
Overrides configuration parameter :confval:`connection.server`.
.. option:: -t, --timeout arg
Overrides configuration parameter :confval:`connection.timeout`.
.. option:: -g, --primary-group arg
Overrides configuration parameter :confval:`connection.primaryGroup`.
.. option:: -S, --subscribe-group arg
A group to subscribe to.
This option can be given more than once.
.. option:: --content-type arg
Overrides configuration parameter :confval:`connection.contentType`.
.. option:: --start-stop-msg arg
Set sending of a start and a stop message.
Import
------
.. option:: -o, --sink
Overrides configuration parameter :confval:`sink`.
.. option:: -i, --import
Switch to import mode \(default is relay\). Im import mode the
routing table has to be specified whereas in relay the
routing table will be calculated automatically.
.. option:: --no-filter
Disable message filtering and ignore all configured filters.
.. option:: --routeunknowngroup
Route unknown groups to the default group IMPORT_GROUP.
.. option:: --ignore-groups
Ignore user specified groups.
.. option:: --test
Do not send any messages.

View File

@ -0,0 +1,677 @@
.. highlight:: rst
.. _scinv:
#####
scinv
#####
**Inventory database synchronisation.**
Description
===========
scinv merges and tests inventory XML files to a single inventory, synchronises
an inventory with another (most common use is with database), creates initial
key files and much more ...
scinv is used by :file:`$SEISCOMP_ROOT/etc/init/scinv.py` to synchronise the
inventory from :file:`$SEISCOMP_ROOT/etc/inventory` with the database.
.. code-block:: sh
seiscomp update-config inventory
.. hint::
Inventory files in :term:`SCML` format may be generated or modified by
:cite:t:`smp` or :ref:`invextr`. For conversion from FDSN station XML and
dataless SEED volume to :term:`SCML` use :ref:`fdsnxml2inv` and
:ref:`dlsv2inv`, respectively.
Commands
========
scinv works with different commands:
- :ref:`scinv_ls`: List the content of inventories in XML files,
- :ref:`scinv_check`: Merge and test inventories, check the completeness and
consistency of parameters, report any issue,
- :ref:`scinv_merge`: Merge and test inventory files,
- :ref:`scinv_keys`: Merge and test inventories, generate key files or
remove key files without coorresponding inventory,
- :ref:`scinv_sync`: Merge and test inventory files, generate or remove key
files, synchronise the inventory with the database and send updates by
notifiers to the messaging for saving to the database,
- :ref:`scinv_apply`: Read and apply notifiers.
The command **must** be given as **1st**
parameter to the application. All others parameters must follow.
.. code-block:: sh
scinv $command [options] [files]
.. _scinv_sync:
sync
----
Synchronises an applications inventory with a given source given as file(s).
It checks the consistency of the inventory using :ref:`scinv_check` before
synchronization.
The applications inventory is either read from the database or given with
:option:`--inventory-db`. As a result all information in the source is written
to target and target does not contain any additional information. The source
must hold all information. This works different to merge. If an output file is
specified with :option:`-o` no notifiers are generated and sent via messaging.
This command is used by :file:`etc/init/scinv.py` as follows:
.. code-block:: sh
scinv sync --console=1 -H localhost:$p --filebase "$fb" \
--rc-dir "$rc" --key-dir "$kd"
where
.. code-block:: sh
$p = configured messaging port
$fb = $SEISCOMP_ROOT/etc/inventory
$rc = $SEISCOMP_ROOT/var/lib/rc
$kd = $SEISCOMP_ROOT/etc/key
.. _scinv_merge:
merge
-----
Merges two or more inventories into one inventory checking the consistency
of the inventory by using :ref:`scinv_check`. This command is useful to merge
existing subtrees into a final inventory before synchronization.
.. code-block:: sh
scinv merge net1.xml net2.xml -o inv.xml
.. note::
Merging inventory XML files is also supported by :ref:`scxmlmerge` but
without the full :ref:`consistency checks <scinv_check>`.
.. _scinv_apply:
apply
-----
Applies stored notifiers created with **sync** and option ``--create-notifier``
which is saved in a file (``-o``). Source is the applications inventory read
from the database or given with ``--inventory-db``.
If ``-o`` is passed, no messages are sent but the result is stored in a file.
Useful to test/debug or prepare an inventory for offline processing.
.. code-block:: sh
# Synchronise inventory and save the notifiers locally. No messages are sent.
scinv sync -d mysql://sysop:sysop@localhost/seiscomp \
--create-notifier -o sync_patch.xml
# Sent the notifiers to the target system
scinv apply -H localhost sync_patch.xml
This operation can be useful to save differences in synchronization for
validation or debugging problems.
.. _scinv_keys:
keys
----
Synchronise station key files with current inventory pool. This command merges
all XML files in the inventory pool (or the given files) and checks if a
corresponding station key file in :file:`$SEISCOMP_ROOT/etc/key` exists. If not,
an empty station key file is created. If a station key file without a
corresponding station in the merged inventory is found, it is deleted.
.. _scinv_ls:
ls
--
List contained items up to response level. This command is useful to inspect
an XML file or the complete inventory pool.
.. code-block:: sh
$ scinv ls SK.KOLS.xml
network SK Slovak National Network of Seismic Stations
epoch 1980-01-01
station KOLS Kolonicke sedlo, Slovakia
epoch 2004-09-01
location __
epoch 2004-09-01
channel BHE
epoch 2006-04-25 12:00:00 - 2010-03-24
channel BHN
epoch 2006-04-25 12:00:00 - 2010-03-24
channel BHZ
epoch 2006-04-25 12:00:00 - 2010-03-24
channel EHE
epoch 2004-09-01 - 2006-04-25 10:00:00
channel EHN
epoch 2004-09-01 - 2006-04-25 10:00:00
channel EHZ
epoch 2004-09-01 - 2006-04-25 10:00:00
channel HHE
epoch 2006-04-25 12:00:00 - 2010-03-24
channel HHE
epoch 2010-03-25
channel HHN
epoch 2006-04-25 12:00:00 - 2010-03-24
channel HHN
epoch 2010-03-25
channel HHZ
epoch 2006-04-25 12:00:00 - 2010-03-24
channel HHZ
epoch 2010-03-25
The default level of information printed is *chan*. Available levels are *net*,
*sta*, *chan* and *resp*. The output level is controlled by :option:`--level`.
For checking the available networks and stations in the inventory pool, calling
.. code-block:: sh
scinv ls --level sta
is enough.
.. hint::
Stream lists in NSLC format (NET.STA.LOC.CHA) may be generated when combining
with :option:`--nslc`. Such lists can be used as input for filtering
waveforms, e.g., to :ref:`scmssort` or :ref:`scart`.
.. code-block:: sh
$ scinv ls --nslc inventory.xml
IU.WVT.00.BHZ 2017-11-16
IU.XMAS.00.BH1 2018-07-06 20:00:00
.. _scinv_check:
check
-----
Checks consistency of passed inventory files or a complete filebase. In the
first step the inventory is merged from all files. In the second step several
consistency checks are applied such as:
- Overlapping epochs on each level (network, station, ...),
- Valid epochs (start < end),
- Defined gain in a stream,
- Set gain unit,
- Distance of the sensor location to the station location,
- "Invalid" location 0/0.
When inconsistencies or other relevant information are found, alerts are printed:
- **!**: Error, user must take an action,
- **C**: Conflict, user should take an action,
- **W**: Warning, user should check if an action is required,
- **I**: Information,
- **D**: Debug,
- **R**: Unresolvable, user should check if an action is required,
- **?**: Question.
.. note::
* Default test tolerances are adopted from typical values for global
networks. Consider adjusting :confval:`check.maxDistance`,
:confval:`check.maxElevationDifference` and :confval:`check.maxSensorDepth`
by configuration or command-line options.
* Errors must but conflicts and warnings should be resolved for maintaining a
correct inventory.
* :ref:`Merging <scinv_merge>` and :ref:`sychronization <scinv_sync>` stop
when finding errors.
The following table lists checks of objects for deficiencies and the test
results.
* This test matrix may be incomplete. Consider adding more tests and results.
* Please report inventory issues not caught by tests to the SeisComP
development team, e.g. on :cite:t:`seiscomp-github`.
.. csv-table::
:widths: 10, 30, 5, 65
:header: Object, Check description, Alert, Comments
:align: left
:delim: ;
network ; start time after end time ; !;
; network without station ; W;
; empty start time ; ; handled by SeisComP inventory reader: network is ignored
; empty station ; W;
; empty code ; W;
station ; start time after end time ; !;
; empty or no start time ; W; station is ignored
; start time after end time ; !;
; empty code ; W;
; empty latitude ; W;
; empty longitude ; W;
; empty elevation ; W;
; elevation > 8900 ; !;
; elevation < -12000 ; !;
; has no sensor location ; W;
sensorLocation; coordinates far away from station; W; :option:`--distance` and :confval:`check.maxDistance` override default threshold (10 km)
; elevation far away from station ; W; :option:`--max-elevation-difference` and :confval:`check.maxElevationDifference` override default threshold (500 m)
; epoch outside network epochs ; C;
; epoch outside station epochs ; C;
; empty or no start time ; W; sensorLocation is ignored
; empty latitude ; W;
; empty longitude ; W;
; elevation > 8900 ; !;
; elevation < -12000 ; !;
; empty or no elevation ; W;
; has no channel/stream ; W;
stream ; empty or no start time ; ; handled by SeisComP inventory reader: stream is ignored
; empty azimuth ; C;
; epoch outside sensorLocation ; C;
; epoch outside station ; C;
; epoch outside network ; C;
; start time after end time ; C;
; missing gain value ; W; empty value is handled by SeisComP inventory reader
; gain value = 0 ; W;
; gain < 0 and dip > 0 ; W; may result in unexpected behavior, consider positive gain and negative dip
; missing gain unit ; W; empty value is handled by SeisComP inventory reader
; missing gain frequency ; ; empty value is handled by SeisComP inventory reader
; missing sampling rate ; ; empty value is handled by SeisComP inventory reader
; missing depth ; W; empty value is handled by SeisComP inventory reader
; missing azimuth ; W;
; missing dip ; W;
; empty azimuth ; ; handled by SeisComP inventory reader
; empty dip ; ; handled by SeisComP inventory reader
; large depth ; W; :option:`--max-sensor-depth` and :confval:`check.maxSensorDepth` override default threshold (500 m)
; empty sensor ID ; I;
; sensor is unavailable ; R;
; empty data logger ID ; I;
; data logger is unavailable ; R;
; 2 or more than 3 streams exist ; I;
; 3C streams are not orthogonal ; W; differences <= 5 degree are tolerated, applies to seismic sensors with codes G, H, L, N
sensor ; referenced response not available; R;
data logger ; referenced response not available; R;
.. _scinv_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/scinv.cfg`
| :file:`etc/global.cfg`
| :file:`etc/scinv.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/scinv.cfg`
scinv inherits :ref:`global options<global-configuration>`.
.. confval:: syncKeys
Default: ``true``
Type: *boolean*
Synchronise key files.
.. confval:: purgeKeys
Default: ``true``
Type: *boolean*
Delete key files if a station does not exist in inventory.
.. note::
**check.\***
*Quantities probed when using the check command.*
.. confval:: check.maxDistance
Default: ``10``
Type: *double*
Unit: *km*
Maximum allowed distance between station and sensor location
coordinates.
.. confval:: check.maxElevationDifference
Default: ``500``
Type: *double*
Unit: *m*
Maximum allowed differences between elevation of station and
sensor location.
.. confval:: check.maxSensorDepth
Default: ``500``
Type: *double*
Unit: *m*
Maximum allowed depth of channel \(sensor\). This is the depth
of the sensor below the surface.
Command-Line Options
====================
.. program:: scinv
:program:`scinv command [options] [files]`
Command is one of: sync, merge, apply, keys, ls and check.
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
.. option:: --plugins arg
Load given plugins.
.. option:: -D, --daemon
Run as daemon. This means the application will fork itself
and doesn't need to be started with \&.
.. option:: --auto-shutdown arg
Enable\/disable self\-shutdown because a master module shutdown.
This only works when messaging is enabled and the master
module sends a shutdown message \(enabled with \-\-start\-stop\-msg
for the master module\).
.. option:: --shutdown-master-module arg
Set the name of the master\-module used for auto\-shutdown.
This is the application name of the module actually
started. If symlinks are used, then it is the name of
the symlinked application.
.. option:: --shutdown-master-username arg
Set the name of the master\-username of the messaging
used for auto\-shutdown. If \"shutdown\-master\-module\" is
given as well, this parameter is ignored.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --log-file arg
Use alternative log file.
.. option:: --print-component arg
For each log entry print the component right after the
log level. By default the component output is enabled
for file output but disabled for console output.
.. option:: --trace
Execute in trace mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 \-\-print\-component\=1
\-\-print\-context\=1 .
Messaging
---------
.. option:: -u, --user arg
Overrides configuration parameter :confval:`connection.username`.
.. option:: -H, --host arg
Overrides configuration parameter :confval:`connection.server`.
.. option:: -t, --timeout arg
Overrides configuration parameter :confval:`connection.timeout`.
.. option:: -g, --primary-group arg
Overrides configuration parameter :confval:`connection.primaryGroup`.
.. option:: -S, --subscribe-group arg
A group to subscribe to.
This option can be given more than once.
.. option:: --content-type arg
Overrides configuration parameter :confval:`connection.contentType`.
.. option:: --start-stop-msg arg
Set sending of a start and a stop message.
Database
--------
.. option:: --db-driver-list
List all supported database drivers.
.. option:: -d, --database arg
The database connection string, format:
service:\/\/user:pwd\@host\/database.
\"service\" is the name of the database driver which
can be queried with \"\-\-db\-driver\-list\".
.. option:: --config-module arg
The config module to use.
.. option:: --inventory-db arg
Load the inventory from the given database or file, format:
[service:\/\/]location .
.. option:: --db-disable
Do not use the database at all
Manager
-------
.. option:: --filebase dir
Directory to check for inventory XML files. If not given,
all XML files passed are checked.
.. option:: --rc-dir dir
If given, rc \(resource\) files will be created in this
directory for each station. The station descriptions will be
from the last available epoch.
.. option:: --key-dir dir
The directory to synchronise key files to. If not given,
\@SYSTEMCONFIGDIR\@\/key is assumed.
.. option:: -o, --output file
Output file for writing inventory XML after merging.
.. option:: --purge-keys
\(default\) Delete key files if a station does not exist in
inventory.
.. option:: --no-purge-keys
Do not delete key files if a station does not exist in
inventory.
Check
-----
.. option:: --distance
Maximum allowed distance between station and location
coordinates when using the check command.
.. option:: --max-elevation-difference
Maximum allowed difference in elevation
between station and sensorlocation in m. Larger differences
will be reported.
.. option:: --max-sensor-depth
Maximum allowed depth of channel \(sensor\). This is the depth
of the sensor below the surface in m. Larger depths will be
reported.
List
----
.. option:: --compact
Enable compact output for ls: each object one line.
.. option:: --level int
Information level reported by ls. One of \"net\", \"sta\",
\"cha\" or \"resp\". Default is \"cha\".
.. option:: --nslc
Enable NSLC output for ls as NET.STA.LOC.CHA. The option
implies level \= cha.
Merge
-----
.. option:: --strip
Remove unreferenced objects \(data loggers, sensors, ...\).
Sync
----
.. option:: --create-notifier
If an output file is given, then all notifiers will be saved
and not the result set itself.
.. option:: --no-keys
Do not synchronise key files.
.. option:: --no-rc
Do not synchronise rc files.
.. option:: --test
Do not send any notifiers and just output resulting
operations and conflicts.

View File

@ -0,0 +1,399 @@
.. highlight:: rst
.. _scm:
###
scm
###
**Process monitor.**
Description
===========
scm monitors client activity. scm connects to a certain master and periodically
processes the status messages sent by the clients.
Each client status if forwarded to the plugins loaded by scm. By default
the :ref:`mncursesplugin <scm_ncurses>` is loaded which presents an interface
similar to the gnu program top.
Filters
=======
Plugins might support filtering client status information. To configure filters
each plugin supports a configuration value :confval:`$name.filter`. This filter
is a string which can be constructed from available status info tags and logical
and numerical operators.
List of tags:
.. code-block:: sh
time
privategroup
hostname
clientname
ips
programname
pid
cpuusage
totalmemory
clientmemoryusage
memoryusage
sentmessages
receivedmessages
messagequeuesize
summedmessagequeuesize
averagemessagequeuesize
summedmessagesize
averagemessagesize
objectcount
uptime
responsetime
A filter might look like this:
.. code-block:: sh
memailplugin.filter = "(cpuusage>100 || totalmemory>1000) && hostname==proc-machine"
Numerical operators
-------------------
Numerical operators are applied to a tag name and a constant value.
======== =================
Operator Description
======== =================
== equal
!= not equal
< less than
> greater than
<= less or equal
>= greater or equal
======== =================
Logical operators
-----------------
Logical operators are applied to a group (might be enclosed in brackets) or
numerical expressions.
======== =================
Operator Description
======== =================
! not
&& and
|| or
======== =================
Multiple instances
==================
To monitor different clients sets with different criteria and different plugins
it is common practice to create aliases of scm and to configure each instance
separately
.. code-block:: sh
seiscomp alias create scm_level1 scm
seiscomp alias create scm_level2 scm
where :program:`scm_level1` could monitor all mandatory clients whereas
:program:`scm_level2` monitors all clients which are not crucial for operation.
Plugins
=======
* :ref:`email <scm_email>`
Email plugin for scm which sends emails based on client status.
* :ref:`text <scm_text>`
Text output plugin for scm.
* :ref:`ncurses <scm_ncurses>`
Ncurses output plugin for scm which presents an interactive table of processes.
.. _scm_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/scm.cfg`
| :file:`etc/global.cfg`
| :file:`etc/scm.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/scm.cfg`
scm inherits :ref:`global options<global-configuration>`.
.. _scm/email:
email extension
---------------
Email plugin for scm which sends emails based on client status.
.. confval:: memailplugin.recipients
Type: *list:string*
Defines a comma separated list of email addresses to send
notifications to.
.. confval:: memailplugin.template
Type: *string*
Configures a custom message text that is appended to each message
when clients passed the filter.
.. confval:: memailplugin.filter
Type: *string*
Defines the filter for each client status. A filter is an expression
that can be constructed with all available status tags \(scm \-\-print\-tags\)
and logical and numerical operators. See scm for more information.
.. confval:: memailplugin.requiredClients
Type: *list:string*
*No description available*
.. confval:: memailplugin.reportSilentClients
Default: ``true``
Type: *boolean*
*No description available*
.. confval:: memailplugin.reportSilentClientsTimeSpan
Default: ``1``
Type: *double*
Unit: *min*
*No description available*
.. confval:: memailplugin.reportRequiredClients
Default: ``10``
Type: *double*
Unit: *min*
*No description available*
.. confval:: memailplugin.filterMeanInterval
Default: ``10``
Type: *double*
Unit: *min*
*No description available*
.. confval:: memailplugin.sendEmail
Default: ``false``
Type: *boolean*
Enables sending of emails using mailx shell command.
.. _scm/text:
text extension
--------------
Text output plugin for scm.
.. confval:: mtextplugin.outputDir
Default: ``@LOGDIR@/scm/``
Type: *string*
Output directory where [client].txt is written to. Additionally
an file description.txt will be created to show the order of
tags used in the client status file.
Command-Line Options
====================
.. program:: scm
:program:`scm [options]`
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
.. option:: --plugins arg
Load given plugins.
.. option:: -D, --daemon
Run as daemon. This means the application will fork itself
and doesn't need to be started with \&.
.. option:: --auto-shutdown arg
Enable\/disable self\-shutdown because a master module shutdown.
This only works when messaging is enabled and the master
module sends a shutdown message \(enabled with \-\-start\-stop\-msg
for the master module\).
.. option:: --shutdown-master-module arg
Set the name of the master\-module used for auto\-shutdown.
This is the application name of the module actually
started. If symlinks are used, then it is the name of
the symlinked application.
.. option:: --shutdown-master-username arg
Set the name of the master\-username of the messaging
used for auto\-shutdown. If \"shutdown\-master\-module\" is
given as well, this parameter is ignored.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --log-file arg
Use alternative log file.
Messaging
---------
.. option:: -u, --user arg
Overrides configuration parameter :confval:`connection.username`.
.. option:: -H, --host arg
Overrides configuration parameter :confval:`connection.server`.
.. option:: -t, --timeout arg
Overrides configuration parameter :confval:`connection.timeout`.
.. option:: -g, --primary-group arg
Overrides configuration parameter :confval:`connection.primaryGroup`.
.. option:: -S, --subscribe-group arg
A group to subscribe to.
This option can be given more than once.
.. option:: --content-type arg
Overrides configuration parameter :confval:`connection.contentType`.
.. option:: --start-stop-msg arg
Set sending of a start and a stop message.
Monitor
-------
.. option:: -c, --clients list
Comma separated list of clients to monitor.
.. option:: --print-tags
Print available keys for accessing client info data and to
build filter configurations.
.. option:: --no-output-plugins
Do not use output plugins such as mncursesplugin.

View File

@ -0,0 +1,138 @@
.. _scm_email:
#####
email
#####
Email plugin for scm which sends emails based on client status.
Description
===========
The email plugin sends emails to configured receipients if a client status
messages passes the configured :confval:`filter <memailplugin.filter>`.
Plugin
======
The email plugin is installed under :file:`share/plugins/monitor/memailplugin.so`.
To add the plugin to :ref:`scm`, add it to the plugin list:
.. code-block:: sh
plugins = ${plugins}, memailplugin
Examples
========
An example configuration looks like this:
.. code-block:: sh
# Send notification is a clients CPU usage exceeds 100 percent
memailplugin.filter = "cpuusage>100"
# Send emails, yes
memailplugin.sendEmail = true
# Send emails to this address(es)
memailplugin.recipients = operator@my-agency.org, operator2@my-agency.org
memailplugin.reportSilentClients = false
# Minutes before report missing clients
memailplugin.reportRequiredClients = 1
# Interval to calculate mean of the message values for (in minutes)
memailplugin.filterMeanInterval = 1
# List of clients we definitely require to be operative
memailplugin.requiredClients = scautopick, scautoloc, scevent, scamp,\
scmag, scqc, scevtlog
.. _scm_email_configuration:
Module Configuration
====================
.. confval:: memailplugin.recipients
Type: *list:string*
Defines a comma separated list of email addresses to send
notifications to.
.. confval:: memailplugin.template
Type: *string*
Configures a custom message text that is appended to each message
when clients passed the filter.
.. confval:: memailplugin.filter
Type: *string*
Defines the filter for each client status. A filter is an expression
that can be constructed with all available status tags \(scm \-\-print\-tags\)
and logical and numerical operators. See scm for more information.
.. confval:: memailplugin.requiredClients
Type: *list:string*
*No description available*
.. confval:: memailplugin.reportSilentClients
Default: ``true``
Type: *boolean*
*No description available*
.. confval:: memailplugin.reportSilentClientsTimeSpan
Default: ``1``
Type: *double*
Unit: *min*
*No description available*
.. confval:: memailplugin.reportRequiredClients
Default: ``10``
Type: *double*
Unit: *min*
*No description available*
.. confval:: memailplugin.filterMeanInterval
Default: ``10``
Type: *double*
Unit: *min*
*No description available*
.. confval:: memailplugin.sendEmail
Default: ``false``
Type: *boolean*
Enables sending of emails using mailx shell command.

View File

@ -0,0 +1,42 @@
.. _scm_ncurses:
#######
ncurses
#######
Ncurses output plugin for scm which presents an interactive table of
processes.
Description
===========
The output is a table where each row contains the information of a
certain connected client. The rows are sorted in descending order based on the
contents of the name column. Other columns can be interactively selected with
the left and right arrow key. The r key changes the sorting to ascending order.
The displayed parameter are the name of the binary (prog), the name of the
client (name), the name of the host name from which the client is connected
(host), the available memory on the clients host in kb (hmem), the clients
memory usage in kb (cmem), the percentage of the clients memory usage (mem),
the CPU usage (cpu), the amount of queued messages to be processed by the
client (q), the average number of messages in the queue (mq), the connection
time of the client to the master (uptime), the elapsed time since the last
client update (resp).
.. figure:: media/scm_curses.png
top like perspective of ncurses plugin
Plugin
======
The Ncurses plugin is installed under :file:`share/plugins/monitor/mncursesplugin.so`.
To add the plugin to :ref:`scm`, add it to the plugin list:
.. code-block:: sh
plugins = ${plugins}, mncursesplugin

View File

@ -0,0 +1,44 @@
.. _scm_text:
####
text
####
Text output plugin for scm.
Description
===========
The text plugin writes for each client a status text file to a
:confval:`configurable <mtextplugin.outputDir>` directory. Each text file
is named after the client with the extension ".txt".
Plugin
======
The text plugin is installed under :file:`share/plugins/monitor/mtextplugin.so`.
To add the plugin to :ref:`scm`, add it to the plugin list:
.. code-block:: sh
plugins = ${plugins}, mtextplugin
.. _scm_text_configuration:
Module Configuration
====================
.. confval:: mtextplugin.outputDir
Default: ``@LOGDIR@/scm/``
Type: *string*
Output directory where [client].txt is written to. Additionally
an file description.txt will be created to show the order of
tags used in the client status file.

View File

@ -0,0 +1,633 @@
.. highlight:: rst
.. _scmag:
#####
scmag
#####
**Calculates magnitudes of different types.**
Description
===========
The purpose of scmag is to compute magnitudes from pre-computed amplitudes.
Instead it takes amplitudes and origins as input and produces StationMagnitudes
and (network) Magnitudes as output. It does not access waveforms.
The resulting magnitudes are sent to the "MAGNITUDE" group. scmag doesnt access
any waveforms. It only uses amplitudes previously calculated.
The purpose of scmag is the decoupling of magnitude computation from amplitude
measurements. This allows several modules to generate amplitudes concurrently,
like :ref:`scautopick` or :ref:`scamp`. As soon as an origin comes in, the amplitudes related
to the picks are taken either from the memory buffer or the database to compute
the magnitudes.
Relationship between amplitudes and origins
-------------------------------------------
scmag makes use of the fact that origins sent by :ref:`scautoloc`, :ref:`scolv`
or other modules include
the complete set of arrivals, which reference picks used for origin computation.
The picks in turn are referenced by a number of amplitudes, some of which are
relevant for magnitude computation.
Read the :ref:`scamp` documentation for more details on amplitude measurements.
.. _scmag-primaryM:
Primary magnitudes
------------------
Primary magnitudes are computed from amplitudes and station-event distances.
Currently the following primary magnitude types are implemented.
Local distances
---------------
:term:`Md <magnitude, duration (Md)>`
Duration magnitude as described in HYPOINVERSE (:cite:t:`klein-2002`).
:term:`Mjma <magnitude, JMA (M_JMA)>`
Mjma is computed on displacement data using body waves of period < 30s.
:term:`ML <magnitude, local (ML)>`
Local (Richter) magnitude calculated on the horizontal components using a
correction term to fit with the standard ML (:cite:t:`richter-1935`).
:term:`MLc <magnitude, local custom (MLc)>`
Local custom magnitude calculated on the horizontal components according to
Hessian Earthquake Service and :cite:t:`stange-2006`
:term:`MLh <magnitude, local horizontal (MLh)>`
Local magnitude calculated on the horizontal components according to SED
specifications.
:term:`MLv <magnitude, local vertical (MLv)>`
Local magnitude calculated on the vertical component using a correction term
to fit with the standard ML.
:term:`MLr <magnitude, local GNS/GEONET (MLr)>`
Local magnitude calculated from MLv amplitudes based on GNS/GEONET specifications
for New Zealand (:cite:t:`ristau-2016`).
:term:`MN <magnitude, Nuttli (MN)>`
Nuttli magnitude for Canada and other Cratonic regions (:cite:t:`nuttli-1973`).
Teleseismic distances
---------------------
:term:`mb <magnitude, body-wave (mb)>`
Narrow band body wave magnitude measured on a WWSSN-SP filtered trace
:term:`mBc <magnitude, cumulative body-wave (mBc)>`
Cumulative body wave magnitude
:term:`mB <magnitude, broadband body-wave (mB)>`
Broad band body wave magnitude after :cite:t:`bormann-2008`
:term:`Mwp <magnitude, broadband P-wave moment (Mwp)>`
The body wave magnitude of :cite:t:`tsuboi-1995`
:term:`Ms_20 <magnitude, surface wave (Ms_20)>`
Surface-wave magnitude at 20 s period
:term:`Ms(BB) <magnitude, broadband surface wave (Ms(BB))>`
Broad band surface-wave magnitude
Derived magnitudes
------------------
Additionally, scmag derives the following magnitudes from primary magnitudes:
:term:`Mw(mB) <magnitude, derived mB (Mw(mB))>`
Estimation of the moment magnitude Mw based on mB using the Mw vs. mB
regression of :cite:t:`bormann-2008`
:term:`Mw(Mwp) <magnitude, derived Mwp (Mw(Mwp))>`
Estimation of the moment magnitude Mw based on Mwp using the Mw vs. Mwp
regression of :cite:t:`whitmore-2002`
:term:`M <magnitude, summary (M)>`
Summary magnitude, which consists of a weighted average of the individual
magnitudes and attempts to be a best possible compromise between all magnitudes.
See below for configuration and also scevent for how to add the summary magnitude
to the list of possible preferred magnitudes or how to make it always preferred.
More details are given in the :ref:`section Summary magnitude<scmag-summaryM>`.
Mw(avg)
Estimation of the moment magnitude Mw based on a weighted average of other
magnitudes, currently MLv, mb and Mw(mB), in future possibly other magnitudes as
well, especially those suitable for very large events. The purpose of Mw(avg) is
to have, at any stage during the processing, a “best possible” estimation of the
magnitude by combining all available magnitudes into a single, weighted average.
Initially the average will consist of only MLv and/or mb measurements, but as soon
as Mw(mB) measurements become available, these (and in future other large-event
magnitudes) become progressively more weight in the average.
If an amplitude is updated, the corresponding magnitude is updated as well.
This allows the computation of preliminary, real-time magnitudes even before
the full length of the P coda is available.
.. _scmag-stationM:
Station magnitudes
==================
Station magnitudes of a :ref:`particular magnitude type <scmag-primaryM>` are
calculated based on measured amplitudes considered by this magnitude type and
the distance between the :term:`origin` and the station at which the amplitude
was measured. Typically, epicentral distance is used for distance. Magnitudes
may support configurable distance measures, e.g.,
:term:`MLc <magnitude, local custom (MLc)>`. The relation between measured
amplitudes, distance and station magnitude is given by a calibration function
which is specific to a magnitude type and configurable for some magnitudes.
.. note::
Usually station magnitudes use amplitudes of the same type. However, some magnitude
consider amplitudes of another type. E.g. :term:`MLr <magnitude, local GNS/GEONET (MLr)>`
uses amplitudes computed for :term:`MLv <magnitude, local vertical (MLv)>`.
Regionalization
---------------
Depending on the geographic region in which events, stations or entire ray paths
are located, different calibration functions and constraints may apply. This is
called "magnitude regionalization". The region is defined by a polygon stored in
a region file. For a particular magnitude, regionalization can be configured by
global parameters, e.g., in :file:`$SEISCOMP_ROOT/etc/global.cfg`.
#. Add magnitude type profile to the magnitudes parameters. The name of the
profile must be the name of the magnitude type.
#. Add the profile-specific parameters.
Example for MLc in :file:`$SEISCOMP_ROOT/etc/global.cfg` the polygon with name
*test* defined in a :ref:`BNA file <sec-gui_layers-vector>`:
.. code-block:: properties
magnitudes.MLc.regionFile = @DATADIR@/spatial/vector/magnitudes/regions.bna
magnitudes.MLc.region.test.enable = true
magnitudes.MLc.region.test.A0.logA0 = 0:-1.3, 60:-2.8, 100:-3.0, 400:-4.5, 1000:-5.85
.. _scmag-networkM:
Network magnitudes
==================
The network magnitude is a magnitude value summarizing several
:ref:`station magnitudes <scmag-stationM>` values of one :term:`origin`.
Different methods are available for forming network magnitudes from station
magnitudes:
.. csv-table::
:header: Method, Description
:widths: 20 80
:align: left
:delim: ;
mean; The usual mean value.
trimmed mean value; To stabilize the network magnitudes the smallest and the largest 12.5% of the :term:`station magnitude` values are removed before computing the mean.
median; The usual median value.
median trimmed mean; Removing all station magnitudes with a distance greater than 0.5 (default) from the median of all station magnitudes and computing the mean of all remaining station magnitudes.
Configure the method per magnitude type by :confval:`magnitudes.average`.
Default values apply for each magnitude type which are defined by the magnitude
itself.
In the :ref:`scolv Magnitudes tab <scolv-sec-magnitude-tab>` the methods, the
stations magnitudes and other parameters can be selected interactively.
.. _scmag-summaryM:
Summary magnitude
=================
scmag can compute a summary magnitude as a weighted sum from all available
:ref:`network magnitudes <scmag-networkM>`.
This magnitude is typically called **M** as configured in
:confval:`summaryMagnitude.type`.
It is computed as a weighted average over the available magnitudes:
.. math::
M &= \frac{\sum w_{i} * M_{i}}{\sum w_i} \\
w_{i} &= a_i * stationCount(M_{i}) + b_i
The coefficients a and b can be configured per magnitude type by
:confval:`summaryMagnitude.coefficients.a`
and :confval:`summaryMagnitude.coefficients.b`, respectively.
Furthermore each magnitude type can be specifically added to or excluded from the
summary magnitude calculation
as defined in :confval:`summaryMagnitude.whitelist` or
:confval:`summaryMagnitude.blacklist`, respectively.
.. note::
While the magnitudes are computed by scmag the decision about the preferred
magnitude of an :term:`event` is made by :ref:`scevent`.
Preferred Magnitude
===================
The preferred magnitude of an :term:`event` is set automatically by :ref:`scevent`
or interactively in :ref:`scolv`. It can be any network magnitude or the summary
magnitude.
.. _scmag_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/scmag.cfg`
| :file:`etc/global.cfg`
| :file:`etc/scmag.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/scmag.cfg`
scmag inherits :ref:`global options<global-configuration>`.
.. confval:: magnitudes
Default: ``MLv,mb,mB,Mwp``
Type: *list:string*
The magnitude types to be calculated. Station magnitudes are
computed from their amplitudes, network magnitudes from their
station magnitudes.
.. confval:: minimumArrivalWeight
Default: ``0.5``
Type: *double*
The minimum weight of an arrival for an associated amplitude
to be used for calculating a magnitude.
.. note::
**magnitudes.\***
*General parameters for computing magnitudes. Others are configured*
*by global binding parameters for specific magnitude types.*
.. confval:: magnitudes.average
Default: ``default``
Type: *list:string*
The methods for computing the network magnitude
from station magnitudes. Exactly one method per
magnitude can be configured.
To define the averaging method per magnitude type append
the type after colon, e.g.:
\"magnitudes.average \= default, MLv:median\"
default: Compute the mean if less than 4 contributed
station magnitudes exist. Otherwise apply trimmedMean\(25\),
trimmed mean with 25%.
.. confval:: connection.sendInterval
Default: ``1``
Type: *int*
Unit: *s*
Interval between 2 sending processes. The interval controls
how often information is updated.
.. note::
**summaryMagnitude.\***
*The summary magnitude is the weighted average from all*
*defined network magnitude types: Single network magnitude values*
*are multiplied with their magnitude-type specific weight and*
*summed up. The resulting sum is divided by the sum of all weights.*
.. confval:: summaryMagnitude.enabled
Default: ``true``
Type: *boolean*
Enables summary magnitude calculation.
.. confval:: summaryMagnitude.type
Default: ``M``
Type: *string*
Define the type\/name of the summary magnitude.
.. confval:: summaryMagnitude.minStationCount
Default: ``1``
Type: *int*
This is the minimum station magnitude required for any
magnitude to contribute to the summary magnitude at all. If
this is set to 4, then no magnitude with less than 4 station
magnitudes is taken into consideration even if this results
in no summary magnitude at all. For this reason, the default
here is 1 but in a purely automatic system it should be
higher, at least 4 is recommended.
.. confval:: summaryMagnitude.singleton
Default: ``true``
Type: *boolean*
Allow computing the summary magnitude even if only one single
network magnitude meeting the other criteria is available.
Unselecting this parameter will suppress computing summary
magnitudes if only one network magnitude is available.
.. confval:: summaryMagnitude.blacklist
Type: *list:string*
Define the magnitude types to be excluded from the summary
magnitude calculation.
.. confval:: summaryMagnitude.whitelist
Type: *list:string*
Define the magnitude types to be included in the summary
magnitude calculation.
.. note::
**summaryMagnitude.coefficients.\***
*The coefficients defining the weight of network magnitudes*
*for calculating the summary magnitude.*
*Weight = a * magnitudeStationCount + b.*
.. confval:: summaryMagnitude.coefficients.a
Default: ``0, Mw(mB):0.4, Mw(Mwp):0.4``
Type: *list:string*
Define the coefficients a. To define the value per magnitude
type append the type after colon. A value without a
type defines the default value.
.. confval:: summaryMagnitude.coefficients.b
Default: ``1, MLv:2, Mw(mB):-1, Mw(Mwp):-1``
Type: *list:string*
Define the coefficients b. To define the value per magnitude
type append the type after colon. A value without a
type defines the default value.
Command-Line Options
====================
.. program:: scmag
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
.. option:: --plugins arg
Load given plugins.
.. option:: -D, --daemon
Run as daemon. This means the application will fork itself
and doesn't need to be started with \&.
.. option:: --auto-shutdown arg
Enable\/disable self\-shutdown because a master module shutdown.
This only works when messaging is enabled and the master
module sends a shutdown message \(enabled with \-\-start\-stop\-msg
for the master module\).
.. option:: --shutdown-master-module arg
Set the name of the master\-module used for auto\-shutdown.
This is the application name of the module actually
started. If symlinks are used, then it is the name of
the symlinked application.
.. option:: --shutdown-master-username arg
Set the name of the master\-username of the messaging
used for auto\-shutdown. If \"shutdown\-master\-module\" is
given as well, this parameter is ignored.
.. option:: -x, --expiry time
Time span in hours after which objects expire.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --log-file arg
Use alternative log file.
Messaging
---------
.. option:: -u, --user arg
Overrides configuration parameter :confval:`connection.username`.
.. option:: -H, --host arg
Overrides configuration parameter :confval:`connection.server`.
.. option:: -t, --timeout arg
Overrides configuration parameter :confval:`connection.timeout`.
.. option:: -g, --primary-group arg
Overrides configuration parameter :confval:`connection.primaryGroup`.
.. option:: -S, --subscribe-group arg
A group to subscribe to.
This option can be given more than once.
.. option:: --content-type arg
Overrides configuration parameter :confval:`connection.contentType`.
.. option:: --start-stop-msg arg
Set sending of a start and a stop message.
Database
--------
.. option:: --db-driver-list
List all supported database drivers.
.. option:: -d, --database arg
The database connection string, format:
service:\/\/user:pwd\@host\/database.
\"service\" is the name of the database driver which
can be queried with \"\-\-db\-driver\-list\".
.. option:: --config-module arg
The config module to use.
.. option:: --inventory-db arg
Load the inventory from the given database or file, format:
[service:\/\/]location .
.. option:: --db-disable
Do not use the database at all
Input
-----
.. option:: --ep file
Defines an event parameters XML file to be read and processed. This
implies offline mode and only processes all origins contained
in that file. It computes station magnitudes for all picks associated
with an origin where amplitudes are available and the corresponding
network magnitudes. Station and network magnitudes having the
evaluation status set are ignored. Use \-\-reprocess to include those
magnitudes. It outputs an XML text adding the station\-
and network magnitudes to the input XML file.
.. option:: --reprocess
Reprocess also station and network magnitudes with an evaluation
status set but do not change original weights. New
contributions are added with weight 0.
Reprocess
---------
.. option:: --static
With that flag all existing station magnitudes are recomputed
based on their associated amplitudes. If an amplitude cannot
be accessed, no station magnitude is updated.
Network magnitudes are recomputed based on their station
magnitude contributions. No new objects will
be created in this mode, it only updates values and weights.
The method to accumulate the station magnitudes to form the network
magnitude will be read from the existing object and replicated.
If it cannot be interpreted, then the configured default for this
magnitude type will be used instead. Weights of station magnitudes
will be changed according to the accumulation method of the
network magnitude.
.. option:: --keep-weights
Keep the original weights in combination with \-\-static.

View File

@ -0,0 +1,238 @@
.. highlight:: rst
.. _scmapcut:
########
scmapcut
########
**Create image files containing maps of specific regions.**
Description
===========
*scmapcut* is a commandline tool to create image files containing maps of specific
regions and for selected events. When plotting events given by their eventID, the
event parameters must be provided in a SeisComP event XML file. The XML file can
be retrieved from the database using :ref:`scxmldump`.
Examples
========
1. Draw a map for the event with event ID <eventID>. Plot a region of at least
3 degrees around the epicentre. The created image has 800x400 px.
.. code-block:: sh
scmapcut -E <eventID> --ep <eventID>.xml -m 3 -d 800x400 -o <eventID>.png
.. _fig-workflow:
.. figure:: media/gempa2017xxxx.png
:align: center
:width: 10cm
Image example.
#. Draw a map for a generic event with magnitude 4. The size of the event shown
on the map scales with magnitude. Plot a region of at least 3 degrees around
the epicentre. The created image has 800x400 px.
.. code-block:: sh
scmapcut --lat 44 --lon 12 --depth 10 --mag 4 -m 0.5 -d 800x400 -o generic.png
.. _fig-workflow_mag4:
.. figure:: media/generic.png
:align: center
:width: 10cm
Generic example.
.. _scmapcut_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/scmapcut.cfg`
| :file:`etc/global.cfg`
| :file:`etc/scmapcut.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/scmapcut.cfg`
scmapcut inherits :ref:`global options<global-configuration>`.
Command-Line Options
====================
.. program:: scmapcut
:program:`scmapcut [options]`
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
.. option:: --plugins arg
Load given plugins.
.. option:: -D, --daemon
Run as daemon. This means the application will fork itself
and doesn't need to be started with \&.
.. option:: --auto-shutdown arg
Enable\/disable self\-shutdown because a master module shutdown.
This only works when messaging is enabled and the master
module sends a shutdown message \(enabled with \-\-start\-stop\-msg
for the master module\).
.. option:: --shutdown-master-module arg
Set the name of the master\-module used for auto\-shutdown.
This is the application name of the module actually
started. If symlinks are used, then it is the name of
the symlinked application.
.. option:: --shutdown-master-username arg
Set the name of the master\-username of the messaging
used for auto\-shutdown. If \"shutdown\-master\-module\" is
given as well, this parameter is ignored.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --log-file arg
Use alternative log file.
Cities
------
.. option:: --city-xml arg
Path to the cities.xml file. If undefined, the data is
read from CONFIGDIR\@\/cities.xml or \@DATADIR\@\/cities.xml.
Options
-------
.. option:: -r, --region arg
Cut region \([lat_dim]x[lon_dim]+lat0+lon0 or +lat0+lon+lat1+lon1\).
.. option:: -m, --margin arg
Margin in degrees around origin \(margin\|margin_latxmargin_lon\).
.. option:: -d, --dimension arg
Output image dimension \(wxh\).
.. option:: -o, --output arg
Output image, file name.
.. option:: --lat arg
Latitude of symbol.
.. option:: --long arg
Longitude of symbol.
.. option:: --depth arg
Depth of event.
.. option:: --mag arg
Magnitude of event.
.. option:: --layers arg
Draw polygonal layers.
.. option:: --ep arg
Name of XML file containing the event parameters to load.
.. option:: -E, --event-id arg
Event ID of the event to plot on map.
.. option:: --html-area
Print html\/area section.
.. option:: --without-arrivals
Do not render arrivals \(stations\).

View File

@ -0,0 +1,555 @@
.. highlight:: rst
.. _scmaster:
########
scmaster
########
**The messaging system**
Description
===========
scmaster is the implementation of the :ref:`messaging <concepts_messaging>`
mediator.
.. _section-scmaster-groups:
Message Groups
==============
scmaster provides the :ref:`message groups <messaging-groups>`. Configure
* :confval:`defaultGroups`: Add the groups which can be used by all queues.
* :confval:`queues.$name.groups`: Set all groups which are used by the given
queue. You may inherit :confval:`defaultGroups`, e.g.: ::
queues.production.groups = ${defaultGroups},L1PICK
.. warning ::
Setting any value without inheriting :confval:`defaultGroups` ignores all
values of :confval:`defaultGroups`.
Queues
======
scmaster provides *queues* for separating the processing.
Typically, the default queue *production* is used. To add new queues
#. Define a new queue by adding a new profile with some name,
#. Configure the profile parameters :confval:`queues.$name.*`,
#. Register the queue in :confval:`queues`.
Scheme
======
scmaster provides unsecured and secured connection which is addressed by the
scheme values *scmp* and *scmps*, respectively, in :confval:`connection.server`
when connecting to the messaging.
Read the :ref:`concepts section <messaging-scheme>` for more details. *scmps*
is in use when configuring :confval:`interface.ssl.bind`.
Database Access
===============
scmaster reads from and writes to the database and reports the database connection
to the clients of the messaging system (compare with the :ref:`concepts section <messaging-db>`).
The database is configured per queue.
Single Machine
--------------
When running all |scname| modules on a single machine, the read and write
parameters are typically configured with *localhost* as a *host name*.
Example: ::
queues.production.processors.messages.dbstore.read = sysop:sysop@localhost/seiscomp
queues.production.processors.messages.dbstore.write = sysop:sysop@localhost/seiscomp
Multiple Machines
-----------------
If the clients are located on machines different from the messaging, the
*host name* of the read parameter
must be available on the client machine and the client machine must be able to
connect to the host with its name. If the database is on the same machine as the
messaging, the *host name* of the write connection typically remains *localhost*.
Example for connecting clients on computerB to the messaging on computerA (compare
with the :ref:`concepts section <messaging-distribution>`).
* Configuration of scmaster on computerA: ::
queues.production.processors.messages.dbstore.read = sysop:sysop@computerA/seiscomp
queues.production.processors.messages.dbstore.write = sysop:sysop@localhost/seiscomp
* Global configuration of client on computerB: ::
connection.server = computerA/production
Database Proxy
--------------
scmaster can accept database requests and forward results to clients without
exposing the underlying database. That allows clients to connect to the database
of a particular queue via the Websocket HTTP protocol. No specific database
plugin is required at the client which reduces the complexity of configuration.
Be aware that due to the nature of a proxy which is another layer on top of the
actual database connection the performance is not as high as direct database
access.
To let scmaster return the proxy address of the database connection, set
.. code::
queues.production.processors.messages.dbstore.proxy = true
in the configuration file.
Access Control
==============
scmaster does not provide any built-in access control to connecting clients.
The only exception is the possibility to verify client certificates against
the server certificate if SSL is enabled.
.. code::
interface.ssl.verifyPeer = true
It is required that the client certificate is signed by the server certificate
otherwise the client connection will be rejected.
.. _scmaster_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/scmaster.cfg`
| :file:`etc/global.cfg`
| :file:`etc/scmaster.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/scmaster.cfg`
scmaster inherits :ref:`global options<global-configuration>`.
.. confval:: defaultGroups
Default: ``AMPLITUDE, PICK, LOCATION, MAGNITUDE, FOCMECH, EVENT, QC, PUBLICATION, GUI, INVENTORY, CONFIG, LOGGING, SERVICE_REQUEST, SERVICE_PROVIDE, STATUS_GROUP``
Type: *list:string*
The default set of message groups for each queue. Only used
if a queues group list is unset \(note: empty is not unset\).
.. confval:: queues
Default: ``production, playback``
Type: *list:string*
Enable messaging queues defined as profile in queues. The profile
names are the final queue names.
.. note::
**interface.\***
*Control the messaging interface. The default protocol is*
*"scmp" but "scmps" (secure protocol) is*
*used when valid SSL certificate and key are configured.*
.. confval:: interface.bind
Default: ``0.0.0.0:18180``
Type: *ipbind*
Local bind address and port of the messaging system.
0.0.0.0:18180 accepts connections from all clients,
127.0.0.1:18180 only from localhost.
.. confval:: interface.acl
Type: *list:ipmask*
The IP access control list for clients which are allowed
to connect to the interface. Separate each IP with a space
and put the entire list in double quotes, e.g.
\"127.0.0.1 192.168.1.2 192.168.0.0\/16\".
.. confval:: interface.socketPortReuse
Default: ``true``
Type: *boolean*
SO_REUSEADDR socket option for the TCP listening socket.
.. note::
**interface.ssl.\***
*SSL encryption is used if key and certificate are configured.*
.. confval:: interface.ssl.bind
Default: ``0.0.0.0:-1``
Type: *ipbind*
Additional local bind address and port of the messaging
system in case SSL encryption is active.
.. confval:: interface.ssl.acl
Type: *list:ipmask*
The IP access control list for clients which are allowed
to connect to the interface. See interface.acl for
further details.
.. confval:: interface.ssl.socketPortReuse
Default: ``true``
Type: *boolean*
SO_REUSEADDR socket option for the TCP listening socket.
.. confval:: interface.ssl.key
Type: *path*
.. confval:: interface.ssl.certificate
Type: *path*
.. confval:: interface.ssl.verifyPeer
Default: ``false``
Type: *boolean*
If enabled then the certificate of a connecting client
is verified against the servers certificate. It is
required that the client certificate is signed by the
server certificate otherwise the connection is refused.
.. note::
**queues.\***
*Set the parameters for each messaging queue. The queues are used*
*when listed in the "queues" parameter. Several queues*
*can be used in parallel. For queues with without databases leave*
*the processor parameters empty.*
.. note::
**queues.$name.\***
$name is a placeholder for the name to be used and needs to be added to :confval:`queues` to become active.
.. code-block:: sh
queues = a,b
queues.a.value1 = ...
queues.b.value1 = ...
# c is not active because it has not been added
# to the list of queues
queues.c.value1 = ...
.. confval:: queues.$name.groups
Type: *list:string*
Define the list of message groups added to the queue.
If unset, then the defaultGroups will be used.
A queue will always add the default group \"STATUS_GROUP\".
This parameter overrides defaultGroups.
.. confval:: queues.$name.acl
Default: ``0.0.0.0/0``
Type: *list:ipmask*
The IP access control list for clients which are allowed
to join the queue. See interface.acl for further details.
.. confval:: queues.$name.maximumPayloadSize
Default: ``1048576``
Type: *int*
Unit: *B*
The maximum size in bytes of a message to be accepted.
Clients which send larger messages will be disconnected.
The default is 1MB.
.. confval:: queues.$name.plugins
Type: *list:string*
List of plugins required by this queue. This is just a
convenience parameter to improve configurations
readability. The plugins can also be added to the
global list of module plugins.
Example: dbstore
.. confval:: queues.$name.processors.messages
Type: *string*
Interface name. For now, use \"dbstore\"to
use a database.
Use empty for testing or playbacks without a database.
.. note::
**queues.$name.processors.messages.dbstore.\***
*Define the database connection parameters.*
.. confval:: queues.$name.processors.messages.dbstore.driver
Type: *string*
Selected the database driver to use.
Database drivers are available through plugins.
The default plugin is dbmysql which supports
the MYSQL database server. It is activated
with the core.plugins parameter.
.. confval:: queues.$name.processors.messages.dbstore.read
Type: *string*
Set the database read connection which is
reported to clients that connect to this server.
If a remote setup should be implemented,
ensure that the hostname is reachable from
the remote computer.
.. confval:: queues.$name.processors.messages.dbstore.write
Type: *string*
Set the database write connection which is
private to scmaster.
A separate write connection enables different
permissions on the database level for scmaster
and clients.
.. confval:: queues.$name.processors.messages.dbstore.proxy
Default: ``false``
Type: *boolean*
If enabled then the database connection as configured
in 'read' is not being returned to the client
but the URL \"proxy:\/\/\". This URL
tells the client to open the database via the
websocket proxy at the messaging address,
e.g. http:\/\/localhost\/production\/db. The same
hostname and queue must be used as for the
initial messaging connection.
.. confval:: queues.$name.processors.messages.dbstore.strictVersionMatch
Default: ``true``
Type: *boolean*
If enabled, the plugin will check the database
schema version and refuse to start if the
version doesn't match the latest version.
If disabled and the an object needs to be
stored, which is incompatible with the
database schema, this object is lost.
Leave this option enabled unless you know
exactly what are you doing and what the
consequences are.
.. confval:: http.filebase
Default: ``@DATADIR@/scmaster/http/``
Type: *path*
The directory served by the http server at staticPath.
.. confval:: http.staticPath
Default: ``/``
Type: *string*
The URL path at which html files and assets are available.
All files under filebase will be served at this URL path.
.. confval:: http.brokerPath
Default: ``/``
Type: *string*
The URL path at which the broker websocket is available.
Command-Line Options
====================
.. program:: scmaster
:program:`scmaster [options]`
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
.. option:: --plugins arg
Load given plugins.
.. option:: -D, --daemon
Run as daemon. This means the application will fork itself
and doesn't need to be started with \&.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --log-file arg
Use alternative log file.
.. option:: --print-component arg
For each log entry print the component right after the
log level. By default the component output is enabled
for file output but disabled for console output.
.. option:: --trace
Execute in trace mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 \-\-print\-component\=1
\-\-print\-context\=1 .
Wired
-----
.. option:: --bind arg
The non\-encrypted bind address. Format [ip:]port
.. option:: --sbind arg
The encrypted bind address. Format: [ip:]port

View File

@ -0,0 +1,272 @@
.. highlight:: rst
.. _scmm:
####
scmm
####
**Messaging Monitor**
Description
===========
**scmm** is the messaging monitor, the graphical interface to :ref:`scm`. It allows to
view the messages from all modules sent around by the SeisComP messaging system.
Therefore, **scmm** can be used to debug configured message groups and module
connections, e.g. in a system with several processing pipelines and specific
message groups.
In addition, **scmm** allows to view the content of single messges
as well as the memory consumption and other statistics of all modules connected
to the SeisComP messaging system.
.. figure:: media/scmm_messages.png
:width: 8cm
:align: center
View message notifications.
.. figure:: media/scmm_message.png
:width: 8cm
:align: center
View the message content by clicking on individual messages.
.. figure:: media/scmm_clients.png
:width: 8cm
:align: center
View module memory consumptions.
.. figure:: media/scmm_statistics.png
:width: 8cm
:align: center
View module statistics.
.. _scmm_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/scmm.cfg`
| :file:`etc/global.cfg`
| :file:`etc/scmm.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/scmm.cfg`
scmm inherits :ref:`global options<global-configuration>`.
Command-Line Options
====================
.. program:: scmm
:program:`scmm [options]`
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
.. option:: --plugins arg
Load given plugins.
.. option:: --auto-shutdown arg
Enable\/disable self\-shutdown because a master module shutdown.
This only works when messaging is enabled and the master
module sends a shutdown message \(enabled with \-\-start\-stop\-msg
for the master module\).
.. option:: --shutdown-master-module arg
Set the name of the master\-module used for auto\-shutdown.
This is the application name of the module actually
started. If symlinks are used, then it is the name of
the symlinked application.
.. option:: --shutdown-master-username arg
Set the name of the master\-username of the messaging
used for auto\-shutdown. If \"shutdown\-master\-module\" is
given as well, this parameter is ignored.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --log-file arg
Use alternative log file.
.. option:: --print-component arg
For each log entry print the component right after the
log level. By default the component output is enabled
for file output but disabled for console output.
.. option:: --trace
Execute in trace mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 \-\-print\-component\=1
\-\-print\-context\=1 .
Messaging
---------
.. option:: -u, --user arg
Overrides configuration parameter :confval:`connection.username`.
.. option:: -H, --host arg
Overrides configuration parameter :confval:`connection.server`.
.. option:: -t, --timeout arg
Overrides configuration parameter :confval:`connection.timeout`.
.. option:: -g, --primary-group arg
Overrides configuration parameter :confval:`connection.primaryGroup`.
.. option:: -S, --subscribe-group arg
A group to subscribe to.
This option can be given more than once.
.. option:: --content-type arg
Overrides configuration parameter :confval:`connection.contentType`.
.. option:: --start-stop-msg arg
Set sending of a start and a stop message.
Database
--------
.. option:: --db-driver-list
List all supported database drivers.
.. option:: -d, --database arg
The database connection string, format:
service:\/\/user:pwd\@host\/database.
\"service\" is the name of the database driver which
can be queried with \"\-\-db\-driver\-list\".
.. option:: --config-module arg
The config module to use.
.. option:: --inventory-db arg
Load the inventory from the given database or file, format:
[service:\/\/]location .
.. option:: --db-disable
Do not use the database at all
Records
-------
.. option:: --record-driver-list
List all supported record stream drivers.
.. option:: -I, --record-url arg
The recordstream source URL, format:
[service:\/\/]location[#type].
\"service\" is the name of the recordstream driver
which can be queried with \"\-\-record\-driver\-list\".
If \"service\" is not given, \"file:\/\/\" is
used.
.. option:: --record-file arg
Specify a file as record source.
.. option:: --record-type arg
Specify a type for the records being read.
User interface
--------------
.. option:: -F, --full-screen
Start the application in fullscreen
.. option:: -N, --non-interactive
Use non interactive presentation mode

View File

@ -0,0 +1,133 @@
.. highlight:: rst
.. _scmssort:
########
scmssort
########
**Read and manipulate miniSEED records**
Description
===========
scmssort reads unsorted (and possibly multiplexed) MiniSEED files and sorts
the individual records by time. This is useful e.g. for simulating data
acquisition and playbacks. Removing of duplicate data and trimming of time
window is available.
scmssort reads single files and output to the command line. Cat many files
to read them at the same time. In this way huge amount of data can be processed
efficiently.
Applications to miniSEED records:
* Sort records by time, e.g., for playbacks.
* Remove duplicate records from files and clean waveform archives.
* Filter data records, i.e. keep or remove them, based on
* time windows,
* stream lists where each line has the format NET.STA.LOC.CHA including regular
expressions. Such stream lists can be generated, e.g., using :ref:`scinv`.
.. hint::
* Combine with :ref:`scart` or :ref:`msrtsimul` to archive data or to make
playbacks with real-time simulations.
* Filter data by stream IDs using NSLC lists which can be generated using
:ref:`scinv`.
Examples
========
#. Read a single miniSEED data file. The records are sorted by endtime and
duplicates are removed.
.. code-block:: sh
scmssort -vuE unsorted.mseed > sorted.mseed
#. Read all files ending with ".mseed" at the same time. The data are trimmed
to a time window and duplicated or empty records are ignored.
.. code-block:: sh
cat *.mseed | scmssort -vuiE -t 2020-03-28T15:48~2020-03-28T16:18 > sorted.mseed
#. Remove streams listed by stream code and sort records by end time. Also ignore
duplicated or empty records. Stream lists can be generated, e.g., by :ref:`scinv`.
.. code-block:: sh
scmssort -vuiE --rm -l stream-list.txt test.mseed > sorted.mseed
#. Extract streams by time and stream code and sort records by end time. Also ignore
duplicated or empty records.
.. code-block:: sh
echo CX.PB01..BH? | scmssort -vuE -t 2007-03-28T15:48~2007-03-28T16:18 -l - test.mseed > sorted.mseed
scmssort -vuiE -t 2007-03-28T15:48~2007-03-28T16:18 -l stream-list.txt test.mseed > sorted.mseed
Command-Line Options
====================
.. program:: scmssort
:program:`scmssort [options] input`
Options
-------
.. option:: -h, --help
Show the command\-line help and exit.
.. option:: -E, --sort-by-end-time
Sort by end time.
.. option:: -i, --ignore
Ignore all records which have no data samples.
.. option:: -l, --list
File with streams to filter the records. One stream per
line. Instead of a file read the from stdin \(\-\). Line
format: NET.STA.LOC.CHA \- wildcards and regular expressions
are considered. Example: CX.\*..BH? .
NSLC lists may be generated, e.g., using scinv.
.. option:: -o, --output
Name of output file for miniSEED data \(default is stdout\).
.. option:: -r, --rm
Remove all traces in stream list given by \-\-list
instead of keeping them.
.. option:: -t, --time-window
Specify time window \(as one \-properly quoted\- string\).
Times are of course UTC and separated by a tilde \~
Example:
2020\-03\-28T15:48\~2020\-03\-28T16:18
.. option:: -u, --uniqueness
Ensure uniqueness of output, i.e. skip duplicate records.
.. option:: -v, --verbose
Run in verbose mode.

View File

@ -0,0 +1,649 @@
.. highlight:: rst
.. _scmv:
####
scmv
####
**Map view**
Description
===========
scmv visualizes the actual information regarding earthquakes, trigger status,
ground motion and station quality. This information is visualized in a map
showing also the topography of the region including bathymetry, state borders
and geologic faults. scmv provides two different tabs with maps showing the
ground motion and the network status.
Ground motion status
--------------------
The ground motion tab (fig. :ref:`fig-scmv-gm`) shows the actual groundmotion
in nm/s at each station color-coded. Furthermore the trigger status (new picks)
of the stations are announced by blinking triangles (the same in all
other tabs). In case a new pick is associated to an event, the blinking color
changes from red to green (default). Events are represented by open circles
and the depth is color-coded. By right mouse click on the symbol of a station
more detailed information are shown in an information widget (fig. :ref:`fig-scmv-infowidget`).
.. _fig-scmv-gm:
.. figure:: media/scmv/ground-motion.png
:width: 16cm
:align: center
Ground motion tab
Tab 1-2 = Ground Motion, Network Status tab; Station A = station with ground
motion between 0 and 200 nm/s; Station B = triggering (red blinking) station;
Station C = station of undefined status; Station D = triggering and associated
station (green blinking); EQ = location of the earthquake/event; P = spreading
of the P-wave; S = spreading of the S-wave; Legend = ground motion scale.
Network status
--------------
The network status tab (fig. :ref:`fig-scmv-netstat`) visualizes the quality control
parameters of all shown stations. The colors of the triangles represent the data
the selected parameters of a station, e.g. latency or delay. More detailed information about station quality
can be derived by opening an information widget (fig. :ref:`fig-scmv-infowidget` left)
with a right mouse click on the symbol.
.. _fig-scmv-netstat:
.. figure:: media/scmv/netstat.png
:width: 16cm
:align: center
Network status tab
Station A = station with up to 20 sec data latency/delay;
Station B = station with up to 10 min data latency/delay and a warning;
Station C = disabled station;
EQ = location of a recent or historic earthquake/event;
Legend = scale of the delay;
Status = connection status.
The visualized QC parameters can be selected in the QC parameters widget of scmv.
.. _fig-scmv-qc-select:
.. figure:: media/scmv/qc-selection.png
:width: 16cm
:align: center
QC parameters selection widget
The station information widget opens by right mouse click on a station symbol
in one of the scmv maps. The widget gives a more detailed list of parameters.
For a station general information (coordinates, network code etc.), quality
parameter (latency, gaps etc.), amplitude information and the actual waveform
data are provided. For an event information like the location, the number of
stations and the azimuthal gap are shown.
.. _fig-scmv-infowidget:
.. figure:: media/scmv/infowidget-station.png
:width: 8cm
:align: center
Station information widget
The info widget of a station shows station information like network code
and location. The quality of the station can be checked here. The lower
part shows the last 15 minutes of waveforms including picks (if available).
Event information
-----------------
In addition to the current event scmv also shows all events it has currently
saved in its session as circles. This depends on the configured time span it remembers
events. Showing the events can give a convenient overview of the network activity.
If available focal mechanisms are displayed with dashed lines connecting to their location.
The circles and focal mechanisms are again colored by their depth and sized by
magnitude.
.. _fig-scmv-historic-origins:
.. figure:: media/scmv/historic-origins.png
:width: 16cm
:align: center
Show historic origins
All historic origins currently saved in scmv are shown as circles.
The events shown an the map can be confined by selecting the mode and the status
of preferred origins in the event selection widget.
.. _fig-scmv-event-select:
.. figure:: media/scmv/event-selection.png
:width: 16cm
:align: center
Event selection widget
The event information widget opens by right mouse click on an event symbol.
It shows event details.
Clicking on the "Show Details" button allows to open the preferred origin of
the event in other GUIs such as in scolv.
.. _fig-scmv-event-info:
.. figure:: media/scmv/event-info.png
:width: 8cm
:align: center
Event information widget
Hotkeys
=======
===================== ========================================
Hotkey Description
===================== ========================================
:kbd:`F2` Setup connection dialog
:kbd:`F3` Toggle ray paths and associated stations
:kbd:`F6` Hide propagation of P and S waves
:kbd:`F7` Toggle legend
:kbd:`F8` Toggle historic origins
:kbd:`F9` Toggle station annotation
:kbd:`F10` Toggle event list (event tab)
:kbd:`F11` Toggle full screen mode
:kbd:`CTRL + f` Seach station
:kbd:`Arrows` Move focus
Mouse wheel Zoom in or out
Double click Center map
Right mouse button Open info widget
Mid mouse button Set preliminary origin
===================== ========================================
Use cases
=========
Get station information
-----------------------
- Position the mouse above a triangle representing a station.
- Click the right mouse button for opening the station info widget.
- Choose one station in case several stations are in the selection range.
Get event information
---------------------
- Position the mouse above a circle representing a location of an event.
- Click the right mouse button for opening the event info widget.
- Choose one event in case several events are in the selection range.
Set preliminary origin
----------------------
- Position the mouse in the map
- Press the middle mouse button
- Set date & time and latitude, longitude & depth
- Press "Create" to open the origin in another GUI, e.g. scolv.
.. image:: media/scmv/artificial-origin.png
:width: 4cm
Search station/network
----------------------
- Press :kbd:`CTRL + f`
- Type station/network name
- Double click in a station in the list to center the map at this location
.. _scmv_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/scmv.cfg`
| :file:`etc/global.cfg`
| :file:`etc/scmv.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/scmv.cfg`
scmv inherits :ref:`global options<global-configuration>`.
.. confval:: displaymode
Type: *string*
Start scmv in one of the available display modes:
groundmotion or qualitycontrol
and without tabs and menus as walldisplay.
.. confval:: groundMotionRecordLifeSpan
Default: ``300``
Type: *int*
Unit: *s*
Set the maximum latency in seconds of the ground motion records
to be valid.
.. confval:: removeEventDataOlderThan
Default: ``43200``
Type: *double*
Unit: *s*
Set the time span in seconds to keep events.
.. confval:: readEventsNotOlderThan
Default: ``0``
Type: *double*
Unit: *s*
Set the time span in seconds to read events initially from
database.
.. confval:: centerOrigins
Default: ``false``
Type: *boolean*
If a new origin\/event is set\/selected this option defines if the
map is centered or not at the origin location.
.. confval:: eventActivityLifeSpan
Default: ``900``
Type: *double*
Unit: *s*
Time span of which an event is active after origin time to
show blinking associated stations.
.. confval:: expiredEventsInterval
Default: ``0``
Type: *double*
Unit: *s*
A positive value \(greater than zero\) defines the interval to check
for expired events. A negative or zero value disables the interval
check and expired events are only removed when a new event is declared
or an existing event is updated.
.. confval:: annotations
Default: ``false``
Type: *boolean*
Enable\/disable drawing of station annotations at startup.
.. confval:: annotationsWithChannels
Default: ``true``
Type: *boolean*
Enable\/disable drawing of station annotations with
location\/channel codes.
.. confval:: mapLegendPosition
Default: ``topleft``
Type: *string*
Set the location of the map symbol legend \(QC, ground motion\).
Use either: topleft, topright, bottomright or bottomleft.
.. confval:: eventLegendPosition
Default: ``bottomleft``
Type: *string*
Set the location of the event symbol legend. Use either:
topleft, topright, bottomright or bottomleft.
.. confval:: eventTable.visible
Default: ``false``
Type: *boolean*
Whether to show the event table initially or not.
.. confval:: eventTable.columns
Type: *list:string*
The columns that are visible in the table. If nothing
is specified then all columns are visible. Valid column names are:
\"Event\", \"Origin Time\", \"Magnitude\",
\"Magnitude Type\", \"Region\", \"Latitude\",
\"Longitude\", \"Depth\".
.. confval:: stations.groundMotionFilter
Default: ``"RMHP(50)>>ITAPER(20)>>BW(2,0.04,2)"``
Type: *string*
The filter applied to waveforms for measuring ground motion.
.. note::
**display.\***
*Allow to define an initial rectangular region for the map.*
.. confval:: display.latmin
Default: ``-90``
Type: *double*
Unit: *deg*
Minimum latitude in degrees.
.. confval:: display.lonmin
Default: ``-180``
Type: *double*
Unit: *deg*
Minimum longitude in degrees.
.. confval:: display.latmax
Default: ``90``
Type: *double*
Unit: *deg*
Maximum latitude in degrees.
.. confval:: display.lonmax
Default: ``180``
Type: *double*
Unit: *deg*
Maximum longitude in degrees.
Command-Line Options
====================
.. program:: scmv
:program:`scmv [options]`
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
.. option:: --plugins arg
Load given plugins.
.. option:: --auto-shutdown arg
Enable\/disable self\-shutdown because a master module shutdown.
This only works when messaging is enabled and the master
module sends a shutdown message \(enabled with \-\-start\-stop\-msg
for the master module\).
.. option:: --shutdown-master-module arg
Set the name of the master\-module used for auto\-shutdown.
This is the application name of the module actually
started. If symlinks are used, then it is the name of
the symlinked application.
.. option:: --shutdown-master-username arg
Set the name of the master\-username of the messaging
used for auto\-shutdown. If \"shutdown\-master\-module\" is
given as well, this parameter is ignored.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --log-file arg
Use alternative log file.
.. option:: --print-component arg
For each log entry print the component right after the
log level. By default the component output is enabled
for file output but disabled for console output.
.. option:: --trace
Execute in trace mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 \-\-print\-component\=1
\-\-print\-context\=1 .
Messaging
---------
.. option:: -u, --user arg
Overrides configuration parameter :confval:`connection.username`.
.. option:: -H, --host arg
Overrides configuration parameter :confval:`connection.server`.
.. option:: -t, --timeout arg
Overrides configuration parameter :confval:`connection.timeout`.
.. option:: -g, --primary-group arg
Overrides configuration parameter :confval:`connection.primaryGroup`.
.. option:: -S, --subscribe-group arg
A group to subscribe to.
This option can be given more than once.
.. option:: --content-type arg
Overrides configuration parameter :confval:`connection.contentType`.
.. option:: --start-stop-msg arg
Set sending of a start and a stop message.
Database
--------
.. option:: --db-driver-list
List all supported database drivers.
.. option:: -d, --database arg
The database connection string, format:
service:\/\/user:pwd\@host\/database.
\"service\" is the name of the database driver which
can be queried with \"\-\-db\-driver\-list\".
.. option:: --config-module arg
The config module to use.
.. option:: --inventory-db arg
Load the inventory from the given database or file, format:
[service:\/\/]location .
.. option:: --db-disable
Do not use the database at all
Records
-------
.. option:: --record-driver-list
List all supported record stream drivers.
.. option:: -I, --record-url arg
The recordstream source URL, format:
[service:\/\/]location[#type].
\"service\" is the name of the recordstream driver
which can be queried with \"\-\-record\-driver\-list\".
If \"service\" is not given, \"file:\/\/\" is
used.
.. option:: --record-file arg
Specify a file as record source.
.. option:: --record-type arg
Specify a type for the records being read.
Cities
------
.. option:: --city-xml arg
The path to the cities XML file. This overrides the default
paths. Compare with the global parameter \"citiesXML\".
User interface
--------------
.. option:: -F, --full-screen
Start the application filling the entire screen.
This only works with GUI applications.
.. option:: -N, --non-interactive
Use non\-interactive presentation mode. This only works with
GUI applications.
Mapview
-------
.. option:: --displaymode arg
Start scmv as walldisplay.
Modes: groundmotion, qualitycontrol
.. option:: --with-legend
Show the map legend if started as walldisplay.

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,217 @@
.. highlight:: rst
.. _scorgls:
#######
scorgls
#######
**List origin IDs from database.**
Description
===========
*scorgls* lists all available origin IDs within a given time range to stdout.
Origins are fetched from database or read from a :term:`SCML` file.
Similarly, use :ref:`scevtls` for listing all event IDs. In extension to
*scorgls* and :ref:`scevtls` :ref:`scquery` can search for parameters based on
complex custom queries.
Examples
========
* Print all origin IDs for the complete year 2012.
.. code-block:: sh
scorgls -d mysql://sysop:sysop@localhost/seiscomp \
--begin "2012-01-01 00:00:00" \
--end "2013-01-01 00:00:00"
* Print the IDs of all origins provided with the XML file:
.. code-block:: sh
scevtls -i origins.xml
.. _scorgls_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/scorgls.cfg`
| :file:`etc/global.cfg`
| :file:`etc/scorgls.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/scorgls.cfg`
scorgls inherits :ref:`global options<global-configuration>`.
Command-Line Options
====================
.. program:: scorgls
:program:`scorgls [options]`
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
.. option:: --plugins arg
Load given plugins.
.. option:: -D, --daemon
Run as daemon. This means the application will fork itself
and doesn't need to be started with \&.
.. option:: --auto-shutdown arg
Enable\/disable self\-shutdown because a master module shutdown.
This only works when messaging is enabled and the master
module sends a shutdown message \(enabled with \-\-start\-stop\-msg
for the master module\).
.. option:: --shutdown-master-module arg
Set the name of the master\-module used for auto\-shutdown.
This is the application name of the module actually
started. If symlinks are used, then it is the name of
the symlinked application.
.. option:: --shutdown-master-username arg
Set the name of the master\-username of the messaging
used for auto\-shutdown. If \"shutdown\-master\-module\" is
given as well, this parameter is ignored.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --log-file arg
Use alternative log file.
Database
--------
.. option:: --db-driver-list
List all supported database drivers.
.. option:: -d, --database arg
The database connection string, format:
service:\/\/user:pwd\@host\/database.
\"service\" is the name of the database driver which
can be queried with \"\-\-db\-driver\-list\".
.. option:: --config-module arg
The config module to use.
.. option:: --inventory-db arg
Load the inventory from the given database or file, format:
[service:\/\/]location .
.. option:: --db-disable
Do not use the database at all
Input
-----
.. option:: -i, --input arg
Name of input XML file. Read from stdin if '\-' is given.
Deactivates reading origins from database.
Origins
-------
.. option:: --begin time
The lower bound of the time interval. Format:
012\-01\-01T00:00:00.
.. option:: --end time
The upper bound of the time interval. Format:
2012\-01\-01T00:00:00.
Output
------
.. option:: -D, --delimiter string
The delimiter of the resulting origin IDs.

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,340 @@
.. highlight:: rst
.. _scquery:
#######
scquery
#######
**Read database objects and writes them to the command line.**
Description
===========
*scquery* reads objects such as event information from a
:ref:`SeisComP database <concepts_database>` using custom queries. The results
are written to stdout. The module extends :ref:`scevtls` and :ref:`scorgls`
which are limited to searching event and origin IDs, respectively, by time.
scquery takes into account and requires :ref:`query profiles <scquery_queries>`
for querying the database. The profiles are defined in
* :file:`@SYSTEMCONFIGDIR@/queries.cfg` or
* :file:`@CONFIGDIR@/queries.cfg`
while parameters in the latter take priority. The are no default query profile,
hence they must be created first.
Module Setup
============
.. _scquery_config:
#. Create the query profiles in :file:`queries.cfg` in :file:`@SYSTEMCONFIGDIR@`
or :file:`@CONFIGDIR@`. The file contains your database queries. Examples for
MariaDB/MySQL and PostgreSQL are found in the section :ref:`scquery_queries`.
#. **Optional:** Add the database connection parameter to the configuration file
:file:`scquery.cfg` or :file:`global.cfg` in @CONFIGDIR@ or to @SYSTEMCONFIGDIR@:
.. code-block:: properties
database = mysql://sysop:sysop@localhost/seiscomp
.. hint ::
If the database connection is configured, the database option
:confval:`-d <database>` in the section :ref:`Examples<scquery_examples>`
can be omitted or used to override the configuration.
.. _scquery_examples:
Examples
========
Choose any query profile defined in the :ref:`queries.cfg<scquery_queries>`.
Provide the required parameters in the same order as in the database request.
The required parameters are indicated by hashes, e.g. ##latMin##.
#. List all available query profiles using the command-line option
:confval:`showqueries`:
.. code-block:: sh
scquery --showqueries
#. Profile **event_filter**: Fetch all event IDs and event parameters for events
with magnitude ranging from 2.5 to 5 in central Germany between 2014 and 2017:
.. code-block:: sh
scquery -d localhost/seiscomp eventFilter 50 52 11.5 12.5 2.5 5 2014-01-01 2018-01-01 > events_vogtland.txt
#. Profile **eventByAuthor**: Fetch all event IDs where the preferred origin was
provided by a specific author for events 2.5 to 5 with 6 to 20 phases in central
Germany between 2014 and 2017:
.. code-block:: sh
scquery -d localhost/seiscomp eventByAuthor 50 52 11.5 12.5 6 20 2.5 5 2014-01-01 2018-01-01 scautoloc > events_vogtland.txt
#. Profile **eventType**: Fetch all event IDs and event times from events
with the given event type and within the provided time interval:
.. code-block:: sh
scquery -d localhost/seiscomp eventType explosion '2017-11-01 00:00:00' '2018-11-01 00:00:00'
.. _scquery_queries:
Queries
=======
Example queries for :ref:`scquery_mariadb` and :ref:`scquery_psql` are given
below.
.. _scquery_mariadb:
MariaDB/MySQL
-------------
**General event/origin queries**
.. code-block:: properties
queries = eventFilter, eventUncertainty, eventByAuthor, eventWithStationCount, eventType, originByAuthor
query.eventFilter.description = "Returns all events (lat, lon, mag, time) that fall into a certain region and a magnitude range"
query.eventFilter = "SELECT PEvent.publicID, Origin.time_value AS OT, Origin.latitude_value,Origin.longitude_value, Origin.depth_value, Magnitude.magnitude_value, Magnitude.type FROM Origin,PublicObject as POrigin, Event, PublicObject AS PEvent, Magnitude, PublicObject as PMagnitude WHERE Event._oid = PEvent._oid AND Origin._oid = POrigin._oid AND Magnitude._oid = PMagnitude._oid AND PMagnitude.publicID=Event.preferredMagnitudeID AND POrigin.publicID = Event.preferredOriginID AND Origin.latitude_value >= ##latMin## AND Origin.latitude_value <= ##latMax## AND Origin.longitude_value >= ##lonMin## AND Origin.longitude_value <= ##lonMax## AND Magnitude.magnitude_value >= ##minMag## AND Magnitude.magnitude_value <= ##maxMag## AND Origin.time_value >= '##startTime##' AND Origin.time_value <= '##endTime##';"
query.eventUncertainty.description = "Returns all events (eventsIDs, time, lat, lat error, lon, lon error, depth, depth error, magnitude, region name) in the form of an event catalog"
query.eventUncertainty = "SELECT PEvent.publicID, Origin.time_value AS OT, ROUND(Origin.latitude_value, 3), ROUND(Origin.latitude_uncertainty, 3), ROUND(Origin.longitude_value, 3), ROUND(Origin.longitude_uncertainty, 3), ROUND(Origin.depth_value, 3), ROUND(Origin.depth_uncertainty, 3), ROUND(Magnitude.magnitude_value, 1), EventDescription.text FROM Event, PublicObject AS PEvent, EventDescription, Origin, PublicObject AS POrigin, Magnitude, PublicObject AS PMagnitude WHERE Event._oid = PEvent._oid AND Origin._oid = POrigin._oid AND Magnitude._oid = PMagnitude._oid AND Event.preferredOriginID = POrigin.publicID AND Event.preferredMagnitudeID = PMagnitude.publicID AND Event._oid = EventDescription._parent_oid AND EventDescription.type = 'region name' AND Event.type = '##type##' AND Origin.time_value >= '##startTime##' AND Origin.time_value <= '##endTime##';"
query.eventByAuthor.description = "Get events by preferred origin author etc"
query.eventByAuthor = "SELECT PEvent.publicID, Origin.time_value AS OT, Origin.latitude_value AS lat,Origin.longitude_value AS lon, Origin.depth_value AS dep, Magnitude.magnitude_value AS mag, Magnitude.type AS mtype, Origin.quality_usedPhaseCount AS phases, Event.type AS type, Event.typeCertainty AS certainty, Origin.creationInfo_author FROM Origin, PublicObject AS POrigin, Event, PublicObject AS PEvent, Magnitude, PublicObject AS PMagnitude WHERE Event._oid = PEvent._oid AND Origin._oid = POrigin._oid AND Magnitude._oid = PMagnitude._oid AND PMagnitude.publicID = Event.preferredMagnitudeID AND POrigin.publicID = Event.preferredOriginID AND Origin.latitude_value >= ##latMin## AND Origin.latitude_value <= ##latMax## AND Origin.longitude_value >= ##lonMin## AND Origin.longitude_value <= ##lonMax## AND Origin.quality_usedPhaseCount >= ##minPhases## AND Origin.quality_usedPhaseCount <= ##maxPhases## AND Magnitude.magnitude_value >= ##minMag## AND Magnitude.magnitude_value <= ##maxMag## AND Origin.time_value >= '##startTime##' AND Origin.time_value <= '##endTime##' AND Origin.creationInfo_author like '##author##';"
query.eventWithStationCount.description = "Get events by preferred origin author etc"
query.eventWithStationCount = "SELECT PEvent.publicID, Origin.time_value AS OT, Origin.latitude_value AS lat, Origin.longitude_value AS lon, Origin.depth_value AS dep, Magnitude.magnitude_value AS mag, Magnitude.type AS mtype, Origin.quality_usedStationCount AS stations, Event.type AS type, Event.typeCertainty AS certainty, Origin.creationInfo_author FROM Origin, PublicObject AS POrigin, Event, PublicObject AS PEvent, Magnitude, PublicObject AS PMagnitude WHERE Event._oid = PEvent._oid AND Origin._oid = POrigin._oid AND Magnitude._oid = PMagnitude._oid AND PMagnitude.publicID = Event.preferredMagnitudeID AND POrigin.publicID = Event.preferredOriginID AND Origin.time_value >= '##startTime##' AND Origin.time_value <= '##endTime##';"
query.eventType.description = "Returns all eventIDs FROM event WHERE the type is flagged AS 'event type'"
query.eventType = "SELECT pe.publicID, o.time_value AS OT FROM PublicObject pe, PublicObject po, Event e, Origin o WHERE pe._oid = e._oid AND po._oid = o._oid AND e.preferredOriginID = po.publicID AND e.type = '##type##' AND o.time_value >= '##startTime##' AND o.time_value <= '##endTime##';"
query.originByAuthor.description = "Get origins by author"
query.originByAuthor = "SELECT po.publicID, o.time_value AS OT, o.creationInfo_author FROM PublicObject po JOIN Origin o ON po._oid = o._oid WHERE o.creationInfo_author like '##author##' AND o.time_value >= '##startTime##' AND o.time_value <= '##endTime##';"
**More examples and statistics**
.. code-block:: properties
queries = phaseCountPerAuthor, time, mag_time, space_time, all, space_mag_time, event, fm_space_time, picks, stationPicks, assoc_picks, pref_assoc_picks, sta_net_mag, sta_net_mag_type, delta_sta_net_mag, delta_sta_net_mag_type
query.phaseCountPerAuthor.description = "Get phase count per origin author FROM event #EventID#"
query.phaseCountPerAuthor = "SELECT PEvent.publicID, Origin.creationInfo_author, MAX(Origin.quality_usedPhaseCount) FROM Origin, PublicObject AS POrigin, Event, PublicObject AS PEvent, OriginReference WHERE Origin._oid = POrigin._oid AND Event._oid = PEvent._oid AND OriginReference._parent_oid = Event._oid AND OriginReference.originID = POrigin.publicID AND PEvent.publicID = '##EventID##' group by Origin.creationInfo_author;"
query.time.description = "Events in time range"
query.time = "SELECT PEvent.publicID, Origin.time_value, ROUND(Origin.latitude_value, 4), ROUND(Origin.longitude_value, 4), ROUND(Origin.depth_value, 1), ROUND(Magnitude.magnitude_value, 1), Magnitude.type, Origin.quality_usedPhaseCount, Origin.quality_usedStationCount, Event.typeCertainty, Event.type, Origin.creationInfo_author FROM Origin, PublicObject AS POrigin, Event, PublicObject AS PEvent, Magnitude, PublicObject AS PMagnitude WHERE Event._oid = PEvent._oid AND Origin._oid = POrigin._oid AND Magnitude._oid = PMagnitude._oid AND PMagnitude.publicID = Event.preferredMagnitudeID AND POrigin.publicID = Event.preferredOriginID AND Origin.time_value >= '##startTime##' AND Origin.time_value <= '##endTime##';"
query.mag_time.description = "Events in magnitude-time range"
query.mag_time = "SELECT PEvent.publicID, Origin.time_value, ROUND(Origin.latitude_value, 4), ROUND(Origin.longitude_value, 4), ROUND(Origin.depth_value, 1), ROUND(Magnitude.magnitude_value, 1), Magnitude.type, Origin.quality_usedPhaseCount, Origin.quality_usedStationCount, Event.typeCertainty, Event.type, Origin.creationInfo_author FROM Origin, PublicObject AS POrigin, Event, PublicObject AS PEvent, Magnitude, PublicObject AS PMagnitude WHERE Event._oid = PEvent._oid AND Origin._oid = POrigin._oid AND Magnitude._oid = PMagnitude._oid AND PMagnitude.publicID = Event.preferredMagnitudeID AND POrigin.publicID = Event.preferredOriginID AND Magnitude.magnitude_value >= ##minMag## AND Magnitude.magnitude_value <= ##maxMag## AND Origin.time_value >= '##startTime##' AND Origin.time_value <= '##endTime##';"
query.space_time.description = "Events in space-time range"
query.space_time = "SELECT PEvent.publicID, Origin.time_value, ROUND(Origin.latitude_value, 4), ROUND(Origin.longitude_value, 4), ROUND(Origin.depth_value, 1), ROUND(Magnitude.magnitude_value, 1), Magnitude.type, Origin.quality_usedPhaseCount, Origin.quality_usedStationCount, Event.typeCertainty, Event.type, Origin.creationInfo_author FROM Origin, PublicObject AS POrigin, Event, PublicObject AS PEvent, Magnitude, PublicObject AS PMagnitude WHERE Event._oid = PEvent._oid AND Origin._oid = POrigin._oid AND Magnitude._oid = PMagnitude._oid AND PMagnitude.publicID = Event.preferredMagnitudeID AND POrigin.publicID = Event.preferredOriginID AND Origin.latitude_value >= ##latMin## AND Origin.latitude_value <= ##latMax## AND Origin.longitude_value >= ##lonMin## AND Origin.longitude_value <= ##lonMax## AND Origin.time_value >= '##startTime##' AND Origin.time_value <= '##endTime##';"
query.all.description = "Events in space-magnitude-time-quality range by author"
query.all = "SELECT PEvent.publicID, Origin.time_value, ROUND(Origin.latitude_value, 4), ROUND(Origin.longitude_value, 4), ROUND(Origin.depth_value, 1), ROUND(Magnitude.magnitude_value, 1), Magnitude.type, Origin.quality_usedPhaseCount, Origin.quality_usedStationCount, Event.typeCertainty, Event.type, Origin.creationInfo_author FROM Origin, PublicObject AS POrigin, Event, PublicObject AS PEvent, Magnitude, PublicObject AS PMagnitude WHERE Event._oid = PEvent._oid AND Origin._oid = POrigin._oid AND Magnitude._oid = PMagnitude._oid AND PMagnitude.publicID = Event.preferredMagnitudeID AND POrigin.publicID = Event.preferredOriginID AND Origin.latitude_value >= ##latMin## AND Origin.latitude_value <= ##latMax## AND Origin.longitude_value >= ##lonMin## AND Origin.longitude_value <= ##lonMax## AND Origin.quality_usedPhaseCount >= ##minPhases## AND Origin.quality_usedPhaseCount <= ##maxPhases## AND Magnitude.magnitude_value >= ##minMag## AND Magnitude.magnitude_value <= ##maxMag## AND Origin.time_value >= '##startTime##' AND Origin.time_value <= '##endTime##' AND Origin.creationInfo_author like '##author##%';"
query.space_mag_time.description = "Events in space-magnitude-time range"
query.space_mag_time = "SELECT PEvent.publicID, Origin.time_value, ROUND(Origin.latitude_value, 4), ROUND(Origin.longitude_value, 4), ROUND(Origin.depth_value, 1), ROUND(Magnitude.magnitude_value, 1), Magnitude.type, Origin.quality_usedPhaseCount, Origin.quality_usedStationCount, Event.typeCertainty, Event.type, Origin.creationInfo_author FROM Origin, PublicObject AS POrigin, Event, PublicObject AS PEvent, Magnitude, PublicObject AS PMagnitude WHERE Event._oid = PEvent._oid AND Origin._oid = POrigin._oid AND Magnitude._oid = PMagnitude._oid AND PMagnitude.publicID = Event.preferredMagnitudeID AND POrigin.publicID = Event.preferredOriginID AND Origin.latitude_value >= ##latMin## AND Origin.latitude_value <= ##latMax## AND Origin.longitude_value >= ##lonMin## AND Origin.longitude_value <= ##lonMax## AND Magnitude.magnitude_value >= ##minMag## AND Magnitude.magnitude_value <= ##maxMag## AND Origin.time_value >= '##startTime##' AND Origin.time_value <= '##endTime##';"
query.fm_space_time.description = "Events with focal mechanisms in space-time range"
query.fm_space_time = "SELECT PEvent.publicID, Origin.time_value, ROUND(Origin.latitude_value, 4), ROUND(Origin.longitude_value, 4), ROUND(Origin.depth_value, 1), ROUND(Magnitude.magnitude_value, 1), Magnitude.type, MomentTensor.doubleCouple, MomentTensor.variance, Event.typeCertainty, Event.type, Origin.creationInfo_author FROM Origin, PublicObject AS POrigin, Event, PublicObject AS PEvent, Magnitude, PublicObject AS PMagnitude, FocalMechanism, PublicObject AS PFocalMechanism, MomentTensor WHERE Event._oid = PEvent._oid AND Origin._oid = POrigin._oid AND Magnitude._oid = PMagnitude._oid AND PMagnitude.publicID = Event.preferredMagnitudeID AND FocalMechanism._oid = PFocalMechanism._oid AND PFocalMechanism.publicID = Event.preferredFocalMechanismID AND MomentTensor._parent_oid = FocalMechanism._oid AND POrigin.publicID = Event.preferredOriginID AND Origin.latitude_value >= ##latMin## AND Origin.latitude_value <= ##latMax## AND Origin.longitude_value >= ##lonMin## AND Origin.longitude_value <= ##lonMax## AND Origin.time_value >= '##startTime##' AND Origin.time_value <= '##endTime##';"
query.event.description = "List authors and number of origins for event"
query.event = "SELECT PEvent.publicID, Origin.creationInfo_author, MAX(Origin.quality_usedPhaseCount) FROM Origin, PublicObject AS POrigin, Event, PublicObject AS PEvent, OriginReference WHERE Origin._oid = POrigin._oid AND Event._oid = PEvent._oid AND OriginReference._parent_oid = Event._oid AND OriginReference.originID = POrigin.publicID AND PEvent.publicID = '##EventID##' group by Origin.creationInfo_author;"
query.picks.description = "List number of picks per station in a certain timespan"
query.picks = "SELECT waveformID_networkCode AS Network, waveformID_stationCode AS Station, COUNT(_oid) AS Picks, MIN(time_value) AS Start, MAX(time_value) AS End FROM Pick WHERE time_value >= '##startTime##' AND time_value <= '##endTime##' GROUP BY waveformID_networkCode, waveformID_stationCode;"
query.stationPicks.description = "List the picks and phase hints per station in a certain timespan"
query.stationPicks = "SELECT PPick.publicID, Pick.phaseHint_code FROM Pick, PublicObject AS PPick WHERE Pick._oid = PPick._oid AND waveformID_networkCode = '##netCode##' AND waveformID_stationCode = '##staCode##' AND time_value >= '##startTime##' AND time_value <= '##endTime##';"
query.assoc_picks.description = "List number of associated picks per station in a certain time span"
query.assoc_picks = "SELECT Pick.waveformID_networkCode AS Network, Pick.waveformID_stationCode AS Station, COUNT(DISTINCT(Pick._oid)) AS Picks, MIN(Pick.time_value) AS Start, MAX(Pick.time_value) AS End FROM Pick, PublicObject PPick, Arrival WHERE Pick._oid = PPick._oid AND PPick.publicID = Arrival.pickID AND Pick.time_value >= '##startTime##' AND Pick.time_value <= '##endTime##' GROUP BY Pick.waveformID_networkCode, Pick.waveformID_stationCode;"
query.pref_assoc_picks.description = "List number of associated picks of preferred origins per station for certain time span"
query.pref_assoc_picks = "SELECT Pick.waveformID_networkCode AS Network, Pick.waveformID_stationCode AS Station, COUNT(DISTINCT(Pick._oid)) AS Picks, MIN(Pick.time_value) AS Start, MAX(Pick.time_value) AS End FROM Pick, PublicObject PPick, Arrival, Origin, PublicObject POrigin, Event WHERE Event.preferredOriginID = POrigin.publicID AND Origin._oid = POrigin._oid AND Origin._oid = Arrival._parent_oid AND Pick._oid = PPick._oid AND PPick.publicID = Arrival.pickID AND Pick.time_value >= '##startTime##' AND Pick.time_value <= '##endTime##' GROUP BY Pick.waveformID_networkCode, Pick.waveformID_stationCode;"
query.sta_net_mag.description = "Compares station magnitudes of a particular station with the network magnitude in a certain time span"
query.sta_net_mag = "SELECT StationMagnitude.waveformID_networkCode AS Network, StationMagnitude.waveformID_stationCode AS Station, StationMagnitude.magnitude_value AS StaMag, Magnitude.magnitude_value AS NetMag, Magnitude.type AS NetMagType, StationMagnitude.creationInfo_creationTime AS CreationTime FROM StationMagnitude, PublicObject PStationMagnitude, StationMagnitudeContribution, Magnitude WHERE StationMagnitude._oid = PStationMagnitude._oid AND StationMagnitudeContribution.stationMagnitudeID = PStationMagnitude.publicID AND StationMagnitudeContribution._parent_oid = Magnitude._oid AND StationMagnitude.waveformID_networkCode = '##netCode##' AND StationMagnitude.waveformID_stationCode = '##staCode##' AND StationMagnitude.creationInfo_creationTime >= '##startTime##' AND StationMagnitude.creationInfo_creationTime <= '##endTime##' ORDER BY StationMagnitude.creationInfo_creationTime;"
query.sta_net_mag_type.description = "Compares station magnitudes of a particular station with the network magnitude of specific type in a certain time span"
query.sta_net_mag_type = "SELECT StationMagnitude.waveformID_networkCode AS Network, StationMagnitude.waveformID_stationCode AS Station, StationMagnitude.magnitude_value AS StaMag, Magnitude.magnitude_value AS NetMag, Magnitude.type AS NetMagType, StationMagnitude.creationInfo_creationTime AS CreationTime FROM StationMagnitude, PublicObject PStationMagnitude, StationMagnitudeContribution, Magnitude WHERE StationMagnitude._oid = PStationMagnitude._oid AND StationMagnitudeContribution.stationMagnitudeID = PStationMagnitude.publicID AND StationMagnitudeContribution._parent_oid = Magnitude._oid AND StationMagnitude.waveformID_networkCode = '##netCode##' AND StationMagnitude.waveformID_stationCode = '##staCode##' AND StationMagnitude.creationInfo_creationTime >= '##startTime##' AND StationMagnitude.creationInfo_creationTime <= '##endTime##' AND Magnitude.type = '##magType##' ORDER BY StationMagnitude.creationInfo_creationTime;"
query.delta_sta_net_mag.description = "Calculates delta values of station and network magnitudes for all stations in a certain time span"
query.delta_sta_net_mag = "SELECT StationMagnitude.waveformID_networkCode AS Network, StationMagnitude.waveformID_stationCode AS Station, AVG(StationMagnitude.magnitude_value - Magnitude.magnitude_value) AS DeltaAvg, MIN(StationMagnitude.magnitude_value - Magnitude.magnitude_value) AS DeltaMin, MAX(StationMagnitude.magnitude_value - Magnitude.magnitude_value) AS DeltaMax, MIN(StationMagnitude.creationInfo_creationTime) AS Start, MAX(StationMagnitude.creationInfo_creationTime) AS End FROM StationMagnitude, PublicObject PStationMagnitude, StationMagnitudeContribution, Magnitude WHERE StationMagnitude._oid = PStationMagnitude._oidStationMagnitudeContribution.stationMagnitudeID = PStationMagnitude.publicIDStationMagnitudeContribution._parent_oid = Magnitude._oidStationMagnitude.creationInfo_creationTime >= '##startTime##'StationMagnitude.creationInfo_creationTime <= '##endTime##' GROUP BY StationMagnitude.waveformID_networkCode, StationMagnitude.waveformID_stationCode;"
query.delta_sta_net_mag_type.description = "Calculates delta values of station and network magnitudes for all stations and all magnitude types in a certain time span"
query.delta_sta_net_mag_type = "SELECT StationMagnitude.waveformID_networkCode AS Network, StationMagnitude.waveformID_stationCode AS Station, AVG(StationMagnitude.magnitude_value - Magnitude.magnitude_value) AS DeltaAvg, MIN(StationMagnitude.magnitude_value - Magnitude.magnitude_value) AS DeltaMin, MAX(StationMagnitude.magnitude_value - Magnitude.magnitude_value) AS DeltaMax, Magnitude.type AS NetMagType, MIN(StationMagnitude.creationInfo_creationTime) AS Start, MAX(StationMagnitude.creationInfo_creationTime) AS End FROM StationMagnitude, PublicObject PStationMagnitude, StationMagnitudeContribution, Magnitude WHERE StationMagnitude._oid = PStationMagnitude._oid AND StationMagnitudeContribution.stationMagnitudeID = PStationMagnitude.publicID AND StationMagnitudeContribution._parent_oid = Magnitude._oid AND StationMagnitude.creationInfo_creationTime >= '##startTime##' AND StationMagnitude.creationInfo_creationTime <= '##endTime##' GROUP BY StationMagnitude.waveformID_networkCode, StationMagnitude.waveformID_stationCode, Magnitude.type;"
.. _scquery_psql:
PostgreSQL
----------
In contrast to queries for objects in :ref:`MariaDB/MySQL <scquery_mariadb>` the
string ``m_`` must be added to the value and publicID database columns as shown
below for the query "eventFilter".
.. code-block:: properties
queries = eventFilter
query.eventFilter.description = "Returns all events (lat, lon, mag, time) that fall into a certain region and a magnitude range"
query.eventFilter = "SELECT PEvent.m_publicID, Origin.m_time_value AS OT, Origin.m_latitude_value, Origin.m_longitude_value, Origin.m_depth_value, Magnitude.m_magnitude_value, Magnitude.m_type FROM Origin, PublicObject AS POrigin, Event, PublicObject AS PEvent, Magnitude, PublicObject AS PMagnitude WHERE Event._oid = PEvent._oid AND Origin._oid = POrigin._oid AND Magnitude._oid = PMagnitude._oid AND PMagnitude.m_publicID = Event.m_preferredMagnitudeID AND POrigin.m_publicID = Event.m_preferredOriginID AND Origin.m_latitude_value >= ##latMin## AND Origin.m_latitude_value <= ##latMax## AND Origin.m_longitude_value >= ##lonMin## AND Origin.m_longitude_value <= ##lonMax## AND Magnitude.m_magnitude_value >= ##minMag## AND Magnitude.m_magnitude_value <= ##maxMag## AND Origin.m_time_value >= '##startTime##' AND Origin.m_time_value <= '##endTime##';"
.. _scquery_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/scquery.cfg`
| :file:`etc/global.cfg`
| :file:`etc/scquery.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/scquery.cfg`
scquery inherits :ref:`global options<global-configuration>`.
Command-Line Options
====================
.. program:: scquery
:program:`scquery [OPTIONS] [query name] [query parameters]`
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
.. option:: --plugins arg
Load given plugins.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --log-file arg
Use alternative log file.
Database
--------
.. option:: --db-driver-list
List all supported database drivers.
.. option:: -d, --database arg
The database connection string, format:
service:\/\/user:pwd\@host\/database.
\"service\" is the name of the database driver which
can be queried with \"\-\-db\-driver\-list\".
Commands
--------
.. option:: --showqueries
Show the queries defined in queries.cfg.
.. option:: --delimiter arg
Column delimiter. If found, this character will be escaped
in output values. Default: '\|'.
.. option:: --print-column-name
Print the name of each output column in a header.
.. option:: --print-header
Print the query parameters and the query filter description
as a header of the query output.
.. option:: --print-query-only
Only print the full query to stdout and then quit.
.. option:: -Q, --query arg
Execute the given query instead of applying queries
pre\-defined by configuration.

View File

@ -0,0 +1,246 @@
.. highlight:: rst
.. _scqueryqc:
#########
scqueryqc
#########
**Query waveform quality control (QC) parameters from database.**
Description
===========
scqueryqc queries a database for waveform quality control (QC) parameters. The
QC parameters can be provided and written to the database, e.g., by :ref:`scqc`.
.. warning ::
Writing QC parameters to the database by :ref:`scqc` will result in a rapidly
growing database and is therefore not recommended in permanent application without
regularly stripping these parameters from the database!
The database query is done for
* One or multiple streams,
* One or multiple QC parameters. All QC parameters can be requested. Defaults
apply. For reading the defaults use
.. code-block:: sh
scqueryqc -h
* A single time window where the begin time must be provided. Current time is
considered if the end is not give.
Workflow
--------
You should minimize the impact of stored waveform QC parameters on the size of the
database.
#. Compute the QC parameters in real time using :ref:`scqc` and save them in the
|scname| database. Saving the QC parameters in the database requires to
adjust the scqc module configuration parameters
:confval:`plugins.$name.archive.interval` for each plugin.
#. Regularly use scqueryqc for some time span to read the QC parameters from the
database. Save them in a XML files.
Example for all QC parameters found for all streams in the inventory before
end time:
.. code-block:: sh
scqueryqc -d [host] -e '[end time]' --streams-from-inventory -o [XML file]
#. Clean the database from QC parameters.
* Either use :ref:`scdispatch` with the parameters saved in XML. You may need
to set the routing table for sending the QualityControl parameters to the
right message group, e.g., QC:
.. code-block:: sh
scdispatch -H [host] -O remove --routingtable QualityControl:QC -i [XML file]
* Alternatively, use :ref:`scdbstrip` with the command-line option
:option:`--qc-only` and remove **all** QC parameters in the time span. Use the same
period for which the QC parameters were retrieved:
.. code-block:: sh
scdbstrip -d [database] -Q --date-time '[end time]'
.. note ::
Considering an end time by :option:`--date-time` has the advantage that no QC
parameters are removed which were measured after scqueryqc was applied with the
same end time value.
Examples
--------
* Query rms and delay values for the stream AU.AS18..SHZ,AU.AS19..SHZ before
'2021-11-20 00:00:00'. Write the XML to stdout
.. code-block:: sh
scqueryqc -d localhost -e '2021-11-20 00:00:00' -p rms,delay -i AU.AS18..SHZ,AU.AS19..SHZ
* Query all default QC parameter values for all streams found in the inventory
from '2021-11-20 00:00:00' until current. Write the formatted XML output to
:file:`/tmp/query.xml`
.. code-block:: sh
scqueryqc -d localhost -b '2021-11-20 00:00:00' --streams-from-inventory -f -o /tmp/query.xml
.. _scqueryqc_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/scqueryqc.cfg`
| :file:`etc/global.cfg`
| :file:`etc/scqueryqc.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/scqueryqc.cfg`
scqueryqc inherits :ref:`global options<global-configuration>`.
Command-Line Options
====================
.. program:: scqueryqc
:program:`scqueryqc [options]`
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --log-file arg
Use alternative log file.
Database
--------
.. option:: --db-driver-list
List all supported database drivers.
.. option:: -d, --database arg
The database connection string, format:
service:\/\/user:pwd\@host\/database.
\"service\" is the name of the database driver which
can be queried with \"\-\-db\-driver\-list\".
.. option:: --inventory-db arg
Load the inventory from the given database or file, format:
[service:\/\/]location .
Output
------
.. option:: -o, --output filename
Name of output XML file. Objects are sent to stderr if none
is given.
.. option:: -f, --formatted
Write formatted XML.
Query
-----
.. option:: -b, --begin time
Begin time of query. Format: 'YYYY\-MM\-DD hh:mm:ss'.
'1900\-01\-01T00:00:00Z' is considered if undefined.
.. option:: -e, --end time
End time of query. Format: 'YYYY\-MM\-DD hh:mm:ss'.
Current time is considered if undefined.
.. option:: -i, --stream-id string
Waveform stream ID to search for QC parameters: net.sta.loc.cha \-
[networkCode].[stationCode].[sensorLocationCode].[channelCode].
Provide a single ID or a comma\-separated list. Overrides
'\-\-streams\-from\-inventory'.
.. option:: -p, --parameter string
QC parameter to output: \(e.g. delay,rms,'gaps count' ...\).
Use quotes if QC parameter has more than 1 word.
Provide a single parameter or a comma\-separated list.
Defaults are used if parameter is undefined.
.. option:: --streams-from-inventory
Read streams from inventory. Superseded by '\-\-stream\-id'.

View File

@ -0,0 +1,470 @@
.. highlight:: rst
.. _screloc:
#######
screloc
#######
**Automatic relocator.**
Description
===========
screloc is an automatic relocator that receives origins from realtime
locators such as scautoloc and relocates them with a configurable locator.
screloc can be conveniently used to test different locators and velocity models
or to relocate events with updated velocity models. Check the
:ref:`Example applications <screloc-example>` for screloc.
screloc processes any incoming automatic origin but does not yet listen to event
information nor does it skip origins for that a more recent one exists.
To run screloc along with all processing modules add it to the list of
clients in the seiscomp configuration frontend.
.. code-block:: sh
seiscomp enable screloc
seiscomp start screloc
Descriptions of parameters for screloc:
.. code-block:: sh
seiscomp exec screloc -h
Test the performance of screloc and learn from debug output:
.. code-block:: sh
seiscomp exec screloc --debug
Setup
=====
The following example configuration shows a setup of screloc for
:ref:`NonLinLoc <global_nonlinloc>`:
.. code-block:: sh
plugins = ${plugins}, locnll
# Define the locator algorithm to use
reloc.locator = NonLinLoc
# Define a suffix appended to the publicID of the origin to be relocated
# to form the new publicID.
# This helps to identify pairs of origins before and after relocation.
# However, new publicIDs are unrelated to the time of creation.
# If not defined, a new publicID will be generated automatically.
reloc.originIDSuffix = "#relocated"
########################################################
################ NonLinLoc configuration################
########################################################
NLLROOT = ${HOME}/nll/data
NonLinLoc.outputPath = ${NLLROOT}/output/
# Define the default control file if no profile specific
# control file is defined.
NonLinLoc.controlFile = ${NLLROOT}/NLL.default.conf
# Set the default pick error in seconds passed to NonLinLoc
# if no SeisComP pick uncertainty is available.
NonLinLoc.defaultPickError = 0.1
# Define the available NonLinLoc location profiles. The order
# implicitly defines the priority for overlapping regions
#NonLinLoc.profiles = swiss_3d, swiss_1d, global
NonLinLoc.profiles = swiss_3d, global
# The earthModelID is copied to earthModelID attribute of the
# resulting origin
NonLinLoc.profile.swiss_1d.earthModelID = "swiss regional 1D"
# Specify the velocity model table path as used by NonLinLoc
NonLinLoc.profile.swiss_1d.tablePath = ${NLLROOT}/time_1d_regio/regio
# Specify the region valid for this profile
NonLinLoc.profile.swiss_1d.region = 41.2, 3.8, 50.1, 16.8
# The NonLinLoc default control file to use for this profile
NonLinLoc.profile.swiss_1d.controlFile = ${NLLROOT}/NLL.swiss_1d.conf
# Configure the swiss_3d profile
NonLinLoc.profile.swiss_3d.earthModelID = "swiss regional 3D"
NonLinLoc.profile.swiss_3d.tablePath = ${NLLROOT}/time_3d/ch
NonLinLoc.profile.swiss_3d.region = 45.15, 5.7, 48.3, 11.0
NonLinLoc.profile.swiss_3d.controlFile = ${NLLROOT}/NLL.swiss_3d.conf
# And the global profile
NonLinLoc.profile.global.earthModelID = iaspei91
NonLinLoc.profile.global.tablePath = ${NLLROOT}/iasp91/iasp91
NonLinLoc.profile.global.controlFile = ${NLLROOT}/NLL.global.conf
.. _screloc-example:
Examples
========
* Run screloc to with a specific velocity model given in a profile by :ref:`NonLinLoc <global_nonlinloc>`.
Use a specific userID and authorID for uniquely recognizing the relocation.
Changing the priority in :ref:`scevent` before running the example, e.g. to
TIME_AUTOMATIC, sets the latest origin (which will be created by screloc) to preferred.
.. code-block:: sh
# set specific velocity profile defined for NonLinLoc
profile=<your_profile>
# set userID
userID="<your_user>"
# set authorID
authorID="<screloc>"
for i in `scevtls -d mysql://sysop:sysop@localhost/seiscomp --begin '2015-01-01 00:00:00' --end '2015-02-01 00:00:00'`; do
orgID=`echo "select preferredOriginID from Event,PublicObject where Event._oid=PublicObject._oid and PublicObject.publicID='$i'" |\
mysql -u sysop -p sysop -D seiscomp -h localhost -N`
screloc -O $orgID -d localhost --locator NonLinLoc --profile $profile -u $userID --debug --author=$authorID
done
.. _screloc_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/screloc.cfg`
| :file:`etc/global.cfg`
| :file:`etc/screloc.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/screloc.cfg`
screloc inherits :ref:`global options<global-configuration>`.
.. confval:: reloc.locator
Type: *string*
Defines the locator to be used such as NonLinLoc.
.. confval:: reloc.profile
Type: *string*
The locator profile to use.
.. confval:: reloc.ignoreRejectedOrigins
Default: ``false``
Type: *boolean*
Ignores origins with status REJECTED.
.. confval:: reloc.allowAnyStatus
Default: ``false``
Type: *boolean*
Triggers processing on origins with any evaluation status.
Overrides \"allowPreliminaryOrigins\" but not
\"ignoreRejectedOrigins\".
.. confval:: reloc.allowPreliminaryOrigins
Default: ``false``
Type: *boolean*
Triggers processing also on origins with status PRELIMINARY.
Origins with any other status is ignored anyway. The parameter
is overridden by \"allowAnyStatus\".
.. confval:: reloc.allowManualOrigins
Default: ``false``
Type: *boolean*
Triggers processing also on origins with evaluatuion mode
MANUAL. Otherwise only origins with mode AUTOMATIC are
considered. The triggering may be limited due to an evaluation
status, see \"allowPreliminaryOrigins\".
.. confval:: reloc.useWeight
Default: ``false``
Type: *boolean*
Whether to use the weight of the picks associated with the
input origin as defined in the input origin as input for
the locator or not. If false then all picks associated with
the input origin will be forwarded to the locator with full
weight.
.. confval:: reloc.adoptFixedDepth
Default: ``false``
Type: *boolean*
If the input origin's depth is fixed then it will be fixed
during the relocation process as well.
.. confval:: reloc.storeSourceOriginID
Default: ``false``
Type: *boolean*
Whether to store the ID of the input origin as comment in the
relocated origin or not.
.. confval:: reloc.originIDSuffix
Type: *string*
Suffix appended to the publicID of the origin to be relocated
to form the new publicID. This
helps to identify pairs of origins before and after relocation.
However, new publicIDs are unrelated to the time of creation.
If not defined, a new publicID will be generated automatically.
Command-Line Options
====================
.. program:: screloc
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
.. option:: --plugins arg
Load given plugins.
.. option:: -D, --daemon
Run as daemon. This means the application will fork itself
and doesn't need to be started with \&.
.. option:: --auto-shutdown arg
Enable\/disable self\-shutdown because a master module shutdown.
This only works when messaging is enabled and the master
module sends a shutdown message \(enabled with \-\-start\-stop\-msg
for the master module\).
.. option:: --shutdown-master-module arg
Set the name of the master\-module used for auto\-shutdown.
This is the application name of the module actually
started. If symlinks are used, then it is the name of
the symlinked application.
.. option:: --shutdown-master-username arg
Set the name of the master\-username of the messaging
used for auto\-shutdown. If \"shutdown\-master\-module\" is
given as well, this parameter is ignored.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --log-file arg
Use alternative log file.
Messaging
---------
.. option:: -u, --user arg
Overrides configuration parameter :confval:`connection.username`.
.. option:: -H, --host arg
Overrides configuration parameter :confval:`connection.server`.
.. option:: -t, --timeout arg
Overrides configuration parameter :confval:`connection.timeout`.
.. option:: -g, --primary-group arg
Overrides configuration parameter :confval:`connection.primaryGroup`.
.. option:: -S, --subscribe-group arg
A group to subscribe to.
This option can be given more than once.
.. option:: --content-type arg
Overrides configuration parameter :confval:`connection.contentType`.
.. option:: --start-stop-msg arg
Set sending of a start and a stop message.
Database
--------
.. option:: --db-driver-list
List all supported database drivers.
.. option:: -d, --database arg
The database connection string, format:
service:\/\/user:pwd\@host\/database.
\"service\" is the name of the database driver which
can be queried with \"\-\-db\-driver\-list\".
.. option:: --config-module arg
The config module to use.
.. option:: --inventory-db arg
Load the inventory from the given database or file, format:
[service:\/\/]location .
.. option:: --db-disable
Do not use the database at all
Mode
----
.. option:: --test
Test mode, do not send any message
Input
-----
.. option:: -O, --origin-id arg
Reprocess the origin and send a message unless test mode is activated
.. option:: --locator arg
The locator type to use
.. option:: --use-weight arg
Use current picks weight
.. option:: --evaluation-mode arg
set origin evaluation mode: \"AUTOMATIC\" or \"MANUAL\"
.. option:: --ep file
Defines an event parameters XML file to be read and processed. This
implies offline mode and only processes all origins contained
in that file. Each relocated origin is appended to the list
of origins unless \-\-replace is given.
.. option:: --replace
Used in combination with \-\-ep. If given then each origin for
that an output has been generated is replaced by the result
of relocation. In other words: two LocSAT origins in, two
NonLinLoc origins out. All other objects are passed through.
Profiling
---------
.. option:: --measure-relocation-time
Measure the time spent in a single relocation
.. option:: --repeated-relocations arg
improve measurement of relocation time by running each relocation multiple times. Specify the number of relocations, e.g. 100.

View File

@ -0,0 +1,219 @@
.. highlight:: rst
.. _screpick:
########
screpick
########
**Reads an XML file of picks and repicks them using a post picker. The
picks will be modified in place and written to another XML file.**
.. _screpick_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/screpick.cfg`
| :file:`etc/global.cfg`
| :file:`etc/screpick.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/screpick.cfg`
screpick inherits :ref:`global options<global-configuration>`.
.. confval:: picker
Type: *string*
Picker interface to be used for repicking.
.. confval:: anyPhase
Default: ``false``
Type: *boolean*
If enabled then all picks will be processed. Otherwise only
P phases or picks without a phase hint will be considered.
Command-Line Options
====================
.. program:: screpick
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
.. option:: --plugins arg
Load given plugins.
.. option:: -D, --daemon
Run as daemon. This means the application will fork itself
and doesn't need to be started with \&.
.. option:: --auto-shutdown arg
Enable\/disable self\-shutdown because a master module shutdown.
This only works when messaging is enabled and the master
module sends a shutdown message \(enabled with \-\-start\-stop\-msg
for the master module\).
.. option:: --shutdown-master-module arg
Set the name of the master\-module used for auto\-shutdown.
This is the application name of the module actually
started. If symlinks are used, then it is the name of
the symlinked application.
.. option:: --shutdown-master-username arg
Set the name of the master\-username of the messaging
used for auto\-shutdown. If \"shutdown\-master\-module\" is
given as well, this parameter is ignored.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --log-file arg
Use alternative log file.
Database
--------
.. option:: --db-driver-list
List all supported database drivers.
.. option:: -d, --database arg
The database connection string, format:
service:\/\/user:pwd\@host\/database.
\"service\" is the name of the database driver which
can be queried with \"\-\-db\-driver\-list\".
.. option:: --config-module arg
The config module to use.
.. option:: --inventory-db arg
Load the inventory from the given database or file, format:
[service:\/\/]location .
.. option:: --db-disable
Do not use the database at all
Records
-------
.. option:: --record-driver-list
List all supported record stream drivers.
.. option:: -I, --record-url arg
The recordstream source URL, format:
[service:\/\/]location[#type].
\"service\" is the name of the recordstream driver
which can be queried with \"\-\-record\-driver\-list\".
If \"service\" is not given, \"file:\/\/\" is
used.
.. option:: --record-file arg
Specify a file as record source.
.. option:: --record-type arg
Specify a type for the records being read.
Input
-----
.. option:: --ep file
Name of input XML file \(SCML\) with all picks for offline
processing.
Picker
------
.. option:: -P, --picker interface
Defines the picker interface to be used. It must be part of
the common libraries or loaded as plugin.
.. option:: -A, --any-phase
Accept any pick regardless of its phase hint.

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,242 @@
.. highlight:: rst
.. _scsendjournal:
#############
scsendjournal
#############
**Send journaling information to the messaging to manipulate event parameter objects.**
Description
===========
scsendjournal sends journals to the `scname` messaging system.
Currently, journals can be used to command :ref:`scevent`.
The journals command :ref:`scevent` to manipulate event parameters according to
the :ref:`journal actions <scsendjournal-actions>` which must be known to
:ref:`scevent`.
The actions allow to:
* Create new events,
* Modify event parameters,
* Control the association of origins to events.
.. _scsendjournal-actions:
Actions
=======
There are specific journal actions for handling non-events and events. The documentation
of :ref:`scevent` contains a :ref:`complete list of journals known to scevent <scevent-journals>`.
Parameters used on the command line can also be passed from file using the option
:option:`-i`.
None-event specific actions
---------------------------
* **EvNewEvent**: Create a new event from origin with the provided origin ID.
The origin must be known to :ref:`scevent`.
Example: Create a new event from the
origin with given originID. Apply the action in the message system on *localhost*: ::
scsendjournal -H localhost Origin#20170505130954.736019.318 EvNewEvent
Origin association
------------------
* **EvGrabOrg**: Grab origin and move the origin to the event with the given eventID.
If the origins is already associated to another event, remove this reference
in the other event.
* **EvMerge**: Merge events into one event.
Example: Merge all origins from the source event with eventID *eventS* into the
target event with eventID *eventT*. Remove event *eventS*. Apply the action in
the message system on *host*: ::
scsendjournal -H {host} {eventT} EvMerge {eventS}
* **EvSplitOrg**: Split origins to 2 events.
Event parameters
----------------
* **EvName**: Set *EventDescription* of type *earthquake name*.
Example, setting the name of the event with
eventID *gempa2021abcd* to *Petrinja* ::
scsendjournal -H localhost gempa2020abcd EvName "Petrinja"
* **EvOpComment**: Set event operator's comment.
* **EvPrefFocMecID**: Set event preferred focal mechanism.
* **EvPrefMagTypev:** Set preferred magnitude type.
* **EvPrefMw**: Set Mw from focal mechanism as preferred magnitude.
* **EvPrefOrgAutomatic**: Set the preferred mode to *automatic* corresponding to *unfix* in scolv.
* **EvPrefOrgEvalMode**: Set preferred origin by evaluation mode.
* **EvPrefOrgID**: Set preferred origin by ID.
* **EvRefresh**: Select the preferred origin, the preferred magnitude, update
the region. Call processors loaded with plugins, e.g. the
:ref:`evrc <scevent_regioncheck>` plugin for scevent.
Example: ::
scsendjournal -H localhost gempa2021abcd EvRefresh
* **EvType**: Set event type.
Example: Set the type of the event with eventID *gempa2021abcd* to *nuclear explosion*. ::
scsendjournal -H localhost gempa2021abcd EvType "nuclear explosion"
* **EvTypeCertainty**: set event type certainty.
Command-Line Options
====================
.. program:: scsendjournal
:program:`scsendjournal [opts] {objectID} {action} [parameters]`
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
.. option:: --plugins arg
Load given plugins.
.. option:: -D, --daemon
Run as daemon. This means the application will fork itself
and doesn't need to be started with \&.
.. option:: --auto-shutdown arg
Enable\/disable self\-shutdown because a master module shutdown.
This only works when messaging is enabled and the master
module sends a shutdown message \(enabled with \-\-start\-stop\-msg
for the master module\).
.. option:: --shutdown-master-module arg
Set the name of the master\-module used for auto\-shutdown.
This is the application name of the module actually
started. If symlinks are used, then it is the name of
the symlinked application.
.. option:: --shutdown-master-username arg
Set the name of the master\-username of the messaging
used for auto\-shutdown. If \"shutdown\-master\-module\" is
given as well, this parameter is ignored.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --log-file arg
Use alternative log file.
Messaging
---------
.. option:: -u, --user arg
Overrides configuration parameter :confval:`connection.username`.
.. option:: -H, --host arg
Overrides configuration parameter :confval:`connection.server`.
.. option:: -t, --timeout arg
Overrides configuration parameter :confval:`connection.timeout`.
.. option:: -g, --primary-group arg
Overrides configuration parameter :confval:`connection.primaryGroup`.
.. option:: -S, --subscribe-group arg
A group to subscribe to.
This option can be given more than once.
.. option:: --start-stop-msg arg
Set sending of a start and a stop message.
Input
-----
.. option:: -i, --input arg
Read parameters from given file instead from command line.

View File

@ -0,0 +1,155 @@
.. highlight:: rst
.. _scsendorigin:
############
scsendorigin
############
**Create an artificial origin and send to the messaging.**
Command-Line Options
====================
.. program:: scsendorigin
:program:`scsendorigin [options]`
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
.. option:: --plugins arg
Load given plugins.
.. option:: -D, --daemon
Run as daemon. This means the application will fork itself
and doesn't need to be started with \&.
.. option:: --auto-shutdown arg
Enable\/disable self\-shutdown because a master module shutdown.
This only works when messaging is enabled and the master
module sends a shutdown message \(enabled with \-\-start\-stop\-msg
for the master module\).
.. option:: --shutdown-master-module arg
Set the name of the master\-module used for auto\-shutdown.
This is the application name of the module actually
started. If symlinks are used, then it is the name of
the symlinked application.
.. option:: --shutdown-master-username arg
Set the name of the master\-username of the messaging
used for auto\-shutdown. If \"shutdown\-master\-module\" is
given as well, this parameter is ignored.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --log-file arg
Use alternative log file.
Messaging
---------
.. option:: -u, --user arg
Overrides configuration parameter :confval:`connection.username`.
.. option:: -H, --host arg
Overrides configuration parameter :confval:`connection.server`.
.. option:: -t, --timeout arg
Overrides configuration parameter :confval:`connection.timeout`.
.. option:: -g, --primary-group arg
Overrides configuration parameter :confval:`connection.primaryGroup`.
.. option:: -S, --subscribe-group arg
A group to subscribe to.
This option can be given more than once.
.. option:: --start-stop-msg arg
Set sending of a start and a stop message.
Parameters
----------
.. option:: --time
Time of origin. Use quotes to encapsulate date and time.
.. option:: --coord
Latitude,longitude,depth of origin.

View File

@ -0,0 +1,235 @@
.. highlight:: rst
.. _scsohlog:
########
scsohlog
########
**Connect to the messaging and collect all information sent from connected
clients.**
Description
===========
scsohlog connects to the messaging and collects all information sent from connected
clients. It creates an XML file and writes that to disc at a configurable interval.
That output can be read by any consumer and converted to the desired output.
Example
=======
Create an output XML file every 60 seconds and execute a custom script to process
that XML file.
.. code-block:: sh
#!/bin/sh
scsohlog -o stat.xml -i 60 --script process-stat.sh
You can also preconfigure these values:
.. code-block:: sh
monitor.output.file = /path/to/stat.xml
monitor.output.interval = 60
monitor.output.script = /path/to/script.sh
.. _scsohlog_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/scsohlog.cfg`
| :file:`etc/global.cfg`
| :file:`etc/scsohlog.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/scsohlog.cfg`
scsohlog inherits :ref:`global options<global-configuration>`.
.. confval:: monitor.output.interval
Default: ``60``
Type: *uint*
Unit: *s*
The output interval in seconds.
.. confval:: monitor.output.file
Default: ``@LOGDIR@/server.xml``
Type: *string*
The output XML file.
.. confval:: monitor.output.script
Type: *string*
The script to execute.
Example: \@CONFIGDIR\@\/scripts\/scsohlog_script.sh.
Command-Line Options
====================
.. program:: scsohlog
:program:`scsohlog [options]`
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
.. option:: --plugins arg
Load given plugins.
.. option:: -D, --daemon
Run as daemon. This means the application will fork itself
and doesn't need to be started with \&.
.. option:: --auto-shutdown arg
Enable\/disable self\-shutdown because a master module shutdown.
This only works when messaging is enabled and the master
module sends a shutdown message \(enabled with \-\-start\-stop\-msg
for the master module\).
.. option:: --shutdown-master-module arg
Set the name of the master\-module used for auto\-shutdown.
This is the application name of the module actually
started. If symlinks are used, then it is the name of
the symlinked application.
.. option:: --shutdown-master-username arg
Set the name of the master\-username of the messaging
used for auto\-shutdown. If \"shutdown\-master\-module\" is
given as well, this parameter is ignored.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --log-file arg
Use alternative log file.
Messaging
---------
.. option:: -u, --user arg
Overrides configuration parameter :confval:`connection.username`.
.. option:: -H, --host arg
Overrides configuration parameter :confval:`connection.server`.
.. option:: -t, --timeout arg
Overrides configuration parameter :confval:`connection.timeout`.
.. option:: -g, --primary-group arg
Overrides configuration parameter :confval:`connection.primaryGroup`.
.. option:: -S, --subscribe-group arg
A group to subscribe to.
This option can be given more than once.
.. option:: --content-type arg
Overrides configuration parameter :confval:`connection.contentType`.
.. option:: --start-stop-msg arg
Set sending of a start and a stop message.
Output
------
.. option:: -o, --file file
The output XML file.
.. option:: -i, --interval arg
The output interval in seconds.
.. option:: --script arg
The script to execute. Example: path\/scsohlog_script.sh.

View File

@ -0,0 +1,416 @@
.. highlight:: rst
.. _scvoice:
#######
scvoice
#######
**Real time voice alert.**
Description
===========
This tool runs an external script whenever an event has been created or updated.
It can also run a script in case an amplitude of a particular type or a
preliminary origin (heads-up message) has been sent. The common purpose for
this tool is to play a sound or to convert a message to speech using external
tools like festival or espeak.
There are three possible trigger mechanisms for calling scripts:
* Event creation/update
* Amplitude creation
* Origin creation (with status = preliminary)
Although this tool was designed to alert the user acoustically it can also be
used to send e-mails, sms or to do any other kind of alert. scvoice can only
run one script per call type at a time! A template (:ref:`scalert`) Python script with
more options has been added to |scname| to be used as source for custom notifications.
Examples
========
Event script
------------
The following script is used as event script. It requires
`festival <http://www.cstr.ed.ac.uk/projects/festival/>`_ which should be
available in almost any Linux distribution.
.. important::
When saving the scripts given below do not forget to set the executable
bit otherwise scvoice cannot call the scripts. In Linux just run:
.. code-block:: sh
chmod +x /path/to/file
#. Save an executable script file, e.g., under, e.g. :file:`~/.seiscomp/event.sh`:
.. code-block:: sh
#!/bin/sh
if [ "$2" = "1" ]; then
echo " $1" | sed 's/,/, ,/g' | festival --tts;
else
echo "Event updated, $1" | sed 's/,/, ,/g' | festival --tts;
fi
#. Add the file to the configuration of :confval:`scripts.event` in the file
:file:`SEISCOMP_ROOT/etc/scvoice.cfg` or :file:`~/.seiscomp/scvoice.cfg`:
.. code-block:: sh
scripts.event = /home/sysop/.seiscomp/event.sh
Amplitude script
----------------
#. Save an executable script file, e.g., under :file:`~/.seiscomp/amplitude.sh`
.. code-block:: sh
#!/bin/sh
# Play a wav file with a particular volume
# derived from the amplitude itself.
playwave ~/.seiscomp/beep.wav -v $3
#. Add the file to the configuration of :confval:`scripts.amplitude` in the
file :file:`SEISCOMP_ROOT/etc/scvoice.cfg` or :file:`~/.seiscomp/scvoice.cfg`:
.. code-block:: sh
scripts.amplitude = /home/sysop/.seiscomp/amplitude.sh
Alert script
------------
#. Create a sound file :file:`siren.wav` for accoustic alerts.
#. Save an executable script file under, e.g., :file:`~/.seiscomp/alert.sh`:
.. code-block:: sh
#!/bin/sh
playwave /home/sysop/.seiscomp/siren.wav
#. Add the script filename to the configuration of :confval:`scripts.alert` in
the file :file:`SEISCOMP_ROOT/etc/scvoice.cfg` or :file:`~/.seiscomp/scvoice.cfg`.
.. code-block:: sh
scripts.alert = /home/sysop/.seiscomp/alert.sh
.. _scvoice_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/scvoice.cfg`
| :file:`etc/global.cfg`
| :file:`etc/scvoice.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/scvoice.cfg`
scvoice inherits :ref:`global options<global-configuration>`.
.. confval:: firstNew
Default: ``false``
Type: *boolean*
Treat an event as new event when it is seen the first time.
.. confval:: agencyIDs
Type: *list:string*
Define a white list of agencyIDs to alert an event. The
agencyID is extracted from the preferred origin of the event.
If this list is empty, all agencies are allowed.
.. confval:: poi.message
Type: *string*
The default message string for the event\-script is
\"earthquake, [HOURS] hours [MINS] minutes ago, [DESC],
magnitude [MAG], depth [DEP] kilometers\" whereas [DESC]
is the string given in the event.description attribute.
This string can be overwritten using one of the following
options. There are three placeholders that can be used:
\@region\@, \@dist\@ and \@poi\@.
Example: \"\@region\@, \@dist\@ kilometers from \@poi\@ away\".
.. confval:: poi.maxDist
Default: ``20``
Type: *double*
Unit: *deg*
When using the nearest point of interest \(city\) as part of the
message string, specify the maximum distance in degrees from the
event. Any point of interest farther away will be ignored.
.. confval:: poi.minPopulation
Default: ``50000``
Type: *double*
Minimum population for a city to become a point of interest.
.. confval:: scripts.amplitude
Type: *string*
Specify the script to be called if an amplitude
arrives, network\-, stationcode and amplitude are passed
as parameters \$1, \$2 and \$3.
.. confval:: scripts.alert
Type: *string*
Specify the script to be called if a preliminary
origin arrives, latitude and longitude are passed as
parameters \$1 and \$2.
.. confval:: scripts.event
Type: *string*
Specify the script to be called when an event has been
declared; the message string, a flag \(1\=new event,
0\=update event\), the EventID, the arrival count and the
magnitude \(optional when set\) are passed as parameter
\$1, \$2, \$3, \$4 and \$5.
Command-Line Options
====================
.. program:: scvoice
:program:`scvoice [options]`
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
.. option:: --plugins arg
Load given plugins.
.. option:: -D, --daemon
Run as daemon. This means the application will fork itself
and doesn't need to be started with \&.
.. option:: --auto-shutdown arg
Enable\/disable self\-shutdown because a master module shutdown.
This only works when messaging is enabled and the master
module sends a shutdown message \(enabled with \-\-start\-stop\-msg
for the master module\).
.. option:: --shutdown-master-module arg
Set the name of the master\-module used for auto\-shutdown.
This is the application name of the module actually
started. If symlinks are used, then it is the name of
the symlinked application.
.. option:: --shutdown-master-username arg
Set the name of the master\-username of the messaging
used for auto\-shutdown. If \"shutdown\-master\-module\" is
given as well, this parameter is ignored.
.. option:: --first-new
Overrides configuration parameter :confval:`firstNew`.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --log-file arg
Use alternative log file.
Messaging
---------
.. option:: -u, --user arg
Overrides configuration parameter :confval:`connection.username`.
.. option:: -H, --host arg
Overrides configuration parameter :confval:`connection.server`.
.. option:: -t, --timeout arg
Overrides configuration parameter :confval:`connection.timeout`.
.. option:: -g, --primary-group arg
Overrides configuration parameter :confval:`connection.primaryGroup`.
.. option:: -S, --subscribe-group arg
A group to subscribe to.
This option can be given more than once.
.. option:: --content-type arg
Overrides configuration parameter :confval:`connection.contentType`.
.. option:: --start-stop-msg arg
Set sending of a start and a stop message.
Database
--------
.. option:: --db-driver-list
List all supported database drivers.
.. option:: -d, --database arg
The database connection string, format:
service:\/\/user:pwd\@host\/database.
\"service\" is the name of the database driver which
can be queried with \"\-\-db\-driver\-list\".
.. option:: --config-module arg
The config module to use.
.. option:: --inventory-db arg
Load the inventory from the given database or file, format:
[service:\/\/]location .
.. option:: --db-disable
Do not use the database at all
Alert
-----
.. option:: --amp-type arg
Specify the amplitude type to listen to.
.. option:: --amp-script arg
Overrides configuration parameter :confval:`scripts.amplitude`.
.. option:: --alert-script arg
Overrides configuration parameter :confval:`scripts.alert`.
.. option:: --event-script arg
Overrides configuration parameter :confval:`scripts.event`.
Cities
------
.. option:: --max-dist arg
Overrides configuration parameter :confval:`poi.maxDist`.
.. option:: --min-population arg
Overrides configuration parameter :confval:`poi.minPopulation`.
Debug
-----
.. option:: -E, --eventid arg
Specify event ID that is used for testing.
After running the alert scripts scvoice will exit.

View File

@ -0,0 +1,223 @@
.. highlight:: rst
.. _scwfas:
######
scwfas
######
**Waveform archive server**
Description
===========
The waveform archive server is a small application that serves a local
SDS archive via different protocols. Currently there are two implementations:
* :ref:`fdsnws dataselect <sec-dataSelect>`
* dataselect/1/query
* dataselect/1/version
* dataselect/1/application.wadl
* Arclink (deprecated)
This application is meant to share data with trusted computers in a fast and
efficient way. It does not require inventory information and supports wildcards
on each level.
All data are forwarded unrestricted. There are no options to add restriction
checks or user authentication.
.. _scwfas_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/scwfas.cfg`
| :file:`etc/global.cfg`
| :file:`etc/scwfas.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/scwfas.cfg`
scwfas inherits :ref:`global options<global-configuration>`.
.. confval:: handlerSDS
Type: *string*
Defines an alternative SDS archive handler. This is the name
of an RecordStream interface that can be loaded via a plugin.
If not given, an internal implementation will be used.
.. confval:: filebase
Default: ``@ROOTDIR@/var/lib/archive``
Type: *string*
The filebase of the SDS archive. If an alternative archive
handler is defined by \"handlerSDS\", this value serves
as input to setSource\(\).
.. confval:: arclink.port
Default: ``-1``
Type: *int*
The server port for Arclink connections. \-1
deactivates the Arclink server. The standard Arclink port is
18001.
.. confval:: fdsnws.port
Default: ``8080``
Type: *int*
The server port for FDSNWS connections. \-1
deactivates the FDSN Web server.
.. confval:: fdsnws.baseURL
Default: ``http://localhost:8080/fdsnws``
Type: *string*
The base URL of the FDSN Web service that is
given in the WADL document.
.. confval:: fdsnws.maxTimeWindow
Default: ``0``
Type: *int*
Unit: *s*
The aggregated maximum time window \(seconds\)
for all requested streams. A value of 0 will deactive
any restriction.
Command-Line Options
====================
.. program:: scwfas
:program:`scwfas [options]`
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
.. option:: --plugins arg
Load given plugins.
.. option:: -D, --daemon
Run as daemon. This means the application will fork itself
and doesn't need to be started with \&.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --print-component arg
For each log entry print the component right after the
log level. By default the component output is enabled
for file output but disabled for console output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --trace
Execute in trace mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 \-\-print\-component\=1
\-\-print\-context\=1 .
.. option:: --log-file arg
Use alternative log file.
Server
------
.. option:: --arclink-port int
Overrides configuration parameter :confval:`arclink.port`.
.. option:: --fdsnws-port int
Overrides configuration parameter :confval:`fdsnws.port`.
.. option:: --fdsnws-baseurl string
Overrides configuration parameter :confval:`fdsnws.baseURL`.

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,410 @@
.. highlight:: rst
.. _scxmldump:
#########
scxmldump
#########
**Dump database objects to XML files.**
Description
===========
scxmldump reads various parameters from a SeisComP database:
* Availability,
* Config (bindings parameters),
* Event parameters,
* Inventory,
* Journal,
* Routing.
The parameters are sent to stdout or written into an XML (:term:`SCML`) file.
.. note::
Waveform quality control (QC) parameters can be read from databases using
:ref:`scqcquery`.
Event parameters
----------------
To get event, origin or pick information from the database without using SQL
commands is an important task for the user. :ref:`scxmldump` queries the
database and transforms that information into XML. Events and origins can be
treated further by :ref:`scbulletin` for generating bulletins or conversion
into KML.
Many processing modules, e.g., :ref:`scevent` support the on-demand processing
of dumped event parameters by the command-line option :option:`--ep`.
Importing event parameters into another database is possible with :ref:`scdb`
and sending to a SeisComP messaging is provided by :ref:`scdispatch`.
.. hint::
Events, origins and picks are referred to by their public IDs. IDs of events
and origins can be provided by :ref:`scevtls` and :ref:`scorgls`,
respectively. Event, origin and pick IDs can also be read from graphical
tools like :ref:`scolv` or used database queries assisted by :ref:`scquery`.
Format conversion
-----------------
Conversion of :term:`SCML` into other formats is supported by :ref:`sccnv`.
An XSD schema of the XML output can be found under
:file:`$SEISCOMP_ROOT/share/xml/`.
Examples
--------
Dump inventory
.. code-block:: sh
scxmldump -fI -o inventory.xml -d mysql://sysop:sysop@localhost/seiscomp
Dump config (bindings parameters)
.. code-block:: sh
scxmldump -fC -o config.xml -d mysql://sysop:sysop@localhost/seiscomp
Dump full event data incl. the relevant journal entries
.. code-block:: sh
scxmldump -fPAMFJ -E test2012abcd -o test2012abcd.xml \
-d mysql://sysop:sysop@localhost/seiscomp
Dump summary event data
.. code-block:: sh
scxmldump -fap -E test2012abcd -o test2012abcd.xml \
-d mysql://sysop:sysop@localhost/seiscomp
Create bulletin from an event
.. code-block:: sh
scxmldump -fPAMF -E test2012abcd
-d mysql://sysop:sysop@localhost/seiscomp | \
scbulletin
Copy event parameters to another database
.. code-block:: sh
scxmldump -fPAMF -E test2012abcd \
-d mysql://sysop:sysop@localhost/seiscomp | \
scdb -i - -d mysql://sysop:sysop@archive-db/seiscomp
Export the entire journal:
.. code-block:: sh
scxmldump -fJ -o journal.xml \
-d mysql://sysop:sysop@localhost/seiscomp
.. _scxmldump_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/scxmldump.cfg`
| :file:`etc/global.cfg`
| :file:`etc/scxmldump.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/scxmldump.cfg`
scxmldump inherits :ref:`global options<global-configuration>`.
Command-Line Options
====================
.. program:: scxmldump
:program:`scxmldump [options]`
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
.. option:: --plugins arg
Load given plugins.
.. option:: -D, --daemon
Run as daemon. This means the application will fork itself
and doesn't need to be started with \&.
.. option:: --auto-shutdown arg
Enable\/disable self\-shutdown because a master module shutdown.
This only works when messaging is enabled and the master
module sends a shutdown message \(enabled with \-\-start\-stop\-msg
for the master module\).
.. option:: --shutdown-master-module arg
Set the name of the master\-module used for auto\-shutdown.
This is the application name of the module actually
started. If symlinks are used, then it is the name of
the symlinked application.
.. option:: --shutdown-master-username arg
Set the name of the master\-username of the messaging
used for auto\-shutdown. If \"shutdown\-master\-module\" is
given as well, this parameter is ignored.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --log-file arg
Use alternative log file.
Messaging
---------
.. option:: -u, --user arg
Overrides configuration parameter :confval:`connection.username`.
.. option:: -H, --host arg
Overrides configuration parameter :confval:`connection.server`.
.. option:: -t, --timeout arg
Overrides configuration parameter :confval:`connection.timeout`.
.. option:: -g, --primary-group arg
Overrides configuration parameter :confval:`connection.primaryGroup`.
.. option:: -S, --subscribe-group arg
A group to subscribe to.
This option can be given more than once.
.. option:: --content-type arg
Overrides configuration parameter :confval:`connection.contentType`.
.. option:: --start-stop-msg arg
Set sending of a start and a stop message.
Database
--------
.. option:: --db-driver-list
List all supported database drivers.
.. option:: -d, --database arg
The database connection string, format:
service:\/\/user:pwd\@host\/database.
\"service\" is the name of the database driver which
can be queried with \"\-\-db\-driver\-list\".
.. option:: --config-module arg
The config module to use.
.. option:: --inventory-db arg
Load the inventory from the given database or file, format:
[service:\/\/]location .
.. option:: --config-db arg
Load the configuration from the given database or file,
format: [service:\/\/]location .
Dump
----
.. option:: --listen
Enable server mode which listens to the message server for
incoming events and dumps each received add\/update.
.. option:: -I, --inventory
Dump the inventory.
.. option:: --without-station-groups
Remove all station groups from inventory.
.. option:: --stations sta-list
If inventory is exported, filter the stations to export.
Wildcards are supported. Format of each item: net[.{sta\|\*}].
.. option:: -C, --config
Dump the configuration \(bindings\).
.. option:: -J, --journal
Dump the journal. In combination with \-E only corresponding
journal entries are included.
.. option:: -R, --routing
Dump the routing.
.. option:: -Y, --availability
Dump data availability information.
.. option:: --with-segments
Dump individual data segments.
.. option:: --pick ID
Pick public ID\(s\) to dump. Multiple IDs may be specified
as a comma\-separated list.
.. option:: -O, --origin ID
Origin public ID\(s\) to dump. Multiple origins may be specified
as a comma\-separated list.
.. option:: -E, --event ID
Event public ID\(s\) to dump. Multiple events may be specified
as a comma\-separated list.
.. option:: -P, --with-picks
Add picks associated to origins.
.. option:: -A, --with-amplitudes
Add amplitudes associated to magnitudes.
.. option:: -M, --with-magnitudes
Add station magnitudes associated to origins.
.. option:: -F, --with-focal-mechanism
Add focal mechanisms associated to events.
.. option:: -a, --ignore-arrivals
Do not dump arrivals part of origins.
.. option:: --ignore-magnitudes
Do not export \(network\) magnitudes of origins. This
option is most useful in combination with \-O to only
export the location information.
.. option:: -p, --preferred-only
When exporting events only the preferred origin, the preferred
magnitude and the preferred focal mechanism will be dumped.
.. option:: -m, --all-magnitudes
If only the preferred origin is exported, all magnitudes for
this origin will be dumped.
Output
------
.. option:: -f, --formatted
Use formatted XML output. Otherwise all XML is printed
on one line.
.. option:: -o, --output
Name of output file. If not given, output is sent to
stdout.
.. option:: --prepend-datasize
Prepend a line with the length of the XML data.

View File

@ -0,0 +1,197 @@
.. highlight:: rst
.. _scxmlmerge:
##########
scxmlmerge
##########
**Merge the content of multiple XML files in SCML format.**
Description
===========
scxmlmerge reads all |scname| elements from one or more XML files in :term:`SCML`
format. It merges the content and prints the result to standard output. The
input can contain and :ref:`SeisComP element<api-datamodel-python>` and the
content can be filtered to print only some elements such as EventParameters.
The output can be redirected into one single file and used by other applications.
The supported :ref:`SeisComP elements<api-datamodel-python>` are:
* Config
* DataAvailability
* EventParameters
* Inventory
* Journaling
* QualityControl
* Routing
By default all supported elements will be parsed and merged. Duplicates are
removed. Use options to restrict the element types.
There are alternative modules for processing inventory XML files:
* :ref:`scinv`: Merge inventory XML files, extract inventory information.
* :ref:`invextr`: Extract and filter inventory information.
Examples
========
#. Merge the all SeisComP elements from 2 XML files into a single XML file:
.. code-block:: sh
scxmlmerge file1.xml file2.xml > file.xml
#. Merge the all EventParameters and all Config elements from 2 XML files into a
single XML file. Other element types will be ignored:
.. code-block:: sh
scxmlmerge -E -C file1.xml file2.xml > file.xml
.. _scxmlmerge_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/scxmlmerge.cfg`
| :file:`etc/global.cfg`
| :file:`etc/scxmlmerge.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/scxmlmerge.cfg`
scxmlmerge inherits :ref:`global options<global-configuration>`.
Command-Line Options
====================
.. program:: scxmlmerge
:program:`scxmlmerge [options] inputFiles`
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --print-component arg
For each log entry print the component right after the
log level. By default the component output is enabled
for file output but disabled for console output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --trace
Execute in trace mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 \-\-print\-component\=1
\-\-print\-context\=1 .
.. option:: --log-file arg
Use alternative log file.
Dump
----
.. option:: -Y, --availability
Include DataAvailability specifically. If no specfic object
type is given, all supported objects are included.
.. option:: -E, --event
Include EventParameters specifically. If no specfic object
type is given, all supported objects are included.
.. option:: -I, --inventory
Include Inventory specifically. If no specfic object
type is given, all supported objects are included.
.. option:: -J, --journaling
Include QualityControl specifically. If no specfic object
type is given, all supported objects are included.
.. option:: -C, --config
Include Config specifically. If no specfic object
type is given, all supported objects are included.
.. option:: -Q, --quality
Include QualityControl specifically. If no specfic object
type is given, all supported objects are included.
.. option:: -R, --routing
Include Routing specifically. If no specfic object
type is given, all supported objects are included.
Options
-------
.. option:: --ignore-bad-files
Tolerate empty or corrupted input files and continue without
interruption. Otherwise, the application stops if corrupt
or empty files are processed.

View File

@ -0,0 +1,176 @@
.. highlight:: rst
.. _sczip:
#####
sczip
#####
**Compress or expand files with ZIP algorithm**
Description
===========
sczip can compress and decompress data using the ZIP algorithm (PKZIP). It
is not meant to supersede any available packagers but a little helper to
decompress zipped SeisComPML formats. Like GZip, sczip can only handle one file
and does not support archives. It compresses a byte stream and outputs a byte
stream.
Examples
========
Decompress a file
.. code-block:: sh
sczip -d file.xml.zip -o file.xml
.. code-block:: sh
sczip -d file.xml.zip -o file.xml
Compress a file
.. code-block:: sh
sczip file.xml -o file.xml.zip
.. code-block:: sh
sczip < file.xml > file.xml.zip
.. _sczip_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/sczip.cfg`
| :file:`etc/global.cfg`
| :file:`etc/sczip.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/sczip.cfg`
sczip inherits :ref:`global options<global-configuration>`.
Command-Line Options
====================
.. program:: sczip
:program:`sczip file.dat -o file.zip`
:program:`sczip file.dat -d -o file.zip`
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
.. option:: --plugins arg
Load given plugins.
.. option:: -D, --daemon
Run as daemon. This means the application will fork itself
and doesn't need to be started with \&.
.. option:: --auto-shutdown arg
Enable\/disable self\-shutdown because a master module shutdown.
This only works when messaging is enabled and the master
module sends a shutdown message \(enabled with \-\-start\-stop\-msg
for the master module\).
.. option:: --shutdown-master-module arg
Set the name of the master\-module used for auto\-shutdown.
This is the application name of the module actually
started. If symlinks are used, then it is the name of
the symlinked application.
.. option:: --shutdown-master-username arg
Set the name of the master\-username of the messaging
used for auto\-shutdown. If \"shutdown\-master\-module\" is
given as well, this parameter is ignored.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --log-file arg
Use alternative log file.
Mode
----
.. option:: -d, --decompress
Decompress.
.. option:: -o, --output file
Output file name. Default is stdout.

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,488 @@
.. highlight:: rst
.. _seiscomp:
########
seiscomp
########
**SeisComP system control utility**
Description
===========
The tool :program:`seiscomp` allows controlling your |scname| system on the
command line. As other |scname| modules it provides
options and commands, e.g., the command :command:`help`. Apply
:program:`seiscomp` to
* Install software dependencies,
* Print
* environment variables of the installed |scname| system,
* internal |scname| variables which can be used in configurations,
* suggestions for timed automatic actions, i.e. "*conjobs*",
* Make a basic setup including the |scname| database,
* List daemon modules by categories,
* Enable or disable modules in order to start them by default,
* Start, stop, restart or reload single or multiple modules or all default
modules,
* Check the status of module,
* Execute single or multiple modules or all default modules
* Print the run status of modules,
* Manage modules aliases,
* Update inventory or bindings configurations,
* Manage :ref:`bindings <binding>` by a specific shell.
.. note::
When executing :program:`seiscomp`, all actions refer to the |scname|
installation from within which :program:`seiscomp` is called. This allows
to refer to a default but also to any other installed |scname| system and to
operate multiple |scname| systems in parallel, e.g., for testing different
versions or for running different projects. Then give the full path to the
:program:`seiscomp` tools. Example:
.. code-block:: sh
$HOME/seiscomp-test/bin/seiscomp
Many of these actions are used by :ref:`scconfig`.
.. _sec_seiscomp_help:
Help
====
Use the command :command:`help` for learning about the full set of options and
other commands including examples:
.. code-block:: sh
seiscomp help
For basic help you may also use the option :option:`-h`:
.. code-block:: sh
seiscomp -h
.. _sec_seiscomp_applications:
Applications
============
.. _sec_seiscomp_sw_deps:
Software dependencies
---------------------
Software dependencies should be installed after installation or updates of
|scname|. You may install dependencies on different levels, e.g., *base*,
*gui*, *fdswnws*, *[database]-server*. Examples:
.. code-block:: sh
seiscomp install-deps base
seiscomp install-deps base gui mariadb-server
Alternatively run the shell scripts for your Linux flavor and version located in
:file:`seiscomp/share/deps/`.
.. note::
For making a full installation and setup follow the instructions starting
with section :ref:`installation`.
.. _sec_seiscomp_print:
Print
-----
You may print the environment variables related to your considered |scname|
installation, internal |scname| variables or suggestions for timed automatic
procedures. Examples:
.. code-block:: sh
seiscomp print env
$HOME/seiscomp-test/bin/seiscomp print env
seiscomp print variables
seiscomp print crontab
Add the environment variables to your shell configuration for making them known
user wide. Internal variables are resolved when applying them in user
configurations. For adjusting, adding or removing :program:`crontab` listings
execute:
.. code-block:: sh
man crontab
crontab -e
.. _sec_seiscomp_setup:
Basic setup
-----------
Make a basic setup of your |scname| system interactively after installation.
This will also allow you to generate a database or to configure the connection
to an existing one. Run, e.g.
.. code-block:: sh
seiscomp setup
$HOME/seiscomp-test/bin/seiscomp setup
.. _sec_seiscomp_list:
List
----
List modules which can be started to run as background daemon modules by
categories. Examples:
.. code-block:: sh
seiscomp list modules
seiscomp list enabled
seiscomp list started
.. _sec_seiscomp_enable:
Enable/disable [*]_
-------------------
Enabled modules will be started to run as a background daemon module.
You may enable or disable one or multiple modules. Examples:
.. code-block:: sh
seiscomp enable scautopick
seiscomp enable scautopick scautoloc
seiscomp disable scautopick scautoloc
.. _sec_seiscomp_start:
Start/stop/restart/reload [*]_
------------------------------
Start all enabled modules:
.. code-block:: sh
seiscomp start
Stop all modules and start all enabled modules:
.. code-block:: sh
seiscomp restart
Start/stop/restart specific modules
.. code-block:: sh
seiscomp start scautopick scautoloc
seiscomp stop scautopick scautoloc
seiscomp restart scautopick scautoloc
In order to apply configurations, a module must be (re)started since it reads
any configuration only during startup. Restarts will create downtimes and should
be avoided as much as possible. In order to minimize downtimes, some modules
may apply changes in configuration by reloading during runtime without
restarting. For reloading you may use the command :command:`seiscomp reload`.
The application of reloading is therefore restricted to a limited range of
modules and parameters.
.. note::
Graphical modules such as :ref:`scolv` cannot be operated as background
daemon modules. Therefore, they cannot be started but they can
be :ref:`executed <sec_seiscomp_execute>`.
.. _sec_seiscomp_check:
Check [*]_
----------
When modules stop unexpectedly, they are not stopped in a clean way. Such
stopped modules may be detected and started again in order to minimize
downtimes. Apply the :command:`check` command to all or specific modules.
Examples:
.. code-block:: sh
seiscomp check
seiscomp check scautopick
.. _sec_seiscomp_execute:
Execute
-------
Instead of running daemon modules you may execute modules in a terminal and
observe the output, e.g., for debugging or for applying command-line options.
Examples:
.. code-block:: sh
seiscomp exec scolv --debug
seiscomp exec scautopick --debug
.. note::
When all relevant system environment variables point to the same |scname|
installation from where seiscomp is executed, then it is enough to execute
modules by their names replacing the above:
.. code-block:: sh
scolv --debug
scautopick --debug
.. _sec_seiscomp_status:
Status [*]_
-----------
List the status of all, enabled, disabled, started, or specific modules.
Examples:
.. code-block:: sh
seiscomp status
seiscomp status enabled
seiscomp status disabled
seiscomp status started
seiscomp status scautopick
:command:status` will report modules which terminated due to errors.
.. _sec_seiscomp_aliases:
Module Aliases
--------------
For some |scname| modules aliases can be generated allowing the separate
execution with specific configurations in parallel the original module
and even in separate pipeline with specific message groups.
Using the :command:`alias` command aliases modules can be created or removed.
Examples for creating or removing the alias :program:`l1autopick` to
:ref:`scautopick`:
.. code-block:: sh
seiscomp alias create l1autopick scautopick
seiscomp alias remove l1autopick
When creating aliases, soft links to the original module executable files, the
default configuration and the init files are created. The alias itself is
registered in :file:`SEISCOMP_ROOT/etc/descriptions/aliases`. If a module does
not allow creating aliases a notification is printed. Example:
.. code-block:: sh
seiscomp alias create scolv1 scolv
error: module 'scolv' not found
After creating aliases, they may be configured and operated in the same way as
the original module.
.. warning::
The length of alias names for modules considering
:ref:`bindings<global_bindings_config>` is strictly limited to 20 characters.
When removing aliases, all links and the alias registration are removed but
possibly existing module or binding configurations remained unchanged. The
option :option:`--interactive` allows removing these configurations
interactively.
.. code-block:: sh
seiscomp --interactive alias remove l1autopick
.. _sec_seiscomp_update:
Update configuration [*]_
-------------------------
The command :command:`update-config` allows reading bindings configurations from
the standard :file:`@KEYDIR@` directory as well as inventory from
:file:`@SYSTEMCONFIGDIR@/inventory` and sending them to the messaging for
storing in the database or for generating the configuration of
:term:`standalone modules <standalone module>`:
.. code-block:: sh
seiscomp update-config
Executing :command:`seiscomp update-config` involves:
* Merging inventory,
* Sending inventory updates to the messaging,
* Synchronisation of inventory, key files and bindings,
* Sending any updates of bindings to the messaging,
* Generation of configuration for :term:`standalone modules <standalone
module>`.
The command can therefore be rater time consuming. For speeding up you may be
more specific:
* Only update global bindings and all :term:`trunk` modules without inventory
.. code-block:: sh
seiscomp update-config trunk
* Update only inventory
.. code-block:: sh
seiscomp update-config inventory
* Update bindings of :ref:`scautopick` only
.. code-block:: sh
seiscomp update-config scautopick
The command may be similarly applied to any other module considering
bindings.
.. note::
Instead of reading bindings configurations from the standard @KEYDIR@
directory, the module :ref:`bindings2cfg` can read bindings from any key
directory and write the Config parameters to :term:`SCML` or send them to
the messaging.
.. _sec_seiscomp_shell:
seiscomp Shell
==============
The seiscomp shell is a special environment, e.g., allowing to control
:term:`bindings <binding>` of :term:`modules <module>` to stations.
Applications are:
* Create or remove station bindings,
* Create or remove binding profiles,
* Remove binding profiles.
Invoke :program:`seiscomp` along with the :command:`shell` command to start the
shell:
.. code-block:: sh
seiscomp shell
================================================================================
SeisComP shell
================================================================================
Welcome to the SeisComP interactive shell. You can get help about
available commands with 'help'. 'exit' leaves the shell.
$
The full list of shell control commands are printed along with the help of the
seiscomp shell:
.. code-block:: sh
================================================================================
SeisComP shell
================================================================================
Welcome to the SeisComP interactive shell. You can get help about
available commands with 'help'. 'exit' leaves the shell.
$ help
Commands:
list stations
Lists all available stations keys.
list profiles {mod}
Lists all available profiles of a module.
...
.. note::
.. [*] With this command, the flag :option:`--invert` can be used
in order to invert the application to the specific modules. You may provide
one or more module names. A major application is to
restart most |scname| modules after a change in global bindings. However,
:ref:`seedlink` and :ref:`slarchive` are not affected by global
bindings and any downtime of these modules shall be avoided. Example:
.. code-block:: sh
seiscomp --invert restart seedlink slarchive
The same procedure could be achieved without :option:`--invert` by
explicitly stating all other modules which, however, may result in a
long list of module names.
Command-Line Options
====================
.. program:: seiscomp
:program:`seiscomp [flags] command(s) [argument(s)]`
Flags
-----
.. option:: --asroot
Allow running a command as root.
.. option:: --csv
Print output as CSV in machine\-readable format.
.. option:: -h, --help
Produce this help message.
.. option:: -i, --interactive
Interactive mode: Allow deleting configurations interactively when
removing aliases.
.. option:: --invert
Invert the selection of the specified module names when using any of the
commands: start, stop, check, status, reload, or restart.
.. option:: --wait arg
Define a timeout in seconds for acquiring the seiscomp lock
file, e.g., `seiscomp \-\-wait 10 update\-config`.

View File

@ -0,0 +1,366 @@
.. highlight:: rst
.. _sh2proc:
#######
sh2proc
#######
**Convert SeismicHandler event files to SCML.**
Description
===========
sh2proc converts `Seismic Handler <http://www.seismic-handler.org/>`_ event data to
SeisComP XML format. Data is read from input file or `stdin` if no input file is
specified. The result is available on `stdout`.
Code mapping
============
Since Seismic Handler only specifies station and component codes, a mapping to
SeisComP network, location and channel codes is necessary. The script assumes
that the same station code is not used in different networks. In case an
ambiguous id is found a warning is printed and the first network code is used.
The channel and stream code is extracted from the dectecStream and detecLocid
configured in the global bindings. In case no configuration module is available
the first location and stream is used.
Event parameters
================
* Event types given in Seismic Handler files are mapped to SeisComP event types:
.. csv-table::
:header: "Seismic Handler", "SeisComP"
"teleseismic quake","earthquake"
"regional quake","earthquake"
"local quake","earthquake"
"quarry blast","quarry blast"
"nuclear explosion","nuclear explosion"
"mining event","mining explosion"
* The EventID given in Seismic Handler files is mapped as a comment to the event.
Magnitudes
==========
* Magnitude types given in Seismic Handler files are mapped to SeisComP magnitudes:
.. csv-table::
:header: "Seismic Handler", "SeisComP"
"m","M"
"ml","ML"
"mb","mb"
"ms","Ms(BB)"
"mw","Mw"
"bb","mB"
* ML magnitudes in Seismic Handler files have no corresponding measured amplitudes.
Therefore the ML station magnitudes are converted without referencing the amplitude.
* Seismic Handler uses the phase name "L" for referring to surface waves without
further specification. The phase name is kept unchanged.
Distance calculations
=====================
In Seismic Handler files distances can be given in units of km or degree but in
SeisComP only degree is used. Both representations are considered for conversion.
In case of double posting preference is given to the Seismic Handler values given in km
due to their higher precision.
Beam parameters
===============
Seismic Handler files provide the phase picks with theoretical, measured and corrected
slowness and (back) azimuth but the pick in SeisComP knows only one value.
During conversion highest preference is given to corrected values.
The theoretical values are ignored.
Limitations
===========
The following parameters from Seismic Handler files are not considered:
* Phase Flag
* Location Input Params
* Reference Location Name
* Quality Number
* Ampl&Period Source
* Location Quality
* Reference Latitude
* Reference Longitude
* Amplitude Time
Further processing in SeisComP
===============================
The created XML files can be used in multiple ways, e.g.:
#. By other modules in an XML-base playback
#. Inject into the messaging system by :ref:`scdispatch`
#. Integrate into the database by :ref:`scdb`
Examples
========
#. Convert the Seismic Handler file `shm.evt` and writes SCML into the file
`sc.xml`. The database connection to read inventory and configuration
information is fetched from the default messaging connection.
.. code-block:: sh
sh2proc shm.evt > sc.xml
#. Read Seismic Handler data from `stdin`. Inventory and configuration information
is provided through files.
.. code-block:: sh
cat shm.evt | sh2proc --inventory-db=inventory.xml --config-db=config.xml > sc.xml
shm.evt file format
===================
The list of parameters supported by sh2proc may be incomplete.
Read the original `format and parameter description <http://www.seismic-handler.org/wiki/ShmDocFileEvt>`_
of the SeismicHandler .evt files for providing correct input files.
Example of a SeismicHandler `shm.evt` file with supported parameters:
.. code-block:: sh
Event ID : 1170102002
Station code : VITZ
Onset time : 2-JAN-2017_12:25:40.415
Onset type : emergent
Phase name : Pg
Event Type : mining event
Applied filter : SHM_BP_1HZ_25HZ_3
Component : Z
Quality number : 2
Pick Type : manual
Weight : 4
Theo. Azimuth (deg) : 27.29
Theo. Backazimuth (deg): 207.36
Distance (deg) : 0.122
Distance (km) : 13.572
Magnitude ml : 1.0
Phase Flags : L
--- End of Phase ---
Event ID : 1170102002
Station code : WESF
Onset time : 2-JAN-2017_12:25:53.714
Onset type : emergent
Phase name : Pg
Event Type : mining event
Applied filter : SHM_BP_1HZ_25HZ_3
Component : Z
Quality number : 2
Pick Type : manual
Weight : 4
Theo. Azimuth (deg) : 106.98
Theo. Backazimuth (deg): 287.91
Distance (deg) : 0.807
Distance (km) : 89.708
Magnitude ml : 1.8
Mean Magnitude ml : 1.1
Latitude : +50.779
Longitude : +10.003
Depth (km) : 0.0
Depth type : (g) estimated
Origin time : 2-JAN-2017_12:25:38.273
Region Table : GEO_REG
Region ID : 5326
Source region : Tann, E of Fulda
Velocity Model : deu
Location Input Params : 20
Reference Location Name: CENTRE
--- End of Phase ---
.. _sh2proc_configuration:
Module Configuration
====================
| :file:`etc/defaults/global.cfg`
| :file:`etc/defaults/sh2proc.cfg`
| :file:`etc/global.cfg`
| :file:`etc/sh2proc.cfg`
| :file:`~/.seiscomp/global.cfg`
| :file:`~/.seiscomp/sh2proc.cfg`
sh2proc inherits :ref:`global options<global-configuration>`.
Command-Line Options
====================
.. program:: sh2proc
:program:`sh2proc [options]`
Generic
-------
.. option:: -h, --help
Show help message.
.. option:: -V, --version
Show version information.
.. option:: --config-file arg
Use alternative configuration file. When this option is
used the loading of all stages is disabled. Only the
given configuration file is parsed and used. To use
another name for the configuration create a symbolic
link of the application or copy it. Example:
scautopick \-> scautopick2.
.. option:: --plugins arg
Load given plugins.
.. option:: -D, --daemon
Run as daemon. This means the application will fork itself
and doesn't need to be started with \&.
.. option:: --auto-shutdown arg
Enable\/disable self\-shutdown because a master module shutdown.
This only works when messaging is enabled and the master
module sends a shutdown message \(enabled with \-\-start\-stop\-msg
for the master module\).
.. option:: --shutdown-master-module arg
Set the name of the master\-module used for auto\-shutdown.
This is the application name of the module actually
started. If symlinks are used, then it is the name of
the symlinked application.
.. option:: --shutdown-master-username arg
Set the name of the master\-username of the messaging
used for auto\-shutdown. If \"shutdown\-master\-module\" is
given as well, this parameter is ignored.
Verbosity
---------
.. option:: --verbosity arg
Verbosity level [0..4]. 0:quiet, 1:error, 2:warning, 3:info,
4:debug.
.. option:: -v, --v
Increase verbosity level \(may be repeated, eg. \-vv\).
.. option:: -q, --quiet
Quiet mode: no logging output.
.. option:: --component arg
Limit the logging to a certain component. This option can
be given more than once.
.. option:: -s, --syslog
Use syslog logging backend. The output usually goes to
\/var\/lib\/messages.
.. option:: -l, --lockfile arg
Path to lock file.
.. option:: --console arg
Send log output to stdout.
.. option:: --debug
Execute in debug mode.
Equivalent to \-\-verbosity\=4 \-\-console\=1 .
.. option:: --log-file arg
Use alternative log file.
Messaging
---------
.. option:: -u, --user arg
Overrides configuration parameter :confval:`connection.username`.
.. option:: -H, --host arg
Overrides configuration parameter :confval:`connection.server`.
.. option:: -t, --timeout arg
Overrides configuration parameter :confval:`connection.timeout`.
.. option:: -g, --primary-group arg
Overrides configuration parameter :confval:`connection.primaryGroup`.
.. option:: -S, --subscribe-group arg
A group to subscribe to.
This option can be given more than once.
.. option:: --content-type arg
Overrides configuration parameter :confval:`connection.contentType`.
.. option:: --start-stop-msg arg
Set sending of a start and a stop message.
Database
--------
.. option:: --db-driver-list
List all supported database drivers.
.. option:: -d, --database arg
The database connection string, format:
service:\/\/user:pwd\@host\/database.
\"service\" is the name of the database driver which
can be queried with \"\-\-db\-driver\-list\".
.. option:: --config-module arg
The config module to use.
.. option:: --inventory-db arg
Load the inventory from the given database or file, format:
[service:\/\/]location .
.. option:: --db-disable
Do not use the database at all

View File

@ -0,0 +1,484 @@
.. highlight:: rst
.. _slarchive:
#########
slarchive
#########
**SeedLink client for data stream archiving**
Description
===========
slarchive connects to a SeedLink server, requests data streams and writes received
packets into directory/file structures (archives). The precise layout
of the directories and files is defined in a format string.
The implemented layouts are:
- :ref:`SDS <slarchive-section-sds>`: The SeisComP Data Structure, default in |scname|
- BUD: Buffer of Uniform Data structure
- DLOG: The old SeisComP/datalog structure for backwards compatibility
The duration for which the data are kept in archive is controlled by the bindings
parameter :confval:`keep`. slarchive itself does not clean the archive. For removing
old data execute :file:`$SEISCOMP_ROOT/var/lib/slarchive/purge_datafiles`. A
regular clean-up is suggested by ::
seiscomp print crontab
The resulting line, e.g. ::
20 3 * * * /home/sysop/seiscomp/var/lib/slarchive/purge_datafiles >/dev/null 2>&1
can be adjusted and added to crontab.
Background Execution
====================
When starting slarchive in |scname| as a daemon module in the background SDS is
considered and the packets are written without modification: ::
$ seiscomp start slarchive
Command-Line Execution
======================
Writing to **other layouts** or to **multiple archives** and other options are
supported when executing slarchive on the command line.
E.g. to write to more than one archive simply specify multiple format definitions
(or presets).
For more command-line option read the help: ::
$ slarchive -h
Multiple Instances
==================
slarchive allows generating aliases, e.g. for running in multiple instances with
different module and bindings configurations. For creating/removing aliases use the
:ref:`seiscomp script <sec-management-commands>`, e.g. ::
$ seiscomp alias create slarchive2 slarchive
.. _slarchive-section-sds:
SDS definition
==============
SDS is the basic directory and file layout in |scname| for waveform archives. The
archive base directory is defined by :confval:`archive`. The SDS layout is defined
as:
.. code-block:: sh
<SDSdir>
+ year
+ network code
+ station code
+ channel code
+ one file per day and location, e.g. NET.STA.LOC.CHAN.D.YEAR.DOY
File example: :file:`<SDSdir>/Year/NET/STA/CHAN.TYPE/NET.STA.LOC.CHAN.TYPE.YEAR.DAY`.
+-----------+-----------------------------------------------+
| Field | Description |
+===========+===============================================+
| SDSdir | Arbitrary base directory |
+-----------+-----------------------------------------------+
| YEAR | 4 digit YEAR |
+-----------+-----------------------------------------------+
| NET | Network code/identifier, 1-8 characters, |
| | no spaces |
+-----------+-----------------------------------------------+
| STA | Station code/identifier, 1-8 characters, |
| | no spaces |
+-----------+-----------------------------------------------+
| CHAN | Channel code/identifier, 1-8 characters, |
| | no spaces |
+-----------+-----------------------------------------------+
| TYPE | 1 character, indicating the data type, |
| | provided types are: |
| | |
| | | **D** Waveform data |
| | | **E** Detection data |
| | | **L** Log data |
| | | **T** Timing data |
| | | **C** Calibration data |
| | | **R** Response data |
| | | **O** Opaque data |
| | |
+-----------+-----------------------------------------------+
| LOC | Location identifier, 1-8 characters, |
| | no spaces |
+-----------+-----------------------------------------------+
| DAY | 3 digit day of year, padded with zeros |
+-----------+-----------------------------------------------+
.. _slarchive_configuration:
Module Configuration
====================
.. note::
slarchive is a :term:`standalone module` and does not inherit :ref:`global options <global-configuration>`.
| :file:`etc/defaults/slarchive.cfg`
| :file:`etc/slarchive.cfg`
| :file:`~/.seiscomp/slarchive.cfg`
.. confval:: address
Default: ``127.0.0.1``
Type: *string*
Host of the Seedlink server to connect to. If the acquisition
is running on one system nothing needs to be changed.
.. confval:: port
Default: ``18000``
Type: *int*
The port of the Seedlink server to connect to. If the acquisition
is running on one system this port must match the configured
local Seedlink port.
.. confval:: archive
Default: ``var/lib/archive``
Type: *string*
Path to waveform archive where all data is stored. Relative paths
\(as the default\) are treated relative to the installation
directory \(\$SEISCOMP_ROOT\).
.. confval:: buffer
Default: ``1000``
Type: *int*
Number of records \(512 byte units\) to buffer before flushing to
disk.
.. confval:: delay
Default: ``30``
Type: *int*
Unit: *s*
The network reconnect delay \(in seconds\) for the connection
to the SeedLink server. If the connection breaks for any
reason this will govern how soon a reconnection should be
attempted. The default value is 30 seconds.
.. confval:: networkTimeout
Default: ``900``
Type: *int*
Unit: *s*
The network timeout \(in seconds\) for the connection to the
SeedLink server. If no data [or keep alive packets?] are received
in this time range the connection is closed and re\-established
\(after the reconnect delay has expired\). The default value is
600 seconds. A value of 0 disables the timeout.
.. confval:: idleTimeout
Default: ``300``
Type: *int*
Unit: *s*
Timeout for closing idle data stream files in seconds. The idle
time of the data streams is only checked when some packets has
arrived. If no packets arrived no idle stream files will be
closed. There is no reason to change this parameter except for
the unusual cases where the process is running against an open
file number limit. Default is 300 seconds.
.. confval:: keepalive
Default: ``0``
Type: *int*
Unit: *s*
Interval \(in seconds\) at which keepalive \(heartbeat\) packets
are sent to the server. Keepalive packets are only sent if
nothing is received within the interval. This requires a
Seedlink version >\= 3.
.. confval:: validation.certs
Default: ``var/lib/certs``
Type: *string*
Path to cerificate store where all certificates and CRLs are stored. Relative
paths\(as the default\) are treated relative to the installation
directory \(\$SEISCOMP_ROOT\).
If the signature check is enabled slarchive loads all files at start. The store
uses the OpenSSl store format. From the offical OpenSSL documentation:
\"The directory should contain one certificate or CRL per file in PEM format,
with a file name of the form hash.N for a certificate, or hash.rN for a CRL.
The .N or .rN suffix is a sequence number that starts at zero, and is incremented
consecutively for each certificate or CRL with the same hash value. Gaps in the
sequence numbers are not supported, it is assumed that there are no more objects
with the same hash beyond the first missing number in the sequence.The .N or .rN suffix
is a sequence number that starts at zero, and is incremented consecutively for
each certificate or CRL with the same hash value. Gaps in the sequence numbers
are not supported, it is assumed that there are no more objects with the same
hash beyond the first missing number in the sequence.\"
The hash value can be obtained as follows:
openssl x509 \-hash \-noout \-in <file>
.. confval:: validation.mode
Default: ``ignore``
Type: *string*
Signatures are expected to be carried in blockette 2000
as opaque data. Modes:
ignore : Signatures will be ignored and no further actions
will be taken.
warning: Signatures will be checked and all received records
which do not carry a valid signature or no signature
at all will be logged with at warning level.
skip : All received records without a valid signature
will be ignored and will not be processed.
Bindings Parameters
===================
.. confval:: selectors
Type: *list:string*
List of stream selectors. If left empty all available
streams will be requested. See slarchive manpage for
more information.
.. confval:: keep
Default: ``30``
Type: *int*
Unit: *day*
Number of days the data is kept in the archive. This
requires purge_datafile to be run as cronjob.
Command-Line Options
====================
.. program:: slarchive
:program:`slarchive [OPTION]... [host][:][port]`
Address \([host][:][port]\) is a required argument. It specifies the address
of the SeedLink server in host:port format. Either the host, port or both
can be omitted. If host is omitted then localhost is assumed,
i.e. ':18000' implies 'localhost:18000'. If the port is omitted
then 18000 is assumed, i.e. 'localhost' implies 'localhost:18000'.
If only ':' is specified 'localhost:18000' is assumed.
.. option:: -V
Print program version and exit.
.. option:: -h
Print program usage and exit.
.. option:: -v
Be more verbose. This flag can be used multiple times \(\"\-v \-v\" or \"\-vv\"\)
for more verbosity. One flag: report basic handshaking \(link configuration\) details and
briefly report each packet received. Two flags: report the details of the handshaking,
each packet received and detailed connection diagnostics.
.. option:: -p
Print details of received Mini\-SEED data records. This flag can be used multiple times
\(\"\-p \-p\" or \"\-pp\"\) for more detail. One flag: a single summary line
for each data packet received. Two flags: details of the Mini\-SEED data records received,
including information from fixed header and 100\/1000\/1001 blockettes.
.. option:: -nd delay
The network reconnect delay \(in seconds\) for the connection to the SeedLink server.
If the connection breaks for any reason this will govern how soon a reconnection should
be attempted. The default value is 30 seconds.
.. option:: -nt timeout
The network timeout \(in seconds\) for the connection to the SeedLink server. If no data
[or keep alive packets?] are received in this time range the connection is closed and
re\-established \(after the reconnect delay has expired\). The default value is 600 seconds.
A value of 0 disables the timeout.
.. option:: -k keepalive
Interval \(in seconds\) at which keepalive \(heartbeat\) packets are sent to the server.
Keepalive packets are only sent if nothing is received within the interval. Requires SeedLink
version >\= 3.
.. option:: -x statefile[:interval]
During client shutdown the last received sequence numbers and time stamps \(start times\)
for each data stream will be saved in this file. If this file exists upon startup the information
will be used to resume the data streams from the point at which they were stopped. In this way the
client can be stopped and started without data loss, assuming the data are still available on the
server. If an interval is specified the state will be saved every interval in that packets are
received. Otherwise the state will be saved only on normal program termination.
.. option:: -i timeout
Timeout for closing idle data stream files in seconds. The idle time of the data streams is
only checked when some packets has arrived. If no packets arrived no idle stream files will be
closed. There is no reason to change this parameter except for the unusual cases where the
process is running against an open file number limit. Default is 300 seconds.
.. option:: -d
Configure the connection in \"dial\-up\" mode. The remote server will close the connection when
it has sent all of the data in its buffers for the selected data streams. This is opposed to
the normal behavior of waiting indefinitely for data.
.. option:: -b
Configure the connection in \"batch\" mode.
.. option:: -Fi[:overlap]
Future check initially. Check the last Mini\-SEED data record in an existing archive file
and do not write new data to that file if it is older than a certain overlap. The default
overlap limit is 2 seconds; the overlap can be specified by appending a colon and the desired
overlap limit in seconds to the option. If the overlap is exceeded an error message will be
logged once for each time the file is opened. This option makes sense only for archive formats
where each unique data stream is written to a unique file \(e.g. SDS format\). If a data stream
is closed due to timeout \(see option \-i\) the initial future check will be preformed when the
file is re\-opened.
.. option:: -Fc[:overlap]
Future check continuously. Available only for archive Mini\-SEED data records. Check if the
first sample of the record is older than the last sample of the previous record for a given
archive file, within a certain overlap. The default overlap limit is 2 seconds; the overlap
can be specified by appending a colon and the desired overlap limit in seconds to the option.
If the overlap is exceeded an error message will be logged once until either a non\-overlapping
packet is received or a new archive file is used. This option only makes sense for archive
formats where each unique data stream is written to a unique file \(e.g. SDS format\).
.. option:: -A format
If specified, all received packets \(Mini\-SEED records\) will be appended to a directory\/file
structure defined by format. All directories implied in the format string will be created if
necessary. The option may be used multiple times to write received packets to multiple archives.
See the section \"archiving data\".
.. option:: -SDS path
If specified, all received packets \(Mini\-SEED records\) will be saved into a Simple Data
Structure \(SDS\) dir\/file structure starting at the specified directory. This directory and
all subdirectories will be created if necessary. This option is a preset of the '\-A' option.
The SDS dir\/file structure is:
<SDSdir>\/<YEAR>\/<NET>\/<STA>\/<CHAN.TYPE>\/NET.STA.LOC.CHAN.TYPE.YEAR.DAY
Details are mentioned later on.
.. option:: -BUD path
If specified, all received waveform data packets \(Mini\-SEED data records\) will be saved into
a Buffer of Uniform Data \(BUD\) dir\/file structure starting at the specified directory.
This directory and all subdirectories will be created if necessary. This option is a preset
of the '\-A' option. The BUD dir\/file structure is:
<BUDdir>\/<NET>\/<STA>\/STA.NET.LOC.CHAN.YEAR.DAY
.. option:: -DLOG DLOGdir
If specified, all received packets \(Mini\-SEED data records\) will be saved into an old style
SeisComP\/datalog dir\/file structure starting at the specified directory. This directory and
all subdirectories will be created if necessary. This option is a preset of the '\-A' option.
The DLOG dir\/file structure is:
<DLOGdir>\/<STA>\/[LOC.]<CHAN>.<TYPE>\/STA.NET.CHAN.TYPE.YEAR.DAY.HHMM
.. option:: -l streamfile
The given file contains a list of streams. This option implies multi\-station mode.
The format of the stream list file is given below in the section \"stream list file\".
.. option:: -s selectors
Defining default selectors. If no multi\-station data streams are configured these selectors
will be used for uni\-station mode. Otherwise these selectors will be used when no selectors
are specified for a given stream with the '\-S' or '\-l' options.
.. option:: -S stream[:selectors]
The connection will be configured in multi\-station mode with optional SeedLink selectors
for each station, see examples below. Stream should be provided in NET_STA format. If no
selectors are provided for a given stream, the default selectors will be used, if defined.
Requires SeedLink >\= 2.5.
.. option:: -tw start:[end]
Specifying a time window for the data streams that is applied by the server. The format
for both times is year,month,day,hour,min,sec; for example: \"2002,08,05,14,00:2002,08,05,14,15,00\".
The end time is optional but the colon must be present. If no end time is specified the
server will send data indefinitely. This option will override any saved state information.
Warning: time windowing might be disabled on the remote server.
Requires SeedLink >\= 3.

View File

@ -0,0 +1,390 @@
.. highlight:: rst
.. _slinktool:
#########
slinktool
#########
**SeedLink query interface module**
Description
===========
slinktool connects to a :ref:`seedlink` server and queries the server for information
or requests data using uni-station or multi-station mode and prints information
about the packets received. All received packets can optionally be dumped to a
single file or saved in custom directory and file layouts.
Examples
========
All-station/Uni-station mode
----------------------------
The following would connect to a SeedLink server at slink.host.com port 18000 and
configure the link in all-station/uni-station mode, exactly which data are received
depends on the data being served by the SeedLink server on that particular port.
Additionally, all of the received packets are appended to the file 'data.mseed'
and each packet received is reported on the standard output. ::
slinktool -v -o data.mseed slink.host.com:18000
The '-s' argument could be used to indicate selectors to limit the type of packets
sent by the SeedLink server (without selectors all packet types are sent). The
following would limit this connection to BHZ channel waveform data with a location
code of 10 (see an explanation of SeedLink selectors below). Additionally another
verbose flag is given, causing slinktool to report detailed header information
from data records. ::
slinktool -vv -s 10BHZ.D -o data.mseed slink.host.com:18000
Multi-station mode
------------------
The following example would connect to a SeedLink server on localhost port 18010
and configure the link in multi-station mode. Each station specified with the '-S'
argument will be requested, optionally specifying selectors for each station. ::
slinktool -v -S GE\_WLF,MN\_AQU:00???,IU\_KONO:BHZ.D :18010
This would request all data from the GEOFON station WLF as no selectors were indicated,
MedNet station AQU with location code 00 and all streams and waveform data from the
IU network station KONO from stream BHZ.
A variety of different data selections can be made simultaneously.
Examples:
* Horizontal BH channels, data only: ::
-s 'BHE.D BHN.D' -S 'GE\_STU,GE\_MALT,GE\_WLF'
* Vertical channels only: ::
-s BHZ -S GE\_STU,GE\_WLF,GE\_RUE,GE\_EIL
Wildcarding network and station codes
-------------------------------------
Some SeedLink implementations support wildcarding of the network and station codes.
If this is the case, the only two wildcard characters recognized are '\*' for
one or more characters and '?' for any single character.
As an example, all US network data can be requested using the following syntax ::
-S 'US\_\*'
Seedlink Selectors
==================
SeedLink selectors are used to request specific types of data within a given data
stream, in effect limiting the default action of sending all data types.
A data packet is sent to the client if it matches any positive selector
(without leading "!") and doesn't match any negative selectors (with a leading "!").
The general format of selectors is LLSSS.T, where LL is location, SSS is channel
and T is type (one of [DECOTL] for Data, Event, Calibration, Blockette, Timing,
and Log records). "LL", ".T", and "LLSSS." can be omitted, implying anything in
that field. It is also possible to use "?" in place of L and S as a single character
wildcard. Multiple selectors are separated by space(s).
Examples: ::
BH? - BHZ, BHN, BHE (all record types)
00BH?.D - BHZ, BHN, BHE with location code '00' (data records)
BH? !E - BHZ, BHN, BHE (excluding detection records)
BH? E - BHZ, BHN, BHE & detection records of all channels
!LCQ !LEP - exclude LCQ and LEP channels
!L !T - exclude log and timing records
Archiving Data
==============
Using the '-A format' option received data can be saved in a custom directory and
file structure. The archive format argument is expanded for each packet processed
using the following flags: ::
n : network code, white space removed
s : station code, white space removed
l : location code, white space removed
c : channel code, white space removed
Y : year, 4 digits
y : year, 2 digits zero padded
j : day of year, 3 digits zero padded
H : hour, 2 digits zero padded
M : minute, 2 digits zero padded
S : second, 2 digits zero padded
F : fractional seconds, 4 digits zero padded
% : the percent (%) character
# : the number (#) character
t : single character type code:
D - waveform data packet
E - detection packet
C - calibration packet
T - timing packet
L - log packet
O - opaque data packet
U - unknown/general packet
I - INFO packet
? - unidentifiable packet
The flags are prefaced with either the % or # modifier. The % modifier indicates
a defining flag while the # indicates a non-defining flag. All received packets
with the same set of defining flags will be saved to the same file. Non-defining
flags will be expanded using the values in the first packet received for the
resulting file name.
Time flags are based on the start time of the given packet.
For example, the format string: ::
/archive/%n/%s/%n.%s.%l.%c.%Y.%j
would be expanded to day length files named something like: ::
/archive/NL/HGN/NL.HGN..BHE.2003.055
Using non-defining flags the format string: ::
/data/%n.%s.%Y.%j.%H:#M:#S.miniseed
would be expanded to: ::
/data/NL.HGN.2003.044.14:17:54.miniseed
resulting in hour length files because the minute and second are specified with the non-defining modifier. The minute and second fields are from the first packet in the file.
Stream List File
=================
The stream list file used with the '-l' option is expected to define a data stream
on each line. The format of each line is: ::
Network Station [selectors]
The selectors are optional. If default selectors are also specified (with the '-s' option),
they they will be used when no selectors are specified for a given stream.
Example: ::
---- Begin example file -----
# Comment lines begin with a '#' or '\*'
# Example stream list file for use with the -l argument of slclient or
# with the sl\_read\_streamlist() libslink function.
GE ISP BH?.D
NL HGN
MN AQU BH? HH?
---- End example file -----
.. note::
All diagnostic output from slinktool is printed to standard error (stderr).
Exceptions are when
* Printing miniSEED packet details with the *-p* option.
* Printing unpacked samples with the *-u* option.
* Printing the raw or formatted responses to INFO requests.
Author of slinktool
===================
Chad Trabant
ORFEUS Data Center/EC-Project MEREDIAN
IRIS Data Management Center
Original source code: https://github.com/iris-edu/slinktool/tree/master/doc
Command-Line Options
====================
.. program:: slinktool
:program:`slinktool [OPTION]... [host][:][port]`
Address \([host][:][port]\) is a required argument. It specifies the address
of the SeedLink server in host:port format. Either the host, port or both
can be omitted. If host is omitted then localhost is assumed,
i.e. ':18000' implies 'localhost:18000'. If the port is omitted,
then 18000 is assumed, i.e. 'localhost' implies 'localhost:18000'.
If only ':' is specified, 'localhost:18000' is assumed.
General program options
-----------------------
.. option:: -V
Print program version and exit.
.. option:: -h
Print program usage and exit.
.. option:: -v
Be more verbose. This flag can be used multiple times \(\"\-v \-v\" or \"\-vv\"\)
for more verbosity. One flag: report basic handshaking \(link configuration\) details and
briefly report each packet received. Two flags: report the details of the handshaking,
each packet received and detailed connection diagnostics.
.. option:: -P
Ping the server, report the server ID and exit.
.. option:: -p
Print details of received Mini\-SEED data records. This flag can be used multiple times
\(\"\-p \-p\" or \"\-pp\"\) for more detail. One flag: a single summary line
for each data packet received. Two flags: details of the Mini\-SEED data records received,
including information from fixed header and 100\/1000\/1001 blockettes.
.. option:: -u
Print unpacked samples of data packets.
.. option:: -nd delay
The network reconnect delay for the connection to the SeedLink server.
If the connection breaks for any reason this will govern how soon a reconnection should
be attempted.
.. option:: -nt timeout
The network timeout \(in seconds\) for the connection to the SeedLink server. If no data
[or keep alive packets?] are received in this time range the connection is closed and
re\-established \(after the reconnect delay has expired\). The default value is 600 seconds.
A value of 0 disables the timeout.
.. option:: -k interval
Interval at which keepalive \(heartbeat\) packets are sent to the server.
Keepalive packets are only sent if nothing is received within the interval.
.. option:: -x sfile[:interval]
Save\/restore stream state information to this file.
During client shutdown the last received sequence numbers and time stamps \(start times\)
for each data stream will be saved in this file. If this file exists upon startup the information
will be used to resume the data streams from the point at which they were stopped. In this way the
client can be stopped and started without data loss, assuming the data are still available on the
server. If an interval is specified the state will be saved every interval in that packets are
received. Otherwise the state will be saved only on normal program termination.
.. option:: -d
Configure the connection in \"dial\-up\" mode. The remote server will close the connection when
it has sent all of the data in its buffers for the selected data streams. This is opposed to
the normal behavior of waiting indefinitely for data.
.. option:: -b
Configure the connection in \"batch\" mode.
Data stream selection
---------------------
.. option:: -s selector
Selectors for uni\-station or default for multi\-station mode
.. option:: -l listfile
Read a stream list from this file for multi\-station mode
.. option:: -S streams
Define a stream list for multi\-station mode.
'streams' \= 'stream1[:selectors1],stream2[:selectors2],...'
'stream' is in NET_STA format, for example:
\-S \"IU_KONO:BHE BHN,GE_WLF,MN_AQU:HH?.D\"
.. option:: -tw begin:[end]
Specify a time window in year,month,day,hour,min,sec format.
Example: \-tw 2002,08,05,14,00,00:2002,08,05,14,15,00
The end time is optional, but the colon must be present.
Data saving options
-------------------
.. option:: -o dumpfile
Write all received records to this file
.. option:: -A format
If specified, all received packets \(Mini\-SEED records\) will be appended to a directory\/file
structure defined by format. All directories implied in the format string will be created if
necessary.
.. option:: -SDS SDSdir
If specified, all received packets \(Mini\-SEED records\) will be saved into a Simple Data
Structure \(SDS\) dir\/file structure starting at the specified directory. This directory and
all subdirectories will be created if necessary. This option is a preset of the '\-A' option.
The SDS dir\/file structure is:
<SDSdir>\/<YEAR>\/<NET>\/<STA>\/<CHAN.TYPE>\/NET.STA.LOC.CHAN.TYPE.YEAR.DAY
.. option:: -BUD BUDdir
If specified, all received waveform data packets \(Mini\-SEED data records\) will be saved into
a Buffer of Uniform Data \(BUD\) dir\/file structure starting at the specified directory.
This directory and all subdirectories will be created if necessary. This option is a preset
of the '\-A' option. The BUD dir\/file structure is:
<BUDdir>\/<NET>\/<STA>\/STA.NET.LOC.CHAN.YEAR.DAY
Data server
-----------
.. option:: -i type
Send info request, type is one of the following:
ID, CAPABILITIES, STATIONS, STREAMS, GAPS, CONNECTIONS, ALL
The returned raw XML is displayed when using this option.
.. option:: -I
Print formatted server id and version
.. option:: -L
Print formatted station list \(if supported by server\)
.. option:: -Q
Print formatted stream list \(if supported by server\)
.. option:: -G
Print formatted gap list \(if supported by server\)
.. option:: -C
Print formatted connection list \(if supported by server\)

View File

@ -0,0 +1,204 @@
.. highlight:: rst
.. _slmon:
#####
slmon
#####
**SeedLink monitor creating web pages**
Description
===========
*slmon* collects waveform QC parameters from a configured :ref:`seedlink` Server
and creates static HTML websites for their visualization. :ref:`Station summeries<fig-slmon>` and
:ref:`per-station channels views<fig-slmon-stat>` are available.
Setup
=====
1. Adjust the module configuration parameters of *slmon* to set the seedlink Server,
the output directory for the created webpages and other parameters.
#. Create and adjust binding profiles and station bindings for *slmon* to activate
the monitoring of the desired networks and stations.
#. Update the configuration of *slmon*
#. Start *slmon*. This step reads the QC parameters and creates the static webpage.
The webpage is not automatically updated.
.. code-block:: sh
seiscomp update-config
seiscomp start slmon
#. Restart *slmon* in order to upate the webpage. The restart can be
performed by a the regular system check scheduled by a crontab entry.
The example below shows a crontab entry to update the webpage
generated by *slmon* every 3 minutes:
.. code-block:: sh
*/3 * * * * /home/sysop/seiscomp/bin/seiscomp check slmon >/dev/null 2>&1
.. note::
The crontab entry can be generated and added automatically
using the seiscomp script:
.. code-block:: sh
seiscomp print crontab | crontab -
A comprehensive example for monitoring many stations of a large number of networks
is provided by `GEOFON <http://geofon.gfz-potsdam.de/waveform/status/>`_ at
`GFZ Potsdam <http://www.gfz-potsdam.de>`_, Germany. A simple example of a website
create by *slmon* is given in the :ref:`figures below<fig-slmon>`.
.. _fig-slmon:
.. figure:: media/slmon.png
:align: center
:width: 16cm
Example of a website with the station summary created by *slmon*.
.. _fig-slmon-stat:
.. figure:: media/slmon-stat.png
:align: center
:width: 16cm
Example of a website with a per-station channel view created by *slmon*.
.. _slmon_configuration:
Module Configuration
====================
.. note::
slmon is a :term:`standalone module` and does not inherit :ref:`global options <global-configuration>`.
| :file:`etc/defaults/slmon.cfg`
| :file:`etc/slmon.cfg`
| :file:`~/.seiscomp/slmon.cfg`
.. confval:: title
Default: ``"SeedLink Monitor"``
Type: *string*
Title of the web page used as heading.
.. confval:: refresh
Default: ``180``
Type: *double*
Unit: *s*
Refresh interval of the generated web page used by the browser.
The interval should be similar to the interval for starting slmon.
If empty, the web page must be manually refreshed from within
the browser.
.. confval:: address
Default: ``127.0.0.1``
Type: *string*
Host of the Seedlink server to connect to. If the acquisition
is running on one system nothing needs to be changed.
.. confval:: port
Default: ``18000``
Type: *int*
The port of the Seedlink server to connect to. If the acquisition
is running on one system this port must match the configured
local Seedlink port.
.. confval:: email
Type: *string*
e\-mail address added to web pages.
.. confval:: wwwdir
Default: ``@ROOTDIR@/var/run/@NAME@``
Type: *string*
Output directory of the web pages.
.. confval:: icon
Default: ``http://www.gfz-potsdam.de/favicon.ico``
Type: *string*
Favicon URL of the web pages. Not mandatory.
.. confval:: linkname
Default: ``GEOFON``
Type: *string*
Name of Link shown in footer of web pages.
.. confval:: linkurl
Default: ``http://www.gfz-potsdam.de/geofon/``
Type: *string*
URL referred to by linkname in footer of web pages.
.. confval:: liveurl
Default: ``http://geofon.gfz-potsdam.de/waveform/liveseis.php?station=%s``
Type: *string*
URL to live seismograms. %s will be replaced by station name.
Not mandatory.
Bindings Parameters
===================
.. confval:: group
Default: ``local``
Type: *string*
Defines the group of the station that is monitored.

View File

@ -0,0 +1,76 @@
.. highlight:: rst
.. _tab2inv:
#######
tab2inv
#######
**tab to SeisComp3 inventory converter**
Description
===========
The tab2inv program is part of the nettab package distributed together with the
|scname| package. The nettab is a text-based format developed inside the
GEOFON data center to describe seismological instruments responses information.
The tab2inv program can generate |scname| inventory files by parsing
´´tab files´´ as described by in the nettab format.
This program can read a set of tab files in the nettab format,
verify their contents and generate inventory for the stations
described in them.
Command-Line Options
====================
.. program:: tab2inv
:program:`tab2inv [options] tab-file [more-tab-file]`
Options
-------
.. option:: -h, --help
Prints a small help message and exits.
.. option:: -f, --filterf
Indicate the filter folder. The filter folder is where the program will search for the files indicated by the Ff and If lines.
.. option:: -x, --xmlf
Indicate a folder containing a set of XML files that contains the SeisComp3 inventory files that will be used to resolve the station groups. Normally this folder is seiscomp\/etc\/inventory.
.. option:: -D, --database
.. option:: --force
Don't stop on errors of individual files. Try to perform the requested task even if some files contain formatting errors.
.. option:: -g, --generate
This option instruct the program to generate the XML document in the end of processing. When you don't supply this option the file is just parsed and loaded into objects in the memory.
.. option:: -c, --check
Perform some tests after the files can be loaded. The tests include testing that the instruments all can be resolved and has no unused instruments supplied.
.. option:: -d, --default
Use this option to indicate a default file. A default file is a file containing normally a set of rules for Network \(Na\), Stations \(Sa\) and Instruments \(Ia\) that are applied for every object created before the attributes specified in the file are applied. This option help sto set parameters that you want to be set on all objects.
.. option:: -o, --output
Use this option to indicate the output filename for the XML file. If not indicated the program will write the output file to STDOUT.
.. option:: -i, --ip
Allow prefixing of the instrument \(Datalogger or Sensor\) name attribute on the inventory generated by a prefix. This option is normally used when you want to convert many networks that share the same instrumentation in different calls of the program. In each call of the program you can supply the network code and year as padding to guarantee that the instruments generated in both runs of the program will have different name values that are used as Key on the SeisComp3 inventory.

View File

@ -0,0 +1,134 @@
.. highlight:: rst
.. _tabinvmodifier:
##############
tabinvmodifier
##############
**tab-based inventory modifier**
Description
===========
Sometimes small tweaks need to be made to inventory.
The tabinvmodifier program reads a *rules file* (a network tab file without any station lines) and applies network and station attributes to existing inventory. This method can be used to modify attributes in inventory that originate from dataless SEED or other sources. It can change inventory at the network, station, location, and channel level; it can also change sensor and datalogger attributes (Ia lines).
Currently (2016) changes to station groups (virtual networks) aren't supported.
For details of what can go in a tab file, see
`NETTAB File Format Description <http://www.seiscomp.org/wiki/doc/special/nettabv2>`_.
tabinvmodifier can either write directly to the inventory in an SC3 database, or dump its output as an XML file.
If output is as an XML file, typically this would then be moved to ~/seiscomp/etc/inventory, and then loaded into the database with `seiscomp update-config`.
Examples
========
1. Set network-level attributes. Suppose the file `ge.rules` contains
.. code-block:: sh
Nw: GE 1993/001
Na: Description="GEOFON Program, GFZ Potsdam, Germany"
Na: Remark="Access to Libyan stations and Spanish HH streams limited"
Na: Type=VBB
The first line (Nw:) specifies the network, including its start date, that these rules apply to.
The following lines starting with Na: provide values for the description, remark, and type attributes to be written into the new inventory.
Note the capital letter on the attributes Description, Remark, Type, etc.
We can use this rules file to change attributes of the GE network:
.. code-block:: sh
# Apply changes to database directly
$ tabinvmodifier -r ge.rules
# Apply changes to XML file
$ tabinvmodifier -r ge.rules --inventory-db ge.xml -o ge-mod.xml
The resulting inventory now contains:
.. code-block:: xml
<network publicID="Network#20130513163612.389203.2" code="GE">
<start>1993-01-01T00:00:00.0000Z</start>
<description>GEOFON Program, GFZ Potsdam, Germany</description>
<institutions>GFZ/partners</institutions>
<region>euromed global</region>
<type>VBB</type>
<netClass>p</netClass>
<archive>GFZ</archive>
<restricted>false</restricted>
<shared>true</shared>
<remark>access to Libyan stations and Spanish HH streams limited</remark>
<station publicID="Station#20130620185450.488952.190" code="MSBI" archiveNetworkCode="GE">
<start>2013-06-16T00:00:00.0000Z</start>
Other attributes present in inventory are left unchanged.
2. Changing location codes. (Thanks to Andres H. for this example.)
To replace an empty location code for station "KP.UPNV" with location code "00", together with its description and place.
The rules file is:
.. code-block:: sh
Nw: KP 1980/001
Sa: Description="GLISN Station Upernavik, Greenland" UPNV
Sa: Place="Upernavik, Greenland" UPNV
Sa: Code="00" UPNV,
The resulting inventory now contains:
.. code-block:: xml
<network publicID="Network#20140603153203.17936.2" code="KP">
<start>1980-01-01T00:00:00.0000Z</start>
...
<station publicID="Station#20140603153203.179738.3" code="UPNV">
<start>2013-08-01T00:00:00.0000Z</start>
<description>GLISN Station Upernavik, Greenland</description>
<latitude>72.7829</latitude>
<longitude>-56.1395</longitude>
<elevation>38</elevation>
<place>Upernavik, Greenland</place>
<affiliation>GLISN</affiliation>
...
<sensorLocation publicID="SensorLocation#20140603153203.181119.4" code="00">
<start>2013-08-01T00:00:00.0000Z</start>
...
</station>
</network>
Command-Line Options
====================
.. program:: tabinvmodifier
:program:`tabinvmodifier { -r | --rules } {rules file} [options]`
Options
-------
.. option:: -r, --rules
Input filename of the rules file. A rules file is mandatory.
.. option:: -e, --relaxed
Relax rules for matching NSLC items
.. option:: -o, --output
Output XML filename. Without an output file, tabinvmodifier will attempt to write to the local SeisComp3 database.
.. option:: --inventory-db
Input file containing inventory. If this option is given, an output file must be provided with `\-\-output`.

File diff suppressed because it is too large Load Diff

Some files were not shown because too many files have changed in this diff Show More