GraceDB
The bilby_pipe_gracedb
command line program provides a method
to generate ini files for a GraceDB event. This ini file can then be
used as the input for the other bilby_pipe modules.
In addition to reading the data from gracedb, it will attempt to copy the PSD/strain data files to the local machine.
The functionality of much of these utility assumes the user is running on the CIT cluster, e.g. the ROQ and calibration directories are in their usual place.
Command line interface to create ini file for GraceDB event
The bilby_pipe_gracedb
command line program provides a method
to generate ini files for a GraceDB event. This ini file can then be
used as the input for the other bilby_pipe modules.
In addition to reading the data from gracedb, it will attempt to copy the PSD/strain data files to the local machine.
The bilby_pipe_gracedb
command line program provides a method
to generate ini files for a GraceDB event. This ini file can then be
used as the input for the other bilby_pipe modules.
In addition to reading the data from gracedb, it will attempt to copy the PSD/strain data files to the local machine.
The functionality of much of these utility assumes the user is running on the CIT cluster, e.g. the ROQ and calibration directories are in their usual place.
usage: fancytool [options]
Named Arguments
- --gracedb
GraceDB event id
- --json
Path to json GraceDB file
- --psd-file
Path to ligolw-xml file containing the PSDs for the interferometers.
- --skymap-file
Path to fits file containing distance PDF. This is used to set prior bound of distance
- --disable-skymap-download
If no arguments are passed to –skymap-file, skymap is downloaded from GraceDB to determine the prior maximum of distance. This option can disable it to use the default distance maximum values.
Default:
False
- --convert-to-flat-in-component-mass
Convert a flat-in chirp mass and mass-ratio prior file to flat in component mass during the post-processing. Note, the prior must be uniform in Mc and q with constraints in m1 and m2 for this to work.
Default:
False
- --outdir
Output directory where the ini file and all output is written.
- --output
Possible choices: ini, full, full-local, full-submit
- Flag to create ini, generate directories and/or submit.
ini : generates ini file full : generates ini and dag submission files (default) full-local : generates ini and dag submission files and run locally full-submit : generates ini and dag submission files and submits to condor
Default:
'full'
- --gracedb-url
- GraceDB service url.
Main page : https://gracedb.ligo.org/api/ (default) Playground : https://gracedb-playground.ligo.org/api/
Default:
'https://gracedb.ligo.org/api/'
- --channel-dict
Possible choices: online, o2replay, o3replay, gwosc
- Channel dictionary.
online : use for main GraceDB page events from the current observing run (default) o2replay : use for playground GraceDB page events o3replay : use for playground GraceDB page events gwosc : use for events where the strain data is publicly available, e.g., previous observing runs
Default:
'online'
- --sampler-kwargs
Dictionary of sampler-kwargs to pass in, e.g., {nlive: 1000} OR pass pre-defined set of sampler-kwargs {DynestyDefault, BilbyMCMCDefault, FastTest}
Default:
'DynestyDefault'
- --cbc-likelihood-mode
Built-in CBC likelihood mode or path to a JSON file containing likelihood settings. The built-in settings include ‘phenompv2_bbh_roq’, ‘lowspin_phenomd_narrowmc_roq’, ‘lowspin_phenomd_broadmc_roq’, ‘lowspin_phenomd_fhigh1024_roq’, ‘lowspin_taylorf2_roq’, ‘phenompv2_bns_roq’, ‘phenompv2nrtidalv2_roq’, ‘low_q_phenompv2_roq’, ‘phenomxphm_roq’, and ‘test’.
Default:
'phenompv2_bbh_roq'
- --webdir
- Directory to store summary pages.
If not given, defaults to outdir/results_page
- --settings
JSON file containing extra settings to override the defaults
- --psd-cut
maximum frequency is set to this value multiplied by the maximum frequency of psd contained in coinc.xml. This is to avoid likelihood overflow caused by the roll-off of pipeline psd due to low-pass filter.
Default:
0.95
- --query-kafka
when fetching the data for analysis, check first it is in kafka, and if not, then try to query ifocache.If False, query ifocache (via gwpy TimeSeries.get() ) by default.
Default:
True