pycbc.events package

Submodules

pycbc.events.coherent module

This module contains functions for calculating and manipulating coherent triggers.

pycbc.events.coherent.coherent_snr(snr_triggers, index, threshold, projection_matrix, coinc_snr=None)[source]

Calculate the coherent SNR for a given set of triggers. See Eq. 2.26 of Harry & Fairhurst (2011) [arXiv:1012.4939].

Parameters:
  • snr_triggers (dict) – Dictionary of the normalised complex snr time series for each ifo

  • index (numpy.ndarray) – Array of the indexes corresponding to triggers

  • threshold (float) – Coherent SNR threshold. Triggers below this are cut

  • projection_matrix (numpy.ndarray) – Matrix that projects the signal onto the network

  • coinc_snr (Optional- The coincident snr for each trigger.) –

Returns:

  • rho_coh (numpy.ndarray) – Array of coherent SNR for the detector network

  • index (numpy.ndarray) – Indexes that survive cuts

  • snrv (dict) – Dictionary of individual deector triggers that survive cuts

  • coinc_snr (list or None (default: None)) – The coincident SNR values for triggers surviving the coherent cut

pycbc.events.coherent.coincident_snr(snr_dict, index, threshold, time_delay_idx)[source]

Calculate the coincident SNR for all coincident triggers above threshold

Parameters:
  • snr_dict (dict) – Dictionary of individual detector SNRs

  • index (list) – List of indexes (geocentric) for which to calculate coincident SNR

  • threshold (float) – Coincident SNR threshold. Triggers below this are cut

  • time_delay_idx (dict) – Dictionary of time delay from geocenter in indexes for each detector

Returns:

  • rho_coinc (numpy.ndarray) – Coincident SNR values for surviving triggers

  • index (list) – The subset of input indexes corresponding to triggers that survive the cuts

  • coinc_triggers (dict) – Dictionary of individual detector SNRs for triggers that survive cuts

pycbc.events.coherent.get_coinc_indexes(idx_dict, time_delay_idx)[source]

Return the indexes corresponding to coincident triggers

Parameters:
  • idx_dict (dict) – Dictionary of indexes of triggers above threshold in each detector

  • time_delay_idx (dict) – Dictionary giving time delay index (time_delay*sample_rate) for each ifo

Returns:

coinc_idx – List of indexes for triggers in geocent time that appear in multiple detectors

Return type:

list

pycbc.events.coherent.get_coinc_triggers(snrs, idx, t_delay_idx)[source]

Returns the coincident triggers from the longer SNR timeseries

Parameters:
  • snrs (dict) – Dictionary of single detector SNR time series

  • idx (list) – List of geocentric time indexes of coincident triggers

  • t_delay_idx (dict) – Dictionary of indexes corresponding to light travel time from geocenter for each detector

Returns:

coincs – Dictionary of coincident trigger SNRs in each detector

Return type:

dict

pycbc.events.coherent.get_projection_matrix(f_plus, f_cross, sigma, projection='standard')[source]

Calculate the matrix that projects the signal onto the network. Definitions can be found in Fairhurst (2018) [arXiv:1712.04724]. For the standard projection see Eq. 8, and for left/right circular projections see Eq. 21, with further discussion in Appendix A. See also Williamson et al. (2014) [arXiv:1410.6042] for discussion in context of the GRB search with restricted binary inclination angles.

Parameters:
  • f_plus (dict) – Dictionary containing the plus antenna response factors for each IFO

  • f_cross (dict) – Dictionary containing the cross antenna response factors for each IFO

  • sigma (dict) – Dictionary of the sensitivity weights for each IFO

  • projection (optional, {string, 'standard'}) – The signal polarization to project. Choice of ‘standard’ (unrestricted; default), ‘right’ or ‘left’ (circular polarizations)

Returns:

projection_matrix – The matrix that projects the signal onto the detector network

Return type:

np.ndarray

pycbc.events.coherent.network_chisq(chisq, chisq_dof, snr_dict)[source]

Calculate the network chi-squared statistic. This is the sum of SNR-weighted individual detector chi-squared values. See Eq. 5.4 of Dorrington (2019) [http://orca.cardiff.ac.uk/id/eprint/128124].

Parameters:
  • chisq (dict) – Dictionary of individual detector chi-squared statistics

  • chisq_dof (dict) – Dictionary of the number of degrees of freedom of the chi-squared statistic

  • snr_dict (dict) – Dictionary of complex individual detector SNRs

Returns:

net_chisq – Network chi-squared values

Return type:

list

pycbc.events.coherent.null_snr(rho_coh, rho_coinc, apply_cut=True, null_min=5.25, null_grad=0.2, null_step=20.0, index=None, snrv=None)[source]

Calculate the null SNR and optionally apply threshold cut where null SNR > null_min where coherent SNR < null_step and null SNR > (null_grad * rho_coh + null_min) elsewhere. See Eq. 3.1 of Harry & Fairhurst (2011) [arXiv:1012.4939] or Eqs. 11 and 12 of Williamson et al. (2014) [arXiv:1410.6042]..

Parameters:
  • rho_coh (numpy.ndarray) – Array of coherent snr triggers

  • rho_coinc (numpy.ndarray) – Array of coincident snr triggers

  • apply_cut (bool) – Apply a cut and downweight on null SNR determined by null_min, null_grad, null_step (default True)

  • null_min (scalar) – Any trigger with null SNR below this is retained

  • null_grad (scalar) – Gradient of null SNR cut where coherent SNR > null_step

  • null_step (scalar) – The threshold in coherent SNR rho_coh above which the null SNR threshold increases as null_grad * rho_coh

  • index (dict or None (optional; default None)) – Indexes of triggers. If given, will remove triggers that fail cuts

  • snrv (dict of None (optional; default None)) – Individual detector SNRs. If given will remove triggers that fail cut

Returns:

  • null (numpy.ndarray) – Null SNR for surviving triggers

  • rho_coh (numpy.ndarray) – Coherent SNR for surviving triggers

  • rho_coinc (numpy.ndarray) – Coincident SNR for suviving triggers

  • index (dict) – Indexes for surviving triggers

  • snrv (dict) – Single detector SNRs for surviving triggers

pycbc.events.coherent.reweight_snr_by_null(network_snr, null, coherent, null_min=5.25, null_grad=0.2, null_step=20.0)[source]

Re-weight the detection statistic as a function of the null SNR. See Eq. 16 of Williamson et al. (2014) [arXiv:1410.6042].

Parameters:
  • network_snr (numpy.ndarray) – Array containing SNR statistic to be re-weighted

  • null (numpy.ndarray) – Null SNR array

  • coherent – Coherent SNR array

Returns:

rw_snr – Re-weighted SNR for each trigger

Return type:

numpy.ndarray

pycbc.events.coherent.reweightedsnr_cut(rw_snr, rw_snr_threshold)[source]

Performs a cut on reweighted snr based on a given threshold

Parameters:
  • rw_snr (array of reweighted snr) –

  • rw_snr_threshhold (any reweighted snr below this threshold is set to 0) –

Returns:

rw_snr

Return type:

array of reweighted snr with cut values as 0

pycbc.events.coinc module

This module contains functions for calculating and manipulating coincident triggers.

class pycbc.events.coinc.CoincExpireBuffer(expiration, ifos, initial_size=1048576, dtype=<class 'numpy.float32'>)[source]

Bases: object

Unordered dynamic sized buffer that handles multiple expiration vectors.

add(values, times, ifos)[source]

Add values to the internal buffer

Parameters:
  • values (numpy.ndarray) – Array of elements to add to the internal buffer.

  • times (dict of arrays) – The current time to use for each element being added.

  • ifos (list of strs) – The set of timers to be incremented.

property data

Return the array of elements

increment(ifos)[source]

Increment without adding triggers

property nbytes

Returns the approximate memory usage of self.

num_greater(value)[source]

Return the number of elements larger than ‘value’

remove(num)[source]

Remove the the last ‘num’ elements from the buffer

class pycbc.events.coinc.LiveCoincTimeslideBackgroundEstimator(num_templates, analysis_block, background_statistic, sngl_ranking, stat_files, ifos, ifar_limit=100, timeslide_interval=0.035, coinc_threshold=0.002, return_background=False, **kwargs)[source]

Bases: object

Rolling buffer background estimation.

add_singles(results)[source]

Add singles to the background estimate and find candidates

Parameters:

results (dict) – Dictionary of dictionaries indexed by ifo and keys such as ‘snr’, ‘chisq’, etc. The specific format is determined by the LiveBatchMatchedFilter class.

Returns:

coinc_results – A dictionary of arrays containing the coincident results.

Return type:

dict of arrays

property background_time

Return the amount of background time that the buffers contain

backout_last(updated_singles, num_coincs)[source]

Remove the recently added singles and coincs

Parameters:
  • updated_singles (dict of numpy.ndarrays) – Array of indices that have been just updated in the internal buffers of single detector triggers.

  • num_coincs (int) – The number of coincs that were just added to the internal buffer of coincident triggers

classmethod from_cli(args, num_templates, analysis_chunk, ifos)[source]
ifar(coinc_stat)[source]

Map a given value of the coincident ranking statistic to an inverse false-alarm rate (IFAR) using the interally stored background sample.

Parameters:

coinc_stat (float) – Value of the coincident ranking statistic to be converted.

Returns:

  • ifar (float) – Inverse false-alarm rate in unit of years.

  • ifar_saturated (bool) – True if coinc_stat is larger than all the available background, in which case ifar is to be considered an upper limit.

static insert_args(parser)[source]
classmethod pick_best_coinc(coinc_results)[source]

Choose the best two-ifo coinc by ifar first, then statistic if needed.

This function picks which of the available double-ifo coincs to use. It chooses the best (highest) ifar. The ranking statistic is used as a tie-breaker. A trials factor is applied if multiple types of coincs are possible at this time given the active ifos.

Parameters:

coinc_results (list of coinc result dicts) – Dictionary by detector pair of coinc result dicts.

Returns:

best – If there is a coinc, this will contain the ‘best’ one. Otherwise it will return the provided dict.

Return type:

coinc results dict

static restore_state(filename)[source]

Restore state of the background buffers from a file

save_state(filename)[source]

Save the current state of the background buffers

set_singles_buffer(results)[source]

Create the singles buffer

This creates the singles buffer for each ifo. The dtype is determined by a representative sample of the single triggers in the results.

Parameters:

results (dict of dict) – Dict indexed by ifo and then trigger column.

static verify_args(args, parser)[source]

Verify that psd-var-related options are consistent

class pycbc.events.coinc.MultiRingBuffer(num_rings, max_time, dtype, min_buffer_size=16, buffer_increment=8, resize_invalid_fraction=0.4)[source]

Bases: object

Dynamic size n-dimensional ring buffer that can expire elements.

add(indices, values)[source]

Add triggers in ‘values’ to the buffers indicated by the indices

advance_time()[source]

Advance the internal time increment by 1, expiring any triggers that are now too old.

check_expired_triggers(buffer_index)[source]

Check if we should free memory for this buffer index.

Check what fraction of triggers are expired in the specified buffer and if it is more than the allowed fraction (set by self.resize_invalid_fraction) resize the array to remove them.

data(buffer_index)[source]

Return the data vector for a given ring buffer

discard_last(indices)[source]

Discard the triggers added in the latest update

expire_vector(buffer_index)[source]

Return the expiration vector of a given ring buffer

property filled_time
property nbytes
num_elements()[source]
update_valid_start(buffer_index)[source]

Update the valid_start for the given buffer index

valid_slice(buffer_index)[source]

Return the valid slice for this buffer index

pycbc.events.coinc.background_bin_from_string(background_bins, data)[source]

Return template ids for each bin as defined by the format string

Parameters:
  • bins (list of strings) – List of strings which define how a background bin is taken from the list of templates.

  • data (dict of numpy.ndarrays) – Dict with parameter key values and numpy.ndarray values which define the parameters of the template bank to bin up.

Returns:

bins – Dictionary of location indices indexed by a bin name

Return type:

dict

pycbc.events.coinc.cluster_coincs(stat, time1, time2, timeslide_id, slide, window, **kwargs)[source]

Cluster coincident events for each timeslide separately, across templates, based on the ranking statistic

Parameters:
  • stat (numpy.ndarray) – vector of ranking values to maximize

  • time1 (numpy.ndarray) – first time vector

  • time2 (numpy.ndarray) – second time vector

  • timeslide_id (numpy.ndarray) – vector that determines the timeslide offset

  • slide (float) – length of the timeslides offset interval

  • window (float) – length to cluster over

Returns:

cindex – The set of indices corresponding to the surviving coincidences.

Return type:

numpy.ndarray

pycbc.events.coinc.cluster_coincs_multiifo(stat, time_coincs, timeslide_id, slide, window, **kwargs)[source]

Cluster coincident events for each timeslide separately, across templates, based on the ranking statistic

Parameters:
  • stat (numpy.ndarray) – vector of ranking values to maximize

  • time_coincs (tuple of numpy.ndarrays) – trigger times for each ifo, or -1 if an ifo does not participate in a coinc

  • timeslide_id (numpy.ndarray) – vector that determines the timeslide offset

  • slide (float) – length of the timeslides offset interval

  • window (float) – duration of clustering window in seconds

Returns:

cindex – The set of indices corresponding to the surviving coincidences

Return type:

numpy.ndarray

pycbc.events.coinc.cluster_over_time(stat, time, window, method='python', argmax=<function argmax>)[source]

Cluster generalized transient events over time via maximum stat over a symmetric sliding window

Parameters:
  • stat (numpy.ndarray) – vector of ranking values to maximize

  • time (numpy.ndarray) – time to use for clustering

  • window (float) – length to cluster over

  • method (string) – Either “cython” to use the cython implementation, or “python” to use the pure python version.

  • argmax (function) – the function used to calculate the maximum value

Returns:

cindex – The set of indices corresponding to the surviving coincidences.

Return type:

numpy.ndarray

pycbc.events.coinc.mean_if_greater_than_zero(vals)[source]

Calculate mean over numerical values, ignoring values less than zero. E.g. used for mean time over coincident triggers when timestamps are set to -1 for ifos not included in the coincidence.

Parameters:

vals (iterator of numerical values) – values to be mean averaged

Returns:

  • mean (float) – The mean of the values in the original vector which are greater than zero

  • num_above_zero (int) – The number of entries in the vector which are above zero

pycbc.events.coinc.time_coincidence(t1, t2, window, slide_step=0)[source]

Find coincidences by time window

Parameters:
  • t1 (numpy.ndarray) – Array of trigger times from the first detector

  • t2 (numpy.ndarray) – Array of trigger times from the second detector

  • window (float) – Coincidence window maximum time difference, arbitrary units (usually s)

  • slide_step (float (default 0)) – If calculating background coincidences, the interval between background slides, arbitrary units (usually s)

Returns:

  • idx1 (numpy.ndarray) – Array of indices into the t1 array for coincident triggers

  • idx2 (numpy.ndarray) – Array of indices into the t2 array

  • slide (numpy.ndarray) – Array of slide ids

pycbc.events.coinc.time_multi_coincidence(times, slide_step=0, slop=0.003, pivot='H1', fixed='L1')[source]

Find multi detector coincidences.

Parameters:
  • times (dict of numpy.ndarrays) – Dictionary keyed by ifo of single ifo trigger times

  • slide_step (float) – Interval between time slides

  • slop (float) – The amount of time to add to the TOF between detectors for coincidence

  • pivot (str) – The ifo to which time shifts are applied in first stage coincidence

  • fixed (str) – The other ifo used in first stage coincidence, subsequently used as a time reference for additional ifos. All other ifos are not time shifted relative to this ifo

Returns:

  • ids (dict of arrays of int) – Dictionary keyed by ifo with ids of trigger times forming coincidences. Coincidence is tested for every pair of ifos that can be formed from the input dict: only those tuples of times passing all tests are recorded

  • slide (array of int) – Slide ids of coincident triggers in pivot ifo

pycbc.events.coinc.timeslide_durations(start1, start2, end1, end2, timeslide_offsets)[source]

Find the coincident time for each timeslide.

Find the coincident time for each timeslide, where the first time vector is slid to the right by the offset in the given timeslide_offsets vector.

Parameters:
  • start1 (numpy.ndarray) – Array of the start of valid analyzed times for detector 1

  • start2 (numpy.ndarray) – Array of the start of valid analyzed times for detector 2

  • end1 (numpy.ndarray) – Array of the end of valid analyzed times for detector 1

  • end2 (numpy.ndarray) – Array of the end of valid analyzed times for detector 2

  • timseslide_offset (numpy.ndarray) – Array of offsets (in seconds) for each timeslide

Returns:

durations – Array of coincident time for each timeslide in the offset array

Return type:

numpy.ndarray

pycbc.events.coinc_rate module

This module contains functions for calculating expected rates of noise and signal coincidences.

pycbc.events.coinc_rate.combination_noise_lograte(log_rates, slop)[source]

Calculate the expected rate of noise coincidences for a combination of detectors given log of single detector noise rates

Parameters:
  • log_rates (dict) – Key: ifo string, Value: sequence of log single-detector trigger rates, units assumed to be Hz

  • slop (float) – time added to maximum time-of-flight between detectors to account for timing error

Returns:

Expected log coincidence rate in the combination, units Hz

Return type:

numpy array

pycbc.events.coinc_rate.combination_noise_rate(rates, slop)[source]

Calculate the expected rate of noise coincidences for a combination of detectors WARNING: for high stat values, numerical underflow can occur

Parameters:
  • rates (dict) – Key: ifo string, Value: sequence of single-detector trigger rates, units assumed to be Hz

  • slop (float) – time added to maximum time-of-flight between detectors to account for timing error

Returns:

Expected coincidence rate in the combination, units Hz

Return type:

numpy array

pycbc.events.coinc_rate.multiifo_noise_coincident_area(ifos, slop)[source]

Calculate the total extent of time offset between 2 detectors, or area of the 2d space of time offsets for 3 detectors, for which a coincidence can be generated Cannot yet handle more than 3 detectors.

Parameters:
  • ifos (list of strings) – list of interferometers

  • slop (float) – extra time to add to maximum time-of-flight for timing error

Returns:

allowed_area – area in units of seconds^(n_ifos-1) that coincident values can fall in

Return type:

float

pycbc.events.coinc_rate.multiifo_noise_lograte(log_rates, slop)[source]

Calculate the expected rate of noise coincidences for multiple combinations of detectors

Parameters:
  • log_rates (dict) – Key: ifo string, Value: sequence of log single-detector trigger rates, units assumed to be Hz

  • slop (float) – time added to maximum time-of-flight between detectors to account for timing error

Returns:

expected_log_rates – Key: ifo combination string Value: expected log coincidence rate in the combination, units log Hz

Return type:

dict

pycbc.events.coinc_rate.multiifo_signal_coincident_area(ifos)[source]

Calculate the area in which signal time differences are physically allowed

Parameters:

ifos (list of strings) – list of interferometers

Returns:

allowed_area – area in units of seconds^(n_ifos-1) that coincident signals will occupy

Return type:

float

pycbc.events.cuts module

This module contains functions for reading in command line options and applying cuts to triggers or templates in the offline search

pycbc.events.cuts.apply_template_cuts(bank, template_cut_dict, template_ids=None, statistic=None, ifos=None)[source]

Fetch/calculate the parameter for the templates, possibly already preselected by template_ids, and then apply the cuts defined in template_cut_dict As this is used to select templates for use in findtrigs codes, we remove anything which does not pass

Parameters:
  • bank (h5py File object, or a dictionary) – Must contain the usual template bank datasets

  • template_cut_dict (dictionary) – Dictionary with tuples of (parameter, cut_function) as keys, cut_thresholds as values made using ingest_cuts_option_group function

  • Parameters (Optional) –

  • -------------------

  • template_ids (list of indices) – Indices of templates to consider within the bank, useful if templates have already been down-selected

  • statistic – A PyCBC ranking statistic instance. Used for the template fit cuts. If fits_by_tid does not exist for each ifo, then template fit cuts will be skipped. If a fit cut has been specified and fits_by_tid does not exist for all ifos, an error will be raised. If not supplied, no template fit cuts will be attempted.

  • ifos (list of strings) – List of IFOS used in this findtrigs instance. Templates must pass cuts in all IFOs. This is important e.g. for template fit parameter cuts.

Returns:

tids_out – Array of template_ids which have passed all cuts

Return type:

numpy array

pycbc.events.cuts.apply_template_fit_cut(statistic, ifos, parameter_cut_function, cut_thresh, template_ids)[source]

Apply cuts to template fit parameters, these have a few more checks needed, so we separate out from apply_template_cuts defined later

Parameters:
  • statistic – A PyCBC ranking statistic instance. Used for the template fit cuts. If fits_by_tid does not exist for each ifo, then template fit cuts will be skipped. If a fit cut has been specified and fits_by_tid does not exist for all ifos, an error will be raised.

  • ifos (list of strings) – List of IFOS used in this findtrigs instance. Templates must pass cuts in all IFOs.

  • parameter_cut_function (tuple) – First entry: Which parameter is being used for the cut? Second entry: Cut function

  • cut_thresh (float or int) – Cut threshold to the parameter according to the cut function

  • template_ids (numpy array) – Array of template_ids which have passed previous cuts

Returns:

tids_out – Array of template_ids which have passed this cut

Return type:

numpy array

pycbc.events.cuts.apply_trigger_cuts(triggers, trigger_cut_dict, statistic=None)[source]

Fetch/Calculate the parameter for triggers, and then apply the cuts defined in template_cut_dict

Parameters:
  • triggers (ReadByTemplate object or dictionary) – The triggers in this particular template. This must have the correct datasets required to calculate the values we cut on.

  • trigger_cut_dict (dictionary) – Dictionary with tuples of (parameter, cut_function) as keys, cut_thresholds as values made using ingest_cuts_option_group function

Returns:

idx_out – An array of the indices which meet the criteria set by the dictionary

Return type:

numpy array

pycbc.events.cuts.check_update_cuts(cut_dict, new_cut)[source]

Update a cuts dictionary, but check whether the cut exists already, warn and only apply the strictest cuts

Parameters:
  • cut_dict (dictionary) – Dictionary containing the cuts to be checked, will be updated

  • new_cut (single-entry dictionary) – dictionary to define the new cut which is being considered to add

pycbc.events.cuts.convert_inputstr(inputstr, choices)[source]

Convert the inputstr into a dictionary keyed on parameter with a tuple of the function to be used in the cut, and the float to compare to. Do input checks

pycbc.events.cuts.ingest_cuts_option_group(args)[source]

Return dictionaries for trigger and template cuts.

pycbc.events.cuts.insert_cuts_option_group(parser)[source]

Add options to the parser for cuts to the templates/triggers

pycbc.events.cuts.sigma_multiple_cut_thresh(template_ids, statistic, cut_thresh, ifo)[source]

Apply cuts based on a multiple of the median sigma value for the template

Parameters:
  • template_ids – template_id values for each of the triggers to be considered, this will be used to associate a sigma threshold for each trigger

  • statistic – A PyCBC ranking statistic instance. Used to get the median_sigma value for the cuts. If fits_by_tid does not exist for the specified ifo (where median_sigma lives), an error will be raised.

  • ifo – The IFO for which we want to read median_sigma

  • cut_thresh (int or float) – The multiple of median_sigma to compare triggers to

Returns:

idx_out – An array of the indices of triggers which meet the criteria set by the dictionary

Return type:

numpy array

pycbc.events.eventmgr module

This modules defines functions for clustering and thresholding timeseries to produces event triggers

class pycbc.events.eventmgr.EventManager(opt, column, column_types, **kwds)[source]

Bases: object

add_template_events(columns, vectors)[source]

Add a vector indexed

add_template_params(**kwds)[source]
chisq_threshold(value, num_bins, delta=0)[source]
cluster_template_events(tcolumn, column, window_size)[source]

Cluster the internal events over the named column

consolidate_events(opt, gwstrain=None)[source]
finalize_events()[source]
finalize_template_events()[source]
classmethod from_multi_ifo_interface(opt, ifo, column, column_types, **kwds)[source]

To use this for a single ifo from the multi ifo interface requires some small fixing of the opt structure. This does that. As we edit the opt structure the process_params table will not be correct.

keep_loudest_in_interval(window, num_keep, statname='newsnr', log_chirp_width=None)[source]
keep_near_injection(window, injections)[source]
make_output_dir(outname)[source]
new_template(**kwds)[source]
newsnr_threshold(threshold)[source]

Remove events with newsnr smaller than given threshold

static restore_state(filename)[source]

Restore state of the background buffers from a file

save_performance(ncores, nfilters, ntemplates, run_time, setup_time)[source]

Calls variables from pycbc_inspiral to be used in a timing calculation

save_state(tnum_finished, filename)[source]

Save the current state of the background buffers

write_events(outname)[source]

Write the found events to a file. The only currently supported format is HDF5, indicated by an .hdf or .h5 extension.

write_to_hdf(outname)[source]
class pycbc.events.eventmgr.EventManagerCoherent(opt, ifos, column, column_types, network_column, network_column_types, segments, time_slides, psd=None, **kwargs)[source]

Bases: EventManagerMultiDetBase

add_template_events_to_network(columns, vectors)[source]

Add a vector indexed

add_template_network_events(columns, vectors)[source]

Add a vector indexed

cluster_template_network_events(tcolumn, column, window_size, slide=0)[source]

Cluster the internal events over the named column :param tcolumn: Indicates which column contains the time. :param column: The named column to cluster. :param window_size: The size of the window. :param slide: Default is 0.

finalize_template_events()[source]
write_to_hdf(outname)[source]
class pycbc.events.eventmgr.EventManagerMultiDet(opt, ifos, column, column_types, psd=None, **kwargs)[source]

Bases: EventManagerMultiDetBase

cluster_template_events_single_ifo(tcolumn, column, window_size, ifo)[source]

Cluster the internal events over the named column

finalize_template_events(perform_coincidence=True, coinc_window=0.0)[source]
write_to_hdf(outname)[source]
class pycbc.events.eventmgr.ThresholdCluster(*args, **kwargs)[source]

Bases: object

Create a threshold and cluster engine

Parameters:

series (complex64) – Input pycbc.types.Array (or subclass); it will be searched for points above threshold that are then clustered

pycbc.events.eventmgr.cluster_reduce(idx, snr, window_size)[source]

Reduce the events by clustering over a window

Parameters:
  • indices (Array) – The list of indices of the SNR values

  • snr (Array) – The list of SNR value

  • window_size (int) – The size of the window in integer samples.

Returns:

  • indices (Array) – The list of indices of the SNR values

  • snr (Array) – The list of SNR values

pycbc.events.eventmgr.findchirp_cluster_over_window(times, values, window_length)[source]

Reduce the events by clustering over a window using the FindChirp clustering algorithm

Parameters:
  • indices (Array) – The list of indices of the SNR values

  • snr (Array) – The list of SNR value

  • window_size (int) – The size of the window in integer samples. Must be positive.

Returns:

indices – The reduced list of indices of the SNR values

Return type:

Array

pycbc.events.eventmgr.threshold(series, value)[source]

Return list of values and indices values over threshold in series.

pycbc.events.eventmgr.threshold_and_cluster(series, threshold, window)[source]

Return list of values and indices values over threshold in series.

pycbc.events.eventmgr.threshold_only(series, value)[source]

Return list of values and indices whose values in series are larger (in absolute value) than value

pycbc.events.eventmgr.threshold_real_numpy(series, value)[source]

pycbc.events.eventmgr_cython module

pycbc.events.eventmgr_cython.coincbuffer_expireelements(cbuffer, timer1, timer2, time1, time2, expiration, length)
pycbc.events.eventmgr_cython.coincbuffer_numgreater(cbuffer, length, value)
pycbc.events.eventmgr_cython.findchirp_cluster_over_window_cython(times, absvalues, window_length, indices, tlen)
pycbc.events.eventmgr_cython.logsignalrateinternals_compute2detrate(nbinned0, nbinned1, nbinned2, c0_size, c1_size, c2_size, rate, rtype, sref, two_det_weights, max_penalty, ref_snr, length)
pycbc.events.eventmgr_cython.logsignalrateinternals_computepsignalbins(pdif, tdif, sdif, pbin, tbin, sbin, p, t, s, sig, pref, tref, sref, sigref, shift, rtype, sense, senseref, twidth, pwidth, swidth, to_shift_ref, to_shift_ifo, length)
pycbc.events.eventmgr_cython.timecluster_cython(indices, left, right, stat, leftlen)
pycbc.events.eventmgr_cython.timecoincidence_constructfold(fold1, fold2, t1, t2, slide_step, length1, length2)
pycbc.events.eventmgr_cython.timecoincidence_constructidxs(idx1, idx2, sort1, sort2, left, right, leftlength, sort2length)
pycbc.events.eventmgr_cython.timecoincidence_findidxlen(left, right, leftlength)
pycbc.events.eventmgr_cython.timecoincidence_getslideint(slide, t1, t2, idx1, idx2, slide_step)

pycbc.events.ranking module

This module contains functions for calculating single-ifo ranking statistic values

pycbc.events.ranking.effsnr(snr, reduced_x2, fac=250.0)[source]

Calculate the effective SNR statistic. See (S5y1 paper) for definition.

pycbc.events.ranking.get_newsnr(trigs)[source]

Calculate newsnr (‘reweighted SNR’) for a trigs/dictionary object

Parameters:

trigs (dict of numpy.ndarrays, h5py group (or similar dict-like object)) – Dictionary-like object holding single detector trigger information. ‘chisq_dof’, ‘snr’, and ‘chisq’ are required keys

Returns:

Array of newsnr values

Return type:

numpy.ndarray

pycbc.events.ranking.get_newsnr_sgveto(trigs)[source]

Calculate newsnr re-weigthed by the sine-gaussian veto

Parameters:

trigs (dict of numpy.ndarrays, h5py group (or similar dict-like object)) – Dictionary-like object holding single detector trigger information. ‘chisq_dof’, ‘snr’, ‘sg_chisq’ and ‘chisq’ are required keys

Returns:

Array of newsnr values

Return type:

numpy.ndarray

pycbc.events.ranking.get_newsnr_sgveto_psdvar(trigs)[source]

Calculate snr re-weighted by Allen chisq, sine-gaussian veto and psd variation statistic

Parameters:
  • trigs (dict of numpy.ndarrays) – Dictionary holding single detector trigger information.

  • 'chisq_dof'

  • 'snr'

  • keys ('chisq' and 'psd_var_val' are required) –

Returns:

Array of newsnr values

Return type:

numpy.ndarray

pycbc.events.ranking.get_newsnr_sgveto_psdvar_scaled(trigs)[source]

Calculate newsnr re-weighted by the sine-gaussian veto and scaled psd variation statistic

Parameters:
  • trigs (dict of numpy.ndarrays) – Dictionary holding single detector trigger information.

  • 'chisq_dof'

  • 'snr'

  • keys ('chisq' and 'psd_var_val' are required) –

Returns:

Array of newsnr values

Return type:

numpy.ndarray

pycbc.events.ranking.get_newsnr_sgveto_psdvar_scaled_threshold(trigs)[source]

Calculate newsnr re-weighted by the sine-gaussian veto and scaled psd variation statistic. A further threshold is applied to the reduced chisq.

Parameters:
  • trigs (dict of numpy.ndarrays) – Dictionary holding single detector trigger information.

  • 'chisq_dof'

  • 'snr'

  • keys ('chisq' and 'psd_var_val' are required) –

Returns:

Array of newsnr values

Return type:

numpy.ndarray

pycbc.events.ranking.get_newsnr_sgveto_psdvar_threshold(trigs)[source]

Calculate newsnr re-weighted by the sine-gaussian veto and scaled psd variation statistic

Parameters:
  • trigs (dict of numpy.ndarrays) – Dictionary holding single detector trigger information.

  • 'chisq_dof'

  • 'snr'

  • keys ('chisq' and 'psd_var_val' are required) –

Returns:

Array of newsnr values

Return type:

numpy.ndarray

pycbc.events.ranking.get_sngls_ranking_from_trigs(trigs, statname, **kwargs)[source]

Return ranking for all trigs given a statname.

Compute the single-detector ranking for a list of input triggers for a specific statname.

Parameters:
pycbc.events.ranking.get_snr(trigs)[source]

Return SNR from a trigs/dictionary object

Parameters:

trigs (dict of numpy.ndarrays, h5py group (or similar dict-like object)) – Dictionary-like object holding single detector trigger information. ‘snr’ is a required key

Returns:

Array of snr values

Return type:

numpy.ndarray

pycbc.events.ranking.newsnr(snr, reduced_x2, q=6.0, n=2.0)[source]

Calculate the re-weighted SNR statistic (‘newSNR’) from given SNR and reduced chi-squared values. See http://arxiv.org/abs/1208.3491 for definition. Previous implementation in glue/ligolw/lsctables.py

pycbc.events.ranking.newsnr_sgveto(snr, brchisq, sgchisq)[source]

Combined SNR derived from NewSNR and Sine-Gaussian Chisq

pycbc.events.ranking.newsnr_sgveto_psdvar(snr, brchisq, sgchisq, psd_var_val, min_expected_psdvar=0.65)[source]

Combined SNR derived from SNR, reduced Allen chisq, sine-Gaussian chisq and PSD variation statistic

pycbc.events.ranking.newsnr_sgveto_psdvar_scaled(snr, brchisq, sgchisq, psd_var_val, scaling=0.33, min_expected_psdvar=0.65)[source]

Combined SNR derived from NewSNR, Sine-Gaussian Chisq and scaled PSD variation statistic.

pycbc.events.ranking.newsnr_sgveto_psdvar_scaled_threshold(snr, bchisq, sgchisq, psd_var_val, threshold=2.0)[source]

Combined SNR derived from NewSNR and Sine-Gaussian Chisq, and scaled psd variation.

pycbc.events.ranking.newsnr_sgveto_psdvar_threshold(snr, brchisq, sgchisq, psd_var_val, min_expected_psdvar=0.65, brchisq_threshold=10.0, psd_var_val_threshold=10.0)[source]

newsnr_sgveto_psdvar with thresholds applied.

This is the newsnr_sgveto_psdvar statistic with additional options to threshold on chi-squared or PSD variation.

pycbc.events.significance module

This module contains functions to calculate the significance through different estimation methods of the background, and functions that read in the associated options to do so.

pycbc.events.significance.apply_far_limit(far, significance_dict, combo=None)[source]

Apply a FAR limit to events according to command line options.

If far_limit in significance_dict is zero, no limit is applied.

Parameters:
  • far (numpy array) –

  • significance_dict (dictionary) – Dictionary containing any limit on FAR (Hz), made by digest_significance_options

  • active_combination (numpy.array or string) – Array of IFO combinations given as utf-8 encoded strings, or a string which defines the IFO combination for all events

Returns:

far_out – FARs with far limit applied as appropriate

Return type:

numpy array

pycbc.events.significance.check_significance_options(args, parser)[source]

Check the significance group options

pycbc.events.significance.count_n_louder(bstat, fstat, dec, **kwargs)[source]

Calculate for each foreground event the number of background events that are louder than it.

Parameters:
  • bstat (numpy.ndarray) – Array of the background statistic values

  • fstat (numpy.ndarray or scalar) – Array of the foreground statistic values or single value

  • dec (numpy.ndarray) – Array of the decimation factors for the background statistics

Returns:

  • cum_back_num (numpy.ndarray) – The cumulative array of background triggers.

  • fore_n_louder (numpy.ndarray) – The number of background triggers above each foreground trigger

pycbc.events.significance.digest_significance_options(combo_keys, args)[source]

Read in information from the significance option group and ensure it makes sense before putting into a dictionary

Parameters:
  • combo_keys (list of strings) – list of detector combinations for which options are needed

  • args (parsed arguments) – from argparse ArgumentParser parse_args()

Returns:

significance_dict – Dictionary containing method, threshold and function for trigger fits as appropriate, and any limit on FAR (Hz)

Return type:

dictionary

pycbc.events.significance.get_far(back_stat, fore_stat, dec_facs, background_time, method='n_louder', **kwargs)[source]

Return the appropriate FAR given the significance calculation method

If the n_louder method is used, find the IFAR according to Eq.17-18 of Usman et al., arXiv:1508.02357. The p-value of a candidate in a search of duration T, with n_bg louder time shifted events over a total background time T_bg is p = 1 - exp(-T * (n_bg + 1) / T_bg) corresponding to an effective false alarm rate of (n_bg + 1) / T_bg.

If the trigger_fit method is used, we are extrapolating the background for the specific aim of FAR not being limited to 1 / T_bg, and so we do not add 1 to n_bg

Parameters:
  • parameters (See description in get_n_louder for most) –

  • background_time (float) – The amount of time to convert the number of louder events into a FAR

pycbc.events.significance.get_n_louder(back_stat, fore_stat, dec_facs, method='n_louder', **kwargs)[source]

Wrapper to find the correct n_louder calculation method using standard inputs

pycbc.events.significance.ifar_opt_to_far_limit(ifar_str)[source]

Convert the string of an IFAR limit in years into a float FAR limit in Hz.

Parameters:

ifar_str (string) – Upper limit on IFAR in years. Zero indicates no upper limit

pycbc.events.significance.insert_significance_option_group(parser)[source]

Add some options for use when a significance is being estimated from events or event distributions.

pycbc.events.significance.n_louder_from_fit(back_stat, fore_stat, dec_facs, fit_function='exponential', fit_threshold=0, **kwargs)[source]

Use a fit to events in back_stat in order to estimate the distribution for use in recovering the estimate count of louder background events. Below the fit threshold, use the n_louder method for these triggers

back_stat: numpy.ndarray

Array of the background statistic values

fore_stat: numpy.ndarray or scalar

Array of the foreground statistic values or single value

dec_facs: numpy.ndarray

Array of the decimation factors for the background statistics

fit_function: str

Name of the function to be used for the fit to background statistic values

fit_threshold: float

Threshold above which triggers use the fitted value, below this the counted number of louder events will be used

Returns:

  • bg_n_louder (numpy.ndarray) – The estimated number of background events louder than each background event

  • fg_n_louder (numpy.ndarray) – The estimated number of background events louder than each foreground event

pycbc.events.significance.positive_float(inp)[source]

Wrapper around float conversion which ensures that the float must be positive or zero

pycbc.events.simd_threshold_cython module

pycbc.events.simd_threshold_cython.parallel_thresh_cluster(series, slen, values, locs, thresh, window, segsize)
pycbc.events.simd_threshold_cython.parallel_threshold(N, arr, outv, outl, count, threshold)

pycbc.events.single module

utilities for assigning FAR to single detector triggers

class pycbc.events.single.LiveSingle(ifo, newsnr_threshold=10.0, reduced_chisq_threshold=5, duration_threshold=0, fit_file=None, sngl_ifar_est_dist=None, fixed_ifar=None)[source]

Bases: object

calculate_ifar(sngl_ranking, duration)[source]
check(trigs, data_reader)[source]

Look for a single detector trigger that passes the thresholds in the current data.

classmethod from_cli(args, ifo)[source]
static insert_args(parser)[source]
static verify_args(args, parser, ifos)[source]

pycbc.events.stat module

This module contains functions for calculating coincident ranking statistic values.

class pycbc.events.stat.DQExpFitFgBgKDEStatistic(sngl_ranking, files=None, ifos=None, **kwargs)[source]

Bases: DQExpFitFgBgNormStatistic

The ExpFitFgBgKDEStatistic with DQ-based reranking.

This is the same as the DQExpFitFgBgNormStatistic except the signal rate is adjusted according to the KDE statistic files

coinc_lim_for_thresh(s, thresh, limifo, **kwargs)[source]

Inherited, see docstring for ExpFitFgBgKDEStatistic.coinc_lim_for_thresh

kde_ratio()[source]

Inherited, see docstring for ExpFitFgBgKDEStatistic.kde_signalrate

logsignalrate(stats, shift, to_shift)[source]

Inherited, see docstring for ExpFitFgBgKDEStatistic.logsignalrate

rank_stat_single(single_info, **kwargs)[source]

Inherited, see docstring for ExpFitFgBgKDEStatistic.rank_stat_single

class pycbc.events.stat.DQExpFitFgBgNormStatistic(sngl_ranking, files=None, ifos=None, **kwargs)[source]

Bases: ExpFitFgBgNormStatistic

The ExpFitFgBgNormStatistic with DQ-based reranking.

This is the same as the ExpFitFgBgNormStatistic except the likelihood ratio is corrected via estimating relative noise trigger rates based on the DQ time series.

assign_dq_rates(key)[source]

Assign dq values to each time for every bin based on a referenced statistic file.

Parameters:

key (str) – statistic file key string

Returns:

dq_dict – Dictionary containing the mapping between the time and the dq value for each individual bin.

Return type:

dict of {time: dq_value} dicts for each bin

assign_template_bins(key)[source]

Assign bin ID values Assign each template id to a bin name based on a referenced statistic file.

Parameters:

key (str) – statistic file key string

Returns:

bin_dict – Dictionary containing the bin name for each template id

Return type:

dict of strs

find_dq_noise_rate(trigs, dq_state)[source]

Get dq values for a specific ifo and dq states

find_dq_state_by_time(ifo, times)[source]

Get the dq state for an ifo at times

lognoiserate(trigs)[source]

Calculate the log noise rate density over single-ifo ranking

Read in single trigger information, compute the ranking and rescale by the fitted coefficients alpha and rate

Parameters:

trigs (dict of numpy.ndarrays, h5py group or similar dict-like object) – Object holding single detector trigger information.

Returns:

lognoiserate – Array of log noise rate density for each input trigger.

Return type:

numpy.array

setup_segments(key)[source]

Check if segments definitions are in stat file If they are, we are running offline and need to store them If they aren’t, we are running online

class pycbc.events.stat.ExpFitBgRateStatistic(sngl_ranking, files=None, ifos=None, benchmark_lograte=-14.6, **kwargs)[source]

Bases: ExpFitStatistic

Detection statistic using an exponential falloff noise model.

Statistic calculates the log noise coinc rate for each template over single-ifo newsnr values.

coinc_lim_for_thresh(s, thresh, limifo, **kwargs)[source]

Optimization function to identify coincs too quiet to be of interest

Calculate the required single detector statistic to exceed the threshold for each of the input triggers.

Parameters:
  • s (list) – List of (ifo, single detector statistic) tuples for all detectors except limifo.

  • thresh (float) – The threshold on the coincident statistic.

  • limifo (string) – The ifo for which the limit is to be found.

Returns:

Array of limits on the limifo single statistic to exceed thresh.

Return type:

numpy.ndarray

rank_stat_coinc(s, slide, step, to_shift, **kwargs)[source]

Calculate the coincident detection statistic.

Parameters:
  • sngls_list (list) – List of (ifo, single detector statistic) tuples

  • slide ((unused in this statistic)) –

  • step ((unused in this statistic)) –

  • to_shift (list) – List of integers indicating what multiples of the time shift will be applied (unused in this statistic)

Returns:

Array of coincident ranking statistic values

Return type:

numpy.ndarray

reassign_rate(ifo)[source]

Reassign the rate to be number per time rather

Reassign the rate to be number per time rather than an arbitrarily normalised number.

Parameters:

ifo (str) – The ifo to consider.

class pycbc.events.stat.ExpFitCombinedSNR(sngl_ranking, files=None, ifos=None, **kwargs)[source]

Bases: ExpFitStatistic

Reworking of ExpFitStatistic designed to resemble network SNR

Use a monotonic function of the negative log noise rate density which approximates combined (new)snr for coincs with similar newsnr in each ifo

coinc_lim_for_thresh(s, thresh, limifo, **kwargs)[source]

Optimization function to identify coincs too quiet to be of interest

Calculate the required single detector statistic to exceed the threshold for each of the input triggers.

Parameters:
  • s (list) – List of (ifo, single detector statistic) tuples for all detectors except limifo.

  • thresh (float) – The threshold on the coincident statistic.

  • limifo (string) – The ifo for which the limit is to be found.

Returns:

Array of limits on the limifo single statistic to exceed thresh.

Return type:

numpy.ndarray

rank_stat_coinc(s, slide, step, to_shift, **kwargs)[source]

Calculate the coincident detection statistic.

Parameters:
  • sngls_list (list) – List of (ifo, single detector statistic) tuples

  • slide ((unused in this statistic)) –

  • step ((unused in this statistic)) –

  • to_shift (list) – List of integers indicating what multiples of the time shift will be applied (unused in this statistic)

Returns:

Array of coincident ranking statistic values

Return type:

numpy.ndarray

rank_stat_single(single_info, **kwargs)[source]

Calculate the statistic for single detector candidates

Parameters:

single_info (tuple) – Tuple containing two values. The first is the ifo (str) and the second is the single detector triggers.

Returns:

The array of single detector statistics

Return type:

numpy.ndarray

single(trigs)[source]

Calculate the necessary single detector information

Parameters:

trigs (dict of numpy.ndarrays, h5py group or similar dict-like object) – Object holding single detector trigger information.

Returns:

The array of single detector values

Return type:

numpy.ndarray

use_alphamax()[source]

Compute the reference alpha from the fit files.

Use the harmonic mean of the maximum individual ifo slopes as the reference value of alpha.

class pycbc.events.stat.ExpFitFgBgKDEStatistic(sngl_ranking, files=None, ifos=None, **kwargs)[source]

Bases: ExpFitFgBgNormStatistic

The ExpFitFgBgNormStatistic with an additional mass and spin weighting factor determined by KDE statistic files.

This is the same as the ExpFitFgBgNormStatistic except the likelihood ratio is multiplied by the ratio of signal KDE to template KDE over some parameters covering the bank.

assign_kdes(kname)[source]

Extract values from KDE files

Parameters:

kname (str) – Used to label the kde files.

coinc_lim_for_thresh(s, thresh, limifo, **kwargs)[source]

Optimization function to identify coincs too quiet to be of interest

Calculate the required single detector statistic to exceed the threshold for each of the input trigers.

Parameters:
  • s (list) – List of (ifo, single detector statistic) tuples for all detectors except limifo.

  • thresh (float) – The threshold on the coincident statistic.

  • limifo (string) – The ifo for which the limit is to be found.

Returns:

Array of limits on the limifo single statistic to exceed thresh.

Return type:

numpy.ndarray

find_kdes()[source]

Find which associated files are for the KDE reweighting

kde_ratio()[source]

Calculate the weighting factor according to the ratio of the signal and template KDE lookup tables

logsignalrate(stats, shift, to_shift)[source]

Calculate the normalized log rate density of signals via lookup.

This calls back to the parent class and then applies the ratio_kde weighting factor.

Parameters:
  • stats (list of dicts giving single-ifo quantities, ordered as) – self.ifos

  • shift (numpy array of float, size of the time shift vector for each) – coinc to be ranked

  • to_shift (list of int, multiple of the time shift to apply ordered) – as self.ifos

Returns:

value – triggers and time shifts

Return type:

log of coinc signal rate density for the given single-ifo

rank_stat_single(single_info, **kwargs)[source]

Calculate the statistic for a single detector candidate

Parameters:

single_info (tuple) – Tuple containing two values. The first is the ifo (str) and the second is the single detector triggers.

Returns:

The array of single detector statistics

Return type:

numpy.ndarray

class pycbc.events.stat.ExpFitFgBgNormBBHStatistic(sngl_ranking, files=None, ifos=None, max_chirp_mass=None, **kwargs)[source]

Bases: ExpFitFgBgNormStatistic

The ExpFitFgBgNormStatistic with a mass weighting factor.

This is the same as the ExpFitFgBgNormStatistic except the likelihood is multiplied by a signal rate prior modelled as uniform over chirp mass. As templates are distributed roughly according to mchirp^(-11/3) we weight by the inverse of this. This ensures that quiet signals at high mass where template density is sparse are not swamped by events at lower masses where template density is high.

coinc_lim_for_thresh(s, thresh, limifo, **kwargs)[source]

Optimization function to identify coincs too quiet to be of interest

Calculate the required single detector statistic to exceed the threshold for each of the input triggers.

Parameters:
  • s (list) – List of (ifo, single detector statistic) tuples for all detectors except limifo.

  • thresh (float) – The threshold on the coincident statistic.

  • limifo (string) – The ifo for which the limit is to be found.

Returns:

Array of limits on the limifo single statistic to exceed thresh.

Return type:

numpy.ndarray

logsignalrate(stats, shift, to_shift)[source]

Calculate the normalized log rate density of signals via lookup

This calls back to the Parent class and then applies the chirp mass weighting factor.

Parameters:
  • stats (list of dicts giving single-ifo quantities, ordered as) – self.ifos

  • shift (numpy array of float, size of the time shift vector for each) – coinc to be ranked

  • to_shift (list of int, multiple of the time shift to apply ordered) – as self.ifos

Returns:

value – triggers and time shifts

Return type:

log of coinc signal rate density for the given single-ifo

rank_stat_single(single_info, **kwargs)[source]

Calculate the statistic for a single detector candidate

This calls back to the Parent class and then applies the chirp mass weighting factor.

Parameters:

single_info (tuple) – Tuple containing two values. The first is the ifo (str) and the second is the single detector triggers.

Returns:

The array of single detector statistics

Return type:

numpy.ndarray

single(trigs)[source]

Calculate the necessary single detector information

In this case the ranking rescaled (see the lognoiserate method here) with the phase, end time, sigma, SNR, template_id and the benchmark_logvol values added in. This also stored the current chirp mass for use when computing the coinc statistic values.

Parameters:

trigs (dict of numpy.ndarrays, h5py group or similar dict-like object) – Object holding single detector trigger information.

Returns:

The array of single detector values

Return type:

numpy.ndarray

class pycbc.events.stat.ExpFitFgBgNormStatistic(sngl_ranking, files=None, ifos=None, reference_ifos='H1,L1', **kwargs)[source]

Bases: PhaseTDStatistic, ExpFitBgRateStatistic

Statistic combining PhaseTD, ExpFitBg and additional foreground info.

assign_median_sigma(ifo)[source]

Read and sort the median_sigma values from input files.

Parameters:

ifo (str) – The ifo to consider.

coinc_lim_for_thresh(s, thresh, limifo, **kwargs)[source]

Optimization function to identify coincs too quiet to be of interest

Calculate the required single detector statistic to exceed the threshold for each of the input triggers.

Parameters:
  • s (list) – List of (ifo, single detector statistic) tuples for all detectors except limifo.

  • thresh (float) – The threshold on the coincident statistic.

  • limifo (string) – The ifo for which the limit is to be found.

Returns:

Array of limits on the limifo single statistic to exceed thresh.

Return type:

numpy.ndarray

lognoiserate(trigs, alphabelow=6)[source]

Calculate the log noise rate density over single-ifo ranking

Read in single trigger information, make the newsnr statistic and rescale by the fitted coefficients alpha and rate

Parameters:
  • trigs (dict of numpy.ndarrays, h5py group or similar dict-like object) – Object holding single detector trigger information.

  • alphabelow (float, default=6) – Use this slope to fit the noise triggers below the point at which fits are present in the input files.

Returns:

lognoisel – Array of log noise rate density for each input trigger.

Return type:

numpy.array

rank_stat_coinc(s, slide, step, to_shift, **kwargs)[source]

Calculate the coincident detection statistic.

Parameters:
  • sngls_list (list) – List of (ifo, single detector statistic) tuples

  • slide ((unused in this statistic)) –

  • step ((unused in this statistic)) –

  • to_shift (list) – List of integers indicating what multiples of the time shift will be applied (unused in this statistic)

Returns:

Array of coincident ranking statistic values

Return type:

numpy.ndarray

rank_stat_single(single_info, **kwargs)[source]

Calculate the statistic for single detector candidates

Parameters:

single_info (tuple) – Tuple containing two values. The first is the ifo (str) and the second is the single detector triggers.

Returns:

The array of single detector statistics

Return type:

numpy.ndarray

single(trigs)[source]

Calculate the necessary single detector information

In this case the ranking rescaled (see the lognoiserate method here) with the phase, end time, sigma, SNR, template_id and the benchmark_logvol values added in.

Parameters:

trigs (dict of numpy.ndarrays, h5py group or similar dict-like object) – Object holding single detector trigger information.

Returns:

The array of single detector values

Return type:

numpy.ndarray

class pycbc.events.stat.ExpFitStatistic(sngl_ranking, files=None, ifos=None, **kwargs)[source]

Bases: QuadratureSumStatistic

Detection statistic using an exponential falloff noise model.

Statistic approximates the negative log noise coinc rate density per template over single-ifo newsnr values.

assign_fits(ifo)[source]

Extract fits from fit files

Parameters:

ifo (str) – The detector to get fits for.

Returns:

rate_dict – A dictionary containing the fit information in the alpha, rate and thresh keys.

Return type:

dict

coinc_OLD(s0, s1, slide, step)[source]

Calculate the final coinc ranking statistic

coinc_lim_for_thresh(s, thresh, limifo, **kwargs)[source]

Optimization function to identify coincs too quiet to be of interest

coinc_lim_for_thresh_OLD(s0, thresh)[source]

Calculate the required single detector statistic to exceed the threshold for each of the input triggers.

Parameters:
  • s0 (numpy.ndarray) – Single detector ranking statistic for the first detector.

  • thresh (float) – The threshold on the coincident statistic.

Returns:

  • numpy.ndarray – Array of limits on the second detector single statistic to

  • exceed thresh.

find_fits(trigs)[source]

Get fit coeffs for a specific ifo and template id(s)

Parameters:

trigs (dict of numpy.ndarrays, h5py group or similar dict-like object) – Object holding single detector trigger information. The coincidence executable will always call this using a bunch of trigs from a single template, there template_num is stored as an attribute and we just return the single value for all templates. If multiple templates are in play we must return arrays.

Returns:

  • alphai (float or numpy array) – The alpha fit value(s)

  • ratei (float or numpy array) – The rate fit value(s)

  • thresh (float or numpy array) – The thresh fit value(s)

get_ref_vals(ifo)[source]

Get the largest alpha value over all templates for given ifo.

This is stored in self.alphamax[ifo] in the class instance.

Parameters:

ifo (str) – The detector to get fits for.

lognoiserate(trigs)[source]

Calculate the log noise rate density over single-ifo ranking

Read in single trigger information, compute the ranking and rescale by the fitted coefficients alpha and rate

Parameters:

trigs (dict of numpy.ndarrays, h5py group or similar dict-like object) – Object holding single detector trigger information.

Returns:

lognoisel – Array of log noise rate density for each input trigger.

Return type:

numpy.array

rank_stat_coinc(s, slide, step, to_shift, **kwargs)[source]

Calculate the coincident detection statistic.

rank_stat_single(single_info, **kwargs)[source]

Calculate the statistic for a single detector candidate

Parameters:

single_info (tuple) – Tuple containing two values. The first is the ifo (str) and the second is the single detector triggers.

Returns:

The array of single detector statistics

Return type:

numpy.ndarray

single(trigs)[source]

Calculate the necessary single detector information

In this case the ranking rescaled (see the lognoiserate method here).

Parameters:

trigs (dict of numpy.ndarrays, h5py group or similar dict-like object) – Object holding single detector trigger information.

Returns:

The array of single detector values

Return type:

numpy.ndarray

class pycbc.events.stat.PhaseTDExpFitStatistic(sngl_ranking, files=None, ifos=None, **kwargs)[source]

Bases: PhaseTDStatistic, ExpFitCombinedSNR

Statistic combining exponential noise model with signal histogram PDF

coinc_OLD(s0, s1, slide, step)[source]

Calculate the final coinc ranking statistic

coinc_lim_for_thresh(s, thresh, limifo, **kwargs)[source]

Optimization function to identify coincs too quiet to be of interest Calculate the required single detector statistic to exceed the threshold for each of the input triggers.

coinc_lim_for_thresh_OLD(s0, thresh)[source]

Calculate the required single detector statistic to exceed the threshold for each of the input triggers.

Parameters:
  • s0 (numpy.ndarray) – Single detector ranking statistic for the first detector.

  • thresh (float) – The threshold on the coincident statistic.

Returns:

  • numpy.ndarray – Array of limits on the second detector single statistic to

  • exceed thresh.

rank_stat_coinc(s, slide, step, to_shift, **kwargs)[source]

Calculate the coincident detection statistic.

rank_stat_single(single_info, **kwargs)[source]

Calculate the statistic for a single detector candidate

Parameters:

single_info (tuple) – Tuple containing two values. The first is the ifo (str) and the second is the single detector triggers.

Returns:

The array of single detector statistics

Return type:

numpy.ndarray

single(trigs)[source]

Calculate the necessary single detector information

In this case the ranking rescaled (see the lognoiserate method here) with the phase, end time, sigma and SNR values added in.

Parameters:

trigs (dict of numpy.ndarrays, h5py group or similar dict-like object) – Object holding single detector trigger information.

Returns:

The array of single detector values

Return type:

numpy.ndarray

class pycbc.events.stat.PhaseTDStatistic(sngl_ranking, files=None, ifos=None, pregenerate_hist=True, **kwargs)[source]

Bases: QuadratureSumStatistic

Statistic that re-weights combined newsnr using coinc parameters.

The weighting is based on the PDF of time delays, phase differences and amplitude ratios between triggers in different ifos.

coinc_lim_for_thresh(sngls_list, thresh, limifo, **kwargs)[source]

Optimization function to identify coincs too quiet to be of interest. Calculate the required single detector statistic to exceed the threshold for each of the input triggers.

get_hist(ifos=None)[source]

Read in a signal density file for the ifo combination

Parameters:

ifos (list) – The list of ifos. Needed if not given when initializing the class instance.

logsignalrate(stats, shift, to_shift)[source]

Calculate the normalized log rate density of signals via lookup

Parameters:
  • stats (dict of dicts) – Single-detector quantities for each detector

  • shift (numpy array of float) – Time shift vector for each coinc to be ranked

  • to_shift (list of ints) – Multiple of the time shift to apply, ordered as self.ifos

Returns:

value – triggers and time shifts

Return type:

log of coinc signal rate density for the given single-ifo

rank_stat_coinc(sngls_list, slide, step, to_shift, **kwargs)[source]

Calculate the coincident detection statistic, defined in Eq 2 of [Nitz et al, 2017](https://doi.org/10.3847/1538-4357/aa8f50).

rank_stat_single(single_info, **kwargs)[source]

Calculate the statistic for a single detector candidate

For this statistic this is just passing through the single value, which will be the second entry in the tuple.

Parameters:

single_info (tuple) – Tuple containing two values. The first is the ifo (str) and the second is the single detector triggers.

Returns:

The array of single detector statistics

Return type:

numpy.ndarray

single(trigs)[source]

Calculate the necessary single detector information

Here the ranking as well as phase, endtime and sigma-squared values.

Parameters:

trigs (dict of numpy.ndarrays, h5py group or similar dict-like object) – Object holding single detector trigger information. ‘snr’, ‘chisq’, ‘chisq_dof’, ‘coa_phase’, ‘end_time’, and ‘sigmasq’ are required keys.

Returns:

Array of single detector parameter values

Return type:

numpy.ndarray

class pycbc.events.stat.QuadratureSumStatistic(sngl_ranking, files=None, ifos=None, **kwargs)[source]

Bases: Stat

Calculate the quadrature sum coincident detection statistic

coinc_lim_for_thresh(s, thresh, limifo, **kwargs)[source]

Optimization function to identify coincs too quiet to be of interest

Calculate the required single detector statistic to exceed the threshold for each of the input triggers.

Parameters:
  • s (list) – List of (ifo, single detector statistic) tuples for all detectors except limifo.

  • thresh (float) – The threshold on the coincident statistic.

  • limifo (string) – The ifo for which the limit is to be found.

Returns:

Array of limits on the limifo single statistic to exceed thresh.

Return type:

numpy.ndarray

rank_stat_coinc(sngls_list, slide, step, to_shift, **kwargs)[source]

Calculate the coincident detection statistic.

Parameters:
  • sngls_list (list) – List of (ifo, single detector statistic) tuples

  • slide ((unused in this statistic)) –

  • step ((unused in this statistic)) –

  • to_shift (list) – List of integers indicating what multiples of the time shift will be applied (unused in this statistic)

Returns:

Array of coincident ranking statistic values

Return type:

numpy.ndarray

rank_stat_single(single_info, **kwargs)[source]

Calculate the statistic for a single detector candidate

For this statistic this is just passing through the single value, which will be the second entry in the tuple.

Parameters:

single_info (tuple) – Tuple containing two values. The first is the ifo (str) and the second is the single detector triggers.

Returns:

The array of single detector statistics

Return type:

numpy.ndarray

single(trigs)[source]

Calculate the necessary single detector information

Here just the ranking is computed and returned.

Parameters:

trigs (dict of numpy.ndarrays, h5py group or similar dict-like object) – Object holding single detector trigger information.

Returns:

The array of single detector values

Return type:

numpy.ndarray

class pycbc.events.stat.Stat(sngl_ranking, files=None, ifos=None, **kwargs)[source]

Bases: object

Base class which should be extended to provide a coincident statistic

coinc_lim_for_thresh(s, thresh, limifo, **kwargs)[source]

Optimization function to identify coincs too quiet to be of interest

Calculate the required single detector statistic to exceed the threshold for each of the input triggers.

get_sngl_ranking(trigs)[source]

Returns the ranking for the single detector triggers.

Parameters:

trigs (dict of numpy.ndarrays, h5py group or similar dict-like object) – Object holding single detector trigger information.

Returns:

The array of single detector values

Return type:

numpy.ndarray

rank_stat_coinc(s, slide, step, to_shift, **kwargs)[source]

Calculate the coincident detection statistic.

rank_stat_single(single_info, **kwargs)[source]

Calculate the statistic for a single detector candidate

Parameters:

single_info (tuple) – Tuple containing two values. The first is the ifo (str) and the second is the single detector triggers.

Returns:

The array of single detector statistics

Return type:

numpy.ndarray

single(trigs)[source]

Calculate the necessary single detector information

Parameters:

trigs (dict of numpy.ndarrays, h5py group or similar dict-like object) – Object holding single detector trigger information.

Returns:

The array of single detector values

Return type:

numpy.ndarray

pycbc.events.stat.get_statistic(stat)[source]

Error-handling sugar around dict lookup for coincident statistics

Parameters:

stat (string) – Name of the coincident statistic

Returns:

Subclass of Stat base class

Return type:

class

Raises:

RuntimeError – If the string is not recognized as corresponding to a Stat subclass

pycbc.events.stat.get_statistic_from_opts(opts, ifos)[source]

Return a Stat class from an optparser object.

This will assume that the options in the statistic_opt_group are present and will use these options to call stat.get_statistic and initialize the appropriate Stat subclass with appropriate kwargs.

Parameters:
  • opts (optparse.OptParser instance) – The command line options

  • ifos (list) – The list of detector names

Returns:

Subclass of Stat base class

Return type:

class

pycbc.events.stat.insert_statistic_option_group(parser, default_ranking_statistic=None)[source]

Add ranking statistic options to the optparser object.

Adds the options used to initialize a PyCBC Stat class.

Parameters:
  • parser (object) – OptionParser instance.

  • default_ranking_statisic (str) – Allows setting a default statistic for the ‘–ranking-statistic’ option. The option is no longer required if a default is provided.

Returns:

strain_opt_group – The argument group that is added to the parser.

Return type:

optparser.argument_group

pycbc.events.stat.parse_statistic_keywords_opt(stat_kwarg_list)[source]

Parse the list of statistic keywords into an appropriate dictionary.

Take input from the input argument [“KWARG1:VALUE1”, “KWARG2:VALUE2”, “KWARG3:VALUE3”] and convert into a dictionary.

Parameters:

stat_kwarg_list (list) – Statistic keywords in list format

Returns:

stat_kwarg_dict – Statistic keywords in dict format

Return type:

dict

pycbc.events.threshold_cpu module

class pycbc.events.threshold_cpu.CPUThresholdCluster(series)[source]

Bases: _BaseThresholdCluster

threshold_and_cluster(threshold, window)[source]

Threshold and cluster the memory specified at instantiation with the threshold and window size specified at creation.

Parameters:
  • threshold (float32) – The minimum absolute value of the series given at object initialization to return when thresholding and clustering.

  • window (uint32) – The size (in number of samples) of the window over which to cluster

  • Returns

  • --------

  • event_vals (complex64) – Numpy array, complex values of the clustered events

  • event_locs (uint32) – Numpy array, indices into series of location of events

pycbc.events.threshold_cpu.threshold(series, value)
pycbc.events.threshold_cpu.threshold_inline(series, value)[source]
pycbc.events.threshold_cpu.threshold_numpy(series, value)[source]
pycbc.events.threshold_cpu.threshold_only(series, value)

pycbc.events.trigger_fits module

Tools for maximum likelihood fits to single trigger statistic values

For some set of values above a threshold, e.g. trigger SNRs, the functions in this module perform maximum likelihood fits with 1-sigma uncertainties to various simple functional forms of PDF, all normalized to 1. You can also obtain the fitted function and its (inverse) CDF and perform a Kolmogorov-Smirnov test.

Usage: # call the fit function directly if the threshold is known alpha, sigma_alpha = fit_exponential(snrs, 5.5)

# apply a threshold explicitly alpha, sigma_alpha = fit_above_thresh(‘exponential’, snrs, thresh=6.25)

# let the code work out the threshold from the smallest value via the default thresh=None alpha, sigma_alpha = fit_above_thresh(‘exponential’, snrs)

# or only fit the largest N values, i.e. tail fitting thresh = tail_threshold(snrs, N=500) alpha, sigma_alpha = fit_above_thresh(‘exponential’, snrs, thresh)

# obtain the fitted function directly xvals = numpy.xrange(5.5, 10.5, 20) exponential_fit = expfit(xvals, alpha, thresh)

# or access function by name exponential_fit_1 = fit_fn(‘exponential’, xvals, alpha, thresh)

# Use weighting factors to e.g. take decimation into account alpha, sigma_alpha = fit_above_thresh(‘exponential’, snrs, weights=weights)

# get the KS test statistic and p-value - see scipy.stats.kstest ks_stat, ks_pval = KS_test(‘exponential’, snrs, alpha, thresh)

pycbc.events.trigger_fits.KS_test(distr, vals, alpha, thresh=None)[source]

Perform Kolmogorov-Smirnov test for fitted distribution

Compare the given set of discrete values above a given threshold to the fitted distribution function. If no threshold is specified, the minimum sample value will be used. Returns the KS test statistic and its p-value: lower p means less probable under the hypothesis of a perfect fit

Parameters:
  • distr ({'exponential', 'rayleigh', 'power'}) – Name of distribution

  • vals (sequence of floats) – Values to compare to fit

  • alpha (float) – Fitted distribution parameter

  • thresh (float) – Threshold to apply before fitting; if None, use min(vals)

Returns:

  • D (float) – KS test statistic

  • p-value (float) – p-value, assumed to be two-tailed

pycbc.events.trigger_fits.cum_fit(distr, xvals, alpha, thresh)[source]

Integral of the fitted function above a given value (reverse CDF)

The fitted function is normalized to 1 above threshold

Parameters:
  • xvals (sequence of floats) – Values where the function is to be evaluated

  • alpha (float) – The fitted parameter

  • thresh (float) – Threshold value applied to fitted values

Returns:

cum_fit – Reverse CDF of fitted function at the requested xvals

Return type:

array of floats

pycbc.events.trigger_fits.exponential_fitalpha(vals, thresh, w)[source]

Maximum likelihood estimator for the fit factor for an exponential decrease model

pycbc.events.trigger_fits.fit_above_thresh(distr, vals, thresh=None, weights=None)[source]

Maximum likelihood fit for the coefficient alpha

Fitting a distribution of discrete values above a given threshold. Exponential p(x) = alpha exp(-alpha (x-x_t)) Rayleigh p(x) = alpha x exp(-alpha (x**2-x_t**2)/2) Power p(x) = ((alpha-1)/x_t) (x/x_t)**-alpha Values below threshold will be discarded. If no threshold is specified the minimum sample value will be used.

Parameters:
  • distr ({'exponential', 'rayleigh', 'power'}) – Name of distribution

  • vals (sequence of floats) – Values to fit

  • thresh (float) – Threshold to apply before fitting; if None, use min(vals)

  • weights (sequence of floats) – Weighting factors to use for the values when fitting. Default=None - all the same

Returns:

  • alpha (float) – Fitted value

  • sigma_alpha (float) – Standard error in fitted value

pycbc.events.trigger_fits.fit_fn(distr, xvals, alpha, thresh)[source]

The fitted function normalized to 1 above threshold

To normalize to a given total count multiply by the count.

Parameters:
  • xvals (sequence of floats) – Values where the function is to be evaluated

  • alpha (float) – The fitted parameter

  • thresh (float) – Threshold value applied to fitted values

Returns:

fit – Fitted function at the requested xvals

Return type:

array of floats

pycbc.events.trigger_fits.power_fitalpha(vals, thresh, w)[source]

Maximum likelihood estimator for the fit factor for a power law model

pycbc.events.trigger_fits.rayleigh_fitalpha(vals, thresh, w)[source]

Maximum likelihood estimator for the fit factor for a Rayleigh distribution of events

pycbc.events.trigger_fits.tail_threshold(vals, N=1000)[source]

Determine a threshold above which there are N louder values

pycbc.events.trigger_fits.which_bin(par, minpar, maxpar, nbins, log=False)[source]

Helper function

Returns bin index where a parameter value belongs (from 0 through nbins-1) when dividing the range between minpar and maxpar equally into bins.

Parameters:
  • par (float) – Parameter value being binned

  • minpar (float) – Minimum parameter value

  • maxpar (float) – Maximum parameter value

  • nbins (int) – Number of bins to use

  • log (boolean) – If True, use log spaced bins

Returns:

binind – Bin index

Return type:

int

pycbc.events.triggers module

This modules contains functions for reading single and coincident triggers from the command line.

pycbc.events.triggers.bank_bins_from_cli(opts)[source]

Parses the CLI options related to binning templates in the bank.

Parameters:
  • opts (object) – Result of parsing the CLI with OptionParser.

  • Results

  • -------

  • bins_idx (dict) – A dict with bin names as key and an array of their indices as value.

  • bank (dict) – A dict of the datasets from the bank file.

pycbc.events.triggers.get_found_param(injfile, bankfile, trigfile, param, ifo, args=None)[source]

Translates some popular trigger parameters into functions that calculate them from an hdf found injection file

Parameters:
  • injfile (hdf5 File object) – Injection file of format known to ANitz (DOCUMENTME)

  • bankfile (hdf5 File object or None) – Template bank file

  • trigfile (hdf5 File object or None) – Single-detector trigger file

  • param (string) – Parameter to be calculated for the recovered triggers

  • ifo (string or None) – Standard ifo name, ex. ‘L1’

  • args (Namespace object returned from ArgumentParser instance) – Calling code command line options, used for f_lower value

Returns:

[return value] – The calculated parameter values and a Boolean mask indicating which injections were found in the given ifo (if supplied)

Return type:

NumPy array of floats, array of boolean

pycbc.events.triggers.get_inj_param(injfile, param, ifo, args=None)[source]

Translates some popular injection parameters into functions that calculate them from an hdf found injection file

Parameters:
  • injfile (hdf5 File object) – Injection file of format known to ANitz (DOCUMENTME)

  • param (string) – Parameter to be calculated for the injected signals

  • ifo (string) – Standard detector name, ex. ‘L1’

  • args (Namespace object returned from ArgumentParser instance) – Calling code command line options, used for f_lower value

Returns:

[return value] – The calculated parameter values

Return type:

NumPy array of floats

pycbc.events.triggers.get_mass_spin(bank, tid)[source]

Helper function

Parameters:
  • bank (h5py File object) – Bank parameter file

  • tid (integer or array of int) – Indices of the entries to be returned

Returns:

m1, m2, s1z, s2z – Parameter values of the bank entries

Return type:

tuple of floats or arrays of floats

pycbc.events.triggers.get_param(par, args, m1, m2, s1z, s2z)[source]

Helper function

Parameters:
  • par (string) – Name of parameter to calculate

  • args (Namespace object returned from ArgumentParser instance) – Calling code command line options, used for f_lower value

  • m1 (float or array of floats) – First binary component mass (etc.)

Returns:

parvals – Calculated parameter values

Return type:

float or array of floats

pycbc.events.triggers.insert_bank_bins_option_group(parser)[source]

Add options to the optparser object for selecting templates in bins.

Parameters:

parser (object) – OptionParser instance.

pycbc.events.veto module

This module contains utilities to manipulate trigger lists based on segment.

pycbc.events.veto.get_segment_definer_comments(xml_file, include_version=True)[source]

Returns a dict with the comment column as the value for each segment

pycbc.events.veto.indices_outside_segments(times, segment_files, ifo=None, segment_name=None)[source]

Return the list of indices that are outside the segments in the list of segment files.

Parameters:
  • times (numpy.ndarray of integer type) – Array of gps start times

  • segment_files (string or list of strings) – A string or list of strings that contain the path to xml files that contain a segment table

  • ifo (string, optional) – The ifo to retrieve segments for from the segment files

  • segment_name (str, optional) – name of segment

Returns:

  • indices (numpy.ndarray) – The array of index values outside the segments

  • segmentlist – The segment list corresponding to the selected time.

pycbc.events.veto.indices_outside_times(times, start, end)[source]

Return an index array into times that like outside the durations defined by start end arrays

Parameters:
  • times (numpy.ndarray) – Array of times

  • start (numpy.ndarray) – Array of duration start times

  • end (numpy.ndarray) – Array of duration end times

Returns:

indices – Array of indices into times

Return type:

numpy.ndarray

pycbc.events.veto.indices_within_segments(times, segment_files, ifo=None, segment_name=None)[source]

Return the list of indices that should be vetoed by the segments in the list of veto_files.

Parameters:
  • times (numpy.ndarray of integer type) – Array of gps start times

  • segment_files (string or list of strings) – A string or list of strings that contain the path to xml files that contain a segment table

  • ifo (string, optional) – The ifo to retrieve segments for from the segment files

  • segment_name (str, optional) – name of segment

Returns:

  • indices (numpy.ndarray) – The array of index values within the segments

  • segmentlist – The segment list corresponding to the selected time.

pycbc.events.veto.indices_within_times(times, start, end)[source]

Return an index array into times that lie within the durations defined by start end arrays

Parameters:
  • times (numpy.ndarray) – Array of times

  • start (numpy.ndarray) – Array of duration start times

  • end (numpy.ndarray) – Array of duration end times

Returns:

indices – Array of indices into times

Return type:

numpy.ndarray

pycbc.events.veto.segments_to_start_end(segs)[source]
pycbc.events.veto.select_segments_by_definer(segment_file, segment_name=None, ifo=None)[source]

Return the list of segments that match the segment name

Parameters:
  • segment_file (str) – path to segment xml file

  • segment_name (str) – Name of segment

  • ifo (str, optional) –

Returns:

seg

Return type:

list of segments

pycbc.events.veto.start_end_from_segments(segment_file)[source]

Return the start and end time arrays from a segment file.

Parameters:

segment_file (xml segment file) –

Returns:

  • start (numpy.ndarray)

  • end (numpy.ndarray)

pycbc.events.veto.start_end_to_segments(start, end)[source]

Module contents

This packages contains modules for clustering events