Skip to content

Pygrb offline workflow #4288

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 47 commits into from
Jul 10, 2023
Merged
Show file tree
Hide file tree
Changes from 34 commits
Commits
Show all changes
47 commits
Select commit Hold shift + click to select a range
06416d5
Fixed indexing and autochisq argument in pycbc_multi_inspiral
pannarale Feb 5, 2023
1e0713c
Only Hanford is labelled with a 1: avoiding pycbc_pygrb_plot_snr_time…
pannarale Feb 26, 2023
af50dee
Simplified and update pycbc_pygrb_plot_injs_results; adapted pycbc_py…
pannarale Feb 26, 2023
ef79530
Merge remote-tracking branch 'upstream/master' into pygrb_timeslides_…
pannarale Feb 26, 2023
d451505
Forgotten comma
pannarale Feb 26, 2023
611ad0c
Merge branch 'gwastro:master' into pygrb_timeslides_fixes
pannarale Mar 9, 2023
0318a45
Merge branch 'gwastro:master' into pygrb_timeslides_fixes
pannarale Mar 13, 2023
d7dc6bd
Massive commit that stitches together pycbc_make_offline_grb_workflow…
pannarale Mar 13, 2023
05fc4a6
Massive commit that stitches together pycbc_make_offline_grb_workflow…
pannarale Mar 13, 2023
ed44cf0
Code-climate fixes
pannarale Mar 13, 2023
cefe678
More code-climate
pannarale Mar 13, 2023
145c17f
Import error flagged by code-climate
pannarale Mar 13, 2023
08cbb93
Forgotten import fixes
pannarale Mar 13, 2023
1dcf643
Minor cleanup of comments or commented out code lines
pannarale Mar 13, 2023
4099487
Syntax fix
pannarale Mar 14, 2023
fa5b123
Defining 2 empty file lists
pannarale Mar 14, 2023
dc0411e
Merge branch 'gwastro:master' into pygrb_timeslides_fixes
pannarale Mar 15, 2023
32a1f56
First part of PR 4288 review comments
pannarale Mar 15, 2023
7fe1c2e
One more TODO
pannarale Mar 15, 2023
76bd4d2
opt_to_file is now configparser_value_to_file in pycbc/workflow/core.py
pannarale Mar 24, 2023
22533f7
Upgraded configparser_value_to_file with attrs keyword argument
pannarale Mar 24, 2023
9333bd7
Using configparser_value_to_file throughout the PyGRB workflow generator
pannarale Mar 24, 2023
be1aa83
Removing 2 unused PyGRB functions
pannarale Mar 24, 2023
3b48f53
Removed LigolwCBCAlignTotalSpinExecutable, LigolwCBCJitterSkylocExecu…
pannarale Mar 24, 2023
d3ebdec
Codeclimate
pannarale Mar 24, 2023
d3625b5
bank_veto_file variable needs to be a FileList, not a File
pannarale Mar 31, 2023
770e1dc
Merge branch 'gwastro:master' into pygrb_timeslides_fixes
pannarale Jun 25, 2023
dfac88e
Bug with tags for single IFO SNR plots workflow generator
pannarale Jun 26, 2023
983c0d8
Fixing pycbc_pygrb_plot_coh_ifosnr
pannarale Jun 26, 2023
f2d7a74
Fixing layout of chi-square plots and section label for individual de…
pannarale Jun 27, 2023
00bbd2b
Codeclimate
pannarale Jun 28, 2023
08499d0
Codeclimate
pannarale Jun 28, 2023
0d6b4ec
Codeclimate
pannarale Jun 28, 2023
9685852
Removed FIXME for testing purposes
pannarale Jun 28, 2023
cbb62f4
First changes following up on PR review
pannarale Jul 3, 2023
badf31c
Second round of changes following up on PR review
pannarale Jul 3, 2023
177dbec
Flagging with an underscore grb utility functions not used outside th…
pannarale Jul 4, 2023
673e26b
Flagging with an underscore grb utility functions not used outside th…
pannarale Jul 4, 2023
31b666e
Removed _get_antenna_factors from PyGRB utily file
pannarale Jul 4, 2023
e2b8fd8
Removed commented out function from pycbc_pygrb_plot_injs_results
pannarale Jul 4, 2023
7b4e422
Substituted old unused code with a TODO
pannarale Jul 4, 2023
707717b
Adding injection set name as an option in pycbc_pygrb_efficiency
pannarale Jul 4, 2023
0cd748b
extend --> append for single files
pannarale Jul 4, 2023
6c5481d
Documenting setup_pygrb_pp_workflow more appropriately
pannarale Jul 4, 2023
2b10fa8
Reverted name of _get_antenna_responses to get_antenna_responses; cle…
pannarale Jul 5, 2023
908eb83
Line shortened
pannarale Jul 5, 2023
8539b04
Merge branch 'gwastro:master' into pygrb_timeslides_fixes
pannarale Jul 5, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
149 changes: 94 additions & 55 deletions bin/pygrb/pycbc_make_offline_grb_workflow
Original file line number Diff line number Diff line change
Expand Up @@ -33,10 +33,11 @@ import os
import argparse
import logging
import pycbc.workflow as _workflow
from pycbc.workflow.core import configparser_value_to_file
from ligo.segments import segment, segmentlist, segmentlistdict
import matplotlib
matplotlib.use('agg')
from pycbc.results.legacy_grb import make_grb_segments_plot
from pycbc.results.pygrb_plotting_utils import make_grb_segments_plot

workflow_name = "pygrb_offline"
logging.basicConfig(format="%(asctime)s:%(levelname)s : %(message)s",
Expand Down Expand Up @@ -230,7 +231,7 @@ ifo = ifo_list[0]
ifos = ''.join(ifo_list)
wflow.ifos = ifos

# Is this an IPN GRB?
# Config file consistency check for IPN GRBs
if wflow.cp.has_option("workflow-inspiral", "ipn-search-points") \
and wflow.cp.has_option("workflow-injections", "ipn-sim-points"):
wflow.cp.set("injections", "ipn-gps-time",
Expand All @@ -248,25 +249,46 @@ else:
IPN = False

# Get bank_veto_bank.xml if running bank veto
# TODO: Finalize config file location of bank veto bank
if wflow.cp.has_option('workflow-inspiral', 'bank-veto-bank-file'):
bank_veto_file_path = wflow.cp.get('workflow-inspiral', 'bank-veto-bank-file')
bank_veto_file = _workflow.FileList([_workflow.resolve_url_to_file(bank_veto_file_path)])
bank_veto_file = configparser_value_to_file(wflow.cp, 'workflow-inspiral',
'bank-veto-bank-file')
bank_veto_file = _workflow.FileList([bank_veto_file])
datafind_veto_files.extend(bank_veto_file)
# TODO: Finalize location of bank veto bank and make the above automatic
# Below is the snippet of old grb_utils.py code that used to handle this
# if os.getenv("LAL_SRC") is None:
# raise ValueError("The environment variable LAL_SRC must be set to a "
# "location containing the file lalsuite.git")
# else:
# lalDir = os.getenv("LAL_SRC")
# sci_seg = segments.segment(int(cp.get("workflow", "start-time")),
# int(cp.get("workflow", "end-time")))
# file_list = FileList([])
#
# # Bank veto
# shutil.copy("%s/lalapps/src/ring/coh_PTF_config_files/" \
# "bank_veto_bank.xml" % lalDir, "%s" % run_dir)
# bank_veto_url = "file://localhost%s/bank_veto_bank.xml" % run_dir
# bank_veto = File(ifos, "bank_veto_bank", sci_seg,
# file_url=bank_veto_url)
# # FIXME: If this is an input file, use the from_path classmethod
# bank_veto.add_pfn(bank_veto.cache_entry.path, site="local")
# file_list.extend(FileList([bank_veto]))


if IPN:
search_pts_file = _workflow.get_ipn_sky_files(wflow,
wflow.cp.get("workflow-inspiral", "ipn-search-points"),
tags=["SEARCH"])
file_attrs = {
'ifos': wflow.ifos,
'segs': wflow.analysis_time,
'exe_name': "IPN_SKY_POINTS",
'tags': ["SEARCH"]
}
search_pts_file = configparser_value_to_file(wflow.cp,
'workflow-inspiral',
'ipn-search-points',
file_attrs=file_attrs)
datafind_veto_files.extend(_workflow.FileList([search_pts_file]))

# Make ExtTrig xml file (needed for lalapps_inspinj and summary pages)
# TODO: how will this work in O4?
if wflow.cp.has_option("inspiral", "do-exttrig"):
ext_file = _workflow.make_exttrig_file(wflow.cp, ifos, sciSegs[ifo][0],
baseDir)
all_files.extend(_workflow.FileList([ext_file]))

all_files.extend(datafind_veto_files)

# TEMPLATE BANK AND SPLIT BANK
Expand All @@ -291,15 +313,25 @@ if wflow.cp.has_section("workflow-injections"):

# Generate injection files
if IPN:
sim_pts_file = _workflow.get_ipn_sky_files(wflow,
wflow.cp.get("workflow-injections", "ipn-sim-points"),
tags=["SIM"])
# TODO: we used to pass this file to setup_injection_workflow as
# exttrig_file = sim_pts_file to then use it in lalapps_insping.
# The code below picks it up but does not pass it: get
# setup_injection_workflow or pycbc_create_injections handle it
# directly via configparser_value_to_file
file_attrs = {
'ifos': wflow.ifos,
'segs': wflow.analysis_time,
'exe_name': "IPN_SKY_POINTS",
'tags': ["SIM"]
}
sim_pts_file = configparser_value_to_file(wflow.cp,
'workflow-inspiral',
'ipn-sim-points',
file_attrs=file_attrs)
all_files.extend(_workflow.FileList([sim_pts_file]))
inj_files, inj_tags = _workflow.setup_injection_workflow(wflow, injDir,
exttrig_file=sim_pts_file)
inj_files, inj_tags = _workflow.setup_injection_workflow(wflow, injDir)
else:
inj_files, inj_tags = _workflow.setup_injection_workflow(wflow, injDir,
exttrig_file=ext_file)
inj_files, inj_tags = _workflow.setup_injection_workflow(wflow, injDir)
all_files.extend(inj_files)
injs = inj_files

Expand Down Expand Up @@ -389,9 +421,7 @@ all_files.extend(_workflow.FileList([inspiral_cache]))
inspiral_cache_entries = inspiral_files.convert_to_lal_cache()
inspiral_cache_entries.tofile(open(inspiral_cache.storage_path, "w"))

# LONG TIME SLIDES
# TODO: Eventually remove this once new time slides are finished
# Long slides will likely not be done in the future
# LONG TIME SLIDES: unlikely to be done in the (near) future
long_slides = True if wflow.cp.has_option("workflow", "do-long-slides") else False
if long_slides:
tsDir = os.path.join(currDir, "timeslides")
Expand All @@ -418,43 +448,52 @@ ppDir = os.path.join(currDir, "post_processing")
os.makedirs(ppDir)
post_proc_method = wflow.cp.get_opt_tags("workflow-postproc",
"postproc-method", tags)

pp_files = _workflow.FileList([])
results_files = _workflow.FileList([])
if post_proc_method == "PYGRB_OFFLINE":
pp_files = _workflow.setup_pygrb_pp_workflow(wflow, ppDir, segDir,
sciSegs[ifo][0], inspiral_files,
injs, inj_insp_files, inj_tags)

# TODO: Remove all coh_ptf post-processing
if post_proc_method in ["COH_PTF_WORKFLOW", "COH_PTF_OFFLINE",
"COH_PTF_ONLINE"]:
from pylal import pygrb_cohptf_pp
# Add parsed config file so it can be linked from summary page
cp_file_name = workflow_name + ".ini"
cp_file_url = "file://localhost%s/%s" % (runDir, cp_file_name)
cp_file = _workflow.File(ifos, cp_file_name, sciSegs[ifo][0],
file_url=cp_file_url)
cp_file.PFN(cp_file.cache_entry.path, site="local")

# Generate post-processing workflow
html_dir = wflow.cp.get("workflow", "html-dir")
pp_files = pygrb_cohptf_pp.setup_coh_PTF_post_processing(wflow,
inspiral_files, inspiral_cache, ppDir, segDir,
injection_trigger_files=inj_insp_files, injection_files=injs,
injection_trigger_caches=inj_insp_caches,
timeslide_trigger_files=inspiral_ts_files,
injection_caches=inj_caches, config_file=cp_file, web_dir=html_dir,
segments_plot=segs_plot, ifos=ifos, inj_tags=inj_tags)

# Retrieve style files for webpage
summary_files = _workflow.get_coh_PTF_files(wflow.cp, ifos, ppDir,
summary_files=True)

pp_files.extend(_workflow.FileList([cp_file]))
pp_files.extend(summary_files)
# pp_files[1] ALLTIMES_CLUSTERED File
# pp_files[3] ONSOURCE_CLUSTERED File
# pp_files[-2] FOUNDMISSED FileList
# pp_files[-1] FOUNDMISSED-FILTERED FileList
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would be good to document all files in pp_files in case (for e.g.) someone might want to use pp_files[2]

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This comment makes me wonder if different data structure may be beneficial here in terms of code clarity, maybe a dict?

Copy link
Contributor Author

@pannarale pannarale Jul 3, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I will always need a _workflow.FileList but I guess I can build a dictionary in parallel, use it as a "living documentation" structure and go through it when I need individual pp_files elements.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay, thinking about this more, it is probably better to document this in pycbc/workflow/grb_utils.py where setup_pygrb_pp_workflow is defined, and here place a reminder to look over there.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

On second thought (not required for this PR, as it's time to merge this, but something to fix going forward):

I think it's a bad idea to assume anything about the order here, especially things like [-1] or [-2]. The FileLists were never intended to have a special order. That said, the case of a job that produces multiple output files was never supported well, and dealing with cases like this is ugly ..... Suggestions for improvement here are welcome!

However, I would suggest using one of the methods of the FileList class (https://pycbc.org/pycbc/latest/html/_modules/pycbc/workflow/core.html#FileList) to get these files. It's possible that a new method will be needed, as I think you'll want to search by "description", rather than "tag" (but I'm not sure). I think that file_list. categorize_by_attr('description') may produce what @titodalcanton suggests ... and would then be easier and clearer to work with.

sec_name = 'workflow-pygrb_pp_workflow'
if not wflow.cp.has_section(sec_name):
msg = 'No {0} section found in configuration file.'.format(sec_name)
logging.info(msg)
else:
logging.info('Entering results module')
results_files = _workflow.setup_pygrb_results_workflow(wflow, ppDir,
pp_files[1],
pp_files[-2])
logging.info('Leaving results module')

#if post_proc_method in ["COH_PTF_WORKFLOW", "COH_PTF_OFFLINE",
# "COH_PTF_ONLINE"]:
# from pylal import pygrb_cohptf_pp
# # Add parsed config file so it can be linked from summary page
# cp_file_name = workflow_name + ".ini"
# cp_file_url = "file://localhost%s/%s" % (runDir, cp_file_name)
# cp_file = _workflow.File(ifos, cp_file_name, sciSegs[ifo][0],
# file_url=cp_file_url)
# cp_file.PFN(cp_file.cache_entry.path, site="local")
#
# # Generate post-processing workflow
# html_dir = wflow.cp.get("workflow", "html-dir")
# pp_files = pygrb_cohptf_pp.setup_coh_PTF_post_processing(wflow,
# inspiral_files, inspiral_cache, ppDir, segDir,
# injection_trigger_files=inj_insp_files, injection_files=injs,
# injection_trigger_caches=inj_insp_caches,
# timeslide_trigger_files=inspiral_ts_files,
# injection_caches=inj_caches, config_file=cp_file, web_dir=html_dir,
# segments_plot=segs_plot, ifos=ifos, inj_tags=inj_tags)
#
# pp_files.extend(_workflow.FileList([cp_file]))

all_files.extend(pp_files)
all_files.extend(results_files)

# COMPILE WORKFLOW AND WRITE DAX
wflow.save()
logging.info("Written dax.")

16 changes: 4 additions & 12 deletions bin/pygrb/pycbc_pygrb_efficiency
Original file line number Diff line number Diff line change
Expand Up @@ -101,21 +101,14 @@ parser.add_argument("-g", "--glitch-check-factor", action="store",
parser.add_argument("-C", "--cluster-window", action="store", type=float,
default=0.1, help="The cluster window used " +
"to cluster triggers in time.")
ppu.pygrb_add_missed_injs_input_opt(parser)
ppu.pygrb_add_injmc_opts(parser)
ppu.pygrb_add_bestnr_opts(parser)
opts = parser.parse_args()

init_logging(opts.verbose, format="%(asctime)s: %(levelname)s: %(message)s")

# Check options
if (opts.found_file is None) and (opts.missed_file is None):
do_injections = False
elif (opts.found_file) and opts.missed_file:
do_injections = True
else:
err_msg = "Must provide both found and missed file if running injections."
parser.error(err_msg)
do_injections = opts.found_missed_file is not None

if not opts.newsnr_threshold:
opts.newsnr_threshold = opts.snr_threshold
Expand All @@ -124,9 +117,8 @@ if not opts.newsnr_threshold:
outdir = os.path.split(os.path.abspath(opts.background_output_file))[0]
trig_file = opts.offsource_file
onsource_file = opts.onsource_file
found_file = opts.found_file
missed_file = opts.missed_file
inj_set_name = os.path.split(os.path.abspath(missed_file))[1].split('INSPINJ')[1].split('_')[1]
found_missed_file = opts.found_missed_file
inj_set_name = os.path.split(os.path.abspath(found_missed_file))[1].split('INSPINJ')[1].split('_')[1]
chisq_index = opts.chisq_index
chisq_nhigh = opts.chisq_nhigh
wf_err = opts.waveform_error
Expand Down Expand Up @@ -376,7 +368,7 @@ if do_injections:
logging.info("%d found injections analysed.", len(found_injs))

# Missed injections (ones not recovered at all)
missed_injs = ppu.load_injections(missed_file, vetoes, sim_table=True)
missed_injs = ppu.load_injections(found_missed_file, vetoes, sim_table=True)

# Process missed injections 'missed_inj'
missed_inj = {}
Expand Down
9 changes: 5 additions & 4 deletions bin/pygrb/pycbc_pygrb_minifollowups
Original file line number Diff line number Diff line change
Expand Up @@ -35,8 +35,9 @@ import pycbc.workflow.minifollowups as mini
import pycbc.version
import pycbc.events
from pycbc.results import layout
from pycbc.results import pygrb_postprocessing_utils as ppu
from pycbc.results.pygrb_postprocessing_utils import extract_ifos
from pycbc.workflow.plotting import PlotExecutable
from pycbc.workflow.grb_utils import build_veto_filelist, build_segment_filelist

__author__ = "Francesco Pannarale <[email protected]>"
__version__ = pycbc.version.git_verbose_msg
Expand Down Expand Up @@ -72,10 +73,10 @@ def make_timeseries_plot(workflow, trig_file, snr_type, central_time,
tags=tags+extra_tags).create_node()
node.add_input_opt('--trig-file', trig_file)
# Pass the veto files
veto_files = ppu.build_veto_filelist(workflow)
veto_files = build_veto_filelist(workflow)
node.add_input_list_opt('--veto-files', veto_files)
# Pass the segment files
seg_files = ppu.build_segment_filelist(workflow)
seg_files = build_segment_filelist(workflow)
node.add_input_list_opt('--seg-files', seg_files)
# Other shared tuning values
for opt in ['chisq-index', 'chisq-nhigh', 'null-snr-threshold',
Expand Down Expand Up @@ -155,7 +156,7 @@ num_events = min(num_events, len(fp['BestNR'][:]))

# Determine ifos used in the analysis
trig_file = resolve_url_to_file(os.path.abspath(args.trig_file))
ifos = ppu.extract_ifos(os.path.abspath(args.trig_file))
ifos = extract_ifos(os.path.abspath(args.trig_file))
num_ifos = len(ifos)

# (Loudest) off/on-source events are on time-slid data so the
Expand Down
52 changes: 41 additions & 11 deletions bin/pygrb/pycbc_pygrb_page_tables
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,38 @@ __program__ = "pycbc_pygrb_page_tables"
# =============================================================================
# Functions
# =============================================================================
# Function to load trigger data
def load_data(input_file, vetoes, ifos, opts, injections=False):
"""Load data from a trigger/injection file"""

# Initialize the dictionary
data = {}
data['trig_time'] = None
data['coherent'] = None
data['reweighted_snr'] = None

if input_file:
if injections:
logging.info("Loading injections...")
# This will eventually become ppu.load_injections()
trigs_or_injs = ppu.load_triggers(input_file, vetoes)
else:
logging.info("Loading triggers...")
trigs_or_injs = ppu.load_triggers(input_file, vetoes)

data['trig_time'] = trigs_or_injs['network/end_time_gc'][:]
num_trigs_or_injs = len(data['trig_time'])
data['coherent'] = trigs_or_injs['network/coherent_snr'][:]
data['reweighted_snr'] = trigs_or_injs['network/reweighted_snr'][:]

if injections:
logging.info("%d injections found.", num_trigs_or_injs)
else:
logging.info("%d triggers found.", num_trigs_or_injs)

return data


def get_column(table, column_name):
"""Wrapper for glue.ligolw.lsctables.MultiInspiralTable.get_column
method. Easier for h5py switch. Note: Must still replace
Expand Down Expand Up @@ -190,7 +222,6 @@ parser.add_argument("-F", "--offsource-file", action="store", required=True,
# As opposed to offsource-file and trig-file, this only contains onsource
parser.add_argument("--onsource-file", action="store", default=None,
help="Location of on-source trigger file.")
ppu.pygrb_add_missed_injs_input_opt(parser)
ppu.pygrb_add_injmc_opts(parser)
ppu.pygrb_add_bestnr_opts(parser)
parser.add_argument("--num-loudest-off-trigs", action="store",
Expand All @@ -200,6 +231,7 @@ parser.add_argument("--quiet-found-injs-output-file", default=None, #required=Tr
help="Quiet-found injections html output file.")
parser.add_argument("--missed-found-injs-output-file", default=None, #required=True,
help="Missed-found injections html output file.")
# TODO: group hdf5 files into a single one
parser.add_argument("--quiet-found-injs-h5-output-file", default=None, #required=True,
help="Quiet-found injections h5 output file.")
parser.add_argument("--loudest-offsource-trigs-output-file", default=None, #required=True,
Expand Down Expand Up @@ -241,9 +273,9 @@ if output_files.count(None) == len(output_files):
if opts.quiet_found_injs_output_file or opts.missed_found_injs_output_file or\
opts.quiet_found_injs_h5_output_file:
do_injections = True
if (opts.found_file is None) and (opts.missed_file is None):
err_msg = "Must provide both found and missed injections file "
err_msg += "locations if processing injections."
if opts.found_missed_file is None:
err_msg = "Must provide found-missed injections file "
err_msg += "location if processing injections."
parser.error(err_msg)
else:
do_injections = False
Expand All @@ -260,8 +292,7 @@ if not opts.newsnr_threshold:
# Store options used multiple times in local variables
trig_file = opts.offsource_file
onsource_file = opts.onsource_file
found_file = opts.found_file
missed_file = opts.missed_file
found_missed_file = opts.found_missed_file
chisq_index = opts.chisq_index
chisq_nhigh = opts.chisq_nhigh
wf_err = opts.waveform_error
Expand Down Expand Up @@ -316,8 +347,7 @@ ifos, vetoes = ppu.extract_ifos_and_vetoes(trig_file, opts.veto_files,

# Load triggers, time-slides, and segment dictionary
logging.info("Loading triggers.")
trigs = ppu.load_xml_table(trig_file, lsctables.MultiInspiralTable.tableName)
logging.info("%d triggers loaded.", len(trigs))
trig_data = load_data(trig_file, vetoes, ifos, opts)
logging.info("Loading timeslides.")
slide_dict = ppu.load_time_slides(trig_file)
logging.info("Loading segments.")
Expand Down Expand Up @@ -587,8 +617,8 @@ else:
# =======================
if do_injections:

found_trig_table = ppu.load_injections(found_file, vetoes)
found_inj_table = ppu.load_injections(found_file, vetoes, sim_table=True)
found_trig_table = ppu.load_injections(found_missed_file, vetoes)
found_inj_table = ppu.load_injections(found_missed_file, vetoes, sim_table=True)

logging.info("Missed/found injections/triggers loaded.")

Expand Down Expand Up @@ -618,7 +648,7 @@ if do_injections:
logging.info("%d found injections analysed.", len(found_injs['mchirp']))

# Missed injections (ones not recovered at all)
missed_inj_table = ppu.load_injections(missed_file, vetoes, sim_table=True)
missed_inj_table = ppu.load_injections(found_missed_file, vetoes, sim_table=True)
missed_injs = lsctable_to_dict(missed_inj_table, ifos, opts)

# Avoids a problem with formatting in the non-static html output file
Expand Down
Loading