Parallel Bilby Generation
The Basic Idea
To run efficiently, parallel bilby requires some preparatory work, just as efficient construction work requires some preparatory logistical planning. Fortunately, you do not need to do this on your own. Instead, you can outsource this task to a parallel_bilby_generation
command, which will meet your initial demands if they are well specified in a configuration file. You will find this mostly referred to as config.ini
. It should be used to specify non-default arguments that are taken by parallel_bilby_generation
. Several default values are taken from bilby_pipe, too. Find below some recommended configurations that should comprise your .ini
, but see the parallel_bilby
-docs, too.
Data Generation arguments
trigger_time = 0
For realistic data, this should be set to GPS time, e.g. 1187008882.43 for GW170817. For injection studies, it will mostly suffice to set the value to 0.
Detector arguments
detectors = [ET1, ET2, ET3] psd_dict = {ET1=bilby/bilby/gw/detector/noise_curves/ET_D_psd.txt, ET2=bilby/bilby/gw/detector/noise_curves/ET_D_psd.txt, ET3=bilby/bilby/gw/detector/noise_curves/ET_D_psd.txt}
The power spectral density (PSD) characterises the assumed noise level in the detectors.
It needs to be adopted to the detectors in use, in this example the Einstein Telescope. Other PSDs can be found in the same directory bilby/bilby/gw/detector/noise_curves/
.
If you already have a signal available, you would need to submit it in a data_dict
.
Likelihood arguments
distance-marginalization=False phase-marginalization=True time-marginalization=False likelihood-type = ROQGravitationalWaveTransient roq-folder = [SOME_PATH]/roqs
We can marginalise our likelihood over some parameters, simply by integrating over their support. This makes sense when we are not really interested in them, e.g. because they have no intrinsing meaning to the system.
Using Reduced-order quadratures (ROQs) will speedup your inference significantly. The idea is to reduce the cost of likelihood evaluations by including a data-dependent basis for the computation of the likelihood integration. Their use is thefore highly encouraged. However, they require a lot of memory and should be adapted to the expected parameter space, implying constraints on the prior. As of 2022, constructing advanced ROQs is a global community effort. Some early ROQs can be found here. Ask your supervisor for up-to-date repositories.
Prior arguments
prior-file = bilby/bilby/gw/prior_files/aligned_spins_bns.prior
You will need to specify the prior from which samples as bilby.Prior
-objects.
Bilby provides several standard priors in the directory bilby/bilby/gw/prior_files/
.
Alternatively, you might add the prior directly to your .ini
-file as a prior-dict
, e.g. as
prior-dict={mass_1_source = Constraint(name='mass_1_source', minimum=0.5, maximum=3.) mass_2_source = Constraint(name='mass_2_source', minimum=0.5, maximum=3.) mass_1 = Constraint(name='mass_1', minimum=0.5, maximum=3.) mass_2 = Constraint(name='mass_2', minimum=0.5, maximum=3.) mass_ratio = Uniform(name='mass_ratio', minimum=0.125, maximum=1) chirp_mass_source = Uniform(name='chirp_mass_source', minimum=1.05, maximum=1.2) chirp_mass = Constraint(name='chirp_mass', minimum=1.1, maximum=1.2) chi_1 = bilby.gw.prior.AlignedSpin(name='chi_1', a_prior=Uniform(minimum=0, maximum=0.15)) chi_2 = bilby.gw.prior.AlignedSpin(name='chi_2', a_prior=Uniform(minimum=0, maximum=0.15)) luminosity_distance = bilby.gw.prior.UniformComovingVolume(name='luminosity_distance', minimum=5, maximum=500, unit='Mpc', latex_label='$d_L$') dec = -0.408084 ra = 3.44616 cos_theta_jn = Uniform(name='cos_theta_jn', minimum=-1, maximum=1) psi = Uniform(name='psi', minimum=0, maximum=np.pi, boundary='periodic') phase = Uniform(name='phase', minimum=0, maximum=2 * np.pi, boundary='periodic') lambda_1 = Constraint(name='lambda_1', minimum=0., maximum=5000.) lambda_2 = Constraint(name='lambda_2', minimum=0., maximum=5000.)}
This is an optimised prior for a double neutron star coalescence. Note how the sky localisation has been fixed to the position of GW170817 and multiple constraints have been introduced in the source frame as well as on the tidal disformabilities $\Lambda_{1,2}$. The sampler will not use sampling keys that were provided as constraints or fixed values. As a rule of thumb, the priors should be kept as narrow as possible to minimise the computational cost, but the posterior should only cover its edges if logically necessary. In other words: If a posterior distribution obtained from the above prior peaks near a luminosity distance of 500Mpc, this indicates a poorly chosen prior, while a peak in mass ratio close to 1 is unproblematic by its definition.
EoS Arguments
When using the EoS-patches, parallel_bilby
will be able to further sample over a certain amount of Neutron Star Equations of State.
For this to work, your EoSs need to strictly adhere to the following format:
Assume you want to sample over 500 EoS. Each of these needs to be given by the equivalent mass-radius relation and the associated dimensionless tidal deformability $Lambda$, . You need to make 500 files labelled 1.dat
, …, 500.dat
and store them inside a single directory [EOS_DIR]
.
The files must contain exactly three columns of equal length for – from left to right – radius, mass and $Lambda$, without any headers. The entries must be ordered by increasing mass and cover a mass range from about 0.15 solar masses to the TOV-limit. About 100 data points should be sufficient as the programme will interpolate between them. Experienced users might prefer to apply patches of their own to accomodate different needs.
Once this is completed, add these arguments to your .ini
eos = true Neos = [YOUR_NUMBER_OF_EOS] path-to-eos-data =[EOS_DIR]
You likely have reasons to ascribe a higher prior belief to some of the EoSs. This can be expressed by using sampling weights. They need again to be stored in a single txt-file [YOUR_EOS_WEIGHTS].dat
where the $i$-th line (only) contains the weight of the $i$-th EoS. Because bilby
does not know how to handle truly discrete quantities, we have to find a workaround, using an interpolation. In order to minimise the effects of interpolation-increased/reduced weights, you need to reorder the the EoS as well by increasing/decreasing weights and provide the argument.
path-to-eos-weight =[YOUR_ORDERED_EOS_WEIGHTS.dat/txt]
Find some routines to help you with this task as well as example EoS- and weight-files in this zip
-directory.
Injection arguments
These are required if you do not provide data to be analysed but rather wish that bilby
generates it for you, based upon some parameters you specify.
injection = True gaussian-noise = True injection-numbers=[0] n-simulation = 1 injection-file =[YOUR_INJECTION].json
The injected parameters in injection-file
are used to create a waveform model that is superseeded by noise.
If you are interested in optimal conditions, you might also want to use zero-noise
instead of gaussian-noise
.
The injection-numbers
specify the rows to use in your injection-file
, making use of the ordinary python slicing-syntax. n-simulation
must match the total number of injections , that is to say n-simulation= len(injection-numbers)
.
Waveform arguments
frequency-domain-source-model = lal_binary_neutron_star waveform_approximant = IMRPhenomD_NRTidalv2 maximum-frequency=2048 minimum-frequency= 30 duration = 256 catch-waveform-errors=True
sampler arguments
nact = 30 nlive = 2048 maxmcmc = 10000 sampling-seed = 42 no-plot = True generation-seed=500221 #generation-seed = DUMMY
Job Submission arguments
label = inj outdir = outdir
You may want to assign unique labels and directory names do quickly identify different runs.