dials.algorithms.scaling

This module provides datastructures and algorithms for scaling.

model.model

Definitions of scaling models.

A scaling model is a collection of scaling model components with appropriate methods to define how these are composed into one model.

class dials.algorithms.scaling.model.model.ArrayScalingModel(parameters_dict, configdict, is_scaled=False)[source]

Bases: ScalingModelBase

A scaling model for an array-based parameterisation.

__init__(parameters_dict, configdict, is_scaled=False)[source]

Create the array scaling model components.

configure_components(reflection_table, experiment, params)[source]

Add the required reflection table data to the model components.

property consecutive_refinement_order

a nested list of component names to indicate scaling order.

Type:

list

classmethod from_data(params, experiment, reflection_table)[source]

create an array-based scaling model.

classmethod from_dict(obj)[source]

Create an ArrayScalingModel from a dictionary.

id_ = 'array'
limit_image_range(new_image_range)[source]

Modify the model to be suitable for a reduced image range.

For this model, this involves determining whether the number of parameters should be reduced and may reduce the number of parameters in the absorption and decay components.

Parameters:

new_image_range (tuple) – The (start, end) of the new image range.

phil_scope = <libtbx.phil.scope object>
plot_model_components(reflection_table=None)[source]

Return a dict of plots for plotting model components with plotly.

update(params)[source]
class dials.algorithms.scaling.model.model.DoseDecay(parameters_dict, configdict, is_scaled=False)[source]

Bases: ScalingModelBase

A model similar to the physical model, where an exponential decay component is used plus a relative B-factor per sweep, with no absorption surface by default. Most suitable for multi-crystal datasets.

__init__(parameters_dict, configdict, is_scaled=False)[source]

Create the physical scaling model components.

configure_components(reflection_table, experiment, params)[source]

Add the required reflection table data to the model components.

property consecutive_refinement_order

a nested list of component names to indicate scaling order.

Type:

list

fix_initial_parameter(params)[source]

Fix a parameter of the scaling model.

classmethod from_data(params, experiment, reflection_table)[source]

Create the scaling model defined by the params.

classmethod from_dict(obj)[source]

Create a DoseDecayScalingModel from a dictionary.

get_shared_components()[source]
id_ = 'dose_decay'
limit_image_range(new_image_range)[source]

Modify the model to be suitable for a reduced image range.

For this model, this involves determining whether the number of parameters should be reduced and may reduce the number of parameters in the scale and decay components.

Parameters:

new_image_range (tuple) – The (start, end) of the new image range.

phil_scope = <libtbx.phil.scope object>
plot_model_components(reflection_table=None)[source]

Return a dict of plots for plotting model components with plotly.

update(params)[source]
class dials.algorithms.scaling.model.model.KBScalingModel(parameters_dict, configdict, is_scaled=False)[source]

Bases: ScalingModelBase

A scaling model for a KB parameterisation.

__init__(parameters_dict, configdict, is_scaled=False)[source]

Create the KB scaling model components.

configure_components(reflection_table, experiment, params)[source]

Add the required reflection table data to the model components.

property consecutive_refinement_order

a nested list of component names to indicate scaling order.

Type:

list

classmethod from_data(params, experiment, reflection_table)[source]

Create the KBScalingModel from data.

classmethod from_dict(obj)[source]

Create an KBScalingModel from a dictionary.

id_ = 'KB'
phil_scope = <libtbx.phil.scope object>
update(params)[source]
class dials.algorithms.scaling.model.model.PhysicalScalingModel(parameters_dict, configdict, is_scaled=False)[source]

Bases: ScalingModelBase

A scaling model for a physical parameterisation.

__init__(parameters_dict, configdict, is_scaled=False)[source]

Create the physical scaling model components.

configure_components(reflection_table, experiment, params)[source]

Add the required reflection table data to the model components.

property consecutive_refinement_order

a nested list of component names to indicate scaling order.

Type:

list

fix_initial_parameter(params)[source]

Fix a parameter of the scaling model.

classmethod from_data(params, experiment, reflection_table)[source]

Create the scaling model defined by the params.

classmethod from_dict(obj)[source]

Create a PhysicalScalingModel from a dictionary.

get_shared_components()[source]
id_ = 'physical'
limit_image_range(new_image_range)[source]

Modify the model to be suitable for a reduced image range.

For this model, this involves determining whether the number of parameters should be reduced and may reduce the number of parameters in the scale and decay components.

Parameters:

new_image_range (tuple) – The (start, end) of the new image range.

phil_scope = <libtbx.phil.scope object>
plot_model_components(reflection_table=None)[source]

Return a dict of plots for plotting model components with plotly.

update(params)[source]

Update the model if new options chosen in the phil scope.

class dials.algorithms.scaling.model.model.ScalingModelBase(configdict, is_scaled=False)[source]

Bases: object

Abstract base class for scaling models.

__init__(configdict, is_scaled=False)[source]

Initialise the model with no components and a configdict.

property components

a dictionary of the model components.

Type:

dict

property configdict

a dictionary of the model configuration parameters.

Type:

dict

configure_components(reflection_table, experiment, params)[source]

Add the required reflection table data to the model components.

consecutive_refinement_order()[source]

list: a nested list of component names.

This list indicates to the scaler the order to perform scaling in consecutive scaling mode. e.g. [[‘scale’, ‘decay’], [‘absorption’]] would cause the first cycle to refine scale and decay, and then absorption in a subsequent cycle.

property error_model

The error model associated with the scaling model.

Type:

error_model

fix_initial_parameter(params)[source]

Fix a parameter of the scaling model.

property fixed_components
classmethod from_data(params, experiment, reflection_table)[source]

Create the model from input data.

classmethod from_dict(obj)[source]

Create a scaling model from a dictionary.

get_shared_components()[source]
id_ = None
property is_scaled

Indicate whether this model has previously been refined.

Type:

bool

limit_image_range(new_image_range)[source]

Modify the model if necessary due to reducing the image range.

Parameters:

new_image_range (tuple) – The (start, end) of the new image range.

load_error_model(error_params)[source]
property n_params

a dictionary of the model components.

Type:

dict

plot_model_components(reflection_table=None)[source]

Return a dict of plots for plotting model components with plotly.

record_intensity_combination_Imid(Imid)[source]

Record the intensity combination Imid value.

set_error_model(error_model)[source]

Associate an error model with the dataset.

set_scaling_model_as_scaled()[source]

Set the boolean ‘is_scaled’ flag as True.

set_scaling_model_as_unscaled()[source]

Set the boolean ‘is_scaled’ flag as False.

set_valid_image_range(image_range)[source]

Set the valid image range for the model in the configdict.

to_dict()[source]

Serialize the model to a dictionary.

Returns:

A dictionary representation of the model.

Return type:

dict

update(model_params)[source]
dials.algorithms.scaling.model.model.calc_n_param_from_bins(value_min, value_max, n_bins)[source]

Return the correct number of bins for initialising the gaussian smoothers.

dials.algorithms.scaling.model.model.calculate_new_offset(current_image_0, new_image_0, new_norm_fac, n_old_param, n_new_param)[source]

Calculate the parameter offset for the new image range.

Returns:

An offset to apply when selecting the new parameters from the

existing parameters.

Return type:

int

dials.algorithms.scaling.model.model.determine_auto_absorption_params(absorption)[source]
dials.algorithms.scaling.model.model.initialise_smooth_input(osc_range, one_osc_width, interval)[source]

Calculate the required smoother parameters.

Using information about the sequence and the chosen parameterisation interval, the required parameters for the smoother are determined.

Parameters:
  • osc_range (tuple) – The (start, stop) of an oscillation in degrees.

  • one_osc_width (float) – The oscillation width of a single image in degrees.

  • interval (float) – The required maximum separation between parameters in degrees.

Returns:

3-element tuple containing;

n_params (int): The number of parameters to use. norm_fac (float): The degrees to parameters space normalisation factor. interval (float): The actual interval in degrees between the parameters.

Return type:

tuple

dials.algorithms.scaling.model.model.make_combined_plots(data)[source]

Make any plots that require evaluation of all models.

dials.algorithms.scaling.model.model.plot_scaling_models(model, reflection_table=None)[source]

Return a dict of component plots for the model for plotting with plotly.

outlier_rejection

Definitions of outlier rejection algorithms.

These algorithms use the Ih_table datastructures to perform calculations in groups of symmetry equivalent reflections. Two functions are provided, reject_outliers, to reject outlier and set flags given a reflection table and experiment object, and determine_outlier_index_arrays, which takes an Ih_table and returns flex.size_t index arrays of the outlier positions.

class dials.algorithms.scaling.outlier_rejection.NormDevOutlierRejection(Ih_table, zmax)[source]

Bases: OutlierRejectionBase

Algorithm using normalised deviations from the weighted intensity means.

In this case, the weighted mean is calculated from all reflections in the symmetry group excluding the test reflection.

__init__(Ih_table, zmax)[source]

Set up and run the outlier rejection algorithm.

class dials.algorithms.scaling.outlier_rejection.OutlierRejectionBase(Ih_table, zmax)[source]

Bases: object

Base class for outlier rejection algorithms using an IhTable datastructure.

Subclasses must implement the _do_outlier_rejection method, which must add the indices of outliers to the _outlier_indices attribute. The algorithms are run upon initialisation and result in the population of the final_outlier_arrays.

final_outlier_arrays

A list of flex.size_t arrays of outlier indices w.r.t. the order of the initial reflection tables used to create the Ih_table.

Type:

list

__init__(Ih_table, zmax)[source]

Set up and run the outlier rejection algorithm.

run()[source]

Run the outlier rejection algorithm, implemented by a subclass.

class dials.algorithms.scaling.outlier_rejection.SimpleNormDevOutlierRejection(Ih_table, zmax)[source]

Bases: OutlierRejectionBase

Algorithm using normalised deviations from the weighted intensity means.

In this case, the weighted mean is calculated from all reflections in the symmetry group excluding the test reflection.

__init__(Ih_table, zmax)[source]

Set up and run the outlier rejection algorithm.

class dials.algorithms.scaling.outlier_rejection.TargetedOutlierRejection(Ih_table, zmax, target)[source]

Bases: OutlierRejectionBase

Implementation of an outlier rejection algorithm against a target.

This algorithm requires a target Ih_table in addition to an Ih_table for the dataset under investigation. Normalised deviations are calculated from the intensity values in the target table.

__init__(Ih_table, zmax, target)[source]

Set a target Ih_table and run the outlier rejection.

dials.algorithms.scaling.outlier_rejection.determine_Esq_outlier_index_arrays(Ih_table, experiment, emax=10.0)[source]
dials.algorithms.scaling.outlier_rejection.determine_outlier_index_arrays(Ih_table, method='standard', zmax=6.0, target=None)[source]

Run an outlier algorithm and return the outlier indices.

Parameters:
  • Ih_table – A dials.algorithms.scaling.Ih_table.IhTable.

  • method (str) – Name (alias) of outlier rejection algorithm to use. If method=target, then the optional argument target must also be specified. Implemented methods; standard, simple, target.

  • zmax (float) – Normalised deviation threshold for classifying an outlier.

  • target (Optional[IhTable]) – An IhTable to use to obtain target Ih for outlier rejectiob, if method=target.

Returns:

A list of flex.size_t arrays, with one

array per dataset that was used to create the Ih_table. Importantly, the indices are the indices of the reflections in the initial reflection table used to create the Ih_table, not the indices of the data in the Ih_table.

Return type:

outlier_index_arrays (list)

Raises:

ValueError – if an invalid choice is made for the method.

dials.algorithms.scaling.outlier_rejection.reject_outliers(reflection_table, experiment, method='standard', zmax=6.0)[source]

Run an outlier algorithm on symmetry-equivalent intensities.

This method runs an intensity-based outlier rejection algorithm, comparing the deviations from the weighted mean in groups of symmetry equivalent reflections. The outliers are determined and the outlier_in_scaling flag is set in the reflection table.

The values intensity and variance must be set in the reflection table; these should be corrected but unscaled values, as an inverse_scale_factor will be applied during outlier rejection if this is present in the reflection table. The reflection table should also be prefiltered (e.g. not-integrated reflections should not be present) as no further filtering is done on the input table.

Parameters:
  • reflection_table – A reflection table.

  • experiment – A single experiment object.

  • method (str) – Name (alias) of outlier rejection algorithm to use.

  • zmax (float) – Normalised deviation threshold for classifying an outlier.

Returns:

The input table with the outlier_in_scaling flag set.

Return type:

reflection_table