ReSpect Kernel Programs
The ReSpect kernel is a widely applicable method for fitting a linear model to noisy data. The program generates data reconstructions and, where appropriate, the underlying sharp detail contained in the data irrespective of the type of data being processed. The kernel is applied to a specific problem through a specialised and carefully designed interface program appropriate to the application. Thus, in mass spectrometry for example, both profile and centroided data may be processed, the only difference being the way the interfaces to the kernel are designed to operate.
In many applications (such as deconvolution), the data to be analysed are the experimental measurements themselves - e.g. a spectrum and the requirement is to provide the Reconstruction and the Deconvolution (the underlying sharp information contained in the data). The input is a peak profile - the Model - that matches the profile of the data peaks. No knowledge of the number of peaks or their positions is required. The aim is to obtain a mathematical reconstruction of the data that fits the actual data within the noise level and any other constraints. Pictorially, the data may be considered as a bulls-eye surrounded by contours of increasing deviation or misfit. However, different points on the desired contour will provide different results for the same overall misfit. The ReSpect kernel is therefore designed to navigate to the correct point on the contour of interest by a self-analysis of its diagnostics as the computation progresses. The program converges naturally without intervention by the user. Quantified peak position and intensity errors are obtained by directly analysing for the uncertainty in the data surrounding each feature in the Deconvolution. At convergence, the fully quantified Deconvolution, Reconstruction and Misfit are available.
In other applications, the data to be analysed may themselves have been derived from the experimental data in a prior analysis step - e.g. a peak table, derived by deconvolution or centroiding a mass spectrum, for electrospray charge deconvolution. The second series of kernel programs are designed to analyse the results obtained from deconvolutions. These programs take into account the type of experiment performed and relevant interpretation rules so that unambiguous results may be presented to the user.
PPL's policy is to provide a kernel and an interface that are suited to the problem rather than supply a general purpose program that may have shortcomings for any particular type of data.
Currently available Interfaces
General (for continuous data)
Enchant Automatic Noise Reduction Algorithm
Enhance Fast Data Reconstructions
Nadir Baseline Correction
Sleuth 1-D Peak Detection/Deconvolutions and Reconstructions
Profile Automatic Modelling/Remodelling Program
ReView User Graphics Interface
Discharge ESIMS Charge Deconvolutions for any Mass
Collapse Deisotoping - Centroids and Deconvolutions
EvaluateQuantification with Errors. Significance/Confidence filtering and Noise Rejection for Centroids and Deconvolutions
Element ICP-MS Analysis - Profile and Centroid Data
Locator Fast Reconstruction Centroiding for Peak Identification and Detection
LCMS-HiRes Analysis Software where Deisotoping is required to m/z or Zero-Charge
LCMS-LoRes Analysis Software for ESI Data where Isotopes are Unresolved
LCMS-View 2D Data/Results Viewer
LCMS-Mapper Map Comparison Program
Interfaces for ReSpect kernels
Discharge - ESIMS Charge Deconvolutions
Standard algebraic transformation methods have the disadvantages that they introduce artefacts and are not designed to improve resolution. Although some earlier reconstruction methods often work well, the quality of the result is often compromised because the computation uses only a very limited knowledge of the rules that apply to the experiment. These programs also attempt to reconstruct ALL the intensity in the data even when it is unreasonable to do so and the presence of even a single solvent signal can seriously degrade the quality of the charge-deconvolved result.
In addition, other reconstruction methods are designed to expect all possible charge states to be present in the data and truncated charge series - typical for peptide map data - may give rise to numerous artefacts along with the possibility that some major fragments may not be detected. This generally leads to the found zero-charge intensities being in serious error.
These problems have been addressed and the ReSpect algorithm, along with the Discharge interface, first deconvolves or centroids the data to provide a fully quantified peak table. An analysis of the peak table provides all possible zero-charge masses that are then compared with the data to evaluate the evidence for their presence. Features for which there is no evidence are rejected and the intensities and errors for the remaining peaks are recalculated. The final zero-charge result contains only those features for which there is evidence according to the constraints that apply to the experiment. These may be edited to suit the problem and include choice of ionising species, positive and negative ion data and truncated and continuous charge series. Data from small molecules and fragments through to large peptides may be processed. The program is designed for any mass and any output mass range.
Collapse - Deisotoping Centroids and Deconvolutions
Collapse is a novel approach to deisotoping MS data. It takes its input from a peak table and it may therefore be applied to Deconvolutions and centroided data. In both cases the fully quantified errors are available. Therefore, deisotoped C12 m/z values or zero-charge masses may be obtained from poorly or unresolved data. Unlike other methods, Collapse does not produce artefacts it will only reconstruct C12 m/z values or zero-charge masses for which there is evidence in the data.
Evaluate™ - Quantification with Errors. Significance/Confidence filtering and Noise Rejection for Centroids and Deconvolutions
Evaluate™ is a completely novel quantification method that is part of the Sleuth™ and Locator™ programs that provides fully quantified position and intensity errors. Also incorporated are significance and confidence filtering that take into account any variation in the underlying noise level. This means that conventional thresholding is redundant. These new facilities ensure that peaks attain their true significance regardless of the noise level and dynamic range. If the baseline has been incorrectly set, it follows that the significance of reconstructed peaks and features can be in error. The novel noise rejection feature automatically analyses results to detect any background bias and takes this into account when it rejects noise. Therefore, irrespective of the magnitude of any background and the noise level and its variation across the data signals and noise are efficiently separated.
Element - ICP-MS Analysis - Profile and Centroid Data
Algebraic methods are traditionally used for the analysis of ICP-MS data. Such methods are prone to failure when elements and interferences overlap, the problem being ill-defined. This is because many species can share a specific mass. Element uses the ReSpect data reconstruction program, along with a list of all the elements and interferences that may be present, so that a different model may be generated for each. The program then returns the most plausible fit to the data using these models. If an interference is present that is omitted from the input list of possible elements and species, a serious mismatch appears between the data and the reconstruction. Additional interferences may be added to the until the misfit is reduced to within the noise level.
Analysis Software where Deisotoping is Required to m/z or Zero-Charge.
This is a stand-alone Windows program that is used to process the LC-MS data of peptides or other small molecules off-line on any PC. The program accepts data as a sequential series of text files, one for each scan. Models are then generated from the data; any change in peak width being taken into account so that results are not compromised. Models may be saved for use on other similar runs. Spectra are then co-added to generate "blocks" of data of enhanced S/N, the baselines corrected and each block deconvolved using a variable model. Confidence filters are applied to the results peak tables to remove obvious noise and each block is then deisotoped to C12 m/z ions or zero-charge mass using PPL's unique artefact-free methodology. In principle, there is no limit to the number of charges that may be accommodated. For example, the program has already demonstrated that it will robustly deisotope overlapping isotope clusters for Z=1-12 in the same data.
Background ions are identified and rejected without removing genuinely eluting ions that may ride on the top of background ions. The retention time axis is analysed to obtain the elution errors and a full results table of m/z (or mass), RT and intensity, along with all the errors is presented. Confidence filters may be applied to the final results table.
The program is exceptionally fast. In trials the processing time is much faster than the data acquisition time.
LCMS-LoRes Analysis Software for ESI Data where Isotopes are Unresolved.
This is a stand-alone Windows program that is used to process the ESI LC-MS data of intact proteins and other large molecules off-line on any PC. The program may also be used for low resolution data - e.g. quadrupole data - on small molecules. The program accepts data as a sequential series of text files, one for each scan. Models are then generated from the data; any change in peak width being taken into account so that results are not compromised. Models may be saved for use on other similar runs. Spectra are then co-added to generate "blocks" of data of enhanced S/N, the baselines corrected and each block deconvolved using a variable model. Confidence filters are applied to the results peak tables to remove obvious noise and each block is then charge-deconvolved to zero-charge masses. There is no limit to the masses that may be studied.
Background ions are identified and rejected without removing genuinely eluting ions that may ride on the top of background ions. The retention time axis is analysed to obtain the elution errors and a full results table of mass, RT and intensity, along with all the errors is presented. Confidence filters may be applied to the final results table.
The program is exceptionally fast. In trials the processing time is much faster than the data acquisition time.
LCMS-View 2D Data/Results Viewer
This is a stand-alone Windows program that is used to view 2D LC-MS data. The data may be zoomed, panned, expanded, etc. Thresholds may be applied and intensities displayed linearly or logarithmically using a colour scale. The viewer will also overlay results from the LC-MS Peptide/Small Molecule Processing Programs for visual inspection and comparison of results.
LCMS-Mapper Map Comparison Program
This is a stand-alone Windows program that is used to compare LC-MS maps. The rescaling of the retention time axis to align maps is unnecessary. Instead, the program searches for patterns to identify common features. Mass calibration drifts are automatically identified and corrected both within individual maps and between maps using either a lock mass or identified background ions. Common elutions between maps are identified from their relative retention times and their retention time errors at any user-chosen confidence level. Therefore, any elutions that "swap" position as the result of minor changes in LC conditions or column ageing are still correctly identified. Differences between maps are reported at any chosen significance/confidence level and according to any user-defined constraints - e.g. intensity differences less than x2 are not significant.
Enchant - Automatic Noise Reduction Algorithm for Continuous Data
Filters are most commonly applied for improving the S/N of data as rapidly as possible. Savitsky-Golay filters are generally used in mass spectrometry and optical techniques whilst Fourier smoothing is traditional in NMR. The problem with ALL filters is that they broaden the peaks of interest and resolution may deteriorate. Furthermore, the user generally needs to make a series of trials in an attempt to obtain a compromise between the loss of information and the reduction of the noise level. These methods are therefore subjective. Although Enchant is a filter, it is a novel, iterative noise correlation method designed to remove high frequencies regardless of the characteristics of the noise in the data. The algorithm is unaffected by the S/N of the data and whether this varies. It is also independent of the dynamic range of the data and the presence of any baseline error. There are no user inputs and the method is fast. For a given improvement in S/N, the peak broadening in the result is much less than that obtained with conventional filters.
Enhance - Fast Data Reconstruction Algorithm for Continuous Data
Data reconstruction methods are designed to provide both a reconstruction of the data and the underlying detail contained in it - the deconvolved result. In order to do this, these iterative methods converge or terminate when the reconstructed data fit the data within the noise level. However, it is generally unnecessary to fit the data as closely as this when the only requirement is to enhance the S/N. Enhance™ has been specifically designed to separate signals from noise efficiently and fast without broadening peaks. The only input is an estimate of the peak width. As with all data reconstruction methods, the program will provide the cleanest results on baseline corrected data.
Nadir- Baseline Correction
There is no generic solution to
baseline correction. This is because a baseline hump in one situation may be
genuine signal in another. Even so, it is possible to design methods that will
almost invariably work very well for specific applications. Of course, it is
not possible to offer a guarantee when data can, and frequently does, occasionally
surprise the user.
Nadir™ is a very fast, non-linear noise correlation baseline correction method. The only mandatory input is an estimate of the peak width and other inputs are generally left at their computed defaults. The program has been specifically designed to be very tolerant to user inputs and wide margins of error will still produce excellent corrections. A unique feature allows baseline humps to be treated as signals so that the computed baseline passes beneath humps, rather than following them. Because the method takes into account any variation in the noise level, it means that the computed baseline will be close to the centre of the noise regardless of its amplitude.
Sleuth - 1-D Deconvolutions & Reconstructions for Continuous Data
Just like the other PPL interfaces, no attempt is made to reverse any corruption of the data. Instead, the Sleuth interface iteratively reconstructs the data from an estimate of the data peak profile. At the end of each iteration cycle, the current Deconvolution is convolved with the Model to produce a corresponding Reconstruction. The difference between this and the actual data is used to guide the calculation to its conclusion.
Unlike other methods, ReSpect detects weak features very early in the computation and their found intensities are much less compromised by noise. In part, this is made possible by taking into account any variation in the underlying noise level so all features attain their true significance. Conventional thresholds are therefore redundant and peak tables are filtered on the basis of the significance or confidence of each feature. A facility is also provided that allows the input peak profile to be modelled and automatically optimised (Profile). Trial experiments, which are commonly required by other calculation methods and can be extremely time-consuming, are therefore unnecessary.
At the end of the calculation, the user is provided with the Reconstruction of the data and the underlying sharp detail contained in the data - the Deconvolution. The results are fully quantified with error estimates.
Normally, a single Model is used to represent the peak profile of all peaks in the data. For many chromatographic techniques and some mass spectrometry measurements, the peak width varies across the data in a rational way and the use of a single Model is inappropriate. The Sleuth™ interface allows the use of a varying Model so that the best possible results are obtained.
Locator - Peak Identification and Detection for Continuous Data
Locator™ is a novel PPL centroiding program that requires only an estimate of the peak width as its input. The program is very tolerant of this input and only a crude estimate is required. The program first performs a special, very fast reconstruction to remove most of the noise without broadening genuine peaks. The resulting spectrum (or chromatogram) is then centroided. Novel features within the centroiding program provide more reliable peak positions and intensities than conventional valley to valley or valley to baseline methods. The resulting peak table may be used as the input for other applications. A unique feature is the provision of estimates of the position and intensity errors.
Profile - Automatic Modelling/Remodelling Program for Continuous Data
Profile is a novel PPL program designed to facilitate the construction and refinement of Models. The user zooms in to a suitable peak or cluster of peaks and the program generates the best parameterised Model within the limitation of the programmed peak shapes. Currently, Profile will provide the best parameters for any peak by smoothly mixing shapes from a square wave through Gaussian and Lorentzian to a Super-Lorentzian, including any asymmetry between the left and right sides of the peak.
Modern high resolution mass spectrometers will often produce partially resolved isotopes for high masses and it is more appropriate to model complete profiles when charge-deconvolving ESI data to zero-charge masses. An option is therefore provided for modelling isotope cluster profiles.
Where there is a significant change in peak width across the data, Profile™ may be used to generate a range of Models from different parts of the data. This allows the way the Model parameters change across the data and this change may be taken into account when performing spectrum (or chromatogram) deconvolutions. Results are therefore not compromised by using a single Model.
ReView - User Graphics Interface
ReView is the custom user graphics interface for the Microsoft Windows environment and is the user link between the program interfaces and their respective kernels. This specialised interface allows data to be viewed and, where appropriate, correctly prepared prior to processing. Where Models are required it allows their design and the data to be processed and the results to be displayed. The interface contains the standard, expected features such as zooming, hiding/showing of traces, file saving, etc. All operations and commands are implemented through the mouse.