Event data will not have been corrected for instrumental effect like dead time and vignetting. This is done after the binning stage by an instrument-specific correction program (applications running on event data will have to make their own arrangements, using the same correction subroutines). However it will frequently be the case that the instrument interface has done some of the work required for correction (notably extracting the relevant instrument on-times) in the course of assembling the event dataset. Some of this information can be stored in the dataset to allow the subsequent correction stage to be performed more quickly. This is the function of the LIVE_TIME structure:
LIVE_TIME <EXTENSION> ON(N) <_REAL> OFF(N) <_REAL> DURATION(N) <_REAL> Optional
This would be attached to the event dataset when it is created. It is basically just a list of on and off times (leaving aside the last component for the moment). For spatially resolved event data these times refer to live time for the WHOLE FIELD. Live time for a spatial subset may in general be some subset of this overall live time (depending upon telescope motion). The live time values are referred to the same reference time (specified in the HEADER) as the data timetags.
An instrument-specific correction program takes any binned dataset (e.g. image or time series) and uses the LIVE_TIME information, together with HK and aspect data as required (which will generally be in files referenced in the INSTRUMENT extension) to normalise to count/s, correcting for dead-time and vignetting. (It then sets the logical flags in the PROCESSING structure to indicate what has been done.) In the case of an imaging instrument, the corrections may vary with position. If the spatial information has already been removed by binning into some other domain (e.g. a time series has been formed) then the correction application should take the spatial position from the FIELD_RA and FIELD_DEC components in HEADER, and will perform the corrections as if dealing with a single spatial point.
The overall processing scheme for event data looks like this:
Instrument interface Outputs event lists (uncorrected) (`Event Processing') with LIVE_TIME for whole field | | EVENT DATASET | | Event selection Select a subset of the events based on | spatial, spectral, time etc criteria | Smaller EVENT DATASET (e.g. small region centred on source) | | Binning (EVENTBIN) Bins data without normalisation, enter | FIELD_RA/DEC in HEADER if necessary | BINNED DATASET | | Correction program Uses LIVE_TIME, field centre, HK and (instrument-specific) aspect files to normalise data, | correcting for dead time, vignetting etc. | Normalised, corrected BINNED DATASET
Software operating on event datasets (notably EVSUBSET and EVMERGE) must maintain the components required for the subsequent correction process - i.e. LIVE_TIME must be updated if the time ranges are altered and FIELD_RA/DEC must be updated if the spatial field is changed.
Binning software simply bins the data up and propagates the LIVE_TIME, it does not attempt to normalise the data in any way.
Note that software operating on binned data will NOT be expected to altered and FIELD_RA/DEC must be updated if the spatial field is maintain LIVE_TIME or FIELD_RA/DEC. This means that if normalisation is required, it should be performed on the freshly binned data. Manipulation (e.g. subsetting) of the data after binning but before normalisation could lead to incorrect normalisation being applied. In general it would be best if binned data software which affects these components were to delete them, to avoid confusion.
Where one has the simpler case of a non-imaging instrument, the dead time correction can be handled by including the DURATION component in LIVE_TIME. The correction program would then normalise the data by dividing by the collection time and then correcting for dead time by multiplying by (OFF-ON)/DURATION (either summed up, for a spectrum, or as a function of time for a time series).