
Next: Method Up: Time series processing Prev: Parameters
VARTEST
Vartest tests a 1-D timeseries for variability. Using a maximum
likelihood method to find the source count rate, it takes as an
input the source box time series and background box time series,
corrected as if they were at the centre of the field of view. It
calculates the probability that the least likely background
subtracted point, occurred due to chance, assuming a Poissonian
distribution about the calculated count rate. Hence the
probability statistic output is approximately the probability
that the time series is from a constant source.
N.B. The output statistic is not strictly a probability
and be more than one in some cases of very low
variability.
Subtopics:

Next: Method Up: Time series processing Prev: Parameters