1. Introduction

Researching and modeling the Earth’s magnetic field, more often than not, requires accurate separation of the various field sources, which, when combined, result in the observed magnetic field. Good quality measurements are essential for this. For effective separation of internal (core and lithospheric) and external (ionospheric and magnetospheric currents) sources it has long since been recognised that continuous, absolute measurements of the magnetic vector at single locations, sustained over tens or even hundreds of years, are vital. Such locations are known as magnetic observatories and there are around 170 operating around the world today (see for example Rasson et al., 2011). Of course, high quality global models require better global coverage than observatories alone can provide and measurements of the magnetic field from low-Earth-orbiting satellites at altitudes below 1000 km have radically changed the way in which models are produced and what they reveal about Earth processes.

High-precision satellite missions, flown over the last decade (CHAMP, ϕrsted, SAC-C) have enabled the production of improved core field models (e.g. IGRF-11, Finlay et al., 2010); models with higher spherical degree (e.g. Maus et al., 2006; Olsen et al., 2006a; Thomson et al., 2010) that include more of the internal signal from the lithosphere, which in turn enable better separation between internal and external sources; as well as comprehensive models that strive to model all sources (e.g. Sabaka et al., 2004). Nonetheless, the production of all such global models require ground based observatory data to complement the satellite data, whether used in building the models or in validation or in helping to back out the external field sources or all three.

With the launch of the Swarm constellation of satellites (Friis-Christensen et al., 2006), a new era in the study of the Earth’s magnetic field will begin. The Swarm mission will enable the separation of sources on a global scale better than ever before (e.g. Olsen et al., 2006b), especially the longer wavelength sources from the lithosphere. Absolute values of the magnetic field from observatories will be an essential component of the mission, both in the validation of the satellite data and in combination with the satellite data for the derivation of Level 2 Swarm products. The importance of observatory data for Swarm is highlighted in Lesur et al. (2006) and Macmillan and Olsen (2013). In the latter, the current practice of using hourly mean values selected from the holdings at the World Data Centre for Geomagnetism, Edinburgh is discussed as well as the potential for future requirements for data with a higher temporal resolution such as one-minute and one-second means. Macmillan and Olsen effectively support the case for data, corrected to near absolute level, to be made available more rapidly than definitive data currently are.

In light of the generally slow availability of definitive data, the initiative has been taken, first by INTERMAGNET, an organisation which is described in Kerridge (2001) and subsequently by the International Association of Geomagnetism and Aeronomy (IAGA), to establish a new data type, quasi-definitive (QD) data, and encourage the production of QD data from observatory operators. The main driving force has been for ‘ground truth’ data for Swarm research activities as well as Swarm Level 2 products. A number of other scientific activities may also benefit, such as providing the opportunity to derive a more rapidly available version of the Dst index, which is commonly used in solar terrestrial research and in space weather applications.

At the INTERMAGNET meeting in Beijing, 2007, the problem of the delay time for the publication of definitive observatory data—after the calendar year ends and in some cases several months later—was discussed and the concept of “quasi-real-time baselines” was first introduced. Further discussions took place in Golden, Colorado, 2008, on the possibilities of producing QD data to support specific applications such as Swarm.

IAGA approved resolution number 5 at its 11th Scientific Assembly, Sopron, Hungary, 2009, which states “IAGA, recognizing the importance of prompt baseline-corrected observatory data for the production of geomagnetic indices and geomagnetic models such as the IGRF, noting that several individual users and groups of users, such as the Mission Advisory Group of the upcoming ESA Swarm satellite mission, have expressed their interest in and need for such data, encourages magnetic observatories to produce baseline-corrected quasi-definitive data shortly after their acquisition.” At the INTERMAGNET meeting that followed, also in Sopron, a tentative definition of QD data was coined as “... data corrected using temporary baselines shortly after their acquisition and very near to being the final data of the observatory”. Observatory data types, formats, products and metadata are summarised in Reay et al. (2011).

During the 2010 INTERMAGNET meeting in Paris the QD data type was firmly established with the letter “Q” to be used in INTERMAGNET Geomagnetic Information Node (GIN) dissemination format data files to denote the data type (St-Louis, 2011). A request was made to IAGA to modify the IAGA2002 data format to account for this type. The revisions required were:

  1. a.

    In the metadata data type field, the valid values would be extended to include “Quasi Definitive”; and

  2. b.

    In the file name convention the new data type would be denoted by the letter “q” as a valid code for the data type.

These modifications were formally approved at the IAGA V-DAT working group business meeting during the IUGG-2011 General Assembly in Melbourne.

In October 2012 INTERMAGNET Magnetic Observatories (IMOs) were sent (by e-mail) the following defined standard for QD data. “The data should be close to the expected definitive value, but is to be delivered more rapidly than an observatory’s annual definitive data. QD data are (H, D, Z) or (X, Y, Z) 1-minute data:

  1. a.

    corrected using temporary baselines;

  2. b.

    made available less than three months after their acquisition; and

  3. c.

    such that the difference between the quasi-definitive and definitive (X, Y, Z) monthly means is less than 5 nT for every month of the year.

Point c can be checked a posteriori by comparing QD and definitive data from the previous year.” This information was also published at http://www.intermagnet.org/faq_e.php.

Initial doubts within the observatory community over what could be achieved with the production of QD data within the time frame required were quickly dispelled with two different approaches demonstrated by groups from Institut de Physique du Globe de Paris (IPGP) (Chulliat et al., 2009) and British Geological Survey (BGS) (Baillie et al., 2009). Peltier and Chulliat (2010) confirmed that the IPGP method achieved accuracies well within that set by INTERMAGNET. They found that the mean and standard deviations of differences between simulated QD and definitive data during 2008 was within 0.3 nT for nine observatories.

Over the last year many other groups operating observatories have also made the changes to their procedures to enable the derivation of QD data (e.g. Matzka, 2013). Macmillan and Olsen (2013) reported that the submission of QD type data to INTERMAGNET has been made for 44 (out of 125) observatories in 2012. Although most solar terrestrial research and space weather applications do not necessarily require absolute values of observatory data, which is the most challenging part of producing QD data, there is no doubt that the efforts across the world to develop systems to provide higher quality and more rapidly available observatory data will be of great value to all users.

For more than a decade BGS have operated and developed a programme in geomagnetic observatory instrumentation, data acquisition and processing to enable production of near absolute data from its observatories in near realtime. These data, although not labeled as such at the time of production, to all intents and purposes can, retrospectively, claim to be QD data. One of the intentions of the work described in this paper is to test this statement. For the purposes of clarity and simplicity we use the label “QD data” when referring to the historical data set of retrospectively labeled QD data or candidate QD data that have been used in the analysis.

The paper details the processes involved in enabling the production of QD data from the BGS observatories and the evaluation of these data to verify that they meet the standard and quality set. The analysis that has been carried out is described and the results are presented. General problems faced and observatory specific difficulties for the timely production of QD data are also discussed.

2. Observatory Instruments and Measurements

BGS operates three observatories in the UK, Lerwick (LER), Eskdalemuir (ESK) and Hartland (HAD), and five others internationally, Ascension Island (ASC), Port Stanley (PST), King Edward Point (KEP), Jim Carrigan (JCO) and Sable Island (SBL). The three UK plus ASC and PST observatories are IMOs and it is the results from these five that are studied here. Figure 1 shows the locations of the BGS network of observatories, with the five IMOs indicated by triangles.

Fig. 1.
figure 1

Locations of BGS operated magnetic observatories. Current analysis is carried out on those marked by triangles.

At each of the UK observatories three identical systems, each with two different instruments, record the magnetic field direction and magnitude. At ASC and PST there is a single system with vector and scalar instruments. The ability to derive QD data in near real-time relies on the standard of the instruments used and quality of the raw measurements.

Observatory data collection and transmission is controlled by a Geomagnetic Data Acquisition System (GDAS) (Turbitt and Flower, 2004). GDAS systems were installed at the UK observatories in 2002, becoming operational from January 2003. GDAS incorporates a Danish Meteorological Institute (DMI), now Danish Technological University (DTU), FGE tri-axial linear-core fluxgate magnetometer (operated in variometer mode), a Gem Systems GSM-90 Overhauser-effect proton precession magnetometer (PPM), a GPS-referenced time source and a PC running BGS proprietary data acquisition software under the QNX operating system. QNX is a real time operating system that behaves in a way that is similar to the UNIX operating system, but is particularly useful for embedded systems. The data loggers are connected via internet and telephone networks as well as satellite for reliable communication of data to Edinburgh. The connections are bi-directional, allowing remote monitoring and administration of the GDAS systems.

The FGE fluxgate magnetometers have been tuned to have a dynamic range of ±4000 nT. Thus compensating fields are used to zero the magnetometer output on set-up. The orthogonal sensors are oriented to measure variations in the horizontal (H) and vertical (Z) components of the field. The third sensor is orientated perpendicular to these in an eastward sense, measuring variations proportional to the changes in declination (D). Measurements are made at a rate of 1 Hz to a resolution of 0.2 nT. One-minute values are derived at the data processing stage by applying a 61-point cosine filter.

The DTU FGE fluxgate was chosen due to its proven long term stability, on which the production of accurate and timely QD data relies. To reduce the influence of temperature variation and pier tilt, the three sensors are mounted in a single marble cube held in a gimbaled mount, as described in the FGE technical manual (DMI, 2006). The magnetometer is also located on a stable concrete pier in a temperature-controlled chamber.

The scaling constants of the fluxgate magnetometer are determined at regular intervals (approximately every 4–6 months) by applying a known current through the compensation coils of each sensor. The digitiser is also routinely calibrated against a known voltage. The current and voltage sources are traceable to National Accreditation of Measurement and Sampling (NAMAS) standards. The coil-constants of the compensation coils were provided by the manufacturer (DTU).

Absolute total field intensity (F) measurements are recorded from the PPM to 0.1 nT resolution (the GSM90 is capable of measuring to a resolution of 0.01 nT), at a rate of 0.1 Hz. In this case one-minute values are derived by applying a 7-point cosine filter. Since the GSM-90 PPM has a low temperature coefficient there is no need for temperature control and the instrument can be sited at the nearest convenient location. The internal frequency counter of the PPM is routinely calibrated against a NAMAS frequency standard. The difference in the total field intensity between the GDAS PPM and the absolute pillar is also measured regularly using a second PPM in order that the GDAS PPM data can be corrected to the absolute pillar.

At each observatory manual absolute measurements of the direction of the magnetic field are made from a single standard pillar located in an absolute hut except in the case of Ascension Island, where the pillar is not covered. Measurements of D and inclination (I) are made using a fluxgate-theodolite. This consists of either a Zeiss or Wild theodolite with a Bartington MAG 01H fluxgate magnetometer attached. At each observatory absolute values of all geomagnetic elements are referred to this single standard absolute pillar location.

Two absolute observations are made per week at the UK observatories and two per month are made at the international stations, with additional observations being made during annual service visits. The accuracy of the absolute measurements is fundamental to the production of QD data.

Systematic errors associated with each individual instrument exist due to non-perfect alignment (collimation errors) and magnetometer offset. See for example Kerridge (1988) and Jankowski and Sucksdorff (1996). These errors are eliminated by the measurement process. However, associated parameters, which should remain reasonably constant, are calculated for each observation and plotted over time to provide a valuable aid to the quality control process.

3. A Method for Producing Quasi-Definitive Data

The method, that has been adapted and developed over many years by BGS, is best described in two separate parts. One involves the processes in place for the detection and correction of erroneous values in the raw variometer measurements and the other is the process of deriving and fitting the baselines.

3.1 Daily data processing and quality control procedures for variometer data

The raw data from the observatory GDAS systems are returned continuously to BGS Edinburgh in near real-time and stored in day files. The automatic data processing is carried out by a series of FORTRAN programs, which filter the one-second variometer and the 10-second PPM data to derive one-minute values, combine the one-minute and one-second values with daily baseline values (see later) and derive various data products and outputs useful for quality control of the data. These include magnetograms showing H, D, Z and F as well as F difference, often called closing error plots, where F derived from the H and Z baseline corrected fluxgate measurements is compared against measured F from the PPM. The one-minute F difference plot for PST observatory on an example date is shown in Fig. 2. The spikes seen on the difference trace indicated a problem with one or more of the instruments. Further investigation in this case showed that both instruments were affected at slightly different times and the erroneous values were identified and removed within 24 hours of recording.

Fig. 2.
figure 2

An example quality control plot for PST observatory. One-minute values of F from the PPM (top panel), F derived from H and Z fluxgate (2nd panel), the difference between them (3rd panel) and measured temperature (bottom panel) on 22nd February 2009 are shown.

Where an observatory has more than one system installed, such as at LER, ESK and HAD, the use of comparison plots between systems for each component is used to identify any corrupt data. Quality control plots and data products are automatically updated in near real-time or can be regenerated manually as required, and are available to view on an internal website.

During the day and on a next day basis these quality control plots are carefully analysed by the duty processor. Any errors identified in the variometer data are either removed, or, in the case of the UK where backup systems are running, replaced with unaffected data from the most appropriate system. For real-time products, the data processing software will carry out the latter automatically using configuration files that can be manually adjusted. Common observatory data quality control practices are discussed in Reda et al. (2011).

Every morning during normal working days any required adjustments to the variometer data for the previous day (or three days following the weekend) are completed. The original reported data are retained and a separate adjusted day file is created. It is important to note that the one-minute and one-second data are stored separately from the daily baseline values, i.e. as variometer data, and are tied to the baseline values by the software, prior to publishing. Any corrections to data are logged in a diary system detailing the times, type of correction and information on the cause, if known.

3.2 Processing of absolute observation measurements and derivation of baseline values

Absolute observations are recorded and processed using the BGS proprietary Java program ‘GDASView’ (Turbitt and Shanahan, 2012). All of the relevant information associated with each observation can either be entered or read in from a saved version that has previously been entered. The software also reads in the variometer and PPM data for the same date and time and outputs absolute observations in D, I, F, H and Z and associated collimation errors, plus the difference between the absolute and variometer values, known as spot baseline values. Note that what we refer to as the measured D and H variations are not exactly in the standard geomagnetic reference frame; rather they are in a cylindrical reference frame defined by the exact orientation of the sensors, and as such, are not exact variations in D and H. Thus the calculation of both the D and H spot baselines include both these so-called measured D and H variations to ensure the sensor reference frame is accounted for. This processing is carried out as soon as possible after the manual measurements are made and received in the Edinburgh office via email. Spot baselines account for instrument offsets and correction to the absolute pillar.

A FORTRAN program is used to plot the spot baselines against the current continuous daily baseline values for a given year. The plot includes the daily mean F differences as well as the daily mean temperature in the variometer chamber with all the panels building up as the year progresses. The F differences in this case compares daily mean values of F derived from the H and Z baseline corrected fluxgate measurements against those derived from PPM measurements, providing vital information on the longer term stability of the instruments. The LER baseline plot, as it was at the end of September 2010, is shown in Fig. 3.

Fig. 3.
figure 3

Baseline plot showing the spot (markers) and daily (lines) baselines for Lerwick 2010. D is in the top panel, H is in the 2nd panel and Z is in the 3rd panel. Daily mean F differences (4th panel) and temperature (5th panel) are also shown.

Presenting the results in the form of Fig. 3, combines relevant information that is required to help make decisions on whether the daily baseline values need to be updated. When new observations are available this plot is assessed to make sure the current baselines (and effectively the baselines predicted for the next few weeks) are as accurate as they can be and well within the QD standard. The plot is regenerated after any adjustments to the baseline have been made and used to visually assess the quality of the polynomial fit of the baseline to the data. At this point in time the spot baseline values (spots) were available up to the end of September, whereas the daily baselines (lines) were predicted into the future. Note that this plot has been retrospectively created for illustration purposes in this paper and the actual baselines shown may not have been exactly as they were at the time, although they would have been very close.

At least once a month, but more frequently if required, the recent spot baseline values, absolute observation data and associated collimation errors are analysed in more detail using Microsoft Excel. Continuous baselines are fitted to the spot values using a series of piecewise polynomials. The Excel package uses the method of least squares to fit the selected data and derive the polynomials. Figure 4 shows the spot values for the H component at ESK in 2011. The selected sets of observations and the derived polynomial fits for each selected data set are shown along with the polynomial order and the coefficient of determination, R2, giving an indication of the goodness of the fit in each case. In practice the equation of each polynomial is also output, which is then used to derive the daily values. Polynomials of order 3 or less are most common. Using those of order greater than 4 is not recommended as following the spot values too closely is unrealistic for the instruments. A smoother baseline is a better approach, but the data processor will have experience of the instrument capabilities and can make educated decisions. The process of choosing the sections does have a subjective element, which depends on the skill and experience of the data processor, however known information on instrument and environmental changes are taken into account. In the case of the example shown, the vertical dashed arrows indicate the final cut-off points between each polynomial used. Before fitting, any outliers in the spot values are removed, based on a set of rules, which takes into account erroneous collimation errors, spot values falling outside two standard deviations of the mean of the selected data and any other obvious factors. This method also allows for known steps to be accounted for. An example of a step at ESK is clear in Fig. 4 towards the end of 2011.

Fig. 4.
figure 4

The piecewise polynomial fitting to the spot baselines to derive the H baseline at ESK during 2011.

Adjustments are made to the piecewise polynomials as required and the baseline updated. Baseline data are stored in year files with one value for each component H, D and Z per day. All previous versions of the baseline file are saved using a simple version control system. Following any revisions to the values baseline plots of the type shown in Fig. 3 are regenerated, and the constancy of the F difference trace is used to decide if another iteration of the process is required.

Daily baseline values are created for the full year, which by default will include a projection into the future based on the baselines on the day of computation. The extrapolation is usually constant (order 0), although account can be taken of any current increasing or decreasing linear trends (order 1). Where such trends exist it is even more important to repeat the process with new observations as soon as they are available, thus reducing the number of days of predicted baseline values.

3.3 Production and delivery of quasi-definitive data

Automated data processing software combines the daily extrapolated baseline values of H, D and Z, derived from the baseline functions, with the H, D and Z variometer data. One-minute data are delivered to the Edinburgh INTERMAGNET GIN in near real-time and on a next day basis. IAGA-2002 type ‘v’ variometer data (INTERMAGNET type “R” reported) are delivered in real-time and IAGA-2002 type ‘p’; provisional (INTERMAGNET type “A” adjusted), which have baselines applied but not necessarily fully quality controlled, are delivered next-day, shortly after UT midnight. Once the full procedures in 3.1 and 3.2 have been completed, the QD data are prepared and also delivered to the GIN by running the data processing software in manual mode on a next (working) day basis.

Later, if baselines are revised, QD data may be resub-mitted to the GIN within the 3-month window. We have included an extra header line “Data file created on” to keep a record of when the QD data is produced. This information is currently lost when data are extracted from the GIN (although it is retained by BGS) so we would encourage INTERMAGNET to give consideration to time stamping or version control of data. For some applications, such as Swarm Level 2 data products, a clear audit trail may be required and having information on the version or creation date of QD data may be a requirement.

4. Evaluation of Quasi-Definitive Data Accuracy

A statistical evaluation has been carried out using data from 2000 to 2011 (the last year for which definitive data are available at the time of the analysis) for the five INTERMAGNET observatories, as shown in Fig. 1.

Although publishing data as type QD is a relatively recent activity, as previously discussed, it is suggested that the data published as type “provisional” by BGS over several years have met the criteria now established for QD data. For more than a decade hourly mean values from LER, ESK and HAD, derived from the provisional, baseline corrected, one-minute values have been published online at http://www.geomag.bgs.ac.uk/data_service/data/obs data/hourly_means.html on a next day basis. In the case of ASC and PST the hourly mean values needed to be computed from provisional data submitted to the INTERMAGNET GIN, also on a next day basis. These hourly values are used here to test the hypothesis that the provisional data of the time can be classed as QD and that the method developed at BGS for the production of QD is suitable to meet the criteria established.

The definitive one-minute values, as published on an annual basis, are used to compute definitive hourly mean values that are then compared against the candidate QD hourly means. The hourly differences in the North (X), East (Y) and Vertical (Z) components were computed from 2000 to 2011 for LER, ESK and HAD, from 2004 to 2011 for ASC and from 2005 to 2011 for PST.

The resolution of the original hourly mean values, and thus the differences, is 1 nT. These are counted in 1 nT bins. The mean difference (μ) and the standard deviation of the differences (σ) are calculated for each observatory. The results for the whole period analysed are presented in Table 1 and the same set of results for the most recent year (2011) are presented in Table 2.

Table 1. Mean (μ) and standard deviation (σ) of the differences in the X, Y and Z hourly mean values (QD—definitive) at five IMOs for all years analysed.
Table 2. Mean (μ) and standard deviation (σ) of the differences in the X, Y and Z hourly mean values (QD—definitive) at five IMOs for 2011.

In Fig. 5 the distribution of the differences are shown for each observatory and each component over all years and Fig. 6 shows the same but for year 2011 only. It is largely well known that geomagnetic data tend not to follow a Gaussian error distribution and this is verified in the current analysis. Further analysis is therefore carried out by calculating how often the INTERMAGNET accuracy threshold of 5 nT is achieved as a percentage of the time. These results are presented for the three components together, at each observatory, for each year, in Fig. 7.

Fig. 5.
figure 5

Histograms of the binned differences between QD and definitive X, Y and Z hourly mean values for all years. From top to bottom the results are for LER, ESK, HAD, ASC and PST. The curves are the Gaussian best fit for each distribution.

Fig. 6.
figure 6

Histograms of the binned differences between QD and definitive X, Y and Z hourly mean values for 2011. From top to bottom the results are for LER, ESK, HAD, ASC and PST. The curves are the Gaussian best fit for each distribution.

Fig. 7.
figure 7

The annual percentage of hourly mean QD values falling within the INTERMAGNET defined accuracy of 5 nT from the final definitive hourly mean values at the five IMOs for each year analysed.

5. Discussion of Results

This evaluation of the candidate QD data has shown that they are clearly within 5 nT of the definitive values most of the time. Better results are obtained for the vertical (Z) component than the two horizontal components (X and Y) at all five observatories and the poorest results are obtained for the Y component, reflecting the difficulties involved in accurate measurement of D and fitting of the D baseline. The values in Table 1 also highlights that the QD data from HAD observatory has been closest to definitive than any of the others throughout the analysed period with the PST results being generally the poorest. The mean and standard deviations for all five IMOs are within 5 nT, although the spread of the differences, as seen in both Figs. 5 and 6, at ASC and PST are greater than at the UK observatories. Comparing Figs. 5 and 6 with each other it is clear that there is a reduction in spread for all observatories except ASC in 2011 (Fig. 6) compared with all years (Fig. 5) and the standard deviation values in Table 2 compared to those in Table 1 also demonstrate this improvement. Figures 5 and 6 also provide evidence of bi-modal and non-Gaussian error distributions in some and most cases respectively.

At ASC and PST the instrument baselines have been more prone to variations due to being located in a more challenging operating environment, without BGS staff on site to address problems as they arise. The fewer number of absolute observations at these remote sites make it more difficult to account for drifts, such as those that might be caused by temperature and site difference (artificial or otherwise) changes. It is not always possible to fit new data and revise the baselines every month at these observatories. The observation quality can sometimes fall outside the criterion and the increased uncertainty in the values can result in the decision not to use the new measurement. The next observation is then required before any trends can be evaluated with any degree of confidence. This delay means that predicted baselines at ASC and PST are in use further into the future than at the UK observatories.

In the case of ASC, 2010 and 2011 were years when delays to fitting the baseline were more frequent for a variety of reasons and this is highlighted in Fig. 7. Operation of PST has been the most challenging out of the five BGS IMOs over the years and this is reflected in the results overall and in particular prior to 2009. The improvement since then has been largely down to improvement in the quality of the absolute observations made, although the observatory operations overall are still not without problems at this site.

Figure 7 highlights a clear drop in the accuracy of QD data at LER during 2008 and to a lesser extent in 2009.

This was entirely due to the discovery that heaters located in the absolute hut, contained magnetic material. The heaters were removed from the hut on 25th March 2008 but it was much later that the problem was fully understood. It was left until the production of the final definitive data for 2008 to deal with this artificial change at the observatory absolute pillar. The decision was made to introduce a discontinuity in the final observatory results between 31st December 2007 and 1st January 2008, the effect of which was to create a step at that point and all measurements since then needing to be reprocessed. This processing was carried out in May 2009 with the consequence that much larger than usual changes to the provisional baselines throughout 2008 and up to the end of May 2009 were required. In the case of the Y component this amounted to an offset of greater than 5 nT being applied retrospectively and explains the bi-modal distribution in the LER Y errors and the overall poor result for LER in 2008 and into 2009.

One disadvantage of the baseline derivation method is that it has four separate stages using three different sets of software: the first is a Java program to derive spot absolute values; the second is a FORTRAN program to plot the relevant parameters; the third is the use of Excel spreadsheets to fit piecewise polynomials; and finally the FORTRAN program is used again to re-plot the combined data sets and enable final assessment on the quality of the fit. A more streamlined approach is possible and development is being carried out that will make the process less labour intensive. Nonetheless the results presented here are not affected. The principle behind the chosen method that enables data of QD standard to be produced in near real-time remains and the methodology as described in this paper will continue for the foreseeable future.

6. Concluding Remarks

The ability to maintain baselines to QD specification is fundamentally a sampling problem. In other words the variometer baseline must not contain signal of the order of the required QD accuracy at periods less than twice the sampling interval. So an observatory requires stable variometers; good control of the measurement; and regular, precise absolute measurements. The less stable the variometers or where there are environment problems, the more frequent absolute measurements will be required.

The method developed by BGS was initially driven by the real-time demand from users for time varying data that was near to the absolute level. Although not labeled QD at the time it has been shown that the method in place enabled the derivation of data products that were close enough to definitive to meet the current QD accuracy standard most of the time. Hourly mean values published on a next day basis were within ±5 nT of the definitive values published over a year later, close to 100% of the time in the majority of years analysed, apart from a few exceptional cases (e.g. LER 2008 and PST prior to 2009).

In a similar study by Peltier and Chulliat (2010) the error in QD data is found to be <0.3 nT, which is a power of ten magnitude less than that found in the present study (worse case is 3.8 nT). This difference is in part explained by the method used at the two institutes as well as by the differences in the analyses carried out. The IPGP method is a monthly process, which concentrates on obtaining the most accurate results for the recent past. The BGS method, although similar, also attempts to produce next day QD data using predicted baseline values. The analysis carried out for this paper has been on these predicted QD data as opposed to QD data with a 3 month delay, which in the future can also be analysed for accuracy. Both methods are clearly valid to meet the QD data definition set by INTERMAG-NET and each has strengths that will benefit specific users of the data.

We hope that these results provide encouragement to operators of observatories around the world, who have not yet started producing and publishing QD data.