What do argo floats measure
Argo floats are deployed all over the world - the aim is to have over of them, distributed roughly every 3 degrees km. Deployments began in , and continue at a rate of about a year. During one of Dean Roemmich's Scripps Institution of Oceanography visits to New Zealand, the challenge of deploying floats in the southern hemisphere was discussed. It was realised that NIWA's Kaharoa, predominantly used for coastal research, would make an ideal deployment platform.
The Kaharoa has done some very large transects while deploying Argo floats. While she hasn't yet been to the South Atlantic, there are hopes that she will be able to go there in the future. Video of ARGO - measuring salinity and temperature across the oceans.
Jump to Navigation Skip to main content. Overview Argo floats are instruments able to measure temperature and salinity in the top m of our oceans. They make sure the quality of the data is good enough to measure the very small but important changes in temperature and salinity that might be due to climate change.
They check that the data is consistent with other data measured by other instruments in the same place, with mathematical models — thanks to Argo floats near the observation point, we can have an idea of what should be expected as a measurement at this point — obtained in the region and of course relying on their own experience etc. Within 6 to 12 hours the data are available at operational weather forecasting centres where it is used in climate forecasting.
Soon after this, the data is also freely available on the Internet. What are the precious data scientists receive thanks to Argo floats? Of course, the data that scientists are most looking forward to are those on ocean temperature and salinity.
This requires as well free and unrestricted data sharing in real-time; a mandatory rule for all Argo floats. The AIC developed a centralized electronic notification system, which informs Member States of the float deployments. Float operators register their deployment including early plans in the AIC system, which will trigger a notification email including float deployment details, sensors equipped, configuration, contact points, etc.
The availability of data is also checked by the AIC. In this process, each Argo float deployment will be assigned a unique identifier, its WMO number, which is also used in the data system. NKE floats are adjusted to maximum buoyancy before deployment i. To be able to sink, these floats must reduce their volume buoyancy by opening a solenoid valve to bring the oil back into the hull. Activation is typically done just before deployment.
The float then starts a sequence of self-tests which last a few minutes with success indicated by an audible signal, signaling that the float starts its volume reduction. This volume reduction and signal lasts 30 min and the user should not deploy the float after the signal has stopped to avoid too rapid sinking of the float. After deployment, it can take 30—40 min before the float leaves the surface, possibly longer in cold waters due to increased oil viscosity.
The float then reads its pressure sensor every 2 h. As the float is ballasted low, it sinks passively on its own, and typically leaves the surface after the cowling fills with water 2—20 min. Pre-deployment sensor-specific procedures should include the cleaning of optical windows e. Upon deployment, metadata should be completed with location, time, concurrent observations, etc.
Concurrent reference samples e. Experience from some float groups suggests that daily profiles for the first 5 days after deployment as well as enabling descending and ascending profile acquisition only possible on NKE floats allow certain sensors e.
After these 5 days, the float can be configured to a standard Argo mode. This includes profiling to m depth at least once per month to allow appropriate salinity data qualification for the core Argo mission 3. Thanks to two-way communication, mission changes like those above are possible during deployment. This is a very useful capability, but should be used cautiously, depending on the detail of modification.
If the level of the modifications is low and affects only standard parameters e. Other parameters can be very advanced and should be left to expert users. In any case, any modification of parameters must be carefully checked in order to detect any conflict between them. Pre-validated missions should be used or, if available, a simulation software that assists the float operator in this task. The new mission parameters are directly connected with the firmware version of the float, i.
This information is crucial. The capability to change the float mission can become quite time consuming because the data subsequently transmitted by the float needs to be monitored frequently. This is only possible with two-way communication which permits a change in mission parameters to set a float into recovery mode. This mode keeps the float at the surface sending regular position updates at a higher frequency. BGC-Argo float recovery is becoming a common practice when logistics permit, e.
Every 2—3 years, a ship cruise is dedicated to the recovery of old floats and the deployment of refurbished platforms Taillandier et al. Recovered floats have been refurbished in the following way, serving as a template for others: Visual check of the sensors before cleaning, systematic replacement of batteries and o-rings in the lab, pre-deployment tests for ballasting, communication, mission configuration and software performance.
In addition, sensors should be recalibrated as a best practice recommendation. Recovering and redeploying floats is cost effective, eco-friendly, allows for sensor recalibration, and helps to maintain the BGC-Argo array and attain the BGC-Argo science goals Biogeochemical-Argo Planning Group, It is thus recommended to check for recovery opportunities wherever possible.
Note, however, that while float recovery and redeployment has many benefits, its feasibility is highly dependent on float location and related field logistics and is not mandatory within the Argo system. The NAOS redeployment experience suggests that though the recovered battery voltages would allow a prolonged deployment, the bio-optical sensors were covered by a biofilm in most cases depending on in situ conditions encountered , which decreased the data quality and became the factor limiting the mission.
In contrast, optics and electronics of the bio-optical sensors were not critically affected even with high sampling rates. Overall, the choice to limit the deployment of the platform to 2—3 years about cycles provided a better data quality and reduced calibration drifts.
When seen in the perspective of maintaining the array, the recovery cruise allowed for a higher level of data quality and maintained the initial seeding plan for the Mediterranean. The primary objective of BGC-Argo data management is to provide calibrated, science-quality biogeochemical data to the global user community, while at the same time preserving raw BGC float data in its original form such that applied adjustments can be reprocessed as-needed.
The pre-existing framework of the core Argo data system supports these aims through the management of specific real-time and delayed-mode file structures. Additionally, associated meta-data, trajectory, and technical files provide the supporting framework for storing all pertinent calibration and location information. The Argo data system has been serving core Argo data successfully since and is considered a reputable repository for data from autonomous profiling floats.
Since biogeochemical data are obtained by sensors that are housed within Argo floats and need to be used in conjunction with their temperature T and salinity S profiles, it is natural to serve the BGC data as an extension in the core Argo data system. The core-file stores the pressure, temperature and salinity data.
The b-file stores the pressure and all corresponding BGC data. The pressure axes in both files are identical and are used to merge core and BGC data from a single cycle into various higher-level files served at the GDAC. All the profile data are stored as they are telemetered and decoded, thus preserving the data in their rawest form. In this manner, re-processing of BGC data can be carried out with relative ease and integrity. Together these files serve to preserve the complete float record.
Real-time data are aimed to serve operational users such as those assimilating Argo float data within numerical weather prediction and other operational models. Data in this stream is subject to real-time quality control checks and is expected to be free of gross outliers.
Only automated quality control and data checks can be applied. The second data stream, delayed-mode data, is meant to provide the best quality data for science at the present date, including realistic error estimates.
For core data, the DMQC process is typically expected to occur on an annual basis. The same frequency is recommended for BGC parameters, although initial DMQC, which improves initial accuracy considerably, should be performed as soon as a sufficient number profiles have been returned typically after 5—10 cycles.
During a delayed mode assessment, for either core or BGC data, data from a float are rigorously examined in a multi-parameter context, which typically includes some level of comparative analysis to regional reference data and climatology e. In many cases, any necessary data adjustments gain, drift, offset derived by the delayed-mode operator during such an assessment can be fed back into the incoming data stream in real-time, producing real-time adjusted data.
If no parameters exist in delayed-mode i. Figure 4. Sequence of quality control and adjustment steps during the life time of a float from float deployment to death.
Initial DMQC should be performed soon after deployment typically after 5—10 cycles. With subsequent DMQC revisits on an annual basis , adjustments become more reliable indicated by the green shading. Table 5. Example of the file naming scheme for a float with WMO number The all-cycle file contains all available cycles for the float.
These two data streams are reflected in the BGC-Argo data processing organization and documentation. Each cookbook document is highly comprehensive, often including sample parameter data processing code and meta-data population examples for a wide range of BGC sensor types and model configurations. It is highly recommended that particular care is taken in following the procedures outlined within the Argo data processing documentation in order to ensure consistency between DACs.
Specific questions related to a specific float configuration, should they arise, are typically directed toward the Argo community via the Argo data management list-serve. Therefore, users are warned that raw biogeochemical data should be treated with care, and that often, adjustments are needed before these data can be used for meaningful scientific applications. The QC manuals go hand in hand with the processing cookbooks. However, their update frequency is driven by scientific evolution.
If not stated otherwise, they represent a 1 sigma uncertainty. Note that raw intermediate parameters are not quality controlled and that only the derived BGC parameters receive QC. Real-time operations, including the reception of float-transmitted files, decoding and conversion to biogeochemical parameters, application of real-time adjustments, as well as the creation of Argo netcdf files and transfer to the GDACs, are performed at each DAC. Note that both the sharing of data and providing delayed-mode quality controlled data is mandatory for a float running under the Argo label and its associated IOC regulations Argo Steering Team, 4 , 5.
Defining the proper pathway for processing of a BGC float Figure 5 could take different forms, but having a system identified prior to deployment is advised. Suggested steps would include:. Figure 5. The upper row indicates the real-time data stream, while the lower loop shows the delayed-mode data stream.
Given the richness of parameters, this does not need to be localized at a single laboratory. An operational workflow must be established such that any data adjustments resulting from DMQC efforts 2 are effectively fed back into the processing and submission scheme 1.
Finally, as highlighted in the previous parts, the BGC-Argo data flow implies a lot of transformation of the data files from the real time to the delayed mode status. A file checker is developed and made available to the Argo community 7 so that before submitting modified files to the GDAC, anyone can check that their files are compliant with the Argo rules and documentations.
There are three types of Argo files on the GDACs: 1 the core- or c-files; 2 the biogeochemical- or b-files; and 3 the simplified- or s-files Table 5. The c-files contain the float pressure, temperature and salinity data. The b-files contain the pressure and all BGC data, including the raw intermediate sensor data and the computed ocean-state variables.
The c- and b-files record the raw float-transmitted data. Biogeochemical-Argo data come in one of three data modes Figure 6. These data were converted by the DAC from raw transmitted sensor data to an ocean state BGC parameter without any further adjustments. While real-time QC was applied to remove gross outliers, these data should be treated with care, as adjustments are often needed before these data can be used for meaningful scientific applications.
These data were converted by the DAC from raw transmitted sensor data to an ocean state BGC parameter and subsequently received an adjustment in an automated manner. Note that BGC-Argo is an evolving network, which is also true for its data. They may well change when better methods, climatologies, or longer float time-series become available. In addition, each sample has a corresponding quality flag.
The QC flags and their meaning are defined in Argo reference table 2 Wong et al. The QC flags are an essential part of the data. Table 7. Data access begins with the search for appropriate floats and profiles.
Figure 7. To allow traceability of research, the Argo doi may be sub-referenced to one of the monthly snapshots with its own doi key 8. Other sources may aggregate Argo data for specific purposes e. However, only those sources should be used where the data origin is traceable and transparent, i. In many cases, however, the float PI or operator may have more detailed knowledge about a BGC-Argo float or the area of operation.
For regional studies or studies involving just a limited number of float programs, it is recommended to contact the respective float PI or programs.
Descriptions of biogeochemical sensors available on BGC-Argo floats are provided below. Note that accompanying temperature, salinity and pressure data are required to support the transformation of raw data returned from a respective BGC sensor into a meaningful BGC quantity.
An outline of the associated core-Argo variable quality control procedures can be found in Wong et al. Such details will not be summarized here, as they are beyond the scope of this paper. The processing capabilities of the data management system will always be superior to on-board capabilities and raw data can be reprocessed at any time as improved algorithms become available as long as the calibration information is available.
Figure 8 gives some examples of BGC sensor attachments on different float platforms. Figure 8. Examples for biogeochemical sensor attachments and implementations on different float platforms. Dissolved oxygen O 2 is a key variable of ocean biogeochemistry e.
The preferred type of sensor to measure O 2 from floats are oxygen optodes as they are robust and have low power consumption order 1. They are based on luminescence quenching of a chemical immobilized in a gas-permeable sensing membrane, which is in contact with seawater. This has two consequences: 1 Oxygen optodes sense the seawater oxygen partial pressure, p O 2 , and 2 they show an oxygen time response due to re-equilibration between sensing membrane and seawater Bittig et al.
Typical response times are between 5 and s, depending on sensor and setting, and can be reduced if the optode is in a pumped flow Bittig et al. The Argo data system is able to store this information alongside the sensor data Bittig et al.
Foil-batch calibrated optodes, which were used dominantly in the first decade of Argo-O 2 — , can additionally have differences between batch calibration and individual optode that may require an additional offset next to the factor on p O 2. Takeshita et al. Even for such optodes, however, correction should be done in units of partial pressure to match the sensor character Bittig et al.
Oxygen observations thus require a solid plan for in situ referencing and adjustment. To enable in-air measurements, oxygen optodes should be attached at the top of the float e. As a good practice, at least 2—3 in-air measurements per month should be performed to generate sufficient samples for in situ drift assessment. The primary goal of the Argo program is to maintain a global array of autonomous profiling floats integrated with other elements of the climate observing system.
Argo Australia, as part of the international collaborative effort, are the second largest contributor to the global array. The data at the GDACs are in netcdf format with profile, trajectory, technical and meta files available for each float. United States Global AccessCentre. French Global AccessCentre.
The global fleet of Argo floats has provided hundreds of thousands of profiles to date. Many Data Centres provide gridded products derived from Argo data - accessible here. These reference manuals are available from the Argo Data Management Website.
0コメント