GTSPP, Brest, 19-20 Nov, 2001.11.21

 

The Session was opened on Monday, 19 November 2001 at 08:30 at IFREMER Brest, France by Gérard Riou, Director of the Technology and Marine Information of IFREMER His welcoming remarks are included.

 

Welcome

 

Mr. Chairman, ladies and gentlemen I would like to welcome you to this GTSPP meeting. For some of you it’s the third meeting in one week in Brest after the Argo and sea surface salinity meetings. This clearly demonstrates that the operational oceanography is rapidly moving forward. It requires a large cooperation between marine science laboratories, data centers and met offices. All these components and competencies cooperated in GTSPP. Seven countries supported the pilot project : Australia, Canada, France, Germany, Japan, Russia and the united states.

 

Canada's Marine Environmental Data Service (MEDS) leads the project, and has the operational responsibility to gather and process the real-time data.

 

The US NODC performs four functions within the GTSPP program.

  • Maintains the global database of temperature and salinity data adding 'real-time' data supplied by MEDS.
  • Processes delayed mode copies of data by performing the same data quality tests-as MEDS, then adds  data to the database.
  • Prepares monthly data sets and transfers them by network to participants in the U.S., Australia and France, as well as to requesters.
  • Maintains GTSPP files on line.

 

In addition to MEDS and US NODC, three science centres participate in the project by independently evaluating the delayed-mode data sets for the Indian, Pacific, and Atlantic Oceans. Australia's Commonwealth Scientific, Industrial and Research Organization (CSIRO), the Scripps Institution of Oceanography (SIO), and NOAA's Atlantic Oceanographic & Meteorological Laboratory (AOML) perform this function as Data Assembly Centres for the World Ocean Circulation Program, which GTSPP supports. IFREMER participates to the GTSSP by  injecting data acquired through French sources into the system and by maintaining one of the two delayed mode global data centres.

 

The project is a real success and has now become permanent project. No doubt it expresses the quality of work done by the participants. But it is based on the standardization of methods and data processing as for instance the quality flags and quality controls and on a clear definition of the responsibilities between the participants.

 

We have the experience, the tools and the competencies to operate a permanent system. The success will rely on the long-term commitment of the participants. I can express the support of IFREMER to this project.

 

Introduction

 

The list of participants is given in annex 1. Though not able to attend in person, Rick Bailey provided an extensive email.

 

UOT CD Preparation

 

The meeting began with a review of preparation for the UOT CD Version 3. The schedule for production of this was set at the previous WOCE DPC meeting in March, 2001. This meeting reviewed the action items from the previous GTSPP meeting, held just prior to DPC. A review of progress item by item is provided in Annex 2.

 

In brief there are two items that have slipped in the schedule. The first concerns assembling all of the 1990-1998 data which has yet to undergo Science Centre QC. The 1998 data have been returned from the Pacific Ocean SC, will shortly be returned from the Atlantic Ocean SC and followed by the Indian Ocean SC. NODC is in the process of switching its database system for GTSPP and this will cause some delay in the 1990-1998 data being sent to SCs. It is expected that these data will go out in early January. Since the complete set of data do not have to be assembled on the CD until May, this should leave sufficient time for all Science Centres to complete their work.

 

The number of stations in the 1990-1998 collection is between 35 and 55,000 depending on the reconciliation with archives at Brest that are presently underway. At the low end, there are about 15,000 stations in the Atlantic, 16,000 in the Pacific and 3,000 in the Indian Ocean.

 

The data from 1999 and 2000 may be placed on the CD but with no Science Centre QC. As instructed at the last DPC meeting, these data will be placed in a separate data directory structure so that users may readily choose to exclude them if they wish. Bob Keeley will consult the DPC chairs for their advice about including these data or not.

 

An email from Rick Bailey suggested that GTSPP should merge the WHPO data into the archives. This is intended after the issuance of the final WOCE CDs, but this will not be done for the CD.

 

The second slip in date was in the creation of an html layout for the next version. A common look was proposed by the IPO and this has yet to be assembled and shown to GTSPP/UOT partners. Keeley has the necessary templates from IPO. He expects to have a draft showing layout and proposed content by early in January. He remarked that unless advised by UOT partners, the content would largely be the same, and organized in the same way.

 

GTSPP Review

 

Keeley presented a brief review of what he saw as accomplishments and weaknesses of GTSPP. His presentation had a number of figures, but most of these were based on the real-time data flow, the area of responsibility for MEDS. Some statistics were presented derived from NODC, but these needed to be updated.

 

The main point of the review was to set the stage for a discussion of the future of GTSPP.  Written comments were received from Rick Bailey.

 

The accomplishments of GTSPP were considered to be the following.

·         Good collaboration between science and data centres.

·         Better monitoring of data collection, flow, quality.

·         Improved duplicates management.

·         Some standardizing of QC procedures and reporting

·         Improvements in data system performance (timeliness, quality, fewer duplications, fewer data losses, better metadata handling)

 

The weaknesses were considered to be the following.

·         Distributed archives result in multiple copies

·         Duplications are managed not fixed

·         Data “state” is unclear

·         Detailed format content not standard

·         Missing coordination of “PCODES”

·         Limited advertising of available products

·         Limited on-line data access

·         Fall rate corrections to archives

·         Limited application to other variables

 

A general discussion followed under the broad topics outlined below.

 

Duplicates Control

 

GTSPP spends a substantial degree of effort in finding and removing duplications in its archives. These duplicates are created both intentionally and unintentionally. For example, in the real-time data flow, it is necessary to have more than one point on the GTS where profile data are extracted. This is demonstrably necessary because of losses (likely due to problems in bulletin routing tables) in transmission. In delayed mode submissions, NODC matches real-time to delayed mode and also needs to examine delayed mode data against its delayed mode archives since the same data can arrive from different sources, and with differences in content. It is these differences that can make duplicates identification difficult.

 

It has been proposed that all data handled within GTSPP should be given a unique tag that will be carried with it, and once created, never altered. It was suggested that we work with data collectors to try to get this tag placed on the data as early on after collection as possible. It was suggested that manufacturers, for example of XBTs, might attach such a tag. Such an idea has been proposed by SOOPIP. Indeed, programs such as Argo will be doing this, and this is under serious consideration by operational programs such as SEAS. It was also suggested that all historical data now within GTSPP have a tag created for them as well.

 

Creating these tags will not eliminate duplicates problems, but it starts to address the problem earlier on in the data collection process where it is easier to apply a tag. It is important to start this, to use them to the extent possible within GTSPP, and to educate clients to create and preserve these tags so that the data systems can compare unique tags rather than proxies such as position, time, measurements, etc, which are subject to changes because of QC processes.

 

It was suggested that we do not need to tell collectors exactly what the tag should be. Rather it is better to provide guidelines about how tags can be created including what would be a preferred character length. The most important attributes of this tagging system are that the tags can be created independent of a central source issuing them, and that they are guaranteed to be unique.

 

The level of detail at which the tag attaches is dependent on the type of data. For example, profile data from a CTD may have the tag attached to each profile. Time series, may choose an arbitrary segment, such as a day, at which to attach the tag. The guidelines remarked on above need to address this issue.

 

The data format must be able to carry the tag around, certainly within the GTSPP, and preferably when the tag is attached to the data. The GTSPP format, depending on the length of the tag, has a place to store such information in the surface codes structure. In order to attach a tag to real-time data, it will be necessary to report the data on the GTS in BUFR as there is no place in either the BATHY or TESAC formats to do this. GTSPP should recommend such a move for this and other reasons related to the greater flexibility of BUFR.

 

The meeting did not try to suggest, nor discuss at great length either the explicit ways to create a tag (exploiting originator provided information, for example), nor what should be used (such as database identifiers generated by data centres). Instead NODC and IFREMER will exchange correspondence on this matter with the object of having a proposal ready in time for the next GTSPP meeting, scheduled for Hobart in March of 2002. In addition to the attributes of the tag described earlier it was expressed that though attributes such as date, time, platform identifiers, etc. could be used to facilitate the generation of a unique tag, once the tag was assigned no attempt should be made to use the information in the tag to tell a receiver anything about the data. The tag is a unique identifier only, and should not be considered as carrying any other information content.

 

Data State Indicators (DSI)

 

This is a concept being promoted by Keeley and Neville Smith with the support of OOPC. The object is to provide a user with an indication of the state of processing of data when it is received. The intention is not to provide greater detail, but to have some fairly general scheme that is easily found with the data and easily understood. The proposal made by Keeley had been accepted within the Argo program, and he argued that it should also be used within GTSPP.

 

The discussion stated that it might be more difficult to know what DSI should be applied to data collected and reviewed by a PI before being sent to a data centre as opposed to data coming anonymously from the GTS. Keeley agreed to write a summary of the data flow within GTSPP with a proposed value for the DSI at each stage. Participants will consider this, with results to be finalized at the next GTSPP meeting.

 

Format

 

In the review of GTSPP it was noted that there was confusion especially early on about how to place information into some parts of the GTSPP format. This confusion generated a considerable number of problems for data senders and receivers. Some of the data structures used by GTSPP also appear in the Argo data formats. A "user manual" is being written for the Argo format, and this will explain in as much detail as necessary what the data structures are for, how to place information in them, and how they may be used. GTSPP should adopt these descriptions where they apply.

 

There was also the issue (see the discussion later about data access) that GTSPP should provide data in more than a single format. In particular, it was remarked that some clients would want to combine data collected in the Argo program with other profile data. For this reason, GTSPP had to provide data in the Argo data format as well so that users could more easily combine the data as they chose. The discussions about data access later explain what is and will be the GTSPP strategy. NODC and IFREMER will exchange correspondence about this issue with results to be discussed at the next meeting.

 

Some users are interested in the data only, and so there was some discussion about what should be the minimum amount of metadata that should always be provided even though not specifically requested. For example, since GTSPP rarely modifies bad data, simply flags them, it is necessary to send the quality flags with the data so that a user can know when the GTSPP programme advises that values be ignored. Keeley will prepare a document that recommends the minimum information and present this at the next meeting.

 

Data Codes

 

At a previous meeting, it was agreed that a central list of codes used within the GTSPP should be maintained on-line. The production of the UOT CD facilitated the assembly of such lists, but this inevitably is applicable only up to the date it was issued. MEDS and NODC had agreed to look into placing such information on line, but neither had found the needed resources to do so. The meeting re-iterated the need and requested that MEDS and NODC cooperate to accomplish this by the next meeting. They agreed to do so.

 

Products

 

The review noted that GTSPP had a number of products including both data and monitoring tools. What was lacking was a scientific product that could be pointed to as largely or exclusively generated using data that have passed through the GTSPP. The programme was likened to a "wholesaler" in that it provided the infrastructure to move data from collector to user, monitored how well the data flow occurred, but had little to show the "consumer". It was, rather, the clients of GTSPP who showed interesting and useful results using data coming through GTSPP. Though credit is given to GTSPP, this did not raise the level of awareness of the programme to users and international projects.

 

There were a number of ways discussed about how it might be possible to raise the general awareness of GTSPP. One suggestion was an article for a journal, such as EOS. Another was to work with Coriolis to generate a product. A third option is to approach our science partners and see what assistance they might bring. The meeting also suggested that the power point review done by Keeley and partly seen at the meeting might also be a vehicle to explain the accomplishments and advantages of GTSPP.

 

Finally it was noted that any existing web sites that deal with GTSPP are not particularly good in that the program is either not well explained, or presented as the infrastructure of other programs. NODC remarked that it was redesigning their web site and part of this would be to work on the GTSPP pages. The target for installing the new pages is mid 2002. Participants suggested that there was a substantial amount of material on the UOT CD that could be used for new web pages about GTSPP. It was also likely that some new material would need to be written. Participants were requested to forward their ideas to Schnebele. In turn, he will provide a layout of the new pages for the next meeting.

 

It was also remarked that these new pages would be the focus for GTSPP information and as such the links should be provided to the JCOMMOPS coordinator for appropriate reference on those web pages. Other organizations that should be contacted would be WMO and IOC. It was also remarked that it might be a good idea to register GTSPP as a name on the Internet. Carval offered to look into this.

 

Data Access

 

Bailey remarked that he knew users who did not know how to go about getting access to GTSPP data even though they were aware of the programme. Participants agreed that this was a problem. Two strategies were discussed, one for real-time data and another for delayed mode.

 

Currently MEDS provides real-time data on a subscription basis. That is, through contacting MEDS, clients are able to get data 3 times a week for either the whole world or subset by areas. The data forwarded to NODC on this schedule is also made available to users through an ftp site at NODC. Both MEDS and NODC files are in "MEDS ASCII" format.

 

One of MEDS more recent subscribers is Coriolis who take the data and make them available to Mercator users. Coriolis prepares a netCDF file of all of the data less than 30 days old and received in the last week and posts this to an ftp site once a week on Tuesday evening. The files are maintained on the site for at least 60 days. Its main users are members of Mercator, but Coriolis agreed that there was no reason any other users could not also have access. They agreed to do so and to have their ftp site referenced by the new web pages to be built at NODC. MEDS also agreed to reference this site to offer this as an alternate format and download mechanism. In return, it was noted that the new NODC pages should also reference the subscription service provided by MEDS.

 

It should be noted that the netCDF files offered by Coriolis offers full vertical resolution French XBT data as well. These data are later issued as BATHYs but the corresponding BATHYs are removed from the files obtained from MEDS so that duplicates do not appear.

 

Coriolis also remarked that they were in the process of developing software to make daily files available, constructed in a similar way as the weekly files, as some clients expressed a desire for more frequent updates. These files would also be made generally available.

 

Obtaining real-time data via ftp seems to match up well with what most users of such data require. Generally these are modeling groups and a regular file download matches their operational needs.

 

MEDS, NODC and IFREMER agreed to exchange correspondence to standardize how these various real-time data services should be referenced on respective web sites. Such references could be in place (at least at MEDS and IFREMER) very soon and certainly before the next meeting.

 

In the course of acquiring the real-time data from MEDS, Coriolis has made comparisons of the data available from MEDS and from Meteo France and a paper is in preparation. IFREMER offered to provide a copy of this paper to add to documentation of GTSPP. The paper should be ready in both French and English by the end of January, 2002.

 

Concerning access to delayed mode data, the most immediate answer would seem to be the provision of a DODS server for this purpose at NODC by mid 2002. The server is presently up, but not yet in public operation. Having such a server meets one of the requests from the WOCE DPC and provides greater access to GTSPP data as well. The file format of this will be netCDF. MEDS is also in the process of putting up a DODS server for its data holdings. The server has been installed, but it is not generally available yet.

 

IREMER noted that it had a web server that allowed the public to construct a query based on many attributes and to interactively build the resulting data set. They also noted that at times, the size of the resulting query caused problems in overloading the server. NODC also has experience with this in earlier web based data access software. MEDS equally remarked that, especially for certain of its data holdings, large requests were more common than small ones. Until some reliable way can be developed to gauge the size of a query before it is executed, there was a problem putting up such servers.

 

Coriolis serves up the contents of their database through a web based system. At present this is simply T and S data, but they shortly expect to expand to handle ADCP. A short-term strategy for GTSPP was that Coriolis could provide the T and S data, but if users also wanted other kinds of profile data, such as nutrients, oxygen, etc., they would be directed to NODC. Both NODC and Coriolis expect to serve their data in the same netCDF structure as Argo data found on the Global Argo servers. There was some question if all of the information in the GTSPP format could be placed in the Argo formats. NODC and Coriolis agreed to explore this, though the expectation is that the answer is yes.

 

A point brought up in the review of GTSPP was that though GTSPP did seem to have stimulated and improved the flow of real-time data, the receipts of delayed mode data generally were as slow as ever. This is not strictly true since data from operational programs such as SEAS in the US arrive with only a few months delay. It is expected this will also be true for the delayed mode Argo data.

 

There were two strategies discussed to address this problem. The first concerns delayed mode exchanges within the GTSPP itself. MEDS and NODC have initiated more frequent exchanges (monthly for files going to NODC and bi-monthly for files going to MEDS). Though this is not yet operating as smoothly as it should, it meets MEDS needs quite well. In this case the files are smaller and so more easily integrated into routine processing rather than having to deal with a large file once a year (or even less frequently). It also has the advantage of reducing the delays of delayed mode data from Canada reaching the GTSPP archives. NODC and IFREMER agreed to talk about adopting such a strategy as well. The final decision will be made on the basis of how well more frequent exchanges can mesh with operations at both centres.

 

There are also still many data collected by other nations that take substantial time to reach the GTSPP. In this respect, certain data centres can act as "funneling agencies" for groups of countries as well as for their own. Kouznetsov said they would be interested in discussing how to connect in a more efficient way to GTSPP. Keeley agreed to draft a letter to go to data centres (recommended centres agreed by discussion among current GTSPP participants) to seek additional help in speeding up the exchange of delayed mode data to GTSPP and in adopting GTSPP QC procedures more widely. This letter however will not go out until a new Project Plan (see below) has been written for GTSPP. It was remarked that data arrive at data centres sometimes with restrictions on how soon they may be made available. It is important to respect these restrictions, but where they did not apply, it should be possible to improve the timeliness of delayed mode data acquisition by GTSPP.

 

It was also suggested that since Argo had revised its target for delivery of delayed mode data to the Argo data system to be 5 months after data collection, that this time also be the target for GTSPP delayed mode data. Participants agreed to this where it was possible.

 

High resolution data in real-time

 

Keeley reminded the meeting of certain developments that either provide now or shortly are expected to provide full resolution data in the same time frame as the reduced resolution GTS formatted data. Currently, this is the mode of operation for Argo with nearly 1000 TESACs reported each month from some 300 floats. The US SEAS program is developing software that will permit it to send full resolution XBT data ashore rather than sending only the BATHY and providing full resolution on diskettes some months later. It was noted above that within Coriolis, IFREMER already is handling French XBT and CTD data in this manner. Since reduced resolution XBT data appear on the GTS and TESACs from Argo floats with questionable points removed, the more complete data in fact reside elsewhere. MEDS asked why it should continue its real-time operations at the same resource level.

 

Coriolis responded that it was relatively early in its operations and it wanted MEDS to continue at least for some time to ensure that all data that was supposed to be sent to the GTS in fact was. At some future date, the data sets from sources such as the Argo GDACs and MEDS would be reviewed to see if it made sense for MEDS to reduce its operations. The same strategy was suggested for SEAS XBT data as well. Participants generally were in favour of this keeping in mind that a yearly review would be valuable to do. In addition, it was remarked that MEDS can use its tests of the quality of data appearing on the GTS as a way to gauge the implementation of agreed automated QC procedures (as in Argo), and of the effectiveness of the tests.

 

With respect to Argo data in particular, it was recommended that where data access links are provided, users should be directed to the Argo GDACs as the source of the best copy data from Argo. An explanation of what was available through the GTS should also be provided so that users can make a sensible choice of which data to request.

 

With respect to the SEAS XBT data, NODC will correspond with Coriolis and OAR (Gary Soneira) to get more details of how the software development is proceeding, what QC procedures will be employed, and to discuss how these higher resolution data can be sent to the Coriolis data files for weekly and daily serving to users.

 

Email from Rick Bailey

 

Rick raised a number of issues in his email that were not covered by the above discussions.

 

Concerning the separation of data having passed through Science Centre QC and those that had not, this was already to be a part of the version 3 CD release of UOT for WOCE. However, NODC remarked that they should build in the capability to select based on this criterion in its DODS implementation.

 

On getting greater feedback from users, Rick suggested a few ideas of what might be done. The idea of a survey was not popular, but some interaction, perhaps through a redesigned web site could be done. It was also suggested that a yearly GTSPP report should be generated and used as a kind of newsletter to potential users. It was remarked that the GTSPP brochure was dated and needed revision. IFREMER offered to provide a redraft by next meeting. NODC agreed to contact the new IPRC in Hawaii to see what their requirements might be and to inform them of what GTSPP might be able to provide to them.

 

It was suggested that a set of data be prepared that could be used to test that software that was supposed to be operating according to GTSPP principles was in fact doing so. Considering the suggested initiative to broaden the support to GTSPP from other data centres, such a test set was considered a good idea. Keeley was asked to work with Bailey to prepare such test data.

 

There was a general discussion about the comprehensiveness of the GTSPP archives. Bailey noted that the version of the WODB98, which he used for the UOT Network Review contained more data than the GTSPP archives. This was true, but the next release of the WODB will contain the GTSPP archives as well. The general question was is it the GTSPP archive that will be the most comprehensive or some other? When GTSPP began, it was the intention to deal with the historical data as well as the data from 1990 and forward. The accumulation of the historical data was as big a job as building GTSPP and through Syd Levitus' interests that part became the GODAR project. Syd continues to search out historical data in danger of loss or otherwise not contained within electronic archives. GTSPP should be doing its best to be sure currently collected data are quickly and routinely brought to the world archives, so that a future GODAR will not again be required. In this respect, GTSPP and GODAR are cooperative projects that both contribute to the comprehensive archive. The main difference is that data are handled differently by Levitus than by GTSPP. It would be a substantial amount of work to reconcile these differences, and it is not clear where resources to do this could be found.

 

Bailey raised the point a drafting a new Project Plan. This was generally agreed to be important to do and the time frame for a draft for the next meeting was agreed to. Bailey offered to help, as did Schnebele. Keeley will work with both to get this done. Bailey also remarked that the present plan is not written in a way that he can use in seeking resources. It seems like a good strategic idea to change the layout of the plan so that there are sections that science centres can use, and perhaps other sections that data centres can use when they are seeking resources.

 

The meeting ended at 1600 on 20 November. The chair thanked those who had attended. The next meeting is scheduled for next March in Hobart in conjunction with the WOCE DPC meeting.

 


 

Annex 1: List of Participants

 

Bob Keeley, MEDS, Canada

Greg Reed, IOC

Kurt Schnebele, NODC, USA

Alexander Kouznetsov, RIHMI, WDC-B, Russia

Natalia Puzova, RIHMI, WDC-B, Russia

Catherine Maillard, SISMER, France

Thierry Carval, SISMER, France

Loic Petit de la Villeon, SISMER, France

Yvette Raguenes, SISMER, France

Steve Piotrowitx, USA.

Etienne Charpentier, JCOMMOPS

Matieu Belbeoch, JCOMMOPS


 

Annex 2: Version 3 CD Review of Actions from Washington meeting

 

1. From item 2a. Review of scientific QC and data status.

 

Norm will work with Melanie to use the SCP$ parameter in the surface codes group to correctly indicate the actions performed on the

data from marginal seas returned from Scripps.

 

Being implemented in new db

 

2. From item 2a. Review of scientific QC and data status.

 

Melanie and Yeun-ho will ensure that profiles that have not passed through AOML QC are suitably identified in the GTSPP archive at

NODC.

 

Being implemented in new db

 

3. From item 2c. Review of scientific QC and data status.

 

Melanie agreed to split the real-time from the delayed mode in the Brest submissions and to send them to MEDS. Bob agreed to look

at the real-time data to see if there were any that MEDS had not received by other channels.

 

new db will be changed, can load 19,000 and can split real-time  Work is underway to split out

the realtime.  Plan to send realtime to Bob to check against his inventory.  Other profiles will be

loaded into new CMD if not already there.

 

4. From item 2c. Review of scientific QC and data status.

 

Melanie and Loic will investigate profiles originating at NODC and being returned from Brest having fewer depth value pairs than sent.

 

Has to do with the deepest values at the bottom of a profile being removed at Brest. This

Issue will be worked between IFREMER and NODC.  NODC did NOT replace existing profiles

with ‘shorter’ version returned from Brest.

 

5. From item 2c. Review of scientific QC and data status.

 

NODC will split the incoming files so that profiles types of temperature and salinity will be processed immediately into the GTSPP data

base, and other profiles will be set aside for inclusion in the new GTSPP archive when it is built.

 

Being implemented in new db

 

6. From item 2d. Status on receipt of data from navies and other sources.

 

Melanie will look into submissions from the Japanese to determine the vertical resolution of the data. In the past some had been very

low resolution data rather than the high resolution expected.

 

Syd picked up data from the Far Seas Fisheries and still at standard levels. Rick suggests there is higher resolution data. Kurt will follow this up by checking with Scripps and Levitus to learn more about the sources they have uncovered (id any)

 

7. From item 2d. Status on receipt of data from navies and other sources.

 

Kurt will have someone check into data submissions from NAVOCEANO to see how much duplicates data already held at NODC.

 

Found some data not yet seen, but not a lot. Still don't have the complete answer. Some recent US Navy AXBT on GTS and they have assured full resolution will come. But don’t yet have definitive answer on which Navy ships record higher resolution and whether they can submit it for public use.

 

8. From item 4: Interaction with other programs

 

Kurt agreed that NODC would act quickly to write a proposal on how NODC would act as the long term archive for Argo, so that it could

be reviewed by the Argo data management committee and participants of GTSPP and that would meet both Argo and GTSPP

objectives.

 

charles, sylvie, bob to pursue as part of Argo

 

9. From item 4e: Others (SEAS, JAFOOS)

 

It was suggested that using the platform's WMO identifier and time stamp was a more widely applicable unique tag in SEAS software

Gary agreed to take this suggestion back to the software developers.

 

no decision yet by SOOP.

 

10. From item 4e: Others (SEAS, JAFOOS)

 

Gary agreed to provide Bob with more details about other kinds of data that could be handled in SEAS software, and Bob would write

a proposal about how GTSPP should handle this. The draft should be circulated by September, with finalization as soon as possible

after to permit time for software changes if required.

 

Bob to pursue Gary for this.

 

11. From item 4e: Others (SEAS, JAFOOS)

 

Bob agreed to approach JAFOOS to see if they would be willing to host the entire GTSPP data on their server.

 

DODS server for Indian ocean only

 

12. From item 5ai: CSIRO codes are different from WOCE codes

 

It was agreed that NODC should retrieve the data for one or two of the stations noted with the problem as they will for the V3 CD. The

results should be sent to Ann to check that all of the relevant information is present.

 

Resolution is part of revamping GTSPP database and software.  CSIRO codes that were in addition to the ‘WOCE’ codes will be included when the v3 profiles are copied to the CD

 

13. From item 5aii: Duplicates on CD

 

Melanie confirmed that this was true. NODC will correct this for V3.

 

NODC has this correction as part of new software

 

14. From item 5aiii: Faulty FORTRAN code created

 

We need to be sure that the FORTRAN code generated did meet FORTRAN rules. It may be that some of the variable names in the

netCDF structure will need to be changed. Charles will work with Rick to sort this out.

 

a faulty piece of software from UCAR. Nathan to fix this in general. Bob to send email

 

15. From item 5aiv: Mixture of data on V2 with WOCE QC and without

 

This is true but was not a mistake. NODC will build V3 containing all available data from both WOCE years and after. The data not

having passed through scientific QC will be placed in a separate directory so that there could be no confusion about the level of QC

carried out. This alternate directory tree will include data from outside the WOCE period and not scientifically QC'ed as well as data

from marginal seas (see item 1 above) and any other data that could not meet the cutoff for scientific QC as indicated in the timetable.

 

Being implemented in new CD

 

16. From item 5aiv: Mixture of data on V2 with WOCE QC and without

 

NODC agreed to produce some statistics by year of the number of WOCE QC'ed and non-WOCE QC'ed data on V2. This would also

be used on the V3 CD with updated figures.

 

Kurt provided copies at the meeting.

 

17. From item 5av) Fall rate correction

 

DPC was of the opinion that though it may not be possible to be sure about certain corrections needed, if a reasonable guess on the

probe type could be made, and this used as the basis of a correction, the resulting profile may still be improved in terms of the

uncertainty of the depth. Bob will contact GTSPP science centres for their opinion.

 

Corrections will be made after the QC’d files come back to NODC. Kurt will send email to be very clear what we will do and what we will not. Correction will be applied when profiles are pulled to create netCDF files for v3.

 

18. From item 5av) Fall rate correction

 

Bob agreed to co-ordinate tracking down probe information using the resources of both science centres and data centres of GTSPP.

 

Melanie is doing this.

 

19. From item 5av) Fall rate correction

 

JAFOOS and CSIRO did not think reQC of the data would be necessary after the depth correction was made. Bob will contact other

science centres of GTSPP about this.

 

Done

 

20. From item 5av) Fall rate correction

 

Bob agreed to look at the incidence of the use of old or new fall rate equations in the real-time data stream since the code change in

November 1995.

 

report in meeting.

 

21. From item 5av) Fall rate correction

 

Bob agreed to write a document that lays out the strategy for correcting the fall rates. He will circulate this to GTSPP members for

comment.

 

Done

 

22. From item 5avi) Some records on V2 had non-WOCE records generated after WOCE QC was carried out.

 

This will be corrected for V3. Discussions with Ann suggested that she would be willing to modify her software to write the necessary

record so that the NODC code need not be altered but this problem fixed.

 

Believe this problem is fixed by new load software, and CSIRO need not modify their procedures

 

23. From item 5avii) Some profiles showed a zero as the deepest value.

 

Melanie reported that this was true. It was found that nulll values were present at the last depth for several data set s received from

CSIRO. Melanie and Ann will resolve this problem.

 

Found the cause. Will be fixed.

 

24. From item 5aviii) Need an articulation of rules for handling data returned from Science centres

 

NODC will draft rules for deletes/replacements in 2 weeks for circulation

 

NODC is revamping the load software for applying Science Center QC results.  In early Jan we will circulate a list of proposed ‘rules’ for how to handle certain unusual situations that have come up in the past.  Response to these proposed ‘rules’ will guide our update future procedure.

 

25. From item 5b: Review V2 CD content

 

Bob will contact science centres directly seeking contributions and comments on the layout of the CD. Bob will assemble a new layout

for the CD and circulate this to participants.

 

 Bob will do this with issue of new CD layout

 

 

Annex 3: Actions List

Action List

 

1.      Keeley to consult DPC chairs about inclusion of non-science QC'ed data on CDs.

2.      Keeley to send layout of V3 CD as soon as possibile

3.      NODC, Ifremer to provide a plan for a unique GTSPP data tag by Mar, 2002

4.      Keeley to write DSI for GTSPP data flow conditions by Mar, 2002

5.      NODC, Ifremer to discuss data access

6.      Keeley to describe minimum content to be distributed with data.

7.      MEDS and NODC to put up code lists on a website

8.      Participants to send web site ideas to Kurt

9.      Kurt to provide new GTSPP web site proposal

10.  Carval to look into GTSPP as a domain name on the WWW

11.  Coriolis to make weekly and daily data files available from their web site

12.  Other GTSPP web pages to reference Coriolis sites and subscription service of MEDS

13.  Ifremer to make available document on differences between MEDS and Meteo France GTS files- jan, 2002

14.  NODC and Ifremer to examine if GTSPP data can be made availble in Argo netCDF formats

15.  NODC and Iffremer to look at more frequent data exchanges

16.  Keeley to send letter to data centres asking for delayed mode data help after new PP written

17.  NODC, Ifremer, OAR to discuss how SEAS developments for sending full resolution data on GTS are progressing.

18.  Ifremer to draft revised GTSPP brochure

19.  NODC to contact IPRC to see what interaction is possible with GTSPP

20.  Keeley and Bailey to prepare test data to exercise QC software

21.  Keeley, Schnebele, Bailey to revise project plan