3.1 Near-real-time Data Acquisition
3.2 Delayed-mode Data Acquisition.



5.1 Real-time Quality Control
5.2 Data Center Quality Control
5.3 Scientific Quality Control
5.4 End-to-end and Long-term Issues
5.5 Future Plans


6.1 Marine Environmental Data Service (MEDS) of Canada
6.2 Australian Oceanographic Data Center (AODC)
6.3 U.S. National Oceanographic Data Center (NODC)
6.4 Conclusions



8.1 Schedule
8.2 Status of Hardware/Software






The ad-hoc meeting was opened on Monday 23 January 1989 by Mr. Gregory Withee, Director of the U.S. National Oceanographic Data Center (NODC). He welcomed the participants (Appendix A), who represented both oceanographic data centers and the oceanographic research community, and then briefly reviewed the background and purpose of the meeting.

The Australian Oceanographic Data Center (AODC), the Marine Environmental Data Service (MEDS) of Canada, and NODC initiated discussions on a global temperature-salinity (T-S) pilot project during an Integrated Global Ocean Services System (IGOSS)-International Oceanographic Data Exchange (IODE) data flow meeting held in Ottawa, Canada in January 1988. The discussions continued at the June 1988 workshop on ocean data files (Appendix C,citation 1) convened in Washington, D.C., and hosted by NODC and the Environmental Research Laboratory (ERL). The pilot project was formally recognized and endorsed at the November 1988 meeting of IGOSS. It is planned to seek formal endorsement by IODE at the November 1989 meeting in Paris, France.

An objective of this project is to develop, maintain, and make readily available, a comprehensive, up-to-date, and quality-assessed data base of oceanographic T-S data through a cooperative effort of the participating centers. It is anticipated that other data centers will join in this effort.

The purposes of the present meeting were (1) to discuss the potential scope (data types, spatial and temporal coverage, etc.) of the project, (2) to assess the present status of capabilities at the participating centers, (3) to decide on a tentative course of action, division of responsibilities, and schedule, and (4) to decide what specific actions were necessary in preparation for the November 1989 IODE meeting.

Mr. Ronald Wilson, Director of MEDS, was unanimously elected chairman of the meeting by the participants. The provisional agenda was revised (Appendix B) and adopted. Background documents (Appendix C ) were distributed to the participants.


The Global Temperature-Salinity Pilot Project (GTSPP) is being initiated to promote, improve, and standardize the T-S data management mechanisms which presently exist in both the IODE and IGOSS systems. This is to be accomplished through a cooperative effort to acquire as much of the available T-S data as practical from both real-time and delayed-mode data sources, assess their quality, and to make them available in a timely fashion to the user community. Although such a system is expected to have many local or regional research and non-research applications, this pilot project will focus primarily on satisfying the requirements for T-S data of global climate change research activities. User feedback will be encouraged to aid in evaluation and improvement of the system. Temperature and salinity are the ocean parameters targeted by this project because they are the primary state variables that influence ocean dynamics. However, the GTSPP should serve as a prototype for future systems to manage data on other important ocean parameters, such as nutrients.

(A Proposed IGOSS-IODE Program)

1. To create a timely and complete data and information base of ocean temperature and salinity data of known quality in support of the World Climate Research Program (WCRP) and of national requirements.

2. To improve the performance of the Intergovernmental Oceanographic Commission (IOC)/IODE and World Meteorological Organization (WMO)/IOC IGOSS data exchange systems by actively seeking data sources, exercising the data inventory, data management, and data exchange mechanisms as they are intended to work and recommending changes where necessary to meet national and international requirements.

3. To disseminate, through a widely distributed monitoring report produced on a regular basis, information on the performance of the IODE and IGOSS systems.

4. To improve the state of historical data bases of oceanographic temperature and salinity data by developing and applying improved quality control systems to these data bases. Improve the completeness of these historical data bases by digitization of historical data presently in analog or manuscript form and by including digital data not presently at a World Data Center (WDC).

5. To distribute copies of portions of the data base and selected analyses to interested users and researchers.

The T-S data requirements (e.g., time and space scales of interest, data types desired, etc.) of WCRP research efforts should be briefly articulated in a GTSPP background document. This document will serve to guide future deliberations, priorities, and decisions regarding system design and data management policy of the GTSPP. It may also serve to justify the GTSPP to the research community and gain their endorsement. A draft should be completed as soon as possible (by mid-February 1989) because it is logically a prerequisite to most other GTSPP activities. NODC volunteered to prepare the initial draft statement of the data requirements for consideration by the participants and others.

An implementation plan for the GTSPP will help bring into focus the project priorities and will help guide and coordinate specific activities of the participants. The document will highlight critical implementation problems that need resolution and present a comprehensive and logically ordered sequence of steps with dated milestones. The plan is a prerequisite to a detailed system design and initiation of any necessary procurement actions. AODC volunteered to prepare an initial draft (by July 1989) for consideration by the participants and others.

It is anticipated that these documents will help promote the GTSPP, in particular at the IODE meeting in November 1989.


It was agreed that within the GTSPP, MEDS would assume primary responsibility for management of the real-time (i.e., IGOSS) data stream and NODC would assume primary responsibility for incorporating delayed-mode (i.e., IODE) data into, and continuously maintaining, the GTSPP data base. The participants divided into two working groups to discuss the real-time (or near-real-time) and delayed-mode data acquisition, respectively.

3.1 Near-real-time Data Acquisition

The participants noted the sustained contribution by the U.S.S.R. of salinity observations to the IGOSS data stream. It was proposed that the U.S.S.R. be encouraged to participate in the GTSPP through WDC-B, Oceanography.

3.1.1 Data Exchange Links

The meeting agreed that MEDS would initially capture and compare the real-time T-S data streams available from U.S. and Canadian (and possibly Australian) sources (Fig. 1). NODC would continue their existing links to the U.S. real-time data sources and would make these data available to MEDS daily. AODC volunteered to investigate whether an additional IGOSS feed to MEDS from Australia is needed to ensure a complete data base. AODC would monitor IGOSS data received at the Australian Specialized Oceanographic Center (SOC) and would provide this information to MEDS for comparison with their IGOSS data. If additional data were available via the Australian SOC, a data exchange link with MEDS would need to be investigated. In that context, it was agreed that the assistance of other IGOSS responsible national oceanographic data centers (RNODCs) should be solicited as well. As experience is gained, through the IGOSS data monitoring effort of MEDS, about the degree of data duplication among the various sources, one or more of these links may prove to be unnecessary and subsequently may be dropped.

3.1.2 Data Exchange Formats

It was recommended that participating data centers in the GTSPP adopt the Binary Universal Format for data Representation (BUFR) as a standard for exchange of IGOSS operational data with MEDS. This implies that all participating data centers must acquire the capability to read and write BUFR. Since BUFR is relatively new and some details relevant to its use in the transmission of oceanographic T-S data have not yet been resolved, it was the consensus of the participants that a workshop dedicated to the resolution of these format issues is necessary. Such a workshop was tentatively scheduled for July 1989 in Ottawa.

3.1.3 Data Exchange Schedules

It was recommended that MEDS should acquire near-real-time T-S data from the various sources daily. This will require some coordination with, and cooperation from, the operational data sources. MEDS will subject the captured data to standardized quality control (QC) procedures. The resulting quality-controlled data will be made available to NODC on a daily schedule, but with a delay of approximately one week. The purpose of the delay is to allow quality checks in the context of recent history (e.g., the past week). The length of delay is arbitrary and can be modified as required.

The quality-controlled operational data will be incorporated into the GTSPP data base as they are received from MEDS. Delayed-mode data will also be incorporated when they are received. On approximately a monthly cycle, NODC will forward Pacific Ocean data to Scripps Institution of Oceanography (SIO) for scientific QC (including both objective and subjective assessments) and the generation of standard analysis products. Data from other regions will be forwarded to other centers (e.g., to the U.S. Atlantic Oceanographic and Meteorological Laboratory [AOML], to the Commonwealth Scientific and Industrial Research Organization [CSIRO] via AODC, and to the Institute of Ocean Sciences IOS via MEDS) for similar assessments and analyses. As above, the schedule for transferring the data to the scientific analysis centers may vary among centers and, as experience is gained, may be modified to optimize and streamline the data flow.

Figure 1. Schematic Diagram of Near-real-time Data Flow

In a timely manner, the scientifically quality-controlled data will be returned to NODC where they will be incorporated into the GTSPP data base. They will displace all previous versions as the "best presently available" version of the data.

Since some delayed-mode T-S data are not inserted into the real-time data stream and arrive at NODC long after they were collected, the GTSPP data base will usually contain some delayed-mode data that have not been subjected to all of the QC steps described above. The participants noted the need for periodic re-analysis of the GTSPP data base by the analysis centers to help ensure that all data are uniformly checked.

3.2 Delayed-mode Data Acquisition

The working group on delayed-mode data acquisition addressed the issues of international participation and responsibilities with regard to acquisition of delayed-mode data for the GTSPP data base. A schematic diagram of the envisioned delayed-mode data flow is presented in Fig. 2

While contemporary delayed-mode data are given a high priority by GTSPP in that they are sought to displace the real-time version of the data in the GTSPP data base, some of these data may fill gaps in both time and space, and thereby supplement the coverage afforded by the operational data. Also historical T-S data of interest to global climate research programs, that are not already in the IODE centers, will be actively sought for incorporation into the GTSPP data base. Some historical data identified by this effort may not be in digital form. Resources to convert these data to digital form may need to be allocated. Avenues for obtaining funding (via IOC trust fund, etc.) to support this enhanced historical data recruitment effort should be explored.

Figure 2. Schematic Diagram of Delayed-mode Data Flow

The participating NODC's (i.e., Australia, Canada, and U.S.A.) would accept primary responsibility for seeking out and acquiring data from countries in regions of special interest to them (Fig. 3). Benefits to be derived from the availability of analysis products should be emphasized to encourage data submission. Other members of IODE should be encouraged to volunteer to participate in this accelerated data acquisition effort. In this regard, the participants suggested that the U.S.S.R. (through its WDC-B) and Argentina (through its RNODC for the Southern Ocean) should be encouraged to assist in data sparse areas in the Arctic and Atlantic sector of the Southern Ocean, respectively.

The WDCs should be the lead mechanism for interfacing the GTSPP with major international research programs (e.g., World Ocean Circulation Experiment [WOCE], Tropical Ocean and Global Atmosphere [TOGA] program, and Joint Global Ocean Flux Study [JGOFS], etc.) and with the International Council for the Exploration of the Sea (ICES). An assessment of present data holdings in the WDCs (in the form of data distribution maps, etc.) should be made available as an aid in the data recruitment effort.

NODCs and Designated National Agencies (DNAs) should be the lead mechanisms for interfacing the GTSPP with individual principal investigators (P.I.s), especially those who are not included in one of the major international programs. The endorsement of the Committee on Climatic Changes and the Ocean (CCCO) for these data acquisition activities should be solicited.


The communications infrastructure required to support data and information exchanges within the project were discussed.

The recommendation of the meeting was that initially the Space Physics Analysis Network (SPAN) of the U.S. National Aeronautics and Space Administration (NASA) will be used for the exchange of data (both operational and delayed-mode) and information between U.S. and Canadian centers participating in the project. Efforts to establish a SPAN link between NODC and MEDS are already well underway. The meeting acknowledged that for very large volume data exchanges, it may prove to be more practical to utilize magnetic tape rather than a telecommunications link.

Also, the establishment of a SPAN link to Australia was identified as highly desirable and will be further investigated. In the meantime, data exchanges between Australia and the other partners will be via magnetic tape. It is anticipated that international network connectivity will improve. Opportunities for enhanced connectivity presented by networks other than SPAN, such as INTERNET, will be explored.

BUFR is the recommended exchange format for operational data. General Format 3 (GF3) is the recommended format for submission of delayed-mode data and supporting documentation to the GTSPP, but exchange of delayed-mode data between participating centers may be in any mutually agreeable format. Since coordination between centers on details of formats for exchange of both operational and delayed-mode data will be essential to the efficiency of the GTSPP, a workshop dedicated to formats is planned for July 1989 in Ottawa.

The GTSPP will also require considerable day to day exchange of electronic mail and it was decided to use the commercial OMNET system for this purpose since most present participants already have access to this system.


There was much thoughtful discussion of QC issues. However, since assessment of data quality is to some extent subjective and dependent upon the intended application, and since policy decisions on QC issues can have significant practical consequences with regard to system design and resource requirements, it is not surprising that there was not unanimous agreement on all issues.

Most of the issues discussed applied equally to data from real-time operational data streams as to those submitted in delayed-mode.

There was much discussion about the wisdom and practicality of applying reversible versus irreversible "corrections" to data:

It was argued by some that in unambiguous cases, irreversible corrections should be applied. Others felt that even "experts" can make mistakes regarding the "unambiguous" cases and urged that only reversible corrections or flags be applied to the data.

In the case of suspicious data values, that were nonetheless within the limits of possibility, it was agreed that flags should be attached to the data. When practical, the cause of the suspicious value should be determined and corrective measures implemented at the source.

It was also agreed that wherever QC flags are applied to the data, their meaning should be clear and that all GTSPP QC procedures should be documented.

The participants concluded that flags applied at each QC step in the data flow should be retained so that it will be possible to determine (by examining flags in the GTSPP data base) which QC procedures were performed on the data.

There was some debate, but no agreement, about the degree of detailed information that should be communicated via flags. There was some concern that too many flags, providing information about subtle distinctions, might be more hindrance than help.

To begin to comprehensively identify and resolve important QC issues and to document the resulting project decisions, it was agreed that there should be a QC manual for the GTSPP (See 5.5 Future Plans, below).

5.1 Real-time Quality Control

MEDS presently captures and quality controls IGOSS T-S data from the northern hemisphere. Their QC procedures involve (1) objective checking of IGOSS message structure and ranges of individual field values and (2) subjective checking of parameter profiles to detect suspicious values. It was noted that flags are presently assigned to an entire IGOSS message. For example, in the case of a BATHY message, a quality flag is assigned to an entire profile rather than to the specific offending point(s) in the profile. The question of whether flags should be applied to individual points or to an entire message for the GTSPP remains unresolved.

MEDS anticipates that for the GTSPP, they will continue their present level of QC with three enhancements. First, some "history" will be added to allow for tracking of ships and spotting erroneous positions. (MEDS presently performs this type of check on drifting buoy data.) Second, MEDS will perform duplication checking to identify and remove all duplications within the "history" of the file. Third, the profile values will be checked against climatology and the profile depths will be checked against bathymetry; suspicious values will be flagged.

A potentially valuable data base, containing up-to-date information about U.S. in-situ operational ocean observing platforms (e.g., buoys, oil platforms, ships, etc.) and their suite of sensors, that is presently being developed by the U.S.National Ocean Service (NOS). Such a data base might help to link real-time T-S data to information about the data sources (e.g., by helping to resolve ambiguities in platform identification associated with individual IGOSS messages).

5.2 Data Center Quality Control

Delayed-mode data are potentially more comprehensive, accompanied by more supporting documentation, and of higher quality, than operational data since they may have been subjected to very meticulous analysis by a dedicated P.I. However, procedures applied to delayed-mode data sets vary with the intended application and among different investigators. Therefore, the quality of these data should be subjected to a standard set of quality checks to ensure that they are in fact of good quality. If delayed-mode data fail these quality checks, the data center should contact the originator for clarification. Supporting documentation (e.g., project, P.I., sensor type, processing techniques and technician, etc.) needs to be preserved and "linked" to the data set.

All or part of a delayed-mode data set may "duplicate" versions of the data obtained earlier by the GTSPP via other sources, such as the real-time data stream. NODC will identify such duplications (or different versions) so that the delayed-mode version can displace the earlier data as the "best presently available." Although the delayed-mode data are to be merged with near-real-time data in the GTSPP data base, it should always be possible to selectively retrieve complete delayed-mode data sets by various criteria (e.g., project, P.I., etc.). This might be accomplished with a directory or catalog containing the searchable high-level attributes and links to the data in the data base.

5.3 Scientific Quality Control

Scientific QC within the GTSPP is anticipated to follow the model of the SIO QC activities with regard to upper ocean temperature profiles as part of the Joint Environmental Data Analysis (JEDA) Center. In this model, the data are subjected to a two-step quality assessment which includes both objective and subjective checks. First, the data are objectively compared to climatology and flagged to indicate the degree to which they conform and whether they are accepted or rejected for the next step. Then the accepted data are used to produce timely analysis products (e.g., contoured fields in plan view and vertical section). The initial products are subjectively checked for unrealistic features. The data responsible for such features are subjectively reexamined to determine their validity and flagged accordingly.

5.4 End-to-end and Long-term Issues

The meeting noted that QC performed at one step in the data flow might discover that QC performed at an earlier step resulted in the assignment of inappropriate flags. Hence a feedback loop and/or mechanism for modifying flags needs to be agreed upon. (Note: removal of the inappropriate flags applied at an earlier step in the data flow might be interpreted to mean that the data were never subjected to the QC of the earlier step.) In this context, an overall GTSPP data manager, who has end-to-end monitoring and coordination responsibility for the project data flow, was recommended by the participants.

It was also noted that with increasing experience and improved climatology, our initial QC tools and procedures are likely to evolve (improve) with time. This means that early data are likely to be judged and flagged using different criteria (e.g., climatology) than later data. This issue needs to be clearly acknowledged and addressed. Furthermore, the need for periodic re-analyses of the older data should be anticipated.

5.5 Future Plans

MEDS volunteered to take the lead in preparing a first draft of a GTSPP QC manual for consideration by the participants. The manual will document individual QC tests and indicate what actions should be taken when data fail each test. It should address both real-time and delayed-mode data. It was agreed that peer review of the manual by representatives of the research community would be desirable and that reviews by the IODE Task Team on Data Quality and IGOSS Operations and Technical Applications (OTA) will also be solicited.


It was agreed that a comprehensive, up-to-date ocean data resource that is readily accessible, that contains data of known (mostly high) quality with adequate documentation, and that supports diverse and complex queries, is an idea whose time has come. Unless otherwise requested, this system would provide the requester with only the "best presently available" version of the data satisfying his selection criteria. This resource does not yet exist. Parts of such a data management system exist, but they are not integrated and hence there is no easy access or means of maintaining consistency. Since this system does not exist, data that might otherwise be available are not. Furthermore, data are being lost through neglect. Joining the existing pieces together into an integrated system, and supplementing them where necessary with new developments, will not be easy. It will require resources. But once such a system is in place, it will be a vital tool for many applications.

Ideally data would be captured by the system as soon as possible after they are observed, their quality would be assessed, and this early version of the data would be quickly available to all who need them (i.e., not just the operational data users). As superior versions (e.g., more complete documentation, more stringent quality assessment, etc.) of these same data were captured by the system, they would displace the earlier versions as the "best presently available". Various functional modules of the system such as those that manage catalogs, inventories, supporting documentation, quality assessment, search and retrieval, etc., would be integrated so that when new or corrected information becomes available, the additions or modifications to the data and information base would need to be applied only once. This would be efficient in that redundancy would be minimized and it would ensure that the contents of the various modules remain consistent. Queries would be easy to formulate and they would be answered quickly and unambiguously. The results of queries would be easily used to selectively retrieve data and/or information of interest.

The participants believe that some significant steps toward such a system are feasible now and should be aggressively pursued. The ideas and goals expressed here are not new. The participating centers have individually been experimenting with new data base management technologies. Therefore, the participants recognize that the task is not trivial and that resources are limited. Collaboration has the advantages of spreading the work load, tapping the collective expertise, and avoiding unnecessary duplication of effort. Furthermore if such a system is ever to be truly comprehensive, international collaboration is essential. The GTSPP provides a framework to begin that collaboration. The following are brief remarks, prepared during the meeting, by the participating centers relating some of their views and experiences.

6.1 Marine Environmental Data Service (MEDS) of Canada

The volumes of data managed by MEDS have grown sufficiently in the last few years such that maintenance of the data archives in a commercial data base management system (DBMS) is no longer possible. The increased volumes have meant poorer response times for data requests and larger demand on on-line storage. MEDS' new archiving strategy will focus on on-line inventories which can respond to area, time and identification searches in a matter of seconds. Data retrievals will be accomplished by passing pointers from the data identified in inventory searches, to retrieval software working against the data archives. We envisage inventories being managed possibly within DBMSs, with archives kept in an indexed sequential file structure. We also will require a system to identify data across the many archives, to point down to inventories above each of the archives.

6.2 Australian Oceanographic Data Center (AODC)

The AODC has investigated a number of approaches to oceanographic data management problems. Examining the relational data base management (RDBMS) technology has revealed that using this method exclusively causes a large number of problems. These problems relate to the inability of RDBMS to effectively manage variable length observation records. Performance inefficiencies which can result from storing hierarchical data structures in relational files are also a problem.

6.2.1 Variable Length Records

One method of storing variable length records in a RDBMS is to break each physical record (observation) into a number of smaller data base records, all linked to the header record with pointers. For example, each temperature-depth pair in an XBT observation would be stored as an individual record. This leads to very large overheads, both in storage and processing. Before an individual record can be processed it must be re-built. That requires a number of retrievals from disk in order to access the complete observation. Storing variable length records is not generally practical using relational data base technology.

6.2.2 Virtual Data

In many cases the data acquired from the DBMS is not actually stored on the DBMS and must be "created" or calculated. For example, to produce a T250 contour all data values at 250 meters must be extracted. However, expendable bathythermographs (XBTs) recorded as inflection points may not have a 250 meter value stored, so this value must be calculated from the stored values above and below 250 meters. This creates a "virtual" data value. This data value, together with the other calculated values can then be contoured. This calculation is an overhead that must be considered when designing a database system.

6.2.3 Performance

Storing a single observation as a number of records, as previously stated, adds to the system overhead which has an obvious effect on system or retrieval performance. In tests undertaken using both a hierarchical and relational DBMS, performance figures were not satisfactory. The overheads associated with the DBMS cause these performance inefficiencies.

One solution examined is to store the fixed length "header" or management information in the rigid DBMS schema. The "header" would contain all details regarding the data including: position, time/date, ship name, country, etc. The "header" would also contain pointers to the actual data which would be stored in an indexed sequential file. This results in a maximum of only two disk accesses to retrieve one observation. The DBMS header allows rapid and Standard Query Language (SQL) type retrievals of information that could be used for management purposes. Linking the DBMS and the indexed sequential file allows retrieval of the actual data according to criteria based on any combination of information maintained in the "header" record stored in the RDBMS.

6.3 U.S. National Oceanographic Data Center (NODC)

The goals and objectives of the GTSPP will require a single all inclusive dynamic data base management capability that will permit frequent (daily) updates of new data, replacement data, quality indicators and references. The participating national data centers must develop this capability if this pilot project is to succeed.

Conceptually NODC views this continuously updated data base as an integrated data base with minimum redundancy managed by relational data base technology. The primary record for each station would contain the latitude, longitude, date and time with a series of pointers to numerous supporting records such as ship, institution, P.I., quality indicators, documentation, etc. and pointers to the data themselves. The data would reside as direct access records on a mass storage device such as optical disk. The success of this system is dependent upon the design of the tracking and inventory subsystems. Utilizing relational database technology, these subsystems must permit rapid responses to queries, retrievals, and updates. The system should have the capability to format all retrieval data and documentation to a customer's specified output format.

In addition to the queries and updates, the system must address other more mundane functions, such as audits, where system activity and data set status are maintained and various management reports are produced. Security needs to be planned to address standards, access control, update authority, system backups, etc. Other functions that must be supported include accounting; interfacing with other data set catalog and information system development activities of NODC, the U.S. National Environmental Satellite, Data, and Information Service (NESDIS), and NASA; data management system documentation; and finally establishment of criteria for evaluating the success of this effort.

6.4 Conclusions

The tasks of incorporating contemporary data and historical data into the GTSPP data base each present a unique set of challenges. They are different enough that it is logical and expedient to devote a separate planning process to each. The consensus was that the contemporary data should receive early emphasis (phase 1) and historical data would quickly follow (phase 2). In this regard, a meeting devoted to historical data issues was tentatively scheduled for June 1990.


GTSPP analysis and information products should promote use of the GTSPP data base. They should be standardized and distributed on a regular schedule. The plan is to build on present product generation capabilities.

There are two cycles for scheduled products: monthly and annual. Monthly products will be modeled after those presently being produced by SIO and MEDS, however they will be enhanced to include information on the IGOSS monitoring effort, acquisition of delayed-mode data, and other standard information reports.

Each analysis center will generate monthly analysis products covering their region of special interest. Present and anticipated analysis centers and their probable regions of special interest are presented in Table l .

Monthly maps showing the following information were discussed by the participants as potential analysis products:

- Spatial distribution of data used in each analysis field
- Sea surface temperature anomalies
- Sea surface salinity anomalies
- Mixed layer depth anomalies
- Vertically integrated upper ocean heat content anomalies
- Dynamic height anomalies
- 300 m temperature anomalies

Each participating center will generate annual products covering regions of special interest. Among these products will be maps showing the spatial distribution of data acquired by the GTSPP during the past year possibly broken down by sensor type, time period (e.g., season), and mode (e.g., operational or delayed-mode), etc. There should also be a product to illustrate the total holdings of the GTSPP. It was recommended that global climatology statistics be annually updated.

Opportunities to distribute GTSPP data and/or products on a compact disk with read-only memory (CD-ROM) will be actively pursued. It is anticipated that an initial project CD-ROM will be produced by 1993.

The participants felt that it would be desirable to distribute monthly and annual products and information in a single GTSPP publication series. This matter is to be discussed further at the November 1989 IODE meeting.


Implementation of the GTSPP was discussed in terms of scheduling project activities and assessing the present status of hardware and software at the participating centers.

8.1 Schedule

The schedule of major project activities identified by the participants is presented in Table 2.

Table 2. Schedule of GTSPP activities

Table 2. Schedule of GTSPP activities (continued)

8.2 Status of Hardware/Software

A summary of the status of hardware and software at the participating centers is presented in Table 3.


The participants thanked the chairman for his excellent leadership and the chairman adjourned the ad hoc consultative meeting on the Global Temperature- Salinity Pilot Project at 3 PM on 25 January 1989.



Roger Bauer
Compass Systems, Inc.

James Churgin
NODC J.Churgin

Philip Hadsell

Douglas Hamilton

Melanie Hamilton

J. R. Keeley

Sydney Levitus

Charles MacFarland

Ronald Moffatt

Christopher Noe

Steven Patterson

Irving Perlroth

Ben Searle

Robert Stone

Peter Topoly
NODC Telemail/USA

Warren White

J. R. Wilson

Gregory Withee



1. Opening Remarks and Election of Chairperson

2. Global Temperature-Salinity Pilot Project
(A Proposed IGOSS-IODE Program) Overview and Goals

3. Data Flow

3.1 Near-real-time Data Acquisition

3.2 Delayed-mode Data Acquisition

4. Communications Infrastructure

5. Quality Control

5.1 Real-time Quality Control

5.2 Data Center Quality Control

5.3 Scientific Quality Control

5.4 End-to-end and Long-term Issues

5.5 Future Plans

6. The GTSPP Data Base

6.1 Marine Environmental Data Service (MEDS) of Canada

6.2 Australian Oceanographic Data Center (AODC)

6.3 U.S. National Oceanographic Data Center (NODC)

7. Project Products and Information

8. Implementation

8.1 Schedule

8.2 Status of Hardware/Software

9. Closure of the Session



1. Final Report of the NODC/ERL Workshop on Ocean Data Files, June 13-14, 1988, National Oceanographic Data Center, Washington, D.C., September 1988.

2. Data Management, Chapter 3 from World Ocean Circulation Experiment Implementation Plan, Vol. I, WCRP, July 1988.

3. Global Thermal Salinity Data Base.

4. Processing Algorithm for Detecting and Replacing Duplicate Observations.

5. Goals of US-Canada Thermal-Salinity Project

6. Suggested Contents of Monitoring Report



Figure 1. Schematic Diagram of Real-time Data Flow

Figure 2. Schematic Diagram of Delayed-mode Data Flow

Figure 3a. Regions of Special Interest of AODC and MEDS
for Acquisition of Delayed-mode Oceanographic Temperature
and Salinity Data

Figure 3b.



Table l. Potential GTSPP scientific analysis centers and regions covered
by their analysis products

Table 2. Schedule of GTSPP activities

Table 3. Status of hardware and software at participating centers



AODC Australian Oceanographic Data Center
AOML U.S. Atlantic Oceanographic and Meteorological Laboratory
BUFR Binary Universal Format for Records
CCCO Committee on Climatic Changes and the Ocean
CD-ROM Compact disk with read-only memory
CSIRO Commonwealth Scientific and Industrial Research Organization
DBMS Data base management system
DNA Designated National Agency
ERL U.S. Environmental Research Laboratory
FY Fiscal year
GF3 General Format 3
GKS Graphics Kernel System
GOFS Global Ocean Flux Study
GTS Global Telecommunications System
GTSPP Global Temperature, Salinity Pilot Project
HP Hewlett-Packard
ICES International Council for the Exploration of the Seas
IGOSS Integrated Global Ocean Services System
IOC Intergovernmental Oceanographic Commission
IODE International Oceanographic Data Exchange
IOS Institute of Ocean Sciences of Canada
JEDA Joint Environmental Data Analysis Center
MEDS Marine Environmental Data Service of Canada
NASA U.S. National Aeronautics and Space Administration
NESDIS U.S. National Environmental Satellite, Data, and Information Service
NODC U.S. National Oceanographic Data Center
NOS U.S. National Ocean Service
OTA Operations and Technical Applications
P.I. Principal investigator
QC Quality control
RDBMS Relational data base management system
RJE Remote Job Entry
RNODC Responsible national oceanographic data center
SCOR Scientific Committee on Oceanic Research
SIO Scripps Institution of Oceanography
SOC Specialized Oceanographic Center
SPAN Space Physics Analysis Network
SQL Standard Query Language
T-S Temperature-salinity
TOGA Tropical Ocean and Global Atmosphere Program
VMS Virtual Management System
WCRP World Climate Research Program
WDC World Data Center
WMO World Meteorological Organization
WOCE World Ocean Circulation Experiment