This report is dated Oct 2006. A newer report does not exist.
Please refer to this more recent report about the status of the HDSS sonars. Note that the processing and logistical aspects of getting the data into CODAS have not changed substantially since this blog (below) was written. Report is here
While at sea for a 42-day CLIVAR cruise, I adapted CODAS single-ping processing for use with the HDSS 50kHz and 140kHz sonars on the Revelle. During this process I discovered that
- the forward two beams of the 50kHz instrument are dead
- beam 3 of the 140kHz instrument is essentially dead.
These beams are all reputedly going to be replaced in the fall of 2005.
Data acquisition is done on Macintosh OS9 machines, one per instrument. These computers were unable to share their data in any generally useful way, so an OSX machine was added in late 2005 to bridge the gap between the OS9 Macintosh and other operating systems. An attempt was made to provide NFS, samba, and network access to the raw data files as they were generated. The raw data include attitude, UTC timestamp, and position, but (as reported below) are poorly documented and the ancillary data contain bugs.
Matlab routines were provided (by Jody Klymak) which read the raw data, calculate beam velocities, and transform them into earth coordinates.
I took these matlab routines and altered their output data structures to look like the single-ping data structures used in processing other single-ping fully-navigated Dopper current data (specifically, VmDAS ENX files).
In the end I was able to run the HDSS data through the CODAS "mill", but the results were discouraging. Because of the two-beam solutions, the horizontal velocities contain large artifacts due to the vertical velocities associated with heave. We think the measured velocities themselves have a bias them selves at low speeds, but this is still a point under discussion.
My results were that the 50kHz instrument could only be used for ocean velocities if the barotropic component came from elsewhere, specifically the NB150.
This web page ("blog") is a short collection of notes, observations, and emails relating to the HDSS sonars, made available for anyone who wants to know what they're getting into (or at least what I got into) if they decide to process HDSS data using CODAS software.
I am adding the software I wrote to the CODAS suite. If you want to process HDSS data with CODAS software, follow the instructions as if you were getting ready to process ENX files, but this is one step harder. This software is experimental, and I don't have the time or the mandate to bring it up to the level of other CODAS processing code. Read the pages here, and the "reports", to get an idea of what to expect.
If you have questions or comments, I will attempt to address them.
Jules Hummon May 12, 2005
Two more formal reports include figures and more consise wording. They discuss
To try and processed HDSS data using CODAS, you must
- know how to process averaged data (calibrations, editing, discerning trends and good vs/ bad)
- know how to process ENX data (fully-navigated single-ping ADCP data)
- be familiar with single-ping ADCP data
- be good with Matlab and persistent. This won't be easy
Start with the FAQ, and codas processing information, especially quick_adcp.py.
More will be added as it develops.