1. Three experiments have been designed to test the following assumptions underlying the conventional methods for calculating decompression schedules: (a) that there is a critical limit to the supersaturation of a tissue by a gas, and (b) that no gas is formed in the tissues of a subject who does not develop symptoms of decompression sickness.

2. When decompression time was titrated by cutting back upon the time spent at various last stops, it was found that a last stop at a depth of 30 ft was more effective than one at 20 ft which was, in turn, more effective than one at the conventional last stop of 10 ft for each of five schedules. These trials involved fourteen goats and 486 exposures.

3. In a second method, an exposure was selected for each animal from which the ‘titrated’ stop was now the only one required; but the results were inconclusive since the interval of uncertain diagnosis was now of comparable order of magnitude to the total decompression time.

4. In a third method, symptoms could not be induced in three goats, which had just completed marginally safe decompressions, by exposure to high-intensity ultrasound whose energy would be expected to cause the tissue to exceed any hypothetical metastable limit to supersaturation.

5. It was concluded that it is far more likely that the quantity of gas separating from solution determines the imminence of decompression sickness rather than its mere presence as determined by a critical limit to supersaturation.

6. This is discussed in relation to modifying the diving tables, and the serious implication that conventional schedules are really treating a gas phase in the tissues which does not give rise to symptoms due to the remaining compression afforded by the stopping pressures, rather than preventing the separation of gas from solution in tissue.

This content is only available as a PDF.
You do not currently have access to this content.