Journal cover Journal topic
Geoscientific Model Development An interactive open-access journal of the European Geosciences Union
Geosci. Model Dev., 10, 413-423, 2017
https://doi.org/10.5194/gmd-10-413-2017
© Author(s) 2017. This work is distributed under
the Creative Commons Attribution 3.0 License.
Development and technical paper
27 Jan 2017
The compression–error trade-off for large gridded data sets
Jeremy D. Silver1 and Charles S. Zender2 1School of Earth Sciences, University of Melbourne, Melbourne, Australia
2Departments of Earth System Science and of Computer Science, University of California, Irvine, CA, USA
Abstract. The netCDF-4 format is widely used for large gridded scientific data sets and includes several compression methods: lossy linear scaling and the non-lossy deflate and shuffle algorithms. Many multidimensional geoscientific data sets exhibit considerable variation over one or several spatial dimensions (e.g., vertically) with less variation in the remaining dimensions (e.g., horizontally). On such data sets, linear scaling with a single pair of scale and offset parameters often entails considerable loss of precision. We introduce an alternative compression method called "layer-packing" that simultaneously exploits lossy linear scaling and lossless compression. Layer-packing stores arrays (instead of a scalar pair) of scale and offset parameters. An implementation of this method is compared with lossless compression, storing data at fixed relative precision (bit-grooming) and scalar linear packing in terms of compression ratio, accuracy and speed.

When viewed as a trade-off between compression and error, layer-packing yields similar results to bit-grooming (storing between 3 and 4 significant figures). Bit-grooming and layer-packing offer significantly better control of precision than scalar linear packing. Relative performance, in terms of compression and errors, of bit-groomed and layer-packed data were strongly predicted by the entropy of the exponent array, and lossless compression was well predicted by entropy of the original data array. Layer-packed data files must be "unpacked" to be readily usable. The compression and precision characteristics make layer-packing a competitive archive format for many scientific data sets.


Citation: Silver, J. D. and Zender, C. S.: The compression–error trade-off for large gridded data sets, Geosci. Model Dev., 10, 413-423, https://doi.org/10.5194/gmd-10-413-2017, 2017.
Publications Copernicus
Download
Short summary
Many modern scientific research projects generate large amounts of data. Storage space is valuable and may be limited; hence compression is vital. We tested different compression methods for large gridded data sets, assessing the space savings and the amount of precision lost. We found a general trade-off between precision and compression, with compression well-predicted by the entropy of the data set. A method introduced here proved to be a competitive archive format for gridded numerical data.
Many modern scientific research projects generate large amounts of data. Storage space is...
Share