Task #12274 (closed)
Opened 10 years ago
Closed 9 years ago
Reduce memory usage for very large PNG images
Reported by: | mlinkert | Owned by: | mlinkert |
---|---|---|---|
Priority: | minor | Milestone: | 5.1.0-m3 |
Component: | Bio-Formats | Version: | 5.0.1 |
Keywords: | n.a. | Cc: | james.r.anderson@… |
Resources: | n.a. | Referenced By: | n.a. |
References: | n.a. | Remaining Time: | n.a. |
Sprint: | n.a. |
Description
See https://www.openmicroscopy.org/community/viewtopic.php?f=4&t=7509&p=13958 and corresponding file in data_repo/from_skyking/png/james/.
The PNG format is not designed for efficient reading of images this large; while the image is stored in blocks, the blocks form a continuous stream of compressed data, so reading an arbitrary tile requires decoding all previous blocks first.
At the moment we're essentially reading the entire image into memory to decompress and unfilter, but it's probably worth trying to see if we can build up an index of blocks and their (x, y) positions during setId, so that openBytes can more easily read only up to the requested tile.
Change History (3)
comment:1 Changed 10 years ago by jamoore
- Cc james.r.anderson@… added
comment:2 Changed 10 years ago by mlinkert
- Milestone changed from 5.0.3 to 5.0.4
comment:3 Changed 9 years ago by mlinkert
- Resolution set to fixed
- Status changed from new to closed
PR opened: https://github.com/openmicroscopy/bioformats/pull/1450