-
Notifications
You must be signed in to change notification settings - Fork 80
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Maximum data size for compression #94
Comments
@DrivenByNostalgia Thanks for reporting this. You're right to expect that a buffer sizes >= 2^32 should work or a limitation should be documented. Can you share a minimal repro of this issue? |
This issue has been labeled |
No, I'm sorry, it's not within my time to create a reproducer. But as this is an issue that can easily be circumvented in the calling code, it's not a big issue after all. |
This issue has been labeled |
This issue has been labeled |
I have integrated nvCOMP (version nvcomp_3.0.5_windows_12.x) into our system for the compression of scientific data (GIS, simulation, …) and have noticed that compression fails for data of 2^32 bytes or more, while the compression of a buffer of size 2^32 - 1 works flawlessly.
Specifically,
nvcompManagerBase::compress
throws the exception "Encountered Cuda Error: 2: 'out of memory'." after callingnvcompManagerBase::configure_compression
with adecomp_buffer_size
>= 2^32.If the chosen compression manager is a
DeflateManager
orGdeflateManager
, the call ofcompress
first outputs a warning to std::cerr:Both of the reported numbers in this case seem wrong to me, but this is not my main issue, as we are not planning to use Deflate/Gdeflate anyways.
Is this an undocumented hard limit for the file size that nvCOMP can handle? I don't mind if there is a limit of 4 GB to the buffer size even with much more available memory. However, in this case, it would be nice to know and query this limit beforehand so to not run into an exception.
This has been tested on a system with:
The text was updated successfully, but these errors were encountered: