nixnixnix Messages: 415 Registered: February 2007 Location: Kelowna, British Columbia
Senior Member
Hi,
I recently had a user complain that they couldn't save a TIF that they had opened in my software. The TIF is only 120MB but once as an Image it is approx 3000x3000 pixels and therefore around 3.6GB.
This means that the following function
void Image::Serialize(Stream& s)
{
int version = 0;
s / version;
Size sz = GetSize();
Point p = GetHotSpot();
Size dots = GetDots();
s % sz % p % dots;
int len = sz.cx * sz.cy;
if(s.IsLoading())
if(len) {
ImageBuffer b(sz);
if(!s.GetAll(~b, len * sizeof(RGBA)))
s.SetError();
b.SetDots(dots);
b.SetHotSpot(p);
*this = b;
}
else
Clear();
else
s.Put(~*this, len * sizeof(RGBA));
}
needs len to be declared as int64 leading to
int64 len = int64(sz.cx) * int64(sz.cy);
or something like that. Unfortunately, Stream::Put takes an int rather than an int64 and so the size is reported as less than zero and fails the ASSERT(size>=0) on Stream.h line 89.
I realise that I could re-write Image::Serialize to save Image in chunks but then you guys are so much better than me at this and it needs to be a permanent solution as we are all moving eventually to 64 bit and large memory items. Some of the code in Stream looks like it is intended for 64 bit.
I hope my detailed diagnostic helps.
Stream::GetAll will need upgrading as well to use int64. If no-one has time for this I can try to suggest a patch but I figure my code is unlikely to be the long-term solution.