<<< Tuesday, January 15, 2008 10:12 PM

Home

Wednesday, January 16, 2008 08:33 PM >>>


as the memory turns

Wednesday,  01/16/08  06:21 PM

<rant optional=yes>

You all know my status as a dinosaur; I can remember when all we had were zeros, and how great it was when we first got ones.  (There are 10 kinds of people in the world, those who understand binary, and those who don’t.)

So in the bad old days of 16-bit computing, the biggest programming problem was the size of your address space.   With only 64K to work with, and typically more physical memory than logical address space, you had to page stuff in and out in order to deal with it.  In those days every malloc was surrounded by an if(), because memory allocations could and did fail.

In the good new days of 32-bit computing, the biggest programming problem is the size of physical memory, and avoiding paging.  With 2GB to work with, and typically more logical address space than physical memory, you can allocate virtual storage with impunity.   In these days every new goes unchecked, because memory allocations don’t usually fail.

Well, we're entering some newer new days now, with machines that have more physical memory than logical address space again.   It is quite common to have 4GB on a machine, and yet the address space is “only” 2GB.   (Windows lamely uses the high-order bit, so you don’t have all 2^32.)   Which means once again you have to page stuff in and out in order to deal with it, and once again you have to check whether a virtual storage allocation has failed.

I suppose soon we'll all be running 64-bit operating systems and applications, and so this is a temporary situation; once we have a 2^64 address space we'll once again be worried about physical memory size and paging, and not about allocating virtual storage.

But for now, this is a problem.

You may know, a little while ago I made the world’s largest TIFF file, containing nearly 3TB of information.   I discovered 5 hours into an 8 hour compression run, I had run out of virtual storage.   I was running on a machine with 4GB of RAM, but my address space was “only” 2GB.   And so yes after a while – a long while – I allocated so much stuff that I hit the address space limit of my local heap, and news began to fail.  And of course my code didn't expect news to fail, so it died a horrible death.  I had to rearchitect the cache I was using to check for virtual storage availability in addition to physical storage availability.  A lot of work for an artificial limit.

So, what do we do?

We can surround every new with a an if(), and attempt to gracefully handle memory allocation failures.   That is too hard and too ugly to be right.  Anyway what do you do if one fails?   Most of the memory allocations are little pissant buffers and arrays; it is only the accumulation of literally millions of them that results in an overall failure.  We can catch the exceptions thrown by the C++ runtime when a new fails - that is better than checking every new - but it still leaves the problem of what do you do if you catch a failure.  We can move buffers into shared memory segments - kind of complex - or we can wait for 64-bit computing to be ubiquitous.

I do think that 64-bit will be the final frontier; it is unimaginable that 2^64 wouldn't be a big enough address space for everything.  Remind me I said that :)

<rant>