Brad Wardell's views about technology, politics, religion, world affairs, and all sorts of politically incorrect topics.
Why you run out of memory
Published on September 2, 2007 By Draginol In GalCiv Journals

32-bit gaming is going to come to an end.  Not today. Not tomorrow, but a lot sooner than most people think.

That's because no matter how much memory your PC has, no matter how much virtual memory you have, a given process on a 32-bit Windows machine only gets 2 gigabytes of memory (if the OS had been better designed, it would have been 4 gigs but that's another story).

Occasionally you run into people in the forums who say "I got an out of memory error".  And for months we couldn't figure it out.  We don't have any memory leaks that we know of and the people who reported it had plenty of virtual memory.  So what was the deal?

The problem was a basic misunderstanding on how memory in Windows is managed.  We (myself included) thought that each process in Windows may only get 2 gigabytes of memory but if it ran out of that memory, it would simply swap to the disk drive.  Thus, if a user had a large enough page file, no problem.  But that's not how it works.  After 2 gigabytes of memory, the system simply won't allocate the process any more memory. It simply fails and you will end up with a crashed game.

This is a very significant problem.  In Galactic Civilizations II v1.7, we'll at least be able to address this with more aggressive dealocation routines (which I really hate having to do, I really prefer the idea of once something is used, to keep it around for performance -- I've always been a proponent of performance over memory use).  But we'll be able to do it here without any noticeable affect in performance.

No, the real problem is in future games. If 2 gigabytes is the limit and a relatively low impact game like Galactic Civilizations is already running into it (and it's no memory hog), what's coming up next?  How about this -- when your video card runs out of memory for textures, it goes to system memory. And I think (but haven't tested this) that the memory it grabs belongs to the game process. 

Console game developers would simply laugh at our complaints and say that we just need to get better at de-allocating memory.  But that's only a short-term solution.  Gamers, particularly PC gamers, want to see their games get better looking and more sophisticated. 

So at some point, in the next few years, serious gamers and high-end games are going to have to require 64-bit machines to play them.  It'll probably be several years before it becomes common but it's coming. 

The good short-term news for GalCiv players is that we'll be able to have larger sized galaxies in GalCiv II: Twilight of the Arnor without running out of memory and users of GalCiv II will be able to avoid running out of memory once they get v1.7.


Comments (Page 1)
4 Pages1 2 3  Last
on Sep 02, 2007

I'm not sure I would have expected any different from the group that (even if slightly misquoted) originally thought that 64K would be plenty of RAM for anyone.   Actually they were probably talking/thinking about 640K, but even then that wasn't enough.

The size limits and restrictions that Microsoft and Intel have been using have been problematic for years, but we keep going, and going, and going with backwards compatibility and a stubborn desire not to change from what we have used in the past to some whole new system that would require all new versions of every application we could ever imagine running on them.

You well know that is a big part of the reason (the many would say better operating system) OS/2 was never able to displace Windows.  Of course another part of the reason was that application support for any Windows alternative always lagged and IBM never wanted to pay enough green-mail to developers to move their future work to OS/2.

If people would work within the model Microsoft used with the Xbox 360, where only a handful of selected games (applications) were backwards compatible through an emulation system, but basically everyone started all over again from scratch, then things might be a whole lot better off now.  Some would say this approach is exactly what is happening with Vista where you do see 64-bit support and applications are being re-written to take advantage of that power -- or they are working with-in a 32-bit emulation type world where their applications may run, but perform poorly or don't quite work as they should because of security models, hardware access restrictions, etc.

on Sep 02, 2007
Its good to see an article that illustrates positive thinking and action. It gets tiring to see tirade after tirade about what might have been, could have been or should have been, usually resulting in a general self seeking "told you so" statement, negative hindsight is an easy to achieve skill. I am more interested in people who resolve issues, and get on with life. Yet again full marks to Stardock, I look forward to V1.7 and ongoing developments.
on Sep 02, 2007
I second Zydor's compliment, with the exception of preferring good critical thinking to "positive thinking."

I'm guilty of some of the whinging that Zydor mentions, but it is worth knowing why things went wrong as well as what you're going to do next. For example, the key problem Brad points out could well have been avoided by changes to the pacing of development cycles or the influence that sales and marketing units tend to have over external communications. Because such problems can and likely will recur, we'd all be better off if the organizations involved were more open to constructive criticsm than is currently popular in this ad-saturated world.
on Sep 02, 2007
I have a 64bit system and running XP 64bit edition. Will we eventually see 64bit builds of Galciv II?
on Sep 02, 2007
Oh I can whinge with the best of them, I am no paragon of vertue    Its just nice to see a positive article.
on Sep 02, 2007
WARNING: geek content following...

This is not just a Window's issue. Userspace applications on 32 bit systems have had this limit since virtual memory managers were created. Every process is given 4 GB of virtual memory space, the most that can be addresses in 32-bits. Half is given to the kernel and half is given to the process. This is where 2GB comes from. But it does not stop there, code space, data space and stack space all have to come from that 2GB that is left. So when you are ready to allocate that first set of bytes, there is really not much virtual memory address space left to do really big things.

The good news though, is that this is a per process limit. Meaning that every process gets a new virtual address space to play in. So with interprocess communication, IPC, and shared memory access, the 32 bit arch continues to move forward with the 4GB addressing limitations.

Technology such a PAE and GT4 has also come up, so that 32 bit hardware can stay around still and address > 4GB of physical memory. Maybe 64bit wil catch on and this limitations will be removed. Only time will tell...
on Sep 02, 2007
have you brought this up with mirco soft
on Sep 02, 2007
I stated that memory was going to be the driving force to 64 bit computing 2 years ago.  Gone are the days when memory conservation was the way to a top programming job (I do remember those days).  But as coding in general has gotten sloppier (code optimization is a thing of the past), memory usage has sky rocketed.  And I blame Microsoft for that.  Eventually we had to hit the 32bit wall, and it is now, not in the future.  In 2 years, only low end systems will be non-64bit.
on Sep 02, 2007
I stated that memory was going to be the driving force to 64 bit computing 2 years ago. Gone are the days when memory conservation was the way to a top programming job (I do remember those days). But as coding in general has gotten sloppier (code optimization is a thing of the past), memory usage has sky rocketed. And I blame Microsoft for that. Eventually we had to hit the 32bit wall, and it is now, not in the future. In 2 years, only low end systems will be non-64bit.




i am hitting that same wall with my 64 bit. although it doesn't crash it just stops saving.
on Sep 02, 2007
That wall is in the teraoctects (ie 1000s of Gb ) for the 64bits OSs so no. What you hit is the "emulated" 32bits memory wall which still affect 64bits systems using 32bits apps. That why, with GC2 and any other 32bits apps, using a 64bit OS makes no differences (except lower performances of course).
on Sep 02, 2007
64-bit has its own problems on the hardware side, though. How many people here have tried routing a 64-bit bus line on a microprocessor before? This isn't like the "lack of vision" problem Bill Gates had with the 640K memory thing. You pay a significant performance, price, and power penalty when you go 64-bit on the hardware. That's why things like the Intel Itanium were outrageously over-priced, YEARS behind schedule, and little market for them. To put it really simply: if you make everything 64-bit, you are more than doubling the size of your chip. And if you double the chip size, that means it computes at half the speed, at twice the cost, and twice the wattage. Now, there are several tricks we are trying in the industry to make it not that bad, but it's kinda hard when the average guy at Best Buy expects the 64-bit chips to be FASTER than 32-bit.

I *guarantee* you this problem is more easily solved in the software. We may see more 64-bit Operating Systems in the next few years that do some swizzling on the hardware's 32-bit address lines (which incurs a performance penalty in itself), but if you want 64-bit hardware--you're gonna pay. Understand that you're going to pay more for a CPU that runs slower & hotter, in return for your >4 Gig. support. Maybe at some point in the distant future the penalty for running 64-bit hardware will be less than emulating 64-bit behavior in the software, but it's just that--distant. For now, I highly recommend squeezing all you can out of 32-bit while you can. Sorry.
on Sep 02, 2007
If you've got an app that keeps eating memory, what difference what the limit is?

This just seems like common sense. Even if windows did start swapping more memory, the game would suffer. (and eventually you'd hit a wall there too)

I'm sorry, but if you're going to use dynamic memory, you'd better clean up after yourself. So does this mean the Out of Memory problem may actually get fixed?
on Sep 03, 2007
So the out of memory error will finally be addressed in 1.7? I certainly hope so. My frustration point reached its limit last week when my well-advanced game would not go more than a turn or two, and I just got so tired of reloading and replaying the same turns over and over again. (Yes, I have the latest NVidia drivers, latest Vista patches, 2 Gig RAM, etc.) I'm afraid I'm simply going to have to quit playing the game for now, and will look forward to 1.7 and a return to GalCiv 2.
on Sep 03, 2007
but if you want 64-bit hardware--you're gonna pay.


Uh, pretty much all consumer-level hardware has been 64 bit for a while now--32 bit died with the Athlon XP and Pentium 4E. Sure, the majority of OSes are still 32 bit, but a typical PC has been capable of well more for a couple of years now.
on Sep 03, 2007
Well, I'll be happy with the memory leak addressed. I didn't think SD would leave me high and dry, as their customer support has been top notch, but blaming it all on the OS seems a bit strange. I understand all too well the limitations of certain software, and hardware, but the way this game can eat memory can be out of this world. I'm not on the forefront of the 32/64 bit information line, so I'll let you computer guys work it out and I'll just pay the difference for another good computer later, but my good computer now, than runs every game I own at full (or near full) settings, can't run GC2 without crashing. I know that different games allocate memory differently, so it's nearly apples and oranges, but it seems to me that if a Supreme Commander match with a near equal map size, and an equal number of enemies (while not nearly an intelligent as the GC2 AI) runs smoother, longer, and completely without fail, and GC2 will run a few turns and then ctd, without exception, then something is amiss.

I'm not sure what it was, but GC2:DA worked great when I bought it, it reinvigorated the experience for me and I played countless games, and now it doesn't. The sooner it's fixed, the happier I'll be, because I'm not willing to buy another expansion with the last one still broken (for me). Here's hoping 1.7, and 2.0 squash it. Or they'll re-release the last patch that didn't have the error.   
4 Pages1 2 3  Last