Brad Wardell's views about technology, politics, religion, world affairs, and all sorts of politically incorrect topics.
Why you run out of memory
Published on September 2, 2007 By Draginol In GalCiv Journals

32-bit gaming is going to come to an end.  Not today. Not tomorrow, but a lot sooner than most people think.

That's because no matter how much memory your PC has, no matter how much virtual memory you have, a given process on a 32-bit Windows machine only gets 2 gigabytes of memory (if the OS had been better designed, it would have been 4 gigs but that's another story).

Occasionally you run into people in the forums who say "I got an out of memory error".  And for months we couldn't figure it out.  We don't have any memory leaks that we know of and the people who reported it had plenty of virtual memory.  So what was the deal?

The problem was a basic misunderstanding on how memory in Windows is managed.  We (myself included) thought that each process in Windows may only get 2 gigabytes of memory but if it ran out of that memory, it would simply swap to the disk drive.  Thus, if a user had a large enough page file, no problem.  But that's not how it works.  After 2 gigabytes of memory, the system simply won't allocate the process any more memory. It simply fails and you will end up with a crashed game.

This is a very significant problem.  In Galactic Civilizations II v1.7, we'll at least be able to address this with more aggressive dealocation routines (which I really hate having to do, I really prefer the idea of once something is used, to keep it around for performance -- I've always been a proponent of performance over memory use).  But we'll be able to do it here without any noticeable affect in performance.

No, the real problem is in future games. If 2 gigabytes is the limit and a relatively low impact game like Galactic Civilizations is already running into it (and it's no memory hog), what's coming up next?  How about this -- when your video card runs out of memory for textures, it goes to system memory. And I think (but haven't tested this) that the memory it grabs belongs to the game process. 

Console game developers would simply laugh at our complaints and say that we just need to get better at de-allocating memory.  But that's only a short-term solution.  Gamers, particularly PC gamers, want to see their games get better looking and more sophisticated. 

So at some point, in the next few years, serious gamers and high-end games are going to have to require 64-bit machines to play them.  It'll probably be several years before it becomes common but it's coming. 

The good short-term news for GalCiv players is that we'll be able to have larger sized galaxies in GalCiv II: Twilight of the Arnor without running out of memory and users of GalCiv II will be able to avoid running out of memory once they get v1.7.


Comments (Page 3)
4 Pages1 2 3 4 
on Sep 16, 2007
on the rare occasion i've reported bugs i stick to a phenomenological approach: which is to say, i describe what i see and how it is a problem for me. i don't presume a thing about the underlying process; i know it's way over my head. it's unfortunate we have a culture that engenders people to want to sound like they know what they're talking about at all times.


Too bad that the tools available to most users have absolutely no relevance or just report plain wrong things.

Example: Window's own task manager and it's memory usage column for the running processes.

I'm a developer for over a decade and still don't know what that number actually represents. Try this: Start some program, write down it's memory usage. Minimize it's window and restore it. Compare memory usage.
-> Usually A HUGE difference

Example: calc.exe, part of Windows.
Startup: 6.1MB. After minimize + restore cycle: 2.3MB, after yet another cycle down to 2MB even. Makes no sense.

Yet people scream bloody leak if the mem usage column rises. But is it really a leak or just some memory pages not yet reclaimed by the operating system?

Another example would be Firefox. People have been crying over it's memory consumption and that it would leak memory and so on. I'm using Firefox too and usually I have long running instances, sometimes I don't close Firefox for days. I've never had problems with it taking too much memory.

I summary, most users have no clue, they got broken tools and interpret the numbers reported by these tools in the wrong way.

Real life analogy: Users cannot distinguish between a leaky bucket and a sieve, but try to measure the leakyness by the amount of light that shines through it. Guess Windows are very leaky
on Sep 16, 2007
I summary, most users have no clue, they got broken tools and interpret the numbers reported by these tools in the wrong way.

Real life analogy: Users cannot distinguish between a leaky bucket and a sieve, but try to measure the leakyness by the amount of light that shines through it. Guess Windows are very leaky


nice analogies! the sad thing is, i'm a really smart guy. it's a real tough question. 'digital literacy' is part of the general education program in the college i work for. but i realize it's kind of an exercise in futility. sure, we can teach kids to understand computers in 2007. but in 2011, everything we taught will be out of date. only the students working in the field, and the rare few super-power-users, will keep themselves up to date.

shakespere and einstein won't really become out of date. but how do you teach people about something that that shifts radically every few years? i dunno, as a 'really smart guy' i resent that Microsoft makes cute little metaphors to describe what's really happening. but i realize most people would be put off by the level of detail i want.

...i just rented a movie at a friend's recommendation, Idiocracy (dir. Mike Judge, 2006). it was great, and this line of lamentation fits right in with the theme.
on Sep 17, 2007
nice analogies! the sad thing is, i'm a really smart guy. it's a real tough question. 'digital literacy' is part of the general education program in the college i work for. but i realize it's kind of an exercise in futility. sure, we can teach kids to understand computers in 2007. but in 2011, everything we taught will be out of date. only the students working in the field, and the rare few super-power-users, will keep themselves up to date.


Really understanding? Or just being able to use the current reincarnation of Windows + Office? Two different things.

shakespere and einstein won't really become out of date. but how do you teach people about something that that shifts radically every few years? i dunno, as a 'really smart guy' i resent that Microsoft makes cute little metaphors to describe what's really happening. but i realize most people would be put off by the level of detail i want.


Does it really have shifted that fast? The basics are still roughly the same then they were 50 years ago. We still have CPUs, RAM, secondary storage (be it magnetic tapes or modern hard disks, the purpose is the same). Pick up an old book, everything there is still relevant if you abstract away from any specific hardware.

Explaining this all to others shouldn't be that hard. They need to be interested though, no point in trying to teach Bob the builder ^^.
After explaining the basic components you explain how to talk to them in a very low level, but abstracted from the current platform. Some theoretical assembler language on some theoretical architecture. Next give a brief overview of a high level language.
That has covered the hardware and bridged over to the software. Now explain what an operating system does, file systems, libraries and so on.
In a last step, explain the modern metaphors which are preventing a clear view on what's really going on a recent computer.

..i just rented a movie at a friend's recommendation, Idiocracy (dir. Mike Judge, 2006). it was great, and this line of lamentation fits right in with the theme.


Great movie.

"If you have one bucket with 5 gallons and another bucket with 2 gallons, how many buckets do you have?"

I think in such a world I would perish.
on Sep 17, 2007


"If you have one bucket with 5 gallons and another bucket with 2 gallons, how many buckets do you have?"

I think in such a world I would perish.


Are we discounting alternate time lines and parallel universes?

on Sep 17, 2007
Really understanding? Or just being able to use the current reincarnation of Windows + Office? Two different things.


a little bit of real understanding, and then on to production (mostly Adobe products rather than MS, actually - they do some digital film/image/music editing, sometimes learn DreamWeaver and occasionally we let them get away with PowerPoint).

Does it really have shifted that fast? The basics are still roughly the same then they were 50 years ago. We still have CPUs, RAM, secondary storage (be it magnetic tapes or modern hard disks, the purpose is the same).


true. and even past the level of components, a logic gate is still a logic gate - at least until quantum computing becomes a reality (if and when).
on Sep 17, 2007
I don't think quantum computing will ever enter the realm of consumer products. And if so, then only in form of a co-processor or so.
The algorithms are just too different, a quantum computer is unsuited for everyday tasks. Only a very narrow range of algorithms can work on quantum computers.
And on top of that, it's not even certain yet that quantum computers are viable at all, there's still the possibility that a quantum computer won't have more power than a traditional computer.
on Sep 17, 2007
You know, I think a big part of the problem is that people are used to things both coming easily and not needing to learn or think.

I've dealt with training a significant number of people on the basics of computer support over a number of years. There is always a point in time where you just tell someone that they should have the tools to figure out their own question themselves and it's time to start working on teaching them how to ask the right questions to draw on their knowledge. A typical success will say they don't know where to start. I respond with, "Well what do you know about X?" and keep working down until they do know where to start or we can correct the knowledge deficit. Probably 30-40% simply will not try...They want the spoon fed answer, not the ability to find the answer themselves. Mind, most of these are people drawing a salary while this is happening and they need to learn this to hold their job!

It explains so much about our society when you realize that such a large group of people never learn or think!

When I interview people now, I always ask at least one question that requires them to think to respond. I don't even care if they get the answer right, I just want to know that they can think about a problem...As expected this winnows a good, solid, one third of the candidates.
on Sep 17, 2007
I'm not skeptical about quantum computers at all. It's not necessarily about the performance, it's about the power consumption and area usage. This could mean "flash" drives with more memory than today's hard drives, and mobile CPU's and video cards with 1/10th the battery drain. There's just several key technology breakthroughs researchers need to find. There's: a) how to get subatomic particles to "remember" states indefinitely (i.e. store bits), how to create switches, analogous to today's transistors, c) how to create basic logic gate circuitry analogous to today's CMOS logic--not just that, but how to create them in a self-contained manner such that you can create blocks of logic, that all behave the same way regardless of what circuits they are interfacing to, d) how to do this in a practical, cost-effective manner.

This all sounds way off, but the reality is, our current integrated circuit technology is really reaching its limits. Transistors are getting down into the 32nm range, which is smaller than the wavelengths of the light which etches them. And when you get that small, transistors start to really burn a lot of power just sitting there. I remember when Intel was first researching 1GHz processors, and at the time that sounded all sci-fi and crazy. Well, we're at 3GHz now. Pretty soon you'll get 1GHz on your cell phone. We might be able to squeeze out a few more GHz, but we simply are not able to go 10GHz and above with our current family of technology.
on Sep 17, 2007
Wouldn't it be possible to store rarely used objects in some kind of database without direct memory mapping?
on Sep 17, 2007

Wouldn't it be possible to store rarely used objects in some kind of database without direct memory mapping?


That would have the same effect as a swapfile. Unless you can predict when you need some data and load it back before it's needed, you'll suffer from the low disk performance.
Which also raises the question of the complexity of the predictor.
on Sep 17, 2007
I'm not skeptical about quantum computers at all. It's not necessarily about the performance, it's about the power consumption and area usage. This could mean "flash" drives with more memory than today's hard drives, and mobile CPU's and video cards with 1/10th the battery drain. There's just several key technology breakthroughs researchers need to find. There's: a) how to get subatomic particles to "remember" states indefinitely (i.e. store bits), how to create switches, analogous to today's transistors, c) how to create basic logic gate circuitry analogous to today's CMOS logic--not just that, but how to create them in a self-contained manner such that you can create blocks of logic, that all behave the same way regardless of what circuits they are interfacing to, d) how to do this in a practical, cost-effective manner.


Isn't that more like Spintronics and optical computers? Quantum computers however are an entirely different concept. Wikipedia has some great articles about this.


on Sep 17, 2007
I don't think quantum computing will ever enter the realm of consumer products. And if so, then only in form of a co-processor or so.
The algorithms are just too different, a quantum computer is unsuited for everyday tasks. Only a very narrow range of algorithms can work on quantum computers.
And on top of that, it's not even certain yet that quantum computers are viable at all, there's still the possibility that a quantum computer won't have more power than a traditional computer.


i'd tend to agree with you, but my conclusion stems more from the uncertainty principal than from an understanding of computation.

i think a more significant breakthrough will come when optical computers become commonplace and affordable. i have a fairly interesting link about some recent research along these lines, but it's it's in my yahoo mail, which is being wonkey at the moment so i'll have have to post it later.
on Sep 17, 2007
Optical computers aren't the same thing as quantum computing--yet--and I'm more skeptical of optical computing. You have to fab optical components on Galium Arsenide wafers instead of silicon and the power consumption is way up there. Basically all your wires are lasers. You can imagine the power that takes. It would be nice if you could use optical components for just your long transmission lines (such as on PCB boards), but the technology to fuse together GaAs with silicon doesn't exist yet.

On the other hand, if you started dealing on the tens of photons level, now you're talking--but now you're getting back into quantum computing.
on Sep 17, 2007
I'm more skeptical of optical computing


ahem, here's that link: WWW Link

gotta love skepticism
on Oct 05, 2007


as far as the 64 bit issue more broadly goes... it just depresses me, the whole subject. i love gaming, but i hate Windows, but i hate Apple more. it's not some fancy technological argument i have. since the original iMacs, Apple has designed its products (both hardware casing and GUI) like its consumers were kindergartners. but now Windows is going that way too, and i hate it. i miss the days of the DOS prompt. i'm just whining, i guess.


If you hate microsoft AND apple you can always use a UNIX clone. If you REALLY like abusing yourself you could do a stage 1 install of Gentoo Linux (full OS compile) or get one of the DOS clones such as FreeDOS (i think is the correct one) And if its the chipset that is the problem (Intel/AMD or Apple) you can alwasy make the switch over to one of the other cores such as sparc or alpha cpus.
4 Pages1 2 3 4