foobar on computers, software and the rest of the world

The GPU, your personal desktop super computer

, posted: 6-Feb-2009 07:19

There once was a time (70s, 80s and early 90s) when a lot of interesting research was done in the field of computer architectures. Researchers and specialised companies developed MIMD, SIMD, array, vector, pipeline and other exciting and innovative computing architectures in order to win the race to the top, to be able to claim the 'fastest computer in the world' title (usually measured in FLOPS, Floating Point Operations per Second). The money was there to buy those multi-million dollar machines. Universities and many government agencies were the customers. The names of those who created these computing behemoths still ring true: Thinking Machines, nCUBE, MasPar, Cray, and a smattering of even smaller ones, together with some of the usual suspects, such as Intel, IBM, NEC, etc.

Interesting architectures were needed because of the von Neumann Bottleneck: Ultimately, your performance is limited by the transfer between memory and CPU, which is relatively slow compared to the fast CPUs. So, massively parallel architectures were developed, sometimes with thousands of CPUs, and very often with highly distributed memory (Thinking Machines, nCUBE, Intel, IBM). Or innovative ways of organizing your workload were developed, as in vector or pipeline processing for example (Cray). Many hybrid approaches were tried as well. This didn't necessarily work on all algorithms, and wasn't efficient for all workloads. But the kind of work these computers needed to do - such as simulating exploding nuclear weapons with finite state simulations requiring massive data sets - were highly parallelisable and really benefited from those architectures.

That magical time of flush funding and research was called 'the cold war'. Things were predictable, money was available, research was interesting, times were good ... in their own strange sort of way.

But alas, it couldn't go on forever. The cold war ended. Worse, yet, the nuclear arms race ended as well. So, while the world rejoiced in a new-found illusion of peace and security, the militaries slowly realised that future wars would most likely not be fought with nuclear weapons, but with bombs strapped to bodies and placed by road sides and other partisan tactics. The demand for high-end supercomputers dropped. The makers of these machines either folded or tried to find salvation in the commercial world. Some architectures that were designed to pump massive data sets to thousands of CPUs turned out to be quite adept at handling great IO loads as well, so the vendors tried to retool their machines to become database or video servers.

Nevertheless, in the end those specialised manufacturers couldn't compete in shear $/TPS value with servers built from off-the-shelf components. Many of those specialised supercomputer companies had their intellectual capital not only in the computer architecture, but also custom CPU designs. For example, CPUs that had special on-board hardware to help route data through innovative grids, hyper-cubes, fat-trees, and other fascinating inter-connection networks. The CPUs were good at doing this, but those small design shops simply couldn't keep up with the raw performance of the main-stream CPUs, especially since Intel and AMD had just started their MHz (soon GHz race). I mean, seriously, who can make a 100 MHz CPU? That is so ... fast!

Pretty quickly it became clear that those specialised shops couldn't compete on CPU speeds on the free markets and without support of the cold-war funding frenzy. And the specialised inter-connection features of those CPUs weren't much use either. The workloads in the enterprise were different than in the scientific world. Larger CPU caches helped to alleviate the von Neumann bottleneck sufficiently. Standard servers, as we still know them today, were simply 'good enough' and simply too cheap to argue with. Today we still see supercomputers, but the architectures are far from innovative, it seems. It's all just heavily parallelized, often simply consisting of clusters of off-the-shelf servers. It's a pretty sad state of affairs.

Architecture research has moved on in one particular area, though: The GPU (Graphics Processing Unit). Or in other words: Your humble graphics card. More and more, these highly specialised processors - paired with very fast memory - can perform the complex calculations needed for fast rendering of physically realistic scenes in games, for example. So, not only the actual graphics, but also the physics of the world that is being shown can be processed here. And because these units are produced in large numbers for the mass market they are unbeatably cheap. A few hundred dollars give you computing power that could only be had for tens of millions of dollars just ten years ago.

For some time now there has been active research into trying to use the inherent powers of modern GPUs for general, scientific computing. One of the more astonishing pieces of software in this field is OpenMM. It is used for molecular modeling. I'm not a chemist, but supposedly it's really cool. Now take a look at this article here. Some quotes:
In the past, researchers needed either supercomputers or large computer clusters to run simulations. Or they had to be content to run only a tiny fraction of the process on their desktop computers. But a new open-source software package developed at Stanford University is making it possible to do complex simulations of molecular motion on desktop computers at much faster speeds than has been previously possible.

"Simulations that used to take three years can now be completed in a few days." ...

OpenMM is a collaborative project between Pande's lab and Simbios, the National Center for Physics-based Simulation of Biological Structures at Stanford. ...

The key to the accelerated simulations OpenMM makes possible is the advantage it takes of current graphics processors (GPUs), which cost just a few hundred dollars. At its core, OpenMM makes use of GPU
acceleration, a set of advanced hardware and software technologies that enable GPUs, working in concert with the system's central processor (CPU), to accelerate applications beyond just creating or manipulating graphics.

The icing on the molecular-simulation cake is that the software has no allegiance to any particular brand of GPU, meaning it is, as computer geeks like to say, "brand agnostic." OpenMM will enable molecular dynamics (MD) simulations to work on most of the high-end GPUs used today in laptop and desktop computers.

Do in days what used to take years? I don't need no stinkin' Cray. I've got an nVidia! Smile The times we live in...


Other related posts:
More Apple madness (follow up)
A truly light-weight OS: Written in ASM, with GUI, networking and apps
Exploring functional programming - Which language do you recommend?

Comment by boby55, on 6-Feb-2009 09:57

I think it would be fair to say,

Do in milliseconds that use to be done in days?

I remember back at school we use to program the computers to count to 1000000 and it would take all day to finish, now its done in a blink of a eye/

Comment by GeekAdviser, on 6-Feb-2009 10:19

It would be cool to have an old Cray Supercomputer, just for the nostalgic value. Although, it might take a large room to house it.

Comment by robscovell, on 17-Jul-2009 08:31

The good old days of the cold war eh!
Kids these days don't know how lucky they are ... my generation in the UK lived believing that nuclear holocaust was somewhere between highly likely and inevitable. The apocalyptic movies of my teenage years featured mass airbursts of high megaton nukes. 'Jericho' with its pathetic terrorist ground burst suitcase bombs hardly compares with 'Threads' or 'The Day After'. Bad but not MAD!
Funny how nostalgic I now am of the cold war stuff -- listening to shortwave broadcasts of propaganda, thinking about what I would do when the sirens go off (grab a girl and lose our virginity and die happy was what most of us teenage lads came up with!) We all wanted to fry (remember the Tom Lehrer song?) rather than 'survive' and linger on cancer-ridden and traumatised. 
But back on topic -- I am following quantum computing research with interest ... wonder how long before that gets into the desktop?

Add a comment

Please note: comments that are inappropriate or promotional in nature will be deleted. E-mail addresses are not displayed, but you must enter a valid e-mail address to confirm your comments.

Are you a registered Geekzone user? Login to have the fields below automatically filled in for you and to enable links in comments. If you have (or qualify to have) a Geekzone Blog then your comment will be automatically confirmed and placed in the moderation queue for the blog owner's approval.

Your name:

Your e-mail:

Your webpage:

foobar's profile

New Zealand

  • Who I am: Software developer and consultant.
  • What I do: System level programming, Linux/Unix. C, C++, Java, Python, and a long time ago even Assembler.
  • What I like: I'm a big fan of free and open source software. I'm Windows-free, running Ubuntu on my laptop. To a somewhat lesser degree, I also follow the SaaS industry.
  • Where I have been: Here and there, all over the place.

Google Search

Recent posts

Attack on net neutrality right...
Munich already saved millions ...
Iceland's public administratio...
More Apple madness (follow up)...
Apple demonstrates: With great...
Smooth sailing with the Karmic...
Censorship in New Zealand: Wid...
Image roll-over effects withou...
How about: Three strikes and Y...
UK government supports open so...

Top 10

How to write a Linux virus in ...
(11-Feb-2009 06:33, 461170 views)
Follow up: How to write a Linu...
(12-Feb-2009 08:10, 64824 views)
A truly light-weight OS: Writt...
(3-Feb-2009 10:39, 46652 views)
The 'Verified by Visa' fiasco ...
(20-Jun-2008 09:59, 32518 views)
EEE PC with XP is cheaper than...
(9-May-2008 06:50, 20330 views)
11 reasons to switch to Linux...
(4-Feb-2009 09:24, 20222 views)
Would you use Google App Engin...
(8-Apr-2008 20:02, 19561 views)
Censorship in New Zealand: Wid...
(16-Jul-2009 12:11, 19051 views)
Django Plugables: Tons of plug...
(11-Apr-2008 03:24, 16892 views)
Slow file copy bug in Vista: A...
(21-Dec-2007 12:18, 16039 views)