ATI has finally unveiled what its 'stream computing' initiative is all about. It's looking mighty powerful and you, yourself, just might be able to benefit from it within a day.
As mentioned previously, news surrounding ATI's stream computing has been fairly hazy. Well it has now officially launched its stream computing initiative.
This is ATI kicking off support for a broad range of applications that can run on its graphics cards, beyond graphics. It's exactly as we thought it would be and, yes, it involves damn tasty performance treats.
Called 'stream computing', this technique involves running calculations that aren't necessarily graphically intensive, yet can take advantage of the extremely powerful vector processor and pixel shaders found on modern graphics cards.
The distributed computing project, Folding@Home, is one area in which you can already see this in action. ATI says that in comparison to a Pentium 4 2.8GHz processor, running F@H on one of its stream computing enabled graphics cards produces a forty-fold speed boost.
In other words, this means outputting results in one month for what would ordinarily take three years on a CPU. Stream computing also enables Stanford researchers to dig for answers to questions that were previously impossible to "tackle computationally."
Later today or early tomorrow you will be able to download an application for F@H from the the project's website that can run on your graphics card. ATI will also have its new stream computing-enabled graphics driver ready (Catalyst 6.10). The caveat emptor to running this application is that your ATI RADEON card must be X19**-flavoured. We suspect this has to do with the 36 to 48 onboard pixel shader processors.
Physics processing is another prominently useful area in which stream computing excels. As such, Havok was also present at the announcement. "Realistic physics is the future of videogames and together we're making it possible," said Jeff Yates, VP of marketing, continuing "Just as real-time lighting and shadows are standard in today's games, there will come a time when no game is without this level of immersive, true-to-life physics."
This may mark another nail in the coffin for a certain physics processor, especially considering new GPUs with such powerful capabilities are more easily introduced into the market than an entirely new genre of processor.
No mention was made of plans for AI support in games and I would at least bat an eyelid or three if it were to be announced in the future.
Yes, I'd be vaguely surprised. There's only so much you can do with the available bandwidth between a graphics card and the rest of the system. That, and whether logic-based AI has any computational advantage on vector-based GPUs is questionable.
With Vista using the GPU for its graphical interface, whatever the GPU ends up as, this official support from ATI marks a major shift in the way we'll look at GPUs in the very near future.
Even with a more generalised use for the mighty GPU, this isn't the end of the CPU. Its need to be ever-powerful, however, is likely to disintegrate -- and not solely for gamers, but in a great many other fields for both academic and enterprise purposes, including scientific research, defense, financial institutions, and even the oil and gas industry.
Assuming ATI gets its way, graphics cards soon won't just be graphics cards.