AMD explains its strategy for coming back

AMD explains its strategy for coming back

APC: So where does AMD fit in today’s world where Intel seems to have the upper hand in the top performing CPUs?

Intel will make a lot of the fact that their view is you need to be spending a thousand dollars on a CPU and that everybody ought to drive the Ferrari. Without giving a lot of thought to usage scenarios and the most commonplace price-points that people are purchasing a computer at today – and I guess I would maybe equate our story to be that of: we’re bringing to market a Honda Accord or Toyota Camry. These are the two most popular cars on the planet – nobody’s ever criticised Honda or Camry for not having Ferrari performance. They want a mix of fuel efficiency performance design a hint of luxury – and I think that is what we’re bringing to market with “Cartwheel†– the triple core Phenom and 780G chipset and if you like ATI hybrid graphics technology.

Why is that great? Well you look at a triple core CPU for $US145 a 780G motherboard plus a Radeon HD 3450 – our entry level discreet card and that gives you a basis to build a $500 – $550 BluRay-playing PC with that beautiful balance of multitasking performance connectivity energy efficiency quietness… and I think that as consumers get more sophisticated they start to ask what they’re really getting for their investment – is that extra money spent on the CPU really worth it in terms of return?

We’re really paying attention to usage scenarios and industry trends – what the consumer is using their PC for today. We’re bringing platform offerings to market that are addressing that in all of its facets. We hope people look at Cartwheel in the same way popular high quality mainstream cars have because it’s not just about pure benchmarking level theoretical performance – it’s about solving challenges and usage models that consumers have on a daily basis.

APC: You mentioned in your presentation earlier that AMD is the only company that supports GPU-accelerated decode of BluRay DVDs mastered with Microsoft’s VC1 CODEC – and showed some a demo of how BluRay VC1 playback goes into jerky slideshow mode on a PC running your competitor’s chipset. Why is VC1 so difficult to support?

Well it’s not – our main competitor has made a mistake in making their bet on H.264 and MPEG-2 but you’ve got a situation now where 25% of BluRay DVDs and a much larger proportion of HD-DVDs (especially important in China where “China HD-DVD†is still very much in use) use VC-1 encoding. Discs bought in stores don’t say what CODEC is used so people will often buy a BluRay disc and find their CPU is crapping out at 100%. It is obviously a lot more power efficient to run your decoding on a power-efficient graphics chipset than a very power hungry general-purpose CPU. A CPU is not very well designed for DVD decoding… so you end up burning a lot of power and probably won’t have enough battery life for your non VC-1 enabled laptop to watch a full DVD – very frustrating when your computer dies at the good part of the movie. (And then you won’t have the ability to charge up and sync your iPod before the plane lands so you have something to watch in the taxi on the way to the hotel.)

Is it hard to support VC1? No – but not every vendor has succeeded in doing it and that has created a wheel of fortune situation for people who buy our competitors’ products and want to buy BluRay DVDs and have the BluRay experience.

APC: You must have been pretty delighted when those results came out of the Microsoft lawsuit showing NVIDIA is responsible for 28% of Windows Vista crashes reported by users. What effect has this had on AMD’s standing in the marketplace?

Well I think we’ve always had a very good reputation for drivers – at least since 2002 when we realised our drivers were not very highly regarded for stability. The Catalyst program was intended to create the world’s most stable driver including a WHQL certified program to release a new Microsoft-certified driver every month. And it’s for those reasons that OEMs continue to purchase overwhelmingly from AMD because the last thing they want is for their profit margins to be cut in half or by three quarters by what can be very expensive support calls to resolve complicated graphics-related crash issues.

By and large I don’t think our customers were surprised – they have a pretty good idea of where we have a key advantage in stability. What was unique was that we always had that data – but we were limited by Microsoft in our ability to share it because it is proprietary data. Microsoft collects the information to help make the experience better not for companies to market with.

So when they put out the information as part of their lawsuit that NVIDIA was more than twice as likely as AMD to create a crash it felt really good – based on all the work we had done and engineering resources we’d put in it was great to see we were being recognised.

APC: Did it boost sales?

Certainly in terms of OEMs it’s very important. It’s very important for anybody purchasing PCs in a commercial environment. It’s very important for certain groups of people if you want to have the best home computer – having stable drivers is absolutely key.

To broaden the discussion about drivers something that people often don’t realise is that we are routinely offering them increases of 40% or more performance a year after they’ve bought their cards. We’re not only committed to producing stable drivers but also continuously improving performance. I think frankly we should be making more noise about it. If we were to stand up and say anyone who bought last year’s card we’re going to give them this year’s card for free that would be a sensation. But a driver update can actually give that amount of performance advantage too. In some applications it can mean an 80% boost in performance for free – just by updating your driver.

APC: You’re the first graphics vendor to fully support DirectX 10.1 – what does this mean for people and why has it been so hard for the industry to get to it?

You look historically at what’s happened – how much time has taken place between the time a new DirectX version has been available and the first games have appeared in the marketplace. When we brought DirectX 9 to the marketplace we announced our capability in July 2002 but we didn’t start to see games appear in the marketplace until six months after that. So there’s always a lag – you have to code for the hardware and if you don’t have the hardware you can’t test your code.

We brought DirectX 10.1 capability to market this January and here in April we’re already seeing the first game title to leverage that – Assassin’s Creed. Usually it takes longer than that – we’re getting games a little sooner than we had expected. So AMD was fortunate to be ahead of the rest of the industry.

In terms of the benefit you see from DirectX 10.1-enabled titles certain types of game code can run faster on DirectX 10.1 – in the case of Assassin’s creed you get about 20% performance advantage (which is ironic since it is an NVIDIA sponsored game and only AMD hardware can offer the DirectX 10.1 capability). You also have a broader palette for game developers to create better imagery.

APC: Do you think Intel’s massive quad-core price drops a deliberate spoiler for the release of the tri-core Phenom?

I think when you have a monopoly in the marketplace which they had at the quad core level for a period of time – basic economics dictates you can charge more. But over time price drops will happen as competitive tensions heighten in the marketplace. I would categorise it as a competitive move.

APC: Is it harder to market tri-core Phenoms now that Intel has dropped its quad-core pricing so much?

I think everyone likes the idea of getting an extra core and getting additional multitasking capabilities at a low price point. You’re getting a tremendous amount of extra flexibility at a $US150 price range. We’re seeing great demand for it – in the consumer space and in the business space. This is of course the anchor for AMD business class. So I don’t think we’re seeing lack-of-demand issues related to other quad core offerings in the marketplace.

APC: Is it true that tri-core Phenoms are actually quad-core chips where one of the cores is faulty?

It’s been commonplace in the semiconductor industry to disable parts of a chip so you can sell it at various price points – we’ve been doing it in the graphics industry for as long as I remember. It would be a mistake to assume our tri-core chips are being produced only because one of cores is broken or not working – it has more to do with adjusting a product to make it fit at a particular price point. Certainly there will be instances where a particular core does not work but in other cases we will have disabled a core so we can sell it at a particular price point. It’s nothing new in the semiconducutor space – it’s only new in the CPU space.

APC: You’ve just released the updated Opteron and Phenom CPUs that fix the chip error that became such a show stopper upon their original release. Has that been a rough time for the company?

I think what was particularly disheartening about it was we were dealing with errata 298 which was not an issue for regular client PCs. Even when you disabled the errata workaround in the BIOS unless you were running highly specific highly virtualised type server-environment software off your desktop (which practically nobody does in a client device) you didn’t see the error.

There was a perception in the marketplace that these were buggy chips when in fact the issue we’d identified would not have created a problem in the client scenario.

Also it’s nothing new when a chip comes off the production line that there are going to be errata in it that are going to require workarounds. This was errata 298 after all – it doesn’t take a rocket scientist to tell that there were 297 other errata before it that were addressed with workarounds. It happens with every processor no matter the manufacturer – it just happens that occasionally the tech community seizes on a particular errata as a news item.

This one happened to get some bad publicity but to date we’ve not had a single person come to us and say they saw a problem in their client device related to this errata. It was only in a highly specific server environment where this would come to bear.

APC: Your latest GPUs are using a 55nm process which you say is the densest process currently used in the GPU industry – what will this translate to in real terms?

It obviously creates a better situation. We’ve got an advantage in process technology – smaller chips less power more energy efficiency less heat less cooling required… and you couple that with other features like clock-gating features that allow us to turn of sectors of the chip when they’re not used. It allows us to throttle down parts of the chip to provide a very robust power saving story for notebooks.

That extends to the mobile version fo the 780G chipset – less than 1W in idle but we still have 40 stream processors – all the power of a discreet mobile GPU but in IGP format.

APC: You showed a demo at your Singapore technology update event of a truly incredible graphics demo running on your current hardware. If this is where we’re at now can you tell me what will be on the table with the upcoming R700 series of graphics processors?

Well we’re moving towards level of digital visual realism that we’ve never seen before. In terms of the little bits and pieces I’ve seen so far this really is a vast step forward – and the industry will see that. In terms of the kinds of interactive experiences you can enjoy on a PC.

During the spider launch we worked very closely with an API developer by the names of Jules Erboch and they’re a middleware developer that create software that allow movie artefacts to be run in real time on a GPU rather than take 30 hours per frame to render (he threw out that number.)

There is still a pretty big delta in terms of what you can see in Hollywood 3D environments – e.g. in I am Legend a very compelling post-indsutrial scene where you are looking at 3D special effects – what New York City would look like if it were hit by plague and weeds were growing out of the road. What you see in Hollywood is one level of digital realism. The problem is you can’t have that in real-time – you have to wait 30 hours for every frame of that to render and you can’t enjoy that in a game. You can’t interact with graphics fo that quality.

In a video game environment you don’t have that same level of digital realism – you look at a game like the Toy Shop demo and you’re starting to see that level of visual realism creep into games. But when you try to get that level of digital realism you generally sacrifice the ability to interact with it in real time.

Over time we’ve been espousing the view that convergence is finally taking place. Graphics hardware is increasing in performance at a very rapid rate – it’s starting to catch up with cinema. I think in the next year you are going to start to see companies like AMD be able to being to see Hollywood level visual realism and special effects start to be rendered in real-time through new ways of the GPU being able to render in real-time.

Based on what I’ve seen of what we are going to bring to the market next we are quickly moving to a realm where we are able to create demos which are – for some people – difficult to tell any difference from what you see in Hollywood.

The fact that we can demonstrate it on the GPU doesn’t mean you’ll immediately see it in games – there’s always a four or five year lag between when you can demonstrate it in a canned demo and when you can play it in video games because you need to wait for the high-end technology to trickle down to enough end users for developers to see that it’s worthwhile to develop for it.

We’ll see little bits and pieces of true cinematic realism rendered in real time this year though. And soon we’ll be able to do more and more of that.

When we talk about the ultimate visual experience that can be interpreted as creating vide games that are every bit as compelling and realistic as cinema but you have the ability to interact with it.

APC: Will AMD become a leader in overall notebook performance?

Certainly we’ll be a leader in notebook performance – depending on what aspect of performance you’re talking about and what exactly you’re trying to process. But equally important to performance is battery life and you can certainly get notebooks today that have eight hours of battery life but they may not have a form factor you want – they might be big and clunky and heavy. Or they may not have gthe features you want.

The trick is to have that useful balance of battery life – you want to have all your applications and you also want all the features. It’s finding the balance. I wouldn’t necessarily say it’s just about performance – it’s also about features flexibility and battery life.

John Taylor was talking about the volume of design wins – there’s a huge market acceptance for Puma. The market is latching on in a way they haven’t to our previous moble platform solutions. We talk about being a smarter choice company but when you see volumes of design wins that’s the proof point.

APC: Thanks for your time Chris.

Dan Warne travelled to AMD Tech Update Singapore as a guest of AMD.