Interview: NVIDIA's Keita Iida
The future of Direct X, Crysis and PS3 under the spotlight.
Keita Iida, Director of Content Management at NVIDIA sat down with IGN AU to discuss all things Direct X 10 and the evolution of their Geforce graphics cards. Iida goes into detail on the differences between developing for the PS3's RSX graphics processor, and the latest development tools to hit the scene.
quote:
Selected portions of the interview-
IGN AU: What are your thoughts on Microsoft effectively forcing gamers to upgrade to Vista in order to run Direct X 10 - when there's no real reason why it can't run on Windows XP?
Keita Iida: It's a business and marketing decision.
IGN AU: Can you comment on what happened with NVIDIA's Vista drivers? You guys have had access to Vista for years to build drivers and at the launch of Vista there were no drivers. The ones that are out now are still basically crippled. Why did this happen?
Keita Iida: On a high level, we had to prioritise. In our case, we have DX9, DX10, multiple APIs, Vista and XP - the driver models are completely different, and the DX9 and 10 drivers are completely different. Then you have single- and multi-card SLI - there are many variables to consider. Given that we were so far ahead with DX10 hardware, we've had to make sure that the drivers, although not necessarily available to a wide degree, or not stable, were good enough from a development standpoint.
If you compare our situation to our competitor's, we have double the variables to consider when we write the drivers; they have much more time to optimise and make sure their drivers work well on their DX10 hardware when it comes out. We've had to balance our priorities between making sure we have proper DX10 feature-supported drivers to facilitate development of DX10 content, but also make sure that the end user will have a good experience on Vista. To some degree, I think that we may have underestimated how many resources were necessary to have a stable Vista driver off the bat. I can assure you and your readers that our first priority right now is not performance, not anything else; it's stability and all the features supported on Vista.
IGN AU: So what kind of timeline are we looking at until the end user can be comfortable with Vista drivers? With DX9 drivers that work as stably and quickly as they do with XP?
Keita Iida: We're ramping up the frequency of our Vista driver releases. Users will probably understand that we release a number of beta drivers on our site, so we're making incremental progress. We believe that, in a very short time we will have addressed the vast majority, if not all of the issues. We've had teams who were working on other projects who have mobilised to make sure that as quickly as possible we have the drivers fixed. I'm not going to give you an exact timeframe, but it's going to be very soon. We're disappointed that we couldn't do it right off the bat, but we hear what everyone is saying and we're willing to fix it.
This next gen purevideo stuff sounds amazing. I thought I was gonna have to get a new motherboard and dual core cpu to play some HD-DVD content smoothly. Please, do try and rush testing the purevideo stuff ASAP. Blu-ray and hd-dvd is growing. . .
Unless these cards are majically fast under DX10 (and we all know they won't be, they will play Crysis, but not quickly) they offer less performance than even midrange parts from the last get.
Anyone remember how a 6600GT offreed 9800pro beating performance, and how nVidia sold millions of them. I don't see that happening here. What I do see is a wait-and-see attitude. Does anyone else think it's VERY suspicious that there are no 64 shader cards? Here is what may happen: nVidia waits for the midrange AMD cards to emerge. If they offer better performance, nVidia slashes prices of these and releases a 8800GS with 64 shaders for $200. I won't be surprised at all if that's what we have in 3 months.
We've got 128 SP on the GTX, 96 on the GTS... and then 32 on the G84. I'd say there's definitely room for 64 SP from NVIDIA, and possibly 48 SP as well. Will they go that route, though? Unless they've already been working on it, doing a new chip will cost quite a bit of time and effort. I was expecting 8600 to be 64 SP and 8300 to be 32 SP before we had any details, but then the 8600 probably would have been too close to the 8800.
Er, wait (not too long) for nVidia's re-roll of the 8xxx-series on 65nm... You might just get your wish. I believe that nV is copying Intel's 'tic-toc' process strategy - architecture and go to production on a mature process (80nm half-node), then transfer and refine the implementation on the new process. Note the interesting and important tweaks in the implementation of the 8600 vs 8800... which gives a glimpse of the future 65nm 9xxx(??)-family architecture but with higher numbers of stream-processors and high-precision math processing for the expected GPGPU applications.
nVidia has already hinted that the successor to the 8800 will be available before the end of 2007, and no doubt will be on 65nm for the obvious cost and yield reasons. If the R600 turns out to be a true contender for the 8800 "crown" in the same price-range, then I fully expect nV to accelerate the appearance of the 8800 successor. No doubt the design was started long before the 8800 itself was production-available.
Why did you not include the x1950xt in the test lineup? It can also be had for about $200 now, like the 7950gt. You didn't want to make the 8600 series look worse than they already do, or what?
Wow, sorry for the ommission -- I was trying to include specific comparison points -- 3 from AMD and 3 from NVIDIA, but this one just slipped through the cracks. Sorry. It will be included in my performance update.
Now, now guys. True that these cards are not going to be what many of you want (there are some good reasons to stay with what you have considering the performance differential of several of the last generation cards). And it is clear that these cards will not touch the 8800 cards (from what I can tell, the only these these do better are are 100% Pure HD on the card, which I guess is because these might be paired with not-so-great cpus.
But for some of us, they might work. I recently bought a Dell 390 workstation. I packed it with fast drives, QX6700 cpu, and 4gb ram. There were very few BTO graphics choices, and most centered around the Pro market (this is a "workstation" after all). These is a new machine, and quite powerful! I want to work and play on this box. Because of the relatively week power supply (rated at 375 watts or something like that) and because I need both available non-graphics PCIe slots (if you put it an 8800 GTS, even if you changed the power supply, this type of dual slot card will cover one of those slots), I have been waiting for something reasonably powerful to come along (again, I am not going to just work on this box; I would like to play UT2k7, too:). Since I run Linux, I was trying to stick with the Nvidia line (my experience is that they have better drivers for this platform, but perhaps ATI has stepped up in the last half year or so). I could have gone with the 79xx line (single slot), but I wanted to see what the new generation would bring. Depending on what you need/want, I think either a slightly-used 7950GT OC or a 8600 GTS would work just fine for me. It does not seem unreasonable to me that in some things the older higher end card is faster than the newer mid range card, and vice versa. But I did not see any benchmark where the 79xx line whooped the 8600 GTS thoroughly (like what happend with several benchmarks comparing the 8800 and 8600).
I would say that the only immediate problem I might have with using the 8600 GTS is for gaming at high resolutions. I have a Dell 2407, and Anandtech's benchmarks make it clear I should not be gaming at that high a resolution. Bummer. The 7950 GT OC might very well be the better option here.
In an ideal world, I really would like the power of an 8800 (and, fortunately, I can pay for it). But I really need the PCIe slot more, and changing out the power supply would add even more cost. I could have gotten another Dell model (like the XPS 710 or the Precision 490)--and I am thinking about just that. But I got the 390 for what I considered good reasons (a damned sight cheaper than the 490, and I have no need of another cpu socket when I can have 4 cores in one socket), and the XPS 710 did not have BTO storage options that I wanted (not sure why they could not design that thing to have more than two internal drives--the thing is big enough; maybe most games do not need it, as that is what the machine was designed for). I bet I am not the only one.
The "masses" also don't go hunting for DX10 cards to add FPS to their hardcore Dell and Gateway gaming rigs.
Be honest with yourself, the people going for these cards are custom riggers.
AGP DX10 please, theres hundreds of thousands with Pentium 3.4 Northwoods that know their processors will run BioShock well, but they need DX10 without paying for a new motherboard, DDR2, and everything else, including Vista!!!
Actually, I don't think anyone "knows" whether or not any current system will run BioShock well or not. Let's wait for the game to appear at least. We're still at least four months away (assuming they hit the current release date).
While I can understand people complaining about the lack of AGP cards, let's be honest: why should either company invest a lot of money in an old platform? It takes time to make the AGP cards and more time to make sure the drivers all work right. At some point, the old tech has to be left behind. The cost to transition from an AGP setup to a PCIe setup is often under $100, so if the AGP cards had a $50 price premium you'd only save yourself $50 and still be stuck with the older platform.
I figure AMD/ATI and NVIDIA basically ignored the complaints with X1900/7900 class hardware (the best was several notches below what was available on PCIe), and at this point I think they're done. I'd even go so far as to say we're probably now at the point where an AGP platform would start to be a bottleneck with current hardware - maybe not midrange stuff, but certainly the high-end offerings.
Let's put it another way: why can't I get something faster than a single core 2.4GHz 1MB cache Athlon 64 3700+ for socket 754? Why can't I get Pentium D or Core 2 Duo for socket 478? Why do we need new motherboards for Core 2 Duo when socket 775 is used sing 915/925? Intel and AMD have forced transitions on users for years, and after a long run it's tough to say that AGP hasn't fulfilled its purpose. Such is the life of PC hardware.
Basically, I feel that my 3.2 northwood, 2GB ram is worth salvaging for BioShock and Hellgate, obviously not Crysis, but it's convenient that it will be released in 08.
I figure I can hold out 8 more months, save up during this time, and switch to quad and DDR3.
While it would be nice to have this hardware in NVIDIA's higher end offerings, this technology arguably makes more sense in mainstream parts. High end, expensive graphics cards are usually paired with high end expensive CPUs and lots of RAM. The decode assistance that these higher end cards offer is more than enough to enable a high end CPU to handle the hardest hitting HD videos. With mainstream graphics hardware providing a huge amount of decode assistance, the lower end CPUs that people pair with this hardware will benefit greatly.
IMO, this is absolute bollocks.
If I'm paying for nVidia's high-end stuff, I expect high-end everything. And this is at least the second time nVidia has only improved video on their second-round or midrange parts (anybody remember NV40/45?).
I game some, and I want good performance for that. But, I also have a 1920x1200 display, and I want the best video playback experience I can get on it. I also want the lower CPU-usage so I can playback video while my system is left to do other processor-intensive tasks in the background.
Once again, nVidia has really disappointed me in this area. In comparison, ATI seems to be much better at making sure their full range of cards supports their best video technologies. This (along with nVidia's driver development) continues to make the G80 seem like a "rushed-out-the-door" product.
that performance is horrible. everyone here is pretty dead on - this is strictly for marketing to the non-educated gamer. too bad they will be disappointed and probably return such a piece of sh!t item. what a joke.
come on ati, this kind of performance should be in the low end cards, this is not a mid-range card. maybe if nvidia sold them for $100-$140 they may end up in somebody htpc but that is about all they are good for.
glad i have a 360 to ride out this phase of cards while my x1800xt still works fine for my duties.
if i were the upper management at nvidia, people would be fired over this horrible performance, but sadly the upper management is more than likely the cause of this joke of a release.
nVidia needs to have people with actual product knowledge dictate what the specifications of future products will be. This disappointing lineup has marketing written all over it. They need to wise up or they will end up like Intel and their failed marketing derived netburst architecture.
In the article they talk about the Pure Video features as if they are brand new. Does this mean they ARE NOT implemented in the 8800 series? The article talked about how 100% of the video decoding process is on the GPU but it did not mention the 8800 core which worries the heck outta me. Also does the G84 have CUDA capabilities?
The 8800 series support PureVideo HD the same way GeForce 7 sereis does -- through VP1 hardware.
The 8600 and below support PureVideo HD through VP2 hardware, the BSP, and other enhancements which allow 100% offload of decode.
While the 8800 is able to offload much of the process, it's not 100% like the 8600/8500. Both support PureVideo HD, but G84 does it with lower CPU usage.
I just checked NVIDIA's website and it appears only the 8600 and 8500 series support Pure Video HD which sucks balls. I want 8800GTS performance with Pure Video HD support. Guess I'll have to wait a few more months, or go ATI but ATI's future isn't stable these days.
If I'm going to spend this kind of money for an 8800 series card then I want VP2 100% hardware decoding? Is that too much to ask? I want all the extra bells and whistles. Damn, I may have to go ATI for the first time since 1987 when I had that EGA Wonder.
It's not surprising that G84 has some enhancements relative to G80. I mean, G80 was done six months ago. I'd expect VP2 is one of the areas they worked on improving a lot after comments post-8800 launch. Now, should they kill the current G80 and make a new G80 v1.1 with VP2? That's up for debate, but you can't whine that older hardware doesn't have newer features. "Why doesn't my Core 2 Duo support SSE4?" It's almost the same thing. I wouldn't be at all surprised to see a new high-end card from NVIDIA in the future with VP2, but when that will be... dunno.
quote: It is important to emphasize the fact that HDCP is supported over dual-link DVI, allowing 8600 and 8500 hardware to play HDCP protected content at its full resolution on any monitor capable of displaying 1920x1080
So ... to confirm, the card *does* let you watch HDCP content on a Dell 3007WFP at 2560x1600 ? Of course, the card would probably scale the stream to the panel resolution ...
The card will let you watch HDCP protected content at the content's native resolution -- 1920x1080 progressive at max ...
Currently if you want to watch HDCP protected content on a Dell 30", you need to drop your screen resolution to 1280x800 and watch at that res -- the video is downscaled from 1920x1080. Higher resolutions on the panel require dual-link DVI, and now HDCP protected content over a dual-link connection is here.
Maybe I'm in the minority, but I don't care about this HDCP business. The players are still ultra expensive, and the resolution benefit doesn't really change how much I enjoy a movie. Also, a 30" screen is pretty small to be able to notice a difference between HD and DVD, if you're sitting at any typical movie-watching distance from the screen. Well, I would guess so at least.
I loved it how the two 8600 cards listed 256MB memoy only however the 8500 card showed 256MB / 512MB. Gotta love marketing in attempting to grab the masses attention by throwing more ram into a situation where it doesn't really help...
Jason
Horrible, horrible performance. I'm so disappointed its not even funny! I'm so waiting for ATI to release their mid-range cards and blow Nvidia out the water to space.
quote: We haven't done any Windows Vista testing this time around, as we still care about maximum performance and testing in the environment most people will be using their hardware. This is not to say that we are ignoring Vista: we will be looking into DX10 benchmarks in the very near future. Right now, there is just no reason to move our testing to a new platform.
Very true, and not only because the vast majority of gamers are still running XP, but also because no games out to this point gain anything from DX10/Vista (aside from one or two that add a few graphical tweaks here and there in DX10).
When there are enough popular, well-reviewed DX10/Vista focused games available that demonstrate appreciable performance improvement when running in that environment, such that you can create a test suite around those games, then it would be time to transition to that sort of test setup for GPUs.
Derek you should add the specs of the 8800GTS 320MB to the spec chart on page 2, unless of course NVidia forbids you to do that because it would make it too obvious how they've cut too many stream processors and too much bus size from these new cards.
Now what they'll do is end the production of the 7950GTs to ensure folks can't continue to pick them up cheaper and will be forced to move to the 8600GTS that doesn't yet offer superior performance.
gg neutering these cards so much that they lose to your own previous generation hardware, NVidia.
Total disapointment :( ... Could even beat up a X1950pro. They really need to sell at $150 otherwise you would be better off buying a X1950GT or 7900GS for $150 to $160. At the current $200 to $230 price you could get a X1950XT 256MB which could destroy it but that GPU needs a good powersupply. Only thing going for a 7600GT is the DX10 support and Full H.264 , VLC , Mepg 4 support but that can be found on even other cards.
I have been waiting several months for these cards and boy and I disappointed. I figured this month I would get a new PC since April 22 the prices of C2D also drop. My idea was to get a C2D600 and an 8600GTS but after their lack luster performance, my only option is an 8800GTS which is $50+ more. Not a huge difference but I am very compelled to wait until the refresh comes out and then maybe I can get a better deal. I really hate this senario where ATI is down, AMD is down and no competition is leading to high prices and crappy performance.
Hopefully in another 6 months, AMD will be up to par on both their processors and CPUs. I will be holding on to my current system until then. I found it disappointing that these cards do not come with 512MB of memory but their performance is actually even more disappointing.
Well, the R6xx stuff from AMD should be out soon, so that's going to be the real determining factor. Hopefully the drivers do well (in Vista as well as XP), and as the conclusion states NVIDIA has certainly left the door open for AMD to take the lead. Preliminary reports surfacing around the 'net show that R600 looks very promising on the high end, and features and specs on the midrange parts look promising as well. GDDR4 could offer more bandwidth making the 128-bit bus feasible on the upper midrange parts as well. Should be interesting, so let's see if AMD can capitalize on NVIDIA's current shortcomings....
It might be a good idea to read more reviews before writing off the 8600 range.
Over at , they found the 8600GTS easily beats the X1950Pro and even though the models they tested were factory overclocked, one of them had only a 5% core overclock and no memory overclock and was still well ahead of the X1950Pro. At worst the two cards were roughly even but in many tests the 8600GTS (with just 5% core o/c) was considerably faster. As say in the article link
quote: New GeForce 8600 GTS Shines
If you have been waiting for a DX10 video card capable of playing today's games, your wait is over. The GeForce 8600 GTS series GPU destroys ATI's current X1950 series for right around $220.
So who do you believe? I guess I'll need to read several more reviews to see what's really going on.
I agree that that review it paints a different picture, but I have no reason to disbelieve it. I mean, they do their testing differently (try to find the best playable settings for each card), but it is what it is... I mean, the tables show the differences in the cards, at those (varrying settings) and so they therefore draw the conclusions they draw, based on that (with card X, I can enable option Y at these frame rates, but not on card Z - at least at the same resolutions). It's not like they tried to hide the settings they used or the frame-rates they got with each set-up. I found it an interesting perspective. ~shrug~
Anyway, my personal opinion is that they neutered this chipset too much. There looks to be a substantial gap between 7900/8600 and 8800 level performance and the sweet-spot for this price-point would have been right in the middle of that gap... maybe they're planning a 8700 series chipset?
HardOCP's review disagrees with almost everyone else's results and also reads like a marketing advertisement for the product. I wouldn't give their review the time of day.
They're normally very good which after reading HardOCP's review immediately after AT (whose I read first), I thought I should mention that not everyone found the 8600GTS to be slower than the X1950Pro.
However, after reading several more reviews on other usually reliable websites, the consensus seems to be that the 8600GTS is well behind the X1950Pro, which does make HardOCP's finding seem very odd.
I get the feeling that we're going to have to wait until nVidia get their drivers for this card sorted out as I suspect they are not all they could be, which they will hopefully have done by the time the HD2xxx series are launched, then the 8600/8500 cards can be retested and compared with their true competition.
You need to take into account that 7900GS will be soon discontinued and X1900 series will face same fate as soon as ATI releases RV630 cards.
Cards based on previous high-end products like 7900 and X1900 based cards are great for consumers, but bad for ATI/NVidia since they have large die sizes and 256bit memory bus (= high board manufacturing costs).
I think a lot of people is waiting to see some DX10 bechmarks really bad because that's what makes G80 and G84 special.
If the 8600 GTS can't run Crysis at AT LEAST 45 FPS with 1280x1024 with full details and a moderate 4xAA then it's not worth it in my own humble opinion.
Same for the 8800 GTS 320MB, if it can't run Crysis at 60 FPS with 1280x1024 with full details and full 16xCSAA then it sucks...
BTW 8800 GTS 320MB gets near double the performance at 50% higher price and when 4xAA is enabled a little over double. Think about that everyone ;-)
My reaction to. Do you play PC games? Very few games can be run at 60fps with full detail even with top of the line hardware. I expect the 8600GTS to get about 20fps in Crysis.
The only games that I don't see get that amount of fps at 1280x1024 with current mainstream hardware (7900GS) are Black & White 2, Oblivion and of course Rainbow Six Vegas. The rest of the games get 60 or more, excepto for Splinter Cell which gets 52 but that's almost 60 to me. Only 3 games gentlemen and I'm taking this info from Anandtech. If 200$ can get me descent performance at good quality at DX10 then I don't think it's worth it and XFX 7900GS XXX would rock!
The issues is still one of the direction the industry is going. Games are going to get more graphically intense in the future, and different techniques will scale better on different hardware.
Rainbow Six: Vegas is very important, as it is an Unreal Engine 3 game -- and Epic usually does very well with licensing their engine ... It's possible many games could be based on this same code in the future, though we can't say for certain.
It's not only a question of DX10, but future DX9 games as well -- how will they be implemented, and whether more shader intensive DX9 code lend it self better to the G8x architecture of not.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
60 Comments
Back to Article
deathwalker - Wednesday, April 18, 2007 - link
So, whats the word on the Ultra version of the 8600? Has that fallen to the wayside?crystal clear - Wednesday, April 18, 2007 - link
Interview: NVIDIA's Keita IidaThe future of Direct X, Crysis and PS3 under the spotlight.
Keita Iida, Director of Content Management at NVIDIA sat down with IGN AU to discuss all things Direct X 10 and the evolution of their Geforce graphics cards. Iida goes into detail on the differences between developing for the PS3's RSX graphics processor, and the latest development tools to hit the scene.
quote:
Selected portions of the interview-
IGN AU: What are your thoughts on Microsoft effectively forcing gamers to upgrade to Vista in order to run Direct X 10 - when there's no real reason why it can't run on Windows XP?
Keita Iida: It's a business and marketing decision.
IGN AU: Can you comment on what happened with NVIDIA's Vista drivers? You guys have had access to Vista for years to build drivers and at the launch of Vista there were no drivers. The ones that are out now are still basically crippled. Why did this happen?
Keita Iida: On a high level, we had to prioritise. In our case, we have DX9, DX10, multiple APIs, Vista and XP - the driver models are completely different, and the DX9 and 10 drivers are completely different. Then you have single- and multi-card SLI - there are many variables to consider. Given that we were so far ahead with DX10 hardware, we've had to make sure that the drivers, although not necessarily available to a wide degree, or not stable, were good enough from a development standpoint.
If you compare our situation to our competitor's, we have double the variables to consider when we write the drivers; they have much more time to optimise and make sure their drivers work well on their DX10 hardware when it comes out. We've had to balance our priorities between making sure we have proper DX10 feature-supported drivers to facilitate development of DX10 content, but also make sure that the end user will have a good experience on Vista. To some degree, I think that we may have underestimated how many resources were necessary to have a stable Vista driver off the bat. I can assure you and your readers that our first priority right now is not performance, not anything else; it's stability and all the features supported on Vista.
IGN AU: So what kind of timeline are we looking at until the end user can be comfortable with Vista drivers? With DX9 drivers that work as stably and quickly as they do with XP?
Keita Iida: We're ramping up the frequency of our Vista driver releases. Users will probably understand that we release a number of beta drivers on our site, so we're making incremental progress. We believe that, in a very short time we will have addressed the vast majority, if not all of the issues. We've had teams who were working on other projects who have mobilised to make sure that as quickly as possible we have the drivers fixed. I'm not going to give you an exact timeframe, but it's going to be very soon. We're disappointed that we couldn't do it right off the bat, but we hear what everyone is saying and we're willing to fix it.
http://pc.ign.com/articles/780/780314p1.html">http://pc.ign.com/articles/780/780314p1.html
xpose - Tuesday, April 17, 2007 - link
This next gen purevideo stuff sounds amazing. I thought I was gonna have to get a new motherboard and dual core cpu to play some HD-DVD content smoothly. Please, do try and rush testing the purevideo stuff ASAP. Blu-ray and hd-dvd is growing. . .shabby - Tuesday, April 17, 2007 - link
128bit/256meg for $200 bucks? Gimme a break.Sunrise089 - Tuesday, April 17, 2007 - link
Unless these cards are majically fast under DX10 (and we all know they won't be, they will play Crysis, but not quickly) they offer less performance than even midrange parts from the last get.Anyone remember how a 6600GT offreed 9800pro beating performance, and how nVidia sold millions of them. I don't see that happening here. What I do see is a wait-and-see attitude. Does anyone else think it's VERY suspicious that there are no 64 shader cards? Here is what may happen: nVidia waits for the midrange AMD cards to emerge. If they offer better performance, nVidia slashes prices of these and releases a 8800GS with 64 shaders for $200. I won't be surprised at all if that's what we have in 3 months.
JarredWalton - Tuesday, April 17, 2007 - link
We've got 128 SP on the GTX, 96 on the GTS... and then 32 on the G84. I'd say there's definitely room for 64 SP from NVIDIA, and possibly 48 SP as well. Will they go that route, though? Unless they've already been working on it, doing a new chip will cost quite a bit of time and effort. I was expecting 8600 to be 64 SP and 8300 to be 32 SP before we had any details, but then the 8600 probably would have been too close to the 8800.kilkennycat - Tuesday, April 17, 2007 - link
Er, wait (not too long) for nVidia's re-roll of the 8xxx-series on 65nm... You might just get your wish. I believe that nV is copying Intel's 'tic-toc' process strategy - architecture and go to production on a mature process (80nm half-node), then transfer and refine the implementation on the new process. Note the interesting and important tweaks in the implementation of the 8600 vs 8800... which gives a glimpse of the future 65nm 9xxx(??)-family architecture but with higher numbers of stream-processors and high-precision math processing for the expected GPGPU applications.nVidia has already hinted that the successor to the 8800 will be available before the end of 2007, and no doubt will be on 65nm for the obvious cost and yield reasons. If the R600 turns out to be a true contender for the 8800 "crown" in the same price-range, then I fully expect nV to accelerate the appearance of the 8800 successor. No doubt the design was started long before the 8800 itself was production-available.
Toebot - Tuesday, April 17, 2007 - link
No, nothing to sneeze at, just something to blow my nose on! Utter wretch. This card is NVidia's attempt to milk the Vista market, nothing more.DerekWilson - Tuesday, April 17, 2007 - link
We should at least wait and see what DX10 performance looks like first.AdamK47 - Tuesday, April 17, 2007 - link
With what software?shabby - Tuesday, April 17, 2007 - link
3dmark vista edition of course!munky - Tuesday, April 17, 2007 - link
Why did you not include the x1950xt in the test lineup? It can also be had for about $200 now, like the 7950gt. You didn't want to make the 8600 series look worse than they already do, or what?DerekWilson - Tuesday, April 17, 2007 - link
Wow, sorry for the ommission -- I was trying to include specific comparison points -- 3 from AMD and 3 from NVIDIA, but this one just slipped through the cracks. Sorry. It will be included in my performance update.Elwe - Tuesday, April 17, 2007 - link
Now, now guys. True that these cards are not going to be what many of you want (there are some good reasons to stay with what you have considering the performance differential of several of the last generation cards). And it is clear that these cards will not touch the 8800 cards (from what I can tell, the only these these do better are are 100% Pure HD on the card, which I guess is because these might be paired with not-so-great cpus.But for some of us, they might work. I recently bought a Dell 390 workstation. I packed it with fast drives, QX6700 cpu, and 4gb ram. There were very few BTO graphics choices, and most centered around the Pro market (this is a "workstation" after all). These is a new machine, and quite powerful! I want to work and play on this box. Because of the relatively week power supply (rated at 375 watts or something like that) and because I need both available non-graphics PCIe slots (if you put it an 8800 GTS, even if you changed the power supply, this type of dual slot card will cover one of those slots), I have been waiting for something reasonably powerful to come along (again, I am not going to just work on this box; I would like to play UT2k7, too:). Since I run Linux, I was trying to stick with the Nvidia line (my experience is that they have better drivers for this platform, but perhaps ATI has stepped up in the last half year or so). I could have gone with the 79xx line (single slot), but I wanted to see what the new generation would bring. Depending on what you need/want, I think either a slightly-used 7950GT OC or a 8600 GTS would work just fine for me. It does not seem unreasonable to me that in some things the older higher end card is faster than the newer mid range card, and vice versa. But I did not see any benchmark where the 79xx line whooped the 8600 GTS thoroughly (like what happend with several benchmarks comparing the 8800 and 8600).
I would say that the only immediate problem I might have with using the 8600 GTS is for gaming at high resolutions. I have a Dell 2407, and Anandtech's benchmarks make it clear I should not be gaming at that high a resolution. Bummer. The 7950 GT OC might very well be the better option here.
In an ideal world, I really would like the power of an 8800 (and, fortunately, I can pay for it). But I really need the PCIe slot more, and changing out the power supply would add even more cost. I could have gotten another Dell model (like the XPS 710 or the Precision 490)--and I am thinking about just that. But I got the 390 for what I considered good reasons (a damned sight cheaper than the 490, and I have no need of another cpu socket when I can have 4 cores in one socket), and the XPS 710 did not have BTO storage options that I wanted (not sure why they could not design that thing to have more than two internal drives--the thing is big enough; maybe most games do not need it, as that is what the machine was designed for). I bet I am not the only one.
GhandiInstinct - Tuesday, April 17, 2007 - link
Masses would include AGP cards...I see no AGP DX10 cards...
aka1nas - Tuesday, April 17, 2007 - link
The "masses" don't build their own computers, and thus have long since stopped purchasing machines with AGP slots.GhandiInstinct - Tuesday, April 17, 2007 - link
The "masses" also don't go hunting for DX10 cards to add FPS to their hardcore Dell and Gateway gaming rigs.Be honest with yourself, the people going for these cards are custom riggers.
AGP DX10 please, theres hundreds of thousands with Pentium 3.4 Northwoods that know their processors will run BioShock well, but they need DX10 without paying for a new motherboard, DDR2, and everything else, including Vista!!!
JarredWalton - Tuesday, April 17, 2007 - link
Actually, I don't think anyone "knows" whether or not any current system will run BioShock well or not. Let's wait for the game to appear at least. We're still at least four months away (assuming they hit the current release date).While I can understand people complaining about the lack of AGP cards, let's be honest: why should either company invest a lot of money in an old platform? It takes time to make the AGP cards and more time to make sure the drivers all work right. At some point, the old tech has to be left behind. The cost to transition from an AGP setup to a PCIe setup is often under $100, so if the AGP cards had a $50 price premium you'd only save yourself $50 and still be stuck with the older platform.
I figure AMD/ATI and NVIDIA basically ignored the complaints with X1900/7900 class hardware (the best was several notches below what was available on PCIe), and at this point I think they're done. I'd even go so far as to say we're probably now at the point where an AGP platform would start to be a bottleneck with current hardware - maybe not midrange stuff, but certainly the high-end offerings.
Let's put it another way: why can't I get something faster than a single core 2.4GHz 1MB cache Athlon 64 3700+ for socket 754? Why can't I get Pentium D or Core 2 Duo for socket 478? Why do we need new motherboards for Core 2 Duo when socket 775 is used sing 915/925? Intel and AMD have forced transitions on users for years, and after a long run it's tough to say that AGP hasn't fulfilled its purpose. Such is the life of PC hardware.
GhandiInstinct - Tuesday, April 17, 2007 - link
Good points, I agree with them all.Basically, I feel that my 3.2 northwood, 2GB ram is worth salvaging for BioShock and Hellgate, obviously not Crysis, but it's convenient that it will be released in 08.
I figure I can hold out 8 more months, save up during this time, and switch to quad and DDR3.
I service hundreds of clients a week in tech support that have AGP setups, and I don't think Nvidia and ATi will abandon AGP with DX10, especially since there is speculation to believe they will be releasing this cards in the future: http://www.theinquirer.net/default.aspx?article=37...">http://www.theinquirer.net/default.aspx?article=37...
:)
LoneWolf15 - Tuesday, April 17, 2007 - link
While it would be nice to have this hardware in NVIDIA's higher end offerings, this technology arguably makes more sense in mainstream parts. High end, expensive graphics cards are usually paired with high end expensive CPUs and lots of RAM. The decode assistance that these higher end cards offer is more than enough to enable a high end CPU to handle the hardest hitting HD videos. With mainstream graphics hardware providing a huge amount of decode assistance, the lower end CPUs that people pair with this hardware will benefit greatly.IMO, this is absolute bollocks.
If I'm paying for nVidia's high-end stuff, I expect high-end everything. And this is at least the second time nVidia has only improved video on their second-round or midrange parts (anybody remember NV40/45?).
I game some, and I want good performance for that. But, I also have a 1920x1200 display, and I want the best video playback experience I can get on it. I also want the lower CPU-usage so I can playback video while my system is left to do other processor-intensive tasks in the background.
Once again, nVidia has really disappointed me in this area. In comparison, ATI seems to be much better at making sure their full range of cards supports their best video technologies. This (along with nVidia's driver development) continues to make the G80 seem like a "rushed-out-the-door" product.
kilkennycat - Tuesday, April 17, 2007 - link
(As of 8AM Pacific Time, April 17)See:-
http://www.zipzoomfly.com/jsp/ProductDetail.jsp?Pr...">http://www.zipzoomfly.com/jsp/ProductDetail.jsp?Pr...
http://www.zipzoomfly.com/jsp/ProductDetail.jsp?Pr...">http://www.zipzoomfly.com/jsp/ProductDetail.jsp?Pr...
Chadder007 - Tuesday, April 17, 2007 - link
Thats really not too bad for a DX10 part. I just wish we actually had some DX10 games to see how it performs though....bob4432 - Tuesday, April 17, 2007 - link
that performance is horrible. everyone here is pretty dead on - this is strictly for marketing to the non-educated gamer. too bad they will be disappointed and probably return such a piece of sh!t item. what a joke.come on ati, this kind of performance should be in the low end cards, this is not a mid-range card. maybe if nvidia sold them for $100-$140 they may end up in somebody htpc but that is about all they are good for.
glad i have a 360 to ride out this phase of cards while my x1800xt still works fine for my duties.
if i were the upper management at nvidia, people would be fired over this horrible performance, but sadly the upper management is more than likely the cause of this joke of a release.
AdamK47 - Tuesday, April 17, 2007 - link
nVidia needs to have people with actual product knowledge dictate what the specifications of future products will be. This disappointing lineup has marketing written all over it. They need to wise up or they will end up like Intel and their failed marketing derived netburst architecture.wingless - Tuesday, April 17, 2007 - link
In the article they talk about the Pure Video features as if they are brand new. Does this mean they ARE NOT implemented in the 8800 series? The article talked about how 100% of the video decoding process is on the GPU but it did not mention the 8800 core which worries the heck outta me. Also does the G84 have CUDA capabilities?DerekWilson - Tuesday, April 17, 2007 - link
CUDA is supportedDerekWilson - Tuesday, April 17, 2007 - link
The 8800 series support PureVideo HD the same way GeForce 7 sereis does -- through VP1 hardware.The 8600 and below support PureVideo HD through VP2 hardware, the BSP, and other enhancements which allow 100% offload of decode.
While the 8800 is able to offload much of the process, it's not 100% like the 8600/8500. Both support PureVideo HD, but G84 does it with lower CPU usage.
wingless - Tuesday, April 17, 2007 - link
I just checked NVIDIA's website and it appears only the 8600 and 8500 series support Pure Video HD which sucks balls. I want 8800GTS performance with Pure Video HD support. Guess I'll have to wait a few more months, or go ATI but ATI's future isn't stable these days.defter - Tuesday, April 17, 2007 - link
Why you want 8800GTS performance with improved Purevideo HD support? Are you going to pair 8800GTS with $40 Celeron? 8800GTS has more than enough power to decode H.264 at HD resolutions as long as you pair with modern CPU: http://www.anandtech.com/printarticle.aspx?i=2886">http://www.anandtech.com/printarticle.aspx?i=2886This improved Purevide HD is aimed for low-end systems that are using a low end-CPU. That's why this feature is important for low/mid-range GPUs.
wingless - Tuesday, April 17, 2007 - link
If I'm going to spend this kind of money for an 8800 series card then I want VP2 100% hardware decoding? Is that too much to ask? I want all the extra bells and whistles. Damn, I may have to go ATI for the first time since 1987 when I had that EGA Wonder.JarredWalton - Tuesday, April 17, 2007 - link
It's not surprising that G84 has some enhancements relative to G80. I mean, G80 was done six months ago. I'd expect VP2 is one of the areas they worked on improving a lot after comments post-8800 launch. Now, should they kill the current G80 and make a new G80 v1.1 with VP2? That's up for debate, but you can't whine that older hardware doesn't have newer features. "Why doesn't my Core 2 Duo support SSE4?" It's almost the same thing. I wouldn't be at all surprised to see a new high-end card from NVIDIA in the future with VP2, but when that will be... dunno.harshw - Tuesday, April 17, 2007 - link
So ... to confirm, the card *does* let you watch HDCP content on a Dell 3007WFP at 2560x1600 ? Of course, the card would probably scale the stream to the panel resolution ...
DerekWilson - Tuesday, April 17, 2007 - link
The card will let you watch HDCP protected content at the content's native resolution -- 1920x1080 progressive at max ...Currently if you want to watch HDCP protected content on a Dell 30", you need to drop your screen resolution to 1280x800 and watch at that res -- the video is downscaled from 1920x1080. Higher resolutions on the panel require dual-link DVI, and now HDCP protected content over a dual-link connection is here.
AnnonymousCoward - Tuesday, April 17, 2007 - link
Maybe I'm in the minority, but I don't care about this HDCP business. The players are still ultra expensive, and the resolution benefit doesn't really change how much I enjoy a movie. Also, a 30" screen is pretty small to be able to notice a difference between HD and DVD, if you're sitting at any typical movie-watching distance from the screen. Well, I would guess so at least.Spoelie - Wednesday, April 18, 2007 - link
We're talking about 30" lcd monitors with humongous resolutions, not old 30" lcd tvs with 1386x768 something.Or do your really don't see any difference between
http://www.imagehosting.com/out.php/i433150_BasicR...">http://www.imagehosting.com/out.php/i433150_BasicR... and http://www.imagehosting.com/out.php/i433192_HDDVD....">http://www.imagehosting.com/out.php/i433192_HDDVD....
or
http://www.imagehosting.com/out.php/i433157_BasicR...">http://www.imagehosting.com/out.php/i433157_BasicR... and http://www.imagehosting.com/out.php/i433198_HDDVD....">http://www.imagehosting.com/out.php/i433198_HDDVD....
Myrandex - Tuesday, April 17, 2007 - link
I loved it how the two 8600 cards listed 256MB memoy only however the 8500 card showed 256MB / 512MB. Gotta love marketing in attempting to grab the masses attention by throwing more ram into a situation where it doesn't really help...Jason
KhoiFather - Tuesday, April 17, 2007 - link
Horrible, horrible performance. I'm so disappointed its not even funny! I'm so waiting for ATI to release their mid-range cards and blow Nvidia out the water to space.jay401 - Tuesday, April 17, 2007 - link
Very true, and not only because the vast majority of gamers are still running XP, but also because no games out to this point gain anything from DX10/Vista (aside from one or two that add a few graphical tweaks here and there in DX10).
When there are enough popular, well-reviewed DX10/Vista focused games available that demonstrate appreciable performance improvement when running in that environment, such that you can create a test suite around those games, then it would be time to transition to that sort of test setup for GPUs.
Griswold - Tuesday, April 17, 2007 - link
The real reason would that nobody wants to go through the nightmare of dealing with nvidia drivers under vista. ;)jay401 - Tuesday, April 17, 2007 - link
Derek you should add the specs of the 8800GTS 320MB to the spec chart on page 2, unless of course NVidia forbids you to do that because it would make it too obvious how they've cut too many stream processors and too much bus size from these new cards.Now what they'll do is end the production of the 7950GTs to ensure folks can't continue to pick them up cheaper and will be forced to move to the 8600GTS that doesn't yet offer superior performance.
gg neutering these cards so much that they lose to your own previous generation hardware, NVidia.
poohbear - Tuesday, April 17, 2007 - link
sweet review on new tech! thanks for the bar graphs this time! good to know my 512mb x1900xtx still kicks mainstream butt.:)tuteja1986 - Tuesday, April 17, 2007 - link
Total disapointment :( ... Could even beat up a X1950pro. They really need to sell at $150 otherwise you would be better off buying a X1950GT or 7900GS for $150 to $160. At the current $200 to $230 price you could get a X1950XT 256MB which could destroy it but that GPU needs a good powersupply. Only thing going for a 7600GT is the DX10 support and Full H.264 , VLC , Mepg 4 support but that can be found on even other cards.Staples - Tuesday, April 17, 2007 - link
I have been waiting several months for these cards and boy and I disappointed. I figured this month I would get a new PC since April 22 the prices of C2D also drop. My idea was to get a C2D600 and an 8600GTS but after their lack luster performance, my only option is an 8800GTS which is $50+ more. Not a huge difference but I am very compelled to wait until the refresh comes out and then maybe I can get a better deal. I really hate this senario where ATI is down, AMD is down and no competition is leading to high prices and crappy performance.Hopefully in another 6 months, AMD will be up to par on both their processors and CPUs. I will be holding on to my current system until then. I found it disappointing that these cards do not come with 512MB of memory but their performance is actually even more disappointing.
JarredWalton - Tuesday, April 17, 2007 - link
Well, the R6xx stuff from AMD should be out soon, so that's going to be the real determining factor. Hopefully the drivers do well (in Vista as well as XP), and as the conclusion states NVIDIA has certainly left the door open for AMD to take the lead. Preliminary reports surfacing around the 'net show that R600 looks very promising on the high end, and features and specs on the midrange parts look promising as well. GDDR4 could offer more bandwidth making the 128-bit bus feasible on the upper midrange parts as well. Should be interesting, so let's see if AMD can capitalize on NVIDIA's current shortcomings....PrinceGaz - Tuesday, April 17, 2007 - link
It might be a good idea to read more reviews before writing off the 8600 range.Over at , they found the 8600GTS easily beats the X1950Pro and even though the models they tested were factory overclocked, one of them had only a 5% core overclock and no memory overclock and was still well ahead of the X1950Pro. At worst the two cards were roughly even but in many tests the 8600GTS (with just 5% core o/c) was considerably faster. As say in the article link
So who do you believe? I guess I'll need to read several more reviews to see what's really going on.
Spanki - Tuesday, April 17, 2007 - link
I agree that that review it paints a different picture, but I have no reason to disbelieve it. I mean, they do their testing differently (try to find the best playable settings for each card), but it is what it is... I mean, the tables show the differences in the cards, at those (varrying settings) and so they therefore draw the conclusions they draw, based on that (with card X, I can enable option Y at these frame rates, but not on card Z - at least at the same resolutions). It's not like they tried to hide the settings they used or the frame-rates they got with each set-up. I found it an interesting perspective. ~shrug~Anyway, my personal opinion is that they neutered this chipset too much. There looks to be a substantial gap between 7900/8600 and 8800 level performance and the sweet-spot for this price-point would have been right in the middle of that gap... maybe they're planning a 8700 series chipset?
Griswold - Tuesday, April 17, 2007 - link
Thats a funny review. I'll stick to the other 90% that say this fish smells.GoatMonkey - Tuesday, April 17, 2007 - link
yacoub - Tuesday, April 17, 2007 - link
HardOCP's review disagrees with almost everyone else's results and also reads like a marketing advertisement for the product. I wouldn't give their review the time of day.PrinceGaz - Wednesday, April 18, 2007 - link
They're normally very good which after reading HardOCP's review immediately after AT (whose I read first), I thought I should mention that not everyone found the 8600GTS to be slower than the X1950Pro.However, after reading several more reviews on other usually reliable websites, the consensus seems to be that the 8600GTS is well behind the X1950Pro, which does make HardOCP's finding seem very odd.
I get the feeling that we're going to have to wait until nVidia get their drivers for this card sorted out as I suspect they are not all they could be, which they will hopefully have done by the time the HD2xxx series are launched, then the 8600/8500 cards can be retested and compared with their true competition.
erwos - Tuesday, April 17, 2007 - link
</font>I'm wondering if I can fix the disappearing text problem.
PrinceGaz - Tuesday, April 17, 2007 - link
Please remove or edit my above post to remove the (H) bit which caused a problem, I'd do it myself but we have no edit facility.JarredWalton - Tuesday, April 17, 2007 - link
That should hopefully fix it - you just need to turn off highlighting using {/h} (with brackets instead of braces).
defter - Tuesday, April 17, 2007 - link
You need to take into account that 7900GS will be soon discontinued and X1900 series will face same fate as soon as ATI releases RV630 cards.Cards based on previous high-end products like 7900 and X1900 based cards are great for consumers, but bad for ATI/NVidia since they have large die sizes and 256bit memory bus (= high board manufacturing costs).
hubajube - Tuesday, April 17, 2007 - link
I wouldn't replace my 7800GT with these but it would be fantastic for a HTPC.PICBoy - Tuesday, April 17, 2007 - link
I think a lot of people is waiting to see some DX10 bechmarks really bad because that's what makes G80 and G84 special.If the 8600 GTS can't run Crysis at AT LEAST 45 FPS with 1280x1024 with full details and a moderate 4xAA then it's not worth it in my own humble opinion.
Same for the 8800 GTS 320MB, if it can't run Crysis at 60 FPS with 1280x1024 with full details and full 16xCSAA then it sucks...
BTW 8800 GTS 320MB gets near double the performance at 50% higher price and when 4xAA is enabled a little over double. Think about that everyone ;-)
Staples - Tuesday, April 17, 2007 - link
My reaction to. Do you play PC games? Very few games can be run at 60fps with full detail even with top of the line hardware. I expect the 8600GTS to get about 20fps in Crysis.PICBoy - Tuesday, April 17, 2007 - link
The only games that I don't see get that amount of fps at 1280x1024 with current mainstream hardware (7900GS) are Black & White 2, Oblivion and of course Rainbow Six Vegas. The rest of the games get 60 or more, excepto for Splinter Cell which gets 52 but that's almost 60 to me. Only 3 games gentlemen and I'm taking this info from Anandtech. If 200$ can get me descent performance at good quality at DX10 then I don't think it's worth it and XFX 7900GS XXX would rock!DerekWilson - Wednesday, April 18, 2007 - link
The issues is still one of the direction the industry is going. Games are going to get more graphically intense in the future, and different techniques will scale better on different hardware.Rainbow Six: Vegas is very important, as it is an Unreal Engine 3 game -- and Epic usually does very well with licensing their engine ... It's possible many games could be based on this same code in the future, though we can't say for certain.
It's not only a question of DX10, but future DX9 games as well -- how will they be implemented, and whether more shader intensive DX9 code lend it self better to the G8x architecture of not.
gramboh - Tuesday, April 17, 2007 - link
Are you joking? 8800GTS 320 in Crysis with max details and 16x AA at 60+FPS?I'm not expecting more than 40fps on my system at 1920x1200 less-than-max-details no aa/af (E6600 3.4GHz, 2GB ram, 8800GTS 640MB at 600/1900)