If this core 216 one replaces the old one what happens if i want to get sli with the old gtx 260 later on, would i just be screwed or, is this product just going to supplement the old gtx 260.
Good grief why didnt they just name the product based on the number of stream processors. GTX216, GTX192, etc. No wonder they are losing so much money... Can you imagine how much money they spend coming up with these model numbers? That's who they should be firing...
I think this card was a knee jerk reaction that NVIDIA should have passed on. It was a waste of resources, they had plenty of time to blow AMD out of the water since the 8800 but fell asleep at the wheel. This move seems like a petty attempt to put their card on equal footing at best. How about HDMI, display port, DDR4, something significant, etc.
I like NVIDIA but they have to step up, not waste resources or shelf space with something that can only even the playing field, and not even at a price point.
I think this product was a poor decision and a waste of resources when all focus could have been getting put into proper future products.
This review was decent enough, I don't think much more effort was necessary to spend reviewing a product that didn't bring much to the table.
Maybe someone will find a way to hack the card to enable that last TPC, maybe a bios or HW hack.
"I think this card was a knee jerk reaction that NVIDIA should have passed on. It was a waste of resources, they had plenty of time to blow AMD out of the water since the 8800 but fell asleep at the wheel. This move seems like a petty attempt to put their card on equal footing at best. How about HDMI, display port, DDR4, something significant, etc. "
You're kidding right ? So what you're saying is that nVidia should just toss out their defective 280 GPUs to "not waste resources" ? Of course I am guessing that this is what is actually going on, but it makes perfect sense to me that NV try and recover what they can cost wise.
"think this card was a knee jerk reaction that NVIDIA should have passed on. It was a waste of resources, they had plenty of time to blow AMD out of the water since the 8800 but fell asleep at the wheel. This move seems like a petty attempt to put their card on equal footing at best. How about HDMI, display port, DDR4, something significant, etc. "
Like I said above: if my assumption is correct nVidia is doing the right thing by them and their stock holders.
Why is there no longer video reviews on these cards ?
Like a subjective review on how they do playing video, and cpu usage and such ?
I miss seeing this info !
Please start including it again, as it is a factor in my purchases.
the 260 core 216 (OMG what long and senseless name!)competes with the 1GB version of the 4870.
I am really inclined to believe that ATI is paying sites to review the 1GB version of 4870 at 1920x1200 or lower resolutions to avoid losing 4870X2 sales.
Congratulations on AT for another great card review, but it is a shame that no 1GB 4870 numbers at 2560x1600 were in the roundup.
Guys, I know you want to stay short and sweet with the summaries before each game but please try to actually summarize the data accurately. The Enemy Territory: Quake Wars is a glaring example. For the summary you say:
"With our ET:QW bench, at 2560x1600, the Core 216 adds enough horse power to pull the NVIDIA card up to a tie with the 4870. Yes, there is a tenth of a frame difference here, but that's well within margin of error."
But at both previous resolutions of 1680X1050 and 1920X1200 the 216 falls to the 3870 by 13% and 9% respectively. Yes once you get to the ungodly resolution of 2560X1600 (which I may ask why? since these are not high-end parts?!?) they pull even, but up until that point it's a pretty sound beating for the 216.
I would ask that you change the summary to the effect of:
"With our ET:QW bench, the 216 falls by double-digits to the 3870 at lower resolutions, but at 2560x1600, the Core 216 adds enough horse power to pull the NVIDIA card up to a tie with the 4870."
That is a fully accurate summary that allows someone to get the gist of the data without having to trace all the broken-line graphs.
One other little favor to ask. Could you please, PLEASE, keep the colors consistent between the line and bar graphs? On several occasions I was following the wrong line because the green bar is a different card than the green line. I'm sure it would take only a couple minutes, but it would really help the reader quickly go from chart to chart to see how a particular card fares at the reader's native resolution.
> 2560X1600 (which I may ask why? since these are not high-end parts?!?)
Uhh yeah they are...the 216 is only the second-fastest card in the world. Plus, that ultra high resolution can help expose things like memory size, memory bandwidth, and less CPU influence.
The new 216gtx is obviously slower than the 4870.It loses against the 4870 in 4 out 0f 6 games at 2560 res. and 5 out of 6 games at 1920 res.
At 2560 res. they should've used the 4870 with 1 GB version.
Because the difference between 96.8 and 111.2 is rather irrelevant? Both cards can rock those resolutions, so the difference might lise elsewhere in the test systems. Since they aren't reporting minimum frame rates, both of the above are well into the playable range.
Not to me it isn't. Those numbers today can be 1/2 tomorrow with the latest game. I'd love to see the % of people purchasing these cards with a monitor capable of the highest resolution benchmarked. I think 1% is a safe bet. That 1% benefits from the game summary while the remaining 99% of potential buyers with lower res monitors that fail to read the broken-line graph and instead just read the game summary are given bad/incomplete information.
Trust me, I'm not saying Anand is conspiring to put the Nvidia card in a better light, just that the summary as it stands is very misleading.
Or maybe it is the lack of memory in the frame buffer for the 4870 that prevents it from scaling its performance all the way to 2560.
I believe it is a very important fact that the 4870 is faster at the lower resolutions than the Core 216, this omission shows a lack of attention to detail on the summary :-/.
Or maybe it is the lack of memory in the frame buffer for the 4870 that prevents it from scaling its performance all the way to 2560.
I believe it is a very important fact that the 4870 is faster at the lower resolutions than the Core 216, this omission shows a lack of attention to detail on the summary :-/.
Few days ago I got my new 24'' monitor... So I'm curious and would like to know your opinion:
Currently I own Asus 8800GTS 512 but want to move to Sapphire HD4780 1G... Is it worth doing so?
P.S.
I have Vista machine and mainly (90% of the time) is for gaming (TF2, COD4 & Crysis. But planning to buy FarCry2 and Crysis Warhead?)... It is a Q6600@2.4GHz (planning to OC) on Gigabyte P35-DS3R with 4GB DDR2-800.
GT200 gives very poor performance! It has double the transistors of the previous generation for marginal gains. The GX2, with the same total transistors as GT200, blows it away.
Since the Geforce 256, every new series has basically doubled performance, but this trend stopped from 8 to 9, and again 9 to GT200. The 8800 GTX is nearly 2 years old, and is still in the same league.
it'll be quite a long while before nvidia competes again in the bank/$ game. i also got in on the two 4850's for under 300 deals when they first came out...
nvidia needs to go back to the drawing board on their current chipset offerings.
Why is that in almost every single benchmark I've seen here with Crysis, its always without AA?
I think that is a important component to measure as it puts a further strain on the memory bandwidth and shows potential weaknesses of an architecture. Crysis as i've seen on some other sites seems to show the limits of a 512 MB frame buffer on the 4870 versus the GTX 260's, and I would like it if Anandtech could confirm this :).
I had heard that the 55nm version of the 260 and 280 was going to be called the 270 and 290 which would explain why they decided not to use 270.
Overall though I agree that they need to come up with a better naming convention. Do the same Generation/Family/Varient thing as AMD or else go back to the old naming convention and make the number tell you how many SP Cores and type of memory used and stuff, and then do the GS/GT/GTX on the end to signify the clock speed level of the card within the family.
Here, nVidia, i will do it for you.
The current cards are now named:
??? = whatever the hell you like. nVCore or GeForce or CoreForce or whatever marketing name you decide to spend way too much money thinking up.
??? 208 GTX
??? 209 GTX
??? 210 GTX
Your 55nm replacements will be the:
??? 308/309/310 GTX (Assuming they use the same memory config or if you switch to a narrower width on the memory bus, it better be GDDR5 then)
2 is your family code for current gen and 3 will be for 55nm
08-10 = number of SP Cores
GTX = high end Card
So a mid range card might be something like the ??? 206 GT
And a Cheap card could be the ??? 203 GS
Dont even have to change your internal design of the SP Cores.
And if you Tick/Tock like Intel, then the next generation after this one can
Next generation after 55nm can be the 4 series and then the 45nm variant of that can be the 5 series and you now have a naming convention for the next 4 years at least.
Why would you have one FPS at 60.9 and the other at 61? It should either be 61 and 61 or 60.9 and 61.0. If .1 is within the margin of error you should not report FPS to this accuracy.
The main thing keeping me from upgrading my 7900GT right now is how damn many video cards there are available. I can't tell which is better, the 9600GT, 9600GSO, 9800GT, 8800GT, etc, etc, etc, as they're all within about a $40 spread and all seem to be the same damn card.
It's frustrating. I wish nVidia would stop doing this. Choose a set of price points and release four to six cards for those price points. High end ($300+), mid range ($200-$300), low end ($100-$200), multimedia ($50-100), extreme low end ($30). We don't need a card spaced every $5 through the spectrum. $50 price differences should more than suffice.
Its not really that bad. All you need to do is a little reaserch about each card and you can put them in a easy list. But yes, they do need to work on nameing..
Also, they come out with a handfull of cards every 9-18 months, you cant blame them for places still selling last 2 generation cards and you getting confused! Todays highend will be tomarrows midrage, and guess what? The prices will be similer. Its your job to investigate your investment.
Well,It's not fair to put the results at 2560res. on the top.They could've put the results at 1920 res. on the top.
I think it's intentional,to show the 4870 in a worse case.
I know anandtech.they are AMD fanatics.simply, they don't like AMD.
Anyways, im waiting for the review of this card on techreport.com.
I'll see about addressing that as soon as I finish my current article. I've been meaning to do one for months, but you know what they say about good intentions....
I would love to see one too, but if you're really going to do a new one, and I hope you do, please start doing them regularly. Maybe once per season or fiscal quarter? Monthly might be a bit much, but those system guides are fantastic.
I read somewhere that the first batches of 48x0 cards had a bug in their bios which prevented power play from working properly. This is supposed to be fixed since some time now and idle power draw should be decreased significantly.
I'd say contact AMD or a card manufacturer. If it's true they should be more than happy to assist you in obtaining updated numbers. The current numbers are just plain horrible and may keep people from buying the Radeons.
this is a overclocked new 260gtx cuz the stock one has the same clock and shader frequency of the original 260gtx.
you should have included a 4870 top or a xoc 4870 in this test.
If you had actually read the article, you would see in multiple places that they ran it at both stock clocks and overclocked (as received) and showed both results.
After all the complaining the AMD fanbois did when they showed a 9600GSO in the 4670 article, why would they bring in a new AMD overclocked card and hear the same thing from the NVIDIA fanbois?
The ridiculous amount of power draw for an idle card has been going on too long. I have a 4850 and my system consumes 30w more than it did beofre with a 7950GT. Most people do not pay attention to this number but I sure do. I am glad to see that NVIDIA has done more than just bumped up the chips inside this, there is significantly less power draw when it is idle.
And here is hoping that ATI can actually come out with some better drivers this month. The 8.8 cause all kinds of trouble with a 780G chipset (in Vista 32) and a 4850 (in XP). Amazing but I have to run 8.7 on both computers because the 8.8 drivers are really problematic.
Power Savings??? Where? 160-200 watts for sitting there? Give me a break. These GPUs are a massive waste of power. ATI/nVidia should be ashamed of themselves.
You'll see that the 4870 draws 65 W at idle (seems power play doesn't work there either). Assuming 80% power supply efficiency that means a draw of 81 W at the wall. Therefore ATs system draws 122.5 W without the GPU and the NVs consume about 36 W from the wall and 29 W for the cards themselves. That's way better than previous generation NV cards, which consumed 40 - 60 W at idle. That's what the previous poster meant by "power savings".
(seems like power mixer is not working for XBits NV cards, whereas for ATs it works)
I know you were reviewing the chip and not really the EVGA product, but looking at the product images on newegg, I see all the GTX 260s look exactly the same, so the fan noise of this card should be fairly representative for other GTX 260 Core 216s. (Wow, that name is a mouthful.)
I read somewhere else recently that it is because the 260 and the 260 Core 216 can still be SLIed together. If they called it anything other than a 260, they felt that consumers would be confused.
No, no, that's not confusing enough. The GTX260 Core 216 fits perfectly with the 8600GT\GTS, 8800GTS 640\512\320, 9600GT\GSO and the 9800GTX+. Can't wait to hear the naming scheme for the 55nm GTX280's.
Gosh, sorry I have to vent here. lol -
I know you were being a wise guy - so take it with a grain of salt.
ou don't really believe that excuse, do you ? I mean the corpo planners have you eating their goat cheese like a baby.
" Oh lookie here, the exact parts we need failed , failed again, wow, just a shaders region and one mem spew bank, man we get lucky a lot ! Ok, just cut that line on the corecap, and stamp it a 260 !"
.
( I mean really...).
Yes, of course they can grade bin somethnig like an E8400 / E3110 for voltage hence decency, but I've never believed they just whack out a 280 to make it a 260, or the other endless derivations of the basic deal...
They control multipliers IN MANUFACTURE, fine. If they stamp on a different name, fine.
But having this " lucky failed chip " taken to every extreme - I simply don't believe it.
280.260 "same core" - yeah, fine, they PLAN on the 260 core reductions, then produce them as such.
Same as with all their other crap...
We've found MANY times before - their locked chips - when unlocked did just as well as the higher versions, sometimes even better.
Anyway.
When has Tom's ever not been biased? They used to be vary pro-Intel, pro-ATI if I remember correctly.
I'm impressed with nVidia's numbers since I had figured they had abandoned the mid-range market again when the release of the GTX series, which were double the price of what ATI was offering. Good frame rates, fair price, and lower power consumption than the HD 4870. Not a bad buy. I wish I had kept with nVidia rather than getting an HD3850 earlier this year. While it's a good card, it has driver trouble with older games like KotOR (low frame rates, missing effects) and ATI's Linux compatibility sucks.
Toms was ok back in the day, and sometimes for CPU/VGA charts, but for most everything else they were just not very good. They are a dumbed down version of anandtech for the masses.
I only wish I could run SLI on an X38 chipset (without the silly nForce 100/200 bridge). Until that happens, I'll use ATI hardware for this generation. Hopefully with the Nehalem stuff, I can pick up an X58 board with support for both GPU platforms. Though I have to say, my dual 3870 cards are starting to look awfully sad. :(
I'm still holding on to my dual 8800GTX's. I am still surprised how competitive they are against all but the uber high end 280SLI's or 4way Xfire setups. At my native resolution of 1920x1080, they still tear everything up.
I would have liked to seen 4850 in Crossfire since I paid well under $300 for my 2 on sale, and $300 seems to be the price point this article goes after.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
65 Comments
Back to Article
Hrel - Wednesday, November 12, 2008 - link
If it's just a matter of enabling a TPC can't you just buy a GTX260 and enable the two off TPC's yourself to make it a GTX280???gtotheb1 - Saturday, October 18, 2008 - link
If this core 216 one replaces the old one what happens if i want to get sli with the old gtx 260 later on, would i just be screwed or, is this product just going to supplement the old gtx 260.Shadowmaster625 - Wednesday, September 24, 2008 - link
Good grief why didnt they just name the product based on the number of stream processors. GTX216, GTX192, etc. No wonder they are losing so much money... Can you imagine how much money they spend coming up with these model numbers? That's who they should be firing...DigitalFreak - Saturday, September 20, 2008 - link
http://www.fudzilla.com/index.php?option=com_conte...">http://www.fudzilla.com/index.php?optio...amp;task...MadBoris - Thursday, September 18, 2008 - link
I think this card was a knee jerk reaction that NVIDIA should have passed on. It was a waste of resources, they had plenty of time to blow AMD out of the water since the 8800 but fell asleep at the wheel. This move seems like a petty attempt to put their card on equal footing at best. How about HDMI, display port, DDR4, something significant, etc.I like NVIDIA but they have to step up, not waste resources or shelf space with something that can only even the playing field, and not even at a price point.
I think this product was a poor decision and a waste of resources when all focus could have been getting put into proper future products.
This review was decent enough, I don't think much more effort was necessary to spend reviewing a product that didn't bring much to the table.
Maybe someone will find a way to hack the card to enable that last TPC, maybe a bios or HW hack.
yyrkoon - Thursday, September 18, 2008 - link
"I think this card was a knee jerk reaction that NVIDIA should have passed on. It was a waste of resources, they had plenty of time to blow AMD out of the water since the 8800 but fell asleep at the wheel. This move seems like a petty attempt to put their card on equal footing at best. How about HDMI, display port, DDR4, something significant, etc. "You're kidding right ? So what you're saying is that nVidia should just toss out their defective 280 GPUs to "not waste resources" ? Of course I am guessing that this is what is actually going on, but it makes perfect sense to me that NV try and recover what they can cost wise.
"think this card was a knee jerk reaction that NVIDIA should have passed on. It was a waste of resources, they had plenty of time to blow AMD out of the water since the 8800 but fell asleep at the wheel. This move seems like a petty attempt to put their card on equal footing at best. How about HDMI, display port, DDR4, something significant, etc. "
Like I said above: if my assumption is correct nVidia is doing the right thing by them and their stock holders.
a1yet - Wednesday, September 17, 2008 - link
Why is there no longer video reviews on these cards ?Like a subjective review on how they do playing video, and cpu usage and such ?
I miss seeing this info !
Please start including it again, as it is a factor in my purchases.
Peace
.
geok1ng - Wednesday, September 17, 2008 - link
the 260 core 216 (OMG what long and senseless name!)competes with the 1GB version of the 4870.I am really inclined to believe that ATI is paying sites to review the 1GB version of 4870 at 1920x1200 or lower resolutions to avoid losing 4870X2 sales.
Congratulations on AT for another great card review, but it is a shame that no 1GB 4870 numbers at 2560x1600 were in the roundup.
strikeback03 - Thursday, September 18, 2008 - link
Bottom of page 1 of the comments - Anand says they have not received 1GB test parts yet7Enigma - Wednesday, September 17, 2008 - link
Guys, I know you want to stay short and sweet with the summaries before each game but please try to actually summarize the data accurately. The Enemy Territory: Quake Wars is a glaring example. For the summary you say:"With our ET:QW bench, at 2560x1600, the Core 216 adds enough horse power to pull the NVIDIA card up to a tie with the 4870. Yes, there is a tenth of a frame difference here, but that's well within margin of error."
But at both previous resolutions of 1680X1050 and 1920X1200 the 216 falls to the 3870 by 13% and 9% respectively. Yes once you get to the ungodly resolution of 2560X1600 (which I may ask why? since these are not high-end parts?!?) they pull even, but up until that point it's a pretty sound beating for the 216.
I would ask that you change the summary to the effect of:
"With our ET:QW bench, the 216 falls by double-digits to the 3870 at lower resolutions, but at 2560x1600, the Core 216 adds enough horse power to pull the NVIDIA card up to a tie with the 4870."
That is a fully accurate summary that allows someone to get the gist of the data without having to trace all the broken-line graphs.
One other little favor to ask. Could you please, PLEASE, keep the colors consistent between the line and bar graphs? On several occasions I was following the wrong line because the green bar is a different card than the green line. I'm sure it would take only a couple minutes, but it would really help the reader quickly go from chart to chart to see how a particular card fares at the reader's native resolution.
Thanks again!
AnnonymousCoward - Wednesday, September 17, 2008 - link
> 2560X1600 (which I may ask why? since these are not high-end parts?!?)Uhh yeah they are...the 216 is only the second-fastest card in the world. Plus, that ultra high resolution can help expose things like memory size, memory bandwidth, and less CPU influence.
helldrell666 - Friday, September 19, 2008 - link
The new 216gtx is obviously slower than the 4870.It loses against the 4870 in 4 out 0f 6 games at 2560 res. and 5 out of 6 games at 1920 res.At 2560 res. they should've used the 4870 with 1 GB version.
AnnonymousCoward - Saturday, September 20, 2008 - link
Looking at all the bar graphs in this review, the 216 beats the 4870 6 out of 10 times.strikeback03 - Wednesday, September 17, 2008 - link
Because the difference between 96.8 and 111.2 is rather irrelevant? Both cards can rock those resolutions, so the difference might lise elsewhere in the test systems. Since they aren't reporting minimum frame rates, both of the above are well into the playable range.7Enigma - Friday, September 19, 2008 - link
Not to me it isn't. Those numbers today can be 1/2 tomorrow with the latest game. I'd love to see the % of people purchasing these cards with a monitor capable of the highest resolution benchmarked. I think 1% is a safe bet. That 1% benefits from the game summary while the remaining 99% of potential buyers with lower res monitors that fail to read the broken-line graph and instead just read the game summary are given bad/incomplete information.Trust me, I'm not saying Anand is conspiring to put the Nvidia card in a better light, just that the summary as it stands is very misleading.
Jedi2155 - Wednesday, September 17, 2008 - link
Or maybe it is the lack of memory in the frame buffer for the 4870 that prevents it from scaling its performance all the way to 2560.I believe it is a very important fact that the 4870 is faster at the lower resolutions than the Core 216, this omission shows a lack of attention to detail on the summary :-/.
Jedi2155 - Wednesday, September 17, 2008 - link
Or maybe it is the lack of memory in the frame buffer for the 4870 that prevents it from scaling its performance all the way to 2560.I believe it is a very important fact that the 4870 is faster at the lower resolutions than the Core 216, this omission shows a lack of attention to detail on the summary :-/.
Stupido - Wednesday, September 17, 2008 - link
Few days ago I got my new 24'' monitor... So I'm curious and would like to know your opinion:Currently I own Asus 8800GTS 512 but want to move to Sapphire HD4780 1G... Is it worth doing so?
P.S.
I have Vista machine and mainly (90% of the time) is for gaming (TF2, COD4 & Crysis. But planning to buy FarCry2 and Crysis Warhead?)... It is a Q6600@2.4GHz (planning to OC) on Gigabyte P35-DS3R with 4GB DDR2-800.
AnnonymousCoward - Wednesday, September 17, 2008 - link
GT200 gives very poor performance! It has double the transistors of the previous generation for marginal gains. The GX2, with the same total transistors as GT200, blows it away.Since the Geforce 256, every new series has basically doubled performance, but this trend stopped from 8 to 9, and again 9 to GT200. The 8800 GTX is nearly 2 years old, and is still in the same league.
CollectorZ - Wednesday, September 17, 2008 - link
Perhaps if Nvidia spent a little less money on marketing defective 280s and got on with the 55nm parts....Post exam October 25 would be a nice time to replace my 8800GT....
araczynski - Wednesday, September 17, 2008 - link
it'll be quite a long while before nvidia competes again in the bank/$ game. i also got in on the two 4850's for under 300 deals when they first came out...nvidia needs to go back to the drawing board on their current chipset offerings.
Jedi2155 - Tuesday, September 16, 2008 - link
Why is that in almost every single benchmark I've seen here with Crysis, its always without AA?I think that is a important component to measure as it puts a further strain on the memory bandwidth and shows potential weaknesses of an architecture. Crysis as i've seen on some other sites seems to show the limits of a 512 MB frame buffer on the 4870 versus the GTX 260's, and I would like it if Anandtech could confirm this :).
Casper42 - Tuesday, September 16, 2008 - link
I had heard that the 55nm version of the 260 and 280 was going to be called the 270 and 290 which would explain why they decided not to use 270.Overall though I agree that they need to come up with a better naming convention. Do the same Generation/Family/Varient thing as AMD or else go back to the old naming convention and make the number tell you how many SP Cores and type of memory used and stuff, and then do the GS/GT/GTX on the end to signify the clock speed level of the card within the family.
Here, nVidia, i will do it for you.
The current cards are now named:
??? = whatever the hell you like. nVCore or GeForce or CoreForce or whatever marketing name you decide to spend way too much money thinking up.
??? 208 GTX
??? 209 GTX
??? 210 GTX
Your 55nm replacements will be the:
??? 308/309/310 GTX (Assuming they use the same memory config or if you switch to a narrower width on the memory bus, it better be GDDR5 then)
2 is your family code for current gen and 3 will be for 55nm
08-10 = number of SP Cores
GTX = high end Card
So a mid range card might be something like the ??? 206 GT
And a Cheap card could be the ??? 203 GS
Dont even have to change your internal design of the SP Cores.
And if you Tick/Tock like Intel, then the next generation after this one can
Casper42 - Tuesday, September 16, 2008 - link
Next generation after 55nm can be the 4 series and then the 45nm variant of that can be the 5 series and you now have a naming convention for the next 4 years at least.pauldovi - Tuesday, September 16, 2008 - link
Why would you have one FPS at 60.9 and the other at 61? It should either be 61 and 61 or 60.9 and 61.0. If .1 is within the margin of error you should not report FPS to this accuracy.Learn a little about significant digits!
AnnonymousCoward - Wednesday, September 17, 2008 - link
"!?!?"? Is it that big of a deal? I actually prefer dropping the ".0" for the sake of simplicity, and I'm an engineer.strikeback03 - Wednesday, September 17, 2008 - link
Could be Excel automatically throwing away trailing zeros.drebo - Tuesday, September 16, 2008 - link
The main thing keeping me from upgrading my 7900GT right now is how damn many video cards there are available. I can't tell which is better, the 9600GT, 9600GSO, 9800GT, 8800GT, etc, etc, etc, as they're all within about a $40 spread and all seem to be the same damn card.It's frustrating. I wish nVidia would stop doing this. Choose a set of price points and release four to six cards for those price points. High end ($300+), mid range ($200-$300), low end ($100-$200), multimedia ($50-100), extreme low end ($30). We don't need a card spaced every $5 through the spectrum. $50 price differences should more than suffice.
aeternitas - Wednesday, September 17, 2008 - link
Its not really that bad. All you need to do is a little reaserch about each card and you can put them in a easy list. But yes, they do need to work on nameing..Also, they come out with a handfull of cards every 9-18 months, you cant blame them for places still selling last 2 generation cards and you getting confused! Todays highend will be tomarrows midrage, and guess what? The prices will be similer. Its your job to investigate your investment.
The power... IS YOURS!
MrSpadge - Tuesday, September 16, 2008 - link
Just get a 4850 :)(.. seriously saying this while having bought a 9800GTX+ a few weeks ago and really like it)
MrS
helldrell666 - Tuesday, September 16, 2008 - link
At 1920x1200 res. the 4870 beats the new 260gtx in all the tested games.the 4870 has only 512MB of ram so it's not fair to taste at 2560x1600 res.MrSpadge - Tuesday, September 16, 2008 - link
And if AMD put only 128 MB on that card people should only test it at 1024x768, because anything else is not fair?Come on, the lower resolutions are included in the detailed charts and the conclusion clearly says currently the 4870 is better.
MrS
helldrell666 - Tuesday, September 16, 2008 - link
Well,It's not fair to put the results at 2560res. on the top.They could've put the results at 1920 res. on the top.I think it's intentional,to show the 4870 in a worse case.
I know anandtech.they are AMD fanatics.simply, they don't like AMD.
Anyways, im waiting for the review of this card on techreport.com.
theoflow - Tuesday, September 16, 2008 - link
BUT CAN WE PLEASE GET A SYSTEM BUILDER GUIDE NOW???Thank you.
=)
JarredWalton - Tuesday, September 16, 2008 - link
I'll see about addressing that as soon as I finish my current article. I've been meaning to do one for months, but you know what they say about good intentions....Chaotic42 - Tuesday, September 16, 2008 - link
I would love to see one too, but if you're really going to do a new one, and I hope you do, please start doing them regularly. Maybe once per season or fiscal quarter? Monthly might be a bit much, but those system guides are fantastic.theoflow - Tuesday, September 16, 2008 - link
Cool thanks man. I'm itching to build a new rig, but been too busy to keep up with all the changes over the past year or so.MrSpadge - Tuesday, September 16, 2008 - link
Hi guys,I read somewhere that the first batches of 48x0 cards had a bug in their bios which prevented power play from working properly. This is supposed to be fixed since some time now and idle power draw should be decreased significantly.
I'd say contact AMD or a card manufacturer. If it's true they should be more than happy to assist you in obtaining updated numbers. The current numbers are just plain horrible and may keep people from buying the Radeons.
Regards, MrS
Mr Roboto - Tuesday, September 16, 2008 - link
Powerplay works fine on my reference VisionTek 4870 512MB.MrSpadge - Wednesday, September 17, 2008 - link
That's nice for you, but it still looks like it's not working on ATs card.Derek, did you hear me?
MrS
helldrell666 - Tuesday, September 16, 2008 - link
this is a overclocked new 260gtx cuz the stock one has the same clock and shader frequency of the original 260gtx.you should have included a 4870 top or a xoc 4870 in this test.
strikeback03 - Wednesday, September 17, 2008 - link
If you had actually read the article, you would see in multiple places that they ran it at both stock clocks and overclocked (as received) and showed both results.After all the complaining the AMD fanbois did when they showed a 9600GSO in the 4670 article, why would they bring in a new AMD overclocked card and hear the same thing from the NVIDIA fanbois?
toyota - Tuesday, September 16, 2008 - link
what are you talking about? those are the same clocks as the standard GTX260.Staples - Tuesday, September 16, 2008 - link
The ridiculous amount of power draw for an idle card has been going on too long. I have a 4850 and my system consumes 30w more than it did beofre with a 7950GT. Most people do not pay attention to this number but I sure do. I am glad to see that NVIDIA has done more than just bumped up the chips inside this, there is significantly less power draw when it is idle.And here is hoping that ATI can actually come out with some better drivers this month. The 8.8 cause all kinds of trouble with a 780G chipset (in Vista 32) and a 4850 (in XP). Amazing but I have to run 8.7 on both computers because the 8.8 drivers are really problematic.
Vidmo - Wednesday, September 17, 2008 - link
Power Savings??? Where? 160-200 watts for sitting there? Give me a break. These GPUs are a massive waste of power. ATI/nVidia should be ashamed of themselves.MrSpadge - Wednesday, September 17, 2008 - link
Take a calm look at the power consumptions of actual cards, e.g. here:http://www.xbitlabs.com/articles/video/display/zot...">http://www.xbitlabs.com/articles/video/display/zot...
You'll see that the 4870 draws 65 W at idle (seems power play doesn't work there either). Assuming 80% power supply efficiency that means a draw of 81 W at the wall. Therefore ATs system draws 122.5 W without the GPU and the NVs consume about 36 W from the wall and 29 W for the cards themselves. That's way better than previous generation NV cards, which consumed 40 - 60 W at idle. That's what the previous poster meant by "power savings".
(seems like power mixer is not working for XBits NV cards, whereas for ATs it works)
MrS
bespoke - Tuesday, September 16, 2008 - link
I know you were reviewing the chip and not really the EVGA product, but looking at the product images on newegg, I see all the GTX 260s look exactly the same, so the fan noise of this card should be fairly representative for other GTX 260 Core 216s. (Wow, that name is a mouthful.)piroroadkill - Tuesday, September 16, 2008 - link
Why didn't they just call it the Geforce GTX 270cabul - Thursday, May 21, 2009 - link
I read somewhere else recently that it is because the 260 and the 260 Core 216 can still be SLIed together. If they called it anything other than a 260, they felt that consumers would be confused.It makes perfect sense to me now.
gaiden2k5 - Tuesday, September 16, 2008 - link
or GTX 265 since it's a variant of the 260 cardMr Roboto - Tuesday, September 16, 2008 - link
No, no, that's not confusing enough. The GTX260 Core 216 fits perfectly with the 8600GT\GTS, 8800GTS 640\512\320, 9600GT\GSO and the 9800GTX+. Can't wait to hear the naming scheme for the 55nm GTX280's.Boushh - Wednesday, September 17, 2008 - link
How about the GTX260 55nm Core 216 :)yyrkoon - Thursday, September 18, 2008 - link
How about . . . " a failed GTX 280 " core ; )SiliconDoc - Sunday, October 5, 2008 - link
Gosh, sorry I have to vent here. lol -I know you were being a wise guy - so take it with a grain of salt.
ou don't really believe that excuse, do you ? I mean the corpo planners have you eating their goat cheese like a baby.
" Oh lookie here, the exact parts we need failed , failed again, wow, just a shaders region and one mem spew bank, man we get lucky a lot ! Ok, just cut that line on the corecap, and stamp it a 260 !"
.
( I mean really...).
Yes, of course they can grade bin somethnig like an E8400 / E3110 for voltage hence decency, but I've never believed they just whack out a 280 to make it a 260, or the other endless derivations of the basic deal...
They control multipliers IN MANUFACTURE, fine. If they stamp on a different name, fine.
But having this " lucky failed chip " taken to every extreme - I simply don't believe it.
280.260 "same core" - yeah, fine, they PLAN on the 260 core reductions, then produce them as such.
Same as with all their other crap...
We've found MANY times before - their locked chips - when unlocked did just as well as the higher versions, sometimes even better.
Anyway.
crimson117 - Tuesday, September 16, 2008 - link
or GTX 280 GSOsilversound - Tuesday, September 16, 2008 - link
Anandtech just keep getting better, tomshardware just plain sucks & bias now.Any reviews on the new 4850X2?
Anand Lal Shimpi - Tuesday, September 16, 2008 - link
We're still waiting on review samples of the 4850X2 as well as the new 1GB RV770 cards. As soon as we get some in for review we'll get on them :)-A
mmntech - Tuesday, September 16, 2008 - link
When has Tom's ever not been biased? They used to be vary pro-Intel, pro-ATI if I remember correctly.I'm impressed with nVidia's numbers since I had figured they had abandoned the mid-range market again when the release of the GTX series, which were double the price of what ATI was offering. Good frame rates, fair price, and lower power consumption than the HD 4870. Not a bad buy. I wish I had kept with nVidia rather than getting an HD3850 earlier this year. While it's a good card, it has driver trouble with older games like KotOR (low frame rates, missing effects) and ATI's Linux compatibility sucks.
Gannon - Wednesday, September 17, 2008 - link
Toms was ok back in the day, and sometimes for CPU/VGA charts, but for most everything else they were just not very good. They are a dumbed down version of anandtech for the masses.JarredWalton - Tuesday, September 16, 2008 - link
I only wish I could run SLI on an X38 chipset (without the silly nForce 100/200 bridge). Until that happens, I'll use ATI hardware for this generation. Hopefully with the Nehalem stuff, I can pick up an X58 board with support for both GPU platforms. Though I have to say, my dual 3870 cards are starting to look awfully sad. :(aguilpa1 - Tuesday, September 16, 2008 - link
I'm still holding on to my dual 8800GTX's. I am still surprised how competitive they are against all but the uber high end 280SLI's or 4way Xfire setups. At my native resolution of 1920x1080, they still tear everything up.PSMR - Tuesday, September 16, 2008 - link
What about system specs. or drivers that were used?Clauzii - Tuesday, September 16, 2008 - link
It's all on page 2 :)PSMR - Wednesday, September 17, 2008 - link
Ha, that was added after my comment. Wow, good reporting anandtech.BPB - Tuesday, September 16, 2008 - link
I would have liked to seen 4850 in Crossfire since I paid well under $300 for my 2 on sale, and $300 seems to be the price point this article goes after.