Any idea when we'll see a comparo showing the 7900GT against following cards?
X800XL
X1800XL
X1800XT
7800GT
It is important for people running cards like those right now to know how much gain they will see going with a 7900GT versus going with a 7900GTX. Clearly they can see the difference between the 7900GT and 7900GTX on this review, but no one knows what improvement the 7900GT would have today (with today's drivers and games) over the cards many people are still using such as the X800XL or 7800GT.
It's important to know if the 7900GT offers enough gain for such users over their current cards, or whether they should step all the way up to the 7900GTX.
I definite like the review and the presentation Derek. There are definite tradeoffs for price, power load and performance.
All this talk of HDCP, DX10 (with Vista) and HD (1080p) PureVideo vs. AVIVO (shame on you nVidia for asking more from the consumer when ATI bundles such goodies) just around the corner, I'm still on the wait-n-see list before making the plunge to PCIe (besides AMD's M2 chipset, and Intel's DuoCore refresh to spank the PentiumD).
What I don't get is how drastic ATI's ability to do AA with HDR (how many games truly support this? FarCry? but Splinter Cell can't? Half-Life2 engine?) shines above nVidia's lack. Is this the only feature ATI has an exclusive win over nVidia?
Also, there was a preface of the 7900GT being marginally faster than the 7800 GTX-256, with a nice price advantage going to the 7900GT (as well as lower power load), killing off the 7800 line for newcomers. So where is ATI's high-mid or low-high 300$ competing part? Along with the 1900XTX being a gratutitous weak "ultra" offering, since a Crossfire only paces the 1900XT, "wasting" the XTX's 5% extra power, except to prove the King-of-the-Hill mentality. More power to ATI's customers paying a cost premium.
The 7900GT SLI might be penny-wise & pound-foolish, as two of these cards cost substaintially more (350x2 vs. 550 = ~150$ / 20%) a single 7900GTX, draw more power (hence hidden cost of a beefier UPS upgrade) for roughly ~15% gain (YMMV with oc'ing, or manufacturer tweaks).
And sure, the ATI X1800GTO squeaks a victory from the nV 7600GT, with est. end of March '06 MSRP of 249 vs. ~180-200. For the non-graphic fanatic, cost-conscious WoW player, the 7600GT is a nice target for newer PCIe. For the gung-ho FPS shooter, for a bit more, why not aim for the 7900GT instead of the X1800GTO?
For curiosity sake, Derek, could you downgrade a 7900GTX to GT core/mem clock speeds and see how much a difference the extra 256MB makes? If the GT has good OC headroom, it can be a better bargain. With the same basic core, how much can the clock be pushed on the GT? As for memory, how much is 256MB more of GDDR3 RAM worth (over the same bus), and how far can the GT's bandwidth get pushed to help the benchmarks? Is nVidia able to push a GT to a GTX due to better active cooling? This might be wait tweakers looks for to justify their purchase. We see that a 7800GTX-512 had definitive victories over a 7800GTX-256/7900GT of about ~20%, which does bring nearly playable frame-rates into the 60 fps point. Maybe an enterprising third-party might offer a $400 7900GT-512 with a slightly higher mem clk; there is room for opportunistic pricing there.
7900GTX @ $475 is perfect competition for the X1900XT/XTX.
7900GT @ $300 is a great price for 7800GTX performance.
A 7900GT with a quieter, better cooling solution and accompanying overclock for around ~$350 will be my next purchase as soon as Asus or Gigabyte or someone releases such a card at such a price.
too bad the only 7900GTX I've seen is $559.00 Man, I though I read somewhere in Video that these things were supposed to be uber cheap. Guess that was just rumor. :(
I know it can be used to beat world record of fastest graphics cards, but for whom this tests are targeted. I think they are useless for 99.99999% of readers. Atlhon FXs, SLIs all over the place and almost none nonSLIsetups of currently available cards. How typical user, which has card 6-18 months old is supposed to evaluate speed of new cards and value of upgrading. I think it's the main task of such test - convince people to upgrade. Who have such systems like your testbed and who use SLI in real life - 0.00001% of these readers. Maybe even less.
some of us have been making this request for months now but it routinely falls on deaf ears. it appears most anandtech readers would prefer to read what is essentially technical advertising for GPU performance as tested in ubersystems 99% of us will never own.
Can we please have separate charts for SLI/Crossfire setups and single cards that normal people will end up using. That way we can easily compare apples to apples. I'm sure that the 1% of you that use an SLI/Crossfire setup will like the articles, but the rest of us normal people will appreciate a direct comparison between the various single cards.
Yes, that is the by far the biggest complaint I have about recent AT GPU reviews. Please, please put SLI/CF tests in a seperate graph from single card tests.
Derek, all these new cards say on the box "Built for Windows Vista", but i don't see anywhere that they support HDCP (High Bandwidth Content Protection). If i'm not mistaken HD-DVD and Blu-ray are going to be upon us soon, and they both will require hdcp compliant components, specifically your monitor and videocard. Seeing as these cards just came out and theyre boasting of Windows Vista compliance, could u possibly ellaborate on this for me. I seen to be missing something, thanks.
Derek, why are only the first graphs hightling the 7900' series in orange while the rest are totally blue? It makes it hard to compare them to the 7800 series.
sorry, we've had some problems with our graphign engine today -- I will make sure to update the colors on the rest of the graphs so they are more readable.
my plan is to make the new single cards orange and their sli counterparts green.
I understand that it is a lot of data in one place, but I hope this helps.
Man, I've been reading this sight for like two years now and they always find a way to make me feel like a noob. Can anyone tell me what 'the IC' is? Didn't see the long form in the article.
Not an ATI fan anyways but it does seem that these boys favor ATI in almost every review. Now having had experience with ATI it makes me wonder how they can sleep at night knowing what sort of reputation, and consistant reputation ATI has for absolute crap software/drivers. Some of the TV cards they've put out aren't supported by themselves, left to 3rd party software and powervcr at that. Both ATI and Nvidia have good hardware the huge difference between them is implimentation via drivers and software. Nvidia can do it, ATI can't and they've proven it over time. Nvidia drivers are compatible with more of their older cards until you go back to 2mb TNT cards. ATI wasn't able to do this. I don't find Anandtech impartial anymore, they don't put out anywhere near the amount or quality of articles they used to and there's some plagerism claims about them floating around the web. Because of all this I only keep this link for amusement it's not considered a serious source of info anymore.
Anandtech don't favor ATi over nVIDIA. Have you checked out the majority of reviews? The only site that's giving nVIDIA a decisive win is HardOCP. If you want fanboism and retardation (yes new word I invented) please feel free to visit http://www.HardOCP.com">http://www.HardOCP.com. But if you want solid benchmarks only a few places offer them. Beyond3D, Anandtech and firingsquad. You can also check Techreport & Hothardware. Want a list?
- Anandtech (GeForce 7600 and 7900 series)
- Beyond 3D (GeForce 7600 series)
- Bjorn 3D (GeForce 7600 and 7900 series)
- ExtremeTech(GeForce 7600 and 7900 series)
- Firing Squad (GeForce 7900 series)
- Firing Squad (GeForce 7600 series)
- Guru 3D (GeForce 7600 and 7900 series)
- Hard OCP (GeForce 7900 series)
- Hardware Zone (ASUS GeForce 7900 GT)
- HEXUS (GeForce 7600 and 7900 series)
- Hot Hardware (GeForce 7600 and 7900 series)
- Legit Reviews (XFX GeForce 7900 GTX XXX Edition)
- NV News (eVHGA GeForce 7900 GT CO)
- PC Perspective (GeForce 7600 and 7900 series)
- PenStar Systems (eVGA GeForce 7600 CO)
- The Tech Report (GeForce 7600 and 7900 series)
- Tom's Hardware Guide (GeForce 7600 and 7900 series)
- Tweak Town (BFG GeForce 7900 GTX)
- Club IC (French) (GeForce 7900 GT)
- iXBT (Russian) (GeForce 7600 and 7900 series)
- Hardware.FR (GeForce 7900 series)
- Hardware.FR (GeForce 7600 series)
All in all the x1900XTX comes out the winner in the high end segment when HIGH END features are used (AA and AF) and when heavy Shaders are used as well. But it's not a clear victory. Results go both ways and much like the x800XT PE vs. 6800 Ultra (with roles reversed) there will never be a clear winner between these two cards.
I for one prefer the X1900XTX, I like the fact that it will last a tad longer and offer me better Shader performance, better performance under HDR, Adaptive AA, High Quality AF, HDR + AA, AVIVO and the AVIVO converter tool. But that's just my opinion.
You do realize that the x1900 XT and XTX beat the 7800 series, right? That's all Nvidia has had until now. I'm glad to see the 7900 take the lead (albeit the few frames it gains generally don't matter). What concerns me is the budget market. I'd like to see both ATI and Nvidia do some more work in producing better budget cards. My x800pro is still an awesome mid-range card that can hang with many of these new series cards, minus SM3(I bought it some months ago as a final AGP upgrade). In the end of course, stiff compeitition = better price/performance for us
Been living under a rock for the last 3 years? ATI's drivers are fine these days. I still prefer NVidia's drivers, but that's a matter of preference mainly. Quality-wise, there's only the slightest difference these days.
And NVidia isn't all that compatible either. They've ditched support for everything up to (and including) Geforce 2 in their newer drivers. But really, who cares? I doubt you'd get much more performance out of a GF2 by using newer drivers.
As for the bias, I'm surprised NVidia does so well in this test. I was expecting them to take a beating performance-wise.
But geez, what you're saying is really "I don't know anything about ATI, but the fact that AT includes their cards in benchmarks means they must be evil liars..."
If you've never had experience with an ATI GPU, how qualified are you to judge their software? I've used cards made by both companies and I would not bad mouth ATI's drivers down anymore. Ever since the Catalyst series came out, their drivers have been pretty decent. The 'Driver Gap' is highly overrated and untrue to the best of my experience, atleast under Windows. Under Linux, my apartment mate tells me ATI's drivers suck, but then again, he's never used them, but I'd give some weight to his opinion. In any case, there's no point in buying a high end card like this for a Linux box.
First of all, let me say that Anandtech is usually the first place I visit when looking for information on new hardware, however, I find that your video card reviews seem to have fallen prey to the same pattern as other review sites. Although its nice to know how these cards perform for gaming, the vast majority of users do more than game with their machines. It would be very beneficial to those of us looking for a new video card to see results of comparative video quality for text use and photo editing as well as the normal gaming tests. In the past, I have returned video cards because of their extremely poor text quality, even though they were good for gaming. The gaming community is a vocal minority online, however, the vast majority of users spend a lot of time using their machines for textual processing or photo editing, etc and a small portion of their time gaming.
Please include the requested tests in upcoming video card reviews so as to provide a balanced, professional review of these products and stand out from all the other review sites that seem to concentrate primarily on gaming.
Can you specify what cards you've had to return due to poor texture quality? As far as I know, no cards have had problems with 2D in a very very long time. In any case, you'd have to be insane and very rich to splurge money on a G71 or R580 class card for Photoshop or @D desktop performance. It's like buying a '70 Dodge Challenger for driving to work in. I do however feel that AT needs to talk about image quality in 3D some. With all the different modes of AF and AA out there, and the cores themselves performing so well, IQ becomes a large factor in the decesion making process.
In the past I have had to return Asus and Abit Geforce based cards due to their dubious text\2D quality. There are differences between the various cards, ATI and nVidia, dependant upon the actual manufacturer, in their filter designs. This has a noticeable affect at times on the quality of the text. I agree that IQ in 3D is important, however I do think that text and 2D IQ are also important. The fact that a G71 or R580 class card may be overkill if all you were doing with your computer is Photoshop or MSOffice, however for some of us, the computer is a multipurpose device, used for the full gamut of applications, including occassional gaming. In the main, I usually stay a step behind the bleeding edge of video performance, as do many others. Todays bleeding edge is tomorrows main stream card and unless you review everything the first time, there is no information wrt text and 2D IQ.
These are most likely reference cards, and reference cards from nvidia have in the past proven to output a much better signal that what will be produced later on, esp. when the price cutting starts.
Derek, why don't you guys take the time required to produce a nice review? Is it really necessary to get that article up and running on the day of the launch? If you got the cards late, bash the company for it. And take all the time you need to do a proper review like these AT have done in the past.
Reviews with just benchmarks and pharaphrased press release info is REALLY boring and is a turn off. For example, I couldn't bear to look at the graphs as they weren't relevant. I skipped right to the End.
Whatever happened to overclocking investigations? Testing for core/mem bottlenecks by tweaking the frequency? Such infomation is USEFUL as it means all these with the same care out there DOES NOT have to repeat it for themselves. Recall AT's TNT/GF2 era articles. If my memory is correct, there were pages of such investigation, and a final recommendation was made to clock up the mem clock to the limit, and then clock up the core.
Image quality comparisons like these done on for the Radeon 32 DDR, R200, etc are almost absent.
Quality of components used? Granted, this is moot for engineering sample cards, but an investigation of the cooling solution would be good. Reliability and noise of the cooling solution should be included. Does these ultra fine fins dust traps? That small high RPM screamer a possible candidate for early failure?
Performance is only one small part of the whole picture. Everyone and their dog publishes graphs. However, only a select few go beyond that, and even fewer are from these that have the trust of many.
just dropped by to say that you did a great job here, plenty of info, good benchmarks, nice "load/idle" tests. not many people here know how stresfull benchmarking against the clock can be. keep up the good work. Looking forward to the follow-up!
I think it's safe to say that atleast for now, there is no clear winner with a slight advantage to ATI. From the bechmarks, it seems that the 7900GTX performs on par with the X1900XT with the X1900XTX a few fps higher (not a huge diff IMO). The future will therefore be decided by the drivers and the games. The drivers are still pretty young and I bet we'll see performance improvements in the future as both sets of drivers mature. The article says, ATI has the more comprehensive graphics solution (sorta like the R420 vs/s the NV40 situation in reverse?), so if code developers decide to take advantage of the greater functionality offered by ATI (most coders will probably aim for the lowest common denominator to increase sales, while a few may have 'ATI only' type features) then that may tilt the balance in ATI's favor. What's more important is the longevity of the R580 & G71 families. With VISTA set to appear towards the end of the year, how long can ATI and NVIDIA push these DX9 parts? I'm sure both companies will have a new family ready for VISTA, though the new family may just be a more refined version of the R580 and G71 architectures (much as the R420 was based off the R300 family). In terms of raw power, I think we're already VISTA games ready.
The real question is, what does DX10 bring to the table from the perspective of the end user? There were certain features unique to DX9 that a DX8 part just could not render. Are there things that a DX10 card will be able to do that a DX9 card just can't do? As I understand it, the main difference between DX9 and 10 is that DX10 will unify Pixel Shaders and Vertex shaders, but I don't see how this will let a DX10 card render something that a DX9 card can't. Can anyone clarify?
Lastly, one great benefit of Crossfire and SLI will be that I can buy a high end X1900XT for gaming right now and then add a low end card or a HD accelerator card (like the MPEG accelerator cards a few yyears ago) when it is clear if HDCP support will be necessary to play HD content and when I can afford to add a HDCP compliant monitor.
quote: The future will therefore be decided by the drivers and the games.
And price, bro, and price. Two cards at the same performance level from two different companies = great time for a price war. Especially when one has the die shrink advantage as an incentive to drop the price to squeeze out the other's profits.
The major difference that DX9 parts will "just not be able to do" is vertex generation and "geometry shading" in hardware. Currently a vertex shader program can only manipulate existing data, while in the future it will be possible to adaptively create or destroy vertecies.
Programmatically, the transition from DX9 to DX10 will be one of the largest we have seen in a while. Or so we have been told. Some form of the DX10 SDK (not sure if it was a beta or not) was recently released, so I may look into that for more details if people are interested.
I too would be very interested to learn more about DX10. I have looked online but I haven't really seen anything beyond the unification u mentioned.
Also Unreal 2007 does look ungodly, and I didn't even think to wonder if it was DX9 or 10 like the other poster. Will it be comparable to games that will run on 8.1 hardware sans DX9 effects? That engine will make them big bux when they license it out. Sidenote-I read they were running demos of it with a quad SLI setup to showcase the game. I wonder what it will need to run it at full tilt?
BTW Derek I think you do a very good job at AT, I always find your articles full of good common sense advice. When U did a review on the 3000+ budget gaming platform I jumped on the A64 bandwagon (I had to get an AsrockDual tho, instead of an NF4 cuz I wanted to keep my AGP 6600gt, and that's sad now considering the 7900 gt performance/price in sli compared to a 7900gtx.) and I've been really happy with my 3000+ at 2250 it runs noticeably better than my XP2400M oc'd 2.2) I'm just one example of someone you & AT have made more satisfied with their PC experience. So don't let disparaging comments get you down. You thorough committment to accuracy of your work shows how you accept criticism with grace and correct mistakes swiftly. I think the only thing "slipping" around here are peoples' manners.
Yes, please do! So if you can actually generate vertices, the impact would be that you'd be able to do stuff like the character's hair flying apart in a light breeze without having to create the hair as a high poly model, right? What about the Unreal3 engine? Is it a DX9 or DX10 engine?
Where the heck is the X3: Reunion rolling demo benchmark? I was all geeked when AT reviewed it and said "it will make a fine addition to our round of benchmarks." Well then when the heck are you going to start using it? I have yet to see it being used for any of the articles posted since the review.
We really will be including X3 in our benchmarks ^_^;;
The benchmark does take quite a long time and we needed to optimize our performance testing in order to make sure we could get the article up for the launch.
As I have mentioned, we will be doing a follow up article, and I will look into including the X3 demo.
How did the XTX Crossfire lose 11 FPS with a very mild bump in resolution? Worst yet, their editors didn't even mention which drivers they used for their review.
That is really odd. I'd expect the numbers to swing a little, but 11 fps is 25% of 44fps. Could they be using different benchmarks? Atleast they aren't simply using the numbers from the X1900 review and are actually retesting stuff.
FEAR is one game where the x1900's have had a big lead over the 7800's, and your results from today just done make sense. How does a x1900xtx get 59fps at 1280x1024, when the gtx512 also get 59 and the 7900gtx ges 63? Comapare it to the results from another site - http://www.techreport.com/reviews/2006q1/geforce-7...">http://www.techreport.com/reviews/2006q1/geforce-7.... At 1280x960 they place the xtx at 57fps, the 7900gtx at 46, and the gtx512 at 44, which are more inline with the results I have seen before.
There is a known bug in the current drivers that causes a performance drop with the 7900GTX in FEAR. Check our HardOCP's preview, where they use the updated driver from Nvidia. FEAR scores are the same or higher than the 1900XT(x)
This article didn't state which drivers were used either - you'd think after having the cards for a few weeks your editors wouldn't have such obvious oversights.
I've updated the article with drivers used. I appologize for the omission.
I absolutely do not mean this as an excuse -- drivers should not have been omitted no matter what the case -- but we have had the cards for less than a week. Again, not an excuse, just correcting your assumption.
It would be really helpful if you had SLI data and Single card data seperate on each benchmark. If a single card is beating an SLI setup - I can figure that out for myself. In the current charts a lot of cards that should be there, or could be there, aren't because your running out of space.
I mean the high performance benchmarks include nothing lower end from ATI than a 1900XT? You are also only reporting on the SLI performance of the 7600GT in the first 4 benchmarks. Most of us aren't going to go out and drop the cash on an SLI setup right off the bat but instead use it as an incremental upgrade.
There were a lot of factors that went in to our decision to cut down the number of tests for this article. Our testing is not over, as we are planning a follow up article as well.
I never really noticed how bad Crossfire sucks. While a 1900 xtx is about as faster and in many cases faster than 7900, when put in Crossfire vs SLI it just blows.
And oh by the way it looks like the 7600 GT does quite good against the X1800 GTO with less then half the die size to boot and only a 128Bit Memory Interface.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
97 Comments
Back to Article
yacoub - Friday, March 10, 2006 - link
Any idea when we'll see a comparo showing the 7900GT against following cards?X800XL
X1800XL
X1800XT
7800GT
It is important for people running cards like those right now to know how much gain they will see going with a 7900GT versus going with a 7900GTX. Clearly they can see the difference between the 7900GT and 7900GTX on this review, but no one knows what improvement the 7900GT would have today (with today's drivers and games) over the cards many people are still using such as the X800XL or 7800GT.
It's important to know if the 7900GT offers enough gain for such users over their current cards, or whether they should step all the way up to the 7900GTX.
Thanks.
spinportal - Friday, March 10, 2006 - link
I definite like the review and the presentation Derek. There are definite tradeoffs for price, power load and performance.All this talk of HDCP, DX10 (with Vista) and HD (1080p) PureVideo vs. AVIVO (shame on you nVidia for asking more from the consumer when ATI bundles such goodies) just around the corner, I'm still on the wait-n-see list before making the plunge to PCIe (besides AMD's M2 chipset, and Intel's DuoCore refresh to spank the PentiumD).
What I don't get is how drastic ATI's ability to do AA with HDR (how many games truly support this? FarCry? but Splinter Cell can't? Half-Life2 engine?) shines above nVidia's lack. Is this the only feature ATI has an exclusive win over nVidia?
Also, there was a preface of the 7900GT being marginally faster than the 7800 GTX-256, with a nice price advantage going to the 7900GT (as well as lower power load), killing off the 7800 line for newcomers. So where is ATI's high-mid or low-high 300$ competing part? Along with the 1900XTX being a gratutitous weak "ultra" offering, since a Crossfire only paces the 1900XT, "wasting" the XTX's 5% extra power, except to prove the King-of-the-Hill mentality. More power to ATI's customers paying a cost premium.
The 7900GT SLI might be penny-wise & pound-foolish, as two of these cards cost substaintially more (350x2 vs. 550 = ~150$ / 20%) a single 7900GTX, draw more power (hence hidden cost of a beefier UPS upgrade) for roughly ~15% gain (YMMV with oc'ing, or manufacturer tweaks).
And sure, the ATI X1800GTO squeaks a victory from the nV 7600GT, with est. end of March '06 MSRP of 249 vs. ~180-200. For the non-graphic fanatic, cost-conscious WoW player, the 7600GT is a nice target for newer PCIe. For the gung-ho FPS shooter, for a bit more, why not aim for the 7900GT instead of the X1800GTO?
For curiosity sake, Derek, could you downgrade a 7900GTX to GT core/mem clock speeds and see how much a difference the extra 256MB makes? If the GT has good OC headroom, it can be a better bargain. With the same basic core, how much can the clock be pushed on the GT? As for memory, how much is 256MB more of GDDR3 RAM worth (over the same bus), and how far can the GT's bandwidth get pushed to help the benchmarks? Is nVidia able to push a GT to a GTX due to better active cooling? This might be wait tweakers looks for to justify their purchase. We see that a 7800GTX-512 had definitive victories over a 7800GTX-256/7900GT of about ~20%, which does bring nearly playable frame-rates into the 60 fps point. Maybe an enterprising third-party might offer a $400 7900GT-512 with a slightly higher mem clk; there is room for opportunistic pricing there.
yacoub - Friday, March 10, 2006 - link
I like what XFX and eVGA are offering: http://www.firingsquad.com/hardware/nvidia_geforce...">http://www.firingsquad.com/hardware/nvi...orce_790...yacoub - Friday, March 10, 2006 - link
Good article, good conclusion.7900GTX @ $475 is perfect competition for the X1900XT/XTX.
7900GT @ $300 is a great price for 7800GTX performance.
A 7900GT with a quieter, better cooling solution and accompanying overclock for around ~$350 will be my next purchase as soon as Asus or Gigabyte or someone releases such a card at such a price.
Leper Messiah - Friday, March 10, 2006 - link
too bad the only 7900GTX I've seen is $559.00 Man, I though I read somewhere in Video that these things were supposed to be uber cheap. Guess that was just rumor. :(yacoub - Friday, March 10, 2006 - link
Really? I saw ones as low as $499 in the RealTime Pricing results..KHysiek - Friday, March 10, 2006 - link
I know it can be used to beat world record of fastest graphics cards, but for whom this tests are targeted. I think they are useless for 99.99999% of readers. Atlhon FXs, SLIs all over the place and almost none nonSLIsetups of currently available cards. How typical user, which has card 6-18 months old is supposed to evaluate speed of new cards and value of upgrading. I think it's the main task of such test - convince people to upgrade. Who have such systems like your testbed and who use SLI in real life - 0.00001% of these readers. Maybe even less.yacoub - Friday, March 10, 2006 - link
some of us have been making this request for months now but it routinely falls on deaf ears. it appears most anandtech readers would prefer to read what is essentially technical advertising for GPU performance as tested in ubersystems 99% of us will never own.Egglick - Friday, March 10, 2006 - link
Trying to compare cards with all those SLI and Crossfire scores everywhere can get really irritating.bigboxes - Thursday, March 9, 2006 - link
Can we please have separate charts for SLI/Crossfire setups and single cards that normal people will end up using. That way we can easily compare apples to apples. I'm sure that the 1% of you that use an SLI/Crossfire setup will like the articles, but the rest of us normal people will appreciate a direct comparison between the various single cards.Regs - Thursday, March 9, 2006 - link
I can agree with that. SLI is clearly still a segregated part of the market.It's welcome...but...segregated.
Ozenmacher - Thursday, March 9, 2006 - link
Yes, good point. Or atleast use a different color or formatted bar so it is easy to distingush on another.smitty3268 - Thursday, March 9, 2006 - link
Yes, that is the by far the biggest complaint I have about recent AT GPU reviews. Please, please put SLI/CF tests in a seperate graph from single card tests.Zoomer - Monday, March 13, 2006 - link
Ditto. SLI is really a retarded marketing move and I hate nVidia for starting this.Look at all the recent mobo designs. Due to some weird fascination for SLI, almost all of them have weird slot designs with few PCI slots.
PrinceGaz - Friday, March 10, 2006 - link
I agree too, SLI/Crossfire results should be in a seperate graph.JNo - Monday, March 13, 2006 - link
Me tooOzenmacher - Thursday, March 9, 2006 - link
Oh, is there anyway you can allow post editing so I can change my bone-headed spelling errors ^^frostyrox - Thursday, March 9, 2006 - link
Derek, all these new cards say on the box "Built for Windows Vista", but i don't see anywhere that they support HDCP (High Bandwidth Content Protection). If i'm not mistaken HD-DVD and Blu-ray are going to be upon us soon, and they both will require hdcp compliant components, specifically your monitor and videocard. Seeing as these cards just came out and theyre boasting of Windows Vista compliance, could u possibly ellaborate on this for me. I seen to be missing something, thanks.Zoomer - Monday, March 13, 2006 - link
Vista doesn't require HDCP. These cards, and many previous cards, will work just fine with vista.Only playing HD content with stock windows, stock everything, etc will require HDCP.
Phantronius - Thursday, March 9, 2006 - link
I see no reason to part with my 7800GTX OC 256meg from BFG just yetRegs - Thursday, March 9, 2006 - link
Or am I going to have to look for myself? Im too lazy Derek.DerekWilson - Thursday, March 9, 2006 - link
well... since the 7900 GT performs the same as the 7800 GTX ... and the 7800 GT performance relative to the 7800 GTX is well documented :-)You're right though, it might have been good to make this more clear.
Regs - Thursday, March 9, 2006 - link
So I suspect a 5-10% difference going from a 7800GT to a 7900GT.Phantronius - Thursday, March 9, 2006 - link
Derek, why are only the first graphs hightling the 7900' series in orange while the rest are totally blue? It makes it hard to compare them to the 7800 series.DerekWilson - Thursday, March 9, 2006 - link
sorry, we've had some problems with our graphign engine today -- I will make sure to update the colors on the rest of the graphs so they are more readable.my plan is to make the new single cards orange and their sli counterparts green.
I understand that it is a lot of data in one place, but I hope this helps.
Thanks,
Derek Wilson
coldpower27 - Thursday, March 9, 2006 - link
I though sites reported that the die size of the R580 is supposed to be 313mm2 not 353mm2 as stated in the article???
APKasten - Thursday, March 9, 2006 - link
Man, I've been reading this sight for like two years now and they always find a way to make me feel like a noob. Can anyone tell me what 'the IC' is? Didn't see the long form in the article.DerekWilson - Thursday, March 9, 2006 - link
Sorry ...Integrated Circuit.
It is the term for what is commonly refered to as a "chip"
APKasten - Thursday, March 9, 2006 - link
Oh man that's embarassing...for some reason I just couldn't figure that. Sometimes the brain just doesn't work right.4AcesIII - Thursday, March 9, 2006 - link
Not an ATI fan anyways but it does seem that these boys favor ATI in almost every review. Now having had experience with ATI it makes me wonder how they can sleep at night knowing what sort of reputation, and consistant reputation ATI has for absolute crap software/drivers. Some of the TV cards they've put out aren't supported by themselves, left to 3rd party software and powervcr at that. Both ATI and Nvidia have good hardware the huge difference between them is implimentation via drivers and software. Nvidia can do it, ATI can't and they've proven it over time. Nvidia drivers are compatible with more of their older cards until you go back to 2mb TNT cards. ATI wasn't able to do this. I don't find Anandtech impartial anymore, they don't put out anywhere near the amount or quality of articles they used to and there's some plagerism claims about them floating around the web. Because of all this I only keep this link for amusement it's not considered a serious source of info anymore.Z3RoC00L - Thursday, March 9, 2006 - link
Anandtech don't favor ATi over nVIDIA. Have you checked out the majority of reviews? The only site that's giving nVIDIA a decisive win is HardOCP. If you want fanboism and retardation (yes new word I invented) please feel free to visit http://www.HardOCP.com">http://www.HardOCP.com. But if you want solid benchmarks only a few places offer them. Beyond3D, Anandtech and firingsquad. You can also check Techreport & Hothardware. Want a list?- Anandtech (GeForce 7600 and 7900 series)
- Beyond 3D (GeForce 7600 series)
- Bjorn 3D (GeForce 7600 and 7900 series)
- ExtremeTech(GeForce 7600 and 7900 series)
- Firing Squad (GeForce 7900 series)
- Firing Squad (GeForce 7600 series)
- Guru 3D (GeForce 7600 and 7900 series)
- Hard OCP (GeForce 7900 series)
- Hardware Zone (ASUS GeForce 7900 GT)
- HEXUS (GeForce 7600 and 7900 series)
- Hot Hardware (GeForce 7600 and 7900 series)
- Legit Reviews (XFX GeForce 7900 GTX XXX Edition)
- NV News (eVHGA GeForce 7900 GT CO)
- PC Perspective (GeForce 7600 and 7900 series)
- PenStar Systems (eVGA GeForce 7600 CO)
- The Tech Report (GeForce 7600 and 7900 series)
- Tom's Hardware Guide (GeForce 7600 and 7900 series)
- Tweak Town (BFG GeForce 7900 GTX)
- Club IC (French) (GeForce 7900 GT)
- iXBT (Russian) (GeForce 7600 and 7900 series)
- Hardware.FR (GeForce 7900 series)
- Hardware.FR (GeForce 7600 series)
All in all the x1900XTX comes out the winner in the high end segment when HIGH END features are used (AA and AF) and when heavy Shaders are used as well. But it's not a clear victory. Results go both ways and much like the x800XT PE vs. 6800 Ultra (with roles reversed) there will never be a clear winner between these two cards.
I for one prefer the X1900XTX, I like the fact that it will last a tad longer and offer me better Shader performance, better performance under HDR, Adaptive AA, High Quality AF, HDR + AA, AVIVO and the AVIVO converter tool. But that's just my opinion.
Fenixgoon - Thursday, March 9, 2006 - link
You do realize that the x1900 XT and XTX beat the 7800 series, right? That's all Nvidia has had until now. I'm glad to see the 7900 take the lead (albeit the few frames it gains generally don't matter). What concerns me is the budget market. I'd like to see both ATI and Nvidia do some more work in producing better budget cards. My x800pro is still an awesome mid-range card that can hang with many of these new series cards, minus SM3(I bought it some months ago as a final AGP upgrade). In the end of course, stiff compeitition = better price/performance for usSpoonbender - Thursday, March 9, 2006 - link
Been living under a rock for the last 3 years? ATI's drivers are fine these days. I still prefer NVidia's drivers, but that's a matter of preference mainly. Quality-wise, there's only the slightest difference these days.And NVidia isn't all that compatible either. They've ditched support for everything up to (and including) Geforce 2 in their newer drivers. But really, who cares? I doubt you'd get much more performance out of a GF2 by using newer drivers.
As for the bias, I'm surprised NVidia does so well in this test. I was expecting them to take a beating performance-wise.
But geez, what you're saying is really "I don't know anything about ATI, but the fact that AT includes their cards in benchmarks means they must be evil liars..."
Spinne - Thursday, March 9, 2006 - link
If you've never had experience with an ATI GPU, how qualified are you to judge their software? I've used cards made by both companies and I would not bad mouth ATI's drivers down anymore. Ever since the Catalyst series came out, their drivers have been pretty decent. The 'Driver Gap' is highly overrated and untrue to the best of my experience, atleast under Windows. Under Linux, my apartment mate tells me ATI's drivers suck, but then again, he's never used them, but I'd give some weight to his opinion. In any case, there's no point in buying a high end card like this for a Linux box.rgsaunders - Thursday, March 9, 2006 - link
First of all, let me say that Anandtech is usually the first place I visit when looking for information on new hardware, however, I find that your video card reviews seem to have fallen prey to the same pattern as other review sites. Although its nice to know how these cards perform for gaming, the vast majority of users do more than game with their machines. It would be very beneficial to those of us looking for a new video card to see results of comparative video quality for text use and photo editing as well as the normal gaming tests. In the past, I have returned video cards because of their extremely poor text quality, even though they were good for gaming. The gaming community is a vocal minority online, however, the vast majority of users spend a lot of time using their machines for textual processing or photo editing, etc and a small portion of their time gaming.Please include the requested tests in upcoming video card reviews so as to provide a balanced, professional review of these products and stand out from all the other review sites that seem to concentrate primarily on gaming.
Spinne - Thursday, March 9, 2006 - link
Can you specify what cards you've had to return due to poor texture quality? As far as I know, no cards have had problems with 2D in a very very long time. In any case, you'd have to be insane and very rich to splurge money on a G71 or R580 class card for Photoshop or @D desktop performance. It's like buying a '70 Dodge Challenger for driving to work in. I do however feel that AT needs to talk about image quality in 3D some. With all the different modes of AF and AA out there, and the cores themselves performing so well, IQ becomes a large factor in the decesion making process.rgsaunders - Thursday, March 9, 2006 - link
In the past I have had to return Asus and Abit Geforce based cards due to their dubious text\2D quality. There are differences between the various cards, ATI and nVidia, dependant upon the actual manufacturer, in their filter designs. This has a noticeable affect at times on the quality of the text. I agree that IQ in 3D is important, however I do think that text and 2D IQ are also important. The fact that a G71 or R580 class card may be overkill if all you were doing with your computer is Photoshop or MSOffice, however for some of us, the computer is a multipurpose device, used for the full gamut of applications, including occassional gaming. In the main, I usually stay a step behind the bleeding edge of video performance, as do many others. Todays bleeding edge is tomorrows main stream card and unless you review everything the first time, there is no information wrt text and 2D IQ.Zoomer - Monday, March 13, 2006 - link
These are most likely reference cards, and reference cards from nvidia have in the past proven to output a much better signal that what will be produced later on, esp. when the price cutting starts.Zoomer - Monday, March 13, 2006 - link
One more thing.Derek, why don't you guys take the time required to produce a nice review? Is it really necessary to get that article up and running on the day of the launch? If you got the cards late, bash the company for it. And take all the time you need to do a proper review like these AT have done in the past.
Reviews with just benchmarks and pharaphrased press release info is REALLY boring and is a turn off. For example, I couldn't bear to look at the graphs as they weren't relevant. I skipped right to the End.
Whatever happened to overclocking investigations? Testing for core/mem bottlenecks by tweaking the frequency? Such infomation is USEFUL as it means all these with the same care out there DOES NOT have to repeat it for themselves. Recall AT's TNT/GF2 era articles. If my memory is correct, there were pages of such investigation, and a final recommendation was made to clock up the mem clock to the limit, and then clock up the core.
Image quality comparisons like these done on for the Radeon 32 DDR, R200, etc are almost absent.
Quality of components used? Granted, this is moot for engineering sample cards, but an investigation of the cooling solution would be good. Reliability and noise of the cooling solution should be included. Does these ultra fine fins dust traps? That small high RPM screamer a possible candidate for early failure?
Performance is only one small part of the whole picture. Everyone and their dog publishes graphs. However, only a select few go beyond that, and even fewer are from these that have the trust of many.
Questar - Thursday, March 9, 2006 - link
According to Hardocp, the 7900 has horrible texture shimmering issues.DigitalFreak - Thursday, March 9, 2006 - link
I saw that as well. Any comments, Derek?DerekWilson - Thursday, March 9, 2006 - link
I did not see any texture shimmering during testing, but I will make sure to look very closely druing our follow up testing.Thanks,
Derek Wilson
jmke - Thursday, March 9, 2006 - link
just dropped by to say that you did a great job here, plenty of info, good benchmarks, nice "load/idle" tests. not many people here know how stresfull benchmarking against the clock can be. keep up the good work. Looking forward to the follow-up!Spinne - Thursday, March 9, 2006 - link
I think it's safe to say that atleast for now, there is no clear winner with a slight advantage to ATI. From the bechmarks, it seems that the 7900GTX performs on par with the X1900XT with the X1900XTX a few fps higher (not a huge diff IMO). The future will therefore be decided by the drivers and the games. The drivers are still pretty young and I bet we'll see performance improvements in the future as both sets of drivers mature. The article says, ATI has the more comprehensive graphics solution (sorta like the R420 vs/s the NV40 situation in reverse?), so if code developers decide to take advantage of the greater functionality offered by ATI (most coders will probably aim for the lowest common denominator to increase sales, while a few may have 'ATI only' type features) then that may tilt the balance in ATI's favor. What's more important is the longevity of the R580 & G71 families. With VISTA set to appear towards the end of the year, how long can ATI and NVIDIA push these DX9 parts? I'm sure both companies will have a new family ready for VISTA, though the new family may just be a more refined version of the R580 and G71 architectures (much as the R420 was based off the R300 family). In terms of raw power, I think we're already VISTA games ready.The real question is, what does DX10 bring to the table from the perspective of the end user? There were certain features unique to DX9 that a DX8 part just could not render. Are there things that a DX10 card will be able to do that a DX9 card just can't do? As I understand it, the main difference between DX9 and 10 is that DX10 will unify Pixel Shaders and Vertex shaders, but I don't see how this will let a DX10 card render something that a DX9 card can't. Can anyone clarify?
Lastly, one great benefit of Crossfire and SLI will be that I can buy a high end X1900XT for gaming right now and then add a low end card or a HD accelerator card (like the MPEG accelerator cards a few yyears ago) when it is clear if HDCP support will be necessary to play HD content and when I can afford to add a HDCP compliant monitor.
yacoub - Friday, March 10, 2006 - link
And price, bro, and price. Two cards at the same performance level from two different companies = great time for a price war. Especially when one has the die shrink advantage as an incentive to drop the price to squeeze out the other's profits.
bob661 - Thursday, March 9, 2006 - link
I only buy based on games I play anyways but it's good to see them close in performance.DerekWilson - Thursday, March 9, 2006 - link
The major difference that DX9 parts will "just not be able to do" is vertex generation and "geometry shading" in hardware. Currently a vertex shader program can only manipulate existing data, while in the future it will be possible to adaptively create or destroy vertecies.Programmatically, the transition from DX9 to DX10 will be one of the largest we have seen in a while. Or so we have been told. Some form of the DX10 SDK (not sure if it was a beta or not) was recently released, so I may look into that for more details if people are interested.
feraltoad - Friday, March 10, 2006 - link
I too would be very interested to learn more about DX10. I have looked online but I haven't really seen anything beyond the unification u mentioned.Also Unreal 2007 does look ungodly, and I didn't even think to wonder if it was DX9 or 10 like the other poster. Will it be comparable to games that will run on 8.1 hardware sans DX9 effects? That engine will make them big bux when they license it out. Sidenote-I read they were running demos of it with a quad SLI setup to showcase the game. I wonder what it will need to run it at full tilt?
BTW Derek I think you do a very good job at AT, I always find your articles full of good common sense advice. When U did a review on the 3000+ budget gaming platform I jumped on the A64 bandwagon (I had to get an AsrockDual tho, instead of an NF4 cuz I wanted to keep my AGP 6600gt, and that's sad now considering the 7900 gt performance/price in sli compared to a 7900gtx.) and I've been really happy with my 3000+ at 2250 it runs noticeably better than my XP2400M oc'd 2.2) I'm just one example of someone you & AT have made more satisfied with their PC experience. So don't let disparaging comments get you down. You thorough committment to accuracy of your work shows how you accept criticism with grace and correct mistakes swiftly. I think the only thing "slipping" around here are peoples' manners.
Spinne - Thursday, March 9, 2006 - link
Yes, please do! So if you can actually generate vertices, the impact would be that you'd be able to do stuff like the character's hair flying apart in a light breeze without having to create the hair as a high poly model, right? What about the Unreal3 engine? Is it a DX9 or DX10 engine?Rock Hydra - Thursday, March 9, 2006 - link
I didn't read all of that, but I'm glad it's close becasue the consumer becomes the real winner.redlotus - Thursday, March 9, 2006 - link
Where the heck is the X3: Reunion rolling demo benchmark? I was all geeked when AT reviewed it and said "it will make a fine addition to our round of benchmarks." Well then when the heck are you going to start using it? I have yet to see it being used for any of the articles posted since the review.DerekWilson - Thursday, March 9, 2006 - link
We really will be including X3 in our benchmarks ^_^;;The benchmark does take quite a long time and we needed to optimize our performance testing in order to make sure we could get the article up for the launch.
As I have mentioned, we will be doing a follow up article, and I will look into including the X3 demo.
Thanks,
Derek Wilson
5150Joker - Thursday, March 9, 2006 - link
Check out these discrepancies with Anandtech's review, boy has this site been going downhill lately:From your older review:
http://images.anandtech.com/graphs/ati%20radeon%20...">http://images.anandtech.com/graphs/ati%...0x1900%2...
Then today's review:
http://images.anandtech.com/graphs/7900%20and%2076...">http://images.anandtech.com/graphs/7900...%207600%...
How did the XTX Crossfire lose 11 FPS with a very mild bump in resolution? Worst yet, their editors didn't even mention which drivers they used for their review.
Cygni - Friday, March 10, 2006 - link
Wow, its like numbers change with different motherboards, chipsets, and driver revisions. ALRET THE PRESS!
Spinne - Thursday, March 9, 2006 - link
That is really odd. I'd expect the numbers to swing a little, but 11 fps is 25% of 44fps. Could they be using different benchmarks? Atleast they aren't simply using the numbers from the X1900 review and are actually retesting stuff.DerekWilson - Thursday, March 9, 2006 - link
We retested with an updated motherboard (RD580) and an updated driver (CAT 6.2).We used the same test for F.E.A.R. (the built in performance test).
I'm not sure why performance would drop in this case.
DerekWilson - Thursday, March 9, 2006 - link
I've been looking into this, and we also are now using F.E.A.R. 1.03 rather than 1.02 which we used last time.I retested the x1900 xtx crossfire and got the same results. I'm really not sure what happened with this, but I'll keep poking around.
munky - Thursday, March 9, 2006 - link
FEAR is one game where the x1900's have had a big lead over the 7800's, and your results from today just done make sense. How does a x1900xtx get 59fps at 1280x1024, when the gtx512 also get 59 and the 7900gtx ges 63? Comapare it to the results from another site - http://www.techreport.com/reviews/2006q1/geforce-7...">http://www.techreport.com/reviews/2006q1/geforce-7.... At 1280x960 they place the xtx at 57fps, the 7900gtx at 46, and the gtx512 at 44, which are more inline with the results I have seen before.DigitalFreak - Thursday, March 9, 2006 - link
There is a known bug in the current drivers that causes a performance drop with the 7900GTX in FEAR. Check our HardOCP's preview, where they use the updated driver from Nvidia. FEAR scores are the same or higher than the 1900XT(x)DerekWilson - Thursday, March 9, 2006 - link
We went back and updated our performance numbers with the afore mentioned driver fix.NVIDIA released it to the press late in the weekend, but we felt the performance increase was important enough to retest with the new driver.
I haven't read Scott's article at the Tech Report yet, so I don't know what driver he used.
5150Joker - Thursday, March 9, 2006 - link
This article didn't state which drivers were used either - you'd think after having the cards for a few weeks your editors wouldn't have such obvious oversights.Cygni - Friday, March 10, 2006 - link
Ya, cause you paid good money to read this review!
Griswold - Friday, March 10, 2006 - link
That one is getting pretty old, its not really an excuse for a site with such high standards.Besides that, the ads on this page cost my bandwith. ;)
DerekWilson - Thursday, March 9, 2006 - link
I've updated the article with drivers used. I appologize for the omission.I absolutely do not mean this as an excuse -- drivers should not have been omitted no matter what the case -- but we have had the cards for less than a week. Again, not an excuse, just correcting your assumption.
dab - Thursday, March 9, 2006 - link
So will EVGA send me a 7900gt in the step up program to replace my 6800GS?allometry - Thursday, March 9, 2006 - link
Is there any word as to when these cards will hit stores?DerekWilson - Thursday, March 9, 2006 - link
check now :-)allometry - Thursday, March 9, 2006 - link
Right on! I didn't see any posts for the card earlier, so I figure there might be a week delay.Too bad NewEgg already lost it's stock on the eVGA 7900GT's :(
inthell - Thursday, March 9, 2006 - link
so NEWEGG shows some of the XFX and EVGA cards with a 256bit mem interface?inthell - Thursday, March 9, 2006 - link
why would anyone buy the 128bit version and how come anand didnt test or say anything abou this :confused:DerekWilson - Thursday, March 9, 2006 - link
there is not a 256bit version of the 7600 gt as far as we knowinthell - Friday, March 10, 2006 - link
hmmm, must be a mis-print on NEWEGG they have a XFX and a EVGA with 256bitsupafly - Thursday, March 9, 2006 - link
Could you do some benchmarks for COD2? If you have the time... thanks :)vaystrem - Thursday, March 9, 2006 - link
It would be really helpful if you had SLI data and Single card data seperate on each benchmark. If a single card is beating an SLI setup - I can figure that out for myself. In the current charts a lot of cards that should be there, or could be there, aren't because your running out of space.I mean the high performance benchmarks include nothing lower end from ATI than a 1900XT? You are also only reporting on the SLI performance of the 7600GT in the first 4 benchmarks. Most of us aren't going to go out and drop the cash on an SLI setup right off the bat but instead use it as an incremental upgrade.
Just a thought.
shabby - Thursday, March 9, 2006 - link
I agree they should be split up, it just looks too cluttered up.R3MF - Thursday, March 9, 2006 - link
hoorah for the 7900GT, just dropped £244 on one inlcuding p&p.Teetu - Thursday, March 9, 2006 - link
they mention they didn't run all the tests they usually do... why not? time?DerekWilson - Thursday, March 9, 2006 - link
There were a lot of factors that went in to our decision to cut down the number of tests for this article. Our testing is not over, as we are planning a follow up article as well.5150Joker - Thursday, March 9, 2006 - link
You mention the X1900 XTX costs $580-$650 yet you can find it at a popular e-tailer like Newegg for as low as http://www.newegg.com/Product/Product.asp?Item=N82...">$509 for an OEM XTXand http://www.newegg.com/Product/Product.asp?Item=N82...">$531.99 for a retail XTX (MIR)classy - Thursday, March 9, 2006 - link
I never really noticed how bad Crossfire sucks. While a 1900 xtx is about as faster and in many cases faster than 7900, when put in Crossfire vs SLI it just blows.DigitalFreak - Thursday, March 9, 2006 - link
One of the reasons I just picked up 2 7900GTX boards. That, and I've been hearing of more and more problems with Crossfire.Rogue 2 - Thursday, March 9, 2006 - link
On "The Competition" page, they're showing the 7600GT w/ 256 bit memory bus, and the 6800GS w/ 128 bit memory bus. Isn't it exactly the opposite?DerekWilson - Thursday, March 9, 2006 - link
yes -- fixedDigitalFreak - Thursday, March 9, 2006 - link
Power test info is missing.DarthPierce - Thursday, March 9, 2006 - link
Yep... I see no graphs :)DerekWilson - Thursday, March 9, 2006 - link
fixedDarthPierce - Thursday, March 9, 2006 - link
I think the 7900GTX vs 7800 GTX512 graph is also missingDerekWilson - Thursday, March 9, 2006 - link
Do you mean 7900 GT vs. 7800 GTX? We didn't do a seperated 7800 GTX 512 vs. 7900 GTX comparison.BigLan - Thursday, March 9, 2006 - link
"Today marks the launch of NVIDIA's newest graphics cards: the 7900 GTX, 7900 GT and the 7900 GT."Should that be 7900 GT and the 7600 GT ?
SpaceRanger - Thursday, March 9, 2006 - link
That was the first thing I noticed too... Pretty obvious typo that should have been caught.Cygni - Friday, March 10, 2006 - link
Ya, you paid good money to read this review!
DerekWilson - Thursday, March 9, 2006 - link
fixedaryaba - Thursday, March 9, 2006 - link
Huh?Dfere - Thursday, March 9, 2006 - link
Wow. I guess I am getting old. Just what are they trying to do with all the GS GT, XL, XLT nomenclature. And I build my own systems.....tuteja1986 - Thursday, March 9, 2006 - link
lol :)I just saw the x1800gto score they beat the 7600GT almost every test. http://www.guru3d.com/article/Videocards/325/">http://www.guru3d.com/article/Videocards/325/
Now its time to see the price ; ) lets see how far the prices of these cards can go :)
coldpower27 - Thursday, March 9, 2006 - link
And oh by the way it looks like the 7600 GT does quite good against the X1800 GTO with less then half the die size to boot and only a 128Bit Memory Interface.http://www.firingsquad.com/hardware/nvidia_geforce...">http://www.firingsquad.com/hardware/nvi...geforce_...
coldpower27 - Thursday, March 9, 2006 - link
Can you please show me what you saw I don't see the 7600 GT tested against the X1800 GTO period. It's only tested against the X1800 XL.