Mid Range GPU Roundup - Summer 2006

by Derek Wilson on 8/10/2006 3:30 AM EST
Comments Locked

74 Comments

Back to Article

  • jcbennett - Wednesday, September 6, 2006 - link

    I've been unable to find these cheap prices for a x1900gt (nor can I find the card being sold in many places). The cheapest I see anywhere is on newegg for open box products - ~$220. For new products, their prices are ~$300. The 7900gt on the other hand I've found at Tiger Direct for $250 or less, including overclocked versions for ~$10 more.
  • VooDooAddict - Saturday, August 12, 2006 - link

    It's nice to see that really any of the new "midrange budget" solutions would work well for someone. Decissions can be made more on the details then on the raw speed. Most people would be very happy with 7600GT or better. None of the cards being pushed in this price range are really lemons. (Unlike the the GeforceFX 5xxx Series)

    Shader Model 3 is also supported across the X1xxxx or the 7xxx series lines.
  • blondeguy08 - Friday, August 11, 2006 - link

    since amd has aquired ati it is pointless to get a video card from them especcialy high end because amd has stomped out the ati name along with some of its name brand technologies meaning no support for the old............hello nvidia is th eonly way to go at this day and time maybe not tomorrow cause amd might potentially create a duo of the two companies products that could smoke intels relations with nvidia since they havent merged in retailation to amds move....
  • arturnowp - Friday, August 11, 2006 - link

    AMD said there won't discontinue ATi and Radeon brand...
  • Josh7289 - Friday, August 11, 2006 - link

    Yeah, and there isn't going to be any real products of this takeover until 2008 or so.
  • arturnowp - Friday, August 11, 2006 - link

    I think 6600GT stands out in Quake 4 is because of its memory amount - it has only 128MB which isn't enough for Q4/D3. This card should be tested in medium. And even though Doom 3 give nice ave. framerate with 6600GT hiccups occurs with high quality textures.
  • arturnowp - Friday, August 11, 2006 - link

    I wonder why those resolutions 'casue midrange gamers mostly use 1280x1024 and equivalent
  • JarredWalton - Friday, August 11, 2006 - link

    We also show the various lower/higher resolutions, and basically chose a top resolution that shows how the cards begin to separate as the GPU is stressed more. At 1280x1024, some games begin to become CPU limited. It's also worth mentioning that 1600x1200 is relatively close to 1680x1050 in terms of GPU requirements, and 1920x1400 is close to 1920x1200 - the WS resolution will typically be ~10-20% faster in both instances (more at 19x12, less at 16x10). I would say a lot of people are moving to 1680x1050 these days, even in the mid-range.
  • DerekWilson - Saturday, August 19, 2006 - link

    also, if you just want to play at 1280x1024, I'd recommend going with the 7600 gt at this point ... the very low end of midrange cards can handle 12x9 and 12x10 resolutions.
  • Egglick - Friday, August 11, 2006 - link

    Where the heck is the 256MB X1800XT?? You can get it for http://www.newegg.com/Product/Product.asp?Item=N82...">only $199 and it offers equal or better performance than the X1900GT.

    Why do review sites continually ignore this card??
  • gmallen - Friday, August 11, 2006 - link

    Most of the PC enthusiast population interested in mid-range cards are still running AGP motherboards (this is based on sales of pci motherboards vs. agp motherboards). Where are these cards?
  • Josh7289 - Friday, August 11, 2006 - link

    quote:

    Where are these cards?


    They don't exist.
  • arturnowp - Friday, August 11, 2006 - link

    Hi

    It's written that all card in oblivion was tested with HDR Lighting with X800GTO doesn't support. I think your results are misleading. The same with SC: Chaos Theory...

    BTW: Who plays Oblivion with Actor Fade at 20%, Item Fade at 10% and Object Fade at 25% you get better graphics and performance setting those option to 50-60% and turning off grass with consums a lot of power and doesn't look good. In foliage it's better to see your enemies from greater distance the say with a horse ;-)

  • arturnowp - Friday, August 11, 2006 - link

    OK there's writen about SC: Chaos Theory but all in all conclusion are misleading "Owners of the X800 GTO may have a little more life left in their card depending on how overclocked the card is, but even at stock clocks, it might be wise to hang on for another product cycle if possibl" where GeForce 6600GT performe on par with X800GTO. It would be better to exclude X800GTO from charts or mark it as SM 2.0 card. What's better GeForce 6600GT should be tested in SM 2.0 mode...
  • nv40 - Friday, August 11, 2006 - link

    Don't why?
    http://www.xbitlabs.com/articles/video/display/pow...">http://www.xbitlabs.com/articles/video/display/pow...
    Some difference of test are so large that it almost shocked me
    For instance:
    7900GT@84.21 with FX-60 can run 54 FPS avg in 1600x1200 with 4xAA 16xAF in X-bit lab
    7900GT@91.33 with X6800 just be 35 FPS ave in 1600x1200 with only 4x AA in Anandtech
    Problem of 91.33? Intel 975X? X6800? nVidia?
    more than 40% performance difference despite X6800 is far superior to FX-60

  • coldpower27 - Friday, August 11, 2006 - link

    They probably aren't running the same time demo sequences.
  • nv40 - Friday, August 11, 2006 - link

    Maybe... but only 9% dif in X1900GT (41 vs 38)
    And 7900GT test in Anandtech definitely performed much worse then X-bit lab in general
    nothing with which is correct or not, but if both are right, the the conclusion may be probably draw like below:
    1. Driver problem: 91.33 is much slower than 84.21 (nV Cheat, or 91.33 problem)
    2. CPU problem: X6800 is much inferior than FX-60 in game (Rediculous, and far from true in every test)
    3. Platform problem: nVidia cards perform much worse in intel chipset (975X)
  • Sharky974 - Friday, August 11, 2006 - link

    I agree. I clearly remember Xbit declaring the 7900GT to win the vast majority of benches vs the X1900GT.

    In fact overall the X1900GT wasn't warmly recieved. I really feel this deserves some looking into.

    For example, I'll have to go look, but I think Firing Sqaud also showed the X1900GT as inferior to the 7900GT.

    As it stands now, it's like Anand's platforms are somehow ATI biased, on the other hand I believe Xbit platform is Nvidia biased. Xbit reviews nearly always show Nvidia winning.
  • Sharky974 - Friday, August 11, 2006 - link

    http://www.firingsquad.com/hardware/sapphire_radeo...">http://www.firingsquad.com/hardware/sapphire_radeo...

    I started on the first page of benches.

    As one glaring example:

    Firings squad: Quake 4 1280X1024 4XAA 8XAF 7900GT-87.2 X1900GT-60.6

    http://www.firingsquad.com/hardware/sapphire_radeo...">http://www.firingsquad.com/hardware/sapphire_radeo...

    Anand: Quake 4 1280X1024 4XAA 7900 GT-45.1 X1900GT-49.8

    http://images.anandtech.com/reviews/video/roundups...">http://images.anandtech.com/reviews/video/roundups...

    With similar settings, FS has the 7900GT getting nearly double the frames Anand does. The X1900GT also gets significantly more in FS review, from 49 to 60 FPS, but nowhere near the change the 7900GT sees, with the net effect the X1900GT eaks out a win at Anand, but loses by nearly 27+ FPS at FS.

    The X1900GT is definitly a better card than I had remembered, even at the FS benches though.

    Also, FS was using a FX-57. Anand a much more powerful CPU, making results all the more puzzling.

    In addition to some of the other suggestions, I'd question drivers. FS was using older drivers on both since it is an older review. Perhaps Nvidia drivers have seen a large performance decrease, or ATI's a similar increase? This seems fairly unlikely, though, as I dont think you normally get huge differences from driver to driver.

    Unless Nvidia really was cheating RE 16-bit filtering as the INQ claimed a while back, so they fixed it causing a massive performance decrease? :) Again though, that suggestion is made half-jokingly.

    This definitly needs a lot of looking into I fell. Anand's results are quite different than others around the web at first blush.
  • JarredWalton - Friday, August 11, 2006 - link

    Levels can make a huge difference in performance. For example, Far Cry has segments that get about 80 FPS max on any current CPU (maybe higher with Core 2 Extreme overclocked...), but other areas of the game run at 150+ FPS on even a moderate CPU like a 3500+. I don't have a problem providing our demo files, but some fo them are quite large (Q4 is about 130 MB if I recall). SCCT, FEAR, and X3 provide a reference that anyone can compare to, if they want. The only other thing is that ATI driver improvements are certainly not unlikely, especially in Quake 4.
  • Sharky974 - Friday, August 11, 2006 - link

    I tried comparing numbers for SCCT, FEAR and X3, the problem is Anand didn't bench any of these with AA in this mid-range test, and other sites all use 4XAA as default. So in other words no direct numbers comparison on those three games at least with those two Xbit/FS articles is possible.

    Although the settings are different, both FS and Anand showed FEAR as a tossup, though.

    It does appear other sites are confirming Anand's results more than I thought though.

    And the X1900GT for $230 is a kickass card.
  • JarredWalton - Friday, August 11, 2006 - link

    The real problem is that virtually every level of a game can offer higher/lower performance relative to the average, and you also get levels that use effects that work better on ATI or NV hardware. Some people like to make a point about providing "real world" gaming benchmarks, but the simple fact of the matter is that any benchmark is inherently different from actually sitting down and playing a game - unless you happen to be playing the exact segment benchmarked, or perhaps the extremely rare game where performance is nearly identical throughout the entire game. (I'm not even sure what an example of that would be - Pacman?)

    Stock clockspeed 7900GT cards are almost uncommon these days, since the cards are so easy to overclock. Standard clocks are actually supposed to be 450/1360 IIRC, and most cards are at least slightly overclocked in one or both areas. Throw in all the variables, plus things like whether or not antialiasing is enabled, and it becomes difficult to compare articles between any two sources. I tend to think of it as providing various snapshots of performance, as no one site can provide everything. So if we determine X1900 GT is a bit faster overall than 7900 GT and another site determines the reverse, the truth is that the cards are very similar, with some games doing better on one architecture and other games on the other arch.

    My last thought is that it's important to look at where each GPU manages to excel. If for example (and I'm just pulling numbers out of the hat rather than referring to any particular benchmarks) the 7900 GT is 20% faster in Half-Life 2 but the X1900 GT still manages frame rates of over 100 FPS, but then the X1900 GT is faster in Oblivion by 20% and frame rates are closer to 40 FPS, I would definitely wait to Oblivion figures as being more important. Especially if you run on LCDs, super high frame rates become virtually meaningless. If you can average well over 60 frames per second, I would strongly recommend enabling VSYNC on any LCD. Of course, down the road we are guaranteed to encounter games that require more GPU power, but predicting what game engine is most representative of the future requires a far better crystal ball than what we have available.

    For what it's worth, I would still personally purchase an overclocked 7900 GT over an X1900 GT for a few reasons, provided the price difference isn't more than ~$20. First, SLI is a real possibility, whereas CrossFire with an X1900 GT is not (as far as I know). Second, I simply prefer NVIDIA's drivers -- the old-style, not the new "Vista compatible" design. Third, I find that NVIDIA always seems to do a bit better on brand new games, while ATI seems to need a patch or a new driver release to address performance issues -- not always, but at least that's my general impression; I'm sure there are exceptions to this statement. ATI cards are still good, and at the current price points it's definitely hard to pick a clear winner. Plus you have stuff like the reduced prices on X1800 cards, and in another month or so we will likely have new hardware in all of the price points. It's a never ending rat race, and as always people should upgrade only when they find that the current level of performance they had is unacceptable from their perspective.
  • arturnowp - Friday, August 11, 2006 - link

    I think another advantage of 7900GT over X1900GT is power consumption. I'm not checking numbers of this matter so I am not 100% sure.
  • coldpower27 - Saturday, August 12, 2006 - link


    Yes, this is completely true, going by Xbitlab's numbers.

    Stock 7900 GT: 48W
    eVGA SC 7900 GT: 54W
    Stock X1900 GT: 75W
  • JarredWalton - Friday, August 11, 2006 - link

    Speech-recognition + lack of proofing = lots of typos

    "... out of a hat..."
    "I would definitely weight..."
    "... level of performance they have is..."

    Okay, so there were only three typos that I saw, but I was feeling anal retentive.
  • Sharky974 - Friday, August 11, 2006 - link

    Not too beat this to death, but at FS the X1900GT vs 7900GT benchmarks

    X1900GT:

    Wins-BF2, Call of Duty 2 (barely)

    Loses-Quake 4, Lock On Modern Air Combat, FEAR (barely),

    Toss ups- Oblivion (FS runs two benches, foliage/mountains, the cards split them) Far Cry w/HDR (X1900 takes two lower res benches, 7900 GT takes two higher res benches)

    At Xbit's X1900 gt vs 7900 gt conclusion


    "The Radeon X1900 GT generally provides a high enough performance in today’s games. However, it is only in 4 tests out of 19 that it enjoyed a confident victory over its market opponent and in 4 tests more equals the performance of the GeForce 7900 GT. These 8 tests are Battlefield 2, Far Cry (except in the HDR mode), Half-Life 2, TES IV: Oblivion, Splinter Cell: Chaos Theory, X3: Reunion and both 3DMarks. As you see, Half-Life 2 is the only game in the list that doesn’t use mathematics-heavy shaders. In other cases the new solution from ATI was hamstringed by its having too few texture-mapping units as we’ve repeatedly said throughout this review."

    Xbit review: http://www.xbitlabs.com/articles/video/display/pow...">http://www.xbitlabs.com/articles/video/display/pow...
  • Geraldo8022 - Thursday, August 10, 2006 - link

    I wish you would do a similar article concerning the video cards for HDTV and HDCP. It is very confusing. Even though certain crds might state they are HDCP, it is not enabled.
  • tjpark1111 - Thursday, August 10, 2006 - link

    the X1800XT is only $200 shipped, why not include that card? if the X1900GT outperforms it, then ignore my comment(been out of the game for a while)
  • LumbergTech - Thursday, August 10, 2006 - link

    so you want to test the cheaper gpu's for those who dont want to spend quite as much..ok..well why are you using the cpu you chose then? that isnt exactly in the affordable segement for the average pc user at this point
  • PrinceGaz - Thursday, August 10, 2006 - link

    Did you even bother reading the article, or did you just skim through it and look at the graphs and conclusion? May I suggest you read page 3 of the review, or in case that is too much trouble, read the relevant excerpt-

    quote:

    With the recent launch of Intel's Core 2 Duo, affordable CPU power isn't much of an object. While the midrange GPUs we will be testing will more than likely be paired with a midrange CPU, we will be testing with high end hardware. Yes, this is a point of much contention, as has always been the case. The arguments on both sides of the aisle have valid points, and there are places for system level reviews and component level reviews. The major factor is that the reviewer and readers must be very careful to understand what the tests are really testing and what the numbers mean.

    For this article, one of the major goals is to determine which midrange cards offers the best quality and performance for the money at stock clock speeds at this point in time. If we test with a well aged 2.8GHz Netburst era Celeron CPU, much of our testing would show every card performing the same until games got very graphics limited. Of course, it would be nice to know how a graphics card would perform in a common midrange PC, but this doesn't always help us get to the bottom of the value of a card.

    For instance, if we are faced with 2 midrange graphics cards which cost the same and perform nearly the same on a midrange CPU, does it really matter which one we recommend? In our minds, it absolutely does matter. Value doesn't end with what performance the average person will get from the card when they plug it into a system. What if the user wants to upgrade to a faster CPU before the next GPU upgrade? What about reselling the card when it's time to buy something faster? We feel that it is necessary to test with high end platforms in order to offer the most complete analysis of which graphics solutions are actually the best in their class. As this is our goal, our test system reflects the latest in high end performance.
  • augiem - Thursday, August 10, 2006 - link

    I wonder which of these cards would accelerate Maya's 3D viewport performance the most...
  • PrinceGaz - Thursday, August 10, 2006 - link

    If you're a casual Maya user, then look at the OpenGL performance (Quake 4) for a rough guide. I'm tempted to think though that the GeForce cards should still have the edge in most OpenGL situations so Quake 4 might not be representative.

    If you use Maya professionally, then none of the cards looked at are for you. A good Quadro or FireGL card will render scenes far faster than any consumer card, and as time is money, will more than pay for itself despite their high cost if that is what you do for a living.
  • Calin - Friday, August 11, 2006 - link

    There was a time when it was possible (although not very easy) to mod a Radeon 9700 into the corresponding FireGL card. This would have been great for you (but now a FireGL based on 9700 could be slower than consumer cards)
  • PrinceGaz - Thursday, August 10, 2006 - link

    I've only read the first two pages of the article up to and including the list of prices for the various cards at the bottom of the second page, and haven't read any comments here, but it seems pretty obvious already that the X1900GT is going to be the obvious winner in terms of value for money.

    I'll be back in half an hour or so after I've read the rest of it.
  • Gondorff - Friday, August 11, 2006 - link

    Indeed, the X1900GT looks very good... which makes me very happy b/c I just bought it a week or so ago (damned slow shipping though...). For those who do care about rebates, the x1900gt can be had on newegg for $200 right now (a connect3d one). I was lucky and got it at $175 before they raised the price... for $15 more than the 7600gt I was going to get otherwise, that's pretty damn good if I may say so myself.

    Anyway... excellent article; if only it were out earlier so I could worry less about a slightly blind choice... but c'est la vie and it turned out well anyway :).
  • Kougar - Thursday, August 17, 2006 - link

    Good grief, I just found it for $199... and it was previously $175!? Incredible... :(
  • PrinceGaz - Thursday, August 10, 2006 - link

    Yep, pretty much as I suspected- the X1900GT is best at stock speeds. Things become a little blurred when factory-overclocked 7900GTs are brought into the picture but while they're faster, they're also more expensive by a similar amount. Both offer great value for money if you need to buy a card now.

    One thing the article seemed to overlook is that many people who visit sites like this will overclock cards themselves, factory overclocked or not, and this is likely to reduce the advantage of already overclocked cards like the 7900GTs you recommend. I imagine there is a bit more headroom in a stock X1900GT than a factory overclocked 7900GT (especially a 7900GT with a core clock of 580 like you used). Those of us willing to take a chance on how much extra a card has available may well find a user-overclocked X1900GT to be a match for what an overclocked (user or factory) 7900GT can achieve.
  • coldpower27 - Friday, August 11, 2006 - link


    The problem with this is that your using assume performance vs guranteed performance of factory overclocked units, so they aren't comparable.

    The point provided is something to keep in mind, but shouldn't be recommended for anyone other then those who know what they are doing. Not to mention the voiding of the warranty when you do when you suggest.
  • DerekWilson - Friday, August 11, 2006 - link

    Also, if you look around, increasing voltage and cooling for 7900 GT cards can yeild results better than a 7900 GTX. Buying a factory overclocked 7900 GT gives you a card that a manufacturer binned as a part that is able to hit higher than stock clocks at stock voltage and temperature. So you should get a more easily overclockable card if you really want to push it to its limits.
  • Genx87 - Thursday, August 10, 2006 - link

    2nd from the top for ATI is considered mid grade?

    Guess that 7950GX2 is pushing them down from the top.
  • coldpower27 - Friday, August 11, 2006 - link


    Well it wasn't too long ago that X1900 XT still had pricing over 400US.

    It wasn't until ATI started doing some price slashes in preparation for the X1950 that the prices have fallen alot, fairly recently.
  • JarredWalton - Thursday, August 10, 2006 - link

    It's more based on price than performance, and obviously at $330 we're very close to the high end.
  • Powermoloch - Thursday, August 10, 2006 - link

    Why was it not listed? These days they can be found almost under $150.00
  • kalrith - Friday, August 11, 2006 - link

    Actually, it's http://www.newegg.com/Product/Product.asp?Item=N82...">$126 shipped from Newegg right now, and that's BEFORE a $30 MIR. It should keep up (or beat) the 7600GT, so I think it deserves to be on there as well.
  • Jedi2155 - Sunday, August 13, 2006 - link

    Although it is plenty fast, I think the DX 9.0C has shown enough benefits over 9.0b to seriously consider the 7600 GT over the X850 XT
  • Zebo - Thursday, August 10, 2006 - link

    Nice review but there should only be two choices in the sub $300 field:

    7900GT, not only can it be had for $224, not $275 as the review implies, it can be overclocked to 7900GTX virtually guarnteed, meaning it trades punches with a $359 1900XT.

    The card missing from this review is the $220 1900 All-in-Wonder, not only is it faster than 7900GT stock and has way more features, it can also be overclocked to 1900XT levels.
  • Zebo - Thursday, August 10, 2006 - link

    http://www.zipzoomfly.com/jsp/ProductDetail.jsp?Pr...">http://www.zipzoomfly.com/jsp/ProductDetail.jsp?Pr...

    looks like they raised price since last week... it really was 224:)
    http://www.newegg.com/Product/Product.asp?Item=N82...">http://www.newegg.com/Product/Product.asp?Item=N82...
  • AmbroseAthan - Thursday, August 10, 2006 - link

    Was kind of surprised to see it not in this mix being you can get one for ~$200 http://www.newegg.com/Product/Product.asp?item=N82...">Sapphire x1800xt - OEM (Retail is 250ish)

    I assume it runs faster then the 1800GTO, but how does it rank with the 7800GT and 7900GT?
  • mpc7488 - Thursday, August 10, 2006 - link

    quote:

    but with the recent price cuts pushing the X1900 GT down to about $230, the added performance gain of the 7900 GT might not be worth the money in this case


    The 7900GT is consistently around $240 after rebates. There are 3 cards at that price from 3 different manufacturers at Newegg right now (eVGA, XFX, and MSI). In fact, the overclocked version (520 core/1540 memory) is $244.

    Maybe rebates aren't really looked at in the price engine, but the fact remains that you can easily find a 7900GT for under $250.
  • DerekWilson - Thursday, August 10, 2006 - link

    Good point. We didn't include rebates as they can change without warning, not everyone follows through on them, and they take some time to recieve.

    But, obviously, they can make a difference. I'll add a bit to the conclusion about it.

    Thanks.
  • rcc - Monday, August 14, 2006 - link

    For my past sins I did a brief stint at Fry's Electronics. Only 5% of customers send in rebates and follow up. The stores count on this.

    So, I think I'd ignore them too. Unless just to note that rebates may be available on some items, but that pretty much applies to anything at any time.

  • Josh7289 - Thursday, August 10, 2006 - link

    On Page 5, Black and White 2 Performance, this is written:

    quote:

    But with cards like the 6600 GT, 6800 GT, X800 GTO and X1600 XT, the game would look much better if some settings were turned down in favor of enabling some antialiasing or a higher resolution.


    Obviously, "6800 GT" should be "6800 GS". ;)
  • DerekWilson - Thursday, August 10, 2006 - link

    Obviously :-)
  • Nelsieus - Thursday, August 10, 2006 - link

    It was probably really hard making final conclusions as you noted (with price cuts and factory OCs, etc), but I think overall, you came up with some excellent choices. The review was very fair and balanced, indepth, and overall covered all the basis.

    Another great article, Derek. Thanks for keeping an eye out for us midrange buyers. :)
  • saiku - Thursday, August 10, 2006 - link

    amen, thanks for remembering the guys in the "middle".
  • DerekWilson - Thursday, August 10, 2006 - link

    Thanks! We did want to do even more with it, but we were afraid if we worked any longer on it we'd have to deal with another price cut before it got published :-)
  • Josh7289 - Thursday, August 10, 2006 - link

    I have a question. When looking at performance for games at 1600x1200 no AA, could I compare that to what I would get with 1280x1024 with AA on? Thanks.
  • Gigahertz19 - Thursday, August 10, 2006 - link

    You overclock the 7900 GT and it gets a great boost in performance. I would like to know how well the X1900 XT overclocks?

    I think you should have overclocked the Top Midrange ATI X1900 XT and see how well it could outperfrom an overclocked 7900GT or a stock 7900 GTX....

    or maybe compared an overclocked 7900 GT to a stock clock 7900 GTX then compare an overclocked ATI X1900 XT to a stock clock ATI X1900 XTX.

    Nice article by the way, this comes at a perfect time when I'm about to build a new computer in a few weeks. Going to wait until September until Nvidia 590 chipsets for Conroe and see what else comes out by that time then buy :)
  • DerekWilson - Thursday, August 10, 2006 - link

    There are no factory overclocked X1900 XT cards for sale. The clock speed difference between the X1900 XT and the X1900 XTX essentially means that an overclocked X1900 XT would *be* an X1900 XTX.

    We tested the NVIDIA cards at higher clock speeds because they are sold at higher clock speeds. We weren't trying to snub ATI; it's just that people can actually get this performance out of the box.
  • yacoub - Thursday, August 10, 2006 - link

    Top of the Final Words page, first sentence:

    While this has been quite a lot of information to absorb, but we will do our best to sort it all out.

    Remove the "While" and capitalize the 't' in "this", or remove "but". =)
  • Gigahertz19 - Thursday, August 10, 2006 - link

    I can't stand people who always have to correct every damn thing they read, who cares if the authors of these articles make little mistakes? As long as these articles are readable and understandable who gives a shit. I don't think anybody has the right to complain for something that is free for us to read...now if we were paying to read this material it would be a different story.

    I can understand correcting big mistakes like correcting the author when he uses the incorrect name for something or is wrong about a fact then that should be corrected but little grammatical errors and sentence structure should be left alone unless it's completely butchered. If you're so interested in these small mistakes go teach high school English.

    And yes I know some ass on here will find an error in my above comments and correct it, go for it :).
  • yacoub - Thursday, August 10, 2006 - link

    Actually, the authors generally appreciate it and fix it, at least in my experience. It makes for a more professional site to have solid grammar in articles. As for "who gives a s#!t", generally adults do.
  • Netopia - Friday, August 11, 2006 - link

    And to support his position, take a look at the sentence now... they fixed it!

    Joe
  • JarredWalton - Friday, August 11, 2006 - link

    Yup.

    Derek was working on this late at night and so I went and made my typical corrections after the fact. There were plenty of other minor typos, and we do our best to correct them whether we spot them or someone else does. We certainly don't mind people pointing them out, as long as it's not the "OMFG you misspelled two words on the first page so I stopped reading - you guys are teh lamez0rz!?1!" type of comment. ;)
  • CKDragon - Thursday, August 10, 2006 - link

    I have my 7900GT voltmodded & overclocked to 640/820. I know you didn't show voltmod overclocked benchmarks, but seeing that just a core bump up to 580 brings it close to or better than the X1900XT at stock is a nice reference mark to have.
  • Frackal - Thursday, August 10, 2006 - link

    I doubt that considering a 7900GTX with higher core/memory clocks than that usually gets beaten by an X1900XT at stock. (Not to mention to make that fair they'd have to OC the x1900xt too)

    This review was relevantly incomplete IMO because it did not show the huge difference between an x1900xt and 7900gt with AA/AF on
  • yacoub - Thursday, August 10, 2006 - link

    Nor the huge difference in audible noise levels, for that matter. My 7900GT is practically silent except when in 3D games, and even then it's not a jet engine.
  • yacoub - Thursday, August 10, 2006 - link

    I recently upgraded from an X800XL to a 7900GT (eVGA N584 model - hsf is copper and covers the RAM chips). I run the 91.33 drivers.

    I am extremely pleased with this upgrade choice. The card is actually quieter than my Sapphire X800XL Ultra was (it had the Zalman hsf on it stock but the fan was ball-bearing and made a bit of noise).

    My rig:
    3200+ Venice
    1GB DDR RAM dual-channel
    A8N-SLI Premium

    Games:
    CS:Source
    Homeworld 2

    Haven't reinstalled other games yet but considering the great improvement I noticed in CS:S, I imagine FEAR, NFSMW, and the other games I own but don't currently have installed would also see a large jump in performance. Not only did I gain fps and eliminate the big dips I experienced in busy scenes with the X800XL, I'm also at max graphical settings (everything High) and anywhere from 2xAA and 4xAF up to 4xS AA and 8xAF, and this is at 1680x1050 (20" widescreen).

    Very satisfied with the purchase. This cost me less than the X800XL did nine months ago and performs probably 40-60% better, if not more considering the improved graphical settings on top of the fps gain.
  • vailr - Thursday, August 10, 2006 - link

    When are the DX 10.0 cards going to available?
    And, what new assortment of ATI or nVidia GPU's will be on the DX 10.0 cards?
    Will there be cheap [<$150] DX 10.0 cards?
  • Warder45 - Thursday, August 10, 2006 - link

    I don't see the 7600GT OC 600/750 listed in the charts on the page talking about the 7600GT OC. Lots of 7900GT models though.
  • DerekWilson - Thursday, August 10, 2006 - link

    look again :-) It should be fixed.
  • pervisanathema - Thursday, August 10, 2006 - link

    You post hard to read line graphs of the benchmarks that show the X1900XT crushing the 7900GT with AA/AF enabled.

    Then you post easy to read bar charts of an O/Ced 7900GT barely eeking out a victory over the X1900XT ins some benchmarks and you forget to turn on AA/AF.

    I am not accussing you guys of bias but you make it very easy to draw that conclusion.
  • yyrkoon - Sunday, August 13, 2006 - link

    Well, I cannot speak for the rest of the benchmarks, but owning a 7600GT, AND Oblivion, I find the Oblivion benchmarks not accurate.

    My system:

    Asrock AM2NF4G-SATA2
    AMD AM2 3800+
    2GB Corsair DDR2 6400 (4-4-4-12)
    eVGA 7600GT KO

    The rest is pretty much irrelivent. With this system, I play @ 1440x900, with high settings, simular to the benchmark settings, and the lowest I get is 29 FPS under heavey combat(lots of NPCs on screen, and attacking me.). Average FPS in town, 44 FPS, wilderness 44 FPS, dungeon 110 FPS. I'd also like to note, that compared to my AMD 3200+ XP / 6600GT system, the game is much more fluid / playable.

    Anyhow, keep up the good work guys, I just find your benchmarks wrong from my perspective.
  • Warder45 - Thursday, August 10, 2006 - link

    The type of chart used just depends on if they tested multiple resolutions vs a single resolution.

    Similar to your complaint, I could say they are bias towards ATI by showing how the X1900XT had better marks across all resolutions tested yet only tested the 7900GT OC at one resolution not giveing it the chance to prove itself.

Log in

Don't have an account? Sign up now