Comments Locked

93 Comments

Back to Article

  • ajbird - Friday, September 12, 2008 - link

    I am not sure I agree with this

    "Along with such a title comes a general requirement: if you're dropping over $500 on a graphics card on a somewhat regular basis, you had better have a good monitor - one of many 30" displays comes to mind. Without a monitor that can handle 2560x1600, especially with 2x 4870 X2 cards in CrossFire, all that hard earned money spent on graphics hardware is just wasted."

    I have just built a new rig and will not be building anything else for a long time to come. I want something that will play new games now at 1680x1050 with all the eye candy switched on and will still be able to play games in 3 years time at a decent level of IQ. I paid £330 for this card and think it was worth every penny.

    I have always splashed out on top end cards (i build a new pc about overy 3-4 years) and have aways been happy with their life span.
  • 4g63 - Monday, August 18, 2008 - link

    Cars and cards are two completely different things. I think that the guy [author] was just humoring you with the comment about energy costs. Why is this even an issue. If you want to burn a few hundred extra watts on your gaming rig there is no logical reason not to. If you want to help this place with energy support nuke power. Nuke power is clean, safe, plentiful and reliable. Oil prices have soared because of skeptics who have bought into a unsubstantiated construct of social panic. One guy was basically drawing conclusions between the amount of kilowatt hours on his power meter [due to the use of a gaming rig!] to a conglomeration of natural phenomena. Come on. Think for Yourself and Research the Truth if you are going to preach about how we should give up our right to play games as fast and as clearly as we possibly can! God Bless Capitalism.
  • Zak - Thursday, August 21, 2008 - link

    I'd like to see 9800GX2 scores too for comparison. I have been quite disappointed with it: sudden performance dropoff above 1680x1050 due to low bandwidth and only 512MB of memory, I guess, and extremely hot. I'm thinking about getting this new AMD/ATI card as the GeForce 200 series is a joke.

    Z.
  • X1REME - Friday, August 15, 2008 - link

    I used to look 2 this site 2 C what is good 2 buy & what not 2 buy. last time I purchased the ASUS P5B Deluxe/WiFi-AP AiLifestyle Series - motherboard recommended by this site. I have looked at many reviews but have not come across one 2 this date, which recommends somfin from AMD, I don't know why, maybe they just have not made anything good enough or maybe every1 expects more from AMD (and that's a good thing). so if som1 can show me personally a review that favours AMD outright as with Intel and nVidia all the time would be a start.
  • StormEffect - Friday, August 15, 2008 - link

    Through the past couple of years I've become more and more of an AMD/ATI fanboy, even though I've tried to stop. Yet I bought a Macbook Pro (By Apple, with Intel and Nvidia parts). All of my desktops are primarily Intel and Nvidia.

    But somehow the underdog status has me totally charmed, and this new 4000 series has added to the affection I have for AMD/ATI. I even recently built a new desktop with a 780G chipset and an 8750 tricore Phenom (no GPU except the integrated HD3200, which rocks).

    I REALLY <3 AMD/ATI right now. So I am QUITE surprised that everyone here seems so adamant that there is some serious bias in this article. It states the facts, this card is as fast as it gets, but it isn't perfect. So what's wrong with that? When you are at the top you deserve to be looked at critically.

    Is everyone just oversensitive at this point? I believe in being nice to others, but does the writer of this review have to drool over this card to validate it? I LOVE AMD/ATI. If there is bias here I'd be freaking out and getting out the picket signs, but I have reread it and I STILL don't see where you are all coming from.

    When they reviewed the GTX280 they used words with possibly unkind connotations to describe the massive core, does that make them Nvidia haters too?

    I don't see it, will someone point it out to me?
  • Mr Roboto - Saturday, August 16, 2008 - link

    I generally agree with what you're saying except the stopgap garbage 9800GX2 card was not looked at nearly as critical when it was reviewed and the card was at it's EOL after only 3 months from the release date. Also Nvidia did not add anything new to that card when compared to the 7950GX2 from 3 years ago. Why does the already obsolete 9800GX2 get a pass?

    It's right on the money to state "Hey lets get it going from a hardware level with these single slot dual GPU cards" instead of relying on software profiles that waste money by having to have a dedicated team of programmers working on it to get any significant improvement in performance. Not just the time and money aspect but like Anand said it's not going to last and was supposed to be a temporary thing that has now been the only way either side has shown any progress. But Nvidia is doing the same thing. The bias is noticeable but I have gotten used to it in the last few years since ATI has been out of the game. Nvidia's aggressive "marketing" if you want to call it that, has corrupted nearly all of the major online hardware publications point of views IMO. AMD is definitely the underdog but with a different sort of negative twist.
  • far327 - Friday, August 15, 2008 - link

    http://phoenix.craigslist.org/evl/sys/790600009.ht...">http://phoenix.craigslist.org/evl/sys/790600009.ht...

    Don't miss out on a steal.
  • far327 - Friday, August 15, 2008 - link

    I'm sorry boys & girls, but this is where my maturity and passion for gaming collide... It is just getting to damned excessive to be able to play a PC game at an HD resolution. I think my PC gaming days are done until Nvidia & AMD decide to work on better cooling methods and lower power consumption. Doesn't anyone here realize the world is in the middle of an energy crisis that is causing food and energy prices to soar??? Video cards today are like muscle cards from the 70's. I am running a E8400 with two 8800 GT Akimbo 1024mb in Sli off a 550watt PSU. I refuse to invest into a market that is more or less careless towards the environment. Waiting on green solutions!!!
  • Ezareth - Friday, August 15, 2008 - link

    That is fine with us. The rest of us who can think for ourselves will continue to advance while you revert back to the stone age. "Green" is a marketing ploy, much like "Organic" food, and Global Warming etc. If we need more electricity you and your kind need to give up your opposition to nuclear power. We have enough uranium in the US to power all of our electricity needs for the next few centuries.

    Not everyone buys into that but computers and graphics cards will continue to consume more and more electricity until some technology breakthrough comes through that doesn't involve the use of transistors(like IBM spintronics research).

    If you are so concerned about being "green" go live in the woods somewhere, and let the rest of us enjoy our advanced lifestyles.
  • far327 - Friday, August 15, 2008 - link

    It must be nice to completely ignore reality. I suppose you think $5.00 per gallon of gas is just fine too? Nuclear power can't fix that bro. Nor can nuclear power fix that flood that hit the Mississippi or California's massive wild fires, or Katrina. The recent surge in China's economy has allowed 1.8 billion people to drive automobiles. Think that might have a slight effect on our atmosphere? The population of the USA increases by 400,000 yearly. Think of all the consumption done by each person every single day! And our population continues to increase. The difference between you and I is that when you look at outside, you see a tree. When I look outside I see a forest. The world is bigger than your computer screen.
  • M1KEO - Saturday, August 16, 2008 - link

    Buying a high end video card has little to no effect on the price of gasoline, seeing as very few power plants run off of oil. And are you relating electicity usage to forest fires and floods which are all natural disasters, and have been happening for milleniums? Look at what scientists are saying, and realize temperatures were actually warmer in the 1980's then they are now, and that plants even flourish with more CO2 in the atmosphere because that is what they use to make oxygen.
  • far327 - Sunday, August 17, 2008 - link

    Whatever makes you sleep better at night. Your approach is as if energy, despite how it is produced or distributed is an endless commodity. Where as, I am trying to take a more conservative approach towards the ideal that energy is a valuable resource because of the ways we import it and produce it. Now if energy was made via solar or wind, I would loosen up a bit with my energy spending habits because that it would then be renewable energy. I'm just saying, don't feed the pig if it's already over weight. Eventually that pig will not be able to walk, and the meat with spoil. We as a country need to completely change the way we think about our energy spending habits. If we buy these power hog cards and create a viable market for Nvidia and AMD to invest in year after year. The exuberant careless energy spending cycle continues... We are therefore feeding that pig until it will eventually collapse. WAKE UP AND SMELL THE NEWS PEOPLE!! Global warming is not even debatable anymore! It is a very real threat towards our existence as a people. I am done with this childish debate and I'm sure all of you will be happy I leave the board, but don't say you weren't all warned.
  • BenPope - Thursday, August 14, 2008 - link

    I guess SidePort will become useful on 4-way plus... in much the same way as 2 or more hypertransport links in opteron 4 and 8 way CPUs scale.

    So if you have 4 GPUs, the sideports could connect diagonal corners to reduce latency the two-hop latency and increase bandwidth.
  • Barack Obama - Thursday, August 14, 2008 - link

    :)
  • oldhoss - Thursday, August 14, 2008 - link

    Uh oh...Bedwetting tree huggin liberal alert! ;-P
  • Hrel - Thursday, August 14, 2008 - link

    How the heck did you not include the 9800GX2 in your testing; I mean, that's Nvidia's only comprable card. And you said yourself it outperforms the GTX 280. When you factor in that it only cost 285 dollars on newegg it's a great buy. I'm actually amazed and sincerely confused as to why that card wasn't included in this review. Big mistake anandtech; not a small oversight but a complete disregard for common sense.
  • jeffrey - Thursday, August 14, 2008 - link

    Usually, NDA dates are known well in advance for the latest and greatest tech. That means that many people are excited and looking forward to insight on release day.

    I was happy to see the 4870 X2 posted when I opened the site. I was even happier to see the authors of the review were Anand and Derek. This to me usually means a well-thought out unbiased article that would have unique industry insights.

    The article seemed rushed, incomplete, and unbalanced. What a disappointment! ATI released the current performance king in the 4870 X2, a mid-level 4850 X2, AND refreshed the 4870 and 4850 by doubling the RAM!

    So much time and effort was wasted in the article whining about AMD/ATI not using the Sideport that driver versions and system specs weren't even included.

    This post probably sounds like a broken record now that I'm number 70 something giving feedback that is not very positive. I just want this site to stay the best and I felt I owed it to you Anand and Derek to try and push you to do better. Thanks for all the great work that you have done over the years.
  • Bezado11 - Wednesday, August 13, 2008 - link

    I loved the article and well it shows that the new king of cards is the 4870X2, however; I think your doing a bit of extra work for a benchmark nobody will use. AOC is tanking hard, not sure if you guys are aware of that games overall lack of integrity. Since AOC is not going to be a well played or viewed game, why use that as a benchmark standard? I mean we won't care one bit about it sooner or later because the game is in it's death stages.

    Just a heads up on that. I think taking the AOC benchmark out of future reviews will be advised. Stick to what we know best and what stresses the hardware the most like Crysis etc. AOC for heavens sake doesn't even support DX10 yet.
  • Griswold - Thursday, August 14, 2008 - link

    While I dont play AoC or plan on doing so, you just showed what a foolish idiot you are by claiming its soon demise. It has been the fastest selling MMO launch in history, I think "some" people will stick to it and even more will return when the content problem has been solved. Just because you dont like it, doesnt mean its not a good benchmark.

    I mean, I couldnt care less about all these "quake wars" and "ssassins creeds" that are, in my opinion, played by dumbass kids such as you, but hell, I wont complain about them being used as a benchmark.
  • Scour - Wednesday, August 13, 2008 - link

    This article is a way to negative for AMD/ATIs cards. This looks like the reviewer hate ATI, dunno why

    First the negative article about 790GX-chipset, now this :(
  • helldrell666 - Wednesday, August 13, 2008 - link

    Anandtech hates DAAMIT.Have you checked the review of the 4870/x2 cards at techreport.com?
    The cards scored much better than here.
    I mean In assassins creed it's well know that ATI cards do much better than nvidia's
    It seems that some sites like: anandtech,tweaktown"nvidiatown",guru3d,hexus... do have some good relations with NVIDIA.
    It seems that marketing these days is turning into fraud.



  • Odeen - Wednesday, August 13, 2008 - link

    With the majority of the gaming population still running 32-bit operating systems and bound by the 4GB RAM limitation, it seems that a 2GB video card (that leaves AT MOST 2GB of system RAM addressable, and, in some cases, only 1.25-1.5GB of RAM) causes more problems than it solves.

    Are there tangible benefits to having 1GB of RAM per GPU in modern gaming, or does the GPU bog down before textures require such a gargantuan amount of memory? Wouldn't it really be more sensible to make the 4870x2 a 2x512MB card, which is more compatible with 32-bit OS'es?

  • BikeDude - Wednesday, August 13, 2008 - link

    Because you can't be bothered upgrading to a 64-bit OS, the rest of the world should stop evolving?

    A 64-bit setup used to be a challenge. Most hw comes with 64-bit drivers now. The question now is: Why bother installing a 32-bit OS in new hardware? You have lots of Win16 apps around that you run on a daily basis?
  • Odeen - Thursday, August 14, 2008 - link

    Actually, no. However, a significant percentage of "enthusiast" gamers at whom this card is aimed run Windows XP (with higher performance and less memory usage than Vista), for which 64-bit support is lackluster.

    Vista 64-bit does not allow unsigned non-WHQL drivers to be installed. That means that you cannot use beta drivers, or patched drivers released to deal with the bug-of-the-week.

    Since a lot of "enthusiast" gamers update their video (and possibly sound) card drivers on a regular basis, and cannot wait until the latest drivers get Microsoft's blessing, 64-bit OS'es are not an option for them.

    I'm not saying that the world should stop evolving, but I am looking forward to a single 64-bit codebase for Windows, where the driver signing restriction can be lifted, since ALL drivers will be designed for 64-bit.
  • rhog - Wednesday, August 13, 2008 - link

    Poor Nvidia,
    DT and Anandtech have their heads in the sand if they don't see the writing on the wall for nvidia. The 4870X2 is the fastest video card out there, the 4870 is excellent in its price range and the 4850 is the same in its price range. The AMD chipsets are excellent (now that the SB750SB is out) and Intel Chipsets have always been a cut above also they really only support Crossfirenot SLI. Why would anyone buy Nvidia (this is why they lost a bunch of money last quarter,no surprise). For example, to get a 280SLI setup you have to buy an Nvidia chipset for either the AMD or Intel processors (the exception may be skulltrail ofr intel?) Neither Nvidia Chipset platform is really better than the equivalents from Intel or AMD so why would you buy them? Along with this Nvidia is currently having issues with their chips dying. Again why woudl you buy Nvidia? I feel that the writing is on the wall Nvidia needs to do something Quick to survive. What I also find Funny is that many people on this site and on others said AMD was stupid for buying ATI but in the end it seems that Nvidia is the one who will suffer the most. Give Nvidia a biased review they need all the help they can get!
  • helldrell666 - Wednesday, August 13, 2008 - link

    AMD didn't get over 40% of the X86 market share when they had the best cpus "athlon 64 /x2".
    AMD knew back then that beating INTEL "to get over 50%" of the
    x86 market share" wont happen by just having the best product.
    Now,INTEL has the better cpu/cpus and 86% of the cpu market.
    So,to fight such a beast with a huge power you have to change the battle ground.
    AMD bought ATI to get the parallel processing technology.Why?
    To get a new market where there's no INTEL.
    actually, that's not the exact reason
    Lately nvidia introduced cuda,"the parallel processing for general processing "And as we saw,The parallel procesing is much faster than the x86 processing in some taskes.
    Like in transcoding the 280gtx with a 933 Giga flops/cycle of processing power {processing power is the number of constructions or flops a gpu can handle in a single cycle} was 14 times faster than a QX 9770 clocked at 4GHz.
    NVIDIA claims that there are much more areas where the parallel processing can take over easily.
    So,We have two types of processing and each one has it's adavantages over the other.
    What i meant by changing the battle ground wasn't the gpu market.
    AMD is woking at these seconds on the first parallel+x86 processor .
    A processor that will include x86 and parallel cores working together to handle everthing much faster than a x86 processor at least in some tasks.So the x86 core will handle the tasks that they are faster at,and the parallel cores will handle tha tasks that the're faster at.
    Now,Intel claims that geometry can be handled better via the x86 processing.
    you can see it as a battle ground between INTEL and NVIDIA but,It's actually where AMD can win.
    I think that we're going to see not only x86+parallel cpus but also
    x86+parallel gpus.Easily put as much processing power of each type as it needs to make a gpu or a cpu.
    I think that AMD is going to change the micro processing industry to where it can win.
  • lee1210mk2 - Wednesday, August 13, 2008 - link

    Fastest card out - all that matters! - #1
  • Ezareth - Wednesday, August 13, 2008 - link

    I wouldn't be suprised to see the test setup done on a P45 much like Tweaktown did for their 4870X2 CF setup. Doesn't anyone realize the 2 X PCIe X8 is not the same as 2 X PCIe X16? That is the only thing that really explains the low scoring of the CF setups here.
  • Crassus - Wednesday, August 13, 2008 - link

    I think this is actually a positive sign when viewed from a little further away. Remember all the hoopla about "native quad core" with AMD's Phenom? They stuck with it, and they're barely catching up with Intel (and probably lose out big in yield).

    Here Sideport apparently doesn't bring the expected benefits - so they cut it out and moved on. No complaints from me here - at the end of the day the performance counts, not how you get there. And if disabling it lowers the power requirements a bit, with the power draw Anand measured I don't think it's an unreasonable choice to disable it. And if it makes the board cheaper, again, I don't mind paying less. :D

    And if AMD/ATI choses to enable it one or two years down the road - by then we've probably moved on by one or two generations, and the gain is negligible compared to just replacing them.

    [rant]
    At any rate, I'm happy with my 7900 GT SLI - and I can run the whole setup with a 939 4200+ on a 350 W PSU. If power requirements continue to go up like that, I see the power grid going down if s/o hosts a LAN party in my block. We already had brownouts this summer with multiple ACs kicking in at the same time, and it looks like PC PSUs are moving into the same power draw ballpark. R&D seriously needs to look into GPU power efficiency.
    [/rant]

    My $.02
  • drank12quartsstrohsbeer - Wednesday, August 13, 2008 - link

    My guess (before the reviews came out) was that the sideport would be used with the unified framebuffer memory. When the unified memory feature didn't work out, there was no need for it.

    I wonder if the non functioning unified memory was due to technical problems, or if it was disabled for strategic reasons... ie since this card already beats Nvidias, why use it. This way they can make it a feature of the firegl and GPGPU cards only.
  • Greene - Wednesday, August 13, 2008 - link

    Wow. Lots of this and that in here :-)

    No Hardware Info...
    No Driver Info...

    Did we lose a Page ?

    I'm also curious why Assessess Creed wasn't tested with the different versions ?
    There was such a big stink back in 99/2000 when ati fudged drivers to get better FPS scores, as well as the stink back when Nvidia did the same with 3DMark (what was it 05)?
    And here the "creed" developers drop some sort of support for ATI
    and the authors skip over it, and leave the different versions out of the test.

    Did you guys draft this article 2 weeks ago and forget to revise it ?

    Did you hire fox news editors ?

    I've really trusted and valued Anandtech's articles in the past.

    This just seems sloppy, incomplete and rushed... and i dropped out of college! :-)
  • Arbie - Wednesday, August 13, 2008 - link

    Every bar graph has the cards in a different order. This makes it impossible to scan the graphs and see how a card does overall, across a range of games. And there is no compensating benefit. If I want to know which card is fastest in Crysis, I can clearly see which bar is longer! It DOESN'T HAVE TO BE THE TOP BAR ON THE GRAPH.

    So... you won't do that again.

    Next: everyone should just go out and buy a 4850. It will do all you want for now. Let all these X2 kludges and 65nm dinosaurs pound each other into landfill. Check back again in 6-8 months.

    Arbie
  • hooflung - Wednesday, August 13, 2008 - link

    The numbers were not bad. They speak for themselves. However, the tone of this review was horrible. It is the fastest card in your review and has exactly what people want out of a multi gpu setup. 1 slot, full gig of ram, smashes the competition's closest competitor that cost more, only costs 100 above the best single gpu solution and doesn't require a new motherboard.

    Yet, Nvidia can't do any wrong. ATI decides its sideport isn't needed and disable's it which is a cardinal sin it seems. It still cost 100 dollars LESS than Nvidia's GTX280 when it first came out.

    The mixed signals coming from this review could make a cake if baked.
  • drank12quartsstrohsbeer - Wednesday, August 13, 2008 - link

    This article had the feel like the authors were annoyed that they had to write it. I certainly feel annoyed after reading it...
  • just4U - Wednesday, August 13, 2008 - link

    From my perspective this was a very valid and honest review that zones in on key issues that effect the majority of our gpu buying decisions. Yeah their getting some tough love feedback from it but that's to be expected as well.
  • Keldor314 - Wednesday, August 13, 2008 - link

    750 watts for the X2 in crossfire?! You'd better think of having an electrician come by and upgrade your home's powergrid! Seriously, though, for my house, I can't run a single 8800 gtx at the same time as a space heater without tripping the circut breakers in the garage. True, the heater in question is rated at 1500 watts. The total wattage to trip the circut breaker is thus probably less than 2000 watts, since I've also seen the heater trip it when only accompanied by a lamp (no computer on). Given that the X2 CF will probably, after counting the rest of the computer, send energy usage to over 1000W at load, there's a very real chance that such a computer would periodically cause your power to go out, especially if, god forbid, someone tried to turn on the room's lights.

    Upgrading a power supply is cheap. Rewiring your house to handle the higher wattage is not.
  • CK804 - Sunday, August 17, 2008 - link

    Actually, the power consumption numbers are of the entire system and not just the graphics cards alone. Still, it's amazing how much power these cards draw. My jaw dropped when I saw that the power consumption of a system with these cards under load exceeded 700 watts. When X-bit labs did a roundup of 1000 watt power supplies, the first thing they concluded was that there was no need for power supplies over 6-700 watts for any setup unless some sort of exotic cooling was to be used. I can attest to that statement when I had 4 first gen. 74GB Raptors in RAID 0 coupled with 2 7900GTs in SLI and an AMD X2 4800+ running on a Zalman 460 watt PSU.
  • animaniac2k8 - Wednesday, August 13, 2008 - link

    I 've been a reader of AnandTech's articles for many years and I have owned exlusively Nvidia cards since 2001.

    This is easily one of the worst and most biased articles I 've ever read on AnandTech. Very dissapointed to have wasted my time reading this. I 'll be looking elsewhere for quality reviews from now on.
  • CyberHawk - Wednesday, August 13, 2008 - link

    Same here. Reader since 2001, registered later.

    I always liked articles here. English is my second language and I liked that from time to time I found a new word that made me look into the diary.

    But, this article is a bunch of bull. One more like this and I am out of here. Not that this means the end of anandtech but anyway.
  • helldrell666 - Wednesday, August 13, 2008 - link

    Where's the system setup?
    Why the poster hates AMd that much?
    This is the worst review of the 4870x2 I've checked yet.

    The review at techreport.com is much better.


  • random2 - Wednesday, August 13, 2008 - link

    I didn't really notice much in the line of bias, however I'm beginning to wonder if this card didn't quite meet their expectations....or maybe after years of running similar types of benchmarks on video cards (which are all basically the same), and having to spend countless hours trying to jam a review together in time for deadlines, isn't quite as much fun as it used to be. We on the other hand can hardly keep our bodily fluids in place when we hear there is a new card review almost ready for release. Now I'm not sure just who the crazy ones are here...Is it Anand and Derek for wasting years of there lives trying to appease us unappeasables, or is it us, because even at my advanced age, I get a woody just thinking about a dual GPU ATI card....Mmmmm....Maybe I should talk to someone about this...
    Anyhow, thank you Anand and Derek for your efforts and more late nights no doubt.
    What I really find interesting is the bit of backlash we are starting to see against Nvidia. Not that I am a fanboy of anything...ok I've looked at a few girls...but When I think of Nvidias pricing strategies over the years while competition was in the realm of little to none from ATI, I cannot help but feel much the same as a choir boy in the preachers shower stall being asked to bend over and grab that soap. I know, no one says I have to, but if I want to play the latest games and become the lead singer it might help.
    I like hardware.....I like it functional, fast, reliable and affordable. Till now Nvidia has been kind of sticking it to us so to speak. Providing only the basics at more reasonable pricing and forcing us people of average incomes to sell drugs or our bodies in order to afford their high end products.
    Thank you AMD/ATI for saving me from a life on the street, for giving me the opportunity to turn away from a life of corruption, evil and self loathing. Never again will I feel the degradation and the compromising of my moral values in the search of higher frame rates. Bless you...
    Down with Nvidia! which I am now sure have become a key player in the evil axis...somewhere...or is that axis of evil?
    I wonder if I have time to stop into NCIX tomorrow before therapy?

    Denny Crane
  • DarthAgitated - Tuesday, August 12, 2008 - link

    I am bothered by the fact that the test system specs are not shown which seems rather silly.

    I am even bothered more however by the ridiculous reactions to the article.

    Please look over the previous reviews with regards to the 4850 and 4870. Stating that both cards are the best cards at their price points. Or even the 9800GTX review where they praise the 4850 over the 9800GTX.

    Also take into account that the GX260 and 280 reviews happen when ATI had no new product out there and here was this brand new architecture being released.

    In this review he points out that the Nvidia price drop is due to AMD’s influence. As they did in the 4850 and 4870 review.. “You can either look at it as AMD giving you a bargain or NVIDIA charging too much, either way it's healthy competition in the graphics industry once again (after far too long of a hiatus).”

    So please stop with your “this site is pro-“whatever doesn’t fit with what I like” bullshit. Try actually reading the articles instead of doing what most people on the internet do and skim through everything. Here you have a very fast video card, which is shown to win in a bunch of benchmarks. And when he points out the issues with crossfire and SLI implementation and support for game titles, he is speaking a valid truth. And like the majority of x2 cards, it’s hard to get excited over something that can be achieved by simply plugging in a second video card.

    Please grasp the concept of this card being a niche product, a product that is not reaching its full potential if your not pushing a 30” lcd (or an hdtv I suppose) and 800+ watt power supply and playing specific games that are properly supported by the crossfire drivers and you happen to have $559.00 (OMG you didn’t automatically account for shipping and my local sales tax when posting that!!!) dollars burning a hole in your wallet.

    If you want your pro-whatever review then do a search on website that contains the name of the company you want to win and find their review and enjoy a, what I’m sure is a unbiased “this is fastest evarrr!” review.

    I for one appreciate this site and fact that it contains well thought out and technical articles on some new stuff like new cpu or gpu architecture. But don’t expect them to go off on a product that’s an updated version of an existing product that contains two other existing products and offers nothing new.

    Please put up the test system specs though please.

    Thanks.
  • Zaitsev - Wednesday, August 13, 2008 - link

    I couldn't agree more, DarthAgitated. In the article, I thought credit was given when it was deserved and nothing more. Someone's panties did seem to be mildly bunched over the Sideport issue, however. ;)
  • DRSoul - Wednesday, August 13, 2008 - link

    What's happening?? This article is not up to the usual Anandtech standard. I am writing this because I have over the past 4 - 5 years never read such a sober review on this site on such a high performing card. Might just be that they are getting so much new stuff that its becoming boring...
  • AnnoyedGrunt - Wednesday, August 13, 2008 - link

    I thought the review was very good. I didn't detect bias, just a good mixture of praise for the performance but concern that the CrossFire support could not always be counted on.

    In general the feeling (at least on Anantech) towards SLI and CF solutions at this time seems to be similar to the original Single Core Hyper Threaded Intel Chips. Sometime, Hyper Threading helped and sometimes it didn't. It wasn't until true dual core processors arrived (along with multi-threaded software) that the dual CPU's really became good competition for higher clocked single core CPU's.

    I agree with the concern that both SLI and CF solutions are not ready for "prime-time" and are instead best for the early-adopters and enthusiasts that like to tinker and don't mind getting frequent driver updates and browsing forums for optimal driver configurations.

    I liked that the article provided some factual commentary regarding the tradeoffs between single and multi-GPU setups.

    Aside from that, I didn't notice a discussion regarding fan noise. Did I miss it somewhere, or is it not included in the article?

    -D'oh!
  • ZootyGray - Tuesday, August 12, 2008 - link

    I can be very sensitive to some strange things sometimes.

    While reading this, I had to stop. I was sensing great rage. That's pretty dark. And it was taking over me! Like evil something. I don't read reviews to have experiences like that.

    Anyone else get a feeling like that? Maybe just me.

    I met a guy once who was about 8 months clean from crak coc. He was full of rage. I had to stop talking to him too. It was like he wasn't even there. This just reminded me of that. Which makes no sense and the whole thing was pretty insane. But I wonder why I sense such anger in a hardware review that's supposedly unbiased. I always thought I would enjoy such work - must be stressful. Passing stress to an audience is not right either.

    I don't know much about why; but my feelings don't lie. Anyone else get a sense of pain or anger? Maybe I got this from a different source. Maybe I am getting stressed from all the flame games. Maybe time to just let this go. Time to move on.
  • anonymous x - Tuesday, August 12, 2008 - link

    i don't see any bias in this article- you can clearly see the performance benchmark scores, hardly any improvement over the other cards.
  • Dark Legion - Wednesday, August 13, 2008 - link

    If you don't see any bias in the article, you clearly only looked at the pretty pictures and not the actual writing.

    Also, why not include the test setup? For all we know, there could have been another limiting factor, especially since we saw very little gains from having a pair of 4870 X2's at such a high resolution. I remember back when the GX2 was coming out and you reviewed it, you used an Intel Skulltrail board because you could use both Crossfire and SLI in it, and you weren't limited by anything else. THAT is unbiased, and you can truly compare the cards that you're reviewing. Now you review the best performing competing card, and you don't even tell us what setup you used. Oh, and you don't even include the GX2, which is Nvidia's best performing card, and also happens to be a multi-GPU single card solution (which takes up as much energy as the 4870 X2, so don't only put AMD down for that). Did it not stack up well enough compared to this card for you? This review was horrible. I have come to expect this from DT (*cough cough Jason *cough cough), but not so much from Anand.
  • glynor - Tuesday, August 12, 2008 - link

    I find it very interesting that the one solitary example of a game that doesn't scale well on the 4870 X2 is Assassin's Creed with the patch. Also, I didn't see a detailed explanation of what settings and setup were used to test the different games (were these custom timedemos? FRAPS? built-in benches?).

    Either way... It is interesting to say the least. Assassin's Creed posted numbers well under the GTX 280 in the Anandtech results. However, if you look at Tech Report's numbers which were generated using the pre-patch (DirectX 10.1 supporting) version of the game from before they took Nvidia money, the picture is dramatically different. Instead of losing to the GTX 280 (not to mention the GTX 260 SLI), the 4870 X2 easily bests the GTX 280 SLI (x2) setup in both average FPS and in Median Low FPS.

    Just seems awful fishy to me. Overall this is good information. I'm particularly interested in the info on Sideport being disabled, as I'm not seeing similar information reflected in other reviews out there. I'm sure this is simply a case of Anand asking the right questions of the right people, but it'd be nice to see some independent confirmation.

    Overall, it seems to me that this review does seem to have a slight (not over the top, but slight) anti-dual-GPU bias. It feels like more stock is put in the "failing" shown by Assassin's Creed, which is dubious at best, and no other evidence is shown to back this up. Surely, any dual GPU product may suffer optimization problems with new games, but wouldn't this apply to SLI equally (if not more -- most results I've seen show Crossfire scaling better than SLI more often than not)? I guess I just feel that the conclusions are being drawn based on results "not in evidence".

    Discounting the outlier and contradicted Assassin's Creed results, I fail to see how the GTX 280 is in the same league at all as the new ATI dual-GPU card.
  • JarredWalton - Tuesday, August 12, 2008 - link

    The games tested impact the results. As someone who has been running CrossFire HD 3870 for the past year or so (well, maybe more like 9 months?), plus someone who ran X1900/X1950 CrossFire before that, I can attest to the fact that CF support for new games is terrible. Basically, you get support in all major titles, but it's usually about two months after a game comes out. I've taken to not rushing to purchase new games, but that's okay since I'm busy of late.

    As for Assassin's Creed, the lack of performance with 4870X2 is odd and indicates perhaps a remaining driver issue for the new architecture. The game is definitely demanding of your CPU, but it should be running much faster. Maybe forcing on 4xAA (the game doesn't support 4xAA above 1680x1050) made the results worse than you would normally expect.

    Personally, I am very cautious about recommending dual-GPU configurations for gamers - they're much better for benchmarks. Or at least, I would only recommend them for gamers that don't immediately buy the latest games and want top performance. GRID required updated drivers for CF, as did Mass Effect, Assassin's Creed, and pretty much every game I recall purchasing in the last two years.
  • xsilver - Tuesday, August 12, 2008 - link

    Is anyone still waiting for ATI to get off their butts and fix/enhance their avivo encoder?

    Its incredibly fast but having the ability to encode high quality videos would be nice.
    If ATI are not willing to develop it why dont they just open up the source code so that others can develop it?
  • TheJian - Tuesday, August 12, 2008 - link

    For one, I haven't seen a 4870X2 for less than $600-650 and they don't exist now. Don't expect these to go for $550 for a while. You should print what they are being SOLD for at the time of the article.

    GTX260 pricing (check newegg) is off also. You can get one for $245 and it's an OC Edition (MSI)! They have quite a few at $269 retail. So where the heck do you get $300? Again quit printing suggested retail prices (or whatever they are) and PRINT ACTUAL PRICING! In the case of your GXT260 SLI price that would drop it $110 and they are both overclocked!

    GTX280 isn't $450. Newegg has them for $399 if you want the cheapest, and most are $410-429. Do you guys even check the pricing before putting up your stories? You do this every time.

    You state this for AMD "At the same time, AMD's literally hot GPUs have seen their prices fall; the Radeon HD 4870 is now a $270 - $280 GPU, slightly down from $299 and the Radeon HD 4850 is a $170 - $180 card. These are very slight changes in price, but at least they are in the right direction."

    But you conveniently leave out that Nvidia's cards don't run $450 (GTX280 your price) or $299 (GTX260 your price). These are FAR from reality. A suggested retail doesn't matter. What matters is WHAT I WILL PAY if I buy it today! These prices haven't changed in about a week so you've had plenty of time to faq check before printing. Also the 4870 is $250/259 if you want the cheapest at newegg. So you even got the ATI pricing wrong. Newegg has some 4870X2's listed at $559+ but they won't be that by the time their auto-price-upping machine gets done with them (still higher than the $550 you state...which won't happen for a month or more likely). They'll hit $600 next week before one even sells...LOL. You need to fix the prcing in the article to reflect REALITY.
  • Ezareth - Wednesday, August 13, 2008 - link

    I just bought a Sapphire 4870X2 from Newegg for $559 an hour ago so the pricing is correct. Obviously on launch day they are going to be sold out as people like me have been waiting for them for months now...the same was true of the 280GTXs as well. It will take a couple weeks before they become readily available and then the price will start coming down eventually to around $500.00.
  • Aberforth - Tuesday, August 12, 2008 - link

    I am studying business models of different tech companies - from the core architectures to marketing, it helps me to understand how these products actually sell, basically the crux of the tech industry.

    We've seen GTX 280, which was priced at $650 when it was released, but even after months of research NV comes up with a GPU that is just too large and bloated- redesign is what they actually do. Yet, it gets a good review and market hype. When you look at some of the early 260 reviews, they actually say "it's reasonably priced".

    So these self proclaimed geeks write articles based on comparison- which is a childish thing to do. You cannot compare one design with another nor you can judge it's merits only looking at the performance factor. There cannot be one single outcome of a review, there are different types of customers with different requirements. So at the end of the review if someone favors either AMD or NV are biased. A unbiased review cannot contain suggestive material that hampers the customer's decision but instead it should contain information on how it affects customers with different requirements.
  • ZootyGray - Tuesday, August 12, 2008 - link

    Thanx for that - good post.

    I have been thinking similar thoughts; but my ability to say it as you have done, was clouded by my own reaction of disgust at this shit-on-a-stick socalled report.

    This site is seriously biased against AMD.

    I used to think it was thorough testing - but issues found here, are not reported elsewhere, and are simply not experienced by users that post to forums. I expect config issues with new releases - but this site uses any excuse.

    Anandtech BIAS is out of the bag.

    I thought I was the only one seeing this; but I am glad to see general rejection of bullshit by many others.

    Very unprofessional that we have no idea what 486-box you used to skew these results - YOUR RESULTS DON'T MATCH OTHER SITES - and they aren't shuffling dx9 and dx10 to fudge it all.

    If you have to change the game midstream,
    u r cheating.

    .
  • GmTrix - Tuesday, August 12, 2008 - link

    I'm surprised that the 9800 GX2 wasn't included in the benchmarks...
    Seeing that the 4870 X2 is ATI's most powerful single card solution and as this article states the 9800 GX2 is still Nvidias's most powerful single card solution. Not to mention they are very similarly priced and they are both dual GPU cards...
  • Mr Roboto - Thursday, August 14, 2008 - link

    The 9800GX2 would get smoked in higher resolutions with it's much lower bandwidth and smaller frame buffer. Anything above 1600x1200 especially with AA and the GX2 would choke. That's where the 4870x2 really shines is above those resolutions. That's why.
  • Spacecomber - Tuesday, August 12, 2008 - link

    I was wondering the same thing: why leave out nvidia's dual core card? I know that it is not as powerful as the 4870 GX2, but it is nvidia's most powerful single card solution. And, the pricing seems to make it a fairly competitive option. It is selling for less than $300, which positions it well against the 4870 single core card as well as the GTX 280.

    It's not a big deal, but it has struck me as odd that this card hasn't been included in recent video card reviews. I know that not every available card can be rounded up for benchmarking; however, I think this is one that many consumers would be interested in knowing about, especially if they are thinking of spending around $300 on a video card.
  • techguy2k5 - Tuesday, August 12, 2008 - link

    Mr. Anand Lal Shimpi:

    you have a writer working for you that has a bias towards a particular IHV. How do I know this? Because he makes it apparent in EVERY piece he writes. Derek Wilson, the constant pot-shots against ATi are pathetic. You are incapable of keeping your bias out of your articles, and therefore should not be writing. I will not read another Derek Wilson article again. In fact, I will not read another Anandtech article until something is done about this matter.

    I'm not the only person aware of Mr. Wilson's bias, all of my tech enthusiast friends are aware and feel the same way. It is sad what has become of Anandtech in recent years. It used to be easy to trust Anandtech and take your writers' word on any issue, but no more. Derek Wilson is dragging this site's name through the mud.

    Dismayed,
    -techguy
  • ZootyGray - Tuesday, August 12, 2008 - link

    Keys also goes overboard to find trouble.

    And lack of followup articles - as promised - where are they? Trouble only reported here with 780G has not been followed up as mentioned it would be - since last May. But the damage was done. Do we believe it or no?

    Credibility? nah.

    Bias? haha

    the ntel fanboys seem to like it - or employboys.

    And the ultimate recently 16 pages of praise for non-existent theoretical 2yrs out larrabeee rehash trash. Preceded by - we don't know anything about it at all. as dictated by ntel bs div.

    Biased much? haha

    This is dangerous market influence. How far does this go? How long has this been happening?

    Violating trust is pretty sad.
  • pattycake0147 - Tuesday, August 12, 2008 - link

    I've noticed the same bias recently. I've only been a member for a little over a year now and even in the short time the site has gone downhill.
  • sweetsauce - Tuesday, August 12, 2008 - link

    Translation: I like ATI and you don't so im going to bitch. Even though my name is tech guy, i obviously have ovaries. I'm going to go cry now on ATI's behalf.
  • jnmfox - Tuesday, August 12, 2008 - link

    Get over yourself. Pointing out facts isn't taking pot-shots.

    This is just what I was looking for in a review of the X2. The numbers tell the story. In the majority of cases the X2 isn't worth it, and until AMD & NVIDIA get proper hardware implementation of multi-GPU solutions it will most continue to be the case.

    To little performance increase for the large increase is cost.
  • skiboysteve - Tuesday, August 12, 2008 - link

    i completely agree with anand on this article. the lack of innovation from a company supposedly focusing on multi chip solutions is stupid

    although yes, it is really fast.

    and why cant they clock it lower at idle?
  • astrodemoniac - Tuesday, August 12, 2008 - link

    ... reviews I have ever seen here @ Anands. I am extremely disappointed with this so called "Review" ... hell, I have seen PREVIEWS that would put it to shame.

    Oh, and what in the hell did AMD do to you that you're so obviously pissed off at them?... are you annoyed they didn't give you preferential treatment to release the review earlier? man just go back to the unbiased reviews, we're buying graphic cards, not brands.

    It's like the guys writing the reviews are not gamers any more o_0

    /rant
  • Halley - Wednesday, August 13, 2008 - link

    It's no secret that AnandTech is "managed by Intel" as a user put it. Of course every one must have some source of income to support their families and themselves but it's pathetic to show such blatant biasedness.
  • TheDoc9 - Tuesday, August 12, 2008 - link

    Anandtech isn't about gaming anymore, it's about photography and home theater. And the occasional newest intel extreem cpu.

    I think Dailytech and the forums carry Anandtech these days...

  • DigitalFreak - Tuesday, August 12, 2008 - link

    "AMD decided that since there's relatively no performance increase yet there's an increase in power consumption and board costs that it would make more sense to leave the feature disabled. "

    In other words, it's broken in hardware and we couldn't get it working, so we "disabled" it.
  • NullSubroutine - Tuesday, August 12, 2008 - link

    You didn't even include test system specs or driver versions.
  • CreasianDevaili - Tuesday, August 12, 2008 - link

    I wanted to know why you didnt retest the 4870CF setup when you obviously had some issues with it before in GRID. I noticed the 280gtx setup was retested which resulted in higher FPS. I feel that after running the game at 2560x1600 on my FW900 and also from other reviews that you had a issue with crossfire not working at that resolution. The single 4870 shouldnt be getting better FPS by that degree at 2560x1600 because it also has 512mb of vram.

    So I just wanted to know why the 280gtx was special enough to retest when this review was about the 4870X2. If it is to show a good comparison then why wasnt the 4870CF, which many have and want to see, not retested as well.
  • Spoelie - Tuesday, August 12, 2008 - link

    How come 3dfx was able to have a transparant multigpu solution back in the 90's - granted, memory still was not shared - when it seems impossible for everyone else these days.

    Shader functionality problems? Too much integration (a single card voodoo2 was a 3 chip solution to begin with)?
  • Calin - Tuesday, August 12, 2008 - link

    The SLI from 3dfx used scan line interleaving (or Scan Line Interleaving to be exact). The new SLI still has Scan Line Interleaving, amongst other modes.
    The reason 3dfx was able to use this is that the graphic library used was their own, and it was built specifically to the task. Now, Microsoft's DirectX is not built for this SLI thing, and it shows (see the CrossFire profiles, selected for the best performance for a game, depending on that game).

    Also, 3dfx's SLI had a dongle feeding video signal from the second card (slave) into the first card (master), and the video from the two cards was interleaved. Now, this uses lots of bandwidth, and I don't think DirectX is able to generate scenes in "only even/odd lines", and much of the geometry work must be done by both cards (so if your game engine is geometry bound, SLI doesn't help you)
  • mlambert890 - Friday, August 15, 2008 - link

    Great post... Odd that people seem to remember 3DFX and dont remember GLIDE or how it worked. Im guessing they're too young to have actually owned the original 3D cards (I still have my dedicated 12MB Voodoo cards in a closet), and they just hear something on the web about how "great" 3DFX was.

    It was a different era and there was no real unified 3D API. Back then we used to argue about OpenGL vs GLIDE and the same types of malcontents would rant and rave about how "evil" MSFT was for daring to think to create DirectX

    Today a new generation of illinformed malcontents continue to rant and rave about Direct3D and slam NVidia for "screwing up" 3DFX when the reality is that time moves on and NVidia used the IP from 3DFX that made sense to use (OBVIOUSLY - sometimes the people spending hundreds of millions and billions have SOME clue what they're buying/doing and actually have CS PhDs rather than just "forum posting cred")
  • Zoomer - Wednesday, August 13, 2008 - link

    Ah, I remember wanting to get a Voodoo5 5000, but ultimately decided on the Radeon 32MB DDR instead.

    Yes, 32MB DDR framebuffer!
  • JarredWalton - Tuesday, August 12, 2008 - link

    Actually, current SLI stands for "Scalable Link Interface" and has nothing to do with the original SLI other than the name. Note also that 3dfx didn't support anti-aliasing with SLI, and they had issues going beyond the Voodoo2... which is why they're gone.
  • CyberHawk - Tuesday, August 12, 2008 - link

    nVidia bought them .... and is now uncapable of take advantage if the technology :D
  • StevoLincolnite - Tuesday, August 12, 2008 - link

    They could have at least included support for 3DFX glide so all those GLIDE only games would continue to function.

    Also, ATI have had a "Dual GPU" Card for many years (Rage Furry Maxx) before nVidia released one.
  • TonyB - Tuesday, August 12, 2008 - link

    can it play Crysis though?



    two of my friends computer died while playing it.
  • Spoelie - Tuesday, August 12, 2008 - link

    no it can't, the crysis benchmarks are just made up

    stop with the bearded comments already
  • MamiyaOtaru - Wednesday, August 13, 2008 - link

    Dude was joking. And it was funny.

    It's apparently pretty dangerous to joke around here. Two of my friends died from it.
  • CyberHawk - Tuesday, August 12, 2008 - link

    ... but I find a response a bit cold.

    It's the fastest card for God sake!
  • Samus - Wednesday, August 13, 2008 - link

    it was pretty negative. there really isn't anything negative about this card. price and power consumption (the only arguably negative things about this card) are in line with anything nVidia would have had they made a product to compete against this.
  • Finally - Tuesday, August 12, 2008 - link

    And what is this?!

    "When you start pushing up over $450 and into multi-GPU solutions, you do have to be prepared for even more diminished returns on your investment, and the 4870 X2 is no exception."

    Man! This is a bullshit card for bullshit buyers, sry I meant: ENTHUSIASTS... What the heck do you expect? Low power consumption and reasonable price-to-power-relations? I totally don't get it...
    Isn't this the "We like power supplies only if they can assure us that they will kill the rain forest single-handedly" site?
    Where is the bullshit, sry again: enthusiasm?
  • Finally - Tuesday, August 12, 2008 - link

    Is it just my eyes or did they actually read the following heading on page2?

    "NVIDIA Strikes Back"

    *sound of a Vegas-style gambling automat turning out big coin*
    If there was a prize for a totally out-of-order title... this would take rank 1 to 3...
  • Finally - Tuesday, August 12, 2008 - link

    You are right; got that perception, too...

    Although I would never buy a dualchip cardmonster like this one (save SLI or CF...) I actually love it how they manage to take an article about an AMD product and turn around till you don't know wheter it was about the new HD4870X2 or the lackluster 280...
  • drisie - Tuesday, August 12, 2008 - link

    In my opinion this review is way too negative. It is solutions like this that have caused Nvidia to drop prices and increase competition between the competitors. Its the best card money can buy for ffs.
  • formulav8 - Tuesday, August 12, 2008 - link

    Yeps, this is one of the worst reviews Anand himself has ever done. He continues to praise nVideo who just a month or 2 ago was charging $600 for their cards.

    Give credit where credit is do. He even harps on a sideport feature that doesn't mean much now and AMD says it didn't provide no real benefit even when it was enabled.

    I've been a member of this site since 2000 and am dissappointed how bad the reviews here are getting especially when they have a biased tone to them.

    Of course, this is only my opinion.

    Jason
  • BikeDude - Wednesday, August 13, 2008 - link

    I think Anand's initial comments has to be viewed in the light of his conclusion:

    "I keep getting the impression that multi-GPU is great for marketing but not particularly important when it comes to actually investing R&D dollars into design. With every generation, especially from AMD, I expect to see a much more seamless use of multiple GPUs, but instead we're given the same old solution - we rely on software profiles to ensure that multiple GPUs work well in a system rather than having a hardware solution where two GPUs truly appear, behave and act as one to the software."

    I wholeheartedly agree. The software profile solution has baffled me for years. Why are they messing about with this? It was supposed to be a temporary thing. Creating unique profiles for every game title is not feasible. At the very least give the developers an API that will help them do this themselves.

    Instead of messing about with the power hungry sideport nonsense, AMD should have invested some R&D time on how to get rid of software profiles.
  • Locutus465 - Thursday, August 14, 2008 - link

    Probably because from a design perspective it works... And based on the benchmark results it works very well indeed. Additionally we know from the early days of SLI that not all games will respond well to all modes, so it seems to be that at a HW level the task of getting the card to automatically perform amazingly with multiple GPU's is going to be difficult to say the least (perhaps feautal?).
  • EglsFly - Tuesday, August 12, 2008 - link

    I agree!

    It is because of ATI/AMD's release of the 4000 series cards that Nvidia had to dramatically drop the price of its GPU's.

    AMD brought great performance to the masses at an affordable price point. Here they up the ante with an even higher performing solution and what do we get in return from Anandtech? A biased review full of negativity that it looks like it was written by somebody from Nvidia.
  • CyberHawk - Tuesday, August 12, 2008 - link

    That kind of message was I hoping for to get from review... a kind of didn't happen.
  • BRDiger - Tuesday, August 12, 2008 - link

    I just wondered if you useed the 8.8 Catalysts... The testing rigs specs would be nice for comparison of the benchies...
  • nubie - Wednesday, August 13, 2008 - link

    This is interesting, and thanks for the hints about a 1GB model, but guru3d ran an article two weeks ago on a 2GB 4850, so I believe that is trumped.

    I too was hoping for more enthusiasm, the 9800GTX is $200 and the just released GTX 260 is under $300?? Stop the presses nVidia is no longer on top!!

    AMD has wrested back the performance crown with a vengeance, and their mainstream products are totally playable in recent games.

    Meanwhile nVidia is trying to plug every price point with the 8800GS and 9600GSO and the 9600GT, not to mention the 9800GTX+, this is freaking ridiculous.

    You need to paint a more realistic picture, this is one of the rare times that mainstream games can be played for $170 while decimating the competition's products that cost $250, and the high end is owned by the same company with a working Dual chip card with the performance crown, and being a more efficient electricity user than the competition.

    If nVidia comes out with a GTX 260 x2 or a GTX 280 x2 I am going to look very carefully to see how glowing THAT review is.

    I want SLi and Crossfire to die. There is no reason to only allow 2 displays (or worse just one) on a multi-output machine. Worse still a machine with 2x PCI-E 16x slots (even in dual 8x mode) should be allowed to run any hardware that fits in them.

    This software hampering of a completely standard PCI-E interface is stupid and childish, they should just drop it.

Log in

Don't have an account? Sign up now