I'm curious whether crossfire would increase AVIVO performance or not. If not, will there be drivers in the future that will benefit from crossfire when using AVIVO?
referring to playback, or the hardware encoding feature?
playback is already at 1080p with a single x1800 of any sort so i don't think that needs improvement. crossfire hardware assisted encoding might be a really good thing. i imagine a dual core crossfire setup could become a real encoding/rendering powerhouse
While I'm still here, I thought I'd point out what seemed to be a strange anomaly in the Quake 4 benches to see if someone can provide an answer.
Under 4x FSAA, the GTX 512 cards are listed as performing better in 1920x1440 than in 1600x1200. Oddly enough, the results are almost right in the middle of the 1280x1024 and 1600x1200 scores, though if you re-plot the graph with the 1600 and 1920 results reversed it doesn't match the trends set by any other hardware in the list.
Is this a typo, or something more sinister? And, more curiously, why didn't Derek make any mention of it at all?
One thing I would be curious to see is how the ATi cards fare with a small tweak done under B&W2. There's a setting which can be changed in one of the .INI files which makes the game run exponentially better on most hardware I've seen it "trying" to run on - including my own X800 Pro AGP, an two mate's 6600GT AGP and 5800 Ultra AGP.
I believe the file is called "graphics.ini" in the data subdirectory - change the detail settings to be 3 1 3 instead of 3 0 3. It does disable two of the options in the ingame graphics menu (and I have heard it can result in "squares" under fields and such), but the performance increase is substantial, to say the least. Oddly enough, just disabling these two options on their own doesn't make anywhere near as much of a difference.
Sadly, once it's running well you quickly find out that it wasn't worth all the effort, but I would still be curious to see the results from tests under such conditions. NVidia apparently fixed this bug with one of their post-release drivers (hence the disparity of scores), and there's also a 1.2 patch being prepared as we speak which will hopefully level things off somewhat, but in the meantime this is the best we've got.
...card-of-the week mentality. So I finally decided to do some research to see what the B.F.D. was with having one or more $700+ video cards in a PC. I went out and bought the Lanparty UT SLI Mobo, (2) FX57s so I could find the fastest O/C'ing one, (2) Asus 7800 GTX 512s, (2) 520W OCZ Power Stream PSUs, 2 x 1024MB OCZ EB Platinum 4800 modules, a Corsair ice water-cooling system for the FX57 and Nvidia chipset (until I get to vapor cooling), an Antec P160 Performance case and an HP L2335 23" display.
Everything went together fine and I spent several days overclocking the two FX57s until I was able to run almost stable at 3.9 Gig. @ 1.625V w/34 degree cold water. And to my surprise my 3Dmark 2005 showed an incredible 18,240 score !!! WOW, I was just blown away. I was starting to understand what the enthusiasm was all about for the latest-greatest-trick of the week PC hardware. After several weeks of tweaking I now have my system stable most of the time and it simply fly's !!! Not only that but the blue LEDs look so cool at night, and my friends are impressed as H*LL that for less than $6,000 I have a PC that will cook my breakfast, bring in the newspaper, make the utility company rich, heat my house, make Nvidia rich, clean my car, wash my clothes and even do word processing. I can even log on to the Net .00000000000000001 seconds faster than my old dumbazz Athlon 939 3000 that I spent $1,000 on total and which runs rock stable at 2.4 Gig. And at a resolution of 1920 x 1200 I'm able to get a frame rate in any video game of at least 60. This allows me to sit 6'-8' away from my monitor to minimize eye strain when I play video games for 18 hours or more at a time.
Without a doubt I am one broke but very happy camper. NOW - now I understand the point of spending $700 or more on a Vid card and $1000 on a CPU and hundreds on memory, and PSUs, and trick PC cases, etc. And my friends think I am the coolest guy they know cause I got this BLING machine. Whatta life !!! If only I had known years ago...
Dunno what about 512gtx-Sli, but single one is http://www.pcpop.com/doc/0/121/121711_5.shtml">no more "the best of the best" since "eax 1800xt top" beat it in most d3d benchies. ( not to mention it's price some $20-50 more than standard xt)
D3D benches are different than real world performance - and for just about everything (if not everything, correct me if i'm wrong), the GTX 512 blows away the GTX 256 and x1800XT. The x1800 XTPE, or whatever's next in line, is *supposed* to compete with the GTX512. Almost seems like nvidia caught ati flat footed on this one.
sorry, maybe i was a bit unclear
but the thing is that asus x1800xt-top IS x1800xtpe, indeed. And as you've just said the real competitor to gtx512 according to article that i refered.
As to real world perfomance, it's still uclear to me what do you mean.
Maybe i'm wrong, but aren't the majority of the modern games using d3d? Even if not so, i think these results are fairly enough prove that the gtx is no longer the fastest.
Of course, this has to be proven further by other reviewers
Come on! There are over 100 people in EVGA's step-up queue waiting for the 7800GTX 512MB, but you have a problem with ATI's availability?!
Nvidia got LUCKY with the 256MB 7800GTX that it was ready to launch with no real competition. Nvidia was able to sit on it until sufficient quantity were ready. ATI (sort of) launches the X1800XT and Nvidia falls back to the same old launch tricks. If you're going to hold one company accountable, you have to hold them all accountable!
So you're saying Nvidia's should not be held accountable for supply issues, but ATI should be held accountable? Please tell me you were being sarcastic.
I'm saying it's retarded because when Nvidia released their cards, you could buy them that day. Unlike ATI that even SAYS it will be different this time around and STILL fails to deliver. If neither companies produce enough to meet demand then they underestimated demand and that's something different entirely.
So ATI could have shipped about five cards to the top 10 retailers, and ATI would have completely fulfilled your expectations. They just would have "underestimated demand".
That's a huge load of crap you're shovelling there. Both companies are still more interested in appearing to be in a leadership position than they are actually ensuring they are making deliverable products. I just can't understand why so many "journalists" have their heads shoved so far up Nvidia's ass they can count their fillings.
Has anyone ever compared SLI and crossfire performance using a dual core compared to just a single core cpu? I mean if there is enough overhead for sli or crossfire a dual core chip could improve performance.
I don't know if that dual core thing would work. I mean it might but the two slower CPUs would not help in my opinion. Games are single threaded so the multi CPU wouldn't take off the overhead .. at least that's my knowledge of it.
See I thought that was a big deal with one of the latest Nvidia driver releases. That it was made multithreaded so that in a situation such as when you have sli or any other kind of driver overhead it would be taken care of by the a second core if one existed. I do not know it was just a thought that i had never seen discussed, so I thought I would ask.
We shall shortly soon find out whether Crossfire is serious or just a ATi marketing straw-grabbing ploy to get some suckers (er, "enthusiasts") not to buy SLI. If the compositor is fully integrated into EVERY R580 GPU, (thus never requiring a masterboard and implementing the board communications via a passive bridge a la nVidia) then we shall finally know that ATI is serious with Crossfire. It was probably a stupid cheese-pairing management decision not to integrate the Crossfire functionality fully into the R520 GPU, or else Crossfire does not have enthusiastic support from ATI engineering and is purely a ATi marketing ploy anyway. The R580 details will reveal the truth.
What changed since the http://www.anandtech.com/video/showdoc.aspx?i=2466...">Battlefield 2 GPU Performance Analysis article? It seemed like you were able to demonstrate the advantages of SLI in those benchmarks.
As far as I know the only thing that has changed along the way are the addition of BF2 patches (according to the overclocking the Athlon X2 article, they are up to using the 1.03 patch) and newer nvidia drivers. I believe they are still creating a demo and running it with the timedemo option. With this being such a popular game (BF2), it seems like it would be worthwhile to confirm whether SLI/Crossfire does or does not offer significant improvements for BF2.
i meant to say BF2 :( i know i am an idiot >>> Anyways please forgive me and i have a 7800GTX so don't call me a ATI fanboy , i can even take a screenshot if you want :!
So lets all throw a shit fit about every company that ever announced a product only to have availability weeks to years from that announcement.
Anandtech staff is just as bad as it's two year old readers who tie emotions in with silicon. Grow up and learn some patience - if you can't wait, buy someone elses product and stop your whining.
crossfire does great in DOD source , Black and White 2 and it beats Nvidia 7800GTX 512MB SLI . how ever it doesn't great in DOOM 3 engine. FEAR and Chaos theory it manages to defeat the 7800GTX 256MB SLI easy but in high quality of chaos theory it keeps up with 7800GTX 512MB. X1800XT crossfire has only one problem and that it suck in DOOM engine benchmark but overall its great.
"crossfire does great in DOD source , Black and White 2 and it beats Nvidia 7800GTX 512MB SLI "
You obviously didn't read the article if you thought Crossfire did great in Black and White 2. It failed to get above 12 fps and was over 50% slower than a single x1800XT at times.
I think the R580 chipset and videocard will be the real crossfire shot at SLI. This feels a little early generation SLI to me and they seriously need to get rid of the dongle, LOL. I can't wait to see the R580 in relation to the 7800GTX 512 market edition card.
Problem is that current ATI Crossfire setup is way inferior to NVIDIA SLI, despite big heading "ATI MultiGPU done right", and I do not believe this will change much with hyped R580.
With NVIDIA I can:
- mix any similar cards, being it 6800GS/GT, 7800GT, 7800GTX, 7800GTX-512, thus saving costs
- get much better drivers
- get support for user defined game profiles, thus when I purchase a new game, I do not have to wait a month or two for card producer to come up with new drivers and that game support (ie. with ATI I will play new games using SuperTiling with 0% to -50% "improvement", with NVIDIA I will create a new game profile using e.g. AFR2 and get immediate 90% improvement)
- there is no huge external dongle, and I guess picture quality should be better as well with SLI
- there is much better availibility of NFORCE4-SLI chipsets / motherboards
- I can connnect up to 4 monitors to NVIDIA SLI, but I cannot to Crossfire
- I can switch NVIDIA SLI on/off without restart
- I get better performance with NVIDIA SLI
ATI Crossfire seems still like an afterthought, nothing else, while NVIDIA SLI is a technology incorporated into the core. If I want to game on 1600x1200, the only real option is to get NVIDIA SLI.
I understand that Anandtech cannot bash Crossfire too much, to keep good relations with ATI, free products, shows, trips, etc., but I belive that superiority of NVIDIA SLI is very clear here..
I don't know what you're talking about.
- you can't mix any of the cards you listed with each other, except a GTX with a GTX512 (gotta use only 256mb from each, and run the 512 model at the normal GTX speeds);
- dont see how drivers are better
- dunno if a profile tool/editor for ati exists or is in the making, so cant comment on this; but you're overestimating the improvement on nvidia
- its just a cable; your guess about the quality comes from what grounds? guess what, you guessed wrong.
- point, unless you go intel; but i seriously hope both ATI and nVidia will release unlocked crossfire drivers that work on any dual16x board
- last i checked, you can't use the outputs on the secondary card in SLI mode, you have to switch SLI off first; i might be counting on outdated information, but i doubt it.
- yeah, and you do that how often? i bet its much fun and enjoyment... no seriously, you're right that this is a convenient feature as occasionally you'd have to switch it, but it doesnt seem too important does it? plus there's no telling if crossfire can't do the same eventually.
- again an unfounded claim
Indeed Crossfire seems like an afterthought to me too, and nVidia's superiority here is (atleast was, until recently) clear. But that's the good part of competition - nVidia forced ATI to "afterthink" something out, and it will be getting better with time. Crossfire is just as serious an option as SLI, and if you stick with your "the only real option is to get NVIDIA SLI" you're just purpously closing your eyes.
Think what you want, I don't really care. But don't preach unfounded fanboyism without expecting to face differing oppinions.
However, 7800 GTX 512 is not the competitor for R580, that competitor is G71, which is supposed to be due in Early Feburary as opposed to R580's Late January.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
40 Comments
Back to Article
t3h l337 n3wb - Wednesday, December 21, 2005 - link
The only place you can get one is Ebay, where there are 2 listings, and they're like $700+...DjDiff - Wednesday, December 21, 2005 - link
I'm curious whether crossfire would increase AVIVO performance or not. If not, will there be drivers in the future that will benefit from crossfire when using AVIVO?dualblade - Friday, December 23, 2005 - link
referring to playback, or the hardware encoding feature?playback is already at 1080p with a single x1800 of any sort so i don't think that needs improvement. crossfire hardware assisted encoding might be a really good thing. i imagine a dual core crossfire setup could become a real encoding/rendering powerhouse
Scarceas - Wednesday, December 21, 2005 - link
bleh no product... Why is it so hard to launch? Just don't announce your product until you've already shipped it. DURRR!!!Thalyn - Tuesday, December 20, 2005 - link
While I'm still here, I thought I'd point out what seemed to be a strange anomaly in the Quake 4 benches to see if someone can provide an answer.Under 4x FSAA, the GTX 512 cards are listed as performing better in 1920x1440 than in 1600x1200. Oddly enough, the results are almost right in the middle of the 1280x1024 and 1600x1200 scores, though if you re-plot the graph with the 1600 and 1920 results reversed it doesn't match the trends set by any other hardware in the list.
Is this a typo, or something more sinister? And, more curiously, why didn't Derek make any mention of it at all?
-Jak
Leper Messiah - Tuesday, December 20, 2005 - link
Yeah, I mentioned this a bit higher...haven't gotten an answer yet...Thalyn - Tuesday, December 20, 2005 - link
One thing I would be curious to see is how the ATi cards fare with a small tweak done under B&W2. There's a setting which can be changed in one of the .INI files which makes the game run exponentially better on most hardware I've seen it "trying" to run on - including my own X800 Pro AGP, an two mate's 6600GT AGP and 5800 Ultra AGP.I believe the file is called "graphics.ini" in the data subdirectory - change the detail settings to be 3 1 3 instead of 3 0 3. It does disable two of the options in the ingame graphics menu (and I have heard it can result in "squares" under fields and such), but the performance increase is substantial, to say the least. Oddly enough, just disabling these two options on their own doesn't make anywhere near as much of a difference.
Sadly, once it's running well you quickly find out that it wasn't worth all the effort, but I would still be curious to see the results from tests under such conditions. NVidia apparently fixed this bug with one of their post-release drivers (hence the disparity of scores), and there's also a 1.2 patch being prepared as we speak which will hopefully level things off somewhat, but in the meantime this is the best we've got.
-Jak
Beenthere - Tuesday, December 20, 2005 - link
...card-of-the week mentality. So I finally decided to do some research to see what the B.F.D. was with having one or more $700+ video cards in a PC. I went out and bought the Lanparty UT SLI Mobo, (2) FX57s so I could find the fastest O/C'ing one, (2) Asus 7800 GTX 512s, (2) 520W OCZ Power Stream PSUs, 2 x 1024MB OCZ EB Platinum 4800 modules, a Corsair ice water-cooling system for the FX57 and Nvidia chipset (until I get to vapor cooling), an Antec P160 Performance case and an HP L2335 23" display.Everything went together fine and I spent several days overclocking the two FX57s until I was able to run almost stable at 3.9 Gig. @ 1.625V w/34 degree cold water. And to my surprise my 3Dmark 2005 showed an incredible 18,240 score !!! WOW, I was just blown away. I was starting to understand what the enthusiasm was all about for the latest-greatest-trick of the week PC hardware. After several weeks of tweaking I now have my system stable most of the time and it simply fly's !!! Not only that but the blue LEDs look so cool at night, and my friends are impressed as H*LL that for less than $6,000 I have a PC that will cook my breakfast, bring in the newspaper, make the utility company rich, heat my house, make Nvidia rich, clean my car, wash my clothes and even do word processing. I can even log on to the Net .00000000000000001 seconds faster than my old dumbazz Athlon 939 3000 that I spent $1,000 on total and which runs rock stable at 2.4 Gig. And at a resolution of 1920 x 1200 I'm able to get a frame rate in any video game of at least 60. This allows me to sit 6'-8' away from my monitor to minimize eye strain when I play video games for 18 hours or more at a time.
Without a doubt I am one broke but very happy camper. NOW - now I understand the point of spending $700 or more on a Vid card and $1000 on a CPU and hundreds on memory, and PSUs, and trick PC cases, etc. And my friends think I am the coolest guy they know cause I got this BLING machine. Whatta life !!! If only I had known years ago...
AdamK47 3DS - Wednesday, December 21, 2005 - link
You forgot the sarcasm tags <sarcasm> </sarcasm>dali71 - Tuesday, December 20, 2005 - link
Really? And exactly WHERE can I find this mythical $1400 setup?
Vol2005 - Tuesday, December 20, 2005 - link
Dunno what about 512gtx-Sli, but single one is http://www.pcpop.com/doc/0/121/121711_5.shtml">no more "the best of the best" since "eax 1800xt top" beat it in most d3d benchies. ( not to mention it's price some $20-50 more than standard xt)Fenixgoon - Tuesday, December 20, 2005 - link
D3D benches are different than real world performance - and for just about everything (if not everything, correct me if i'm wrong), the GTX 512 blows away the GTX 256 and x1800XT. The x1800 XTPE, or whatever's next in line, is *supposed* to compete with the GTX512. Almost seems like nvidia caught ati flat footed on this one.Vol2005 - Wednesday, December 21, 2005 - link
sorry, maybe i was a bit unclearbut the thing is that asus x1800xt-top IS x1800xtpe, indeed. And as you've just said the real competitor to gtx512 according to article that i refered.
As to real world perfomance, it's still uclear to me what do you mean.
Maybe i'm wrong, but aren't the majority of the modern games using d3d? Even if not so, i think these results are fairly enough prove that the gtx is no longer the fastest.
Of course, this has to be proven further by other reviewers
bob661 - Tuesday, December 20, 2005 - link
You should've bought it when it was released. It was available then. Nothing mystical here.Leper Messiah - Tuesday, December 20, 2005 - link
6 frame per second increase in 1920x1440? eh?Tanclearas - Tuesday, December 20, 2005 - link
Come on! There are over 100 people in EVGA's step-up queue waiting for the 7800GTX 512MB, but you have a problem with ATI's availability?!Nvidia got LUCKY with the 256MB 7800GTX that it was ready to launch with no real competition. Nvidia was able to sit on it until sufficient quantity were ready. ATI (sort of) launches the X1800XT and Nvidia falls back to the same old launch tricks. If you're going to hold one company accountable, you have to hold them all accountable!
bob661 - Tuesday, December 20, 2005 - link
LOL! That's some retarded logic you got there pal.Tanclearas - Tuesday, December 20, 2005 - link
So you're saying Nvidia's should not be held accountable for supply issues, but ATI should be held accountable? Please tell me you were being sarcastic.bob661 - Tuesday, December 20, 2005 - link
I'm saying it's retarded because when Nvidia released their cards, you could buy them that day. Unlike ATI that even SAYS it will be different this time around and STILL fails to deliver. If neither companies produce enough to meet demand then they underestimated demand and that's something different entirely.Tanclearas - Tuesday, December 20, 2005 - link
So ATI could have shipped about five cards to the top 10 retailers, and ATI would have completely fulfilled your expectations. They just would have "underestimated demand".That's a huge load of crap you're shovelling there. Both companies are still more interested in appearing to be in a leadership position than they are actually ensuring they are making deliverable products. I just can't understand why so many "journalists" have their heads shoved so far up Nvidia's ass they can count their fillings.
almvtb - Tuesday, December 20, 2005 - link
Has anyone ever compared SLI and crossfire performance using a dual core compared to just a single core cpu? I mean if there is enough overhead for sli or crossfire a dual core chip could improve performance.kristof007 - Tuesday, December 20, 2005 - link
I don't know if that dual core thing would work. I mean it might but the two slower CPUs would not help in my opinion. Games are single threaded so the multi CPU wouldn't take off the overhead .. at least that's my knowledge of it.almvtb - Tuesday, December 20, 2005 - link
See I thought that was a big deal with one of the latest Nvidia driver releases. That it was made multithreaded so that in a situation such as when you have sli or any other kind of driver overhead it would be taken care of by the a second core if one existed. I do not know it was just a thought that i had never seen discussed, so I thought I would ask.bob661 - Tuesday, December 20, 2005 - link
That was an ATI driver release that had the multithreading stuff, I think.kilkennycat - Tuesday, December 20, 2005 - link
We shall shortly soon find out whether Crossfire is serious or just a ATi marketing straw-grabbing ploy to get some suckers (er, "enthusiasts") not to buy SLI. If the compositor is fully integrated into EVERY R580 GPU, (thus never requiring a masterboard and implementing the board communications via a passive bridge a la nVidia) then we shall finally know that ATI is serious with Crossfire. It was probably a stupid cheese-pairing management decision not to integrate the Crossfire functionality fully into the R520 GPU, or else Crossfire does not have enthusiastic support from ATI engineering and is purely a ATi marketing ploy anyway. The R580 details will reveal the truth.Spacecomber - Tuesday, December 20, 2005 - link
What changed since the http://www.anandtech.com/video/showdoc.aspx?i=2466...">Battlefield 2 GPU Performance Analysis article? It seemed like you were able to demonstrate the advantages of SLI in those benchmarks.Space
bob661 - Tuesday, December 20, 2005 - link
I think AT has a different benchmark now for BF2.Spacecomber - Thursday, December 22, 2005 - link
As far as I know the only thing that has changed along the way are the addition of BF2 patches (according to the overclocking the Athlon X2 article, they are up to using the 1.03 patch) and newer nvidia drivers. I believe they are still creating a demo and running it with the timedemo option. With this being such a popular game (BF2), it seems like it would be worthwhile to confirm whether SLI/Crossfire does or does not offer significant improvements for BF2.ViRGE - Wednesday, December 21, 2005 - link
Ya, DICE seems to screw up demos with new BF2 patches.ElFenix - Tuesday, December 20, 2005 - link
i wonder if you can change b&w2's name to make the score go up as well. maybe there is poor optimization going on in the catalyst AI?tuteja1986 - Tuesday, December 20, 2005 - link
i meant to say BF2 :( i know i am an idiot >>> Anyways please forgive me and i have a 7800GTX so don't call me a ATI fanboy , i can even take a screenshot if you want :!tfranzese - Tuesday, December 20, 2005 - link
So were back here today with the CrossFire solution that should have been: the ATI Radeon X1800 CrossFire Edition.we're
tfranzese - Tuesday, December 20, 2005 - link
So lets all throw a shit fit about every company that ever announced a product only to have availability weeks to years from that announcement.Anandtech staff is just as bad as it's two year old readers who tie emotions in with silicon. Grow up and learn some patience - if you can't wait, buy someone elses product and stop your whining.
tuteja1986 - Tuesday, December 20, 2005 - link
crossfire does great in DOD source , Black and White 2 and it beats Nvidia 7800GTX 512MB SLI . how ever it doesn't great in DOOM 3 engine. FEAR and Chaos theory it manages to defeat the 7800GTX 256MB SLI easy but in high quality of chaos theory it keeps up with 7800GTX 512MB. X1800XT crossfire has only one problem and that it suck in DOOM engine benchmark but overall its great.jonny13 - Tuesday, December 20, 2005 - link
"crossfire does great in DOD source , Black and White 2 and it beats Nvidia 7800GTX 512MB SLI "You obviously didn't read the article if you thought Crossfire did great in Black and White 2. It failed to get above 12 fps and was over 50% slower than a single x1800XT at times.
BigLan - Tuesday, December 20, 2005 - link
No x1800 master cards showing on Newegg or CompUSA's website as of 10AM EST. Really hope ATi make good on their availability promises this time.michaelpatrick33 - Tuesday, December 20, 2005 - link
I think the R580 chipset and videocard will be the real crossfire shot at SLI. This feels a little early generation SLI to me and they seriously need to get rid of the dongle, LOL. I can't wait to see the R580 in relation to the 7800GTX 512 market edition card.radekhulan - Wednesday, December 21, 2005 - link
Problem is that current ATI Crossfire setup is way inferior to NVIDIA SLI, despite big heading "ATI MultiGPU done right", and I do not believe this will change much with hyped R580.With NVIDIA I can:
- mix any similar cards, being it 6800GS/GT, 7800GT, 7800GTX, 7800GTX-512, thus saving costs
- get much better drivers
- get support for user defined game profiles, thus when I purchase a new game, I do not have to wait a month or two for card producer to come up with new drivers and that game support (ie. with ATI I will play new games using SuperTiling with 0% to -50% "improvement", with NVIDIA I will create a new game profile using e.g. AFR2 and get immediate 90% improvement)
- there is no huge external dongle, and I guess picture quality should be better as well with SLI
- there is much better availibility of NFORCE4-SLI chipsets / motherboards
- I can connnect up to 4 monitors to NVIDIA SLI, but I cannot to Crossfire
- I can switch NVIDIA SLI on/off without restart
- I get better performance with NVIDIA SLI
ATI Crossfire seems still like an afterthought, nothing else, while NVIDIA SLI is a technology incorporated into the core. If I want to game on 1600x1200, the only real option is to get NVIDIA SLI.
I understand that Anandtech cannot bash Crossfire too much, to keep good relations with ATI, free products, shows, trips, etc., but I belive that superiority of NVIDIA SLI is very clear here..
Visual - Thursday, December 22, 2005 - link
I don't know what you're talking about.- you can't mix any of the cards you listed with each other, except a GTX with a GTX512 (gotta use only 256mb from each, and run the 512 model at the normal GTX speeds);
- dont see how drivers are better
- dunno if a profile tool/editor for ati exists or is in the making, so cant comment on this; but you're overestimating the improvement on nvidia
- its just a cable; your guess about the quality comes from what grounds? guess what, you guessed wrong.
- point, unless you go intel; but i seriously hope both ATI and nVidia will release unlocked crossfire drivers that work on any dual16x board
- last i checked, you can't use the outputs on the secondary card in SLI mode, you have to switch SLI off first; i might be counting on outdated information, but i doubt it.
- yeah, and you do that how often? i bet its much fun and enjoyment... no seriously, you're right that this is a convenient feature as occasionally you'd have to switch it, but it doesnt seem too important does it? plus there's no telling if crossfire can't do the same eventually.
- again an unfounded claim
Indeed Crossfire seems like an afterthought to me too, and nVidia's superiority here is (atleast was, until recently) clear. But that's the good part of competition - nVidia forced ATI to "afterthink" something out, and it will be getting better with time. Crossfire is just as serious an option as SLI, and if you stick with your "the only real option is to get NVIDIA SLI" you're just purpously closing your eyes.
Think what you want, I don't really care. But don't preach unfounded fanboyism without expecting to face differing oppinions.
coldpower27 - Tuesday, December 20, 2005 - link
However, 7800 GTX 512 is not the competitor for R580, that competitor is G71, which is supposed to be due in Early Feburary as opposed to R580's Late January.