I’m still a little confused about this 60MHz limit at 1600x 1200.
Can someone please explain what I am looking at (in the graphs) when every 1600x1200 resolution that is tested is above 60MHz. If crossfire really is limited to 1600x1200@60MHz then what do these numbers represent? Wouldn’t this also make the entire benchmarking section of this article pointless?
Framerate is separate from the physical refresh (or redraw, in the case of LCDs) rate of your monitor. The cards can produce higher than 60fps at 16x12, but you'll only see at most 60 of those frames in a given second. It's the same principle as with single cards today. Your frame rate might be 120fps, but if you're monitor's refreshing at 85Hz, you aren't going to see all those frames. They're just--for lack of a better term--thrown away.
Thanks Pete, but I do understand the difference between a cards frame rate and a monitors refresh rate.
My confusion is in regards to this quote from the article “The problem is that ATI is currently fixing maximum CrossFire resolution to 1600x1200@60Hz”, and then we see every game test ran at 1600x1200 over 60Hz (BTW thanks for ignoring my typo of MHz instead of Hz in my earlier post).
If the numbers in the game tests are the actual frame rates then why limit the resolution? Furthermore can the cards driver limit the monitors refresh rate? And again looking at the scores why would they want to do this?
I feel as if I am missing a vital piece of the puzzle.
Where does it say anything about the games being run at over 60Hz refresh?
The cards manage significantly higher frame rates than 60fps when rendering
the games, but I've not spotted the monitor setup being mentioned *at all*.
Maybe I'm going blind. The graphs just show frame rate - no mention of refresh.
How do you mean "why limit the resolution"? Who's limiting it to what (in this case)?
The driver is responsible for programming the video outputs on the card, and
for reporting the available modes to the Display Properties dialogue. If the
driver won't let you set a higher resolution, then you can't (without using
something like PowerStrip) even if the card could physically do it. So yes,
the driver could limit it. The modes presented are a the result of negotiation
between the monitor and the driver, unless they're manually overridden. ATi's
drivers have been known not to present all the modes the monitor/card can handle
unless persuaded by a registry hack, but I don't think that's what's going on here.
In case of further confusion, it's up to the card to tell the monitor how to refresh
- the card "pushes", the monitor doesn't "pull". Some monitors have an internal
buffer to allow their screens to be updated separately from the frame buffer, and
obviously the refresh rate affects TFTs less than CRTs (a TFT won't fade to black
a fraction of a second after you last updated it), but the rate at which the video
gets from the card's frame buffer to the monitor is determined by the driver's
decision of how to program the card.
As for why ATi(?) would want to do this, I'm sure they don't. In theory Crossfire
should support higher single-link resolutions (at lower refreshes) - 1920x1200 at
50Hz, for example - and I believe it actually does, with the right monitor attached.
However, the X800/X850 series start to run out of performance above 1600x1200, when
their hierarchical Z buffer doesn't fit any more (according to an article on higher
resolution rendering with 7800GTXs; I could be wrong), so 1600x1200 would be a
reasonable limit anyway; also few LCD monitors support more than that without being
widescreened, and many games don't like a wide screen configuration. What you can't
do is exceed the single link bandwidth, so the card isn't psysically capable of
sending 1600x1200 at 85Hz for a CRT. 1600x1200 on an LCD should be fine, because
60Hz won't flicker; it's only CRT owners and those (few) with dual link LCDs who
have a problem.
I would guess this test was performed with a 20" LCD, on the basis that a 60Hz
1600x1200 CRT would drive the reviewer nuts quickly, and they're more common than
1920x1200 panels; hence no test above this. Any specific details of refresh will
presumably be the result of the driver talking to the monitor, and may not correspond
to what you'd get with a *different* monitor. Almost all TFTs will be within the
limits, but if you've got a decent CRT then it's definitely a problem. Nice of ATi
to buy us all 2405FPWs, isn't it?
even though their overall sli performance is lower than nvidia, if you compare the percentage increase from ati single to sli with the percentage increase from nvidia single to sli...ati comes out on top.
which basically tells us when ati comes out with a 7800gtx killer, they will have much higher sli numbers than nvidia.
the limitations, I'm sure they will square out in future implementations of crossfire.
I hope ati comes out with all-in-wonder editions of the sli cards...can you say 2 channels at the same time without any lag? :D
... umm ... did you read/understand the tables at the bottom of page 6? Quoted directly from the review -- "the 4xAA modes show that SLI provides better scaling" ... meaning that SLI gets closer to the theoretical doubling of performance (ie, 100%) than CrossFire does. In fact, the only situation in which a move from single-card to dual-card scales better on the ATI side than the nVidia side is Doom3 at 16x12/noAA (41.3% vs 34.0%)...
This tells us nothing substantial about performance improvements when ATI comes out with R520, but in my opinion it would seem that nVidia has the stronger dual-card algorithms. If I had the choice of spending an additional $425 (cost of a master card, estimated) for an average performance improvement of ~37% with noAA or ~56% with 4xAA for ATI's CrossFire, versus spending an additional $450 (cost of another 7800GTX) for an average performance improvment of 48.5% with noAA or 79.8% with 4xAA for nVidia's SLI, I think I would definitely go with nVidia's solution, as that additional $500 for another 7800GTX offers more bang-for-buck than for one of ATI's master cards. Combine this with the fact that 7800GTX SLI offers better performance overall anyway, and it's a no-brainer. The situation's a little different if you've already got a 6800Ultra (but then why are you looking at CrossFire?) or an x850xt (but then why are you looking at SLI?), but those situations have no bearing. Simply put, nVidia has hands-down won this "round" of multi-card rendering, even ignoring current availability.
-TIM
PS -- I would be curious what would be possible with two AIW's hooked up in CrossFire ... that could certainly have some interesting situations open up ...
quote: With AA enabled, CrossFire performs similarly well. The 7800 GTX SLI beats it in Halflife 2 this time around (even though both parts are still essentially CPU limited),
So that would denote that AA "costs" more on the ATI setup than it does on the NVidia setup, right?
The ATI could have performed better because of the way HL2's code runs on ATI hardware allows a higher maximum performance under similar CPU limitation. It could be a driver efficiency issue. It could be a number of things ...
It's too hard to make a statement about anything when we are bouncing hard off of CPU limited performance.
The CrossFire master card is basically an X850 XT with the addition of a Xilinx FPGA (for the compositing engine) and a TMDS receiver for taking input from the slave card. Instead of 2 DVI-D ports, the CrossFire master card makes us of a high speed DMS port. This connects to one port of the CrossFire dongle and takes the slave card input as well as providing the output to the monitor.
We will be posting an in-depth review of the ATI Crossfire AMD motherboard tomorrow. One thing no site has really talked about is what a great overclocker the ATI Crossfire AMD has become. The Reference Board has the best Enthusiast level options and controls I have EVER seen on a Reference Board, the overclocking performance is outstanding, and DFI promises they will deliver the same or better in the next few weeks in their own Crossfire AMD board.
Another unmentioned biggie for me is the fact that the Crossfire X850 Master Card works just fine as a standalone X850 graphics card. If you have to buy a Master Card for Crossfire I think it's important to know you can use it ALONE as an X850 card when the next hot generation graphics comes along - it's not just an investment you have to throw away.
Derek talks a bit about how the next generation graphics fixes some of his concerns about Crossfire as a TODAY purchase. His comments should make more sense in that light, and they will certainly be clearer in the next couple of weeks when X1800 launches. Just keep in mind that the same Crossfire AMD motherboard will be used with X1800 and that the motherboard also works great with an nVidia 7800GTX and any other single nVidia or ATI graphics card right now.
i have to say nice numbers by ati, but
i mean who in the right mind is buying a 500 dollar x850 card, a 150-200 dollar mobo,
just to get crossfire, when in 1 week or so the x520 is supposd to come out. which by the way does not appear to have any menition of crossfire support. and while true you do not have to buy the same 2 cards for crossfire to work. the 2nd master card you buy costs you 100 bucks more. phhff. sorry. the whole sli/crossfire thing really doesn't make much sense unless you are always buying at the release of the tech. i.e. when the gtx gets released u buy 2.
but why would you buy into a crossfire setup now? when the next ati cards are coming out. if there similar to the nivida one. getting 2 x800s, is like roughly spending the money on 2 6600gts, when you could for the money buy the never, and overall better 1 card solution in a 7800gtx for the same money.
or a new x520, for the same money, i'd pass on getting old tech up to speed when new stuff is/will be here soon
also if u do compare the 6800ultra vs the x850, the only game the ati car wins in is in hl2. go figure, a game they basicaly wrote for there hardware with the help of gabe 'i will delay this game till ati is ready'
heh ... ATI was very unhappy with Valve's launch slip. They sank a lot of money into the HL2 bundle and launch efforts for a time frame that was a year before the game actually came along.
So, like most people on this forum, I read both here and at Tom's Hardware...and it is amazing to me the discrepancies in the writings of the articles. I was hoping you could clarify.
1. Are they just bought out by ATI, or are you guys just bought out by NVIDIA?
1a. The conclusions drawn seemed to be in the face of each other. You are dead set against, and they are seeming for ATI's crossfire. This isn't fashion, or poetry, or a film, where you just go by what you like. There are frame rates, quality of picture, etc....why are they so different?
2. It appears that the benchmarks on Tom's are slightly different in comparison to yours, in that it shows the X850 crossfire doing slightly better than the 7800 SLI configuration in several instances.
3. Tom's foxused more on the notion that you don't need two of the exact same card. Based on your article, it is your opinion that you go 7800 SLI, or single wiht hopes of 7800 SLI later, rather than x850 now, and then the addition of R520 later? It would be helpful to have some benchies of mixed card situations.
I realize that it seems attacking, and really I posted here because I like Anandtech better than Tom's but I respect both as the best resources on the net for technology reviews, but it just seems odd that the two top sites out there come down on completely different sides of the fence.
Well as it seems THG has in the last two year gone from beeing a "not-so-good" hardwaresite (please buy it back Thomas) to currently beeing a "outright-bought" hardwaresite. Just look at the major articles they have published the last year and U will see.
Tom's Hardware articles seem to be trending toward less interesting and less quality over time.
I still read it, but most of the time find I prefer the AnandTech version of similar subjects.
One thing that is terrible about Toms - why no direct link from articles to a discussion topic?
Not only does it make it easier to discuss, but I consider it a huge benefit the the AnandTech authors read and participate in the discussion of the articles.
wow, that commment sure brought a lot of attention.
I would agree that Thom's appears to be "outright-bought" in this article as they don't post the limitation of hte card at 1600x1200 @ 60hz, or at least not as clear as Anand, but they do make some valid comments that Anand's article didn't post either.
Perhaps they are targeting different audiences? No idea, but why are their numbers on par with the 7800 SLI in many situations? That sounds fishy somewhere.
Adding an X8xx card to an R520 CrossFire card would either not perform well at all or would not work.
Personally, dual-GPU as an upgrade solution is not really a plus unless you are just deffereing purchase for a couple months while prices drop and your wallet heals from the first GPU purchase. If your personal upgrade cycle is a year or more, you'd be much better off just buying a new card from a new architecture.
Everyone else has pretty well hit it on the head. The 1600x1200 limit is a killer. We also have no availability and we are butted right up against the launch of ATIs next gen parts.
I wouldn't recommend SLI either -- as I said -- unless you want absolute maximum preformance. My recommendation may change to CrossFire after the R520 comes along. But who knows what the results of that comparison will be :-)
Mixed modes would perform slightly lower than the dual x850 xt setup and still at most 1600x1200@60 ... Yes, the X850 XT CrossFire does well in performance, but if I'm not going to recommend the X850 XT CrossFire, I'm certainly not going to recommend a mixed solution that will perform worse.
I wasn't saying that you would recommend the other, but it would have been interesting for the readers (who attempt to be congiscent, autonomous beings, and only act at the whim and will of god Anand!) to be able to compare for themselves, per chance see what a mixed solution looks like as that is a selling point of ATI over Nvidia.
I don't think SLI from NVIDIA is much of a solution. If you have $1k to shell out for graphics out of the starting gate, great. But you have to get the same manufacturer, the same card, and then the motherboard to match. But I won't be buying an ATI Crossfire setup just yet either.
1) If the 1600x1200@60 Hz problem doesn't bother you, that continue. For me, it's a deal-breaker.
2) Do you already own an X8xx card of some form?
3) Do you have a motherboard with two PCIe X16 slots?
4) Using the ATI Crossfire chipset?
If all of those are true, X800 Crossfire is worth consideration. Personally, 1, 3, and 4 eliminate it from contention. That said, this is current X8xx Crossfire we're looking at. We're not reviewing R520 Crossfire yet, and it will address at least the first point in some fashion.
Yeah, accusing a site of corruption could seem like an attack.
1600 res limited to 60 hz is a deal-breaker for me right away. I can't tolerate playing at 60 hz for any length of time. I'm not a snob about this, it makes me physically feel ill. Pairing two powerful and pricey cards in one system should offer better, easily. I would not plunk down that kind of money to not be able to play at 16x12 when others do it at liveable refresh rates. Nvidia has better price-to-performance single and dual card solutions available right now. ATI should have the same shortly. The mega AA sounds great, but if as soon as you turn it on you say "I need to upgrade now," then what's the point? While the reviewed Crossfire is certainly nice in performance, there's better to be had for the money from both GPU suppliers. It would be irresponsible to recommend it at this time at the current price.
Now if there's a confluence of specifics where this setup makes gaming and financial sense to a handful of people out there, more power to them, they should enjoy this. For the teaming masses of readers though, you shouldn't be suprised by the lack of recommendation.
LOL... attack an enthusiast site? Say it isn't so!
How about the THG article? They review the platform as a whole, so that's a bit different. I won't comment much on their review, but consider a few points.
They have a page entitled, "Advantages Of CrossFire Over SLI" that reads like marketing hype, and yet they make no mention of the resolution limitation or "Advantages of SLI Over Crossfire". Clearly, they know the limitation exists - ATI hasn't tried to hide this fact, and the lack of any benches at higher than 1600x1200 is telling in and of itself. You do the rest of the math. (Also, some of the results are at best suspect.)
IMO, the writing of the THG article was a bit higher quality, but the content was far more suspect. They come off making everything sound rosey for ATI, and only a fool or a marketing department would believe that. ATI isn't dead yet by any means, but Crossfire is doing little for me right now. Did you realize that it's still not available for purchase at retail? Hmmmm.....
Oh yeah, Catalyst Control Center is pure garbage. Slow, clunky UI, memory hog, and causes as many problems as it fixes. Anyone that tries to tell me how great CCC is (i.e. THG) is immediately under suspicion.
Oh, I wasn't making a blanket statement that a review site couldn't be accused of corruption. Just that if you're going to do it, don't then wuss out and pretend it's not an attack :)
My experience with THG is mostly second-hand, so I don't say much about it. That they chose to not mention the 16x12 refresh rate limitation, especially with all the debate over it before today, well... that's scandalous.
The TechReport review was the best. Anand seems to give poor Graphics reviews, and definately not up to par with their CPU, motherboard and memory reviews. It must be the authors differences of opinion.
Could you clarify why Anand's video reviews suck? Just because Anand doesn't benchmark and show 100 diff. graphs of the cards based on the same architecture doesn't mean AnandTech's video reviews suck. TR is quite redundant from one article to another, if you didn't notice.
quote: CrossFire is limited to a peak resolution of 1600x1200 at a 60Hz refresh rate. CrossFire relies on the single-link DVI output of existing Radeon X800-family graphics cards, and that connection tops out at 1600x1200 at 60Hz.
Well because the 1600x1200@60Hz is only a limitation because of the existing x800 family of cards and not the x850 family and definately not Crossfire itself.
It's a limitation of the X8xx esries, as they all feature single-link TDMS transmitters, and so the Master cards have single-link TDMS receivers. They should be good for more than 16x12@72Hz, per DVI spec; hopefully future drivers will up this a bit.
I really like how concise and readable the first few pages are.
I do agree that comparing XF directly to the "2nd gen" SLI of the 7800 is a little unfair, but it's still potentially useful to some people, and you obviously left in XF's direct competitors, 6800 SLI and a single 7800. This does take the article in the 'too much info' direction, as opposed to the first few pages' 'just enough' method.
I have a few suggestions and corrections, if you don't mind.
* Perhaps you could elaborate on how XF will remove the res/refresh limitation with the R520 line-ups dual-link TDMS transmitters? This is appropriate in terms of the 7800 SLI comparison, although who knows when X1800 XF will show up.
* On that note, I've read elsewhere that SuperAA is so unbelievably slow because XF is actually using PCIe (bandwidth- and latency-limited) lanes and then the "master" GPU (for inter-GPU communication and then to composite the image, respectively), and not the dongle and CE (as with "normal" XF operation). This will supposedly be corrected in a future driver, but (IMO) it's as big a shortcoming (however temporary) as the (permanent, hardware-imposed) resolution limit. And I'm quite skeptical about future driver fixes, though it seems essential that ATI solve this one.
* p.6, you write "pre" instead of "per."
* p.7, "worth" instead of "worthy."
Will you be examining these issues at Ibiza, or will you have time before packing your sunscreen? :D
(And no, I'm not ignoring you, I'm just an incredibly slow and unimaginative thinker at times.)
I did mention that ATI's next gen part should remove the limitations of the single-link TMDS somewhere in there ... I am unable to go into detail at this time.
I'll have to follow up on the PCIe rather than TMDS angle. That would make some sense to me though. All the subsamples from a single pixel may need to be in the same framebuffer in order for ATI to perform proper AA on them. It may be that the gamma adjustment causes some problems with doing a straight blend between the two scenes. Of course, that's speculation about speculation, so I wouldn't put much stock in my musings :-) As I said though, I'll follow up on this.
I fixed my typos. Thanks.
Glad you liked the article. And where I'm going next sunscreen won't be of much use. :-(
Also, I didn't think you were ignoring me. I've actually been pretty busy myself lately, so I completely understand.
Good point. Why are comparing previous gen ATI to current gen nVidia?
----------------------------------------------------------------------------------------
Previous gen? I can buy something better than an x850 xt pe from ati? Where?
Crossfire is compared to the 6800 Ultra SLI and more. More is better me thinks.
If they hadn’t tested it against 7800GTX we would not have know that SLI got beat in HL2, now would we? I think the choice on which cards to test was great.
It's not ATs fault ATI didn’t give them any newer cards to run. You will see those benches next week.
With the first comparison images between the various levels of AA, I can see the straight picture to be badly stairstepped. The 4AA seems to correct pretty much all of the problems. I can't see a real improvement in any of the others.
I wonder how many will notice any of it in an actual game situation.
It would be easier to show the advantages if I could show motion. Higher AA not only helps the stair steps, but it also helps keep objects that are well antialised in a single frame consistent across multiple frames of motion.
Derek, I've noticed something odd in the last few articles regarding splinter cell:chaos theory and the 6800u benchmarks. If you compare the results of 1600x1200 no aa and 1600x1200 4x aa, the performance hit for the 6800u is .1 of a frame? How is this possible? In my own experience, when I enable 4xaa on my 6800u for sc:ct, I notice no difference in image quality. Is this perhaps a driver problem?
power draw is measured before the PSU -- so yes, the dissipated power of the supply itself is included. And I do know that power draw at the wall is not exactly linear with respect to power supplied to the computer. At the same time, watts pulled from the wall are what we pay for right? :-)
You do realise that without proper homework you are just perpetuating sensationalism, right? The TMDS receiver has nothing to do with actual framerate or screen refresh, you do know that, right? You are aware that 1600x1200@60Hz in the TMDS translates into about 2500x1500@100Hz when doing SLI (or Crossfire, as is the case), right? Now go do your homework and correct the article (you're liable to be sued for libel by ATI btw).
LOL, you are such an idiot. AnandTech can't be sued for libel; did you ever take business 101? Apparently not. Please keep your mouth shut on the things you have no clue about.
sorry m8, don't know where you got your info, but regaurdless of the fact that it is possible for the output to be run at a higher resolution than the TMDS receiver doesn't matter when the product manager of CrossFire at ATI states that the output of CrossFire will be locked to 1600x1200@60Hz *because* of the single link TMDS receiver.
I'm sorry if I didn't make it completely clear that ATI could decouple their output from the reciever but they have chosen not to.
The TDMS receiver is specced to the TDMS transmitter on the slave card, so it is indirectly tied to the screen refresh. I don't recall Derek saying it limits the frame rate, but obviously you can't see more frames than screen updates, so it can potentially limit the visible framerate, too. Yes, that applies to anything with a different frame than refresh rate, but XFire is fairly limited at 16x12@60Hz.
And, no, you don't double up TDMS rates with XF, as the CE (Compositing Engine) doesn't have a buffer to accomodate a refresh rate independent of TDMS rate (which is, again, limited to 16x12@60Hz).
It is late and I can't be bothered to look for the techie article on this particular problem and why it's been blown out of proportion, but I will do it tomorrow and post a link here. My apologies to Derek if I offended him, it really is late. Link upcoming.
Dangher, you won't find an article to support your claims. It was speculated (in many a forum and possibly by Josh at Penstarsys) that AFR could double XF's single-link TDMS' refresh rate or resolution by interleaving frames, but that's been ruled out, as apparently the RAMDAC must run at the TDMS engine's rate, and the CE doesn't have buffer enough to support RAMDAC refresh rates indpendent of the TDMS engine.
So, I'd be surprised if you do.
And Derek won't be sued for libel unless he intentionally published false info. I'm sure much of his info came from ATI themselves, as well as hands-on experience (which shows a 16x12@60Hz limit across the review-site board).
I think there has been speculation about what could be done with additional low-level hardware and driver tweaks. For now, X8xx Crossfire does not appear to have any support for anything beyond 1600x1200@60 Hz. That's terrible, in my opinion. I have a 9 year old 21" CRT that can run 1600x1200@75Hz. Anyone that has the money to buy Crossfire is highly likely to have a better monitor than that. Meanwhile, my 2405FPW may only run at 60Hz, but lack of 1920x1200 output makes X850 Crossfire a definite no.
My only hope is that ATI has spent more effort on R520 Crossfire and will manage to support at least 2048x1536@85 Hz. That's about where top quality CRTs max out, and there are far more 22" CRT owners than Apple 30" Cinema Display owners. :|
I'm surprised that any single-link resolution isn't possible (so a digitally driven
2405FPW ought to work), but it's clear that there's a problem with CRTs. The R520's
dual-link outputs would appear to solve the problem with reasonable headroom, coincidentally supporting dual link monitors.
Dangher's post *could* make sense - by interleaving pixels one could, in theory, take
two single-link images and produce a dual-link one. But the chips aren't really set
up to render like that - it's certainly not one of the announced Crossfire modes.
It would probably also be slower than the existing modes.
AFAIK there's very little intelligence in the CE (or in the SLi combiner) - the
chip not producing output for the relevant bit of screen just outputs black, and
the CE/SLi combiner just ORs the values from the two heads together. There's a bit
of genlock involved and the DVI receiver and transmitter, but the amount of actual
logic is tiny. Unless I'm wrong about how it works, but I don't see the need for
more (except for the multi-card antialiasing, which presumably needs some blending
support - I was a bit surprised that nVidia could retrofit this for that reason).
You could do all kinds of clever things if the SLi bridge/Crossfire connection
was actually a general-purpose high bandwidth link between the two cards, but to
the best of my knowledge, it's not: it's video only, so you're limited to what
the cards can drive on their digital video outputs when it comes to displaying
the result, and uneven splitting won't help you - it's the peak rate of output
which matters, not the average throughput.
On the plus side, with enough supersampling 1280x1024 on a CRT might not look
much worse than 1600x1200 with less...
I've belatedly picked up on something. Sorry if I'm being slow, but to confirm:
The multi-card supersampling mode... is the frame from the secondary card sent
over the PCI-e bus, rather than over the Crossfire link? If so, this would
explain a large performance drop as it's implemented, but also explain how
nVidia could implement the equivalent mode without having built blending
directly into their SLi combiner in the first place (and also suggest that
the Crossfire combiner doesn't need to be clever enough to blend). It might
alse explain why nVidia's implementation coincided with a bridgeless SLi
capability (once you've done the work in the driver...)
If they *do* this, there's no reason for it to be limited to 1600x1200 (or
single-link bandwidth), other than that the PCI-e bus will be limiting the
refresh at some point.
Just wondering, and curious whether I'm imagining it.
I don't get offended easily. I'm certainly the first person who wants to know if I got something wrong. At the same time, it is my responsibility to get across the clearest way possible, so I'm also concerned when it doesn't seem that I have communicated the facts clearly enough.
If this had been released 6 months ago, it would be good. Right now with one 7800GTX beating it in some benchies, and SLi GTs and GTX raping it, this just doesn't cut it. Hopefully ATi has something amazing with the R520, otherwise they are heading back to the days of pre-R300.
Okay, I don't get this. I'm running a 24" widescreen monitor at 1920x1200@60HZ using single link DVI. The limit for single-link DVI at 60HZ is said to be 2.6 megapixels which is quite a bit higher than 1600x1200.
That's because all the video card vendors allow higher resolutions be reducing the video blanking perioed. This gives the card more time to send data, resulting in a higher available resolution.
I'm assuming the article is brand new and yet to be fixed, but in case no one has noticed, the charts on page 4 show the crossfire consuming no power. While I'm sure that would be everyone's goal, I don't think it's right somehow.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
76 Comments
Back to Article
justly - Tuesday, September 27, 2005 - link
I’m still a little confused about this 60MHz limit at 1600x 1200.Can someone please explain what I am looking at (in the graphs) when every 1600x1200 resolution that is tested is above 60MHz. If crossfire really is limited to 1600x1200@60MHz then what do these numbers represent? Wouldn’t this also make the entire benchmarking section of this article pointless?
Pete - Wednesday, September 28, 2005 - link
Framerate is separate from the physical refresh (or redraw, in the case of LCDs) rate of your monitor. The cards can produce higher than 60fps at 16x12, but you'll only see at most 60 of those frames in a given second. It's the same principle as with single cards today. Your frame rate might be 120fps, but if you're monitor's refreshing at 85Hz, you aren't going to see all those frames. They're just--for lack of a better term--thrown away.justly - Wednesday, September 28, 2005 - link
Thanks Pete, but I do understand the difference between a cards frame rate and a monitors refresh rate.My confusion is in regards to this quote from the article “The problem is that ATI is currently fixing maximum CrossFire resolution to 1600x1200@60Hz”, and then we see every game test ran at 1600x1200 over 60Hz (BTW thanks for ignoring my typo of MHz instead of Hz in my earlier post).
If the numbers in the game tests are the actual frame rates then why limit the resolution? Furthermore can the cards driver limit the monitors refresh rate? And again looking at the scores why would they want to do this?
I feel as if I am missing a vital piece of the puzzle.
Fluppeteer - Thursday, September 29, 2005 - link
Sorry, now I think *I'm* missing something.Where does it say anything about the games being run at over 60Hz refresh?
The cards manage significantly higher frame rates than 60fps when rendering
the games, but I've not spotted the monitor setup being mentioned *at all*.
Maybe I'm going blind. The graphs just show frame rate - no mention of refresh.
How do you mean "why limit the resolution"? Who's limiting it to what (in this case)?
The driver is responsible for programming the video outputs on the card, and
for reporting the available modes to the Display Properties dialogue. If the
driver won't let you set a higher resolution, then you can't (without using
something like PowerStrip) even if the card could physically do it. So yes,
the driver could limit it. The modes presented are a the result of negotiation
between the monitor and the driver, unless they're manually overridden. ATi's
drivers have been known not to present all the modes the monitor/card can handle
unless persuaded by a registry hack, but I don't think that's what's going on here.
In case of further confusion, it's up to the card to tell the monitor how to refresh
- the card "pushes", the monitor doesn't "pull". Some monitors have an internal
buffer to allow their screens to be updated separately from the frame buffer, and
obviously the refresh rate affects TFTs less than CRTs (a TFT won't fade to black
a fraction of a second after you last updated it), but the rate at which the video
gets from the card's frame buffer to the monitor is determined by the driver's
decision of how to program the card.
As for why ATi(?) would want to do this, I'm sure they don't. In theory Crossfire
should support higher single-link resolutions (at lower refreshes) - 1920x1200 at
50Hz, for example - and I believe it actually does, with the right monitor attached.
However, the X800/X850 series start to run out of performance above 1600x1200, when
their hierarchical Z buffer doesn't fit any more (according to an article on higher
resolution rendering with 7800GTXs; I could be wrong), so 1600x1200 would be a
reasonable limit anyway; also few LCD monitors support more than that without being
widescreened, and many games don't like a wide screen configuration. What you can't
do is exceed the single link bandwidth, so the card isn't psysically capable of
sending 1600x1200 at 85Hz for a CRT. 1600x1200 on an LCD should be fine, because
60Hz won't flicker; it's only CRT owners and those (few) with dual link LCDs who
have a problem.
I would guess this test was performed with a 20" LCD, on the basis that a 60Hz
1600x1200 CRT would drive the reviewer nuts quickly, and they're more common than
1920x1200 panels; hence no test above this. Any specific details of refresh will
presumably be the result of the driver talking to the monitor, and may not correspond
to what you'd get with a *different* monitor. Almost all TFTs will be within the
limits, but if you've got a decent CRT then it's definitely a problem. Nice of ATi
to buy us all 2405FPWs, isn't it?
--
Fluppeteer
FreshPrince - Tuesday, September 27, 2005 - link
even though their overall sli performance is lower than nvidia, if you compare the percentage increase from ati single to sli with the percentage increase from nvidia single to sli...ati comes out on top.which basically tells us when ati comes out with a 7800gtx killer, they will have much higher sli numbers than nvidia.
the limitations, I'm sure they will square out in future implementations of crossfire.
I hope ati comes out with all-in-wonder editions of the sli cards...can you say 2 channels at the same time without any lag? :D
TheInvincibleMustard - Wednesday, September 28, 2005 - link
"ati comes out on top"... umm ... did you read/understand the tables at the bottom of page 6? Quoted directly from the review -- "the 4xAA modes show that SLI provides better scaling" ... meaning that SLI gets closer to the theoretical doubling of performance (ie, 100%) than CrossFire does. In fact, the only situation in which a move from single-card to dual-card scales better on the ATI side than the nVidia side is Doom3 at 16x12/noAA (41.3% vs 34.0%)...
This tells us nothing substantial about performance improvements when ATI comes out with R520, but in my opinion it would seem that nVidia has the stronger dual-card algorithms. If I had the choice of spending an additional $425 (cost of a master card, estimated) for an average performance improvement of ~37% with noAA or ~56% with 4xAA for ATI's CrossFire, versus spending an additional $450 (cost of another 7800GTX) for an average performance improvment of 48.5% with noAA or 79.8% with 4xAA for nVidia's SLI, I think I would definitely go with nVidia's solution, as that additional $500 for another 7800GTX offers more bang-for-buck than for one of ATI's master cards. Combine this with the fact that 7800GTX SLI offers better performance overall anyway, and it's a no-brainer. The situation's a little different if you've already got a 6800Ultra (but then why are you looking at CrossFire?) or an x850xt (but then why are you looking at SLI?), but those situations have no bearing. Simply put, nVidia has hands-down won this "round" of multi-card rendering, even ignoring current availability.
-TIM
PS -- I would be curious what would be possible with two AIW's hooked up in CrossFire ... that could certainly have some interesting situations open up ...
yacoub - Tuesday, September 27, 2005 - link
So that would denote that AA "costs" more on the ATI setup than it does on the NVidia setup, right?
DerekWilson - Tuesday, September 27, 2005 - link
Not necessarily ...The ATI could have performed better because of the way HL2's code runs on ATI hardware allows a higher maximum performance under similar CPU limitation. It could be a driver efficiency issue. It could be a number of things ...
It's too hard to make a statement about anything when we are bouncing hard off of CPU limited performance.
yacoub - Tuesday, September 27, 2005 - link
Please re-word this caption so it makes more sense given the image (six boxes):DerekWilson - Tuesday, September 27, 2005 - link
the commas seperate list items ... I describe 3 columns there and if ATI is the top row, NVIDIA must be the bottom ... 6 boxes ...I'm not sure how to be more clear. If I can get a good suggestion, I'll certainly change the caption.
davecason - Tuesday, September 27, 2005 - link
http://www.anandtech.com/video/showdoc.aspx?i=2541...">http://www.anandtech.com/video/showdoc.aspx?i=2541...The word "us" should be "use" in this paragraph:
The CrossFire master card is basically an X850 XT with the addition of a Xilinx FPGA (for the compositing engine) and a TMDS receiver for taking input from the slave card. Instead of 2 DVI-D ports, the CrossFire master card makes us of a high speed DMS port. This connects to one port of the CrossFire dongle and takes the slave card input as well as providing the output to the monitor.
JarredWalton - Tuesday, September 27, 2005 - link
Fixed. I think that was a minor edit by me where I fixed one error and created another. Hahaha....Wesley Fink - Monday, September 26, 2005 - link
We will be posting an in-depth review of the ATI Crossfire AMD motherboard tomorrow. One thing no site has really talked about is what a great overclocker the ATI Crossfire AMD has become. The Reference Board has the best Enthusiast level options and controls I have EVER seen on a Reference Board, the overclocking performance is outstanding, and DFI promises they will deliver the same or better in the next few weeks in their own Crossfire AMD board.Another unmentioned biggie for me is the fact that the Crossfire X850 Master Card works just fine as a standalone X850 graphics card. If you have to buy a Master Card for Crossfire I think it's important to know you can use it ALONE as an X850 card when the next hot generation graphics comes along - it's not just an investment you have to throw away.
Derek talks a bit about how the next generation graphics fixes some of his concerns about Crossfire as a TODAY purchase. His comments should make more sense in that light, and they will certainly be clearer in the next couple of weeks when X1800 launches. Just keep in mind that the same Crossfire AMD motherboard will be used with X1800 and that the motherboard also works great with an nVidia 7800GTX and any other single nVidia or ATI graphics card right now.
michal1980 - Tuesday, September 27, 2005 - link
i have to say nice numbers by ati, buti mean who in the right mind is buying a 500 dollar x850 card, a 150-200 dollar mobo,
just to get crossfire, when in 1 week or so the x520 is supposd to come out. which by the way does not appear to have any menition of crossfire support. and while true you do not have to buy the same 2 cards for crossfire to work. the 2nd master card you buy costs you 100 bucks more. phhff. sorry. the whole sli/crossfire thing really doesn't make much sense unless you are always buying at the release of the tech. i.e. when the gtx gets released u buy 2.
but why would you buy into a crossfire setup now? when the next ati cards are coming out. if there similar to the nivida one. getting 2 x800s, is like roughly spending the money on 2 6600gts, when you could for the money buy the never, and overall better 1 card solution in a 7800gtx for the same money.
or a new x520, for the same money, i'd pass on getting old tech up to speed when new stuff is/will be here soon
michal1980 - Tuesday, September 27, 2005 - link
also if u do compare the 6800ultra vs the x850, the only game the ati car wins in is in hl2. go figure, a game they basicaly wrote for there hardware with the help of gabe 'i will delay this game till ati is ready'DerekWilson - Tuesday, September 27, 2005 - link
heh ... ATI was very unhappy with Valve's launch slip. They sank a lot of money into the HL2 bundle and launch efforts for a time frame that was a year before the game actually came along.waldo - Monday, September 26, 2005 - link
So, like most people on this forum, I read both here and at Tom's Hardware...and it is amazing to me the discrepancies in the writings of the articles. I was hoping you could clarify.1. Are they just bought out by ATI, or are you guys just bought out by NVIDIA?
1a. The conclusions drawn seemed to be in the face of each other. You are dead set against, and they are seeming for ATI's crossfire. This isn't fashion, or poetry, or a film, where you just go by what you like. There are frame rates, quality of picture, etc....why are they so different?
2. It appears that the benchmarks on Tom's are slightly different in comparison to yours, in that it shows the X850 crossfire doing slightly better than the 7800 SLI configuration in several instances.
3. Tom's foxused more on the notion that you don't need two of the exact same card. Based on your article, it is your opinion that you go 7800 SLI, or single wiht hopes of 7800 SLI later, rather than x850 now, and then the addition of R520 later? It would be helpful to have some benchies of mixed card situations.
I realize that it seems attacking, and really I posted here because I like Anandtech better than Tom's but I respect both as the best resources on the net for technology reviews, but it just seems odd that the two top sites out there come down on completely different sides of the fence.
Thanks
Frallan - Tuesday, September 27, 2005 - link
THG!Well as it seems THG has in the last two year gone from beeing a "not-so-good" hardwaresite (please buy it back Thomas) to currently beeing a "outright-bought" hardwaresite. Just look at the major articles they have published the last year and U will see.
All IMHO oc.
photoguy99 - Tuesday, September 27, 2005 - link
Tom's Hardware articles seem to be trending toward less interesting and less quality over time.I still read it, but most of the time find I prefer the AnandTech version of similar subjects.
One thing that is terrible about Toms - why no direct link from articles to a discussion topic?
Not only does it make it easier to discuss, but I consider it a huge benefit the the AnandTech authors read and participate in the discussion of the articles.
DerekWilson - Tuesday, September 27, 2005 - link
Is this not a discussion? :-)waldo - Tuesday, September 27, 2005 - link
wow, that commment sure brought a lot of attention.I would agree that Thom's appears to be "outright-bought" in this article as they don't post the limitation of hte card at 1600x1200 @ 60hz, or at least not as clear as Anand, but they do make some valid comments that Anand's article didn't post either.
Perhaps they are targeting different audiences? No idea, but why are their numbers on par with the 7800 SLI in many situations? That sounds fishy somewhere.
DerekWilson - Tuesday, September 27, 2005 - link
Just to cover what others haven't yet --Adding an X8xx card to an R520 CrossFire card would either not perform well at all or would not work.
Personally, dual-GPU as an upgrade solution is not really a plus unless you are just deffereing purchase for a couple months while prices drop and your wallet heals from the first GPU purchase. If your personal upgrade cycle is a year or more, you'd be much better off just buying a new card from a new architecture.
Everyone else has pretty well hit it on the head. The 1600x1200 limit is a killer. We also have no availability and we are butted right up against the launch of ATIs next gen parts.
I wouldn't recommend SLI either -- as I said -- unless you want absolute maximum preformance. My recommendation may change to CrossFire after the R520 comes along. But who knows what the results of that comparison will be :-)
Mixed modes would perform slightly lower than the dual x850 xt setup and still at most 1600x1200@60 ... Yes, the X850 XT CrossFire does well in performance, but if I'm not going to recommend the X850 XT CrossFire, I'm certainly not going to recommend a mixed solution that will perform worse.
waldo - Tuesday, September 27, 2005 - link
I wasn't saying that you would recommend the other, but it would have been interesting for the readers (who attempt to be congiscent, autonomous beings, and only act at the whim and will of god Anand!) to be able to compare for themselves, per chance see what a mixed solution looks like as that is a selling point of ATI over Nvidia.I don't think SLI from NVIDIA is much of a solution. If you have $1k to shell out for graphics out of the starting gate, great. But you have to get the same manufacturer, the same card, and then the motherboard to match. But I won't be buying an ATI Crossfire setup just yet either.
JarredWalton - Monday, September 26, 2005 - link
Here's my personal take:1) If the 1600x1200@60 Hz problem doesn't bother you, that continue. For me, it's a deal-breaker.
2) Do you already own an X8xx card of some form?
3) Do you have a motherboard with two PCIe X16 slots?
4) Using the ATI Crossfire chipset?
If all of those are true, X800 Crossfire is worth consideration. Personally, 1, 3, and 4 eliminate it from contention. That said, this is current X8xx Crossfire we're looking at. We're not reviewing R520 Crossfire yet, and it will address at least the first point in some fashion.
Brian23 - Monday, September 26, 2005 - link
pwnedfishbits - Monday, September 26, 2005 - link
Yeah, accusing a site of corruption could seem like an attack.
1600 res limited to 60 hz is a deal-breaker for me right away. I can't tolerate playing at 60 hz for any length of time. I'm not a snob about this, it makes me physically feel ill. Pairing two powerful and pricey cards in one system should offer better, easily. I would not plunk down that kind of money to not be able to play at 16x12 when others do it at liveable refresh rates. Nvidia has better price-to-performance single and dual card solutions available right now. ATI should have the same shortly. The mega AA sounds great, but if as soon as you turn it on you say "I need to upgrade now," then what's the point? While the reviewed Crossfire is certainly nice in performance, there's better to be had for the money from both GPU suppliers. It would be irresponsible to recommend it at this time at the current price.
Now if there's a confluence of specifics where this setup makes gaming and financial sense to a handful of people out there, more power to them, they should enjoy this. For the teaming masses of readers though, you shouldn't be suprised by the lack of recommendation.
TrogdorJW - Monday, September 26, 2005 - link
LOL... attack an enthusiast site? Say it isn't so!How about the THG article? They review the platform as a whole, so that's a bit different. I won't comment much on their review, but consider a few points.
They have a page entitled, "Advantages Of CrossFire Over SLI" that reads like marketing hype, and yet they make no mention of the resolution limitation or "Advantages of SLI Over Crossfire". Clearly, they know the limitation exists - ATI hasn't tried to hide this fact, and the lack of any benches at higher than 1600x1200 is telling in and of itself. You do the rest of the math. (Also, some of the results are at best suspect.)
IMO, the writing of the THG article was a bit higher quality, but the content was far more suspect. They come off making everything sound rosey for ATI, and only a fool or a marketing department would believe that. ATI isn't dead yet by any means, but Crossfire is doing little for me right now. Did you realize that it's still not available for purchase at retail? Hmmmm.....
Oh yeah, Catalyst Control Center is pure garbage. Slow, clunky UI, memory hog, and causes as many problems as it fixes. Anyone that tries to tell me how great CCC is (i.e. THG) is immediately under suspicion.
fishbits - Monday, September 26, 2005 - link
Oh, I wasn't making a blanket statement that a review site couldn't be accused of corruption. Just that if you're going to do it, don't then wuss out and pretend it's not an attack :)My experience with THG is mostly second-hand, so I don't say much about it. That they chose to not mention the 16x12 refresh rate limitation, especially with all the debate over it before today, well... that's scandalous.
erinlegault - Monday, September 26, 2005 - link
The TechReport review was the best. Anand seems to give poor Graphics reviews, and definately not up to par with their CPU, motherboard and memory reviews. It must be the authors differences of opinion.overclockingoodness - Monday, September 26, 2005 - link
Could you clarify why Anand's video reviews suck? Just because Anand doesn't benchmark and show 100 diff. graphs of the cards based on the same architecture doesn't mean AnandTech's video reviews suck. TR is quite redundant from one article to another, if you didn't notice.erinlegault - Tuesday, September 27, 2005 - link
Well because the 1600x1200@60Hz is only a limitation because of the existing x800 family of cards and not the x850 family and definately not Crossfire itself.
Pete - Wednesday, September 28, 2005 - link
It's a limitation of the X8xx esries, as they all feature single-link TDMS transmitters, and so the Master cards have single-link TDMS receivers. They should be good for more than 16x12@72Hz, per DVI spec; hopefully future drivers will up this a bit.Plus, didn't The Inq show a pic of 19x12@52Hz?
vijay333 - Monday, September 26, 2005 - link
Valid points, don't see why some people are ranking this post down.Pete - Monday, September 26, 2005 - link
I really like how concise and readable the first few pages are.I do agree that comparing XF directly to the "2nd gen" SLI of the 7800 is a little unfair, but it's still potentially useful to some people, and you obviously left in XF's direct competitors, 6800 SLI and a single 7800. This does take the article in the 'too much info' direction, as opposed to the first few pages' 'just enough' method.
I have a few suggestions and corrections, if you don't mind.
* Perhaps you could elaborate on how XF will remove the res/refresh limitation with the R520 line-ups dual-link TDMS transmitters? This is appropriate in terms of the 7800 SLI comparison, although who knows when X1800 XF will show up.
* On that note, I've read elsewhere that SuperAA is so unbelievably slow because XF is actually using PCIe (bandwidth- and latency-limited) lanes and then the "master" GPU (for inter-GPU communication and then to composite the image, respectively), and not the dongle and CE (as with "normal" XF operation). This will supposedly be corrected in a future driver, but (IMO) it's as big a shortcoming (however temporary) as the (permanent, hardware-imposed) resolution limit. And I'm quite skeptical about future driver fixes, though it seems essential that ATI solve this one.
* p.6, you write "pre" instead of "per."
* p.7, "worth" instead of "worthy."
Will you be examining these issues at Ibiza, or will you have time before packing your sunscreen? :D
(And no, I'm not ignoring you, I'm just an incredibly slow and unimaginative thinker at times.)
DerekWilson - Monday, September 26, 2005 - link
I did mention that ATI's next gen part should remove the limitations of the single-link TMDS somewhere in there ... I am unable to go into detail at this time.I'll have to follow up on the PCIe rather than TMDS angle. That would make some sense to me though. All the subsamples from a single pixel may need to be in the same framebuffer in order for ATI to perform proper AA on them. It may be that the gamma adjustment causes some problems with doing a straight blend between the two scenes. Of course, that's speculation about speculation, so I wouldn't put much stock in my musings :-) As I said though, I'll follow up on this.
I fixed my typos. Thanks.
Glad you liked the article. And where I'm going next sunscreen won't be of much use. :-(
Also, I didn't think you were ignoring me. I've actually been pretty busy myself lately, so I completely understand.
tfranzese - Monday, September 26, 2005 - link
I'd appreciate it if all graphs had units attached. Numbers are certainly not good if they don't have units attached.OvErHeAtInG - Monday, September 26, 2005 - link
Is it the mode scaling you're worried about?From p 6: "Our graphs show frames per second on the y-axis and AA mode across the x-axis."
The rest of the sideways-historam-thingies show fps. That is pretty standard.
OvErHeAtInG - Monday, September 26, 2005 - link
I meant histogram, not historam. D'oh! And yes, I realize it's not really a histogram. Bar chart ? Ah! Who cares.OvErHeAtInG - Monday, September 26, 2005 - link
Or watts for the wattage graphs. :)Stefan - Monday, September 26, 2005 - link
Shouldn't we be comparing the Crossfire to the 6800 Ultra SLI and not the 7800 GTX SLI?I thought ATi's new X1800 Crossfire was going to be the 7800's counter. Or am I mistaken?
Dangher - Monday, September 26, 2005 - link
Good point. Why are comparing previous gen ATI to current gen nVidia?Pythias - Monday, September 26, 2005 - link
Good point. Why are comparing previous gen ATI to current gen nVidia?----------------------------------------------------------------------------------------
Previous gen? I can buy something better than an x850 xt pe from ati? Where?
Dangher - Monday, September 26, 2005 - link
Forgot to add - when it even manages to beat GTX SLI in a benchie (something to think about, eh?)Live - Monday, September 26, 2005 - link
Crossfire is compared to the 6800 Ultra SLI and more. More is better me thinks.If they hadn’t tested it against 7800GTX we would not have know that SLI got beat in HL2, now would we? I think the choice on which cards to test was great.
It's not ATs fault ATI didn’t give them any newer cards to run. You will see those benches next week.
melgross - Monday, September 26, 2005 - link
With the first comparison images between the various levels of AA, I can see the straight picture to be badly stairstepped. The 4AA seems to correct pretty much all of the problems. I can't see a real improvement in any of the others.I wonder how many will notice any of it in an actual game situation.
DerekWilson - Monday, September 26, 2005 - link
It would be easier to show the advantages if I could show motion. Higher AA not only helps the stair steps, but it also helps keep objects that are well antialised in a single frame consistent across multiple frames of motion.Jojo7 - Monday, September 26, 2005 - link
Derek, I've noticed something odd in the last few articles regarding splinter cell:chaos theory and the 6800u benchmarks. If you compare the results of 1600x1200 no aa and 1600x1200 4x aa, the performance hit for the 6800u is .1 of a frame? How is this possible? In my own experience, when I enable 4xaa on my 6800u for sc:ct, I notice no difference in image quality. Is this perhaps a driver problem?Jojo7 - Monday, September 26, 2005 - link
Err. I made a mistake. Actually the scores are identical (40.2) with and without 4xAA.Something has to be wrong here.
DerekWilson - Monday, September 26, 2005 - link
Something is wrong there ... I'm removing the 6800 ultra numbers from sc3 -- thanks for pointing out the problem.erinlegault - Monday, September 26, 2005 - link
Were you a little rush to get this article out of the door?DerekWilson - Tuesday, September 27, 2005 - link
Actually, we took our time. Copied a few numbers down incorrectly. Sorry about that.Googer - Monday, September 26, 2005 - link
Why hasn't anyone tested these on an nForec 4 motherboard yet? ATL Crossfire on a DFI SLI motherboard, will it work?OvErHeAtInG - Monday, September 26, 2005 - link
Um, no. In the future, with different drivers? Who knows. But nvidia is unlikely to provide good nF4 drivers for people who are buying ATI cards?Live - Monday, September 26, 2005 - link
How is the total power draw calculated, before or after the PSU?DerekWilson - Monday, September 26, 2005 - link
Sorry I didn't explain -- I'll add the infopower draw is measured before the PSU -- so yes, the dissipated power of the supply itself is included. And I do know that power draw at the wall is not exactly linear with respect to power supplied to the computer. At the same time, watts pulled from the wall are what we pay for right? :-)
Dangher - Monday, September 26, 2005 - link
You do realise that without proper homework you are just perpetuating sensationalism, right? The TMDS receiver has nothing to do with actual framerate or screen refresh, you do know that, right? You are aware that 1600x1200@60Hz in the TMDS translates into about 2500x1500@100Hz when doing SLI (or Crossfire, as is the case), right? Now go do your homework and correct the article (you're liable to be sued for libel by ATI btw).overclockingoodness - Monday, September 26, 2005 - link
LOL, you are such an idiot. AnandTech can't be sued for libel; did you ever take business 101? Apparently not. Please keep your mouth shut on the things you have no clue about.DerekWilson - Monday, September 26, 2005 - link
sorry m8, don't know where you got your info, but regaurdless of the fact that it is possible for the output to be run at a higher resolution than the TMDS receiver doesn't matter when the product manager of CrossFire at ATI states that the output of CrossFire will be locked to 1600x1200@60Hz *because* of the single link TMDS receiver.I'm sorry if I didn't make it completely clear that ATI could decouple their output from the reciever but they have chosen not to.
Pete - Monday, September 26, 2005 - link
The TDMS receiver is specced to the TDMS transmitter on the slave card, so it is indirectly tied to the screen refresh. I don't recall Derek saying it limits the frame rate, but obviously you can't see more frames than screen updates, so it can potentially limit the visible framerate, too. Yes, that applies to anything with a different frame than refresh rate, but XFire is fairly limited at 16x12@60Hz.And, no, you don't double up TDMS rates with XF, as the CE (Compositing Engine) doesn't have a buffer to accomodate a refresh rate independent of TDMS rate (which is, again, limited to 16x12@60Hz).
Dangher - Monday, September 26, 2005 - link
It is late and I can't be bothered to look for the techie article on this particular problem and why it's been blown out of proportion, but I will do it tomorrow and post a link here. My apologies to Derek if I offended him, it really is late. Link upcoming.Pete - Monday, September 26, 2005 - link
Dangher, you won't find an article to support your claims. It was speculated (in many a forum and possibly by Josh at Penstarsys) that AFR could double XF's single-link TDMS' refresh rate or resolution by interleaving frames, but that's been ruled out, as apparently the RAMDAC must run at the TDMS engine's rate, and the CE doesn't have buffer enough to support RAMDAC refresh rates indpendent of the TDMS engine.So, I'd be surprised if you do.
And Derek won't be sued for libel unless he intentionally published false info. I'm sure much of his info came from ATI themselves, as well as hands-on experience (which shows a 16x12@60Hz limit across the review-site board).
JarredWalton - Monday, September 26, 2005 - link
I think there has been speculation about what could be done with additional low-level hardware and driver tweaks. For now, X8xx Crossfire does not appear to have any support for anything beyond 1600x1200@60 Hz. That's terrible, in my opinion. I have a 9 year old 21" CRT that can run 1600x1200@75Hz. Anyone that has the money to buy Crossfire is highly likely to have a better monitor than that. Meanwhile, my 2405FPW may only run at 60Hz, but lack of 1920x1200 output makes X850 Crossfire a definite no.My only hope is that ATI has spent more effort on R520 Crossfire and will manage to support at least 2048x1536@85 Hz. That's about where top quality CRTs max out, and there are far more 22" CRT owners than Apple 30" Cinema Display owners. :|
Fluppeteer - Tuesday, September 27, 2005 - link
I'm surprised that any single-link resolution isn't possible (so a digitally driven2405FPW ought to work), but it's clear that there's a problem with CRTs. The R520's
dual-link outputs would appear to solve the problem with reasonable headroom, coincidentally supporting dual link monitors.
Dangher's post *could* make sense - by interleaving pixels one could, in theory, take
two single-link images and produce a dual-link one. But the chips aren't really set
up to render like that - it's certainly not one of the announced Crossfire modes.
It would probably also be slower than the existing modes.
AFAIK there's very little intelligence in the CE (or in the SLi combiner) - the
chip not producing output for the relevant bit of screen just outputs black, and
the CE/SLi combiner just ORs the values from the two heads together. There's a bit
of genlock involved and the DVI receiver and transmitter, but the amount of actual
logic is tiny. Unless I'm wrong about how it works, but I don't see the need for
more (except for the multi-card antialiasing, which presumably needs some blending
support - I was a bit surprised that nVidia could retrofit this for that reason).
You could do all kinds of clever things if the SLi bridge/Crossfire connection
was actually a general-purpose high bandwidth link between the two cards, but to
the best of my knowledge, it's not: it's video only, so you're limited to what
the cards can drive on their digital video outputs when it comes to displaying
the result, and uneven splitting won't help you - it's the peak rate of output
which matters, not the average throughput.
On the plus side, with enough supersampling 1280x1024 on a CRT might not look
much worse than 1600x1200 with less...
Fluppeteer - Thursday, September 29, 2005 - link
I've belatedly picked up on something. Sorry if I'm being slow, but to confirm:The multi-card supersampling mode... is the frame from the secondary card sent
over the PCI-e bus, rather than over the Crossfire link? If so, this would
explain a large performance drop as it's implemented, but also explain how
nVidia could implement the equivalent mode without having built blending
directly into their SLi combiner in the first place (and also suggest that
the Crossfire combiner doesn't need to be clever enough to blend). It might
alse explain why nVidia's implementation coincided with a bridgeless SLi
capability (once you've done the work in the driver...)
If they *do* this, there's no reason for it to be limited to 1600x1200 (or
single-link bandwidth), other than that the PCI-e bus will be limiting the
refresh at some point.
Just wondering, and curious whether I'm imagining it.
--
Fluppeteer
DerekWilson - Monday, September 26, 2005 - link
:-)I don't get offended easily. I'm certainly the first person who wants to know if I got something wrong. At the same time, it is my responsibility to get across the clearest way possible, so I'm also concerned when it doesn't seem that I have communicated the facts clearly enough.
Derek Wilson
erinlegault - Monday, September 26, 2005 - link
All of the reviews I've been reading today on Crossfire have been saying the same thing. Can you tell us how they are all wrong?Leper Messiah - Monday, September 26, 2005 - link
The last table on the last page is missing, there's just a [table] tag.DerekWilson - Monday, September 26, 2005 - link
sorry again ... I'll drop in in a second.Leper Messiah - Monday, September 26, 2005 - link
If this had been released 6 months ago, it would be good. Right now with one 7800GTX beating it in some benchies, and SLi GTs and GTX raping it, this just doesn't cut it. Hopefully ATi has something amazing with the R520, otherwise they are heading back to the days of pre-R300.sxr7171 - Monday, September 26, 2005 - link
Okay, I don't get this. I'm running a 24" widescreen monitor at 1920x1200@60HZ using single link DVI. The limit for single-link DVI at 60HZ is said to be 2.6 megapixels which is quite a bit higher than 1600x1200.Questar - Monday, September 26, 2005 - link
That's because all the video card vendors allow higher resolutions be reducing the video blanking perioed. This gives the card more time to send data, resulting in a higher available resolution.Questar - Monday, September 26, 2005 - link
Crossfire cards only use 5 watts of power!DerekWilson - Monday, September 26, 2005 - link
again ... I appologize ... I forgot to hit the update button after I entered in the power consumption.idle: 150W
load: 326W
MrSmurf - Monday, September 26, 2005 - link
That limited resolution and refresh rate is going to the achille's heal of Crossfire.bobsmith1492 - Monday, September 26, 2005 - link
I'm assuming the article is brand new and yet to be fixed, but in case no one has noticed, the charts on page 4 show the crossfire consuming no power. While I'm sure that would be everyone's goal, I don't think it's right somehow.DerekWilson - Monday, September 26, 2005 - link
very sorry -- forgot to hit update after I filled in the info