"As TV shows transition to HD, we will likely see 1080i as the choice format due to the fact that this is the format in which most HDTV channels are broadcast (over-the-air and otherwise), 720p being the other option."
I would like to point out that 1080i has become a popular broadcast standard because of it's lower broadcast bandwidth requirements. TV shows are generally mastered on 1080p, then 1080i dubs are pulled from those masters and delivered to broadcasters (although some networks still don't work with HD at all, MTV for instance who take all deliveries on Digital Beta Cam). Pretty much the only people shooting and mastering in 1080i are live sports, some talk shows, reality TV and the evening news.
Probably 90% of TV and film related blu-rays will be 1080p.
Hint: They didn't. What anandtech isn't telling you is that NO nvidia card supports HDCP over dual-DVI, so yeah, you know that hot and fancy 30" LCD with gorgeous 2560x1600 res? You need to drop it down to 1280x800 to get it to work with an nvidia solution.
This is a very significant problem, and I for one applaud ATI for including HDCP over dual-DVI.
Does the HD video acceleration work with other programs, and with non blueray/hddvd sources? For example if I wanted to watch a h.264 encoded .mkv file would I still see the performance and image enhancements.
Well, what annoys me is that there used to be all-in-wonder video cards for this kinda stuff. I do not mind a product line that has TV tuners and HD playback codecs, but not at the expense of 3d performance.
It is a mistake for ATI and Nvidia to try to include this stuff on all video cards. The current 2XXX and 8XXX generation of video cards might not been as pathetic had the two GPU giants focused on actually making a GPU good instead of adding features that not everyone wants.
I am sure lots of people watch movies on their computer. I do not. I don't want a GPU with those features. I want a GPU that is good at playing games.
All in wonder cards are a totally different beast. The all in wonder card was simply a combination of a TV tuner card (and a rather poor one) and a normal graphics chip. The TV tuner simply records TV and has nothing to do with playback. ATI no longer sells All in wonder cards because the TV tuner card did not go obsolete quickly, while the graphics core in the AIW card went obsolete quickly, requiring the buyer to buy another expensive AIW card when only the graphics part was obsolete. A separate tuner card made so much more sense.
Playback of video is a totally different thing and the AIW cards performed exactly the same as regular video cards based on the same chip. At the time, playing video on the PC was more rare and the video playback of all cards was essentially the same because no cards offered hardware deinterlacing on their video cards. Now, video on the PC is abundant and is the new Killer App (besides graphics) which drives PC performance, storage, and internet speed. Nvidia was first to the party offering Purevideo support, which did hardware deinterlacing for DVDs and SD TV on the video card instead of in software. It was far superior to any software solution at the time (save a few diehard fans of Dscaler with IVTC) and came out at exactly the right time, with the introduction of media center and cheap TV tuner cards and HD video. Now, Purevideo 2 and AVIVO HD introduce the same high quality deinterlacing to HD video for mpeg2 (7600GT and up could do HD mpeg2 deinterlacing) as well as VC-1 and H.264 content. If you don't think this is important, remember that all new satelite HD broadcasts coming online are in 1080i h264, requiring deinterlacing to look its best, and new products are coming and exist already if you are willing to work for it, that allow you to record this content on your computer. Also, new TV series are likely to be released in 1080i on HD discs because that is their most common broadcast format. If you don't need this fine, but they sell a lot of cards to people who do.
Oh, I forgot to mention that only the video decode acceleration requires extra transistors, the deinterlacing calculations are done on the programable shaders of the cards requiring no additional hardware, just extra code in the drivers to work. The faster the video card, the better your deinterlacing, which explains why the 2400 and the 8500 cannot get perfect scores on the HQV tests. You can verify this on HD 2X00 cards by watching the GPU% in Riva Tuner while forcing different adaptive deinterlacing in CCC. This only works in XP btw.
You confirmed my suspicions all along. I always wondered if the true motive for SLI and Crossfire was to double the benefits of GPU processing rather than separate the graphics performance of 3D and video acceleration. In my eyes, I see SLI and Crossfire being a "bridge" for 3D graphics and Video accleration cards. What I am referring to is the PCIex16(16 lane) slot been for high powered 3D GPUs and the PCIex16(8 lane) slot being for video accleration GPUs.
It is obvious between the HD2900XT and the HD2600XT that one is great at rendering 3D game graphics while the other is great at acceleration motion picture movies.
Personally, this is an okay tactic by the card manufacturers. It segments the performance a little bit better. I do not game the least bit, so the high end cards are something I don't want. But, my taste are different than others that do want it. But those that desire both, can have their cake and eat it too, but using a dual PCIex16 motherboard and installing each type of card.
Overall, good article. You enlightened my purchasing decision. With all the talk about futureproofing that was going around for a while, buying a dual PCIex16 motherboard makes a lot of sense now.
I don't think you understand the point of the cards.
If you buy the 2900 and a high end processor, you will not have any problems with HD playback, that's the whole point. You don't need a 2600 to go with it. The number of people that buy something as expensive as the 2900XT and a low end processor that is incapable of playing back HD is very, very low to the point where ATI decided it was a mistake to buy it.
So, no, you wouldn't get a 2600 to go with it, you'd get a good processor and the 2900 and that's all you'd need to have the best of both worlds.
Yes, if by "best" you mean:
- Higher CPU utilization when viewing any HD video content, compared to 8800
- Generally lower price/performance in games compared to 8800
- More flaky (in my experience) drivers than 8800 (though I believe AMD might actually be better on Vista - not that I really care at this point)
Don't pat AMD on the back for skipping UVD on R600. NVIDIA didn't bother to work on VP2 for G80, and yet no one is congratulating them on the decision. I agree that the omission is not the end of the world, mostly because I don't think people running 8800/X2900 cards are really all that concerned with H.264 video. If I were looking to go Blu-ray or HD-DVD, I'd be looking at a set-top box to hook up to my HDTV.
My PC is connected to a 24" LCD that I use for work, not watching movies, and trying to put it next to the TV is more effort than it's worth. Unless H.264 suddenly makes a difference for YouTube and the like (hey - I'd love to see higher quality videos online), I can't say I'm all that concerned. Seems to me there's just a vocal minority whining about the latest features that are used by less than 10% of people.
UVD, PureVideo HD, and a partridge in a pear tree: it's all moot to me!
OK, do you understand the meaning of the word "context"?
I'm not going into the merits of Nvidia and ATI. I have used both, I consider Nvidia junk, and I do not buy them. If you have had better luck, then go with them. That's not the point, but anyone with any reading comprehension should have figured that out.
He was talking about putting a 2600 and a 2900 on the same motherboard to get the best of both worlds, meaning having all the performance of the 2900 yet getting the HD capabilities of the 2900. Do you understand that?
My point is you don't need the 2600 to get "the best of both worlds", you just need a good processor and you will not miss that feature. I think Nvidia made the right choice too. Most people are morons, and they want just because they want, and they fail to realize nothing is free. Including useless features at a cost is a bad idea, and ATI did the right thing not to, even though you'll have the idiots that think they are missing out on something. Yes, you are, you're missing out on additional cost, additional electricity use, and additional heat dissipation. You don't need it if you buy a reasonable processor for the item. That's the point. Try to understand context better, and realize what he meant by the best of both worlds.
Assuming your "good processor" falls somewhere between the two tested C2D processors, dropping UVD boosts average processor usage around 42% in Transporter2, 44% in Yozakura, and 24% in Serenity. So which uses more electricity and generates more heat - the additional transistors needed for UVD on the 2900, or moving your CPU off idle to do the work?
For someone trying to act superior, you need to take a look in the mirror (and the dictionary) for a moment. I agree it's silly to use something like a 2600 and 2900 in the same system. However, if you want the "best of both worlds", let's consider for a minute what that means:
Best (courtesy of Mirriam Webster):
1 : excelling all others
2 : most productive of good: offering or producing the greatest advantage, utility, or satisfaction
So, if you truly want the best of both worlds, what you really want is:
UVD from ATI RV630
3D from NVIDIA G80
Anything less than that is not the "best" anymore (though I'm sure some ATI fans would put R600 3D above G80 for various reasons).
Try ditching the superlatives instead of copping an attitude and constantly defending ever post you make. If you want to say that R600 with a fast CPU is more than sufficient for H.264 playback as well as providing good 3D performance, you're right. The same goes for G80. If you want to argue that it may have been difficult and not entirely necessary to cram UVD into R600, you can do that, but others will disagree.
Since they were at something like 700 million transistors, they may have been out of room. That seems very bloated (especially considering the final performance), but how many transistors are required for UVD? I'd say it was certainly possible to get UVD in there, but would the benefit be worth the cost? Given the delays, it probably was best to scrap UVD. However, the resulting product certainly isn't able to offer the best possible feature set in every area. In fact, I'd say it's second in practically every area to other GPUs (G80/86 and RV630, depending on the feature). As others have pointed out in the past, that's a lot like the NV30 launch.
quote: While the R600 based Radeon HD 2900 XT only supports the features listed as "Avivo", G84 and G86 based hardware comprise the Avivo HD feature set (100% GPU offload) for all but VC-1 decoding ...
Dont you mean ...
quote: While the R600 based Radeon HD 2900 XT only supports the the features listed as "Avivo", HD 2400 and HD 2600 based hardware comprise the Avivo HD feature set (100% GPU offload) for all but VC-1 decoding ...
No. The 2400 and 2600 have support for Avivo HD feature set even with VC-1 decoding, while the G84 and G86 don't so their quote is correct. If a little confusing, since Avivo is ATI terminology. Nevertheless, it is basically equivalent to the NVIDIA hardware.
<blockquote>While the R600 based Radeon HD 2900 XT only supports the features listed as "Avivo", <b>G84 and G86<\b> based hardware comprise the Avivo HD feature set (100% GPU offload) for all but VC-1 decoding ...<\blockquote>
Dont you mean ...
<blockquote>the features listed as "Avivo", <b>HD 2400 and HD 2600</b> based hardware comprise the Avivo HD feature set (100% GPU offload) for all but VC-1 decoding ...<\blockquote>
quote: We have to stress here that, in spite of the fact that NVIDIA and AMD expect the inclusion of video decode hardware on their low end hardware to provide significant value to end users, we absolutely cannot recommend current low end graphics card for use in systems where video decode is important. In our eyes, with the inability to provide a high quality HD experience in all cases, the HD 2400, GeForce 8500, and lower end hardware are all only suitable for use in business class or casual computing systems where neither games nor HD video play a part in the system's purpose.
May be i am the only one who doesn't understand why would they not recommend a Geforce 8500 for Low end machine?
NVIDIA PureVideo HD still doesn't support Windows XP, correct? That would be the deciding factor for many people (instead of a noise reduction score of 15% versus 25% etc.)
this man hit the nail on the head. A couple months ago i was on the verge of buying a new video card for my htpc with h.264 acceleration, but upon learning that those features were only enabled for vista (bleh) I decided not to upgrade at all.
Any ideas as to why the HQV scores are almost totally opposite of what http://techreport.com/reviews/2007q3/radeon-hd-240...">The Techreport came up with? I'd trust AT's review more, but it seems strange that the scores are so different.
quote: Also, even on the 8600 GTS, Nvidia's noise reduction filter isn't anywhere near ready for prime-time. This routine may produce a solid score in HQV, but it introduces visible color banding during HD movie playback. AMD's algorithms quite clearly perform better.
I'm wondering if they ran with the noise filter at over 75% in their test. As Derek mentioned, higher than 75% produced banding. I also noticed that Derek used 163.x drivers, while TR used 162.x.
Honestly, I wish there was an 8600 GT/GTS with HDMI out. Would really love to avoid running two cables to my receiver.
quote: As Derek mentioned, higher than 75% produced banding.
Indeed, which makes it strange that he gave the nvidia cards 100% scores! Sure manual control on the noise filter is nice, but 100% is 100% Derek. It working badly when set above 75% makes for a less than perfect HQV score IMHO. Personally I would have gone with knocking off 5 points from the nvidia card's noise scores for this.
I would have cut points back too, but not because at 100% the image quality goes down. There's no sense in providing a slider if every position on the slider gives the same perfect image, doesn't it?
Giving a slider, however, isn't very user-friendly, from an average Joe's perspective. I want to dump my movie in the player and listen to it, and I want it to look great. I do not want to move a slider around for every movie to get a good picture quality. Makes me think about the Tracking on old VHS. Quite annoying.
From a technological POV, yes, NVidia's implementation enables players to be great. From a consumer's POV, it doesn't. I wanna listen to a movie not fine tune my player.
It's all about the drivers, people! TechReport did their review with older drivers (at least on the NVIDIA side). So in the past two weeks, NVIDIA apparently addressed some problems and AT took a look at the current results. Probably delayed the article a couple times to rerun tests as well, I bet!
As for the above comment about the slider, what you're failing to realize is that noise reduction impacts the final output. I believe Sin City used a lot of noise intentionally, so if you watch that on ATI hardware the result will NOT be what the director wanted. A slider is a bit of a pain, but then being a videophile is also a pain at times. With an imperfect format and imperfect content, we will always have to deal with imperfect solutions. I'd take NVIDIA here as well, unless/until ATI offers the ability to shut off NR.
Hi Derek,
Nice article, although I've just noticed a major omission: you didn't bench any AGP cards! There are AGP versions of the 2600 and 2400 cards and I think these are very attractive upgrades for AGP HTPC owners who are probably lacking the CPU power for full HD. The big question is whether the unidirectional AGP bus is up to the HD decode task. The previous generation ATi X1900 AGP cards reportedly had problems with HD playback.
Hopefully you'll be able to look into this, as AFAIK no-one else has yet.
Thanks so much for the insightful article. I’ve been waiting on it for about a month now, I guess. You or some reader could help me out with a couple of embellishments, if you would.
1.How much power do the ATI Radeon HD 2600 XT, Radeon HD 2600 Pro, Nvidia GeForce 6800 GTS and GeForce 6800 GT graphics cards burn?
2.Do all four of the above mentioned graphics cards provide HDCP for their DVI output? Do they provide simultaneous HDCP for dual DVI outputs?
3.Do you recommend CyberLink’s Power DVD video playing software, only?
15% cpu utilization looks great until.... you find that a e4300 takes so little power that to use 50% of it to decode is only 25 watts of power. It is nice seeing things offloaded from the cpu.... IF the video card isnt cranking up alot of heat and power.
Just my opinion, but I would save money on the Power DVD if you are buying ATI and just use theirs. Power DVD is not cheap, and I personally do not like it is much, but I am sure others do. He has to use it, of course, because how else would he be able to test Nvidia and ATI on the same software. But it's not a trivial expense, and the ATI stuff works well enough that it seems, to me, an unnecessary expense. You might be happier with spending that money on hardware instead of Power DVD. Again, all this assumes an ATI card purchase.
Choosing a Pentium 4 560 is a really strange choice, do you think there are a lot of them out there with PCI-E waiting to upgrade to one of these cards. It's a minor point, but I think a Pentium D 805 would have been an excellent choice, since a lot of people bought these and it would be a much more interesting data point, and many of them on PCI-E based motherboards.
My next point is the expectation of the 2900 XT. I totally disagree this is something they needed to add, because what they are saying is absolutely true. Someone who will buy this item will almost certainly do it with a very capable CPU. Since high end processors are dual cores, it is not as if you can not do something else if the CPU is assisting with it. It's not free, you pay for it with cost, and you pay for it with power use, and you pay for it to heat, and it's going to be a waste the vast majority of time. Considering the power use of the 2900 is appalling already, adding to this is highly undesirable considering the very questionable usefulness of it.
I think they should be congratulated for using intelligent feature targeting for their products, rather than bloating a product with useless features and making people pay for it.
Clearly, the point was to get a single-core point of reference. While admittedly that exact CPU would be a slightly rare case, it's a simple matter to benchmark it since it fits the same 775 mainboard as the two Core2 chips. A PD805 wouldn't be much use to compare, as it would simply be a bit slower than the E4300... so what? The P4 560 makes a reasonable proxy for the variety of good performing single-core P4's and Athlon64's out there, while the E4300 stands in for all the X2's.
The Pentium D 805 is a very popular chip and widely used, and represents an entirely different architecture. It would be an extremely valid data point because it's a popular item. It's not "a little slower", it has completely different performance characteristics.
A Pentium 560 owner will probably never buy this card, and many of these owners are not even on a PCI-E platform. I wouldn't even have had a problem if they sold a single core Sempron, but a Pentium 560 makes no sense at all. People are still buying the 805, in fact, and you don't think the idea of popping one of these cards with an 805, while waiting for the Penryn to come out, is not something people think about? Or a similar Pentium D? Except, they'll not know how it performs. Luckily, though, they'll know how the Pentium 560 performs, because, I'm sure, that is their next choice.
Seeing as this is an article concerning media decoding with an emphasis towards HD media playback, shouldn't Anandtech be applying some pressure on Nvidia to support open drivers for linux? mythTV and XBMC are promising HTPC options, perfectly suited towards this test scenario.
Why should h.264 offloading be exclusive to users of Microsoft operating systems?
Linux doesn't have a framework to support H.264 or VC-1 acceleration yet. When that happens, I would expect the binary drivers to catch up fairly quickly.
Actually, it does. The problem is that it is open source, while the MS equivalent is closed. ATI/NVIDIA don't want to share their specs in an open manner and never came up with a suitable API to make public.
Well, gstreamer allows for closed source plug-ins since it's licensed under LGPL. Fluendo has already implemented a lot of proprietary (patented) codecs in gstreamer. With the required features exposed through the driver, it shouldn't be too hard for the IHVs to do the same with hardware accelerated H.264/VC-1.
Most drivers only support it with MPEG-2, but that doesn't mean it isn't capable of more. Looking again, I'm a little unclear about how much work would be required to get it working. I'm not sure if it is completely done and just requires support from the hardware vendors or if it also needs some additional work before that happens.
Hi, it would be really interesting to see similar tests done in Linux also
For example how cheap of a HTPC rig can you build, with free software too, and still provide betters features than any of the commercial solutions.
I think we are many that have some old hardware laying around. And when seeing this article it brings up ideas. Pairing the old computer with a (AGP?) ATI 2600 card would provide an ideal solution in a nice HTPC chassi under the TV perhaps?
However a HTPC can still be built to be a player for satellite data for example, granted configuring all that up with a subscription card will not be for the faint of heart. But then again the Dreambox 8000 is not available yet, only a new decoder from Kathrein UFS910 with no decent software (yet)
good review. However, based on a review of the german written magazine C't I have some suggestions and additions:
PowerDVD patch 2911, Catalyst 7.6, Nvidia 158.24
- the Geforce G84/85 miss not only VC-1 but also MPEG-2 bitstream processing.
- the HD 2400 does not have MPEG-2 bitstream processing, frequency transform and pixel prediction or it is not activated.
- A single core Athlon is significantly worse than a single core Pentium IV. The reson is AACS. Decryption puts a hudge load on the CPU and is optimized for Intel CPUs (9%->39% H.264, Pentium IV, Casino Royale). Perhaps later patches made the situation better (like your Yozakura shows?)
- VC-1 on the Radeons and Geforces showed picture distortions, but based on your review this seems to be fixed now
Combinations of Athlon 3500+, X2 6000+, Pentium IV 3,2 GHz, Pentium E2160 and HD 2400/2600, Geforce 8600 GTS which resulted in lagging in MPEG-2 or VC-1 or H.264
3500+ + 690G/2400/2600/8600
6000+ + 690G
Pentium IV + 8600
Why run with older drivers? If these features are important to you, you will need to stay on top of the driver game. Would have been interesting to see AMD chips in there, but then that would require a different motherboard as well. I think the use of a P4 560 was perfectly acceptable - it's a low-end CPU and if it can handle playback with the 2600/8600 then Athlons will be fine as well.
but, while i usually think anandtech conclusions are insightful and spot on,
it seems odd not to give props to the 2600xt which dominated the benchmarks.
for the occasional gamer who often likes watching videos, it seems the 2600xt is a great choice, better than the 8600gts.
for example for VC1, on a low end c2duo the difference between 7% and 19.2% matters, esp if the person likes watching a video while working or browsing or whatever...
can amd add noise reduction options later w/ a driver update?
quote: for example for VC1, on a low end c2duo the difference between 7% and 19.2% matters, esp if the person likes watching a video while working or browsing or whatever...
How can that matter? Even in worst case you have 80% of idle CPU time.
Besides, how can you "work" while watching video at the same time? And don't try to tell me that a web browser takes over 80% of CPU time with Core2 Duo system...
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
63 Comments
Back to Article
Wozza - Monday, March 17, 2008 - link
"As TV shows transition to HD, we will likely see 1080i as the choice format due to the fact that this is the format in which most HDTV channels are broadcast (over-the-air and otherwise), 720p being the other option."I would like to point out that 1080i has become a popular broadcast standard because of it's lower broadcast bandwidth requirements. TV shows are generally mastered on 1080p, then 1080i dubs are pulled from those masters and delivered to broadcasters (although some networks still don't work with HD at all, MTV for instance who take all deliveries on Digital Beta Cam). Pretty much the only people shooting and mastering in 1080i are live sports, some talk shows, reality TV and the evening news.
Probably 90% of TV and film related blu-rays will be 1080p.
redpriest_ - Monday, July 23, 2007 - link
Hint: They didn't. What anandtech isn't telling you is that NO nvidia card supports HDCP over dual-DVI, so yeah, you know that hot and fancy 30" LCD with gorgeous 2560x1600 res? You need to drop it down to 1280x800 to get it to work with an nvidia solution.This is a very significant problem, and I for one applaud ATI for including HDCP over dual-DVI.
DigitalFreak - Wednesday, July 25, 2007 - link
Pwnd!defter - Tuesday, July 24, 2007 - link
You are wrong.Check Anand's 8600 review, they clearly state that 8600/8500 cards support HDCP over dual-DVI.
DigitalFreak - Monday, July 23, 2007 - link
http://guru3d.com/article/Videocards/443/5/">http://guru3d.com/article/Videocards/443/5/http://guru3d.com/article/Videocards/443/5/Chadder007 - Monday, July 23, 2007 - link
I see the ATI cards lower CPU usage, but how is the power readings when the GPU is being used compared to the CPU??chris92314 - Monday, July 23, 2007 - link
Does the HD video acceleration work with other programs, and with non blueray/hddvd sources? For example if I wanted to watch a h.264 encoded .mkv file would I still see the performance and image enhancements.GPett - Monday, July 23, 2007 - link
Well, what annoys me is that there used to be all-in-wonder video cards for this kinda stuff. I do not mind a product line that has TV tuners and HD playback codecs, but not at the expense of 3d performance.It is a mistake for ATI and Nvidia to try to include this stuff on all video cards. The current 2XXX and 8XXX generation of video cards might not been as pathetic had the two GPU giants focused on actually making a GPU good instead of adding features that not everyone wants.
I am sure lots of people watch movies on their computer. I do not. I don't want a GPU with those features. I want a GPU that is good at playing games.
autoboy - Wednesday, July 25, 2007 - link
All in wonder cards are a totally different beast. The all in wonder card was simply a combination of a TV tuner card (and a rather poor one) and a normal graphics chip. The TV tuner simply records TV and has nothing to do with playback. ATI no longer sells All in wonder cards because the TV tuner card did not go obsolete quickly, while the graphics core in the AIW card went obsolete quickly, requiring the buyer to buy another expensive AIW card when only the graphics part was obsolete. A separate tuner card made so much more sense.Playback of video is a totally different thing and the AIW cards performed exactly the same as regular video cards based on the same chip. At the time, playing video on the PC was more rare and the video playback of all cards was essentially the same because no cards offered hardware deinterlacing on their video cards. Now, video on the PC is abundant and is the new Killer App (besides graphics) which drives PC performance, storage, and internet speed. Nvidia was first to the party offering Purevideo support, which did hardware deinterlacing for DVDs and SD TV on the video card instead of in software. It was far superior to any software solution at the time (save a few diehard fans of Dscaler with IVTC) and came out at exactly the right time, with the introduction of media center and cheap TV tuner cards and HD video. Now, Purevideo 2 and AVIVO HD introduce the same high quality deinterlacing to HD video for mpeg2 (7600GT and up could do HD mpeg2 deinterlacing) as well as VC-1 and H.264 content. If you don't think this is important, remember that all new satelite HD broadcasts coming online are in 1080i h264, requiring deinterlacing to look its best, and new products are coming and exist already if you are willing to work for it, that allow you to record this content on your computer. Also, new TV series are likely to be released in 1080i on HD discs because that is their most common broadcast format. If you don't need this fine, but they sell a lot of cards to people who do.
autoboy - Wednesday, July 25, 2007 - link
Oh, I forgot to mention that only the video decode acceleration requires extra transistors, the deinterlacing calculations are done on the programable shaders of the cards requiring no additional hardware, just extra code in the drivers to work. The faster the video card, the better your deinterlacing, which explains why the 2400 and the 8500 cannot get perfect scores on the HQV tests. You can verify this on HD 2X00 cards by watching the GPU% in Riva Tuner while forcing different adaptive deinterlacing in CCC. This only works in XP btw.DigitalFreak - Monday, July 23, 2007 - link
Based on Derek's results, I ordered the parts for my new HTPC.C2D E6850 (Newegg are bastards for pricing this at $325, but I didn't want to wait)
Intel P35 board
2GB PC2-800
Gigabyte 8600GTS (passive cooling)
Wished there was an 8600 board with HDMI out, but oh well...
SunAngel - Monday, July 23, 2007 - link
You confirmed my suspicions all along. I always wondered if the true motive for SLI and Crossfire was to double the benefits of GPU processing rather than separate the graphics performance of 3D and video acceleration. In my eyes, I see SLI and Crossfire being a "bridge" for 3D graphics and Video accleration cards. What I am referring to is the PCIex16(16 lane) slot been for high powered 3D GPUs and the PCIex16(8 lane) slot being for video accleration GPUs.It is obvious between the HD2900XT and the HD2600XT that one is great at rendering 3D game graphics while the other is great at acceleration motion picture movies.
Personally, this is an okay tactic by the card manufacturers. It segments the performance a little bit better. I do not game the least bit, so the high end cards are something I don't want. But, my taste are different than others that do want it. But those that desire both, can have their cake and eat it too, but using a dual PCIex16 motherboard and installing each type of card.
Overall, good article. You enlightened my purchasing decision. With all the talk about futureproofing that was going around for a while, buying a dual PCIex16 motherboard makes a lot of sense now.
TA152H - Monday, July 23, 2007 - link
I don't think you understand the point of the cards.If you buy the 2900 and a high end processor, you will not have any problems with HD playback, that's the whole point. You don't need a 2600 to go with it. The number of people that buy something as expensive as the 2900XT and a low end processor that is incapable of playing back HD is very, very low to the point where ATI decided it was a mistake to buy it.
So, no, you wouldn't get a 2600 to go with it, you'd get a good processor and the 2900 and that's all you'd need to have the best of both worlds.
Chunga29 - Monday, July 23, 2007 - link
Yes, if by "best" you mean:- Higher CPU utilization when viewing any HD video content, compared to 8800
- Generally lower price/performance in games compared to 8800
- More flaky (in my experience) drivers than 8800 (though I believe AMD might actually be better on Vista - not that I really care at this point)
Don't pat AMD on the back for skipping UVD on R600. NVIDIA didn't bother to work on VP2 for G80, and yet no one is congratulating them on the decision. I agree that the omission is not the end of the world, mostly because I don't think people running 8800/X2900 cards are really all that concerned with H.264 video. If I were looking to go Blu-ray or HD-DVD, I'd be looking at a set-top box to hook up to my HDTV.
My PC is connected to a 24" LCD that I use for work, not watching movies, and trying to put it next to the TV is more effort than it's worth. Unless H.264 suddenly makes a difference for YouTube and the like (hey - I'd love to see higher quality videos online), I can't say I'm all that concerned. Seems to me there's just a vocal minority whining about the latest features that are used by less than 10% of people.
UVD, PureVideo HD, and a partridge in a pear tree: it's all moot to me!
TA152H - Tuesday, July 24, 2007 - link
OK, do you understand the meaning of the word "context"?I'm not going into the merits of Nvidia and ATI. I have used both, I consider Nvidia junk, and I do not buy them. If you have had better luck, then go with them. That's not the point, but anyone with any reading comprehension should have figured that out.
He was talking about putting a 2600 and a 2900 on the same motherboard to get the best of both worlds, meaning having all the performance of the 2900 yet getting the HD capabilities of the 2900. Do you understand that?
My point is you don't need the 2600 to get "the best of both worlds", you just need a good processor and you will not miss that feature. I think Nvidia made the right choice too. Most people are morons, and they want just because they want, and they fail to realize nothing is free. Including useless features at a cost is a bad idea, and ATI did the right thing not to, even though you'll have the idiots that think they are missing out on something. Yes, you are, you're missing out on additional cost, additional electricity use, and additional heat dissipation. You don't need it if you buy a reasonable processor for the item. That's the point. Try to understand context better, and realize what he meant by the best of both worlds.
strikeback03 - Wednesday, July 25, 2007 - link
Assuming your "good processor" falls somewhere between the two tested C2D processors, dropping UVD boosts average processor usage around 42% in Transporter2, 44% in Yozakura, and 24% in Serenity. So which uses more electricity and generates more heat - the additional transistors needed for UVD on the 2900, or moving your CPU off idle to do the work?TA152H - Tuesday, July 24, 2007 - link
Previous post should have said "HD capability of the 2600".Chunga29 - Tuesday, July 24, 2007 - link
For someone trying to act superior, you need to take a look in the mirror (and the dictionary) for a moment. I agree it's silly to use something like a 2600 and 2900 in the same system. However, if you want the "best of both worlds", let's consider for a minute what that means:Best (courtesy of Mirriam Webster):
1 : excelling all others
2 : most productive of good: offering or producing the greatest advantage, utility, or satisfaction
So, if you truly want the best of both worlds, what you really want is:
UVD from ATI RV630
3D from NVIDIA G80
Anything less than that is not the "best" anymore (though I'm sure some ATI fans would put R600 3D above G80 for various reasons).
Try ditching the superlatives instead of copping an attitude and constantly defending ever post you make. If you want to say that R600 with a fast CPU is more than sufficient for H.264 playback as well as providing good 3D performance, you're right. The same goes for G80. If you want to argue that it may have been difficult and not entirely necessary to cram UVD into R600, you can do that, but others will disagree.
Since they were at something like 700 million transistors, they may have been out of room. That seems very bloated (especially considering the final performance), but how many transistors are required for UVD? I'd say it was certainly possible to get UVD in there, but would the benefit be worth the cost? Given the delays, it probably was best to scrap UVD. However, the resulting product certainly isn't able to offer the best possible feature set in every area. In fact, I'd say it's second in practically every area to other GPUs (G80/86 and RV630, depending on the feature). As others have pointed out in the past, that's a lot like the NV30 launch.
autoboy - Monday, July 23, 2007 - link
What??scosta - Monday, July 23, 2007 - link
I think this sentence in page 1 is wrong!Dont you mean ...
Regards
smitty3268 - Monday, July 23, 2007 - link
No. The 2400 and 2600 have support for Avivo HD feature set even with VC-1 decoding, while the G84 and G86 don't so their quote is correct. If a little confusing, since Avivo is ATI terminology. Nevertheless, it is basically equivalent to the NVIDIA hardware.scosta - Monday, July 23, 2007 - link
I think this sentence in page 1 is wrong!<blockquote>While the R600 based Radeon HD 2900 XT only supports the features listed as "Avivo", <b>G84 and G86<\b> based hardware comprise the Avivo HD feature set (100% GPU offload) for all but VC-1 decoding ...<\blockquote>
Dont you mean ...
<blockquote>the features listed as "Avivo", <b>HD 2400 and HD 2600</b> based hardware comprise the Avivo HD feature set (100% GPU offload) for all but VC-1 decoding ...<\blockquote>
Regards
iwodo - Monday, July 23, 2007 - link
May be i am the only one who doesn't understand why would they not recommend a Geforce 8500 for Low end machine?
Chunga29 - Monday, July 23, 2007 - link
The NVIDIA 8500 drivers are not currently working with PureVideo HD, I believe was mentioned.ssiu - Monday, July 23, 2007 - link
NVIDIA PureVideo HD still doesn't support Windows XP, correct? That would be the deciding factor for many people (instead of a noise reduction score of 15% versus 25% etc.)legoman666 - Monday, July 23, 2007 - link
this man hit the nail on the head. A couple months ago i was on the verge of buying a new video card for my htpc with h.264 acceleration, but upon learning that those features were only enabled for vista (bleh) I decided not to upgrade at all.DigitalFreak - Monday, July 23, 2007 - link
Any ideas as to why the HQV scores are almost totally opposite of what http://techreport.com/reviews/2007q3/radeon-hd-240...">The Techreport came up with? I'd trust AT's review more, but it seems strange that the scores are so different.phusg - Monday, July 23, 2007 - link
Yes very interesting! FTA:DigitalFreak - Monday, July 23, 2007 - link
I'm wondering if they ran with the noise filter at over 75% in their test. As Derek mentioned, higher than 75% produced banding. I also noticed that Derek used 163.x drivers, while TR used 162.x.Honestly, I wish there was an 8600 GT/GTS with HDMI out. Would really love to avoid running two cables to my receiver.
Gary Key - Monday, July 23, 2007 - link
There will be in about 60 days, hardware is sampling now. ;)
bpt8056 - Monday, July 23, 2007 - link
Does it have HDMI 1.3??phusg - Monday, July 23, 2007 - link
Indeed, which makes it strange that he gave the nvidia cards 100% scores! Sure manual control on the noise filter is nice, but 100% is 100% Derek. It working badly when set above 75% makes for a less than perfect HQV score IMHO. Personally I would have gone with knocking off 5 points from the nvidia card's noise scores for this.
Scrogneugneu - Monday, July 23, 2007 - link
I would have cut points back too, but not because at 100% the image quality goes down. There's no sense in providing a slider if every position on the slider gives the same perfect image, doesn't it?Giving a slider, however, isn't very user-friendly, from an average Joe's perspective. I want to dump my movie in the player and listen to it, and I want it to look great. I do not want to move a slider around for every movie to get a good picture quality. Makes me think about the Tracking on old VHS. Quite annoying.
From a technological POV, yes, NVidia's implementation enables players to be great. From a consumer's POV, it doesn't. I wanna listen to a movie not fine tune my player.
Chunga29 - Monday, July 23, 2007 - link
It's all about the drivers, people! TechReport did their review with older drivers (at least on the NVIDIA side). So in the past two weeks, NVIDIA apparently addressed some problems and AT took a look at the current results. Probably delayed the article a couple times to rerun tests as well, I bet!As for the above comment about the slider, what you're failing to realize is that noise reduction impacts the final output. I believe Sin City used a lot of noise intentionally, so if you watch that on ATI hardware the result will NOT be what the director wanted. A slider is a bit of a pain, but then being a videophile is also a pain at times. With an imperfect format and imperfect content, we will always have to deal with imperfect solutions. I'd take NVIDIA here as well, unless/until ATI offers the ability to shut off NR.
phusg - Monday, July 23, 2007 - link
Hi Derek,Nice article, although I've just noticed a major omission: you didn't bench any AGP cards! There are AGP versions of the 2600 and 2400 cards and I think these are very attractive upgrades for AGP HTPC owners who are probably lacking the CPU power for full HD. The big question is whether the unidirectional AGP bus is up to the HD decode task. The previous generation ATi X1900 AGP cards reportedly had problems with HD playback.
Hopefully you'll be able to look into this, as AFAIK no-one else has yet.
Regards, Pete
ericeash - Monday, July 23, 2007 - link
i would really like to see these tests done on an AMD x2 proc. the core 2 duo's don't need as much offloading as we do.Orville - Monday, July 23, 2007 - link
Derek,Thanks so much for the insightful article. I’ve been waiting on it for about a month now, I guess. You or some reader could help me out with a couple of embellishments, if you would.
1.How much power do the ATI Radeon HD 2600 XT, Radeon HD 2600 Pro, Nvidia GeForce 6800 GTS and GeForce 6800 GT graphics cards burn?
2.Do all four of the above mentioned graphics cards provide HDCP for their DVI output? Do they provide simultaneous HDCP for dual DVI outputs?
3.Do you recommend CyberLink’s Power DVD video playing software, only?
Regards,
Orville
DerekWilson - Monday, July 23, 2007 - link
we'll add power numbers tonight ... sorry for the omissionall had hdcp support, not all had hdcp over dual-link dvi support
powerdvd and windvd are good solutions, but powerdvd is currently further along. we don't recommend it exclusively, but it is a good solution.
phusg - Wednesday, July 25, 2007 - link
I still can't see them, have they been added? Thanks.GlassHouse69 - Monday, July 23, 2007 - link
I agree here, good points.15% cpu utilization looks great until.... you find that a e4300 takes so little power that to use 50% of it to decode is only 25 watts of power. It is nice seeing things offloaded from the cpu.... IF the video card isnt cranking up alot of heat and power.
TA152H - Monday, July 23, 2007 - link
Just my opinion, but I would save money on the Power DVD if you are buying ATI and just use theirs. Power DVD is not cheap, and I personally do not like it is much, but I am sure others do. He has to use it, of course, because how else would he be able to test Nvidia and ATI on the same software. But it's not a trivial expense, and the ATI stuff works well enough that it seems, to me, an unnecessary expense. You might be happier with spending that money on hardware instead of Power DVD. Again, all this assumes an ATI card purchase.phusg - Monday, July 23, 2007 - link
Good questions. From what I've seen the 2600 Pro is the least power hungry card at under 50W. Any chance you could shed some light Derek?TA152H - Monday, July 23, 2007 - link
Choosing a Pentium 4 560 is a really strange choice, do you think there are a lot of them out there with PCI-E waiting to upgrade to one of these cards. It's a minor point, but I think a Pentium D 805 would have been an excellent choice, since a lot of people bought these and it would be a much more interesting data point, and many of them on PCI-E based motherboards.My next point is the expectation of the 2900 XT. I totally disagree this is something they needed to add, because what they are saying is absolutely true. Someone who will buy this item will almost certainly do it with a very capable CPU. Since high end processors are dual cores, it is not as if you can not do something else if the CPU is assisting with it. It's not free, you pay for it with cost, and you pay for it with power use, and you pay for it to heat, and it's going to be a waste the vast majority of time. Considering the power use of the 2900 is appalling already, adding to this is highly undesirable considering the very questionable usefulness of it.
I think they should be congratulated for using intelligent feature targeting for their products, rather than bloating a product with useless features and making people pay for it.
johnsonx - Tuesday, July 24, 2007 - link
Clearly, the point was to get a single-core point of reference. While admittedly that exact CPU would be a slightly rare case, it's a simple matter to benchmark it since it fits the same 775 mainboard as the two Core2 chips. A PD805 wouldn't be much use to compare, as it would simply be a bit slower than the E4300... so what? The P4 560 makes a reasonable proxy for the variety of good performing single-core P4's and Athlon64's out there, while the E4300 stands in for all the X2's.TA152H - Tuesday, July 24, 2007 - link
Are you crazy?The Pentium D 805 is a very popular chip and widely used, and represents an entirely different architecture. It would be an extremely valid data point because it's a popular item. It's not "a little slower", it has completely different performance characteristics.
A Pentium 560 owner will probably never buy this card, and many of these owners are not even on a PCI-E platform. I wouldn't even have had a problem if they sold a single core Sempron, but a Pentium 560 makes no sense at all. People are still buying the 805, in fact, and you don't think the idea of popping one of these cards with an 805, while waiting for the Penryn to come out, is not something people think about? Or a similar Pentium D? Except, they'll not know how it performs. Luckily, though, they'll know how the Pentium 560 performs, because, I'm sure, that is their next choice.
100proof - Monday, July 23, 2007 - link
Derek,Seeing as this is an article concerning media decoding with an emphasis towards HD media playback, shouldn't Anandtech be applying some pressure on Nvidia to support open drivers for linux? mythTV and XBMC are promising HTPC options, perfectly suited towards this test scenario.
Why should h.264 offloading be exclusive to users of Microsoft operating systems?
100proof - Monday, July 23, 2007 - link
This complaint applies to ATi\AMD as well.erwos - Monday, July 23, 2007 - link
Linux doesn't have a framework to support H.264 or VC-1 acceleration yet. When that happens, I would expect the binary drivers to catch up fairly quickly.smitty3268 - Monday, July 23, 2007 - link
Actually, it does. The problem is that it is open source, while the MS equivalent is closed. ATI/NVIDIA don't want to share their specs in an open manner and never came up with a suitable API to make public.wien - Monday, July 23, 2007 - link
Well, gstreamer allows for closed source plug-ins since it's licensed under LGPL. Fluendo has already implemented a lot of proprietary (patented) codecs in gstreamer. With the required features exposed through the driver, it shouldn't be too hard for the IHVs to do the same with hardware accelerated H.264/VC-1.It's probably not worth their time yet though...
erwos - Monday, July 23, 2007 - link
Does it? Because I thought that was only for MPEG-2. Link?smitty3268 - Monday, July 23, 2007 - link
Most drivers only support it with MPEG-2, but that doesn't mean it isn't capable of more. Looking again, I'm a little unclear about how much work would be required to get it working. I'm not sure if it is completely done and just requires support from the hardware vendors or if it also needs some additional work before that happens.http://www.mythtv.org/wiki/index.php/XvMC">http://www.mythtv.org/wiki/index.php/XvMC
http://en.wikipedia.org/wiki/X-Video_Motion_Compen...">http://en.wikipedia.org/wiki/X-Video_Motion_Compen...
Per Hansson - Monday, July 23, 2007 - link
Hi, it would be really interesting to see similar tests done in Linux alsoFor example how cheap of a HTPC rig can you build, with free software too, and still provide betters features than any of the commercial solutions.
I think we are many that have some old hardware laying around. And when seeing this article it brings up ideas. Pairing the old computer with a (AGP?) ATI 2600 card would provide an ideal solution in a nice HTPC chassi under the TV perhaps?
jojo4u - Monday, July 23, 2007 - link
Linux is not practical. You would have to crack AACS and dump the disc first.Per Hansson - Monday, July 23, 2007 - link
Hmm, I did not realize thatHowever a HTPC can still be built to be a player for satellite data for example, granted configuring all that up with a subscription card will not be for the faint of heart. But then again the Dreambox 8000 is not available yet, only a new decoder from Kathrein UFS910 with no decent software (yet)
jojo4u - Monday, July 23, 2007 - link
Hi Derek,good review. However, based on a review of the german written magazine C't I have some suggestions and additions:
PowerDVD patch 2911, Catalyst 7.6, Nvidia 158.24
- the Geforce G84/85 miss not only VC-1 but also MPEG-2 bitstream processing.
- the HD 2400 does not have MPEG-2 bitstream processing, frequency transform and pixel prediction or it is not activated.
- A single core Athlon is significantly worse than a single core Pentium IV. The reson is AACS. Decryption puts a hudge load on the CPU and is optimized for Intel CPUs (9%->39% H.264, Pentium IV, Casino Royale). Perhaps later patches made the situation better (like your Yozakura shows?)
- VC-1 on the Radeons and Geforces showed picture distortions, but based on your review this seems to be fixed now
Combinations of Athlon 3500+, X2 6000+, Pentium IV 3,2 GHz, Pentium E2160 and HD 2400/2600, Geforce 8600 GTS which resulted in lagging in MPEG-2 or VC-1 or H.264
3500+ + 690G/2400/2600/8600
6000+ + 690G
Pentium IV + 8600
Chunga29 - Monday, July 23, 2007 - link
Why run with older drivers? If these features are important to you, you will need to stay on top of the driver game. Would have been interesting to see AMD chips in there, but then that would require a different motherboard as well. I think the use of a P4 560 was perfectly acceptable - it's a low-end CPU and if it can handle playback with the 2600/8600 then Athlons will be fine as well.8steve8 - Monday, July 23, 2007 - link
nice article..but, while i usually think anandtech conclusions are insightful and spot on,
it seems odd not to give props to the 2600xt which dominated the benchmarks.
for the occasional gamer who often likes watching videos, it seems the 2600xt is a great choice, better than the 8600gts.
for example for VC1, on a low end c2duo the difference between 7% and 19.2% matters, esp if the person likes watching a video while working or browsing or whatever...
can amd add noise reduction options later w/ a driver update?
defter - Tuesday, July 24, 2007 - link
How can that matter? Even in worst case you have 80% of idle CPU time.
Besides, how can you "work" while watching video at the same time? And don't try to tell me that a web browser takes over 80% of CPU time with Core2 Duo system...
drebo - Monday, July 23, 2007 - link
We all know why this is.
I'll give you a hint: look at the overwhelming presence of Intel advertising on this site.
It doesn't take a genius to figure it out. That's why I don't take the video and CPU reviews on this site seriously anymore.
DigitalFreak - Monday, July 23, 2007 - link
/ignoredrebo - Monday, July 23, 2007 - link
Willful ignorance, rose-colored glasses, selective blindness, et cetera.You have fun with that.
johnsonx - Tuesday, July 24, 2007 - link
Please go back to your fanboi sites then and don't bother us here.