Maybe I'm the opposite of most people here, but I'm glad ATI/AMD and Nvidia both produced mid-range cards that suck. Maybe we will finally get the game developers to slow down and produce tighter code or not waste GPU/CPU cycles on eye-candy and actually produce better game play. While I understand that most game companies write games that play acceptably on the $400 flagship video cards, I for one am not one of those people. It's not that I can't afford to buy a $400 card once in a awhile - it's having to spend that every year that ticks me off. I'd much rather upgrade my card every year to keep up the times if said card was $120.
Most game developers already do that. If you don't have the power to run the shaders and enable d3d10 features you can run in d3d9 mode. If your card still doesn't have the power for that you can run in pixel shader 1 mode.
Take a game like Half Life 2 for example. Turn everything up and it was too much for most high end cards when it shipped. But you can turn it down so it looks like a typical d3d8 or d3d7 era game and play it just fine on your old hardware.
If you're hoping that developers can somehow make things run just as well on a PentiumIII as on a core2 duo you're hoping for the impossible. The 2600 only has about 1/3 the processing power as a 2900. The 2400 has about 10% of the power! Think about underclocking your CPU to 10% speed and seeing how your applications run ;)
I would like to apologize for not catching this before publication, but the 8600 GTS we used had a slight overclock resulting in about 5% across the board higher performance than we should have seen.
We have re-run the tests on our stock clocked 8600 GTS and edited all the graphs in our article to reflect the changes. The overall results were not significantly affected, but we are very interested in being as fair and acurate as possible.
We have also added idle and load power numbers to The Test page.
Again, I am very sorry for the error, and I will do my best to make sure this doesn't happen again.
Meh, thanks Derek, but if you already have Factory overclocked results it may as well be lovely to leave them in as they are fair game, if Nvidia's partners are selling them in those configurations. This is of course in addition to the Nvidia reference clock rates.
Yeah, but you have all kinds of GPUs on the chart anyway from many different price points, 7950 GT is not close to $150 either, and neither is the 8800 GTS 320.
I think people would be quite aware that the Factory OC cards if placed are indeed priced higher but if you already have the results leave them in, in addition to the Nvidia reference clock designs.
For gods sake, isn't it obvious by now that running DX10 games on these cards will result in LOWER performance, not HIGHER? If you are averaging 30fps @ 1280x1024 on DX9 games, it's only gonna get worse in DX10!
To TA152H:
Hi. It should've been painfully obvious about 10 comments ago that nobody here agrees with or, well, understands anything your saying. Can you please stop commenting on hardware articles when you don't know what you're talking about? To say that dx9 benchmarks aren't important or, heck, not the most important aspect of these cards makes 0 sense. These cards might be dx10 capable, but they obviously haven't even been given the hardware or raw horsepower to even handle dx9 (even at common resolutions). It also makes 0 sense whatsoever to even suggest these cards will somehow magically perform drastically different in pure dx10 games.
Furthermore...
How does it make any sense for AMD/ATI to have a card thats over 400 dollars (2900xt) that trades blows with the 320 and 640mb 8800gts (which are cheaper), but then have nothing between that card and a heavily hardware-castrated x2600xt for 150 dollars (a 250~ dollar difference). Also consider the fact that Nvidias dx10 mid-low end cards have matured, so even if you were froogle and wanted the cheapest possible clownshoes videocard around you should just go Nvidia. Don't even bother calling me a fanboy unless you feel like making me laugh. I currently own a radeon x800xl. I'm just being honest. It's about time the rest of you do the same. Rant over.
Crossfire gets better results than SLI always.... two crossfire 2600XT cards would take MUCH less wattage than any other option around for its framerate. At least some azn websites are noticing.
All of you guys posting wait for the DX 10 benchmarks do u seriously think the FPS is gonna double from DX9. These cards are a joke, and ment for OEM systems. They are not gonna release a good midrange card to creep up on the 2900XT and take sales away from it. And they will make far more money selling these cards to OEM's than the average joe blow. The people who are gonna suffer from this is the fools who buy pc's at Best buy and futureshop, that believe they are getting good gaming cards.
Although Anandtech hasn't posted it yet, it looks as if the lower end 2000-series parts are quite good at HD decode, to the point where CPU utilization goes from 100% to 5%. At least this according to a cumbersome Chinese review I read a week ago.
Granted my needs don't apply to practically anyone but the HTPC crowd, but I play games at the native resouloution of my 50" panel, which is 1366x768 and I don't use AA, so the 2600 XT would be nice to pick up, in addition to finally being able to send the output to my receiver. For us in the HTPC community, this card will be a godsend, being quiet and low-power.
I look forward to seeing what Anandtech says about the UVD aspects of thse cards, as that's what I'm interested in.
Maybe I completely missed it in the reviews, but can these cards be used in Crossfire mode? That could be one way (albeit very clumsy) way to get you closer to midrange performance for the $200-$250 range...
Wow. Just wow. I haven't seen so much bashing in a long time. However, through all the nVidia and ATI bashing I'm not surprised that the author left out a very important point. The 2600 XT consumes a mere 45W and the 2400 Pro a mere 25W. That is incredible. There is no need for external power as one might expect on low end parts except I think nVidia has external power on the high end 8600. The ATI cards are made using a 65 nm process which explains the low power consumption.
For a less insulting and less bias review, go here
They are as expected, considering the HD 2600 XT is clocked at 800MHZ with 390 Million Transistors the fact that it consumes equal power as compared to the 289 Million Transistor G84 at 675MHZ I would say for what it's worth the improvements of the 65nm process are showing themselves.
The 8600GTS could easily do without an external power connector. So could a 7900GT for that matter. It´s about the situation in SLI and making sure its a clean supply.
I was really hoping that AMD would pull a rabbit out of the hat and release something competitive (read: faster) with the 8600GTS. Clearly, they didn't.
Now I've got to decide between an 8600GTS and an 8800GTS for my new build. I like the PureVideo features in the 8600GTS, but I'm not sure I'll really need them if I've got a Q6600. Then again, I'm not sure I'll really need the full gaming performance of the 8800GTS either. Bleh.
Maybe I'll just stick with an 8600GTS for now, and upgrade to the inevitable 8900GTS.
Since we all know these cards suck for games, please make the UVD article really complete. I know you are going to be doing CPU tests, and you are going to test them with core2duos, but I beg you to test these systems for what they are made for, allowing crappy systems to play HD video. Try testing these cards with a sempron @ 1.6Ghz. You could also try finding the lowest possible cpu speed while HD video still plays smoothly by adjusting the multiplier. That could be pretty interesting and help people out who don't overbuild their HTPCs.
Also, please try to run HQV benchmarks for both DVD and HD DVD for all the cards. We all know the 2600XT will get good scores, but the 2400pro will most likely be the best card for HTPC use (because nobody will ever play games with these crappy cards) and reviewers ussually don't review the low end models for HQV scores. Many times they don't score the same as their big brothers. If you can't get a 2400pro, you could underclock a 2400XT.
A terrific suggestion. Since it is now very obvious that all of the current sub-$200 DX10 cards from both nVidia and AMD/ATi are really targeted for HTPCs and the "casual" gamer -- the bulk of the PC add-on market. Not all of Anandtech's readers are bleeding-edge gaming "enthusiasts".
(Derek, I hope you take note of this little thread)
I almost agree... just don't listen to that BS about a Sempron CPU! Seriously, are you people running 1.6 GHz Sempron chips with $100 GPUs? I doubt that any single core can handle H.264, even with a good GPU helping out (though it would be somewhat interesting to see if I'm wrong). Considering X2 3600+ chips start at a whopping $63 and the 3800+ is only $5 more, so I think that would be a far better choice. Those are the somewhat lower power 65nm chips as well, and the dual cores means you might actually be able to manage video encoding on your HTPC.
What, you don't encode video with your HTPC!? I've got an E6600 in mine, because Windows MCE 2005 sucks up about ~3.5GB per hour of high-quality analog video. I can turn those clips into 700MB DivX video with no discernable quality loss, or I can even go to 350MB per hour and still have nearly the same quality. Doing so on a single core Sempron, though? Don't make me laugh! You'd end up spending five hours to encode a thirty minute show. If you record more than two hours of video per day, you could never catch up!
I am perfectly happy with my sempron 1.6Ghz. I have no problem with OTA HD mpeg2, and I can play any downloaded file I've found. It just keeps chugging along at 1.1V using less than 20W at full load, allowing me to put my HTPC in a nearly enclosed space, and run the fans at a low 500rpm. I can't upgrade to dual core on a Socket 754 board, and I'm not about to upgrade an entire system when this little gem of a $50 graphics card will allow me to run the one thing my cpu can't handle, HD-DVDs.
Also, why would I want to re-encode my TV shows when 500Gig harddrives are only $100? For the rare times I do encode, I use my dual core office PC or my gaming rig, or I could just start it at night and come back tomorrow. I've never been in a big hurry to re-encode old episodes of America's Got Talent.
Also, you are wrong about the sempron handlig h.264. Mine can handle downloaded 720p content already, and a chinese site has already confirmed that the UVD can easily run on a Sempron 1.6 with lots of cpu to spare.
There you go. Personally, I want to see next week's review of UVD vs. Purevideo. I seriously hope that they include 2400s and 2600s in the review along with 8600s and 8500s. THAT sort of information is what will form the basis for my decision on my next Vid Card. My C2D isn't a gaming machine, but a HTPC. If the 2400 series is as good at video as the 2600, then silent wins - big time.
nVidia is well into development of the 8xxx-family successors. If you don't like any of the current Dx10 offerings, keep your wallets in your pockets till late this year or very early next year. Double-precision floating-point compute paths (think a full-fledged GPGPU, fully capable of mixing and matching GPU-functionality and compute horsepower for particle-physics etc.) with HD-decode hardware-assist integrated in all versions. Likely all on 65nm. And no doubt finally filling in the performance-gap around $200 to quiet the current laments and wailings from all sides.
Crysis is likely to run just fine in DX9 on your current high-end DX9 cards. Enjoy the game, upgrading your CPU/Motherboard if Crysis and other next-gen games make good use of multiple cores. Defer the expenditure on prettier graphics to a more-opportune, higher-performance (and less expensive) time. Do you really, really want to invest in a first-generation Dx10 card (unless you want the HD-decode for your HTPC)? For high-end graphics cards the 8800-series is getting long-in-the-tooth, and the prices are not likely to fall much further due to the very high manufacturing cost of the giant G80 die, plus the 2900XT is not an adequate answer. All of the major upcoming game titles are fully-compatible with Dx9. Some developers may be bribed(?) by Microsoft to make their games Vista-only to push Vista's lagging sales, but Vista-only or not, no current game is restricted to Dx10-only... that would be true commercial suicide with the current tiny market-penetration of Dx10 hardware.
Looks like ATI is giving up the high end again. The 2600XT/Pro is priced against the 8600GT/8500GT with the price drop, and the 2400Pro is well below them.
It will work with the OEMs, but not with game developers and players.
I guess we will see a cut-down 2900GT or something like that to fill the $150-$350 bracket where they have no DX10 products.
Just when I though things were getting better. This whole 6-12 months just one long disapointment.
Mid-low range cards that perform sometimes worse than last generation?
All these guys are selling now is hardware with a different name. I never seen such ridiculous stuff in my life. I hope AMD didn't spend too much money on producing these cards. How much money do you have to spend to make a card perform worse than last generations line up? Complete lack of innovation and a complete lack of any sense. I just can't make any sense at all out of this.
I think a 7900GS or a X1800 is the way to go for mid range this year. Though to tell you the truth I wouldn't give AMD any money right now and hopefully then will they get rid of their CEO who seems to not be pulling his weight.
I don't agree with all your reasons, but I agree with Hector Ruiz going. This ass-clown has been plaguing the company for too long, and he has no vision and only a penchant for whining about Intel's anti-competitive practices.
quote: How much money do you have to spend to make a card perform worse than last generations line up? Complete lack of innovation and a complete lack of any sense. I just can't make any sense at all out of this.
They have the same problem that NVidia had with GeForce FX. They spent a lot of money to an exotic new architecture that turned to be very inefficient in terms of performance/transistor count.
All of this lacks relevance anyways... there's no Direct X 10 games out. By now we've all been burned by the "future-proofing our PCs" con so many times that we don't fall for it any more. Yes, at some point in the next 1-4 years, most new games will require DX10, and THEN maybe these video cards will be important. For now and for the next year or so, if you can't afford an 8800 you should just get something from the DX9 generation.
Company of Heroes was the world's first d3d10 game and it's been out for a month. Pretty sweet game too, Game of the Year 2006 and the highest rated RTS ever on gamerankings and metacritic.
Lost Planet I think was a week behind CoH. And now nou can even download Call of Juaez from the game's website.
quote: both NVIDIA and AMD have seen fit to leave a huge gap in performance between their lower high end part and upper low end parts. We saw this with the 8600 GTS falling way short of the 8800 series, and we will see it again with the HD 2600 XT not even getting close to the 2900 XT.
AMD's price gap will be even larger than NVIDIA's, leaving a hole between $150 and $400 with nothing to fill it.
What an absolute joke the videocard industry has become in the last six months or so.
If you judge by performance in games, there is no mid-range aside from the 320MB 8800GTS and yet that has not dropped far enough in price fast enough to hit the $250 mark without needing rebates. The 640MB 8800GTS remains very near its MSRP, which is even more frustrating because it has been out since last November and yet still hovers around $400 (particularly because AMD didn't release a strong enough competitor but we'll get to that in a moment).
Then the 8600GTS came out without enough streamprocessors and bus bandwidth to perform the way a proper mid-range card would have. Who loses? Only the consumer because AMD is only just now about to get their competing product to store shelves so NVidia simply doesn't care to offer a worthwhile midrange card.
Then AMD released the HD 2900XT and it didn't match the 8800GTX so it didn't influence a price drop on the 8800GTSes by NVidia so the consumer lost out yet again.
Now AMD releases another crap mid-range card to compete with NVidia's crap mid-range card, even though this was the perfect opportunity to release a STRONG mid-range card that threatened the 320MB GTS in at least some games while offering the proper 256-bit bus and enough shaders and stream processors to handle modern games at your average 1600x and 1680x resolutions that most of us with 19" and 20" monitors run.
Really, for $250 a gamer should be able to buy a card that can run your average game at your average resolution with your average AA and AF settings.
My example would be that a modern FPS like Stalker should be able to be run full graphical settings at 1600x1200 or 1680x1050 with 4xAA and 8xAF at a solid 45fps or better.
(As opposed to 'high-end' which is a larger display at 1920x1200 or 2560x1600 with 8xAA and 16xAF.)
I believe the purpose of these cards (2400/2600/8500/8600) are for the OEMs to meet a basic checklist of features for the upcoming back to school and holiday season. They provide excellent video playback features for media centers and "appear" on paper to have all of the features you need to play games in your spare time. For the Dell/Best Buy/HP online crowds, the marketing speak will look impressive and I bet the upcharge to go from iX3100/nv7050/atiX1250 on-board graphics will be about the same if not more as just buying the card itself.
Unfortunately, in my testing for the m-ATX roundup (which starts on Monday,finally), the 2400/8500 series did not provide any noticeable improvements in most cases over the atiX1250/nv7050 based boards while being "better" than the G965, sometimes up to 100% better but of course going from 2fps to 4fps in STALKER is not exactly setting the world on fire but those percentages look impressive. ;-)
I think the true mid-range has been or is in the process of being dropped by AMD/NV in order to improve profits. I believe they think the low end cards will suffice for the Home Office / Media Center crowds and that gamers/enthusiasts will pay up for the high end cards. Considering the dearth of competition at the high end, NV sees no point in reducing prices or rolling out new cards (thus taking current cards downstream) at this time.
I expected AMD to reduce the cost on the 2900XT 512 down to $309 or less with the 1GB cards arriving in order place some pricing pressure on NV, especially after the performance let down of the card but even that is not happening. It surprises me because they could really jump start the sales on the card and gain some market share if they had the card priced in the $269~$309 range, a point at which I think people would think the card had decent price to performance value in the market. That leads me to believe the sweet spot ($200~$300) performance market is dead for now, at least with new products.
Maybe, maybe not. However, introduction of new technology has historically been high end and low end, with midrange only coming around the refresh.
For example:
initial release
7600GT < 199$, 7900GT > 300$
x1600xt < 169$, x1900xt > 400$
~2 months later, 256mem configuration was used on the x1900xt to bring the 7900gt some competition at the 300$ pricepoint, but yet again there were no new introductions in the 200-300$ price range, leaving it open to sell yesterdays highend (x1800 7800 x800)
refresh
introduction of 7900gs, gt price dropped and replaced by the 7950gt
x1950pro introduction at 200, x1950xt 256/512 x1950xtx 512 at 250/300/350 respectively
At this point we have some decent mid range, bliss for enthousiasts.
With a little bit of luck, we'll see some nice midrange parts around autumn/winter.
I agree that the Mid-Range is dead with the current generation of video cards on both sides of the fence (ATI&NV). The closest thing we have to midrange is the GTS 320 and its still over priced for that category.
There is a clear lack of video cards in the $200-300 range. Its easier to find a better performer in the last generation of cards than it is 7-8 months into the current generation.
Wheres the 9800 Pro/x850 Pro of the current generation????
(And please give it a rest on the 2900XT vs GTX. You know its been said about a billion times that the 2900xt competitor is the GTS. Let the pre-production hype die already will yah?)
Yet another weird choice for testing. You test a low-end card AMD card, with a top of the line Intel CPU? Yes, a lot of people will be going for that configuration. At least .01 %. Well, close to it anyway. And then, the most interesting part of all, you don't test, the 2400 Pro, which doesn't have a noisy fan and thus at least has something attractive about it.
Granted, a high end processor exaggerates the differences in cards, but it's more academic than meaningful, since most people will not have that configuration. As you go to more mainstream processors, the delta shrinks, although in this case it's so dramatic as to be startling.
But, DX9 tests aren't particularly meaningful within the context of what these cards are. Their claim to fame is to be DX10 cards, and one is fanless, so it's going to be attractive to some people. If the DX10 performance is so miserable though, maybe it won't have appeal in that market. That's why I'm confused why you'd even bother with the DX9 article first, instead of going straight to DX10 and testing DX9 later. Clearly, DX10 is the future; it looks better, and was the focus of the product. If it performs relatively well on DX10, DX9 performance (or lack thereof), becomes almost irrelevant since it is not the focus of the card. The reverse would not be true, if DX10 performance were miserable, and DX9 performance good, it would still be a nearly useless card. The DX10 article should have come first.
IIRC there were faster fanless cards last generation. I know a fanless 7600GT is still available, and the 7600GT finished ahead of the 2400 XT here, so no doubt it's better than the 2400 Pro.
With the DX9 performance of these cards, AMD has certainly not given anyone a reason to upgrade. Assuming DX10 performance is better, users will wait until more DX10 games are available. Why lose performance in an upgrade on the games you currently play?
While I see your points, I don't agree entirely. Does the 7600GT beat the 2400 in DX10? Well, no, it's not for that. So, I'd agree you wouldn't upgrade to the 2400 for DX9 performance, you might for DX10. But they didn't publish any results for DX10, so we don't know if it's even worthwhile for that.
You make a good point with timing on an upgrade, but I'll offer a few good reasons. Sometimes, you NEED something right now, either because your old card died, or because you're building a new machine, or whatever. So, it's not always about an upgrade. And, in these instances, I think most people will be very interested in DX10 performance since we're already seeing titles for them coming out, and obviously DX9 will cease to be relevant in the future as DX10 takes over. The advantages of DX10 are huge and meaningful, and DX9 can't die fast enough. But, it will take a while because XP and old computers represent a huge installed base, and software companies will have to keep making DX9 versions along with DX10. But who will want to run them degraded? Also, Vista itself will be pushing demand for DX10, and XP is pretty much dead now. I don't think you'll see too many new machines with this OS on it, but Vista of course will continue to grow. Would you want a DX9 card for Vista? I wouldn't, and most people wouldn't, so they make these DX10 cards that at least nominally support DX10. Maybe that's all this is, just products that are made to fill a need for the company to say they have inexpensive DX10 cards to run Vista on. I don't know, because they didn't put out DX10 results, instead wasted the review on DX9 which isn't why these cards were put out. And naturally, most of the idiots who post here eat it up without understanding DX9 wasn't the point of this product. I won't be surprised if DX10 sucks too, and these were just first attempts that fell short, but I'd like to see results before I condemn the cards for being poor at something that wasn't their focus.
?? how can XP be dead most Offices will be useing it probly for the next 5-8 years home users its going to take 2-3 years before users upgrade if thay are even bothered if it works it works
i love to use vista but Drivers Suck for it M$ Should Demand Fully working drivers Before WHQL an driver not just It works an slap on an WHQL cert on it (Nvidia crapy chip set drivers and Creative Poor driver support {but thats M$ fault for changeing the Driver model}..)
most norm Home users do not care what video card is in there pc so the Low end video parts whould not bother them
quote: I'd agree you wouldn't upgrade to the 2400 for DX9 performance, you might for DX10.
ROFL. If 2400 XT gets 9 fps in a "DX9" game at 1280x1024, do you expect that any real "DX10" game will be playable on it at any reasonable resolution?
It's quite clear by now, that in order to really use those DX10 features (i.e. run games with settings where there is a noticable image quality difference compared to DX9 codepath), you need to have 8800/2900 class card. DX10 performance of 2600 series, let alone 2400 series, is quite irrelevant.
Have you even tried any DX10 hardware? Because I have to say that it sounds like you're talking out of your ass! The reason I say that is because the current DX10 titles (which are really just DX9 titles with hacked in DX10 support) perform atrociously! Company of Heroes, for example, has performance cut in half or more when you enable DirectX 10 mode -- and that's running on a mighty 8800 GTX! (Yes, I own one.) It looks fractionally better (improved shadows for the most part) but the performance hit absolutely isn't worthwhile.
Company of Heroes isn't the only example of games that perform poorly under DirectX 10. Call of Juarez and Lost Planet also perform worse in DirectX 10 mode than they do in DirectX 9 on the same hardware (all this testing having been done on HD 2900 XT and 8800 series hardware by several reputable websites - see FiringSquad among others). You don't actually believe that crippled hardware like the 8500/8600 or 2400/2600 will somehow magically perform better in DirectX 10 mode than the more powerful high-end hardware, do you?
Extrapolate those performance losses over to hardware that is already about one third as powerful as the 8800/2900 GPUs, and this low-end DirectX 10 hardware is simply a marketing attempt. If it's slower in DX9 mode and the hihg-end stuff suffers in performance, why would having fewer ROPs, Shader Units, etc. help matters?
This is blatant marketing -- marketing that people like you apparently buy into? "The 2600 isn't intended to perform well in DirectX 9. That's why it has DirectX 10 support! Wait until you see how great it performs in DirectX 10 -- something cards like the 7600/7900 and X1950 type cards can't run at all because they lack the feature. Blah de blah blah...." Don't make me laugh! (Too late.)
Radeon 9700 was the first DirectX 9 hardware to hit the planet, followed by the (insert derogatory adjectives) GeForce FX cards several months later. Is but is because we're seeing the exact same thing in reverse this time: NVIDIA 8800 hardware it appears to be pretty fast all told, and almost 9 months after the hardware became available ATI's response can't even keep up! Anyway, how long did it take for us to finally get software that required DirectX 9 support? I would say Half-Life 2 is about as early as you can go, and that came out in November 2004. That was more than two years after the first DirectX 9 hardware (as well as the DirectX 9 SDK) was released -- August 2002 for the Radeon 9700, I believe.
Sure, there were a few titles that tried to hack in rudimentary DirectX 9 support, but it wasn't until 2005 that we really started to see a significant number of games requiring DirectX 9 hardware for the higher-quality rendering modes. Given that DirectX 10 requires Windows Vista, I expect the transition to be even slower! I have a shiny little copy of Windows Vista sitting around my house. I installed it, played around with it, tried out a few DirectX 10 patched titles... and I promptly uninstalled. (Okay, truth be told I installed it on a spare 80GB hard drive, so technically I can still dual-boot if I want to.) Vista is a questionable investment at best, high-end DirectX 10 hardware is expensive as well, and the nail in the coffin right now is that we have yet to see any compelling DirectX 10 stuff. Thanks, but I'll pass on any claims of DX10 being useful for now.
d3d9 is basically d3d8, but with SM2.0 instead of SM1. Just because Oblivion doesn't support SM1 doesn't mean it is any more or less of a d3d9 title than one that also supported sm1 or OpenGL.
With any technology, the initial software and drivers is going to be sub-optimal, and not everyone is going to be playing at the same resolutions or configurations. But, these cards aren't made for the ultimate gaming experience, they are low end cards to perform tasks that are not so demanding well enough for the mainstream user.
What you don't seem to understand is, not everyone is a dork that plays games all day. Some people actually have lives, and they don't need high end hardware to shoot aliens. For you, sure, get the high end stuff, but that's not what you compare these cards to. $60 card versus $800? Duh. Why would you even make the comparision? You make the comparison of how well the perform for their target, which is a low end, feature rich card. Will it work well with Vista and mainstream type of apps? Does the feature set and performance justify the cost?
Only an idiot would use this type of card for a high end gaming computer, and only the same type of person would even compare it to that.
With any technology, the initial software and drivers is going to be sub-optimal, and not everyone is going to be playing at the same resolutions or configurations. But, these cards aren't made for the ultimate gaming experience, they are low end cards to perform tasks that are not so demanding well enough for the mainstream user. Yes, and by the time the technology and drivers (in this case, DirectX 10) ARE optimal, these cards will be replaced by a completely new generation, making them a pretty poor purchase at this time. Remember the timetable for going from DirectX 8 to DirectX 9?
Derek is completely right at this point. If you're buying a card, you should buy based on what kind of DirectX 9 performance you're going to get, because by the time REAL DX10 games come out (and I'm going to make a bet that will be a minimum of 12 months from now, and I mean for good ones, not some DX-9 game with additional hacks) there will be product refreshes or new designs from both ATI and nVidia. Buying a card solely for DirectX 10 performance, especially a low-to-midrange one, is completely silly. If it's outclassed now, it will be far worse in a year. It really makes sense to choose to either a)buy a minimum of a 320MB Geforce 8800GTS, or b)Stick with upper-middle to high-end last-gen hardware (read: Radeon x19xx, Geforce 79xx) until there's a reason to upgrade.
quote: And then, the most interesting part of all, you don't test, the 2400 Pro
Yes, it would be interesting to know the performance of 2400 Pro. After all, faster 2400 XT got whopping 8.9 fps at 1280x1024 no FSAA, no AF in Rainbow Six and 9.3 fps in Stalker with the same settings. I wonder if 2400 Pro gets more than 6 fps?
I actually don't see what is the point of 2400 XT. If somebody needs just multimedia functionality and doesn't care about gaming, then $50 2400 Pro/8400 GS is a cheaper choice. If somebody cares about gaming, then the 2400 XT is useless since 8600GT/2600 Pro are much faster and are only slightly more expensive.
Because not everyone is going to run Rainbow Six, duh!!!!!!
For some people these cards would be fine because they aren't running all the titles here, or are willing to run them at lower resolutions so they don't have to hear some damn egg beater in their computer. Resolution isn't important to everyone, not everyone is some jackass kid that thinks blowing up space aliens with the highest degree of resolution is what life is all about and would be willing to sacrifice some of that for something that is quieter and cooler. I would have thought that much was obvious.
We would have tested the 2400 Pro if we had been able to get a hold of one. AMD was not able to send us a 2400 Pro, so we'll have to wait until we can get one from one of their board partners.
DX9 is much more important in these tests. How many people used a 9500 or an FX 5600 to play any serious DX9 games (read hl2 or better)? And how long did they have to wait for it when it finally mattered?
The reason we do real world tests is because we want to evaluate how the card will behave in normal use. To the customer, the hardware is only as good as the software that runs on it. And right now the software that runs on these parts is almost exclusively DX9.
It'll be at least a year or so before we see any real meaningful DX10 titles. Remember TRAOD, Tron 2.0 and Halo? Not the best DX9 implementations even if they were among the first.
DX10 tests are certianly interesting, and definitely relevant. But I think DX9 is much more important right now.
Yes, but you miss the point that these cards were made for DX10. There are already some titles out, and they will become more and more popular, although initially, without all the features. It obviously wasn't the focus of the product at all, so why make it yours?
Let me ask you a simple question. If you were buying a card, even today, would you buy it for the performance of DX9, or DX10? If you had the choice of two cards, one that had obscenely bad DX9 performance, but good DX10, and the other the reverse, which would you choose? I'd choose the one that performs well on DX10, because that's where things are going, and I'd put up with poor DX9 performance while new titles came out. However, these might suck on DX10 too, that's what we need to know.
Well, Radeon 9700 didn't have too much trouble rocking DirectX 8 games. Nor did GeForce FX (hell that's all it was really good for). G80 slaughters other cards at DirectX 9 games. I highly, highly doubt that these new cards are optimized for DirectX 10. How can they be? The first cards of each generation are usually disappointments for the new APIs.
You're missing the point, I'm not saying it will, I'm saying let's see.
But, let's be realistic, at the price of these cards, they aren't going to be extremely powerful, but they have a great feature set for the price. For a lot of people, these are going to be good cards.
Having said that, I'm inclined to agree they probably will not have great DX10 performance, but they didn't even test it. Strange, to say the least. Some of their decisions are baffling, and you wonder how much thought they actually put into them, if any.
I also agree the first generation for a feature set isn't great. I'm not expecting much, but I'll withhold criticism until I see the results. Besides, in the case of the 2400, wouldn't you think that with this type of feature set, for $60 or so, it would be a very good product for a lot of people running Vista? It's not going to be for the alien blasters, of course, but don't you think it's got some market ?
you make it sound like you'd never even play any of the games tested in this review. wouldn't you be mad your "midrange" card performed this awful on OLDER technology games?
i don't understand why anyone WOULDN'T care about dx9 performance when there are so many good dx9 games out there...
And before you rip me apart for bringing up 9700 and telling me how awesome it was for DX9, remember the mid-range 6600 GT beat it handily. Both are designed for the same API.
You're comparing apples to oranges here. Remember, the 9700 was the FIRST DX9 part available from ATI. The 6600GT was the second gen DX9 part from NVidia. I WILL say that the 9700 was light-years ahead of the nVidia competing DX9 part, the 5800XT.
Your statement is more or less the same as saying that the 9700 was crap, because the 7600GT handily beat it (ok, I'm slightly exaggerating here...)
The point is that this is a reversal of the DX9 situation. The 9700 did handily beat the 5800 in DX8 generation games. In this case, the 8800GTX handily beats the 2900XT (the jury's still out on the 8800GTS).
I view this more like the 2600 appears similar to the horribly performing (in DX9) GeForce 5600.. At least the 5600 did reasonably well in DX8 games...
No, I agree that the HD2600 and 2400 are reminiscent of the FX 5600 and FX5200. They are pretty awful. And I'm not going to sit here and dreamily imagine 3x the performance when they are running more complex DX10 shader code. I think these cards are flops and that's all they will really ever be. For non-gamers and HD video people, the only people who should buy these, they will of course be fine.
If you want to play games, don't jump to DX10 dreaming. How many years did it take for DX9 to become the only API in use? Years. DX9 arrived in 2002 and only a couple of years ago at best was it becoming the primary API. UT2004, for example, is basically a DX7 engine. Guild Wars arrived with a DX8 renderer.
DX9 had multiple OS's backing it. DX10 is Vista only. Its adoption rate is likely to really be slowed down due to this and the fact that the only cards with remotely decent DX10 performance are $300+.
I brought up 9700 and 6600GT just to say that the first generation of cards for a new API is never very good at that API.
I agree that we need to know dx10 performance, which is why we're doing a followup.
I would think it would be clear that, if I were buying a card now, I'd buy a card that performed well under dx9.
All the games I play are dx9, all the games I'll play over the next 6 months will have a dx9 codepath, and we don't have dx10 tests that really help indicate what performance will be like on games designed strictly around dx10.
We always recommend people buy hardware to suit their current needs, because these are the needs we can talk about through our testing better.
OK, that recommendation part is a little scary. You should be balancing the two, because as you know, the future does come. DX9 will exist for the next six months, but there are already games using DX10 that look better than DX9. Plus, Vista surely loves DX10.
But, we can agree to disagree on what's more important. I think this site's backward looking style is obvious, and while I fundamentally disagree with it, at least you guys are consistent in your love for dying technology. Then again, I still prefer Win 2K over XP, so I guess I'm guilty of it too, but in this case my primary concern would be DX10. It's better, noticeably so. But, the main thing is, you're judging something for what it's not made for. AMD's announcement made it very clear that DX10 was the main point, and HD visual effects. Yet you chose to test neither and condemn the hardware for legacy code. Read the announcement, and judge it on what's it's supposed to be for. Would you condemn a Toyota Celica because it's not as fast as a Porsche? Or a Corvette because it's got bad fuel economy? I doubt it, because that's not why they were made. Why condemn this part without testing it for what it was for? I didn't see DX9 mentioned anywhere in their announcement. Maybe that was a hint?
celica compared to a porsche?!?! dude that analogy is waayyyyyy off. How the hell is a toyata celica supposed to represent DX9 & a porsche DX10?!?! considering a porsche u can see instant results and enjoy it instantly, there's nothing out right now on a DX10 and i dont think even in 3 years the DX10 AP would ever encompass the differences between a celica and a porsche. get over yourself.
Wow, what worthless cards! Like does ATI really think people are going to buy this crap? Maybe for a media box and that's about it but for us mid-range gamers, it's worthless! All this hype and wait for nothing I tell ya!
Yeah, WTF?? They are all sometimes WORSE than the X1650XT!!! What is going on? According to the specs it should be better, could it be driver issues still??
I don't think driver alone will help much ... beside ATI has never really known to be able to magically put strong numbers out through driver updates.
Personally I'd say the 2xxx line that AMD/ATI has just sunk to the deep abyss. First it was months late, and the performance was light years behind ... all the while the price is just well not right.
As much as I hate saying this ... it seems that we'll have to wait till Intel dips their giant feet into the graphic industry before nVidia and (especially) AMD/ATI woke up and think carefully about their next products (that is if they can bring a competitive product) ... especially in the mainstream and value market.
Very easy to guess what is happening here. Both camps are targeting the the high engame that switch to vista and cna afford the high end cards. And the OEM cards for deal so they push Vista again on people. Niether company want to lower their high sells by releasing a mid-level part. I just wonder if the cards are just more expensive to make for D10. I don't see a reason it would be, maybe I'm wrong.
Until Vista is used by more gamer my guess is they will not release a mid range card.
Early adaptors of software is getting screwed.
Agreed. I couldn't help laughing when I read the Final Words section. Kinda like "The Nvidia 86xx/85xx cards suck, and the ATI 26xx/24xx suck worse!"
WTF happened this generation? The only cards worth their salt are the 88xx series. Nvidia dropped the ball with their low end stuff, and AMD.... well, AMD never really showed up for the game.
I think it's clear that with these low end cards, ATI and NVIDIA both came to the conclusion that they could either spend their transistor budget implementing the DX10 spec or adding performance, and they both went with DX10. Probably so they could be marketed as Vista compatible, or whatever. It's still a mystery why they didn't choose to make any midrange cards, as they tend to sell fairly well AFAIK. Perhaps these were meant to be midrange cards and ATI/NVIDIA were just shocked by how badly performance scaled downwards in their current designs, and were forced to reposition them as cheaper cards.
There just wasn't much choice, 390 Million for a midrange part on ATi's side that performs worse then Nvidia's 289 Million part, is quite a sorry state of affairs.
It's too bad this generation was so expensive on the feature front that barely any transistor budget was left for implementing performance and were left with hardware that only performs marginally faster if that then the previous generation products.
I am quite disappointed that ATi parts are currently slower despite having a larger transistor budget and higher core clock.
Maybe because they weren't designed for DX9 performance, to state the obvious. They are DX10 parts, and should be judged on how well they perform on that.
DX10 sucks on both 8600GT/S and 2600XT, unless playing at 5-8FPS is you.
2900XT/8800GTS/X is needed for DX10. And better yet, SLI/CF or the next generation.
DX10 on these midrange nVidia and AMD GPUs is 100% useless.
And for what reason do you think they will perform magically better in DX10? 2900XT didnt over 8800. And there is no reason on why it should be better.
I didn't say it would perform better, or worse. We'll see how well it performs when they do the proper tests. Until then, stop the whining. Afterwards, if it sucks, I'll whine with you.
From what I know, all DX10 games or applications out there right now were developed for DX9 and received DX10 feature as an after thought. For REAL DX10 we will have to wait for Crysis.
I've seen Crysis on a 8800GTX. Don't expect to play it well on less, unless the game devs perform some serious miracles. And I wouldn't bet one that. :)
quote: Just read some of the other sites that tested DX10.
I was replying to that. There is no REAL review or even preview from DX10 (game that have been developed from the start for it) now. I know very well that you will need a very good Video card to play Crysis in its full glory.
If these cards suck that bad in DX9 they are bound to suck even harder in DX10. Don't give me this...OH they will do better in DX10....pffff. I'm going to hold off and buy a DX10 card once the games come out, that way I will know what performs the best and buy then the Geforce 8900 series will be out this Q3 making the prices drop even further the the 8800 line.
You're obviously not very bright, I never said they'd perform better or worse. I said it makes more sense to wait until the results are in before passing judgment. Don't put words in my mouth.
I have to agree but disagree about these cards.
I agree that they will suck for gaming.
But, I think they can be fantastic in the right application.
I would love a 2600pro in a family pc.
1. Gets rid of onboard ram sucking video
2. 128mbit path to its own onboard ram
3. Hardware built in to offload multimedia from the cpu
4. Low power requirements
5. Cheap
6. Drop to low res and an occasional game will function
A person may want a very fast modern pc but not be a gamer.
These cards are great for that small market and oems.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
96 Comments
Back to Article
valnar - Friday, June 29, 2007 - link
Maybe I'm the opposite of most people here, but I'm glad ATI/AMD and Nvidia both produced mid-range cards that suck. Maybe we will finally get the game developers to slow down and produce tighter code or not waste GPU/CPU cycles on eye-candy and actually produce better game play. While I understand that most game companies write games that play acceptably on the $400 flagship video cards, I for one am not one of those people. It's not that I can't afford to buy a $400 card once in a awhile - it's having to spend that every year that ticks me off. I'd much rather upgrade my card every year to keep up the times if said card was $120.titan7 - Saturday, June 30, 2007 - link
Most game developers already do that. If you don't have the power to run the shaders and enable d3d10 features you can run in d3d9 mode. If your card still doesn't have the power for that you can run in pixel shader 1 mode.Take a game like Half Life 2 for example. Turn everything up and it was too much for most high end cards when it shipped. But you can turn it down so it looks like a typical d3d8 or d3d7 era game and play it just fine on your old hardware.
If you're hoping that developers can somehow make things run just as well on a PentiumIII as on a core2 duo you're hoping for the impossible. The 2600 only has about 1/3 the processing power as a 2900. The 2400 has about 10% of the power! Think about underclocking your CPU to 10% speed and seeing how your applications run ;)
Thank goodness we can disable features.
DerekWilson - Friday, June 29, 2007 - link
PLEASE READ THE UPDATE AT THE BOTTOM OF PAGE 1I would like to apologize for not catching this before publication, but the 8600 GTS we used had a slight overclock resulting in about 5% across the board higher performance than we should have seen.
We have re-run the tests on our stock clocked 8600 GTS and edited all the graphs in our article to reflect the changes. The overall results were not significantly affected, but we are very interested in being as fair and acurate as possible.
We have also added idle and load power numbers to The Test page.
Again, I am very sorry for the error, and I will do my best to make sure this doesn't happen again.
coldpower27 - Friday, June 29, 2007 - link
Meh, thanks Derek, but if you already have Factory overclocked results it may as well be lovely to leave them in as they are fair game, if Nvidia's partners are selling them in those configurations. This is of course in addition to the Nvidia reference clock rates.DerekWilson - Friday, June 29, 2007 - link
the issue is that overclocked 8600 GTS parts generally go for closer to $200, putting them well out of the price range the 2600 XT is expected to hit.it's not a fair comparison to make at this point (expecting the 2600 XT to come in at <= $150 anyway.
coldpower27 - Saturday, June 30, 2007 - link
Yeah, but you have all kinds of GPUs on the chart anyway from many different price points, 7950 GT is not close to $150 either, and neither is the 8800 GTS 320.I think people would be quite aware that the Factory OC cards if placed are indeed priced higher but if you already have the results leave them in, in addition to the Nvidia reference clock designs.
dm0r - Friday, June 29, 2007 - link
And please, keep us informed about performance with new drivers because im really interested in midrange video cards :)harpoon84 - Friday, June 29, 2007 - link
For gods sake, isn't it obvious by now that running DX10 games on these cards will result in LOWER performance, not HIGHER? If you are averaging 30fps @ 1280x1024 on DX9 games, it's only gonna get worse in DX10!http://www.extremetech.com/article2/0,1697,2151677...">http://www.extremetech.com/article2/0,1697,2151677...
Company of Heroes DX10 - SINGLE DIGIT FRAMERATES!!!
Yes, the 2600 cards are twice as fast as the 8600 cards, but we are talking totally unplayable framerates of 5 - 9 FPS!
Yeah, designed for DX10 alright! /SARCASM
Wheres TA152H now huh?
frostyrox - Friday, June 29, 2007 - link
To TA152H:Hi. It should've been painfully obvious about 10 comments ago that nobody here agrees with or, well, understands anything your saying. Can you please stop commenting on hardware articles when you don't know what you're talking about? To say that dx9 benchmarks aren't important or, heck, not the most important aspect of these cards makes 0 sense. These cards might be dx10 capable, but they obviously haven't even been given the hardware or raw horsepower to even handle dx9 (even at common resolutions). It also makes 0 sense whatsoever to even suggest these cards will somehow magically perform drastically different in pure dx10 games.
Furthermore...
How does it make any sense for AMD/ATI to have a card thats over 400 dollars (2900xt) that trades blows with the 320 and 640mb 8800gts (which are cheaper), but then have nothing between that card and a heavily hardware-castrated x2600xt for 150 dollars (a 250~ dollar difference). Also consider the fact that Nvidias dx10 mid-low end cards have matured, so even if you were froogle and wanted the cheapest possible clownshoes videocard around you should just go Nvidia. Don't even bother calling me a fanboy unless you feel like making me laugh. I currently own a radeon x800xl. I'm just being honest. It's about time the rest of you do the same. Rant over.
GlassHouse69 - Thursday, June 28, 2007 - link
Crossfire gets better results than SLI always.... two crossfire 2600XT cards would take MUCH less wattage than any other option around for its framerate. At least some azn websites are noticing.Makaveli - Thursday, June 28, 2007 - link
All of you guys posting wait for the DX 10 benchmarks do u seriously think the FPS is gonna double from DX9. These cards are a joke, and ment for OEM systems. They are not gonna release a good midrange card to creep up on the 2900XT and take sales away from it. And they will make far more money selling these cards to OEM's than the average joe blow. The people who are gonna suffer from this is the fools who buy pc's at Best buy and futureshop, that believe they are getting good gaming cards.All I gotta say is you get what you pay for.
Hugs my X1950Pro 512MB AGP!
guste - Thursday, June 28, 2007 - link
Although Anandtech hasn't posted it yet, it looks as if the lower end 2000-series parts are quite good at HD decode, to the point where CPU utilization goes from 100% to 5%. At least this according to a cumbersome Chinese review I read a week ago.Granted my needs don't apply to practically anyone but the HTPC crowd, but I play games at the native resouloution of my 50" panel, which is 1366x768 and I don't use AA, so the 2600 XT would be nice to pick up, in addition to finally being able to send the output to my receiver. For us in the HTPC community, this card will be a godsend, being quiet and low-power.
I look forward to seeing what Anandtech says about the UVD aspects of thse cards, as that's what I'm interested in.
florrv - Thursday, June 28, 2007 - link
Maybe I completely missed it in the reviews, but can these cards be used in Crossfire mode? That could be one way (albeit very clumsy) way to get you closer to midrange performance for the $200-$250 range...strikeback03 - Friday, June 29, 2007 - link
Just looking at the pictures, it would appear the 2400XT and 2600XT cave the connectors.DavenJ - Thursday, June 28, 2007 - link
Wow. Just wow. I haven't seen so much bashing in a long time. However, through all the nVidia and ATI bashing I'm not surprised that the author left out a very important point. The 2600 XT consumes a mere 45W and the 2400 Pro a mere 25W. That is incredible. There is no need for external power as one might expect on low end parts except I think nVidia has external power on the high end 8600. The ATI cards are made using a 65 nm process which explains the low power consumption.For a less insulting and less bias review, go here
http://www.techpowerup.com/reviews/ATI/HD_2600_XT">http://www.techpowerup.com/reviews/ATI/HD_2600_XT
Have a good day!
DerekWilson - Friday, June 29, 2007 - link
i added power numbers on the test page ...the power performance of the new radeon HD cards is not that great.
coldpower27 - Saturday, June 30, 2007 - link
They are as expected, considering the HD 2600 XT is clocked at 800MHZ with 390 Million Transistors the fact that it consumes equal power as compared to the 289 Million Transistor G84 at 675MHZ I would say for what it's worth the improvements of the 65nm process are showing themselves.coldpower27 - Thursday, June 28, 2007 - link
As you can see from the reviews here the HD 2600 XT and HD 2600 Pro don't consume that much less then the cards from the Nvidia camp.http://www.firingsquad.com/hardware/radeon_hd_2600...">http://www.firingsquad.com/hardware/rad...hd_2600_...
Shintai - Thursday, June 28, 2007 - link
The 8600GTS could easily do without an external power connector. So could a 7900GT for that matter. It´s about the situation in SLI and making sure its a clean supply.DerekWilson - Thursday, June 28, 2007 - link
Who am I biased against? Both NVIDIA and AMD have made terrible mainstream parts.While the 86 GTS does require external power, the 86 GT and lower do not.
IKeelU - Thursday, June 28, 2007 - link
Wow, now I feel even better about my 8800GTS 320MB purchase.LionD - Thursday, June 28, 2007 - link
This article scores Radeon X1950Pro approximately 1.5 times lower than iXBT. Why is it so?OCedHrt - Thursday, June 28, 2007 - link
Are these drivers newer than 7.6?DerekWilson - Thursday, June 28, 2007 - link
these drivers are beta 7.7erwos - Thursday, June 28, 2007 - link
I was really hoping that AMD would pull a rabbit out of the hat and release something competitive (read: faster) with the 8600GTS. Clearly, they didn't.Now I've got to decide between an 8600GTS and an 8800GTS for my new build. I like the PureVideo features in the 8600GTS, but I'm not sure I'll really need them if I've got a Q6600. Then again, I'm not sure I'll really need the full gaming performance of the 8800GTS either. Bleh.
Maybe I'll just stick with an 8600GTS for now, and upgrade to the inevitable 8900GTS.
autoboy - Thursday, June 28, 2007 - link
Since we all know these cards suck for games, please make the UVD article really complete. I know you are going to be doing CPU tests, and you are going to test them with core2duos, but I beg you to test these systems for what they are made for, allowing crappy systems to play HD video. Try testing these cards with a sempron @ 1.6Ghz. You could also try finding the lowest possible cpu speed while HD video still plays smoothly by adjusting the multiplier. That could be pretty interesting and help people out who don't overbuild their HTPCs.Also, please try to run HQV benchmarks for both DVD and HD DVD for all the cards. We all know the 2600XT will get good scores, but the 2400pro will most likely be the best card for HTPC use (because nobody will ever play games with these crappy cards) and reviewers ussually don't review the low end models for HQV scores. Many times they don't score the same as their big brothers. If you can't get a 2400pro, you could underclock a 2400XT.
kilkennycat - Thursday, June 28, 2007 - link
A terrific suggestion. Since it is now very obvious that all of the current sub-$200 DX10 cards from both nVidia and AMD/ATi are really targeted for HTPCs and the "casual" gamer -- the bulk of the PC add-on market. Not all of Anandtech's readers are bleeding-edge gaming "enthusiasts".(Derek, I hope you take note of this little thread)
Frumious1 - Thursday, June 28, 2007 - link
I almost agree... just don't listen to that BS about a Sempron CPU! Seriously, are you people running 1.6 GHz Sempron chips with $100 GPUs? I doubt that any single core can handle H.264, even with a good GPU helping out (though it would be somewhat interesting to see if I'm wrong). Considering X2 3600+ chips start at a whopping $63 and the 3800+ is only $5 more, so I think that would be a far better choice. Those are the somewhat lower power 65nm chips as well, and the dual cores means you might actually be able to manage video encoding on your HTPC.What, you don't encode video with your HTPC!? I've got an E6600 in mine, because Windows MCE 2005 sucks up about ~3.5GB per hour of high-quality analog video. I can turn those clips into 700MB DivX video with no discernable quality loss, or I can even go to 350MB per hour and still have nearly the same quality. Doing so on a single core Sempron, though? Don't make me laugh! You'd end up spending five hours to encode a thirty minute show. If you record more than two hours of video per day, you could never catch up!
autoboy - Thursday, June 28, 2007 - link
I am perfectly happy with my sempron 1.6Ghz. I have no problem with OTA HD mpeg2, and I can play any downloaded file I've found. It just keeps chugging along at 1.1V using less than 20W at full load, allowing me to put my HTPC in a nearly enclosed space, and run the fans at a low 500rpm. I can't upgrade to dual core on a Socket 754 board, and I'm not about to upgrade an entire system when this little gem of a $50 graphics card will allow me to run the one thing my cpu can't handle, HD-DVDs.Also, why would I want to re-encode my TV shows when 500Gig harddrives are only $100? For the rare times I do encode, I use my dual core office PC or my gaming rig, or I could just start it at night and come back tomorrow. I've never been in a big hurry to re-encode old episodes of America's Got Talent.
Also, you are wrong about the sempron handlig h.264. Mine can handle downloaded 720p content already, and a chinese site has already confirmed that the UVD can easily run on a Sempron 1.6 with lots of cpu to spare.
lumbergeek - Thursday, June 28, 2007 - link
There you go. Personally, I want to see next week's review of UVD vs. Purevideo. I seriously hope that they include 2400s and 2600s in the review along with 8600s and 8500s. THAT sort of information is what will form the basis for my decision on my next Vid Card. My C2D isn't a gaming machine, but a HTPC. If the 2400 series is as good at video as the 2600, then silent wins - big time.kilkennycat - Thursday, June 28, 2007 - link
nVidia is well into development of the 8xxx-family successors. If you don't like any of the current Dx10 offerings, keep your wallets in your pockets till late this year or very early next year. Double-precision floating-point compute paths (think a full-fledged GPGPU, fully capable of mixing and matching GPU-functionality and compute horsepower for particle-physics etc.) with HD-decode hardware-assist integrated in all versions. Likely all on 65nm. And no doubt finally filling in the performance-gap around $200 to quiet the current laments and wailings from all sides.Crysis is likely to run just fine in DX9 on your current high-end DX9 cards. Enjoy the game, upgrading your CPU/Motherboard if Crysis and other next-gen games make good use of multiple cores. Defer the expenditure on prettier graphics to a more-opportune, higher-performance (and less expensive) time. Do you really, really want to invest in a first-generation Dx10 card (unless you want the HD-decode for your HTPC)? For high-end graphics cards the 8800-series is getting long-in-the-tooth, and the prices are not likely to fall much further due to the very high manufacturing cost of the giant G80 die, plus the 2900XT is not an adequate answer. All of the major upcoming game titles are fully-compatible with Dx9. Some developers may be bribed(?) by Microsoft to make their games Vista-only to push Vista's lagging sales, but Vista-only or not, no current game is restricted to Dx10-only... that would be true commercial suicide with the current tiny market-penetration of Dx10 hardware.
Slaimus - Thursday, June 28, 2007 - link
Looks like ATI is giving up the high end again. The 2600XT/Pro is priced against the 8600GT/8500GT with the price drop, and the 2400Pro is well below them.It will work with the OEMs, but not with game developers and players.
I guess we will see a cut-down 2900GT or something like that to fill the $150-$350 bracket where they have no DX10 products.
Goty - Thursday, June 28, 2007 - link
Why are there no power consumption tests? I thought AT was all over this performance-per-watt nonsense?smitty3268 - Thursday, June 28, 2007 - link
Especially after the article made a point of saying that these cards were built to maximize power efficiency rather than speed.avaughan - Thursday, June 28, 2007 - link
Also missing are noise levels.SandmanWN - Thursday, June 28, 2007 - link
And overclocking...Regs - Thursday, June 28, 2007 - link
Just when I though things were getting better. This whole 6-12 months just one long disapointment.Mid-low range cards that perform sometimes worse than last generation?
All these guys are selling now is hardware with a different name. I never seen such ridiculous stuff in my life. I hope AMD didn't spend too much money on producing these cards. How much money do you have to spend to make a card perform worse than last generations line up? Complete lack of innovation and a complete lack of any sense. I just can't make any sense at all out of this.
I think a 7900GS or a X1800 is the way to go for mid range this year. Though to tell you the truth I wouldn't give AMD any money right now and hopefully then will they get rid of their CEO who seems to not be pulling his weight.
TA152H - Thursday, June 28, 2007 - link
I don't agree with all your reasons, but I agree with Hector Ruiz going. This ass-clown has been plaguing the company for too long, and he has no vision and only a penchant for whining about Intel's anti-competitive practices.He really needs to go. Now!
defter - Thursday, June 28, 2007 - link
They have the same problem that NVidia had with GeForce FX. They spent a lot of money to an exotic new architecture that turned to be very inefficient in terms of performance/transistor count.
DerekWilson - Thursday, June 28, 2007 - link
Except that this is their second generation of a unified shader architecture. The first incarnation is the XBox360 Xenos.Slaimus - Thursday, June 28, 2007 - link
Maybe that was the problem. They built their architecture for maximum theoretical performance rather than practical performance.Console hardware are very picky about the type of code they run. In the 360, the developers can organize their shaders in a way the VLIW units like.
Computer games are designed to run on a variety of hardware, so Nvidia's more flexible approach will be faster for more games.
EODetroit - Thursday, June 28, 2007 - link
All of this lacks relevance anyways... there's no Direct X 10 games out. By now we've all been burned by the "future-proofing our PCs" con so many times that we don't fall for it any more. Yes, at some point in the next 1-4 years, most new games will require DX10, and THEN maybe these video cards will be important. For now and for the next year or so, if you can't afford an 8800 you should just get something from the DX9 generation.titan7 - Saturday, June 30, 2007 - link
Company of Heroes was the world's first d3d10 game and it's been out for a month. Pretty sweet game too, Game of the Year 2006 and the highest rated RTS ever on gamerankings and metacritic.Lost Planet I think was a week behind CoH. And now nou can even download Call of Juaez from the game's website.
d3d10 is here.
yacoub - Thursday, June 28, 2007 - link
What an absolute joke the videocard industry has become in the last six months or so.
If you judge by performance in games, there is no mid-range aside from the 320MB 8800GTS and yet that has not dropped far enough in price fast enough to hit the $250 mark without needing rebates. The 640MB 8800GTS remains very near its MSRP, which is even more frustrating because it has been out since last November and yet still hovers around $400 (particularly because AMD didn't release a strong enough competitor but we'll get to that in a moment).
Then the 8600GTS came out without enough streamprocessors and bus bandwidth to perform the way a proper mid-range card would have. Who loses? Only the consumer because AMD is only just now about to get their competing product to store shelves so NVidia simply doesn't care to offer a worthwhile midrange card.
Then AMD released the HD 2900XT and it didn't match the 8800GTX so it didn't influence a price drop on the 8800GTSes by NVidia so the consumer lost out yet again.
Now AMD releases another crap mid-range card to compete with NVidia's crap mid-range card, even though this was the perfect opportunity to release a STRONG mid-range card that threatened the 320MB GTS in at least some games while offering the proper 256-bit bus and enough shaders and stream processors to handle modern games at your average 1600x and 1680x resolutions that most of us with 19" and 20" monitors run.
Really, for $250 a gamer should be able to buy a card that can run your average game at your average resolution with your average AA and AF settings.
My example would be that a modern FPS like Stalker should be able to be run full graphical settings at 1600x1200 or 1680x1050 with 4xAA and 8xAF at a solid 45fps or better.
(As opposed to 'high-end' which is a larger display at 1920x1200 or 2560x1600 with 8xAA and 16xAF.)
Thoughts?
Gary Key - Thursday, June 28, 2007 - link
I believe the purpose of these cards (2400/2600/8500/8600) are for the OEMs to meet a basic checklist of features for the upcoming back to school and holiday season. They provide excellent video playback features for media centers and "appear" on paper to have all of the features you need to play games in your spare time. For the Dell/Best Buy/HP online crowds, the marketing speak will look impressive and I bet the upcharge to go from iX3100/nv7050/atiX1250 on-board graphics will be about the same if not more as just buying the card itself.
Unfortunately, in my testing for the m-ATX roundup (which starts on Monday,finally), the 2400/8500 series did not provide any noticeable improvements in most cases over the atiX1250/nv7050 based boards while being "better" than the G965, sometimes up to 100% better but of course going from 2fps to 4fps in STALKER is not exactly setting the world on fire but those percentages look impressive. ;-)
I think the true mid-range has been or is in the process of being dropped by AMD/NV in order to improve profits. I believe they think the low end cards will suffice for the Home Office / Media Center crowds and that gamers/enthusiasts will pay up for the high end cards. Considering the dearth of competition at the high end, NV sees no point in reducing prices or rolling out new cards (thus taking current cards downstream) at this time.
I expected AMD to reduce the cost on the 2900XT 512 down to $309 or less with the 1GB cards arriving in order place some pricing pressure on NV, especially after the performance let down of the card but even that is not happening. It surprises me because they could really jump start the sales on the card and gain some market share if they had the card priced in the $269~$309 range, a point at which I think people would think the card had decent price to performance value in the market. That leads me to believe the sweet spot ($200~$300) performance market is dead for now, at least with new products.
Just some of my thoughts....
Spoelie - Thursday, June 28, 2007 - link
Maybe, maybe not. However, introduction of new technology has historically been high end and low end, with midrange only coming around the refresh.For example:
initial release
7600GT < 199$, 7900GT > 300$
x1600xt < 169$, x1900xt > 400$
~2 months later, 256mem configuration was used on the x1900xt to bring the 7900gt some competition at the 300$ pricepoint, but yet again there were no new introductions in the 200-300$ price range, leaving it open to sell yesterdays highend (x1800 7800 x800)
refresh
introduction of 7900gs, gt price dropped and replaced by the 7950gt
x1950pro introduction at 200, x1950xt 256/512 x1950xtx 512 at 250/300/350 respectively
At this point we have some decent mid range, bliss for enthousiasts.
With a little bit of luck, we'll see some nice midrange parts around autumn/winter.
SandmanWN - Thursday, June 28, 2007 - link
I agree that the Mid-Range is dead with the current generation of video cards on both sides of the fence (ATI&NV). The closest thing we have to midrange is the GTS 320 and its still over priced for that category.There is a clear lack of video cards in the $200-300 range. Its easier to find a better performer in the last generation of cards than it is 7-8 months into the current generation.
Wheres the 9800 Pro/x850 Pro of the current generation????
(And please give it a rest on the 2900XT vs GTX. You know its been said about a billion times that the 2900xt competitor is the GTS. Let the pre-production hype die already will yah?)
theprodigalrebel - Thursday, June 28, 2007 - link
The mid-range isn't dead. X1950Pro prices are awesome at the moment. :-)TA152H - Thursday, June 28, 2007 - link
Yet another weird choice for testing. You test a low-end card AMD card, with a top of the line Intel CPU? Yes, a lot of people will be going for that configuration. At least .01 %. Well, close to it anyway. And then, the most interesting part of all, you don't test, the 2400 Pro, which doesn't have a noisy fan and thus at least has something attractive about it.Granted, a high end processor exaggerates the differences in cards, but it's more academic than meaningful, since most people will not have that configuration. As you go to more mainstream processors, the delta shrinks, although in this case it's so dramatic as to be startling.
But, DX9 tests aren't particularly meaningful within the context of what these cards are. Their claim to fame is to be DX10 cards, and one is fanless, so it's going to be attractive to some people. If the DX10 performance is so miserable though, maybe it won't have appeal in that market. That's why I'm confused why you'd even bother with the DX9 article first, instead of going straight to DX10 and testing DX9 later. Clearly, DX10 is the future; it looks better, and was the focus of the product. If it performs relatively well on DX10, DX9 performance (or lack thereof), becomes almost irrelevant since it is not the focus of the card. The reverse would not be true, if DX10 performance were miserable, and DX9 performance good, it would still be a nearly useless card. The DX10 article should have come first.
strikeback03 - Thursday, June 28, 2007 - link
IIRC there were faster fanless cards last generation. I know a fanless 7600GT is still available, and the 7600GT finished ahead of the 2400 XT here, so no doubt it's better than the 2400 Pro.With the DX9 performance of these cards, AMD has certainly not given anyone a reason to upgrade. Assuming DX10 performance is better, users will wait until more DX10 games are available. Why lose performance in an upgrade on the games you currently play?
TA152H - Thursday, June 28, 2007 - link
While I see your points, I don't agree entirely. Does the 7600GT beat the 2400 in DX10? Well, no, it's not for that. So, I'd agree you wouldn't upgrade to the 2400 for DX9 performance, you might for DX10. But they didn't publish any results for DX10, so we don't know if it's even worthwhile for that.You make a good point with timing on an upgrade, but I'll offer a few good reasons. Sometimes, you NEED something right now, either because your old card died, or because you're building a new machine, or whatever. So, it's not always about an upgrade. And, in these instances, I think most people will be very interested in DX10 performance since we're already seeing titles for them coming out, and obviously DX9 will cease to be relevant in the future as DX10 takes over. The advantages of DX10 are huge and meaningful, and DX9 can't die fast enough. But, it will take a while because XP and old computers represent a huge installed base, and software companies will have to keep making DX9 versions along with DX10. But who will want to run them degraded? Also, Vista itself will be pushing demand for DX10, and XP is pretty much dead now. I don't think you'll see too many new machines with this OS on it, but Vista of course will continue to grow. Would you want a DX9 card for Vista? I wouldn't, and most people wouldn't, so they make these DX10 cards that at least nominally support DX10. Maybe that's all this is, just products that are made to fill a need for the company to say they have inexpensive DX10 cards to run Vista on. I don't know, because they didn't put out DX10 results, instead wasted the review on DX9 which isn't why these cards were put out. And naturally, most of the idiots who post here eat it up without understanding DX9 wasn't the point of this product. I won't be surprised if DX10 sucks too, and these were just first attempts that fell short, but I'd like to see results before I condemn the cards for being poor at something that wasn't their focus.
leexgx - Thursday, June 28, 2007 - link
?? how can XP be dead most Offices will be useing it probly for the next 5-8 years home users its going to take 2-3 years before users upgrade if thay are even bothered if it works it works
i love to use vista but Drivers Suck for it M$ Should Demand Fully working drivers Before WHQL an driver not just It works an slap on an WHQL cert on it (Nvidia crapy chip set drivers and Creative Poor driver support {but thats M$ fault for changeing the Driver model}..)
most norm Home users do not care what video card is in there pc so the Low end video parts whould not bother them
defter - Thursday, June 28, 2007 - link
ROFL. If 2400 XT gets 9 fps in a "DX9" game at 1280x1024, do you expect that any real "DX10" game will be playable on it at any reasonable resolution?
It's quite clear by now, that in order to really use those DX10 features (i.e. run games with settings where there is a noticable image quality difference compared to DX9 codepath), you need to have 8800/2900 class card. DX10 performance of 2600 series, let alone 2400 series, is quite irrelevant.
Frumious1 - Thursday, June 28, 2007 - link
Have you even tried any DX10 hardware? Because I have to say that it sounds like you're talking out of your ass! The reason I say that is because the current DX10 titles (which are really just DX9 titles with hacked in DX10 support) perform atrociously! Company of Heroes, for example, has performance cut in half or more when you enable DirectX 10 mode -- and that's running on a mighty 8800 GTX! (Yes, I own one.) It looks fractionally better (improved shadows for the most part) but the performance hit absolutely isn't worthwhile.Company of Heroes isn't the only example of games that perform poorly under DirectX 10. Call of Juarez and Lost Planet also perform worse in DirectX 10 mode than they do in DirectX 9 on the same hardware (all this testing having been done on HD 2900 XT and 8800 series hardware by several reputable websites - see FiringSquad among others). You don't actually believe that crippled hardware like the 8500/8600 or 2400/2600 will somehow magically perform better in DirectX 10 mode than the more powerful high-end hardware, do you?
Extrapolate those performance losses over to hardware that is already about one third as powerful as the 8800/2900 GPUs, and this low-end DirectX 10 hardware is simply a marketing attempt. If it's slower in DX9 mode and the hihg-end stuff suffers in performance, why would having fewer ROPs, Shader Units, etc. help matters?
This is blatant marketing -- marketing that people like you apparently buy into? "The 2600 isn't intended to perform well in DirectX 9. That's why it has DirectX 10 support! Wait until you see how great it performs in DirectX 10 -- something cards like the 7600/7900 and X1950 type cards can't run at all because they lack the feature. Blah de blah blah...." Don't make me laugh! (Too late.)
Radeon 9700 was the first DirectX 9 hardware to hit the planet, followed by the (insert derogatory adjectives) GeForce FX cards several months later. Is but is because we're seeing the exact same thing in reverse this time: NVIDIA 8800 hardware it appears to be pretty fast all told, and almost 9 months after the hardware became available ATI's response can't even keep up! Anyway, how long did it take for us to finally get software that required DirectX 9 support? I would say Half-Life 2 is about as early as you can go, and that came out in November 2004. That was more than two years after the first DirectX 9 hardware (as well as the DirectX 9 SDK) was released -- August 2002 for the Radeon 9700, I believe.
Sure, there were a few titles that tried to hack in rudimentary DirectX 9 support, but it wasn't until 2005 that we really started to see a significant number of games requiring DirectX 9 hardware for the higher-quality rendering modes. Given that DirectX 10 requires Windows Vista, I expect the transition to be even slower! I have a shiny little copy of Windows Vista sitting around my house. I installed it, played around with it, tried out a few DirectX 10 patched titles... and I promptly uninstalled. (Okay, truth be told I installed it on a spare 80GB hard drive, so technically I can still dual-boot if I want to.) Vista is a questionable investment at best, high-end DirectX 10 hardware is expensive as well, and the nail in the coffin right now is that we have yet to see any compelling DirectX 10 stuff. Thanks, but I'll pass on any claims of DX10 being useful for now.
Jedi2155 - Friday, June 29, 2007 - link
Farcry was February 04, and that was definitely full DX9 support.coldpower27 - Friday, June 29, 2007 - link
That was DX8 largely with some DX9 shader in. But the majority of the code use was still in DX8 mode.Were talking about native DX9 games which would be something on the magnitude of Oblivion, or Neverwinter Nights 2.
titan7 - Saturday, June 30, 2007 - link
d3d9 is basically d3d8, but with SM2.0 instead of SM1. Just because Oblivion doesn't support SM1 doesn't mean it is any more or less of a d3d9 title than one that also supported sm1 or OpenGL.TA152H - Thursday, June 28, 2007 - link
With any technology, the initial software and drivers is going to be sub-optimal, and not everyone is going to be playing at the same resolutions or configurations. But, these cards aren't made for the ultimate gaming experience, they are low end cards to perform tasks that are not so demanding well enough for the mainstream user.What you don't seem to understand is, not everyone is a dork that plays games all day. Some people actually have lives, and they don't need high end hardware to shoot aliens. For you, sure, get the high end stuff, but that's not what you compare these cards to. $60 card versus $800? Duh. Why would you even make the comparision? You make the comparison of how well the perform for their target, which is a low end, feature rich card. Will it work well with Vista and mainstream type of apps? Does the feature set and performance justify the cost?
Only an idiot would use this type of card for a high end gaming computer, and only the same type of person would even compare it to that.
LoneWolf15 - Thursday, June 28, 2007 - link
With any technology, the initial software and drivers is going to be sub-optimal, and not everyone is going to be playing at the same resolutions or configurations. But, these cards aren't made for the ultimate gaming experience, they are low end cards to perform tasks that are not so demanding well enough for the mainstream user.Yes, and by the time the technology and drivers (in this case, DirectX 10) ARE optimal, these cards will be replaced by a completely new generation, making them a pretty poor purchase at this time. Remember the timetable for going from DirectX 8 to DirectX 9?
Derek is completely right at this point. If you're buying a card, you should buy based on what kind of DirectX 9 performance you're going to get, because by the time REAL DX10 games come out (and I'm going to make a bet that will be a minimum of 12 months from now, and I mean for good ones, not some DX-9 game with additional hacks) there will be product refreshes or new designs from both ATI and nVidia. Buying a card solely for DirectX 10 performance, especially a low-to-midrange one, is completely silly. If it's outclassed now, it will be far worse in a year. It really makes sense to choose to either a)buy a minimum of a 320MB Geforce 8800GTS, or b)Stick with upper-middle to high-end last-gen hardware (read: Radeon x19xx, Geforce 79xx) until there's a reason to upgrade.
defter - Thursday, June 28, 2007 - link
Yes, it would be interesting to know the performance of 2400 Pro. After all, faster 2400 XT got whopping 8.9 fps at 1280x1024 no FSAA, no AF in Rainbow Six and 9.3 fps in Stalker with the same settings. I wonder if 2400 Pro gets more than 6 fps?
I actually don't see what is the point of 2400 XT. If somebody needs just multimedia functionality and doesn't care about gaming, then $50 2400 Pro/8400 GS is a cheaper choice. If somebody cares about gaming, then the 2400 XT is useless since 8600GT/2600 Pro are much faster and are only slightly more expensive.
TA152H - Thursday, June 28, 2007 - link
Because not everyone is going to run Rainbow Six, duh!!!!!!For some people these cards would be fine because they aren't running all the titles here, or are willing to run them at lower resolutions so they don't have to hear some damn egg beater in their computer. Resolution isn't important to everyone, not everyone is some jackass kid that thinks blowing up space aliens with the highest degree of resolution is what life is all about and would be willing to sacrifice some of that for something that is quieter and cooler. I would have thought that much was obvious.
DerekWilson - Thursday, June 28, 2007 - link
We would have tested the 2400 Pro if we had been able to get a hold of one. AMD was not able to send us a 2400 Pro, so we'll have to wait until we can get one from one of their board partners.DerekWilson - Thursday, June 28, 2007 - link
I'm gonna disagree.DX9 is much more important in these tests. How many people used a 9500 or an FX 5600 to play any serious DX9 games (read hl2 or better)? And how long did they have to wait for it when it finally mattered?
The reason we do real world tests is because we want to evaluate how the card will behave in normal use. To the customer, the hardware is only as good as the software that runs on it. And right now the software that runs on these parts is almost exclusively DX9.
It'll be at least a year or so before we see any real meaningful DX10 titles. Remember TRAOD, Tron 2.0 and Halo? Not the best DX9 implementations even if they were among the first.
DX10 tests are certianly interesting, and definitely relevant. But I think DX9 is much more important right now.
TA152H - Thursday, June 28, 2007 - link
Yes, but you miss the point that these cards were made for DX10. There are already some titles out, and they will become more and more popular, although initially, without all the features. It obviously wasn't the focus of the product at all, so why make it yours?Let me ask you a simple question. If you were buying a card, even today, would you buy it for the performance of DX9, or DX10? If you had the choice of two cards, one that had obscenely bad DX9 performance, but good DX10, and the other the reverse, which would you choose? I'd choose the one that performs well on DX10, because that's where things are going, and I'd put up with poor DX9 performance while new titles came out. However, these might suck on DX10 too, that's what we need to know.
swaaye - Thursday, June 28, 2007 - link
Well, Radeon 9700 didn't have too much trouble rocking DirectX 8 games. Nor did GeForce FX (hell that's all it was really good for). G80 slaughters other cards at DirectX 9 games. I highly, highly doubt that these new cards are optimized for DirectX 10. How can they be? The first cards of each generation are usually disappointments for the new APIs.TA152H - Thursday, June 28, 2007 - link
You're missing the point, I'm not saying it will, I'm saying let's see.But, let's be realistic, at the price of these cards, they aren't going to be extremely powerful, but they have a great feature set for the price. For a lot of people, these are going to be good cards.
Having said that, I'm inclined to agree they probably will not have great DX10 performance, but they didn't even test it. Strange, to say the least. Some of their decisions are baffling, and you wonder how much thought they actually put into them, if any.
I also agree the first generation for a feature set isn't great. I'm not expecting much, but I'll withhold criticism until I see the results. Besides, in the case of the 2400, wouldn't you think that with this type of feature set, for $60 or so, it would be a very good product for a lot of people running Vista? It's not going to be for the alien blasters, of course, but don't you think it's got some market ?
Tamale - Thursday, June 28, 2007 - link
you make it sound like you'd never even play any of the games tested in this review. wouldn't you be mad your "midrange" card performed this awful on OLDER technology games?i don't understand why anyone WOULDN'T care about dx9 performance when there are so many good dx9 games out there...
swaaye - Thursday, June 28, 2007 - link
And before you rip me apart for bringing up 9700 and telling me how awesome it was for DX9, remember the mid-range 6600 GT beat it handily. Both are designed for the same API.erple2 - Thursday, June 28, 2007 - link
You're comparing apples to oranges here. Remember, the 9700 was the FIRST DX9 part available from ATI. The 6600GT was the second gen DX9 part from NVidia. I WILL say that the 9700 was light-years ahead of the nVidia competing DX9 part, the 5800XT.Your statement is more or less the same as saying that the 9700 was crap, because the 7600GT handily beat it (ok, I'm slightly exaggerating here...)
The point is that this is a reversal of the DX9 situation. The 9700 did handily beat the 5800 in DX8 generation games. In this case, the 8800GTX handily beats the 2900XT (the jury's still out on the 8800GTS).
I view this more like the 2600 appears similar to the horribly performing (in DX9) GeForce 5600.. At least the 5600 did reasonably well in DX8 games...
swaaye - Friday, June 29, 2007 - link
No, I agree that the HD2600 and 2400 are reminiscent of the FX 5600 and FX5200. They are pretty awful. And I'm not going to sit here and dreamily imagine 3x the performance when they are running more complex DX10 shader code. I think these cards are flops and that's all they will really ever be. For non-gamers and HD video people, the only people who should buy these, they will of course be fine.If you want to play games, don't jump to DX10 dreaming. How many years did it take for DX9 to become the only API in use? Years. DX9 arrived in 2002 and only a couple of years ago at best was it becoming the primary API. UT2004, for example, is basically a DX7 engine. Guild Wars arrived with a DX8 renderer.
DX9 had multiple OS's backing it. DX10 is Vista only. Its adoption rate is likely to really be slowed down due to this and the fact that the only cards with remotely decent DX10 performance are $300+.
I brought up 9700 and 6600GT just to say that the first generation of cards for a new API is never very good at that API.
DerekWilson - Thursday, June 28, 2007 - link
I agree that we need to know dx10 performance, which is why we're doing a followup.I would think it would be clear that, if I were buying a card now, I'd buy a card that performed well under dx9.
All the games I play are dx9, all the games I'll play over the next 6 months will have a dx9 codepath, and we don't have dx10 tests that really help indicate what performance will be like on games designed strictly around dx10.
We always recommend people buy hardware to suit their current needs, because these are the needs we can talk about through our testing better.
TA152H - Thursday, June 28, 2007 - link
OK, that recommendation part is a little scary. You should be balancing the two, because as you know, the future does come. DX9 will exist for the next six months, but there are already games using DX10 that look better than DX9. Plus, Vista surely loves DX10.But, we can agree to disagree on what's more important. I think this site's backward looking style is obvious, and while I fundamentally disagree with it, at least you guys are consistent in your love for dying technology. Then again, I still prefer Win 2K over XP, so I guess I'm guilty of it too, but in this case my primary concern would be DX10. It's better, noticeably so. But, the main thing is, you're judging something for what it's not made for. AMD's announcement made it very clear that DX10 was the main point, and HD visual effects. Yet you chose to test neither and condemn the hardware for legacy code. Read the announcement, and judge it on what's it's supposed to be for. Would you condemn a Toyota Celica because it's not as fast as a Porsche? Or a Corvette because it's got bad fuel economy? I doubt it, because that's not why they were made. Why condemn this part without testing it for what it was for? I didn't see DX9 mentioned anywhere in their announcement. Maybe that was a hint?
Chaotic42 - Thursday, June 28, 2007 - link
Yes, but how many people are going to purchase low-to-mid range cards to play games that aren't coming out for several months?poohbear - Thursday, June 28, 2007 - link
celica compared to a porsche?!?! dude that analogy is waayyyyyy off. How the hell is a toyata celica supposed to represent DX9 & a porsche DX10?!?! considering a porsche u can see instant results and enjoy it instantly, there's nothing out right now on a DX10 and i dont think even in 3 years the DX10 AP would ever encompass the differences between a celica and a porsche. get over yourself.KhoiFather - Thursday, June 28, 2007 - link
Wow, what worthless cards! Like does ATI really think people are going to buy this crap? Maybe for a media box and that's about it but for us mid-range gamers, it's worthless! All this hype and wait for nothing I tell ya!Chadder007 - Thursday, June 28, 2007 - link
Yeah, WTF?? They are all sometimes WORSE than the X1650XT!!! What is going on? According to the specs it should be better, could it be driver issues still??tungtung - Thursday, June 28, 2007 - link
I don't think driver alone will help much ... beside ATI has never really known to be able to magically put strong numbers out through driver updates.Personally I'd say the 2xxx line that AMD/ATI has just sunk to the deep abyss. First it was months late, and the performance was light years behind ... all the while the price is just well not right.
As much as I hate saying this ... it seems that we'll have to wait till Intel dips their giant feet into the graphic industry before nVidia and (especially) AMD/ATI woke up and think carefully about their next products (that is if they can bring a competitive product) ... especially in the mainstream and value market.
OrSin - Thursday, June 28, 2007 - link
Very easy to guess what is happening here. Both camps are targeting the the high engame that switch to vista and cna afford the high end cards. And the OEM cards for deal so they push Vista again on people. Niether company want to lower their high sells by releasing a mid-level part. I just wonder if the cards are just more expensive to make for D10. I don't see a reason it would be, maybe I'm wrong.Until Vista is used by more gamer my guess is they will not release a mid range card.
Early adaptors of software is getting screwed.
DigitalFreak - Thursday, June 28, 2007 - link
Agreed. I couldn't help laughing when I read the Final Words section. Kinda like "The Nvidia 86xx/85xx cards suck, and the ATI 26xx/24xx suck worse!"WTF happened this generation? The only cards worth their salt are the 88xx series. Nvidia dropped the ball with their low end stuff, and AMD.... well, AMD never really showed up for the game.
smitty3268 - Thursday, June 28, 2007 - link
I think it's clear that with these low end cards, ATI and NVIDIA both came to the conclusion that they could either spend their transistor budget implementing the DX10 spec or adding performance, and they both went with DX10. Probably so they could be marketed as Vista compatible, or whatever. It's still a mystery why they didn't choose to make any midrange cards, as they tend to sell fairly well AFAIK. Perhaps these were meant to be midrange cards and ATI/NVIDIA were just shocked by how badly performance scaled downwards in their current designs, and were forced to reposition them as cheaper cards.Spoelie - Thursday, June 28, 2007 - link
think about the fact that the x1950xt has less transistors then a HD2600xt, and this is even more disappointingcoldpower27 - Thursday, June 28, 2007 - link
There just wasn't much choice, 390 Million for a midrange part on ATi's side that performs worse then Nvidia's 289 Million part, is quite a sorry state of affairs.It's too bad this generation was so expensive on the feature front that barely any transistor budget was left for implementing performance and were left with hardware that only performs marginally faster if that then the previous generation products.
I am quite disappointed that ATi parts are currently slower despite having a larger transistor budget and higher core clock.
TA152H - Thursday, June 28, 2007 - link
Maybe because they weren't designed for DX9 performance, to state the obvious. They are DX10 parts, and should be judged on how well they perform on that.Shintai - Thursday, June 28, 2007 - link
DX10 sucks on both 8600GT/S and 2600XT, unless playing at 5-8FPS is you.2900XT/8800GTS/X is needed for DX10. And better yet, SLI/CF or the next generation.
DX10 on these midrange nVidia and AMD GPUs is 100% useless.
And for what reason do you think they will perform magically better in DX10? 2900XT didnt over 8800. And there is no reason on why it should be better.
TA152H - Thursday, June 28, 2007 - link
Another person that can't read.I didn't say it would perform better, or worse. We'll see how well it performs when they do the proper tests. Until then, stop the whining. Afterwards, if it sucks, I'll whine with you.
Shintai - Thursday, June 28, 2007 - link
Just read some of the other sites that tested DX10.Le Québécois - Thursday, June 28, 2007 - link
From what I know, all DX10 games or applications out there right now were developed for DX9 and received DX10 feature as an after thought. For REAL DX10 we will have to wait for Crysis.titan7 - Saturday, June 30, 2007 - link
Company of Heroes was designed for d3d10 from the start. It's as much a real d3d10 game as crysis will be.coldpower27 - Thursday, June 28, 2007 - link
There won't be any "REAL" DX10 for sometime to come, oit takes ages to develop native API games.swaaye - Thursday, June 28, 2007 - link
I've seen Crysis on a 8800GTX. Don't expect to play it well on less, unless the game devs perform some serious miracles. And I wouldn't bet one that. :)Le Québécois - Thursday, June 28, 2007 - link
I was replying to that. There is no REAL review or even preview from DX10 (game that have been developed from the start for it) now. I know very well that you will need a very good Video card to play Crysis in its full glory.
gigahertz20 - Thursday, June 28, 2007 - link
If these cards suck that bad in DX9 they are bound to suck even harder in DX10. Don't give me this...OH they will do better in DX10....pffff. I'm going to hold off and buy a DX10 card once the games come out, that way I will know what performs the best and buy then the Geforce 8900 series will be out this Q3 making the prices drop even further the the 8800 line.TA152H - Thursday, June 28, 2007 - link
You're obviously not very bright, I never said they'd perform better or worse. I said it makes more sense to wait until the results are in before passing judgment. Don't put words in my mouth.PrinceGaz - Thursday, June 28, 2007 - link
First post! :)nameisfake - Sunday, July 1, 2007 - link
I have to agree but disagree about these cards.I agree that they will suck for gaming.
But, I think they can be fantastic in the right application.
I would love a 2600pro in a family pc.
1. Gets rid of onboard ram sucking video
2. 128mbit path to its own onboard ram
3. Hardware built in to offload multimedia from the cpu
4. Low power requirements
5. Cheap
6. Drop to low res and an occasional game will function
A person may want a very fast modern pc but not be a gamer.
These cards are great for that small market and oems.
My 2cents
DigitalFreak - Thursday, June 28, 2007 - link
Dude, that shit died years ago...