Why not simply show a screen shot of Task Manager after each benchmark? Then we can see approximatly how much of the second core is used by each benchmark.
If you follow the link Amdahl's law in this artilce and then at the bottom of page follow link “Reevaluating Amdahl’s law” you see the author state,
Amdahl's law contains ” . . . the assumption that p is independent of N, which is virtually never the case. One does not take a fixed-size problem and run it on various numbers of processors except when doing academic research; in practice, the problem size scales with the number of processors. When given a more powerful processor, the problem generally expands to make use of the increased facilities.”
Isn’t this also what we have seen with video games, that they have always expanded to make use of the increased facilities?
Derek's article mentions in its conclusion, “The real benefit will come in when game developers start working on parallelizing their code as much as possible.” This article is very forgiving of ATI’s new driver as there is already significant benefits to dual core processors on Quake4 and SeriousSam2 if you are using Nvidia 81.xx drivers.
The article, using the ATI X1800 XL with the Catalyst drivers, Quake4 showed “no performance difference or issue” in the single core to dual core tests. In an article on www.xbitlabs.com titled Contemporary CPUs and New Games: No Way to Delusions!, on page 6, Quake 4 is shown to run much faster on dual core using the Nvidia 81.xx drivers, as the X2 3800+ clocked 2 GHz gets 101.6 fps, and the 3500+ clocked higher at 2.2GHz achieves only 98.6 fps. The 3200+, which is clocked evenly with X2 3800+ at 2Ghz, only achieved 93.4 fps, which means the dual core produced an 8% gain.
Also, on page 5, Serious Sam2 is shown running faster on dual cores, as the X2 3800+ clocked at 2 GHz achives103.1 fps beating out the 3800+ clocked at 2.4 GHz which achieved 99.2! The 3200+ clocked evenly with X2 3800+ at 2Ghz only achieved 84.8 fps. This indicates the dual core produced a 17.7% gain! The author notes on this test, “I have to point out NVIDIA drivers also started supporting dual-core architectures. ForceWare version 81.xx allows enjoying the advantages of dual-core technology in DirectX as well as in OpenGL.”
And to rule out that having two times the cache on an X2 is the reason for the performance difference, you can directly compare the equally clocked 4000+ to the X24600+. Each are clocked at 2.4 GHz and have a total of 1Mb cache. In Quake4 the X2 gets 110.2 while the 4000+ only achieved 103.2, which is still a 6.35% difference.
I think it would be great if this article is appended or these facts are included in the update. The end user should know what dual core performance is available to them now, not think that because ATI wrote a less than successful driver that we have to wait for the game developers before we see any significant dual core benefits!
Thanks,
Semiconductor Manufacturer and Anandtech fan.
Ok just read the article. I thought the test was to be scheduled. Anyway, why are you only testing on an AMD system and why no sign of Black&White2, one of the new era of games that will push the bus, system and visual experience? Also Intel are the kings on multitasking and bus bandwidth so why test dual core drivers and speed % differences on an AMD?
Surely a driver that has two cpu threads going will require more bandwidth on the bus when expected to move upto twice the amount of graphic files around tot he gfx card.
Can you please test a game that is modern like B&W2 in the future and put at high res. Compare the benefits between the market products. Show results on bandwidth demand and their effect whilst multitasking the system.
It seems Anandtech is falling behind the times due to bias towards AMD. It maybe ok to fool the majority of system owners that still have low PC-memory ratings and legacy AMD stuff thinking they're modern, but you portray your site has a leading tech reviewer. Please show the best of, not best of frame rates for non multitasking environments running an old tech wise game.
quote: It maybe ok to fool the majority of system owners that still have low PC-memory ratings and legacy AMD stuff thinking they're modern, but you portray your site has a leading tech reviewer.
Most of the readers here that have dual core systems have AMD X2 processors. AMD is faster at games than Intel so it is totally appropriate for AT to test using AMD first. AMD is not legacy, its more the general public still view Intel as the performance leader based on their processors several years ago and still clueless that AMD have now taken over.
Sure AT should now test using Intel processors just for knowledge sake but i bet performance % is similar to AMD like 1%-10%.
Quake 4 is new just came out recently, DOD source is recent, Battlefield 2 is a popular game many people play it, B&W2 would be nice but COD2 is the game really missing here.
Why have u provided a link showing motherboards for Intel at games and no AMD motherboards?
What does that prove? That one particular Intel chipset motherboard is better than another Intel motherboard?
Compare the results for the Farcry from this test results to the one I linked to.
The Intel system is far better yet it's not using the dual core style gfx drivers yet.
I'm starting to think Anandtech doesn't want to compare latest games between the best of AMD and Intel as Intel will show it's better, especially at maintaining FPS during multitasking.
"We have Firefox loaded with all 13 tabs from our new suite test, iTunes is running and playing a playlist, and Newsleecher is downloading headers. We kept Newsleecher in this test simply because it's the best way for us to be able to have a fairly CPU/disk intensive downloading task running in the background while still maintaining some semblance of repeatability." --Anand Lal Shimpi
You are failing to see that the majority of AMD CPU's are legacy as they have poor ability for modern software. Only AMD's top end CPU's are anything to consider, else all the others aren't suitable. AMD only perform well in single tasks. Intel are the KINGS of multitasking and bandwidth for the whole of their range.
Games like DoD2 and Doom2 are not pushing the gfx routines and system, Black and White 2 does. B&w2 is a front line game, where the games in this test are safe bets for all processors. The point is this test was to measure performance for a dual-core device driver, so you would expect to see best of software.
Well I'll agree that Hyperthreading benifits the P4 for multitasking, but lets not consider legacy, when comparing each companies latest dual cores, the Intel Pentium Processor Extreme Edition 840 $1029 at newegg, and the Athlon X2 4800+ $787 at newegg
AMD uses what they call Direct Connect architecture. Instead of two processor cores being saddled to one bus and run to a single memory controller as we see with Intel dual core technology, we have to remember that AMD Athlon 64 processors have the memory controller on the CPU die itself and therefore no “front side bus” is needed. So each CPU on our dual core X2 has a much quicker route to the memory controller as with current Athlon 64 processors.So still the biggest benefit to the entire K8 core system is shining through in AMD's Athlon X2 line in the ways of HyperTransport and its extremely wide bus width when compared to Intel’s dual core 800MHz bus.
Sandra Memory Bandwith
Intel EE 840 Dual Core 4331
AMD X2 4800+ 5801
From same article
"On the dual core front, when you look at AMD’s flagship Athlon 64 X2 side by side with Intel’s flagship Pentium Extreme Edition 840 with HyperThreading, the obvious HyperThreading advantages seemingly disappear. In comparing single threaded applications, the Athlon 64 X2 4800+ shines over Intel’s 840 in our benchmarks."
First off, sorry for spelling and typo's in last post. I reply on EDIT alot which this commenting system doesn't have, hehe.
Ok, now I'm not bagging the X2 or AMD's top range as they are capable of doing the task required, but the for this test it's important to have Intel as the main test bed since it's got a higher threshold for things tested.
Like I can't imagine games use all the sustained bandwidth yet, but a game playing whilst multitasking should put enough strain to show degradation in available memory data for game textures etc.
If I'm correct from quick calcs, an AMD with top range DDR1 memory running a game at high refresh rates may only have 30meg bandwidth per game frame to play with. On the Intel that is about 55meg per frame. Now these sound like high values but when you consider multitasking and instance demands and gaming hi res textures etc, you start seeing the limitations.
This test fails to test multitasking, it fails to place the strain of FRONTLINE games on the bus, it fails to compare the best of CPU's for the situation.
If GFX drivers are going to start using multi core cpus and their threads then that will surely increase instance bandwidth on the memory bus, something Intel has trumphs over AMD.
I hope the new dual core driver test compares the difference in bandwidth use and what effect that has between AMD and Intel based system.
If you own a AMD without the top notch expensive ram then you maybe trapped is a past era as new games demand more bandiwdth and drivers start using more too.
So after looking at the charts it looks like if you use singlecore, stay away. Also no word on if they fixed the issue with FEAR that was mentioned last month as NOT being fixed in 5.11.
On a side note: Anyone wonder if this is how they will start to push people from single to dual core? (That is, offering improvements for dc at the expense of sc performance.)
There's no other way of doing it really. Multi-threaded code will always run slower than the equivalent single-threaded code on a single-core CPU. (As long as you count threads waiting for disk-IO and stuff like that out of it that is.) If apps are ever to go the multi-threaded route, single-core performance will suffer...
The gentlemen at Sanda National Labs don't see the parallelization "problem" the same as most of the computing community, and take exception to Mr. Amdahl's observation. They have shown that the possible multiple CPU parallelization improvement can exceed Amdahl's theoretical maximum. They state that as the number of processors increase, the size of the problem introduced to the system increases allowing speeds above those previously thought possible. Here's a link to the horribly boring article.
http://www.scl.ameslab.gov/Publications/Gus/Amdahl...">Extremely Boring Paper With Too Much Math
Thank God we got that cleared up. Now maybe I'll be able to sleep tonight. 8>}
Interesting review, although disappointing performance.
Just commenting to say don't forget about us SMPers! All the talk is about DC although not everyone with a dual CPU has it on the same die!!! I have a dual 760MPX board with a Radeon 9600 Pro and am hoping these driver improvement will really help me out.
Seriously I have been a long time Anandtech reader and this past year I have noticed a huge dropoff in quality! What ever happened to Anand's in depth reviews of CPU's and their architecture? I don't see that type of enthusiasm or detail anymore here.
Are the specific reviewers for certain articles biased? It seems this is the case whoever reviews a certain product nowadays is biased on this site. The new ATI drivers are crap not to mention are in beta.....
Who the hell cares about a driver update I mean seriously unless it was a huge change for the better or worse why report it? I don't see reviews on forceware changes or an AT article on the dual core drivers for Nvidia? I suppose there are no pro nvidia people at AT or at least maybe Nvidia doesn't pony up to AT's reviewer here?
Who has problems with Nvidia's newer drivers? I sure as hell haven't and I have a 4800+ with two 512mb 7800 GTX's. I never had problems with Nvidia's drivers. Everyone has a different system and just because you yourself have a problem when the majority do not does not mean the product sucks. I have had issues with a bios update utility due to beta creative drivers for a sb audigy I mean please people I would have never of thought of that but most problems happen because of what other software you have installed.
Who cares about the ATI X1K series I don't and most of the world doesn't especially since they are replacing it supposedly coming soon next year so why bother? They aren't even cost effective performance solutions comapred to Nvidia. Oh I would love to try a crossfire solution too bad there are zero around so guess what the only performance in graphics today is Nvidia. That's what we are stuck with now.
I swear this site no matter the trash ATI puts out will post it and put it in a better light. The forums here are even worse with all the newbs and their uneducated hate and bias to other solutions.
The site is dying Anand please replace Derek Wilson and some other members and start taking control of the reviews again everyone sorely misses you :(.
I have to agree on the point that since this driver update provides limited improvement, does it really warrant all the work Anandtech is putting into it. I own graphics cards from both ATI and NVIDIA and tend to go for midrange cards, so I can see some relevance to similar users who happen to own an x1000 series card. At the same time, as the previous poster points out, the advantage from a change of cards is far greater. So why so much testing that it has to be broken up into multiple reviews for a single driver providing such a small performance gain to users of a single line of cards who run at low resolutions and only play certain games? The benefit just seems small for so much work unless Anandtech can substantially tie it to a more general problem. My recommendation to the author, then, is to spend a little more time describing why you did all the testing - what it tells us - up front and tie it to a more general problem users may have.
I think this review is quite interesting. There is a lot of chatter in the channels about how DC is useless for gaming. Its reviews like this that give us guys with dual core cpus hope that our cpus are good for more than just playing mp3's and encoding movies all at the same time.
well after reading this article i still feel DC is only slightly useful when u are solely gaming. I mean why pay $200 extra for 2nd core to get 5% benefit at low res and when gaming at high res the benefit drops to like 2%. Like x2 3800+ and 3200+ are both 2ghz and about $200 difference and with these dual core drivers the 3800+ wins by 5%.
Rather put the extra $200 to video card if main purpose of comp is to game and improvement will be like 30%+, like going from 6600gt to 7800gt. However multitasking is a totally different story. Encoding movie and gaming at same time, then dual core is very worth it.
I have an X2 4400+ and like many other people have been forced to revert to the 7x.xx Forceware drivers because the new dual-core drivers cause certain well known OpenGL applications (3DS Max and PaintShop Pro for instance) to hang when trying to start them. If you haven't heard of this problem, just try googling and you'll get plenty of hits.
I'd rather have nVidia fix bugs before adding new performance enhancing features, but sadly it is all about getting a few extra pecent over ATI in the latest games it seems.
Do the drivers show any improvement while using a single core CPU w/HT enabled? Is it supposed to? How does it affect previous generation hardware? Are the tweaks only good for the X1000 hardware? You asked for suggestions, I gave some. Hope to see some of em answered.
Seems to me ATI had best get to the bottom of the single-core performance deficit in these 5.12 drivers before they come out of beta. All the fanbois would get their panties in a wad if the new driver hurts performance in the top-end FX-57 gaming rigs. If nothing else, they could include regular and DC-optimized versions of the key driver files and install them based on detecting 1 or 2(+) cores.
Actually, what might be even better from a marketing point of view is if they have a 'regular' driver that works fine for all systems, and a separate 'dual-core optimized' driver. Nothing gives users the warm fuzzies like being told 'oh, for YOU we have a special, better driver. Later on, once dual-core is almost universal in new systems, they could just unify the driver again.
Though a good idea, I fear the changes they have made to the driver to "parallellize" it can't be plugged in and out that easily. And if they can't, ATI would have to keep two separate code-trees (single and dual core) for their drivers, and update them both every time they come up with an improvement. What would probably end up happening is that the single core version would be more of less stagnant in terms of development (but with version numbers increasing of course), and the DC version getting the actual improvements. (Or the other way around... for now at least.)
The effort to optimize their dual core drivers to mitigate the single core performance loss is far less than keeping two parallel branches of their drivers in development. This is beta software, it's not as tuned as it can be. We won't know how the performance will be when the driver gets actually released.
Since integrated graphics cards are the ones that (currently) lack things like vertex shaders, they probably will get a much more "dramatic" performance increase from dual-core drivers.
Indeed, I guess this article has only been posted because someone at Anandtech has worked on it, and didn't want to have wasted his time entirely for some beta drivers no one cares about.
Ya, im sure all those people with Dual Core rigs, and all the people that will have Dual Core rigs by the end of this year (probably everybody on this board), doesnt care about Dual Core driver improvements.
This is a _beta_ driver, not a released driver. Anandtech could have waited for the actual release. Now we won't be seeing any article on the real thing when it will come out is my guess.
Yeah, I'm sure all those people with bleeding edge dual core processors and newest generation ATi cards will rejoice at their 5 extra frames per second at the lowest humanly tolarable screen resolution in this age.
Besides, it can only be pathetic if they actually get real performance improvements. On higher resolutions it would only show how lousy their drivers would be if they use that much CPU power to make an impact in benchmarks if the driver is off-loaded to another core. And on lower resolutions they apparently stall their rendering pipeline with current drivers. Thumbs up.
Are you seriously saying that it would be better to not know this? I was also under the impression that they were going to post a followup with more tests.
I never said it would be better to not know this, whatever this may be. The article states explicitely that there'll be a follow up, I was a bit too cynic perhaps. ;-)
The point is, there's a lot of fluff about some beta driver which "takes advantage of dual core". Earthshattering. It's the least any graphic card driver developer could do. And I welcome any comprehensive tests and article by Anandtech on real products. I just don't see any point in this article, especially not in the light of an upcoming "complete" article. (I see one: self-advertisement). Good effort by the Anandtech crowd, but it annoys me after all the other prerelease & beta & exclusive vapourware reviews.
Keep in mind that ATI's drivers seldom change once they hit beta. Once they're in beta, ATI is usually done with any coding and internal testing, and they're simply handing these drivers out to their partners for testing to see if in the unlikely event a problem crops up. So I certainly wouldn't consider these drivers vaporware, since they will be in everyones hands in another week or so.
As for why we did this article instead of waiting for the complete article, we felt it was more important for you to be able to see what ATI's dual-core changes are capable of now on the best of hardware, rather than wait longer for a full-comparison article. With the need to swap CPUs on top of everything else, it's going to take us some time to finish the full article. We can certainly wait until we have every last benchmark done, but we'd rather show what we have now and get feedback from you guys, rather than keep you in the dark any longer.=)
not like games these days are CPU bottlenecked, thats why we really only see improvements at 800x600, nVidia doesn't gain much in the higher resolutions either
Obviously you don't multitask? Like do you run a bittorrent client downlaoding off ADSL2 whilst playing a game, or run a IIS server in the background, or run other apps?
The days are gone of having a single task able computer as most users want multitasking due to their better understand and use of their machines.
That's odd. I thought they're going to use either the X2 4800, the 4400, or the 3800 CPU for the test... I'm a little surprised that they'd go for the 4600 to benchmark this.
what difference does it make? it's a dual-core cpu. for this sort of test, it makes no difference whether a 4600 is most popular to buy or not (which I agree it isn't).
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
56 Comments
Back to Article
Pjotr - Tuesday, December 6, 2005 - link
Why not simply show a screen shot of Task Manager after each benchmark? Then we can see approximatly how much of the second core is used by each benchmark.SemiconductorSlave - Tuesday, December 6, 2005 - link
If you follow the link Amdahl's law in this artilce and then at the bottom of page follow link “Reevaluating Amdahl’s law” you see the author state,Amdahl's law contains ” . . . the assumption that p is independent of N, which is virtually never the case. One does not take a fixed-size problem and run it on various numbers of processors except when doing academic research; in practice, the problem size scales with the number of processors. When given a more powerful processor, the problem generally expands to make use of the increased facilities.”
Isn’t this also what we have seen with video games, that they have always expanded to make use of the increased facilities?
SemiconductorSlave - Tuesday, December 6, 2005 - link
Derek's article mentions in its conclusion, “The real benefit will come in when game developers start working on parallelizing their code as much as possible.” This article is very forgiving of ATI’s new driver as there is already significant benefits to dual core processors on Quake4 and SeriousSam2 if you are using Nvidia 81.xx drivers.The article, using the ATI X1800 XL with the Catalyst drivers, Quake4 showed “no performance difference or issue” in the single core to dual core tests. In an article on www.xbitlabs.com titled Contemporary CPUs and New Games: No Way to Delusions!, on page 6, Quake 4 is shown to run much faster on dual core using the Nvidia 81.xx drivers, as the X2 3800+ clocked 2 GHz gets 101.6 fps, and the 3500+ clocked higher at 2.2GHz achieves only 98.6 fps. The 3200+, which is clocked evenly with X2 3800+ at 2Ghz, only achieved 93.4 fps, which means the dual core produced an 8% gain.
Also, on page 5, Serious Sam2 is shown running faster on dual cores, as the X2 3800+ clocked at 2 GHz achives103.1 fps beating out the 3800+ clocked at 2.4 GHz which achieved 99.2! The 3200+ clocked evenly with X2 3800+ at 2Ghz only achieved 84.8 fps. This indicates the dual core produced a 17.7% gain! The author notes on this test, “I have to point out NVIDIA drivers also started supporting dual-core architectures. ForceWare version 81.xx allows enjoying the advantages of dual-core technology in DirectX as well as in OpenGL.”
And to rule out that having two times the cache on an X2 is the reason for the performance difference, you can directly compare the equally clocked 4000+ to the X24600+. Each are clocked at 2.4 GHz and have a total of 1Mb cache. In Quake4 the X2 gets 110.2 while the 4000+ only achieved 103.2, which is still a 6.35% difference.
I think it would be great if this article is appended or these facts are included in the update. The end user should know what dual core performance is available to them now, not think that because ATI wrote a less than successful driver that we have to wait for the game developers before we see any significant dual core benefits!
Thanks,
Semiconductor Manufacturer and Anandtech fan.
porkster - Monday, December 5, 2005 - link
Ok just read the article. I thought the test was to be scheduled. Anyway, why are you only testing on an AMD system and why no sign of Black&White2, one of the new era of games that will push the bus, system and visual experience? Also Intel are the kings on multitasking and bus bandwidth so why test dual core drivers and speed % differences on an AMD?Surely a driver that has two cpu threads going will require more bandwidth on the bus when expected to move upto twice the amount of graphic files around tot he gfx card.
Can you please test a game that is modern like B&W2 in the future and put at high res. Compare the benefits between the market products. Show results on bandwidth demand and their effect whilst multitasking the system.
It seems Anandtech is falling behind the times due to bias towards AMD. It maybe ok to fool the majority of system owners that still have low PC-memory ratings and legacy AMD stuff thinking they're modern, but you portray your site has a leading tech reviewer. Please show the best of, not best of frame rates for non multitasking environments running an old tech wise game.
DrZoidberg - Tuesday, December 6, 2005 - link
Most of the readers here that have dual core systems have AMD X2 processors. AMD is faster at games than Intel so it is totally appropriate for AT to test using AMD first. AMD is not legacy, its more the general public still view Intel as the performance leader based on their processors several years ago and still clueless that AMD have now taken over.
Sure AT should now test using Intel processors just for knowledge sake but i bet performance % is similar to AMD like 1%-10%.
Quake 4 is new just came out recently, DOD source is recent, Battlefield 2 is a popular game many people play it, B&W2 would be nice but COD2 is the game really missing here.
porkster - Tuesday, December 6, 2005 - link
Actually you maybe wrong regarding AMD best for gaming, as an example compare the results on this test to http://www.anandtech.com/mb/showdoc.aspx?i=2631&am...">http://www.anandtech.com/mb/showdoc.aspx?i=2631&am...DrZoidberg - Tuesday, December 6, 2005 - link
Why have u provided a link showing motherboards for Intel at games and no AMD motherboards?What does that prove? That one particular Intel chipset motherboard is better than another Intel motherboard?
This is a more appropriate article: http://www.anandtech.com/cpuchipsets/showdoc.aspx?...">benchmarks
Battlefield 2
Pentium D 830 (3ghz) 92.1 fps
Intel Yonah (upcoming Intel processor) 103.3 fps
AMD X2 4200+ 120.8 fps
porkster - Wednesday, December 7, 2005 - link
Compare the results for the Farcry from this test results to the one I linked to.The Intel system is far better yet it's not using the dual core style gfx drivers yet.
I'm starting to think Anandtech doesn't want to compare latest games between the best of AMD and Intel as Intel will show it's better, especially at maintaining FPS during multitasking.
SemiconductorSlave - Wednesday, December 7, 2005 - link
Like in these 4 benchmarks?http://www.anandtech.com/cpuchipsets/showdoc.aspx?...">http://www.anandtech.com/cpuchipsets/showdoc.aspx?...
"We have Firefox loaded with all 13 tabs from our new suite test, iTunes is running and playing a playlist, and Newsleecher is downloading headers. We kept Newsleecher in this test simply because it's the best way for us to be able to have a fairly CPU/disk intensive downloading task running in the background while still maintaining some semblance of repeatability." --Anand Lal Shimpi
porkster - Thursday, December 8, 2005 - link
TALK ABOUT OLD REVIEW. Sorry, be relative.DrZoidberg - Wednesday, December 7, 2005 - link
lol totally ownedgreat link semiconductorslave
heres another link took 5 secs to find since almost all the websites show AMD is better
http://www.xbitlabs.com/articles/cpu/display/28cpu...">http://www.xbitlabs.com/articles/cpu/display/28cpu...
porkster - Tuesday, December 6, 2005 - link
You are failing to see that the majority of AMD CPU's are legacy as they have poor ability for modern software. Only AMD's top end CPU's are anything to consider, else all the others aren't suitable. AMD only perform well in single tasks. Intel are the KINGS of multitasking and bandwidth for the whole of their range.Games like DoD2 and Doom2 are not pushing the gfx routines and system, Black and White 2 does. B&w2 is a front line game, where the games in this test are safe bets for all processors. The point is this test was to measure performance for a dual-core device driver, so you would expect to see best of software.
SemiconductorSlave - Tuesday, December 6, 2005 - link
Well I'll agree that Hyperthreading benifits the P4 for multitasking, but lets not consider legacy, when comparing each companies latest dual cores, the Intel Pentium Processor Extreme Edition 840 $1029 at newegg, and the Athlon X2 4800+ $787 at neweggFrom Hardocp.com http://www.hardocp.com/article.html?art=NzY2">http://www.hardocp.com/article.html?art=NzY2
AMD uses what they call Direct Connect architecture. Instead of two processor cores being saddled to one bus and run to a single memory controller as we see with Intel dual core technology, we have to remember that AMD Athlon 64 processors have the memory controller on the CPU die itself and therefore no “front side bus” is needed. So each CPU on our dual core X2 has a much quicker route to the memory controller as with current Athlon 64 processors.So still the biggest benefit to the entire K8 core system is shining through in AMD's Athlon X2 line in the ways of HyperTransport and its extremely wide bus width when compared to Intel’s dual core 800MHz bus.
Sandra Memory Bandwith
Intel EE 840 Dual Core 4331
AMD X2 4800+ 5801
From same article
"On the dual core front, when you look at AMD’s flagship Athlon 64 X2 side by side with Intel’s flagship Pentium Extreme Edition 840 with HyperThreading, the obvious HyperThreading advantages seemingly disappear. In comparing single threaded applications, the Athlon 64 X2 4800+ shines over Intel’s 840 in our benchmarks."
porkster - Tuesday, December 6, 2005 - link
First off, sorry for spelling and typo's in last post. I reply on EDIT alot which this commenting system doesn't have, hehe.Ok, now I'm not bagging the X2 or AMD's top range as they are capable of doing the task required, but the for this test it's important to have Intel as the main test bed since it's got a higher threshold for things tested.
Like I can't imagine games use all the sustained bandwidth yet, but a game playing whilst multitasking should put enough strain to show degradation in available memory data for game textures etc.
If I'm correct from quick calcs, an AMD with top range DDR1 memory running a game at high refresh rates may only have 30meg bandwidth per game frame to play with. On the Intel that is about 55meg per frame. Now these sound like high values but when you consider multitasking and instance demands and gaming hi res textures etc, you start seeing the limitations.
This test fails to test multitasking, it fails to place the strain of FRONTLINE games on the bus, it fails to compare the best of CPU's for the situation.
porkster - Monday, December 5, 2005 - link
If GFX drivers are going to start using multi core cpus and their threads then that will surely increase instance bandwidth on the memory bus, something Intel has trumphs over AMD.I hope the new dual core driver test compares the difference in bandwidth use and what effect that has between AMD and Intel based system.
If you own a AMD without the top notch expensive ram then you maybe trapped is a past era as new games demand more bandiwdth and drivers start using more too.
yacoub - Monday, December 5, 2005 - link
So after looking at the charts it looks like if you use singlecore, stay away. Also no word on if they fixed the issue with FEAR that was mentioned last month as NOT being fixed in 5.11.On a side note: Anyone wonder if this is how they will start to push people from single to dual core? (That is, offering improvements for dc at the expense of sc performance.)
wien - Monday, December 5, 2005 - link
There's no other way of doing it really. Multi-threaded code will always run slower than the equivalent single-threaded code on a single-core CPU. (As long as you count threads waiting for disk-IO and stuff like that out of it that is.) If apps are ever to go the multi-threaded route, single-core performance will suffer...stephenbrooks - Monday, December 5, 2005 - link
There's a thing called an "if" statement :) You can write "if (nprocessors>1) {/* Multithreaded code */} else {/* Single-threaded code */}".Questar - Monday, December 5, 2005 - link
BS. There are many apps that are multithreaded that don't take a perf hit on a single cpu.yacoub - Monday, December 5, 2005 - link
Is this the new Cat driver that fixes the FEAR.EXE bug?bldckstark - Monday, December 5, 2005 - link
The gentlemen at Sanda National Labs don't see the parallelization "problem" the same as most of the computing community, and take exception to Mr. Amdahl's observation. They have shown that the possible multiple CPU parallelization improvement can exceed Amdahl's theoretical maximum. They state that as the number of processors increase, the size of the problem introduced to the system increases allowing speeds above those previously thought possible. Here's a link to the horribly boring article.http://www.scl.ameslab.gov/Publications/Gus/Amdahl...">Extremely Boring Paper With Too Much Math
Thank God we got that cleared up. Now maybe I'll be able to sleep tonight. 8>}
phusg - Monday, December 5, 2005 - link
Interesting review, although disappointing performance.Just commenting to say don't forget about us SMPers! All the talk is about DC although not everyone with a dual CPU has it on the same die!!! I have a dual 760MPX board with a Radeon 9600 Pro and am hoping these driver improvement will really help me out.
Humble Magii - Monday, December 5, 2005 - link
This article is silly....Seriously I have been a long time Anandtech reader and this past year I have noticed a huge dropoff in quality! What ever happened to Anand's in depth reviews of CPU's and their architecture? I don't see that type of enthusiasm or detail anymore here.
Are the specific reviewers for certain articles biased? It seems this is the case whoever reviews a certain product nowadays is biased on this site. The new ATI drivers are crap not to mention are in beta.....
Who the hell cares about a driver update I mean seriously unless it was a huge change for the better or worse why report it? I don't see reviews on forceware changes or an AT article on the dual core drivers for Nvidia? I suppose there are no pro nvidia people at AT or at least maybe Nvidia doesn't pony up to AT's reviewer here?
Who has problems with Nvidia's newer drivers? I sure as hell haven't and I have a 4800+ with two 512mb 7800 GTX's. I never had problems with Nvidia's drivers. Everyone has a different system and just because you yourself have a problem when the majority do not does not mean the product sucks. I have had issues with a bios update utility due to beta creative drivers for a sb audigy I mean please people I would have never of thought of that but most problems happen because of what other software you have installed.
Who cares about the ATI X1K series I don't and most of the world doesn't especially since they are replacing it supposedly coming soon next year so why bother? They aren't even cost effective performance solutions comapred to Nvidia. Oh I would love to try a crossfire solution too bad there are zero around so guess what the only performance in graphics today is Nvidia. That's what we are stuck with now.
I swear this site no matter the trash ATI puts out will post it and put it in a better light. The forums here are even worse with all the newbs and their uneducated hate and bias to other solutions.
The site is dying Anand please replace Derek Wilson and some other members and start taking control of the reviews again everyone sorely misses you :(.
heulenwolf - Monday, December 5, 2005 - link
I have to agree on the point that since this driver update provides limited improvement, does it really warrant all the work Anandtech is putting into it. I own graphics cards from both ATI and NVIDIA and tend to go for midrange cards, so I can see some relevance to similar users who happen to own an x1000 series card. At the same time, as the previous poster points out, the advantage from a change of cards is far greater. So why so much testing that it has to be broken up into multiple reviews for a single driver providing such a small performance gain to users of a single line of cards who run at low resolutions and only play certain games? The benefit just seems small for so much work unless Anandtech can substantially tie it to a more general problem. My recommendation to the author, then, is to spend a little more time describing why you did all the testing - what it tells us - up front and tie it to a more general problem users may have.bob661 - Monday, December 5, 2005 - link
So the WE, the user, can know that this driver only provides "small performance gains". How else would you know?bob661 - Monday, December 5, 2005 - link
Damn it.....So THAT we...heulenwolf - Tuesday, December 6, 2005 - link
So, now we know. Where's the need for a follow up?hondaman - Monday, December 5, 2005 - link
I think this review is quite interesting. There is a lot of chatter in the channels about how DC is useless for gaming. Its reviews like this that give us guys with dual core cpus hope that our cpus are good for more than just playing mp3's and encoding movies all at the same time.DrZoidberg - Monday, December 5, 2005 - link
well after reading this article i still feel DC is only slightly useful when u are solely gaming. I mean why pay $200 extra for 2nd core to get 5% benefit at low res and when gaming at high res the benefit drops to like 2%. Like x2 3800+ and 3200+ are both 2ghz and about $200 difference and with these dual core drivers the 3800+ wins by 5%.Rather put the extra $200 to video card if main purpose of comp is to game and improvement will be like 30%+, like going from 6600gt to 7800gt. However multitasking is a totally different story. Encoding movie and gaming at same time, then dual core is very worth it.
porkster - Monday, December 5, 2005 - link
You may only get 5% as the games are not taking advantage of the new features.mbhame - Sunday, December 4, 2005 - link
Who wrote this article?stephenbrooks - Monday, December 5, 2005 - link
Derek WilsonPrinceGaz - Sunday, December 4, 2005 - link
I have an X2 4400+ and like many other people have been forced to revert to the 7x.xx Forceware drivers because the new dual-core drivers cause certain well known OpenGL applications (3DS Max and PaintShop Pro for instance) to hang when trying to start them. If you haven't heard of this problem, just try googling and you'll get plenty of hits.I'd rather have nVidia fix bugs before adding new performance enhancing features, but sadly it is all about getting a few extra pecent over ATI in the latest games it seems.
hondaman - Monday, December 5, 2005 - link
Nvidia claims that their drivers have DC optimisations, although i havent seen any review that shows one way or the other if it really does.I personally found this "review" to be quite interesting, and hope anandtech does the same for nvidia and their newest drivers.
mmp121 - Sunday, December 4, 2005 - link
Derek,Do the drivers show any improvement while using a single core CPU w/HT enabled? Is it supposed to? How does it affect previous generation hardware? Are the tweaks only good for the X1000 hardware? You asked for suggestions, I gave some. Hope to see some of em answered.
stephenbrooks - Monday, December 5, 2005 - link
^^^ above are good questionsjohnsonx - Sunday, December 4, 2005 - link
Seems to me ATI had best get to the bottom of the single-core performance deficit in these 5.12 drivers before they come out of beta. All the fanbois would get their panties in a wad if the new driver hurts performance in the top-end FX-57 gaming rigs. If nothing else, they could include regular and DC-optimized versions of the key driver files and install them based on detecting 1 or 2(+) cores.Actually, what might be even better from a marketing point of view is if they have a 'regular' driver that works fine for all systems, and a separate 'dual-core optimized' driver. Nothing gives users the warm fuzzies like being told 'oh, for YOU we have a special, better driver. Later on, once dual-core is almost universal in new systems, they could just unify the driver again.
wien - Sunday, December 4, 2005 - link
Though a good idea, I fear the changes they have made to the driver to "parallellize" it can't be plugged in and out that easily. And if they can't, ATI would have to keep two separate code-trees (single and dual core) for their drivers, and update them both every time they come up with an improvement. What would probably end up happening is that the single core version would be more of less stagnant in terms of development (but with version numbers increasing of course), and the DC version getting the actual improvements. (Or the other way around... for now at least.)Pannenkoek - Sunday, December 4, 2005 - link
The effort to optimize their dual core drivers to mitigate the single core performance loss is far less than keeping two parallel branches of their drivers in development. This is beta software, it's not as tuned as it can be. We won't know how the performance will be when the driver gets actually released.mlittl3 - Sunday, December 4, 2005 - link
That's a good idead.huges84 - Sunday, December 4, 2005 - link
How much RAM did the test system have?Furen - Sunday, December 4, 2005 - link
Since integrated graphics cards are the ones that (currently) lack things like vertex shaders, they probably will get a much more "dramatic" performance increase from dual-core drivers.Cybercat - Sunday, December 4, 2005 - link
amazing improvements! At 800x600...Pannenkoek - Sunday, December 4, 2005 - link
Indeed, I guess this article has only been posted because someone at Anandtech has worked on it, and didn't want to have wasted his time entirely for some beta drivers no one cares about.Cygni - Sunday, December 4, 2005 - link
Ya, im sure all those people with Dual Core rigs, and all the people that will have Dual Core rigs by the end of this year (probably everybody on this board), doesnt care about Dual Core driver improvements.In other news, i hope that post was a joke...
Pannenkoek - Sunday, December 4, 2005 - link
This is a _beta_ driver, not a released driver. Anandtech could have waited for the actual release. Now we won't be seeing any article on the real thing when it will come out is my guess.Yeah, I'm sure all those people with bleeding edge dual core processors and newest generation ATi cards will rejoice at their 5 extra frames per second at the lowest humanly tolarable screen resolution in this age.
Besides, it can only be pathetic if they actually get real performance improvements. On higher resolutions it would only show how lousy their drivers would be if they use that much CPU power to make an impact in benchmarks if the driver is off-loaded to another core. And on lower resolutions they apparently stall their rendering pipeline with current drivers. Thumbs up.
Andyvan - Sunday, December 4, 2005 - link
Are you seriously saying that it would be better to not know this? I was also under the impression that they were going to post a followup with more tests.-- Andyvan
Pannenkoek - Sunday, December 4, 2005 - link
I never said it would be better to not know this, whatever this may be. The article states explicitely that there'll be a follow up, I was a bit too cynic perhaps. ;-)The point is, there's a lot of fluff about some beta driver which "takes advantage of dual core". Earthshattering. It's the least any graphic card driver developer could do. And I welcome any comprehensive tests and article by Anandtech on real products. I just don't see any point in this article, especially not in the light of an upcoming "complete" article. (I see one: self-advertisement). Good effort by the Anandtech crowd, but it annoys me after all the other prerelease & beta & exclusive vapourware reviews.
Ryan Smith - Sunday, December 4, 2005 - link
Keep in mind that ATI's drivers seldom change once they hit beta. Once they're in beta, ATI is usually done with any coding and internal testing, and they're simply handing these drivers out to their partners for testing to see if in the unlikely event a problem crops up. So I certainly wouldn't consider these drivers vaporware, since they will be in everyones hands in another week or so.As for why we did this article instead of waiting for the complete article, we felt it was more important for you to be able to see what ATI's dual-core changes are capable of now on the best of hardware, rather than wait longer for a full-comparison article. With the need to swap CPUs on top of everything else, it's going to take us some time to finish the full article. We can certainly wait until we have every last benchmark done, but we'd rather show what we have now and get feedback from you guys, rather than keep you in the dark any longer.=)
bob661 - Monday, December 5, 2005 - link
Don't mind him Ryan. Some people won't shop for a Ferrari until they can afford one.wien - Sunday, December 4, 2005 - link
Way to talk for everyone... I care, so there.Jep4444 - Sunday, December 4, 2005 - link
not like games these days are CPU bottlenecked, thats why we really only see improvements at 800x600, nVidia doesn't gain much in the higher resolutions eitherporkster - Tuesday, December 6, 2005 - link
Obviously you don't multitask? Like do you run a bittorrent client downlaoding off ADSL2 whilst playing a game, or run a IIS server in the background, or run other apps?The days are gone of having a single task able computer as most users want multitasking due to their better understand and use of their machines.
keitaro - Sunday, December 4, 2005 - link
That's odd. I thought they're going to use either the X2 4800, the 4400, or the 3800 CPU for the test... I'm a little surprised that they'd go for the 4600 to benchmark this.johnsonx - Sunday, December 4, 2005 - link
what difference does it make? it's a dual-core cpu. for this sort of test, it makes no difference whether a 4600 is most popular to buy or not (which I agree it isn't).Shimmishim - Sunday, December 4, 2005 - link
first post!looks promising for ATI.