I thought the bit-tech review was more relevant - they actually sat down and played the game, proving that the inbuilt graphics test doesn't bare any representation on real game play.
quote: To give an idea of how intense our manual run through is in comparison to the built in "stress" test, the average and minimum frame rates for the NVIDIA GeForce 7800 GTX at 1280x960 2xAA 8xAF were 41 fps and 11 fps respectively. In contrast, the stress test reported a 55 FPS average and 35 fps minimum frame rate with only 17% of the frames below 40 frames per second. We think it is fair to say that the stress test isn't really much of a "stress" test after all.
Having to upgrade your hardware to the latest and greatest to get good looking games is crazy. In the last two years its spun out of control.
My 9700pro lasted the longest, then I bought a 6800GT for $359.....my next purchase is a xbox360 for $399....I am sure if they make a version of Fear for it it will look great on my 50inch HD Sony TV. Ahhh and it probably wont ship full of bugs.
Dam if it were not for PC games my system would be a 1ghz P3 with 512megs of ram!
I'm a little reluctant too this year to upgrade. The worse thing about it is when they make a good optimized graphics engine , like HL2's & Far Cry's, they dont seem to last very long. I expected at least a few other developers to make good quality games with them but that never happened. So the end result is that you're upgrading your PC for 2 or 3 titles a year. ID's OpenGL based engine was the only real "big" seller with Riddick and Quake 4 thankfully. Plus if you include the problem that if the games turned out not to be in your liking you're stuck with 1000 dollars worth of useless hardware.
I got the game, installed the new patch, and have been running @ 1024x768 just fine with the majority of the goodies on. Given, no soft shadows, and on 4x rather than 8x or 16x, but it looks beautiful to me.
You make reference to how the soft shadows are implemented to Riddick compared to FEARs yet I searched the site and there is no benchmarks or IQ comparisons of Riddick. If you asked me that's a major problem considering you have no evidence published to back up your own statement.
This should be a game review... not a GPU review. Review the game, play the game how you'd actually play it... with sound enabled. THEN show us the FPS measurements.
A large number of Anandtech readers do not comprehend anything other than "GPU review" so you will likely not see a true game review anytime soon on a realistic rig. It's always only ever a GPU test with an FX-55. =/
You include the 6600GT and 6800GT but not the X800XL and X800XT, the two comparable cards. Stop with the 1800-series nonsense and post the BUYABLE ATI cards as well please! Would be nice for those of us considering upgrading to an X800XL or 6800GT to see how they stand up in FEAR. :(
but games like this make up for the lower settings
my friend came over last night and we played online FEAR for 6 hours
He has a comp i built him with a 6600gt and it ran great on some custom setting and didnt look at all sub par. Didnt lag ONCE all night. The test program in the game is really cool to so now i dont have to sit there with fraps and stuff on for ever
multi player - the gameplay is so fast most of the time there is NO time for you to admire the scenery
it says how the geforce 7800 is basically the only card to run it at the highest end, i just ordered a alienware about a week ago with, dual geforce 7800GTX-KO's, and a 19"LCD monitor, 4 gig DDR2 ram and 3.2 dual core pentium-D....my question is do i have anything to worry about in the upcoming months/year graphically?
Since you are probably limited to 1280x1024 on your 19" LCD then you are fairly safe. However, the 840D will have issues in providing enough data (the GPUs will have wait states) for the 7800GTX SLI setup at the higher resolutions such as 1600x1200 in case you decide to change monitors. I have found the 7800GTX SLI setup and the 840EE to run Fear at 1280x1024 (960) without too much of an issue.
Is there any way to get a poll of AT forum viewers and establish which cards get tested on bleeding edge software? Wouldn't that help us see data that pertains to the majority of us. I understand that you can't evaluate every possible card/model/resolution variation but a current reader based poll may help.
This is one of the reasons why I don't think the 7800 GTX or x1000's are worth buying. I feel sorry for the people who payed over 600 dollars for them when they can't even play FEAR @ 1280x1024 with AA.
FEAR is quite clearly optimized poorly. However, claiming that people pay over $600 for a gtx without being able to play 1280x1024 with AA is totally wrong. It is easily playable as the review shows for less than $600 (price-wise, at least for the gtx). Not to mention, you can even kick up the resolution to 1600x1200 and get only slightly unusable FPS.
Specifically, on 1280x1024 with all settings on max except for soft shadows, the GTX gets a playable 39 fps. ATI is off the mark, but NVIDIA is okay. And as for the cost of the 7800GTX, it is (as of now off Newegg) in $20 intervals from $460-$500, the $500 version includes BF2, and one $580 version. Clearly, you can get the GTX for over $100 less than your "$600" price. And no, exaggerating by over $100 is not negligible, not at all.
Note: what I mean by "slightly unusable" is not that it is slightly problematic, but rather that it is in fact unplayable but misses the playability mark by only a little.
quote: This is one of the reasons why I don't think the 7800 GTX or x1000's are worth buying. I feel sorry for the people who payed over 600 dollars for them when they can't even play FEAR @ 1280x1024 with AA.
I would argue that if anything, it is likely that F.E.A.R. was optimized poorly, and is more likely the result. I've seen screenshots, and so far, I'm not impressed enough to put down the money. Greaphics doesn't seem to be anywhere near as good as the hype has stated (previous Half-Life 2 shots look far better IMO; perhaps I have to play it to see). Add that to the fact that there's already a 1.01 patch the day of the game release, and I think that's a symptom of a game that needs more under-the-hood work. I'll wait to see the results of testing for more games; one is not enough.
P.S. To all that said this review should have had more ATI cards, you were right on the money. This review has the Geforce 6600GT and 6800GT, and doesn't even include ATI counterparts to them (read: X800GT, X800XL)? That's poor.
I really do think developers have either reached the limit in optimizing their code, or they are too lazy to do so. Or perhaps, it's a conspiracy between ATI/Nvidia and developers? The fact is, you shouldn't NEED a $600 video card to run some of these games coming out today. The shear lack of performance shown here on a high dollar card, shows us that something is wrong in the industry.
Anyone notice a trend here in the industry? Supposedly the GPU power of the cards are increasing. X800 claims to be two times faster than an "old" 9800 Pro. Yet the game engines being written today, can't crank out more than 40fps at a measly resolution of 1280x1024? Something is wrong in the industry. As someone else said in another post...Something as got to give.
The problem is simple...PC game developers have no limite to speak of...They know there is allways something new coming up who will run their game perfectly...That's not the case with the console market. Since they're going to be "stuck" with the same HW for 4-5 years they HAVE to optimize their code..That why you see games on a same system ( Gamecube for exemple ) with graphic twice has beautiful as other older game running on the SAME HW...
Take RE4 for exemple..nobody even though that level of graphic could be achive on a GC....but it did.
I'd say this was a fairly good performance review except for the choice of graphics-cards.
An excellent choice of current nVidia cards by including both 7800 models, and popular GF6 cards (6800GT and 6600GT) from which the performance of other 6800/6600 can be extrapolated. Given the use of a PCIe platform, the only cards I would add would be a standard 6200 (not TC) and a PCX5900; the PCX5900 would give FX5900 owners a good idea of how their card would perform and be a guide to general GF5 series performance. A 7800GTX SLI setup is also needed to show what benefit it offers, but I wouldn't bother testing anything slower in SLI as it is not a viable upgrade.
The ATI X1000 series cards included was also excellent, but only using an X800GT from the previous generation was woefully inadequate. Ideally an X850XT, X800XL, and X700Pro would also be added to give more complete information. For the generation before that, just as a PCX5900 could be used for nVidia, an X600Pro/XT could be used for ATI as that would be equivalent to a 9600Pro/XT. It's a pity there isn't a PCIe version of the 9800Pro but a 9600Pro/XT would be the next best thing. Until you can setup a Crossfire X1800XT there is no point including any Crossfire tests.
So my recommended gfx-card selection is: nVidia 7800GTX SLI, 7800GTX, 7800GT, 6800GT, 6600GT, 6200, PCX5900. ATI X1800XT, X1800XL, X1600XT, X1300Pro, X850XT, X800XL, X800GT, X700Pro, X600Pro/XT. That may seem a daunting list but it is only a total of 16 instead of 10 cards so it is not overwhelming. All the cards are PCIe so you only need the one box, and it includes a good selection of old and new cards.
The only other thing I'd change is the test system. The FX-55 processor is fine though an FX-57 would be even better; people who suggest using something slower when testing slower video-cards are missing the point of a video-card review. I would up the memory to 2GB (2x 1GB) though just to remove possible stuttering from affecting the results, even if that means slowing the timings slightly to 2-3-2.
The fastest CPU is good if you want to know exactly how well a GPU do in a game...but that still doesn't refelct the majority of peoples who will run the game...that's why a slower CPU could be nice. If hte idea behind this review was to show peoples how well their HW will do in this game...only using the best of the best is not the best way to achive that goal.
The aim of video-card reviews is to show as best as possible what the video-card is capable of when all other variables (such as CPU limitations) are removed from the equation. That's why even testing an AGP GeForce 2GTS with a high-end FX-57 processor would be preferable as the performance is determined entirely by the graphics-card.
If you use slower CPUs with slower graphics-cards, it is difficult to say for sure whether it is the CPU or the graphics-card that is the limiting factor. All a review which tries to mix and match CPUs and graphics cards is saying is "this combination went this fast, but we have no idea if it was the CPU or the graphics-card that was the limiting factor, so we don't know if you should buy a faster CPU or a faster graphics-card".
And That's why they just do at least a couple of test with slower CPU to see if it affect the FPS or not...I dont say screw the FX57...I juste say..why dont you do a couple a test with a slower CPU to see if it makes a difference....
Good article but I really realy hoped you would of compared a 9800Pro. It's a very very popular card that many of us still have and we'd love to know how it performs!
I've seen a trend in GPU reviews lately, in that fewer old cards are used. This review says "See if you need to upgrade", but how can I tell when my old card is not there and I have no relation to even the 800 GT? I would like to see more old cards. 9800 Pro is probably a card a LOT of people have, it would be great to include this card and perhaps 2-3 more from around that generation.
"why would anyone make a game with a no name legacy that has no ability to be played in full with a system that would cost 3000 dollars? "
This is a huge exaggeration. I have a new, just under $2000 system with an x2 4200+, 7800gt, and 2 gb of memory. I can also play FEAR on 1280x1024 (that's the highest resolution my monitor can support) with all graphics settings turned on (except for soft shadows, I believe) with no noticeable issues. So while my system is a high-end system, zero playability on a $3000 system is a MASSIVE exaggeration.
In fact, for ~$2400 (the cost for me to upgrade my system to 7800gtx SLI), I could play the game flawlessly on 1280x1024 with all settings turned on. The SLI is probably overkill, in any case, meaning for ~$1900, you can play FEAR easily. Not to mention the graphics are OUTSTANDING at this level. The gameplay is amazing and the physics are to die for (I've spent hours mesmerizing friends during one part of the demo where you can flip an enemy head over heels with the bolt gun thing. So while FEAR is INCREDIBLY demanding, the game is incredibly amazing.
"sick of this assinine increase in resolutions. I bet the game actually sux after playing it for 2 days like most do."
Like Derek said, this game is outstanding. It's not absolutely and utterly captivating like HL2, nor is it as practical as HL2, but the game is still awesome.
"with a no name legacy"
Companies have to make a name for themselves. Great gameplay is one way, groundbreaking graphics is another, and a combination of the two is even better. FEAR has both. Additionally, Monolith has come out with great games in the past that have been rejected by the market. If you want me to come up with some examples, I'll dig up some games and give you some.
Granted, these games aren't "uberly leet" like HL2, but you can't expect every game to get ratings as high as 95+.
Basically, don't post irrational posts that exaggerate the truth. This isn't the time to slander FEAR or monolith or anything else, really. And there's no reason to create "flame wars." Post calm. Post decently. And post rationally...and I probably have some irrational points in here too, so just point them out and I'll attempt to fix it. If I can't, then you win. Happy?
I also agree with what seems to be a popular opinion that this review is lacking info. It doesn't show us the incredibly out-of-reach SLI setups, making the review more user-tailored and less FEAR-bragging tailored, but few (if any) AGP cards are shown. This makes the review tailed towards the "high" gaming community instead of the mainstream community.
I would be happier with more lower end cards shown, more varied graphics settings chosen, and far fewer top-of-the-line cards. For example, instead of the 7800gtx and the 7800gt, I would have been okay with just the gtx or just the gt. While this leaves out a very nice card, for purposes of practically it might be better to let users extrapolate the fps for the other card, and instead show some mid-range cards as well.
If time doesn't allow for this, then perhaps make it more clear that you intend to release another, more comprehensive, review soon so we can get an overview in the first, rush-review and a more detailed review after a few weeks.
On a side note, I heard rumors that the release of FEAR is multi-threaded. Is this true?
Well...testing it on a X2 VS a FX55 would make an interesting benchmark but all the reviews I have read so far use FX55-57 processor...And Btw...when CPU speed is taken in consideration ...the FPS in FEAR doesn't change much.. a AMD 64 3200 and a FX57 both run the game at almost the same speed ( give or take 1-2 fps ).
why would anyone make a game with a no name legacy that has no ability to be played in full with a system that would cost 3000 dollars?
for that id go to europe and rent someone named fear to hang out with or something.
sick of this assinine increase in resolutions. I bet the game actually sux after playing it for 2 days like most do.
Now a game like half-life2, that was a game you would consider getting a new system for, or for final fantasy, or for doomIII (even though it bugged out too many people and died fast).
they should make a new game that requires two overclocked liquid cooled only 7800GTXs in sli mode that gets 4 frames at 640 resolution. that would really help out the industry! yeah!
FEAR is a top 10 game. It will become a benchmark for games follow. Imagine a developer that took some of their favorite levels from current games and worked those ideas into a believable environment. Then took a horror movie and tied it to a game and have it play on your mind more than in your eyes. Then toss in some enemy AI that does a lot more than predictably pop out from the same side of a crate; an AI that can actually flank, hide, work together, and corner the player. Then they actually play tested this game with real gamers and adjusted the difficulty to make sure there wasn't something stupid like a flood of Combine attacking you in prison when you had no health and little defense. Now wrap it up this game in cutting edge visuals.
I have the FEAR Director's Edition DVD, and highly recommend it. And I'm only playing on a P4 2.8 with 1 Gig and a 6800GT. Everything is maxed at 1024, and it looks stunning.
the game is actually fun and has enemies that are interesting. Half life 2 was bad enough with its enemy AI, but doom 3 had every enemy of a certain type doing exactly the same thing after its initial jump out and scare you routine. it was really boring killing enemies like that.
FEAR actually has enemies that can do different and interesting things depending on the current landscape, and it seems they work together better in this game than others.
I'd say that as a single person shooter, FEAR has better playability than many other games out there.
And, again, lower settings run at higher frame rates.
We tested three different settings combinations for this game where we normally only test 2. I agree that it would have been nice to include a test with settings that allowed the midrange cards to acheive smooth framerates at high resolutions. We did test with and without the setting that has the single largest impact in framerates (soft shadows).
It is not possible run the game with antialiasing and soft shadows enabled at the same time. If AA is enabled in the control panel and soft shadows are enabled in the game, Monolith notes that rendering problems will happen. AA + Soft shadows is ommited because we could not include it.
The only SLI option we currently recommend is the high end combination of 2x 7800 GTX. Rather than doubling any other product, it is a better option to upgade to a higher end solution and sell the lower performance part. Testing SLI and crossfire combinations of every card and including other X800 and 9800 series solutions would have ended up doubling our test load and our time to publication. We tried to choose well a smaller sample of cards that would present a full representation of what would happen in the mid to high end space.
And after reading the comments on this article, it is quite apparent that we chose poorly. In the future, we will include at least an X800 XT or X850 XT and a 7800 GTX SLI test.
It just isn't possible to test every setup imaginable, but rest assured that we will absolutely listen to the feedback and include at least a couple more cards and tests in future articles of this nature.
Don't forget the X800XL ...For a long time this card has been the best bang for the buck you could get so its probably a card that many Anand readers have.
Absolutely understand the point now about not being able to run soft shadows and AA together, sorry for calling you out on that then. For settings I meant maybe testing at Medium or High rather than whatever the Max setting is called. Thank you for caring to read my feedback, it is appreciated.
How hard is it to incorporate min/max FPS in addition to average FPS in your benchmarks? I see this info in other sites, and it really helps to see how much it dips throughout a timedemo. I think it gives a better representation of whether a game will have some or a lot of hiccups.
We will work on ways to include this data effectively. There are always more numbers to add, and not always good ways to represent that data. But we will absolutely look into it. Any suggestions on how you would like to see this data represented?
And, please please please consider using a system that is a closer representation of that the average READER might have!!! (http://www.steampowered.com/status/survey.html)">http://www.steampowered.com/status/survey.html) Your dream gamer rigs are absurd and do not offer any represention what we READERS can expect from the game. If you're reviewing GPU's, toss some absurdly low end ones in and maybe some of your highend cards in, ***only if they can be found in local stores and/or available through your advertisers***. This gives us READERS a better idea of what some new hardware may do for our systems. If the games run like crap on older hardware, maybe these developers will learn how to write better code!? Trying to find the most amazing performance, isn't always important. Know what I mean?
I would be fine with two charts, one for minimum and one for max. Integrating the two charts would probably make things too cluttered. As far as the specifics go, I like how you set up your charts right now. So simply duplicating the format and changing the content is great for me.
I could have expected this kind of performance. Kudos for a decent article.
The very best of both ATI and Nvidia are not even up to the task of 1600x1200 with 8AF and soft shadows. I wouldn't even want to imagine what 1920x1200 would look like(don't know if Fear can do that res or not). But it's clear to me the R580 and the G80 are obviously needed for the next generation. People often argue that GPU's of the best today are overkill. Clearly we can see that even the very best of today can be brought to its knees by a shipping game, let alone what may come in the next year.
Let's hope those newer chips don't take a year to get to us.
I'm the only person on the face of the planet who's going to be able to play this game at 1600x1200 with everything set to max, with min 60 fps, and enjoy every dpi of its beauty... a year from now that is.
Personally I don't think the game looks all that great and once I set it so it is playable it looks pretty bad IMO - Quake 4 looks better and runs smoother at higher rez with more options.
P4 3.4ghz, 1gb ram, 6600GT - I need to run it at 8x6 with everything on medium, no AA - it's fugly man.
I'm sick of reviews with only the highest end gear - the 6600GT #s mean almost nothing to an actual owner, who has an FX processor and "only" a 6600?! Please start using TWO machines for these tests, one super mega rig for getting absolute #s and one average machine so users can see what they will REALLY get.
I have to agree with what others have said. Where are the X800 Pro/XL and X850 XT? Why test with ATI's new (unavailable) cards at the expense of the currently available ones and then dismiss tham as options because of availability issues? If you feel so strongly about it, refuse to test with them until they become available. Then we can all complain about the absence of testing with forthcoming cards! :) Sucks to be a reviewer and have to test 15 different cards to please most of us.
Which brings me to my industry issue: How long can NV and ATI realistically continue to crank out new architectures every 6 months? Something has got to give. I think the worst case scenario is ending up in a single manufacturer situation. I keep hoping ATi pulls something out of their hat just for competition sake.
Are the days of passively cooled cards over?! I haven't even gotten around to picking up Gigabyte's passively cooled X800XL and it's already becoming outdated :(
>>>
I have to agree with what others have said. Where are the X800 Pro/XL and X850 XT? Why test with ATI's new (unavailable) cards at the expense of the currently available ones and then dismiss tham as options because of availability issues? If you feel so strongly about it, refuse to test with them until they become available.
>>>
i totally agree. I just stuningly overflew the article and the charts where in the AA/AF tests (which are the ones which count) the 1800XT CLEARLY comes out a TAD faster than the 7800GTX.
THEN - below i read: "We can only recommend 'saving up for a 7800GTX'.
a)if you recommend 'saving for a 7800GTX' then i dont understand that you dont mention that the 1800XT might be available the same time when this person saved it's money going out for a GTX - and then XT would be faster
b) having the XT in the charts and then dismiss it in the recommendations because availability is WEIRD. I UNDERSTAND, and we're all frustrated by ATI's paper launches and non-existing fantasy cards.....but, still..i THINK you would have done better if you'd waited a bit longer 'til the XT is an available product...instead of showing it in the graphs and then forget about it because it might take a few more weeks 'til they're available.
ALSO - you as testers HAD one (XT)....so it doesnt make sense because you HAD the product in hands and compared it - and this was a real product which will be available soone (ehrm, i hope :) )and not some calculated "benchies" based on a totally differnt hardware.
c) X850PE: For sure. Miss the numbers becausew i have one
d) i think it would be worth to mention that, amongst all the hype, a engine which runs BARELY 40FPS on super-duper high-end cards realle "does not make much sense" - especially if opinions are split whether the gfx in it are REALLY *that* ground breaking. MAYBE - maybe this game engine is just really BAD and inefficient. Sorry...we're talking about high-end machines here with 2GB ram and top-notch gfx cards in the $500 range...and a mediocre resolution like 1280x doesnt get better than 40FPS ??? Not really a reason to rave.
And as some said, there are similiar titles out with (subjective seen) on par (or even better) graphics which runs WAY faster.
Why are review impressed when games need more graphic power? Can we be impressed with good graphics and lower requirements. It a shame when a $50 game need a $500 video card. And whats worste is that in year another will need the next $500 card. And all this to play maybe 5 FPS at most in the next year.
Don't play the game or turn your settings down! It's pretty simple. I've been doing it for a long time as until last year, I couldn't even afford a midrange video card.
Don't play the game or turn your settings down! It's pretty simple. I've been doing it for a long time as until last year, I couldn't even afford a midrange video card.
Thats a great idea, except I just purchased an lcd. Mine doesnt play well with anything other than its native resolution.
Seemed like it was more of a 1800xt vs 7800 gtx article..all the other stuff mentioned was just a bonus, maybe thats why they didnt include sli in the mix..
http://www.firingsquad.com/hardware/fear_performan...">They use the X800 XL . Not only that but when they review a game they do it in two different articles. One for mainstream and another for high-end. I would like to see Anandtech do the same.
How is it that the X800XL is consistantly left out of the benchmarks? You have multiple ATI cards in the test that aren't even available, but leave out one of their best sellers in this and plenty of other reviews.
I would not declare 30fps playable when your game settings involved sound disabled. No one is going to play with sound off, and thus their framerates will be even lower. This game runs like ass. I hope I can get my copy to run at all on my 6600 @ 400/700.
very good read AT! New software benches makes me all gimpy inside :)
I know im probably beating the dead horse here, but I was actually looking foward to purchasing a x1800XT for my main computer. (building one with a 7800GT for my son) :-) Now I either have to settle for the XL, jump on the 7800GTX bandwagon, or wait to mid-November :/
As much as I love ATI products, I think they might have lost me as well as other customers who are tired of playing the waiting game.
I usually don't trust gamespot for their Hardware testing but until Anandtech comes up with a more complete test you can find more information here http://hardware.gamespot.com/Story-ST-x-2661-x-x-x">Gamespot
They are testing differente CPU speed, graphic settings and RAM sizes.
You are correct. There is no excuse for not including the x850 pe. Judgin from Gamespot's review, the x850 did well. Come on guys, lets see numbers for the x850! I have one and am a ATI fanboi for the moment. LOL
I would complain to ATI they are the ones pushing the heck out of new products they don't even have for sale. It's only natural this makes people more interested in X1000 line.
OK so the highest graphics settings on FEAR are completely unplayable at any decent resolution for most of us, much like the 'Ultra' quality settings in Doom3 when it came out.
What about all the other settings? I suspect the 'highest' settings make little difference to the visuals, but seriously cut the framerate versus the 'high' setting.
At least a couple of benchmarks and screenshots to compare the medium/high/highest settings would be nice.
I would think the complaint should be against the beta ATI drivers which are a press sample that is completely unavailable to the public in any form. At least people can download and install the 81.85 drivers from NVIDIA.
In all honesty, we used unavailable FEAR enhanced drivers for ATI because NVIDIA simply performed better and we didn't want to see complaints about the 81.85 driver... But I guess you can't always get what you want. :-)
I have FEAR, and have been playing it for the past day or so ("sick day" from work).
I can't believe AnandTech would consider it good-looking on non-cuttingedge hardware where you have to put the details down. Have you actually played the game for more than 5 minutes? Performance & Graphics Quality in the later levels is CRAP if you're using mostly medium settings, which is a NECESSITY if you're using a slow X800 part or anything worse. (think X800XL)
For the level of graphics you get, the performance of FEAR is unacceptable. Chronicles of Riddick looked much better, and performed slightly better, on my system. That's an OPENGL game on ATI hardware! Significant, no?
BTW I also just tried Quake4... much much better performance than FEAR, and the indoor sequences look better by comparison (since I can afford to increase details in Q4, because the D3 engine actually runs pretty decently on ATI hardware with the most up-to-date drivers and CatalystAI enabled).
LOL. He thinks X800XL is "slow"! A few months ago, everybody here was raving the X800XL as being best price/performance that actually beat a lot of higher end Nvidia cards. Ugly on medium textures? Go play the first DOOM or Wolfenstein 3D, and then come back and say Medium textures are ugly on Fear.
Some people just won't be satisfied. It's people like you that pay $400-600 for a graphics card, that is causing prices to inflate. I really can't wait until silicon hits the limit where they can't reduce size anymore, and Moore's Law goes obsolete. Then the engineers will have to actually OPTIMIZE the hardware and drivers, instead of just cranking out more raw GPU power. I'm really sick of upgrading, and a lot of my friends have already stopped upgrading their systems, two generations ago. They just gave up.
Tell me something...Everyone sure talks big on here, wanting to upgrade their cards. But why is it when I go to a game store, there are barely any PC games available on the shelves? I don't think a lot of people are buying PC games today, even though ATI and Nvidia would like to say otherwise. The shelves are totally full with console games instead.
Yes, compared to the 7800GTX - which apparently is what you NEED for Fear to run at a decent rate AND look good - the X800XL is slow.
"Ugly on medium textures? Go play the first DOOM or Wolfenstein 3D, and then come back and say Medium textures are ugly on Fear."
Not necessary to go back 10 years, chico. Anyone who has played Doom 3 (August 04), Half-Life 2 (December 04) or even Far Cry (March 04) will agree that Medium textures are "ugly" on Fear, although some may not use such strong language.
"It's people like you that pay $400-600 for a graphics card, that is causing prices to inflate."
Uh, no genius, I paid less than $200 for my X800Pro at a fire sale. And then I overclocked the sh!t out of it, so now it's a little bit faster than a stock X800XL.
"Then the engineers will have to actually OPTIMIZE the hardware and drivers, instead of just cranking out more raw GPU power."
Uh, that's precisely the point of my post - sorta. FEAR is a horribly unoptimized, perhaps even poorly-written, engine. In my opinion, it is unacceptable that an X800XL-class card should have so much trouble with it.
So, what exactly was the point of your post anyways?
On the esthetics of FEAR: it would have been kind of the article writers to include screenshots to underline their judgement. As one who has seen Unreal2 rendered by a lousy GF2 I can understand the parent poster's point. Also thanks for listening to my request for the absence of subjective opinions on the "playablity" of a game in benchmarks...
I played through the demo and thought the graphics were pretty good (considering my setup - 9700 pro, AMD 2500). But more to your point... I have never been a big fan of the Monolith LithTech Engine, every game or demo I have played which used it always feels clunky, the controls always seem "off" and the general engine performance is generally not on par with the other 3d game engines available. To be fair, I haven't played enough of this game to bash the current engine that much and once the game goes on sale I will probably pick it up. But not until after I buy a new system. :)
Tried the demo with my 9800 Pro 128MB, 2.2GHz 64-bit proc, 1GB DDR. I ran the game in 1024x768, no soft shadows and no AF/AA, medium textures, and it still ran great on my system. It also still looked great with medium textures and ran smooth.
I'm not sure why the X800 GT got such low framerate? Because of high textures? Maybe I'll try that on my card too tonight.
At first i was very skeptical that my friends system could handle it but it worked great and was perfectly smooth @ 1024x768 with medium details. And he's only running a AXP 1800+ with a Radeon 9800 pro. So it can still work pretty well on the old systems.
high end textures absolutley kills the 9800pro - killed mine anyways ;)
if you let it autodetect the settings it should run smooth, all the tests here were on max settings except for the aformentioned soft shadows and AA/ansio
with your settings in the demo (i had basically the same setup), while the frames were good, it still hitches when there is a lot of enemies/action going on
it would just be good for AT to test it to compare apples with apples :)
Like many peoples said it would have been nice to see older generation HW...especially on ATI side of thing since most of the card tested here are nowhere to be found on the market.
Seeing performance with the X800XL and the X850XT would have been nice.
I also hope you'll do some CPU testing in the future since I doubt you'll see many peoples out there with AMD FX55...especially paired up with the like of X1300... :)
It is a significant error that SLI numbers were left out of the article since it seems to be about how fast current video card technologies can play the game:
"Those who want to play FEAR at the highest resolution and settings with AA enabled (without soft shadows) will basically have to use the 7800 GTX, as no other card available gets playable framerates at those settings, and the 7800 GTX does just barely (if uncomfortably)." ...unless you have an SLI setup, I assume. Does Anandtech feel that SLI is not a viable graphics technology or am I missing something?
And then there's Crossfire... while it STILL isn't available yet, it would have been interesting to see some performance numbers along with SLI tests.
I'd would be nice if you could update the article with dual card frame-rates.
PC Perspective has already beat Anandtech to the punch on this subject, and the results show that SLI has a SIGNIFICANT impact on playability, even without any driver optimizations....
Exactly! I love this Land of Make Believe. It's a good thing that I have a AMD Athlon 64 FX-55 2.6 GHz processor in my Desktop, Laptop, and PDA. And I'm loving it because after an unreal CPU like that, I would still have hundreds of dollars left to burn on make-believe GPU's. Because, if I was only a regular Joe Anad-reader with a middle tier Pentium 4 and old school AGP graphics port I would be quite upset that the author is targeting his reviews at the well connected Beverley Hills posh.
Just who is Josh writing his articles for any way?! I'm going back to surfing pr0n. Because I have a far better chance at dating a porn* than owning a system like the one that he's showing scores on.
Well thanks for supporting the thread I started in Video forum section last week addressing that very issue. All the idiots came out of the woodwork to do their best to misinterpret and misread the post and very few actually bothered to support my suggestion that a test be done with a REAL WORLD system most of us own, not an FX-55 setup with a 7800GTX that few people own.
I'd LOVE to see how modern games perform on a system I'm actually thinking of buying, not an imaginary supersystem.
You know..it's simply come to the point to where I don't know how the average gamer can keep up. It's come to the point to where if you are not willing to spend $300-$500 every 6-12 mos. or so you just can not keep up with the demands that games are putting on computer hardware. This is stupid..I mean who the hell is dragging this industry along? Do they develop new and more powerful hardware so more demanding software can be created or do they develop more demanding software making it a necessity to develop more powerful hardware? Is all this crap really needed to have a decent gaming experience? I guess I'm just gonna have to starve the Cat for the couple of months so I can toss out my POS 6800gt and get some new wizzbang graphics cards the industry wants me to buy. This has become a never ending process that is wearing thin on me.
I think this is an EXTREMELY bad review
what card do you own?
i know i own a ATI 9600XT bought 12 months ago and runs BF2 really well at medium-high
but why dont Article like this include that info??
Either these sites have lost the plot
Or ATI and Nvidia dont want us to know that older/cheaper cards are still capable
quote: latest drivers from ATI (press sample 8.183.1017 which should be available in catalyst soon)
Yes because we all just happen to be playing FEAR with Drivers not yet available.
quote: NVIDIA GeForce 7800 GTX
NVIDIA GeForce 7800 GT
NVIDIA GeForce 6800 GT
NVIDIA GeForce 6600 GT
ATI Radeon X1800 XT (not yet available)
ATI Radeon X1800 XL
ATI Radeon X1600 XT (not yet available)
ATI Radeon X1300 Pro
ATI Radeon X800 GT
And WHat is wrong with this list?
alot at first glance for starts ATI Radeon X1800 XT (not yet available)
ATI Radeon X1600 XT (not yet available)
dont exist on the market yet. So yes just happen to be running those on FEAR already.
"This has become a never ending process that is wearing thin on me."
Amen. If it wont run on whay I have now, I simply wont buy it. The software/hardware gouging can continue on without me. At least with a console, you know the games you buy are going to run on your machine.
The games will run fine if you turn off maximum detail setting. There still isn't a card that can run EQ2 at extreme quality mode.
I see this as a good thing because games out there are finally making use of the high end hardware some people have invested in. Until this half of the year there really hasn't been much out that could really make use of high end hardware.
This is quite different than requiring high end hardware.
you should have forewell known that the computer industry moves very fast
if you want a bugdet gaming experience, I suggest a ps2/xbox....
no one is telling you to toss your 6800gt, its just that if you WANT to run high resolutions with AA/ansio enabled then you need the latest/greatest card, its ALWAYS been like that
xsliver...I fully understand all of what you are saying...Im 58 yrs old and have been building customs systems for about 12 years...and...I "have" by in large kept up with new technology at all of my upgrade intervals. Perhaps in my position and at my age the payback just isn't what it use to be.
Sounds like you aren't having fun with todays games. I choose to stick to the old stuff until I see a game I like then I'll switch. I don't play new games just because they're new. I play BF2, UT2004 (the funnest game of these 3) and sometimes COD (and probably COD2 when I have a chance to play the demo). I don't play anything else because I don't like anything else. Also, my hardware upgrade path is solely dictated by the games I play.
I was thinking about hte RAM issue too. I used 1GB for the demo, then I upgraded to 1.5GB. It removed a lot of stuttering and felt a whole lot smoother.
This was the demo, of course.
Why are 6800GTs used and not Ultras? I've found this trend recently, a little puzzling.
6800gt's are high mid range cards, whereas the ultras are not good value pricewise... plus not many people have them
the card that is missing is the 16 pipe last gen ATI cards, x800 pro/xt etc...
could that card be added please?
also, people might want a point of reference for old systems that want to see their card splutter on this game (9800pro / 5900fx) -- it would be great to see if these cards are still playable since they are using ps2.0 and generally older tech
this game is very GPU limited, as you can tell by how steeply the resolution scaling graphs drop off. The game won't run at over 1600x1200 without a little hacking.
We will look into testing with more RAM, but our initial thought is that performance (especially at higher resolutions or with AA enabled) will not be incredibly affected by RAM. We will update the article if we find anything.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
117 Comments
Back to Article
giles00 - Tuesday, October 25, 2005 - link
I thought the bit-tech review was more relevant - they actually sat down and played the game, proving that the inbuilt graphics test doesn't bare any representation on real game play.http://www.bit-tech.net/gaming/2005/10/24/fear/1.h...">http://www.bit-tech.net/gaming/2005/10/24/fear/1.h...
here's a good quote:
lindy01 - Sunday, October 23, 2005 - link
Having to upgrade your hardware to the latest and greatest to get good looking games is crazy. In the last two years its spun out of control.My 9700pro lasted the longest, then I bought a 6800GT for $359.....my next purchase is a xbox360 for $399....I am sure if they make a version of Fear for it it will look great on my 50inch HD Sony TV. Ahhh and it probably wont ship full of bugs.
Dam if it were not for PC games my system would be a 1ghz P3 with 512megs of ram!
Regs - Monday, October 24, 2005 - link
I'm a little reluctant too this year to upgrade. The worse thing about it is when they make a good optimized graphics engine , like HL2's & Far Cry's, they dont seem to last very long. I expected at least a few other developers to make good quality games with them but that never happened. So the end result is that you're upgrading your PC for 2 or 3 titles a year. ID's OpenGL based engine was the only real "big" seller with Riddick and Quake 4 thankfully. Plus if you include the problem that if the games turned out not to be in your liking you're stuck with 1000 dollars worth of useless hardware.CronicallyInsane - Sunday, October 23, 2005 - link
I got the game, installed the new patch, and have been running @ 1024x768 just fine with the majority of the goodies on. Given, no soft shadows, and on 4x rather than 8x or 16x, but it looks beautiful to me.2.4Gig Northwood @ 3 gig
1 gig pc3200
Raptor 36G
6600gt agp @ 550/1.1
Try it before you bash it, ya know?
d
carl0ski - Sunday, October 23, 2005 - link
I am starting to think technology sites are forgeting they are to be reviewing the game.Not VIDEO cards.
So What the hell is this (not yet available) doing in an article helping us decide to buy a game?
we want to buy the game knowing whether it will run on what people own.
Geforce TI's
ATI 9800XT
ATI Radeon X1300 Pro
ATI Radeon X800 GT
NVIDIA GeForce 7800 GTX
NVIDIA GeForce 7800 GT
NVIDIA GeForce 6800 GT
NVIDIA GeForce 6600 GT
etc
AS that is what the mainstream/people own already
We want to know if it'll run on our own computers!!
simple as that.
How many of the 1,000,000's of copies a game is sold on do people run on $400 current generation Cards?
probably only a small percentage
most people i know who bought BF2 use cards ranging 6 months - 2 year old cards.
yacoub - Sunday, October 23, 2005 - link
Any chance this will actually happen?http://forums.anandtech.com/messageview.aspx?catid...">http://forums.anandtech.com/messageview...amp;thre...
Regs - Saturday, October 22, 2005 - link
You make reference to how the soft shadows are implemented to Riddick compared to FEARs yet I searched the site and there is no benchmarks or IQ comparisons of Riddick. If you asked me that's a major problem considering you have no evidence published to back up your own statement.Jeff7181 - Saturday, October 22, 2005 - link
This should be a game review... not a GPU review. Review the game, play the game how you'd actually play it... with sound enabled. THEN show us the FPS measurements.yacoub - Saturday, October 22, 2005 - link
A large number of Anandtech readers do not comprehend anything other than "GPU review" so you will likely not see a true game review anytime soon on a realistic rig. It's always only ever a GPU test with an FX-55. =/yacoub - Saturday, October 22, 2005 - link
You include the 6600GT and 6800GT but not the X800XL and X800XT, the two comparable cards. Stop with the 1800-series nonsense and post the BUYABLE ATI cards as well please! Would be nice for those of us considering upgrading to an X800XL or 6800GT to see how they stand up in FEAR. :(eljefeII - Saturday, October 22, 2005 - link
yeah, x1800 looks like a flop for the most part. and it doesnt exist.kinda gay
ryanlopez4550 - Saturday, October 22, 2005 - link
but games like this make up for the lower settingsmy friend came over last night and we played online FEAR for 6 hours
He has a comp i built him with a 6600gt and it ran great on some custom setting and didnt look at all sub par. Didnt lag ONCE all night. The test program in the game is really cool to so now i dont have to sit there with fraps and stuff on for ever
multi player - the gameplay is so fast most of the time there is NO time for you to admire the scenery
boyer00 - Saturday, October 22, 2005 - link
it says how the geforce 7800 is basically the only card to run it at the highest end, i just ordered a alienware about a week ago with, dual geforce 7800GTX-KO's, and a 19"LCD monitor, 4 gig DDR2 ram and 3.2 dual core pentium-D....my question is do i have anything to worry about in the upcoming months/year graphically?Gary Key - Sunday, October 23, 2005 - link
Since you are probably limited to 1280x1024 on your 19" LCD then you are fairly safe. However, the 840D will have issues in providing enough data (the GPUs will have wait states) for the 7800GTX SLI setup at the higher resolutions such as 1600x1200 in case you decide to change monitors. I have found the 7800GTX SLI setup and the 840EE to run Fear at 1280x1024 (960) without too much of an issue.Regs - Saturday, October 22, 2005 - link
The answer to this question is always yes. You just bought a excellent system to play today's games. Not tomorrows games.ryanlopez4550 - Friday, October 21, 2005 - link
i have a 7800 gtx at 490-1300 and a gig of ram and a 3200 amd...i tested the game out on the MIN. settings... direct x 7 and such (looked like duke nukem) and i got a max of 60 a min of 58 and an average of 59
everything else runs HORRIBLY!
ryanlopez4550 - Friday, October 21, 2005 - link
ONE MORE TIMEtried to get the new drivers for my 7800gtx
its telling me i dont have the right drivers for my hardware... ?????????
well anywho i uninstalled them all and reinstalled them so i have the old old drivers and now the game runs like normal
'high' and 'high' settings got me a min. of 54 and a max of 214
ryanlopez4550 - Friday, October 21, 2005 - link
ok just tried it againmax setting (1024x768) soft shadows and 4x16x i got an average of 23 fps
so i lower the settings... soft shadows off and 2x8x and i get the same results...
Kung Lau - Friday, October 21, 2005 - link
Is there any way to get a poll of AT forum viewers and establish which cards get tested on bleeding edge software? Wouldn't that help us see data that pertains to the majority of us. I understand that you can't evaluate every possible card/model/resolution variation but a current reader based poll may help.fogeyman - Friday, October 21, 2005 - link
Post in the forums.Regs - Friday, October 21, 2005 - link
This is one of the reasons why I don't think the 7800 GTX or x1000's are worth buying. I feel sorry for the people who payed over 600 dollars for them when they can't even play FEAR @ 1280x1024 with AA.fogeyman - Friday, October 21, 2005 - link
FEAR is quite clearly optimized poorly. However, claiming that people pay over $600 for a gtx without being able to play 1280x1024 with AA is totally wrong. It is easily playable as the review shows for less than $600 (price-wise, at least for the gtx). Not to mention, you can even kick up the resolution to 1600x1200 and get only slightly unusable FPS.Specifically, on 1280x1024 with all settings on max except for soft shadows, the GTX gets a playable 39 fps. ATI is off the mark, but NVIDIA is okay. And as for the cost of the 7800GTX, it is (as of now off Newegg) in $20 intervals from $460-$500, the $500 version includes BF2, and one $580 version. Clearly, you can get the GTX for over $100 less than your "$600" price. And no, exaggerating by over $100 is not negligible, not at all.
Note: what I mean by "slightly unusable" is not that it is slightly problematic, but rather that it is in fact unplayable but misses the playability mark by only a little.
LoneWolf15 - Friday, October 21, 2005 - link
I would argue that if anything, it is likely that F.E.A.R. was optimized poorly, and is more likely the result. I've seen screenshots, and so far, I'm not impressed enough to put down the money. Greaphics doesn't seem to be anywhere near as good as the hype has stated (previous Half-Life 2 shots look far better IMO; perhaps I have to play it to see). Add that to the fact that there's already a 1.01 patch the day of the game release, and I think that's a symptom of a game that needs more under-the-hood work. I'll wait to see the results of testing for more games; one is not enough.
P.S. To all that said this review should have had more ATI cards, you were right on the money. This review has the Geforce 6600GT and 6800GT, and doesn't even include ATI counterparts to them (read: X800GT, X800XL)? That's poor.
Jackyl - Friday, October 21, 2005 - link
I really do think developers have either reached the limit in optimizing their code, or they are too lazy to do so. Or perhaps, it's a conspiracy between ATI/Nvidia and developers? The fact is, you shouldn't NEED a $600 video card to run some of these games coming out today. The shear lack of performance shown here on a high dollar card, shows us that something is wrong in the industry.Anyone notice a trend here in the industry? Supposedly the GPU power of the cards are increasing. X800 claims to be two times faster than an "old" 9800 Pro. Yet the game engines being written today, can't crank out more than 40fps at a measly resolution of 1280x1024? Something is wrong in the industry. As someone else said in another post...Something as got to give.
Le Québécois - Friday, October 21, 2005 - link
The problem is simple...PC game developers have no limite to speak of...They know there is allways something new coming up who will run their game perfectly...That's not the case with the console market. Since they're going to be "stuck" with the same HW for 4-5 years they HAVE to optimize their code..That why you see games on a same system ( Gamecube for exemple ) with graphic twice has beautiful as other older game running on the SAME HW...Take RE4 for exemple..nobody even though that level of graphic could be achive on a GC....but it did.
g33k - Friday, October 21, 2005 - link
I can't really complain as the 6800gt was included in the article. Good read, I enjoyed it.PrinceGaz - Friday, October 21, 2005 - link
I'd say this was a fairly good performance review except for the choice of graphics-cards.An excellent choice of current nVidia cards by including both 7800 models, and popular GF6 cards (6800GT and 6600GT) from which the performance of other 6800/6600 can be extrapolated. Given the use of a PCIe platform, the only cards I would add would be a standard 6200 (not TC) and a PCX5900; the PCX5900 would give FX5900 owners a good idea of how their card would perform and be a guide to general GF5 series performance. A 7800GTX SLI setup is also needed to show what benefit it offers, but I wouldn't bother testing anything slower in SLI as it is not a viable upgrade.
The ATI X1000 series cards included was also excellent, but only using an X800GT from the previous generation was woefully inadequate. Ideally an X850XT, X800XL, and X700Pro would also be added to give more complete information. For the generation before that, just as a PCX5900 could be used for nVidia, an X600Pro/XT could be used for ATI as that would be equivalent to a 9600Pro/XT. It's a pity there isn't a PCIe version of the 9800Pro but a 9600Pro/XT would be the next best thing. Until you can setup a Crossfire X1800XT there is no point including any Crossfire tests.
So my recommended gfx-card selection is: nVidia 7800GTX SLI, 7800GTX, 7800GT, 6800GT, 6600GT, 6200, PCX5900. ATI X1800XT, X1800XL, X1600XT, X1300Pro, X850XT, X800XL, X800GT, X700Pro, X600Pro/XT. That may seem a daunting list but it is only a total of 16 instead of 10 cards so it is not overwhelming. All the cards are PCIe so you only need the one box, and it includes a good selection of old and new cards.
The only other thing I'd change is the test system. The FX-55 processor is fine though an FX-57 would be even better; people who suggest using something slower when testing slower video-cards are missing the point of a video-card review. I would up the memory to 2GB (2x 1GB) though just to remove possible stuttering from affecting the results, even if that means slowing the timings slightly to 2-3-2.
Le Québécois - Friday, October 21, 2005 - link
Oh..and your selection of video card seems pretty good to me :P Since pple with a 9800PRO will perform closely with the X700PRO.Le Québécois - Friday, October 21, 2005 - link
The fastest CPU is good if you want to know exactly how well a GPU do in a game...but that still doesn't refelct the majority of peoples who will run the game...that's why a slower CPU could be nice. If hte idea behind this review was to show peoples how well their HW will do in this game...only using the best of the best is not the best way to achive that goal.PrinceGaz - Friday, October 21, 2005 - link
The aim of video-card reviews is to show as best as possible what the video-card is capable of when all other variables (such as CPU limitations) are removed from the equation. That's why even testing an AGP GeForce 2GTS with a high-end FX-57 processor would be preferable as the performance is determined entirely by the graphics-card.If you use slower CPUs with slower graphics-cards, it is difficult to say for sure whether it is the CPU or the graphics-card that is the limiting factor. All a review which tries to mix and match CPUs and graphics cards is saying is "this combination went this fast, but we have no idea if it was the CPU or the graphics-card that was the limiting factor, so we don't know if you should buy a faster CPU or a faster graphics-card".
Le Québécois - Friday, October 21, 2005 - link
And That's why they just do at least a couple of test with slower CPU to see if it affect the FPS or not...I dont say screw the FX57...I juste say..why dont you do a couple a test with a slower CPU to see if it makes a difference....Hardtarget - Friday, October 21, 2005 - link
Good article but I really realy hoped you would of compared a 9800Pro. It's a very very popular card that many of us still have and we'd love to know how it performs!Pjotr - Friday, October 21, 2005 - link
I've seen a trend in GPU reviews lately, in that fewer old cards are used. This review says "See if you need to upgrade", but how can I tell when my old card is not there and I have no relation to even the 800 GT? I would like to see more old cards. 9800 Pro is probably a card a LOT of people have, it would be great to include this card and perhaps 2-3 more from around that generation.fogeyman - Friday, October 21, 2005 - link
"why would anyone make a game with a no name legacy that has no ability to be played in full with a system that would cost 3000 dollars? "This is a huge exaggeration. I have a new, just under $2000 system with an x2 4200+, 7800gt, and 2 gb of memory. I can also play FEAR on 1280x1024 (that's the highest resolution my monitor can support) with all graphics settings turned on (except for soft shadows, I believe) with no noticeable issues. So while my system is a high-end system, zero playability on a $3000 system is a MASSIVE exaggeration.
In fact, for ~$2400 (the cost for me to upgrade my system to 7800gtx SLI), I could play the game flawlessly on 1280x1024 with all settings turned on. The SLI is probably overkill, in any case, meaning for ~$1900, you can play FEAR easily. Not to mention the graphics are OUTSTANDING at this level. The gameplay is amazing and the physics are to die for (I've spent hours mesmerizing friends during one part of the demo where you can flip an enemy head over heels with the bolt gun thing. So while FEAR is INCREDIBLY demanding, the game is incredibly amazing.
"sick of this assinine increase in resolutions. I bet the game actually sux after playing it for 2 days like most do."
Like Derek said, this game is outstanding. It's not absolutely and utterly captivating like HL2, nor is it as practical as HL2, but the game is still awesome.
"with a no name legacy"
Companies have to make a name for themselves. Great gameplay is one way, groundbreaking graphics is another, and a combination of the two is even better. FEAR has both. Additionally, Monolith has come out with great games in the past that have been rejected by the market. If you want me to come up with some examples, I'll dig up some games and give you some.
Granted, these games aren't "uberly leet" like HL2, but you can't expect every game to get ratings as high as 95+.
Basically, don't post irrational posts that exaggerate the truth. This isn't the time to slander FEAR or monolith or anything else, really. And there's no reason to create "flame wars." Post calm. Post decently. And post rationally...and I probably have some irrational points in here too, so just point them out and I'll attempt to fix it. If I can't, then you win. Happy?
FPSnut - Saturday, October 22, 2005 - link
Fogeyman,How did you get the game to play at 1280x1024? I only see an option for 1280x960...
Thanks in advance!
fogeyman - Friday, October 21, 2005 - link
Forgot to add this:I also agree with what seems to be a popular opinion that this review is lacking info. It doesn't show us the incredibly out-of-reach SLI setups, making the review more user-tailored and less FEAR-bragging tailored, but few (if any) AGP cards are shown. This makes the review tailed towards the "high" gaming community instead of the mainstream community.
I would be happier with more lower end cards shown, more varied graphics settings chosen, and far fewer top-of-the-line cards. For example, instead of the 7800gtx and the 7800gt, I would have been okay with just the gtx or just the gt. While this leaves out a very nice card, for purposes of practically it might be better to let users extrapolate the fps for the other card, and instead show some mid-range cards as well.
If time doesn't allow for this, then perhaps make it more clear that you intend to release another, more comprehensive, review soon so we can get an overview in the first, rush-review and a more detailed review after a few weeks.
On a side note, I heard rumors that the release of FEAR is multi-threaded. Is this true?
Le Québécois - Friday, October 21, 2005 - link
Well...testing it on a X2 VS a FX55 would make an interesting benchmark but all the reviews I have read so far use FX55-57 processor...And Btw...when CPU speed is taken in consideration ...the FPS in FEAR doesn't change much.. a AMD 64 3200 and a FX57 both run the game at almost the same speed ( give or take 1-2 fps ).ElJefe - Friday, October 21, 2005 - link
the game can fear "this" *grabs a lower organ*why would anyone make a game with a no name legacy that has no ability to be played in full with a system that would cost 3000 dollars?
for that id go to europe and rent someone named fear to hang out with or something.
sick of this assinine increase in resolutions. I bet the game actually sux after playing it for 2 days like most do.
Now a game like half-life2, that was a game you would consider getting a new system for, or for final fantasy, or for doomIII (even though it bugged out too many people and died fast).
they should make a new game that requires two overclocked liquid cooled only 7800GTXs in sli mode that gets 4 frames at 640 resolution. that would really help out the industry! yeah!
9nails - Saturday, October 22, 2005 - link
FEAR is a top 10 game. It will become a benchmark for games follow. Imagine a developer that took some of their favorite levels from current games and worked those ideas into a believable environment. Then took a horror movie and tied it to a game and have it play on your mind more than in your eyes. Then toss in some enemy AI that does a lot more than predictably pop out from the same side of a crate; an AI that can actually flank, hide, work together, and corner the player. Then they actually play tested this game with real gamers and adjusted the difficulty to make sure there wasn't something stupid like a flood of Combine attacking you in prison when you had no health and little defense. Now wrap it up this game in cutting edge visuals.I have the FEAR Director's Edition DVD, and highly recommend it. And I'm only playing on a P4 2.8 with 1 Gig and a 6800GT. Everything is maxed at 1024, and it looks stunning.
DerekWilson - Friday, October 21, 2005 - link
the game is actually fun and has enemies that are interesting. Half life 2 was bad enough with its enemy AI, but doom 3 had every enemy of a certain type doing exactly the same thing after its initial jump out and scare you routine. it was really boring killing enemies like that.FEAR actually has enemies that can do different and interesting things depending on the current landscape, and it seems they work together better in this game than others.
I'd say that as a single person shooter, FEAR has better playability than many other games out there.
And, again, lower settings run at higher frame rates.
Sunrise089 - Thursday, October 20, 2005 - link
Lack of testing at different graphics settings - badLack of soft shadows + AA testing - sort of bad
Lack of SLI testing - quite bad
Lack of an older card like a 9800pro even if only to see how badly it plays - sort of bad
Testing two unavailable ATI cards while not testing ANY previous gen ATI cards - terrible
DerekWilson - Thursday, October 20, 2005 - link
We tested three different settings combinations for this game where we normally only test 2. I agree that it would have been nice to include a test with settings that allowed the midrange cards to acheive smooth framerates at high resolutions. We did test with and without the setting that has the single largest impact in framerates (soft shadows).It is not possible run the game with antialiasing and soft shadows enabled at the same time. If AA is enabled in the control panel and soft shadows are enabled in the game, Monolith notes that rendering problems will happen. AA + Soft shadows is ommited because we could not include it.
The only SLI option we currently recommend is the high end combination of 2x 7800 GTX. Rather than doubling any other product, it is a better option to upgade to a higher end solution and sell the lower performance part. Testing SLI and crossfire combinations of every card and including other X800 and 9800 series solutions would have ended up doubling our test load and our time to publication. We tried to choose well a smaller sample of cards that would present a full representation of what would happen in the mid to high end space.
And after reading the comments on this article, it is quite apparent that we chose poorly. In the future, we will include at least an X800 XT or X850 XT and a 7800 GTX SLI test.
It just isn't possible to test every setup imaginable, but rest assured that we will absolutely listen to the feedback and include at least a couple more cards and tests in future articles of this nature.
Thanks very much for your feedback,
Derek Wilson
Le Québécois - Friday, October 21, 2005 - link
Don't forget the X800XL ...For a long time this card has been the best bang for the buck you could get so its probably a card that many Anand readers have.Sunrise089 - Thursday, October 20, 2005 - link
Absolutely understand the point now about not being able to run soft shadows and AA together, sorry for calling you out on that then. For settings I meant maybe testing at Medium or High rather than whatever the Max setting is called. Thank you for caring to read my feedback, it is appreciated.dashrendar - Thursday, October 20, 2005 - link
Hey Derek,How hard is it to incorporate min/max FPS in addition to average FPS in your benchmarks? I see this info in other sites, and it really helps to see how much it dips throughout a timedemo. I think it gives a better representation of whether a game will have some or a lot of hiccups.
Thanks
DerekWilson - Thursday, October 20, 2005 - link
We will work on ways to include this data effectively. There are always more numbers to add, and not always good ways to represent that data. But we will absolutely look into it. Any suggestions on how you would like to see this data represented?9nails - Saturday, October 22, 2005 - link
I'm not saying run out and steal these graphs, but HardOCP has pretty good charts that show the frame rates over the timedemo. Pretty coolio. http://www.hardocp.com/article.html?art=ODU1LDM=">http://www.hardocp.com/article.html?art=ODU1LDM=And, please please please consider using a system that is a closer representation of that the average READER might have!!! (http://www.steampowered.com/status/survey.html)">http://www.steampowered.com/status/survey.html) Your dream gamer rigs are absurd and do not offer any represention what we READERS can expect from the game. If you're reviewing GPU's, toss some absurdly low end ones in and maybe some of your highend cards in, ***only if they can be found in local stores and/or available through your advertisers***. This gives us READERS a better idea of what some new hardware may do for our systems. If the games run like crap on older hardware, maybe these developers will learn how to write better code!? Trying to find the most amazing performance, isn't always important. Know what I mean?
fogeyman - Friday, October 21, 2005 - link
I would be fine with two charts, one for minimum and one for max. Integrating the two charts would probably make things too cluttered. As far as the specifics go, I like how you set up your charts right now. So simply duplicating the format and changing the content is great for me.Icehawk - Thursday, October 20, 2005 - link
Oh, and it hiccups like crazy on my PC...Anemone - Thursday, October 20, 2005 - link
I could have expected this kind of performance. Kudos for a decent article.The very best of both ATI and Nvidia are not even up to the task of 1600x1200 with 8AF and soft shadows. I wouldn't even want to imagine what 1920x1200 would look like(don't know if Fear can do that res or not). But it's clear to me the R580 and the G80 are obviously needed for the next generation. People often argue that GPU's of the best today are overkill. Clearly we can see that even the very best of today can be brought to its knees by a shipping game, let alone what may come in the next year.
Let's hope those newer chips don't take a year to get to us.
:)
dashrendar - Thursday, October 20, 2005 - link
I'm the only person on the face of the planet who's going to be able to play this game at 1600x1200 with everything set to max, with min 60 fps, and enjoy every dpi of its beauty... a year from now that is.Icehawk - Thursday, October 20, 2005 - link
Personally I don't think the game looks all that great and once I set it so it is playable it looks pretty bad IMO - Quake 4 looks better and runs smoother at higher rez with more options.P4 3.4ghz, 1gb ram, 6600GT - I need to run it at 8x6 with everything on medium, no AA - it's fugly man.
I'm sick of reviews with only the highest end gear - the 6600GT #s mean almost nothing to an actual owner, who has an FX processor and "only" a 6600?! Please start using TWO machines for these tests, one super mega rig for getting absolute #s and one average machine so users can see what they will REALLY get.
mostlyprudent - Thursday, October 20, 2005 - link
I have to agree with what others have said. Where are the X800 Pro/XL and X850 XT? Why test with ATI's new (unavailable) cards at the expense of the currently available ones and then dismiss tham as options because of availability issues? If you feel so strongly about it, refuse to test with them until they become available. Then we can all complain about the absence of testing with forthcoming cards! :) Sucks to be a reviewer and have to test 15 different cards to please most of us.Which brings me to my industry issue: How long can NV and ATI realistically continue to crank out new architectures every 6 months? Something has got to give. I think the worst case scenario is ending up in a single manufacturer situation. I keep hoping ATi pulls something out of their hat just for competition sake.
Are the days of passively cooled cards over?! I haven't even gotten around to picking up Gigabyte's passively cooled X800XL and it's already becoming outdated :(
flexy - Thursday, October 20, 2005 - link
>>>I have to agree with what others have said. Where are the X800 Pro/XL and X850 XT? Why test with ATI's new (unavailable) cards at the expense of the currently available ones and then dismiss tham as options because of availability issues? If you feel so strongly about it, refuse to test with them until they become available.
>>>
i totally agree. I just stuningly overflew the article and the charts where in the AA/AF tests (which are the ones which count) the 1800XT CLEARLY comes out a TAD faster than the 7800GTX.
THEN - below i read: "We can only recommend 'saving up for a 7800GTX'.
a)if you recommend 'saving for a 7800GTX' then i dont understand that you dont mention that the 1800XT might be available the same time when this person saved it's money going out for a GTX - and then XT would be faster
b) having the XT in the charts and then dismiss it in the recommendations because availability is WEIRD. I UNDERSTAND, and we're all frustrated by ATI's paper launches and non-existing fantasy cards.....but, still..i THINK you would have done better if you'd waited a bit longer 'til the XT is an available product...instead of showing it in the graphs and then forget about it because it might take a few more weeks 'til they're available.
ALSO - you as testers HAD one (XT)....so it doesnt make sense because you HAD the product in hands and compared it - and this was a real product which will be available soone (ehrm, i hope :) )and not some calculated "benchies" based on a totally differnt hardware.
c) X850PE: For sure. Miss the numbers becausew i have one
d) i think it would be worth to mention that, amongst all the hype, a engine which runs BARELY 40FPS on super-duper high-end cards realle "does not make much sense" - especially if opinions are split whether the gfx in it are REALLY *that* ground breaking. MAYBE - maybe this game engine is just really BAD and inefficient. Sorry...we're talking about high-end machines here with 2GB ram and top-notch gfx cards in the $500 range...and a mediocre resolution like 1280x doesnt get better than 40FPS ??? Not really a reason to rave.
And as some said, there are similiar titles out with (subjective seen) on par (or even better) graphics which runs WAY faster.
OrSin - Thursday, October 20, 2005 - link
Why are review impressed when games need more graphic power? Can we be impressed with good graphics and lower requirements. It a shame when a $50 game need a $500 video card. And whats worste is that in year another will need the next $500 card. And all this to play maybe 5 FPS at most in the next year.bob661 - Thursday, October 20, 2005 - link
Don't play the game or turn your settings down! It's pretty simple. I've been doing it for a long time as until last year, I couldn't even afford a midrange video card.Pythias - Thursday, October 20, 2005 - link
Don't play the game or turn your settings down! It's pretty simple. I've been doing it for a long time as until last year, I couldn't even afford a midrange video card.Thats a great idea, except I just purchased an lcd. Mine doesnt play well with anything other than its native resolution.
bob661 - Friday, October 21, 2005 - link
Well then you need to get crackin on that new video card! :)antiprnt - Thursday, October 20, 2005 - link
Seemed like it was more of a 1800xt vs 7800 gtx article..all the other stuff mentioned was just a bonus, maybe thats why they didnt include sli in the mix..latino666 - Thursday, October 20, 2005 - link
http://www.firingsquad.com/hardware/fear_performan...">They use the X800 XL . Not only that but when they review a game they do it in two different articles. One for mainstream and another for high-end. I would like to see Anandtech do the same.dev0lution - Thursday, October 20, 2005 - link
How is it that the X800XL is consistantly left out of the benchmarks? You have multiple ATI cards in the test that aren't even available, but leave out one of their best sellers in this and plenty of other reviews.Avalon - Thursday, October 20, 2005 - link
I would not declare 30fps playable when your game settings involved sound disabled. No one is going to play with sound off, and thus their framerates will be even lower. This game runs like ass. I hope I can get my copy to run at all on my 6600 @ 400/700.Leper Messiah - Thursday, October 20, 2005 - link
Fo' sho'.cmon AT, where's the SLi benchies? Can 2 7800GTX's run this game at 1600x1200 with AA/AF and sound?
aldamon - Thursday, October 20, 2005 - link
For the NVIDIA cards, was Forceware set to Quality or High Quality?Were Transparency AA and Gamma Correct AA turned on?
If Transparency AA was turned on, was Multipsampling or Supersampling used?
Ender17 - Thursday, October 20, 2005 - link
is AnandTech ever going to get with the times and use 1920 x 1200 for all us widescreen users?DerekWilson - Thursday, October 20, 2005 - link
In our most recent graphics performance article we did include 1920x1200For FEAR, the resolution is not an option (physically) and would have been too difficult to hack into existance.
the game does not run widescreen resolutions. Check tweakguide for more details.
Le Québécois - Thursday, October 20, 2005 - link
Normally I would agree but with FEAR why bother with 1920 when 1600 is barely an option.lexmark - Thursday, October 20, 2005 - link
very good read AT! New software benches makes me all gimpy inside :)I know im probably beating the dead horse here, but I was actually looking foward to purchasing a x1800XT for my main computer. (building one with a 7800GT for my son) :-) Now I either have to settle for the XL, jump on the 7800GTX bandwagon, or wait to mid-November :/
As much as I love ATI products, I think they might have lost me as well as other customers who are tired of playing the waiting game.
lexmark - Thursday, October 20, 2005 - link
purchasing a card to play F.E.A.R. that is.lexmark - Thursday, October 20, 2005 - link
Jeez i need a break. ><"I'M" purchasing a card to play F.E.A.R.that is.
Le Québécois - Thursday, October 20, 2005 - link
I usually don't trust gamespot for their Hardware testing but until Anandtech comes up with a more complete test you can find more information here http://hardware.gamespot.com/Story-ST-x-2661-x-x-x">GamespotThey are testing differente CPU speed, graphic settings and RAM sizes.
smaky - Thursday, October 20, 2005 - link
You are correct. There is no excuse for not including the x850 pe. Judgin from Gamespot's review, the x850 did well. Come on guys, lets see numbers for the x850! I have one and am a ATI fanboi for the moment. LOLphotoguy99 - Thursday, October 20, 2005 - link
>lets see numbers for the x850!I would complain to ATI they are the ones pushing the heck out of new products they don't even have for sale. It's only natural this makes people more interested in X1000 line.
peldor - Thursday, October 20, 2005 - link
OK so the highest graphics settings on FEAR are completely unplayable at any decent resolution for most of us, much like the 'Ultra' quality settings in Doom3 when it came out.What about all the other settings? I suspect the 'highest' settings make little difference to the visuals, but seriously cut the framerate versus the 'high' setting.
At least a couple of benchmarks and screenshots to compare the medium/high/highest settings would be nice.
poohbear - Thursday, October 20, 2005 - link
why are u guys using nvidia beta drivers? should'nt u test w/ only official drivers?DerekWilson - Thursday, October 20, 2005 - link
I would think the complaint should be against the beta ATI drivers which are a press sample that is completely unavailable to the public in any form. At least people can download and install the 81.85 drivers from NVIDIA.In all honesty, we used unavailable FEAR enhanced drivers for ATI because NVIDIA simply performed better and we didn't want to see complaints about the 81.85 driver... But I guess you can't always get what you want. :-)
Le Québécois - Thursday, October 20, 2005 - link
Anychance you could e-mail me those press sample driver for ATI? :PLe Québécois - Thursday, October 20, 2005 - link
Oups...you read my mind Derek.DerekWilson - Thursday, October 20, 2005 - link
I've an update -- the driver we used is available here:http://support.ati.com/ics/support/DLRedirect.asp?...">http://support.ati.com/ics/support/DLRe...b1854&am...
and was listed as a fix for serious sam II. It's the 5.10a driver and was posted yesterday for public consumption.
Bingo13 - Thursday, October 20, 2005 - link
The 81.85 drivers will be WHQL approved and on Nvidia's website later today.LocutusX - Thursday, October 20, 2005 - link
I have FEAR, and have been playing it for the past day or so ("sick day" from work).I can't believe AnandTech would consider it good-looking on non-cuttingedge hardware where you have to put the details down. Have you actually played the game for more than 5 minutes? Performance & Graphics Quality in the later levels is CRAP if you're using mostly medium settings, which is a NECESSITY if you're using a slow X800 part or anything worse. (think X800XL)
For the level of graphics you get, the performance of FEAR is unacceptable. Chronicles of Riddick looked much better, and performed slightly better, on my system. That's an OPENGL game on ATI hardware! Significant, no?
BTW I also just tried Quake4... much much better performance than FEAR, and the indoor sequences look better by comparison (since I can afford to increase details in Q4, because the D3 engine actually runs pretty decently on ATI hardware with the most up-to-date drivers and CatalystAI enabled).
Jackyl - Thursday, October 20, 2005 - link
LOL. He thinks X800XL is "slow"! A few months ago, everybody here was raving the X800XL as being best price/performance that actually beat a lot of higher end Nvidia cards. Ugly on medium textures? Go play the first DOOM or Wolfenstein 3D, and then come back and say Medium textures are ugly on Fear.Some people just won't be satisfied. It's people like you that pay $400-600 for a graphics card, that is causing prices to inflate. I really can't wait until silicon hits the limit where they can't reduce size anymore, and Moore's Law goes obsolete. Then the engineers will have to actually OPTIMIZE the hardware and drivers, instead of just cranking out more raw GPU power. I'm really sick of upgrading, and a lot of my friends have already stopped upgrading their systems, two generations ago. They just gave up.
Tell me something...Everyone sure talks big on here, wanting to upgrade their cards. But why is it when I go to a game store, there are barely any PC games available on the shelves? I don't think a lot of people are buying PC games today, even though ATI and Nvidia would like to say otherwise. The shelves are totally full with console games instead.
LocutusX - Thursday, October 20, 2005 - link
Jackyl, thanks for your totally useless post:"LOL. He thinks X800XL is "slow"!"
Yes, compared to the 7800GTX - which apparently is what you NEED for Fear to run at a decent rate AND look good - the X800XL is slow.
"Ugly on medium textures? Go play the first DOOM or Wolfenstein 3D, and then come back and say Medium textures are ugly on Fear."
Not necessary to go back 10 years, chico. Anyone who has played Doom 3 (August 04), Half-Life 2 (December 04) or even Far Cry (March 04) will agree that Medium textures are "ugly" on Fear, although some may not use such strong language.
"It's people like you that pay $400-600 for a graphics card, that is causing prices to inflate."
Uh, no genius, I paid less than $200 for my X800Pro at a fire sale. And then I overclocked the sh!t out of it, so now it's a little bit faster than a stock X800XL.
"Then the engineers will have to actually OPTIMIZE the hardware and drivers, instead of just cranking out more raw GPU power."
Uh, that's precisely the point of my post - sorta. FEAR is a horribly unoptimized, perhaps even poorly-written, engine. In my opinion, it is unacceptable that an X800XL-class card should have so much trouble with it.
So, what exactly was the point of your post anyways?
Pannenkoek - Thursday, October 20, 2005 - link
On the esthetics of FEAR: it would have been kind of the article writers to include screenshots to underline their judgement. As one who has seen Unreal2 rendered by a lousy GF2 I can understand the parent poster's point. Also thanks for listening to my request for the absence of subjective opinions on the "playablity" of a game in benchmarks...Kegh - Thursday, October 20, 2005 - link
I played through the demo and thought the graphics were pretty good (considering my setup - 9700 pro, AMD 2500). But more to your point... I have never been a big fan of the Monolith LithTech Engine, every game or demo I have played which used it always feels clunky, the controls always seem "off" and the general engine performance is generally not on par with the other 3d game engines available. To be fair, I haven't played enough of this game to bash the current engine that much and once the game goes on sale I will probably pick it up. But not until after I buy a new system. :)michal1980 - Thursday, October 20, 2005 - link
common now, high end with no sli, maybe the 7800gt/gtx X 2 could really for the first time shine?Jackyl - Thursday, October 20, 2005 - link
Tried the demo with my 9800 Pro 128MB, 2.2GHz 64-bit proc, 1GB DDR. I ran the game in 1024x768, no soft shadows and no AF/AA, medium textures, and it still ran great on my system. It also still looked great with medium textures and ran smooth.I'm not sure why the X800 GT got such low framerate? Because of high textures? Maybe I'll try that on my card too tonight.
Jedi2155 - Friday, October 21, 2005 - link
At first i was very skeptical that my friends system could handle it but it worked great and was perfectly smooth @ 1024x768 with medium details. And he's only running a AXP 1800+ with a Radeon 9800 pro. So it can still work pretty well on the old systems.xsilver - Thursday, October 20, 2005 - link
high end textures absolutley kills the 9800pro - killed mine anyways ;)if you let it autodetect the settings it should run smooth, all the tests here were on max settings except for the aformentioned soft shadows and AA/ansio
with your settings in the demo (i had basically the same setup), while the frames were good, it still hitches when there is a lot of enemies/action going on
it would just be good for AT to test it to compare apples with apples :)
Jackyl - Thursday, October 20, 2005 - link
Actually I had ATI's 2x AF turned on in the drivers.Le Québécois - Thursday, October 20, 2005 - link
Like many peoples said it would have been nice to see older generation HW...especially on ATI side of thing since most of the card tested here are nowhere to be found on the market.Seeing performance with the X800XL and the X850XT would have been nice.
I also hope you'll do some CPU testing in the future since I doubt you'll see many peoples out there with AMD FX55...especially paired up with the like of X1300... :)
Kogan - Thursday, October 20, 2005 - link
Since the max upgrade for AGP users on the ATI side is an X800xt/x850xt, it would have been nice to have seen one of them included.ballero - Thursday, October 20, 2005 - link
I'm looking forward to the SLI numbersAbecedaria - Thursday, October 20, 2005 - link
It is a significant error that SLI numbers were left out of the article since it seems to be about how fast current video card technologies can play the game:"Those who want to play FEAR at the highest resolution and settings with AA enabled (without soft shadows) will basically have to use the 7800 GTX, as no other card available gets playable framerates at those settings, and the 7800 GTX does just barely (if uncomfortably)." ...unless you have an SLI setup, I assume. Does Anandtech feel that SLI is not a viable graphics technology or am I missing something?
And then there's Crossfire... while it STILL isn't available yet, it would have been interesting to see some performance numbers along with SLI tests.
I'd would be nice if you could update the article with dual card frame-rates.
abc
Abecedaria - Thursday, October 20, 2005 - link
Oh wait!!!!PC Perspective has already beat Anandtech to the punch on this subject, and the results show that SLI has a SIGNIFICANT impact on playability, even without any driver optimizations....
http://www.pcper.com/article.php?aid=175&type=...">http://www.pcper.com/article.php?aid=175&type=...
abc
Ender17 - Thursday, October 20, 2005 - link
I agree. Can we get some SLI benchmarks?Kyanzes - Thursday, October 20, 2005 - link
...to see a card performing on the top when it's not even available...9nails - Saturday, October 22, 2005 - link
Exactly! I love this Land of Make Believe. It's a good thing that I have a AMD Athlon 64 FX-55 2.6 GHz processor in my Desktop, Laptop, and PDA. And I'm loving it because after an unreal CPU like that, I would still have hundreds of dollars left to burn on make-believe GPU's. Because, if I was only a regular Joe Anad-reader with a middle tier Pentium 4 and old school AGP graphics port I would be quite upset that the author is targeting his reviews at the well connected Beverley Hills posh.Just who is Josh writing his articles for any way?! I'm going back to surfing pr0n. Because I have a far better chance at dating a porn* than owning a system like the one that he's showing scores on.
yacoub - Saturday, October 22, 2005 - link
Well thanks for supporting the thread I started in Video forum section last week addressing that very issue. All the idiots came out of the woodwork to do their best to misinterpret and misread the post and very few actually bothered to support my suggestion that a test be done with a REAL WORLD system most of us own, not an FX-55 setup with a 7800GTX that few people own.I'd LOVE to see how modern games perform on a system I'm actually thinking of buying, not an imaginary supersystem.
deathwalker - Thursday, October 20, 2005 - link
You know..it's simply come to the point to where I don't know how the average gamer can keep up. It's come to the point to where if you are not willing to spend $300-$500 every 6-12 mos. or so you just can not keep up with the demands that games are putting on computer hardware. This is stupid..I mean who the hell is dragging this industry along? Do they develop new and more powerful hardware so more demanding software can be created or do they develop more demanding software making it a necessity to develop more powerful hardware? Is all this crap really needed to have a decent gaming experience? I guess I'm just gonna have to starve the Cat for the couple of months so I can toss out my POS 6800gt and get some new wizzbang graphics cards the industry wants me to buy. This has become a never ending process that is wearing thin on me.carl0ski - Sunday, October 23, 2005 - link
I think this is an EXTREMELY bad reviewwhat card do you own?
i know i own a ATI 9600XT bought 12 months ago and runs BF2 really well at medium-high
but why dont Article like this include that info??
Either these sites have lost the plot
Or ATI and Nvidia dont want us to know that older/cheaper cards are still capable
Yes because we all just happen to be playing FEAR with Drivers not yet available.
And WHat is wrong with this list?
alot at first glance for starts ATI Radeon X1800 XT (not yet available)
ATI Radeon X1600 XT (not yet available)
dont exist on the market yet. So yes just happen to be running those on FEAR already.
This articler is to sell VIdeo Cards not fear.
Pythias - Thursday, October 20, 2005 - link
"This has become a never ending process that is wearing thin on me."Amen. If it wont run on whay I have now, I simply wont buy it. The software/hardware gouging can continue on without me. At least with a console, you know the games you buy are going to run on your machine.
DerekWilson - Friday, October 21, 2005 - link
The games will run fine if you turn off maximum detail setting. There still isn't a card that can run EQ2 at extreme quality mode.I see this as a good thing because games out there are finally making use of the high end hardware some people have invested in. Until this half of the year there really hasn't been much out that could really make use of high end hardware.
This is quite different than requiring high end hardware.
xsilver - Thursday, October 20, 2005 - link
you should have forewell known that the computer industry moves very fastif you want a bugdet gaming experience, I suggest a ps2/xbox....
no one is telling you to toss your 6800gt, its just that if you WANT to run high resolutions with AA/ansio enabled then you need the latest/greatest card, its ALWAYS been like that
deathwalker - Thursday, October 20, 2005 - link
xsliver...I fully understand all of what you are saying...Im 58 yrs old and have been building customs systems for about 12 years...and...I "have" by in large kept up with new technology at all of my upgrade intervals. Perhaps in my position and at my age the payback just isn't what it use to be.bob661 - Thursday, October 20, 2005 - link
Sounds like you aren't having fun with todays games. I choose to stick to the old stuff until I see a game I like then I'll switch. I don't play new games just because they're new. I play BF2, UT2004 (the funnest game of these 3) and sometimes COD (and probably COD2 when I have a chance to play the demo). I don't play anything else because I don't like anything else. Also, my hardware upgrade path is solely dictated by the games I play.arswihart - Thursday, October 20, 2005 - link
i agree the x800xt/xl should be included, i can't understand why they would beChronoReverse - Thursday, October 20, 2005 - link
I must have missed it, but what were the other settings used for each card?I'm particularly curious about the shader level used and the texture detail level.
Le Québécois - Thursday, October 20, 2005 - link
Everything was set to maximum except for the soft shadow, AA and AF.capslock99999 - Thursday, October 20, 2005 - link
I was thinking about hte RAM issue too. I used 1GB for the demo, then I upgraded to 1.5GB. It removed a lot of stuttering and felt a whole lot smoother.This was the demo, of course.
Why are 6800GTs used and not Ultras? I've found this trend recently, a little puzzling.
xsilver - Thursday, October 20, 2005 - link
6800gt's are high mid range cards, whereas the ultras are not good value pricewise... plus not many people have themthe card that is missing is the 16 pipe last gen ATI cards, x800 pro/xt etc...
could that card be added please?
also, people might want a point of reference for old systems that want to see their card splutter on this game (9800pro / 5900fx) -- it would be great to see if these cards are still playable since they are using ps2.0 and generally older tech
ZobarStyl - Thursday, October 20, 2005 - link
Because more people have GT's than Ultras and it's not too terribly hard to extrapolate out the change between the two.Bingo13 - Thursday, October 20, 2005 - link
Very good article, would 2gb of ram help in this game as it does in BF2?DerekWilson - Thursday, October 20, 2005 - link
this game is very GPU limited, as you can tell by how steeply the resolution scaling graphs drop off. The game won't run at over 1600x1200 without a little hacking.We will look into testing with more RAM, but our initial thought is that performance (especially at higher resolutions or with AA enabled) will not be incredibly affected by RAM. We will update the article if we find anything.
Z3dd - Friday, October 21, 2005 - link
What about http://www.digit-life.com/articles2/video/giga-1.h...">this issue?Scoll down to the analasysis of local videomemory usage in F.E.A.R.
Though their conclusion is that F.E.A.R is so taxing on the GPU that you won't
notice that your card runs out of local memory.
Thatguy97 - Wednesday, May 27, 2015 - link
lol i can 80 fps at 10 by 7 on integrated nowThatguy97 - Wednesday, June 24, 2015 - link
this game shit all over my x800 xl :((((((