Gabbe_Master:
But Far Cry has the best graphics today; how can you possibly play that game with a card that is three and a half years old? That seems odd to me...
You didn't account for how much nVidia's budget cards (which had to be kept in mind throughout the development of the game, or else Crytek would have the shame of not bearing that pretty little nVidia logo) sucked at the time when Far Cry was being developed -- an overclocked GeForce 4 Ti 4200 could come out over every one of the mainstream GeForce FX cards running at default clocks.
Combine this with the fact that Far Cry has a perfectly decent DirectX 8 codepath, and the result is the explanation for 32 fps in a game with graphics that are close to the best of its time under a video card manufactured more than three years earlier.
I have heard that you need at least a Radeon 9800 Pro to play Far Cry.
That's patently ludicrous. The video requirement for Far Cry is a card with 64 MB of RAM and drivers with DirectX support -- if those drivers happen to be the nVidia magic drivers, the hardware requirement drops down to a 486DX with 4MB of GDDR3 duct-taped on.
MajKrAzAm:
These are the requirements:
Video Card: 128 MB GeForce 4 128 MB to GeForce FX 5950; ATI Radeon 9500-9800 XT
Those are most likely the recommended specs.