6800

Well, that depends. When you factor in overhead and lack of an optimized codepath, it'd take a decent system. But by that I don't mean anything with a GPU near $300, I just mean not an off-the-shelf $400 OEM machine.

Regardless, as Mal said, you don't need a top-of-the-line card to play games. If you want to play at high resolutions on your 21" monitor @ 100hz with heavy use of AA and aniso, sure. Otherwise, my overclocked GF3 Ti 200 runs pretty damn good. But that doesn't stop envy and a desire for something faster, for many people. It's the same thing with cars. How much horsepower do you need to coast in traffic with one foot hovering over the brake? Ack, traffic here sucks lately...
 
Originally posted by Curtis@Apr 24, 2004 @ 01:09 AM

There seems to be some evidence that nVidia are fudging the benchmark figures with this card as they have done in the past with other cards. I'm not really sure how significant this is...

Ah yes, somebody posted that at another site yesterday. It's true, however they're comparing it to a modern Radeon. Compare it to a GF FX, and you'll see that they have made progress. Is it wrong? Sure. But it's a step forward, and it is a beta driver. They've started offering a higher quality mode in their drivers, too. ATI does some optimizations too, though nothing this noticeable.
 
Ah today I just OCed my processor to 2.4 ghz and boy did that make a hel of a difference in my games. They all run smooth as hell. Ah I think ill stick with my 95pro for a while but I have another question.

What's better: the 9800 pro 128 meg or the FX 5900 U 128 meg
 
Root of all evil or not, if they had PERFECT IQ but only so-so performance, they'd be torched. Again, it is still an improvement, or have you not seen the FXs in action? They are far worse.

Gallstaff: The 9800 Pro is a better buy.
 
Root of all evil or not, if they had PERFECT IQ but only so-so performance, they'd be torched.

Yes, but if the IQ ends up visibly suffering for the sake of performance, they'll be torched too (cf. Voodoo3). Anyone can draw a crappy image fast. I suspect that the problem is that they haven't had time to optimize their driver properly, so they threw in a handful of ugly hacks to get "indicative" benchmark scores, or something along those lines, and that the problem will go away after a couple retail revisions if not sooner.

Again, it is still an improvement, or have you not seen the FXs in action? They are far worse.

It's not enough that it be an improvement, it must be enough of an improvement to compete with the current state of the art in both IQ and speed. Winning on one and losing on the other isn't a win at all.
 
They've struck a reasonable balance. Even the harsh critics tend to note that when you're playing a game, you won't notice it. They've been burned with the FXs, they don't seem to be making such a drastic mistake. Shoot, I'm not even an Nvidia fan, but the IQ of the 6800 isn't bad. It isn't as good as the recent Radeons, you'll get no argument from me there. I prefer Radeons these days.
 
Originally posted by ExCyber@Apr 26, 2004 @ 07:12 AM

Yes, but if the IQ ends up visibly suffering for the sake of performance, they'll be torched too (cf. Voodoo3). Anyone can draw a crappy image fast. I suspect that the problem is that they haven't had time to optimize their driver properly, so they threw in a handful of ugly hacks to get "indicative" benchmark scores, or something along those lines, and that the problem will go away after a couple retail revisions if not sooner.

V3's 16-bit looks better than any other card's 16-bit except Kyro II's.
 
Originally posted by it290@Apr 28, 2004 @ 01:38 AM

V3's 16-bit looks better than any other card's 16-bit except Kyro II's.

Maybe so, but 3dfx stuck with 16-bit color/optimization for way too long...

Thing is, no game until around Quake3's time really USED anything above 16-bit colour.

Up to that point, most games were even still using 8-bit CLUT textures... for which 32-bit would've done nothing.

The only benefit to running 32-bit until at or around Quake3's release was in alpha effects, which V3 had a trick to help out.
 
Yeah, but Q3A killed the Voodoo 3 for a lot of people. It actually runs the game pretty well, but I remember a lot of people bitching about it when the game was released (not that they had any right to-- I was running the game on a bloody ATi Rage Pro for a while!). The original GeForce came out about the same time as Q3, and that was one of the things that pretty much nailed the door on 3dfx's coffin, IMHO. It always pissed me off that games were still coming out after Q3 that were still optimized for Glide (mostly Unreal engine games, Deus Ex being one).
 
Originally posted by it290@Apr 28, 2004 @ 11:59 PM

Yeah, but Q3A killed the Voodoo 3 for a lot of people. It actually runs the game pretty well, but I remember a lot of people bitching about it when the game was released (not that they had any right to-- I was running the game on a bloody ATi Rage Pro for a while!). The original GeForce came out about the same time as Q3, and that was one of the things that pretty much nailed the door on 3dfx's coffin, IMHO. It always pissed me off that games were still coming out after Q3 that were still optimized for Glide (mostly Unreal engine games, Deus Ex being one).

Why did it piss you off? Would you rather every game had required use of GeForce256-only features like hardware T&L while you were stuck with your Rage?

There's this thing called "Catering to the lowest common denominator".. combined with "Catering to the largest installed user base".

Voodoo3 was, for its time, THE most successful graphics card. Period. A relatively HUGE percentage of gaming systems at the time had Voodoo3 in them.

Even more people had the stunningly successful Voodoo2.

Why not make the game for GLide?
 
Originally posted by ExCyber@May 3, 2004 @ 02:20 AM

[cue OpenGL vs. DirectX flamefest]

No no no... Let's not do that. Why not talk about ATI's new chips? I hear they bumped the speed up a bit after seeing tests of the 6800 Ultra.
 
Originally posted by it290@May 2, 2004 @ 06:38 PM

Because proprietary APIs suck, especially if they're hardware-specific.

When you get right down to it... (not bashing here!) DirectX technically isn't even an API because it only applies to Windows. Where's my DirectX API for Solaris?
 
When you get right down to it... (not bashing here!) DirectX technically isn't even an API because it only applies to Windows.

Being platform-specific doesn't make something not an API. All that is really necessary for something to be called an API is that it embody a well-defined source level interface between applications and some collection of functions that is independent of the implementation of those functions. The DirectX APIs do this, even if they're only officially implemented on Windows operating systems. I suppose you could argue, however, that since they depend heavily on Win32 functions and data types, it would be more appropriate to say that they are extensions to Win32 rather than being APIs themselves. Is that what you meant to begin with? :p

Where's my DirectX API for Solaris?

Try here. :devil
 
Back
Top