Qhimm.com Forums
Miscellaneous Forums => General Discussion => Topic started by: atzn on 2002-11-02 20:10:12
-
Nayoung sent me this link, thought u guys might wanna test this out...
http://www.playonline.com/Download/FFXiBench.zip
I got a score of 2072 ..... pretty crap.
The graphics in the Benchmark don't look too good either... I wonder if there's any way to crank it up.. or probably it's set like that.
I'm not gonna get FFXI anyway.. but I thought u guys might just wanna test it out.
-
Aye, I got a score of a little over 3500 IIRC.
I felt it looked very blurred, which says to me that they are trying to implement a kinda soft-FSAA, which will probs be disableable (quite a mouthfull there)
-
and the linux version is where? ;_;
Actully, FFXI will probably not be in my inventory of Square games. I never really got into MMORPGs myself. I'll just call FFX-2 "FFXI" (it's not really X-2 but X+1 anyway)
-
Message
-
I downloaded and ran it - I have 512 megs of RAM with the page file disabled, and after it had been running for a couple minutes, it complained about being out of memory and shut off :P
I was only using 208 megs of RAM before I ran it to... memory hog?
Not to mention that the FSAA it uses looks pretty bad and is probably causing a considerable framerate drop...
-
You whining bastards. I got a score of 1835, but you don't hear me complaining about it, do you? No. So there. :-?
-
yeah I realised that it is poorly optimized.. gee.. don't tell me that they are going to run the final game on 640X480 resolution...
And I also heard that Square is working with nVidia to optimize the graphics... now what happened. :evil:
-
Grrr!
Ok, first I download the program with the intention of seeing how well my V3 would run it, and to show up any nVidiots here (:P), and what happens when I launch it?
Screen turns black, then dumps me to the desktop.
*scratches head*
What API is the benchmark using? DX8.1? Or did nVidia pull another one of their secret "screw 3Dfx users" subroutines in this game?
-
Quite simple: It needs H/W T&L to run.
-
Alright.
Thank God for Geometry Assist, then. ;)
I've had it disabled for awhile, so that could be it.
However, by the same token, this also means kyro users are screwed as well, unless they try running it with "3DAnalyse" program.
-
you guys ever see the Warcraft 3 Benchmark program? That looked like utter crap! And the game turned out well. don't judge a game by a small benchmarking program
-
You've got a good point, smurgen.
Anyway, me gots a status update:
Geometry Assist failed to fool the benchmark program. Considering G. Assist is part of the 3Dfx driver code, and that nVidia has that code, and nVidia is involved with Square in the gfx code behind the game, I wouldn't be surprised if nVidia added some code that allows the benchmark to detect the fact that G. Assist is trying to lie to the program.
-
i got 4141....
jari: why is yours so high?
-
restarted and got 5033 :)
-
Kunnnaejundaaa wuwaaa *blink blink* o.O
unnng you have to have G4 or better 3D-card
"5487"
Under G3, 512MB DDR, 2.2GHz...,
but I couldn't test under SiS 650_740 (Y_Y)
i got 4141....
jari: why is yours so high?
"GF4 Ti4400" really stop whining -_-;
FFXI is based on DX8.1 or higher...
also game is based on nVidia (I will re-read the homepage again)!!!
But I hope US Squaresoft will release US version of FFXI U.U
-
I think Square is going to release it in English, it would be really dumb of them not to
-
Kunnnaejundaaa wuwaaa *blink blink* o.O
unnng you have to have G4 or better 3D-card
Uh, if that was addressed to me, I don't speak/understand/read Japanese (if that was what the first part of that sentence was), so I'm lost on the meaning of that.
A G4? Since when did Square go to the Macintosh side?! :erm:
Geforce4 my foot! You do know that I'm the resident 3Dfx fanboy here, NaYoung? If not, I won't hold it against you if you didn't know that, since its kinda a running gag of sorts around here that started before your time. :roll:
However, I usually find it annoying when people immediately reccomend getting something of the nVidia species when something doesn't work with my vid card, because its kinda unfair in that the only thing holding back my card is a little discrepency in my drivers that say something like:
IsDX8.1Compliant=False
So, I'll try runnin' it with 3DAnalyze, as it is reputed to handle D3D8.1 calls better than G. Assist can when it deals with stuff that requires HW TnL capability to function.
Oh, and while we're on the subject of 3Dfx (no thanks to me ;)), I'm happy to announce that that V3 3000 AGP that came with the system that was supposedly fried by a power surge is undamaged and fully functional. So, YAY for me!! :P
-
nayoung wrote:
Kunnnaejundaaa wuwaaa *blink blink* o.O
unnng you have to have G4 or better 3D-card
Uh, if that was addressed to me, I don't speak/understand/read Japanese (if that was what the first part of that sentence was), so I'm lost on the meaning of that.
PS: Not Japanese... Korean ;)
WOW cool *blink blink* o.O
Yup you have to have G4 or better 3D-card
Hey says who *tsk tsk* I really like 3Dfx as well =__=
*I really like V5 5500 AGP with Windows 98...*
Cool thing about 3Dfx is that you don't have to
update card-driver very often!!!!
-
Message
-
I couldn't get it to download. First time it took about a minute before some error table or the other appeared (from their side) and the second time some scrambled text appeared in IE. :-?
-
Kunnnaejundaaa wuwaaa *blink blink* o.O
unnng you have to have G4 or better 3D-card
"5487"
Under G3, 512MB DDR, 2.2GHz...,
but I couldn't test under SiS 650_740 (Y_Y)
i got 4141....
jari: why is yours so high?
"GF4 Ti4400" really stop whining -_-;
FFXI is based on DX8.1 or higher...
also game is based on nVidia (I will re-read the homepage again)!!!
But I hope US Squaresoft will release US version of FFXI U.U
eh. i have a 4600 :)
-
Message
-
*chuckles* Why bother? There's no threat whatsoever from 3Dfx these days.
Are you familiar with Occam's Razor? In other words; did you consider that it might not work just because your Voodoo 3 is old piece of junk? Which will fail if the program demands something as simple as 32 bit color depth with 3D acceleration. But hey, if you think that the only problem with your card is the lack of DX8.1 support in the drivers, good for you.
That "piece of junk" still has some perfectly good hardware acceleration equipment on it. Physically, the hardware can handle the games (at least in my case, with my P4 1.5 Ghz backing it up), its only the drivers that's preventing it from running those games. I don't mind the fact that it can't do 32bit color, although i wouldn't mind having that capability in the first place.
Though it is true that 3Dfx doesn't really threaten the market, it still seems as though nVidia is either afraid of letting these cards have full support, or they really do think that 3Dfx constitues a threat to them.
Why? They seem adamant about keeping the 3Dfx driver code away from us. Yes, I can see if they need SOME code for the 3Dfx tech that they may or may not be using in the nv30, but most of that tech is coming from the Rampage architecture (which probably would have used a totally different driver set/code than the V3 or VSA-100 cards), and so therefore the Voodoo driver code really isn't mission critical for them, which makes it seem silly to not open-source it, so those of us who *like* our cards and don't have money to buy a new one can at least squeeze a few more games' worth of playability out of it.
Unless, of course, it really IS their intent to completely erase 3Dfx from the planet......
-
That "piece of junk" still has some perfectly good hardware acceleration equipment on it. Physically, the hardware can handle the games (at least in my case, with my P4 1.5 Ghz backing it up), its only the drivers that's preventing it from running those games. I don't mind the fact that it can't do 32bit color, although i wouldn't mind having that capability in the first place.
Even so, I personally feel that the Voodoo3 series are aging... that's what I think at least.
-
That "piece of junk" still has some perfectly good hardware acceleration equipment on it. Physically, the hardware can handle the games (at least in my case, with my P4 1.5 Ghz backing it up), its only the drivers that's preventing it from running those games. I don't mind the fact that it can't do 32bit color, although i wouldn't mind having that capability in the first place.
Unless, of course, the games actually NEED
-32-bit colour
-Textures larger than 256x256 (technically known as 'postage stamp' size in the industry...
-More than 2 texturing units
-More than 2 projection matrices (this is *really* silly, it costs nothing in terms of memory to support more matrices, the card just wasn't built to do it; even old Rage cards had 10 matrices...)
-Texture combining (needed to really make USE of more than one texture)
in which case the hardware CAN'T physically handle the games.
I should mention that I haven't picked out any features there which are 'new' and have only been around for a few years. My old Savage 4 - which, I might add, was around at the same time as the V3; had nearly the same performance; and cost far less - does most of those and *more*, such as texture compression. And yes, it was also compatible with old games such as FF7. So the V3 really isn't a good card in any sense of the word...
-
I can testify that the only thing the V3 was good at was glide, which no one uses anymore. I have a V3 in naru here and even in Linux, the hacker's OS, it simply blows. The people who made the DRI (Direct Rendering Interface) for Linux were able to hack in the ability render OpenGL in a window using off-screen rendering and filling in the blanks with software. I get 5 FPS running Doom Legacy.
The Voodoo3 is actully a Banchee, they changed the name in the devopment cycle as 3DFX was already going doun the poop chute. They took Brand Reconition over a good product.
Heck I can't even to 32 bit color in 1280x1024. I *So* need a new graphics card....
-
why they can't follow the examples done in "Nature" scene of 3DMark 2001SE? My R8500 showed a quite playble framerate but NOT in FF XI bench.
bah! all the elements showed in FF XI is outdated, :evil:
er, speaking of evil... it brings up for another upgrade strategy...uuah!
-
They do, It's called "Evolution" and "Survival of the fittest"
I bet the Tyrannasaur was a really cool animal, but being cool doen't garentee you'l be around forever.
-
Goku7: Hey, I kinda know how you feel. I held onto a viper2 card for waaaaay to long. Hell, the thing couldn't display textures correctly to save it's own butt until driver release 9.51.13 (which was well after the card's release). It was okay for it's time (note I said okay, not great) but quickly became outdated. S3 drivers have always been crap. It would be wonderful for companies to continue to write/develop/release drivers for cards that have passed their prime, but it's just not profitable. We all know what drives companies. It'd be nice to see continued development for our favorite 3d chipsets, but it's just not going to happen. I *finally* upgraded to a GF2ti, which was the best thing I could've done. I can enjoy games at decent framerates and have them display correctly. And get driver updates.
Halkun: Have you tried running the demo with wine/winex? I've had very little luck with wine or that transgaming crap, although my mandrake box isn't exactly what you'd call ready for prime time with gaming.
-
The Voodoo 3 was a nice card. Yes it's damned fast in glide, but it is missing features that are needed in todays' games. I think the real reason nVidia hold back the code is so that people buy new cards, so that the industry can move forward, rather than being rooted in it's past.
Don't get me wrong, 3DFX was a great company, I still have a Voodoo 5 sitting in my machine. Was is the operative word here though.
-
Wine....
I used to help develop for wine and and the system is not as mature as you think.
First, My openGL does not allow the use of threads. So the DirectX layer will not work on my box.
Second, there are many kernel fuctions that are missing in wine. Wine will never ever be able to use Kernel32.dll and gdi.dll even though it has a DLL loader. They run in ring0 and the linux kenel will never allow that. These systems are only partially implemented.
Funny story:
I once tried to get bleem to run under linux. It would smash as soon as I tried to start it. After a little research, I found out how bleem worked. To run as fast as it does, It kicks out the Windows9x kernel and replaces it with itself. In ring 0 it can do whatever it wants. When bleem tried to do that with the Linux kernel, not only did the linux kernel shut down the Bleem process, but also killed the winesever for being such an idiot for allowing that kind of operation to even be reconized.
-
Goku: Quite frankly, I think they aren't supporting it because 3DFX is a dead technology. Yeah, people still use it, but in all realities, it's a dead technology, and I think Square simply doesn't want to spend time supporting a dead technology.
BTW: My GF2 sucks, but I got a decent score. 2603 first run, 23xx second run.
Time to get a new Vid Card...
-
I got 3892..... I quite enjoyed that little fly around the scenery... was light relief from a day of maths :weep:
-
Hal: No misconceptions about the limitations of wine here. That's why there's a windoze box in the other room. I've played with it enought to know it's not worth messing around with (not yet, anyway). I'm not really a heavy gamer, so it's not like it's a big hassle. I use vmware at work, so I can run our billing software when I have to. It's really a shame there aren't more games/ports developed for linux, though.
-
Unless, of course, the games actually NEED
-32-bit colour
-Textures larger than 256x256 (technically known as 'postage stamp' size in the industry...
-More than 2 texturing units
-More than 2 projection matrices (this is *really* silly, it costs nothing in terms of memory to support more matrices, the card just wasn't built to do it; even old Rage cards had 10 matrices...)
-Texture combining (needed to really make USE of more than one texture)
in which case the hardware CAN'T physically handle the games.
I should mention that I haven't picked out any features there which are 'new' and have only been around for a few years. My old Savage 4 - which, I might add, was around at the same time as the V3; had nearly the same performance; and cost far less - does most of those and *more*, such as texture compression. And yes, it was also compatible with old games such as FF7. So the V3 really isn't a good card in any sense of the word...
Ok, yes, those are all very valid points. However, 3Dfx tech isn't exactly dead. IIRC, nVidia is incorporating some stuff from the original 3Dfx Rampage design, so not ALL of their stuff is bad.
The main cause of their downfall was the change of management that happened somewhere after the Voodoo2 was released, and unfortunately the new management wasn't to smart when it came to financial matters. The tech people were definitely top notch, if not ambitious, for if you remember, Rampage (which was supposed to have hardware capabilities similar to the GF3), was originally scheduled to be the next released card after the Voodoo2.
Also, yes, you are correct in that compared to hardware standards of today, the V3 is not exactly the best of cards. Personally, the only reasons (other than a lack of money) that I'm staying with the V3 is that currently the performance it's giving (framerate-wise) is still enough to where I can enjoy the games I play, and because I'm having a little contest with how long the card can last (speed-wise) in the games that come out (in order to really get a fair "trial", you'd have to assume that it does have drivers that support for DX8.1, and I'm still hearing reports on the X-3Dfx.com forums that the 3DHQ team is still working on it), and because I haven't found a Voodoo5 to use (Ok, so driver-wise I'd still be in the same boat, but I would at least have support for some stuff, like texture size and 32bit color), if only to see its reputed FSAA quality in action.
Yes, I know I'm stubborn, and may even be acting like an idiot by not upgrading, but if (and when) I do decide to get a newer card that's actually made by a still-existing company, I'd probably go for ATi, if only because they're not nVidia. :P
Oh, and besides, if I didn't keep carrying the 3Dfx fanboy torch around here, there'd be nobody to argue with about things like this! :roll:
-
On a slightly related note.
I was thinking of buying an ATI card, but I have decided to boycott the company because of poor business pratices. (Actully, I boycott other companies too. The other two are Disney and Microsoft)
ATI is on my sh*tlist cause they have a baltent disreguard for non-disclosure agreements.
A few days ago a leaked alpha of Doom3 was released. ATI was the ones who leaked it, but it doen't stop there...
Afew weeks ago ID software revieved a development laptop from ATI where they forgot to erase an alpha of the next unreal that's coming out. The guys at ID erased the disk, but that's not something you should be giving away. Espically to rival software houses
As a programmer, this is a very nasty thing to do. It's Nvidia for me.
-
Well, what can I say? They're Canadian. Go figure. :P
-
heh, i've got 803 with the bench :love:
-
Why is disney on your boycott list then?
-
Well, at 11mb I can't be arsed to download it unless I'm willing to stay online for more than an hour. What can I expect from my system:
Athlon XP 1700+
512mb DRR
GeForce 2 Pro @ 250mhz core & 490mhz 64mb DDR
Suddenly I realise how many people have better computers than me and think to myself "Oh shit, I need more money."
EDIT: ::Prays to the 3dfx Interactive gods:: Please, just one more...
-
Disney is on my list for the floowing reasons.
1) They have paied off the U.S. Congress to create a copyright law that is unconstitutional, granting them a de facto unlimited copyright term, when in Article 8 of the constitution copyrights can only be held for a limited time. This has ben challanged in Eldred Vs Ashcroft. A supreme court ruling is due in march.
2) The have used thier power to stife growth of animation as a art from in the U.S.
3) As not to have compitition from Japan, they have bought the world wide distobution rights to Studio Ghibli (Been often called "the disney of Japan") with the promose of doing theatrical releases, only to sit on the works and not making a wide release. This one gets me the most.
For example. "Sen to Chiro no Kamukaushi" (Spirited Away) broke all kinds of records in Japan and Hong Kong. Disney has the release rights for this work workd wide (execpt in asia) and here in the US, they released in only *26* movie theaters...... That's it. Even though it has had *rave* reviews. They did the same thing to "Princess Monoke"
Even Japan has noticed that there is something shifty going on.
Miyazaki (The directer/producer of studio Ghimbli works) Is getting old and will probably be retireing soon. DIsney is waiting for this to happen so they can knock him out as a threat to thier animation monopoly.
Funny story:
When disney bought the rights to Ghibli, they decided to hold ANime Expo at the disneyland hotel. They didn't realize they woulh have peopl dressing up on half nude costumes ot that the dealer's room would be selling adult material.. They had secuity tell anyon selling "objectional martrial" to leave and kicked out some lajitimate vendors..
AX will not be an disnel land anymore...
-
dgp9999: I scored a 2375 with the following setup:
Athlon XP1800
512mb SDRAM
GF2ti (200mhz)
Not exactly like yours, but with the DDR and faster card I expect you wouldn't do any worse. BTW, that's on a Windoze 2000 box.
-
I never thought I'd see the WinNT kernel performing better at games. <sigh> Goodbye Win9x, you lasted a long time. Still, my computer can't handle anything more than Creative WaveEditor and Cubase so I'm not trying to install any NT based OS, again. Whenever I open most MIDIs in Cubase VST/24 my system freezes, what is this?
-
I am the other Voodoo 3 owner on this board.. probably only what 2 of us.. I dont plan on staying this way for long either.. oh btw.. it wont run on mine either.. black screen gets all confused.. i thought i heard some birds chirping.. could have been my video card imploding.. -_-
I have hated being stuck without 32 bit color.. god when i get a new card.. i am gonna play all my damn games over in 32 bit color.. even if it makes no difference.. and emulation in 32 bit color..
I was a proud Voodoo 3 owner.. but now it is time to let it RIP.. i will probably wait until i get a new motherboard b 4 i decide on whether to go with ATI or Nvidia..
-
I have hated being stuck without 32 bit color.. god when i get a new card.. i am gonna play all my damn games over in 32 bit color.. even if it makes no difference.. and emulation in 32 bit color..
Believe me, once you start gaming in 32-bit colour you'd never want to see 16-bit colour again.
-
About ff11 Pc benchmark.
This benchmark is terrible. Graphics look like it is in 320x200 resolution.
The most shimering textures and aliased poligons i have ever seen.
I do not know if this benchmark is even using 3d hardware aceleration.
And this folder called "Rom" .Is this some kind of the piece of playstation 2 software emulated on Pc?. I think that this game was not programed for Pc from ground, but was ported in some way from Playstation 2.So maybe there are files similar to Ps2 version. I think that if this game would be programed specialy for Pc by native Pc experienced programers
it would run faster. Seeing ff7,8 for Pc and now this demo i think that Square has no Pc programing exerience.
I have not seen full ff11 for Pc so i may be wrong about that game. We shall see in time.
-
I scored 2315 on the test.
I'm using a 1.2Ghz Athlon XP(Underclocked due to m/b limit).
512MB's of SDRAM.
32MB Geforce 2 GTS
I think the benchmark score depends on the amount of free raw cpu cycles.
Therefore, I think the score is
raw cpu power - cpu power used by video card.
-
You scored much higher than me, so that's got to be the case.
256 megs of SDRAM
731 Mhz Duron (Mobo doesn't offer good overclock...)
64 meg DDR Geforce 2 Pro with a custom hacked BIOS
...and I make about 1K. Sucks to be me. I think this is a mostly CPU-dependant thing, myself. It would make sense considering the blurryness
-
Wow, another GeForce2 Pro user. That's a rarity. I've only seen one GeForce2 Pro in my life time.
-
Wow, another GeForce2 Pro user. That's a rarity. I've only seen one GeForce2 Pro in my life time.
I'm also using a GeForce2 Pro.... :P
-
I scored 2315 on the test.
I'm using a 1.2Ghz Athlon XP(Underclocked due to m/b limit).
512MB's of SDRAM.
32MB Geforce 2 GTS
I get about that score, sometimes less, sometimes more and I'm on:
1.3GHz Athlon (not XP)
512MB SDRAM
64MB Geforce 3 (not Ti or anything)
Windows 2000
I tried to run it on 98 as games and the like tend to be faster there, but it didn't even begin running. :-?
-
16bit and 32bit color actually have no difference to me.. I see both of them as many many colors, so it doesn't register in my eyes as much different. I'll have to admit that 16-bit looks a little fuzzy half the time. Since I play most of my games on my geforce2's tv-out feature, the screen is so fuzzy that there is pretty much no difference.
I would never sacrifice looks for performance..ever. It's just me. I know for a fact that our current gaming and 3d worlds will never become similar to "ours". So, why NOT make it look fake?!
-
I would never sacrifice looks for performance..ever. It's just me. I know for a fact that our current gaming and 3d worlds will never become similar to "ours". So, why NOT make it look fake?!
Really? The Kaya project (http://www.vetorzero.com/kaya/) would beg to differ. I admit that at this moment in time, no, but in the furture, I wouldn't be surprised to see FF quality gameplay on home PC's in the very near future.
Every day is a step closer to the goar of photorealistic rendering.
-
Of course, I for one think it would be ironic as heck if the first company to produce a fully-working holodeck had the name 3Dfx (obviously, this would have to be sometime after the copyright nVidia has on it had run out). :p
-
16-bit and 32-bit are like night and day to me. The banding is so obvious (even on a TV) it's insane!
Hehe...I wonder how 10-bit would look to my eyes?
-
You can always take a 16- or 32- bit screen shot, and go to Paint Shop Pro and reduce it to 256 colors (8-bit) using the "Standard" set of colors (it'll use just a normal 256-color palette) and you get... really crappy graphics! Heh.
-
Hmm......I wonder if screen shots outputed from my V3 are actually 16bit, considering that they use a 22bit post filter, so in thery any framebuffer data being captured in a screenshot might be interpreted as a 24bit image.........
:roll:
-
I think its supposed to capture what's on the screen, minus the mouse cursor [and sometimes a few other things], so I think it would be at whatever color depth the screen is set to at the time of the screen shot.
-
Well, the problem with that is when the screenshot is created by grabbing the info directly out of the v3's framebuffer. They've been saying over at the x3dfx forums that everything in the framebuffer is being rendered at 22bit, and if that really is the case, then the image you get should be at least a 22bit image, even when you've got it set to do 16bit color.
See why I'm confused? :P
-
So... you could just try it and find out :P
-
I did. Remember that screenshot of FF8 I posted a while ago that had a map of the real world in it?
Obviously, the game only runs in 16bit modes, so you would expect the image to be reported as a using 16bit colors, right?
Well, the .bmp file had been made using 24bit colors, or so it says. Go figure. :P
-
after processor upgrade from Pii-350Mhz to Piii-1302Mhz ("02?...:p), my FFXI bench score beamed up 3.5x higher! (28xx -- something)
-
i dont know much about the good ol 24bit color bitmap.. but i believe this is something like the file format.. all bitmaps have the ability to be 24 bit color.. but that doesnt mean they are.. that sounds screwey.. hang on..
uh.. ok
in photoshop for example you have the option to set bmp pictures to 24bit.. actually you have something like 1 4 8 or 24bit.. but just because the option is there doesnt mean that it is.. i think.. i cant find the proper words i am looking for here.. somebody... little help here.. :)
-
16-bit and 32-bit are like night and day to me. The banding is so obvious (even on a TV) it's insane!
Hehe...I wonder how 10-bit would look to my eyes?
Yes they are quite different, but I suppose it has something to do with the TV that I am playing on. I'm using a 20-year-old 25 inch tv (or something similar) and I have to squint to read text at 640x480 resolution. When I set it to 800x600 (which is the highest it goes), I can't read ANYTHING.
I played warcraft 3 and gta3 on the tv and I noticed no difference between 16 and 32 bit. Once I went back to using my monitor, I was like "what is this 16-bit crap?". I bet that if I played it on my new(er) tv, it would definately look different.
-
Of course, I for one think it would be ironic as heck if the first company to produce a fully-working holodeck had the name 3Dfx (obviously, this would have to be sometime after the copyright nVidia has on it had run out). :p
So in more than 70 years time then, according to international copyright law. Aside from the fact that nVidia may actualy renew the copyright.Hmm......I wonder if screen shots outputed from my V3 are actually 16bit, considering that they use a 22bit post filter, so in thery any framebuffer data being captured in a screenshot might be interpreted as a 24bit image.........
IIRC the shots are taken from the framebuffer, which is indeed 22-bit. There is however the gamma issue tat is acosiated with ripping from the 3dFX framebuffer IIRC. The Voodoo 5's had the problem that the shot in the framebuffer was not actualy antialiased, leading to misinterpreted screenshots.
-
IIRC some review sites mentioned that the Voodoo3's 16-bit colour actually looked much better than TNT/TNT2's 16-bit colour...
-
That's accurate. The Voodoo's had the best quality 16-bit colour there was.
-
Hmm I re-ran the Benchmark and I got a score of 2504.... :P
1.2GHz Athlon (Thunderbird)
1 GB SDRAM
64mb GeForce2 Pro
WinXP Home Ed
-
That's accurate. The Voodoo's had the best quality 16-bit colour there was.
But, the ONLY reason that is the way it is, is because of the 22bit post filter it does on its output. Take away that, and it would look like every other card's 16bit output, correct?
-
But, the ONLY reason that is the way it is, is because of the 22bit post filter it does on its output. Take away that, and it would look like every other card's 16bit output, correct?
I don't know about this filter stuff, but I think you're right.
Still, kinda surprising how fast Voodoo3 was in 16bit vs TNT2....
-
I don't know about this filter stuff, but I think you're right.
Still, kinda surprising how fast Voodoo3 was in 16bit vs TNT2....
Well, that could be attributed to the 256x256 texture size limit. The TNT2 could do the standard 2000x2000 size textures, but it would do them dog-slow....
-
They also had excellent dithering, but pretty much, yeah.
-
I agree with LordKane: V3s had the best 16bit colour. I remember running my V3 3000 on ePSXe w/ dithering and it looked almost as good as 32bit on my GeForce 2. Without dithering it still looked pretty poor compared to nowadays 32bit cards. Keep in mind V3s can't do hardware Alpha Multipass, Alpha something anything, the one in Pete's GPU options, that's gotta effect soemthing.
-
Which is why us V3 users prefer Lewpy's plugin. Due to it using Glide for its graphics processing, it gets a nice performance boost compared to using Pete's plugins.
Not to mention, it seems to do the FF9 screen swirl thing better, both speed and graphic-wise. :P
Oh, and V3's can do Alpha Blending, at least standard Alpha Blending. They just can't do "Advanced Blending", IIRC.
-
Hrm... benchmark gave me 4675...
Athlon XP 2400+
512 MB RAM
GeForce4 TI 4600
-
impressive, aaron.
i've came across a V5 a "long" while ago... is that card still worth it?
if not, i felt very sorry for him spend $300+ back then.
-
impressive, aaron.
i've came across a V5 a "long" while ago... is that card still worth it?
if not, i felt very sorry for him spend $300+ back then.
Voodoo 5's are only really worthwhile now for old games (The FSAA is superb for older games), and for N64 and PSX emulators.
-
Yeah, if/when I do upgrade (probably more of a "when" than an "if" though, lol), I'll still keep my 3Dfx cards with me, mainly for glide support, and the fact that its nice to have a piece of computer history with me. :P
However, I'll still jump at the chance to aquire a Voodoo5 if I can (that is, if I can wing it and still have something left in my bank account, lol).
-
dude.. u can probably get a voodoo 5 for dirt cheap.. unless they have become a collectors item now :wink:
-
dude.. u can probably get a voodoo 5 for dirt cheap.. unless they have become a collectors item now :wink:
They are something of a collectors item now, but you should be able to get a PCI one for less than GB£70
-
I see an AGP one on Pricewatch for just over $100...
-
I was talking on eBay, like where I got mine. New onbes are few and far between, and are probs expensive for their rarity value.
-
Yeah, I know all that. What I plan on doing is seeing if my parents will be willing to pay for it for Christmas, LOL, if I find one at a good enough price during that time. :D
-edit-
Of course, that prompts the next question: V5 5500 or a working V5 6000? :P The answer to that is an easy one as well: I'd get the 6k, as long as it was one that actually works.
-
Yeah, I know all that. What I plan on doing is seeing if my parents will be willing to pay for it for Christmas, LOL, if I find one at a good enough price during that time. :D
-edit-
Of course, that prompts the next question: V5 5500 or a working V5 6000? :P The answer to that is an easy one as well: I'd get the 6k, as long as it was one that actually works.
Lol, that's a bit silly IMO. The Voodoo 5 6000 are rarity items and frequently sell on ebay for upwards of $700. There were very few of these boards every made....
-
That last part I wrote was tongue-in-cheek, Lord Kane. Yes, you are correct on both counts, working V5 6k's are RARE, and are truly expensive. Not to mention they are about a foot and a half long, they're so big some people can't fit 'em in their comp's case!
-
any one notice FF 11 is out for windows in japan
yet it aint even out here on ps2
dammit
http://www.watch.impress.co.jp/akiba/hotline/20021109/etc_ffxi.html
if anyone can import/export this for me id be gladly in debt to u,
and ill send u a huge batch of anime on CD.
BTW i got a voodoo 5 5500 PCI for sale if anyones interested at 65£ its a steal, believe me, ive looked for this on Ebay it always gets to the £90+ mark
-
Well... At least it's posting the required specs...
Min Required: P3 800Mhz, 128megs RAM, & a Gefore card w/ 32megs of mem. (They don't specifiy what kind.)
Recommended: P4 (no speed listed), 256 megs of RAM & a GeForce 3 or 4 Ti with at least 64megs of mem.
It looks like they are recommending the GeForce4 Ti4200 as the card to use. (I can't really read japanese, but I can get the gist of it.)
Besides, if anyone DID import it, they'd have to connect to the Japanese servers. That'd make a lot of problems in terms of connection speed...
-
When you are going across the world though, you start to run into the speed of light problem.
Say I'm in New York, and I want to play a server in Tokyo. That's 6760 miles. The speed of light is 186,000 miles a second. In ideal conditions, your signal has to go from your computer to the server, and back. Say that this travels at the speed of light (it does not - it is slower due to overhead, A/D conversions, electricity travels a bit slower than the speed of light, etc.)
So, we have to go 6760 miles there, and 6760 miles back. That's 13520 miles. At the speed of light, there and back it takes 72.688 ms (((13520 miles) / 186000 miles/sec) * 1000ms/sec). So, at best you can have a 72 ms ping. And that's using grossly bad estimates. (Not taking into account time lost by routing hops, A/D conversion, network overhead, latency, etc.)
You really can't defeat the speed of light.
Taken from http://www.kaillera.com/
Hee hee
-
Disney is on my list for the floowing reasons.
Heh. I new they showed the last few Ghibli films in theaters in the US but I didnt know it was that bad. Bastards. Disney are idiots for even thinking there animation comes close to Japanse Animation.
Anime >>>> Disney
.........
Wtf? When I try to run this bench mark I get the error
"Windows cannot acces the specified devive, path or file. Youmay not have the appropriate permissions to acces the item"
Eh???
Seen. Made a copy of the main exe and it worked fine.
Scored 3650 on:
Atholn 1.6ghz
Elsa GeForce 3
-
>Disney is on my list for the floowing reasons.
You know they also apparently stole the plot for Monsters inc too.... at least according to an article on Slashdot (http://www.slashdot.com)
-
Message
-
look fellas i dont care about the online aspect.
i want it for the packing!! dam it looks good for a pc game :)
just a expensive jap momento, that i can show off with :)
-
You guys wanna the the diffrence between Nadia and Disney's Atlantis?
Read the following link. You would get disgusted with Disney pretty quick too.
http://www.newgrounds.com/lit/atlantis.html
Hmmm....
-
lol Ahhh how classic, as well as Leo (I think) & The Lion King lol
-
http://real.amazon.usa.speedera.net/ramgen/real.amazon.usa/games/squa/b00006la8u160300.rm
There, you might be able to see some parts of FFXI opening!!!
PS: If you didn't know...
http://real.amazon.usa.speedera.net/ramgen/real.amazon.usa/games/capc/b00006d2dl160300.rm
-
Looks cool, but thanks to the Real Player format it looks really choppy on my PC (I cannot be bothered to update it....)