![]() |
My Loverlly GFX card
I've got a GainWard GFX card
so what do i do i use the ExpertBIOS, which enables me to update my GFX BIOS whilst in windows, what they didnt tell me is that latest version causes my GFX card to crash so now i can't use it, and here's whats even worse the updates they send me to fix it give me errors saying that they cant be used. So i guess im not going to buy a GainWard anymore... |
Re: My Loverlly GFX card
moved from realbot forum to offtopic, allthough not beeing able to play realbot with a decent gfx is a pitty
|
Re: My Loverlly GFX card
with a decent one LOL, i have to use an
S3 Trio 32, which cant use OGL or D3D, only software, and its max res is 800x600x16 and to use that in CS is 1-2fps max. |
Re: My Loverlly GFX card
ouch that hurts instantly
|
Re: My Loverlly GFX card
that hurts yes... also why'd you need to update a GFX bios ? :D
I have an integrated geForce4 MX, never touched it. Works well. :) My other comp has a geForce2, never touched it either. Not a single problem :) </mocking ;)> |
Re: My Loverlly GFX card
certain updates fix things like the shaders, mine were scewed before i updated the first time, i went to update again, and this time my entire card got screwed
|
Re: My Loverlly GFX card
.. so the second update was playing roulette, russian *ouch*
|
Re: My Loverlly GFX card
they have sent me 3 differnt files now to try and fix it, and its still FUKC'd
|
Re: My Loverlly GFX card
LOL do they at least know what to send you ??? :D :D
|
Re: My Loverlly GFX card
ive been in contact with the following countries about it
Taiwan = HQ of GainWard Sweden = only place that would pick up the phone for 2days Germany = European Tech HQ England = England GainWard HQ and they all can't figure out why its fucjed |
Re: My Loverlly GFX card
The only thing you can do then, if you know someone who has the same card, turn on the Video BIOS shadow in the PC BIOS setup (so that his computer won't need to read the video BIOS code directly in the BIOS chip), and then hotswap the two BIOS ROMs, flashing the fucked one on the shadowed PC. Barbaric, but works :)
|
Re: My Loverlly GFX card
LOL, bit of a shame then that mine uses the BIOS before even the CMOS kicks into action, but ill give it a go
// nope didnt work |
Re: My Loverlly GFX card
hmmm... odd. Are you sure you enabled both caching AND shadowing for your video BIOS ?
Normally if the BIOS is duplicated into the system RAM or the video RAM there should be no reason why the card would read bytes from the chip. Or maybe this card does "sanity checks" which prevent it to work when you unplug a chip... |
Re: My Loverlly GFX card
Hmmm I can have a look around for an engineering BIOS (Welcome to OC heaven) , but I need to know the cards name... FULL NAME.
Btw, I run a Creative GeForce4 Ti 4600, Overclocked and modded to Quadro4 (Hey, I am a hardware weirdo so I do what I can to have fun with it.) Now, if I had an XP 1800, I would been in the extreme division. Let me know the name of the card, and we could get it fixed for ya, FX's isn't the best in the backyard, ATi just raced past them, and is keeping the lead. |
Re: My Loverlly GFX card
I use a GeForce FX 5200 by BFG Tech. I picked it up when I needed a card with more memory, however, I got what I payed for when I handed the cashier 100 USD.
Sometimes I feel like my videocard is the scrawny kid on the playground who gets beat up often. |
Re: My Loverlly GFX card
I'll see what I can find around that card...
Not sure if I can get a BIOS for an FX, I will check, not many use them. |
Re: My Loverlly GFX card
hehe, just got my hardware supplier to give me a new card, rahter than me having to send it to germany to get fixed, lol what he didnt know was i only had an FX5400 (yes it really was a 5400, not a 5200, or a 5600)
and he gave me a 5900 ULTRA, lol, the leaf blower kind, lol but with the power of the fans on my system, you cant even tell its on, |
Re: My Loverlly GFX card
People say the best card in the FX series is the 5900 Ultra, however, I don't think that is a fair argument as the card with the highest speed processor would perform the best, because who really needs the best?
I know I've never needed more than 60 FPS in any of my games, so why would I spend more money to go above that? |
Re: My Loverlly GFX card
Actually the only relevant criteria are the number of polygons the card can handle each second, and the availability of such or such special effect directly in the hardware, like fogs, texture mappings and filtering functions. The human eye samples its world 15 times a second ; you could eventually play a game at 20 fps and not notice any lag, or very little.
I remember when I was developing my bots with my old computer (500 MHz), most of the times CS was running at 15-20 fps and it was still playable. |
Re: My Loverlly GFX card
lol, the lowest fps ive ever played at was 1, and that wasnt with this card it was with a GeForce 1 + my gfx settings.
everything i do must look the best no matter what card im on. at one point i had the fps so low you couldnt tell what you were doing at all. |
Re: My Loverlly GFX card
Quote:
But the more fps you get, the more fluent is your game. It's very difficult to play with 25fps, much easier with 100fps :) |
Re: My Loverlly GFX card
no, I think pierre is right with 15-20, television is @25 full pictures per second. but you cannot compare those values. On TV you have movement blur and those effects, therefore we don't need 60fps. but with a videogame it is different : you have no blur there, just sharp pictures. to have smooth movements, the human eye needs more frames per second. and I guess nobody doubts that 60fps looks better than 30 or even 20 :D
|
Re: My Loverlly GFX card
It's the Shannon theorem...
Any information digitally encoded that is to be used without loss has to be sampled at TWICE the sampling rate of the system using it. This is to prevent signal loss due to filtering. That's why our digital audio is sampled at 44kHz, even if the human ear rarely goes beyond 20kHz. That's also why the PAL TV standard goes at 30 fps while the eye samples at 15... On some LCD monitors the LC latency induces a form of filtering between two frames ; that's why I have been able to play CS at 15 fps without noticing much frame skipping. On CRT monitors though, it is different and I would probably have needed 30 fps to play correctly. Anyhow, above this value is luxe only... |
Re: My Loverlly GFX card
hi
it is 10 frames per sec., pal is 25 so 15 would not work (cinema movies are at 24, and one mpeg standart is defined at 23,89..) also tv uses interlave, this "enables" the human filter in your brain, which puts those half frames together. thats why viewing tv makes you faster tiered (you eyes). when you get tired it gets slower, close to 6 frames/sec, you can test it with avi's, before you go to bed you will be able to view movies at 18-20 frames/sec without noticing. this also lead to "how long is a moment", according to studies one moment is about 1,5 seconds when you are fit, and it goes up to 3-4 seconds when you get tired. but that is psychology... :-) |
Re: My Loverlly GFX card
the human eye samples at 120+ fps, but due to the size of our brains we are able to blend the images.
|
Re: My Loverlly GFX card
Quote:
|
Re: My Loverlly GFX card
that could be possible.. since you can see if the fps is low.. yeah i know digital.. but anyways..
|
All times are GMT +2. The time now is 22:04. |
Powered by vBulletin® Version 3.8.2
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.