*2nd UPDATE* Geforce 2 can't play a single game - Computer freezing! - HELP!

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Mem

Lifer
Apr 23, 2000
21,476
13
81
AnitaPeterson well we are getting somewhere very slowly,I forgot to say also install latest monitor drivers as well,with Quake3 do you have all the latest updates as well.

You can try going
here to Geforce FAQs ,must admit I`m running out of ideas,unless you want to upgrade to WinXP ;).
 

Assimilator1

Elite Member
Nov 4, 1999
24,152
517
126
What motherboard have you got? I'm somewaht confused by your sig link saying Creative ,is that right?

Btw what about latest VIA 4 in 1 drivers?
 

rogue1979

Diamond Member
Mar 14, 2001
3,062
0
0
Windows 98 is still arguably the fastest OS for gaming, is there a reason why you couldn't go back to 98 and solve your problems?
 

Assimilator1

Elite Member
Nov 4, 1999
24,152
517
126
She wanted Win2k for vid editing

Anita
Dual boot with Win98 would be the ideal answer ,though it would involve more hassle ,FAT32 would have to be used too
BtwI dont see how incorrect monitor drivers would cause system lock ups.

Have you tried the latest vid card drivers from Nvidia?
Have you checked with your m/brd maker to see if there is a latter bios to solve any Win2k problems?
 

jamesbond007

Diamond Member
Dec 21, 2000
5,280
0
71
Anita, tough break on your situation. However, regarding your VIVO problems, I've found that you can use the nVidia WDM Driver (Personal Cinema), which is available on this page. Upon reboot, the drivers will detect and install the appropriate drivers for your VIVO options on your Asus Deluxe model video card. I've used the nVidia reference drivers paired with the Personal Cinema drivers ever since they've been available on their website. I haven't had one problem with them. I've tested and used this driver combo on the 7700 Deluxe and the 8200 Deluxe models from Asus.

About your driver/gaming problems, I suggest a full format again, using FAT32. After the initial install of Win2k, install the SP2 update, Direct X, your video drivers, and your sound drivers in this order. (This is what I do with all of the machines I build and repair, which has yielded me no problems yet.)

Good luck!
 

Bozo Galora

Diamond Member
Oct 28, 1999
7,271
0
0
MASS:


you would be wasting your time asking anyone in the industry about your problem
there are six strobe lines in AGP 4X that must be correct within .06V or 4X AGP wont initialize and
there are few mobo that have voltage distribution sufficient or sophisticated enough to adhere to
Intels 4X AGP specification because if a mobo maker started to make a board that would do it, they
would price themselves out of the market.

Abit is making an effort on their newest motherboards with 4 phase (yes 4) power implementation.

Intel based boards also fail in 3D but not nearly as much because Intel wrote the AGP specs but give no
help to VIA or AMD, who have to back engineer their 4X.

Windows protects the hardware its running when you ask it to do something it cant four ways - drop from
game back to desktop, hard freeze requiring reboot, black screen (no video) or BSOD with loop message.
When you switch to game or to internet you are changing from cruising in 2D to 3D accelleration.
Your best bet, since you are probably not involved enough to tweak your system parameters for months
is to reinstall 2K with 2X set in BIOS and load "standard" 4 in 1 AGP (1X - 2X) or no 4 in 1 VIA AGP at all,
which means your system will use Intel gart supplied by detonators.

AGP 8X is also an effort at solving this prob since voltage will be .8V
 

AnitaPeterson

Diamond Member
Apr 24, 2001
5,991
491
126
Jeez, what a hassle! Weren't these stupid pieces of hardware and software supposed to be compatible without us killing our minds over tweaking and finding hacks and going around specs?

I mean, it could be fun, too, but sometimes I need a break!

I have decided that the safest - and probably less painful, not to mention cheapest! - course of action would be for me to forget about gaming on my main, video editing machine.

I will use the weaker, secondary computer of the house, for anything related to gaming.

Thank you all for trying to help, for your goodwill and knowledge.

*!%+&^)_# very much, ASUS!
 

Assimilator1

Elite Member
Nov 4, 1999
24,152
517
126
I don't think you can blame Asus for this 1 ,Intel ought to give more info to VIA & AMD..............not that I beleive that will ever happen!
 

Bozo Galora

Diamond Member
Oct 28, 1999
7,271
0
0

the following is long, but the best explanantion i've ever seen on why your computer wont work on 4X
and crashes or freezes or loops:

The peeking begins...

If there is one topic guaranteed to cause arguments and controversy amongst computer enthusiasts, it has got to be the subject of the AGP port. Go and visit any popular discussion board, and read the vast number of posts from people trying to get their system to work at 4x speed or with sideband addressing. The whole area of this data transfer bus seems to be shrouded in mystery and laden with a great deal of randomness. How is it that one machine works fine with every dongle switched on, and another (seemingly identical one) spit the dummy out with anything other than 1x speed?

This article is aimed at explaining what all those features are, plus whether or not they'll give you an increase in performance on your system. Hopefully at the end of it you'll have a much clearer idea about what's going on - plus you'll be able to thrash your mates at "Geek Question Time". Sit back and read on...

Who, what and when?

First off, some terminology and explanations. The AGP (accelerated graphics port) was invented by Intel and released to the world market as a specification in 1997. It's important to understand that the specifications are rather like cooking instructions - they tell you how it should be made and how it's all supposed to work, but the final design can be as individual as each manufacturer that makes them. Fortunately though, the specification is very precise so variances between the makes are minimal.

PCI = Slow (boo), AGP = Fast (yay!)

In 1997, Intel's 440LX chipset (later followed by the BX chipset) showed the way to go with the first AGP slot, meeting specification 1.0. This slot had a few nifty tricks up it's sleeve.

You see another problem with the PCI bus is that it can't request for more data until the preceding request has finished. Every data request requires a response first from the Northbridge chip or CPU, so if it's busy then the PCI device will have to wait until it gets an answer back before requesting something else (although it gets around this a little bit by using bursting, whereby lots of chunks of data are sent with just one request). The AGP bus instead uses pipelining - the bus just makes requests from data as and when it needs to, without having to wait for the last one to be responded too. Nifty, huh?

To free up the AGP bus for just data transfers, there is an additional little bus called the sideband addressing port (SBA from now on). It's only 8 bits in width, but all it has travelling up/down it are address lines for where in memory the data is going to/from, and requests for data. The latter part is especially useful. Remember, the PCI bus has to go through a rather odd cycle before it can get data: (1) a PCI device asks the memory controller "Can I have the data bus for a bit so I can get some data?", (2) to which the memory controller replies "Yup" if it can. (3) The device then makes a request for the data, (4) waits until that request has finished, and then (5) waits until the PCI buffer releases the data. By which time, we've all fallen asleep....*thud*.....

Hurrah for the AGP port! It doesn't need to ask permission as to whether it can use the AGP bus, because it's the only thing on the bus (something 3dfx obviously never noticed when they designed the V5...tee hee). Along with the SBA, there can be an almost continuous stream of data between the RAM in the motherboard and the video card, provided the CPU doesn't choke and die.

There was another reason why the AGP bus was given this direct link into the system RAM though. Video RAM (VRAM) was very expensive at the time, as was all memory - you'd be lucky to see more than 4Mb on a video card in a top PC back then. A single video frame with a resolution of 800 * 600 in 32 bit colour with plenty of polygons weighs in at several Mb. Add in a few more Mb of memory to store the textures for the 3D bits, a bit more for the z-buffer, a wee bit for the RAMDAC too and....oh dear...you've got more than 4Mb. The AGP solution was to allow a software designer to keep the textures and buffers stored in the system RAM and let the video card access them when it needed to. Windows has a funny habit though of putting things from the system RAM into a file on the hard drive (called the virtual memory or swap file) when it needs the room. A game designer wouldn't want their textures being shifted (the game would effectively stop whilst it waited for the hard disk to find the textures), so Intel included the AGP Aperture in the specification.

The aperture is a setting, nothing more. Think of it as being an address book for the system RAM which gets used by applications that access your video card. The aperture setting specifies the size of the address book, if you like. Nothing special in that but firstly, the location of the data within the memory is always known and the contents of the aperture can never be paged (or swapped) to the hard disk. The default setting for this is typically 64Mb but this does not mean that 64Mb of your system RAM is now the AGP aperture. In fact none, some or all of it could be. It just depends on the application using the video card. If it doesn't need to use any of the aperture, then all of the system RAM is free to be paged away.

Just to really kick the poor old PCI slot while it was on the floor, in waded the AGP port again with its higher clock speed. As well as transferring data at 66Mhz, the port could be made to transfer data twice per clock cycle (it's often called DDR these days or double data rate). This gives an effective clock speed of 133Mhz. Better known as AGP 2x speed, it stuffed the miserable 133Mb/sec bandwidth offered by the PCI bus with a steaming 533Mb/sec. However, this technology didn't come without a price.

To get around potential timing problems with the data flow, the AGP bus uses separate data lines to the normal motherboard one. Better known as strobe lines, there are 3 in use when the card operates in AGP 2x speed. Two of them are used for the actual data transfers (in two 16-bit signals, one for high priority threads and the other for low ones), whilst the other is used for better grounding (which helps to remove signal noise). More importantly, the strobe lines are not generated by the PCI signalling chip; instead they are created by the graphics card and the Northbridge chip in the motherboard. Although these strobes are synchronised with the main AGP bus clock, they are required to be very precise and this precision has to come from 2 areas: the chips making the signals and the power supply unit providing the voltage for the strobes.

Specification 2.0 - it's 2 so it has to be better, yes?

By the end of 1997, the AGP slot was already becoming king. Every new video card released onto the market was primarily an AGP one (with the exception of the mighty Voodoo 2), but by then the rest of the technology world was progressing very quickly. RAM prices were falling and games were getting more a lot more complex too.

Id software's release of Quake II in December of that year showed everyone just precisely where the future was heading - polygons and lots of them too, with very nice textures all over the place. Although it wasn't in response to this, Intel released specification 2.0 for the AGP port in 1998. It promised much - perhaps too much as all video card manufacturers and motherboard makers raised their eyebrows and only delivered small parts of the new instructions. First to use the new information were the video card makers. They released onto the world AGP 4x compliant graphics cards.

The only problem being there weren't any motherboards that provided AGP 4x speed, with the exception of Intel who had their 820 chipset as being the first to offer it. But that got released late too....*sigh*....

<http://www.xtremepcuk.com/articles/agpunc1/820chip.gif>

AGP 4x was more than just a speed increase - it was a distinct hardware change over the last specification. The AGP data strobe lines had their clock speeds doubled to get the increase, so within one cycle of the AGP bus clock there are 4 cycles in the data strobes. To beef up the even tighter precision Intel popped in another 3 strobes too.

These new 3 lines work in parallel with the other 3, but they alternate 180º out-of-phase. In terms of clock cycles, it means that the two sets of 2 data lines cross over each other during a single clock cycle. Since this happens 4 times within one AGP bus cycle, each cross-over phase marks the moment at which data is sent or received. Northbridge chips at that time weren't as complex as graphics chips or CPUs so to expect them to suddenly cope with a data rate of 266MHz (when the next nearest is 100Mhz for the memory bus) was a little naive. In addition to the extra strobe lines, the new specification required a change in the signal voltage. For normal use, the PCI and AGP slot use a 3.3 volt line to do this. However, for AGP 4x speed, the strobe voltage operates at 1.5 volts; the reason for this is due to the problems of switching a large voltage at high frequencies. Unfortunately, this change alone can cause problems in signal stability. On the good side, the increase in effective bus speed raised to the bandwidth to 1066Mb/sec - on a par with all but the fastest system memory speeds.

Also sprinkled into this new specification was another term - fast writes. The idea behind this is to allow the processor (or another device in the PC) to write data to the video card's memory but without having to be cached through the main system RAM first. This then removes the potential bottleneck that the system RAM could cause and bung up the performance. The data from the CPU is written directly to the video memory but there are restrictions though, namely that it can only operate in 2x or 4x speeds.



All in all, everything was looking very rosy for the AGP slot. NVIDIA had their TNT2 Ultra chip out by the middle of 1999, and several manufacturers were using this chip along with 4x speed. What could possibly go wrong?

The GeForce 256 - blame it for everything?

History is littered with geniuses pointing out flaws in other people's work. Einstein showed Newton the real way with gravity. Freud with every mother/son relationship on the planet. Quayle teaching the youth of America how to spell....maybe not. Oh, and NVIDIA versus the rest of the world.

Well ok, not quite the entire world - just the motherboard and power supply unit manufacturers. When their GeForce 256 graphics chip came out in 1999, NVIDIA forced the computer world to sit up and take notice of the fact that far too many PCs weren't ready yet for the full deal, as spelt out in the second AGP specification.

They were helped by an ugly duckling that had all grown up - AMD. They'd released their Athlon chip and 750 chipset at around the same time that the GeForce chip was let loose. Within a few weeks, people had noticed that despite the fact the both the 750 chipset and the GeForce card met the first AGP specification with ease, the video card would cause such a system to lock up when the AGP port was set to 2x speed.

The official word from both companies was that the GeForce was just particularly sensitive to "line noise" in the data transfer strobes. AMD admitted that the Northbridge chip in their 750 chipset was "slightly" less than ideal concerning electrical noise, although they did seem to fix it with later revisions of the chip, as not all 750 motherboards have this problem.

Something else seemed to be amiss with the NVIDIA chip though. By default, it had the SBA disabled (although some manufacturers offered an easy way to reactivate it). Why would this be so? Although many of the early problems concerning stability could be attributed to immature drivers there remains the fact that specific hardware configurations caused real problems. Just what precisely is it about AGP 2x and 4x, SBA and Fast Writes that produces the instabilities that computer enthusiasts regularly complain about? Well, it primarily revolves around the issue of tolerance.

Any device that uses an electrical signal to transfer and receive data expects the voltage, that provides the strobe line, to be within a fixed range of numbers. For example, all AGP devices that use 3.3v signalling also expect the Vref line (the voltage line that acts as the clock) to be within the range of 1.287 volts to 1.353 volts - a gap of 0.066v between the two. At AGP 4x speed, the data strobes use 1.5v lines, and requires the Vref to stay within 0.72 to 0.78v - a gap of 0.06 volts. Now these figures may seem very small, and indeed they are at 60 millivolts, but in the world of high-speed microelectronics it is a chasm to be crossed.

Which is why voltages rule in computers. They determine exactly when and how data will be moved around. If it all boils down to timings (and the features within AGP specification 2.0 need exceptionally good voltage timings), somebody has to set the standard of build quality and tolerance. So who sorts this out then? The graphics chip designers, the makers of motherboards or the people who churn out the power supply units? Or are none of these people to blame? Are we just expecting too much from PC suppliers?

Quality counts but does it have to cost too?

Shortly after AMD released the 750 slabs of silicon, VIA came up with an alternative that fully met the latest AGP specifications: the Apollo 133 chipset.

Used in both Intel and AMD guises, it was another alternative for motherboard makers to use in their products. The Northbridge chip in some of the Apollo chipsets offered the full AGP 2.0 deal - SBA, 4x speed and Fast Writes. VIA promised that it would be a superior solution to the AMD-750 chipset and at least equal to anything produced by Intel. Don't you just love marketing hype!

However for many people, it simply wasn't what they expected. What was more of concern though, was that it worked just fine for others. How was it that one set-up could operate at 4x speed with SBA or FW, and another seemingly identical one could not? How much variation is there in the quality of motherboard or graphics card? The answer maybe hidden to the average user but blatantly obvious to the overclocker.

A trip to MadOnion to look at the submitted scores of 3DMark 2001 will show you that no two graphics cards are the same - every card is different and produces surprising variations in the limit to overclocking. The same applies to CPUs. The likes of Intel and AMD do not produce 10 different speeds of chip. Instead, one silicon wafer produces dozens of the same design. Each chip that is taken from the wafer is temperature and voltage tested for a given clock speed. Those that make it are labelled accordingly. Some will only meet the targets at lower clock speeds and so are set, badged and sold at this slower speed.

So if there are problems with production variances, why can't the makers remove the "failing" chips and not use them? One word: profit. Intel and others don't waste much money these days - if a chip runs slow, then it is simply sold a slow chip. If a Northbridge chip runs with a poor level of tolerance though, it will still be used provided it is not out of specification. Should the market suddenly demand manufacturers veto every single component that goes into their products, then the problem of production variation will disappear but until then, we must be forced to accept that not all products in the world of computers are equal. So does this let off the motherboard and graphics card manufacturers then?

Well there is the question of design. ATi's Radeon cards seem to have the odd concern with AGP 4x speed as well, but SBA is activated by default and works just dandy. Matrox's G450 video card operates rather nicely at the higher AGP speed, with none of the major worries that seem to inflict the entire GeForce range. For the future though, I think it will be safe to assume that the 3 main graphics card designers will ensure that their products can use all of the features offered by the AGP port but without making such a song and dance about it. Who else then, can we point the finger of blame at?

Power Prevents Problems!

If you cannot control the quality of the chips in your motherboard, then at least you have the option of dealing with the other potential source of poor tolerances - the power supply unit (PSU for short). This is the device that takes the 110/240v mains supply from your wall socket and then steps it down to fixed voltages of 12v, 5v and 3.3 volts. From these voltage lines, the motherboard then has to produce all the strobe lines and current supplies for every device in it. The tolerances have to be good but not especially so - something like the core voltage to the CPU must not change by around 5% (e.g. a vcore of 1.8v should vary by no more than 1.71 to 1.89 volts). 5% sounds quite reasonable, doesn't it?

The manufacturers of PSUs are notoriously shy at producing realistic figures to the tolerances on any given voltage line. Enermax do though - they give values of around 4 to 5% tolerances, with ripple voltages (small variations) of around 0.05v for the 3.3v line. The tolerance is sounds about right, but a variation of 50 millivolts is sounds great. What could be wrong with these low figures?

Go back to the Vref lines. AGP 4x speed uses 1.5v lines - a 5% tolerance in that would give a variance of 0.075 volts. The Vref is supposed to allow a maximum of 0.72v and a minimum of 0.78v, but with the 5% tolerance in the main line itself, the Vref could actually vary from 0.68v to 0.82v. AGP 4x is sensitive to such changes, so it's not surprising that such a small change in one place can have considerable consequences elsewhere.

In fact, most of the problems with AGP 4x stem from the fact that it uses two 1.5v strobe lines to transfer data. Since the signals are not as "big" as with a 3.3v signalling line, any such details concerning signal integrity need to be more closely controlled. A decent power supply unit will always help out here and at the end of the day, you pay for what you get. However, you don't have to pay a lot for decent equipment. Both overclockers.co.uk and theoverclockingstore.co.uk sell top brand power supply units that don't cost a fortune. You can spend lots though, if you so wish to; I once coughed up £140 for a 550W Enermax although my problem wasn't stability - I had a squeaking Radeon (don't ask).

This is not to say that all of the problems behind the stability of the AGP lie behind the power supply unit; it is just one part of a more complex overall view. However, it is the only part that you have any real control over; so in the quest for that holy grail of The-Workingeth-AGP help yourself out by swapping the generic supply that almost certainly resides in your PC right now with a higher quality one.

You might ask though "well why don't the motherboard designers build something into their products to smooth out the power supply?". It's already been done - the current spread of Athlon/Duron motherboards bristle with smoothing capacitors; there to reduce fluctuations to an absolute minimum. They can only do so much, and too many affect high speed signalling anyway, which is precisely what you don't want on an AGP port.

Intel has already released a draft specification about what the next generation AGP bus will entail - AGP 3.0. In their words, it's being done because certain games/apps are hitting the port bandwidth all the time. Future games will be pushing huge amounts of data to the graphics card and the bus will be struggling to cope under the current specification.

Intel plan to have it so that the data strobe lines will have their clock frequencies doubled to 533Mhz and they're also removing 3.3v signalling, so users will be forced to put up with 1.5v signalling. In addition to this, 0.8v signalling will also be available. That means that all other manufactures - video cards, chipset makers, motherboards suppliers and PSU producers will all have to really tidy up their tolerances. Failure to do so, will see another round of unstable setups with motherboards and video cards meeting this new specification - and we've all had enough of the current bout, thank you very much!
 

Antos

Member
Apr 12, 2000
70
0
0
Ok, i stumbled upon this thread late.

Try this before giving up. (and bTW, I know why UT works and Q3 doesn't....because UT doesn't use AGP features, IIRC).

Use Zagptool (?) or Powerstrip to force your AGP to 1x. Then try some games

Do they work now?

Then check is Sidebanding is enabled. If it's enabled, disable it in powerstrip.

You can also use Rivatuner to force SBA off, but you may or may not be able to set AGP 1x in the registry with rivatuner with the new detonators....they seem to ignore the registry key for AGP rate, at least on my system, where the 12.90 drivers were perfect.

If your Quake3 is working now, try enabling SBA again, but keep AGP at 1x.
Working?