KyroII and no T&L. What does that really mean?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

uknemesis

Senior member
Jun 18, 2001
384
0
0
People keep saying that the Kyro 2 will be limited by not having a t&l gpu. I'm pretty sure I read somewhere (yes it's one of those) that with a fast cpu disabling t&l on the video card can increase performance because a top of the range cpu is a hell of alot faster than any gpu?

Have any of you with a top of the range cpu tried disabling the t&l gpu and using you cpu to process t&l instead? What impact did it have?

Nemesis
 

Mingon

Diamond Member
Apr 2, 2000
3,012
0
0


<< People keep saying that the Kyro 2 will be limited by not having a t&amp;l gpu. I'm pretty sure I read somewhere (yes it's one of those) that with a fast cpu disabling t&amp;l on the video card can increase performance because a top of the range cpu is a hell of alot faster than any gpu? >>



Read this
 

PotNoodle

Senior member
May 18, 2001
450
0
0
?It's mainly only the RAM that's the issue. If you overclock an MX's RAM, you see a boost in T&amp;L situations.?

Well, MX is more pixel bandwidth limited than it is Transform limited, hence overclocking the ram is really gaining higher performance through allowing a greater pixel throughput.

However, this is again part of the point I?m making ? a higher bandwidth MX is not available; if it was then more expensive RAM would be required and thus it would push the price up; does the cost justify the gain? Evidently not else this is what NVIDIA would have done.

?The retail pricing of the Kyro2 is around ~$150, right? That's about the same as the GTS.?

Well, retail pricing of 32MB KYROII?s are less than that; that?s what I was talking about with UK prices ? they are pegged at the MX, rather than the GTS. The manufacturers elsewhere are using the increased performance of KYROII over its competitors to price it higher (at GTS) and get better profits from it (unfortunately for us).
 

saintp

Junior Member
Jun 26, 2001
1
0
0


<< PotNoodle:

The retail pricing of the Kyro2 is around ~$150, right? That's about the same as the GTS.
>>




Actually, the retail price of the Hercules 3D Prophet 4500 is well under 150$. It could be found easily for 110-120$ (I got mine for 118$). Not to mention that the prices of KyroII cards by oder manufacturers are even lower. So, my point is that the KyroII is a good bargain having in mind the excellent performance of the card and it will deal with future games quite well, at least in the next 6-8 months. The architecture of the chip is just enough to compensate the lack of hardware T&amp;L for now. Then, if all goes well, we'll have the Kyro III.
 

RobsTV

Platinum Member
Feb 11, 2000
2,520
0
0
For those too lazy to read the excellent article that Mingon has kindly linked too, I'll post a couple statements from it: (in short hardware T&amp;L rules, and will continue to whoop ass on software T&amp;L when left up to the CPU to handle)


<< Gary was right on both counts. Multitexturing, virtually unheard of until Quake 1, is now standard in all games and video cards. And T&amp;L is well on its way there.&quot; >>



<< Here's the lesson to take away from this: no matter how fast you think your CPU is, dedicated hardware will always be faster at the same tesk. This principle applies to T&amp;L just as much as it does to basic 3D rendering, which is what I'll show you next. >>



<< How about pitting the 800MHz Athlon using GeForce hardware T&amp;L, against the 1.4GHz Athlon using software T&amp;L?The other two games achieve a level of performance parity that's so close, it's almost eerie >>



As to pricing, now that the GF3 has been released, GTS cards can now be had for the same price as Kyro2 cards, or less, and many pay as little as $75 for new GF2 &quot;GTS&quot; cards. The same place that sells OEM Kyro2 cards for $110, (Newegg.com), sells Full Retail GF2 GTS card with TV out, for $118, and was selling 64meg GF2 Pro cards for $125 shipped! Check out Hot Deals forums here at Anandtech regularly for deals like these.
 

PotNoodle

Senior member
May 18, 2001
450
0
0
?For those too lazy to read the excellent article that Mingon has kindly linked too, I'll post a couple statements from it: (in short hardware T&amp;L rules, and will continue to whoop ass on software T&amp;L when left up to the CPU to handle)?

That?s your interpretation, not mine. I was actually startled at how close newer CPU?s are, especially considering the costs of the likes of a 1.4G Athlon. Granted, the T&amp;L units have got faster in the GTS, but the traditional hardware T&amp;L unit from GF256 to GF3 has remained unchanged in it functions.

As for ?a whole can of whoop ass? I assume you are overlooking the multiple lights cases which opens up a ?whole can of suckage?.

As for GF3, it contains both a traditional T&amp;L unit and the ?Vertex Shader?; which is what NVIDIA want developers t use. The reason 3D pipelines are so much faster than general purpose CPU?s is because they are ?hardcoded? to do the operations they need (as Gary?s explanation at the start shows), however Vertex Shaders are useful because they are ?programmable?, and hence they breaking some of those hardcoded operations, and in essence going back to more of a CPU like operation ? for this reason they are slower in RAW polygon throughput, and hence this enables CPU?s to catch back up (Vertex Shaders can be implemented in full speed in software, and this is what DX8 does).

? As to pricing, now that the GF3 has been released, GTS cards can now be had for the same price as Kyro2 cards, or less, and many pay as little as $75 for new GF2 &quot;GTS&quot; cards.?

Prices as low as these generally tend to imply their supply is greater than the demand?
 

Teasy

Senior member
Oct 4, 2000
589
0
0
That HW T&amp;L vs SW T&amp;L article was a bit dodgy. The 3dmark2001 high poly tests were totally messed up, 7 mtriangle/s looks about right for a 800mhz Athlon using SW T&amp;L but 6.8mtriagles/s with a 1.4ghz Athlon using SW T&amp;L???? thats obviously very wrong.
 

matheusber

Senior member
Jun 12, 2001
380
5
81
as you are talking about the K2, so i have a question ...

what is this... i saw at aceshardware.com:

AGP Interface 1-2x DIME

this for the IMG Tech Kyro II ...
its like the non-agp4x from voodoo5 ?

well ... i have a vd3 and i know it doesn't make use of those agp things that would make it faster, in 3dmark, a 16Mb texture on it gets about 2fps and an older viper 550 ( TNT 1 ) also with 16Mb gets about 40!!!
the tnt2 m64 does the same at 32Mb, so nvidia's chips at the limit are good and 3dfx aren't, so i linked this to that agp fact. ( when less the mem limits, all 3dfx chips rocks the others with at least the double the framerate ... :)) )

now i saw this in K2, so ... why moving from a without-agp-goodies card to another ? or this 1-2x DIME has the same performance of a 4x ?
as a voodoo fan, i was planning about a vd5, but as K2 is not nvidia should be a deal ... but either vd5 and k2 haven't this agp stuff so why k2 ???

so ... is k2 just agp 2x, as my vd3 ?

matheus
 

ThExorcist

Member
Jun 22, 2001
53
0
0
Just thought I would give all Voodoo owners and any future owners a heads up about this since Voodoo 5 was brought up....
Bad news

The Voodoo 5 may be inexpensive card to get now but may not be worth it since Microsoft cut support thanks to Nvidia.
 

RobsTV

Platinum Member
Feb 11, 2000
2,520
0
0
What the heck are you talking about???
The guy says &quot;My guest would be nVidia woudln't give MS the source code for the 3dfx cards&quot; Besides his best guest, he can't even figure out what a spell checker is.

For your information, MS has been abandoning 3dfx since long ago, even before 3dfx anounced that they were closing up shop. A good example of this is MS Links 2001 golf game. One of the most popular 3d cards, the V3 series (and V2 series), would not work with this game, as it required a card that supported 32bit color for hardware accelleration. I think 3dfx saw the end coming for a long time, and found a friend in nvidia, that would give them some cash, and hire some of the employees. Why must you bash nvidia every chance you get??
 

PotNoodle

Senior member
May 18, 2001
450
0
0
&quot;I think 3dfx saw the end coming for a long time, and found a friend in nvidia, that would give them some cash, and hire some of the employees&quot;

'Friends'???

I think the last thing Scott Sellers et al wanted to do was to sell to NVIDIA, they had no option in then end though. What they wanted was some way of getting their technology out of the door, which meant sale to another company - S3/Via being the most likely candidate, right up until the day before the announcment; unfortunalty they just couldn't afford it.
 

pidge

Banned
Oct 10, 1999
1,519
0
0


<< Just thought I would give all Voodoo owners and any future owners a heads up about this since Voodoo 5 was brought up....
Bad news

The Voodoo 5 may be inexpensive card to get now but may not be worth it since Microsoft cut support thanks to Nvidia.
>>



I like the KyroII. I really do. I am just getting sick and tired of you &quot;KyroII rules/die NVIDIA die&quot; morons who keep looking for things to bash NVIDIA over the head for, even if they are just rumours or unbased attacks. Do you know the process for WHQL certification and to actually get a driver included in a Microsoft OS release? It is a painstaking process. This would have been the responsability of 3dfx and Microsoft, not NVIDIA. In the end, Microsoft was probably not happy with the results and didn't care so much for Voodoo since their Glide competed with their Direct 3D so they canned XP support for the Voodoo cards. I don't think Microsoft would have gone through all the trouble of developing Windows XP drivers themselves for Voodoo products. That is not Microsofts job in the first place. Personally if I had a Voodoo card, I wouldn't care so much because I won't be upgrading to XP anytime soon. If you have a KyroII board, enjoy it while it still performs well. Stop looking for ways to bash NVIDIA. Let NVIDIA owners enjoy their video cards too. This forum is getting lame lately with all these KyroII owners trying to find ways to screw over NVIDIA. Get a life. :|
 

Teasy

Senior member
Oct 4, 2000
589
0
0
Its certainly not there job to make Voodoo drivers but lets look at this for a moment. There's a huge amount of people out there with ageing Voodoo cards that are all either out or soon to be out of date. All of these people are going to be moving on to other cards soon. Allot of those people will hate Nvidia rightly or wrongly and so there next card will be the best card thats not from Nvidia. Yet if Nvidia simply tried to be nice to these people and did support Voodoo products driver wise even in a very small way they'd win over allot of those people which would mean when they finally upgrade they'd consider Nvidia and not just ATI and IMGTEC. They had a very good opportunity to hurt IMGTEC and ATI and help themselves but they missed it. I'm not bashing Nvidia here or saying they owed Voodoo customers driver support, I'm just saying they could have upped there popularity ten fold if they'd supported voodoo customers and got a good share of the current voodoo user base, most of whom will be looking for new cards soon.
 

ThExorcist

Member
Jun 22, 2001
53
0
0
I am not bashing Nvidia or saying Nvidia is evil. If those off you that are upset would calm down and just read the posts (which I found on Nv News by the way) its states that Microsoft wanted the source code for the Voodoo series drivers to qualify them for XP. Nvidia said they wouldn't release the code. So Microsoft is cutting support for the Voodoo series out of XP.

What I was trying to saying is that 3dfx is no more. There is no threat to Nvidia by releasing the code. It is for support purposes only. S3 got bought out by VIA and they don't have problems releasing driver code for S3 cards to Microsoft. It is only for operating system support. Nothing more...

Now all the Voodoo owners out there will be forced to upgrade their cards only if they want to use XP with guaranteed support. I am sure there are many people that still have old cards in there systems excluding the majority of people on these forums.

(But we only represent a very,very,very,very, very small portion of computer users compared to the casual users otherwise your GF2's and Radeon's would be getting fully exploited)

No bashing was initially implied Pidge.

Thanks Teasy for pointing out another perspective.
 

pidge

Banned
Oct 10, 1999
1,519
0
0
Teasy,
Do you know how big NVIDIA is worldwide? They are about 1000 employees. This includes their R&amp;D, driver development, QA, sales, marketing, legal, HR, everything. Do you expect NVIDIA to hire new people just to cover the resources required to develop updated drivers for 3dfx products? In the past, NVIDIA has been able to getaway with minimal staff because of their unified drivers. 3dfx did not have unified drivers so developing updated drivers for these products would require more resources. It would have been nice but in the real world, it is really being unrealistic. Not trying to bash your post. I am just stating that it is not as easy as it sounds.
 

Mingon

Diamond Member
Apr 2, 2000
3,012
0
0


<< There's a huge amount of people out there with ageing Voodoo cards that are all either out or soon to be out of date >>



Teasy, sorry but if there was that many people out there 3dfx wouldnt have gone out of business.
 

Teasy

Senior member
Oct 4, 2000
589
0
0
<<<Teasy, sorry but if there was that many people out there 3dfx wouldnt have gone out of business.>>>

There's a massive amount of people out there with 3dfx cards, from voodoo1 to voodoo5. Yes thats right some people do still have voodoo1, for instance I know two people with voodoo1's and 3 with Voodoo2's (these aren't people I know over the net either but my freindly and family) and all are looking for new cards, granted not many will still have voodoo1 and 2 but there are loads of people with voodoo3's and quite a few with voodoo5's. I'm not saying there's as many Voodoo owners as Geforce owners but there doesn't have to be, fact is thousands of people out there have voodoo cards and are looking to upgrade either now or very soon and allot of then hate Nvidia, you only have to look around allot of the 3dfx fansites to see that allot of them are moving onto ATI and IMGTEC and won't even consider Nvidia because in there eyes Nvidia killed there favourite graphics card company.
 

RobsTV

Platinum Member
Feb 11, 2000
2,520
0
0
Unlike many buyouts, such as S3 by VIA, nVidia did NOT buy out 3dfx.
They bought some stuff, but not the company, and definitely NOT the drivers or support.
Don't know how many times this needs to be explained. Even 3dfx has stated this.
From the beginning, 3dfx has said that 3dfx, and NOT nVidia, would be responsible for drivers. They also said that they would stop everything at a later date. How on earth would any company, especially Microsoft, want to support a product that it's manufacturer gave up on???
 

jbirney

Member
Jul 24, 2000
188
0
0
Rob,

not true from voodsource:


--------------------------------------------------------------------------------

WinXP 3dfx Support
Posted June 26, 2001 @ 14:57 by Wipeout - Discuss this in the Forums


I was cruising around the forums over at NT Compatible and noticed a disturbing but not surprising post.


WindowsXP RC1 will not be getting any Voodoo3,4,5 drivers they have been pull from WindowsXP build 2499.
Please remember that this is not really Nvidia's doing. They only bought the core assets of 3dfx, they did not buy any of their current product line or support for such, as a matter of fact I think that they even passed on the remaining inventory. I guess that M$ couldn't fix whatever bugs were in there and are planning on ganking the whole thing. Bummer.....

Update: Big wet one goes out to Craig Lisowski for firing me an excerpt from an email with one of the developers from Metabyte. I will let it speak for itself.
I dont suppose there's any chance of Wicked3D producing a full set of drivers [D3D too] for 3dfx cards?

Craig,

Probably not. nVidia has the right to most of 3dfx's Intelectual Property now, including the DirectX drivers' source code.

Your Wicked3D product has been shipped.

Sincerely,

Jerry Betti,

Quality Assurance Specialist,

Metabyte, Inc



It would be no problem for them to give MS the source as they have had to do with their own GF/TNT cards. The just don't want to do it. I remember reading a press release from one of the PR reps saying that they will not be support the legacy products and all vooodoo users should no longer be using legacy products

ie they want more of your money...
 

pidge

Banned
Oct 10, 1999
1,519
0
0
Microsoft is really not into producing drivers for any company. They take drivers from companies which have passed their WHQL tests and requirements and include these in their cat files. To ask Microsoft to develop Windows XP drivers for Voodoo cards is a real stretch since they don't do this for any other company. They may do the testing but developing the drivers is a different story. Microsoft probably saw to many problems with the Voodoo drivers and decided to can it from the installation disk. Remember this doesn't mean you can't just use the Windows 2000 drivers for the Voodoo cards anyways. I am sure the drivers which were included in previous XP releases were not that much different.

Microsoft and NVIDIA have a pretty good relationship. I am sure if Microsoft really wanted to provide basic driver support for Voodoo products, NVIDIA would provide it much like they did for Sun for TNT2 support under Solaris. I just think Microsoft didn't want to bother with Voodoo drivers. If Microsoft won't bother with providing USB 2.0 support, why do you think they would bother with Voodoo support? :eek:
 

RobsTV

Platinum Member
Feb 11, 2000
2,520
0
0
I guess some just hate to read.
From the screwed up posts at NT compatable:

<< Cost is not the issue, 3dfx had signed NDA's to several other companies for parts of 3dfx driver source code. 3dfx didnt not own all of the copyrights to the code. NVIDIA can not release the source without permissions from the other companies. That is why the source was and will be never available to ANYONE. This was explained to me by a 3dfx driver developer a while back.

NVIDIA ownes the source but not all the copyrights to the code in it. If they were to &quot;pass the code&quot; they would be guilty of copyright infringement and end up in &quot;legal hot water&quot;.
>>

 

jbirney

Member
Jul 24, 2000
188
0
0
<<NVIDIA ownes the source but not all the copyrights to the code in it. If they were to &quot;pass the code&quot; they would be guilty of copyright infringement and end up in &quot;legal hot water&quot;. >>

from my sources:


This is not true. nVidia holds all the cards and could give the source to MS if they wanted to...
 

dfloyd

Senior member
Nov 7, 2000
978
0
0
Does T&amp;L Help?

Ok here is a better situiation to test it. Pit a Geforce 2 MX card againsta Kyro II in a T&amp;L enabled game. Run them at 1600 x 1200. Who wins? I think the answer is obvious. Yes hardware T&amp;L might be a player in the future, but at the moment it is for the most part a non factor. The only reason anyone bought a hardware T&amp;L card was for future implmentation. I have yet to see a worthwhile implementation of T&amp;L and how long has it been since the first hardware T&amp;L card was released?

It would be nice if developers had games out running faster and looking better because of your video card having hardware T&amp;L but they dont. MDK2 is the only one I can think of and I turn it off and on with my Radeon and dont honestly notice any difference.

This is the main problem I have always had with Nivida. They charge you for features that you never use. Yes it is also true that if no one implements them then no software developer will use them. But tell me all you who purchased a Geforce I card. How does it feel to be Nividia's Guinea Pig? At least other companys are implementing features that can be used. FSAA, Tru Form, Tile Based Rendering, etc.
 

RobsTV

Platinum Member
Feb 11, 2000
2,520
0
0
<<&quot;Pit a Geforce 2 MX card against Kyro II in a T&amp;L enabled game. Run them at 1600 x 1200. Who wins? I think the answer is obvious.&quot;>> Not at all. The MX would win in low to mid range systems, and the Kyro2 should win in high end systems.

When you do this make sure you use a level playing field, such as a low end CPU of around 600MHz, so that you can see what the &quot;VIDEO CARDS&quot; do, not the CPU. The whole theory of the Kyro2 is that it needs a fast CPU to handle the T&amp;L. This is so simple it is funny. With a high end CPU to do T&amp;L, you are using those extra needed CPU cycles, and apparently a lot of those CPU cycles to do your T&amp;L, which is why you need a high end CPU. On the other hand, if your video card could handle T&amp;L, then the CPU would be freed up to use all it's cycles for other things.

BTW, why use a MX Vs. Kyro2, when they are in different classes? MX is bottom basement priced, while Kyro2 and GF2 GTS are midrange priced. Why not use a GF2 GTS for this comparison, as it is in the same price ballpark as the Kyro2, and is just a little cheaper than a Kyro2. (under $99 US).

<<&quot;The only reason anyone bought a hardware T&amp;L card was for future implmentation. I have yet to see a worthwhile implementation of T&amp;L and how long has it been since the first hardware T&amp;L card was released?&quot;>>

Huh? When I bought my GF2, I couldn't have cared less about T&amp;L. I'm sure no one bought the card for that reason. It was simply the fastest in it's price range, with the best all around support when I bought mine. T&amp;L was a bonus that I didn't realize the benefit of until recently.

<<&quot;It would be nice if developers had games out running faster and looking better because of your video card having hardware T&amp;L but they dont. MDK2 is the only one I can think of and I turn it off and on with my Radeon and dont honestly notice any difference.&quot;>>

T&amp;L not used? Near the top of the gaming charts, Nascar Racing 4 is one of those new DX8 games that use T&amp;L, and it makes a huge difference. Don't look in the old DX7 games for T&amp;L, but instead to the newer games designed for use with DX8 to incorporate T&amp;L. DX8 may well decide this all, as Kyro2 owners without hardware T&amp;L keep saying how great the card will do with the next version of DX8, and T&amp;L users know that as more DX8 games appear, so will more games that feature T&amp;L. Don't forget the Microsoft and nVidia have an excellent working relationship, and sort of depend on each other for certain things. This could easily bleed over into DX8 features that would give nVidia an edge over the other cards.

1 last little point about driver updates. The Radeon and V5 were faster then GF2 GTS cards. Then nVidia released the Detonator 3 drivers and changed all that by becoming the fastest. Do you think they are done? Hardly. Proof of this is the latest 12.90 DX8 drivers, which gave me a good 10%-15% performace increase over the 12.60 drivers, which gave me an increase over the previous ones, and so on. All this without compromising visuals or stability. They will not just lie there and be overtaken, which is one reason why many of us nvidia owners feel as though we made the right choice, and are in great hands.