Tick Tock goes the graphics clock!

Rubycon

Madame President
Aug 10, 2005
17,768
485
126
Do GPUs (particularly Nvidia) follow the tick tock "rule" of microprocessors?
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
They try to, but nvidia has slowed down because Fermi is behind schedule. AMD/ATI has done a fairly good job since the 3850 of maintaining a 6-8 month release window for new products.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
You are way wrong on this one.

Speaking strictly on nvidia's last three generations:

7800 GTX ---> 7900 GTX ---> 7950 GTX
8800 GTX ---> 8800 Ultra ---> 9800 GX2
GTX 280 ----> GTX 285 ---> GTX 295

The 7900GTX uses the same manufacturing process as the 7950GTX, the same goes with the 8800 GTX and Ultra versions, so they're not the same as the Intel's tick tock model. A Tick Tock model should look like this;

Tick: 8800GTX Tock:9800GTX (We all know that the 9800GTX doesn't deserve the 9 nomenclature)

Tick: HD 2900XT Tock: HD 3870 (The same goes here)

In the end, there's no GPU vendor that follows the Tick Tock closely, where does the GTX 260+ fits after the 9800GTX? Both uses the same 65nm manufacturing process, where does the HD 4870 fits after the HD 3870? It also uses the same manufacturing process as the HD 3870.
 

Idontcare

Elite Member
Oct 10, 1999
21,118
58
91
tick-tock just means the first GPU product on a new node would not be a brand new architecture but rather a shrink/refresh of a pre-existing architecture.

Like GT200b or RV740.

Then a new architecture (if one is going to be created) would debut at a later date on the same node.

I haven't followed GPU's long enough to know if this is what has been going on, but for 40nm it does appear to be the case.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
tick-tock just means the first GPU product on a new node would not be a brand new architecture but rather a shrink/refresh of a pre-existing architecture.

Like GT200b or RV740.

Then a new architecture (if one is going to be created) would debut at a later date on the same node.

I haven't followed GPU's long enough to know if this is what has been going on, but for 40nm it does appear to be the case.
You can get there, but it's a bit of a stretch when compared to Tick Tock in the Intel sense. RV740 was a low-volume part, so it's not like Intel where there is a long period of producing an old design on a new node.

NVIDIA was even worse on 55nm; they didn't produce any new designs there, everything was GT200 or G9x variants. 40nm did go a bit differently though, since they have the GT210-240 which are mostly rehashed designs, although NVIDIA did add GDDR5, DX10.1, and VP4.
 

DominionSeraph

Diamond Member
Jul 22, 2009
8,391
31
91
You are way wrong on this one.

Riva 128: .35 micron
TNT: .35 micron
TNT2: .25 micron
Geforce 256: .22 micron
Geforce 2: .18 micron
Geforce 3: Debuted .18 micron, refresh .15 micron
Geforce 4: .15 micron
Geforce 5: .13 micron, refresh on .13 micron
Geforce 6: 6800s debuted .13 micron. Mainstream at .11 micron. I believe the vanilla 6800 transitioned in PCIe.

Only the Geforce 3 in there really tick-tocked.
So historically, no.
 

scooterlibby

Senior member
Feb 28, 2009
752
0
0
Wasn't tick-tock a marketing term invented by Intel? Wait, isn't tick-tock a term referring to the passing of seconds? No no no, tick-tock means different amounts of microns, right?
 

Asianman

Junior Member
Mar 29, 2001
2
0
0
Actually, AMD's followed the Tick Tock amazingly closely. From the original tock, the HD 2900, it ticked in about 7 months to the HD3800s, that tocked in 8 months to the HD 4800s. The 4700s came out in April, with the new process so that's 8-9 months, and the HD 5800s in September for the new architecture. Moving along, I honestly doubt the 5800s will tick again in 8-9 months, if they do, great, if they don't then that's expected.

NV's a little different. 8800's came out in NOV 2006, in October 07, 65nm 8800 GT's came out, 280's in april, 55nm 9800GTX+ in July, 40nm parts earlier this year and a full tock for fermi later.
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,841
3,189
126
Historically, no. New products debuted on new processes, with no refreshes.

4870 -> 4890 -> 5870

Tick -> tock -> tick

GT8xxx series -> GT9xxx series -> GT2xx series.

Tick -> tock -> again Tock.
 

Rubycon

Madame President
Aug 10, 2005
17,768
485
126
Since 5870 which such a disappointment I'm anxiously waiting for nV.

EDIT:
Let me update this before all the fanATIcs jump me!

For me it was not good. It was fast (gaming), was quiet even loaded, but I do more than surf and game. I use professional AV content creation and editing software and the hardware just does not like it. Flickering, toolbars shaking, etc. I just could not deal with it so I gave the card back and put my trusty 2GB 285GTX back in. I do like some of the features in video playback however.
 
Last edited:

GodisanAtheist

Diamond Member
Nov 16, 2006
6,783
7,117
136
4870 -> 4890 -> 5870

Tick -> tock -> tick

GT8xxx series -> GT9xxx series -> GT2xx series.

Tick -> tock -> again Tock.

-No.

Tick -> Tock -> Tick -> Tock looks as follows:

New Arch/Old Process -> Old Arch/New Process -> New Arch/Old Process -> Old Arch/New process.

This is how Intel does it:

Old Pentium 4/New 65nm process -> New Core arch/Old 65 nm process -> Old Core arch/New 45nm process -> New Nehalem arch/ Old 45nm process etc...

4870 was neither a new arch nor a new process. It used the exact same architecture of the 3870 at the exact same process. The 4890 used the exact same arch and process as the 4870 and by extension the 3870. The 5870 does modify the arch a bit, and is on a different process. So its really got nothing to do with Tick/Tock.

Nvidia's advancement is just as messed up thanks to their product nomenclature.

GPU's do not use the Tick/Tock strategy as defined by Intel. ATI has recently announced that it plans to adopt the strategy with its GPUs from here on out.
 

Voo

Golden Member
Feb 27, 2009
1,684
0
76
For me it was not good. It was fast (gaming), was quiet even loaded, but I do more than surf and game. I use professional AV content creation and editing software and the hardware just does not like it. Flickering, toolbars shaking, etc. I just could not deal with it so I gave the card back and put my trusty 2GB 285GTX back in. I do like some of the features in video playback however.
Sounds just like the typical driver problems with a new card.. you may or may not have luck with this the next time, from whomever you buy a new card.
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Actually, sounds like what happens if you try to use the firegl driver with ATI cards on Linux. Doesn't matter what generation the card is from, professional, OpenGL and desktop eye candy apps definitely exhibit their share of funk.

Rubycon, you weren't trying to use an ATI card on a Linux workstation were you?
 

Rubycon

Madame President
Aug 10, 2005
17,768
485
126
Actually, sounds like what happens if you try to use the firegl driver with ATI cards on Linux. Doesn't matter what generation the card is from, professional, OpenGL and desktop eye candy apps definitely exhibit their share of funk.

Rubycon, you weren't trying to use an ATI card on a Linux workstation were you?

Nope. Win7 x64.

Found someone else having an issue they RMA'd and it was fixed. :(

So I may have to try a 5970. I miss my 295GTX.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
The 7900GTX uses the same manufacturing process as the 7950GTX, the same goes with the 8800 GTX and Ultra versions,

Wasn't 7800 GTX on a larger process than 7900 GTX?

I also think g80 was on a larger process than g92?
 

Spike

Diamond Member
Aug 27, 2001
6,770
1
81
I know I'm not adding anything to the thread but I have to say... this is the first time Rubycon has posted something that I have actually understood. Are you feeling ok Ruby?
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Wasn't 7800 GTX on a larger process than 7900 GTX?

I also think g80 was on a larger process than g92?

The comparisons were based on the same families, I never said that the 7900GTX uses the same manufacturing process as the G92 or something similar. I didn't even mentioned the G92, I was talking about the G80 as a whole, 8800GTX, Ultra, GTS 320 and GTS 640, the G92 was hardly a revolution, just a die shrink with minor tweaks, old architecture with newer manufacturing process.

I never mentioned the 7800GTX, I mentioned the 7900GTX and 7950GTX which both are based on the 90nm process.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,783
7,117
136
Lets see what Nvidia did:

(7800GTX)G70/110nm ->
(7900GTX)G71/90nm ->
(8800GTX)G80/90nm ->
(8800GT/GTS & 9800/GT/GTX)G92/65nm ->
(GTX280)GT200/65nm ->
(GTX285)GT206/55nm & (GTS250)G92/55nm ->
(GT240)GT21x/40nm ->
(Fermi)GF100/40nm...

Nvidia's progression is just as much a mess as ATI's in terms of tick/tock. The G80 -> G92 jump came with a die shrink, so new core & die there. The GT200 arch has remained the same across 3 process nodes, while the G92 was die shrunk after the new architecture was released.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
True, if we compare them against their ATi counterparts, it would look like like this;

X1800XT/X19x0XT - 90nm - Tick (Sort of) - New arch and new process
X1950PRO - 80nm Tock - Old arch, new process
HD 2900XT - 80nm Tick - New arch, Old process
HD 2600XT - 65nm - New arch, new process
HD 3870 - 55nm - New arch (Sort of), New process
HD 4870 - 55nm - New arch, Old process
HD 4770 - 40nm - die shrink, new process
HD 5870 - 40nm - New arch, Old process

It is even more of a mess than nVidia, because if you count nVidia cards since the 6800 series, they used 130nm, 110nm, 90nm, 65nm, 55nm and 40nm, using the same era from the X1800 series, ATi used 130nm, 110nm, 90nm, 80nm, 65nm, 55nm, 40nm, sharing the 90nm, 130nm and 80nm between old and new architectures while nVidia only shared the 90nm and the 65nm. I wonder why nVidia didn't use the 80nm process.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Who the hell cares? Sometimes they do, sometimes not. Nvidia since the fx5800 flop has generally avoided introducing a new architecture on a new process, but that's put them at a disadvantage more than once, so it's likely to change in the future. AMD on the other hand has on several occasions done both, with various degrees of success (r520, rv770)
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Who the hell cares? Sometimes they do, sometimes not. Nvidia since the fx5800 flop has generally avoided introducing a new architecture on a new process, but that's put them at a disadvantage more than once, so it's likely to change in the future. AMD on the other hand has on several occasions done both, with various degrees of success (r520, rv770)

Wut? RV770 uses the same 55nm process used on the HD 3870. And if you mean the HD 5800 series, they use the same 40nm process used on the ill fated HD 4770.