What are Gigaflops?

cremator

Senior member
Sep 21, 2001
643
0
0
Well recently buying a Geforce3 Ti-500 Deluxe, I noticed the box said 78 Gigaflops. And I've heard it before but only Jeff K. using it so it sounded fake. Can someone give me an in-depth explanation of what they are and what they do?
 

MustPost

Golden Member
May 30, 2001
1,923
0
0
I believe a flop is a FLoating point operation. 78 million in one second is a very very large amount of them
 

Sohcan

Platinum Member
Oct 10, 1999
2,127
0
0
Flops alone is too arbitrary of a number to judge performance, along with MIPS (millions of instructions/second, which my comp arch professor used to call Meaningless Indicator of Performance). First of all, with superscalar architectures, many companies define how they determine their MIPS/Flops numbers differently....usually its the peak fetch rate * clock speed, but sometimes it's the peak issue or dispatch rate in place of fetch rate.

Secondly, MIPS/Flops can't adequately judge performance. As an example, consider CPU A and CPU B, both running at 1GHz. Let's say they're running a program comprising only floating-point square-root operations. CPU A has special hardware to handle square root instructions, such that it takes a single 10-cycle instruction to do an operation. CPU B on the other hand has to issue 100 single-cycle instructions to do a single operation. CPU A is running at 100 megaflops, and executes 100 million square-root operations per second. CPU runs at 1000 megaflops (10 times greater than CPU A), yet only executes 10 million square-root operations per second (10 times smaller than CPU A). This is an over-simplified example that ignores a number of factors, but it gives you a good idea why no one no longer seriously uses MIPS/Flops except for marketing.
 

Shalmanese

Platinum Member
Sep 29, 2000
2,157
0
0
Just remind me, is 1 Gigaflop 1,000,000,000 FLOPS or 1,073,741,824 FLOPS

damn HD makers got the whole thing confusing me :(
 

Superdoopercooper

Golden Member
Jan 15, 2001
1,252
0
0


<< Just remind me, is 1 Gigaflop 1,000,000,000 FLOPS or 1,073,741,824 FLOPS

damn HD makers got the whole thing confusing me :(
>>




I think the only time the 1000 = 1024 thing works is on MEMORY/HDD space.... you know... 1MB actually = 1,000,000*1.024 BYTES. Whatever that convertion is. With speed (1Mhz = 1,000,000 Hz) and FLOPS, it is a straight number with no multiplying factors.

Yeh... the memory thing is totally dumb. Not sure why that ever happened.
 

Killbat

Diamond Member
Jan 9, 2000
6,641
1
0
"Yeh... the memory thing is totally dumb. Not sure why that ever happened."

What, you mean the number 1024? It makes perfect sense in a binary system. 2^10 = 1024, which is close enough to 1000 to just say "kilo". kilo^2 = mega, kilo^3 = giga, kilo^4 = tera, etc.

Maybe when we start talking about everyday data in terms of exabytes someone will come up with an easier way, but I don't think it really matters.
 

Moohooya

Senior member
Oct 10, 1999
677
0
0
The only foolish convention is the HD Megabyte being 1024000 bytes (1024 x 1000) Where the heck did they come up with that?

A Kilobyte being 1024 bytes is a perfectly rational number. (No pun intented)

Edit

A FLOP is a FLoating point Operation Per Second. (Everyone seems to be missing out the per second in their definition, althought correctly staying that 78Gigaflops is 78 billion of these suckers in a second.)

Floating point opperations take different amount of time depening on the numbers and the operation. I would read this as 78 billion very trivial operations that we are highly optimised to do. (Not sure if load/store counts or not?)
 

Superdoopercooper

Golden Member
Jan 15, 2001
1,252
0
0
Yeh... I know that 2^10 is 1024. I understand where they got that from...

but where to they get that 640kB = 655634 Bytes or whatever it is. That is just stupid... IMO... b/c a kilo is 10^3. So, 640 * 10^3 = 640,000. But for some reason... only for bytes kilo now means 1024. Hmmm....

So does a kilo of crack weigh 2.2lbs... or 2.2 * 1.024 lbs?? :D
 

Shalmanese

Platinum Member
Sep 29, 2000
2,157
0
0


<<

A FLOP is a FLoating point Operation Per Second. (Everyone seems to be missing out the per second in their definition, althought correctly staying that 78Gigaflops is 78 billion of these suckers in a second.)

Floating point opperations take different amount of time depening on the numbers and the operation. I would read this as 78 billion very trivial operations that we are highly optimised to do. (Not sure if load/store counts or not?)
>>



well, just to be pendantic, a FLOPS is a floating point operation per second.

also, according to dell anyway, a Gigabyte is 1,000,000,000 bytes which would make a Megabyte 1,000,000 bytes and not 1,024,000 bytes which means your getting short changed even more. I'm pretty sure a gigabyte of RAM however, is 1024^3 as the memory count thingie on boot counts in Kbytes and 65526 KBytes is 64 MB

Superdoopercooper: When working with things in Base 2, each progressive prefex denotes a rasing of 2^10. When working with stuff that isnt base 2, each progressive prefix is 10^3 The obvious exception being HD size. With lower prefixs (2^10)^n ~ (10^3)^n but this difference increases with size. Only with things that work inherently in base 2 is 2^10 used. ie not crack cocaine :)

The reason being, quite often you have objects whose sizes is exact integer multiples of 2^10 which means it is a lot easier sayin 64MB rather than 65.536 MB

the IEEE were planning on creating a new convention of kibi, mebi, gibi etc. but I guess it didnt catch on

BTW: onto another controversy but who the hell invented the whole MB/s and Mb/s thing?
 

MustPost

Golden Member
May 30, 2001
1,923
0
0
I read on theregister that the System Internatinal, SI or IS or whoever runs the metric system has there own new standerd for computer sizes. I'll try to find them
 

Moohooya

Senior member
Oct 10, 1999
677
0
0
well, just to be pendantic, a FLOPS is a floating point operation per second.

Well I ment that. I just missed out the S in flop but did manage to capatilise it in definition!

It would make a whole lot more sense that a HD megabyte was 1,000,000 bytes rather than 1,024,000 (assuming that it couldn't be 2^20). However I thought that once (and perhaps they changed this) it was 1,024,000 bytes. This was from the early days when storage media was not even as big as a megabyte, and so was always in kilobytes (1024 bytes) Then when they finally introduced megabyte storage the megabyte was 1000 kilobytes, or 1,024,000 bytes! I guess marketing has since required that they display as many gigs as possible, so it is all uses 10^3 rather than 2^10.

MB vs Mb has been around for decades. Well the B v b has been around that long, but they started of as B v b and KB v Kb.

Then there is the nybble. 4 bits to a nybble, 2 nybbles to a byte. Then the 9 bit byte (I believe IBM's creation.)