• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Fermi benchmarked? [Fake]

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.
I think most people can safely assume that AMD is sitting on/tweaking designs

I hope Fermi causes ATI to unleash improved versions of HD58xx and HD5970.

Faster and more dense GDD5 modules would do the trick. 2 GB 7Gbps GDDR5 with 256 bit bus might be warranted.

I'd be willing to bet testing at higher resolutions like 5760x1080 would yield improvements.
 
The performance indicated in these slides are shocking. It'd be like 2nd coming of G80.

Disclaimer: I didn't say that the slides were real or anything in such context.
 
The performance indicated in these slides are shocking. It'd be like 2nd coming of G80.

Disclaimer: I didn't say that the slides were real or anything in such context.

That's pretty much my thoughts.

Ah well, it's winter and speculation is entertainment.
 
I think it very funny the ATi owners getting upset and calling the slides all sorts of names.....PMSL...how upsetting that nV might have a card coming out that the slides show spanking the almighty 5xxx series....

Fake or not, a good laugh!
 
I am thankful this will cause Video card prices to drop eventually.

However, if I was Nvidia I might think twice about making such a huge die again. (Or maybe this doesn't even cross their minds as the HPC market is where they make 2/3 of their profit)
 
I think it very funny the ATi owners getting upset and calling the slides all sorts of names.....PMSL...how upsetting that nV might have a card coming out that the slides show spanking the almighty 5xxx series....

Fake or not, a good laugh!

Ya know, a few months ago, nvidia owners got upset by the AMD PR slides showing 5870 walking all over GTX295 and called foul saying they're just PR and not-accurate. And you know what?














They were completely right.
 
Come on people. This is the kind of crap that gets threads locked and people banned. Take it to PFI.

Back on topic:

I am thankful this will cause Video card prices to drop eventually.

However, if I was Nvidia I might think twice about making such a huge die again. (Or maybe this doesn't even cross their minds as the HPC market is where they make 2/3 of their profit)

I agree. It seems very expensive to make these huge dies. It seems like Nvidia might one day need to make an entirely separate GPU design for the HPC market and make something smaller and cheaper for the GPU market.
 
I agree. It seems very expensive to make these huge dies. It seems like Nvidia might one day need to make an entirely separate GPU design for the HPC market and make something smaller and cheaper for the GPU market.


If they could make their architecture more modular, in that they could have a bunch of gaming widgets and a handful of GPGPU twockers for their GTX card, and a handful of gaming widgets and a bunch of GPGPU twockers for their HPC card. No wasted GPGPU die space on gamer cards, no wasted gaming die space on HPC cards. Now THAT, would be a huge win, though I don't think it'd be very feasible
 
Ya know, a few months ago, nvidia owners got upset by the AMD PR slides showing 5870 walking all over GTX295 and called foul saying they're just PR and not-accurate. And you know what?

They were completely right.

Those slides showed the 5870 outperforming(but not by much, certainly not "walking all over") the GTX 295. They were misleading definitely, but not complete nonsense. There are plenty of scenarios where the Radeon HD 5870 is faster than the GTX 295. The nitpick is this is not all scenarios and it's not the majority of the scenarios.(The slide only showed benchmarks that the 5870 won and omitted the others, giving the impression that it was faster in everything)
 
Because you are an annoying troll who adds little to this msgboard.

1. It's always funny to see comments like this from people with such useful posts 😛

2. It's a forum, not a message board, no need for 'oldschool' when it's incorrect (BTW most likely I was already posting on BBSes when you got your first computer.)
 
Those slides showed the 5870 outperforming(but not by much, certainly not "walking all over") the GTX 295. They were misleading definitely, but not complete nonsense. There are plenty of scenarios where the Radeon HD 5870 is faster than the GTX 295. The nitpick is this is not all scenarios and it's not the majority of the scenarios.(The slide only showed benchmarks that the 5870 won and omitted the others, giving the impression that it was faster in everything)

Correct. There are several situations where a 5870 beats up a 295 and with newer and newer drivers it will going to get worse and worse for the 295...
 
Anyway, from the XS thread:
"These graphs were originally posted in a forum by a guy with the user name of "successful troll". I'm surprised guru3d made an article of this crap."
by Redsand426
 
Last edited:
Those benchmarks show the 380 as having MORE THAN TWICE THE PERFORMANCE OF THE 285.

Which is clearly impossible, because of die size and every rumored spec.

Also impossible simply because of the GPU trend, nVidia would not release such a card as it's completely unnecessary.
 
What is the problem ?

Wait for real numbers. New cards will eventually be tested by trusted reviewers.

This is NOT the time to buy, unless your have to, for whatever reason (e.g. new rig)

If you have to buy, than go the value road (4890, 260) and then wait.

I have a 512MB 4870 and I always wanted to upgrade to 1Gb, but this is NOT the time. I will pay neither AMD nor Nvidia a penny in this situation.

AMD has a good chip, but not widely available due to the TMCS f***-up.

Nvidia will probably have a good chip, but it is not yet there. People who have to buy (e.g. new rig) are paying premiums for the short supply.

Again: this is NOT the time to buy, it is NOT the time to discuss leaked/faked benchmarks. It is time to focus on Christmas and WAIT.
 
Last edited:
Because A3 is reportedly "in the oven" does not mean there are not successful A2 cores being used in house. A respin could simply be called for to improve yields. While these graphs are fake, there isn't any reason why they couldn't have been legit at this point in time. I'm not talking about the scores, but the data is probably presently available in house at NV. As others have said, there isn't any reason why good yield A2 stepping cores couldn't have been used at A3 stepping clock rates for testing. It's like cherry picking some A2 cores that could, or did, meet the clocks or near it with adequate cooling. Anyway, this thread along with the other a Guru and XS is much ado about nothing. As we get closer to a time where GF100 will become public, I'm sure we'll see some people with pre-release samples leaking out some real numbers.
 
I am waiting until Fermi has been reviewed by Anandtech because that would be a review I would trust. I keep remembering October does anyone else remember:-

"Nvidia has confirmed that the Fermi card held up by Jen-Hsun on stage wasn't real." 😀
 
Last edited:
These numbers seem believable "if" the GTX360 is the top end single GPU card, and the GTX380 is a dual GPU (down-clocked) card. Now if NV are hoping to release GF100 in February/March time-frame, then I'm sure they would have some cards running by now. But these numbers look too good to be true, unless NV found a way to clock GF100 to over 800Mhz, which I think is highly doubtful in their case.
 
Status
Not open for further replies.
Back
Top