[techeye] High end kepler -- 2013?

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
http://news.techeye.net/chips/nvidias-kepler-suffers-wobbly-perturbations#ixzz1l4sdeNee

More industry whispers suggest the reason Nvidia's upcoming hope, the GK104, is late to the table is all the redesigns it has had to go through. Parts on the 32nm had to be redesigned for 28nm, but along the way there were challenges with packaging and interconnects. Some industry watchers suggest that Nvidia gave up a lot of space on its chip, trying to buff up Kepler by bringing Ageia to the hardware.

But the murmurs suggest Nvidia has been dedicating a lot of resources to get physics and fluid dynamics operating properly, which has so far, allegedly, taken half of its gaming engineers and six months to get right.

One industry watcher said to TechEye the company is in "holy s**t" mode - having been confident that the GK104 would fight off and trounce the competition, but the timing is out of whack. When Nvidia does get its high-end Kepler chip out in the second half of the year, the competition is going to be ready with something else.

Interesting info. I wonder if Kepler will suffer Fermi type delays or if it will make April? I'm even more confused by the part nomenclatures now too.
 
Last edited by a moderator:

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
Will be an epic fail if nvidia can only manage their mid range card before late this year. I was starting to hope they might manage to get them out before or close to when custom 7970s and full cover water blocks to go with them become available.

Could care less about a mid range 28nm card myself. This could explain all the far fetched and too good to be true rumours about GK104 though.
 

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
I hope not, I planned to hold off on upgrading to a custom 7970 until I saw what NV had up their sleeves :\
 
Feb 19, 2009
10,457
10
76
If they could paper launch it, they would.

Frankly, the fact they aren't even hinting at a launch in the foreseeable future and keeping a tight lid suggests Kepler is very far off.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
This articles leaves a lot of questions -- IF TRUE (not sure obviously) -- will the GK104 leave aegia off? Will only the high end GK110 have Aegia? And will GK104 make an april-may release?

I wonder how NV will shore up support for it. Seems difficult since so many games are multi platform.
 

Concillian

Diamond Member
May 26, 2004
3,751
8
81
This is a bummer.

Especially that PhysX would be a possible reason for the delay. If that's why they are delaying, then PhysX could have cost them an awful lot, and I'm not sure how much it has gained them over the years.
 

thilanliyan

Lifer
Jun 21, 2005
12,057
2,272
126
SECOND HALF of 2012? I hope that's not true. 79XX prices probably won't come down much soon if that's the case.

So they're trying to do PhysX on a dedicated chip? How reliable is "Techeye" anyway?! :D
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
Then why hasn't NVIDIA already shipped it for revenue? After you ship for revenue it takes one-two months for a hard launch.

Three months to launch is the minimum it'll take, and I fail to see how that is "imminent". Information on Enthusiast Kepler is scarce, and engineering samples have not been supplied to anyone, and if they had working samples they would've demonstrated them at recent events. That alone should tell you a lot about how "imminent" Kepler's release is.

Seems pretty obvious to me that high-end Kepler doesn't physically exist. There's no engineering samples or product demos to speak of.

You don't have to be a genius to figure out that NVIDIA hasn't demonstrated it because, simply put, there's no working silicon for now. This summer is when these cards will launch at the earliest, and AMD can easily take advantage of improved yields and clock speeds by then to release a new card with higher clocks and more execution units.
 

Subyman

Moderator <br> VC&G Forum
Mar 18, 2005
7,876
32
86
Seems pretty obvious to me that high-end Kepler doesn't physically exist. There's no engineering samples or product demos to speak of.

You don't have to be a genius to figure out that NVIDIA hasn't demonstrated it because, simply put, there's no working silicon for now. This summer is when these cards will launch at the earliest, and AMD can easily take advantage of improved yields and clock speeds by then to release a new card with higher clocks and more execution units.

With AMD lining up improvements to Bulldozer as well, it may be the year of AMD at this point.
 

mak360

Member
Jan 23, 2012
130
0
0
This new toolkit backs that rumour

CUDA Toolkit 4.1



"LLVM's modular design allows third-party software tool developers to provide a custom LLVM solution for non-NVIDIA processor architectures, enabling CUDA applications to run across other vendors" (AMD)

http://developer.nvidia.com/cuda-toolkit-41
 
Last edited:

Leyawiin

Diamond Member
Nov 11, 2008
3,204
52
91
How reliable is "Techeye" anyway?! :D

First I've heard of them. Looks to be a British website and this "Hector Dish" seems to be reporting mostly verifiable news items - not rumors (which is to say how good are his rumor sources since that doesn't seem to be what he does most).
 

amenx

Diamond Member
Dec 17, 2004
4,521
2,857
136
Regardless, it is still a fail even if a proper, good performing Kepler high end comes out late 2nd quarter as expected. That sort of delay is too much in the GPU business. New products from the competition could be just around the corner when it arrives.
 

Quantos

Senior member
Dec 23, 2011
386
0
76
[Assuming that's true, at least in part]

Well, NVidia really cannot fall in the trap of releasing a late, unoptimized product again (à la GTX480). The more time goes by, the higher the expectation for Kepler is. At the same time, the more time goes by, the more AMD can prepare its next move. Of course it's still quite early to go crazy. AMD is launching its new products at the moment, and hopefully for them, profiting from their lack of new competition. But again, AMD's engineers definitely aren't just sitting around.

From NVidia's business point of view, I simply cannot see how that pattern can be profitable in the long run. At the moment the 560/570 are probably sitll very profitable, but when AMD releases the rest of the 7xxx line, that could change. Even if AMD's offering are not incredible increase in performance at that point, I just don't see how the 560/570 could remain good values. If that's the case, and that Kepler is SO far away, that could be disastrous.

[/Assuming that's true, at least in part]
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Regardless, it is still a fail even if a proper, good performing Kepler high end comes out late 2nd quarter as expected. That sort of delay is too much in the GPU business. New products from the competition could be just around the corner when it arrives.

The same way that the entire Fermi generation was a fail by being 6 months late: bringing working 3D gaming, unparalleled tessellation performance (about 2 years ahead of AMD), while achieving 60% market share in the discrete GPU space and giving us awesome price/performance in GTX460 and GTX560Ti cards, and at the same time putting pricing pressure on AMD, resulting in such gems as the HD6950 2GB at sub $299? You think AMD would have focused on Tessellation performance improvements in HD6900 and HD7900 if NV wasn't dominant in this area before? Let's consider that competition can often improve the landscape for all of us, even if the competitor does arrive late.

In the current PC era characterized by console ports/game engines that lack realism, it would be amazing to see a GPU maker trying to add some unique features that may encourage game developers to produce more realistic games. Sure, thus far physX has not worked out, but that doesn't mean a company should just give up. I'd rather take more realistic physics effects (regardless if NV or AMD brings that advantage) than 80-120 FPS in DX9 game engines and/or 20-30 fps in DX11 games with extreme levels of Tessellation (arguably the only standout DX11 feature).

At the end of the day, HD7900 series did not revolutionize much aside from Zero Power core technology. The new cards still cannot hope to cope with heavy Tessellation in games like Metro 2033, struggle in demanding games like Dragon Age 2 and are still more or less unable to pull off Eyefinity with a single card on 3 monitors. That more or less leaves us having to resort to inefficient deferred AA settings in modern game engines that more often than not result in blur fest and/or massive performance hits in games such as BF3 to extract some value. Are we at a point of diminishing returns though? Fundamentally, games barely look better or feel any more realistic than Crysis 1 from 2007. Maybe I am getting older, but I am no longer impressed by 40% performance increases. I want more realistic PC games. Sure, having a card powerful enough to game on a 30 inch 2560x1600 monitor is a nice bonus, or being able to play on 3 screens gives us gamers more flexibility, but that's not actually improving realism of gaming imho.

If NV is 6 or even 9 months late but actually adds something that has a long-term potential to improve the gaming experience, it might be worth the investment for them and for all of us gamers (as it might encourage AMD to focus on physics for once). The unfortunate side effect of being late is the lack of pricing pressure on AMD.

However, personally, I don't care about Eyefinity due to bezels or 120 Hz PC gaming @ 120 fps. As such, it would be a breath of fresh air if any company actually incorporated some useful features that help improve the realism in games. After the incredible hype behind BF3's graphics/realism, the game is actually laughable. 5 years after Crysis 1 and based on BF3, the PC gaming industry has barely moved, aside from more fluid character animations ripped off from EA sports game engines. That's supposed to be impressive? Textures still look no better than they did in 2007-2008 videogames.

Maybe it's time games started to focus more on realistic physics and worry less about obscene levels of AA and high resolutions.

Softimage Lagoa ICE - Mousetrap 1080p HD

and

Real-Time Grass Rendering

and

Physically Guided Animation of Trees

GRAW with Ageia PhysX (not bad)

Or do people want 5 monitor gaming at super high resolutions and 128x AA filters with the same ragdoll and physics effects from 2008?
 
Last edited:

xp0c

Member
Jan 20, 2008
91
0
0
The real truth is that Nvidia waited until they got a 7970 to test before actually doing much of anything. AMD is actually the leader in GPU's, Nvidia is a follower.

Pretty much the same thing as when the 5870 was launched.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
NVIDIA is on the ropes. They might not last the year!

:hmm:

http://www.google.com/finance?q=NASDAQ:NVDA&fstype=ii

Not sure if serious. You might consider taking some courses in finance, accounting and strategy.

The real truth is that Nvidia waited until they got a 7970 to test before actually doing much of anything.

Exactly. Why spend years on a new GPU architecture when you can wait until your competitor puts its foot forward, then get to work and in quick 6 months design and manufacture a new GPU from scratch that beats your competitor.....

AMD is actually the leader in GPU's, Nvidia is a follower.

Definitely. Like AMD invented the use of multiple-GPUs in CF and NV copied with their SLI technology, how AMD introduced SM3.0 first with X1800 series and NV copied with their 6800 series, how AMD had the world's first DX10 GPU in 2900XT and G80 came out months later with the same feature set, how AMD had the most advanced Tessellation design in HD5800 series and NV copied it with Fermi, how AMD developed the world's first GPGPU scalar architecture and NV copied, how AMD provided Analytical Anti-Aliasing and Super Sample Anti-Aliasing for DX10+ games first, how AMD provided custom game settings/profiles and custom resolutions in driver control panel, how AMD was the first to have working surround 3D gaming....etc. :thumbsup:
 
Last edited:

bryanW1995

Lifer
May 22, 2007
11,144
32
91
You need to change the thread title to reflect the source.

With AMD lining up improvements to Bulldozer as well, it may be the year of AMD at this point.

Really? AMD put all of their best engineers on brazos a year ago. They have 1/1,000,000 the resources of intel (approximately). Intel is still going full speed ahead on their next next gen offerings. Oh, and intel underclocks their current high end by ~ 20%. Something tells me that Bulldozer will never EVER be competitive with anything from intel. Well, except maybe atom.
 
Last edited:

Subyman

Moderator <br> VC&G Forum
Mar 18, 2005
7,876
32
86
Take that to the frickin' BANK, I'm buying some more AMD stock when I get done here. Just a couple thou worth because times are tight, but they seem to be hitting on all cylinders with the first 8 core CPUs setting OCing records, and the 7970 and 7950 kicking arse and takin' names.....:wub:

Might be worth it. I bought AMD stock a few years ago when it was at 3 dollars a share. Rode it up to around 7 within a few months and dumped it. I don't really see there being big profits to be gained at 6-7 dollars a share. They haven't been above 10 in 4 years.
 

tincart

Senior member
Apr 15, 2010
630
1
0
It's pretty obvious NVIDIA is heading for a big, big fall Quantos. What with AMD launching the 5870 first, and totally beating the gtx580 in the all important frames per dollar and decibel categories, NVIDIA is on the ropes. They might not last the year!

nVidia is in great financial shape, they're not going anywhere.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
The same way that the entire Fermi generation was a fail by being 6 months late: bringing working 3D gaming, unparalleled tessellation performance (about 2 years ahead of AMD), while achieving 60% market share in the discrete GPU space and giving us awesome price/performance in GTX460 and GTX560Ti cards, and at the same time putting price pressume on AMD, resulting in such gems as the HD6950 2GB at sub $299? You think AMD would have focused on Tessellation performance improvements in HD6900 and HD7900 if NV wasn't dominant in this area before? Let's consider that competition can often improve the landscape for all of us, even if the competitor does arrive late.

In the current PC era characterized by console ports/game engines that lack realism, it would be amazing to see a GPU maker trying to add some unique features that may encourage game developers to produce more realistic games. Sure, thus far physX has not worked out, but that doesn't mean a company should just give up. I'd rather take more realistic physics effects (regardless if NV or AMD brings that advantage) than 80-120 FPS in DX9 game engines and/or 20-30 fps in DX11 games with extreme levels of Tessellation (arguably the only standout DX11 feature).

At the end of the day, HD7900 series did not revolutionize much aside from Zero Power core technology. The new cards still cannot hope to cope with heavy Tessellation in games like Metro 2033, struggle in demanding games like Dragon Age 2 and are still more or less unable to pull off Eyefinity with a single card on 3 monitors. That more or less leaves us having to resort to inefficient deferred AA settings in modern game engines that more often than not result in blur fest and/or massive performance hits in games such as BF3 to extract some value. Are we at a point of diminishing returns though? Fundamentally, games barely look better or feel any more realistic than Crysis 1 from 2007. Maybe I am getting older, but I am no longer impressed by 40% performance increases. I want more realistic PC games. Sure, having a card powerful enough to game on a 30 inch 2560x1600 monitor is a nice option, or being able to play on 3 screens is decent, but that's not improving realism of gaming whatsoever.

If NV is 6 or even 9 months late but actually adds something that has a long-term potential to improve the gaming experience, it might be worth the investment for them and for all of us gamers (as it might encourage AMD to focus on physics for once). The unfortunate side effect of being late is the lack of pricing pressure on AMD.

However, personally, I don't care about Eyefinity due to bezels or 120 Hz PC gaming @ 120 fps. As such, it would be a breath of fresh air if any company actually incorporated some useful features that help improve the realism in games. After the incredible hype behind BF3's graphics/realism, the game is a laughing stock. 5 years after Crysis 1 and the PC gaming industry hasn't moved a dime, aside from more fluid character animations ripped off from EA sports game engines. Sad really.

Maybe it's time games started to focus on physics more and worry less about AA and high resolutions so much.

Softimage Lagoa ICE - Mousetrap 1080p HD

and

Real-Time Grass Rendering

and

Physically Guided Animation of Trees

GRAW with Ageia PhysX (not bad)


Or do we people want 5 monitor gaming at super high resolutions and 128x AA filters with the same ragdoll and physics effects from 2008?

I hope to god that NV doesn't launch kepler in september, if only because there would be approximately zero pricing pressure on AMD if the delay is that long. Even stretching it out to late Q2 would be extremely bad for us, regardless of the cool new features that NV provides.

Btw, I think that game physics has potential, but the cards released up to now haven't been beefy enough to handle it. I'd be pleasantly surprised if kepler can, either. Maybe by Maxwell we'll start seeing actual playable/enjoyable physics in games. Of course, it would be enormously easier if AMD/NV could agree on some sort of reasonable physics standard.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
It's pretty obvious NVIDIA is heading for a big, big fall Quantos. What with AMD launching the 5870 first, and totally beating the gtx580 in the all important frames per dollar and decibel categories, NVIDIA is on the ropes. They might not last the year!

You forgot the /sarcasm tag...
 

xp0c

Member
Jan 20, 2008
91
0
0
I wish ATI would have stayed away from AMD. AMD does great at the GPU department, but you always have to wonder, what happens if Intel shuts AMD down.