• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

[Rumor] NVIDIA Maxwell, Denver CPU, Shield 2 and Tegra 5 Announcement 01/06 @ CES

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Did they state what clocks it was running at for those demo's? If it's like 900mhz to 1ghz that's nice, but there is no way they will be at those clocks at 5w.
 
this is super exciting but does anyone here actually believe this will only use 5w at the chip level while pumping out ~360gflops on the same process?
It sounds to good to be true to me as well.
Possibly, they have some type of dynamic clocking and they measure the 5W in the low power state, while the 360gflops is measured in high power state.
 
Interested in Maxwell if it delivers. Probably the first decent 28nm part for anyone will get my dollar as I need to build this summer.
 
1. Comparing real world gaming performance between AMD and NV strictly based on Gflops? Total fail. Explained below.

2. Top of the line Kaveri APU will have 512 GCN SPs. The performance should come very close to HD7750. On the other hand, the GPU in K1 only has 192 CUDA cores or nearly half the power of GT640. GK107 features a single graphics processing cluster containing dual SMX shader multiprocessors, or double that of K1's GPU.

3. HD7750 pummels GT640 into the ground, which means Kaveri's GPU should be at least 3.5x faster than Tegra 5.

value-average.png


4. You didn't even talk about the massive difference in CPU power between 4 Steamroller cores at 4.0ghz and Denver CPU cores. Obviously this comparison is unfair since Kaveri has a TDP of 95W but you are the one who went there comparing GFlops on paper/Kaveri vs. K1.



Exact same mistake as before. Comparing GPU performance from different architectures based on Gflops. The GPU in Xbox One has 768 Steam Processors. It is very similar in performance to HD7790. As such, it will trounce the K1 chip by 4x in gaming performance considering HD7790 is at least 2x faster than GT640.

You again didn't even bring up the power of 8 Jaguar cores vs. 2 Denver cores. Sure, it is impressive for a 5W chip but Tegra 5 is far from the power of Xbox 1 and PS4. There is no question that NV has massively improved Tegra 5 from Tegra 4.

NV compares K1's GPU power to Cyclone in A7 but that's "old generation in tech terms" since the 28nm SOC ship has sailed. This year, A8 is expected to be made on 20nm and Tegra 5 will find itself competing against newer chips on lower nodes. If NV introduced Tegra 5 on 28nm before A7 debuted, it would have been uber impressive. Instead, NV seems to have an excellent architecture but they have a process node battle on their hands since very soon K1 will find itself competing against 20nm chips.

Also, not even sure why you conveniently chose to use Xbox 1, which costs $100 more than PS4, yet the GPU in PS4 is nearly 50% faster than in XB1. If we are going to compare modern SOCs to current generation consoles, might as well use PS4 as the benchmark. It will likely take 3-4 years before an SOC has the power of PS4. By that time, PS4 will be near half of its life cycle.

More importantly, even if you had a smartphone that had 100x the power of PS4, the end user experience is so vastly different, the comparison isn't really relevant from a consumer's point of view. My smartphone can produce graphics far superior to NES, SNES and N64 but as a gaming device, compared to those consoles, it's trash.

GT 640 is faster than 8800GS? It seems K1 will have a GPU Power of a 8800GS/8600GTS. BTW Kaveri can match 7750 in situations where memory bandwith don't destroy the IGP performance...
 
If OpenGL offered what Mantle offers, do you really think developers would be asking AMD to create Mantle?

I seriously doubt developers asked AMD to create a brand new API that they would have to code for separately. Developers would rather have an open source API than have to make a game work for most with one API, then have to make it work for another with a niche API.

More than likely developers just complained about DirectX and it's complexities with porting games to PC, so AMD just created Mantle and are paying devs to try to leverage their console designs. I'm not falling for that hero caped crusader gibberish one bit.
 
Not for all of us...I am more interested in pure performance than perf/watt.
Hence why I game on a PC..and not a crapbox or mobile...

Top end GPUs are limited by performance/W. There is a TDP ceiling on what you can fit in the PCIe spec, and the 290X and 780Ti are both hard up against it. If you improve performance/W while keeping W constant, then you get higher performance. Basic maths...
 
Top end GPUs are limited by performance/W. There is a TDP ceiling on what you can fit in the PCIe spec, and the 290X and 780Ti are both hard up against it. If you improve performance/W while keeping W constant, then you get higher performance. Basic maths...

I guess you have been absent with the Haswell launch...foucs on mobile stalled pure performance over perf/watt on mobile...sorry, but reality just broke your "basic maths"....try again.
 
Again, why is this so exciting? Is it for small mobile stuff? I'm not crapping on you guys, I just want to know why people are flipping out over this thing. What will it do for you, personally? I get all excited about new GPUs because it means better graphics with higher FPS, but what does this chip do? Better looking mobile games?
 
Wondering that myself. Who cares about this thing. They hyped Tegra 3 and Tegra 4 a lot too as far as I remember from seeing tech news and forum posts. Both of those bombed hard from everything I read; releasing much later than promised, with hardly any device makers interested in using them and them not delivering the performance as promised.

What does this thing do exactly ? Let me play Quake 3 on my phone ? 😀 Aren't the giants like Qualcomm, Samsung and the huge gorilla Intel going to lap them again before they manage to release it ?

I just don't get the interest in these chips, even if it performs like a xbox 360... that is horrible and does nothing for me as a gamer and I don't think most smart-phone users could care less about that anyways. They just want to SMS, do email, check the internet and play some Farmville/Bejewelled 😀
 
Wondering that myself. Who cares about this thing. They hyped Tegra 3 and Tegra 4 a lot too as far as I remember from seeing tech news and forum posts. Both of those bombed hard from everything I read; releasing much later than promised, with hardly any device makers interested in using them and them not delivering the performance as promised.

What does this thing do exactly ? Let me play Quake 3 on my phone ? 😀 Aren't the giants like Qualcomm, Samsung and the huge gorilla Intel going to lap them again before they manage to release it ?

I just don't get the interest in these chips, even if it performs like a xbox 360... that is horrible and does nothing for me as a gamer and I don't think most smart-phone users could care less about that anyways. They just want to SMS, do email, check the internet and play some Farmville/Bejewelled 😀

I concur. These things obviously have their place or they wouldn't spend money developing them, but for gamers who actually get sort of serious and involved in gaming, I don't think a phone is their preferred platform. So this seems like its geared toward casual gamers who are constantly on the go. Like, they must travel so often that they simply can't manage an hour of down time in front of a TV or PC. But if I was cursed with such a chaotic lifestyle, then I'd be thrilled to play quake 3 on my iphone while driving to my next appointment.
 
Again, why is this so exciting? Is it for small mobile stuff? I'm not crapping on you guys, I just want to know why people are flipping out over this thing. What will it do for you, personally? I get all excited about new GPUs because it means better graphics with higher FPS, but what does this chip do? Better looking mobile games?

It breaks through the stagnation of the x86 market. It gives us competition and advancement.

Tegra K1v1 delivers 40% more performance in the same power range than Tegra 4. Dozens of companies are developing their own 64bit ARM processor. Since 2006 we have a real CPU race to the top.
 
I guess you have been absent with the Haswell launch...foucs on mobile stalled pure performance over perf/watt on mobile...sorry, but reality just broke your "basic maths"....try again.

Haswell gave a respectable increase in CPU performance, and an excellent increase in GPU performance. If you only care about CPU performance, then look at the CPU only server parts, where Ivy Bridge gives ~50% improvement in performance by fitting 12 cores into the TDP of 8 Sandy Bridge cores.
 
It breaks through the stagnation of the x86 market. It gives us competition and advancement.

Tegra K1v1 delivers 40% more performance in the same power range than Tegra 4. Dozens of companies are developing their own 64bit ARM processor. Since 2006 we have a real CPU race to the top.

stagnation? I would call baytrail of kabini stagnant, they are definitely reacting to the onslaught of highperf arm cores.
 
If nvidia delivers on its 2010 promises for Maxwell performance, then we are in good hands.


http://static.vizworld.com/wp-content/uploads/2010/09/cuda-gpu-roadmap-595x444.jpg

Not for all of us...I am more interested in pure performance than perf/watt.
Hence why I game on a PC..and not a crapbox or mobile...

I guess you have been absent with the Haswell launch...foucs on mobile stalled pure performance over perf/watt on mobile...sorry, but reality just broke your "basic maths"....try again.

Haswell still runs extermely cool by intel's own design. I don't know if there is a relevant or non obselete ATX spec that controls power to a processor. If Intel wanted to make a 200W monster there isn't a spec that prevents them from doing so. Remember that their processors today use about the same power as the professors from a few years ago, yet their performance is better; the perf/watt approach isn't a total failure - its somewhat sucessful.

Like the person noted - there are limits on what the PCIE spec can provide for power, and AMD and NV are pushing that already. While scaling up a design from low to high power isn't going to be perfectly 1:1 linear, and sometimes it can be downright horrible, it provides a better chance that the next level of high end cards will have better performance for the same power (assuming that the high end cards are just beefy and scaled up low power units)
 
Last edited:
Grooveriding said:
Wondering that myself. Who cares about this thing

The people who care about this are people that are interested in ultra mobile technology at the bleeding edge of performance and efficiency. Tegra will be the building block (so to speak) for future Geforce GPU architectures, and the power efficiency gains in Tegra will translate directly to higher end Geforce GPU's (including the upcoming Maxwell GPU's).
 
Last edited:
The people who care about this are people that are interested in ultra mobile technology at the bleeding edge of performance and efficiency. Tegra will be the building block (so to speak) for future Geforce GPU architectures, and the power efficiency gains in Tegra will translate directly to higher end Geforce GPU's (including the upcoming Maxwell GPU's).

So a miniscule portion of phone and tablet users who frequent tech forums ?

Meanwhile 99% of smartphone and tablet buyers will be happily using their SGS5, iPhone 6 and ipads powered by Samsung, apple chips and quite likely Intel, who is the player best poised to, and in the unique position of being able to deliver massive perf/w to mobile that no one else can. While this tegra 5 thing winds up in Shield2... So a few people can play old games with near 360/ps3 iq.....

I just do not find anything exciting about performance in mobile parts, it's terrible always and the form factor is not suited to interactive entertainment. Better battery life is great, but it's Intel that is going to give the real massive gains there.
 
Grooveriding said:
I just do not find anything exciting about performance in mobile parts

You are entitled to your opinion, but this thread is about ultra mobile tech 🙂

FWIW, Tegra goes way beyond just smartphones and tablets. Tegra makes headway as Android makes headway. Android will be in smartphones, tablets, tablet convertibles, TV's, game consoles, cars, etc.
 
Hopefully WP 8.1 devices embrace Tegra K1, both are underdogs in the market and would make for a nice alternative. I've already got a Lumia 1520 which I absolutely love so I'd jump at the chance for another 6" phone by MS that uses the K1 SoC with Denver.
 
Looking at CES and the notebooks being debuted it seems like nvidia is going to launch the 8xxm series for sure in the next couple months.

Whether thats Maxwell or not is still up in the air but it looks likely.
 
Back
Top