• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Discussion Nvidia Blackwell in Q1-2025

Page 178 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
It will be roughly a Nvidia L4 combined with 20 core ARM, just with the latest gen 5 cuda cores instead of gen 4. The L4 does 30 TFLOPs FP32, while the GB10 is said to have 31 TFlops FP32 and they both have the same memory bandwidth, just that the GB10 may have up to 128GB unified memory, vs the L4 having 24GB dedicated memory (the unified memory is shared for both the ARM CPU cores and the GPU).
 
in 2025 even 13'' thin laptops can do 140W total sustained without throttling

16'' 2kg class can do upwards of 200W

Doesn't sound like anything I'd want to put on my lap. 😛

It also sounds like the battery should be specified in Wm (watt minutes) instead of Wh (watt hours).
 
It will be roughly a Nvidia L4 combined with 20 core ARM, just with the latest gen 5 cuda cores instead of gen 4. The L4 does 30 TFLOPs FP32, while the GB10 is said to have 31 TFlops FP32 and they both have the same memory bandwidth, just that the GB10 may have up to 128GB unified memory, vs the L4 having 24GB dedicated memory (the unified memory is shared for both the ARM CPU cores and the GPU).

The raw specs are similar to the 5070, even though they claimed it was using the GB100 version.

Without the memory bandwidth of course.
 
ComputerBase' take on RTX Hair
looks like Sh*t with on
1757073324179-png.1654359
 
A company trying to find ways to justify the high price tag of its overpriced unreliable GPU with buggy drivers. Yeah guys. Keep enjoying staring at the hair instead of actually playing the game and oh, don't worry if you smell something weird. It's part of the experience. If something burns, take it as an opportunity to reward this company for making more innovative products and buy the GPU again. Remember, you need to trust them and keep letting them try, try again. One day they will make a true banger of a product with zero issues!
 
looks like Sh*t with on
1757073324179-png.1654359

I'm actually playing Indiana Jones maxed out on a RTX 5090 and enabling RTX Hair didn't make it look like the picture on the left at all.
Though I have to say I didn't notice any substantial difference in how the hair looks, either.


If anything, the biggest change I noticed on the RTX Hair update is how it brought a bug in the denoising pipeline that creates noisy backgrounds whenever the game enters a cutscene. Now that's super frustrating.


A company trying to find ways to justify the high price tag of its overpriced unreliable GPU with buggy drivers. Yeah guys. Keep enjoying staring at the hair instead of actually playing the game and oh, don't worry if you smell something weird. It's part of the experience. If something burns, take it as an opportunity to reward this company for making more innovative products and buy the GPU again. Remember, you need to trust them and keep letting them try, try again. One day they will make a true banger of a product with zero issues!
Agreed. The only reason I own a 5090 is because its purchase is justifiable as AI processing hardware. There's no such alternative from AMD, unfortunately.
 
Agreed. The only reason I own a 5090 is because its purchase is justifiable as AI processing hardware. There's no such alternative from AMD, unfortunately.
That's true. AMD can't even be bothered to do Vulkan properly for RDNA4. It's not detected in LM Studio yet both Intel ARC and Nvidia cards (even the 1080 Ti) have no issue being detected by LM Studio.
 
I didn’t see a Rubin thread, so I’ll post this here:

It looks like Nvidia are pre-announcing what I’m guessing is the professional version of what will likely be the same die as the RTX 6090, especially since it employs GDDR7 memory. I also count what appears to be a 512-bit memory interface:
IMG_0043.png
 
I didn’t see a Rubin thread, so I’ll post this here:

It looks like Nvidia are pre-announcing what I’m guessing is the professional version of what will likely be the same die as the RTX 6090, especially since it employs GDDR7 memory. I also count what appears to be a 512-bit memory interface:
View attachment 129892
288 SMs? Looks like an additional row on each side vs GB2021757435685192.jpeg
 
I bet it won't prevent it from being the best selling Nvidia flagship, courtesy of some new exclusive feature that Nvidia's developers are secretly inserting into future games as we speak.
Features are whatever, just that SM scaling might be bad outside of doing GEMM.
 
Back
Top