Insert_Nickname
Diamond Member
- May 6, 2012
- 4,971
- 1,692
- 136
Larry's Law - "Every GPU thread of sufficient length, will devolve into discussions about crypto..."
Sorry Larry, but I'm expropriating that one...
Larry's Law - "Every GPU thread of sufficient length, will devolve into discussions about crypto..."
already been defined.Sorry Larry, but I'm expropriating that one...
OK, back on topic... when are they going to release these &*#@ things already?
Shortly after they fix all the driver bugs... so some time in 2032?
Are the drivers really THAT bad?
I mean, all they need to be able to do to have a winning card is to beat the price/performance numbers for a Radeon 6500 XT while not crashing every 30 minutes. That's a really LOW bar to hurdle... it can't be THAT hard, can it?
Intel's IGP are mostly focused on media playback capabilities so I think their DGPU game ready drivers will take time to be on par with AMD or NVIDIA drivers
Intel has been making integrated graphics drivers for over a decade now, they should know what they're doing.
I'm curious too. What is nVidia's secret to getting GPU drivers right? They put their best programmers on it? Keep it at the top of their priority list? Or is it because if their drivers suck, then the whole company suffers because they are fundamentally a GPU company. I suppose that's the answer. AMD and Intel are first and foremost CPU companies so GPU has to compete with the elder CPU sibling for the parent's love, and gets neglected.
I'm curious too. What is nVidia's secret to getting GPU drivers right? They put their best programmers on it? Keep it at the top of their priority list? Or is it because if their drivers suck, then the whole company suffers because they are fundamentally a GPU company. I suppose that's the answer. AMD and Intel are first and foremost CPU companies so GPU has to compete with the elder CPU sibling for the parent's love, and gets neglected.
Remember, these hardware companies sometimes have to write substitute shader code when their hardware has a problem displaying the intended result. To jump into this from a relaxed position to full on competition is a really, really big task. It's not just best, but how many jobs you can do simultaneously. I have a suspicion that the hardware is the easier of the two.I'm curious too. What is nVidia's secret to getting GPU drivers right? They put their best programmers on it? Keep it at the top of their priority list? Or is it because if their drivers suck, then the whole company suffers because they are fundamentally a GPU company. I suppose that's the answer. AMD and Intel are first and foremost CPU companies so GPU has to compete with the elder CPU sibling for the parent's love, and gets neglected.
OK, back on topic... when are they going to release these &*#@ things already?
I have used both nvidia and amd GPUs over the years.I'm curious too. What is nVidia's secret to getting GPU drivers right? They put their best programmers on it? Keep it at the top of their priority list? Or is it because if their drivers suck, then the whole company suffers because they are fundamentally a GPU company. I suppose that's the answer. AMD and Intel are first and foremost CPU companies so GPU has to compete with the elder CPU sibling for the parent's love, and gets neglected.
It's not just best, but how many jobs you can do simultaneously. I have a suspicion that the hardware is the easier of the two.
Intel has plenty of money but driver quality just hasn't been a focus for them for a long time which means that even if they spend the money now, there is a lot of ground to make up and at some point more money doesn't help speed things up.
Super agree. Like for DG2, leaked benchmarks suggest that they probably blew a good portion of their transistor budget on maximizing higher precision compute power that is required more in the data center than in gaming. They are trying to hit two birds with one stone. Consequently, their gaming performance may not be as good if they had focused solely on it in their GPU's design.But lots of their "driver issues" as perceived by people have been due to their hardware.
Maybe that's just Raja's influence that they're making compute oriented GPUs. I like the GPUs segmented for gaming and data center separate, like AMD did with RDNA and CDNA as well as Nvidia does with geforce and quadro lines. Helps in differentiating product stacks and target audience.Super agree. Like for DG2, leaked benchmarks suggest that they probably blew a good portion of their transistor budget on maximizing higher precision compute power that is required more in the data center than in gaming. They are trying to hit two birds with one stone. Consequently, their gaming performance may not be as good if they had focused solely on it in their GPU's design.
But is not separate, Xe-HPG and Xe-HPC?Maybe that's just Raja's influence that they're making compute oriented GPUs. I like the GPUs segmented for gaming and data center separate, like AMD did with RDNA and CDNA as well as Nvidia does with geforce and quadro lines. Helps in differentiating product stacks and target audience.
I'm curious too. What is nVidia's secret to getting GPU drivers right? They put their best programmers on it? Keep it at the top of their priority list? Or is it because if their drivers suck, then the whole company suffers because they are fundamentally a GPU company. I suppose that's the answer. AMD and Intel are first and foremost CPU companies so GPU has to compete with the elder CPU sibling for the parent's love, and gets neglected.
I don't doubt he has a dungeon...Jensen is turning his old leather jackets into leather whips and hits them if they do anything stupid! +100 boss imbued damage! xD
Jensen sees Nvidia as a software company first. Source from over a decade ago:What is nVidia's secret to getting GPU drivers right?