• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

Massive security hole in CPU's incoming?Official Meltdown/Spectre Discussion Thread

Page 40 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

richaron

Golden Member
Mar 27, 2012
1,349
328
136
Seems like all I can do is wait. Next gen Zen is coming soon, so if you want to go Ryzen, might as well wait, and Intel says fixed chips are coming "later this year", whatever that means.
Yeah that would totes be appropriate if AMD Ryzen chips weren't already immune to the worst of the exploits. It's only intel chips which have the worst bugs.

So since current Ryzen chips have dropped in price I can only assume you agree they are a good buy?
 
  • Like
Reactions: Kuosimodo

TempAcc99

Member
Aug 30, 2017
60
13
51
Is that why you think everybody is scrambling to patch operating systems, drivers and browsers? Just so they can watch how the patches have no significant impact on system security anyway?

Mozilla patched Firefox since they feared exploits could be found to take advantage of these vulnerabilities: just JS code from a compromised web server, no initial malware deployment necessary.
Impact on who? Cloud providers and servers yes. Consumers? Much less and exactly because browser have been patched and I use a script blocker anyway.

Perhaps we need an explanation of why home users need to worry about this?
Other than applying whatever patches come out, and maintaining our anti-virus software, I'm not sure what we can do about it anyway?

I'm 100% certain I will get nothing other than the MS W10 patch.

I will be shocked if I get a BIOS update out of ASUS.

I doubt I will get a microcode update, although that may have been with the MS W10 patch. I have no idea, really.
First off security experts advise against using anti-virus because these programs are attack vectors themselves.They are far from secure.

Most antivirus software is like giving yourself cancer so you don't catch a cold.
Why home users should not worry? Because there are 1000th of simpler methods to get your money directly or by stealing passwords. And money is all the hackers really care about. Cast a wide, simple net and something will stick in it.

For example read this story. Yeah fictional but plausible and doesn't require deep CPU architecture knowledge. It's pretty simple and effective. I mean the guy that found Spectre had been working on this since 2005 or 13 years. he is basically the sole expert on this.

And the main point applies. If you follow safe-practices you avoid getting malware in the first place. If i can install malware on your PC, i can install a key-logger like and kid could and get your passwords. Once you are at the point that someone can install malware / run their code on your system, all bets are off.

Yes, the fact that this could be done via JavaScript (browser) is a huge issue but that has been fixed. To get affected by this you need the malware installed and changes of catching ransomware or a keylogger s just a lot higher.
 

goldstone77

Senior member
Dec 12, 2017
217
93
61
no, that mention for 2005 is simply about when speculative vulnerabilities for hardware, in general, was being tossed about. I don't think the statement suggests anything about that being related to spectre or meltdown type of vulnerabilities. He didn't start looking at this type of vulnerability in 2005, just the potential to use hardware design as an attack vector.
He has been working on security in this "area" since at least as far back as 1995. https://www.google.com/patents/US5899987
BACKGROUND OF THE INVENTION

Many operating systems (OS) today do not include a built in mechanism, called user exits, to divert program control from the operating system or related services to user supplied functions. In many programming instances it is desirable to divert or intercept system calls, issued from a user application, and execute user supplied code instead. The code supplied by the user might bypass the entire original system function call or it might perform a function in conjunction with the original system function call. One application of such a system might include a security system whereby operating system calls issued by a user application are not permitted to execute unless the calling process has the requisite authority or privileges.
 

zinfamous

No Lifer
Jul 12, 2006
102,187
16,416
136
Seems like all I can do is wait. Next gen Zen is coming soon, so if you want to go Ryzen, might as well wait, and Intel says fixed chips are coming "later this year", whatever that means.
They're going to dig out all of their netburst chips that were buried next to the ET video game cartridges and clean them off, put a ""Core gen 9" sticker on them, and ship them out.
 

LTC8K6

Lifer
Mar 10, 2004
28,523
1,569
126
Yeah that would totes be appropriate if AMD Ryzen chips weren't already immune to the worst of the exploits. It's only intel chips which have the worst bugs.

So since current Ryzen chips have dropped in price I can only assume you agree they are a good buy?
No, if I am going to buy a new Ryzen system, I might as well wait a bit and get the next gen, and the new chipset, yes? It launches in March.

Edit:Looks like 12nm Zen launch is moved to April, but still not much of a wait.
 
Last edited:

LTC8K6

Lifer
Mar 10, 2004
28,523
1,569
126
They're going to dig out all of their netburst chips that were buried next to the ET video game cartridges and clean them off, put a ""Core gen 9" sticker on them, and ship them out.
Those ET games were dug up...

I don't know what Intel is going to do, but I can't wait to see the outcome. :D
 

DrMrLordX

Lifer
Apr 27, 2000
16,387
5,287
136
13 fps isn't great though. If I had gone out to buy a top-of-the-line GPU to try and run my AAA games @ 144mhz on a 1080p monitor like they seemed to be doing in the test, losing 13 fps like that would have me pissed.

It looks like a lot of end users are going to try to justify to themselves running without the patch and/or microcode update while trying to convince themselves that their systems are secure.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,071
1,363
126
-9.4% aint nothing. 139 average would put you pretty safe above 120 whereas 126 means you will be bouncing well below 120 quite a bit.
Average fps don't tell the whole picture. Without reviews showing the minimums data it's really hard to get a grasp on the performance hit at this time.
 
  • Like
Reactions: Kuosimodo

Roger Wilco

Senior member
Mar 20, 2017
352
106
86
Are there any game benchmark comparisons where single core performance is almost always the bottleneck, regardless of multi-core or GPU performance?

For example:
Cities Skylines
Planet Coaster
Starcraft 2
Total War (anything in the last decade)
MMOs in general
 
  • Like
Reactions: Kuosimodo

LTC8K6

Lifer
Mar 10, 2004
28,523
1,569
126
13 fps isn't great though. If I had gone out to buy a top-of-the-line GPU to try and run my AAA games @ 144mhz on a 1080p monitor like they seemed to be doing in the test, losing 13 fps like that would have me pissed.

It looks like a lot of end users are going to try to justify to themselves running without the patch and/or microcode update while trying to convince themselves that their systems are secure.
If you left a little overclock in reserve, you can probably get that 10% back.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,071
1,363
126
If you left a little overclock in reserve, you can probably get that 10% back.
I guess. But with the thermal dynamics of Intels latest and greatest cpus it might not really be cost effective in the end. Many overclocks I'd imagine are set by ones thermal threshold for pain and lack of nerve to delid, funds, etc.
 
  • Like
Reactions: Kuosimodo

realibrad

Lifer
Oct 18, 2013
12,337
894
126
Average fps don't tell the whole picture. Without reviews showing the minimums data it's really hard to get a grasp on the performance hit at this time.
I really doubt that the mins got better given the issue. I would imagine the mins got worse if anything.
 

LTC8K6

Lifer
Mar 10, 2004
28,523
1,569
126
I guess. But with the thermal dynamics of Intels latest and greatest cpus it might not really be cost effective in the end. Many overclocks I'd imagine are set by ones thermal threshold for pain and lack of nerve to delid, funds, etc.
Then again, if 10% cpu loss drops you out of the comfort zone, then you probably needed a faster CPU/system anyway.
 

SPBHM

Diamond Member
Sep 12, 2012
4,961
335
126
13 fps isn't great though. If I had gone out to buy a top-of-the-line GPU to try and run my AAA games @ 144mhz on a 1080p monitor like they seemed to be doing in the test, losing 13 fps like that would have me pissed.

It looks like a lot of end users are going to try to justify to themselves running without the patch and/or microcode update while trying to convince themselves that their systems are secure.
13FPS in isolation sounds more than it actually is, it's a 10% drop with both fixes applied, significant, but this is the worst case they manged to find and average/min FPS still really high, on average the loss is a lot less significant, under 5%.
 

Stuka87

Diamond Member
Dec 10, 2010
5,035
781
126

24601

Golden Member
Jun 10, 2007
1,683
38
86
Is your compute workload bottle-necked by swapping data constantly between your system RAM and video RAM?

If no, then it has 0 effect on any consumer workload.

The "hit" in that witcher benchmark is due to steaming data from RAM to VRAM constantly cache thrashing.

People also need to be reminded that performance hits will effect every single processor ever made that has out of order processing capability.
 

realibrad

Lifer
Oct 18, 2013
12,337
894
126
Is your compute workload bottle-necked by swapping data constantly between your system RAM and video RAM?

If no, then it has 0 effect on any consumer workload.

The "hit" in that witcher benchmark is due to steaming data from RAM to VRAM constantly cache thrashing.

People also need to be reminded that performance hits will effect every single processor ever made that has out of order processing capability.
Are you saying that the patch that is slowing things down is for both Meltdown and Spectre?
 
  • Like
Reactions: Kuosimodo

24601

Golden Member
Jun 10, 2007
1,683
38
86
Are you saying that the patch that is slowing things down is for both Meltdown and Spectre?
That's what they are saying from what I can tell, I obviously am not an expert on the subject (apparently literally only 1 expert exists on this subject in the entire world)
 

traderjay

Member
Sep 24, 2015
197
142
116
Is your compute workload bottle-necked by swapping data constantly between your system RAM and video RAM?

If no, then it has 0 effect on any consumer workload.

The "hit" in that witcher benchmark is due to steaming data from RAM to VRAM constantly cache thrashing.

People also need to be reminded that performance hits will effect every single processor ever made that has out of order processing capability.
Yes my compute work is heavily dependent on RAM and VRAM. In the machine vision applications, everything resides on the VRAM.
 
  • Like
Reactions: Kuosimodo

goldstone77

Senior member
Dec 12, 2017
217
93
61
Is your compute workload bottle-necked by swapping data constantly between your system RAM and video RAM?

If no, then it has 0 effect on any consumer workload.

The "hit" in that witcher benchmark is due to steaming data from RAM to VRAM constantly cache thrashing.

People also need to be reminded that performance hits will effect every single processor ever made that has out of order processing capability.
It will only be affect by those receiving patches for both OS and microcode(bios) for both Meltdown and Spectre! Ryzen/Epyc will be excluded from Meltdown. Intel has no fix for variant 2 of spectre for older system, which can only be "fixed" by microcode. There will be a lot of systems with holes out there! Also, no reviews on this yet, but I think it's likely streamers will be affected by this.

Edit: also, since Nvidia claims to be affected, I wonder what the ramification of that will be.
 
  • Like
Reactions: Kuosimodo

ASK THE COMMUNITY

TRENDING THREADS