• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Question Zen 6 Speculation Thread

Page 397 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
You're seriously underestimating the cost of desgin and more the one of validation, and the impacts on how you can let an ISA progress with such dead 50 years old weight.
It's small potatoes legacy stuff - it's already been done, no new changes are made, no way this increases complexity by more than 1%, which is fine cost to maintain 100% backwards compatibility rather than anything less.
 
It's small potatoes legacy stuff - it's already been done, no new changes are made, no way this increases complexity by more than 1%, which is fine cost to maintain 100% backwards compatibility rather than anything less.
It's never done, believe me 🙂 Any change to anything that is shared by obsolete paths and more modern paths will require validation. These things don't live in complete isolation from the rest of the CPU; a trivial example is the sharing of the memory hierarchy.

Out of curiosity, do you really run 16-bit code?
 
It's never done, believe me 🙂 Any change to anything that is shared by obsolete paths and more modern paths will require validation. These things don't live in complete isolation from the rest of the CPU; a trivial example is the sharing of the memory hierarchy.

Out of curiosity, do you really run 16-bit code?
Awww. The good old days!

void (far *funcPtr)(void);

Haven't seen or used it for maybe 20 years. I doubt that 16bit is needed; however, 32 bit is a VERY different story as are a number of extensions.

Backwards compatibility is definitely a real thing in PC's.
 
I doubt that 16bit is needed; however, 32 bit is a VERY different story as are a number of extensions.

Backwards compatibility is definitely a real thing in PC's.
I agree that killing 32-bit is premature. My point is that the much praised compatibility is an illusion. As I previously wrote trying to run >20 yo 32-bit programs is much easier by using simulation rather than trying to tweak a modern Windows and finding the needed DLL that might fail to work completely anyway. It took me less than an hour to build from scratch a 86Box that could run a game that required Glide (for 3dfx Voodoo), and I had never used 86Box before that. When I used to have a Windows machine with Windows 7 I never succeeded at that (of course that might be me, but I couldn't find any relevant help to get around my issues).

Are all current Windows app 64-bit now?

This is getting way off-topic and the rest of my thoughts belong more to the iBOT thread 🙂
 
no they ain't.
You'll struggle to find non-amd64 builds anywhere now (outside of older video games. Curse you Total War Attila).
Mmm surprising, I really thought most apps were stupidly compiled (to push it : 386 target 🤣).
That’s good news.

We’re still obligated to use 32 bit version off 365 ´cos a stupidly old work-related app is based on Access with compiled module. Impossible to run it with 64 bit Access.
Yes, that’s prehistoric (both Access and 32bit).

IMO a lot of specific software like used in industries and certainly those used in school (industry sections)are so old they both require 32bit and Win 7 so not to be broken. I tend to laugh but it really isn’t that funny …
 
Mmm surprising, I really thought most apps were stupidly compiled (to push it : 386 target 🤣).
That’s good news.

We’re still obligated to use 32 bit version off 365 ´cos a stupidly old work-related app is based on Access with compiled module. Impossible to run it with 64 bit Access.
Yes, that’s prehistoric (both Access and 32bit).

IMO a lot of specific software like used in industries and certainly those used in school (industry sections)are so old they both require 32bit and Win 7 so not to be broken. I tend to laugh but it really isn’t that funny …
Well, we have customer where one production line is controlled by max. 75MHz Pentium (and DOS of course). Faster than 75MHz and good old Pascal compiled code goes grazy 😅
 
It's never done, believe me 🙂 Any change to anything that is shared by obsolete paths and more modern paths will require validation. These things don't live in complete isolation from the rest of the CPU; a trivial example is the sharing of the memory hierarchy.

Out of curiosity, do you really run 16-bit code?
I think 16-bit usermode has been deprecated since Zen1, it's only supported for legacy boot stuff.
 
Intel disabled the loop stream detector on Skylake & sons because it crashed when using partial registers (AH, BH, CH, DH) with Hyperthreading.
That's such an obscure thing to validate so understandable Intel didn't catch it (anyone that lived through the Pentium Pro learned to avoid partial registers like the plague) yet still managed to affect someone.
 
Haven't seen or used it for maybe 20 years. I doubt that 16bit is needed; however, 32 bit is a VERY different story as are a number of extensions.
The big problem for 16-bit was InstallShield. It's a proprietary install packager that half the world used for most of the 32-bit windows generation, because it was cheap, unobtrusive and worked well. It's mostly 32-bit, but installers created with old versions of it start with a 16-bit loader that checks whether you are running in 32-bit mode, and if not, exit with an error message that tells you that 32-bit OS is needed. This was a huge problem when 16-bit support was dropped, because no-one was updating their installer creator program, and so like half of all 32-bit software could not be installed because of this. Microsoft fixed this before 64-bit Windows became common by binary patching any InstallShield installers it could detect to jump over the 16-bit code.

This then became a problem because some people modded installers for programs they created to strip off all InstallShield branding, and in the process made them undetectable to Windows. So now every copy of windows running everywhere has a database of some 90's and early 00's programs, and entry points to jump into to skip over the 16-bit parts. That's the cost of compatibility.
 
Last edited:
So now every copy of windows running everywhere has a database of some 90's and early 00's programs, and entry points to jump into to skip over the 16-bit parts. That's the cost of compatibility.
Meanwhile this is what it takes to get the windows version of Sim City 2000 to run modern day.


I honestly don't imagine there is much going on in terms of compatibility these days anyways. Everything old that people want to run needs some form of patch regardless. The real compatibility comes from the registry and other underlying systems being so close enough, that software can be relatively quickly fixed to run, without even needing source code. If community fixes weren't possible IMO windows would have been dead 10 years ago.
 
I agree that killing 32-bit is premature. My point is that the much praised compatibility is an illusion. As I previously wrote trying to run >20 yo 32-bit programs is much easier by using simulation rather than trying to tweak a modern Windows and finding the needed DLL that might fail to work completely anyway. It took me less than an hour to build from scratch a 86Box that could run a game that required Glide (for 3dfx Voodoo), and I had never used 86Box before that. When I used to have a Windows machine with Windows 7 I never succeeded at that (of course that might be me, but I couldn't find any relevant help to get around my issues).

Are all current Windows app 64-bit now?

This is getting way off-topic and the rest of my thoughts belong more to the iBOT thread 🙂

All new games are 64 bit, a lot of old games are 32 or DOS (16 bit).

Most new productivity software, browsers are 64 bit.
 
But could be translated? Of course it can. That’s the future of computing. It’s foolish keep to 32-bit support forever natively.
And get killed in reviews that will point out that classics like Borderlands 2 run slower? There is just very little to gain doing such non-trivial things, and a lot to lose. There is plenty of modern 32-bit code around, Steam only recently got 64 bit version, and this is like 20 years since amd64 was out!
 
And get killed in reviews that will point out that classics like Borderlands 2 run slower? There is just very little to gain doing such non-trivial things, and a lot to lose. There is plenty of modern 32-bit code around, Steam only recently got 64 bit version, and this is like 20 years since amd64 was out!

I think the unstated assumption would be that dropping native 32 bit support would allow "new ISA" 64 bit code to run faster. I don't know what x86 could gain if it had a clean 64 bit only design (including repurposing 16 and 32 bit prefixes to improve code density for code compiled to that new ISA, while still supporting current less dense 64 bit encoding) and relied on Rosetta 2 type static translation for 32 bit code.

I would guess low single digits, which wouldn't be enough to make it all worth it. I think the worries about 32 bit code would likely be unfounded. Is anyone going to care if Borderlands 2 runs a few percent slower due to the translation, given that it would still be running much faster than it ever ran on contemporary CPUs? And I'm not sure there would be ANY penalty for running "translated" 32 bit code, if the translation worked as well as Rosetta 2's does.

Given a choice between a CPU that ran 64 bit code let's say 3% faster but 32 bit code 3% slower and one that has full support for 32/64 bit loses that 3% gain and avoids that 3% loss, are people really going to choose the one that runs native 32 bit code to avoid that 3% hit? Is there no such thing as "fast enough" for gaming, especially on decade old titles that are probably already in crazy fps territory with modern GPUs?

I think a bigger worry would be that the translation might impact anti cheat etc. type mechanisms. Losing a few percent of performance in older titles should be a non issue, but having them not run at all would be a different matter.
 
We’re still obligated to use 32 bit version off 365 ´cos a stupidly old work-related app is based on Access with compiled module. Impossible to run it with 64 bit Access.
Yes, that’s prehistoric (both Access and 32bit).
Just spent 2 hours trying to get office 365 to load on my desktop computer (been using libre office as a work around). After a metric CRAP TON of investigations as to why the installer just up and BLINKS OFF with no log at all to tell you why, I eventually installed the MS installer, made an install config.xml and tried to load it like that.

FINALLY got a message indicating a couple of old access database installs were the problem. Got rid of them and everything is fine.

Still, it is a reminder of how important backwards compatibility is.... and how incompetent MS is with its installers! Can't believe such an easy thing took someone like ME so long to figure out. I suspect your average computer user would have long before given up.
 
Can't believe such an easy thing took someone like ME so long to figure out. I suspect your average computer user would have long before given up.
That's not an easy thing and I definitely would be one of those who would've quit. Time is too precious to waste on Microsoft stuff. I haven't even tried Office 21 and 24 to see if it's better or faster. No point. The moment they introduced their subscription based Office 365, that was the end of any interest in Office. Not interested in paying a yearly tax and not interested in paying for the perpetual licensed product that will definitely be inferior to the subscription based one.
 
That's not an easy thing and I definitely would be one of those who would've quit. Time is too precious to waste on Microsoft stuff. I haven't even tried Office 21 and 24 to see if it's better or faster. No point. The moment they introduced their subscription based Office 365, that was the end of any interest in Office. Not interested in paying a yearly tax and not interested in paying for the perpetual licensed product that will definitely be inferior to the subscription based one.

I don't think the subscription price is terrible, for me for example,, since I use it extensively at work.
 
I would guess low single digits, which wouldn't be enough to make it all worth it.
It wouldn't even be that. The reason AArch64 is faster than AArch32 is that they completely threw out the old encoding and all the warts of the ISA, and redid it as a modern RISC. AMD did not do that when they designed AMD64. Because of that, just dropping 32-bit support gives you 0% more speed on x86. Because the existence of the 32-bit stuff doesn't really hurt you when you're not using it. What hurts you is that the 64-bit ISA is still using the same, sucky instruction encoding that has been expanded from the original 8086 design, just with more prefixes.
 
Given a choice between a CPU that ran 64 bit code let's say 3% faster but 32 bit code 3% slower
3% extra isn't worth the risks associated with dropping even 16 bit stuff, never mind 32 bit that is still very actively used - MAYBE 30% would have been justifiable, but there is no way dropping that stuff will get more than very little - internally it must be all mapping to existing hardware stuff anyway, and simple remap can't be very expensive.

The main thing you'd gain is a bunch of short opcodes that can be repurposed for new stuff, but is decoding really the biggest problem right now?

Anyway, it (dropping 32 bit stuff) is all academic - this ain't going to happen in x86 like EVER.
 
Back
Top