• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Windows 64 bit OS security question--speculation.

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
oop. hit the go button too early...

The only thing that would hold back an internally known exploit is limited resources. It may get delayed while something with higher priority gets done (known - likely to be exploited bug for instance) but it will be fixed as quickly as possible and immediately released. I suppose poor timing might burn the PR guys but that's what they get paid to deal with.
 
In this Washington Post article (7th paragraph) Microsoft chief executive Steven A. Ballmer said that,

"since most virus and worm attacks come only after
vulnerabilities have been disclosed by the company or by security
researchers, Microsoft is working with computer-security firms to make sure
that they do not announce vulnerabilities before Microsoft has designed a
fix." [and chooses to release it.]

and (8th paragraph):

"I wish those people just would be quiet," he said of computer researchers who publish vulnerabilities in Microsoft's products. "It would be best for the world."
[Thus, QED, MS will not announce or release patches to previously unknown <outside MS> security holes.]

In Balmer's light, I think it logically follows that MS's Public Relations/Marketing are, indeed, driving the release and timing of security related information acording to their own strategy and reasoning.

What is best for the world? Hows that for a thread title!
 
I think you're reading what you want into it:

Your interpretation: "In Balmer's light, I think it logically follows that MS's Public Relations/Marketing are, indeed, driving the release and timing of security related information acording to their own strategy and reasoning"

Based on:
"Microsoft is working with computer-security firms to make sure
that they do not announce vulnerabilities before Microsoft has designed a
fix." [and chooses to release it.]

Note the [and chooses to release it.] is not in the quote but your own wording in an attempt to slant the interpretation.

I read it as "hey guys, STFU about it until our programmers get this plugged or you're going to make a big mess out there". They will release the fix 1.) As soon as it's done and 2.) When they are as sure as they can be it won't break something else.


 
Originally posted by: Nothinman
It's not true. Promise.

The beauty of closed development =) "We really, really promise that's not true!"

Imagine in your head how a process might happen internally for something like this: a report of a problem comes in, or someone tasked with finding them discovers one. Some manager evaluates it and drops it into the workload or "to do" list. Some team or an individual starts typing away at it. yada yada - marketing just doesn't fit into the picture. I'm sure they coordinate when it comes to projects and timetables for new products, but fixing stuff? Bah. That stuff is all handled by geeks, not suits.

But the release of security announcements probably goes though PR just like all press releases, it's possible they could have a quota of non-critical releases. If there's a known exploit or even a post on BugTraq they probably consider the release a must but I could definately see them holding back on things they found internally that aren't known outside yet to help their image.

It's not so much an image thing as it is a resource thing.

MS would love to have a perfect product. Really. Consiracy theories aside, that's what MS wants.

But there are only so many hours in a day, and only so many developers. Bugs get triaged. The ones with the greatest impact are fixed first.
 
To reply to the original question:

Buffer overflow exploits are largely a result of the x86 architecture, where local variables are allocated on the stack right next to the callee's context record. Sending huge amounts of data can overwrite the return address, which allows a clever hacker to point to return address to his code and run his code in the context of the currently running process.

I believe AMD's 64 bit architecture has the same design.

I belive that Itanium allocates local variables separately from the thread's context record and therefore stack overflow exploits are not possible. I believe the same is true of the Alpha chip.

I may, of course, be wrong.
 
Originally posted by: Smilin
If your point was that nothinman is a pleasant and intelligent guy whos fun to have a converstion with and that you're a complete freak you've made it
rolleye.gif

Uhuh and you can go on thinking that Microsoft has your best intrests at heart. That's why in every agreement you have to sign you agree that Microsoft is free from any responsability from any financial losses from using their products.

Get a grip. It's all marketing. Everything. That's the whole point of the dog and pony show. You can go on quoting Balmer and think that maybe you can gleam some sort of truth or insights into the internal functionings of Microsoft.

You have to realise that it's one big commercial and everything they say has been carefully rehearsed and rehashed so that any sort of interpetation of their comments will show themselves in a good light.

Sorry I go by history and what the company has done in the past. Apart from a few press releases MS hasn't shown any difference in their behavior.

I tried to explain myself and if you think that is so bad, sorry for you.
 
Originally posted by: Nothinman


But the release of security announcements probably goes though PR just like all press releases, it's possible they could have a quota of non-critical releases. If there's a known exploit or even a post on BugTraq they probably consider the release a must but I could definately see them holding back on things they found internally that aren't known outside yet to help their image.
I don't want to get involved in a flamewar, but I thought I could shed some light on this debate just because I happen to work at said company, in the OS division, as a person who writes code.

The fixes for critical vulnerabilities that we are made aware of (whether from external reporting, or internal testing) are usually posted to the latest Windows source tree within hours. Security vulnerabilities are top priority, and they simply do not exist in our bug tracking database for long because they get everyone's attention. Many are resolved before I even notice them. I can't say for sure that all of them are fixed that quickly, but from what I've seen it's a pretty good estimate.

Sometimes these fixes are made easier by fantastic customers who have narrowed down the problem to such an extent that a single e-mail isolates the flaw. These customers are some of the best people around, because they have not only taken it upon themselves to notify us of the flaw, but have agreed to work with Microsoft to help solve the problem, rather than going public before a patch is ready to get publicity. It warms my heart immensely to know that good people like this still exist on our planet. Whenever I am saddened by news that some twisted hacker has caused $billions of damage by releasing an exploit, I try to think of how many more attacks have been thwarted by these good souls.

Unfortunately, fixing the vulnerability is the easy part. The much harder problem is solving how to distribute the fix to the customer in the most painless way possible. Hundreds of factors come into play. Take a few hypothetical scenarios:

Is the vulnerability susceptible to a remotely exploitable elevation-of-privilege attack? Is an exploit for this vulnerability already public? If so, such a flaw would probably warrant an immediate critical update patch. Such patches are the most painful to customers because they could be announced at any moment, and they must be installed immediately to prevent further outbreak.

Or perhaps the vulnerability was reported by a concerned customer through Product Support Services. If the customer kept the vulnerability confidential and there is no known public exploit for the vulnerability, then MS has more time to make the patches less painful to customers. One way to do this is to release periodically several of these patches in a bundle, to ease the burden on system administrators who are obviously least efficient when applying patches one at a time on irregular schedules.

Among the other legion factors in patch release management is regression testing: MS wants to do its best to make sure that these patches, when installed, will not break software components unnecessarily. When you consider the sheer number of possible combinations of OS version (9x, Me, NT4, NT4 Server, 2k, 2k Server, 2k AS, XP, 2003, etc), hardware configuration (cpu, video, ram, etc), and software configuration (drivers, system utilities, anti-virus, firewall, etc), you can imagine how difficult regression testing can be. Nevertheless, such tests are extremely important to ensure these patches meet the quality bar.

I could go on, but I think you get the idea. Obviously I don't know everything that happens inside MS. But from what I've seen, MS makes every effort possible to address critical vulnerabilities in a timely fashion. Most of the patches you've seen recently are entirely pre-emptive (i.e., there was no public knowledge of the vulnerability before the Microsoft announcement). And I know of no critical vulnerability that we are refusing to patch "because of the bad PR it might cause."

And please, please don't take this post the wrong way: I am not intending to upset anyone. I'm just trying to shed light on the issue from the MS perspective.

Just to be clear lest lawyers somehow twist my words: this post represents my own opinion, and in no way claims to represent the views of Microsoft Corporation. It should not be construed as an official statement or press release in any way, shape, or fashion.
 
Originally posted by: kylefAnd please, please don't take this post the wrong way: I am not intending to upset anyone. I'm just trying to shed light on the issue from the MS perspective.

🙂

 
Look, you may be a little correct. Seems like MS has most every thing covered.

There were only a few serious vunerabilities laying around, that have been uppatched.

Windows RPC Universal Exploit
I am not sure if that has been patched yet or not. I don't think it has. It has a patch, but I am a little confused to wiether or not this a workaround to that patch and I dont' have a windows 2000 machine with me right now to test it out on. 🙂

Proof of Concept for Windows Messenger Service Overflow

I know this hasn't been patched. Edit: (oops it had, I R stopid)

MSMQ Heap Overflow (Exploit)

You have to upgrade to service pack 4 to be safe from that one in windows 2000. So I guess that is considured patched thru that.

IE 6 XML Patch Bypass

This one is kinda interesting. It has been patched against vunerabilities in XML stuff, but still is vunerable thru the same exploit using javascripting.


All in all I am suprised to find so few new exploits that haven't been patched yet. I suppose I could of found more, but I don't have all night to look for them. 😉

All in all it's pretty decent for windows as long as you keep it updated a couple times every week,
 
I got bored, so sue me.

heres one that hasn't been patched.

Listen, don't get me wrong here. I am not trying to bash Microsoft or anything. EVERY company is guilty of this time to time. It's human nature to try to hide your mistakes, and when there is millions of dollars and your companies reputation on the line your going to make bad judgements time to time.

Just becarefull, that's all. It's politics.
 
Originally posted by: drag
I got bored, so sue me.

heres one that hasn't been patched.

Isn't that the same issue as addressed here

I also disagree about it being politics, politicians change their stance on a regular basis in reaction and
response to what is percieved as outside demands.

Businesses follow a consistent pattern of behavior that attempts to cover all potential opportunities, until
they either find a new area of opportunities, or can be convinced that a change in behaviour is more
beneficial than their current course of action.

If you want to treat it like its wholly political, you have the right to your opinion. But the rest of us don't see
the situation is quite as black and white as your comments seem to paint it.




 
Originally posted by: franguinho
as far as i know longhorn has been coded from scratch but i could be wrong

Last I heard, about a year ago, Longhorn is still using the NT kernel. And we wont be seeing a new kernel from Microsoft till 2008-2010.
 
Last I heard, about a year ago, Longhorn is still using the NT kernel. And we wont be seeing a new kernel from Microsoft till 2008-2010.

I doubt we'll ever see a completly new kernel, it would be a waste to rewrite all that code for no good reason.
 
Originally posted by: Nothinman
Last I heard, about a year ago, Longhorn is still using the NT kernel. And we wont be seeing a new kernel from Microsoft till 2008-2010.

I doubt we'll ever see a completly new kernel, it would be a waste to rewrite all that code for no good reason.


Well im not sure exactly how Windows is designed (kernel a complete standalone piece, and everything plugs into it.. sort of deal.. or if the OS is one big program) but currently, there is no way to really secure windows the way we would like it to be.. Its too big, and to complex. They cant just keep building on whats there.. It really has to be redesigned. Although im sure much of the code could be reused, since alot of the NT kernel itself is pretty sweet.
 
but currently, there is no way to really secure windows the way we would like it to be.. Its too big, and to complex

The problem isn't the size, the problem is that fact that everyone as MS is a local administrator on their machines so MS assumes everyone will be running as local admin and things don't get designed to work with lowered previledges.

And certain things were just designed with no flexibility in mind, like you can't enable the SMB server and bind it to only 1 IP it listens on all or none.
 
kernel a complete standalone piece, and everything plugs into it.. sort of deal.. or if the OS is one big program

NT kernel is a Micro kernel.

Unlike Linux which is a monolythic kernel.

The difference is that everything is compiled into a monolythic kernel. The drivers, system calls, blah blah. Everything is integrated into the kernel. Although Linux is getting more and more modular. Everything operates in kernel code. This is the traditional method. Except for the GPL liscence linux is very conservatively designed.

The NT kernel is designed just to handle system calls. One device want's to talk to the hardware it has it's own address space sends messages to the kernel, the kernel then sends that message to another module for the hardware.

Basicly the microkernel design is suppose to be superior from a technical stand point. It makes it easier to utilize new hardware and a bad driver isn't suppose to directly affect the kernel. IE kernel panic. If a driver compiled into Linux is badly designed or has a flaw in can cause a crash and a kernel panic.

The major disadvantages of the microkernel design is speed and size(memory footprint). Memory isn't a issue much anymore, and as software technology increases microkernels go faster and faster. But then again so are Monolythic kernels.

But most of that is academic.


edit. Another example of a microkernel is the Mach kernel used by Apple in OS X.
 
Originally posted by: CQuinn
Originally posted by: drag
I got bored, so sue me.

heres one that hasn't been patched.

Isn't that the same issue as addressed here

I also disagree about it being politics, politicians change their stance on a regular basis in reaction and
response to what is percieved as outside demands.

Businesses follow a consistent pattern of behavior that attempts to cover all potential opportunities, until
they either find a new area of opportunities, or can be convinced that a change in behaviour is more
beneficial than their current course of action.

If you want to treat it like its wholly political, you have the right to your opinion. But the rest of us don't see
the situation is quite as black and white as your comments seem to paint it.



If you don't think that politics exists much in the business world, then I have to suspect that you don't have much experiance dealing with upper managment. (no insult intended. I do try to listen to others, even if I am crass at times)
 
Originally posted by: Nothinman
but currently, there is no way to really secure windows the way we would like it to be.. Its too big, and to complex

The problem isn't the size, the problem is that fact that everyone as MS is a local administrator on their machines so MS assumes everyone will be running as local admin and things don't get designed to work with lowered previledges.

And certain things were just designed with no flexibility in mind, like you can't enable the SMB server and bind it to only 1 IP it listens on all or none.

The problem with windows as I see it is that it's much to interrelated. Makes it easier to use to a certian extent, but much harder to make work properly and do new and weird things.

To understand you have to look ( I am sure that you understand this Noth.) at something like the TCP/IP protocol stack, which most of us is at least somewhat familar with. TCP/IP was created by the same people that developed Unix (BSD guys actually) and follow along the same lines.

Each layer of the protocol has a specific job to do. It acceps input from the layers above and below it, and proccesses the data. Then sends the proccessed data to the next layer. All each layer cares about in relation to others is to make sure the inputs and the outputs follow the standards.

This layered idea, were one program/tool does one thing and does it well creates a situation were you don't have to care about what the hell any other program does, as long as your input/output is what it is suppose to be. Every program is a filter, sort of thing.

Idealy, The internet browser (for example) doesn't have to give a damn about what sort of hardware it runs on, whether or not your on a dial up or going over a LAN. It's not it's job. To port it from PowerPC to AMD-64 to Alpha it doesn't matter (as long it compiles on that platform). Weither it is run from a local harddrive, NFS mounted partition, or over the network running on X is immaterial. It doesn't care. It doesn't matter. It's not it's job to know. Only to work.

If I patch the mozilla browser, I don't have to worry about breaking the E-mailer or screwing up the desktop icons. If I patch the lower parts of the TCP/IP protocal stack I don't have to worry about breaking the SAMBA deamon. It doesn't matter. It doesn't care. so on and so forth. If it worked before, and is designed properly it will work no matter what.

Compare that to the Explorer. If it breaks you can't access system configurations, files, open programs. Everything is broke the computer is useless. It is embedded deep into the system and provides ways for normal users to access and control some apsects of the root system. To do this it has to have the ability to do act as a administrator.

The security is on the top layer. It decides what access to what your have rights to on the fly, depending on a complex set of system configurations and circumstances.

Most MS programs are designed like that.

If I design a virus to break Explorer, or Outlook, I can have a regular user run it and if it succeeds then I have the ability to access the system configuration.

That's what most e-mailer viruses and worms work. Linux, from it's design, is mostly immune to this sort of attack, and generally why it's not a problem.

If a user is stupid enough to open a e-mailer virus in Linux (and if their was one active in the wild) then the solution would be "rm -r /home/luser". Problem solved.

I am not saying that Linux is a god-like OS or anything. It's just different and different sorts of attacks and cracker strategy is needed to break it.
 
Originally posted by: CQuinn
Originally posted by: drag
I got bored, so sue me.

heres one that hasn't been patched.

Isn't that the same issue as addressed here

No I don't think it is. Microsoft tried several times to fix the RPC stuff, they have like 3 or 4 different patches for it, and people just keep on finding new ways to break it.

The microsoft bulliten addresses a buffer overrun to run abratery code on the server. VERY SERIOUS.
The one described in my link is to create a race condition to create a DOS attack. Not so serious, but can make you computer vunerable to other types of attacks.
 
If I patch the mozilla browser, I don't have to worry about breaking the E-mailer or screwing up the desktop icons.

Technically you do because the Mozilla mail client uses parts of the Mozilla browser. But as long as you don't touch how the interfaces work and your patch isn't broken, it should be fine and your point is valid.

Compare that to the Explorer. If it breaks you can't access system configurations, files, open programs. Everything is broke the computer is useless. It is embedded deep into the system and provides ways for normal users to access and control some apsects of the root system. To do this it has to have the ability to do act as a administrator.

You can still do all of those things, you just have to know how. For instance you can launch control panels from the command line if somehow your shell got broken. And IE runs with the same credentials as the user (although this is normally admin because that's the default) so they can only do damage to things they already have access to. If some other program that runs as administrator or SYSTEM uses MSHTML for something and breaks that's not necessarily IE's fault.

If I design a virus to break Explorer, or Outlook, I can have a regular user run it and if it succeeds then I have the ability to access the system configuration.

Again, only if the user is already a local administrator or you use something like the RPC exploit to elevate your priviledges.
 
Originally posted by: drag
I got bored, so sue me.

heres one that hasn't been patched.

Listen, don't get me wrong here. I am not trying to bash Microsoft or anything. EVERY company is guilty of this time to time. It's human nature to try to hide your mistakes, and when there is millions of dollars and your companies reputation on the line your going to make bad judgements time to time.

Just becarefull, that's all. It's politics.

Did you read the discussion? It's a denial of service attack.

Yeah, it should be fixed, but it doesn't allow remote code execution.
 
Originally posted by: ZapZilla
I mean if 32-bit stuff has lots of security problems, then won?t Win-64 and 64-bit apps offer a larger ?bitscape? to plunder, resulting in even more horrendous security nightmares?

Bwahahahahaha, man these forums are fun sometimes.

 
Originally posted by: NogginBoink
Originally posted by: drag
I got bored, so sue me.

heres one that hasn't been patched.

Listen, don't get me wrong here. I am not trying to bash Microsoft or anything. EVERY company is guilty of this time to time. It's human nature to try to hide your mistakes, and when there is millions of dollars and your companies reputation on the line your going to make bad judgements time to time.

Just becarefull, that's all. It's politics.

Did you read the discussion? It's a denial of service attack.

Yeah, it should be fixed, but it doesn't allow remote code execution.


from 2 posts earlier:

Originally posted by: drag
Originally posted by: CQuinn
Originally posted by: drag
I got bored, so sue me.

heres one that hasn't been patched.

Isn't that the same issue as addressed here

No I don't think it is. Microsoft tried several times to fix the RPC stuff, they have like 3 or 4 different patches for it, and people just keep on finding new ways to break it.

The microsoft bulliten addresses a buffer overrun to run abratery code on the server. VERY SERIOUS.
The one described in my link is to create a race condition to create a DOS attack. Not so serious, but can make you computer vunerable to other types of attacks.

 
Originally posted by: Nothinman
If I patch the mozilla browser, I don't have to worry about breaking the E-mailer or screwing up the desktop icons.

Technically you do because the Mozilla mail client uses parts of the Mozilla browser. But as long as you don't touch how the interfaces work and your patch isn't broken, it should be fine and your point is valid.

hehe, forgot about mozilla e-mail. I use Mozilla Firebird for browsing and Ximean Evolution for E-mail.
Compare that to the Explorer. If it breaks you can't access system configurations, files, open programs. Everything is broke the computer is useless. It is embedded deep into the system and provides ways for normal users to access and control some apsects of the root system. To do this it has to have the ability to do act as a administrator.

You can still do all of those things, you just have to know how. For instance you can launch control panels from the command line if somehow your shell got broken. And IE runs with the same credentials as the user (although this is normally admin because that's the default) so they can only do damage to things they already have access to. If some other program that runs as administrator or SYSTEM uses MSHTML for something and breaks that's not necessarily IE's fault.


If I design a virus to break Explorer, or Outlook, I can have a regular user run it and if it succeeds then I have the ability to access the system configuration.

Again, only if the user is already a local administrator or you use something like the RPC exploit to elevate your priviledges.

hmm... I thought the Explorer (regular Explorer, not IE, that's what I ment.) was something that had rights to the system stuff all the time.

 
Back
Top