MS Visual C++ (.NET) vs C++

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

mAdMaLuDaWg

Platinum Member
Feb 15, 2003
2,437
1
0
Originally posted by: kamper
Originally posted by: mAdMaLuDaWg
C# is Windows Only (yes, there is mono but its not truely cross-platform as of yet and won't fully be without official support from MS). So when you learn C#, you are not going to be able to use it on any other platform.
There are lots of apps popping up in the linux community written entirely in c#. I even heard tell that mono implemented parts of asp.net 2.0 from spec before microsoft. I have my doubts about whether or not mono in general will ever be certifiably compatible with microsoft's .net but I bet they'll get pretty close and the skills will most definitely be transferrable between platforms.

I think learning something lower-level like c/c++ isn't too bad of an idea to start on because it'll force you to see some of the nasty stuff like pointers. Moving up to java is nice then, and learning c# after java is kinda cool because it gives you all kinds of insights into how a language is created. Basically c# is java with a bunch of syntactic sugar which generally encourages good programming habits, if a bit verbose. But almost all of it could be implemented in java with compiler tricks (or a preprocessor) without any vm modifications.


I don't think that the skills will ever get a chance to be transferrable. I doubt that many people will even think of building enterprise level apps for non-MS platforms with .NET regardless of the progress of third party interpreters for other platforms. Its the specification thats open while Microsoft control's .NET... people aren't and won't comfortable with that at all. Microsoft could put a stop to any third party initiative any time it wants... now if MS officially endorses making .NET truely cross-platform then I would agree.

I think everyone should learn about pointers... it helped me immensely to fully understand what memory/scope/variables was all about. I know you probally wouldn't use it in the real world but it gives you a very good foundation to build up on.

I know we are going way off topic on the OP's question... I hope he found a book. If not then just search on Amazon and read the reviews. Personally, I started out with a 21 days book for C++ but some people don't like those type of books. I'd also recommend hitting up a library or a Borders and checking out the first few chapters of a book to see if you are able to catch on.
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0

Hmm.. I never thought of it that way. Are you sure the CLR hasn't been upgraded with successive versions of .NET? For example, .NET 1.0/1.1 didn't have inbuilt support for generics while 2.0 does...I'd assume that would mean that the Common Type System would have been upgraded (or maybe not)...

logically, I'd assume that the MSIL is fully backward compatible but I'm wondering whether it is actually true...[/quote]

Common Type System doesn't get upgraded...to maintain type compatibility.

MSIL is like bytecode is like assembly code.. its more along the line of machine instructions.. .Net1.0/1.1 to .Net2.0 is more of JIT compiler performance upgrade etc.
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: Nothinman
I remember hearing a while ago that c++.net ran unmanaged. Or maybe I understood wrong, all I remember is that James Gosling got all in a flap about it Can someone confirm?

I believe you can do either, infact I think you can mix managed and unmanaged code if you really want.

You are correct. All .net languages are equally powerful and .net supports both managed/unmanaged codes.
 

Nothinman

Elite Member
Sep 14, 2001
30,672
0
0
I doubt that many people will even think of building enterprise level apps for non-MS platforms

People were saying the same thing about desktop stuff years ago...

Microsoft could put a stop to any third party initiative any time it wants...

Only certain parts of it, as I understand it they've submitted the majority of the libraries for standards consideration so there's no way they could stop someone from implementing them.
 

mAdMaLuDaWg

Platinum Member
Feb 15, 2003
2,437
1
0
Originally posted by: beggerking

Hmm.. I never thought of it that way. Are you sure the CLR hasn't been upgraded with successive versions of .NET? For example, .NET 1.0/1.1 didn't have inbuilt support for generics while 2.0 does...I'd assume that would mean that the Common Type System would have been upgraded (or maybe not)...

logically, I'd assume that the MSIL is fully backward compatible but I'm wondering whether it is actually true...

Common Type System doesn't get upgraded...to maintain type compatibility.

MSIL is like bytecode is like assembly code.. its more along the line of machine instructions.. .Net1.0/1.1 to .Net2.0 is more of JIT compiler performance upgrade etc.
[/quote]

Sorry, but I wouldn't even begin to compare MSIL to assembly code. I've coded a bit of IL using the System.Reflection namespace and its still, at heart, an object oriented language. Its still very high level and can't be compared to machine instructions. Actually, just fire up ILDAS after you compile a program and you'd be able to follow it quite easily.

I'm still not convinced that the actual IL specification has not been upgraded with .NET 2.0. Afterall, why would you need the .NET 2.0 framework to run .NET 2.0 apps while it wouldn't work with the .NET 1.0/1.1 framework. I'm pretty sure that aside from compiler performance, the specification has changed quite a bit as well.

 

kamper

Diamond Member
Mar 18, 2003
5,513
0
0
Originally posted by: beggerking
Originally posted by: Nothinman
I remember hearing a while ago that c++.net ran unmanaged. Or maybe I understood wrong, all I remember is that James Gosling got all in a flap about it Can someone confirm?
I believe you can do either, infact I think you can mix managed and unmanaged code if you really want.
You are correct. All .net languages are equally powerful and .net supports both managed/unmanaged codes.
So can you run stuff written in c# unmanaged?
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Sorry, but I wouldn't even begin to compare MSIL to assembly code. I've coded a bit of IL using the System.Reflection namespace and its still, at heart, an object oriented language. Its still very high level and can't be compared to machine instructions. Actually, just fire up ILDAS after you compile a program and you'd be able to follow it quite easily.

I'm still not convinced that the actual IL specification has not been upgraded with .NET 2.0. Afterall, why would you need the .NET 2.0 framework to run .NET 2.0 apps while it wouldn't work with the .NET 1.0/1.1 framework. I'm pretty sure that aside from compiler performance, the specification has changed quite a bit as well.

[/quote]

ILDASM is a disassembler.. the stuff you see there is not the actual IL code... the actual MSIL is closer to the level of assembly coding.

unmanaged code can run from .net with marshalling, but I don't think you can create them.
 

mAdMaLuDaWg

Platinum Member
Feb 15, 2003
2,437
1
0
Originally posted by: Nothinman
I doubt that many people will even think of building enterprise level apps for non-MS platforms

People were saying the same thing about desktop stuff years ago...
Not a fair comparison... when you have languages that are cross-platform straight from the first-party vendor, you wouldn't choose a language that's cross-platform only from a third party. People simply won't have faith unless the first-party throws their weight behind it... especially when that first-party specializes in non-opensource apps and has an agenda to make sure its operating system remains the dominant one in the market.
Microsoft could put a stop to any third party initiative any time it wants...

Only certain parts of it, as I understand it they've submitted the majority of the libraries for standards consideration so there's no way they could stop someone from implementing them.

You do have a point with the standards... but if you read the Mono developer blogs, they continue to say that while MS isn't officially supporting the project the .NET MS Engineers have been extremely supportive and willing to answer any questions.. now if that stops, I'd assume that it would slow down the progress conisderably. Besides, isn't C# the only language that was submitted to the ECMA.. plus Microsoft has patented MSIL if I remember reading correctly
 

mAdMaLuDaWg

Platinum Member
Feb 15, 2003
2,437
1
0
Originally posted by: kamper
Originally posted by: beggerking
Originally posted by: Nothinman
I remember hearing a while ago that c++.net ran unmanaged. Or maybe I understood wrong, all I remember is that James Gosling got all in a flap about it Can someone confirm?
I believe you can do either, infact I think you can mix managed and unmanaged code if you really want.
You are correct. All .net languages are equally powerful and .net supports both managed/unmanaged codes.
So can you run stuff written in c# unmanaged?

Yep.. but you basically forsake the advantages of the .NET framework once you run unmanged code and it slows down the performance considerably. Its strongly recommended against but in some cases you have no choice
 

Nothinman

Elite Member
Sep 14, 2001
30,672
0
0
Not a fair comparison... when you have languages that are cross-platform straight from the first-party vendor, you wouldn't choose a language that's cross-platform only from a third party. People simply won't have faith unless the first-party throws their weight behind it... especially when that first-party specializes in non-opensource apps and has an agenda to make sure its operating system remains the dominant one in the market.

If it's a real standard it won't matter and Gnome is investing more and more in Mono. I think Novell is putting a lot of work into Mono projects as well, so it's not like there's no backing at all. And if MS tries to thwart their attempts I bet they'd be facing another anti-trust lawsuit again.

You do have a point with the standards... but if you read the Mono developer blogs, they continue to say that while MS isn't officially supporting the project the .NET MS Engineers have been extremely supportive and willing to answer any questions.. now if that stops, I'd assume that it would slow down the progress conisderably. Besides, isn't C# the only language that was submitted to the ECMA.. plus Microsoft has patented MSIL if I remember reading correctly

So? MS hasn't been helpful in the past with things like SMB and the Samba team have done pretty well, sure things could move faster but it's probably better to have a clean-room implementation anyway. If MS ever orders their employees to stop helping the Mono devs I'm sure things will slow down, but I highly doubt it'll kill the project. Hell it probably won't even hurt it badly since it's already in a state where it's usable for full desktop apps.

As for MS patenting the MSIL, I don't know, but I would assume that the Mono people would know about that and have considered it's implications already.
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Yep.. but you basically forsake the advantages of the .NET framework once you run unmanged code and it slows down the performance considerably. Its strongly recommended against but in some cases you have no choice

:thumbsup:
 

kamper

Diamond Member
Mar 18, 2003
5,513
0
0
Originally posted by: mAdMaLuDaWg
Originally posted by: kamper
Originally posted by: beggerking
Originally posted by: Nothinman
I remember hearing a while ago that c++.net ran unmanaged. Or maybe I understood wrong, all I remember is that James Gosling got all in a flap about it Can someone confirm?
I believe you can do either, infact I think you can mix managed and unmanaged code if you really want.
You are correct. All .net languages are equally powerful and .net supports both managed/unmanaged codes.
So can you run stuff written in c# unmanaged?

Yep.. but you basically forsake the advantages of the .NET framework once you run unmanged code and it slows down the performance considerably. Its strongly recommended against but in some cases you have no choice
Well, obviously I have no desire to do so :p I just wanted to know how stupid it would let you be. Slower though? I mean, if you're constantly switching between the two modes, then sure, but jits haven't passed native code in performance yet. Unless the c# -> native compiler really blows.
 

mAdMaLuDaWg

Platinum Member
Feb 15, 2003
2,437
1
0
Originally posted by: beggerking

Sorry, but I wouldn't even begin to compare MSIL to assembly code. I've coded a bit of IL using the System.Reflection namespace and its still, at heart, an object oriented language. Its still very high level and can't be compared to machine instructions. Actually, just fire up ILDAS after you compile a program and you'd be able to follow it quite easily.

I'm still not convinced that the actual IL specification has not been upgraded with .NET 2.0. Afterall, why would you need the .NET 2.0 framework to run .NET 2.0 apps while it wouldn't work with the .NET 1.0/1.1 framework. I'm pretty sure that aside from compiler performance, the specification has changed quite a bit as well.

ILDASM is a disassembler.. the stuff you see there is not the actual IL code... the actual MSIL is closer to the level of assembly coding.[/quote]

Woah.. where are you getting that information from??? Everything I've read speaks to the contrary. What happends is that the the compiler compiles the language to a universal Intermediate Language (MSIL in this case) and then it is converted to native machine code at which point it isn't called MSIL... it then is truely low level machine dependent code. The 'platform-independence' stops at MSIL.

EDIT: Yep my memory serves me correct... here is the documentation straight from Microsoft: http://msdn.microsoft.com/netframework/programming/clr/
To make a long story short, after MSIL the VES (Virtual Execution system) takes over and is largely responsible for executing the MSIL.

 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: mAdMaLuDaWg

Sorry, but I wouldn't even begin to compare MSIL to assembly code. I've coded a bit of IL using the System.Reflection namespace and its still, at heart, an object oriented language. Its still very high level and can't be compared to machine instructions. Actually, just fire up ILDAS after you compile a program and you'd be able to follow it quite easily.


Woah.. where are you getting that information from??? Everything I've read speaks to the contrary. What happends is that the the compiler compiles the language to a universal Intermediate Language (MSIL in this case) and then it is converted to native machine code at which point it isn't called MSIL... it then is truely low level machine dependent code. The 'platform-independence' stops at MSIL.


bolded part is what I'm talking about.. :) when you fire up ILDASM, what you see is the actual .net code.. not MSIL code.

MSIL code is much closer to assemly code.

here is how it goes..
.net code ->MSIL -> binary
 

mAdMaLuDaWg

Platinum Member
Feb 15, 2003
2,437
1
0
Originally posted by: kamper
Originally posted by: mAdMaLuDaWg
Originally posted by: kamper
Originally posted by: beggerking
Originally posted by: Nothinman
I remember hearing a while ago that c++.net ran unmanaged. Or maybe I understood wrong, all I remember is that James Gosling got all in a flap about it Can someone confirm?
I believe you can do either, infact I think you can mix managed and unmanaged code if you really want.
You are correct. All .net languages are equally powerful and .net supports both managed/unmanaged codes.
So can you run stuff written in c# unmanaged?

Yep.. but you basically forsake the advantages of the .NET framework once you run unmanged code and it slows down the performance considerably. Its strongly recommended against but in some cases you have no choice
Well, obviously I have no desire to do so :p I just wanted to know how stupid it would let you be. Slower though? I mean, if you're constantly switching between the two modes, then sure, but jits haven't passed native code in performance yet. Unless the c# -> native compiler really blows.

Actually, I remember some benchmarks that ran .NET versions of intense mathematical algorithms and C++ algorithms and in some cases the .NET code was faster than native code (if by native, you mean non-.NET languages) and in every other case had very equal performances (I know a few algorithms is not a real test but it had some good insights)
... it was a slashdot story, I'll see if I can dig it up later.

Basically, with unmanaged code you could be as stupid as you want to be. No thread safety, .NET security, garbage collection etc. Your on your own with unmanaged code.
 

mAdMaLuDaWg

Platinum Member
Feb 15, 2003
2,437
1
0
Originally posted by: beggerking
Originally posted by: mAdMaLuDaWg

Sorry, but I wouldn't even begin to compare MSIL to assembly code. I've coded a bit of IL using the System.Reflection namespace and its still, at heart, an object oriented language. Its still very high level and can't be compared to machine instructions. Actually, just fire up ILDAS after you compile a program and you'd be able to follow it quite easily.


Woah.. where are you getting that information from??? Everything I've read speaks to the contrary. What happends is that the the compiler compiles the language to a universal Intermediate Language (MSIL in this case) and then it is converted to native machine code at which point it isn't called MSIL... it then is truely low level machine dependent code. The 'platform-independence' stops at MSIL.


bolded part is what I'm talking about.. :) when you fire up ILDASM, what you see is the actual .net code.. not MSIL code.

MSIL code is much closer to assemly code.

here is how it goes..
.net code ->MSIL -> binary


Sorry, I'm not getting you. When you fire up ILDASM, I don't get any specific .NET code... in other words,I can't just pop that code into the IDE and expect output (unless I use the Reflection namespace ofcourse).

So what you are getting at is that the code in ILDASM is not MSIL but an Intermediate to MSIL?
 

kamper

Diamond Member
Mar 18, 2003
5,513
0
0
Originally posted by: mAdMaLuDaWg
Actually, I remember some benchmarks that ran .NET versions of intense mathematical algorithms and C++ algorithms and in some cases the .NET code was faster than native code (if by native, you mean non-.NET languages) and in every other case had very equal performances (I know a few algorithms is not a real test but it had some good insights)
... it was a slashdot story, I'll see if I can dig it up later.

Basically, with unmanaged code you could be as stupid as you want to be. No thread safety, .NET security, garbage collection etc. Your on your own with unmanaged code.
I know the difference between managed and unmanaged code and that jited code can challenge native code for performance under certain conditions :) I just thought the assumption that unmanaged code would slow things down 'considerably' was odd ;)
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: mAdMaLuDaWg

Sorry, I'm not getting you. When you fire up ILDASM, I don't get any specific .NET code... in other words,I can't just pop that code into the IDE and expect output (unless I use the Reflection namespace ofcourse).

So what you are getting at is that the code in ILDASM is not MSIL but an Intermediate to MSIL?

Hi mAdMaLuDaWg,

Text a link describes ILDASM. as described in article, what you see there is not the actual MSIL code..

A good thing about .net running unmanaged code is that it is possible to "be safe" with marshalling unmanaged codes, that would solve some problem with garbage collection etc I believe.

hard to believe .net code would be faster than native....:) but oh well..
 

mAdMaLuDaWg

Platinum Member
Feb 15, 2003
2,437
1
0
Originally posted by: beggerking
Originally posted by: mAdMaLuDaWg

Sorry, I'm not getting you. When you fire up ILDASM, I don't get any specific .NET code... in other words,I can't just pop that code into the IDE and expect output (unless I use the Reflection namespace ofcourse).

So what you are getting at is that the code in ILDASM is not MSIL but an Intermediate to MSIL?

Hi mAdMaLuDaWg,

Text a link describes ILDASM. as described in article, what you see there is not the actual MSIL code..

A good thing about .net running unmanaged code is that it is possible to "be safe" with marshalling unmanaged codes, that would solve some problem with garbage collection etc I believe.

hard to believe .net code would be faster than native....:) but oh well..
Oh..ok, I see what you were trying to say... that is you couldn't just open up a file in notepad and see the MSIL code, it would have to be disassembled first.
Anway, back to the original question.. even if the assembled-MSIL specification is standard and didn't change, what benefit does that bring? You couldn't use a .NET 2.0 library in .NET 1.0/1.1 directly as there is no way to import the assembled-IL directly without disassembling first AFAIK.

Yeah, you can use unsafe code directly or Marshal anobject which isn't always as straightforward especially when the object is extremely complex.

EDIT: The closest documentation I could find about this issue from Microsoft is here:
http://www.gotdotnet.com/team/changeinfo/
Forward compatibility means an application written for the .NET Framework version 1.1 can execute on version 1.0. Although forward compatibility is supported by the .NET Framework, an application that uses a type or member specific to version 1.1 will never run properly on version 1.0. This is not a forward incompatibility because the application can never be expected to work. If you want your application to run properly on both versions of the .NET Framework, then your application should only use types and members in version 1.0.

It seems that this would imply that the Common Type System has indeed been upgraded.
 

mAdMaLuDaWg

Platinum Member
Feb 15, 2003
2,437
1
0
Originally posted by: kamper
Originally posted by: mAdMaLuDaWg
Actually, I remember some benchmarks that ran .NET versions of intense mathematical algorithms and C++ algorithms and in some cases the .NET code was faster than native code (if by native, you mean non-.NET languages) and in every other case had very equal performances (I know a few algorithms is not a real test but it had some good insights)
... it was a slashdot story, I'll see if I can dig it up later.

Basically, with unmanaged code you could be as stupid as you want to be. No thread safety, .NET security, garbage collection etc. Your on your own with unmanaged code.
I know the difference between managed and unmanaged code and that jited code can challenge native code for performance under certain conditions :) I just thought the assumption that unmanaged code would slow things down 'considerably' was odd ;)


I was always under the impression that .NET code would always be slower because the .NET framework technically runs on top of the Win32 API. I never imagined that it would be faster. I also remember how MS was touting WinFX and that .NET would be the future of the Windows platform... to bad they dropped the ball on that.

Hey and I found the website that had the performance tests: Text
It has an excellent analysis on managed/unmanaged code and gives you a good explanation.. here is my favorite part:
Think of .NET in these terms. The .NET compiler (managed C++ in this case, but the same can be said for the other compilers) is essentially the equivalent of the parsing engine in the unmanaged C++ compiler. That is, the compiler will generate tables of the types and methods and perform some optimizations based on high level aspects like how loops and branches are handled. Think of the .NET JIT compiler as the back end of the unmanaged compiler: this is the part that really knows about generating code because it has to generate the low level x86 code that will be executed. The combination of a .NET compiler and the JIT compiler is an equivalent entity to the unmanaged C++ compiler, the only difference is that it is split into two components meaning that the compilation is split over time. In fact, since the JIT compilation occurs at the time of execution the JIT compiler can take advantage of 'local knowledge' of the machine that will execute the code, and the state of that machine at that particular time, to optimize the code to a degree that is not possible with the unmanaged C++ compiler run on the developer's machine. The results shows that the optimization switches in managed C++ and C# have relatively small effects, and that there is only a 2% difference between managed and unmanaged code. Significantly, C# code is as good, or better than managed C++ or C++/CLI which means that your choice to use a managed version of C++ should be based on the language features rather than a perceived idea that C++ will produce 'more optimized code'.

There is nothing in .NET that means that it should automatically be much slower than native code, indeed, as these results have shown there are cases when managed code is quicker than unmanaged code. Anyone who tells you that .NET should be slower has not thought through the issues.
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: mAdMaLuDaWg
It seems that this would imply that the Common Type System has indeed been upgraded.

interesting...
I'd interpret that as 1.0 MSIL code being layed out differently than 1.1 and 2.0 code.. or there may be new libraries added in each new version of .net framwork in which .net 1.0 does not include..

The important thing is, each newer version of .net framework can run .net codes in prior versions..

In regards to performance.. I've always had the impression that C++ compilers produce the fastest / most effecient codes. With optimization, it is possible to produce codes faster than assembly. .Net on the other hand, produces medium speed codes that should be close or slower than C++ native codes, but faster than interpreted codes...very much like Java bytecode..

Thanks for the above quote,, that was very interesting... umm..hard to believe its faster.. but I guess the world has changed..

 

mAdMaLuDaWg

Platinum Member
Feb 15, 2003
2,437
1
0
Well whats interesting from the gotdotnet link though is that the MS teams are going to be primarily pushing with Side-By-Side execution instead of compatibility execution. It seems that they aren't confident enough of compatibilty with older versions because there are even backward compatible breaking changes (that is 1.0 could break when running on 1.1), so our machines are going to have a every single framework installed.. just imagine the amount of bloat that would have to be on developer machines 4-5 years from now...
lol.. they've gotten rid of DLL hell and introduced version-bloat-hell.
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
and that is the atypical Microsoft practices... bloating everything up to maintain compatibility, but never look back to optimize them..
 

kamper

Diamond Member
Mar 18, 2003
5,513
0
0
Originally posted by: beggerking
In regards to performance.. I've always had the impression that C++ compilers produce the fastest / most effecient codes. With optimization, it is possible to produce codes faster than assembly. .Net on the other hand, produces medium speed codes that should be close or slower than C++ native codes, but faster than interpreted codes...very much like Java bytecode..
That doesn't really make much sense. A c++ compiler essentially produces assembly (machine code ultimately, but that's a direct translation, no optimization). There is no reason why .net or any jit'ed platform can't produce binary code that is just as fast as pre-compiled languages. The reason they are often slower is 1) they have to spend time compiling the bytecode to native code and 2) in order to save time on step 1, they often take shortcuts (a given piece of byte code may be compiled numerous times, successively more optimized as the vm realizes that it will be run more often).

The reason jit'ing can be faster is that much more aggressive optimizations can be made (as touched on in mAdMaLuDaWg's quote above). However, the technology generally isn't at the point yet where virtualization is consistently faster. Particularly, there are often very long initial loading times (sun's java is quite bad for this) which really contributes to people's impression of slowness. Garbage collection is another issue that gets tough from a performance perspective as you get into highly parallelized applications (still a good tradeoff for the risks of memory leaks or referencing freed memory :p).