C#.....wtf

gittyup

Diamond Member
Nov 7, 2000
5,036
0
0
What's one of the most annoying things about working in C++? It's gotta be remembering when to use the -> pointer indicator, when to use the :: for a class member, and when to use the dot.

Is it really that hard to remember this? This is so typical Microsoft.
 

Descartes

Lifer
Oct 10, 1999
13,968
2
0
Pretender: So, for a language to be powerful, it needs to be inherently complex?

C++ has undergone many evolutions to accomodate transitions in modern methodology, and is by no means the way a language would be designed today. C++ was designed for monolithic applications, not interface-based component-oriented systems.

I've come to the conclusion, that when it comes to development, in almost any language, AT people are absolutely clueless (w/ the exception of a a few that I know)... Quite frequently someone posts some clueless post about how they feel something MS does is inappropriate. *sigh*



 

poop

Senior member
Oct 21, 1999
827
0
0
I think of java as what C++ wanted to be.

Java = pure OO
C++ = hybrid
C# = mutant beast

My main fear is that C# will ONLY run on MS approved crapola. It looks pretty cool as a language, but the implementation will almost definitely be messed up. It looks like MS is trying to get Java developers to jump ship for C#.

Descartes: By power, it all depends on what you want to do. For GUI based apps, I would go with Java. For Server-side apps, still Java, as you get more freedom with your server hardware. Java programs are easy to build on and extend in the future.

For smaller, one-time jobs, C++ is ok. But you are right, it gets entirely too messy over time. It's semi-OO nature tends to allow programmers to write messier code. Though it is not impossible with Java, it is less encouraged.

And then comes my true love, C. God, I love C. Then again, I love extremely low-level coding (firmware, system testing code, drivers, etc) C is still the best language for talking directly with hardware, IMHO. I would only use assembly if EXACT timing were required (getting proper DRAM burst cycles, etc). Or maybe if I were in a masochistic mood.

Does this qualify me as seven slightly knowledgable. :) OR am I moron :(
 

Bignate603

Lifer
Sep 5, 2000
13,897
1
0
coding comes in to flavors
simple = doesn't do much
powerful = does alot if you could figure out how to use the freakin thing!
 

Pretender

Banned
Mar 14, 2000
7,192
0
0
I'm not saying that for something to be powerful it has to be complex, but I'm saying is that the crap that they're tossing in the article (and in C#) which claims to make C# simple and powerful is crap.

For example:


<< The third most annoying problem that you run across in C and C++ is integers being used as Booleans, causing assignment errors when you confuse = and ==. >>


Uh, morons, if you confuse = and ==, you're gonna have tons of problems regardless of whether or not you're dealing with bools or any other data type.



<< If you wrote code like this in C++:
int i;
if (i) . . .


You need to convert that into something like this for C#:
int i;
if (i != 0) . . .
>>

Now this just irks me. One of the things I liked in the way the language treats things is that there was no seemless difference between true &amp; 1 (or any non-zero), or false &amp; 0, because to the system they are the same thing. I find it infinately easier at times to simply type if (i) than if (i != 0) because it makes sense to those who've been programming for some type that (i) simply means (i != 0), and if the VBers don't like it, well, too bad. My main point isn't simply in the fact that they're making me type more, but that they're doing it simply to appease people who couldn't go look in a book to understand the basic concept, the same people who generally don't last long as programmers anyway.



To further prove my point that the purpose of C# is to attract more idiots to make more Microsoft-monopoly-only code...


<< Although some power users would disagree with me, type safety promotes robust programs. Several features that promote proper code execution (and more robust programs) in Visual Basic have been included in C#. For example, all dynamically allocated objects and arrays are initialized to zero. Although C# doesn't automatically initialize local variables, the compiler will warn you if you use one before you initialize it. When you access an array, it is automatically range checked. Unlike C and C++, you can't overwrite unallocated memory. >>

Ya know what this reminds me of? The College board AP classes. I've ranted about these ad nauseum to the people in my class, and I will do so until I make the final AP test my bitch. Why? If you're going to teach us programming, don't dumb it down. Don't take away the fun of making us learn that if you create an array with 200 elements, array[200] will give you a crash and a kick in the mouth. Any programming language, teacher, or program which takes away the fun of having a program with one minor syntaxical or logical bug crash for hours on end until you realize it was a stupid mistake which you will no longer make again, is a bad one and should be avoided.
 

poop

Senior member
Oct 21, 1999
827
0
0
Not necessarily. I feel Java is WAY simpler than C, yet I find it more powerful in some respects. If I were making anything with a GUI, java is the simplest approach I know of. Smalltalk may be a bit simpler, but agh! Who wants to subject themselves to that?
 

Descartes

Lifer
Oct 10, 1999
13,968
2
0
Pretender, I find your naivete quite intriguing. You're a lone soul in the world of informed individuals.

The ambiguity between the assignment operator ('=') and the equality operator ('==') C'esque languages, is just that... ambiguous. You act as though it's a simple issue of remembrance. Surely in a polymorphic world, one can make the assignment operator as polymorhpic as any other. I've seen very talented programmers make a mistake (commonly in complex boolean logic expressions) between assignment and equality, and this is not indicative of their poor memory, or poor workmanship.



<< Now this just irks me. One of the things I liked in the way the language treats things is that there was no seemless difference between true &amp; 1 (or any non-zero), or false &amp; 0, because to the system they are the same thing. I find it infinately easier at times to simply type if (i) than if (i != 0) because it makes sense to those who've been programming for some type >>



Now you've given me a headache. The _simple_ idea here, is that the &quot;shortcut&quot; way of C (which has been adopted) to handle any expression != 0 as true, is simply bad code.



<< that (i) simply means (i != 0) >>



Oh really? Lets look at a few examples.

int i = 0;
if (i) doSomething();

int *pi = NULL;
if (i) doSomething();

True, 0 in a pointer context will &quot;decompose&quot; into NULL, but the idea here is simple. If you're testing a variable in an integral context, explicitly do so. If you're testing a pointer in an expression for validity (!= NULL), explicitly do so. This is good code, the former idea is not. I do, however, think C# went overboard on this.



<< and if the VBers don't like it, well, too bad >>



Now you went all out, and commenced a full manifestation of ignorance. I therefore draw the conclusion, that you've probably spent more time reading books than actually writing code. Your logic is absurd, your arguments are way off base, and you seem to somehow hold these antediluvian programming practices as some sort of esoteric idiom that we should maintain. You clearly don't know how VB handles expressions, and I see no point in explaining it. Is it different than C'esque languages? Sure. Does this make it a bad thing? Only if your development practices are purely academic, and hardly that.

Sorry for the rant, bro. I like to toss in my $.02 when I see the pot is being passed around. Don't take offense to anything, but instead, try to see the good in what I've said... :)

 

Pretender

Banned
Mar 14, 2000
7,192
0
0


<< Pretender, I find your naivete quite intriguing. You're a lone soul in the world of informed individuals. >>

Actually, I'm an argumentative soul in world of sheep. Call me naive if you wish, but just remember, Baaaaaaaaa! :)



<< The ambiguity between the assignment operator ('=') and the equality operator ('==') C'esque languages, is just that... ambiguous. You act as though it's a simple issue of remembrance. Surely in a polymorphic world, one can make the assignment operator as polymorhpic as any other. I've seen very talented programmers make a mistake (commonly in complex boolean logic expressions) between assignment and equality, and this is not indicative of their poor memory, or poor workmanship. >>

I wholeheartedly disagree - they couldn't be further from ambiguity if they tried, in fact it's rather clear and straightforward. 1 '=': assignment, 2 '='s: equality testing. And I'm not saying that experts at the language are immune to making the mistake, I was making a point that the subject that they were talking about (confusion of booleans vs. other data types) had little relevance on equal signs, and bringing it up at this point and time in the article seemed irrelevant and to serve only one purpose: to pander to the idiots. I didn't intend to sound like I never make that mistake (maybe not as much as I used to, but it still happens on occasion), but I got the feeling that they were trying to convince people who maybe used C a few times, didn't like it because they kept screwing up on syntax, and decided to revert to something simpler.




<< The _simple_ idea here, is that the &quot;shortcut&quot; way of C (which has been adopted) to handle any expression != 0 as true, is simply bad code. >>

Of cource I disagree, but to argue back and forth over a topic as narrow as this would be pointless. It's mostly a matter of who will be reading the code, and how the coder prefers it. Microsoft uses it in a few things they release (DirectX sdk as one), but they're allowed to be hypocrites I suppose.




<< True, 0 in a pointer context will &quot;decompose&quot; into NULL, but the idea here is simple. If you're testing a variable in an integral context, explicitly do so. If you're testing a pointer in an expression for validity (!= NULL), explicitly do so. This is good code, the former idea is not. I do, however, think C# went overboard on this. >>

Once again this is one of those things that all depends on who's reading, writing, and working with the code, or as you like to superfluously label things: &quot;good code&quot;. To the compiler or the computer executing the program, there's no difference. To the source code, it adds on 6, maybe 7, useless bytes. Where's the benefit?




<< You clearly don't know how VB handles expressions, and I see no point in explaining it. Is it different than C'esque languages? Sure. Does this make it a bad thing? Only if your development practices are purely academic, and hardly that. >>

You're 100% right on this one, I used VB for less than 6 months, and that was quite a while ago.






<< << and if the VBers don't like it, well, too bad >>



Now you went all out, and commenced a full manifestation of ignorance. I therefore draw the conclusion, that you've probably spent more time reading books than actually writing code. Your logic is absurd, your arguments are way off base, and you seem to somehow hold these antediluvian programming practices as some sort of esoteric idiom that we should maintain. You clearly don't know how VB handles expressions, and I see no point in explaining it. Is it different than C'esque languages? Sure. Does this make it a bad thing? Only if your development practices are purely academic, and hardly that.
>>


I see somebody's bought a thesaurus :) Regardless, you're wrong on pretty much all counts. Granted, I've only been writing code for 3 and a half years (not including the time I've spent with VB, and QBasic if that's even considered a language ;)), and books on my spare time within that time. Then again, I've only had a computer for 5, and only been alive for about 16. I suppose one of these traits must've caused me to embark on my mission of complete and utter ignorance that you mentioned. As for the &quot;antediluvian programming practices&quot; that I believe the world should maintain, you're highly overreacting. I'm not saying there should be no improvements, I'm saying there should be no reductions solely for the fact of dumbing it down. That means don't get rid of if (i) simply because it confuses a few people, if I want to use an int as a bool and a long as an int, I should be able to waste memory as I wish; and when I forget to use == instead of =, I should be forced to debug the damn thing for an hour wondering wtf went wrong until I go through the damn piece of crap line by line and find it on line 3642 at 3:03 AM on a Wednesday night before I have to go to school. Yeah, all these problems could go away and we could all make compiler-error-free programs without the worry, but unless you plan on working on C# your whole life, it won't teach you the discipline and mastery of the language that you'll need when you move on to something more realistic and less programmer-friendly. Like say C, or assembly.

 

Descartes

Lifer
Oct 10, 1999
13,968
2
0
I know this is a narrow topic, but I'll respond anyway.



<< I wholeheartedly disagree - they couldn't be further from ambiguity if they tried, in fact it's rather clear and straightforward. 1 '=': assignment, 2 '='s: equality testing. And I'm not saying that experts at the language are immune to making the mistake............. >>



This is a simple issue of &quot;psychological distance&quot;, in that, two operations/operators/names/etc. being so similar perpetuates bugs. I don't remember the last time I've made the mistake, but that doesn't mean anything. If we were in Perl, we'd have !=, eq, etc.. Why? Sure, it's clear they're different, but it's absurd.



<< Of cource I disagree, but to argue back and forth over a topic as narrow as this would be pointless. It's mostly a matter of who will be reading the code, and how the coder prefers it. Microsoft uses it in a few things they release (DirectX sdk as one), but they're allowed to be hypocrites I suppose. >>



I agree.



<< Once again this is one of those things that all depends on who's reading, writing, and working with the code, or as you like to superfluously label things: &quot;good code&quot;. >>



I agree. The idea of &quot;good code&quot; is far too subjective to make it a black and white issue. However, the idea of &quot;bad code&quot; is not.



<< To the compiler or the computer executing the program, there's no difference. To the source code, it adds on 6, maybe 7, useless bytes. Where's the benefit? >>



True, but any decent compiler will optimize the operations appropriately. Thus, although the source code will be increased in size (negligible), the resulting executable will not be affected. The idea of working in a language as an abstraction from an underlying language is readability, productivity, and maintainability. If we take information hiding to the extreme and start hiding your basic integral comparison operations, readability goes to sh1te.



<< I see somebody's bought a thesaurus >>



Nope, no thesaurus. I'm just interested in etymology.



<< Regardless, you're wrong on pretty much all counts. Granted, I've only been writing code for 3 and a half years (not including the time I've spent with VB, and QBasic if that's even considered a language ), and books on my spare time within that time. Then again, I've only had a computer for 5, and only been alive for about 16. I suppose one of these traits must've caused me to embark on my mission of complete and utter ignorance that you mentioned. As for the &quot;antediluvian programming practices&quot; that I believe the world should maintain, you're highly overreacting. I'm not saying there should be no improvements, I'm saying there should be no reductions solely for the fact of dumbing it down. That means don't get rid of if (i) simply because it confuses a few people, if I want to use an int as a bool and a long as an int, I should be able to waste memory as I wish; and when I forget to use == instead of =, I should be forced to debug the damn thing for an hour wondering wtf went wrong until I go through the damn piece of crap line by line and find it on line 3642 at 3:03 AM on a Wednesday night before I have to go to school. Yeah, all these problems could go away and we could all make compiler-error-free programs without the worry, but unless you plan on working on C# your whole life, it won't teach you the discipline and mastery of the language that you'll need when you move on to something more realistic and less programmer-friendly. Like say C, or assembly >>



My opinion is wrong? I think you need to make a more lucid distinction between &quot;dumbing it down&quot; and readability. If you think these enhancements are for &quot;dumbing it down&quot;, then what do you think of automatic type conversion, compile-time type checking, etc.? These are simply benefits of the platform that facilitate a more bug-free environment. No programming language is simple, and the higher the level of abstraction, the lower the bug count in the deliverable (in theory). I'm always annoyed by people who seem to think that for a language to be employable, it must be syntactically abstruse.

I can tell you, that after sitting in a production environment, wading through thousands upon thousands of lines of crap, that I am 100% for the amelioration of modern languages (within reason).

Btw, I won't be using C# (at least not in the foreseeable future) :)