<<
Pretender, I find your naivete quite intriguing. You're a lone soul in the world of informed individuals. >>
Actually, I'm an argumentative soul in world of sheep. Call me naive if you wish, but just remember, Baaaaaaaaa!
<<
The ambiguity between the assignment operator ('=') and the equality operator ('==') C'esque languages, is just that... ambiguous. You act as though it's a simple issue of remembrance. Surely in a polymorphic world, one can make the assignment operator as polymorhpic as any other. I've seen very talented programmers make a mistake (commonly in complex boolean logic expressions) between assignment and equality, and this is not indicative of their poor memory, or poor workmanship. >>
I wholeheartedly disagree - they couldn't be further from ambiguity if they tried, in fact it's rather clear and straightforward. 1 '=': assignment, 2 '='s: equality testing. And I'm not saying that experts at the language are immune to making the mistake, I was making a point that the subject that they were talking about (confusion of booleans vs. other data types) had little relevance on equal signs, and bringing it up at this point and time in the article seemed irrelevant and to serve only one purpose: to pander to the idiots. I didn't intend to sound like I never make that mistake (maybe not as much as I used to, but it still happens on occasion), but I got the feeling that they were trying to convince people who maybe used C a few times, didn't like it because they kept screwing up on syntax, and decided to revert to something simpler.
<<
The _simple_ idea here, is that the "shortcut" way of C (which has been adopted) to handle any expression != 0 as true, is simply bad code. >>
Of cource I disagree, but to argue back and forth over a topic as narrow as this would be pointless. It's mostly a matter of who will be reading the code, and how the coder prefers it. Microsoft uses it in a few things they release (DirectX sdk as one), but they're allowed to be hypocrites I suppose.
<<
True, 0 in a pointer context will "decompose" into NULL, but the idea here is simple. If you're testing a variable in an integral context, explicitly do so. If you're testing a pointer in an expression for validity (!= NULL), explicitly do so. This is good code, the former idea is not. I do, however, think C# went overboard on this. >>
Once again this is one of those things that all depends on who's reading, writing, and working with the code, or as you like to superfluously label things: "good code". To the compiler or the computer executing the program, there's no difference. To the source code, it adds on 6, maybe 7, useless bytes. Where's the benefit?
<<
You clearly don't know how VB handles expressions, and I see no point in explaining it. Is it different than C'esque languages? Sure. Does this make it a bad thing? Only if your development practices are purely academic, and hardly that. >>
You're 100% right on this one, I used VB for less than 6 months, and that was quite a while ago.
<<
<< and if the VBers don't like it, well, too bad >>
Now you went all out, and commenced a full manifestation of ignorance. I therefore draw the conclusion, that you've probably spent more time reading books than actually writing code. Your logic is absurd, your arguments are way off base, and you seem to somehow hold these antediluvian programming practices as some sort of esoteric idiom that we should maintain. You clearly don't know how VB handles expressions, and I see no point in explaining it. Is it different than C'esque languages? Sure. Does this make it a bad thing? Only if your development practices are purely academic, and hardly that. >>
I see somebody's bought a thesaurus

Regardless, you're wrong on pretty much all counts. Granted, I've only been writing code for 3 and a half years (not including the time I've spent with VB, and QBasic if that's even considered a language

), and books on my spare time within that time. Then again, I've only had a computer for 5, and only been alive for about 16. I suppose one of these traits must've caused me to embark on my mission of complete and utter ignorance that you mentioned. As for the "antediluvian programming practices" that I believe the world should maintain, you're highly overreacting. I'm not saying there should be no improvements, I'm saying there should be no reductions
solely for the fact of dumbing it down. That means don't get rid of
if (i) simply because it confuses a few people, if I want to use an int as a bool and a long as an int, I should be able to waste memory as I wish; and when I forget to use == instead of =, I should be forced to debug the damn thing for an hour wondering wtf went wrong until I go through the damn piece of crap line by line and find it on line 3642 at 3:03 AM on a Wednesday night before I have to go to school. Yeah, all these problems could go away and we could all make compiler-error-free programs without the worry, but unless you plan on working on C# your whole life, it won't teach you the discipline and mastery of the language that you'll need when you move on to something more realistic and less programmer-friendly. Like say C, or assembly.