Well?!
Originally posted by: slugg
C#, like VIAN said, is a copy of Java. It compiles into bytecode which is read by a virtual machine or interpreter. It shares a similarity with C++ being that its object oriented (OOP, or Object Oriented Programming), which is basically the "big thing" in programming now. If you can program in C#, you can program in Java and vice versa. For what it's designed to do, it's actually a very good language and I'd recommend it as a learning language (Java might be better since you can find more info on it... but its really the same thing). C# and Java syntax is SIMILAR (but not the same) in many ways to that of C++ and C, so if you can program in C#, you shouldnt have too much of a problem reading and editing C or C++ code. For example, I started on Java and I picked up C++ in no time at all. I don't know about C#, but Java is definitely cross platform. You can create a Java application and have it run on windows, linux, mac, you name it. Java is more common than you'd expect. Java powers JSP (JavaServer Pages) technology, which is Sun's version of PHP or ASP (ebay, for example, is powered by JSP). Java is also used as a base for many scripting languages in games (for example, UScript which powers the Unreal engines, is based on Java), mini applets for cell phones, and other random things. IBM also uses a lot of Java technologies with their servers. It's really a great, but underappreciated language
. Since Java and C# are slightly simplified and run within specific parameters, they're considere'd "high level" languages. Other "high level" languages are Visual Basic and Perl.
Unlike C# and Java, C and C++ compile into binaries which are executed natively. The pros to this is speed and power. Programs written in C++ and C are definitely faster than their C#/Java counterparts, and could potentially do more since they run on the native machine instead of a virtual machine. The downside to this is that BECAUSE these programs run on the native machine, they're potentially hazardous and could crash your computer if you wrote something incorrectly (like if you had a stacked datastructure and blew the stack... you'd probably crash your whole computer). C# and Java programs would simply just crash the instance of the virtual machine, but your computer is left unharmed. Due to the speed and power, more demanding applications are typically written in C++ (or even partly in assembly, which is even faster). An example would be a PC game. Half-life 2 is written in C++, and part of its graphics engine is written in assembly. It's obvious why a program like that would need every bit of speed achievable. Also, C and C++ tend to be more complicated than C# and Java since they're a lower level language than C# and Java. Assembly, though, is an even LOWER level, so its even harder.
And BASICALLY, the difference between C++ and C is that C++ is object oriented, and C is not. Yes there are other differences, but that's the main one.
As always, there are exceptions to every rule, but in a NUTSHELL, that should answer your question! If you have any more i'd be glad to help![]()
Originally posted by: screw3dC+ is not a serious/real programming language.
Originally posted by: Sunner
Originally posted by: screw3dC+ is not a serious/real programming language.
My first thought was "WTF?", then I noticed the lack of the second plus![]()
Originally posted by: Rangoric
I don't consider the JIT version of C# and Java to be interpreted.
This includes all .Net versions of C# and most version of Java.
Originally posted by: slugg
^^ but isnt Visual Basic practically English? To me at least. LOL
Java support is still a big pain on many platforms. That will change soon enough at least for sparc and i386 platforms, as sun open sources java but we're still a long way from anything practical and more obscure architectures still won't have good jits.Originally posted by: slugg
C#, like VIAN said, is a copy of Java. It compiles into bytecode which is read by a virtual machine or interpreter. It shares a similarity with C++ being that its object oriented (OOP, or Object Oriented Programming), which is basically the "big thing" in programming now. If you can program in C#, you can program in Java and vice versa. For what it's designed to do, it's actually a very good language and I'd recommend it as a learning language (Java might be better since you can find more info on it... but its really the same thing). C# and Java syntax is SIMILAR (but not the same) in many ways to that of C++ and C, so if you can program in C#, you shouldnt have too much of a problem reading and editing C or C++ code. For example, I started on Java and I picked up C++ in no time at all. I don't know about C#, but Java is definitely cross platform. You can create a Java application and have it run on windows, linux, mac, you name it.
That's a stretch. asp.net is closer.Java is more common than you'd expect. Java powers JSP (JavaServer Pages) technology, which is Sun's version of PHP or ASP
Are you implying that unreal engines require a jvm or just that UScript is syntactically similar to java?(ebay, for example, is powered by JSP). Java is also used as a base for many scripting languages in games (for example, UScript which powers the Unreal engines, is based on Java),
I'm probably going to get yelled at for this, but the 'faster' part is not nearly so simple. Jits can do a lot of optimizations that static compilers can't and, as the technology improves, jit languages will become faster.mini applets for cell phones, and other random things. IBM also uses a lot of Java technologies with their servers. It's really a great, but underappreciated language. Since Java and C# are slightly simplified and run within specific parameters, they're considere'd "high level" languages. Other "high level" languages are Visual Basic and Perl.
Unlike C# and Java, C and C++ compile into binaries which are executed natively. The pros to this is speed and power. Programs written in C++ and C are definitely faster than their C#/Java counterparts, and could potentially do more since they run on the native machine instead of a virtual machine.
Of course you won't crash your computer. The worst you could do is crash your program. Unless you're using an operating system from like the 80s or something.The downside to this is that BECAUSE these programs run on the native machine, they're potentially hazardous and could crash your computer if you wrote something incorrectly (like if you had a stacked datastructure and blew the stack... you'd probably crash your whole computer).
Again, untrue. In c# and java you can't do anything to the stack, nor can you do anything to the virtual machine (provided it doesn't have glaring bugs).C# and Java programs would simply just crash the instance of the virtual machine, but your computer is left unharmed.
Only if you really, really know what you're doing. There's nothing inherently faster about assembly language unless you know how to take shortcuts that a c compiler won't. For that to work you need to know a lot about processor architecture (pipelining and such), but it'd still never scale to the point that you could write much of significant size.Due to the speed and power, more demanding applications are typically written in C++ (or even partly in assembly, which is even faster).
C was a high-level language when they created itOriginally posted by: Aikouka
C is a procedural language ( http://en.wikipedia.org/wiki/Procedural_programming ) that was created to serve as a low-level system programming language.
Not by a long shot. Look at smalltalk and other such things. Heck, python's more oo than c#/java. Also, to nitpick, c# is more oo than java because its native types are directly backed by objects whereas java treats them as true natives. And in c# there's a limited ability to treat methods like objects for delegates and stuff. In java you have to hack it up with nested classes and whatnot.Originally posted by: slugg
C# and Java are probably the most object oriented languages to date,
It's one path to the future but I doubt it's the most important. I personally think functional programming (or at least something related to that paradigm) is more interesting because of the possibilities for seamlessly using multiple processors.and this is good. The future is OOP (object oriented programming)
You're confusing things big time. Interpetted != jittedOriginally posted by: Aikouka
I fixed that description as they're not interpretted languages (no idea why I left out the fact that they're Object Oriented). They are, however, still interpretted whether it be before the program commences (like Java and C# do) or on-demand (like VBScript). The proper term for their interpretted-ness is Dynamic Translation or JIT (Just-In-Time (Compilation) for anyone who's not familiar and like you mentioned), so I should probably use that.
Originally posted by: slugg
christ... talk about tearing us apart. Aikouka and I were answering the original poster's question in a simple form... I especially mentioned a few times that i was speaking in generalities and that YES there are exceptions to the rule... calling me and Aikouka straight up wrong because we left out minor details just because we were trying to explain the jist of the whole concept to someone in a way they'd understand it was not only counterproductive, but a bit childish. I mean think about it, you just shoved what we said in our faces without offering anything useful to the thread. I guess if "showing us up" was that important to you, then go ahead and take your 15 minutes.
Java used to be much slower then it is now but for the most efficient coding for assembler/C is still the best. It also does not help that the JVM eats a larger amount of memory to run then a C/C++ complied code.Originally posted by: Templeton
Originally posted by: slugg
christ... talk about tearing us apart. Aikouka and I were answering the original poster's question in a simple form... I especially mentioned a few times that i was speaking in generalities and that YES there are exceptions to the rule... calling me and Aikouka straight up wrong because we left out minor details just because we were trying to explain the jist of the whole concept to someone in a way they'd understand it was not only counterproductive, but a bit childish. I mean think about it, you just shoved what we said in our faces without offering anything useful to the thread. I guess if "showing us up" was that important to you, then go ahead and take your 15 minutes.
Oh please. You may well have been speaking in generalities, but in a number of cases those generalities were flat out wrong. What's the harm in kamper providing accurate information? Your not doing the op any favors by swaying him away from a set of languages with inaccurate info. (Java / C# are interpreted and will be slower) You should try taking the above posts as a chance to improve your knowledge and the knowledge of the greater community rather then as a personal attack; you might see the usefulness in them then.
Originally posted by: kamper
C was a high-level language when they created itHeck, when I took my first C class they introduced it as a high level language (and I'm still in school).
Originally posted by: kamper
You're confusing things big time. Interpetted != jitted
Interpreting means you read the bytecode, figure out what it does, execute those commands on behalf of the bytecode and throw it away to be interpreted next time it executes. JIT compilation means you read the byte code, compile it into machine code and then execute the machine code as well as storing it in memory for the next time the code executes.
Originally posted by: Templeton
You may well have been speaking in generalities, but in a number of cases those generalities were flat out wrong.
Relax man, you said a bunch of stuff that was wrong and I pointed it out, along with supplying more useful answers. If that's not offering anything useful, I don't know what is. If you've got rebuttals to any of the above, I'd love to hear it.Originally posted by: slugg
christ... talk about tearing us apart. Aikouka and I were answering the original poster's question in a simple form... I especially mentioned a few times that i was speaking in generalities and that YES there are exceptions to the rule... calling me and Aikouka straight up wrong because we left out minor details just because we were trying to explain the jist of the whole concept to someone in a way they'd understand it was not only counterproductive, but a bit childish. I mean think about it, you just shoved what we said in our faces without offering anything useful to the thread. I guess if "showing us up" was that important to you, then go ahead and take your 15 minutes.
Yep, as you can tell, I was just in a really pedantic moodOriginally posted by: Aikouka
Originally posted by: kamper
C was a high-level language when they created itHeck, when I took my first C class they introduced it as a high level language (and I'm still in school).
I was referring to low-level as the scope to which the language is used. C is high order in the scope of all languages as it exists in a higher plane than Machine Code and Assembly; however, when it comes to scope of where C is used, it has a lower order than you'd see Java programs at. That was the point I was trying to convey, although it seems I worded it poorly.
What I was getting at was that it seems like you were implying that interpretation and jitting were the same thing ("The proper term for their interpretted-ness is Dynamic Translation or JIT (Just-In-Time (Compilation)"). The term 'interpret' is mutually exclusive to the term 'jit'. If you do a jit compilation, you are not interpreting, or at least that's how someone who writes virtual machines would understand it. The reason is just that the two ways of doing it are very different and you need a term to refer specifically to the non-jit way. So yeah, you could describe the process of jitting as first interpreting the code and then producing machine code but it'd be an abuse of the technical term. Anyway, just getting pedantic again, I'm sure you already understood what I'm trying to sayOriginally posted by: kamper
You're confusing things big time. Interpetted != jitted
Interpreting means you read the bytecode, figure out what it does, execute those commands on behalf of the bytecode and throw it away to be interpreted next time it executes. JIT compilation means you read the byte code, compile it into machine code and then execute the machine code as well as storing it in memory for the next time the code executes.
Where did I say that being interpretted means it's only compiled Just-in-Time? Generally speaking, interpretting is a basic part of translating languages into their lower-order and that was the point I was trying to convey. I was by no means referring to these languages as being part of the Interpretted Language paradigm as I fixed that gross error when I edited my previous post. Hence my use of the word (made-up or not) "interpretted-ness."
I think there are more purposes to going with bytecode. For instance, it's probably easier to sign and/or obfuscate bytecode. It's also smaller, so it's easier to transmit. You can also use other language syntaxes and compile them to the same bytecode. But those are all fringe benefits, of course speed is primary.Also, not all interpretted languages use bytecode. Being precompiled into bytecode isn't a requirement of an interpretted language and the only purpose of doing so is to try to speed up processing.