I'd say c++ is arguably the most useful language to learn (a TON of other languages use c++ style syntax and concepts that c++ pioneered, and the language itself is still one of the go-to languages for many applications). That said, it's also one of the most complex from a beginner's perspective, as it involves direct memory allocation/management and other highly technical/messy concepts, and some of its syntax is quite arcane.
I'd recommend starting with Python. It's a much more modern language, has automatic memory management, cleaner syntax (although with similar base concepts, if you can read python you can read most c++), and is overall much easier to learn and use. Don't mistake this for training wheels, Python is a widely used if still young language at this point and is backed by Google (in fact it was created by a Google employee).
Once you get the essential programming concepts down with python, you can move on to the messier topics of manual memory management and such.
I'd recommend finding a community college class to take as its very helpful to have an instructor, but if that's not an option this book seems to be held in high regard:
http://www.amazon.com/Python-Program...on+programming
I've never read it, so take that recommendation with a grain of salt, but from all appearances it looks good.
And just to more clearly explain the differences between python and c++:
There are two distinct varieties of high-level programming languages (high-level meaning they have code that uses words you can actually read): Compiled languages and interpreted languages.
Compiled languages (like c/c++) start out as written code, then get run through a piece of software called the "compiler" that converts that code, all at once, into a binary file of ones and zeros that can be directly read by the computer. These languages usually offer the most powerful features from a hardware perspective, as you can literally manipulate individual ones and zeroes and tweak your program to technical perfection. They also run the fastest, as the binary file is fed directly into the hardware.
The downside is that a compiled program will ONLY work on the hardware it has been compiled for; and the power of its manual memory manipulation comes with the responsibility to use said power wisely. Memory errors are an everlasting bitch for someone who's not used to them; your algorithm can be logically perfect, but if you forgot to disallocate one byte of memory in its implementation, your whole program can spontaneously fail and you won't even know where to look right away. At least not until you've made enough such errors to get used to them.
Interpreted languages (like python, Java, Ruby) came about to address the problems of compiled languages. They utilize a piece of software known as an "interpreter", which translates code into ones/and zeroes on the fly, sometimes even line-by-line (as with python). The advantages are that such code is portable, it can run on any machine with the interpreter installed. Such interpreters also include automatic memory management, freeing the developer from chasing down near-invisible memory errors.
The downside is a loss of performance, as each line or chunk of code must be read, interpreted, and run in turn. In effect you're taking all the time a compiled language would spend compiling and inserting it into the running time of the actual program. Automatic memory management also requires intelligent algorithms of its own, which must be run concurrently with the program, further slowing performance.
This disadvantage (and correspondingly the performance advantage of compiled languages) has in many cases been mitigated by the sheer power of modern hardware. However there are still TONS of applications (such as satellites, mars rovers, cars, smart-phones, even fighter jets) where the hardware available is not the most powerful. Or there are more complex projects, where the sheer weight of computation requires the best performance out of the hardware. Here, compiled languages reign. In more general purpose applications where portability is a must (the cloud, data-centers, social networking, much of the internet in general) or where speed of development is more important than absolute performance, interpreted languages reign.
Anyways, hoped this helped, and I wish you many productive hours of banging your head against your monitor until you figure things out. (You'll be doing that a lot, but it'll be worth it.)
