Future Programming Trends

CanadianCoder

Junior Member
Aug 2, 2016
8
0
36
One of the things I've always heard as a programmer is that we need to 'keep our skills up to date' throughout our career. Right now I'm fresh out of school and so pretty much up to date, but am looking forward to what skills I might want to invest in, in the near future to the next ten years.

Some quick google searching pulls up some of the buzzwords like Javascript frameworks, Rails, and Data Science, but I'm wondering if anyone can point me to any really great reading or articles predicting where the software world might be going.
 

purbeast0

No Lifer
Sep 13, 2001
52,856
5,728
126
After being in the industry for almost 13 years, imo it basically means to just do some kind of relevant programming on the side and to tinker. But don't tinker around in stiff like asm unless you want a job doing it. Tinker around in stuff that is current and uses current technologies. Find out what is popular and play around with it.

If you do this you will gain s skill that is invaluable - being able to pick up any language/tech to get the job done. Once you are at that point and understand concepts you can basically get any gig.
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
^ That's good advice. Learning to learn on your own outside the structured environment of school.

Have you worked with Amazon Web Services?

That's a good place to tinker since companies as large as Netflix use AWS for their web content. You can sign up and learn the console, identity management, setting up server images, etc. with some of it free and the rest fairly cheap (pennies an hour for a server instance). Just remember to shut down servers and services with hourly billing so they don't run 24x7x365.
 

Gryz

Golden Member
Aug 28, 2010
1,551
203
106
Languages don't matter much. Most of them are very similar. Some of them have a shorter way of writing something down. But it's still the same under the hood. So hunting down the latest "hip" language won't buy you much.

In my humble opinion, there is only one topic that is relevant for now and for the next ten years.

And that topic is: parallel programming.

CPUs will not become faster any more. We've got that behind us. The main improvement, and maybe the only improvement, we'll see in the next decade is that CPUs will have more cores. Most cpus these days have 4 cores. You can buy cpus with 16 or 64 cores already, but they are not mainstream yet. But that will happen. Ten years from now it might be common to have a few dozen cores on your PC, your laptop, and your phone. And larger systems (web-servers, database-servers, boxes for scientific research, core-routers, etc) might have a thousand cores or more.

But those extra cores will not be useful unless programmers write software that makes use of those cores. Now most people don't understand this properly. But parallel programming is really complicated. It requires a certain way of thinking. It requires you understand all the potential problems. Not only correctness problems, but also performance problems. And it requires you to understand all the solutions, when to pick which solution where. Not easy. I believe very few programmers really understand parallel programming today.

So if you ask me: what should I learn to stay up to date with modern software, my answer would be: parallel programming. I think it is the only thing that matters during the coming decade.
 
Last edited:
  • Like
Reactions: Ken g6

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
^ that depends a bit on what you want to do.

I mostly work on Windows desktop applications that do things on their own and also connect to our Amazon services. For many business and work applications there is no need to "real" parallel programming, just some threading: asynchronous UI and event handlers, a worker thread here and there, but each of those pieces is single-threaded. You need to understand what a race condition is and take some simple steps to avoid two asynchronous blocks of code fighting over some global state but not much more than that.

For cloud services, parallel is important but the programming is often distributed rather than single-process parallel. You deal with high workloads and machine failures by creating multiple servers but the task(s) those servers work on are often using multiple copies of single-threaded code.
 

Gryz

Golden Member
Aug 28, 2010
1,551
203
106
DaveSimmons, thanks for making my point. :)

First you say many programmers work on the GUI, and therefor don't need to know a lot about parallel programming. It is true that a lot of programming work is in the GUI and the "front-end". But can't you use multiple cores to make even the front-end go faster ? E.g. loading resources from disk, or from the net, etc. If you dismiss parallel programming in all your front-end programs, that means your front-end programs will never ever improve in performance anymore. Because each core will not (or hardly) become faster in the future. You're basically admitting defeat here, and you accept the fact that there will be no technological progress here any more.

The second thing you say is: "you need to understand what a race condition is, and put some semaphores around your critical sections". Yep, that's it, in a nut-shell. Except for the fact that real-life is a lot more complex. If all you can do is put semaphores around your critical sections, then you will run into performance problems very quickly. Maybe not in toy-programs, but certainly in real-world performance-critical applications.

Yes, distributed programming will also become a lot more important. But that is a form of parallel programming too. Just not with shared memory. You're more likely to use message-passing. Which might not make things easier. And your simple "just use semaphores" will get a whole different meaning in a distributed system without shared memory (and thus without semaphores).

So I think my point still stands: parallel programming will be the most important thing to learn in the next decade.
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
I still say it's important but only at a basic level for many developers. For tinkering I'd tell someone cloud infrastructure including client-server, databases and security is more important.

Many business applications spend most of their time either waiting on the user or waiting for a remote server. There are a lot more of us working on business applications than in scientific programming or on games. Creating extra worker threads just because you can is often counter-productive because it makes the application harder to QA and to maintain.

Many server applications expose services that are 100% single-threaded code. Yes that code services multiple requests in parallel but the code lets the database engine or other external services take care of any shared data.

FYI, the application I spend the most time working on is licensed by over 1,500 companies and has over 3 million users so I do have a little real-world experience :)
 

CanadianCoder

Junior Member
Aug 2, 2016
8
0
36
Thanks for the tips!

I should add that I've been programming out in the working world for about three years, including a few internships, so the 'get the general concepts in order to adapt to any language' thing is pretty much where I'm at. That's also partly due to a really good college program that provided a good amount of breadth. Nonetheless, interesting to hear everyone's perspectives. I like the advice above that said 'tinker around in what's popular here and there', it might really be as simple as that.

After giving it some thought myself, one thing I've noticed is that a lot of people who started out in the mainframe days never really made the shift into object-oriented programming. So maybe something could be said for keeping an eye out for major paradigm shifts like COBOL -> C-Based stuff, if that will even happen in the near future.

For a while there I think there was a bit of movement from web dev -> mobile, a bit of a mini-shift, but these days it seems like a lot of people are making their web presence adapted to mobile, rather than creating stand-alone apps.

Outside of that there seems to be a lot of buzz-words out there, but the bread and butter in 2016 looks mostly like web frameworks.
 

Merad

Platinum Member
May 31, 2010
2,586
19
81
I don't see parallel programming becoming any more "mainstream" unless we come up with a new, easier model to work with it. Dealing directly with threads, synchronization primitives, etc. is incredibly difficult even for programmers who are good and experienced. Introducing things like race conditions into your code is very easy, but the bugs caused by them are typically a huge pain to debug. In many (most, really) cases parallel programming isn't even necessary or gives little to no benefit.

Take the GUI example - GUIs are already multithreaded, and have been for a long time. At least, all of the decent ones that don't lock up every time you click a button have been. It's fairly rare for a GUI to need to do long running CPU intensive work. Threading in this context is simply so that the UI stays responsive even though your file load or network request may take a second or two. In those situations it's usually much easier to use something like C#'s async/await rather than "true" multithreading. Yet even async/await has pitfalls and edge cases that I'd say the vast majority of C# developers aren't aware of.

Another example, since we're talking C#. Microsoft made a big deal a few years ago when they added async/await support for ASP.NET MVC/WebAPI applications. I guess it does work for making highly scalable applications (think Stack Overflow), but I've always read that if your web application is hitting a single database server (which is the great majority of business apps, I'd wager), async/await gives basically no benefit because the the database will become a bottleneck before the web server's thread pool does.

Anyway, I sort of wandered off topic there. I think the biggest shift that's underway is the integration of functional programming practices into mainstream programming. I really don't see a language like Haskell ever taking off for mainstream development, but a lot of their ideas (pure methods, avoid modifying state, etc) are already leaking over to languages like C++ and C#.