In direct response to the topic, I do think that Moore's Law will end within the next decade, unless some particularly massive changes were made to the economic side of the equation.
First, let me point out how
small Intel is. Yes, you read that correctly. Their annual revenue is 53 billion. Walmart and Exxon sit around the 450 billion mark.
Relatively speaking, Intel is a large corporation, seeing as they're ranked 54th in the Fortune 500. But they're still tiny compared to the bigger corporations out there, and they're absolutely miniscule in comparison to the US government (which has an annual income tax revenue of 1.5 trillion).
Moore's Law could certainly continue on,
or could even be beaten -- it's all a question of money.
Fun hypothetical scenario: If the US government had spent $1 trillion dollars on semiconductors, instead of the Iraq War, we'd have 7nm chips in our hands right now (or better).
I'm not intending for this to be a political debate, but consider that we could be
8+ years ahead with a "simple" shift in priority. In reality, such a move is anything but simple, but if you were to stretch the truth, you could say that the Iraq War cost the US 8 years of technological progress. Now was the Iraq War worth it? Perhaps, but that is the debate that needs to be avoided in this forum.
In order for Moore's Law to continue, Intel's profits and revenues are going to have to increase dramatically. More funding is going to need to go into photolithography firms like ASML.
What about new materials? We've been building chips in pretty much the same way since integrated circuits were invented like 50 years ago. Sure the techniques are incredibly refined now compared to early microchips, but there must be a better way..
New materials suffer from the same problem: money. There
are better ways... we just don't have the money to put a graphene/[insert supermaterial here] SoC in everybody's hands.