Still remember VM's and terminals used to connect to a mainframe computers? Well guess what, I predict they will return by 2023, that is in 10 years from now. Here is my reasoning.
Moore's law predicts, that the number of transistors on integrated circuits roughly doubles every two years. If we use a processor's clock rate (frequency at which a CPU is running) as an ultimate parameter describing the processing speed, the lets look what happens in a very near future.
Standard CPU frequency today is 3 Ghz. After two years it is predicted to be 6 Ghz, after four years - 16 Ghz, after six years - 256 GHz, after eight years - 65536 Ghz, after 10 years - 4.3 e21 Hz... According to Moore's law, after a few decades we should have computers that process information at the scale of individual atoms [doi:10.1038/35023282] - a scenario that probably will not happen :) Even more interesting and funny statement is concluded by Lawrence M. Krauss et al.: "Our estimate for the total information processing capability of any system in our Universe implies an ultimate limit on the processing capability of any system in the future, independent of its physical manifestation and implies that Moore’s Law cannot continue unabated for more than 600 years for any technological civilization" [arXiv:astro-ph/0404510].
To catch up with the predictions of Moore's law, a qualitative step in technology is necessary, and very soon. But, there is no novel technology today, which could substitute old good field-effect transistors. Whether thats Silicon or Graphene, I believe it will still be FETs that rule the world, for some time at least :)
So, what will happen in the next decades? I believe parallel computing will happen. The computational power will increase due to increased number of processors working in parallel. 8-core processors, 16-core, 32-core, 64-core, 128-core, 256-core, 65536-core CPU, ... where is an end? The interesting thing though is that from this point, processors would start to grow in size - they will get larger, bulkier, heavier and whats for worst, more power hungry.
Bad news for mobile devices, no? Not necessarily, if high speed data communication protocols will improve. Why bother carrying extra mass and volume of batteries if we can do computation on the mainframe and transfer the data to the mobile devices at high speed? It seems that near future will bring lots of focus and attention to better networks - cellular, long and short range wireless, cable and all types of networks.
But can the wireless networks sustain the large amount of data transfer without local interference? What is the theoretical data transfer speed limit on wireless bandwidth? What is the required data transfer rate for the future devices? Lets say we will have fancy pancy phones in the near future with screen resolutions of 2560x1400 pixels which require 3.7e6 bits to be transferred at 50 Hertz refresh rate to make that next level shooter game of yours so real :) This implies that only 200 Mbps (3.7e6*50) data transfer speed is required for a single device.
An interesting article about the fundamental limits of computations, which are not that far away...
http://www.nature.com/nature/journal/v406/n6799/full/4061047a0.html