First, the software engineering field is still very immature. The tools and terminology in use are still very primitive. Studies seeking to identify best practices via evidence are still few and far between. Worse yet, known best practices with substantial evidence behind them are widely ignored in the industry on average. I believe there is still a lot of room for new techniques, and new models of software engineering, which will serve to guide developers for generations.
Second, we are at a critical juncture in programming languages. There are a lot of new languages gaining popularity, and a lot of older languages seeing substantial change. The fundamental paradigms in use by the average programmer are changing (becoming increasingly more functional, generally). Meanwhile, there are even more advanced concepts on the horizon (monads, for example) which are still not accessible to most programmers. There's a huge opportunity there for the next generation of languages to increase the accessibility of those advanced techniques. Additionally, we're in the midst of a revolution in terms of compiler technology. We are seeing things like modular compilers, very advanced run-times capable of incredible speed improvements, cutting edge garbage collectors, and the potential for massive parallelization on the desktop. Again, there's a huge potential there for some very fundamental innovations.
Third, we are no where near the end of the line in computing hardware. Over the next few decades we could see some very crazy things. 3D micro-chips putting trillions of transistors into every cpu. Even more massive parallelization than what we've seen today. RSFQ processors that could put petaflops computing power on a single chip. MRAM married to processor cores leading to breakthroughs in performance. Non-Von Neumann architectures enabled by memory and processing power being mixed together. FPGAs with billions of gates with the capability to operate and reconfigure at multi-gigahertz speed. Asynchronous CPUs.
And these are just things at a very fundamental level. There's still a huge potential for lots of innovations in the software and hardware space closer to the end-user. There's plenty of room for more icons in the future.
First, the software engineering field is still very immature. The tools and terminology in use are still very primitive. Studies seeking to identify best practices via evidence are still few and far between. Worse yet, known best practices with substantial evidence behind them are widely ignored in the industry on average. I believe there is still a lot of room for new techniques, and new models of software engineering, which will serve to guide developers for generations.
Second, we are at a critical juncture in programming languages. There are a lot of new languages gaining popularity, and a lot of older languages seeing substantial change. The fundamental paradigms in use by the average programmer are changing (becoming increasingly more functional, generally). Meanwhile, there are even more advanced concepts on the horizon (monads, for example) which are still not accessible to most programmers. There's a huge opportunity there for the next generation of languages to increase the accessibility of those advanced techniques. Additionally, we're in the midst of a revolution in terms of compiler technology. We are seeing things like modular compilers, very advanced run-times capable of incredible speed improvements, cutting edge garbage collectors, and the potential for massive parallelization on the desktop. Again, there's a huge potential there for some very fundamental innovations.
Third, we are no where near the end of the line in computing hardware. Over the next few decades we could see some very crazy things. 3D micro-chips putting trillions of transistors into every cpu. Even more massive parallelization than what we've seen today. RSFQ processors that could put petaflops computing power on a single chip. MRAM married to processor cores leading to breakthroughs in performance. Non-Von Neumann architectures enabled by memory and processing power being mixed together. FPGAs with billions of gates with the capability to operate and reconfigure at multi-gigahertz speed. Asynchronous CPUs.
And these are just things at a very fundamental level. There's still a huge potential for lots of innovations in the software and hardware space closer to the end-user. There's plenty of room for more icons in the future.