Kindly Share:

At any given time, a technology or two captures the zeitgeist. A few years ago it was social media and mobile that everybody was talking about. These days it’s machine learning and block chain. Everywhere you look, consulting firms are issuing reports, conferences are being held and new “experts” are being anointed.

In a sense, there’s nothing wrong with that. Social media and mobile computing really did change the world and, clearly, the impact of artificial intelligence and distributed database architectures will be substantial. Every enterprise needs to understand these technologies and how they will impact its business.

Still, it’s worth remembering that we always get disrupted by what we can’t see. The truth is that the next big thing always starts out looking like nothing at all. That’s why it’s so disruptive. If we could easily see it coming, it wouldn’t be. So here are three technologies you may not of heard about, but you should start paying attention to. The fate of your business may depend on it.

4New Computing Architectures

In the April 19th issue of Electronics in 1965, Intel Co-Founder Gordon Moore published an article that observed the number of transistors on a silicon chip were doubling roughly every two years.

Over the past half century, that consistent doubling of computing power, now known as Moore’s Law, has driven the digital revolution.

Today, however, that process has slowed and it will soon it come to a complete halt. There are only so many transistors you can cram onto a silicon wafer before subatomic effects come into play and make it impossible for the technology to function.

Experts disagree on exactly when this will happen, but it’s pretty clear that it will take place sometime within the next five years.

Must Read:  Knowing The Habits Of People With Concealed Depression Might Save A Life

There are, of course, a number of ways to improve chip performance other than increasing the number of transistors, such as FPGAASIC and 3D stacking.

Yet those are merely stopgaps and are unlikely to take us more than a decade or so into the future.

To continue to advance technology over the next 50 years, we need fundamentally new architectures like quantum computing and neuromorphic chips.

The good news is that these architectures are very advanced in their development and we should start seeing a commercial impact within 5-10 years.

The bad news is that, being fundamentally new architectures, nobody really knows how to use them yet. We are, in a sense, back to the early days of computing, with tons of potential but little idea how to actualize it.

Back

LEAVE A REPLY

Please enter your comment!
Please enter your name here