Written by Tim Huckaby, Chairman & Founder

The End of Touchscreens as We Know Them

I love making technology industry predictions.  I have been doing it for over 25 years.  At the end of each year we always see publications of predictions for the coming year. Clearly, I’m not the only one that makes technology predictions.  But I rarely see people admit when they were wrong on a prediction.  I’d love to see an end of year, “Here’s all the predictions that were wrong from last year.”  I feel that would be super interesting because of the explanations of why those predictions did not happen.  To me, the reasons why are a lot more interesting than the prediction itself. 

It seems to me like for over 20 years I have been asked to make technology predictions. I have most certainly been wrong a few times in some of my technology predictions over the years; totally wrong.  And the wrong ones make for great stories.  I love to tell a good story.  However, more times than not, some of my predictions have been spot on; a portion of those have been caught and elaborated by the press.  I’m not interested (and my guess is neither are you) in elaborating all the correct predictions I have made throughout the years.  In this article I am just going to make a prediction for 2021.  And at the end, I will sidebar an elaboration of my favorite wrong prediction.

My prediction for 2021

Well, here is my prediction for 2021 and it is not too much of a stretch:

Because of COVID-19 we have reached the end of the era of touching screens as we know themTouchscreens will be replaced by other interactive technologies quickly.  Other interactive interfaces to computing systems, like voice, gesture, and our own mobile phones as interfaces will go on a steep adoption curve.

Think about it.  Are you ever going to touch a screen in a public place again without thinking about it first?  Are you going to touch anything in a public place without thinking about it first?  Or, will you touch the screen then immediately sanitize your hands?  We are already making a conscious shift in the paradigms by which we use our hands in public.  We have all heard those horrific stories about bacteria on touchscreen kiosks in the fast food industry.  Those kiosks in fast food restaurants will become dinosaurs now.  The large form factor touchscreens in malls and retail: no way would you or I touch one of those right now or into the foreseeable future. 

DOOH (Digital Out-of-home advertising) is not going anywhere…meaning it is not going to disappear.  It is the way we get immersive and interactive content from those large screens that will change.

Since science has proven that COVID-19 can live quite nicely on the touchscreen of your phone, you may already be cleansing your phone after going in public.  If you are not, you will be; you should be.  That would be smart.

There are bacteria and virus free touchscreens that exist today.  They are required in many parts of the healthcare industry in the US.  But, they are very expensive as compared to their low-cost commercial alternatives.  We may just see this market explode.  Certainly, the world could use low-cost germ / bacteria / virus-free touchscreens in a variety of form factors.

Interestingly enough, it was Microsoft who was the first major technology vendor to introduce touch support.  It was November of 2006 with the release of the Windows Vista OS.  How do I know this?  Well, I was the one that did the flagship killer demo standing next to Steve Balmer at his keynote in the launch event.  But, Vista did not include the user interaction design to be able to use touch effectively.  It was hard enough to touch the start button; hitting the little “X” in the top right-hand side of a window was practically impossible.  It wasn’t until the iPad was released a few years later in 2010 that touch became a prevalent and commonly accepted way to use a computer. 

Since touching a screen has become so prevalent in the way we use computers and large form factor screens to get information, what are we to do now that we cannot safely touch them?

Natural User Interfaces

At InterKnowlogy we were building “natural user interfaces” to large form factor screens over a decade ago; predominantly by Voice and Gesture recognition with 3D cameras like the Microsoft Kinect and the Intel Realsense.  We were way before our time.  You can see much of that work on The InterKnowlogy Vimeo Site or specifically here and here.  Now, with the pandemic and the risk of catching the virus by touch, we will see the norms by which we consume content in interactive and immersive ways change. It is not so much a stretch to imagine people talking and waving and pointing mobile phones at large form factor digital screens to consume content in immersive ways. Retail, Healthcare, Hospitality, Events, and Restaurants are the industries that seemingly have to lead in this paradigm shift.  

I believe 3d / lidar / depth / infrared cameras like the Intel Realsense are going to explode into the market for immersive and interactive experiences to content that don’t require touch.  Shoot, like many people across the world, I don’t even type my password to get into Windows anymore.  My Microsoft surface book does a secure authentication on me with facial recognition.

Content Creation and Consumption

Eventually, voice recognition will take over from the keyboard.  For the input / creation of content in the form of text there is nothing that really matches a keyboard…yet.  I’m a machine gun at the keyboard.  It comes from the mandatory sophomore year of typing class at all-boys catholic, Crespi high school in Encino, CA 40+ years ago.  Though I can still talk faster than I can type.

But, for the consumption of content, voice recognition is legit.  So, you ask, “Why does Siri still suck?”  Well, the answer is not in software.  The answer is in computing power.  Given the right amount of computing power, today’s voice recognition software is simply spectacular.  Companies like Microsoft and Amazon and others, when running on powerful computers have simply amazing accuracy in voice recognition.   It’s just the simple case of with so much on my iPhone competing for compute resources, there just is not enough left over for Siri to be legitimately effective (yes, I realize my iPhone 8 pales in comparison to the accuracy of newer iPhone models….because they have more compute power).  This is exactly why you have seen voice recognition — whether it be in phones or Alexa or in your car — get better and better: companies like Intel have consistently increased compute power on their CPUs.  In fact, that is the really good news: Moore’s law is still on track and as a result it is predicted that our CPUs will calculate at the speed of the human brain by 2025….which is a whole different article in itself I’ll have to write. 


The pandemic has caused a lot of business chaos, worse yet sickness and death and fear.  But, there is opportunity that will grow from the ashes.  One of those opportunities is pushing technology beyond commonly accepted norms in DOOH and vertical industries like healthcare, retail, hospitality, etc.  There is huge opportunity to safely create new patterns for interfacing digital systems without requiring touch.  Opportunity in what we call at InterKnowlogy, User Interaction Design – the way by which humans interact with digitally-based computing systems.

Epilogue: My Favorite Prediction that was Wrong

My favorite wrong prediction was a double whammy and I love telling the story.  I fear only those with grey hair get it, though.  It was a long time ago:  I was a fairly confident young dev lead on an architecture team in the late nineties.  I was working on a server product at Microsoft in building 25 in Redmond, WA.  This was right at the time when Bill Gates wrote his famous email about Microsoft being late to the internet and the need to pivot quickly.  I cut my programming teeth on beautiful binary protocols that were efficient, fast and easy to secure.  Well, I’m far from a network engineer, but in the late 90s I first looked at the “interweb” as a toy.  I was not impressed with it as the network for a software development platform and confused why we were going to hop on its bandwagon.  I was specifically not impressed with two major parts:

  •  TCP, the transport layer and IP, the internet protocol which is commonly referred to as TCP/IP.  TCP/IP was:
  • Slow – the ridiculous routing hops made it ridiculously slow
  • Inefficient – unlike binaries protocols, TCP/IP is stateless.  It is also heavyweight.
  • Not secure – at the time there was no real way to secure the wire.
  • HTML – I can remember vividly saying to my team:
    • “This isn’t a programming language. There isn’t even recursion.”
    • “And what the hell is that form submit awefulness?”
    • “I could be more effective in markup with HP GL for Gods sakes.”

So, yea, I predicted the death of the internet in 1997.

What I didn’t realize is that Cisco would make a bad thing worse by inventing that first series 2500 router.  The industry installed a gazillion of those routers overcoming the route hopping weaknesses and made the internet “fast”.  The World Wide Web was also free which was a biggie; in fact, huge.  Most binary networks came with a cost.  Because the internet and specifically the World Wide Web was legitimized by Cisco, supporting technology emerged, standards bodies formed that overcame the weaknesses.  ISAPI filters paved the way for real high-level software development in java, classic ASP and then .NET.  Years later, the roads for Client-side libraries built in JavaScript were also paved for high level software development.  Security roads surfaced on and off the wire to support the growing demand for secure enterprise web applications.  Now we are “stuck” with the “inter-web” forever.