Jeff Hebert
Vice President of Engineering

Automation, AR, and Jobs: How AR Will Train and Assist Workers in the Context of Automation

Automation replacing workers has been a persistent news headline recently.  Advances in technology have changed the economics enough that investments in automation have become more reasonable and the return on investment equation now pencils out for more and more applications.  An industry like fast food, for example, is seeing the encroachment of automation as wages increase and the cost of technology falls—Wendy’s is adding self-service ordering kiosks to 1,000 (15%) of its stores in 2017 alone.  Similarly, Amazon is seeking to fully automate brick-and-mortar checkout with its Go grocery store concept.  The predictions can sound pretty bleak.

While technology advancements in areas like machine learning and robotics are major factors enabling wider-spread applications of automation, the same advancements are also fueling a revolution in human augmentation.  An explosion of artificial intelligence and augmented reality (AR) technologies is shedding light on the future capabilities of augmented humans.  While automation will take over high-volume, repetitive tasks where the high cost of infrastructure can be more easily recuperated over time, lower-volume tasks, especially those requiring more flexibility and adaptation, will be better-served by augmented workers than complete automation.

What is Augmented Reality?

AR means the real-time use of information, delivered to the user via graphics, audio, text, and other methods, to augment human senses, knowledge, capabilities, and experiences.  At a high level, it represents contextual computing and the future of human-machine interfaces.

AR Embodiments

Most people think of head-mounted displays (HMDs) when AR is mentioned, Google Glass being the early example.  While HMDs will be a common way to deliver AR  (as they are coupled with the user’s changing field of view and are hands-free), not all use cases lend themselves to HMDs.  For the lowest-cost and least-immersive AR experiences, mobile phones and tablets are proving to be a powerful platform.  Apple recently incorporated ARKit into iOS 11, enabling developers to use the onboard camera, inertial sensors, and display to deliver AR experiences through app development alone.  For experiences requiring custom sensing or alternative display experiences, non-HMD applications are more appropriate, such as heads-up displays in automobiles and infrared or chemical sensing for industrial workers.

How AR and Automation Will Coexist

It makes the most sense to replace jobs with automation when those jobs involve highly repetitive tasks—especially those which don’t require complex perception, manipulation, creativity, or social intelligence.  Whether it’s repetitive assembly, scanning barcodes, or sorting packages, automation pencils out when a system can perform a predictable task over and over again.  This enables engineers to implement relatively simple algorithms, robotics, and fixtures while avoiding higher-level artificial intelligence and robotic flexibility.

But many jobs and tasks aren’t so repetitive.  For example, an auto mechanic must perform hundreds of procedures on different models and different areas of cars using many different tools.  Developing sophisticated-enough artificial intelligence and flexible-enough articulation and actuation to fully take over such tasks from humans would be a very significant undertaking.  Given this, we will see a proliferation of AR systems (HMDs, non-HMDs, and mobile devices) which augment workers with the contextual information necessary to perform and check tasks as the worker performs them.  The auto mechanic (or physician, for that matter), will benefit from sensors feeding machine learning algorithms with respect to diagnosis as well as checks for accuracy and completeness during procedures, but will not lose their job to automation any time soon.

AR promises a method to train and assist workers in a way that will remain economically favorable for certain tasks and industries for a while to come despite continued advancements in machine learning and robotic automation.  Workers displaced by automation will add additional economic incentive for employers to retrain people through AR in a way that wasn’t possible during previous periods of job loss due to technical innovation.  At Synapse, we’re investing in our machine learning, sensing, and robotics capabilities to support both automation and AR in this rapidly changing landscape.

Main image via Adobe Stock

CONTACT US

See what else is new...

February 28, 2019

Done is Beautiful

When developing a product, each client and project needs to decide what "done" looks like and what makes the most sense for the business. So how do you decide how good is good enough?

December 7, 2018

DeepRay™: The Wiper Blades of the Future

Machine vision technology continues to rapidly advance and improve, performing object recognition at increasing rates and with increasing accuracy. What happens when the images being processed are obscured by the rain, snow, and mud of the real world? DeepRay™ is an AI technology that presents a clever solution to this deceptively difficult problem while making it look easy.

See what else is new...

February 28, 2019

Done is Beautiful

When developing a product, each client and project needs to decide what "done" looks like and what makes the most sense for the business. So how do you decide how good is good enough?