WALL-E Just Wants to Help: A Future of Industrial Robotics Safety Retrofits
When I’m between projects my mind wanders a bit, and I tend to think about technologies that seem unique to a single industry and why that may be. Recently my mind keeps coming back to SteamVR™ Tracking, specifically tracking of objects in space. This technology is currently limited in its application because most engineers haven’t been exposed to how object tracking works in VR. I think it could be easily used in other applications, especially given the promise of 5G communications and the Internet of Things. One industry I see trackable objects being extremely useful in is industrial manufacturing and the expanding market of collaborative robotics (cobotics).
Creating Awareness and Consideration
In the future, individual robots will be expected to perform a larger variety of tasks — from operating in professional kitchens, to training athletes, to performing multiple unique operations on an airplane assembly line — which is why it’s important to look ahead into the development of interaction between humans and robots. As industrial settings evolve, companies will actively weigh the productivity and safety of humans within these systems. Inserting humans into a rigidly configured industrial environment usually cannot be justified due to the safety risk, and significant opportunity exists in making these systems adaptable to unpredictable operators and environmental influences.
Consider an aftermarket retrofit solution to this problem — while low-cost retrofits aren’t typically associated with high-value capability expansion and robust safety, is it possible to deliver both with mature and rapidly maturing technologies?
Imagine if a technician could enter an environment of industrial robots and exoskeletons without any of the neighboring equipment shutting down, because the system could dynamically adapt to the individual's presence. Solutions of this nature could actively keep people safe and maintain uptime for manufacturing lines. During service and maintenance of neighboring equipment, the individual could be in the working space of the operating robot or machine, supporting downtime avoidance based on active awareness of the individual.
So why don’t large manufacturers with fully or semi-automated assembly lines already have advanced capabilities like this? One reason might be the development cost and time burden for each equipment manufacturer to implement unique perception systems on a robot-by-robot basis. Another is the cost to operators of replacing perfectly functional installed equipment in order to realize this capability. It is cheaper in the short-term to power down a line for service than to develop and implement a fully-custom, robust system of this kind. Even if that is the case, the long-term prospect of having minimal downtime with this level of combined flexibility and safety is highly desirable. Consider an aftermarket retrofit solution to this problem — while low-cost retrofits aren’t typically associated with high-value capability expansion and robust safety, is it possible to deliver both with mature and rapidly maturing technologies?
Current State of the Art
Robots and cars currently tend to have complex sensor suites, and the data is processed by an onboard computer to characterize the environment. This is not unlike human stereo vision, hearing, and pressure and heat sensing, all of which our brains perceive and process to allow us to comprehend our environment from a distance. Humans and many robots are limited to a first person's perspective of the environment in which they perform.
What opportunities could we unlock if we had a third-person perspective of a complex environment? Technologies currently being implemented to track objects in VR systems and relate a user’s motion to what they see in VR provide this third-person perspective. SteamVR’s trackable objects, like the VIVE Tracker, require only strategically-placed infrared (IR) sensors and two IR beacons (referred to as lighthouses) placed at the corners of the work space so the object knows its relative position in space. With this system, we can determine the current position of an object in space within a centimeter of its actual position, and it’s possible to define the space within which a piece of equipment must operate. This combined technology would allow industrial robots to have a 3rd-person point of view of their surroundings and position in space. Due to the fixed volume of operation determined by the lighthouses, a system of this type is better for something that never moves outside of a few cubic meters for an extended period of time.
How This Might Work
Currently, VR systems are aware of wired (headsets) and wireless (handheld devices) trackable objects in a fixed volume, and show the user via optical displays where the objects are in space. In the industrial robotics scenario, such a system only requires spatial data, removing the need to transmit video, thereby significantly relieving the communication load. Coupled with advancements in low-latency 5G wireless technology, this opens up the possibility of a low-cost retrofit for industrial robots to help keep workers safer. At a minimum, I see an opportunity that opens up all kinds of manufacturing scenarios like legacy machinery working with current machinery, machines working within each others’ zones of motion, and humans working in tandem with new and old robots.
A Glance Into the Future
Two things come to mind for me when imagining how a system like this could perform: the drumming performance in the opening ceremony of the 2008 Beijing Olympics, which is a beautiful, highly synchronized display of routine; and murmuration (start at t=0:45) of starlings, which is a highly complex naturally adaptive system. One moment our system of robotic arms is performing in concert like a well-oiled machine in its standard fashion, and then all of a sudden a chaotic event occurs that forces all of the machines to adapt their pace and movements to optimize for the event. Possibilities like this make me excited for new and upcoming technology like 5G, which has the potential to unlock compelling applications of available tech such as VR trackable devices.
In this second chapter of the IoT series, we’ll show that IoT is about the “Information of Things.” The value of IoT follows the Pyramid of Knowledge, generating increasing value as data gathered from IoT devices is processed into information and the resulting knowledge & insights.
Say hello to the Spring Edition of our Sustainable Product Design Ebook! We've made some updates and added more tools and resources. There's a little something for everyone, no matter where you're at on your sustainability journey.
The mechanical engineering team at Synapse has gotten creative in finding solutions for working together remotely. Following Ann Torres’ (our VP of Engineering in San Francisco) great discussion with Fictiv and Cooper Perkins on How to build a Physical Product in the Virtual New World, our team tackled some of the same challenges and developed solutions of our own.
With major improvements happening in virtual reality technology and many companies operating remotely in the face of unprecedented health and safety concerns, is COVID the perfect time for VR to finally go mainstream?
Are you a startup developing a prototype needed to reach your next fundraising milestone? Or are you on the path to mass production? Either way, there's an Alpha prototype in your future—but, they're not all created equal. Ian Graves, Mechanical Engineering Program Lead, describes a few different Alpha prototype scenarios and discusses some of the downsides and highlights of each.
Synapse is a product development firm. We work with the best companies in the world to drive innovation and introduce cutting-edge devices that positively impact our lives. Fueled by a desire to solve complex engineering challenges, we develop products that transform brands and accelerate advances in technology.