Studies indicate that reliable interaction between human and robot team members requires more than a single mode of communication. One emerging technology that may benefit this Multi-Modal Communication requirement is the tactile display. But is a new language based on touch feasible? Rr. Daniel Barber, a research scientist at the Institute for Simulation and Training's ACTIVE Lab with extensive experience in unmanned and intelligent systems, will discuss results of experiments intended to measure our ability to interpret "vibrotactile" displays of communication from robotic team members.