Since its introduction in the late 1990s, the concept of ambient intelligence has evolved into a multi-disciplinary field of study involving computer science, mobile computing, interaction design, and cognitive science. The study of ambient intelligence, a branch of artificial intelligence, focuses on imbuing computers with the ability to perceptually interact in ways that are common for humans but which have traditionally been out of reach in the computational universe, such as common sense, serendipity, analogy, insight, sensory fusion, anticipation, aesthetics, and emotions. At CyLab’s Visual Intelligence Studio, research in this dynamic and evolving field of study more specifically focuses on instinctive computing as the foundation for ambient intelligence. Instinctive computing is founded on the premise that the fundamental difference between living beings and computers instinct, which profoundly influences how people look, feel, think, and act. Computers can be made to be genuinely intelligent and interact more naturally with people by developing ways to computationally simulate biological and cognitive human instincts such as recognition, understanding, and other more primitive human instincts. These concepts are central to the work of Dr. Yang Cai, CyLab Senior Systems Scientist and Visual Intelligence Studio Director.
Recent work by Cai and other researchers at the Visual Intelligence Studio has focused on the development of Visual Thinking Agents (VTAs), embeddable software tools for visual intelligence designed to address a number of problems. For example, the ever-increasing volume of visual content coming in and out of our phones, computers, and networks raises new security and privacy concerns. As the flow of images and video grows at a rate that exceeds the capacity for human attention and vigilance, VTAs can help reduce the overflow of visual information by doing some of the monitoring for us. Cai and his research team have also developed VTAs to aid with pattern detection in situations such as network anomalous events, mobility of wireless users, and harmful algae blooms seen in satellite imagery. VTA-enabled pattern detection technology has even been used locally in Pittsburgh at Old St. Lukes’ Episcopal Church to detect 3-D patterns and help decipher identifying information on worn down, unreadable tombstones. Government agencies such as NASA and NOAA are already employing CyLab-developed VTA technology in their work as well.
Visual Intelligence Studio is just one of several research centers at CyLab dedicated to revolutionizing next generation computing and networks through the development of cutting-edge technology with real-world applications. Learn more by visiting the Visual Intelligence Studio website.