It is often taken for granted that family and friends recognize us when we enter a room. They can also read our facial expressions to determine if we are happy, sad, excited, worried, or frustrated. They may even smile and roll their eyes when we say something outlandish or exaggerated.
However, advances in AI and other AI technologies like Computer Vision, which aids computers to understand images and videos, have allowed robots to perform many of the same tasks, although they lack empathy and true emotional understanding so they are not sentient.
Despite this lack of consciousness, is still relevant today. These robots are often referred to as “social robots” and provide companionship, emotional, and learning support for children and adults who are playing, can talk to them, and snuggle like a pet. They also work together in warehouses and factories, transporting goods. These platforms are being used for research and development, allowing researchers the opportunity to study how humans interact with robots and make even further advances in the field.
What is a Social Robot?
A social robot can interact with other robots and humans. Artificial intelligence is used to create social robots. They are often equipped with microphones, cameras, microphones, and other technology that can respond to touch, sounds, and visual cues just like humans.
Social robots can be artificial intelligence platforms that are paired with sensors and microphones, as well as other technology like computer vision, to better interact with and engage with other humans and robots.
There are many sizes and shapes of social robots. They can be human-like faces placed on static pedestals or furry, tail-wagging dogs. While their aesthetic design is important to encourage interaction and human engagement, it is often what is inside that matters most.
The first thing they think of is a robot when people think about a social robot, that is capable of communicating and understanding intent and mood, which is much like what humans can do, according to Alexander Kernbaum (interim director of SRI International’s Robotics Laboratory), a research institute that created Apple’s Siri, and the first Telerobotic surgeon, DaVinci.
Robots have many different ways to do this. They can read facial expressions and counter with a smile or track people with their eyes to show they are paying attention. In an effort to increase pedestrian safety while interacting with autonomous vehicles in Japan, researchers placed a pair of large googly eyes on top of a self-driving golf cart in order to observe how pedestrians responded.
Kernbaum explained to Built In that if you cross in front of a car, its eyes will follow you. Then you will know that the car is seeing you. It’s a form of communication.
Kernbaum said that it is not clear whether people would accept a car with googly eye stickers on its front. However, those googly-eyed eyes increase human engagement, which adds an extra layer of interaction humans usually enjoy with each other.
Also read: Top 5 Robotics Companies and their Amazing Creations and Imagination
What are the Uses of Social Robots?
Social robots can be found today as companions or support tools for children’s development. They are primarily used in autistic therapy and emotional-social learning. Social robot pets are an effective method of therapy for those with dementia.
Social robots can also be concierges in hotels or other settings such as malls, where they provide service.
It all depends on how loosely one defines a robot. Social robots have become even more personal. If we don’t have enough money, Smartphones use built-in social AI tools such as Siri to avoid traffic jams and compose text messages. They also allow us to add events and meetings to our calendars.
Examples of Social Robots
Social robots are constantly evolving, whether in the workplace, on the streets, or with our loved ones. These are just a few examples of social robots.
FURHAT BY FURHAT ROBOTICS
Furhat is a social robot that can be customized to perform prototypes and application development. Furhat’s code can be updated by researchers and developers to test different verbal and nonverbal response modes. Furhat can be outfitted with different masks that represent different human likenesses, including gender, age, and race. Furhat can also portray anime and dogs. There are more than 200 voices available.
According to Furhat Robotics’ website that their social robot can also communicate via facial expressions, head movements, eye gaze, and eye gaze. It can even raise its eyebrows to emphasize a conversation. Furhat is equipped with computer vision to track facial expressions and can interact with up to 10 people simultaneously.
JENNIE BY TOROT
Jennie is a robot dog that is used to offer emotional support to those with dementia and other health conditions. Helping to reduce the effects of loneliness, anxiety, and depression. Equipped with voice command software and touch sensors. Jennie is a social robotic named after a Labrador retriever dog. It can make puppy sounds and wag its tail.
MISTY BY MISTY ROBOTICS
Misty Robotics has created Misty a mobile social robot that can be programmed to interact with people. A social robot company that was acquired by Furhat. It has been used to conduct temperature screens at work and in healthcare settings. Now, it is being marketed to researchers to help them with their work with people with Alzheimer’s and autism, according to techcrunch.
Misty can express a variety of emotions using her eyes and sounds. A robot’s neck, head, and arms can also move, which allows it to express curiosity or excitement. According to Misty Robotics’ website, it responds to touch and can recognize and recall people. uses artificial Intelligence to detect 80 object classes.
MOXIE BY EMBODIED
Time has named Moxie one of the top inventions for 2020. This social robot, developed by Embodied robotics and artificial Intelligence company Embodied, encourages social-emotional learning by engaging children in play-based activities. Moxie is designed to be used by children between the ages of 5 and 10.
It can respond to eye contact, conversation, and facial expressions. It can also remember people, places, and things. Twelve children were surveyed over six weeks and asked to interact with Moxie for fifteen minutes each day. Moxie participants experienced an increase in self-esteem by 55 percent and an increase in emotional regulation by 43 percent. You can also improve their interpersonal and communication skills.
Ben Powell, a Longborough University graduate, created Orbit, a small social robot that helps children with autism to develop social skills through storytelling and physical touch. Orbit can also be used for visual communication such as facial expressions. According to a Longborough University press release Powell, who was diagnosed with mild, high-functioning autism, created Orbit to help children “see emotions in context” and learn social skills on their own.
Powell stated in the release that Orbit can be given a personality to help children empathize and build a relationship with it. This will help the user learn social responsibility and allow them to recognize how their actions can affect the robot’s feelings. For example, if they push Orbit too hard, or hit it, it will make Orbit look sad or scared.
Also read: Top How Artificial Intelligence and Robotics Are Used in Medicine
QTROBOT BY LUXAI
QTrobot – Another social robot that educators and families use to support autistic children at school and home. LuxAI’s website states that QTrobot can increase attention and engagement while decreasing anxiety and overstimulation within a learning environment. LuxAI also created a research-and-development platform called QTrobot V2. It is equipped with 3D cameras and high-quality microphones. Text-to-speech can also be used. Emotion and gesture recognition can also be purchased from researchers and institutions for studying social robotics.
Future of Social Robots
It is not only how social robots evolve but also how humans develop alongside them that will determine their future. Given that we are not always able to understand one another, it will be crucial for humans to be able to understand the intent of social robots.
Tracking eyeballs on self-driving cars will be one way to improve human-robot interaction. This will help with managing expectations and safety.
Kernbaum stated, “Once you know these social cues then we can just listen.”
However, managing expectations can be a complicated design problem according to Brad Porter. He is the founder and CEO of Collaborative Robotics. This California-based company has developed a collaborative robot or cobot for industrial settings. Porter believes that human-robot interaction and social awareness play an important role.
“I believe if you’re trying to do something in a work environment, Even if the robot can be socially interacted with and collaborated, You can do a lot with your design in terms of signaling. Porter stated, “What it is capable of and what its limitations are.”
Geared towards children or other social support environments. A social robot must behave and interact differently from other social robots. If it’s at work, expectations can be very different.
Porter stated, “If your robot looks like a cartoon character, then it is likely to have child-like interactions and behaviors.” It could be difficult to meet this expectation, as children are creative and unique, so it can be challenging to infuse all of that into a robot.
Artificial intelligence, in addition to design, will be crucial for the future of social robotics.
Kernbaum believes that AI is important and it’s worth considering what artificial intelligence (and the robot platforms it will be housed in) can do better than human beings given their incredible access to data. Kernbaum stated that she is not concerned about AI taking over our jobs. “But they will be better at certain things and we may want to concentrate on those.”
AI-powered social robots today are promising when it comes to providing personalized learning and support for older adults and children. However, social robots can’t always perform the same task as humans, just like other areas of robotics. It’s because they can do it consistently, which is something that we are not capable of due to the daily struggles of being human.