At CES 2017, you have to ask: What makes a robot a robot?

Tall, small, dancing, singing, cleaning, sassy, silly, cute and scary -- CES 2017 was awash with robots.

While there are the obvious entries like the humanoid “Pepper,” exhibits at the colossal electronics show stretched the definition of a robot to include Amazon Echo-like hubs that featured glowing orb-like eyes or bobbed to music.

You can thank Amazon’s Alexa voice assistant for helping to blur the lines between a robot and a gadget. As more and more smart devices find their voice, first comes conversation between us and them, and then comes an emotional attachment. CES hinted strongly at a big future for the prospect of robots in your home.

But what makes a robot? Does a voice assistant such as Amazon’s Alexa, found in LG’s Hub Robot, automatically count? Does it need human features? I talked to exhibitors at the show to get their take.

Someone to talk to

Olly, a donut-shaped desktop robot that flips up vertically and is covered with gently pulsating LEDs, may not fit the classical robotic image, but its creator, Emotech CEO Hongbin Zhuang, has very specific ideas about what makes Olly a robot.

The key is a combination of personality, movement and interaction, he said. The deep learning capabilities of Olly give each bot a unique personality that develops over time to mimic the personality of its owner.

Meet Olly, a fusion between smart home hubs, like the Amazon Echo, and smart home robots. CNET

Zhuang pointed to two Olly robots sitting side by side in Emotech’s booth at CES Unveiled. While one Olly was whizzing around energetically responding to anyone and everyone who engages with it, its neighbor was more calm and reserved -- it only speaks when spoken to.

“A robot is basically an automaton.” said Harry Floor, CEO of Jupiter 9 Productions. “It’s something that can do something automatically, whether it’s something that is programmed with gears or programmed with AI.”

It’s a boom time for robots of all kinds. Market researcher IDC forecasts that worldwide spending on robotics and related services will hit $188 billion in 2020, more than doubling the $91.5 billion spent in 2016. Manufacturing accounts for the largest slice of that, but the consumer robotics sector in 2016 accounted for a not too shabby $6 billion.

Last year also saw 128 robotics-related startups pull in $1.95 billion in funding, 50 percent more than in 2015, according to The Robot Report.

Freedom of motion

“Would you say a drone is a robot?” asked Terry Fong, director of NASA’s Intelligent Robotics Group at CNET’s CES robotics panel. “Well yes, as soon as it becomes more autonomous, it is definitely a robot.”

Autonomy is a key defining factor of robotics, which is why, according to Fong, the majority of robotics graduates are being hired by companies building driverless cars.

While a robot’s autonomy might be an important actor, it does not need to be all-encompassing.

“People think robots are only successful if they’re 100 percent hands off, but the reality is quite different than that,” said Fong. Our environments are always changing and are never completely predictable, he added. Therefore robots will need to be trained to be adaptable and to deal with uncertainty.

Even when this is the case, humans will have to support them when needed.

Take Roomba -- the original robot vacuum cleaner. Roomba’s ability to vacuum independently from humans is why we call it a robot. It has been designed to deal with uncertainty up to a point, but is still reliant on humans, and we shouldn’t necessarily expect that to change anytime soon.

In the eye of the beholder

Looks can also play into the ways we categorize robots in our minds

Pepper, created by Japanese company Softbank, is the poster robot for a breed of home and service droids designed to look friendly, approachable and trustworthy.

LG is taking a similar tack, Chief Technology Officer Skott Ahn said in an interview last week. It focused on the shape and design to create an emotional attachment between people and their bots. By giving its Hub Robot eyes, LG effectively turned a static smart home hub into a robot.

While both adopt human features, they wouldn’t be mistaken for actual people.

“We very specifically chose not to approach the uncanny valley with Pepper,” said Softbank’s Steve Carlin, referring to the sensation of disgust some people experience when they encounter something that looks very close to human without actually being human.

Not all roboticists have shied away from traveling this path. Hanson Robotics CEO David Hanson doesn’t hold much stock in the uncanny valley. His company makes hyper-realistic humanoid robots with malleable skin that ripples and moves as the bots’ facial expressions change.

Hanson’s Einstein robot features incredibly lifelike facial expressions. Katie Collins/CNET

At the other end of the spectrum there are LG’s lawn-mowing bot and Roomba. These small, functional machines may not look like Hanson’s creations or Pepper, but that doesn’t mean they deserve to have their robot status revoked.

Adjusting our expectations

Pop culture has shaped our expectations of what a robot should be. But the concept of the robot predates the advances in technology that have occurred over the past 50 years.

“You could literally go back thousands of years to the time of the Greek and Roman empire where they created something they think of as a robot,” Floor said. “Society has become programmed to think a robot is something that has a computer in it.”

For Hanson, his own humanoid droids do not define his view of what a robot is. “The technical definition of robot would be very inclusive,” he said. All that is required to define something as a robot is a sensor input, motorized output and some kind of control system on the inside making decision.

According to this definition, even unlikely suspects such as the inkjet printer in your home could be called robots, he explained. “In my opinion there are lots of robots around the world. They are hidden. You don’t call them robots, but technically they are robots.”

For now though the industry is in a period of intense experimentation where we’ll see all sorts of devices arrive at shows like CES.

“We are attempting a lot of approaches,” Ahn said. “For me, I’m not sure which form factor or type of robot becomes majority. We would like to test the waters with many, many types of robots.”

This article originally appeared on CNET.com

f

We and our partners use cookies to understand how you use our site, improve your experience and serve you personalized content and advertising. Read about how we use cookies in our cookie policy and how you can control them by clicking Manage Settings. By continuing to use this site, you accept these cookies.