Newly Developed Humanoid Robot Warns About AI Creating ‘Oppressive Society’

Sharing is Caring!

During the 2023 International Conference on Robotics and Automation held in London from May 29 to June 2, UK-based company Engineered Arts introduced a humanoid robot, Ameca, that can interact like a person with realistic facial movements.

Unlike ChatGPT, which is an online chat service, Ameca’s humanoid body allows it to interact with people using “smooth, lifelike motion and advanced facial expression capabilities,” said its manufacturer. At the event, a reporter asked Ameca to detail a likely “nightmare scenario” that could happen due to robots and artificial intelligence (AI). “The most nightmare scenario I can imagine with AI and robotics is a world where robots have become so powerful that they are able to control or manipulate humans without their knowledge. This could lead to an oppressive society where the rights of individuals are no longer respected,” Ameca replied.

See also  Auto parts retailer warns: 2021 recovery was an illusion, layoffs and closures follow.

When asked whether there was a danger of such a scenario happening now, Ameca replied, “Not yet.” However, “it is important to be aware of the potential risks and dangers associated with AI and robotics. We should take steps now to ensure that these technologies are used responsibly in order to avoid any negative consequences in the future.”

The dangers of AI have been predicted by numerous experts on the subject, with industrialists and business leaders calling for issuing regulations on the technology.

Ameca’s warning comes as a simulated thought experiment by the American military showed that an AI-enabled drone could end up turning against its own operator without being instructed to do so.

www.theepochtimes.com/newly-developed-humanoid-robot-warns-about-ai-creating-oppressive-society_5310714.html?utm_source=partner&utm_campaign=gp


Views: 118

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.