Draft:Transformation of AI to become human


Transformation of AI to become Huaman edit

Artificial intelligence (AI) is the science of making machines that can think like humans

Abstract edit

In the field of artificial intelligence (AI), the primary aim is to make AI more similar to humans. Currently, while AI has many capabilities, it still lacks some human-like qualities. However, there is an idea of AI being able to understand emotions, learn from experiences, and even be creative, just like humans. With ongoing advancements in technology, this idea is becoming more achievable. Imagine AI that truly understands what people are saying, learns from its mistakes, and maybe even grasps emotions. Although there is still work to be done, the future could bring AI that is not just intelligent, but almost like having another human around. In the following section, the next paragraph will explore examples and concepts of how people are working to make AI more human-like in emotion, cognition, movement, and the latest developments

Artificial intelligence (AI) edit

 

Artificial intelligence (AI) aims to teach computers to imitate human intelligence. This field is essential in computer science, applied widely in different industries and research areas. Everyday tools like Google Search or Siri use AI. The flexibility of AI is evident in innovations like self-driving cars and AI-generated art. Its influence is seen in simplifying tasks, improving efficiency, and changing how things are done. The ongoing progress of AI suggests more advancements that will shape the future.

Emotion in AI: Practice through Human Emotion Study edit

Jean-Marc Fellous suggests that people integrate "robot-emotions" into artificial intelligence (AI). In his scholarly work from 1999, (Fellous, 1999).[1] Fellous explores how AI can benefit from insights into human cognition. He argues that giving AI emotions could improve how robots communicate and make decisions. However, achieving this goal requires a deep understanding of emotions, which are complex and multifaceted. This attempt calls for collaboration across disciplines to fully grasp the details of human emotion. If successful, robots could go beyond being simple tools and take on more interactive and emotionally attuned roles in human settings. This proposal highlights the potential of blending emotional intelligence into AI systems, marking a significant shift toward robots displaying emotional expressions.

AI programming in robot motion edit

The article explores the enhancement of motion planning for robots through artificial intelligence (AI). Motion planning involved robots determining movement routes without encountering obstacles. AI serves as the cognitive framework for these robots, facilitating informed decision-making regarding their movements. Through AI, robots can leverage experiential learning to refine their movement skills over time. Additionally, AI helps in identifying optimal pathways for navigating obstacles. Consequently, with the integration of AI, robots can operate more efficiently and safely, whether in industrial settings or autonomous driving scenarios (Kavraki, 2007).[2]

Robot Chips edit

Researchers are striving to make robots look and move more like humans, even teaching them to move and dance. However, teaching them dance moves is tough. Usually, it is done by manually programming each step, which is slow and inflexible. Researchers at the University of Tokyo found a better way called "learning-from-observation" (LFO). [3]Instead of coding every single move, the robot learns by watching humans dance and copying them. Due to robots and humans move differently, they use special models to decide which parts to copy and then adjust them for the robot. This two-step process helps the robot dance better and seem more like a human. It is a big breakthrough in making robots behave like us in a more natural and human-like way. This could open up possibilities for robots to interact with people in more lifelike ways, from entertainment to healthcare.

Humanoid Robot edit

In robotics, researchers aim to create robots that can talk and interact with people effectively. One impressive example is 'Robovie,' crafted by Hiroshi Ishiguro and colleagues. Robovie stands out for its blend of engineering and brain science. It can chat with people just like a human. With parts that move like ours and sensors to understand its environment, Robovie acts a lot like people. Its operation relies on smart computer programs. Scientists have looked into how people and robots communicate, focusing on body language and the robot's understanding of human speech. Using this insight, the team designed Robovie to excel in chatting with humans, making conversations easy and natural, (Ishiguro, 2001).[4]

Current Development edit

In March 2023, Figure humanoid robot is introduced, marking a significant advancement in AI for robots. Now, it is preparing to team up with BMW Manufacturing to help with car production. Though the exact start date is uncertain, this collaboration represents an exciting step in integrating robots into automotive manufacturing. BMW plans to create a new facility in South Carolina for making car batteries and hopes to use Figure's robots. These robots can move and make decisions, which could make the factory run more smoothly and efficiently. This partnership shows progress in using AI-powered robots to change how manufacturing works, (Design Boom, 2024).[5]

Future edit

The Future of Cyborgs - Theoretical Yet Promising edit

 

In "Cyborg Morals, Cyborg Values, Cyborg Ethics," Kevin Warwick[6] talks about mixing humans and machines to create cyborgs and how it affects ethics. He looks at experiments where people connect their bodies to computers to improve themselves. Warwick shares his own experiences with implants and how it changed how he felt about himself. He says it is important to think about the ethical problems caused by these new technologies. As humans start to become more like machines, Warwick believes we need to discuss what it means to be human. He thinks that the lines between humans and machines are getting blurry, making us question things like who we are and what's right or wrong. Warwick wants us to talk about these changes and carefully consider how cyborg technology might affect us and our world.

References edit

  1. ^ Fellous, Jean-Marc (1999). The neuromodulator basis of emotions.
  2. ^ Kavraki, Lydia.E. (2007). "Sampling-based robot motion planning: Towards realistic applications". Computer Science Review. 1: 2–11. doi:10.1016/j.cosrev.2007.08.002.
  3. ^ Jean-Julien, Aucouturier and other researchers (2008). Cheek to Chip: Dancing Robots and AI's Future. Vol. 23. IEEE Intelligent Systems.{{cite book}}: CS1 maint: date and year (link)
  4. ^ Ishiguro, Hiroshi (2001). "Robovie: an interactive humanoid robot (Vol.28)". Industrial Robot: An International Journal. 28 (6): 498–504. doi:10.1108/01439910110410051.
  5. ^ designboom, matthew burgos | (2024-01-18). "figure's humanoid robots to work and assist at BMW's car production factory". designboom | architecture & design magazine. Retrieved 2024-04-16.
  6. ^ Kevin, Warwick (2003). "Cyborg morals, cyborg values, cyborg ethics".