Open main menu

Cognitive computing (CC) describes technology platforms that, broadly speaking, are based on the scientific disciplines of artificial intelligence and signal processing. These platforms encompass machine learning, reasoning, natural language processing, speech recognition and vision (object recognition), human–computer interaction, dialog and narrative generation, among other technologies.[1][2]

Contents

DefinitionEdit

At present, there is no widely agreed upon definition for cognitive computing in either academia or industry.[1][3][4]

In general, the term cognitive computing has been used to refer to new hardware and/or software that mimics the functioning of the human brain[5][6][7][8][9][10] (2004) and helps to improve human decision-making.[11][12] In this sense, CC is a new type of computing with the goal of more accurate models of how the human brain/mind senses, reasons, and responds to stimulus. CC applications link data analysis and adaptive page displays (AUI) to adjust content for a particular type of audience. As such, CC hardware and applications strive to be more affective and more influential by design.

Some features that cognitive systems may express are:

  • Adaptive: They may learn as information changes, and as goals and requirements evolve. They may resolve ambiguity and tolerate unpredictability. They may be engineered to feed on dynamic data in real time, or near real time.[13]
  • Interactive: They may interact easily with users so that those users can define their needs comfortably. They may also interact with other processors, devices, and cloud services, as well as with people.
  • Iterative and stateful: They may aid in defining a problem by asking questions or finding additional source input if a problem statement is ambiguous or incomplete. They may "remember" previous interactions in a process and return information that is suitable for the specific application at that point in time.
  • Contextual: They may understand, identify, and extract contextual elements such as meaning, syntax, time, location, appropriate domain, regulations, user’s profile, process, task and goal. They may draw on multiple sources of information, including both structured and unstructured digital information, as well as sensory inputs (visual, gestural, auditory, or sensor-provided).[14]

Use casesEdit

Cognitive analyticsEdit

Cognitive computing-branded technology platforms typically specialize in the processing and analysis of large, unstructured datasets[15].

Word processing documents, emails, videos, images, audio files, presentations, webpages, social media and many other data formats often need to be manually tagged with metadata before they can be fed to a computer for analysis and insight generation. The principal benefit of utilizing cognitive analytics over traditional big data analytics is that such datasets do not need to be pretagged.

Other characteristics of a cognitive analytics system include:

  • Adaptability: cognitive analytics systems can use machine learning to adapt to different contexts with minimal human supervision
  • Natural language interaction: cognitive analytics systems can be equipped with a chatbot or search assistant that understands queries, explains data insights and interacts with humans in natural language.

ApplicationsEdit

Education: Even if Cognitive Computing can not take the place of teachers, it can still be a heavy driving force in the education of students. Cognitive Computing being used in the classroom is applied by essentially having an assistant that is personalized for each individual student. This cognitive assistant can relieve the stress that teachers face while teaching students, while also enhancing the student’s learning experience over all.[16] Teachers may not be able to pay each and every student individual attention, this being the place that cognitive computers fill the gap. Some students may need a little more help with a particular subject. For many students, Human interaction between student and teacher can cause anxiety and can be uncomfortable. With the help of Cognitive Computer tutors, students will not have to face their uneasiness and can gain the confidence to learn and do well in the classroom. [17]. While a student is in class with their personalized assistant, this assistant can develop various techniques, like creating lesson plans, to tailor and aid the student and their needs.

Healthcare: Numerous tech companies are in the process of developing technology that involves Cognitive Computing that can be used in the medical field. The ability to classify and identify is one of the main goals of these cognitive devices.[18] This trait can be very helpful in the study of identifying carcinogens. This cognitive system that can detect would be able to assist the examiner in interpreting countless numbers of documents in a lesser amount of time than if they did not use Cognitive Computer technology. This technology can also evaluate information about the patient, looking through every medical record in depth, searching for indications that can be the source of their problems.

Industry workEdit

Industry: The power that Cognitive Computing and AI holds can change the whole course of mankind. It holds the potential to affect almost every task that humans are capable of performing. This can negatively affect employment for humans, as there would be no such need for humans anymore. It would also increase the inequality of wealth; the people at the head of the organizations that wield the power of Cognitive Computing would grow significantly richer, while the workers who are not getting employed anymore would be getting poorer.[19]

Competition: This new technology would become a standard when it comes to remaining competitive. Those who are late to use Cognitive Computing would decline exponentially due to the technology to make the best decisions being overwhelmingly powerful. Big Data offers the chance of creating working models of algorithms that can comprehend what a customer wants, needs, and are also prominent in the ability to make decisions.

Management: The more industries start to utilize the power of Cognitive Computing, the more difficult it will be for humans to compete.[19] Greater use of this technology can also increase the amount of work that these AI robots and machines can perform. With the growing number of machines and robots coming into the work industries, this leaves no room for humans. Only talented, capable humans would be able to keep up with the machines. The employment of these talented humans that can keep up with the power of the machines will motivate and inspire them to find ways that can give them a lead against other competition. The great pull away is that the influence of these talented individuals could potentially steer the whole AI industry into a very competitive and highly proficient era of humankind.[20]

ThreatsEdit

Learning: The whole essence of AI and Cognitive Computing relies on the ability for it to self-learn. It is encouraged for the machine to learn at a controlled rate, but when it gets out of control, that is when humans as a species become threatened. As soon as the AI technology has increased intelligence, control will be lost.

Limitation: There will be parameters set up in the case if something goes wrong. Everything is written in a function or code when it comes to this technology. As long as everything is controlled and extreme measures have been taken, the release of Cognitive machines would be safe and eligible to release to the public. If the machine can break out of its code and its parameters, humanity will be threatened.

See alsoEdit

ReferencesEdit

  1. ^ a b Kelly III, Dr. John (2015). "Computing, cognition and the future of knowing" (PDF). IBM Research: Cognitive Computing. IBM Corporation. Retrieved February 9, 2016.
  2. ^ Augmented intelligence, helping humans make smarter decisions. Hewlett Packard Enterprise. http://h20195.www2.hpe.com/V2/GetPDF.aspx/4AA6-4478ENW.pdf
  3. ^ "IBM Research: Cognitive Computing".
  4. ^ "Cognitive Computing".
  5. ^ "Hewlett Packard Labs".
  6. ^ Terdiman, Daniel (2014) .IBM's TrueNorth processor mimics the human brain.http://www.cnet.com/news/ibms-truenorth-processor-mimics-the-human-brain/
  7. ^ Knight, Shawn (2011). IBM unveils cognitive computing chips that mimic human brain TechSpot: August 18, 2011, 12:00 PM
  8. ^ Hamill, Jasper (2013). Cognitive computing: IBM unveils software for its brain-like SyNAPSE chips The Register: August 8, 2013
  9. ^ Denning. P.J. (2014). "Surfing Toward the Future". Communications of the ACM. 57 (3): 26–29. doi:10.1145/2566967.
  10. ^ Dr. Lars Ludwig (2013). "Extended Artificial Memory. Toward an integral cognitive theory of memory and technology" (pdf). Technical University of Kaiserslautern. Retrieved February 7, 2017.
  11. ^ "Research at HP Labs".
  12. ^ "Automate Complex Workflows Using Tactical Cognitive Computing: Coseer". thesiliconreview.com. Retrieved July 31, 2017.
  13. ^ Ferrucci, D. et al. (2010) Building Watson: an overview of the DeepQA Project. Association for the Advancement of Artificial Intelligence, Fall 2010, 59–79.
  14. ^ Deanfelis, Stephen (2014). Will 2014 Be the Year You Fall in Love With Cognitive Computing? Wired: 2014-04-21
  15. ^ "Cognitive analytics - The three-minute guide" (PDF). 2014. Retrieved August 18, 2017.
  16. ^ Sears, Alec (April 14, 2018). "The Role Of Artificial Intelligence In The Classroom". ElearningIndustry. Retrieved April 11, 2019.
  17. ^ Coccoli, M., Maresca, P. & Stanganelli, L. (2016). Cognitive computing in education. Journal of e-Learning and Knowledge Society, 12(2),. Italian e-Learning Association. Retrieved February 14, 2019 from https://www.learntechlib.org/p/173468/.
  18. ^ Dobrescu, E. M., & Dobrescu, E. M. (2018). ARTIFICIAL INTELLIGENCE (AI) - THE TECHNOLOGY THAT SHAPES THE WORLD. Global Economic Observer, 6(2), 71-81. Retrieved from https://login.ezproxy.csum.edu/login?url=https://search.proquest.com/docview/2176184267?accountid=10353
  19. ^ a b Makridakis, S. (2017). The forthcoming Artificial Intelligence (AI) revolution: Its impact on society and firms. Futures, 90, 46-60.
  20. ^ West, D. (2018). The Future of Work: Robots, AI, and Automation. Washington, D.C.: Brookings Institution Press. Retrieved from http://www.jstor.org/stable/10.7864/j.ctt1vjqp2g

Further readingEdit