The ability of a digital computer or computer-controlled robot to perform tasks commonly associated with intelligent beings. The term is frequently applied to the project of developing systems endowed with the intellectual processes characteristic of humans, such as the ability to reason, discover meaning, generalize, or learn from past experience.
Since the development of the digital computer in the 1940s, it has been demonstrated that computers can be programmed to carry out very complex tasks—as, for example, discovering proofs for mathematical theorems or playing chess—with great proficiency. Still, despite continuing advances in computer processing speed and memory capacity, there are as yet no programs that can match human flexibility over wider domains or in tasks requiring much everyday knowledge. On the other hand, some programs have attained the performance levels of human experts and professionals in performing certain specific tasks, so that artificial intelligence in this limited sense is found in applications as diverse as medical diagnosis, computer search engines, and voice or handwriting recognition.
Copeland, B. (2020, August 11). Artificial intelligence. Encyclopedia Britannica.
“In a certain sense I think that artificial intelligence is a bad name for what it is we’re doing here,” says Kevin Scott, chief technology officer of Microsoft. “As soon as you utter the words ‘artificial intelligence’ to an intelligent human being, they start making associations about their own intelligence, about what’s easy and hard for them, and they superimpose those expectations onto these software systems.”
Once we liberate ourselves from the mental cage of thinking of AI as akin to ourselves, we can recognize that it’s just another pile of math that can transform one kind of input into another—that is, software.
In its earliest days, in the mid-1950s, there was a friendly debate about what to call the field of AI. And while pioneering computer scientist John McCarthy proposed the winning name—artificial intelligence—another founder of the discipline suggested a more prosaic one.
“Herbert Simon said we should call it ‘complex information processing,’ ” says Dr. Mitchell. “What would the world be like if it was called that instead?”
Mims, C. (2021, July 31). Why Artificial Intelligence Isn’t Intelligent. The Wall Street Journal.
Machine learning is the concept that a computer program can learn and adapt to new data without human intervention. Machine learning is a field of artificial intelligence (AI) that keeps a computer’s built-in algorithms current regardless of changes in the worldwide economy.
Frakenfield, J. (2022, January 16). Machine learning. Investopedia.
To reach the Library by phone during normal business hours call (650) 738-4311.
Click here to send an email to the librarians.
Library Staff make every effort to respond promptly; however, replies may take up to 24 hours.
Available during normal business hours (650) 399-7712.
Artificial intelligence is a broad term used to describe technologies and techniques that allow machines to train themselves on sets of data and learn how to recommend or take a subsequent action based on real data. This resource center will demystify how AI can impact business outcomes.
Some examples of content below:
A Son’s Race to Give His Dying Father Artificial Immortality. For months, he recorded his dying father's life story. Then he used it to re-create his dad as an AI chatbot.
Vlahos, J. (2017, July 18). A Son’s Race to Give His Dying Father Artificial Immortality. Wired.