Artificial intelligence is defined as the application for computer technology to resemble human thought and action. It is also the branch of computer science concerned with the development of machines having this ability. Artificial intelligence was widely acclaimed in the 1980s as revolutionary technology. However, when expectations went unrealized, its popularity faded. More recently it is reemerg-ing with revamped development tools and languages and advanced flexibility. The restrictions of expensive, large machines and obscure languages have been overcome. Many of the lofty early claims have been dismissed and replaced with more pragmatic and profitable applications. The use of artificial intelligence is subtly in place for a variety of industrial, business, and consumer applications.
Areas of artificial intelligence include robotics, machine vision, voice recognition, natural language processing, expert systems, neural networks, and fuzzy logic. The term artificial intelligence was first used in 1956 at a conference held at Dartmouth College in Hanover, New Hampshire. The basis of the conference was the conjecture that every feature of intelligence can be so precisely described that a machine can be made to simulate it (1). Since that time the field has grown, evolved, fragmented, developed, and in many ways become incorporated into mainstream software.
Was this article helpful?
Acai, Maqui And Many Other Popular Berries That Will Change Your Life And Health. Berries have been demonstrated to be some of the healthiest foods on the planet. Each month or so it seems fresh research is being brought out and new berries are being exposed and analyzed for their health giving attributes.