Artificial Intelligence Background

Artificial intelligence is defined as the application for computer technology to resemble human thought and action. It is also the branch of computer science concerned with the development of machines having this ability. Artificial intelligence was widely acclaimed in the 1980s as revolutionary technology. However, when expectations went unrealized, its popularity faded. More recently it is reemerg-ing with revamped development tools and languages and advanced flexibility. The restrictions of expensive, large machines and obscure languages have been overcome. Many of the lofty early claims have been dismissed and replaced with more pragmatic and profitable applications. The use of artificial intelligence is subtly in place for a variety of industrial, business, and consumer applications.

Areas of artificial intelligence include robotics, machine vision, voice recognition, natural language processing, expert systems, neural networks, and fuzzy logic. The term artificial intelligence was first used in 1956 at a conference held at Dartmouth College in Hanover, New Hampshire. The basis of the conference was the conjecture that every feature of intelligence can be so precisely described that a machine can be made to simulate it (1). Since that time the field has grown, evolved, fragmented, developed, and in many ways become incorporated into mainstream software.

The Mediterranean Diet Meltdown

The Mediterranean Diet Meltdown

Looking To Lose Weight But Not Starve Yourself? Revealed! The Secret To Long Life And Good Health Is In The Foods We Eat. Download today To Discover The Reason Why The Mediterranean Diet Will Help You Have Great Health, Enjoy Life And Live Longer.

Get My Free Ebook

Post a comment