Реферат

Реферат на тему Robotics Essay Research Paper Robots the definition

Работа добавлена на сайт bukvasha.net: 2015-06-14

Поможем написать учебную работу

Если у вас возникли сложности с курсовой, контрольной, дипломной, рефератом, отчетом по практике, научно-исследовательской и любой другой работой - мы готовы помочь.

Предоплата всего

от 25%

Подписываем

договор

Выберите тип работы:

Скидка 25% при заказе до 23.11.2024


Robotics Essay, Research Paper

Robots, the definition from Webster s Dictionary, is an

automatic device that performs functions normally ascribed to

humans or a machine form of a human. Robots and robotics are

growing fields that have progressed since the 1940 s. The first

use of the word ‘robot’ was made by the acclaimed Czech

playwright Karel Capek, from the Czech word for forced labor or

serf. The use of the word Robot was introduced into his play

Rossum’s Universal Robots which opened in January of 1921. In

Rossum’s Universal Robots, Capek poses a paradise where the

machines initially bring so many benefits but in the end bring an

equal amount of blight, in the form of unemployment and social

unrest. The word ‘robotics’ was first used in Runaround, a short

story published in 1942 by Isaac Asimov. One of the first robots

Asimov wrote about was a robotherapist, a modern counterpart to

Asimov’s fictional character is Eliza. Eliza was born in 1966 by a

Massachusetts Institute of Technology Professor, Joseph

Weizenbaum, who wrote Eliza, a computer program for the study

of natural language communication between man and machine. She

was initially programmed with 240 lines of code to simulate a

psychotherapist by answering questions with questions.

Isaac Asimov had four laws that he thought all robots should go

by: Law Zeroth: A robot may not injure humanity, or, through

inaction, allow humanity to come to harm.

Law One: A robot may not injure a human being, or, through

inaction, allow a human being to come to harm, unless this would

violate a higher order law.

Law Two: A robot must obey orders given it by human beings,

except where such orders would conflict with a higher order law.

Law Three: A robot must protect its own existence as long as

such protection does not conflict with a higher order law.

Some of the first actually Robots date into the 1940 s. This

robot was done by Grey Walter and called Machina Spectulatrix.

People called it the turtlebot for short. His robot was just

recently restored to a working state. The turtlebot s are

three-wheeling, light-seeking creatures. A photoelectric cell was

mounted on the steering column with a front wheel attached. The

turtles were propelled by two small electric motors, to roam in

any direction with sensor contacts to avoid obstacles. The turtles

searched and aimed towards the light, but when the light

intensity became too bright they retreated to their hutches to

recharge. Its a very basic robot but Walter was ahead of his time

doing the Robot.

In 1956, a meeting occurred between George C. Devol and

Joseph F. Engelberger. The two met over cocktails to discuss the

writings of Isaac Asimov. The result of this historic meeting was

that Devol and Engelberger, created a working robot nicknamed

the ‘Unimate’. The first Unimate was installed at a General

Motors plant, where it worked with the heated die-casting

machines. Engelberger started a manufacturing company called

‘Unimation’ which stood for Universal Automation, the first

commercial company to produce robots. Devol wrote the

necessary patents. Unimation is still in production today, with

robots for sale.

All robots that work and do things are run by programs.

Most of the programs being made are trying to emulate programs

that let the robot have AI. AI stands for artificial intelligence.

Back in 1637 a French philosopher-mathematician Rene Descartes

predicted that it would never be impossible to make a thing that

had AI. In 1950 the British mathematician and computer pioneer

Alan Turing declared that one day there would be a machine that

could duplicate the thoughts of a human being. This would be done

by passing a specialized test, this test will be done by a computer

and a human hidden from view would be asked random identical

questions. If the computer were successful, the questioner would

be unable to distinguish the machine from the person by the

answers.

Inspired by Turing s Theory, the first conference on AI

convened at Dartmouth College in N.H. in 1956. Soon afterwards,

an AI laboratory was started at Massachusetts Institute of

Technology by John McCarthy and Marvin Minsky, two of the

nation s leading AI researchers. McCarthy also invented the AI

computer Language, Lisp; but by the early 1990 s AI itself had

not been achieved. However, logic programs called expert

systems allow computers to make decisions by interpreting data

and selecting from among alternatives. Technicians can run

programs used in complex medical diagnostics, language

translation, mineral exploration and even computer design.

Machinery can outperform humans physically, as well as

mentally. The fastest computer is able to calculate roughly 10

billion calculations per second. In order to achieve the same track

as the mind, computers have been made with several processors

to follow calculations at the same time.

Critics say that this does not involve understanding,

something that a human would have. This would be theoretically

impossible and involve learning the material. Some experts have

suggested that computers should be modeled after the human

brain, which essentially consists of a network of nerve cells.

The research of AI has progressed so much that some

computers can preform complicated- though extremely

specialized- tasks. For example, artificial intelligence systems

have been produced that can diagnose diseases and locate

minerals in the Earth. Such systems are often called Expert

systems. They require vasts amount of knowlegde or information

in the computer to provide for the basis of the computers

thinking ability. To diagnose a disease a computer needs to be

programed with knowledge of thousands symptoms and how these

symtoms relate to hundreds of diseases.

Programs have also been developed that enable computers

to comprehend commands in a natural language–e.g., ordinary

English. The software systems of this type that have been

produced so far are limited in their vocabulary and knowledge to

specific, narrowly defined subject areas. They contain large

amounts of information about the meaning of words pertaining to

that subject, as well as information about grammatical rules and

common violations of those rules.

Major and continuing advances in computer processing

speeds and memory sizes have facilitated the development of AI

programs. Although most AI programs attempting to simulate

higher mental functions incorporate the bottleneck of limited

short-term memory, which restricts humans to carrying out one

or a few mental tasks at a time, many investigators have begun to

explore how the intelligence of computer programs can be

enhanced by incorporating parallel processing–e.x., the

simultaneous execution of several separate operations by means

of computer memories that allow many processes to be carried

out at once. The question of which portions of the human brain

operate serially and which operate in parallel has been a topic of

intense debate by researchers in both the cognitive sciences and

AI, but no clear verdict had been reached by the mid-1990s.

The largest computer memories now contain elementary

circuits that are comparable in number to the synaptic

connections (about 10 trillion) in the human brain, and they

operate at speeds (billions of operations per second) that are far

faster than elementary neural speeds.The challenge driving AI

research is to understand how computers’ capabilities must be

organized in order to reproduce the many kinds of mental activity

that are comprised by the term “thinking.” AI research has thus

focused on understanding the mechanisms involved in human

mental tasks and on designing software that performs similarly,

starting with relatively simple ones and continually progressing to

levels of greater complexity.


1. Реферат на тему Jimi Hendrix Essay Research Paper Trisomy 13
2. Курсовая Государственный бюджет и его роль
3. Реферат на тему Relaxation In Cancer Essay Research Paper RELAXATION
4. Реферат на тему Геополитика
5. Доклад на тему От Москвы до Сталинграда
6. Диплом Вина как условие гражданско-правовой ответственности
7. Курсовая на тему Графический редактор CorelDraw рисование сложных фигур создание пейзажей
8. Реферат на тему Falseness In Death Of A Salesman Essay
9. Контрольная работа на тему Философское миропонимание исходные принципы и категориальные основ 2
10. Контрольная работа Эмоции и учебный процесс