robot sederhana

Minggu, 13 September 2009

Hungry Robots

by Tony Belpaeme and Andreas Birk


Artificial intelligence (AI) has been studied for more than fifty years and is an established specialty within computer science. A recent sub-specialty of AI is artificial life, also known as alife. Although alife research can be traced back to the 1960s, it has only found prominence in the last decade. Artificial life studies have two main goals: solving common problems inspired by biological phenomena and studying the basic properties of life with computer-based technology [10]. The first goal has been dubbed the "artificial life route to artificial intelligence" [20], emphasizing the relationship between a novel way of building and programming machines and the study of life through artifacts. While studying biological phenomena, some scientists in the field use animats as their research tool [25]. An animat is a robotic device whose physical appearance and inner workings are inspired by the animal world. (The word animat is a contraction of animal robot.)
In the mid-1980s, AI and robotics researchers began arguing against "classic" approaches that relied on symbolic representations and methods [5]. One researcher, Rodney Brooks, advocated new research questions and methods. Instead of building intelligent systems by solving abstract and highly formalized problems such as computer chess programs, he argued in favor of building intelligent robots that were inspired by nature. In doing so, he stressed the importance of reactive mechanisms and discouraged the use of models of the world: "the world is its own best model"[6]. These reactive mechanisms have a tight coupling between sensor values and motor activations and have no central control. Alife robots typically are controlled by short programs, without a central controller organizing the execution of the programs; each program is responsible for a specific behavior of the robot such as avoiding obstacles or seeking an energy source. These robots and the way they are programmed are therefore called behavior-based.
Interestingly, the study of animats returns robots to their roots. The term "robot" was introduced in 1921 by the Czech writer Karel Capek in his satirical drama R.U.R. (Rossum's Universal Robots). There, robots were shown as artificial super humans. The science-fiction notion of a robot as a human-shaped device with seemingly unlimited strength and intelligence has over the years been replaced by the industrial notion of a robot as a dumb and bulky piece of machinery. Recently, the two disparate notions of a robot have begun to merge. The Sony Aibo dogs are a fine example of the convergence of super-device and drudge machine [7]. These mass-produced toy robots use considerable computing power, a camera, microphone, and touch-sensors to interface with the world.
In this article, we first introduce three basic properties of behavior-based robots. Then we describe a prototype alife experiment with an ecosystem of different types of robots competing for energy. We also explain why we found diverse collections of robots more interesting to study than single robots.

From Robots to Animats

Sometimes behavior-based AI is denoted as the bottom-up approach to AI. Behavior-based AI often focuses on systems that might be considered to be non-intelligent. Simple creatures, such as insects, and their behavior are investigated before more sophisticated animals. Although "simple," these creatures have inspired scientists to elegantly solve problems that are awkward or difficult to solve with classic computer science. The study of ant behavior, for example, has led to very efficient solutions for controlling digital network traffic.
This tendency to study simple organisms is also reflected in robotics activities. The robots used in artificial life are often rather uncomplicated and do not match the science-fiction vision of a robot that serves tea or solves world problems. Despite their simplicity, behavior-based robots or animats usually have three important properties:
  • They are highly autonomous.
  • They come equipped with complex sensor and motor-interfaces.
  • They are integrated within an environment.
The word autonomy is derived from the Greek words auto (self) and nomos (law, rule). So, an autonomous system is a self-governed system. Loosely interpreted, autonomy can be seen as the independence of a device from direct and continuous human supervision and maintenance. Often, autonomy for robots is set at par with being mobile without an umbilical cord that connects the robot to a power supply and sometimes to an off-board computer. But this view is too simplistic: the robot should also have control autonomy, allowing it to decide and learn without much external aid.
Autonomous robots face two major problems. First, they have to adapt to novel situations. Second, they have to manage resources, such as energy. Both problems can be related to the so-called self-sufficiency of animats: they have to be able to sustain themselves over extended periods of time [18].
An animat is said to perform well when it manages to stay operational autonomously. Its performance is considered poor when it runs out of energy or breaks apart. This idea can be traced back to the field of cybernetics, which originated in the 1940s [24]. The cybernetician W. Ross Ashby formalized this idea as early as 1952 by introducing the notion of essential variables [1], the state variables that ensure successful operation as long as they are kept within the crucial boundaries or the viability zone of the agent's state space.
Classic robots are based on precise mechanics, which are necessary because the robots rely on exact models to describe and compute their kinematics. In contrast, behavior-based robots are more like natural devices. Their control schemes rely more on mechanisms and rules of thumb for their behavior. The Sony Aibo dog [7], for instance, consists of simple motors targeted for the toy market. Nevertheless it has very complex motor skills, as it is capable of walking on four legs, each of which has three degrees of freedom (DOFs). Each DOF corresponds to one free parameter in the physical device. A door hinge, for example, has one DOF. Hence the legged motion of the four-legged robot has to cope with a total of twelve DOFs, compared with the five to six DOFs of a typical industrial robot arm. Each DOF adds an extra possibility to the configuration of the system, and often more than one way exists to place a paw on the floor. Computing the most efficient way is a daunting task in robotics.
Behavior-based robots cannot rely on precise, complex models. Instead they are controlled with simple programs. Fortunately this means that their need for computational resources is small. Instead of intensive calculations of inverse kinematics, simple couplings between sensor-values and motor-activations are used. But how is it possible to save on the electro-mechanical and on the computational side? Where is the trade-off?
To some extent the trade-off is simply hidden in the kind of tasks classic and behavior-based robots are suited for. If very precise, repeated positioning is needed, classic robotics is ideal. On the other hand, for other tasks, behavior-based robots are often the more competitive option because they strongly benefit from developments in sensors, especially in computer vision and motors. More and more types of sensors are available at constantly dropping prices. Therefore they can be used as the basis for additional behaviors, increasing the robustness and usefulness of the robot. Camera-chips, for instance, which are primarily targeted at the entertainment and toy market, can be used for computationally inexpensive visual servocontrol compared with more common kinematic controls such as gyros or position sensors.
Last but not least, an animat is typically not seen as an isolated device but as part of an environment. This is discussed in some detail in the following section.

Living on your own?

Ecosystem-like settings are interesting from an alife perspective. Within ecosystems, the main goal of a robot is self-preservation (staying operational for an extended period of time [16,20]). Resources, especially energy, are limited in time and space. Consequently, robots must compete for them. Consequently, competition forms the basis of all robot interactions in the system. There is a substantial amount of alife research based on simulated ecosystems [2,4,11,12,21,23]; however, unlike the ideas presented above, perception and effector-control of the agent are decoupled from the real world.
The basic ecosystem located at the Flemish Free University of Brussels (VUB) [19,14] is a 5 m X 3 m space enclosed by walls (Figure 1). Initially it includes simple mobile robots, the moles (Figure 2). The name of this robot "species" as well as the following ones should not be taken too literally. Note that the names are used for convenience only and are not meant to imply direct relations to the natural counterparts. The name "mole" is derived from the limited vision capabilities of the robots, who can only perceive light intensities through a few simple photosensors. The photosensors, positioned at the front of the robot, are used to navigate and to find objects in the ecosystem.
Basic ecosystem with robot moles, competitors, and a charging station
Figure 1: A part of the basic ecosystem with charging station, two robot moles, three competitors, and several bricks as obstacles.
Robot mole
Figure 2: A so-called mole, a simple autonomous robot that is capable of staying operational in the ecosystem over extended periods in time. Illustrated here is one of its basic behaviors: photo taxis (movement in response to light stimulus) to the charging station to recharge its batteries.

  • Digg
  • StumbleUpon
  • Reddit
  • RSS

cara membuat robot

ini robot pertama yang saya ingin coba membuatnya

  • Digg
  • StumbleUpon
  • Reddit
  • RSS

cara membuat robot

                    Membuat Robot Explorer  Hexapod

                (Artikel lengkap baca pada majalah Elkom ed 4)
                                                                                 Widodo Budiharto
                                                       Guest Professor Univ. de Bourgogne, Prancis

Pada proyek robot kali ini, penulis memaparkan cara membuat robot berkaki 6 (hexapod) menggunakan 3 buah sensor, yaitu 1 sensor  jarak SRF04 (Sonar Range Finder) dan 2 bh  Sharp GP2D12.  Dijamin dechhh penasaran dan  menarik untuk dicoba J.

Blok Rangkaian
Robot ini bergerak berdasarkan informasi dari ketiga sensor jarak.  Robot ini diharapkan dapat melakukaneksplorasike daerah yang dilaluinya, untuk memberikan informasi kepemiliknyamenggunakan kamera wireless misalnya, oleh karena itu robot ini dinamakan Explorer Hexapod.  Gambar di bawah ini menampilkan blok rangkaian yang akan dibuat:


                                                       Gambar 1.  Blok rangkaian robot Explorer Hexapod

Berikut ini ialah bahanbahan yang diperlukan, yang paling penting tentunya ialah kerangka dari kaki hexapod ini, yang dapat Anda buat sendiri atau membeli kit yang sudah jadi :
1.       2 buah servo motor HS311
2.       Body dan kaki hexapod
(Dapat membeli kit kaki hexapod lengkap dengan 2 bh servo HS311)
3.       Min. System  ATmega 8535, ATmega16 atau Atmega32
4.       Driver Motor DC 293D/ deKits SPC DC Motor
5.       1 sensor jarak  ultrasonic SRF 04 (jarak 3cm-3m)
6.       2 sensor jarak infrared SharpGP2D12(10cm -80cm)
7.       Tempat baterai 9V 2bh
Berikut ini ialah konstruksi dari kaki hexapod standar, yang digerakkan dari putaran motor servo continuous.  Servo ini dikendalikan dari port B.0-3  melalui Driver motor  yaitu kit DC motor Driver menggunakan IC L293D (dapat menggunakan juga kit dekits SPC DC Motor)  atau jika ingin lebih kuat lagi menggunakan IC H bridge  L298. Perlu diingat, kaki servo ini ada 3 pin, cukup gunakan 2 kaki yang menggerakan motor DC di dalam servo tersebut saja.


                                                                      Gambar 2.  Susunan  sisi  kaki hexapod
Servo HS311 merupakan servo dengan torsi yang cukup besar untuk menggerakkan robot dengan beban maksimal 1.5kg.

Cara kerja
Pertama, kita lihat dulu bagian sensor.  Sensor SRF04 digunakan untuk mengetahui jarak depan robot, apakah ada penghalang atau tidak, yang mampu mendeteksi jarak dari 3cm hingga  3 meter. Sensor ini bekerja berdasarkan prinsip gelombang ultrasonic. Pencari jarak ini bekerja dengan cara memancarkan pulsa suara dengan kecepatan suara (0.9 ft/milidetik) berfrekwensi 40 KHz.  Keluaran sensor ini dihubungkan ke Port C.0 dan Port C.1, dan dengan nilai trigger input sebesar 10 uS pada pulsa TTL.   Alasan mengapa digunakan sensor ini, ialah karena sensor  jarak ini paling banyak digunakan pada Kontes Robot Cerdas di Indonesia, sehingga pembaca pemula menjadi familiar. Anda dapat menambah sensor ini hingga 4 buah untuk digunakan pada  sisi kanan, kiri dan belakang robot biar lebih akurat.
                                                Gambar 3.  Susunan kaki SRF04

Sedangkan 2 sensor infrared GP2D12 di sisi samping kanan dan kiri dapat mengukur jarak sejauh 10cm-  80cm dengan output analog, sehingga dapat langsung dihubungkan ke port A.0 dan port A.1  dari mikrokontroler AVR tersebut.  Karakteristik dari sensor ini tidak linear, oleh karena itu idealnya perlu digunakan look up table untuk mengolah raw data dari sensor tersebut.
Hasil pembacaan sensor-sensor jarak ini diolah oleh mikrokontroler, untuk memutuskan gerakan yang akan dilakukan apakah maju, mundur atau belok. Dengan memutarnya servo, menyebabkan bagian kaki yang terhubung ke servo  bergerak bergantian sehingga robot dapat berjalan.  

‘Program Demo Robot Explorer Hexapod
‘By Mr. Widodo Budiharto
‘Univ. de Bourgogne 2007
deklarasi fungsi dan variabel
Declare Sub Initialize_ultrasonic()
Declare Function Ultrasonic_depan() As Integer
Dim Jarakdepan As Integer
Dim Jaraksampingkanan As Word
Dim Jaraksampingkiri As Word
Dim W As Word
Config Portb = Output
Config Portd = Input
Config Portc = Output
Config Adc = Single , Prescaler = Auto , Reference = Avcc   'konfigurasi ADC
Start Adc
Call Initialize_ultrasonicpanggil fungsi
baca SRF04 untuk jarak depan
Print "jarak sampingkiri" ; Jaraksampingkiri
‘Demo jika ada halangan, maka belok kiri
 If Jarakdepan > 40 Then
   Portb = 8                                   'maju
Wait 2 ‘delay
Else if jarak depan <40 and jaraksampingkanan >150 then
    Portb = 0     'belok kiri 
    Wait 2
End If
Function Ultrasonic_depan() As Integer
                                                 ' set initial state pin trigger
                                      ' buat pulsa  5us @ 4 MHz
                     ' ukur return pulse
End Function
Sub Initialize_ultrasonic inisialisasi  sensor ultrasonik
End Sub

Gambar berikut merupakan hasil yang sudah jadi yang dapat berjalan dengan cukup cepat dan kuat karena menggunakan servo torsi tinggi dari Hitec.

               A.                                                        B.
                      Gambar 4. Robot in action a). Tampak samping     b). Tampak depan

Pengembangan Selanjutnya
Untuk keperluan riset atau hobi, Anda dapat menambahkan kemampuan Artificial Intelligent menggunakan Fuzzy Logic, Algoritma Genetic atau Neural Network, agar robot ini menjadi robot yang cerdas.  Silahkan baca artikel selanjutnya mengenai Neural Network  di majalah kesayangan Anda ini. 
Daftar Pustaka:
4.       Delta Hexapod robot
5.       Situs-situs dan buku pendukung lainnya.

  • Digg
  • StumbleUpon
  • Reddit
  • RSS