Image Image Image Image Image
Scroll to Top

To Top

Arduino: A Brief History

Arduino: A Brief History
Click on image to zoom▵

On 04, Mar 2010 | In | By admin

A paper tracing the inventions and events that drove the evolution of the Arduino microcontroller, from its precursors in the era of early computers to the more recent development of compact microcontrollers and rise of the DIY movement.

PDF Version

 

The Arduino microcontroller is an open-source hardware controller which is designed to easily interface with a variety of sensors (to register user inputs), and to drive the responses and behaviors of external components such as LEDs, motors, and speakers (to respond to user inputs). One of its most important features is its ease of programmability even by users with little technical expertise — an aspect which has made it a tool of choice for artists and designers in the creation of interactive objects and spaces [2].

In this paper, I will examine the history of the Arduino and its precursors, starting with a history of the electronic computer and how it branched into the development of the first microcontrollers used in business, science, and military sectors; and following with a discussion of the needs and events that drove the development of a new group of microcontrollers targetted at artists, designers, and hobbyists.

The computer, commonly defined as a tool for processing, storing, and displaying information[3], arose from a long line of analog devices used for effective counting and calculation, ranging from the simple abacus (first invented in Sumeria around 2300 BC), to Napierʼs Bones (conceived in 1617, and the precursor to the slide rule), to Blaise Pascalʼs gear-based mechanical calculator (1645)[4]. However, Charles Babbageʼs Analytical Engine was the first calculating device to incorporate all the elements of the modern computer — data input (in the form of punched cards), a processing mechanism, a memory structure to store ongoing calculations, and an output/ display component (in this case, a printer)[5].

The development of the computer accelerated during the 1940ʼs, spurred on largely by the highly industrialized nature of military production in World War II (and the resulting needs for accurate calculations in the design and mass-production of military vehicles and ammunition), as well as a requirement for calculating machines that could decrypt Nazi communication ciphers (such as Turingʼs “bombe”, which successfully decoded signals from the German Enigma machine)[6]. These developments paved the way for further iterations of the computer, such as Karl Zuseʼs Z3 (the first binary computer, which was also fully programmable by the user), and Howard Aikenʼs Mark I from 1944 (which consisted of electrical relays and program instructions punched on paper tapes)[7]. Two of the most important computers constructed during this era, however, were the Colossus (UK, 1944) and the ENIAC (USA, 1946)[8], which were the first programmable, electronic computers, employing thousands of vacuum tubes to enable their processing and memory-related operations — a noticeable departure from the gear and relay based nature of their mechanical computer predecessors. Like most computers of this era, the Colossus and the ENIAC were specialized for military use, designed to perform cryptoanalysis on Nazi codes.

This new vacuum tube technology was later incorporated into Univac I (USA, 1951), the first commercial electronic computer, which was also one of the first computers to utilize stored programs — instructions that were stored internally, instead of being fed in via punched tapes and cards[9]. The Univac, and its close competitor the IBM 702, represented a new generation of computers that were employed pervasively in commercial and business environments to meet a growing need for data-processing and statistical analysis[10].

The 1960ʼs marked a significant evolutionary leap for computing, due to the development of solid state computers (such as the IBM 1401), which used transistors for processing operations, and magnetic core memory for storage (instead of roomfuls of vacuum tubes) — both of which enabled a greatly increased compactness of computer hardware[11]. The invention of integrated circuits in 1959 by Jack Kilby, which enabled transistors and circuits to be fused onto small ʻchipsʼ of semiconducting materials (such as silicon), allowed further miniaturization of computer components[12]. Another important development during this decade was the advent of high-level computer programming languages that were written in symbolic language (e.g. plain English), making computer code somewhat easier to read and learn than previous machine languages (which consisted only of numbers and letters) — this enabled individuals without years of technical expertise to control the basic operations of a computer. COBOL (for business applications) and FORTRAN (for scientific calculations) were the main languages introduced during this period[13].

In the 1970ʼs, one of the greatest innovations in modern computer history — the microprocessor — was introduced. The microprocessor essentially miniaturized all hardware components of a computerʼs central processing unit (its ʻbrainʼ) to fit onto a single, tiny integrated circuit, now more popularly known as a ʻmicrochipʼ[14]. The microchip also became the main driving component of microcontrollers (such as the Arduino), which generally consist of a microchip, memory storage hardware, and input/ output hardware for sensors. Due to its small form factor, the microprocessor was incorporated into a plethora of electronic devices from calculators to personal computers, and are still employed in these devices to this very day. The 1970ʼs and 1980ʻs also saw the development of a new generation of more powerful programming languages (such as C, C++, and later Java) for applications in business and science[15].

Having examined the evolution of the microcontroller from its roots in the first electronic computers, I will proceed to discuss recent incarnations of microcontrollers which have been designed to fulfill the needs of casual users and hobbyists with limited technical knowledge (departing from the more complex requirements of business, commercial, and scientific fields). The PIC microcontroller board, introduced in 1985 by General Instruments[16], became one of the most popular tools for electronics enthusiasts (before the Arduino) for several reasons: ease and speed of programming of the board via simple languages like PicBASIC, storage of these programs on an onboard flash memory chip (which enabled the boardʼs instructions to be erased and reprogrammed at will, allowing an infinite number of possibilities), and support for input sensors and output devices (motors, LEDʼs). Other popular boards for hobbyists include the BASIC Stamp (Parallax Inc., 1990)[17], and Wiring (one of the first microcontroller boards designed specifically for electronic art and tangible media explorations)[18] — both of which share the benefits of simplicity of programming, and a resulting ease of rapid-prototyping.

The Arduino project grew largely out of the “DIY” climate created by the burgeoning popularity of rapid- prototyping boards like PIC and Wiring, as well as in response to an increasing need of artists and designers to easily prototype interactive works. In fact, the immediate precursor to the Arduino was a custom-made Wiring microcontroller created by the artist/ designer Hernando Barragan in 2004 for his masters thesis (at Institute Design Institute Ivrea), intended for use by a “non-technical audience” of “artists, designers, and architects”[19]. In 2005, the Arduino team was formed in Ivrea, Italy, consisting of Barragan, Massimo Banzi, David Cuartielles, Dave Mellis, Gianluca Marino, and Nicholas Zambetti[20]. Their goal was to create an electronics prototyping platform which further simplified the Wiring platform, making it as accessible as possible for non-technical users in the creative field. As a result, the Arduino incorporated the following characteristics: a programming environment based on Processing language (a programming language conceived by Ben Fry and Casey Reas, also conceived for artists/ designers), the ability to program the board via a standard USB connection, and a low price point (starting at about $35 USD)[21]. The Arduino achieved rapid success even within its first two years of existence, selling in a quantity of more than 50,000 boards. By 2009, it had spawned over 13 different incarnations, each specialized for different applications[22] — for example, the Arduino Lilypad (for wearable technologies projects), the Arduino Mini (miniaturized for use in small interactive objects), and the Arduino BT (with built-in Bluetooth capabilities).

Today, the Arduino microcontroller has become one of the most popular prototyping platforms in the world, and is a prime example of how hardware and software technologies originally created for military, business, and scientific applications can be repurposed to serve the needs of individuals creating projects in the realms of new media art and design.

 

 

REFERENCES

2 Arduino Team, “Arduino”, http://www.arduino.cc/ (accessed Feb 5, 2010).
3 Britannica Ultimate Reference Suite 2008, “Computer”
4 Harry Henderson, Encyclopedia of Computer Science And Technology (New York, NY: Infobase Publishing, 2009), 226
5 Harry Henderson, Encyclopedia of Computer Science And Technology (New York, NY: Infobase Publishing, 2009), 226
6 Harry Henderson, Encyclopedia of Computer Science And Technology (New York, NY: Infobase Publishing, 2009), 226
7 Harry Henderson, Encyclopedia of Computer Science And Technology (New York, NY: Infobase Publishing, 2009), 227
8 Paul E. Ceruzzi, History of Modern Computing (Boston, MS: MIT Press) 179
9 Paul E. Ceruzzi, History of Modern Computing (Boston, MS: MIT Press) 217
10 Harry Henderson, Encyclopedia of Computer Science And Technology (New York, NY: Infobase Publishing, 2009), 227
11 Harry Henderson, Encyclopedia of Computer Science And Technology (New York, NY: Infobase Publishing, 2009), 227
12 Paul E. Ceruzzi, History of Modern Computing (Boston, MS: MIT Press) 183
13 Harry Henderson, Encyclopedia of Computer Science And Technology (New York, NY: Infobase Publishing, 2009), 240
14 Harry Henderson, Encyclopedia of Computer Science And Technology (New York, NY: Infobase Publishing, 2009), 228
15 Harry Henderson, Encyclopedia of Computer Science And Technology (New York, NY: Infobase Publishing, 2009), 228
16 Wikipedia, “PIC Microcontroller”, http://en.wikipedia.org/wiki/PIC_microcontroller (accessed Feb 15, 2010).
17 Wikipedia, “BASIC Stamp”, http://en.wikipedia.org/wiki/BASIC_Stamp (accessed Feb 15, 2010)
18 Wiring, “Wiring ALPHA 1.0”, http://www.wiring.org.co (accessed Feb 17, 2010)
19 Alicia M. Gibb, “New Media Art, Design, And The Arduino Microcontroller: A Malleable Pool” (Masterʼs Thesis, Pratt Institute, 2010) 7
20 Alicia M. Gibb, “New Media Art, Design, And The Arduino Microcontroller: A Malleable Pool” (Masterʼs Thesis, Pratt Institute, 2010) 8
21 Alicia M. Gibb, “New Media Art, Design, And The Arduino Microcontroller: A Malleable Pool” (Masterʼs Thesis, Pratt Institute, 2010) 9
22 Alicia M. Gibb, “New Media Art, Design, And The Arduino Microcontroller: A Malleable Pool” (Masterʼs Thesis, Pratt Institute, 2010) 22