/Chapter Eight/ The Development of the Computer ------------------------------------------------------------------------ * First Generation Computers <#first> * Second Generation Computers <#second> * Third Generation Computers <#third> * Fourth Generation Computers <#fourth> ------------------------------------------------------------------------ _First Generation Computers_ As time progressed, people found they were using adding machines and slide rules to perform more and more extremely tedious calculations. Aiken, developed the Mark I in 1944 to ease this calculating burden. However, rather than follow the mechanical approach of the Mark I, many researchers set to work to develop electronic computers. Prior to World War II, John V. Atansoff, a professor of physics, and Clifford E. Berry, a graduate student at Iowa State College, began building an electronic computer. Unfortunately, because of the war, they were never able to complete it. In 1939, Atansoff finished the construction of a small prototype computer he built to test his ideas. Atansoff then used this model to begin work on his Atansoff-Berry Computer (ABC), but in 1942 was forced to stop due to the war. The unfinished computer used 300 vacuum tubes to perform calculations, capacitors to store binary data, and punched cards to communicate input/output. One important aspect of this computer was that unlike the old mechanical adding machines which used direct counting, the ABC utilized logical operations to perform addition and subtraction. The spokes on the mechanical adding machines "counted" the solution to their calculation by turning the same number of times as the values of the addends. The ABC, however, used the logical operators " A binary number is a number written in the base two. This means that instead of using the digits zero through nine, like in base ten, only the digits zero and one are used. Each place is then equivalent to a power of two, so you have the one's place, two's place, four's place, etc. For example the number five in binary would be 101. To understand the usage of performing logical operations on binary numbers think of zero as representing "false" and one as representing "true." True "and" true produces true as an answer, while true "and" false, or false "and" false result in false. For "or," false "or" false is false, and any other combination is true. During World War II, researchers made more advances to ease the burden of performing calculations. The Defense Department needed an easier way to compute its firing and ballistic tables.^2 So, J. Presper Eckert and William Mauchley at the Moore School for Engineering of the University of Pennsylvania found a solution for the Defense Department's dilemma. In 1946, they developed the ENIAC, Electronic Numerical Integrator and Calculator.^3 It filled a thirty by fifty-foot room and weighed thirty tons.^4 The computer had 18,000 vacuum tubes which were used to perform calculations at a rate of 5,000 additions per second.^5 This is much faster than any human could perform, but a great deal slower than the computers of today. Operators used plug boards and wires to program the desired operations and entered the numbers used in the calculations by turning a series of dials until they corresponded to the correct digits. In the next few years, a number of other "first generation" computers were built. All of these early computers used vacuum tubes to perform their calculations. One development among these first computers was the use of an internally stored program. In 1945, John von Neumann wrote a paper describing how a binary program could be electronically stored in a computer. This program would enable the computer to alter the operations to be performed depending upon the results of previous steps. For example, the computer could be programmed so that whenever it calculated a number less than ten, it should add five. This concept greatly increased the flexibility of computers. In 1947, the EDVAC, Electronic Discrete Variable Automatic Computer, was built by Eckert and Mauchley at the University of Pennsylvania. The EDVAC utilized the idea of an electronically-stored program.^6 In 1951, Eckert and Mauchley built the UNIVAC for use at the Census Bureau.^7 The UNIVAC used magnetic tape to store input/output rather than the punch tape which had been used in the earlier machines. It was the first computer commercially produced for businesses.^8 A total of forty-six UNIVAC computers were sold.^9 *Grandpa Guinee:* The early computer had many bugs in it and was very frequently inoperative because of the failure of particular electronic component. So that when they were first put out on the market people said "Well, this will never work." But, they have overcome all of their bad publicity and now are accepted as a necessity in business. In 1953, IBM produced its 701 computer, and then two years later its 752 computer. IBM continued to develop and expand its computer line and within the next decade, IBM managed to corner over seventy percent of the industrial computer market.^10 *Uncle Murph:* The first computers that we used used a device called an electronic drum. It did not have any disk or tape and it was fed by cards in and cards out and had no printer. The drum had 2000 words of fixed storage and had to contain both the instructions and the storage both on a drum which cycled around just like a garbage can going this way. You had certain fixed read areas and certain fixed punch areas and certain fixed print areas. So, after you allocate the read area, the punch area, and the print area, the rest was for the program and then you had a little bit of data. But, basically the medium to input the computer was a punched card, so you fed cards in, but you could do repetitive programming against the data that was in the cards and produce a punched result. That didn't last very long. It was too slow, too unworkable, too expensive, too hot. Hot enough to keep my coffee warm, by the way, because there were vacuum tubes. The vacuum tubes were like that high and they were in series _Second Generation Computers_ In 1947, Bell Laboratories invented the transistor.^11 This creation sparked the production of a wave of "second generation" computers. Texas Instruments improved the transistor in 1954 by using silicon rather than germanium. Using silicon was an improvement because it could withstand higher temperatures than the germanium. The area of California where Texas Instruments was located has since become known as Silicon Valley because many computer manufacturing companies have been built in this area.^12 By using transistors in place of vacuum tubes, manufacturers could produce more reliable computers. Using transistors was also less expensive than building a computer with vacuum tubes. The combination of smaller size, better reliability, and lower cost made these second generation computers very popular with buyers. In 1956, using transistors, researchers at Bell Laboratories in New York built a computer called the Leprechaun. IBM, Philco, GE, and RCA quickly followed suit by producing their own transis Americans did not use these new smaller computers solely for calculations. People soon found that computers were very good at data processing. By feeding the computer input via punch cards, the computer could easily sort the data and then print out the sorted material. Computer companies started to produce two different types of computers. For scientists and engineers, they built large powerful computers which were good at performing calculations. *Mom:* In the mid-1960s, I worked for a company called Thomas & Betts as my summer job. I was in college at the time and Thomas & Betts was a company that made electrical fittings. I was working in the computer department there. The computer was new to the company, it was new in companies at the time and they were just learning how to make use of this tool. The computers that they had were very, very large. They had to be in air-conditioned rooms and people had to make sure that no dirt was taken into the room, because all of the information that was in the computer was put on magnetic tape. They had very very large reels of the magnetic tape and it was all stored in this large room, where they could take the reels and get their information from it in the future if necessary. For banks and insurance companies, computer manufacturers produced smaller, fast computers which were good at sorting and printing. IBM marketed its 7094 for calculations and its 1401 for data processing.^14 *Mom:* There were also some smaller computers in the accounting department. I remember when we first put the payroll on computer that we would have to punch all the information that we wanted onto what were called key punch cards which were cardboard cards which were maybe three inches by six inches. Anything that you wanted to go into the computer would first have to go onto these key punch cards. So, there were people who sat as key punch operators all day who punched information onto these cards which would then be stacked up and put on the computer. Then, someone would have to program the computer to take this information and organize it into a usable form. *Uncle Murph:* The next machine was actually a solid state machine, the 705, and it employed magnetic tape. That was, as I said earlier, that was a relief to me. When they say "What was your impression of the first computer?" I say "Relief!" because instead of handling 19 million cards, Hollerith IBM cards, with the punched holes in them, the stuff you saved the confetti for the football games, instead of handling all those cards, we didn't have to handle them anymore. We could just read them and write them to tape. Once we had them on tape and we had the right software, we could sort the tape. *Mom:* To program the computer, you would have a circuit board. The circuit board would have wires on it and you would physically have to move the wires on the circuit board to different positions. Suppose you had a payroll to go out, then the circuit board would have a certain position. If you were doing inventory, the circuit board would have a different position. So, the people in the company were just learning how to program the circuit boards. *Uncle Murph:* I think the first sort we did was on about 80,000 records and they were accounting type records. It probably took about an hour and fifty minutes for 80,000 records. Ridiculous, I mean once you got over 100,000 forget about it. But, a couple years later, some guys at MIT and IBM worked out a way to read backwards. Once we could read backwards, half of the tape time was not wasted anymore, so we could read backwards on one and forwards on the other. We could do it simultaneously. Unfortunately, that's the way things stayed for almost a decade, just mag tape. Almost ten years, no improvements. We're talking late fifties to late sixties now, no improvement, everybody waiting for this thing called disk. Computer companies found that it was expensive to produce two different lines of computers, so they set to work to develop a computer which could perform both calculations and data processing equally well. _Third Generation Computers_ In 1958, the first integrated circuit was made.^15 This invention has led to the widespread use of computers today. Scientists found a way to reduce the size of transistors so they could place hundreds of them on a small silicon chip, about a quarter of an inch on each side.^16 This enabled computer manufacturers to build smaller computers. Using this new technology, Digital Equipment Inc. produced a minicomputer which they sold for the price of fifteen thousand dollars in 1962.^17 Two years later, IBM used chips in its 360 series computers.^18 The 360 series was IBM's solution to the problem of having two different market lines of computer. Every member of the 360 family, no matter how big or powerful, was compatible with each other. This way, a company could buy a small computer to start with and when they outgrew it and bought a larger one, they would still be able to use all of its old stored data. At about this same time, the concept of a programming language was developed. Originally, programmers communicated with the computers via plug boards and wires. As both the computers and the jobs to be executed became more complex, communication between the computers and users also became more complicated. In 1956, FORTRAN, the first programming language, was developed.^19 Then in 1959, Grace Hopper invented COBOL.20 *Uncle Murph:* You were writing in something called machine language. You would write the instruction that the machine understood and so your instructions to the computer were one for one. Whatever block you had on your block diagram which said how to do a payroll calculation, every step on that block diagram was at least one program step. That was really terribly inefficient. I mean, that was just terrible, slow and tedious, but not error prone, because if you didn't do it right, it just wouldn't work. I think when Grace Hopper and her friends in Washington decided to get together and have COBOL, that was really a revolutionary concept, but certainly worthwhile, because we needed some COmmon Business Oriented Language and that's where she got the acronym. The programming languages enabled programmers to write code at a higher conceptual level. A compiler would then translate the code into machine language. For instance, a programmer could now tell the computer to add two numbers by simply using the add command in the language. The introduction of programming languages enabled this third generation of computers to contain something called an operating system. An operating system serves two functions. First, it is a program which provides a buffer between the user and the machine. It enables the user to ask the computer to perform a high level task, and then the operating system translates the task into machine language instructions. Second, the operating system keeps the various pieces of the computer running together smoothly. The companies who manufactured the third generation computers tried to create computers which could successfully perform both calculations and sorts. However, creating such a diverse computer turned out to be a difficult job. The operating system for these computers was very complex. This complexity resulted in many errors in the code which would cause the computers to crash. *Uncle Murph:* I mean it used to crash and you would call IBM and they would say "Yeah we know it crashed, but we don't know why." So, they had something called Problem Resolution Teams. We used to call them swat teams. They used to fly in and try to find out what was wrong with your software. They'd fix it and then they'd remember what it was and they'd tell everyone else. They'd put out a release saying change your software this way because this might happen, and as soon as they did that other things went. It was like a virus, you know. But they did that for years, they had a swat team. Another aspect of computing new to the third generation machines was the presence of multiprogramming. In the early days, a computer was capable of performing only one job at a time. The problem with this method was that jobs were not continuously active. Sometimes a job would reach a point where it needed user input, so the computer would just sit and wait. Multiprogramming changed this. It enabled the computer to run a number of jobs simultaneously. The jobs would take turns using the computer's central processing unit; while one job was waiting for input, another job would execute. *Uncle Murph:* Operating systems didn't come along until someone said we want to do multiprogramming, we want to run more than one task at a time because up to that point in the early eighties you really couldn't run more than one job at a time. You could run one job, no matter how big the computer was you could run one job. That didn't make much sense. Today, in an mvs/esa environment with a mainframe big enough and the right kind of software, you could probably run 200 tasks within a couple of seconds. In those days, you could run one so, it was like the funnel was this big and as much as you can shove through the tunnel, that's all you get. So, it wasn't very productive let's say. But it was great for IBM because they sold a lot of computers. I mean, prudential had a floor of 705s way back when, now they have two machines. They used to have twenty, now they have two. In 1970, IBM put a "floppy disk" drive in their 3740 system computer. Using a floppy disk provided three times more storage space and faster access to the information.^21 _Fourth Generation Computers_ Then, in 1971 Intel created the first microprocessor.^22 The microprocessor was a large-scale integrated circuit which contained thousands of transistors. The transistors on this one chip were capable of performing all of the functions of a computer's central processing unit. The reduced size, reduced cost, and increased speed of the microprocessor led to the creation of the first personal computers. In 1976, Steve Jobs and Steve Wozniak built the first Apple computer in a garage in California.^23 Then, in 1981, IBM introduced its first personal computer.^24 The personal computer was such a revolutionary concept and was expected to have such an impact on society that in 1982, "Time" magazine dedicated its annual "Man of the Year Issue" to the computer.^25 When personal computers first came into use, many people did not know what they were. Some people were amazed by these new tools and others did not like them. *Uncle Murph:* He put it down on the conference room table in the board room and he said "What do you think about this?" I said "What do I think about what? What is it?" We didn't know what it was, but talk about revolutions. *Mom:* When I started teaching in the early eighties, we had just the beginning of computers in the classrooms. The Apple computer company offered an Apple computer to every school in the country because they felt there was going to be a large market for computer software and computers, and they wanted to get their foot in the door. Their way of doing this was to offer any school who would like one an Apple computer. So, we happened to have an Apple computer in the school I was in. I was teaching basic skills and I was fortunate enough to have an Apple computer in my classroom to use with the children. *Aunt Dorothy:* I hated them. In guidance, we had to use them for scheduling. At first, I thought I could do it a lot faster by hand than by the computer. But now that I'm used to it, I learned Word Perfect. We now do our recommendations on Word Perfect. So, it's a big asset and time saver, now that you know how to do it. But in the beginning, oh, I would hate it. Every time I made a mistake, I would somehow clap my hands because it would make me so angry and everybody would laugh in the office. *Mom:* A newspaper reporter called and asked if I would allow my children to be interviewed by him because he wanted to write an article on the use of computers in the classroom. I said "certainly." He came in and he spoke to the children. He asked them if they were frightened because they were going to use computers, did they picture this huge machine that was the size of a room and would this be a frightening aspect to them. The children were quite baffled by the whole situation because he was thinking of the computers of ten or fifteen years before then. The children were accustomed to a much smaller computer. They said no, they weren't frightened at all by the use of computers, that they looked forward to their time on the computer, and they thought that computers were really a great deal of fun. Within a matter of years, computers spread from the work place into the home. *Mom:* When we got a computer in our home, it was probably in the mid-80s. My children were in elementary school at the time, and we used the computer mainly for games and for fun activities like that. And then, as they got a little older and they had to write reports, they started using the computer as a word processing tool. In later years, it has received more use in that area than an activity center for games or that type of thing. Personal computers have changed a great deal since the early eighties. The hardware has definitely changed, the computers are faster now, have more memory, and are relatively inexpensive. But the large increase in home use of computers has come about as the result of an increase in the quantity and quality of the software available. Originally, there was no software available and so some people wrote their own. Companies now produce software to help people do word processing, balance their check book, and store phone numbers. *Mom:* There were not too many programs available at the time, so some of the teachers would have to try and write their own programs, or we were able to get some prepackaged programs. *Jim:* I used to use word processing on the computer and it was real simple: just type it, capitalize it, and have your mom come down and read it to make sure it's correct. Now, you can run it through all kinds of grammar checks, all kinds of spelling checks, and basically you really don't need to know much grammar or spelling anymore cause you just type it out and the computer fixes it for you. In addition to the many programs designed for adults, many software products are geared towards children, in particular, video games. The first video games appeared in 1975, but they were nothing like the games of today.^26 The increased processing speed and memory in computers has led to an increase in the quality of computer graphics. *Jim:* We played it all the time. We used to sit around the TV and play Pacman. That was just a great game. We thought: "Oh, look at this. This is great graphics. Look how quick he moves and look at the little ghosts blinking and flashing. You have got to gobble them up." But, now I've seen some of the graphics on the new computer games. There's a game out called Doom and it's almost realistic, the graphics. The change, it's unreal. Some of the sports games, they used to have with stick figures and a little square ball that would go across the screen real slow. This was the highest there was when I was young. This was the highest technology in video games. Now, anything you want is there, anything. There's nothing you can't do with the graphics on video games. The introduction of the integrated circuit and its development into the very-large scale integrated circuit started a technological revolution which caused computers to invade almost every aspect of our society. This phenomena occurred because of the increased performance, reduced size, and reduced cost of the newer computers. *Uncle Murph:* Today, I think, I don't know, it's almost perfect, almost perfect. I mean, you have unlimited disk. If you run out, you just buy more. If you want more lines, you just add more lines. You don't have time division multi-plexing anymore because the hardware is so fast you don't have to worry about who gets a slice of time when they need it. The machine goes in and just like a multiplex, it just serves everybody. The software's so good, I mean, the response time at EDS from here to Dallas and back is less than 2 seconds. The large database has more than 171 million rows, db2, less than two seconds on 171 million rows, that's pretty good. It doesn't get much faster than that. I don't think it can cause we have got to send the signal down, we have got to send it back. How fast can it be? It can't be zero. ------------------------------------------------------------------------ 1 "Early Vacuum Tube Devices," The On-line Encyclopedia Britannica. 2 Harpur 133. 3 Williams, Science 128. 4 "Information Age." 5 Harpur 133. 6 Harpur 135. 7 Harpur 139. 8 Williams, Science 128. 9 "First Generation Computer," The On-line Encyclopedia Britannica. 10 Harpur 143. 11 AT&T. 12 Harpur 144. 13 Harpur 153. 14 Andrew S. Tanenbaum, Modern Operating Systems (Englewood Cliffs: Prentice Hall, 1992) 6. 15 Williams, Science 164-5. 16 "Third Generation Computer," The On-line Encyclopedia Britannica. 17 Harpur 173. 18 Harpur 176. 19 Williams, Science 125. 20 Williams, Science 125. 21 Harpur 188. 22 Williams, Science 164-5. 23 Harpur 209. 24 Williams, Science 194. 25 Apple Computer, Inc., Corporate Fact Sheet (on-line). 26 Harpur 205. ------------------------------------------------------------------------ © 1995: Kathleen Guinee , A Journey through the History of Information Technology