4. Computers - As Entertainment

Underpinning all of the above developments has been the development of the computer over 70 years. In 1950 these were vast behemoths attended by their many acolytes feeding them data. By 2020 you can have the same power hanging from your wrist.

In 1951 one of the first key inventions was magnetic core memory, enabling instructions and data to be stored in a "central core" or cores for processing together to generate the output. Tiny magnetic toroids were threaded in a 32 x 32 (1 kilobit, or 128 bytes) array joined by horizontal and vertical magnetising and demagnetising wires, and interlaced by diagonal wires to read the state of magnetisation, thus registering whether a binary 1 or zero had been stored. This had the advantage of speeding up computer operation, as the instructions once read in could be processed at the speed of light, rather than the speed of the input data system of cards or tape, and the results could be output, once the program had finished, at the maximum printing speed of the printer. This type of memory was in use until silicon-based semiconductor memory superseded it 25 years later. Nevertheless the terminology survives, in that the processing power of computers is measured by their "core memory" capability.

Computers to that point had been relying on storing data on large magnetic tape reels. These had the disadvantages that they had limited data storage capacity, and mostly had to be exchanged by hand by specialised computer operators. Furthermore they were linear storage, with new records being written at the end of the existing data. So if a particular record had to be read, it would require some time to rewind the tape to the required point to locate it. These disadvantages were overcome with disk storage, storing data on a stack of disks with radially moveable read and write heads. Data would be stored in tracks on "logical cylinders" cutting through the disk stack. The time taken to access one record with this "random access" approach was almost the same for one cylinder of a disk as another, and the disks together could hold vastly more data than the tapes, requiring far fewer changes by the operator. The IBM 350 was the first random access disk controller, and was first marketed in 1956. It had a stack of 52 24" disks, in a 1.7 m x 1.5 m x 0.75 m dustproof cabinet. From then on disks would dominate computer storage, although tapes were still used but for fewer central storage functions, usually rather to bring in specialised data or programmes elaborated offline. IBM finally sold its disk business to Hitachi in 2003, who miniaturised them to the 3.5" or 2.5" size used widely in personal computers today.

By the mid-1960s the acme of computers was the IBM 360, a vast roomful of air-conditioned equipment - 8-bit paper tape and 80 column card readers providing input data, and output on stacks of 132 column teleprinter paper, with multiple large tape and disk drives, a single operator console monitoring progress, and an operational team of several staff ministering to the computer's needs. Such computers were mainly used in business to take on the burden of record keeping, or as the main computer system for universities and scientific institutions. They were large and reliable, but expensive.

There was clearly a market for much cheaper computers, particularly in university science departments and industry, where there was a demand for the ability to gain fuller control over the way the computer operated, so it could control research experiments and industrial processes. This niche was filled by the minicomputer, of which the first was the DEC PDP-8 in 1965. This and later models such as the PDP-11 and VAX computers had their heyday until the arrival of the personal computer. They had magnetic core memory and small tape drives for programme and data input. Typical input was by rolling video screen (24 lines of 80 characters) stored in user filespace on disks or on tape, and output was again to 132 character per line teleprinter, although it as also possible to attach plotters and graphic screens. Machine instructions were only 12 bit, which meant that even simple arithmetic had to be conducted by a routine. Nevertheless the user interface of the operating system was much more like that adopted later for the personal computer, and the instructions were similar and in simple English.

The use of such computers by scientific institutions and universities meant that when it came to establishing the internet (see earlier) these computers and their operating system were used to implement it. One flexible operating system, UNIX, by 1970 had begun to dominate, particularly as it was good for networking computers together. Initially it as written in machine code for the PDP-11, but other devices needed to be incorporated in networks too, so a new programming language, C, that had been developed to write subroutines for UNIX-based machines and that gave access to operating memory, was used in 1972 to rewrite UNIX making it more portable to other machines. These breakthroughs together accelerated the growth of the internet, and established UNIX as the operating system of choice for all the computers responsible for network intercommunication, even as it spread outside academia. And C is still one of the most popular programming languages and has provided the basis for the style of most language successors.

Transistors were already taking over the role of valves in controlling current flow in circuits in the early 1950s as seen previously for early transistor radios and televisions. They were also beginning to be used in computers by the late 1950s but the big step forward came when it was realised that transistors could be etched onto silicon chips already in circuit configurations, making printed circuit board designs much simpler and higher level. The first integrated circuit, a NOR gate - a fundamental computer logic element which can be combined to make all others - using three transistors, was invented by Robert Noyce of Fairchild Superconductor, in 1961, using essentially the same process as is used for integrated circuits today.

These integrated circuits then began to be incorporated into minicomputers and industrial process control equipment, including ever more components central to computer operation. First and obviously, using data in binary form so made up only of two states - zero and one - second operating with a limited set of operating instructions stored in fixed read-only memory (ROM) which manipulate incoming data in a limited number of "registers" (short fast-access storage blocks) to produce output results also in registers prior to their further manipulation or output to storage, third taking the programme instructions and data progressively from storage into changeable random access memory (RAM) as needed, and fourth governing and coordinating the whole process by a time clock to ensure results were available from one instruction before the next is executed. By 1971 Intel brought out the 4004 chip, the first commercially available "microprocessor" - a multi-purpose clock-driven register-based digital integrated circuit processing binary data - in other words "a computer on a chip". This chip contained 2500 transistors, and since then the number of transistors on basically the same size of chip has been nearly doubled every two years.

The first use of such microprocessors, in 1974 was to provide "pocket" calculators such as the Texas Instruments SR-50 or the Hewlett-Packard HP-35. These were in fact more suitable to carrying in a pouch on the belt than fitting well into a pocket. Nevertheless they finally freed engineers and scientists from the inaccuracy of slide rule calculations and the drudgery of using mathematical tables, and were a "must have" for them despite the $170 pricetag. By 1979 the first home micro-computers were coming on the market, integrating a keyboard and using the already available colour TV screen for display purposes. The Atari 400 and 800 were leaders in the field, mainly offering arcade-type games for entertainment. Later the BBC in cooperation with Acorn computers who produced the Proton (also known as the BBC Micro) widened the capability to include word processing, which brought this capability to the average family and perhaps launched the idea of the home office.

In 1981 IBM joined the fray with its IBM-PC. This was an open architecture standard of keyboard, (cathode ray) screen, optional built-in disk and removable (initially 5") floppy disks, as well as up to 10 "expansion cards" which could contain other more specialised features, such as extensive sound cards (the basic computer could only beep), and extra higher definition graphics. This enabled manufacturers to offer their own computer versions, and many hobbyists assembled their own parts from the best available at the time. The screen was basically 24 lines per page and 80 characters per line, although there was some low-level colour graphics capability. The operating system was almost exclusively Microsoft DOS, although there were also some licensed variants. This gifted Microsoft its first near monopoly and led to its first company fortune. Microsoft remained with this basic operating system also for its later computers. The operating system bears some resemblance to Digital Equipment's operating systems for its PDP and DEC range of minicomputers.

In 1984 Apple introduced its Macintosh Computer, adding a mouse to the equipment. This was the first widely used computer to have a graphical user interface, which quickly established a new standard for personal computing. The comparative ease of use compared to the text-based IBM approach led to the first fortune for Apple. Under pressure from Apple, IBM introduced its own graphical user interface, Windows, in 1985. This had much the same functionality, but was an inferior and unreliable implementation, because it was running on top of DOS, rather than being a separate purpose-built operating system. This was only rectified in later versions, from Windows XP in 2001. Apple claimed Microsoft had copied their Macintosh interface in many respects but this was defeated in the courts, perhaps partly because of the inferiority of the Windows implementation.

In 1987 IBM, seeing its market share slipping, introduced the PS/2 computer, establishing a new now proprietary standard for hardware, with a higher resolution graphics (VGA: 16 colour 640 x 480 pixels), and a higher density (1.44.MB) floppy disk drive.

By 1991 companies had started to develop laptop computers, with liquid crystal displays, and Apple's Powerbook range established a new standard with the, at the time, apparently odd decision to place a mouse (rollerball) between the user and the keyboard. This turned out to be preferred by users and thus established a new standard for laptops which is still in evidence today. 1991 also saw the introduction of the first solid state "flash memory" by Sandisk, a 20MB replacement in the IBM Thinkpad, within the same space previously occupied by the disk. Flash memory is increasingly in use today especially for portable devices, now in units of up to several terabytes.

The applications in use on personal computers today have moved on of course from the games and office activities of the earlier days. Far more features come as standard, with high level graphics and sound generation (e.g. driven by MIDI) as standard. But probably the most influential development was in 1990, when Tim Berners-Lee, working at CERN, wrote the first web browser for information made available by others on the internet, thereby inventing the world wide web. His drive was for sharing of scientific information between high energy physicists, but the impact was on everyone, as all of a sudden any information could be shared rapidly - one just had to know where to get it. This stimulated the growth of "search engines" that would try to index the rapidly increasing volume of material people made available. This would have become a hopeless task, as was becoming evident, were it not for Larry Page and Sergey Brin, who invented the concept of "page ranking" of world wide web results, i.e. listing results in the order of how many links were made to a given web page, as part of their PhD studies at Stanford. This has led to Google, the company they founded, dominating the search market ever since.

Despite Google's ability to rank the information available, there was little or no guarantee that the information available was correct. Wikipedia was therefore started in 2001 by Jimmy Wales and Larry Sanger with the intention of becoming the repository of the definitive truth on all topics - an encyclopaedia for everyone. Any registered user can edit the texts, but the aim is to present a neutral, objective, viewpoint. If there is cyclic editing, moderators step in to try to find a compromise, and it is possible to limit who can edit articles to stop vandalism. The system is essentially self-governing, and is funded from user donations, so over the years it has gained an increasing reputation for accuracy and depth, and is now people's likely first port of call for definitive information on any topic (as for almost all the facts in this website). The very fact that it is not automatically to be considered definitive, encourages users to view its pages critically, and look for alternative viewpoints, and to refine it if it is in error, thus, ironically, improving its accuracy.

Another innovative application, started in 2003, was MySpace, the first social media network with global participation, reaching 115 million users by 2008, but thereafter overtaken by Facebook, which by 2020 claims to have 2.8 billion users. This allowed people to share their lives and views with others, presenting themselves to the world or sharing just with their friends. It also gave the companies running these networks a captive audience to advertise to, and the ability to monetise their interests for the benefit of the company. It has also given the opportunity for fringe viewpoints to gain a wide audience, for example as in conspiracy theories about the effectiveness of and justification for vaccines.

The ability of people increasingly to take video of events around them or to produce films about their work has been enhanced dramatically by another application, YouTube, started in 2005 by Steve Chen, Chad Hurley and Jawed Karim. YouTube allows the online sharing of video material. Users may upload videos on any (legal) topic, from the serious to the frivolous (there are thus millions of cute dog and cat videos!). YouTube started the trend for streaming video material previously transmitted by terrestrial TV, satellite or cable. This has stimulated the demand for further streaming of video which in turn has stimulated the demand for better internet "broadband" speeds, and the growth of streamed video through the internet being offered by TV companies worldwide instead of being transmitted over the air, satellite or cable. This requires the television receiver now to be a "smart tv" capable of receiving streamed material also from the internet, in other words an internet terminal.

One other application came along in 2005 which has had a major influence, perhaps more for smartphone users than for computer users. That is Twitter, a "microblogging" service where users can post messages ("tweets") online where they can be viewed by their friends or anyone who "follows" them. They can then interact with the posting by "liking" it and re-tweeting it to others if they wish. Tweets were initially limited to 140 characters but are now 280 characters. Both Twitter and Facebook are ideal media for people organising themselves in groups for actions over long distances. This has enabled demonstrations and other concerted actions against repressive regimes to be highly effective, and has brought about social change. The downside, like the internet in general, is that, being a free for all, there is no guarantee that the content is correct.

Chapter 3 Contents Chapter 5