Other Free Encyclopedias » Online Encyclopedia » Encyclopedia - Featured Articles » Contributed Topics from A-E

Computing - Research Databases, The Computer Industry, Digital Information, and Knowledge

computers systems

Computers and computer networks have changed the way in which people work, play, do business, run organizations and countries, and interact with one another on a personal level. The workplace of the early twentieth century was full of paper, pens, and typewriters. The office of the early twenty-first century is a place of glowing monitor screens, keyboards, mice, scanners, digital cameras, printers, and speech recognition equipment. The office is no longer isolated; it is linked by computer networks to others like it around the world. Computers have had such an effect that some say an information revolution is occurring. This revolution may be as important as the printing revolution of the fifteenth century, the industrial revolution of the nineteenth century, or the agricultural revolutions of the ancient and medieval worlds.

The computer was invented to perform mathematical calculations. It has become a tool for communication, for artistic expression, and for managing the store of human knowledge. Text, photographs, sounds, or moving pictures can all be recorded in the digital form used by computers, so print, photographic, and electronic media are becoming increasingly indistinguishable. As Tim Berners-Lee (1998), developer of the World Wide Web, put it, computers and their networks promise to become the primary medium in which people work and play and socialize, and hopefully, they will also help people understand their world and each other better.

During the last half of the twentieth century, electronic digital computers revolutionized business, learning, and recreation. Computers are now used in newspaper, magazine, and book publishing, and in radio, film, and television production. They guide and operate unmanned space probes, control the flow of telecommunications, and help people manage energy and other resources. They are used to categorize and preserve the store of human knowledge in libraries, archives, and museums. Computer chips called “embedded microprocessors” are found in the control systems of aircraft, automobiles, trains, telephones, medical diagnostic equipment, kitchen utensils, and farm equipment. The effect on society has been so great that digital information itself is now exchanged more rapidly and more extensively than the commodities or manufactured goods it was originally supposed to help manage. Information has become an essential commodity and, some would argue, a necessary social good.

The history of computing is several stories combined. One is a hardware story—a tale of inventions and technologies. Another is a software story—a tale of the operating systems that enabled specific computers to carry out their basic functions and the applications programs designed to deliver services to computer users. A third story tells how computers provide answers to the problems of society, and how they in turn create new possibilities for society.

Research Databases

Computing has changed the way writers research and prepare scientific articles. During the early 1970s, a small number of databases containing “abstracts” (i.e., summaries of scholarly and popular articles) could be searched offline. Users submitted lists of subjects or phrases on coding forms. Keypunchers typed them onto computer cards, and operators processed them on mainframe computers. The answers would be available the next day. Library catalogs were printed on paper cards or computer output microform (COM). A microfiche is a transparent plastic slide, roughly the size of an ordinary index card, but it contains images of many pages of computer output.

The Library of Congress, and national libraries in other countries, had by this time converted most of the descriptions of the books they owned into machine-readable form. Toward the end of the 1970s, research databases and library catalogs were becoming widely available online. The Dialog database, and library services such as the Online Computer Library Center (OCLC), made it possible to search the contents of many journals or the holdings of many libraries at once. Standards such as the Machine-Readable Cataloging format (MARC) made it possible to exchange this information worldwide and to display it on many different types of computers. However, limits on computer disk space, telecommunications capacities, and computer processing power still made it impractical to store the full text of articles.

Because of the costs, researchers working for large institutions were the main users of these services. By the mid-1980s, when microcomputer workstations became widely available and compact disc read only memory (CD-ROM) became a practical distribution method, much research could be conducted without connecting to large central databases. Companies such as EBSCO and Info Trac began licensing CD-ROMs to their subscribers. With better magnetic “hard” disks and faster microcomputer chips, full-text storage and retrieval finally became workable.

By the end of the twentieth century, databases and catalogs could be accessed over the Internet, on CD-ROM, or through dial-up connections. Some of the special databases include ERIC (for educational issues), Medline and Grateful Med (for medical issues), and Inspec (for engineering issues). Legal research was simplified by services such as Lexis and Westlaw, which allowed identification and cross-referencing of U.S. and international statute and case law. In one of the more interesting applications of computing technology, the Institute for Scientific Information in Washington, D.C., introduced its citation indexing services, which allow researchers to discover important authors and issues by revealing which authors quote one another. Some databases are free of charge, and some are available for a fee.

A researcher at a public library, in a television newsroom, or in a medical practice can perform searches against thousands of special databases and millions of sites on the World Wide Web. While this sort of research was possible with printed directories in the past, it was time consuming and labor intensive. However, searching for data electronically can have unexpected results. Because the computer does not really understand what the string of letters “Jim Smith” means, it will faithfully report any occurrence it finds, regardless of the context. Information retrieval theory and informetrics are two fields that study the implications.

The Computer Industry

In the late 1960s, some writers scoffed at the potential of computers. The mainframe machines of the time occupied entire rooms, and only large institutions could afford them. No computer ever conceived, suggested one writer, had ever weighed less than a human being or been capable of performing as many tasks.

Without the transistor and the integrated circuit, computers would still fill large rooms. Without the laser and improved plastics, optical storage media such as CD-ROMs and digital versatile discs (DVDs) would not be possible. Magnetic tapes and disks have also improved greatly over the years and can now store much more information than they could in the past. It is difficult to buy an item in the supermarket or to borrow a book from a library without that item having a barcode label on it. Credit and debit cards with magnetic strips make it easier to access bank accounts and make retail purchases. Inventions such as these are part of the story of computing, although they are often overlooked.

For example, a minicomputer of the mid-1980s could cost about $500,000 and could contain 64 kilobytes (kb) of random access memory (RAM). By the end of the century, a magnetic floppy disk containing 1.4 megabytes (Mb) of memory sold for less than a dollar, a CD-ROM disk that held 650 Mb was less than two dollars, and desktop microcomputers with 64 Mb of RAM were common household items.

As the industry grew, so did the legends of inventors who made fortunes or revolutionized the industry. William R. Hewlett and David Packard started their company in a garage. Graduate students David Filo and Jerry Yang developed the Yahoo! Internet directory in a dormitory room. Steve Jobs of Apple Computer, Bill Gates of Microsoft, and the heads of many other companies in California’s Silicon Valley became known around the world.

Computer engineers and programmers have often exchanged their ideas openly, out of scientific duty. The Xerox Corporation hit on the idea of the graphical user interface (GUI), developed the “mouse,” and then told everyone how to produce them. Linus Torvalds developed the Linux operating system as a personal project and then made it available for free. Universities also have a long history of developing software and computers and then sharing the knowledge.

Digital Information

Computing involves three activities: input, process, and output. Data enters the computer through a keyboard or mouse, from a camera, or from a file previously recorded on a disk. A program or “process” manipulates the data and then outputs it to a screen, printer, disk, or communications line.

Over the years, many different input devices have been used, including punched paper tape, punched cards, keyboards, mice, microphones, touch-screens, and video cameras. Output devices have included paper printouts, teletypewriters, and video monitors. The part of the computer that does the processing is known as the central processing unit (CPU). Collectively, everything other than the CPU, including memory boards, disks, printers, keyboards, mice, and screens can be thought of as peripheral devices, or just “peripherals.”

There are two sorts of computer software. Operating systems, such as Microsoft Windows, Macintosh, or UNIX, allow machines to perform their basic functions—accepting input, running programs, and sending output to users. Applications programs, such as word processors, Internet browsers, electronic mail programs, or database management programs, do the work required by computer users.

Digital computers use data that has been encoded as series of zeros and ones—binary digits or bits. Text, images, sounds, motion pictures, and other media can all be represented as strings of zeros and ones and processed by digital computers. Programs—the instructions on how to manipulate data—also are represented in binary form. The earliest digital computers were designed to store and manipulate the numbers and letters of the alphabet that were found on typewriter keyboards. The American Standard Code for Information Interchange (ASCII) uses 128 combinations of bits to represent the letters, numbers, and symbols on a typewriter keyboard. Plain text worked well when computers were used primarily for mathematics.

Binary numbers can represent visual and audio information as well. By the end of the 1980s, designers had expanded the coding systems to store drawings, photographs, sounds, and moving pictures. Each dot on a screen is called a “picture element” (or “pixel”). To display graphics on the screen, computers use groups of binary numbers—ones and zeros—to represent the color, intensity of light, and position of each pixel.

Modern computers almost always use some type of GUI. Programmers use small graphics called “icons” to represent a program, a document, a movie, or a musical work. When a user selects an icon, the computer can open a file or program that is associated with it. This technique is object-oriented programming.

When the price of computers dropped, it became possible to distribute work among several machines on a network instead of using a large central computer. A piece of software called a “server” could now send information to smaller programs called “clients” located at the workstations. Shared files remain on large computers called “file servers,” so several users can access them at once. Internet browsers, such as Netscape and Internet Explorer, are good examples of “client/server” design at work, where the browser is a client and an Internet site hosts the server software and the large files of information.

There are many programming languages, each better at addressing certain types of problems. The Formula Translation language (FORTRAN) was developed to handle scientific problems. The Beginner’s All-purpose Symbolic Interchange Code (BASIC) and the Common Business-Oriented Language (COBOL) were better for office automation. The languages C, C++, Java, and Visual Basic use libraries of small, interchangeable programs that perform frequently required tasks, such as sorting items or displaying them on a screen. Programmers can combine these small programs into more complex systems, allowing programmers to build new applications quickly. Other languages, such as Prolog and LISP, were invented for work in artificial intelligence, while Ada was designed to address military needs.

Once personal computers were available, the demand for special software packages or “applications” increased. Spreadsheets, such as the early Super Calc and Excel, have simplified accounting and statistical processes, and they allow users to try out various financial scenarios. If the costs or quantities of items change, the results will appear immediately on the screen. A whole range of database management packages, including dBase, Fox-Pro, Oracle, and Access, help users do inventories, maintain customer profiles, and more. Because records in databases can be matched against ones in different files, say a customer demographic file with a warehouse inventory file, businesses can predict supply and demand trends and improve the delivery of goods and services. Geographic information systems, online census data, and telephone directories make it easier to market products in areas where there is demand. Some critics argue that using data for reasons other than those for which it was collected is an invasion of privacy. In many countries, freedom of information and privacy protection laws have been passed to address these issues.

Computing and Knowledge

Computers have changed the world in which people live and work, and they have provided new ways of thinking about, and making sense of, that world. At the beginning of the twenty-first century, computer science is a mature academic discipline, with almost every university or college offering computer courses.

As an academic subject, computer science may involve information theory, systems analysis, software engineering, electrical engineering, programming, and information studies that examine the use of digital information. The founders of information theory, Claude Shannon and Warren Weaver, published The Mathematical Theory of Communication in 1949. The mathematician Nor-bert Wiener, who coined the term “cybernetics,” showed how computing theories could be applied to problems of communication and control in both animals and machines. Ludwig von Bertalanffy founded general system theory because he saw that large complex systems did not necessarily behave in the same what that their individual components did. He is considered one of the founders of systems analysis.

Professional associations have also played important roles in the development of computing theory, practice, and standards. The Association for Computing Machinery, the Institute of Electrical and Electronic Engineers, the International Standards Organization, and the W3 Consortium are all agencies concerned with computing methods and standards. Less widely known groups, such as the International Society for Systems Sciences and Computer Professionals for Social Responsibility, concern themselves with professional ethics and the social effect of computing. Computing has its own journals and magazines that are aimed at special groups of professionals and at consumers.

Modern computing researchers come from many backgrounds. In turn, scholars from other areas apply computing theory and systems analysis to their own disciplines—from philosophy to psychology to social work. Centers such as the Media Lab at the Massachusetts Institute of Technology or the Xerox Corporation’s Palo Alto Researcher Center bring together experts from many fields to design “neural networks” that simulate the human brain, to build smaller and faster machines, or to find better ways of managing digital information. Nicholas Negroponte, Marvin Minsky, and their colleagues at the Media Lab are associated with developments in artificial intelligence and robotics.

Some people fear that while computers relieve humans of repetitive tasks, they may also “de-skill” workers who forget how to do such tasks by hand. Others suggest that having to cope with computers on the job adds extra stress, raises expectations of promptness, and requires ongoing retraining of workers. Because computing has made it possible to recombine and repackage stories, pictures, and sounds, some fear that the work of authors may one day be regarded as interchangeable, much like mechanical parts. In addition, as people depend more on computers, they become more vulnerable to system failure. If the world’s computers should fail all at once, economic and social chaos might result. A series of Internet “worms” and “viruses” heightened concern over society’s dependence on computers during 1999 and 2000. Governments, banks, companies, and individuals worried that the clocks in their computers might fail at the beginning of 2000, but the “Y2K” crisis they feared did not occur.

Computer designers and computer users think about computers in different terms, and they use different jargon. Hackers, who explore aspects of computers that designers could not have foreseen, have their own way of looking at and talking about computers. People who use computers for destructive purposes are more properly called “crackers.” Finally, those people who do not have access to computers run the risk of economic and educational hardships.

[back] Computer Software, Educational - Basic Applications, Software Production Feature, Multimedia Teaching Methods

User Comments

Your email address will be altered so spam harvesting bots can't read it easily.
Hide my email completely instead?

Cancel or