![]() |
Computer to Computer Communication Services intended for access by microcomputers are nowadays usually presented in a very user-friendly fashion: pop in your software disc or firmware, check the connections, dial the telephone number, listen for the tone...and there you are. Hackers, interested in venturing where they are not invited, enjoy no such luxury. They may want to access older services which preceded the modern 'human interface'; they are very likely to travel along paths intended, not for ordinary customers, but for engineers or salesmen; they could be utilising facilities that were part of a computer's commissioning process and have been hardly used since. So the hacker needs a greater knowledge of datacomms technology than does a more passive computer user, and some feeling for the history of the technology is pretty essential, because of its growth pattern and because of the fact that many interesting installations still use yesterday's solutions. Getting one computer to talk to another some distance away means accepting a number of limiting factors: * Although computers can send out several bits of information at once, the ribbon cable necessary to do this is not economical at any great length, particularly if the information is to be sent out over a network--each wire in the ribbon would need switching separately, thus making ex- changes prohibitively expensive. So bits must be transmitted one at a time, or serially. * Since you will be using, in the first instance, wires and networks already installed--in the form of the telephone and telex networks--you must accept that the limited bandwidth of these facilities will restrict the rate at which data can be sent. The data will pass through long lengths of wire, frequently being re-amplified, and undergoing de- gradation as it passes through dirty switches and relays in a multiplicity of exchanges. * Data must be easily capable of accurate recovery at the far end. * Sending and receiving computers must be synchronised in their working. * The mode in which data is transmitted must be one understood by all computers; accepting a standard protocol may mean adopting the speed and efficiency of the slowest. * The present 'universal' standard for data transmission used by microcomputers and many other services uses agreed tones to signify binary 0 and binary 1, the ASCII character set (also known as International Alphabet No 5), and an asynchronous protocol, whereby the transmitting and receiving computers are locked in step every time a character is sent, not just at the beginning of a transmission stream. Like nearly all standards, it is highly arbitrary in its decisions and derives its importance simply from the fact of being generally accepted. Like many standards, too, there are a number of subtle and important variations. To see how the standard works, how it came about and the reasons for the variations, we need to look back a little into history. The Growth of Telegraphy The essential techniques of sending data along wires has a history of 150 years, and some of the common terminology of modern data transmission goes right back to the first experiments. The earliest form of telegraphy, itself the earliest form of electrical message sending, used the remote actuation of electrical relays to leave marks on a strip of paper. The letters of the The terms have come through to the present, to signify binary conditions of '1' and '0' respectively. The first reliable machine for sending letters and figures by this method dates from 1840; the direct successor of that machine, using remarkably unchanged electromechanical technology and a 5-bit alphabetic code, is still widely used today, as the telex/teleprinter/teletype. The mark and space have been replaced by holes punched in paper-tape: larger holes for mark, smaller ones for space. Synchronisation between sending and receiving stations is carried out by beginning each letter with a 'start' bit (a space) and concluding it with a 'stop' bit (mark). The 'idle' state of a circuit is thus 'mark'. In effect, therefore, each letter requires the transmission of 7 bits: . * * . . . * (letter A: . = space; * = mark) of which the first . is the start bit, the last * is the stop bit and * * . .. is the code for A. This is the principle means for sending text messages around the world, and the way in which news reports are distributed globally. And, until third-world countries are rich enough to afford more advanced devices, the technology will survive. Early computer communications When, 110 years after the first such machines came on line, the need arose to address computers remotely, telegraphy was the obvious way to do so. No one expected computers in the early 1950s to give instant results; jobs were assembled in batches, often fed in by means of paper-tape (another borrowing from telex, still in use) and then run. The instant calculation and collation of data was then considered quite miraculous. So the first use of data communications was almost exclusively to ensure that the machine was fed with up-to-date information, not for the machine to send the results out to those who might want it; they could wait for the 'print-out' in due course, borne to them with considerable solemnity by the computer experts. Typical communications speeds were 50 or 75 baud. (The baud is the measure of speed of data transmission: specifically, it refers to the number of signal level changes per second and is thus not the same as bits-per-second.) These early computers were, of course, in today's jargon, single-user/single-task; programs were fed by direct machine coding. Gradually, over the next 15 years, computers spawned multi-user capabilities by means of time-sharing techniques, and their human With these facilities grew the demand for remote access to computers, and modern data communications began. |
| All times are GMT +5. The time now is 01:37 PM. |
Powered by vBulletin® Version 3.8.2
Copyright ©2000 - 2026, Jelsoft Enterprises Ltd.