An analog signal is a continuous signal that contains time-varying quantities. Unlike a digital signal, which has a discrete value at each sampling point, an analog signal has constant fluctuations. The illustration below shows an analog pattern (represented as the curve) alongside a digital pattern (represented as the discrete lines).

An analog signal can be used to measure changes in some physical phenomena such as light, sound, pressure, or temperature. For instance, an analog microphone can convert sound waves into an analog signal. Even in digital devices, there is typically some analog component that is used to take in information from the external world, which will then get translated into digital form (using an analog-to-digital converter).

An analog signal uses some property of the medium to convey the signal’s information. For example, an aneroid barometer uses rotary position as the signal to convey pressure information. In an electrical signal, the voltage, current, or frequency of the signal may be varied to represent the information.

Any information may be conveyed by an analog signal; often such a signal is a measured response to changes in physical phenomena, such as sound, light, temperature, position, or pressure. The physical variable is converted to an analog signal by a transducer. For example, sound striking the diaphragm of a microphone induces corresponding fluctuations in the current produced by a coil in an electromagnetic microphone or the voltage produced by a condenser microphone. The voltage or the current is said to be an “analog” of the sound.

An analog signal is subject to electronic noise and distortion introduced by communication channels and signal processing operations, which can progressively degrade the signal-to-noise ratio (SNR). In contrast, digital signals have a finite resolution. Converting an analog signal to digital form introduces a low-level quantization noise into the signal, but once in digital form the signal can be processed or transmitted without introducing significant additional noise or distortion. In analog systems, it is difficult to detect when such degradation occurs. However, in digital systems, degradation can not only be detected but corrected as well.

The most serious disadvantage of analog signals compared to digital transmission is that analog transmissions always contain noise. As the signal is copied, transmitted, or processed, the unavoidable noise introduced in the signal path will accumulate as a generation loss, progressively and irreversibly degrading the signal-to-noise ratio, until in extreme cases, the signal can be overwhelmed. Noise can show up as “hiss” and intermodulation distortion in audio signals, or “snow” in video signals. Generation loss is irreversible as there is no reliable method to distinguish the noise from the signal, partly because amplifying the signal to recover attenuated parts of the signal amplifies the noise as well. Digital signals can be transmitted, stored and processed without introducing noise.

In electrical analog signals, noise can be minimized by shielding, good connections and the use of certain cable types such as coaxial or twisted pair. An example about analog signal is analog computer.

An analog computer or analogue computer is a type of computer that uses the continuously changeable aspects of physical phenomena such as electrical, mechanical, or hydraulic quantities to model the problem being solved. In contrast, digital computers represent varying quantities symbolically and by discrete values of both time and amplitude.

Analog computers can have a very wide range of complexity. Slide rules and nomograms are the simplest, while naval gunfire control computers and large hybrid digital/analog computers were among the most complicated. Systems for process control and protective relays used analog computation to perform control and protective functions.

Analog computers were widely used in scientific and industrial applications even after the advent of digital computers, because at the time they were typically much faster, but they started to become obsolete as early as the 1950s and 1960s, although remained in use in some specific applications, such as aircraft flight simulators, the flight computer in aircraft, and for teaching control systems in universities. More complex applications, such as aircraft flight simulators and synthetic aperture radar, remained the domain of analog computing (and hybrid computing) well into the 1980s, since digital computers were insufficient for the task

LEAVE A REPLY

Please enter your comment!
Please enter your name here