Communication Systems: Fundamentals and Design Methods
Nevio Benvenuto, Roberto Corvaja, Tomaso Erseghe, Nicola Laurenti
貴賓價: $872Principles of Random Signal Analysis and Low Noise Design : The Power Spectral Density and its Applications
貴賓價: $998Probability and Random Processes: With Applications to Signal Processing and Communications
貴賓價: $1,482Wireless Transceiver Design: Mastering the Design of Modern Wireless Equipment and Systems (Hardcover)
貴賓價: $4,532MicroCMOS Design (Hardcover)
貴賓價: $331Rootkit 隱遁攻擊技術及其防範
貴賓價: $599Python Web開發實戰
貴賓價: $906精通 C#, 6/e (Pro C# 5.0 and the .NET 4.5 Framework, 6/e)
In undergraduate classes on communications it is crucial for the students to acquire a deep and thorough understanding of the system principles, methods of analysis, and design tradeoffs. Communication Systems: Fundamentals and Design Methods provides a rigorous mathematical treatment of modulations, covering well-established analog techniques, such as AM and FM, and the more advanced digital formats, such as QAM and CDMA. Using a probabilistic approach, the analytical evaluation of system performance gives rise to the key concept of 'link budget', showing the role of transmit power, channel bandwidth and receiver noise level. Different systems are then compared on the basis of the above parameters.Key features:
- Comprehensively covers the basics of communication systems, without overemphasizing new technologies which require a much deeper background
- Presents a clearly outlined course track, derived from years of teaching experience
- Enriched by discussions and examples of implementation, and by a wide variety of almost 300 problems, with solutions provided in the companion website
- Includes coverage of deterministic and random signals, as well as transmission media and devices, passband signals, linear, amplitude, angular, digital and binary modulationThe book is a perfect textbook for undergraduate students on electrical engineering, computer science and telecommunications courses, as well as graduate students, engineers and operators involved in the design and deployment of communication networks.
Table of Contents
1 Preliminaries on deterministic and random signals.
1.1 Time and frequency domain representation.
1.1.1 Continuous time signals.
1.1.2 Frequency domain representation for periodic signals.
1.1.3 Discrete time signals.
1.2 Energy and power.
1.2.1 Energy and energy spectral density.
1.2.2 Instantaneous and average power.
1.3 Systems and transformations.
1.3.1 Properties of a system.
1.4.1 Classification of signals and systems.
1.4.2 Uncertainty principle.
1.4.3 Practical definitions of band.
1.4.4 Heaviside conditions.
1.4.5 Sampling theorem.
1.4.6 Nyquist criterion.
1.5 Representation of passband signals.
1.5.1 Analytic signal.
1.5.2 Baseband equivalent.
1.5.3 Baseband equivalent of a transformation.
1.5.4 Hilbert transform.
1.5.5 Envelope, instantaneous phase and frequency.
1.6 Random variables and vectors.
1.6.1 Statistical description of random variables.
1.6.2 Expectation and statistical power.
1.6.3 Random vectors.
1.6.4 Second order description of random vectors and Gaussian vectors.
1.6.5 Complex-valued random variables.
1.7 Random processes.
1.7.1 Definition and properties.
1.7.2 Stationary and ergodic random processes.
1.7.3 Second order description of a WSS process.
1.7.4 Joint second order description of two random processes.
1.7.5 Second order description of a cyclostationary process.
1.8 Systems with random inputs and outputs.
1.8.1 Filtering of a WSS random process.
1.8.2 Filtering of a cyclostationary random process.
1.8.3 Representation of passband WSS random processes.
1.8.4 Sampling and interpolation of stationary random processes.
Appendix: The complementary normalized Gaussian distribution function.
2 Characterization of transmission media and devices.
2.1 Two-terminal devices.
2.1.1 Device representation.
2.1.2 Electrical power.
2.1.3 Measurement of electrical power.
2.1.4 Load matching and available power.
2.1.5 Thermal noise.
2.1.6 Other sources of noise.
2.1.7 Noise temperature.
2.1.8 Equivalent noise models.
2.2 Two-port networks.
2.2.1 Reference model.
2.2.2 Network power gain and matched network.
2.2.3 Power gain in terms of electrical parameters.
2.2.4 Noise temperature.
2.2.5 Noise figure.
2.2.6 Cascade of two-port networks.
2.2.7 Signal-to-noise ratio.
2.3 Transmission system model.
2.3.1 Electrical model.
2.3.2 Transmission system model.
2.3.3 Signal-to-noise ratio.
2.3.4 Narrowband channel model.
2.3.5 Link budget.
2.4 Transmission media.
2.4.1 Transmission lines and cables.
2.4.2 Optical fibers.
2.4.3 Radio links.
3 Analog modulation systems.
3.1 Principle and System Model.
3.2 Linear modulation.
3.2.1 Double side band suppressed carrier (DSB-SC).
3.2.2 Single side band modulation (SSB).
3.2.3 Vestigial side band modulation (VSB).
3.2.4 Quadrature modulation (QM).
3.2.5 Implementation issues.
3.2.6 Performance measure and reference SNR.
3.2.7 Performance evaluation.
3.3 Amplitude modulation (AM).
3.3.2 Implementation issues.
3.3.3 Carrier recovery.
3.3.4 Performance evaluation.
3.4 Phase locked loop (PLL).
3.5 Angular modulation.
3.5.1 Phase and frequency modulations.
3.5.3 Narrowband and wideband FM.
3.5.5 Implementation issues.
3.5.6 Performance evaluation.
3.5.7 Pre-emphasis and de-emphasis in FM.
3.6 Comparison of analog modulation systems.
3.7 Frequency division multiplexing. multiple access.
3.8 Super-heterodyne receiver.
3.9 Examples of application.
3.9.1 AM radio.
3.9.2 FM radio.
3.9.3 FM stereo radio.
3.9.4 Television signal.
4 Digital modulation systems.
4.1 The space of signals.
4.1.1 Linear space.
4.1.2 Signals as elements in a linear space.
4.1.3 Gram-Schmidt orthonormalization in signal spaces.
4.1.4 Vector representation of signals.
4.1.5 Orthogonal projections onto a signal space.
4.2 Digital modulation theory.
4.2.1 Optimum detection in additive noise channels.
4.2.2 Statistical characterization of random vectors.
4.2.3 Optimum decision regions.
4.2.4 Maximum a posteriori criterion.
4.2.5 Maximum likelihood criterion.
4.2.6 Minimum distance criterion.
4.2.7 Implementation of minimum distance receivers.
4.2.8 The theorem of irrelevance.
4.3 Binary modulation.
4.3.1 Error probability.
4.3.2 Antipodal and orthogonal signals.
4.3.3 Single filter receivers.
4.4 M-ary modulation.
4.4.1 Bounds on the error probability.
4.4.2 Orthogonal and bi-orthogonal modulations.
4.5 The digital modulation system.
4.5.1 System overview.
4.5.2 Front-end receiver implementation.
4.5.3 The binary channel.
4.5.4 The inner numerical channel.
4.6 Examples of digital modulations.
4.6.1 Pulse amplitude modulation (PAM).
4.6.2 Quadrature amplitude modulation (QAM).
4.6.3 Phase shift keying (PSK).
4.6.4 Frequency shift keying (FSK).
4.6.5 Code division modulation.
4.7 Comparison of digital modulation systems.
4.7.1 Reference bandwidth and link budget.
4.7.2 Comparison in terms of performance, bandwidth and spectral efficiency.
5 Digital transmission of analog signals.
5.1 Digital representation of waveforms.
5.1.1 Analog to digital converter (ADC).
5.1.2 Digital to analog converter (DAC).
5.1.4 Uniform quantizers.
5.1.5 Quantization error.
5.1.6 SNR of a quantizer.
5.1.7 Non-uniform quantizers.
5.1.8 Companding techniques and SNR.
5.2 Digital transmission of analog signals.
5.2.1 Transmission through a binary channel.
5.2.2 Evaluation of the overall SNR.
5.2.3 Analog versus digital transmission.
5.2.4 Regenerative and analog repeaters.
5.3 Time division multiplexing (TDM).
5.4 Examples of application.
6 Transmission over dispersive channels.
6.1 Channel model.
6.2 Baseband digital transmission (PAM systems).
6.3 Passband digital transmission (QAM systems).
6.3.1 Baseband equivalent of QAM systems.
6.4 Analysis of amplitude modulated systems.
6.4.2 PSD of noise.
6.4.3 PSD of digital modulated signals.
6.5 Intersymbol interference.
6.5.1 Nyquist pulses.
6.5.2 Eye diagram.
6.6 Performance analysis.
6.6.1 Symbol error probability in the absence of ISI.
6.6.2 Symbol error probability in the presence of ISI.
6.7 Application examples.
6.7.1 Line codes.
6.7.2 Transmission formats.
7 Elements of Information Theory, Source and Channel Coding.
7.1 Information and entropy.
7.1.1 A measure for information.
7.1.3 Efficiency and redundancy.
7.1.4 Information rate of a message.
7.1.5 Typical sequences.
7.2 Source coding.
7.2.1 The purpose of source coding.
7.2.2 Entropy coding.
7.2.3 Shannon theorem on source coding.
7.2.4 Huffman coding.
7.2.5 Arithmetic coding.
7.3 Channel coding.
7.3.1 The purpose of channel coding.
7.3.2 Binary block codes.
7.3.3 Decoding criteria. Minimum distance decoding.
7.3.4 Linear codes.
7.3.5 Cyclic codes.
7.3.6 Application of channel codes.
7.4 Channel capacity.
7.4.1 Information rate and capacity of a numerical channel.
7.4.2 Capacity of the additive white Gaussian noise (AWGN) channel.
7.4.3 Shannon theorem on channel coding.