Book contents
- Frontmatter
- Contents
- Preface
- Acknowledgments
- List of Abbreviations
- 1 Embedded network systems
- 2 Representation of signals
- 3 Signal propagation
- 4 Sensor principles
- 5 Source detection and identification
- 6 Digital communications
- 7 Multiple source estimation and multiple access communications
- 8 Networking
- 9 Network position and synchronization services
- 10 Energy management
- 11 Data management
- 12 Articulation, mobility, and infrastructure
- 13 Node architecture
- 14 Network data integrity
- 15 Experimental systems design
- 16 Ethical, legal, and social implications of ENS
- 17 Design principles for ENS
- Appendix A Gaussian Q function
- Appendix B Optimization
- Index
2 - Representation of signals
Published online by Cambridge University Press: 10 August 2009
- Frontmatter
- Contents
- Preface
- Acknowledgments
- List of Abbreviations
- 1 Embedded network systems
- 2 Representation of signals
- 3 Signal propagation
- 4 Sensor principles
- 5 Source detection and identification
- 6 Digital communications
- 7 Multiple source estimation and multiple access communications
- 8 Networking
- 9 Network position and synchronization services
- 10 Energy management
- 11 Data management
- 12 Articulation, mobility, and infrastructure
- 13 Node architecture
- 14 Network data integrity
- 15 Experimental systems design
- 16 Ethical, legal, and social implications of ENS
- 17 Design principles for ENS
- Appendix A Gaussian Q function
- Appendix B Optimization
- Index
Summary
Source detection, localization, and identification begin with the realization that the sources are by their nature probabilistic. The source, the coupling of source to medium, the medium itself, and the noise processes are each variable to some degree. Indeed, it is this very randomness that presents the fundamental barrier to rapid and accurate estimation of signals. Consequently, applied probability is essential to the study of communications and other detection and estimation problems. This chapter is concerned with the description of signals as random processes, how signals are transformed from continuous (real-world) signals into discrete representations, and the fundamental limits on the loss of information that result from such transformations. Three broad topics will be touched upon: basic probability theory, representation of stochastic processes, and information theory.
Probability
Discrete random variables
A discrete random variable (RV) X takes on values in a finite set X = {x1, x2, … xm}. The probability of any instance x being xi, written P(x = xi), is pi. Any probability distribution must satisfy the following axioms:
P(xi) ≥ 0.
The probability of an event which is certain is 1.
If xi and xj are mutually exclusive events, then P(xi + xj) = P(xi) + P(xj).
That is, probabilities are always positive, the maximum probability is 1, and probabilities are additive if one event excludes the occurrence of another. Throughout the book, a random variable is denoted by a capital letter, while any given sample of the distribution is denoted by a lower-case letter.
- Type
- Chapter
- Information
- Principles of Embedded Networked Systems Design , pp. 12 - 35Publisher: Cambridge University PressPrint publication year: 2005