Constrained Signals: A General Theory of Information Content and Detection
Mark M. Stecker*
Identifiers and Pagination:Year: 2011
First Page: 1
Last Page: 18
Publisher Id: TOSIGPJ-4-1
Article History:Received Date: 14/10/2010
Revision Received Date: 03/01/2011
Acceptance Date: 22/01/2011
Electronic publication date: 19/4/2011
Collection year: 2011
open-access license: This is an open access article distributed under the terms of the Creative Commons Attribution 4.0 International Public License (CC-BY 4.0), a copy of which is available at: https://creativecommons.org/licenses/by/4.0/legalcode. This license permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
In this paper, a general theory of signals characterized by probabilistic constraints is developed. As in previous work , the theoretical development employs Lagrange multipliers to implement the constraints and the maximum entropy principle to generate the most likely probability distribution function consistent with the constraints. The method of computing the probability distribution functions is similar to that used in computing partition functions in statistical mechanics. Simple cases in which exact analytic solutions for the maximum entropy distribution functions and entropy exist are studied and their implications discussed. The application of this technique to the problem of signal detection is explored both theoretically and with simulations. It is demonstrated that the method can readily classify signals governed by different constraint distributions as long as the mean value of the constraints for the two distributions is different. Classifying signals governed by the constraint distributions that differ in shape but not in mean value is much more difficult. Some solutions to this problem and extensions of the method are discussed.