TY - GEN
T1 - Information efficiency
AU - Ratsaby, Joel
PY - 2007
Y1 - 2007
N2 - Shannon's theory of information stands on a probabilistic representation of events that convey information, e.g., sending messages over a communication channel. Kolmogorov argues that information is a more fundamental concept which exists also in problems with no underlying stochastic model, for instance, the information contained in an algorithm or in the genome. In a classic paper he defines the discrete entropy of a finite set which establishes a combinatorial based definition of the information I(x : y) conveyed by a variable x (taking a binary string value x) about the unknown value of a variable y. The current paper extends Kolmogorov's definition of information to a more general setting where given 'x = x' there may still be uncertainty about the set of possible values of y. It then establishes a combinatorial based description complexity of x and introduces a novel concept termed information width, similar to n-widths in approximation theory. This forms the basis of new measures of cost and efficiency of information which give rise to a new framework whereby information of any input source, e.g., sample-based, general side-information or a hybrid of both, is represented and computed according to a single common formula. As an application, we consider the space of Boolean functions where input strings x correspond to descriptions of properties of classes of Boolean functions.
AB - Shannon's theory of information stands on a probabilistic representation of events that convey information, e.g., sending messages over a communication channel. Kolmogorov argues that information is a more fundamental concept which exists also in problems with no underlying stochastic model, for instance, the information contained in an algorithm or in the genome. In a classic paper he defines the discrete entropy of a finite set which establishes a combinatorial based definition of the information I(x : y) conveyed by a variable x (taking a binary string value x) about the unknown value of a variable y. The current paper extends Kolmogorov's definition of information to a more general setting where given 'x = x' there may still be uncertainty about the set of possible values of y. It then establishes a combinatorial based description complexity of x and introduces a novel concept termed information width, similar to n-widths in approximation theory. This forms the basis of new measures of cost and efficiency of information which give rise to a new framework whereby information of any input source, e.g., sample-based, general side-information or a hybrid of both, is represented and computed according to a single common formula. As an application, we consider the space of Boolean functions where input strings x correspond to descriptions of properties of classes of Boolean functions.
UR - http://www.scopus.com/inward/record.url?scp=38149132305&partnerID=8YFLogxK
U2 - 10.1007/978-3-540-69507-3_41
DO - 10.1007/978-3-540-69507-3_41
M3 - ???researchoutput.researchoutputtypes.contributiontobookanthology.conference???
AN - SCOPUS:38149132305
SN - 9783540695066
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 475
EP - 487
BT - SOFSEM 2007
T2 - 33rd Conference on Current Trends in Theory and Practice of Computer Science, SOFSEM 2007
Y2 - 20 January 2007 through 26 January 2007
ER -