TY - GEN

T1 - Generalization of the PAC-model for learning with partial information:(Extended abstract)

AU - Ratsaby, Joel

AU - Maiorov, Vitaly

N1 - Publisher Copyright:
© Springer-Verlag Berlin Heidelberg 1997.

PY - 1997

Y1 - 1997

N2 - The PAC model of learning and its extension to real valued function classes provides a well accepted theoretical framework for representing the problem of machine learning using randomly drawn examples. Quite often in practice some form of a priori partial information about the target is available in addition to randomly drawn examples. In this paper we extend the PAC model to a scenario of learning with partial information in addition to randomly drawn examples. According to this model partial information effectively reduces the complexity of the hypothesis class used to learn the target thereby reducing the sample complexity of the learning problem. This leads to a clear quantitative tradeoff between the amount of partial information and the sample complexity of the problem. The underlying framework is based on a combination of information-based complexity theory (cf. Traub et. al. [18]) and Vapnik-Chervonenkis theory. A new quantity I n,d(F) which plays an important role in determining the worth of partial information is introduced. It measures the minimal approximation error of a target in a class F by the family of all function classes of pseudo-dimension d under a given partial information which consists of any nmeasurements which may be expressed as linear operators. As an application, we consider the problem of learning a Sobolev target class. The tradeoff between the amount of partial information and the sample complexity is calculated and by obtaining fairly tight upper and lower bounds on I n,d we identify an almost optimal way of providing partial information.

AB - The PAC model of learning and its extension to real valued function classes provides a well accepted theoretical framework for representing the problem of machine learning using randomly drawn examples. Quite often in practice some form of a priori partial information about the target is available in addition to randomly drawn examples. In this paper we extend the PAC model to a scenario of learning with partial information in addition to randomly drawn examples. According to this model partial information effectively reduces the complexity of the hypothesis class used to learn the target thereby reducing the sample complexity of the learning problem. This leads to a clear quantitative tradeoff between the amount of partial information and the sample complexity of the problem. The underlying framework is based on a combination of information-based complexity theory (cf. Traub et. al. [18]) and Vapnik-Chervonenkis theory. A new quantity I n,d(F) which plays an important role in determining the worth of partial information is introduced. It measures the minimal approximation error of a target in a class F by the family of all function classes of pseudo-dimension d under a given partial information which consists of any nmeasurements which may be expressed as linear operators. As an application, we consider the problem of learning a Sobolev target class. The tradeoff between the amount of partial information and the sample complexity is calculated and by obtaining fairly tight upper and lower bounds on I n,d we identify an almost optimal way of providing partial information.

UR - http://www.scopus.com/inward/record.url?scp=84949267422&partnerID=8YFLogxK

U2 - 10.1007/3-540-62685-9_6

DO - 10.1007/3-540-62685-9_6

M3 - ???researchoutput.researchoutputtypes.contributiontobookanthology.conference???

AN - SCOPUS:84949267422

SN - 3540626859

SN - 9783540626855

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 51

EP - 65

BT - Computational Learning Theory - 3rd European Conference, EuroCOLT 1997, Proceedings

A2 - Ben-David, Shai

T2 - 3rd European Conference on Computational Learning Theory, EuroCOLT 1997

Y2 - 17 March 1997 through 19 March 1997

ER -