TY - GEN
T1 - Locally decodable codes for edit distance
AU - Ostrovsky, Rafail
AU - Paskin-Cherniavsky, Anat
N1 - Publisher Copyright:
© Springer International Publishing Switzerland 2015.
PY - 2015
Y1 - 2015
N2 - Locally decodable codes (LDC) [1,9] are error correcting codes that allow decoding (any) individual symbol of the message, by reading only few symbols of the codeword. LDC’s, originally considered in the setting of PCP’s [1], have found other additional applications in theory of CS, such as PIR in cryptography, generating a lot of fascinating work (see [12] and references within). In one straightforward practical application to storage, such codes provide enormous efficiency gains over standard error correcting codes (ECCs), that need to read the entire encoded message to learn even a single bit of the encoded message. Typically, LDC’s, as well as standard ECC’s are designed to decode the encoded message if up to some bounded fraction of the symbols had been modified. This corresponds to decoding strings of bounded Hamming distance from a valid codeword. A stronger natural metric is the edit distance, measuring the shortest sequence of insertions and deletions (indel.) of symbols leading from one word to another.1Standard ECC’s for edit distance have been previously considered [11]. Furthermore, [11] devised codes with rate and distance (error tolerance) optimal up to constants, with efficient encoding and decoding procedures. However, combining these two useful settings of LDC, and robustness against indel. errors has never been considered.
AB - Locally decodable codes (LDC) [1,9] are error correcting codes that allow decoding (any) individual symbol of the message, by reading only few symbols of the codeword. LDC’s, originally considered in the setting of PCP’s [1], have found other additional applications in theory of CS, such as PIR in cryptography, generating a lot of fascinating work (see [12] and references within). In one straightforward practical application to storage, such codes provide enormous efficiency gains over standard error correcting codes (ECCs), that need to read the entire encoded message to learn even a single bit of the encoded message. Typically, LDC’s, as well as standard ECC’s are designed to decode the encoded message if up to some bounded fraction of the symbols had been modified. This corresponds to decoding strings of bounded Hamming distance from a valid codeword. A stronger natural metric is the edit distance, measuring the shortest sequence of insertions and deletions (indel.) of symbols leading from one word to another.1Standard ECC’s for edit distance have been previously considered [11]. Furthermore, [11] devised codes with rate and distance (error tolerance) optimal up to constants, with efficient encoding and decoding procedures. However, combining these two useful settings of LDC, and robustness against indel. errors has never been considered.
UR - http://www.scopus.com/inward/record.url?scp=84958527870&partnerID=8YFLogxK
U2 - 10.1007/978-3-319-17470-9_14
DO - 10.1007/978-3-319-17470-9_14
M3 - ???researchoutput.researchoutputtypes.contributiontobookanthology.conference???
AN - SCOPUS:84958527870
SN - 9783319174693
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 236
EP - 249
BT - Information Theoretic Security - 8th International Conference, ICITS 2015, Proceedings
A2 - Wolf, Stefan
A2 - Lehmann, Anja
T2 - 8th International Conference on Information Theoretic Security, ICITS 2015
Y2 - 2 May 2015 through 5 May 2015
ER -