TY - GEN

T1 - Compressed transitive delta encoding

AU - Shapira, Dana

PY - 2009

Y1 - 2009

N2 - Given a source file S and two differencing files Δ(S, T) and Δ(T,R), where Δ(X, Y) is used to denote the delta file of the target file Y with respect to the source file X, the objective is to be able to construct R. This is intended for the scenario of upgrading software where intermediate releases are missing, or for the case of file system backups, where non consecutive versions must be recovered. The traditional way is to decompress Δ(S,T) in order to construct T and then apply Δ(T,R) on T and obtain R. The Compressed Transitive Delta Encoding (CTDE) paradigm, introduced in this paper, is to construct a delta file Δ(S,R) working directly on the two given delta files, Δ(S,T) and Δ(T,R), without any decompression or the use of the base file S. A new algorithm for solving CTDE is proposed and its compression performance is compared against the traditional "double delta decompression". Not only does it use constant additional space, as opposed to the traditional method which uses linear additional memory storage, but experiments show that the size of the delta files involved is reduced by 15% on average.

AB - Given a source file S and two differencing files Δ(S, T) and Δ(T,R), where Δ(X, Y) is used to denote the delta file of the target file Y with respect to the source file X, the objective is to be able to construct R. This is intended for the scenario of upgrading software where intermediate releases are missing, or for the case of file system backups, where non consecutive versions must be recovered. The traditional way is to decompress Δ(S,T) in order to construct T and then apply Δ(T,R) on T and obtain R. The Compressed Transitive Delta Encoding (CTDE) paradigm, introduced in this paper, is to construct a delta file Δ(S,R) working directly on the two given delta files, Δ(S,T) and Δ(T,R), without any decompression or the use of the base file S. A new algorithm for solving CTDE is proposed and its compression performance is compared against the traditional "double delta decompression". Not only does it use constant additional space, as opposed to the traditional method which uses linear additional memory storage, but experiments show that the size of the delta files involved is reduced by 15% on average.

UR - http://www.scopus.com/inward/record.url?scp=67650659772&partnerID=8YFLogxK

U2 - 10.1109/DCC.2009.46

DO - 10.1109/DCC.2009.46

M3 - ???researchoutput.researchoutputtypes.contributiontobookanthology.conference???

AN - SCOPUS:67650659772

SN - 9780769535920

T3 - Proceedings - 2009 Data Compression Conference, DCC 2009

SP - 203

EP - 212

BT - Proceedings - 2009 Data Compression Conference, DCC 2009

T2 - 2009 Data Compression Conference, DCC 2009

Y2 - 16 March 2010 through 18 March 2010

ER -