DocumentCode :
1939245
Title :
Statistical models for term compression
Author :
Cheney, James
Author_Institution :
Cornell Univ., Ithaca, NY, USA
fYear :
2000
fDate :
2000
Firstpage :
550
Abstract :
Summary form only given. Computing systems frequently deal with symbolic tree data structures, which are also known as terms in universal algebra and logic. Our goal is to develop universal, effective and efficient term compression techniques superior to specialized or universal compression techniques currently available. Our approach is to use knowledge of term structure to build accurate universal statistical models of terms. These models can compress terms faster or more effectively than comparable sequential methods. We present two statistical term models that are related to Markov random fields over trees. These models gather statistical information about parent-child symbol relationships in terms. Huffman or arithmetic codes generated from these probability estimates are used to encode the terms. In the first model, a symbol´s value is predicted by the value of its parent symbol alone. Thus, in compressing a subterm of the form t+u, the +operator would be used to select a specialized code for the root symbols of subterms t and u. The second model also uses the symbol´s argument position as a predictor. For example, in compressing t+u, we would make different probability estimates for the first and second arguments, and use one code to encode the first children of + and another code for the second children. It might be the case that t+1 (but not 1+t) occurs frequently for many different terms t, in which case we could give 1 a shorter code as the second child of +. We have not achieved our goal of improved term compression, but we believe that more sophisticated and more effective techniques remain to be investigated. For example one improvement would be to make a term version of PPM in which contexts are ancestor symbols in the term rather than predecessors in a sequence. Our implementation could be viewed as a first step in that direction
Keywords :
Huffman codes; Markov processes; arithmetic codes; data compression; estimation theory; mathematical operators; prediction theory; probability; statistical analysis; tree data structures; Huffman codes; Markov random fields; arithmetic codes; operator; parent-child symbol relationships; probability estimates; statistical models; symbol argument position; symbol prediction; symbolic tree data structures; term compression; trees; Algebra; Arithmetic; HTML; Logic functions; Markup languages; Predictive models; Probability; SGML; Tree data structures;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Data Compression Conference, 2000. Proceedings. DCC 2000
Conference_Location :
Snowbird, UT
ISSN :
1068-0314
Print_ISBN :
0-7695-0592-9
Type :
conf
DOI :
10.1109/DCC.2000.838197
Filename :
838197
Link To Document :
بازگشت