Abstract :
We obtain asymptotically tight bounds on the maximum amount of information that a single bit of memory can retain about the entire past. At each of n successive epochs, a single fair bit is generated and a one-bit memory is updated according to a family of memory update rules (possibly probabilistic and time-dependent) depending only on the value of the new input bit and on the current state of the memory. The problem is to estimate the supremum over all possible update rules of the minimum mutual information between the state of the memory at time (n + 1) and each of the previous n input bits. We show that this supremum is asymptotically equal to 1/(2n2 ln 2) bit, as conjectured by Venkatesh and Franklin (1991). We use this result to derive asymptotically sharp estimates of related maximin correlations between the memory and the input bits, thus resolving two more questions left open by Venkatesh and Franklin and by Komlos et al. (1993). Finally, we generalize the results to the case of an m-bit memory, again obtaining asymptotically tight bounds in many cases
Keywords :
correlation methods; digital storage; probability; rate distortion theory; asymptotically tight bounds; correlations; gambling; input bits; memory update rules; mnemonically impaired; one-bit memory; probabilistic update rule; rate distortion; single fair bit; time-dependent update rule; Binary sequences; Boolean functions; Entropy; Information theory; Mutual information; National security; Rate-distortion; State estimation;