This paper focuses on the problem of the relationship between the risk incurred using a nearest neighbor rule and the size of the data base. Theoretical results include demonstrations of the facts that the proximity of the nearest neighbor to a new sample in a collection of

samples becomes (in probability) arbitrarily small as

is increased; that the convergence is often (but not always) with probability 1; that as a result of these convergences, the risk associated with a decision may be closely controlled; and that these facts and their demonstrations aid one in determining the size of a sample of data to be used as a nearest neighbor decision-making base. An example serves to demonstrate that the size of the data base required to meet performance criteria other than the relatively lax expected risk criterion can be unreasonably large.