We study the distance measures between two probability distributions via two different distance metrics, a new metric induced from Jensen–Shannon divergence, and the well known

metric. We show that several important results and constructions in computational complexity under the

metric carry over to the new metric, such as Yao\´s next-bit predictor, the existence of extractors, the leftover hash lemma, and the construction of expander graph based extractor. Finally, we show that the useful parity lemma in studying pseudorandomness does not hold in the new metric.