The minimum distance growth rate of unmerged codewords in a convolutional code is shown to depend upon the minimum average weight per branch

in the encoder state diagram. An upper bound on

is obtained for a large class of rate

codes which includes many of the best known classes of rate

codes. The hound is shown to be tight for short constraint length codes. A class of codes is defined to be asymptotically catastrophic if

approaches zero for large constraint lengths. Several classes of rate

codes are shown to be asymptotically catastrophic. These include classes containing codes known to have large free distance. It is argued that the free distance alone is not a sufficient criterion to determine a codes performance with either Viterbi or sequential decoding. A code with a low distance growth rate will yield a high bit error probability and will not perform well with truncated Viterbi decoding.