In an earlier work, the authors introduced a divergence measure, called the first-order Jensen difference, or in short

-divergence, which is based on entropy functions of degree

. This provided a generalization of the measure of mutual information based on Shannon\´s entropy (corresponding to

. It was shown that the first-order

-divergence is a convex function only when a is restricted to some range. We define higher order Jensen differences and show that they are convex functions only when the underlying entropy function is of degree two. A statistical application requiring the convexity of higher order Jensen differences is indicated.