This paper proposes new parallel versions of some estimation of distribution algorithms (EDAs). Focus is on maintenance of the behavior of sequential EDAs that use probabilistic graphical models (Bayesian networks and Gaussian networks), implementing a master–slave workload distribution for the most computationally intensive phases: learning the probability distribution and, in one algorithm, “sampling and evaluation of individuals.” In discrete domains, we explain the parallelization of

and

algorithms, while in continuous domains, the selected algorithms are

and

. Implementation has been done using two APIs: message passing interface and POSIX threads. The parallel programs can run efficiently on a range of target parallel computers. Experiments to evaluate the programs in terms of speed up and efficiency have been carried out on a cluster of multiprocessors. Compared with the sequential versions, they show reasonable gains in terms of speed.