DocumentCode
3590784
Title
Graduated assignment graph matching
Author
Gold, Steven ; Rangarajan, Anand
Author_Institution
Dept. of Comput. Sci., Yale Univ., New Haven, CT, USA
Volume
3
fYear
1996
Firstpage
1474
Abstract
A new technique, termed softassign, is applied to weighted graph matching. Softassign, which has emerged from the recurrent neural network/statistical physics framework, enforces two-way (assignment) constraints without the use of penalty terms in the energy functions. The softassign technique is compared to softmax (Potts glass) dynamics. Within the statistical physics framework, softmax with a penalty term has been a widely used method for enforcing the two-way constraints common to many combinatorial optimization problems. The benchmarks present evidence that softassign has clear advantages in accuracy, speed, and algorithmic simplicity over softmax with a penalty term in this weighted graph matching problem
Keywords
graph theory; optimisation; accuracy; algorithmic simplicity; combinatorial optimization problems; graduated assignment graph matching; recurrent neural network; softassign; softmax dynamics; speed; statistical physics; two-way constraints; weighted graph matching; Annealing; Computer science; Constraint optimization; Glass; Gold; Physics; Radiology; Recurrent neural networks; Symmetric matrices; Traveling salesman problems;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Networks, 1996., IEEE International Conference on
Print_ISBN
0-7803-3210-5
Type
conf
DOI
10.1109/ICNN.1996.549117
Filename
549117
Link To Document