Title :
A Tutorial on the Method of Moments [Testing Ourselves]
Author :
Arvas, Ercument ; Sevgi, Levent
Author_Institution :
Dept. of Electr. Eng. & Comput. Sci., Syracuse Univ., Syracuse, NY, USA
fDate :
6/1/2012 12:00:00 AM
Abstract :
The Method of Moments (MoM) is a numerical technique used to approximately solve linear operator equations, such as differential equations or integral equations. The unknown function is approximated by a finite series of known expansion functions with unknown expansion coefficients. The approximate function is substituted into the original operator equation, and the resulting approximate equation is tested so that the weighted residual is zero. This results in a number of simultaneous algebraic equations for the unknown coefficients. These equations are then solved using matrix calculus. MoM has been used to solve a vast number of electromagnetic problems during the last five decades. In addition to the basic theory of MoM, some simple examples are given. To demonstrate the concept of minimizing weighted error, the Fourier series is also reviewed.
Keywords :
Fourier series; electromagnetic field theory; matrix algebra; method of moments; Fourier series; approximate equation function; differential equations; electromagnetic problems; integral equations; known expansion functions; linear operator equations; matrix calculus; method of moments; numerical technique; simultaneous algebraic equations; unknown expansion coefficients; weighted error minimization; Electromagnetic measurements; Encoding; Fourier series; Integral equations; Moment methods; Numerical analysis; Tutorials; Fourier series; Method of Moments; MoM; NEC; Numerical electromagnetics; basis functions; expansion functions; integral equations; linear operator equations; numerical electromagnetics code; test functions;
Journal_Title :
Antennas and Propagation Magazine, IEEE
DOI :
10.1109/MAP.2012.6294003