ET07.pdf

# The multilayer perceptron network fig 9 generally

• No School
• AA 1
• 17

This preview shows page 11 - 14 out of 17 pages.

The multilayer perceptron network (Fig. 9) generally consists of an input layer of nodes, one or more hidden layers of nodes and an output layer of nodes. Nodes within the same layer are not connected. However, each layer of nodes is fully interconnected to the nodes in the next layer. All units within a layer process data in parallel but the outputs of different layers are calculated sequentially z x x i j c j w N i j + = 1 1 F i j i j N c j = + x z x 1 2 x z x z < m j n j x w m j 199 Signal and Image Processing for Electromagnetic Testing F IGURE 9. Architecture of multiplayer perceptron neural network. Synaptic weights Neurons Output Input

Subscribe to view the full document.

starting from the input layer and moving toward the output layer. Each node generates an output that is a nonlinear function of the weighted sum of all its input signals. This nonlinear function is primarily used to limit the output of a node between the values of 0 and 1. The network is trained using the backward error propagation algorithm 18 where training patterns are sequentially applied to the network. The overall algorithm is summarized in Fig. 10. The algorithm uses a gradient search technique for minimizing the squared error between the actual output and the desired output by adapting the interconnection weights iteratively. The algorithm cycles through the training data repeatedly until the error drops below a specified threshold value. Neural networks have been used with success for the classification of eddy current and ultrasonic signals. 19 200 Electromagnetic Testing F IGURE 10. Flow chart of backpropagation training algorithm for multilayer perceptron networks. End of training data? f x e x ( ) = + 1 1 ε = 1 2 2 d y ε τ < ? δ j j j j y d y output = ( ) ( ) y 1 δ δ j j j k j,k x x w k j hidden Nodes in layers above node = ( ) 1 w t w t w t w t i,j i,j j i ij ij + ( ) = ( ) + + ( ) ( ) [ ] 1 1 ηδ α x End Update weight Yes No Initialize weight w and activation function ƒ Present input x and desired output d Calculate network output y Compute error No Yes Legend d = output at node j t = time w = weighting factor x = input signal y = network output at node j α = momentum parameter δ = variable defined by equation η = learning parameter τ = preset threshold value for error
Signal characterization involves a more complete solution to the inverse problem. In material science, the inverse problem involves reasoning from effects (that is, indications) in order to draw inferences about test objects. Characterization techniques use information contained in the signal to estimate the size, shape and location of discontinuities. In other words, characterization procedures involve the full two-dimensional or three-dimensional reconstruction of discontinuity profiles in terms of the spatial distribution of the material properties of the test object. In general, the objective of the signal or discontinuity characterization procedure can be described as the identification of a mapping f such that: (43) where S

Subscribe to view the full document.

• Fall '19

{[ snackBarMessage ]}

### What students are saying

• As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

Kiran Temple University Fox School of Business ‘17, Course Hero Intern

• I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

Dana University of Pennsylvania ‘17, Course Hero Intern

• The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

Jill Tulane University ‘16, Course Hero Intern