编辑: kr9梯 2019-07-11
Neuro-fuzzy Systems Xinbo GaoSchool of Electronic EngineeringXidian University2004,10 Introduction Neuro-fuzzy systemsSoft computing methods that combine in various ways neural networks and fuzzy conceptsANN C nervous system C low level perceptive and signal integrationFuzzy part C represents the emergent "higher level" reasoning aspects Introduction "Fuzzification" of neural networks Endowing of fuzzy system with neural learning features Introduction Co-operative-neural algorithm adapt fuzzy systemsOff-line C adaptationOn-line C algorithms are used to adapt as the system operatesConcurrent C where the two techniques are applied after one another as pre- or post-processingHybrid C fuzzy system being represented as a network structure, making it possible to take advantage of learning algorithm inherited from ANNs Fuzzy Neural Networks Introduction of fuzzy concepts into artificial neurons and neural networksFor example, while neural networks are good at recognizing patterns, they are not good at explaining how they reach their decisions.

Fuzzy logic systems, which can reason with imprecise information, are good at explaining their decisions but they cannot automatically acquire the rules they use to make those decisions.These limitations have been a central driving force behind the creation of intelligent hybrid systems where two or more techniques are combined in a manner that overcomes individual techniques Fuzzy Neurons Fuzzy model of artificial neuron can be constructed by using fuzzy operations at single neuron level x = (x1,x2,… xn)w = (w1,w2,… wn) y= g(w.x) Fuzzy Neurons y = g(w.x)y = g(A(w,x)) Instead of weighted sum of inputs, more general aggregation function is usedFuzzy union, fuzzy intersection and, more generally, s-norms and t-norms can be used as an aggregation function for the weighted input to an artificial neuron OR Fuzzy Neuron Transfer function g is linearIf wk=0 then wk AND xk=0 while if wk=1 then wk AND xk= xk independent of xk y=OR(x1 AND w1, x2 AND w2 … xn AND wn) OR:[0,1]x[0,1]n->[0,1] AND Fuzzy Neuron In the generalized forms based on t-norms, operators other than min and max can be used such as algebraic and bounded products and sums y=AND(x1 OR w1, x2 OR w2 … xn OR wn) AND:[0,1]x[0,1]n->[0,1] Fuzzy Neurons Both the OR and the AND logic neurons are excitatory in character, i.e. ?xk => ?yIssue of inhibitory (negative) weights deserves a short digressionIn the realm of fuzzy sets operations are defined in [0,1]Proper solution to make a weighted input inhibitory is to take fuzzy complement of the excitatory membership value ?x = 1-xInput x=(x1,..xn) is extended tox=(x1,…,xn,?x1,…,?xn) Fuzzy Neurons The weighted inputs xi o wi, where o is a t-norm and t-conorm, can be general fuzzy relations too, not just simple products as in standard neuronsThe transfer function g can be a non-linear such as a sigmoid OR / AND Fuzzy Neuron This structure can produce a spectrum of intermediate behaviors that can be modified in order to suit a given problemIf c1 =

0 and c2 =

1 the system reduces itself to pure AND neuronIf c1 =

1 and c2 =

0 the behavior corresponds to that of a pure OR neuron Generalization of the above simple fuzzy neurons Multilayered Fuzzy Neural Networks If we restrict ourselves to the pure two-valued Boolean case, network represents an arbitrary Boolean function as a sum of mintermsMore generally, if the values are continuous members of a fuzzy set then these networks approximate certain unknown fuzzy function A second possibility is to have OR neurons in the hidden layer and a single AND neuron in the output layer Learning in Fuzzy Neural Networks Supervised learning in FNN consists in modifying their connection weights in a such a manner that an error measure is progressively reducedIts performance should remain acceptable when it is presented with new dataSet of training data pairs (xk, dk) for k=1,2..nwt+1=wt + ?wt, where weight change is a given function of difference between the target response d and calculated node output y ?wt=F(|dt-yt|) Learning in Fuzzy Neural Networks Mean square error E C measure of how well the fuzzy network maps input data into the corresponding outputE(w) = ??(dk-yk)2Gradient descent ?wi,j= An Example: NEFPROX NEuro Fuzzy function apPROXimatorThree-layer feedforward network (no cycles in the network and no connections exist between layer n and layer n+j, with j>1input variables / hidden layer - fuzzy rules / output variablesHidden and output units use t-norms and t-conorms as aggregation functionsThe fuzzy sets are encoded as fuzzy connection weights and fuzzy inputs NEFPROX The input units are labelled x1..xn, hidden rule units are called R1…Rk and the output units are denoted as y1 ymEach connection is weighted with a fuzzy set and is labelled with a linguistic termConnection coming from the same input unit and having same label are weighted by the same common weight (shared weight). The same holds for the connections that lead to the same output unitThere is no pair of rules with identical antecedents NEFPROX C learning NEFPROX C learning A Second Example: The ANFIS System Adaptive Network-based Fuzzy Inference SystemNeuro-fuzzy system that can identify parameters by using supervised learning methodsSugeno-type fuzzy system with learning capabilitiesFirst order modelNodes have the same function for a given layer but are different from one layer to the next ANFIS System ANFIS System Learning algorithm is a hybrid supervised method based on gradient descent and Least-squaresForward phase: signals travel up to layer

下载(注:源文件不在本站服务器,都将跳转到源网站下载)
备用下载
发帖评论
相关话题
发布一个新话题