language-icon Old Web
English
Sign In

Extension neural network

Extension neural network is a pattern recognition method found by M. H. Wang and C. P. Hung in 2003 to classify instances of data sets. Extension neural network is composed of artificial neural network and extension theory concepts. It uses the fast and adaptive learning capability of neural network and correlation estimation property of extension theory by calculating extension distance. ENN was used in: R = ( N , C , V ) {displaystyle R=(N,C,V)}     (1) R = [ Y u s u f H e i g h t , 178 c m W e i g h t 98 k g ] {displaystyle R={egin{bmatrix}Yusuf&Height,&178cm\&Weight&98kgend{bmatrix}}}     (2) A = { ( x , y ) | x ∈ U , y = K ( x ) } {displaystyle A=left{(x,y)|xin U,y=K(x) ight}}     (3) ρ ( x , X i n ) = | x − a + b 2 | − b − a 2 {displaystyle ho (x,X_{in})=left|x-{frac {a+b}{2}} ight|-{frac {b-a}{2}}}     (4) K ( x ) = { − ρ ( x , X i n ) x ∈ X i n ρ ( x , X i n ) ρ ( x , X o u t ) − ρ ( x , X i n ) x ∉ X i n {displaystyle K(x)={egin{cases}- ho (x,X_{in})&xin {X_{in}}\{frac { ho (x,X_{in})}{ ho (x,X_{out})- ho (x,X_{in})}}&x ot in {X_{in}}end{cases}}}     (5) E D i k = ∑ j = 0 n ( | x i j p − z k j | − w k j U − w k j L 2 | w k j U − w k j L 2 | + 1 ) {displaystyle ED_{ik}=sum limits _{j=0}^{n}left({frac {|x_{ij}^{p}-z_{kj}|-{frac {w_{kj}^{U}-w_{kj}^{L}}{2}}}{|{frac {w_{kj}^{U}-w_{kj}^{L}}{2}}|}}+1 ight)}     (6) k ∗ = a r g min k ( o i k ) {displaystyle k^{*}=argmin _{k}(o_{ik})}     (7) w k j U = max i { x i j k } {displaystyle w_{kj}^{U}=max _{i}{x_{ij}^{k}}}     (8) Z k = { z k 1 , z k 2 , . . . , z k n } {displaystyle Z_{k}={z_{k1},z_{k2},...,z_{kn}}}     (9) E τ = N m N p {displaystyle E_{ au }={frac {N_{m}}{N_{p}}}}     (10) Extension neural network is a pattern recognition method found by M. H. Wang and C. P. Hung in 2003 to classify instances of data sets. Extension neural network is composed of artificial neural network and extension theory concepts. It uses the fast and adaptive learning capability of neural network and correlation estimation property of extension theory by calculating extension distance. ENN was used in: Extension theory was first proposed by Cai in 1983 to solve contradictory problems. While classical mathematic is familiar with quantity and forms of objects, extension theory transforms these objects to matter-element model. where in matter R {displaystyle R} , N {displaystyle N} is the name or type, C {displaystyle C} is its characteristics and V {displaystyle V} is the corresponding value for the characteristic. There is a corresponding example in equation 2. where H e i g h t {displaystyle Height} and W e i g h t {displaystyle Weight} characteristics form extension sets. These extension sets are defined by the V {displaystyle V} values which are range values for corresponding characteristics. Extension theory concerns with the extension correlation function between matter-element models like shown in equation 2 and extension sets. Extension correlation function is used to define extension space which is composed of pairs of elements and their extension correlation functions. The extension space formula is shown in equation 3. where, A {displaystyle A} is the extension space, U {displaystyle U} is the object space, K {displaystyle K} is the extension correlation function, x {displaystyle x} is an element from the object space and y {displaystyle y} is the corresponding extension correlation function output of element x {displaystyle x} . K ( x ) {displaystyle K(x)} maps x {displaystyle x} to a membership interval [ − ∞ , ∞ ] {displaystyle left} . Negative region represents an element not belonging membership degree to a class and positive region vice versa. If x {displaystyle x} is mapped to [ 0 , 1 ] {displaystyle left} , extension theory acts like fuzzy set theory. The correlation function can be shown with the equation 4. where, X i n {displaystyle X_{in}} and X o u t {displaystyle X_{out}} are called concerned and neighborhood domain and their intervals are (a,b) and (c,d) respectively. The extended correlation function used for estimation of membership degree between x {displaystyle x} and X i n {displaystyle X_{in}} , X o u t {displaystyle X_{out}} is shown in equation 5.

[ "Artificial neural network", "extension theory" ]
Parent Topic
Child Topic
    No Parent Topic