• 沒有找到結果。

Web-Based Search System of Pattern Recognition for Component Patterns Database by a Novel Algorithm

N/A
N/A
Protected

Academic year: 2021

Share "Web-Based Search System of Pattern Recognition for Component Patterns Database by a Novel Algorithm"

Copied!
21
0
0

加載中.... (立即查看全文)

全文

(1)P1.. Title: Web-Based Search System of Pattern Recognition for Component Patterns Database by a Novel Algorithm Abstract. This study attempts to apply the pattern recognition (PR) technologies with associative memory to real-time pattern recognition of engineering components by Client-Server network structure into a web-based recognizing system. Remote engineer is able to draw direct the shape of engineering components by the browser, and the recognition system will proceed with search for the component database of company by the structure of Internet. In this paper, component patterns are stored in the database system. Their properties and specifications are also attached to the data field of each component pattern except the pattern of engineering component. Component patterns with the approach of database system will be able to improve the capacity of recognition system effectively. In our approach, the recognition system adopts parallel computing, and it will raise the recognition rate of system. In this paper, our recognition system is a client-server network structure by Internet. The system uses a recurrent neural network (RNN) with associative memory to perform the action of training and recognition. The last phase joins the technology of database match in process of the recognition except parallel computing, and it will solve the problem of spurious state. In this paper, our system will be carried out in the Yang-Fen Automation Electrical Engineering Company. The plan has gone through four months, and their engineers are also used to take advantage of the way of Web-Based pattern recognition. Therefore, the cooperative plan of above context will be analyzed and discussed in this paper. Keywords: Web-Based, Pattern Recognition, engineering components, component database, RNN Authors’ names: Sung-Jung Hsiaoa, Wen-Tsai Sung b and Kuo-Chin Fan a a Artificial Intelligence and Pattern Recognition Lab. Department of Computer Science & Information Engineering, National Central University, Taiwan b Automation & CAD Lab., Department of Electrical Engineering, National Central University, Taiwan Contact author: Sung-Jung Hsiao, Contact Address: 4F, NO.184-1, Kee-Kin 1st Road, An Lo District, 204 Keelung , Taiwan Phone number:+886-2-24325455 Fax number:+886-2-24343129 E-mail:song1208@ms5.hinet.net. P1..

(2) P2.. Web-Based Search System of Pattern Recognition for Component Patterns Database by a Novel Algorithm Sung-Jung Hsiaoa ,Wen-Tsai Sungb and Kuo-Chin Fan a Artificial Intelligence and Pattern Recognition Lab. Department of Computer Science & Information Engineering, National Central University, Taiwan. a. b. Automation & CAD Lab., Department of Electrical Engineering, National Central University, Taiwan. Contact Address: 4F, NO.184-1, Kee-Kin 1st Road, An Lo District, 204 Keelung , Taiwan Contact E-mail:song1208@ms5.hinet.net. Abstract This study attempts to apply the pattern recognition (PR) technologies with associative memory to real-time pattern recognition of engineering components by Client-Server network structure into a web-based recognizing system. Remote engineer is able to draw direct the shape of engineering components by the browser, and the recognition system will proceed with search for the component database of company by the structure of Internet. In this paper, component patterns are stored in the database system. Their properties and specifications are also attached to the data field of each component pattern except the pattern of engineering component. Component patterns with the approach of database system will be able to improve the capacity of recognition system effectively. In our approach, the recognition system adopts parallel computing, and it will raise the recognition rate of system. In this paper, our recognition system is a client-server network structure by Internet. The system uses a recurrent neural network (RNN) with associative memory to perform the action of training and recognition. The last phase joins the technology of database match in process of the recognition except parallel computing, and it will solve the problem of spurious state. In this paper, our system will be carried out in the Yang-Fen Automation Electrical Engineering Company. The plan has gone through four months, and their engineers are also used to take advantage of the way of Web-Based pattern recognition. Therefore, the cooperative plan of above context will be analyzed and discussed in this paper. Keywords: Web-Based, Pattern Recognition, engineering components, component database, RNN 1. Introduction P2..

(3) P3.. Pattern. recognition. by. neural. networks,. is. widely. discussed. on. the. Internet.. Weather-forecasting (http://www.eunetat.de/en /area2/cqms/ap3-13.htm), document analysis and recognition (C E D A R) (http://www.cedar.buffalo.edu/into.html), optical character recognition (O C H R E) (http://www.geocities.com /SiliconValley/2548/ochre.html), atmospheric blocking recognition and prediction (http://www.aquila.infn.it/atmosfera /neurotools/), and even financial forecasting can utilize pattern recognition [1]. Several recognized procedures, with limited capacity, are considered. Such technologies could be partially improved but they have not yet yielded an optimal solution to the restricted capacity when many data are involved [2] [3] [4]. Associative memory is critical in a neural network used as an approach for pattern recognition. Many studies of pattern recognition have focused on the structure of associative memory [5] [6]. The recurrent neural network (RNN) possesses the function of non-linear associative memory. The RNN is very effectively used in pattern recognition [7] [8]. The development of the Internet is becoming increasingly important, but several pattern recognition programs are still being developed at the local-end. Therefore, pattern recognition methods in the Internet focus on integrating science and technology in the future. In this paper, the Web-Based system uses the technology of associative memory to deal with the task of component recognition; moreover, it is also a neural network with RNN structure. Generally recognized systems adopt RNN to do the task of pattern recognition, and the task only is recognition of characters scope. In our approach, the system performs the recognized assignment according to the component pattern by RNN. Entire Web-Based recognition system is built in a client-server network structure by Internet. Therefore, the database of stored pattern is called server-end, user adopts the interface is called client-end. In this server-end, database stores all patterns of component warehouse. In many sample patterns, our paper proposes using the shape of engineering component and circuit sign to take as a recognized sample pattern. The user is able to input the pattern which will be P3..

(4) P4.. searched in the handwritten region of client-end. The system begins to perform the recognition task after we click the searching button. In the recognized process for training phase, the system uses the method of parallel computing to improve the capacity of stored pattern. On the other hand, in the retrieval stage, the Web-Based system will use the technology of database contrast to easy improve the problem that the RNN produces spurious states. A simulation experiment is also discussed to clarify and corroborate the above Web-based PR technology. Future developments of the proposed Web-based PR framework and algorithm analysis are also discussed. 2. Parallel computing and system analysis This work is a innovative pattern recognition network to enhance the network structure of an RNN .In the classical approach [9], an RNN is a discrete-time discretely valued dynamic system which, at any given time, t, is characterized by a binary state vector x(t) = [x 1 (t),..., x i (t),...x n (t) ] ∈ {1, −1}. n. (1). The behavior of the system is given by a dynamic equation of the type, n  x i (t + 1) = sgn ∑ Wij X j (t) − θ i   j=1 . (2). i = 1,2,…….n. A point, x, is fixed for any pattern prototype vectors, ξ 1 ,ξ 2 ,....ξ p [10] ξ. u. [. ]. = x 1 (t),...., x i(t) ,...x n (t) ∈ {1,−1}n. (3). In our approach, x (t) is a record in the pattern database. x1(t)or xi(t) is a field of any data record. Furthermore, bipolar data are between 1 and -1. A “1 “represents a black point in the pattern, and a “-1” represents a white blank in the pattern. The sample patterns are directly stored in the database system of the server-end via the Internet .A user can modify the patterns of a database system at any time, and a remote user may build up his or her own sample pattern, as shown in Figure 1.. P4..

(5) P5.. In the network parameter, the synaptic matrix, W, and the threshold rector, θ, are improved from discrete RNN [11].Initially, in the training stage, the records of a pattern database are cut; a parallel computation is employed to determine the W and θ values of every segment, and afterwards W and θ of every segment are again used to Eq.(2) and, thus, determine the most similar pattern records of retrieval in every segment. These pattern records in the most similar are collected, and their W and θ also are again calculated by Eq. (2). Repeating the computation several times finally yields a correct pattern in many sample patterns. The Web-Based PR system adopts a parallel computing architecture [25].For example, if the pattern database includes fifty records and the cut number is ten; parallel computation is used to determine W and θ of every group of ten records. Then, W and θ of every group of ten records are calculated by Eq. (2). These pattern records in the most similar become some new pattern records. These new pattern records are collected and their W and θ are again computed, according to the first cut number. The computation is repeated many times, until the result of recognition is determined, as shown in Figure 1. The operation of a discrete RNN as a content-addressable memory involves two phases storage and retrieval. 2.1 Storage phase. Assume that a set of N-dimensional vectors (binary word), denoted by {ξ µ µ = 1,2,....N } , and is to be stored. These N vectors are called fundamental memories and represent the patterns to be memorized by the network. Letξ. µ. denote the ith element of the fundamental memory,. ξ µ , where the class µ = 1,2,..., N. P5..

(6) P6.. 50 records in the pattern database. Calculate W10 and θ10 Divide into 10 records. Divide into 10 records. Input source pattern. Again calculate Wnew and θnew. Similar pattern Calculate W20 and θ20. Similar pattern. Output recognized pattern. Calculate W30 and θ30. Similar pattern. Divide into 10 record. Calculate W40 and θ40. Similar pattern rn Divide into 10 records. Calculate W50 and θ50. Similar pattern. Figure 1. Parallel Computing W and θ via the pattern database system. According to the outer product rule of storage, Hebb’s postulate concerning learning the synaptic weight from neuron i to neuron j is generalized as, W ji =. 1 N ∑ξ µ ⋅ j ξ µ ⋅i p µ =1. (4). 1/p is taken as the constant of proportionality to simplify the mathematical description of information retrieval. [12] Notably, the learning rule in Eq. (4) is a “one shot” computation. In the normal operation of the RNN, the following is set. Wii=0 for all i , i=1,…,p. (5). Wii=0, prevents positive feedback. [13] Let W denote the P by P synaptic weight matrix of the network, with Wji as its jith element. Equations (4) and (5) can then be combined into a single equation written in matrix form: P6..

(7) P7.. W=. 1 N N ξ µ ξ µT − I ∑ p µ P. (6). I is the P x P identity matrix, and W is a symmetric matrix of which the diagonal line is zero in all places.  W11 L W1p    W= M O M   Wp1 L Wpp   . (7).  0 L L W1p   M 0 M  =  M O M     Wp1 L L 0 . (8). The threshold of the jth neuron has two modes: θ j = 0, j = 1,..., p. (9). or P. θ j = ∑ Wij , i = 1,..., p. (10). i =1. The threshold of Eq.(10) can increase the memory capacity of the network [14]. 2.2 Retrieval phase. If a recognizing pattern vector X is input, then the initial output value is X (0). Every neuron follow-up output is computed by Eq.(11) (11). p. X j (n + 1) = sgn(∑ W ji X j (n) − θ j ) i =1. = sgn(u j (n) − θ j )  1 if  = X j (n) if  − 1 if . u j (n) > θ j u j (n) = θ j u j (n) < θ j. In his original paper, RNN used the values 0 and 1 as the outputs [11]. However, the values 1 and –1 are now commonly used [15] [16] for convenience with the zero threshold. In Eq. (11), n is the number of iterations. Importantly, the discrete RNN used an asynchronization method to alter the output of each neuron, and the complete process of associative memory employed Eq. (12) to describe the chain-state relationship: X(0)→X (1)→X (2)→…→X (k)→X (k+1)→…. (12). The output is unchanged by continual iterative computation until the state converges on the stable state.. Mathematically, X = sgn (WX-θ), where X is a stable state. Using a P7..

(8) P8.. synchronization mode to change the output of the network, changes many results, but the neural network still converges, as the partial state converges on the stable state. Another partial state can show a limit cycle of length of at most 2. [17]. Although an asynchronization method is used here to change the output of the network, X converges on the stable state, sometimes also on the incorrect recall [18]. The X state of final convergence is therefore used to match the original pattern database. Computing each Hammer distance determines the minimal value of the dH . [19] With n pattern records, the Hammer distance is computed by, P. dH = ∑ X i − ξ iu. ,. u = 1,2,...n. (13). i =1. And the minimal value is, p. p. p. i =1. i =1. i =`. dHmin=min {∑ Χ i − ξ i1 , ∑ Χ i − ξ i2 ,⋅ ⋅ ⋅ ⋅ ⋅ ⋅ ⋅, ∑ Χ i − ξ in }. If the convergent result of the X equals a vector of the sample pattern,ξ. (14) u. , then dHmin = 0 .. If the convergent result of the X does not equal a vector of the sample pattern, ξ u , then dHmin > 0 . In such a case, X is similar to the sample pattern, ξ u .. 3. Establishing and managing the pattern database. Our web site use Microsoft Internet Information server (IIS). All web pages of recognized system are placed in the IIS, and Administrator manages conveniently. When user changes the data in the database, administrator can monitor this condition that all data is changed. The entire pattern database can be divided into two main parts. The First part is that patterns is established, the second part is management of data. The two parts accomplish their work by the browser. When these patterns are built in the pattern database, our approach adopts web-based and real-time method by Internet. After user inputs pattern and clicks the submitted button, the patter will be built in the database. When user inputs these patterns, simultaneously, user also inputs relational properties of these patterns.. P8..

(9) P9.. After user inputs sample pattern, Administrator is able to supervise and manage the database by the function of web Assistant. The supervisor can views the newest stored pattern in the pattern database by the browser. Besides, the administrator can also modify the database at any time. Whole system is showed in following content, such as Figure 2.. Patterns Database Login System. Client Web Pages. Administrator managed interface. Figure 2.The relationship is shown between Administrator and Client-User In this paper, web assistant views the content which is the newest data in the pattern database. User is able to view the field data of pattern number and pattern builder except pattern data, as shown in figure 3.Our pattern database adopts relational mode to establish these data tables. The data of pattern registrar and pattern data will be separated by the. P9..

(10) P10.. relational mode, simultaneously; it can also reduce the complexity of pattern database. The relational graph of data table is shown in Figure 4 This paper uses the method of one to many to build the data table. When the recognition is accomplished, the correct pattern is found by the recognition system, simultaneously, user can also view the properties of the pattern. Figure 3.These component pattern data and builder are shown in the homepage of system. Relational Pattern Database. Figure 4.The relational graph of pattern database is shown 4. Storage capacity analysis and improvement. As an important model of associative memory, the RNN has been comprehensively researched and applied to pattern recognition using the sum-of-outer products [11]. Further research has addressed asymmetric or generalized RNN model with other learning algorithms, P10..

(11) P11.. since the memory capacity of the RNN using the sum-of-outer products scheme, is very low [21] [22] [23]. Hopfield origingally determined the number of stable patterns for the Hopfield RNN at 0.15P (for P neurons) [11]. Since then, many other researches have obtained results that show better performance capability. The capacity of a Hopfield RNN is the number C of stable states it has. Obviously, C depends on the weight matrix, which is taken to be symmetric with zeros on the diagonal. McEliece et a1. [21] showed that, P/[(4)ln(P)]<C<P/[(2)ln(P)]. (21). For example, for 100 neurons, C satisfies 5<C<10, where C is the number of data records in the stable state.The memory capacity of a discrete RNN has an upper limit. If the number of neurons is P, Eq. (21) yields, M max =. P , M max , which 2ln(P). is the maximum memory capacity.. (22). D.J.Amit[24] stated that the number, P, of neurons is 99% correct in the retrieval phase ,and the number of the stored data records is limited by the following formula. M≤. P , 4ln(P). M: memory capacity. (23). Notably, M in the Eq.. (22) and Eq.(23) becomes the basis of the divided segment in the pattern database: M is the number of distributed computation . Writing a component pattern in a computer uses P=240(15×16 matrix) neurons and has a P2-P=57360 weight value for the recollection. Therefore, P=240 in Eq.. (22)and (23), and the different amount of capacity is determined as the number of cut recorded patterns in the database . Assume that Eq. (23) is used to compute the cut number with 100% recognition. P 4ln(P) 240 M≤ 4ln(240) M ≤ 10 (records) M≤. , P=240. (24). P11..

(12) P12.. Accordingly the cut number of the pattern database as set to ten. Consequently, the pattern of every ten records is set into a segment to calculate the W and θ values of each segment. 5. Implementing the Web-Based pattern recognition system. The implementation of the new pattern recognition system is considered further. The system of the pattern database is first established at the server-end, and Microsoft SQL Server was used as the data management platform. The input is a dynamic action in the sample pattern, and a real-time, Web-based method was used to input the pattern. These new learning patterns can be built at the any time. These established sample patterns can be updated, modified, and deleted. Namely, the above mentions fully conform to the rules which build the pattern database, for which refer to Figure 5. Database System. (Remote user or administrator input the sample pattern in the environment of browser). Storage patterns Real-time. SQL Server 2000. Figure 5.Using a Web-based method to build a pattern database (Prototype Database) Using the method of Figure 1 and parallel computation of cut database will solve efficiently the problem of the capacity. The new learning pattern used a dynamic method and can thus perform pattern recognition at any time. The recognized results of pattern database and client-end operation are included in a web page, as Figure 6.. Figure 6.Patterns are inputted at the client-end, and they are displayed at the server-end P12..

(13) P13.. In Figure 6, the right of the web page is the client-end and the left of the web page is the server-end. If the user inputs the pattern at the client-end, the correct recognized result will be presented at server-end even if in the case that the source pattern of client-end be interfered with some noise. The new recognition system has already overcome many problems which previously existed. The capacity and correct rate are substantially increased, and the neural network with distributed computation is turned into a highly efficient system. The convergence of the recognition system is now analyzed, with reference to Lippmann’s experiment in which the inputs, applied to the network, were assumed to take values +1 for black points and -1 for white points. Next, a pattern of interest is distorted by randomly and independently reversing each point of the pattern from +1 to -1 and vice versa, with a probability of 0.25, and then using the corrupted pattern as a probe for the network . Figure 7 presents the result for a component pattern. The patterns produced by the network after 30, 60, 100, 150, 200, and 238 iterations show that the resemblance of the network output to the component pattern progressively improves. Indeed, after 238 iterations, the network converges onto exactly the correct form of the component pattern.. After 30 iterations. After 60 iterations. After 100 iterations. After 150 iterations. After 200 iterations. After 238 iterations. Figure 7.Complete system of pattern recognition in the convergent process. P13..

(14) P14.. Figure 7 shows the correct pattern of stable convergence after 238 iterations at the server-end. Next, for a case of spurious states, the noisy pattern is input at the client-end, and the partial pattern of recall is recognized at the server-end. The pattern is not correctly recalled because such a sample pattern was not input, as shown in Figure 8.. Figure 8.Partially incorrect recollection The system increased the matching of the pattern database. Accordingly, the partial incorrectness in Figure 8 is not again arisen. After the recognized result matched the pattern database, it converges to an accurate pattern, as in Figure 9.. Figure 9.Accurate condition for convergence after the recognized result matched the pattern database.. 6. Cooperative example. According to our Web-Based recognition system, our laboratory and Yang-Fen Automation Electrical Engineering Company had performed the plan of technologic co-operation during the time of January, 2002. Yang-Fen Automation Electrical Engineering Company operates in the installation of the power distribution equipment of the factory. There is a stock amount in each component of power distribution, and these engineering of Yang-Fen Company are constructed all over the world. Therefore, formerly these engineers query the head office about the stock amount of these components by telephone, and the pronunciations through telephone make mistakes of inquiry easily. Afterward Yang-Fen Company uses the method of P14..

(15) P15.. web page to do these tasks of query by Internet. Unfortunately, there are some engineers forget these name of engineering component usually, and it will bring a persecution about the query of stock amount. Next, Yang-Fen Company and we cooperate to do the experiment which uses Web-Based and real-time way to search for these patterns of component database by Internet, and the search is a recognized task namely recognized search. First, we build the pattern of the shape of each component in the server of component database. The pattern of each component uses their shape to become the pattern of component database. The component database joins the specifications of these components in the other field when these patterns of component database are established, such as Figure 10.. Figure 10.Relational component pattern database These component patterns are shown in the following figures from the Web page of Yang-Fen Company, such as figure 11.. Figure 11.Component patterns are list in the Yang-Fen’s Web page.. P15..

(16) P16.. They are able to login the server-end homepage of component recognition by Internet when remote engineer work in any where. Therefore, the engineer inputs self-drawn component pattern, and the system will begin to recognition after clicked the recognizing button. According to the recognized statistics of Yang-Fen Company from January to April in the 2002, their engineers weren’t used to be familiar with the operation of Web-Based recognition system in January; therefore, the recognition rate was low, and these conditions were improved until February. In the cooperative process, we modify the database of original component patterns frequently. We didn’t let these component patterns too alike, and the recognition rate of system will be raised. Next, we list these data that each engineer login the recognition system for the number of times of success and failure from January to April. Table1. Recognized statistics for cooperative example Total. Correct. Incorrect. recognition times. recognition times. recognition times. January. 232. 163. 69. 70.26%. February. 256. 228. 28. 89.06%. March. 247. 230. 17. 93.12%. April. 269. 262. 7. 97.40%. Recognition ratio. Cooperative example analysis Total recognition times. Correct recognition times. Incorrect recognition times. Times. Month. 300 240 180 120 60 0 January. February. March. April. Month. Figure 12.The cooperative plan is analyzed by their histories. P16..

(17) P17.. Recognition ratio Recognition ratio. Month. April. 97.40%. March. 93.12%. February. 89.06%. January. 70.26%. Ratio. Figure 13.The recognition ratio of cooperative plan is analyzed by long-line graph. 7. Algorithm analyses. Our algorithm was based on the theory of Lippmann [16], with improvements; the newly proposed approaches were included. The approach can be easily implemented in a computer program. The steps of the algorithm for finding the correct pattern in the pattern database are as follows. Step 1: The number of cut records of the pattern database is computed: M=. P , P is the total number of neurons 4ln(P). Step 2: Every M records are a segment in the records of the pattern database. All records are. divided into M, 2M, 3M,…, (the maximum number of cut records=CRmax)M , and compute W and θ value. W(M) = W(2M) = W(3M) =. P 1 M M ξ K ξ TK − I, θ j(M) = ∑ Wji(M) , i = 1,......, P ∑ P K =1 P i =1 P 1 2M M ξ K ξ TK − I, θ j(2M) = ∑ Wji(2M) , i = 1,......, P ∑ P K = M +1 P i =1 P 1 3M M ξ K ξ TK − I, θ j(3M) = ∑ Wji(3M) , i = 1,......, P ∑ P K = 2M +1 P i =1 M M. M W(CRmaxM) =. M CRmaxM. P 1 M ξ K ξ TK − I, θ j(CRmaxM) = ∑ Wji(CRmaxM) , i = 1,......, P ∑ P K = CRmaxM+1 P i =1. P17..

(18) P18.. Step 3: In the retrieval stage, n is the number of iterations, and X indicates that the pattern will. be recognized. P. X j(M) (n + 1) = sgn(∑ W ji(M) X i(n) − θ j(M) ) i =1. P. X j(2M) (n + 1) = sgn(∑ W ji(2M) X i(n) − θ j(2M) ) i =1 P. X j(3M) (n + 1) = sgn(∑ W ji(3M) X i(n) − θ j(3M) ) i =1. M M P. X j(CRmaxM) (n + 1) = sgn(∑ W ji(CRmaxM) X i(n) − θ j(CrmaxM) ) i =1. Step 4: Every convergent X value in Step 3 is determined, and the matching pattern database. determines the minimum Hamming Distance, P P P  (n +1) (n +1) (n +1) − ξ1i ,∑ X ij(M) − ξ i2 ,......,∑ X ij(M) − ξ iM  dH min(M) = min ∑ X ij(M) i =1 i =1  i=1  P P P  (n +1) (n +1) (n +1) M+ 2 M +1 − ξ i ,∑ X ij(2M) − ξ i ,......,∑ X ij(2M) − ξ i2M  dH min(2M) = min∑ X ij(2M) i =1 i =1  i=1  P P P (n +1) (n +1) (n +1) 2M +1 2M+ 2 3M  − ξ i ,∑ X ij(3M) − ξi − ξi  dH min(3M) = min∑ X ij(3M) ,......,∑ X ij(3M) i =1 i =1  i=1  M M. M. M. P P   (n +1) (n +1) (n +1) − ξ i(CRmax-1)M+1 ,∑ X ij(CRmaxM) − ξ i(CRmax-1)M+2 ,......,∑ X ij(3M) − ξ iCRmaxM  dH min(CRmaxM) = min∑ X ij(CRmaxM) i =1 i =1  i=1  P. Step 5: dHmin is determined in Step 4 can specify that the X is the most similar to the ξ (sample. patterns). These patterns are combined as new pattern records. Step 2 is revisited and repeated until the record of Step5 equals one. Step 6: Finally, the sample pattern, ξ , is determined as a correctly recognized pattern.. The recognition method presented here is new, and can be used to write easily a web page with a pattern recognition function. 8. Conclusions and Future Work. The application of component recognition on the Internet is not yet mature. This study uses a real-time, web-based method to recognize network patterns in the structure of the Internet .A. P18..

(19) P19.. pattern database overcome many defects in recognition technology. This work provides three new solutions. 1. Using the pattern database to establish the learning pattern, and solves the problem of the capacity of RNN. 2. Adopting the matching technology of a pattern database to determine the most similar patterns reduces the prevalence of the spurious states of RNN; relatively and raise the recognition rate of a neural network. 3. The Web-based approach uses real-time Internet, and any user can use a browser to connect to the server-end via the Internet. Furthermore, the user could input the training pattern and recognize the source pattern at once. Many recognition programs must be run on a local machine, and these programs are limited in many operating systems. Transplanting these programs to the Internet can cause some difficulties in Common Gateway Interface (CGI) .The program presented here is built in the Web-server environment. The performance of the program is without delay because the system is real-time in learning and recognition. This recognition system is managed by the back-end database system. After the user logins to the system, all patterned data is stored in the database. This method is new, and can ensure the completeness and security of these patterned data With further development, the recognition system will be widely applied to electronic commerce (EC). If the server-end is a bank, an autograph (as sample pattern) can be remotely registered in a home or office. The signed pattern would be recognized at the server-end of the bank. The people in the network would be able to buy more securely, and electronic commerce would thus be further promoted. 9. Acknowledgment. The authors would like to thank the National Central University of Taiwan for financially supporting this research. P19..

(20) P20.. 10. References. [1] Singh, S., ”A Long Memory Pattern Modeling and Recognition System for Financial Forecasting,” Pattern Analysis and Applications, vo1.2, no.3, 1999,pp.264-273. [2] S. kak, “Better Web Searches and Prediction with Instantaneously Trained Neural Networks” IEEE Intelligent Systems, vo1.14, no.6, 1999, pp.78-81. [3] R.P.W.Duin, “Superlearning and neural network magic,” Pattern Recognition Letters, vol.15, 1994, pp.215-217. [4] M.A.Kraaijveld and R.P.W.Duin, “The effective capacity of multilayer feedforward network classifiers,”Proc.12th Int’l Conf. on Pattern Recognition.( ICPR 94), Israel, vol.B,1994,pp.99-103. [5] Z.TAN and M.K.ALI, “Pattern recognition with stochastic resonance in a generic neural network,” International Journal of Modern Physics C, vo1.11, no.8, 2000, pp.1585-1593. [6] M.Perus, “Neural networks as a basis for quantum associative networks,” Neural Network World, vol.10, no.6, 2000, pp.1001-1013.. [7] Brouwer, R.K., “An Integer Recurrent Artificial Neural Network for classifying Feature Vectors,” International Journal of Pattern Recognition and Artificial Intelligence, vol.14, no. 3, 2000, pp.339-335. [8] Brouwer, R.K., “A Fuzzy Recurrent Artificial Neural Network for Pattern classification, “International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems, vo1.8, no.5, 2000, pp.525-538. [9] Kamp, Y. and Hasler, M., Recursive Neural Networks for Associative Memory,: Wiley-Interscience Series in Systems and Optimization, England ,1990, pp.10-34. [10] V.Gimenez, L.Aslanyan, J.Catellanos, and V.Ryazanov. “Distribution Functions as Attractor for Recurrent Neural Networks,” Pattern Recognition and Image Analysis. vol. 11, no. 3, 2001, pp. 492-497. [11] J.J. Hopfield, “Neural networks and physical systems with emergent collective computational abilities,” Proc. the National Academy of sciences, USA,vol. 79, 1982, pp.2554-2558.. [12] Simon Haykin, Neural networks a comprehensive foundation, 2nd, Macmillan College Publishing Company, Inc., New York, 1999. [13] J.J.Hopfield and D.W. Tank, “Computing with neural circuits: a model,” Science, vol.233, 1986, pp. 625-633. [14] B.Mueller, J.Reinhardt, and M. T. Strickland, Neural Networks, Springer-Verlag, Berlin Heidelberg, 1995. P20..

(21) P21.. [15] Zurada, J.M., Artificial Neural Systems, West Publishing, St. Paul, UN. , 1992. [16] Lippmann, R.P., “An Introduction to Computing with Neural Nets,” IEEE ASSP Mag., 1987, pp.4-22, also reprinted in neural networks: Theoretical Foundations and Analysis, edited by C. Lau, IEEE Press, New York, 1992, pp.5-23. [17] W. A. Little and G.. L. Shaw, “Analytical study of the memory storage capacity of a neural network, “Mathematical Biosciences, vo1.39,no.1, 1978, pp.281-290. [18] Simon Haykin, Neural networks a comprehensive foundation, Macmillan College Publishing Company, Inc., New York, 1994. [19] Jinwen.Ma, “A Neural Network Approach to real-time pattern recognition,” International Journal of Pattern Recognition and Artificial Intelligence, vol.15, no. 6, 2001, pp.. 934-947. [20] Ma, J.W., “The stability of the generalized Hopfield networks in randomly asynchronous mode,” Neural Networks, vol.10, no.6, 1997, pp.1109-1116. [21] R.E.McEliece, E.C. Posner, E.R.Rodernich and S.S. VenKatesh, “The capacity of the Hopfield associative memory, “IEEE Trans. Inform. In.IT, vol.33, no.2, 1987, pp.461-483. [22] L.F.Abbott and T.B.Kepler, “Optimal learning in neural network memories, “J.Phys. A:Math. General, vol.22, 1989, pp.711-717.. [23] S.S. Venkatesh and D. Pitts, “Linear and logarithmic capacities in associative memory, “IEEE Trans. Inform. Th. IT, vol.35, 1989, pp.558-568. [24] D.J. Amit, Modeling Brain Function: The World of Attractor Neural Networks, Cambridge University Press, Net York, 1989. [25] D. E. Culler, J. P. Singh, and A. Gupta. Parallel Computing Architecture: A Hardware/Software Approach. Morgan Kaufmann Publishers, 1999.. P21..

(22)

數據

Figure 1. Parallel Computing W and θ via the pattern database system.
Figure 2.The relationship is shown between Administrator and Client-User
Figure 3.These component pattern data and builder are shown in the homepage of system
Figure 6.Patterns are inputted at the client-end, and they are displayed at the server-end
+5

參考文獻

相關文件

In the work of Qian and Sejnowski a window of 13 secondary structure predictions is used as input to a fully connected structure-structure network with 40 hidden units.. Thus,

In this work, we will present a new learning algorithm called error tolerant associative memory (ETAM), which enlarges the basins of attraction, centered at the stored patterns,

Since the FP-tree reduces the number of database scans and uses less memory to represent the necessary information, many frequent pattern mining algorithms are based on its

Moreover, this chapter also presents the basic of the Taguchi method, artificial neural network, genetic algorithm, particle swarm optimization, soft computing and

In this thesis, we propose a novel image-based facial expression recognition method called “expression transition” to identify six kinds of facial expressions (anger, fear,

This study proposed the Minimum Risk Neural Network (MRNN), which is based on back-propagation network (BPN) and combined with the concept of maximization of classification margin

Along with this process, a critical component that must be realized in order to assist management in determining knowledge objective and strategies is the assessment of

To solve this problem, this study proposed a novel neural network model, Ecological Succession Neural Network (ESNN), which is inspired by the concept of ecological succession