CURRICULUM VITAE George Buyanovsky Date of Birth: September 28, 1964 Address: CIS, Republic of Kazakhstan Almaty, 480063, Microregion Zhetisu-2, 56-9 Internet E-mail: george@acb.alma-ata.su fax:7_3272_623856 tel:7_3272_271317 Education: 1) Almaty Polytechnical College, Diploma 1982-1987 2) Secondary School, Certificate 1972-1982 Working experience: November 1993- Chief specialist of the Department of Perspective -Present Developments of the Joint-Stock Company KRAMDS-INFORMATION SYSTEMS: data compression, neural- network modelling for the problems of prognosis of behaviour of multifactor systems and problems of images recognition, author of original algorithms and programs September 1991- Joint-Venture "SovAvstralTechnica" -November 1993 system programmer, data compression, prognosis of dynamic series, stereographics, local networks, November 1989- Almaty Subsidiary of Joint-Venture "Interquadro", -September 1991 mathematician-programmer problems of discrete optimization, location task and task on covering, September 1987- Kalchugin Plant of Non-Ferrous Metals -November 1989 engineer-programmer of the first category Data Compression _ ACB-compressor (1994-1995): worked out a new algorithm of data compression ACB-compressor (Associative Coder of Buyanovsky). The archivator of general purposes ACB.EXE. is realized. The size of ACB-archives is by 10-95% less relative to sizes ZIP- archives (PKZIP V2.04g). The algorithm works with data stream, suitable for communication purposes and apparatus realization. Please, see ACB.txt for detailed information on ACB-compressor with the program ACB.EXE Ver_1.13b. Prognosis of Dynamic Series (1994): the idea of the algorithm is based on the organization of associative memory with the prognosis by the funnel of analogies. The funnel of analogies is built on the data up to 7 exponent with rounding on each of the series (256 128 64 32 16 8 4 2) of the discretization levels. The program is written under Windows 3.1. For testing the generator of dynamic series is included in the program. The effectiveness of the program was compared with the package "Mezozavr" of Aivozyan' s group, at exchange rates relative to Dutch crone. The prognosis was considered successful if fall/rise was predicted by the following counting. "Mezozavr" showed 52/48, the tested algorithm 55/45 per cent. I also worked out an algorithn of quick sorting of bit vectors, with time complexity 3N+o(1) on the data "white noise". The previous two algorithms are based on my articles "The method of research of pseudostochaistic systems" published in the annals of the Institute Of High Energy Physics of the Academy of Sciences of the Republic of Kazakhstan in 1993 in English and the article "Associative coding" in the 8-th issue of the magazine "Monitor " , 1994 . Neural-Network Modeling of multifactor systems (1994): the program of prognosing the behaviour of multifactor systems with a possibility of revealing mutual reasons of factors was worked out. Weight input coefficients of neurons can be received in the size of teaching selection not less n+2 calculating, where n- number of factors of the modelled system. Time complexity of calculating weight coefficients P(o)=4/3*n*n*n. After rate fixing weight input coefficients of neurons can be interpreted as coefficients of reasons and can be used for the analysis of the logical structure of the system. After calculating the model (weight coefficients) the model can be started for self-development from any starting condition of the system (what happens if...). The program works in MS_WINDOWS_3.1 (win32s), exchanging input and output data through Clipboard or through text files. Supports DDE interface. Formation of initial data can be made in Excel,Lotus and etc. On computers with the processor i486-33mHz. RAM 8 Mbyte it is possible to model systems with sizes up to 1000 factors in time of calculating the model- 20..30 min and the prognosis for the step - - 3..6 sec.) Images recognition (1992-1994): the program of recognition of the belonging of a graphic image to one of the classes on the tree of classes was worked out. The algorithm of building a tree of logical utterances in the space of indicators, received with the help of Furie-images on fractal orders, is used. The peculiarities of the algorithm: in global teaching an hierarchical neural network N1 is built for all classes. In recognizing on N1 on the received subset of candidates a neural network N2...N3... and etc. is built until receiving one-element set. On HD only N1 is saved. The program works under WINDOWS 3.1+. There are two regimes: expert - interactive regime of replenishing teaching succession, education of the system and recognition of the classification of new images for sudordinate lists of classes; user's - applied program places a graphic image into "clipboard" WINDOWS reads from it the answer( route on the tree of classes), that allows to expand possibilities of the existing systems of data base management to search information by key - graphical image Possibilities: 1) Maximal number of classes in one knot of the tree of classes 1200. 2) Debth of the tree of classes is restricted by the capacity of Winchester. 3) Graphic image - black and white. 4) Number of pixels of the map of a graphical image - not more than 65536. 5) Number of pixels forming an image - not more than 8192. Time complexity of the algorithm of recognition is described in integer operations by the following formulae:  ) teaching - N2MKLog2(K)+o(1), ¡) recognition - NLog2(K)+CN'2MK. Expenditure of memory (byte): RAM - 4NMK+2NK, HD - 8NMK+o(1). where: N - number of classes; N' -subset N candidates (N'<