Received: from BUACCA by BUACCA.BU.EDU (Mailer R2.08 PTF009) with BSMTP id
 2933; Tue, 28 Sep 93 13:00:39 EDT
Received: from noc4.dccs.upenn.edu by BUACCA.BU.EDU (IBM VM SMTP R1.2.1) with
 TCP; Tue, 28 Sep 93 13:00:35 EDT
Received: from CATTELL.PSYCH.UPENN.EDU by noc4.dccs.upenn.edu
	id AA29332; Tue, 28 Sep 93 12:39:12 -0400
Return-Path: <marvit@cattell.psych.upenn.edu>
Received: from LOCALHOST by cattell.psych.upenn.edu
	id AA04542; Tue, 28 Sep 93 11:39:09 EDT
Posted-Date: Tue, 28 Sep 93 11:38:21 -0400
From: "Neuron-Digest Moderator" <neuron-request@CATTELL.PSYCH.UPENN.EDU>
To: Neuron-Distribution:;
Subject: Neuron Digest V12 #4 (jobs, softwrare, queries, etc.)
Reply-To: "Neuron-Request" <neuron-request@CATTELL.PSYCH.UPENN.EDU>
X-Errors-To: "Neuron-Request" <neuron-request@psych.upenn.edu>
Organization: University of Pennsylvania
Date: Tue, 28 Sep 93 11:38:21 -0400
Message-Id: <4523.749230701@cattell.psych.upenn.edu>
Sender: marvit@cattell.psych.upenn.edu

Neuron Digest   Tuesday, 28 Sep 1993
                Volume 12 : Issue 4

Today's Topics:
             Adaptive Simulated Annealing (ASA) version 1.43
                  Journal of Computational Neuroscience
                  Proceedings of the 1993 NNSP Workshop
                             Chaos and NN's
                        Free simulation software
                 CMU Learning Benchmark Database Updated
      postdoctoral fellowship opportunity for women and minorities
                 Job vacancy in evolutionary algorithms
             'Fractal Extrapolation', is there such a thing?


Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@psych.upenn.edu". The ftp archives are
available from psych.upenn.edu (130.91.68.31). Back issues requested by
mail will eventually be sent, but may take a while.

----------------------------------------------------------------------

Subject: Adaptive Simulated Annealing (ASA) version 1.43
From:    Lester Ingber <ingber@alumni.cco.caltech.edu>
Date:    Fri, 17 Sep 93 09:22:05 -0800

========================================================================
                Adaptive Simulated Annealing (ASA) version 1.43

To get on or off the ASA email list, just send an email to
asa-request@alumni.caltech.edu with your request.
________________________________________________________________________
                        Significant CHANGES since 1.34

Remarks were added in the NOTES for HP support, and for Turbo C, Turbo
C++, and MS Quick C on PCs.

Another option was added, the ASA_PRINT_MORE Printing Option, to give
more intermediate printout, i.e., the values of the best parameters and
cost function at each new acceptance state.
________________________________________________________________________
                        Wall Street Journal

I have been told that the WSJ will mention the world-wide use of the
ASA code, in an article to appear soon.  I gave examples of some
projects using ASA, but I had to insist that the relevant people would
have to be contacted previous to citing them; see the related comment
in General Information below.  Of course the press has the last word on
what they will publish/interpret.
________________________________________________________________________
                        FORTRAN?

I regularly receive requests to be able to run ASA with FORTRAN.  I
cannot maintain both a C and a FORTRAN code, but there does seem to be
a genuine need to interface ASA with FORTRAN.  In the NOTES are a
couple of suggestions: (1) Use f2c on FORTRAN programs; I have done
this and it works very well.  (2) Try CFORTRAN to interface the C and
FORTRAN codes: (a) call FORTRAN from ASA, e.g., from cost_function(),
call a FORTRAN function that performs the actual calculation of the
cost function; (b) call ASA from FORTRAN, e.g., using the ADAPTING
section in the NOTES as a guide, call asa_main() from a FORTRAN
function.  Can someone prepare templates for (a) and/or (b)?  This
probably isn't easy to prepare for public release; about half a dozen
people started, but didn't complete such a project.
________________________________________________________________________
                        sa_pvt93.ps.Z

The new reference for this preprint in ftp.caltech.edu:pub/ingber is
        %A L. Ingber
        %T Simulated annealing: Practice versus theory
        %J Mathl. Comput. Modelling
        %V
        %N
        %D 1993
        %P (to be published)
As announced previously, this is a much expanded version of the
original draft, e.g., including new ideas and calculations regarding
"quenching." In the acknowledgements, I give a sincere thanks to the
many users who read parts of previous drafts and who sent me their own
(p)reprints on simulated annealing.  I'd be interested in hearing about
any systems that find the QUENCHing options as useful (lucky?) as the
ASA test problem did in this paper.
________________________________________________________________________
                        General Information

The latest Adaptive Simulated Annealing (ASA) code and some related
(p)reprints in compressed PostScript format can be retrieved via
anonymous ftp from ftp.caltech.edu [131.215.48.151] in the pub/ingber
directory.

Interactively: ftp ftp.caltech.edu, [Name:] anonymous, [Password:]
your_email_address, cd pub/ingber, binary, ls or dir, get
file_of_interest, quit.  The latest version of ASA is asa-x.y.Z (x and
y are version numbers), linked to asa.Z.  For the convenience of users
who do not have any uncompress utility, there is a file asa which is an
uncompressed copy of asa-x.y.Z/asa.Z; if you do not have sh or unshar,
you still can delete the first-column X's and separate the files at the
END_OF_FILE locations.  There are patches asa-diff-x1.y1-x2.y2.Z up to
the present version; these may be concatenated as required before
applying.  The INDEX file contains an index of the other files.

If you do not have ftp access, get information on the FTPmail service
by: mail ftpmail@decwrl.dec.com, and send only the word "help" in the
body of the message.

If any of the above are not possible, and if your mailer can handle
large files (please test this first), the code or papers you require
can be sent as uuencoded compressed files via electronic mail.  If you
have gzip, resulting in smaller files, please state this.

Sorry, I cannot assume the task of mailing out hardcopies of code or
papers.

People willing to be contacted by others who might be interested in
their ASA projects could keep me informed on (1) the title and/or short
description of their project, and (2) whether I have permission to
release their names as well as the description of their projects.

Lester
========================================================================

|| Prof. Lester Ingber                                                ||
|| Lester Ingber Research                                             ||
|| P.O. Box 857                      EMail: ingber@alumni.caltech.edu ||
|| McLean, VA  22101             Archive: ftp.caltech.edu:/pub/ingber ||


------------------------------

Subject: Journal of Computational Neuroscience
From:    Jim Bower <jbower@smaug.bbb.caltech.edu>
Date:    Fri, 17 Sep 93 09:55:00 -0800


*******************************************************************

           JOURNAL OF COMPUTATIONAL NEUROSCIENCE
*******************************************************************

 From neurons to behavior:  A Journal at the interface between
        experimental and theoretical neuroscience.


                     MANAGING EDITORS:

   James M. Bower                           Eve Marder
California Institute of                 Brandeis University
     Technology

    John Miller                             John Rinzel
University of California,           National Institutes of Health
     Berkeley

     Idan Segev                             Charles Wilson
  Hebrew University                    University of Tennessee,
                                               Memphis



ACTION EDITORS:

L. F.  Abbott, Brandeis University
Richard Andersen, Massachusetts Inst. of Technology
Alexander Borst, Max-Planck Inst., Tubingen
Robert  E. Burke, NINDS, NIH
Catherine Carr, Univ. of Maryland, College Park
Rodney Douglas, Oxford University
G. Bard Ermentrout, University of Pittsburgh
Apostoles Georgopoulos, VA Medical Center, MN
Charles Gray, University of California, Davis
Christof Koch, California Institute of Technology
Gilles Laurent, California Institute of Technology
David McCormick, Yale University
Ken Miller, University of California, San Francisco
Steve Redman, Australian National University
Barry Richmond, NIMH, NIH
Terry Sejnowski, Salk Institute
Shihab Shamma, Univ. of Maryland, College Park
Karen Sigvardt, University of California, Davis
David Tank, Bell Labs
Roger Traub, IBM TJ Watson Research Center
Thelma Williams, University of London


JOURNAL DESCRIPTION:

The JOURNAL OF COMPUTATIONAL NEUROSCIENCE is intended to provide a
forum  for papers that fit the interface between computational and
experimental work in the neurosciences. The JOURNAL OF
COMPUTATIONAL NEUROSCIENCE will publish full length original papers
describing theoretical and experimental work relevant to
computations in the brain and nervous system.  Papers that combine
theoretical and experimental work are especially encouraged.
Primarily theoretical papers should deal with issues of obvious
relevance to biological nervous systems.  Experimental papers
should have implications for the computational function of the
nervous system, and may report results using any of a variety of
approaches including anatomy, electrophysiology, biophysics,
imaging, and molecular biology.   Papers that report novel
technologies of interest to researchers in computational
neuroscience are also welcomed.  It is anticipated that all levels
of analysis from cognitive to single neuron will be represented in
THE JOURNAL OF COMPUTATIONAL NEUROSCIENCE.

*****************************************************************
                           CALL FOR PAPERS
*****************************************************************

For Instructions to Authors, please contact:

             Karen Cullen
             Journal of Computational Neuroscience
             Kluwer Academic Publishers
             101 Philip Drive
             Norwell, MA  02061

             PH: 617 871 6300
             FX: 617 878 0449
             EM: Karen@world.std.com
*****************************************************************

*****************************************************************
                        ORDERING INFORMATION:

For complete ordering information and subscription rates,
please contact:

KLUWER ACADEMIC PUBLISHERS
PH: 617 871 6600
FX: 617 871 6528
EM: Kluwer@world.std.com

JOURNAL OF COMPUTATIONAL NEUROSCIENCE
ISSN: 0929-5313
*****************************************************************



------------------------------

Subject: Proceedings of the 1993 NNSP Workshop
From:    Raymond L Watrous <watrous@learning.siemens.com>
Date:    Mon, 20 Sep 93 15:06:34 -0500


The 1993 IEEE Workshop on Neural Networks for Signal Processing was
held September 6 - September 9, 1993 at the Maritime Institute of
Technology and Graduate Studies Linthicum Heights, Maryland, USA.

Copies of the 593-page, hardbound Proceedings of the workshop may be
obtained for $50 (US, check or money order, please) postpaid from:

        Raymond Watrous, Financial Chair

        1993 IEEE Workshop on Neural Networks for Signal Processing
        c/o Siemens Corporate Research
        755 College Road East
        Princeton, NJ 08540

        (609) 734-6596
        (609) 734-6565 (FAX)




------------------------------

Subject: Chaos and NN's
From:    epperson@evolve.win.net (Mark Epperson)
Date:    Tue, 21 Sep 93 04:18:10

I am looking for references/papers on using NN's and chaos theory
for filtering noisy signals.

Thanks in advance,
Mark Epperson

------------------------------

Subject: Free simulation software
From:    Bard Ermentrout <bard@mthbard.math.pitt.edu>
Date:    Thu, 23 Sep 93 10:41:48 -0500

        F R E E    S I M U L A T I O N   S O F T W A R E

I thought perhaps that modellers etc might be interested to know of the
availability of software for the analysis and simulation of dynamical and
probabilistic phenomena.  xpp is available free via anonymous ftp.  It
solves integro-differential equations, delay equations, iterative
equations, all combined with probabilistic models.  Postscript output is
supported.  A variety of numerical methods are employed so that the user
can generally be sure that the solutions are accurate.  Examples are
connectionist type neural nets, biophysical models, models with memory,
and models of cells with random inputs or with random transitions.  A
graphical interface using X windows as well as numerous plotting options
are provided.  The requirements are a C compiler and an OS capable of
running X11.  The software has been successfully compiled on
DEC,HP,SUN,IBM,NEXT workstations as well as on a PC running Linux.  Once
it is compiled, no more compilation is necessary as the program can read
algebraic expressions and interpret them in order to solve them.  The
program has been used in various guises for the last 5 years by a variety
of mathematicians, physicists, and biologists.  To get it follow the
instructions below:

- ------------Installing XPP1.6--------------------------------

 XPP is pretty simple to install
although you might have to add a line here and there
to the Makefile.  You can get it from
            mthsn4.math.pitt.edu (130.49.12.1)
here is how:

   ftp 130.49.12.1
   cd /pub
   bin
   get xpp1.6.tar.Z
   quit
   uncompress xpp1.6.tar.Z
   tar xf xpp1.6.tar
   make -k

If you get errors in the compilation it is likely to be one
of the following:
1) gcc not found in which case you should edit the Makefile
   so that it says  CC= cc
2)  Cant find X include files.  Then edit the line that says
    CFLAGS= ....
    by adding
            -I<pathname>
where <pathname> is where the include files are for X, e,g,
     -I/usr/X11/include
3) Cant find X libraries.  Then add a line
LDFLAGS= -L<pathname>
right after the CFLAGS= line where <pathname> is where to find the X11
libraries
then  change this line:
$(CC) -o xpp $(OBJECTS) $(LIBS)
to this line
$(CC) -o xpp $(OBJECTS) $(LDFLAGS) $(LIBS)

That should do it!!

If it still doesnt compile, then you should ask your sysadmin about
the proper paths.

Finally, some compilers have trouble with the GEAR algorithm if they
are optimized so you should remove the optimization flags i.e. replace
CFLAGS=  -O2 -I<your pathnames>
with
CFLAGS=   -I<your pathnames>

delete all the .o files and recompile

Good luck!
   Bard Ermentrout


  Send comments and bug reports to
     bard@mthbard.math.pitt.edu



------------------------------

Subject: CMU Learning Benchmark Database Updated
From:    Matthew.White@cs.cmu.edu
Date:    Fri, 24 Sep 93 03:15:48 -0500

The CMU Learning Benchmark Archive has been updated.  As you may know, in
the past, all the data sets in this collection have been in varying
formats, requiring that code be written to parse each one.  This was a
waste of everybody's time.  These old data sets have been replaced with
data sets in a standardized format.  Now, all benchmarks consist of a
file detailing the benchmark and another file that is either a data set
(.data) or a program to generate the appropriate data set (.c).

Data sets currently avaialable are:
        nettalk Pronunciation of English words.
        parity          N-input parity.
        protein Prediction of secondary structure of proteins.
        sonar           Classification of sonar signals.
        two-spirals     Distinction of a twin spiral pattern.
        vowel           Speaker independant recognition of vowels.
        xor             Traditional xor.


To accompany this new data file format is a file describing the format
and a C library to parse the data file format.  In addition, the
simulator (C version) for Cascade-Correlation has been rewritten to use
the new file format.  Both the parsing code and the cascade correlation
code are distributed as compressed shell archives and should compile with
any ANSI/ISO compatible C compiler.

Code currently available:
        nevprop1.16.shar        A user friendly version of quickprop.
        cascor1a.shar           The re-engineered version of the Cascade
                                Correlation algorithm.
        parse1.shar             C code for the parsing algorithm to the new
                                data set format.

Data sets and code are available via anonymous FTP.  Instructions follow.

If you have difficulties with either the data sets or the programs,
please send mail to: neural-bench@cs.cmu.edu.  Any comments or
suggestions should also be sent to that address.  Let me urge you not to
hold back questions as it is our single best way to spot places for
improvement in our methods of doing things.

If you would like to submit a data set to the CMU Learning Benchmark
Archive, send email to neural-bench@cs.cmu.edu.  All data sets should be
in the CMU data file format.  If you have difficulty converting your data
file, contact us for assistance.


Matt White
Maintainer, CMU Learning Benchmark Archive

- --------------------------------------------------------------------

Directions for FTPing datasets:

For people whose systems support AFS, you can access the files directly
from directory "/afs/cs.cmu.edu/project/connect/bench".

For people accessing these files via FTP:

1. Create an FTP connection from wherever you are to machine
   "ftp.cs.cmu.edu". The internet address of this machine is
   128.2.206.173, for those who need it.

2. Log in as user "anonymous" with your own internet address as password.
   You may see an error message that says "filenames may not have /.. in
   them" or something like that.  Just ignore it.

3. Change remote directory to "/afs/cs/project/connect/bench".  NOTE: you
   must do this in a single atomic operation.  Some of the super
   directories on this path are not accessible to outside users.

4. At this point the "dir" command in FTP should give you a listing of
   files in this directory.  Use get or mget to fetch the ones you want.
   If you want to access a compressed file (with suffix .Z) be sure to
   give the "binary" command before doing the "get".  (Some version of
   FTP use different names for these operations -- consult your local
   system maintainer if you have trouble with this.)

5. The directory "/afs/cs/project/connect/code" contains public-domain
   programs implementing the Quickprop and Cascade-Correlation
   algorithms, among other things.  Access it in the same way.




------------------------------

Subject: postdoctoral fellowship opportunity for women and minorities
From:    Ken Miller <ken@phy.ucsf.edu>
Date:    Mon, 27 Sep 93 03:13:58 -0800

The University of California annually awards 20 or more postdoctoral
fellowships to women and minorities under the "President's
Postdoctoral Fellowship Program".  Fellowships are awarded to work
with a faculty member at any of the nine UC campuses or at one of the
three national laboratories associated with UC (Lawrence Berkeley,
Lawrence Livermore, and Los Alamos).  Fellowships pay $26-27,000/year,
plus health benefits and $4000/year for research and travel.
Applicants must be citizens or permanent residents of the United
States, and should anticipate completion of their Ph.D.'s by July 1,
1994.  For this year's competition, DEADLINE FOR APPLICATION IS
DECEMBER 14, 1993.

There are many of us who work in computational neuroscience or
connectionism in the UC system or the national labs.  I would
encourage anyone eligible to make use of this opportunity to obtain
funding to work with one of us.  In particular, I encourage anyone
interested in computational neuroscience to contact me to further
discuss my own research program and the research opportunities in
computational and systems neuroscience at UCSF.

To receive a fellowship application and further information, contact:

        President's Postdoctoral Fellowship Program
        Office of the President
        University of California
        300 Lakeside Drive, 18th Floor
        Oakland, CA 94612-3550
        Phone: 510-987-9500 or 987-9503

Ken Miller

        Kenneth D. Miller               telephone: (415) 476-8217
        Dept. of Physiology             internet: ken@phy.ucsf.edu
        University of California,       fax: (415) 476-4929
                   San Francisco
        513 Parnassus
        San Francisco, CA 94143-0444
        [Office: S-859]


------------------------------

Subject: Job vacancy in evolutionary algorithms
From:    S.FLOCKTON@rhbnc.ac.uk
Date:    Mon, 27 Sep 93 12:41:37 +0000

ROYAL HOLLOWAY, UNIVERISTY OF LONDON

POST-DOCTORAL RESEARCH ASSISTANT

EVOLUTIONARY ALGORITHMS IN NON-LINEAR SIGNAL PROCESSING

Applications are invited for this SERC-funded post, tenable for
three years from 1 October 1993 or soon after, to carry out a
comparison of the effectiveness of evolution-based algorithms for
a number of signal processing problems. This comparison will be
done by study of example problems and developing theoretical ideas
concerning the behaviour of these algorithms.   The successful
applicant will join a group investigating several different
aspects of genetic algorithms and neural networks. Royal Holloway,
one of the five multi-faculty Colleges of the University of
London, is situated in a campus environment approximately 20 miles
west of London, just outside the M25.

Applicants should hold a PhD in Electrical Engineering, Computer
Science, Physics, or a related field, preferably in digital signal
processing or genetic and/or other evolution-based algorithms.

Salary on the Research 1A Scale (UKpounds 14,962 - 17,320 pa,
inclusive of London Allowance).

Informal enquiries to Dr Stuart Flockton (Tel: 0784 443510 , Fax:
0784 472794, email: S.Flockton@rhbnc.ac.uk).
Further particulars from the Personnel Officer, Royal Holloway,
University of London, Egham, Surrey, TW20 0EX Tel: 0784 443030.

Closing date for applications: 15th October 1993


------------------------------

Subject: 'Fractal Extrapolation', is there such a thing?
From:    pwm@csis.dit.csiro.au
Date:    Mon, 27 Sep 93 13:53:29 -0500

Dear Digest Moderator,

I'd be grateful if you could include this in a furure digest:

I'm wondering if there is such a thing as fractal extrapolation.  I have
a timeseries for which I have a probabalistic structural feature which
operates at many levels and allows me to make good predictions ~70%
at each level of the direction of the next move by delta.

I'm interested in trying to understand the 'big picture' here and
got to wondering whether there is such a thing as fractal extrapolation?

I would be grateful for any pointers to work along these lines.

Cheers,
    Peter       (milne@csis.dit.csiro.au)


------------------------------

End of Neuron Digest [Volume 12 Issue 4]
****************************************
Received: from BUACCA by BUACCA.BU.EDU (Mailer R2.08 PTF009) with BSMTP id
 8296; Wed, 29 Sep 93 20:25:03 EDT
Received: from noc4.dccs.upenn.edu by BUACCA.BU.EDU (IBM VM SMTP R1.2.1) with
 TCP; Wed, 29 Sep 93 20:25:00 EDT
Received: from CATTELL.PSYCH.UPENN.EDU by noc4.dccs.upenn.edu
	id AA25307; Wed, 29 Sep 93 20:14:28 -0400
Return-Path: <marvit@cattell.psych.upenn.edu>
Received: from LOCALHOST by cattell.psych.upenn.edu
	id AA27812; Wed, 29 Sep 93 19:24:57 EDT
Posted-Date: Wed, 29 Sep 93 19:24:09 -0400
From: "Neuron-Digest Moderator" <neuron-request@CATTELL.PSYCH.UPENN.EDU>
To: Neuron-Distribution:;
Subject: Neuron Digest V12 #5 (conferences & CFP)
Reply-To: "Neuron-Request" <neuron-request@CATTELL.PSYCH.UPENN.EDU>
X-Errors-To: "Neuron-Request" <neuron-request@psych.upenn.edu>
Organization: University of Pennsylvania
Date: Wed, 29 Sep 93 19:24:09 -0400
Message-Id: <27800.749345049@cattell.psych.upenn.edu>
Sender: marvit@cattell.psych.upenn.edu

Neuron Digest   Wednesday, 29 Sep 1993
                Volume 12 : Issue 5

Today's Topics:
                                ISIKNH'94
           call for papers - INSTRUMENTATION AND MEASUREMENTS
          NIPS 93 Announcement: Workshop on Selective Attention
                 NIPS-93 Workshop "Parallel Processing"
             NIPS'93 workshop on "Stability and Solvability"
       Applied Imagery Pattern Recognition: Workshop Announcement
                 DYNAMICAL SYSTEMS IN THE NEUROSCIENCES


Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@psych.upenn.edu". The ftp archives are
available from psych.upenn.edu (130.91.68.31). Back issues requested by
mail will eventually be sent, but may take a while.

----------------------------------------------------------------------

Subject: ISIKNH'94
From:    "Li-Min Fu" <fu@whale.cis.ufl.edu>
Date:    Thu, 16 Sep 93 14:42:20 -0500


                            CALL  FOR   PAPERS


   International Symposium on Integrating Knowledge and Neural Heuristics
                              (ISIKNH'94)

Sponsored by University of Florida, and AAAI,
in cooperation with IEEE Neural Network Council, INNS-SIG, and FLAIRS.

Time: May 9-10 1994; Place: Pensacola Beach, Florida, USA.


A large amount of research has been directed toward integrating neural
and symbolic methods in recent years.  Especially, the integration of
knowledge-based principles and neural heuristics holds great promise in
solving complicated real-world problems.  This symposium will provide a
forum for discussions and exchanges of ideas in this area. The objective
of this symposium is to bring together researchers from a variety of
fields who are interested in applying neural network techniques to
augmenting existing knowledge or proceeding the other way around, and
especially, who have demonstrated that this combined approach outperforms
either approach alone.  We welcome views of this problem from areas such
as constraint-(knowledge-) based learning and reasoning, connectionist
symbol processing, hybrid intelligent systems, fuzzy neural networks,
multi-strategic learning, and cognitive science.

Examples of specific research include but are not limited to:
1. How do we build a neural network based on {\em a priori}
knowledge (i.e., a knowledge-based neural network)?
2. How do neural heuristics improve the current model
for a particular problem (e.g., classification, planning,
signal processing, and control)?
3. How does knowledge in conjunction with neural heuristics
contribute to machine learning?
4. What is the emergent behavior of a hybrid system?
5. What are the fundamental issues behind the combined approach?

Program activities include keynote speeches, paper presentation,
panel discussions, and tutorials.

*****
Scholarships are offered to assist students in attending the
symposium.  Students who wish to apply for a scholarship should send
their resumes and a statement of how their researches are related
to the symposium.
*****


Symposium Chairs:
LiMin Fu, University of Florida, USA.
Chris Lacher,  Florida State University, USA.

Program Committee:
Jim Anderson,   Brown University,  USA
Michael Arbib,  University of Southern California,  USA
Fevzi Belli,  The University of Paderborn,  Germany
Jim Bezdek,  University of West Florida,  USA
Bir Bhanu,  University of California,  USA
Su-Shing Chen,  National Science Foundation,  USA
Tharam Dillon,  La Trobe University,  Australia
Douglas Fisher,  Vanderbilt University,  USA
Paul Fishwick,  University of Florida,  USA
Stephen Gallant,  HNC Inc.,  USA
Yoichi Hayashi,  Ibaraki University,  Japan
Susan I. Hruska,  Florida State University,  USA
Michel Klefstad-Sillonville  CCETT,  France
David C. Kuncicky,  Florida State University,  USA
Joseph Principe,  University of Florida,  USA
Sylvian Ray,  University of Illinois,  USA
Armando F. Rocha, University of Estadual, Brasil
Ron Sun,  University of Alabama,  USA

Keynote Speaker: Balakrishnan Chandrasekaran, Ohio-State University


Schedule for Contributed Papers
- ----------------------------------------------------------------------
Paper Summaries Due: December 15, 1993
Notice of Acceptance Due: February 1, 1994
Camera Ready Papers Due: March 1, 1994

Extended paper summaries should be
limited to four pages (single or double-spaced)
and should include the title, names of the authors, the
network and mailing addresses and telephone number of the corresponding
author.  Important research results should be attached.
Send four copies of extended paper summaries to

      LiMin Fu
      Dept. of CIS, 301 CSE
      University of Florida
      Gainesville, FL 32611
      USA
      (e-mail: fu@cis.ufl.edu; phone: 904-392-1485).

Students' applications for a scholarship should also be sent
to the above address.

General information and registration materials can be obtained by
writing to

      Rob Francis
      ISIKNH'94
      DOCE/Conferences
      2209 NW 13th Street, STE E
      University of Florida
      Gainesville, FL 32609-3476
      USA
      (Phone: 904-392-1701; fax: 904-392-6950)



- ---------------------------------------------------------------------
If you intend to attend the symposium, you may submit the following
information by returning this message:


NAME: _______________________________________
ADDRESS: ____________________________________
_____________________________________________
_____________________________________________
_____________________________________________
_____________________________________________
PHONE: ______________________________________
FAX: ________________________________________
E-MAIL: _____________________________________


------------------------------

Subject: call for papers - INSTRUMENTATION AND MEASUREMENTS
From:    PIURI@IPMEL1.POLIMI.IT
Date:    Fri, 17 Sep 93 18:59:03 +0700

=============================================================================

      1994 INTERNATIONAL CONFERENCE ON INSTRUMENTATION AND MEASUREMENTS
                                   IMTC'94
          Advanced Technologies in Instrumentation and Measurements

                   Hamamatsu, Shizuoka, Japan - 10-12 May 1994

=============================================================================

SPECIAL SESSION ON NEURAL INSTRUMENTS

CALL FOR PAPERS

Program Chair:  Kenzo Watanabe
                Research Institute of Electronics
                Shizuoka University
                3-5-1 Johoku, Hamamatsu, 423 Japan
                phone +81-53-471-1171
                fax +81-53-474-0630

General Chair:  Robert Myers
                3685 Motor Ave., Suite 240
                Los Angeles, CA 90034-5750, USA
                phone +1-310-287-1463
                fax +1-310-286-1851

Sponsored by:   IEEE Intrsumentation and Measurement Society
                Society of Instruments and Control Engineers, Japan

Cooperated by:  Institute of Electrical Engineers, Japan
                Institute of Electronics, Information and Communication
                Engineers, Japan
                Japan Society of Applied Physics
                Japan Electric Measuring Instrument Manufacturers'
Association

The IMTC'94 Conference is the 10th edition of the annual conference
organized by the IEEE Instrumentation and Measurement Society to provide
a stimulating forum for practitioners and scientists working in areas
related to any kind of measurements, theoretical aspects of mesurements,
instruments for measurements, and measurement processing.

Traditional topics are: Acoustics measurements, AI & fuzzy, Automotive &
avionic instrumentation, Calibrartion, Metrology & standards, Digital
signal analysis & processing, Digital and mobile communications, LSI
analysis, diagnosis & testing, Mixed analog & digital ASICs, Optic &
fiber optic measurement, Process measurements, Sensor & transducers,
System identification, Waveform analysis and measurements, A/D and D/A,
Data acquisition, Antenna & EMI / EMC, Biomedical instruments,
Computer-based measurements & software, Environment measurements,
Microwave measurements, Nuclear & medial instruments, Pressure &
temperature measurements, Quality & reliability, STM and imaging, Time
and Frequency measurements.

To support presentation and discussion of emergent technologies, a
special session on Neural Instruments will be organized within IMTC'94.
Main goals are neural technologies for measurements, applications of
neural networks in measurement and instruments, design and implementation
of neural solutions for instrument's subsystems, neural subsystems for
automatic control, and neural subsystems for signal processing.

Authors are invited to submit one-page abstract (containing title,
authors, affiliations, and the session name "Neural Instruments" in the
upper right corner) and cover page (containing title, authors,
affiliations, contact author, full address of the contact author,
telephone and fax number of the contact author, and the session name
"Neural Instruments" in the upper right corner).

Submission must be received by the general chair (for authors from Europe
and North-America) or by the program chair (for authors from Asia and
other areas) by October 1st, 1993. Fax submissions are accepted.  An
additional copy of the submission should be sent by e-mail or fax to the
coordinator of the session on Neural Instruments (this copy does not
substitute the formal submission to the general chair or the program
chair).  Submission of a paper implies a willingness to attend at the
conference and to present the paper.  Notification of acceptance will be
mailed by December 1st, 1993; camera-ready papers are due by February
1st, 1994.

Authors of selected papers will also be invited to submit their papers
for consideration for the special IMTC/94 issue of the IEEE Transaction
on Instrumentation and Measurements.

For any additional information regarding the special session on Neural
Instruments, contact the session coordinator.

Session Coordinator for "Neural Instruments":
        Prof. Vincenzo PIURI
        Department of Electronics and Information
        Politecnico di Milano
        piazza L. da Vinci 32
        I-20133 Milano, Italy
        phone no. +39-2-2399-3606, +39-2-2399-3623
        fax no. +39-2-2399-3411
        e-mail piuri@ipmel1.polimi.it

=============================================================================


------------------------------

Subject: NIPS 93 Announcement: Workshop on Selective Attention
From:    Ernst Niebur <ernst@cns.caltech.edu>
Date:    Tue, 21 Sep 93 13:06:36 -0800



Fellow Connectionists:

We would like to announce the final program of a workshop on visual
selective attention to be held at this year's NIPS conference. The
conference will be held from Nov.  29 to Dec. 2 in Denver, CO, the
workshop will be held Dec. 3 and 4 "at a nearby ski area."

For NIPS conference and workshop registration info, please write to:  NIPS*93
Registration / NIPS Foundation / PO Box 60035 / Pasadena, CA  91116-6035 USA

For questions concerning this workshop, please contact either of the
organizers by e-mail.

- --Ernst Niebur



   NIPS*93 Workshop:    Neurobiology, Psychophysics, and Computational
   =================    Models of Visual Attention


   Intended Audience:   Experimentalists, modelers and others interested in
   ==================   visual attention and high-level vision

   Organizers:
   ===========

   Ernst Niebur                 Bruno Olshausen
   ernst@caltech.edu            bruno@lgn.wustl.edu


   Program:
   ========

   In any physical computational system, processing resources are
   limited, which inevitably leads to bottlenecks in the processing of
   sensory information.  Nowhere is this more evident than in the primate
   visual system, where the massive amount of information provided by the
   optic nerve far exceeds what the brain is capable of fully processing
   and assimilating into conscious experience.  Visual attention thus
   serves as a mechanism for selecting certain portions of the input to
   be processed preferentially, shifting the processing focus from one
   location to another in a serial fashion.  The study of visual
   attention is integral to our understanding of higher visual function,
   and it may also be of practical benefit to machine vision as well.

   What we know of visual attention has been learned from a combination
   of psychophysical, neurophysiological, and computational approaches.
   Psychophysical studies have revealed the behavioral consequences of
   visual attention by measuring either a speed-up in observer's reaction
   time or an improvement in discrimination performance when the observer
   is attending to a task. Neurophysiological studies, on the other hand,
   have attempted to reveal the neural mechanisms and brain areas
   involved in attention by measuring the modulation in single cell
   firing rate or in the activity in a part of the brain as a function of
   the attentional state of the subject.  A number of computational
   models based on these studies have been proposed to address the
   question of how attention eases the computational burdens faced by the
   brain in pattern recognition or other visual tasks, and how attention
   is controlled and expressed at the neuronal level.

   The goal of this workshop will be to bring together experts from each
   of these fields to discuss the latest advances in their approaches to
   studying visual attention.  Half the available time has been reserved
   for informal presentations and the other half for discussion.


   Morning session:

   7:30-8:00    Introduction/overview

                "Covert Visual Attention: The Phenomenon"
                (Ernst Niebur, Caltech)
                (7:50-8:00: Discussion)

   8:00-9:00    Neurobiology

        8:00    "Effects of Focal Attention on Receptive Field
                 Profiles in Area V4"
                (Ed Connor, Washington University)
                (8:20-8:30: Discussion)

        8:30    "Neurophysiological evidence of scene segmentation
                 by feature selective, parallel attentive mechanisms"
                (Brad Motter, VA Medical Center/SUNY-HSC, Syracuse)
                (8:50-9:00: Discussion)

   9:00-9:30    General Discussion


   Afternoon session:

   4:30-5:00    Psychophysics

                "Attention and salience: alternative mechanisms of
                 visual selection"
                (Jochen Braun, Caltech)
                (4:50-5:00: Discussion)

   5:00-6:00    Computational models

        5:00    "Models for the neural implementation of attention
                 based on the temporal structure of neural signals"
                (Ernst Niebur, Caltech)
                (5:20-5:30: Discussion)

        5:30    "Dynamic routing circuits for visual attention"
                (Bruno Olshausen, Washington University/Caltech)
                (5:50-6:00: Discussion)

   6:00-6:30    General discussion



------------------------------

Subject: NIPS-93 Workshop "Parallel Processing"
From:    Joachim Diederich <joachim@fit.qut.edu.au>
Date:    Fri, 24 Sep 93 04:12:20 -0500


NIPS*93 Workshop:       Connectionist Modelling and Parallel Architectures
=================       --------------------------------------------------
                        4 December 1993; Vail, Colorado


Intended Audience:      computer scientists and engineers as well as
==================      biologists and cognitive scientists

Organizers:
===========

        Joachim Diederich                       Ah Chung Tsoi

  Neurocomputing Research Centre   Department of Elec. and Computer
Engineering
Queensland University of Technology       University of Queensland
  joachim@fitmail.fit.qut.edu.au             act@s1.elec.uq.oz.au


Program:
========

The objective of the workshop is  to  provide  a  discussion
platform   for   researchers   interested  in  software  and
modelling aspects of neural computing. The  workshop  should
be  of  considerable  interest  to  computer  scientists and
engineers as well as biologists and cognitive scientists.

The  introduction  of  specialized  hardware  platforms  for
connectionist  modelling ("connectionist supercomputer") has
created  a  number  of  research  issues  which  should   be
addressed. Some of these issues are controversial (incl. the
need for  such  specialized  architectures):  the  efficient
implementation  of incremental learning techniques, the need
for the dynamic reconfiguration of networks at  runtime  and
possible programming environments for these machines.

The following topics should be addressed:

- -  the   efficient   simulation   of   homogenuous   network
architectures;  mapping  of homogenous network architectures
to parallel machines

- - randomness and sparse coding; the efficient simulation  of
sparse  networks on sequential and parallel machines. Sparse
activity and communication in parallel architectures

- - arbitrary interconnection schemes  and  their  mapping  to
parallel architectures

- -  dynamic  reconfiguration:  the  modification  of  network
structures  and  activation  functions  at runtime. Possible
trade-offs between the efficient simulation  of  fixed-sized
networks and constructive (incremental) learning algorithms

- -  software  tools  and  environments  for  neural   network
modelling, in particular for parallel architectures

- - connectionist supercomputer (such as  CNAPS,  Synapse  and
CNS-1)  hardware  and  programming  issues  associated  with
connectionist supercomputer

- - biologically realistic modelling on parallel machines, the
simulation of synaptogenesis, spike trains etc.

- - realistic simulation  of  the  brain  integrating  over  a
number of scales of complexity, from the detailed simulation
of neurons to high level abstractions


The following is a preliminary schedule, we expect to have two more slots
for brief presentations and invite abstracts for short talks (about
10-15min).
Please send e-mail to: joachim@fitmail.fit.qut.edu.au

Morning Session:
- ----------------

7:30-7:40       Joachim Diederich, Queensland University of Tech., Brisbane
                Introduction

7:40-8:10       Jerome A. Feldman, ICSI & University of California, Berkeley
                The Connectionist Network Supercomputer (CNS-1)

8:10-8:30       Discussion

8:30-8:50       Nigel Goddard, Pittsburgh Supercomputer Center
                Practical Parallel Neural Simulation

8:50-9:10       Per Hammarlund, Royal Institute of Technology, Stockholm
                Simulation of Large Neural Networks: System Specification and
                Execution on Parallel Machines

9:10-9:30       Discussion

Afternoon Session:
- ------------------

4:30-4:50       Paul Murtagh & Ah Chung Tsoi, University of Queensland, St.
 Lucia
                Digital implementation of a reconfigurable VLSI
                neural network chip

4:50-5:20       Ulrich Ramacher, Siemens AG, Munich
                The Neurocomputer SYNAPSE-1

5:20-5:30       Discussion

5:30-6:00       Guenther Palm & Franz Kurfess, University of Ulm
                Parallel Implementations of Neural Networks
                for Associative Memory

6:00-6:30       Discussion


------------------------------

Subject: NIPS'93 workshop on "Stability and Solvability"
From:    GARZONM@hermes.msci.memst.edu
Date:    28 Sep 93 09:28:07 -0600


                  C A L L    F O R    P A P E R S

                      A One-day Workshop on
                  * STABILITY AND OBSERVABILITY *
                           at NIPS'93
                         December 3, 1993

We are organizing a workshop at the NIPS'93 -Neural Information
Processing Systems conference to be held at the Denver/Vail area in
Colorado December 3. The themes of the workshop are `Stability
and Observability'. A more detailed description is attached below.

There is still room for some contributed talks. If you are
interested in presenting a paper based on previous and/or current
research, send a short (one-page) abstract or contact one of the
organizers by October 8 via email or fax. A list of speakers will
be finalized by mid October.

Fernanda Botelho                   Max Garzon
botelhof@hermes.msci.memst.edu     garzonm@hermes.msci.memst.edu
FAX (901)678-2480 (preferred); 678-3299
Workshop cochairs

_____________________________ cut here _________________________

The purpose of this one-day workshop is to bring together neural
network practitioners, computer scientists and mathematicians
interested in `stability' and `observability' of neural networks of
various types (discrete/continuous time and/or activations).

These two properties concern the relationship between defining
parameters (weights, transfer functions, and training sets) and the
behavior of neural networks from the point of view of an outside
observer. This behavior is affected by noise, rounding, bounded
precision, sensitivity to initial conditions, etc. Roughly
speaking, *stability* (e.g. asymptotic, Lyapunov, structural)
refers to the ability of a network (or a family of networks) to
generate trajectories/orbits that remain reasonably close (resp.,
in structure, e.g. topological conjugacy) to the original
under small perturbations of the input/initial conditions (or the
defining parameters of the network). Of course, neural networks are
well-known for their graceful degradation, but this is less clear
an issue with bounded precision, continuous time with local
interaction governed by differential equations, and learning
algorithms.
Second, the issue of *observability*, roughly speaking, concerns
the problem of error control under iteration of recurrent nets. In
dynamical systems observability is studied in terms of
shadowing. But observability can also be construed other ways,
e.g. as our ability to identify a network by observing the abstract
i/o function that it realizes (which, at some level, reduces to
essential uniqueness of an irreducible network implementing the i/o
function).

Speakers will present their views in short(< 20 min.) talks. A
panel discussion coordinated by the cochairs will discuss known
results, and identify fundamental problems and questions of
interest for further research.

F. Botelho and M. Garzon, cochairs
botelhof@hermes.msci.memst.edu   garzonm@hermes.msci.memst.edu
Mathematical Sciences            Institute for Intelligent Systems
                    Memphis State University
                    Memphis, TN 38152 U.S.A.

Max Garzon                (preferred)   garzonm@hermes.msci.memst.edu
Math Sciences                           garzonm@memstvx1.memst.edu
Memphis State University                Phone: (901) 678-3138/-2482
Memphis, TN 38152  USA                  Fax:   (901) 678-3299


------------------------------

Subject: Applied Imagery Pattern Recognition: Workshop Announcement
From:    harmon@erim.org (Laurel Harmon)
Date:    Tue, 28 Sep 93 12:26:01 -0500

********************  ANNOUNCEMENT  *********************

  22nd Applied Imagery Pattern Recognition Workshop

                October 13-15, 1993
                   Cosmos Club
                  Washington, D.C.

*********************************************************


INTERDISCIPLINARY COMPUTER VISION:
APPLICATIONS AND CHANGING NEEDS


SESSIONS ON:

     Environment and Global Change
     Medical and Biotechnology
     Security and Law Enforcement
     Document Image Understanding
     Object and Target Recognition
     Intelligent Highways

KICKOFF SPEAKER:

     "Defense Reinvestment"
     Jane Harman
     U.S. Congress

BANQUET ADDRESS:

     "High Performance Computing and Communication in
      Clinical Medicine"
     Julian Rosenman, MD PhD
     University of North Carolina
__________________________________________________________

This Imagery Workshop brings together researchers
from government, industry, and academia in an elegant
setting conducive to technical interchange across a
broad range of disciplines. The papers span a range from
research to fielded systems and provide, to managers and
developers alike, a broad vision of the applicability of
Image Understanding.

__________________________________________________________

For program details and registration information,
please direct inquiries to:

     Dr. Joan Lurie, AIPR Chair
     TRW - R2/1094
     1 Space Park
     Redondo Beach, CA 90278
     (310) 814-8690

     or

     Dr. J.Michael Selander
     AIPR-03 Program Chair
     MITR (M.S. Z-267)
     7525 Colshire Dr.
     McLean, VA 22102-3481
     (703) 883-7294
     mike@mwunix.mitre.org



********************  ANNOUNCEMENT  *********************


------------------------------

Subject: DYNAMICAL SYSTEMS IN THE NEUROSCIENCES
From:    "Dennis L. Glanzman" <NGG@CU.NIH.GOV>
Date:    Wed, 29 Sep 93 16:26:14 -0500

Would you please post the following announcement in the next issue of
Neuron Digest.

Thanks,

Dennis L. Glanzman, Ph.D. Chief, Theoretical Neuroscience Program
Division of Neuroscience and Behavioral Science National Institute of
Mental Health

Room 11-102 (301) 443-1576 Parklawn Building FAX:  (301) 443-4822
5600 Fishers Lane BITNET:  NGG@NIHCU Rockville, MD 20857 INTERNET:
NGG@CU.NIH.GOV


- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -


                      C A L L   F O R   P O S T E R S

National Institute of Mental Health
Division of Neuroscience and Behavioral Science

DYNAMICAL SYSTEMS IN THE NEUROSCIENCES

"Multiscale Time and Space Coherence in Brain Function"

A Satellite Symposium of the Annual Meeting of the Society for Neuroscience
Friday, November 12, 1-5 PM and Saturday, November 13, 8-12 AM
Washington Convention Center, Rooms 30-32

                                   TUTORIALS

Topics include basic concepts in chaos and fractals; biologically
relevant applications from DNA to aesthetics; new techniques in time
series analyses; communication among chaotic systems.  Tutorial
speakers:  Heinz-Otto Peitgen, Universitt Bremen; Richard Voss,
Watson Research Center of IBM; Eric Kostelich, Arizona State
University; and Lou Pecora, Naval Research Laboratory

                                 APPLICATIONS

Topics include fractal measures on auditory neuron activities,
deterministic variability in hippocampal and spinal cord neural
networks, semi-attractors in epilepsy, multifrequency coherence in
brain waves during learning, electroencephalographic complexity
following electroshock treatment, chaotic rhythms in patients with
bipolar disorder.  Invited speakers:  Malvin Teich, Columbia
University; Steven Schiff, Children's National Medical Center;
William Schaffer, University of Arizona; Steven Bressler, Florida
Atlantic University; Andrew Krystal, Duke University; and Alan
Gottschalk, University of Pennsylvania

                                    POSTERS

Original research contributions with a clear application of dynamics
to Neuroscience are solicited.  Authors should submit four copies of
a 500-word (or less) summary clearly stating their problem,
application and results, postmarked by October 15, 1993.

Include addresses of all authors on the front of the summary and
indicate to which author correspondence should be addressed.


For further information about the symposium, or to submit a poster
abstract, please contact:  D. Glanzman, NIMH, Room 11-102, 5600
Fishers Lane, Rockville, MD 20857 Telephone:  (301) 443-1576 Fax:
(301) 443-4822



------------------------------

End of Neuron Digest [Volume 12 Issue 5]
****************************************
Received: from BUACCA by BUACCA.BU.EDU (Mailer R2.08 PTF009) with BSMTP id
 2528; Thu, 07 Oct 93 15:09:39 EDT
Received: from noc4.dccs.upenn.edu by BUACCA.BU.EDU (IBM VM SMTP R1.2.1) with
 TCP; Thu, 07 Oct 93 15:09:20 EDT
Received: from CATTELL.PSYCH.UPENN.EDU by noc4.dccs.upenn.edu
	id AA23022; Thu, 7 Oct 93 14:46:12 -0400
Return-Path: <marvit@cattell.psych.upenn.edu>
Received: from LOCALHOST by cattell.psych.upenn.edu
	id AA05943; Thu, 7 Oct 93 12:36:49 EDT
Posted-Date: Thu, 07 Oct 93 12:36:08 -0400
From: "Neuron-Digest Moderator" <neuron-request@CATTELL.PSYCH.UPENN.EDU>
To: Neuron-Distribution:;
Subject: Neuron Digest V12 #6
Reply-To: "Neuron-Request" <neuron-request@CATTELL.PSYCH.UPENN.EDU>
X-Errors-To: "Neuron-Request" <neuron-request@psych.upenn.edu>
Organization: University of Pennsylvania
Date: Thu, 07 Oct 93 12:36:08 -0400
Message-Id: <5936.750011768@cattell.psych.upenn.edu>
Sender: marvit@cattell.psych.upenn.edu

Neuron Digest   Thursday,  7 Oct 1993
                Volume 12 : Issue 6

Today's Topics:
                            Summer Institute
         Re: Info on Neural Network based Multivariate Analysis
              BP on Prediction of Protein Structure Problem
                      Protein Structure Prediction
                          Postdoctoral position
                 Model problem for feedforward networks
               Deadline reminder: Music/arts special issue
                       Research Job - Switzerland
                  Research post - Treebank-translation
          Stochastic Neural Networks for Optimization problems
                        Positions in BIOCOMPUTING


Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@psych.upenn.edu". The ftp archives are
available from psych.upenn.edu (130.91.68.31). Back issues requested by
mail will eventually be sent, but may take a while.

----------------------------------------------------------------------

Subject: Summer Institute
From:    Terry Sejnowski <terry@helmholtz.sdsc.edu>
Date:    Tue, 28 Sep 93 17:49:21 -0800

SUMMER INSTITUTE IN COGNITIVE NEUROSCIENCE
at The University of California, Davis

        The 1994 Summer Institute will be held at the University of
California, Davis, from July 10 through 23.  The two week course will
examine how information about the brain bears on issues in cognitive
science, and how approaches in cognitive science apply to neuroscience
research.  A distinguished international faculty will lecture on current
topics in brain plasticity, strategies of neuroencoding, and evolution.
Laboratorites and demonstrations will provide practical experience with
cognitive neuropsychology experiments, connectionist/computational
modeling, and neuroimaging techniques.  At every stage, the relationship
between cognitive processes and underlying neural circuits will be
explored.  The Foundation is providing room/board and limited support for
travel.

Faculty Include:
Richard Andersen, Max Bear, Ira Black, Kenneth H. Britten, Simon Baron
Cohen, Leda Cosmides, Randy Gallistel, Michael S. Gazzaniga, Charles
Gilbert, Charles M. Gray, Eric Knudsen, Peter Marler, Michael Merzenich,
Reid Montague, Steven Pinker, V. Ramachandran, Gregg Recanzone, Barry
Richmond, Mitch Sutter, Timothy Tonini, John Tooby, and many others.

For information and applications please write to:
McDonnell Summer Institute in Cognitive Neuroscience
Center for Neuroscience, 1544 Newton Court
University of California, Davis
Davis, California 95616  USA

APPLICATIONS MUST BE RECEIVED BY JANUARY 15, 1994


------------------------------

Subject: Re: Info on Neural Network based Multivariate Analysis
From:    URPANI D <dju@stan.xx.swin.OZ.AU>
Date:    Thu, 30 Sep 93 16:18:37 -0100

Neuron Digesters

This is a plea for information.  I am a PhD student at Swinburne Uni,
Melbourne, Australia.  My interests are neural net applications in the
process industry with particular reference to aluminum smelting.

The idea is to find ways and means to make the process operate in a more
efficient envelope.  Approx. 15 sensor measurements are currently logged
daily.  My first step is to try to cluster there measurements into groups
so that later on I can assign a process conditon or state to each one.
Each condition can then be tagged with a process performance index such as
kg Al/kWH.  I have been using hierarchical cluster analysis in the hope of
achieving this so I will have set of input vectors (15 dimensional) and a
fair,good, bad etc process condition attached to them ie I can then
progress form an unsupervised to a supervised learning situation.

However cluster analysis does not seem to be discriminating as well as I
hoped and I think I will have to do the clustering job using NN
(SOM's, LVQ's etc).  I suspect there could be some problems associated
with the time series nature of the data (variables autocorrelated etc).

I wonder if anyone of you have come across a similar problem and if you
could volunteer some help.
Thank you.
Regards,
David Urpani// dju@stan.xx.swin.OZ.AU //

------------------------------

Subject: BP on Prediction of Protein Structure Problem
From:    pb@cse.iitb.ernet.in (Pushpak Bhattacharya)
Date:    Fri, 01 Oct 93 23:44:20 +0700

We are trying to train a BP neural net (PDP-3 package of Rumelhart and
Mclleland is being used) with the input as the primary structure and the
output as the conformational state of the amino acids. The net has 65
binary inputs and 3 binary outputs. The number of patterns to be trained
are more than eight thousand.  We have 20 neurons in the hidden layer.

Now, just after one epoch the hidden neurons go into saturation state -
all their outputs are 1. At the same time the output neurons also go into
saturation. All their outputs become 0.  We understand that the primary
reason for this is that A LARGE number of patterns are being trained on a
net with large no of inputs. The weights to the o/p layer keep on getting
the same kind of "Credit", so that their values go on decreasing to large
negative values.

Could anybody suggest a way out of this difficulty - which to our mind is
a result of applying BP on a large-size real life problem ? (those
applying BP on large problems will appreciate our difficulty !).


------------------------------

Subject: Protein Structure Prediction
From:    pb@cse.iitb.ernet.in (Pushpak Bhattacharya)
Date:    Fri, 01 Oct 93 23:49:24 +0700

Oops! we forgot to mention the email addr

pb@cse.iitb.ernet.in (Pushpak Bhattacharyya, CS & E , IIT Bombay)
susmita@cse.iitb.ernet.in (Susmita De, CS & E, IIT Bombay
                    Thanking you,
                                   Pushpak Bhattacharyya
                                    Susmita De


------------------------------

Subject: Postdoctoral position
From:    axon@cortex.rutgers.edu (Ralph Siegel)
Date:    Fri, 01 Oct 93 21:17:56 -0500

                             PLEASE POST

Postdoctoral Fellow. Analysis of visual structure-from-motion in
primates.  These studies combine visual psychophysics and single unit
recording in awake behaving monkeys.  Single unit recordings are being
made in visual association cortex using carefully controlled visual
stimuli. Laboratory also has ongoing human psychophysical studies with
normal and cortically compromised human subjects as well as computational
studies. Recent graduates who are changing fields from either cellular or
computational neuroscience to behavioral and physiological studies are
encouraged to apply.  Computer expertise useful, but not necessary.
Superb experimental and computational facilities in a multi-disciplinary
research center.  NY-NJ Metro area.

Contact: Ralph Siegel
         Center for Molecular and Behavioral Neuroscience
         Rutgers, The State University
         197 University Avenue
         Newark, NJ 07102
         phone: 201-648-1080  x3261
         fax:   201-648-1272

         email: axon@cortex.rutgers.edu

Term:    24 months, beginning 10/1/93 or later
Salary:  NIH level and supplement

Please send statement of research interests, curriculum vitae, and
names of three references.

------------------------------

Subject: Model problem for feedforward networks
From:    maxim@wisdom.weizmann.ac.il (Kovalenko Maxim)
Date:    Sat, 02 Oct 93 20:31:34 +0100

Hello,

I am developing a learning paradigm for feedforward networks
that presumably improves the convergence properties of
the currently available algorythms.

However, one of the drawbacks of the approach is that
input/output patterns should have the "space" nature,
i.e. have local correlation like in images or time series.

Could anyone point me to a good (practically significant)
model problem that involves this specific nature of
input/ouput data  and is known to be solved slowly using
algorythms like back-propagation?

I would appreciate any related references or comments.

Thank you for your attention,

                       Maxim Kovalenko
_________________________________________________________________________
                                   |
Maxim L. Kovalenko,                |  e_mail: maxim@wisdom.weizmann.ac.il
Department of Apllied Mathematics, |  fax:   972-8-344122
The Weizmann Institute of Science, |  tel.:  972-8-343668
Rehovot, 76100                     |
Israel                             |


------------------------------

Subject: Deadline reminder: Music/arts special issue
From:    "Peter M. Todd" <ptodd@spo.rowland.org>
Date:    Sat, 02 Oct 93 18:05:02 -0500


       **** PLEASE DISTRIBUTE ****


MUSIC AND CREATIVITY

Call for Papers for a Special Issue of Connection Science--
  Reminder of approaching deadline


The October 15 deadline for submissions to the special issue of Connection
Science on network applications in music, arts, and creativity, is fast
approaching.  We seek full-length papers on empirical or theoretical work in
the areas of modelling musical cognition; network composition, choreography,
or visual creation; integration of high- and low-level musical or artistic
knowledge; cross-modal integration (e.g. rhythm and tonality); developmental
models; cross-cultural models; psychoacoustic models; relationships between
music and language; and connections to cognitive neuroscience.  We also
welcome shorter research notes up to 4000 words in length covering ongoing
research projects.  For a complete call for papers and author guidelines, or
to submit a paper (five copies), contact the Special Issue Editors:

Niall Griffith
Department of Computer Science,
University of Exeter,
Prince of Wales Road,
Exeter,
EX4 4PT, England.
E-mail: ngr@dcs.exeter.ac.uk

Peter M. Todd
The Rowland Institute for Science
100 Edwin H. Land Boulevard
Cambridge, MA  02142  USA
E-mail: ptodd@spo.rowland.org


------------------------------

Subject: Research Job - Switzerland
From:    Christian Lehmann <Christian.Lehmann@di.epfl.ch>
Date:    Mon, 04 Oct 93 10:39:26 +0000


University of Lausanne:
Graduate student position (Doctorant) available in October 1993 at the
Institute of Physiology

Topic: Spatial and temporal processing in neural networks

The student will be integrated in an electrophysiology group working with
simultaneous single unit recordings. A tight collaboration with the Swiss
Federal School of Technology (EPFL) will provide the latest technical
facilities. It is expected that she/he will acquire in-depth knowledge in
the fast growing field of neural networks in order to develop and test
original ideas on information processing in the brain. A good background in
mathematics, physics, and biology as well as knowledge of at least one
higher programming language is recommended.

Our Ph.D. program extends over a duration of three years minimum. The
minimum salary ranges between US$19,000 and 24,000/year.

Please send applications (curriculum vitae and letters of recommendations
of two academic referees) to or get further information from:

Dr. Alessandro Villa or Dr. Yves de Ribaupierre, UNIL Institute of
Physiology, Rue du Bugnon 7, CH-1005 Lausanne, Switzerland.
Tel. ++41-21-313.2809
FAX  ++41-21-313.2865
E-mail: villa@ulmed.unil.ch



------------------------------

Subject: Research post - Treebank-translation
From:    E S Atwell <eric@scs.leeds.ac.uk>
Date:    Mon, 04 Oct 93 16:10:05 +0000

The following is an advert for a Research Fellowship and one or more
Visiting Fellowships attached to a project to map between a number of
English Corpus annotation schemes.  The exact method for mapping between
syntax trees of two given parsing schemes is open - presumably one way is
to `hand-annotate' a common sample (we call this a Multi-Treebank as each
sentence has more than one parse-tree) and then set a Neural Net to learn
the mapping between corresponding parses.  Anyone who'd like to do this
is welcome to seek further details, and maybe even apply for the 3-year
post or just visit Leeds temporarily...

Eric Atwell

cf:

                     THE UNIVERSITY OF LEEDS
             SCHOOL OF COMPUTER STUDIES  -  AI Division

    Centre for Computer Analysis of Language And Speech (CCALAS)

            RESEARCH FELLOW IN COMPUTATIONAL LINGUISTICS

The above SERC-funded post is available immediately for a fixed period of
three years to work on a project in natural language processing, involving
mapping between the syntactic annotation schemes of different ragged and
parsed corpora, including LOB, Brown, London-Lund, UPenn, SEC, ICE, British
National Corpus.

A PhD or equivalent expertise in Linguistics, Computer Science or Artificial
Intelligence is required; experience of corpus-based computational
linguistics
and the syntactic models of one or more of these corpora is preferred.

Salary will be on the scale for Research Staff Grade IA (#12,828 - #20,442)
according to qualifications and relevant experience.

Informal enquiries about the post may be made to Eric Atwell, tel 0532
335761,
fax 0532 335468, email eric@scs.leeds.ac.uk  or Clive Souter, tel 0532
335460,
email cs@scs.leeds.ac.uk

Application forms and further particulars may be obtained from the Personnel
Office (Academic Section), The University, Leeds LS2 9JT, England, tel 0532
335771 quoting reference no 48/105.

Closing date for applications: November 1st 1993.

The University of Leeds promotes an equal opportunities policy

****************************************************************************

                     SERC VISITING FELLOWSHIPS

SERC may also fund one or more Visiting Fellowships to support leading
researchers from other Institutions who can contribute towards the project,
visiting Leeds University for between a month and a year.  We would
particularly welcome researchers with in-depth experience of one or more of
the tagging and/or parsing schemes, to advise us in the creation of the
detailed mapping algorithms, and the Multi-Tagged Corpus and MultiTreebank.
If
you are interested in visiting CCALAS as a project advisor, please contact
Eric Atwell (eric@scs.leeds.ac.uk) and/or Clive Souter (cs@scs.leeds.ac.uk).

****************************************************************************

       PROJECT SUMMARY: Mapping Between Corpus Annotation Schemes

Several alternative tagged and parsed Corpora of English exist, including
LOB,
Brown, London-Lund, UPenn, SEC, ICE, British National Corpus, each with its
own tagset and/or parsing scheme. A tagged or parsed Corpus has many
applications, such as training linguistic constraint models for improved
speech recognition; however users cannot combine Corpus training sets into a
single language model, as the annotation schemes are incompatible.

This project will design a set of tag- and tree-transducers or algorithms for
mapping between the main corpus annotation schemes. This will allow users of
one Corpus to view other Corpora as enlargements of their training set. One
tagset and parsing scheme will be our 'base' or interlingua, and transducers
will be built between this interlingua and the other annotation schemes. A
relatively small test corpus will be annotated with all the schemes under
consideration; we will investigate the use of the resulting Multi-tagged
Corpus and Multitreebank as a standard evaluation benchmark for taggers and
parsers.

****************************************************************************

       COMPUTATIONAL LINGUISTICS RESEARCH AT LEEDS UNIVERSITY

The School of Computer Studies at Leeds provides excellent broad background
support for research; we were graded 4(A) by UFC/HEFCs, and NLP makes an
important and growing contribution to the School's research profile. At Leeds
the Centre for Computer Analysis of Language And Speech (CCALAS), with Eric
Atwell as Director and Peter Roach and Clive Souter as Deputy Directors,
provides a focus for a broad range of corpus- and dictionary-based research
including word-sense semantic disambiguation and tagging (Demetriou, Jost,
Atwell), grammar-based reasoning (Mott, Silver), speech act theory
(Holdcroft,
Wallis, Wynne, Millican), probabilistic parsing (Pocock, O'Donoghue, Atwell,
Souter, Hogg), corpus collocation analysis (Howarth, Cowie, Davidson), corpus
annotation (Atwell, Roach, Souter, Arnfield, Ghali, Bull), grammatical
inference and clustering (Hughes, Tarver, Atwell) speech recognition (Roach,
Ueberla, Kirby, Moore, Lockhart, Mair, Sergant), speech synthesis (Scully,
Roach), handwriting recognition (Hanlon, Boyle, Bushofa), text generation
(Cole, Grierson, Tawalbeh), human-computer interaction (Crow), computers in
language teaching and linguistics (Davidson, Fox, Roach, Hunter, Shivtiel),
computers in lexicography (Roach, Setter, Cowie, Atwell, Souter).

****************************************************************************

Leeds University has over 15,000 students and 2,000 academic and research
staff, making it one of the largest in Britain. Leeds is half-way between
London and Edinburgh, linked by rail, motorway and air to the rest of the UK
and Europe. It is the 20th largest city in the European Community, with the
excellent arts, sport and other social facilites expected of a growing,
multi-cultural metropolis; but it is also close to four National Parks. More
background information on the Project, CCALAS, the University, and Leeds and
its environs can be found in the Further Particulars from the Personnel
Office.


------------------------------

Subject: Stochastic Neural Networks for Optimization problems
From:    suchi@pollux.cs.uga.edu (Suchi Bhandarkar)
Date:    Mon, 04 Oct 93 16:08:49 -0500

Could someone give me references on Stochastic Neural Network-based
approaches to combinatorial and continuous variable optimization? Please
e-mail your responses to "suchi@cs.uga.edu" Thank you very much.

Sincerely,
Suchi Bhandarkar
University of Georgia
E-mail: suchi@cs.uga.edu



------------------------------

Subject: Positions in BIOCOMPUTING
From:    Soren Brunak <brunak@cbs.dth.dk>
Date:    Tue, 05 Oct 93 12:04:07 +0000

Positions in BIOCOMPUTING at the Danish Center for Biological Sequence
Analysis, Department of Physical Chemistry, The Technical University of
Denmark (Lyngby).

A number of pre- and post-doctoral positions are available at the newly
formed Center for Biological Sequence Analysis. They have a duration of
one, two and three years, starting late 1993 or early 1994. The  center
is funded by a  five-year  grant  from  the  Danish  National  Research
Foundation and conducts an  active  research  program  in  biomolecular
sequence  and  structure  analysis  with  emphasis  on  novel  adaptive
computational  strategies.  The  Technical  University  of  Denmark  is
situated in Lyngby just outside Copenhagen.

The center offers employment to researchers with a background primarily
in the natural sciences, molecular  biology,  genetics,  chemistry  and
physics. We seek individuals with additional competence and interest in
areas of computer science, but not with this area as the  main  subject
of expertise.  Priority  will  be  given  to  younger  scientists  with
experience in some of the  following  areas  (in  alphabetical  order):
experimental molecular  biology,  information  theory  and  statistics,
mathematical analysis, neural computation, protein folding, physics  of
computation and complex systems, and sequence analysis.

In a wide range of projects the center collaborates with  national  and
foreign groups using novel adaptive computational methods many of which
have received attention in the biocomputing context only recently.  The
center is characterized by the use of new approaches both regarding the
algorithmic aspect of the simulation methods as  well  as  the  use  of
advanced hardware. A wide  range  of  parallel  and  cluster  computing
environments is available locally at the center; a nearby supercomputer
center offers easy access to CRAY and  Connection  Machine  facilities.
Among the  research  topics  are  pre--mRNA  splicing,  recognition  of
vertebrate  promoters,  RNA  folding,  protein  structure   prediction,
proteolytic processing of  polyproteins,  signal  peptide  recognition,
phylogenies, global multiple sequence alignment and dedicated  sequence
analysis hardware. The results are evaluated through intensive exchange
with experimentalists.

For further information, feel free to contact us at the address below.

Applicants should send their resumes to

Soren Brunak
Center director
Center for Biological Sequence Analysis
Department of Physical Chemistry
The Technical University of Denmark
Building 206
DK-2800 Lyngby
Denmark

Tel: +45-42882222, ext. 2477
Fax: +45-45934808
Email: brunak@cbs.dth.dk




------------------------------

End of Neuron Digest [Volume 12 Issue 6]
****************************************
Received: from BUACCA by BUACCA.BU.EDU (Mailer R2.08 PTF009) with BSMTP id
 6636; Mon, 18 Oct 93 22:50:10 EDT
Received: from noc4.dccs.upenn.edu by BUACCA.BU.EDU (IBM VM SMTP R1.2.1) with
 TCP; Mon, 18 Oct 93 22:50:06 EDT
Received: from CATTELL.PSYCH.UPENN.EDU by noc4.dccs.upenn.edu
	id AA29117; Mon, 18 Oct 93 22:47:54 -0400
Return-Path: <marvit@cattell.psych.upenn.edu>
Received: from LOCALHOST by cattell.psych.upenn.edu
	id AA22291; Mon, 18 Oct 93 21:52:45 EDT
Posted-Date: Mon, 18 Oct 93 21:51:20 -0400
From: "Neuron-Digest Moderator" <neuron-request@CATTELL.PSYCH.UPENN.EDU>
To: Neuron-Distribution:;
Subject: Neuron Digest V12 #7  (conferences)
Reply-To: "Neuron-Request" <neuron-request@CATTELL.PSYCH.UPENN.EDU>
X-Errors-To: "Neuron-Request" <neuron-request@psych.upenn.edu>
Organization: University of Pennsylvania
Date: Mon, 18 Oct 93 21:51:20 -0400
Message-Id: <22266.750995480@cattell.psych.upenn.edu>
Sender: marvit@cattell.psych.upenn.edu

Neuron Digest   Monday, 18 Oct 1993
                Volume 12 : Issue 7

Today's Topics:
                           ECAI '94 Amsterdam
              Bat-Sheva Seminar on Functional Brain Imaging
      Symposium on Connectionist Models and Psychology (Australia)


Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@psych.upenn.edu". The ftp archives are
available from psych.upenn.edu (130.91.68.31). Back issues requested by
mail will eventually be sent, but may take a while.

----------------------------------------------------------------------

Subject: ECAI '94 Amsterdam
From:    vet@cs.utwente.nl (Paul van der Vet)
Date:    Thu, 01 Jul 93 16:24:14 +0100




                         E C A I  '94
                       A M S T E R D A M


          11th European Conference on Artificial Intelligence



                    Amsterdam RAI International
                   Exhibition and Congress Centre

                        The Netherlands
                       August 8-12, 1994



Call for Papers
Call for Workshop proposals
Exhibition
Call for Tutorial proposals


Organized by the European Coordinating Committee for Artificial
Intelligence (ECCAI)

Hosted by the Dutch Association for Artificial Intelligence (NVKI)

The European Conference on Artificial Intelligence (ECAI) is the European
forum for scientific exchange and presentation of AI research. The aim of
the conference is to cover all aspects of AI research and to bring
together basic research and applied research.  The Technical Programme
will include paper presentations, invited talks, panels, workshops, and
tutorials. The conference is designed to cover all subfields of AI,
including non-symbolic methods.

ECAIs are held in alternate years and are organized by the European
Coordinating Committee for Artificial Intelligence (ECCAI). The 11th ECAI
in 1994 will be hosted by the Dutch AI Society (NVKI). The conference
will take place at the Amsterdam RAI, International Exhibition and
Congress Centre.


E X H I B I T I O N

An industrial and academic exhibition will be organized from August 9 -
11, 1994.  Detailed information will be provided in the second call for
papers or can be obtained at the conference office (for the adress see
elsewhere).

S P O N S O R S (preliminary list)

Bolesian B.V.
Municipality of Amsterdam
University of Amsterdam
Vrije Universiteit Amsterdam
University of Limburg


                 C A L L  F O R  P A P E R S


T O P I C S  O F  I N T E R E S T

You are invited to submit an original research paper that represents a
significant contribution to any aspect of AI, including the principles
underlying cognition, perception, and action in humans and machines; the
design, application, and evaluation of AI algorithms and intelligent
systems; and the analysis of tasks and domains in which intelligent
systems perform. Theoretical and experimental results are equally
welcome. Papers describing innovative ideas are especially sought
providing such papers include substantial analysis of the ideas, the
technology needed to realize them, and their potential impact.

Of special interest this year are papers which address applied AI. Two
kinds of papers are sought. The first category is case studies of AI
applications that address significant real-world problems and which are
used outside the AI community itself; these papers must justify the use
of the AI technique, explain how the AI technology contributed to the
solution and was integrated with other components, and most importantly
explain WHY the application was successful (or perhaps why it failed) --
these "lessons learned" will be the most important review criteria. The
second category is for papers on novel AI techniques and principles that
may enable more ambitious real-world applications. All the usual AI
topics are appropriate. These papers must describe the importance of the
approach from an applications context, in sufficient technical detail and
clarity, and clearly and thoroughly differentiate the work from previous
efforts. There will be special prizes for the best papers in both these
areas.




S U B M I S S I O N  O F  P A P E R S

Authors are requested to submit to the Programme Chairperson 5 copies
of papers written in English in hardcopy format (electronic and fax
submissions will not be accepted).

Each submitted paper must conform to the following specifications.
Papers should be no longer than 5500 words including references. (Each
full page of figures counts as 1100 words.) Papers longer than this
limit risk being rejected without refereeing. A separate title page
should include the title, the name(s) of the author(s), complete
address(es), email, fax and telephone numbers, the specification of
between one and four Content Areas, preferably chosen from the list
below and an abstract (maximum 200 words). The title page should also
contain a declaration that the paper is unpublished, original work,
substantially different from papers currently under review and will
not be submitted elsewhere before the notification date other than to
workshops and similar specialized presentations with a very limited
audience. Papers should be printed on A4 or 8.5"x11" sized paper in
letter quality print, with 12 point type (10 chars/inch on a
typewriter), single spaced. Double sided printing is preferred.
Authors who wish to check that their submission will fit into the
final CRC format will be able to obtain detailed instructions
including a latex style file and example postscript pages after
October 15 by anonymous FTP from agora.leeds.ac.uk, directory ECAI94,
or by e-mailing ecai94-style@scs.leeds.ac.uk with a message body of
"help".

When submitting a paper an electronic mail message should also be sent
to ecai94-title@scs.leeds.ac.uk giving information in the format
specified below. If an intending author has no e-mail facilities then
this requirement is waived.

Papers should be sent to:

Programme Chairperson:        Dr Tony Cohn
                              Division of Artificial Intelligence
                              School of Computer Studies
                              University of Leeds
                              Leeds LS2 9JT
                              United Kingdom
                              Tel.:     (+44)-532-33.54.82
                              Fax:      (+44)-532-33.54.68
                              E-mail:   ecai94@scs.leeds.ac.uk
Received: from BUACCA by BUACCA.BU.EDU (Mailer R2.08 PTF009) with BSMTP id
 9767; Tue, 19 Oct 93 16:40:55 EDT
Received: from noc4.dccs.upenn.edu by BUACCA.BU.EDU (IBM VM SMTP R1.2.1) with
 TCP; Tue, 19 Oct 93 16:40:51 EDT
Received: from CATTELL.PSYCH.UPENN.EDU by noc4.dccs.upenn.edu
	id AA15188; Tue, 19 Oct 93 16:37:36 -0400
Return-Path: <marvit@cattell.psych.upenn.edu>
Received: from LOCALHOST by cattell.psych.upenn.edu
	id AA11032; Tue, 19 Oct 93 15:38:42 EDT
Posted-Date: Tue, 19 Oct 93 15:36:06 -0400
From: "Neuron-Digest Moderator" <neuron-request@CATTELL.PSYCH.UPENN.EDU>
To: Neuron-Distribution:;
Subject: Neuron Digest V12 #8 (misc, jobs/jobs, s/w, various queries)
Reply-To: "Neuron-Request" <neuron-request@CATTELL.PSYCH.UPENN.EDU>
X-Errors-To: "Neuron-Request" <neuron-request@psych.upenn.edu>
Organization: University of Pennsylvania
Date: Tue, 19 Oct 93 15:36:06 -0400
Message-Id: <10983.751059366@cattell.psych.upenn.edu>
Sender: marvit@cattell.psych.upenn.edu

Neuron Digest   Tuesday, 19 Oct 1993
                Volume 12 : Issue 8

Today's Topics:
          job opening at Indiana: Cognitive Science/Psychology
   Graduate study in Cognitive and Neural Systems at Boston University
                          New Book Announcement
              Combinatorial optimization using neural nets?
           Request for Information - handwriting verification
                         position @ U. Michigan
             Motif version of hyperplane animator available
             Neural nets and channel decoding/equalization?
                       Assistant Professor opening
                           POSITION AVAILABLE
                      Short-term position (London)
                  Neurosciences Internet Resource Guide
                       SUMMARY: use of ANN in QSAR
                     "On-line" journals and papers?


Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@psych.upenn.edu". The ftp archives are
available from psych.upenn.edu (130.91.68.31). Back issues requested by
mail will eventually be sent, but may take a while.

----------------------------------------------------------------------

Subject: job opening at Indiana: Cognitive Science/Psychology
From:    John Kruschke <kruschke@pallas.psych.indiana.edu>
Date:    Wed, 06 Oct 93 11:13:58 -0600


> If budget constraints permit, the Psychology Department at
> Indiana University anticipates a tenure-track appointment for the fall
> of 1994 in Cognitive/Cognitive Science.  Outstanding candidates in all
> areas of cognitive psychology will be considered, including those
> studying attention, visual perception and recognition, and basic
> memory processes, and those utilizing quantitative and computer
> modeling.  Applications received before November 15, 1993, will be
> assured of consideration.  Submit curriculum vita, (p)reprints, a
> description of current research interests and future directions, and
> arrange for three letters of reference to be forwarded to Professor
> M.J. Intons-Peterson, Chairperson, Department of Psychology, Indiana
> University, Bloomington, IN  47405.  Applications from women and
> minority members are especially encouraged.  Indiana University is an
> Affirmative Action/Equal Opportunity Employer.


------------------------------

Subject: Graduate study in Cognitive and Neural Systems at Boston University
From:    Announce@PARK.BU.EDU
Date:    Thu, 07 Oct 93 16:09:31 -0500



(please post)


         ***********************************************
         *                                             *
         *                 DEPARTMENT OF               *
         *      COGNITIVE AND NEURAL SYSTEMS (CNS)     *
         *              AT BOSTON UNIVERSITY           *
         *                                             *
         ***********************************************

                    Stephen Grossberg, Chairman
         Gail A. Carpenter, Director of Graduate Studies


The Boston University Department of Cognitive and Neural Systems offers
comprehensive advanced training in the neural and computational principles,
mechanisms, and architectures that underly human and animal behavior,
and the application of neural network architectures to the solution of
technological problems.

Applications for Fall, 1994 admission and financial aid are now being
accepted for both the MA and PhD degree programs.

To obtain a brochure describing the CNS Program and a set of application
materials, write, telephone, or fax:

   Department of Cognitive & Neural Systems
   Boston University
   111 Cummington Street, Room 240
   Boston, MA 02215
   617/353-9481 (phone)
   617/353-7755 (fax)

or send via email your full name and mailing address to:

   rll@cns.bu.edu

Applications for admission and financial aid should be received by the
Graduate School Admissions Office no later than January 15.  Late
applications will be considered until May 1; after that date applications
will be considered only as special cases.

Applicants are required to submit undergraduate (and, if applicable,
graduate) transcripts, three letters of recommendation, and Graduate
Record Examination (GRE) scores. The Advanced Test should be in the
candidate's area of departmental specialization. GRE scores may be
waived for MA candidates and, in exceptional cases, for PhD candidates,
but absence of these scores may decrease an applicant's chances for
admission and financial aid.

Non-degree students may also enroll in CNS courses on a part-time basis.

Description of the CNS Department:

The Department of Cognitive and Neural Systems (CNS) provides advanced
training and research experience for graduate students interested in the
neural and computational principles, mechanisms, and architectures that
underlie human and animal behavior, and the application of neural network
architectures to the solution of technological problems. Students are
trained in a broad range of areas concerning cognitive and neural systems,
including vision and image processing; speech and language understanding;
adaptive pattern recognition; cognitive information processing; self-
organization; associative learning and long-term memory; computational
neuroscience; nerve cell biophysics; cooperative and competitive network
dynamics and short-term memory; reinforcement, motivation, and attention;
adaptive sensory-motor control and robotics; active vision; and biological
rhythms; as well as the mathematical and computational methods needed to
support advanced modeling research and applications. The CNS Department
awards MA, PhD, and BA/MA degrees.

The CNS Department embodies a number of unique features. It has developed
a curriculum that consists of twelve interdisciplinary graduate courses
each of which integrates the psychological, neurobiological, mathematical,
and computational information needed to theoretically investigate
fundamental issues concerning mind and brain processes and the applications
of neural networks to technology. Nine additional advanced courses,
including research seminars, are also offered. Each course is typically
taught once a week in the evening to make the program available to
qualified students, including working professionals, throughout the Boston
area. Students develop a coherent area of expertise by designing a program
that includes courses in areas such as Biology, Computer Science,
Engineering,
Mathematics, and Psychology, in addition to courses in the CNS curriculum.

The CNS Department prepares students for thesis research with scientists
in one of several Boston University research centers or groups, and with
Boston-area scientists collaborating with these centers. The unit most
closely linked to the department is the Center for Adaptive Systems (CAS).
Students interested in neural network hardware work with researchers in
CNS, the College of Engineering, and at MIT Lincoln Laboratory. Other
research resources include distinguished research groups in neurophysiology,
neuroanatomy, and neuropharmacology at the Medical School and the Charles
River campus; in sensory robotics, biomedical engineering, computer and
systems engineering, and neuromuscular research within the Engineering
School; in dynamical systems within the Mathematics Department; in
theoretical computer science within the Computer Science Department; and
in biophysics and computational physics within the Physics Department.

In addition to its basic research and training program, the Department
conducts a seminar series, as well as conferences and symposia, which bring
together distinguished scientists from both experimental and theoretical
disciplines.

1993-94 CAS MEMBERS and CNS FACULTY:

Jacob Beck
Daniel H. Bullock
Gail A. Carpenter
Chan-Sup Chung
Michael A. Cohen
H. Steven Colburn
Paolo Gaudiano
Stephen Grossberg
Frank H. Guenther
Thomas G. Kincaid
Nancy Kopell
Ennio Mingolla
Heiko Neumann
Alan Peters
Adam Reeves
Eric L. Schwartz
Allen Waxman
Jeremy Wolfe


------------------------------

Subject: New Book Announcement
From:    Gary Tajchman <tajchman@ICSI.Berkeley.EDU>
Date:    Thu, 07 Oct 93 17:32:57 -0800

I thought this might be of interest to folks on connectionists.  Kluwer
Academic has just published a book by H. Bourlard and N. Morgan called

``CONNECTIONIST SPEECH RECOGNITION: A Hybrid Approach''.

In the words of the back cover description, this book ``describes the
theory and implementation of a method to incorporate neural network
approaches into state-of-the-art continuous speech recognition systems
based on Hidden Markov Models (HMMs) to improve their performance.''  The
book is based on work done in a 5-year trans-Atlantic collaboration
between Bourlard and Morgan, and puts together in one place what is
otherwise scattered over a bunch of conference and journal papers.

If you would like more information please send email to N. Morgan at
morgan@icsi.berkeley.edu, or reply to this message.


Gary Tajchman                                   tajchman@icsi.berkeley.edu
International Computer Science Institute        TEL: (510)642-4274
1947 Center St., Suite 600                      FAX: (510)643-7684
Berkeley, CA


------------------------------

Subject: Combinatorial optimization using neural nets?
From:    ARDESHIR <CIVAB@VAXA.HERIOT-WATT.AC.UK>
Date:    Fri, 08 Oct 93 13:26:00 +0000

Hello,

I have recently been studying solving combinatorial optimization using
neural nets, in particular graph partitioning and graph matching.  I
would be grateful if anyone could recommend me some (or most) of papers
and publications regarding the use of neural nets in graph problems.

Yours Sincerely

A. Bahreininejad

email  civab@vaxa.hw.ac.uk    ....... for outside U.K.
       civab@uk.ac.hw.vaxa    ....... for within U.K.

P.S. If possible please provide me with the address(s) for FTP.


------------------------------

Subject: Request for Information - handwriting verification
From:    B.FBORTO%CEFET.ANPR.BR@UICVM.UIC.EDU
Date:    Fri, 08 Oct 93 15:43:00 -0300

Neuron Digesters,

This is a plea for information. I'm a Ms. student at the Federal Center
of Technology(CEFET), Curitiba-PR, Brazil. My intersts are neural
networks and thier application to handwritten signature verification. I
wonder if any one of you can send me references or exchange information
with me on this subject. Thanks.


Nabeel Murshed


------------------------------

Subject: position @ U. Michigan
From:    Colleen Seifert <seifert@psych.lsa.umich.edu>
Date:    Mon, 11 Oct 93 10:53:17 -0600


Position in Cognitive Psychology
University of Michigan

The University of Michigan Department of Psychology invites applications
for a tenure-track position in the area of Cognition, beginning September
1, 1994. The appointment will most likely be made at the Assistant
Professor level, but it may be possible at other ranks.  We seek
candidates with primary interests and technical skills in cognitive
psychology.  Our primary goal is to hire an outstanding cognitive
psychologist, and thus we will look at candidates with any specific
research interest.  We have a preference for candidates interested in
higher mental processes or for candidates with computational modeling
skills (including connectionism) or an interest in cognitive
neuroscience.  Responsibilities include graduate and undergraduate
teaching, as well as research and research supervision.  Send curriculum
vitae, letters of reference,copies of recent publications, and a
statement of research and teaching interests no later than January 7,
1994 to: Gary Olson, Chair, Cognitive Processes Search Committee,
Department of Psychology, University of Michigan, 330 Packard Road, Ann
Arbor, Michigan 48104.  The University of Michigan is an Equal
Opportunity/Affirmative Action employer.


------------------------------

Subject: Motif version of hyperplane animator available
From:    Lorien Pratt <lpratt@slate.Mines.Colorado.EDU>
Date:    Mon, 11 Oct 93 11:46:41 -0700

                     -----------------------------------
                                 Announcing
                           the availability of an
                  X-based neural network hyperplane animator
                                Version 1.01
                              October 10, 1993
                     -----------------------------------

                        Lori Pratt and Steve Nicodemus
                Department of Mathematical and Computer Sciences
                          Colorado School of Mines
                              Golden, CO  80401
                                    USA
                          lpratt@mines.colorado.edu


    Understanding neural network behavior is an important goal of many
    research efforts.  Although several projects have sought to
    translate neural network weights into symbolic representations, an
    alternative approach is to understand trained networks
    graphically.  Many researchers have used a display of hyperplanes
    defined by the weights in a single layer of a back-propagation
    neural network.  In contrast to some network visualization schemes,
    this approach shows both the training data and the network
    parameters that attempt to fit those data.  At NIPS 1990, Paul
    Munro presented a video which demonstrated the dynamics of
    hyperplanes as a network changes during learning.  The program
    displayed ran on a Stardent 4000 graphics engine, and was
    implemented at Siemens.

    At NIPS 1991, we demonstrated an X-based hyperplane animator,
    similar in appearance to Paul Munro's, but with extensions to allow
    for interaction during training.  The user may speed up, slow down,
    or freeze animation, and set various other parameters.  Also, since
    it runs under X, this program should be more generally usable.

    An openwindows version of this program was made available to the
    public domain in 1992.   This announcement describes a version of
    the hyperplane animator that has been rewritten for Motif.  It was
    developed on an IBM RS/6000 platform, and so is written in ANSI C.
    The remainder of this message contains more details of the
    hyperplane animator and ftp information.



1. What is the Hyperplane Animator?

The Hyperplane Animator is a program that allows easy graphical display
of Back-Propagation training data and weights in a Back-Propagation
neural network [Rumelhart, 1987].  It implements only some of the
functionality that we eventually hope to include.  In particular, it
only animates hyperplanes representing input-to-hidden weights.

Back-Propagation neural networks consist of processing nodes
interconnected by adjustable, or ``weighted'' connections.  Neural
network learning consists of adjusting weights in response to a set of
training data.  The weights w1,w2,...wn on the connections into any one
node can be viewed as the coefficients in the equation of an
(n-1)-dimensional plane.  Each non-input node in the neural net is thus
associated with its own plane.  These hyperplanes are graphically
portrayed by the hyperplane animator.  On the same graph it also shows
the training data.

2. Why use it?

As learning progresses and the weights in a neural net alter,
hyperplane positions move.  At the end of the training they are in
positions that roughly divide training data into partitions, each of
which contains only one class of data.  Observations of hyperplane
movement can yield valuable insights into neural network learning.

3. Platform information

The Animator was developed using the Motif toolkit on an IBM RS6000
with X-Windows.  It appears to be stable on this platform, and has not
been compiled on other platforms.  However, Dec5000 and SGI workstations
have been succesfully used as graphics servers for the animator.

How to install the hyperplane animator:

  You will need a machine which has X-Windows, and the Motif libraries.

  1. copy the file animator.tar.Z to your machine via ftp as follows:

     ftp mines.colorado.edu (138.67.1.3)
     Name: anonymous
     Password: (your ID)
     ftp> cd pub/software/hyperplane-animator
     ftp> binary
     ftp> get hyperplane-animator.tar
     ftp> quit

  2. Extract files from hyperplane-animator.tar with:
     tar -xvf hyperplane-animator.tar

  3. Read the README file there.  It includes information about
     compiling.  It also includes instructions for running a number of
     demonstration networks that are included with this distribution.

DISCLAIMER:
  This software is distributed as shareware, and comes with no warantees
whatsoever for the software itself or systems that include it.  The authors
deny responsibility for errors, misstatements, or omissions that may or
may not lead to injuries or loss of property.  This code may not be sold
for profit, but may be distributed and copied free of charge as long as
the credits window, copyright statement in the program, and this notice
remain intact.

------------------------------

Subject: Neural nets and channel decoding/equalization?
From:    "P. BALAY (CCETT/SRL/DCS) (Tel 99124225)" <Pascal.Balay@ccett.fr>
Date:    13 Oct 93 09:36:32 -0100

I'm preparing a Ph.d. concerning the use of neural networks in digital
communications ( especially equalization and channel decoding ). I would
be happy if someone has informations related to this type of research.
  thanks to mail me your answers at :

     balay@ccett.fr



------------------------------

Subject: Assistant Professor opening
From:    Robbie Jacobs <robbie@prodigal.psych.rochester.edu>
Date:    Wed, 13 Oct 93 14:20:16 -0500

Dear Colleague:

The attached advertisement describes an Assistant Professor
position for a Behavioral Neuroscientist in the Department
of Psychology at the University of Rochester.  It is anticipated
that this position will be available July 1, 1994 or 1995.
We hope to attract a scientist who will interact productively
with existing faculty whose research interests are in developmental
psychobiology and/or learning and memory.  Also, the candidate
would be part of a university-wide community of over 60
neuroscientists contributing to inter-departmental graduate
and undergraduate programs in neuroscience.

I would appreciate if you could bring this position to the
attention of suitable candidates.

                                Sincerely,

                                Ernie Nordeen
                                Associate Professor of Psychology
                                   and of Neurobiology & Anatomy


BEHAVIORAL NEUROSCIENTIST.  The Department of Psychology at
the University of Rochester anticipates an Assistant Professor
position in neuroscience.  We are particularly interested in
persons investigating relationships between brain and behavioral
plasticity at the level of neural systems.  Individuals whose
research emphasizes either i) neural mechanisms of learning and
memory, or ii) development/reorganization in perceptual or motor
systems are especially encouraged to apply, but persons interested
in related areas of behavioral neuroscience will also be considered.
The successful candidate is expected to develop an active research
program, and participate in teaching within graduate and undergraduate
programs in neuroscience.   Applicants should submit curriculum vitae,
a brief statement of research interests, and three letters of reference
by 1 February 1994 to: Chair, Biopsychology Search Committee,
Dept. of Psychology, University of Rochester, Rochester, NY, 14627.
An Affirmative Action/Equal Opportunity Employer.


------------------------------

Subject: POSITION AVAILABLE
From:    "Frances T. Perillo" <Frances.T.Perillo@Dartmouth.EDU>
Date:    14 Oct 93 09:21:13 -0500

Cognitive Position:  The Department of Psychology at Dartmouth College has a
junior, tenure-track position available in the area of Cognition -- broadly
construed to include any area of research within Cognitive Psychology,
Cognitive Science, and/or Cognitive Neuroscience.  Candidates must be able to
establish a strong research program and must have a commitment to
undergraduate and graduate instruction.  Supervision of both graduate and
undergraduate research will be expected.    Please send a letter of
application, vita and three letters of recommendation to:  Chair, Cognitive
Search Committee, Department of Psychology, 6207 Gerry Hall, Dartmouth
College, Hanover, NH  03755-3459.  Review of applications will begin on
February 15, 1994.   Dartmouth College is an equal opportunity employer with
an affirmative action plan.  Women and members of minority groups are
encouraged to apply.



------------------------------

Subject: Short-term position (London)
From:    annette@cdu.ucl.ac.uk (Annette Karmiloff-Smith)
Date:    Thu, 14 Oct 93 15:09:23 +0000

can an ad like the following go on the neuron-digest?
if so, please post:

Short term non-clinical scientific post is available at the CDU for 3
years to work on the biological basis of cognitive development alongside
Dr Mark Johnson.  The successful candidate should have a PhD and
preferably have experience in the use of event-related potentials as
applied to developmental processes, though this is not essential.
Experience of running ERP studies with infants would be particularly
desirable.  Candidates should also have an interest in the brain basis of
cognition.  Further details of the post may be obtained from Dr Johnson
by E-mail: mark@cdu.ucl.ac.uk

The salary is on the non-clinical scientific scale 15,563 - 24,736 per
annum plus 2,134 London Weighting.

Applications should be made in writing enclosing CV and names of three
referees to The Director, Professor John Morton, at 4, Taviton Street,
London WC1H OBT, UK (Fax 071-383-0398, E-mail: j.morton@uk.ac.ucl) by 17
Nov 1993.  The Medical Research Council is an equal opportunities
employer.

MRC Cognitive Development Unit, London.



------------------------------

Subject: Neurosciences Internet Resource Guide
From:    sherylc@umich.edu (Sheryl Cormicle)
Organization: University of Michigan
Date:    14 Oct 93 16:49:10 +0000

========================================================

     The Neurosciences:  An Internet Resource Guide

========================================================

WHAT WE!RE DOING:

A Neurosciences Internet Resources Guide (NIRG) is currently under
construction at the University of Michigan!s School of Information and
Library Studies.

The NIRG project proposes to bring together related neuroscience
resources in biology, biotechnology, and the bioelectrical and biomedical
fields.  Resources will include electronic journals, newsletters,
databases, image and video databases, documents, reports, gopher sites,
usenet and listserv groups, among others.

In order to provide the most useful guide possible, we welcome resource
suggestions from members of the Internet community and specifically those
with some expertise in the neurosciences.  If you know of a unique,
relevant, or useful resource, we!d love to hear from you.  Even if you
don!t know of a resource, we!d like to know what you think of the value
of such a guide.  Your participation is crucial to the success of this
project.

WHY WE!RE DOING IT:

Several excellent guides to Biology resources on the Internet exist (such
as Una R. Smith!s).  This project will provide not only a guide with a
tighter focus on neuroscience resources, but also a guide with a broader,
more interdisciplinary approach.

WHO WE ARE:

The NIRG project is directed by Sheryl Cormicle and Steve Bonario, two
Master-level students in the School of Information and Library Studies at
the University of Michigan.  The resource guide is a term project and
will be made available to the Internet community in December of this
year.  We hope to make the guide available in several formats, including
a simple text file, a gopher document, and an HTML document.

CONTACT:

If you wish to suggest a resource, please send mail to nirg@umich.edu

You may also write to us directly:

Sheryl Cormicle:  sherylc@umich.edu
or
Steve Bonario:  sbonario@umich.edu

------------------------------

Subject: SUMMARY: use of ANN in QSAR
From:    Bernard Budde <BUDDE@SC.AGRO.NL>
Date:    Tue, 19 Oct 93 10:25:00 +0000

# Hello,
#
# In Neuron Digest 12(1)1993 (8 sept) I asked for information on the use of
# neural nets in QSAR studies (Quantitative Structure-Activity
Relationships),
# responding to Dr. Manallack.
# Two people responded, Dr. Manallack and Mr. Tetko. They sent me interesting
# papers and titles, *thanks*! I'll list their addresses and the titles
below.
# From below the sea-level,  Bernard Budde
#
# One title I'd like to add:
# Tabak, H.H., R. Govind (1993). Prediction of biodegradation kinetics using
a
#    nonlinear group contribution method. Environ Toxicol Chem 12:251-260.
#
#----------------------------------------------------------------------------
# Igor V. Tetko
# tetko%bioorganic.kiev.ua@relay.ussr.eu.net
# Ph.D. Student, working with application of NN in QSAR.
#
# Andrea, T.A., and H. Kalayeh, "Application of Neural Networks in QSAR of
#    Dihydrofolate Reductase Inhibitors", J. Med. Chem.,1991, 34, 2824-2836
# Aoyama, T., Y. Suzuki, H. Ichikawa, "Neural Networks Applied to QSAR",
#    J. Med. Chem., 1990,33,2583-2590
# Aoyama, Y., Y. Suzaki, H. Ichikawa, "Neural Networks Applied to SAR",
#    J. Med. Chem. 1990, 38, 905-908
# Chastrette, M., JY de Saint Laumer, "Structure-odor relationships using
neural
#    networks", Eur. J. Med. Chem., 1991, 26, 829-833.
# Erb, R.J., "Intoduction to Backpropagation Neural Network Computation",
#    Pharmaceutical Research, 1993, 10, 165-170
# Liu, Q., S. Hirono, I. Moriguchi, "Application of Functional-Link Net
#    in QSAR. 2. QSAR for Activity Data Given by Ratings", Quant.
Struct.-Act.
#    Relat., 1992, 11, 318-324
# Liu, Q., S. Hirono, I. Moriguchi, "Comparison of the Functional-Link Net
and
#    the Generalized Delta Rule Net in QSAR", Chem. Pharm. Bull. 1992, 40,
#    2962-2969.
# Tetko, I., A. Luik, G. Poda, "Application of Neural Networks in SAR of
#    a Small Number of Molecules", J. Med. Chem., 1993, 36, 811-814.
#
#----------------------------------------------------------------------------
-
# David Manallack
# manallack_d%frgen.dnet@smithkline.com
# SmithKline Beecham Pharmaceuticals, Hertz, UK.
# ....NN's designed to perform data analysis and the
# associated dangers of chance effects....
#
# Livingstone, D.J., D.T. Manallack (1993). Statistics using neural networks:
#    chance effects. J. Med. Chem. 36(9):1295-1297.
# Livingstone, D.J. and D.W. Salt (1992). Regression analysis for QSAR using
#    neural networks. BioMed. Chem. Lett. 2(3):213-218.
# Manallack, D.T. and D.J. Livingstone (1992). Artificial neural networks:
#    application and chance effects for QSAR data analysis. Med. Chem. Res.
#    2:181-190.
# Salt, D.W., N. Yildiz, D. Livingstone, C.J. Tinsley (1992). The use of
#    artificial neural networks in QSAR. Pestic. Sci. 36:161-170.
#----------------------------------------------------------------------------
-


------------------------------

Subject: "On-line" journals and papers?
From:    "Frank Zanotto" <"zanotto frank"@a1.snofs1.sno.MTS.dec.com>
Date:    Tue, 19 Oct 93 04:55:01 -0800

[[ Editor's Note: Of course, tehre the the Ohio State Neuroprose archive
which is frequently advertised on this Digest.  Are there other central
repositories, available via ftp, gopher or wais? -PM ]]

Hello

RE ascii journals and/or papers

Can anyone provide a pointer to an archive of journals and/or papers on
neural networks stored in ascii format ?  Reason, I will be accessing the
information and storing the information on a DIGITAL VMS system.

Many thanks.

Regards

Frank Zanotto

------------------------------

End of Neuron Digest [Volume 12 Issue 8]
****************************************
Received: from BUACCA by BUACCA.BU.EDU (Mailer R2.08 PTF009) with BSMTP id
 3702; Thu, 28 Oct 93 16:18:49 EDT
Received: from noc4.dccs.upenn.edu by BUACCA.BU.EDU (IBM VM SMTP R1.2.1) with
 TCP; Thu, 28 Oct 93 16:18:36 EDT
Received: from CATTELL.PSYCH.UPENN.EDU by noc4.dccs.upenn.edu
	id AA29441; Thu, 28 Oct 93 16:16:52 -0400
Return-Path: <marvit@cattell.psych.upenn.edu>
Received: from LOCALHOST by cattell.psych.upenn.edu
	id AA18321; Thu, 28 Oct 93 15:03:28 EDT
Posted-Date: Thu, 28 Oct 93 15:02:33 -0400
From: "Neuron-Digest Moderator" <neuron-request@CATTELL.PSYCH.UPENN.EDU>
To: Neuron-Distribution:;
Subject: Neuron Digest V12 #9 (conferences & CFP)
Reply-To: "Neuron-Request" <neuron-request@CATTELL.PSYCH.UPENN.EDU>
X-Errors-To: "Neuron-Request" <neuron-request@psych.upenn.edu>
Organization: University of Pennsylvania
Date: Thu, 28 Oct 93 15:02:33 -0400
Message-Id: <18272.751834953@cattell.psych.upenn.edu>
Sender: marvit@cattell.psych.upenn.edu

Neuron Digest   Thursday, 28 Oct 1993
                Volume 12 : Issue 9

Today's Topics:
                                 SCS CFP
                                ISIKNH'94
                  ESANN'94: European Symposium on ANNs
                                EuroCOLT
                  Artificial Life Workshop Announcement


Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@psych.upenn.edu". The ftp archives are
available from psych.upenn.edu (130.91.68.31). Back issues requested by
mail will eventually be sent, but may take a while.

----------------------------------------------------------------------

Subject: SCS CFP
From:    CELLIER@cadmus.ece.arizona.edu
Date:    Tue, 19 Oct 93 20:24:14 -0800

                       ANNOUNCEMENT AND CALL FOR PAPERS
       The Society for Computer Simulation International (SCS) presents:

           1994 INTERNATIONAL CONFERENCE ON QUALITATIVE INFORMATION,
                FUZZY TECHNIQUES, AND NEURAL NETWORKS IN SIMULATION

                       June 1-3 1994,  Barcelona, Spain

      Part of the 1994 SCS European Multiconference on Computer Simulation


      The 1994 International Conference on Qualitative Information, Fuzzy
Techniques, and Neural Networks in Simulation brings together research paper
presentations, panel sessions, tutorials, workshops, seminars, industrial
applications, and software demonstrations that make use of qualitative
information of some sort or other in models of dynamic systems for the
purpose
of simulation.

      Research papers are welcome in the following categories of presentation
sessions.

      . Tutorials                        . Time-dependent Expert Systems
      . Panel Discussions                . Qualitative Data Bases
      . Software and Tools                  for Simulation
      . Theory                           . Associative Memory
      . Common Sense Reasoning              for Simulation
         about Dynamical Processes       . Fuzzy Information Models
      . Inductive Reasoning              . Fuzzification and Defuzzification
         about Dynamical Processes       . Treatment of Uncertainty
      . Knowledge-based Reasoning           in Dynamical Systems
         about Dynamical Processes       . Treatment of Incomplete Knowledge
      . Naive Physics                       about Dynamical Systems
      . Neural Networks                  . Assumptions and Belief Systems
         for Dynamical Processes         . Models of Human Reasoning
Processes
      . Fault Monitoring and Diagnosis

     In recent years, more and more papers were published that combine
several
of the known qualitative knowledge representation techniques in a combined
algorithm.  For example, a number of papers were recently published on fuzzy
neural networks.  We therefore believe that it makes sense to bring the
experts
on and advocates of the various techniques together in one conference.

DEADLINES:
      . November 30, 1993     Extended Abstracts or full paper drafts due.
      . February 14, 1994     Notification of acceptance/rejection to
authors.
      . March 14, 1994        Camera Ready Copies due.

      Please, send four copies of extended abstracts (from four to six pages
in typing excluding figures and tables) or drafts of full papers (a maximum
of
twelve pages in typing) to:

                    Prof. Dr. Antoni Guasch
                    General Program Chairperson of ESM'94
                    Dept. ESAII,  ETSEIB
                    Universitat Politecnica de Catalunya
                    Diagonal 647,  2 planta
                    08028  BARCELONA
                    Spain

                    Phone: +34(3)401-6544
                    FAX:   +34(3)401-6600
                    EMail: Guasch@ESAII.UPC.ES

Please, add a cover letter stating your name, affiliation, mailing address,
telephone number, FAX number, and EMail address.  Indicate clearly that your
paper is being submitted to ICQFN'94.  Each manuscript will be reviewed by
at least two members of the International Program Committee of ICQFN'94.


Conference Chairperson of ICQFN'94        Program Chairperson of ICQFN'94
- ----------------------------------        -------------------------------
Francois E. Cellier, Ph.D.                Dr. Joseph Aguilar Martin
Associate Professor                       Grup SAC,  Dept. ESAII
Dept. of Electr. and Computer Engr.       ETSEIT-UPC
University of Arizona                     Colom, 11
TUCSON,  AZ  85721                        08222  TERRASSA,  Catalunya
U.S.A.                                    Spain

Phone: +1(602)621-6192                    Phone: +34(3)739-8144
FAX:   +1(602)621-8076                    FAX:   +34(3)739-8101
EMail: Cellier@ECE.Arizona.Edu            EMail: Aguilar@ESAII.UPC.ES


                International Program Committee of ICQFN'94
                -------------------------------------------
Robert Babuska             Delft University of Technology         Netherlands
Silvano Colombano          NASA Ames Research Center              U.S.A.
Massimo de Gregorio        Institute of Cibernetics               Italy
Ken Forbus                 Northwestern University                U.S.A.
Ken-ichi Funahashi         University of Aizu                     Japan
Fernando Gomide            Univeristy of Campinas                 Brazil
Madan Gupta                University of Saskatchewan             Canada
Katalin Hangos             Hungarian Academy of Sciences          Hungary
Yumi Iwasaki               Stanford University                    U.S.A.
Mohamed Jamshidi           University of New Mexico               U.S.A.
Eugene Kerckhoffs          Delft University of Technology         Netherlands
Bart Kosko                 University of Southern California      U.S.A.
Granino Korn               G.A. & T.M. Korn Associates            U.S.A.
Roy Leitch                 Heriot-Watt University Edinburgh       United
Kingdom
Derek Linkens              University of Sheffield                United
Kingdom
Hiroshi Narazaki           Kobe Steel Ltd.                        Japan
Kevin Passino              Ohio State University                  U.S.A.
Witold Pedrycz             University of Manitoba                 Canada
Ethan Scarl                The Boeing Company                     U.S.A.
Georgios Stavrakakis       Technical University of Crete          Greece
Jan-Erik Stromberg         Linkoeping University                  Sweden
Hideo Tanaka               University of Osaka                    Japan
Jan Top                    Energy Research Foundation             Netherlands
Louise Trave-Massuyes      L.A.A.S. / C.N.R.S.                    France
Enric Trillas              Technical University of Madrid         Spain
Ghislain Vansteenkiste     University of Ghent                    Belgium
Llorenc Valverde           University of the Balearic Islands     Spain
Tu Van Le                  University of Camberra                 Australia
Takeshi Yamakawa           Kyushu Institute of Technology         Japan
Lotfi Zadeh                University of California Berkeley      U.S.A.
Hans-Juergen Zimmermann    R.W.T.H. Aachen                        Germany

=====virtual
scissors==========================================================

 ICQFN'94 -- INTERNATIONAL CONFERENCE ON QUALITATIVE INFORMATION,
             FUZZY TECHNIQUES, AND NEURAL NETWORKS IN SIMULATION
 ****************************************************************

 [  ]  I plan to submit a paper

 [  ]  I wish to receive the Preliminary Program


 Name: _____________________________________

 Title: ____________________________________

 Affiliation: ______________________________

 Address: __________________________________

 ___________________________________________

 ___________________________________________

 Phone Number: _____________________________

 FAX Number: _______________________________

 E_Mail Address: ___________________________


 If you wish to receive further information about ICQFN'94, please, fill in
 the above form and return it to: Cellier@ECE.Arizona.Edu



------------------------------

Subject: ISIKNH'94
From:    "Li-Min Fu" <fu@whale.cis.ufl.edu>
Date:    Fri, 13 Aug 93 11:04:26 -0500


                            CALL  FOR   PAPERS


   International Symposium on Integrating Knowledge and Neural Heuristics
                              (ISIKNH'94)

Sponsored by University of Florida, and AAAI,
in cooperation with IEEE Neural Network Council,
and Florida AI Research Society.

Time: May 9-10 1994; Place: Pensacola Beach, Florida, USA.


A large amount of research has been directed
toward integrating neural and symbolic methods in recent years.
Especially, the integration of knowledge-based principles and
neural heuristics holds great promise
in solving complicated real-world problems.
This symposium will provide a forum for discussions
and exchanges of ideas in this area. The objective of this symposium
is to bring together researchers from a variety of fields
who are interested in applying neural network techniques
to augmenting existing knowledge or proceeding the other way around,
and especially, who have demonstrated that this combined approach
outperforms either approach alone.
We welcome views of this problem from
areas such as constraint-(knowledge-) based learning and
reasoning, connectionist symbol processing,
hybrid intelligent systems, fuzzy neural networks,
multi-strategic learning, and cognitive science.

Examples of specific research include but are not limited to:
1. How do we build a neural network based on {\em a priori}
knowledge (i.e., a knowledge-based neural network)?
2. How do neural heuristics improve the current model
for a particular problem (e.g., classification, planning,
signal processing, and control)?
3. How does knowledge in conjunction with neural heuristics
contribute to machine learning?
4. What is the emergent behavior of a hybrid system?
5. What are the fundamental issues behind the combined approach?

Program activities include keynote speeches, paper presentation,
panel discussions, and tutorials.

*****
Scholarships are offered to assist students in attending the
symposium.  Students who wish to apply for a scholarship should send
their resumes and a statement of how their researches are related
to the symposium.
*****


Symposium Chairs:
LiMin Fu, University of Florida, USA.
Chris Lacher,  Florida State University, USA.

Program Committee:
Jim Anderson,   Brown University,  USA
Michael Arbib,  University of Southern California,  USA
Fevzi Belli,  The University of Paderborn,  Germany
Jim Bezdek,  University of West Florida,  USA
Bir Bhanu,  University of California,  USA
Su-Shing Chen,  National Science Foundation,  USA
Tharam Dillon,  La Trobe University,  Australia
Douglas Fisher,  Vanderbilt University,  USA
Paul Fishwick,  University of Florida,  USA
Stephen Gallant,  HNC Inc.,  USA
Yoichi Hayashi,  Ibaraki University,  Japan
Susan I. Hruska,  Florida State University,  USA
Michel Klefstad-Sillonville  CCETT,  France
David C. Kuncicky,  Florida State University,  USA
Joseph Principe,  University of Florida,  USA
Sylvian Ray,  University of Illinois,  USA
Armando F. Rocha, University of Estadual, Brasil
Ron Sun,  University of Alabama,  USA

Keynote Speaker: Balakrishnan Chandrasekaran, Ohio-State University


Schedule for Contributed Papers
- ----------------------------------------------------------------------
Paper Summaries Due: December 15, 1993
Notice of Acceptance Due: February 1, 1994
Camera Ready Papers Due: March 1, 1994

Extended paper summaries should be
limited to four pages (single or double-spaced)
and should include the title, names of the authors, the
network and mailing addresses and telephone number of the corresponding
author.  Important research results should be attached.
Send four copies of extended paper summaries to

      LiMin Fu
      Dept. of CIS, 301 CSE
      University of Florida
      Gainesville, FL 32611
      USA
      (e-mail: fu@cis.ufl.edu; phone: 904-392-1485).

Students' applications for a scholarship should also be sent
to the above address.

General information and registration materials can be obtained by
writing to

      Rob Francis
      ISIKNH'94
      DOCE/Conferences
      2209 NW 13th Street, STE E
      University of Florida
      Gainesville, FL 32609-3476
      USA
      (Phone: 904-392-1701; fax: 904-392-6950)
- ---------------------------------------------------------------------


- ---------------------------------------------------------------------
If you intend to attend the symposium, you may submit the following
information by returning this message:


NAME: _______________________________________
ADDRESS: ____________________________________
_____________________________________________
_____________________________________________
_____________________________________________
_____________________________________________
PHONE: ______________________________________
FAX: ________________________________________
E-MAIL: _____________________________________


- ---------------------------------------------------------------------



------------------------------

Subject: ESANN'94: European Symposium on ANNs
From:    esann@dice.ucl.ac.be
Date:    Wed, 01 Sep 93 15:16:42 +0100

Dear Moderator,

Could you please include the following call for papers for the second
European Symposium on Artificial Neural Networks, in the next issue of
Neuron-Digest?

Thank you in advance.
Sincerely yours,

Michel Verleysen

- --------------------------- Cut here -------------------------------
____________________________________________________________________
____________________________________________________________________

                        European Symposium
                  on Artificial Neural Networks

                 Brussels - April 20-21-22, 1994

             First announcement and call for papers

____________________________________________________________________
____________________________________________________________________

- -----------------------
Scope of the conference
- -----------------------

ESANN'93 was held in Brussels, in April 1993.  It gathered more than 80
scientists, from about 15 countries, who wanted to learn more about the
last developments in the theory of neural networks.

The European Symposium on Artificial Neural Networks will be organized for
the second time in April 1994, and, as in 1993, will focus on the
fundamental aspects of the artificial neural network research.  Today,
thousands of researchers work in this field; they try to develop new
algorithms, to mimic properties found in natural networks, to develop
parallel computers based on these properties, and to use artificial neural
networks in new application areas.  But the field is new, and has expanded
drastically in about ten years; this lead to a lack of theoretical works in
the subject, and also to a lack of comparisons between new methods and more
classical ones.

The purpose of ESANN is to cover the theoretical and fundamental aspects of
neural networks; the symposium is intended to give to the participants an
up-to-date and comprehensive view of these aspects, by the presentation of
new results and new developments, of tutorial papers covering the relations
between neural networks and classical methods of computing, and also by
round tables confronting views of specialists and non-specialists of the
field.  The program committee of ESANN'94 welcomes papers in the following
aspects of artificial neural networks :
. theory
. models and architectures
. mathematics
. learning algorithms
. biologically plausible artificial networks
. neurobiological systems
. adaptive behavior
. signal processing
. statistics
. self-organization
. evolutive learning
Accepted papers will cover new results in one or several of these aspects
or will be of tutorial nature.  Papers insisting on the relations between
artificial neural networks and classical methods of information processing,
signal processing or statistics are encouraged.


- ----------------------
Call for contributions
- ----------------------

Prospective authors are invited to submit six originals of their
contribution before November 26, 1993.  Working language of the conference
(including proceedings) is English.

Papers should not exceed six A4 pages (including figures and references).
Printing area will be 12.2 x 19.3 cm (centered on the A4 page); left,
right, top and bottom margins will thus respectively be 4.4, 4.4, 5.2 and
5.2 cm. 10-point Times font will be used for the main text; headings will
be in bold characters, (but not underlined), and will be separated from the
main text by two blank lines before and one after.  Manuscripts prepared in
this format will be reproduced in the same size in the book.

The first page will begin by a heading, indented 1cm left and right with
regards to the main text (the heading will thus have left and right margins
of 5.4 cm).  The heading will contain the title (Times 14 point, bold,
centered), one blank line, the author(s) name(s) (Times 10 point,
centered), one blank line, the affiliation (Times 9 point, centered), one
blank line, and the abstract (Times 9 point, justified, beginning by the
word "Abstract." in bold face).

Originals of the figures will be pasted into the manuscript and centered
between the margins.  The lettering of the figures should be in 10-point
Times font size.  Figures should be numbered.  The legends also should be
centered between the margins and be written in 9-point Times font size as
follows:

The pages of the manuscript will not be numbered (numbering decided by the
editor).

A separate page (not included in the manuscript) will indicate:
. the title of the manuscript
. author(s) name(s)
. the complete address (including phone & fax numbers and E-mail) of the
corresponding author
. a list of five keywords or topics
On the same page, the authors will copy and sign the following paragraph:
"in case of acceptation of the paper for presentation at ESANN 94:
- - at least one of the authors will register to the conference and will
present the paper
- - the author(s) give their rights up over the paper to the organizers of
ESANN 94, for the proceedings and any publication that could directly be
generated by the conference
- - if the paper does not match the format requirements for the proceedings,
the author(s) will send a revised version within two weeks of the
notification of acceptation."

Contributions must be sent to the conference secretariat.  Prospective
authors are invited to ask examples of camera-ready contributions by
writing to the same address.


- -------------
Local details
- -------------

The conference will be held in the center of Brussels (Belgium).  Close to
most great European cities, Brussels is exceptionally well served by
closely-knit motorway and railway systems, and an international airport.
Besides an artistic and cultural center of attraction, Brussels is also
renowned for its countless typical cafis, form the most unassuming to the
most prestigious.  Belgian food is typical and famous, and the night life
in Brussels is considerable.


- ---------
Deadlines
- ---------

Submission of papers    November 26, 1993
Notification of acceptance      January 17, 1994
Symposium       April 20-22, 1994


- ------
Grants
- ------

A limited number of grants (registration fees and economic accommodation)
will be given to young scientists coming from the European Community (Human
Capital and Mobility program, European Community - DG XII).  Grants will
also probably be available for scientists from Central and Eastern European
countries.  Please write to the conference secretariat to get an
application form for these grants.


- ----------------------
Conference secretariat
- ----------------------

Dr. Michel Verleysen
D Facto Conference Services
45 rue Masui
B - 1210 Brussels (Belgium)
phone: + 32 2 245 43 63
Fax: + 32 2 245 46 94
E-mail: esann@dice.ucl.ac.be


- ----------
Reply form
- ----------

If your contact address is incomplete or has been changed recently, or if
you know a colleague who might be interested in ESANN'94, please send this
form, with your or his/her name and address, to the conference secretariat:

''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''
Name:   .....................................................
First Name:     ...............................................
University or Company:  ....................................
............................................................
Address:        ..................................................
............................................................
............................................................
ZIP:    ......................................................
Town:   .....................................................
Country:        ..................................................
Tel:    ......................................................
Fax:    ......................................................
E-mail:         ...................................................
''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''


- ------------------
Steering committee
- ------------------

Frangois Blayo                 EERIE, Nnmes (F)
Marie Cottrell                 Univ. Paris I (F)
Nicolas Franceschini           CNRS Marseille (F)
Jeanny Hirault                 INPG Grenoble (F)
Michel Verleysen               UCL Louvain-la-Neuve (B)


- --------------------
Scientific committee
- --------------------

Luis Almeida *                 INESC - Lisboa (P)
Jorge Barreto                  UCL Louvain-en-Woluwe (B)
Hervi Bourlard                 L. & H. Speech Products (B)
Joan Cabestany                 Univ. Polit. de Catalunya (E)
Dave Cliff                     University of Sussex (UK)
Pierre Comon                   Thomson-Sintra Sophia (F)
Holk Cruse                     Universitdt Bielefeld (D)
Dante Del Corso                Politecnico di Torino (I)
Marc Duranton                  Philips / LEP (F)
Jean-Claude Fort               Universiti Nancy I (F)
Karl Goser                     Universitdt Dortmund (D)
Martin Hasler                  EPFL Lausanne (CH)
Philip Husbands                University of Sussex (UK)
Christian Jutten               INPG Grenoble (F)
Petr Lansky                    Acad. of Science of the Czech Rep. (CZ)
Jean-Didier Legat              UCL Louvain-la-Neuve (B)
Jean Arcady Meyer *            Ecole Normale Supirieure - Paris (F)
Erkki Oja                      Helsinky University of Technology (SF)
Guy Orban                      KU Leuven (B)
Gilles Paghs *                 Universiti Paris I (F)
Alberto Prieto                 Universitad de Granada (E)
Pierre Puget                   LETI Grenoble (F)
Ronan Reilly                   University College Dublin (IRE)
Tamas Roska                    Hungarian Academy of Science (H)
Jean-Pierre Rospars            INRA Versailles (F)
Andri Roucoux                  UCL Louvain-en-Woluwe (B)
John Stonham                   Brunel University (UK)
Lionel Tarassenko              University of Oxford (UK)
John Taylor                    King3s College London (UK)
Vincent Torre                  Universita di Genova (I)
Claude Touzet                  EERIE Nnmes (F)
Joos Vandewalle                KUL Leuven (B)
Eric Vittoz                    CSEM Neuchbtel (CH)
Christian Wellekens            Eurecom Sophia-Antipolis (F)

(* tentatively)

- --------------------------- Cut here -------------------------------



_____________________________
Michel Verleysen

D facto conference services
45 rue Masui
1210 Brussels
Belgium
tel: +32 2 245 43 63
fax: +32 2 245 46 94
E-mail: esann@dice.ucl.ac.be
_____________________________



------------------------------

Subject: EuroCOLT
From:    John Shawe-Taylor <john@dcs.rhbnc.ac.uk>
Date:    Wed, 08 Sep 93 12:36:10 +0000



            The Institute of Mathematics and its Applications

                            Euro-COLT '93

        FIRST EUROPEAN CONFERENCE ON COMPUTATIONAL LEARNING THEORY

    20th-22nd December, 1993       Royal Holloway, University of London

              Call for Participation and List of Accepted Papers
              ==================================================

The inaugural IMA European conference on Computational Learning Theory
will be held 20--22 December at Royal Holloway, University of London.
The conference covers areas related to the analysis of learning
algorithms and the theory of machine learning, including artificial and
biological neural networks, robotics, pattern recognition, inductive
inference, information theory and cryptology, decision theory and
Bayesian/MDL estimation.

Invited Talks
=============
As part of our program, we are pleased to announce three invited talks
by Wolfgang Maass (Graz), Lenny Pitt (Illinois) and Les Valiant (Harvard).

Euroconference Scholarships
===========================
The conference has also received scientific approval from the European
Commission to be supported under the Human Capital and Mobility
Euroconferences initiative. This means that there will be a number of
scholarships available to cover the expenses of young researchers
attending the conference. The scholarships are open to citizens of
European Community Member States or people who have been residing and
working in research for at least one year in one of the European
States. Please indicate on the return form below if you would like to
receive more information about these scholarships.

List of Accepted Papers
=======================
R. Gavalda,  On the Power of Equivalence Queries.

M. Golea and M. Marchand, On Learning Simple Deterministic and
                Probabilistic Neural Concepts.

P. Fischer, Learning Unions of Convex Polygons.

S. Polt, Improved Sample Size Bounds for PAB-Decisions.

F. Ameur, P. Fischer, K-U. Hoffgen and F.M. Heide, Trial and Error: A
                New Approach to Space-Bounded Learning.

A. Anoulova and S. Polt, Using Kullback-Leibler Divergence in Learning
                Theory.

J. Viksna, Weak Inductive Inference.

H.U. Simon, Bounds on the Number of Examples Needed for Learning
                Functions.

R. Wiehagen, C.H. Smith and T. Zeugmann, Classification of Predicates
                and Languages.

K. Pillaipakkamnatt and V. Raghavan, Read-twice DNF Formulas can be
                learned Properly.

J. Kivinen, H. Mannila and E. Ukkonen, Learning Rules with Local
                Exceptions.

J. Kivinen and M. Warmuth, Using Experts for Predicting Continuous
                Outcomes.

M. Anthony and J. Shawe-Taylor, Valid Generalisation of Functions from
                Close Approximation on a Sample.

N. Cesa-Bianchi, Y. Freund, D.P. Helmbold and M. Warmuth, On-line
                Prediction and Conversion Strategies.

A. Saoudi and T. Yokomori, Learning Local and Recognisable
                omega-Languages and Monadic Logic Programs.

K. Yamanishi, Learning Non-Parametric Smooth Rules by Stochastic Rules
                with Finite Partitioning.

H. Wiklicky, The Neural Network Loading Problem is Undecidable.

T. Hegedus, Learning Zero-one Threshold Functions and Hamming
Balls over the Boolean Domain.

Members of the Organising Committee
===================================
John Shawe-Taylor (Chair: Royal Holloway, University of London, email
to eurocolt@dcs.rhbnc.ac.uk), Martin Anthony (LSE, University of
London), Jose Balcazar (Barcelona), Norman Biggs (LSE, University of
London), Mark Jerrum (Edinburgh), Hans-Ulrich Simon (University of
Dortmund), Paul Vitanyi (CWI Amsterdam).

Location
========
The conference will be held at Royal Holloway, University of London in
Egham, Surrey, conveniently located 15 minutes' drive from London
Heathrow airport. Accommodation will be either in the chateau-like
original Founders Building or in en-suite rooms in a new block also on
the Royal Holloway campus.  Accommodation fees range from 110 pounds
to 150 pounds (inclusive of bed, breakfast and dinner), while the
conference fee is 195 pounds (inclusive of lunch, coffee and tea;
140 pounds for students with reductions available for IMA members;
late application fee of 15 pounds if application received after 16th
November).

- --------------------------------------------------------------------
To: The Conference Officer, The Institute of Mathematics and its
Applications, 16 Nelson Street, Southend-on-Sea, Essex SS1 1EF.
Telephone: (0702) 354020. Fax: (0702) 354111

                         Euro-COLT '93
20th--22nd December, 1993         Royal Holloway, University of London

Please send me an application for the above conference

TITLE ............         MALE/FEMALE .....

SURNAME ..............................    FORENAMES ..................

ADDRESS FOR CORRESPONDENCE ...........................................

......................................................................

TELEPHONE NO ........................ FAX NO .........................


Please send me information about the Euroconference scholarships ........
(Please tick if necessary)




------------------------------

Subject: Artificial Life Workshop Announcement
From:    arantza@cogs.susx.ac.uk (Arantza Etxeberria)
Date:    Tue, 26 Oct 93 09:50:12 +0000

       "Artificial Life: a Bridge towards a New Artificial Intelligence"

                      Palacio de Miramar (San Sebastian, Spain)
                          December 10th and 11th, 1993


                           Workshop organised by the
                Department of Logic and Philosophy of Science,
                          Faculty of Computer Science
                                       &
        Institute of Logic, Cognition, Language and Information (ILCLI)
                                    of the
                  University of the Basque Country (UPV/EHU)

                                  Directors:
               Alvaro Moreno (University of the Basque Country)
                        Francisco Varela (CREA, Paris)


    This Workshop will be dedicated to a discussion of the impact of works
on
Artifical Life in  Artificial Intelligence. Artificial  Intelligence (AI)
has
traditionally attempted to  study cognition  as an  abstract phenomenon
using
formal tools, that is,  as a disembodied process  that can be grasped
through
formal operations, independent of the nature  of the system that displays
it.
Cognition appears  as an  abstract representation  of reality.  After
several
decades of  research  in this  direction  the field  has  encountered
several
problems that have taken it to  what many consider a "dead end":
difficulties
in understanding autonomous and situated agencies, in relating behaviour  in
a
real environment,  in studying  the  nature and  evolution of  perception,
in
finding a  pragmatic  approach to  explain  the operation  of  most
cognitive
capacities such as natural language, context dependent action, etc.
    Artificial Life  (AL)  has  recently  emerged  as  a  confluence  of
very
different fields  trying  to study  different  kinds of  phenomena  of
living
systems using  computers  as  a  modelling  tool,  and,  at  last,  trying
to
artificially (re)produce a living or a population of living systems in real
or
computational media.  Examples of  such phenomena  are prebiotic  systems
and
their evolution, growth and  development, self-reproduction, adaptation to
an
environment, evolution  of  ecosystems  and natural  selection,  formation
of
sensory-motor loops,  autonomous  robots. Thus,  AL  is having  an  impact
on
classic life sciences  but also on  the conceptual foundations  of AI and
new
methodological ideas to Cognitive Science.
    The aim  of this  Workshop is  to  focus on  the last  two points  and
to
evaluate the influence of the methodology and concepts appearing in AL for
the
development of a new ideas about cognition that could eventually give birth
to
a new Artificial Intelligence. Some  of the sessions consist on
presentations
and replies on  a specific subject  by invited speakers  while others will
be
debates open to all participants in the workshop.


MAIN TOPICS:

    * A review of the problems of FUNCTIONALISM in Cognitive Science
        and Artificial Life.
    * Modelling Neural Networks through Genetic Algorithms.
    * Autonomy and Robotics.
    * Consequences of the crisis of the representational models of cognition.
    * Minimal Living System and Minimal Cognitive System
    * Artificial Life systems as problem solvers
    * Emergence and evolution in artificial systems


SPEAKERS

    S. Harnad
    P. Husbands
    G. Kampis
    B. Mac Mullin
    D. Parisi
    T. Smithers
    E. Thompson
    F. Varela

Further Information:

Alvaro Moreno
Apartado 1249
20080 DONOSTIA
SPAIN

E. Mail:    biziart@si.ehu.es
Fax:        34 43 311056
Phone:      34 43 310600 (extension 221)
            34 43 218000 (extension 209)


------------------------------

End of Neuron Digest [Volume 12 Issue 9]
****************************************
Received: from BUACCA by BUACCA.BU.EDU (Mailer R2.08 PTF009) with BSMTP id
 3315; Mon, 08 Nov 93 04:16:43 EST
Received: from noc4.dccs.upenn.edu by BUACCA.BU.EDU (IBM VM SMTP R1.2.1) with
 TCP; Mon, 08 Nov 93 04:16:40 EST
Received: from CATTELL.PSYCH.UPENN.EDU by noc4.dccs.upenn.edu
	id AA17802; Mon, 8 Nov 93 01:34:26 -0500
Return-Path: <marvit@cattell.psych.upenn.edu>
Received: from LOCALHOST by cattell.psych.upenn.edu
	id AA17174; Mon, 8 Nov 93 00:12:18 EST
Posted-Date: Mon, 08 Nov 93 00:10:14 EST
From: "Neuron-Digest Moderator" <neuron-request@CATTELL.PSYCH.UPENN.EDU>
To: Neuron-Distribution:;
Subject: Neuron Digest V12 #10 (conferences & CFP)
Reply-To: "Neuron-Request" <neuron-request@CATTELL.PSYCH.UPENN.EDU>
X-Errors-To: "Neuron-Request" <neuron-request@psych.upenn.edu>
Organization: University of Pennsylvania
Date: Mon, 08 Nov 93 00:10:14 EST
Message-Id: <17158.752735414@cattell.psych.upenn.edu>
Sender: marvit@cattell.psych.upenn.edu

Neuron Digest   Monday,  8 Nov 1993
                Volume 12 : Issue 10

Today's Topics:
   CFP - World Congress on Medical Physics and Biomedical Engineering
                      IEEE NNSP'94 Call For Papers
                             Call For Papers
    Pan Pacific Conf on Brain Electric Topography - 1st announcement
                               ISIKNH '94
                   Call for papers, NeuroControl book
                         Call for Papers of WWW
               URGENT: DEADLINE CHANGE FOR WORLD CONGRESS


Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@psych.upenn.edu". The ftp archives are
available from psych.upenn.edu (130.91.68.31). Back issues requested by
mail will eventually be sent, but may take a while.

----------------------------------------------------------------------

Subject: CFP - World Congress on Medical Physics and Biomedical Engineering
From:    MACHADO%RIOSC.BITNET@VTBIT.CC.VT.EDU
Date:    Thu, 16 Sep 93 15:42:18 -0300

     WORLD CONGRESS ON MEDICAL PHYSICS AND BIOMEDICAL ENGINEERING

                         --- RIO'94 ---

      ---------------------------------------------------------
      Symposium on Hybrid Intelligent Systems: neural networks,
          fuzzy logic, genetic algorithms, expert systems
      ---------------------------------------------------------

                        CALL FOR PAPERS

Rio de Janeiro will host next year in the period August 21-26 the
X International Conference on Medical Physics and the XVII
International Conference on Medical and Biological Engineering.

The above Symposium is one of the planned activities at these
conferences. Its purpose is to bring researchers together to discuss
new hybrid structures for developing intelligent systems for decision
making support in Health Sciences.

Neural networks, Fuzzy Logic, Genetic algorithms, Evolutionary
Computation, etc. are examples of new tools available for the
development of intelligent systems. A new generation of systems
combining these techniques is emerging as a competitive solution for
decision making problems in Health Sciences.

The symposium is being planned to combine the following activities
to provide a forum to describe these ideas:

 - Tutorial on the subject:

   "Hybrid Intelligent Systems for Health Sciences"

    by Armando F. Rocha, author of the book Neural Nets - A Theory for
    Brains and Machines

 - Two invited speeches:

   - "Connectionist Expert Systems"

      by Ricardo J. Machado

   - another to be chosen

 - Two paper sessions

Authors are invited to submit paper abstracts to this symposium
respecting the deadline of December 20, 1993. They are also invited
to propose a theme and speaker for the second invited speech.

The authors' kit including the camera-ready form for abstracts can be
ordered to    Ms. Solange Oliveira
              Secretariat - 1994 World Congress on Medical Physics
              and Biomedical Engineering
              Congrex do Brasil
              Rua do Ouvidor, 60/414
              20040-030 - Rio de Janeiro - RJ - Brazil
              Phone: +55-21-224-6080  Fax: +55-21-231-1492

Submit the abstract form and three copies to

              Ricardo Jose Machado
              IBM - Rio Scientific Center
              Av. Presidente Vargas, 824
              20071-0001, Rio de Janeiro - RJ - Brazil
              Phone: +55-21-271-2306  Fax: +55-21-271-2797
              E-mail: machado@riosc.bitnet


------------------------------

Subject: IEEE NNSP'94 Call For Papers
From:    hwang@pierce.ee.washington.edu (Jenq-Neng Hwang)
Date:    Thu, 30 Sep 93 13:01:00 -0800

                  1994 IEEE WORKSHOP ON
            NEURAL NETWORKS FOR SIGNAL PROCESSING

              September 6-8, 1994 Ermioni, Greece
        Sponsored by the IEEE Signal Processing Society
           and Co-Sponsored by Intracom S.A. Greece
      (In cooperation with the IEEE Neural Networks Council)

GENERAL CHAIR
John Vlontzos
INTRACOM S.A.
Peania, Attica, Greece
jvlo@intranet.gr

PROGRAM CHAIR
Jenq-Neng Hwang
University of Washington
Seattle, Washington, USA
hwang@ee.washington.edu

PROCEEDINGS CHAIR
Elizabeth J. Wilson
Raytheon Co.
Marlborough, MA, USA
bwilson@sud2.ed.ray.com

FINANCE CHAIR
Demetris Kalivas
INTRACOM S.A.
Peania, Attica, Greece
dkal@intranet.gr



PROGRAM COMMITTEE

Joshua Alspector      (Bellcore, USA)
Les Atlas             (U. of Washington, USA)
Charles Bachmann      (Naval Research Lab. USA)
David Burr            (Bellcore, USA)
Rama Chellappa        (U. of Maryland, USA)
Lee Giles             (NEC Research, USA)
Steve J. Hanson       (Siemens Corp. Research, USA)
Yu-Hen Hu             (U. of Wisconsin, USA)
Jenq-Neng Hwang       (U. of Washington, USA)
Bing-Huang Juang      (AT&T Bell Lab., USA)
Shigeru Katagiri      (ATR Japan)
Sun-Yuan Kung         (Princeton U., USA)
Gary M. Kuhn          (Siemens Corp. Research, USA)
Stephanos Kollias     (National Tech. U. of Athens, Greece)
Richard Lippmann      (MIT Lincoln Lab., USA)
Fleming Lure          (Kaelum Research Co., USA)
John Makhoul          (BBN Lab., USA)
Richard Mammone       (Rutgers U., USA)
Elias Manolakos       (Northeastern U., USA)
Nahesan Niranjan      (Cambridge U., UK)
Tomaso Poggio         (MIT, USA)
Jose Principe         (U. of Florida, USA)
Wojtek Przytula       (Hughes Research Lab., USA)
Ulrich Ramacher       (Siemens Corp., Germany)
Bhaskar D. Rao        (UC San Diego, USA)
Andreas Stafylopatis  (National Tech. U. of Athens, Greece)
Noboru Sonehara       (NTT Co., Japan)
John Sorensen         (Tech. U. of Denmark, Denmark)
Yoh'ichi Tohkura      (ATR, Japan)
John Vlontzos         (Intracom S.A., Greece)
Raymond Watrous       (Siemens Corp. Research, USA)
Christian Wellekens   (Eurecom, France)
Yiu-Fai Issac Wong    (Lawrence Livermore Lab., USA)

                          CALL FOR PAPERS

The fourth of a series of IEEE workshops on Neural Networks for Signal
Processing will be held at the Porto Hydra Resort Hotel, Ermioni, Greece,
in September of 1994. Papers are solicited for, but not limited to,
the following topics:

APPLICATIONS:
Image, speech, communications, sensors, medical, adaptive
filtering, OCR, and other general signal processing and pattern
recognition topics.

THEORIES:
Generalization and regularization, system identification, parameter
estimation, new network architectures, new learning algorithms, and
wavelet in NNs.

IMPLEMENTATIONS:
Software, digital, analog, and hybrid technologies.


Prospective authors are invited to submit 4 copies of extended summaries
of no more than 6 pages. The top of the first page of the summary should
include a title, authors' names, affiliations, address, telephone and
fax numbers and email address if any. Camera-ready full papers
of accepted proposals will be published in a hard-bound volume by IEEE
and distributed at the workshop.  Due to workshop facility constraints,
attendance will be limited with priority given to those who submit
written technical contributions.  For further information, please
contact Mrs. Myra Sourlou at the NNSP'94 Athens office,
(Tel.) +30 1 6644961, (Fax) +30 1 6644379, (e-mail)
msou@intranet.gr.

Please send paper submissions to:

Prof. Jenq-Neng Hwang
IEEE  NNSP'94
Department of Electrical Engineering, FT-10
University of Washington, Seattle, WA 98195, USA
Phone: (206) 685-1603,  Fax: (206) 543-3842

                             SCHEDULE

Submission of extended summary:   February 15
Notification of acceptance:       April 19
Submission of photo-ready paper:  June 1
Advanced registration, before:    June 1


------------------------------

Subject: Call For Papers
From:    ai706@freenet.carleton.ca (Ian Malone)
Date:    Wed, 06 Oct 93 12:29:05 -0500


****************************  CALL FOR PAPERS  *****************

The Bellwood Research Center
Journal of Artificial Neural Systems
1994 Edition

The Journal of Artificial Neural Systems is intended to provide
a forum for theoretical and practical advances in the field
of neural networks.  Submissions should consist of full-length
original papers describing theoretical and/or practical research.

Interested parties should submit three copies of their original
work to ...

        Sandra Shoals
        Managing Editor
        Bellwood Research Center
        17 Briston Private
        Ottawa, Ontario
        Canada, K1G 5R5

For submission, catalogue and order information, please address
your inquiries to ...

        Information
        Bellwood Research Center
        17 Briston Private
        Ottawa, Ontario
        Canada, K1G 5R5

Regards,
Ian Malone
President
Bellwood Research Center



------------------------------

Subject: Pan Pacific Conf on Brain Electric Topography - 1st announcement
From:    Alex A Sergejew <alex@brain.physics.swin.oz.au>
Date:    Sat, 28 Aug 93 20:56:14 +0900

                            FIRST ANNOUNCEMENT

                          PAN PACIFIC CONFERENCE
                                    ON
                         BRAIN ELECTRIC TOPOGRAPHY

                           February 10 - 12, 1994
                              SYDNEY, AUSTRALIA

INVITATION

Brain electric and magnetic topography is an exciting emerging area which
draws on the disciplines of neurophysiology, physics,  signal  processing,
computing and cognitive neuroscience.  This conference will offer a forum for
the presentation of recent findings.  The program will include an outstanding
series of plenary lectures, as well as platform and poster presentations
by active participants in the field.

The conference includes two major plenary sessions.

In the Plenary Session entitled "Brain Activity Topography and Cognitive
Processes," the keynote speakers include Frank Duffy (Boston), Alan Gevins
(San Francisco), Steven Hillyard (La Jolla), Yoshihiko Koga (Tokyo) and Paul
Nunez (New Orleans).

Keynote speakers for the Plenary Session entitled "Brain Rhythmic
Activity and States of Consciousness," will include Walter Freeman
(Berkeley),
Rodolfo Llinas (New York), Shigeaki Matsuoka (Kitakyushu)
and Yuzo Yamaguchi (Osaka).

The plenary sessions will provide a forum for discussion of some of the most
recent developments of analysis and models of electrical brain function, and
findings of brain topography and cognitive processes.

This conference is aimed at harnessing multidisciplinary participation and
will be of interest to those working in the areas of clinical
neurophysiology,
cognitive neuroscience, biological signal processing, neurophysiology,
neurology, neuropsychology and neuropsychiatry.


CALL FOR PAPERS

Papers are invited for platform and poster presentation.
Platform presentations will be allocated 20 minutes
(15 mins for presentation and 5 mins for questions).
Abstracts of no more than 300 words are invited.

The deadline for receipt of abstracts is November 10th, 1993, while
notification of acceptance of abstracts will be sent on December 10th, 1993

The abstract can be sent by mail, Fax or Email to:
PAN PACIFIC CONFERENCE ON BRAIN ELECTRIC TOPOGRAPHY
C/- Cognitive Neuroscience Unit
Westmead Hospital, Hawkesbury Road
Westmead  NSW  2145, Sydney
AUSTRALIA

Fax :   +61 (2) 635 7734
Tel :   +61 (2) 633 6688
Email : pan@brain.physics.swin.oz.au

Authors may be invited to provide full manuscripts for publication of the
proceedings in CD-ROM and book form.  All authors wishing to have their
papers included must supply a full manuscript at the time of the conference.


GENERAL INFORMATION:

Date:   February 10 - 12, 1994

Venue:
The conference will be held at the Hotel Intercontinental on Sydney Harbour.

Climate:
February is summertime in Australia and the average maximum day-time
temperature in Sydney is 26 degC (78 degF).

Social Programme:
There will be a conference dinner on a yacht sailing Sydney Harbour
on February 11th, 1994.  Cost $A65 per person.

Hotel Accommodation:
Hotels listed offer a range of accommodation at special conference rates.
Please quote the name of the conference when arranging your booking.


Scientific Committee:                                   Organising Committee:
Prof Richard Silberstein, Melbourne (Chairman)          E Gordon (Chairman)
A/Prof Helen Beh, Sydney                                R Silberstein
Dr Evian Gordon, Sydney                                 J Restom
Dr Shigeaki Matsuoka, Kitakyushu
Dr Patricia Michie, Sydney
Dr Ken Nagata, Akita
Dr Alex Sergejew, Melbourne
A/Prof James Wright, Auckland



REGISTRATION:

Name(Prof/Dr/Ms/Mr):__________________________________________________
Address:______________________________________________________________
______________________________________________________________________
______________________________________________________________________
Telephone:  ______________________________ (include country/area code)
Fax:______________________________ E Mail______________________________

On or before November 10th, 1993                $A380.00
After November 10th, 1993                       $A400.00

Students before November 10th,1993              $A250.00

Conference Harbour Cruise Dinner                $A65.00 per person
    number of people _____


Method of Payment:

Cheque _        MasterCard _      VISA _        BankCard _

To be completed by credit card users only:

Card Number     _ _ _ _   _ _ _ _   _ _ _ _   _ _ _ _

Expiration Date __________________________

Signature       __________________________ (Signature not required if
                                            registering by E-mail)

Date            __________________________

Cheques should be payable to "Pan Pacific Conference"
(Address below)


SOME SUGGESTIONS FOR HOTEL ACCOMODATION

Special conference rates apply.  Quote the name of the conference when
booking.
Prices are per double room per night

SYDNEY RENAISSANCE HOTEL*****
Guaranteed harbour view.  10 min walk under cover.  $A170.00
30 Pitt St, Sydney NSW 2000, Australia.
Ph: +61 (2) 259 7000  Fax +61 (2) 252 1999

HOTEL INTERCONTINENTAL SYDNEY*****
Harbour view $A205.00  City View $A165.00
117 Macquarie Street, Sydney NSW 2000, Australia.
Ph: +61 (2) 230 0200  Fax: +61 (2) 240 1240

OLD SYDNEY PARKROYAL****
10 min walk.  $A190.00 including breakfast
55 George St, Sydney NSW 2000, Australia.
Ph: +61 (2) 252 0524  Fax: (2) +61 251 2093

RAMADA GRAND HOTEL, BONDI BEACH****
Complementary shuttlebus service.  $A130 - $A170 including breakfast
Beach Rd, Bondi Beach NSW 2026, Australia.
Ph: +61 (2) 365 5666  Fax: +61 (2) 3655 330

HOTEL CRANBROOK INTERNATIONAL***
Older style, budget type accomodation overlooking Rose Bay.
Free shuttlebus service and airport transfers.  $A80.00 including breakfast
601 New South Head Rd, Rose Bay NSW 2020, Australia.
Ph: +61 (2) 252 0524  Fax: +61 (2) 251 2093


Post registration details with your cheque to:

PAN PACIFIC CONFERENCE ON ELECTRIC BRAIN TOPOGRAPHY
C/- Cognitive Neuroscience Unit
Westmead Hospital, Hawkesbury Road
Westmead  NSW  2145, Sydney
AUSTRALIA


------------------------------

From:    Ron Sun <rsun@athos.cs.ua.edu>
Date:    Tue, 10 Aug 93 16:33:56 -0600



                            CALL  FOR   PAPERS


   International Symposium on Integrating Knowledge and Neural Heuristics
                              (ISIKNH'94)

Sponsored by University of Florida, and AAAI,
in cooperation with IEEE Neural Network Council,
and Florida AI Research Society.

Time: May 9-10 1994; Place: Pensacola Beach, Florida, USA.


A large amount of research has been directed
toward integrating neural and symbolic methods in recent years.
Especially, the integration of knowledge-based principles and
neural heuristics holds great promise
in solving complicated real-world problems.
This symposium will provide a forum for discussions
and exchanges of ideas in this area. The objective of this symposium
is to bring together researchers from a variety of fields
who are interested in applying neural network techniques
to augmenting existing knowledge or proceeding the other way around,
and especially, who have demonstrated that this combined approach
outperforms either approach alone.
We welcome views of this problem from
areas such as constraint-(knowledge-) based learning and
reasoning, connectionist symbol processing,
hybrid intelligent systems, fuzzy neural networks,
multi-strategic learning, and cognitive science.

Examples of specific research include but are not limited to:
1. How do we build a neural network based on {\em a priori}
knowledge (i.e., a knowledge-based neural network)?
2. How do neural heuristics improve the current model
for a particular problem (e.g., classification, planning,
signal processing, and control)?
3. How does knowledge in conjunction with neural heuristics
contribute to machine learning?
4. What is the emergent behavior of a hybrid system?
5. What are the fundamental issues behind the combined approach?

Program activities include keynote speeches, paper presentation,
and panel discussions.

*****
Scholarships are offered to assist students in attending the
symposium.  Students who wish to apply for a scholarship should send
their resumes and a statement of how their researches are related
to the symposium.
*****


Symposium Chairs:
LiMin Fu, University of Florida, USA.
Chris Lacher,  Florida State University, USA.

Program Committee:
Jim Anderson,   Brown University,  USA
Michael Arbib,  University of Southern California,  USA
Fevzi Belli,  The University of Paderborn,  Germany
Jim Bezdek,  University of West Florida,  USA
Bir Bhanu,  University of California,  USA
Su-Shing Chen,  National Science Foundation,  USA
Tharam Dillon,  La Trobe University,  Australia
Douglas Fisher,  Vanderbilt University,  USA
Paul Fishwick,  University of Florida,  USA
Stephen Gallant,  HNC Inc.,  USA
Yoichi Hayashi,  Ibaraki University,  Japan
Susan I. Hruska,  Florida State University,  USA
Michel Klefstad-Sillonville  CCETT,  France
David C. Kuncicky,  Florida State University,  USA
Joseph Principe,  University of Florida,  USA
Sylvian Ray,  University of Illinois,  USA
Armando F. Rocha, University of Estadual, Brasil
Ron Sun,  University of Alabama,  USA

Keynote Speaker: Balakrishnan Chandrasekaran, Ohio-State University


Schedule for Contributed Papers
- ----------------------------------------------------------------------
Paper Summaries Due: December 15, 1993
Notice of Acceptance Due: February 1, 1994
Camera Ready Papers Due: March 1, 1994

Extended paper summaries should be
limited to four pages (single or double-spaced)
and should include the title, names of the authors, the
network and mailing addresses and telephone number of the corresponding
author.  Important research results should be attached.
Send four copies of extended paper summaries to

      LiMin Fu
      Dept. of CIS, 301 CSE
      University of Florida
      Gainesville, FL 32611
      USA
      (e-mail: fu@cis.ufl.edu; phone: 904-392-1485).

Students' applications for a scholarship should also be sent
to the above address.

General information and registration materials can be obtained by
writing to

      Rob Francis
      ISIKNH'94
      DOCE/Conferences
      2209 NW 13th Street, STE E
      University of Florida
      Gainesville, FL 32609-3476
      USA
      (Phone: 904-392-1701; fax: 904-392-6950)
- ---------------------------------------------------------------------


- ---------------------------------------------------------------------
If you intend to attend the symposium, you may submit the following
information by returning this message:


NAME: _______________________________________
ADDRESS: ____________________________________
_____________________________________________
_____________________________________________
_____________________________________________
_____________________________________________
PHONE: ______________________________________
FAX: ________________________________________
E-MAIL: _____________________________________


- ---------------------------------------------------------------------




------------------------------

Subject: Call for papers, NeuroControl book
From:    "David L. Elliott" <delliott@src.umd.edu>
Date:    Wed, 11 Aug 93 17:52:44 -0500

                  PROGRESS IN NEURAL NETWORKS
                  series Editor O. M. Omidvar
Special Volume:
NEURAL NETWORKS FOR CONTROL
Editor: David L. Elliott

CALL FOR PAPERS


Original manuscripts describing recent progress in neural
networks research directly applicable to Control or making use
of modern control theory. Manuscripts  may be survey or
tutorial in nature. Suggested topics for this book are:

        %New directions in neurocontrol

        %Adaptive control

        %Biological control architectures

        %Mathematical foundations of control

        %Model-based control with learning capability

        %Natural neural control systems

        %Neurocontrol hardware research

        %Optimal control and incremental dynamic programming

        %Process control and manufacturing

        %Reinforcement-Learning Control

        %Sensor fusion and vector quantization

        %Validating neural control systems


The papers will be refereed and uniformly typeset. Ablex and the Progress
Series editors invite you to submit an abstract, extended summary or
manuscript proposal, directly to the Special Volume Editor:

Dr. David L. Elliott, Institute for Systems Research
University of Maryland, College Park, MD 20742
Tel: (301)405-1241   FAX (301)314-9920
Email: DELLIOTT@SRC.UMD.EDU

      or to the Series Editor:
Dr. Omid M. Omidvar, Computer Science Dept.,
University of the District of Columbia, Washington DC 20008
Tel: (202)282-7345   FAX: (202)282-3677
Email: OOMIDVAR@UDCVAX.BITNET
The Publisher is Ablex Publishing Corporation, Norwood, NJ



------------------------------

Subject: Call for Papers of WWW
From:    Takeshi Furuhashi <furu@uchikawa.nuem.nagoya-u.ac.jp>
Date:    Mon, 23 Aug 93 11:22:41 +0200

CALL FOR PAPERS                                   TENTATIVE
1994 IEEE/Nagoya University
World Wisemen/women Workshop(WWW)

ON FUZZY LOGIC AND NEURAL NETWORKS/GENETIC ALGORITHMS
- -Architecture and Applications for Knowledge Acquisition/Adaptation-

August 9 and 10, 1994
Nagoya University Symposion
Chikusa-ku, Nagoya, JAPAN

Sponsored by Nagoya University

Co-sponsored by
IEEE Industrial Electronics Society

Technically Co-sponsored by
IEEE Neural Network Council
IEEE Robotics and Automation Society
International Fuzzy Systems Association
Japan Society for Fuzzy Theory and Systems
North American Fuzzy Information Processing Society
Society of Instrument and Control Engineers
Robotics Society of Japan

There are growing interests in combination technologies of fuzzy logic
and neural networks, fuzzy logic and genetic algorithm for acquisition
of experts' knowledge, modeling of nonlinear systems, realizing
adaptive systems. The goal of the 1994 IEEE/Nagoya University WWW on
Fuzzy Logic and Neural Networks/Genetic Algorithm is to give its
attendees opportunities to exchange information and ideas on various
aspects of the Combination Technologies and to stimulate and inspire
pioneering works in this area. To keep the quality of these workshop
high, only a limited number of people are accepted as participants of
the workshops. The papers presented at the workshop will be edited and
published from the Oxford University Press.

TOPICS:
Combination of Fuzzy Logic and Neural Networks, Combination of Fuzzy
Logic and Genetic Algorithm, Learning and Adaptation, Knowledge
Acquisition, Modeling, Human Machine Interface

IMPORTANT DATES:
Submission of Abstracts of Papers : April 31, 1994
Acceptance Notification           : May 31, 1994
Final Manuscript                  : July 1, 1994

A partial or full assistance of travel expenses for speakers of
excellent papers will be provided by the WWW. The candidates should
apply as soon as possible, preferably by Jan. 30, '94

All correspondence and submission of papers should be sent to
Takeshi Furuhashi, General Chair
Dept. of Information Electronics, Nagoya University
Furo-cho, Chikusa-ku, Nagoya 464-01, JAPAN
TEL: +81-52-781-5111 ext.2792
FAX: +81-52-781-9263
E mail: furu@uchikawa.nuem.nagoya-u.ac.jp

IEEE/Nagoya University WWW:

IEEE/Nagoya University WWW(World Wisemen/women Workshop) is a series
of workshops sponsored by Nagoya University and co-sponsored by IEEE
Industrial Electronics Society. City of Naoya, located two hours away
from Tokyo, has many electro-mechanical industries in its surroundings
such as Mitsubishi, TOYOTA, and their allied companies. Nagoya is a
mecca of robotics industries, machine industries and aerospace
industries in Japan. The series of workshops will give its attendees
opportunities to exchange information on advanced sciences and
technologies and to visit industries and research institutes in this
area.

*This workshop will be held just after the 3rd International
Conference on Fuzzy Logic, Neural Nets and Soft Computing(IIZUKA'94)
from Aug. 1 to 7, '94.

WORKSHOP ORGANIZATION

Honorary Chair: Tetsuo Fujimoto
                (Dean, School of Engineering, Nagoya University)
General Chair:  Takeshi Furuhashi (Nagoya University)
Advisory Committee:
        Chair:  Toshio Fukuda (Nagoya University)
                Fumio Harashima (University of Tokyo)
                Yoshiki Uchikawa (Nagoya University)
                Takeshi Yamakawa (Kyushu Institute of Technology)
Steering Committee:
                H.Berenji (NASA Ames Research Center)
                W.Eppler (University of Karlsruhe)
                I.Hayashi (Hannan University)
                Y.Hayashi (Ibaraki University)
                H.Ichihashi (Osaka Prefectural University)
                A.Imura
                (Laboratory for International Fuzzy Engineering)
                M.Jordan (Massachusetts Institute of Technology)
                C.-C.Jou (National Chiao Tung Universtiy)
                E.Khan (National Semiconductor)
                R.Langari (Texas A & M University)
                H.Takagi (Matsushita Electric Industrial Co., Ltd.)
                K.Tanaka (Kanazawa University)
                M.Valenzuela-Rendon
                (Institute Tecnologico y de Estudios Superiores de Monterrey)
                L.-X.Wang (University of California Berkeley)
                T.Yamaguchi (Utsunomiya University)
                J.Yen (Texas A & M Universtiy)


------------------------------

Subject: URGENT: DEADLINE CHANGE FOR WORLD CONGRESS
From:    mwitten@chpc.utexas.edu
Date:    Wed, 03 Nov 93 11:37:13 -0600

                   UPDATE ON DEADLINES
FIRST WORLD CONGRESS ON COMPUTATIONAL MEDICINE, PUBLIC
             HEALTH, AND BIOTECHNOLOGY
                    24-28 April 1994
                   Hyatt Regency Hotel
                     Austin, Texas
- ----- (Feel Free To Cross Post This Announcement) ----

Due to a confusion in the electronic distribution of
the congress announcement and deadlines, as well as
incorrect deadlines appearing in a number of society
newsletters and journals, we are extending the abstract
submission deadline for this congress to 31 December 1993.
We apologize to those who were confused over the differing
deadline announcements and hope that this change will
allow everyone to participate. For congress details:

To contact the congress organizers for any reason use any of the
following pathways:

ELECTRONIC MAIL - compmed94@chpc.utexas.edu

FAX (USA)       - (512) 471-2445

PHONE (USA)     - (512) 471-2472

GOPHER: log into the University of Texas System-CHPC
select the Computational Medicine and Allied Health
menu choice

ANONYMOUS FTP: ftp.chpc.utexas.edu
             cd /pub/compmed94
        (all documents and forms are stored here)

POSTAL:
            Compmed 1994
      University of Texas System CHPC
            Balcones Research Center
            10100 Burnet Road, 1.154CMS
            Austin, Texas 78758-4497

SUBMISSION PROCEDURES: Authors must submit 5
copies of a single-page 50-100 word abstract clearly
discussing the topic of their presentation. In
addition, authors must clearly state their choice of
poster, contributed paper, tutorial, exhibit, focused
workshop or birds of a feather group along with a
discussion of their presentation. Abstracts will be
published as part of the preliminary conference
material. To notify the congress organizing committee
that you would like to participate and to be put on
the congress mailing list, please fill out and return
the form that follows this announcement.  You may use
any of the contact methods above. If you wish to
organize a contributed paper session, tutorial
session, focused workshop, or birds of a feather
group, please contact the conference director at
mwitten@chpc.utexas.edu . The abstract may be submitted
electronically to  compmed94@chpc.utexas.edu  or
by mail or fax. There is no official format.


If you need further details, please contact me.

Matthew Witten
Congress Chair
mwitten@chpc.utexas.edu


------------------------------

End of Neuron Digest [Volume 12 Issue 10]
*****************************************
Received: from BUACCA by BUACCA.BU.EDU (Mailer R2.08 PTF009) with BSMTP id
 3207; Mon, 08 Nov 93 01:52:00 EST
Received: from noc4.dccs.upenn.edu by BUACCA.BU.EDU (IBM VM SMTP R1.2.1) with
 TCP; Mon, 08 Nov 93 01:51:57 EST
Received: from CATTELL.PSYCH.UPENN.EDU by noc4.dccs.upenn.edu
	id AA17798; Mon, 8 Nov 93 01:34:23 -0500
Return-Path: <marvit@cattell.psych.upenn.edu>
Received: from LOCALHOST by cattell.psych.upenn.edu
	id AA17235; Mon, 8 Nov 93 00:19:37 EST
Posted-Date: Mon, 08 Nov 93 00:17:21 EST
From: "Neuron-Digest Moderator" <neuron-request@CATTELL.PSYCH.UPENN.EDU>
To: Neuron-Distribution:;
Subject: Neuron Digest V12 #11 (proceedings, jobs, protein structures)
Reply-To: "Neuron-Request" <neuron-request@CATTELL.PSYCH.UPENN.EDU>
X-Errors-To: "Neuron-Request" <neuron-request@psych.upenn.edu>
Organization: University of Pennsylvania
Date: Mon, 08 Nov 93 00:17:21 EST
Message-Id: <17199.752735841@cattell.psych.upenn.edu>
Sender: marvit@cattell.psych.upenn.edu

Neuron Digest   Monday,  8 Nov 1993
                Volume 12 : Issue 11

Today's Topics:
                          Proceedings available
                      Possible position in Finance
               UK PhD connectionist studentship available
                                 Courses
          Summary of ANNs and secondary protein structures pt 1
          Summary of ANNs and secondary protein structures pt 2


Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@psych.upenn.edu". The ftp archives are
available from psych.upenn.edu (130.91.68.31). Back issues requested by
mail will eventually be sent, but may take a while.

----------------------------------------------------------------------

Subject: Proceedings available
From:    Joan Cabestany <cabestan@eel.upc.es>
Date:    Tue, 19 Oct 93 09:59:16 -0300



Proceedings available
_____________________

After the last edition of IWANN'93 (International Workshop on Artificial
Neural Networks) held in Spain (Sitges) during June 1993, some books with
the Proceedings are still available at special price.

Reference:

New Trends in Neural Computation (IWANN'93 Proceedings)
J.Mira, J.Cabestany, A.Prieto editors
Lecture Notes in Computer Science number 686
SPRINGER VERLAG  1993

Price: 9000 pesetas (spanish currency)

Method of payment:

VISA Card number _________________________  Expiration date _______________

Name of card holder ______________________________________________________




Date ____________      Signature ___________________________________


Send this form to

ULTRAMAR CONGRESS
Att. Mr. J.Balada
Diputacio, 238, 3
08007 BARCELONA  Spain
Fax + 34.3.412.03.19



------------------------------

Subject: Possible position in Finance
From:    William Fulkerson <fulkersw@smtplink.de.deere.com>
Date:    Tue, 19 Oct 93 09:11:36 -0600

A position is anticipated in the Finance department, Deere &
Company, Moline Illinois.  The purpose of this message is to
survey the interest in such a position and to determine the
skills available among the likely candidates.

The successful candidate will have a theoretical background and 1
to 3 years professional experience in applications of
neural networks, evolutionary computing, and/or fuzzy logic.
Although professional experience in applications is required, it
need not be in finance.  Applicants with advanced degrees in
engineering, statistics, or computer science are preferred.

The applicant must want to apply the above technologies to
financial problems and be willing to pursue these financial
applications for an extended period of years. The initial
assignment will be to develop trading systems for foreign
exchange and commercial paper. Future assignments will be within
the Finance department and may include pension fund management.


If your interest, application experience, training, and skills
match this description, please send a short description of your
qualifications via e-mail to:

fulkersw@smtplink.de.deere.com.

Receipt of your e-mail will be acknowledged.



------------------------------

Subject: UK PhD connectionist studentship available
From:    simon@dcs.ex.ac.uk
Date:    Tue, 19 Oct 93 16:23:58 +0000


PhD Studentship -- full-time, SERC funded

The Department of Computer Science at the University of Exeter has
a SERC quota award available for a suitable candidate to pursue
fulltime research for a PhD degree.  Applicants should have a good
first degree in Computer Science (or a closely related dsicipline)
with a sound knowledge of neural computing and/or software
engineering.  The successful applicant will join a research group
exploring the use of neural computing as a novel software
technology.  Potential projects range from formal analysis of
network implementations of well-defined problems to development of
visualization techniques to facilitate efficient network training
as well as to provide support for a conceptual understanding of
neural net implementations.

Application forms and further information can be obtained from:
  Lyn Shackleton,
  Department of Computer Science,
  University of Exeter,
  Exeter EX4 4PT.
  email: lyn@dcs.exeter.ac.uk;
  tel: 0392 264066; FAX: 0392 264067

Informal enquiries and requests for further details of the
research group's activities may be made to:
  Professor Derek Partridge,
  Department of Computer Science,
  University of Exeter,
  Exeter EX4 4PT,
  email: derek@dcs.exeter.ac.uk
  tel: 0392 264061, FAX: 0392 264067,

The closing date for applications is November 19th, 1993.
Interviews will be conducted in the week beginning November 22, 1993.
It is expected that the award will be taken up in January 1994.


- --
Simon Klyne                                     Connection Science Laboratory
email: simon@dcs.exeter.ac.uk                   Department of Computer
Science
phone: (+44) 392 264066                         University of Exeter
                                                EX4 4QE, UK.




------------------------------

Subject: Courses
From:    RAUL HECTOR GALLARD <gallardr@unslfm.edu.ar>
Date:    Wed, 20 Oct 93 10:21:27 -0300


Dear colleague:

This e-mail is to ask for assistance to rise up the background of lecturers
and researchers, in specific areas.
We belong to the Department of Informatics of the Universidad Nacional de
San Luis, Argentina, where we have a Computer Science carreer. Besides that,
we conform a research group working in the area of Intelligent Distributed
Systems. In our project we are looking for applications of Genetic Algorithms
(GAs) and Neural Networks (NNs), to insert intelligence into Computer
Systems.
A PhD Programme was launched in the areas of GAs and NNs for further study
and

research in these topics.
Our Faculty is looking for Professors and/or Researchers who whish to come to
San Luis to give a short postgraduate course (40 to 80 hours) on GAs and NNs.
The stay is stimated in three weeks, and the courses could be given anytime
before October 1994. Travelling and reasonable living expenses will be
covered
by the Faculty.
We need to have as soon as possible a brief Curriculum Vitae and course
content (current trends are advisable) of the possible candidates.
If you are prepared to give any of those courses please contact me via e-mail
or to the fax number 54 652 30224. Otherwise, if you know another colleague
willing to come please forward this message to him.

Thanking you in advance.

Prof. Raul Gallard
Proyecto 338403
Universidad Nacional de San Luis
Ej. de los Andes 950
5700 - san Luis
Argentina

e-mail : gallardr@unslfm.edu.ar


                                        Raul Hector Gallard
                                        gallardr@unslfm.edu.ar



------------------------------

Subject: Summary of ANNs and secondary protein structures pt 1
From:    pb@cse.iitb.ernet.in (Pushpak Bhattacharya)
Date:    Thu, 21 Oct 93 12:43:21 +0700

Here are some of the responses we received to our query
on prediction of secondary structure of protein. Thanks
very much to all the respondents.
                               Pushpak Bhattacharyya
                               pb@cse.iitb.ernet.in
                               Susmita De
                               susmita@cse.iitb.ernet.in

>From ajay@cgl.ucsf.edu

Hi,

        Saw your note in neuron-digest. i am very surprised that you have
problems running this. i have never had any problems running the protein
structure prediction problem (in fact this is my research area) using
a BP though not the PDP software.

        i guess that the problem is with the code or the specification of
the various bells and whistles with the code. Saturation problems occur
if you use LARGE learning rates. Before you go around messing with the code
i would suggest using a very small learning rate say 0.001, no momentum and
try both per-pattern and batch modes. i have found quicker convergence
with per-pattern updates.

        Hope this helps.

        ---ajay



>From batra@zelia.Colorado.EDU Fri Oct 15 20:05:07 1993

Here are some refs (if you dont have them already)

Qian, N. and Sejnowski, T.J. (1988).  Predicting the Secondary
Structure of Globular Proteins Using Neural Network Models.  {\em
Journal of Molecular Biology,} 202:865-884.

Holley, L.H. and Karplus, M. (1989).  Protein Secondary Structure
Prediction with a Neural Network.  {\em Proceedings of the National
Academy fo Science USA,} 86:152-156.

Threre are many references on this topic but these might be the
better ones.

- --SAjeev.




>From batra@zelia.Colorado.EDU Fri Oct 15 20:05:05 1993


Hi,

I'm also working on Protein Secondary Structure Prediction...  So,
I might be able to help.

I'm not exactly sure what your problem is...  However, first
you should try the problem without a hidden layer in the network.
Qian and Sejnowski do very well w/o the hidden layer.

Having 20 neurons is too many.  You also mention that you
have 65 input units???  Hmmm...  I think if you use 21xN, it
might work better.  N is the size of your sliding window (usually
N= 13).

Hope I've helped a little.  If you need more info or anything,
please do not hesitate to email me.  BTW hope this gets thru to
india.

Regards,
Sajeev Batra
batra@boulder.colorado.edu




>From ajay@cgl.ucsf.edu Tue Oct 12 13:28:57 1993


Hi,

        Saw your note in neuron-digest. i am very surprised that you have
problems running this. i have never had any problems running the protein
structure prediction problem (in fact this is my research area) using
a BP though not the PDP software.

        i guess that the problem is with the code or the specification of
the various bells and whistles with the code. Saturation problems occur
if you use LARGE learning rates. Before you go around messing with the code
i would suggest using a very small learning rate say 0.001, no momentum and
try both per-pattern and batch modes. i have found quicker convergence
with per-pattern updates.

        Hope this helps.

        ---ajay




>From reinagel@husc.harvard.edu Tue Oct 12 13:31:47 1993

Hi.

You posted in the NN digest about a problem you're having with bp
solving protein folding, which you attribute to the size of your data
set. As it happens, I also once (long ago) built a bp net to predict
protein folding, (actually millions of people have done it, it seems).
So I am interested in your problem.

I train nets on data sets of 20,000 patterns (40 inputs, 1 output,
and 5 to 20 hidden U's) all the time and I have NEVER seen the behaviour
you describe. I don't think your problem is due to too big a data set,
but due to a bad representation of the data or to bad net architecture.

Have you tested that bp in your hands behaves normally for, eg., XOR?
What happens if you train on just 100 patterns? Same thing?


You say your net has 65 binary inputs--how is the primary sequence
coded??! The three outputs are? Alpha, beta, other? 3-D spatial
coordinates? Phi, Psi, & something else? I am having trouble imagining
how a sufficient amount of primary sequence could ever be represented
in 65 bits, nor where you can find 8,000 independent sequences of known
structure, unless it is that you're using way to few aa's, or way too
little information about each aa, to specify structure.

I'm asking because it's possible that your input codes carry NO
information about your output codes, and even BP cannot extract
information that is not there.  In such a case, the net will train to
learn intrinsic regularities in the output patterns independent of the
inputs--eg.  if the outputs are so coded that the unit's target is 0
most of the time, the net will learn that just guessing 0 minimizes
its MSE.

If you are convinced that the information is there to be found, try:

1. incremental training (start with a few patterns and add more and
more) 2. heirarchy (make a net to remap inputs to an intermediary code
and then another net to go from that code to the final one) if you
know already of an intermediary representation that makes sense 3. try
fewer hidden units or local connectivity of hidden units, at least in
the early part of training.

Good luck.

Pam Reinagel
Harvard University


------------------------------

Subject: Summary of ANNs and secondary protein structures pt 2
From:    pb@cse.iitb.ernet.in (Pushpak Bhattacharya)
Date:    Thu, 21 Oct 93 12:48:42 +0700

Here are some more reponses to our query on the
prediction of secondery structures of proteins by
neural nets. We post it to the neuron-request for
the benefit of fellow-researchers.  Thanks very much
again to the respondents.
                                      Pushpak Bhattacharyya
                                    pb@cse.iitb.ernet.in
                                  Susmita De
                                  susmita@cse.iitb.ernet.in

>From markb@orl.mmc.com Fri Oct  8 01:28:03 1993

I doubt your problems have much to do with the size of your problem
as I routinely train BP classifiers on data sets of 8,000 examples using
nets roughly the size of 65x20x3 with great success.

More likely is some little snag with how you are presenting data to the net
and so I am going to go through how you stated your problem. Please let me
know if any of the following is in error. You say you present the net with
the primary sequence of the protein and it has 65 binary values. I assume
you are encoding the amino acids acids in some way ( 20 amino acids requires
5 bits to represent ) so I assume you are working on a sequence of length
13; i.e. 65/5. You have then chosen 20 hidden nodes ( because there are 20
amino acids? ) and finally have some data which suggests these 13 residues
have three distinct conformations. Question: how do you assign the training
values? Like
   Conformation 1:   1.0 0.0 0.0
   Conformation 2:   0.0 1.0 0.0
   Conformation 3:   0.0 0.0 1.0

or do you use 0.9 and 0.1? i.e.
   Conformation 1:   0.9 0.1 0.1
   Conformation 2:   0.1 0.9 0.1
   Conformation 3:   0.1 0.1 0.9

a related question would be do you use a sigmoid or other squashing function
on the output layer of nodes?

Finally, are the three conformations presented to the net in random order or
do you present the BP net with all of the Conformation 1 examples, then all
of
the Conformation 2 examples, ...?


I would suggest:
        1) making sure you always put the 'N-terminal' residue at the same
spot
        2) putting a squashing function at the output
        3) putting "teacher values" of 0.9 and 0.1 one the training examples
        4) presenting the examples in random order

Finally, make sure you
        1) set the learning constant sufficiently low so it does not "peg"
                initial movements to either 1 or 0
        2) include the "bias nodes".

I would absolutely LOVE to see your data! I have worked BP nets as
classifiers
for years, but have no access to protein data
 Mark.


&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&

>From may@apple.com Fri Oct  8 09:31:38 1993

> We are trying to train a BP neural net (PDP-3 package of Rumelhart and
> Mclleland is being used) with the input as the primary structure and the
> output as the conformational state of the amino acids. The net has 65
> binary inputs and 3 binary outputs. The number of patterns to be trained
> are more than eight thousand.  We have 20 neurons in the hidden layer.
>
> Now, just after one epoch the hidden neurons go into saturation state -
> all their outputs are 1. At the same time the output neurons also go into
> saturation. All their outputs become 0.  We understand that the primary
> reason for this is that A LARGE number of patterns are being trained on a
> net with large no of inputs. The weights to the o/p layer keep on getting
> the same kind of "Credit", so that their values go on decreasing to large
> negative values.
>
> Could anybody suggest a way out of this difficulty - which to our mind is
> a result of applying BP on a large-size real life problem ? (those
> applying BP on large problems will appreciate our difficulty !).

Hi,

     If you're allowed to release your training set I'd be happy to run
it through my own backprop simulator and send you the results (structure
and weights).  I can also send you the code when I've polished the user-
interface (probably another man-month of work).

Regards,

Patrick May
may@apple.com




>From
 /PN=luca/OU=aeolus/O=ethz/PRMD=switch/ADMD=arcom/C=ch/@x400-gate.iitb.ernet.
in
 Fri Oct  8 10:21:02 1993

Dear Sirs,
I have read your post on neural network and your current problems.
I am fiishing a PH.D. in protein crystallography ,and I am interested
into the applicationof neural network pattern recognition on prediction
of proteins structures. I used it as well in classical crystallographical
problems, since 2 years.
I suppose you are all well aware about the work done by Cotteril few
years ago about threedimensional protein structure prediction appeared
on a small paper on FEBS. I do not remeba the exact reference but I will
look up in the lab tomorrow ( I know I have it ), if you haven't yet read
it. I also worked on topological approach to predict conformations of
aminoacids.
My opinion on the problem as you posed is that it is ill-posed.
That is: 1) are you really SURE that you posed the problem in the more
straight way ? 2) 8000 patterns are not much compared to the 1400 weights
that the net has to solve , are you sure that they are enough ?

I never had problems in making a net converging and generalyze, when I
posed the problem in the right way ...

I applyed my problem to 20000 (twenty thousands) neurons as input layer,
each input being varying between 0. and 1. and using sigmoidal function
as first step, 5000 hiddens and 20000 (twenty thousands) outputs. I was
running it on a CRAY-YMP and then lastly on a faster machine capable of
storing up to 30 GigaBytes in Memory (NEC-SX-3) and had never the pro-
blems you show.

therefore I suppose that the problem you are encountering is not due to
the size of the problem but of the way you designed it ...

neural networks have nothing magic ! all the business is to use them
correctly and in an approprate way ....

OK ?

in case I can be of any help, give me a ring :-)

Luca

luca@aeolus.ethz.ch



>From stein@mars.fiz.huji.ac.il Fri Oct  8 14:12:14 1993

> We are trying to train a BP neural net (PDP-3 package of Rumelhart and
> Mclleland is being used) with the input as the primary structure and the
> output as the conformational state of the amino acids. The net has 65
> binary inputs and 3 binary outputs. The number of patterns to be trained
> are more than eight thousand.  We have 20 neurons in the hidden layer.

> Now, just after one epoch the hidden neurons go into saturation state -
> all their outputs are 1. At the same time the output neurons also go into
> saturation. All their outputs become 0.  We understand that the primary
> reason for this is that A LARGE number of patterns are being trained on a
> net with large no of inputs. The weights to the o/p layer keep on getting
> the same kind of "Credit", so that their values go on decreasing to large
> negative values.

> Could anybody suggest a way out of this difficulty - which to our mind is
> a result of applying BP on a large-size real life problem ? (those
> applying BP on large problems will appreciate our difficulty !).

Your problem does not really classify as a large-size problem with 65
inputs and 3 outputs. We routinely handle hundreds of inputs and tens
of outputs with no saturation problems. I suggest that you consult our
article that appeared in IJCNN-92 (Baltimore) I-932 for some tips.
First, make sure that your binary inputs are represented by -1 and +1
and NOT by 0 and 1. Otherwise you will have to deal with the derivative
becoming zero by one of several techniques (see references in our article).
Next you are probably being drawn into a global attractor, to check this
try stretching your sigmoid to pull the values out of saturation and
continue training. If these don't help drop me an email.

Dr. Yaakov Stein
Neural Network Group
Efrat Future Technology
Tel Aviv, ISRAEL



>From FLOTZI@dpmi.tu-graz.ac.at Fri Oct  8 14:45:45 1993

Hi!

I read about your problems with your backprop application with 65
binary inputs and 3 binary outputs. I have had problems with
backprop, too, and although I don't know the package you are using I
can think of some reasons why the nodes go into saturation so quickly:

1) the learning rate may be too large: depending on the implementation
and on whether you use a momentum term or not you should use quite
small values like 0.001 or even smaller, look at the error curve to find
out if that is too small.
2) I guess you use batch learning, i.e. you look at all training examples
before changing the weights. I have found that this is not so useful if
the patterns are not well seperable. I prefer 'on-line' learning, i.e.
changing the weights after each training example. Unfortunately, with
'on-line' learning, the momentum technique is not applicable, but I don't
know whether you use it or not.
3) try using more than 20 hidden nodes, but I must confess I would
have started with about the same number.

I hope my suggestions will be of help
Good luck

Doris Flotzinger
Medical Informatics
Technical University Graz / Austria



>From iain.strachan@aea.orgn.uk Fri Oct  8 18:58:31 1993


I read your query in neuroprose, and have a few suggestions that might
help.  If you've already tried this, please ignore.

(1) The most common cause of premature saturation is setting of the
initial values of the weights too high. If the expectation value of
the input to any sigmoid unit is greater than about 3.5, then the
slope of the sigmoid is 0.1 times the slope at value zero, and hence
learning is very slow.  A useful heuristic is to determine a basic
range for the weights (say -1 to 1) and then to scale by the inverse
of the square root of the number of inputs to the particular unit.  If
the problem still persists, reduce the initial range even further.

(2) A good discussion of premature saturation is given in:

        "An analysis of Premature Saturation in Back Propagation
        Learning".

        Lee,Oh and Kim,  Neural Networks Vol 6(1993) pps 719-728.

(3) If the learning rate is too high, this can also immediately cause
    the neurons to go into saturation.

-
-----------------------------------------------------------------------------
-
Mr. Iain G. D. Strachan
iain.strachan@aea.orgn.uk
                                                        Tel: +44 (0)235
435321
                                                        Fax: +44 (0)235
432726
AEA Technology
Applied Neurocomputing Centre
B521.1 Harwell
Oxfordshire
OX11 0RA
UK
-
-----------------------------------------------------------------------------
-




>From /PN=jmerelo/O=ugr/PRMD=iris/ADMD=mensatex/C=es/@x400-gate.iitb.ernet.in
 Fri Oct  8 22:19:41 1993


Hi, this is JJ Merelo, from Spain...

I don't know if you have checked the literature, but just in case, there
is a classical article by Qian and Sejnowski, as well as new papers
by people at the EMBL that achieve just what you're looking for. They
don't manage to achieve more than a 70% of accuracy, from protein sequence
to secondary structure.

Several other approaches haven been tried; for instande, ours (Andrade
et al, that has appeared in Protein Engineering recently), which tries
to predict secondary structure from circular dichroism spectra, achieving
somewhat better results.

Hope this helps,

                                JJ




>From drt@genesis.mcs.com Sat Oct  9 19:42:59 1993

Pushpak, you wrote in Neuron Digest:

>We are trying to train a BP neural net (PDP-3 package of Rumelhart and
>Mclleland is being used) with the input as the primary structure and the
>output as the conformational state of the amino acids. The net has 65
>binary inputs and 3 binary outputs. The number of patterns to be trained
>are more than eight thousand.  We have 20 neurons in the hidden layer.
>
>Now, just after one epoch the hidden neurons go into saturation state -
>all their outputs are 1. At the same time the output neurons also go into
>saturation. All their outputs become 0.  We understand that the primary
>reason for this is that A LARGE number of patterns are being trained on a
>net with large no of inputs. The weights to the o/p layer keep on getting
>the same kind of "Credit", so that their values go on decreasing to large
>negative values.
>
>Could anybody suggest a way out of this difficulty - which to our mind is
>a result of applying BP on a large-size real life problem ? (those
>applying BP on large problems will appreciate our difficulty !).

The easiet solution is to change all those binary 1s in the input
layer to smaller values, like 0.1.  A harder way to do this
is to change the activation function from:

        1 / (1 + exp(-x))

to:

        1 / (1 + exp(-D*x))

and make D less than 1.  This require modifying the code but then I
guess you have it so you can do it.  While harder, this is more
convenient because you can keep changing D (called the gain or
sharpness) until you get good results.

Also you will probably want to use very small initial random weights.

But I'll give you some other good advice too:  try some of the newer
variations on backprop like delta-bar-delta and quickprop.  But I'll
give you a caution about this too:  I was recently puttering around
with the sonar data using backprop and quickprop and the backprop
network was generalizing better than quickprop.  Another trick is
to fudge the derivative term for the output units only so instead of
using s(1-s) where s is the value of the output unit use 0.1 + s(1-s)
or just plain 1.  Given that you have 3 binary outputs I think this
may help because I bet some of them will get stuck at the wrong values
and when they do the normal s(1-s) term slows learning down by A LOT.

Finally I could offer you my backprop code that happens to do these
things.  I've posted the backprop source in C for UNIX and DOS in
comp.sources.misc a number of times and people around the world have
used it and liked it.  Its free for ordinary uses, $200 for businesses
and government agencies, if you find it useful.  In case you find it
useful you get the new improved "professional" version as well.

Don Tveter
5228 N. Nashville Ave.
Chicago, Illinois  60656-2233
USA
drt@genesis.mcs.com




>From huyser@nova3.Stanford.EDU Sun Oct 10 04:27:23 1993

Dear Drs. Bhattacharyya and De,

I should think your learning rate is too large.  I would try batch learning
(one step per epoch on the accumulated errors) and a step-size (lrate) of
roughly 0.1/npatterns, which for you is about 1e-5.  It's possible that
1e-4 would still be stable; it depends on the data.

Best wishes,
Karen Huyser


------------------------------

End of Neuron Digest [Volume 12 Issue 11]
*****************************************
