### On the foundations of intelligent processes - I:

An evolving model for
pattern learning

by

#### Lev Goldfarb

#### Abstract

A general adaptive model unifying existing models for pattern learning is
proposed. The model, in addition to preserving the merits of geometric and
syntactic approaches to pattern recognition, has decisive advantages over them.
It can be viewed as a far-reaching generalization of the perceptron, or neural
net, models, in which the vector representation and the associated vector
operations are replaced by more general structural representation and the
corresponding structural operations. The basis of the model is the concept of a
transformation system, which is a generalization of Thue (Post-production)
systems. Parametric distance functions in transformation systems are introduced.
These are generalizations of weighted Levenshtein (edit) distances to more
general structured objects. Learning model for transformation systems, unifying
many existing models (including that of neural nets), is proposed. The model
also suggests how various propositional object (class) descriptions might be
generated based on the outputs of the learning process: these descriptions
represent "translation" of some information encoded in the *nonpropositional*
"language" of the corresponding transformation system, representing
the environment, into the chosen *logical *(*propositional*)
language, whose semantics is now defined by the "translation". In the
light of the metric model the intelligence emerges as based on simple arithmetic
processes: first, those related to the optimal distance computation, and,
second, "counting" and comparing the results of counting for various "important"
features, detected at the learning stage (arithmetic predicates).

goldfarb@unb.ca

last updated: 95/12/22