scispace - formally typeset
Journal ArticleDOI

Growing Grid — a self-organizing network with constant neighborhood range and adaptation strength

Bernd Fritzke
- 01 Sep 1995 - 
- Vol. 2, Iss: 5, pp 9-13
Reads0
Chats0
TLDR
A novel self-organizing network which is generated by a growth process where both the neighborhood range used to co-adapt units in the vicinity of the winning unit and the adaptation strength are constant during the growth phase.
Abstract
We present a novel self-organizing network which is generated by a growth process. The application range of the model is the same as for Kohonen’s feature map: generation of topology-preserving and dimensionality-reducing mappings, e.g., for the purpose of data visualization. The network structure is a rectangular grid which, however, increases its size during self-organization. By inserting complete rows or columns of units the grid may adapt its height/width ratio to the given pattern distribution. Both the neighborhood range used to co-adapt units in the vicinity of the winning unit and the adaptation strength are constant during the growth phase. This makes it possible to let the network grow until an application-specific performance criterion is fulfilled or until a desired network size is reached. A final approximation phase with decaying adaptation strength finetunes the network.

read more

Citations
More filters
Journal ArticleDOI

Clustering of the self-organizing map

TL;DR: The two-stage procedure--first using SOM to produce the prototypes that are then clustered in the second stage--is found to perform well when compared with direct clustering of the data and to reduce the computation time.
Journal ArticleDOI

The growing hierarchical self-organizing map: exploratory analysis of high-dimensional data

TL;DR: The motivation was to provide a model that adapts its architecture during its unsupervised training process according to the particular requirements of the input data, and by providing a global orientation of the independently growing maps in the individual layers of the hierarchy, navigation across branches is facilitated.
Journal ArticleDOI

Generalized relevance learning vector quantization

TL;DR: A scheme for automatically pruning irrelevant input dimensions with weighting factors for the input dimensions which leads to a more powerful classifier and to an adaptive metric with little extra cost compared to standard GLVQ.
Journal ArticleDOI

Concepts of Artificial Intelligence for Computer-Assisted Drug Discovery

TL;DR: The current state-of-the art of AI-assisted pharmaceutical discovery is discussed, including applications in structure- and ligand-based virtual screening, de novo drug design, physicochemical and pharmacokinetic property prediction, drug repurposing, and related aspects.
Journal ArticleDOI

A comparison of self-organizing map algorithm and some conventional statistical methods for ecological community ordination

TL;DR: The present work describes how SOM can be used for the study of ecological communities, and how it can perfectly complete classical techniques for exploring data and for achieving community ordination.
References
More filters
Proceedings Article

A Growing Neural Gas Network Learns Topologies

TL;DR: An incremental network model is introduced which is able to learn the important topological relations in a given set of input vectors by means of a simple Hebb-like learning rule.
Journal ArticleDOI

Growing cell structures—a self-organizing network for unsupervised and supervised learning

Bernd Fritzke
- 01 Nov 1994 - 
TL;DR: A new self-organizing neural network model that has two variants that performs unsupervised learning and can be used for data visualization, clustering, and vector quantization is presented and results on the two-spirals benchmark and a vowel classification problem are presented that are better than any results previously published.
Journal ArticleDOI

Analysis of a simple self-organizing process

TL;DR: In this article, an analysis of two partial processes is presented, in which a set of numerical representations assumes the correct order, and in the second one the final map of the representations converges to its asymptotic form.
Journal ArticleDOI

Variants of self-organizing maps

TL;DR: Two innovations are discussed: dynamic weighting of the input signals at each input of each cell, which improves the ordering when very different input signals are used, and definition of neighborhoods in the learning algorithm by the minimum spanning tree, which provides a far better and faster approximation of prominently structured density functions.