Artificial Neural Networks Report

2258 Words5 Pages

Artificial Neural Networks Report

Artificial Neural Networks

1. Introduction

Artificial Neural Networks are computational models inspired by an animal's central nervous systems (brain) that has the ability of machine learning. Artificial neural networks are generally presented as systems of interconnected "neurons" which can compute values from inputs (from wikipedia).

2. Training an Artificial Neural Network

The network is ready to be trained if it had been structured to service a particular application, meanwhile the initial weights are chosen randomly and after that the training begins.

There are two approaches in training Artificial Neural Networks: supervised and unsupervised.

2.1 Supervised Training

In supervised training, with a teacher, so we notice that both the inputs and the outputs are provided, compares its outputs result with the desired outputs. .

2.2 Unsupervised Training

The unsupervised training, without a teacher, so we see in the unsupervised training, the network is provided with inputs but without desired outputs. So system itself must then decide what features it will use to group or classify (clustering) the input data. This is often referred to as self-organization.

3. Some Issues In Neural networks

3.1Number of input nodes

Input sets are dynamic.Number of input is equal to number of features (columns), once we know the shape of our training data we can the input number ,some the methods like Sensitivity Based Pruning, Average Absolute Derivate Magnitude and others can be used to determine the input neuron numbers.

3.2Number of Output nodes

Output sets are dynamic. The number of output neurons is calculated by the chosen model configuration. The outpu...

... middle of paper ...

...n Heidelberg.‏

[13]Pan, W., Shen, X., & Liu, B. (2013). Cluster analysis: unsupervised learning via supervised learning with a non-convex penalty. The Journal of Machine Learning Research, 14(1), 1865-1889

[14]Pavelka, A., & Procházka, A. (2004). Algorithms for initialization of neural network weights. In Sbornık prıspevku 12. rocnıku konference MATLAB 2004(Vol. 2, pp. 453-459).‏

[15]Prechelt, L. (1998). Early stopping-but when?. In Neural Networks: Tricks of the trade (pp. 55-69). Springer Berlin Heidelberg

[16]Stathakis, D. (2009). How many hidden layers and nodes?. International Journal of Remote Sensing, 30(8), 2133-2147.‏

[17] Wilson, D. R., & Martinez, T. R. (2001). The need for small learning rates on large problems. In Neural Networks, 2001. Proceedings. IJCNN'01. International Joint Conference on (Vol. 1, pp. 115-119). IEEE

Open Document