Download
ANN - Artificial Neural Network for PHP 5.x
This page offers to download the current stable version of ANN implementation for PHP 5.x. Go to Installation section to get information about requirements and on how to implement these PHP libraries into your project.
Version 2.1.3 (2010-01-06) stable
Author: Thomas Wien
- Download - ann213.zip (34 KB)
- Download - ann213.tar.gz (18 KB)
- Download (Phar) - ann213.phar.gz (20 KB)
MD5 finger prints
- ae8c5eb97e1984c836df53cfffb06294 ann213.phar.gz
- 0a375525863eb3f9663b64655bb7b637 ann213.tar.gz
- 981709c7da085a17994cb71ee603f9e0 ann213.zip
Documentation
Change-Log
- Introduction to date input support class
Version 2.1.2 (2009-12-26) obsolete
Author: Thomas Wien
- Download - ann212.zip (31 KB)
- Download - ann212.tar.gz (16 KB)
- Download (Phar) - ann212.phar.gz (18 KB)
MD5 finger prints
- 0a0e350a56941a99bbf55537978e0bdb ann212.phar.gz
- 3cdea04e49898cdb2f0ab66864d4b4ad ann212.tar.gz
- 8e63c8b89293ee43337bfac0ee4fa67c ann212.zip
Documentation
Change-Log
- Introduction to classification with ANN_Classification (Example see: Detection of language with classification)
- Phar support (as of PHP 5.3.0)
Version 2.1.1 (2009-12-23) obsolete
Author: Thomas Wien
- Download - ann211.zip (29 KB)
- Download - ann211.tar.gz (16 KB)
Documentation
Change-Log
- Introduction to string association with ANN_StringValue (Example see: Detection of language)
Version 2.1.0 (2009-12-22) obsolete
Author: Thomas Wien
- Download - ann210.zip (27 KB)
- Download - ann210.tar.gz (15 KB)
Documentation
Change-Log
- Default learning rate to 0.7
- Code changes referring profiling
- Change printing network details formatting
- Remove trailing php end tag
- Code-Standard
- Remove momentum, precision and unused math methods
- Remove unused methods
- Remove error weight derivative
- Removing weight decay
- Removing dynamic learning rate
- Removing algorithm switches and experimental algorithms. Just standard back propagation algorithm used
- Renaming class file names
- Learning rate and delta using
- Using learning rate
- Rounding of network error value
Version 2.0.7 (2009-01-01) stable
Author: Thomas Wien
- Download - ann207.zip (32 KB)
- Download - ann207.tar.gz (34 KB)
Documentation
Change-Log
- Removing protected method ANN_Neuron::setOutput()
- Removing protected unused method ANN_Layer::getInputs()
- Removing protected unused property ANN_Layer::$arrInputs
- More detailed exceptions to ANN_Filesystem::saveToFile()
- Different distribution of activation calls across the layers
- Different adjustments in ANN_Neuron::adjustWeights() depending on output type
- Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
- Increasing math precision
- Using class constants for output types (increasing performance)
- Fixing bug: ANN_Neuron::getOutput() is float and not array
Version 2.0.6 (2008-12-18) obsolete
Author: Thomas Wien
- Download - ann206.zip (30 KB)
- Download - ann206.tar.gz (33 KB)
Documentation
Change-Log
- Printing network details of output differences to their desired values
- Complete rewritten code standard of variables
- New class ANN_Values for defining input and output values
- Code examples to phpdoc
- Internal math precision defaults to 5
Version 2.0.5 (2008-12-16) obsolete
Author: Thomas Wien
- Download - ann205.zip (28 KB)
- Download - ann205.tar.gz (31 KB)
Documentation
Change-Log
- Adjustable output error tolerance between 0 and 10 per cent
- Internal rounding of floats for performance issues
- Loading class for all ANN classes (SPL autoload)
- Renaming filename of ANN_Maths class
- Improving code standard
- Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
Version 2.0.4 (2008-01-27) obsolete
Author: Thomas Wien
- Download - ann204.zip (25 KB)
- Download - ann204.tar.gz (29 KB)
Change-Log
- Weight decay
- QuickProp algorithm (experimental)
- RProp algorithm (experimental)
- Linear saturated activation function (experimental)
- Individual learning rate algorithm (experimental)
- Reducing of overfitting (no training if input pattern produces desired output)
- Increasing performance on activation
- Increasing performance on testing all patterns to their desired outputs
- Increasing performance on calculating hidden deltas
- Increasing performance by defining layer relation by construction
- More details to printNetwork()
- Fixing bug: learning rate is not part of saved delta value
Version 2.0.3 (2008-01-17) obsolete
Author: Thomas Wien
- Download - ann203.zip (22 KB)
- Download - ann203.tar.gz (20 KB)
Change-Log
- Support for dynamic learning rate
- Automatic epoch determination
- Automatic output type detection
- Shuffling input patterns each epoch instead of randomized pattern access
- Bug fix: runtime error on call of setMomentum()
- Logging of network errors
- Logging on each epoch instead of each training step
- Avoiding distributed internal calls of setMomentum() and setLearningRate()
- Extending display of network details
Version 2.0.2 (2008-01-14) obsolete
Author: Thomas Wien
- Download - ann202.zip (21 KB)
- Download - ann202.tar.gz (17 KB)
Change-Log
- Client-Server model for distributed applications
- Calculating total network error for csv logging
Version 2.0.1 (2008-01-06) obsolete
Author: Thomas Wien
- Download - ann201.zip (19 KB)
- Download - ann201.tar.gz (16 KB)
Change-Log
- Separation of classes to several files
- Version control by Subversion
- Performance issues
- Graphical output of neural network topology
- Logging of weights to csv file
Version 2.0.0 (2007-12-17) obsolete
Author: Thomas Wien
- Download - ann200.zip (6 KB)
- Download - ann200.tar.gz (6 KB)
Change-Log
- PHP 5.x support
- PHPDoc documentation
- Momentum support
- Avoiding network overfitting
- Linear / binary output
- ANN_InputValue + ANN_OutputValue classes
- Exceptions
- Threshold function
- Tangens hyperbolicus transfer function
- Several performance issues
- Avoiding array_keys() & srand() due to performance
- Changes in saving and loading network
- Printing network details to browser
- Fixing bug: initializing inputs to all hidden layers
- Fixing bug: training for first hidden layer was skipped
Version 1.0 (2002)
Author: Eddy Young
Change-Log
- Initial version