Download: Difference between revisions

From Artificial Neural Network for PHP
No edit summary
No edit summary
Line 2: Line 2:


This page offers to download the current stable version of ANN implementation for PHP 5.x. Go to [[Installation]] section to get information about requirements and on how to implement these PHP libraries into your project.
This page offers to download the current stable version of ANN implementation for PHP 5.x. Go to [[Installation]] section to get information about requirements and on how to implement these PHP libraries into your project.

== Version '''2.2.0''' (2011-06-01) '''''stable''''' (PHP 5.3 or above) ==

'''Author: Thomas Wien'''

* [http://ann.thwien.de/downloads/ann220.zip Download - ann220.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann220.tar.gz Download - ann220.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann220.phar.gz Download (Phar) - ann220.phar.gz] (20 KB)

'''MD5 finger prints'''

* 1de88cd1077247d3c43ef37c99a9486b ann220.phar.gz
* 87513d54d7558230c496cba478e585fd ann220.tar.gz
* 2f4ea0dea8ca91d6d33b344268f637d2 ann220.zip

'''Documentation'''

* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]

* [http://ann.thwien.de/downloads/ann220.phpdoc.zip Documentation (HTML download, zip compressed)]

'''Change-Log'''

* Introduction of namespaces as of PHP 5.3
* Dynamic learning rate
* \ANN\Network::setLearningRate() is protected now


== Version '''2.1.5''' (2011-05-24) '''''stable''''' ==
== Version '''2.1.5''' (2011-05-24) '''''stable''''' ==

Revision as of 11:19, 1 June 2011

ANN - Artificial Neural Network for PHP 5.x

This page offers to download the current stable version of ANN implementation for PHP 5.x. Go to Installation section to get information about requirements and on how to implement these PHP libraries into your project.

Version 2.2.0 (2011-06-01) stable (PHP 5.3 or above)

Author: Thomas Wien

MD5 finger prints

  • 1de88cd1077247d3c43ef37c99a9486b ann220.phar.gz
  • 87513d54d7558230c496cba478e585fd ann220.tar.gz
  • 2f4ea0dea8ca91d6d33b344268f637d2 ann220.zip

Documentation

Change-Log

  • Introduction of namespaces as of PHP 5.3
  • Dynamic learning rate
  • \ANN\Network::setLearningRate() is protected now

Version 2.1.5 (2011-05-24) stable

Author: Thomas Wien

MD5 finger prints

  • 96ad7c92b08a44d89fd4a7a17b463a9e ann215.phar.gz
  • 88ffad519058881f90eb24b4363d0d94 ann215.tar.gz
  • 2d293c4f92d0d17f3e0e5ba8e79e3dbd ann215.zip

Documentation

Change-Log

  • Dividing method ANN_Math::random() in ANN_Math::randomDelta() and ANN_Math::randomWeight()
  • Better implementation of printing network details including __invoke() und __toString() converting

Version 2.1.4 (2011-05-23) obsolete

Author: Thomas Wien

MD5 finger prints

  • 998d377de058b959c5ad83141b168e5e ann214.phar.gz
  • 12b7a028021477555613d533410526e0 ann214.tar.gz
  • e6762c5f667e1710ac7efdca70cb7c41 ann214.zip

Documentation

Change-Log

  • Better calculation of remaining time of running the network
  • Fixing bug: generating random delta not correct
  • Adding momentum value
  • Simplified ANN_Neuron::adjustWeights()
  • Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
  • Simplify ANN_Layer::calculateHiddenDeltas()

Version 2.1.3 (2010-01-06) obsolete

Author: Thomas Wien

MD5 finger prints

  • ae8c5eb97e1984c836df53cfffb06294 ann213.phar.gz
  • 0a375525863eb3f9663b64655bb7b637 ann213.tar.gz
  • 981709c7da085a17994cb71ee603f9e0 ann213.zip

Documentation

Change-Log

  • Introduction to date input support class

Version 2.1.2 (2009-12-26) obsolete

Author: Thomas Wien

MD5 finger prints

  • 0a0e350a56941a99bbf55537978e0bdb ann212.phar.gz
  • 3cdea04e49898cdb2f0ab66864d4b4ad ann212.tar.gz
  • 8e63c8b89293ee43337bfac0ee4fa67c ann212.zip

Documentation

Change-Log

Version 2.1.1 (2009-12-23) obsolete

Author: Thomas Wien

Documentation

Change-Log

Version 2.1.0 (2009-12-22) obsolete

Author: Thomas Wien

Documentation

Change-Log

  • Default learning rate to 0.7
  • Code changes referring profiling
  • Change printing network details formatting
  • Remove trailing php end tag
  • Code-Standard
  • Remove momentum, precision and unused math methods
  • Remove unused methods
  • Remove error weight derivative
  • Removing weight decay
  • Removing dynamic learning rate
  • Removing algorithm switches and experimental algorithms. Just standard back propagation algorithm used
  • Renaming class file names
  • Learning rate and delta using
  • Using learning rate
  • Rounding of network error value

Version 2.0.7 (2009-01-01) stable

Author: Thomas Wien

Documentation

Change-Log

  • Removing protected method ANN_Neuron::setOutput()
  • Removing protected unused method ANN_Layer::getInputs()
  • Removing protected unused property ANN_Layer::$arrInputs
  • More detailed exceptions to ANN_Filesystem::saveToFile()
  • Different distribution of activation calls across the layers
  • Different adjustments in ANN_Neuron::adjustWeights() depending on output type
  • Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
  • Increasing math precision
  • Using class constants for output types (increasing performance)
  • Fixing bug: ANN_Neuron::getOutput() is float and not array

Version 2.0.6 (2008-12-18) obsolete

Author: Thomas Wien

Documentation

Change-Log

  • Printing network details of output differences to their desired values
  • Complete rewritten code standard of variables
  • New class ANN_Values for defining input and output values
  • Code examples to phpdoc
  • Internal math precision defaults to 5

Version 2.0.5 (2008-12-16) obsolete

Author: Thomas Wien

Documentation

Change-Log

  • Adjustable output error tolerance between 0 and 10 per cent
  • Internal rounding of floats for performance issues
  • Loading class for all ANN classes (SPL autoload)
  • Renaming filename of ANN_Maths class
  • Improving code standard
  • Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue

Version 2.0.4 (2008-01-27) obsolete

Author: Thomas Wien

Change-Log

  • Weight decay
  • QuickProp algorithm (experimental)
  • RProp algorithm (experimental)
  • Linear saturated activation function (experimental)
  • Individual learning rate algorithm (experimental)
  • Reducing of overfitting (no training if input pattern produces desired output)
  • Increasing performance on activation
  • Increasing performance on testing all patterns to their desired outputs
  • Increasing performance on calculating hidden deltas
  • Increasing performance by defining layer relation by construction
  • More details to printNetwork()
  • Fixing bug: learning rate is not part of saved delta value

Version 2.0.3 (2008-01-17) obsolete

Author: Thomas Wien

Change-Log

  • Support for dynamic learning rate
  • Automatic epoch determination
  • Automatic output type detection
  • Shuffling input patterns each epoch instead of randomized pattern access
  • Bug fix: runtime error on call of setMomentum()
  • Logging of network errors
  • Logging on each epoch instead of each training step
  • Avoiding distributed internal calls of setMomentum() and setLearningRate()
  • Extending display of network details

Version 2.0.2 (2008-01-14) obsolete

Author: Thomas Wien

Change-Log

  • Client-Server model for distributed applications
  • Calculating total network error for csv logging

Version 2.0.1 (2008-01-06) obsolete

Author: Thomas Wien

Change-Log

  • Separation of classes to several files
  • Version control by Subversion
  • Performance issues
  • Graphical output of neural network topology
  • Logging of weights to csv file

Version 2.0.0 (2007-12-17) obsolete

Author: Thomas Wien

Change-Log

  • PHP 5.x support
  • PHPDoc documentation
  • Momentum support
  • Avoiding network overfitting
  • Linear / binary output
  • ANN_InputValue + ANN_OutputValue classes
  • Exceptions
  • Threshold function
  • Tangens hyperbolicus transfer function
  • Several performance issues
  • Avoiding array_keys() & srand() due to performance
  • Changes in saving and loading network
  • Printing network details to browser
  • Fixing bug: initializing inputs to all hidden layers
  • Fixing bug: training for first hidden layer was skipped

Version 1.0 (2002)

Author: Eddy Young

Change-Log

  • Initial version