eneural_net
eNeural.net / Dart is an AI Library for efficient Artificial Neural Networks. The library is portable (native, JS/Web, Flutter) and the computation is capable to use SIMD (Single Instruction Multiple Data) to improve performance.
Usage
Output:
SIMD (Single Instruction Multiple Data)
Dart has support for SIMD when computation is made using Float32x4 and Int32x4.
The Activation Functions are implemented using Float32x4, improving
performance by 1.5x to 2x, when compared to normal implementation.
The basic principle with SIMD is to execute math operations simultaneously in 4 numbers.
Float32x4 is a lane of 4 double (32 bits single precision floating points).
Example of multiplication:
See "dart:typed_data library" and "Using SIMD in Dart".
Signal
The class Signal
represents the collection of numbers (including its related operations)
that will flow through the ANN
, representing the actual signal that
an Artificial Neural Network should compute.
The main implementation is SignalFloat32x4
and represents
an ANN
Signal
based in Float32x4. All the operations prioritizes the use of SIMD.
The framework of Signal
allows the implementation of any kind of data
to represent the numbers and operations of an [eNeural.net] ANN
. SignalInt32x4
is an experimental implementation to exercise an ANN
based in integers.
Activation Functions
ActivationFunction
is the base class for ANN
neurons activation functions:
-
ActivationFunctionSigmoid
:The classic Sigmoid function (return for
x
a value between0.0
and1.0
): -
ActivationFunctionSigmoidFast
:Fast approximation version of Sigmoid function, that is not based in exp(x):
Function author: Graciliano M. Passos: [gmpassos@GitHub][github].
-
ActivationFunctionSigmoidBoundedFast
:Fast approximation version of Sigmoid function, that is not based in exp(x),
bounded to a lower and upper limit for [x].Function author: Graciliano M. Passos: [gmpassos@GitHub][github].
exp(x)
exp is the function of the natural exponent,
e, to the power x.
This is an important ANN
function, since is used by the popular
Sigmoid function, and usually a high precision version is slow
and approximation versions can be used for most ANN
models and training
algorithms.
Fast Math
An internal Fast Math library is present and can be used for platforms
that are not efficient to compute exp
(Exponential function).
You can import this library and use it to create a specialized
ActivationFunction
implementation or use it in any kind of project:
The implementation is based in the Dart package Complex:
The fast_math.expFloat32x4
function was created by Graciliano M. Passos ([gmpassos@GitHub][github]).