FunctionNNLayerLibrary   "FunctionNNLayer" 
Generalized Neural Network Layer method.
 function(inputs, weights, n_nodes, activation_function, bias, alpha, scale)  Generalized Layer.
  Parameters:
     inputs : float array, input values.
     weights : float array, weight values.
     n_nodes : int, number of nodes in layer.
     activation_function : string, default='sigmoid', name of the activation function used.
     bias : float, default=1.0, bias to pass into activation function.
     alpha : float, default=na, if required to pass into activation function.
     scale : float, default=na, if required to pass into activation function.
  Returns: float
Artificial_intelligence
FunctionNNPerceptronLibrary   "FunctionNNPerceptron" 
Perceptron Function for Neural networks.
 function(inputs, weights, bias, activation_function, alpha, scale)  generalized perceptron node for Neural Networks.
  Parameters:
     inputs : float array, the inputs of the perceptron.
     weights : float array, the weights for inputs.
     bias : float, default=1.0, the default bias of the perceptron.
     activation_function : string, default='sigmoid', activation function applied to the output.
     alpha : float, default=na, if required for activation.
     scale : float, default=na, if required for activation.
@outputs float
MLActivationFunctionsLibrary   "MLActivationFunctions" 
Activation functions for Neural networks.
 binary_step(value)  Basic threshold output classifier to activate/deactivate neuron.
  Parameters:
     value : float, value to process.
  Returns: float
 linear(value)  Input is the same as output.
  Parameters:
     value : float, value to process.
  Returns: float
 sigmoid(value)  Sigmoid or logistic function.
  Parameters:
     value : float, value to process.
  Returns: float
 sigmoid_derivative(value)  Derivative of sigmoid function.
  Parameters:
     value : float, value to process.
  Returns: float
 tanh(value)  Hyperbolic tangent function.
  Parameters:
     value : float, value to process.
  Returns: float
 tanh_derivative(value)  Hyperbolic tangent function derivative.
  Parameters:
     value : float, value to process.
  Returns: float
 relu(value)  Rectified linear unit (RELU) function.
  Parameters:
     value : float, value to process.
  Returns: float
 relu_derivative(value)  RELU function derivative.
  Parameters:
     value : float, value to process.
  Returns: float
 leaky_relu(value)  Leaky RELU function.
  Parameters:
     value : float, value to process.
  Returns: float
 leaky_relu_derivative(value)  Leaky RELU function derivative.
  Parameters:
     value : float, value to process.
  Returns: float
 relu6(value)  RELU-6 function.
  Parameters:
     value : float, value to process.
  Returns: float
 softmax(value)  Softmax function.
  Parameters:
     value : float array, values to process.
  Returns: float
 softplus(value)  Softplus function.
  Parameters:
     value : float, value to process.
  Returns: float
 softsign(value)  Softsign function.
  Parameters:
     value : float, value to process.
  Returns: float
 elu(value, alpha)  Exponential Linear Unit (ELU) function.
  Parameters:
     value : float, value to process.
     alpha : float, default=1.0, predefined constant, controls the value to which an ELU saturates for negative net inputs. .
  Returns: float
 selu(value, alpha, scale)  Scaled Exponential Linear Unit (SELU) function.
  Parameters:
     value : float, value to process.
     alpha : float, default=1.67326324, predefined constant, controls the value to which an SELU saturates for negative net inputs. .
     scale : float, default=1.05070098, predefined constant.
  Returns: float
 exponential(value)  Pointer to math.exp() function.
  Parameters:
     value : float, value to process.
  Returns: float
 function(name, value, alpha, scale)  Activation function.
  Parameters:
     name : string, name of activation function.
     value : float, value to process.
     alpha : float, default=na, if required. 
     scale : float, default=na, if required. 
  Returns: float
 derivative(name, value, alpha, scale)  Derivative Activation function.
  Parameters:
     name : string, name of activation function.
     value : float, value to process.
     alpha : float, default=na, if required. 
     scale : float, default=na, if required. 
  Returns: float
MLLossFunctionsLibrary   "MLLossFunctions" 
Methods for Loss functions.
 mse(expects, predicts)  Mean Squared Error (MSE) " MSE = 1/N * sum ((y - y')^2) ".
  Parameters:
     expects : float array, expected values.
     predicts : float array, prediction values.
  Returns: float
 binary_cross_entropy(expects, predicts)  Binary Cross-Entropy Loss (log).
  Parameters:
     expects : float array, expected values.
     predicts : float array, prediction values.
  Returns: float



