The Influence of Fractional Calculus in Sequential Models with Low Neuron Count

Abstract:

We study the influence of different fractional order activation functions on the accuracy of a sequential neural network model with an extreme low neuron count. This methodology enables us to extract better results from an otherwise simple neural network architecture on a synthetic dataset. Our research proves that the effect, good or bad depending on the order of the fractional derivative, increases as the number of neurons in a model decrease.