 # Boost :

From: Janek Kozicki (janek_listy_at_[hidden])
Date: 2006-10-08 13:12:19

John Maddock said: (by the date of Sun, 8 Oct 2006 16:12:02 +0100)

> Janek Kozicki wrote:
> > Please note that it's the function shape that is important, not its
> > exact vaules at any point. So if you know a 'faster' function that
> > has similar shape and has a first derivative I'd like to know about
> > it too!
>
> I guess if x is small, then you could use a truncated tailor series, and
> either evaluate it as a polynomial, or maybe even just as:
>
> 1/(2 - x + x*x)
>
> but other than that, I'd be surprised if there's anything much faster than
> exp()?

(to readers: this function is a bell curve.])

hmm.. you are right. My knowledge of neural networks is limited, I
haven't had a dedicated course, so it's only what I managed to learn
myself from articles I have found. And I learned only enough to get my
job done, and have some fun with it altogether.

My conculusion is that maybe I'm wrong to say that it is the shape only
(although I belive one of the papers I read said exactly that). But then
- why people are not using a simple polymonial for that?

Oh, I know why.... its integral is this:

2*atan((2*x-1)/sqrt(7))/sqrt(7)

(this is a sigmoid function)

I belive that backpropagation will work correctly when the
derivatives/integrals are exact.

Neural networks use both: the sigmoid and it's derivative (a bell
curve), and they both need to be fast.

Oh, and I found another reason right here in wikipedia :)

"
A reason for sigmoid popularity in neural networks is because the sigmoid
function satisfies this property:

sigmoid_derivative(x) = sigmoid(x)*(1-sigmoid(x))
"

So I was wrong saying that it's only the shape that is important :)

So, are you going to add those?

- sigmoid,
- sigmoid_derivative,
- double_sigmoid,
- double_sigmoid_derivative

The derivative of double sigmoid can be done as a connection of
two sigmoid_derivatives :)

Another question: how about Fast Fourier Transform? is it by definition
```--