|
Boost : |
From: Janek Kozicki (janek_listy_at_[hidden])
Date: 2006-10-08 09:36:52
John Maddock said: (by the date of Fri, 6 Oct 2006 10:04:56 +0100)
> Janek Kozicki wrote:
> >> Mathematical Special Functions
> > how about sigmoid, double sigmoid and their derivatives? Used for
> > neural networks.
> Those are new ones to me.
yes, actually there is a wide family of sigmoid functions, they all have
a similar shape. However there is one function called "sigmoid" and
unless otherwise specified what sigmoid function is used, this one is
assumed:
sigmoid(x)=1/(1+exp(-x))
Theodore, please note that:
(tanh(x/2)+1)/2 = 1/(1+exp(-x))
It's the same function, only written in different way.
First parameter to tweak here, is the steep:
sigmoid(x,s)=1/(1+exp(-x/s))
But there are other sigmoid functions.
Double sigmoid is this:
double_sigmoid(x,d,s) = sign(x-d)(1-exp(-((x-d)/s)^2))
d - function centre
s - steep factor
Actually it's all here:
http://en.wikipedia.org/wiki/Sigmoid_function
http://en.wikipedia.org/wiki/Gaussian_curve
Both sigmoid functions and bell curves (e.g. gaussian) are useful in
neural networks. But also it is necessary to have their first
derivative. The most popular learning algorithm (called backpropagation)
uses a gradient descent in n-dimensional space, and there the
derivatives are necessary.
Please note that it's the function shape that is important, not its exact
vaules at any point. So if you know a 'faster' function that has similar
shape and has a first derivative I'd like to know about it too!
-- Janek Kozicki |
Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk