|
Boost : |
From: Jeff Garland (jeff_at_[hidden])
Date: 2004-11-23 09:47:09
On Mon, 22 Nov 2004 22:33:58 +0100, Pavol Droba wrote
> I see, that I haven't gave the best example. I was just trying to
> point out the problem, unfortunately, you have provide me with
> solution to another one.
>
> So here is another example.
>
> Imagine a statistical system. Now the input can be arbitrary type of
> date. If would be very hard to make the algorithm that works for any
> particular type or combination of types. Therefor, data is
> normalized. i.e. converted to a real number from interval <0,1>.
> Algorithm process these normalize values and at the end results are expanded
> to the original types.
>
> This is a common practice when multitype data must be handled and
> they are intermixed.
Ok, I'm not clear on why that would be, but I'll take your word for it.
Personally I'd be concerned about the loss of accuracy in these conversions.
> Now it is clear, that for the expansion, one need to multiply a
> time_duration by a real number factor.
So I take it that the data is not regular -- that is it might be 0.1, 0.4,
0.9, 0.95, etc?
I'm still unsure how this interface would operate. Should I truncate or round
fractional values that result? The internal representation of time_duration
isn't changing from an integer to a real because that would result in a lack
of correct calculation for the current operations. So this calculation will
require conversion to real and back.
At the moment the benefit of putting this in the library isn't outweighing the
issues it raises for me (I'm still open to persuasion). I'm inclined to leave
this as a user written function -- especially since I can't write it any more
efficiently than you...
Jeff
Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk