[SciPy-user] constrained optimization
Mon Apr 28 13:57:41 CDT 2008
On Mon, Apr 28, 2008 at 1:34 PM, John Hunter <firstname.lastname@example.org> wrote:
> I need to do a N dimensional constrained optimization over a weight w
> vector with the constraints:
> * w[i] >=0
> * w.sum() == 1.0
> Scanning through the scipy.optimize docs, I see a number of examples
> where parameters can be bounded by a bracketing interval, but none
> where constraints can be placed on combinations of the parameters, eg
> the sum of them. One approach I am considering is doing a bracketed
> [0,1] constrained optimization over N-1 weights (assigning the last
> weight to be 1-sum others) and modifying my cost function to punish
> the optimizer when the N-1 input weights sum to more than one.
> Is there a better approach?
Transform the coordinates to an unconstrained N-1-dimensional space.
One such transformation is the Aitchison (or "additive log-ratio")
y = log(x[:-1] / x[-1])
And to go back:
tmp = hstack([exp(y), 1.0])
x = tmp / tmp.sum()
Searching for "compositional data analysis" should yield similar
transformations, but this one should be sufficient for maintaining
constraints. For doing statistics, the other have better properties.
"I have come to believe that the whole world is an enigma, a harmless
enigma that is made terrible by our own mad attempt to interpret it as
though it had an underlying truth."
-- Umberto Eco
More information about the SciPy-user