[SciPy-User] constrained optimization: question about Jacobian

Andreas Hilboll lists@hilboll...
Tue Mar 26 06:36:23 CDT 2013


I want to perform constrained optimization of the function

def F(X, data):
    # x1, x2, x3, t2 are scalars, data is a constant np.array
    x1, x2, x3, t2 = X
    T = np.arange(X.size)
    t1, t3 = T[0], T[-1]
    Xbefore = x1 + (T - t1) * (x2 - x1) / (t2 - t1)
    Xafter = x2 + (T - t2) * (x3 - x2) / (t3 - t2)
    Xbreak = np.where(T <= t2, Xbefore, Xafter)
    return ((Xbreak - data)**2).sum()

where the parameter value t2 must be within 0 <= t2 < data.size.

According to the tutorial
(http://docs.scipy.org/doc/scipy/reference/tutorial/optimize.html#constrained-minimization-of-multivariate-scalar-functions-minimize),
I need to define the Jacobian. However, I'm unsure of how to define the
derivative with regards to t2.

Any ideas are greatly appreciated :)

Cheers, A.


More information about the SciPy-User mailing list