使用scipy.optimize查找多变量方程的根

爱德华多·维埃拉(Eduardo Vieira)

我有以下功能

import numpy as np
import scipy.optimize as optimize

def x(theta1, theta2, w, h, L1, L2):
    sint1 = np.sin(theta1)
    cost1 = np.cos(theta1)
    sint2 = np.sin(theta2)
    cost2 = np.cos(theta2)

    i1 = L1 * (cost1 + cost2) + w
    j1 = L1 * (sint1 - sint2) - h
    D = np.sqrt((L1*(cost2-cost1)+w)**2+(L1*(sint2-sint1)+h)**2)
    a = (0.25)*np.sqrt((4*L2**2-D**2)*D**2)

    return i1/2 + 2*j1*a/(D**2)

def y(theta1, theta2, w, h, L1, L2):
    sint1 = np.sin(theta1)
    cost1 = np.cos(theta1)
    sint2 = np.sin(theta2)
    cost2 = np.cos(theta2)

    i2 = L1 * (sint1 + sint2) + h
    j2 = L1 * (cost1 - cost2) - w
    D = np.sqrt((L1*(cost2-cost1)+w)**2+(L1*(sint2-sint1)+h)**2)
    a = (0.25)*np.sqrt((4*L2**2-D**2)*D**2)

    return i2/2 - 2*j2*a/(D**2)

def det_jacobiano(theta, w, h, L1, L2,eps):
    theta1,theta2 = theta
    dxdt1 = (-x(theta1+eps, theta2, w, h, L1, L2)+4*x(theta1, theta2, w, h, L1, L2)-3*x(theta1-eps, theta2, w, h, L1, L2))/(2*eps)
    dxdt2 = (-x(theta1, theta2+eps, w, h, L1, L2)+4*x(theta1, theta2, w, h, L1, L2)-3*x(theta1, theta2-eps, w, h, L1, L2))/(2*eps)
    dydt1 = (-y(theta1+eps, theta2, w, h, L1, L2)+4*y(theta1, theta2, w, h, L1, L2)-3*y(theta1-eps, theta2, w, h, L1, L2))/(2*eps)
    dydt2 = (-y(theta1, theta2+eps, w, h, L1, L2)+4*y(theta1, theta2, w, h, L1, L2)-3*y(theta1, theta2-eps, w, h, L1, L2))/(2*eps)  
    return dxdt1*dydt2 - dxdt2*dydt1

我想查找使det_jacobiano 0的theta 1和theta2的值。如您所见,函数det_jacobiano在函数x和y中求值。

当我尝试使用scipy.optimize查找根目录时

initial_guess = [2.693, 0.4538]
result = optimize.root(det_jacobiano, initial_guess,tol=1e-8,args=(20,0,100,100,1e-10),method='lm')

我得到错误: TypeError: Improper input: N=2 must not exceed M=1

电话

求根是求解方程组的数值计算等效项。相同的基本约束适用:您需要的方程式与未知数一样多。

所有的寻根例程都scipy希望第一个参数是返回N个值的N个变量的函数。本质上,该第一参数应等效于一个由N个未知数组成的N个方程组。因此,您的问题是det_jacobiano需要2个变量,但仅返回一个值。

您不能在当前公式中使用寻根方法,但仍可以进行最小化。将的最后一行更改det_jacobiano为:

return np.abs(dxdt1*dydt2 - dxdt2*dydt1)

然后使用optimize.minimize

result = optimize.minimize(det_jacobiano, initial_guess, tol=1e-8, args=(20,0,100,100,1e-10), method='Nelder-Mead')

输出:

 final_simplex: (array([[ 1.47062275, -3.46178428],
       [ 1.47062275, -3.46178428],
       [ 1.47062275, -3.46178428]]), array([ 0.,  0.,  0.]))
           fun: 0.0
       message: 'Optimization terminated successfully.'
          nfev: 330
           nit: 137
        status: 0
       success: True
             x: array([ 1.47062275, -3.46178428])

result.fun保留最终的最小化价值(确实0.0如您所愿),并result.x保留theta1, theta2产生该价值的价值0.0

本文收集自互联网,转载请注明来源。

如有侵权,请联系 [email protected] 删除。

编辑于
0

我来说两句

0 条评论
登录 后参与评论

相关文章