基于Python fminunc 的替代办法
最近闲着没事,想把coursera上斯坦福ML课程里面的练习,用Python来实现一下,一是加深ML的基础,二是熟悉一下numpy,matplotlib,scipy这些库。
在EX2中,优化theta使用了matlab里面的fminunc函数,不知道Python里面如何实现。搜索之后,发现stackflo上有人提到用scipy库里面的minimize函数来替代。我尝试直接调用我的costfunction和grad,程序报错,提示(3,)和(100,1)dim维度不等,gradient vector不对之类的,试了N多次后,终于发现问题何在。。
来看看使用np.info(minimize)查看函数的介绍,传入的参数有
fun : callable The objective function to be minimized. ``fun(x, args) -> float`` here x is an 1-D array ith shape (n,) and `args` is a tuple of the fixed parameters needed to pletely specify the function. x0 : ndarray, shape (n,) Initial guess. Array of real elements of size (n,), here 'n' is the number of independent variables. args : tuple, optional Extra arguments passed to the objective function and its derivatives (`fun`, `jac` and `hess` functions). method : str or callable, optional Type of solver. Should be one of - 'Nelder-Mead' :ref:`(see here)` - 'Poell' :ref:`(see here) ` - 'CG' :ref:`(see here) ` - 'BFGS' :ref:`(see here) ` - 'Neton-CG' :ref:`(see here) ` - 'L-BFGS-B' :ref:`(see here) ` - 'TNC' :ref:`(see here) ` - 'COBYLA' :ref:`(see here) ` - 'SLSQP' :ref:`(see here) ` - 'trust-constr':ref:`(see here) ` - 'dogleg' :ref:`(see here) ` - 'trust-ncg' :ref:`(see here) ` - 'trust-exact' :ref:`(see here) ` - 'trust-krylov' :ref:`(see here) ` - custom - a callable object (added in version 0.14.0), see belo for description. If not given, chosen to be one of ``BFGS``, ``L-BFGS-B``, ``SLSQP``, depending if the problem has constraints or bounds. jac : {callable, '2-point', '3-point', 'cs', bool}, optional Method for puting the gradient vector. only for CG, BFGS, Neton-CG, L-BFGS-B, TNC, SLSQP, dogleg, trust-ncg, trust-krylov, trust-exact and trust-constr. If it is a callable, it should be a function that returns the gradient vector: ``jac(x, args) -> array_like, shape (n,)`` here x is an array ith shape (n,) and `args` is a tuple ith the fixed parameters. Alternatively, the keyords {'2-point', '3-point', 'cs'} select a finite difference scheme for numerical estimation of the gradient. Options '3-point' and 'cs' are available only to 'trust-constr'. If `jac` is a Boolean and is True, `fun` is assumed to return the gradient along ith the objective function. If False, the gradient ill be estimated using '2-point' finite difference estimation.
需要注意的是fun关键词参数里面的函数,需要把优化的theta放在第一个位置,X,y,放到后面。并且,theta在传入的时候一定要是一个一维shape(n,)的数组,不然会出错。
然后jac是梯度,这里的有两个地方要注意,第一个是传入的theta依然要是一个一维shape(n,),第二个是返回的梯度也要是一个一维shape(n,)的数组。
,关键在于传入的theta一定要是一个1D shape(n,)的,不然就不行。我之前为了方便已经把theta塑造成了一个(n,1)的列向量,导致使用minimize时会报错。所以,学会用help看说明可谓是相当重要啊~
import numpy as np import pandas as pd import scipy.optimize as op def LoadData(filename): data=pd.read_csv(filename,header=None) data=np.array(data) return data def ReshapeData(data): m=np.size(data,0) X=data[:,0:2] Y=data[:,2] Y=Y.reshape((m,1)) return X,Y def InitData(X): m,n=X.shape initial_theta = np.zeros(n + 1) Vecones = np.ones((m, 1)) X = np.column_stack((VecOnes, X)) return X,initial_theta def sigmoid(x): z=1/(1+np.exp(-x)) return z def costFunction(theta,X,Y): m=X.shape[0] J = (-np.dot(Y.T, np.log(sigmoid(X.dot(theta)))) - np.dot((1 - Y).T, np.log(1 - sigmoid(X.dot(theta))))) / m return J def gradient(theta,X,Y): m,n=X.shape theta=theta.reshape((n,1)) grad=np.dot(X.T,sigmoid(X.dot(theta))-Y)/m return grad.flatten() if __name__=='__main__': data = LoadData('ex2data1csv.csv') X, Y = ReshapeData(data) X, initial_theta = InitData(X) result = op.minimize(fun=costFunction, x0=initial_theta, args=(X, Y), method='TNC', jac=gradient) print(result)
结果如下,符合MATLAB里面用fminunc优化的结果(fminunc:cost:0.203,theta:-25.161,0.206,0.201)
fun: array([0.2034977]) jac: array([8.95038682e-09, 8.16149951e-08, 4.74505693e-07]) message: 'Local minimum reached (|pg| ~= 0)' nfev: 36 nit: 17 status: 0 suess: True x: array([-25.16131858, 0.20623159, 0.20147149])
,由于知道cost在0.203左右,所以我用最笨的梯度下降试了一下,由于后面实在是太慢了,所以设置hile J>0.21,循环了大概13W次。。可见,使用集成好的优化算法是多么重要。。。还有,在以前的理解中,如果一个学习速率不合适,J会一直发散,昨天的实验发现,有的速率开始会发散,后面还是会收敛。
以上这篇基于Python fminunc 的替代方法就是我分享给大家的全部内容了,电脑维修网希望能给大家一个参考,也电脑维修网希望大家多多支持考高分网。
空调维修
- 温岭冰箱全国统一服务热线-全国统一人工【7X2
- 荆州速热热水器维修(荆州热水器维修)
- 昆山热水器故障码5ER-昆山热水器故障码26
- 温岭洗衣机24小时服务电话—(7X24小时)登记报
- 统帅热水器售后维修服务电话—— (7X24小时)登
- 阳江中央空调统一电话热线-阳江空调官方售后电
- 乌鲁木齐阳春燃气灶厂家服务热线
- 珠海许昌集成灶售后服务电话-全国统一人工【
- 乌鲁木齐中央空调维修服务专线-乌鲁木齐中央空
- 新沂热水器故障电话码维修-新沂热水器常见故障
- 诸城壁挂炉24小时服务热线电话
- 靖江空调24小时服务电话-——售后维修中心电话
- 空调室外滴水管维修(空调室外排水管维修)
- 九江壁挂炉400全国服务电话-(7X24小时)登记报修
- 热水器故障码f.22怎么解决-热水器f0故障解决方法
- 营口热水器售后维修服务电话—— 全国统一人工