- 更多网络例句与条件梯度法相关的网络例句 [注:此内容来源于网络,仅供参考]
-
On the other hand, an iterative hard shrinkage algorithm is obtained by using the generalized conditional gradient method, as well as convergent theorems for Solutions and stopping criterion.
此外,利用广义条件梯度法得到了迭代硬阈值算法,并给出了解的收敛性定理和停止准则。
-
In this paper a class of new conjugate gradient methods are presented, with which the global convergence with generalized Wolfe line search is proven.
本文提出一族新的共轭梯度法,证明了其在推广的Wolfe非精确线搜索条件下具有全局收敛性。
-
The content of our curriculum are:Optimization under unconstrained conditions and constrained conditions,we'll focus on common methods of this field like Conjugate gradient method,DFP,POWELL method,The multiplier method,Penalty Function Method and so on.
主要讲授内容为无约束条件及有约束条件下的优化设计,重点介绍了共轭梯度法,变尺度法,POWELL法,乘子法,惩罚函数法等电路优化的常用方法。
-
In this dissertation, we presented the global convergence properties of nonlinear conjugate gradient methods without line search and with strong Wolfe conditions, Goldstein inexact line search.
本文给出了无约束最优化算法—非线性共轭梯度法在不需线搜索和用强Wolfe条件下以及用Goldstein非精确线搜索产生搜索步长情况下的全局收敛性证明。
-
The introduction of line searches requires the solution of sub-problems must be a dropped direction, that is g T k δ k 0, In this paper, we study the existed methods of solving sub-problems, prove Zhang and Xun 1999's dogleg method and Steihaug 1983's conjugate gradient method to meet the condition of adequate decrease, then the solution can be used in line search.
引入线搜索要求子问题的解必须是下降方向,即g T k δ k 0,本文研究了现存的子问题的求解方法,证明了Zhang和Xun1999折线法、Steihaug1983共轭梯度法满足"充分下降条件",得到的解可以安全实施线搜索。
-
Secondly, a rule of the choice of step-length is presented for a non-quadratic objective function of n variables, according to the rule we prove that the Gradient Method has local n-step quadratic convergence rate for a non-quadratic function under certain conditions.
其次,对于n个变量的非二次目标函数也提出了一个步长选取的准则,并在此准则下证明了当目标函数满足一定的条件时,梯度法具有局部n-步二阶收敛速度。
-
The one order differential model and convolutional model of impedance formula of amplitude versus offset inversion were established. Impedance inversion formula from AVO was derived based on Bayesian framework. Methods of using Huber function as prior model parameter distribution and covariance matrix from well log as constrains were developed. The P-wave impedance, S-wave impedance and density were calculated by the conjugate gradient algorithm.
为此建立了振幅随偏移距变化波阻抗反演公式的一阶差分模型及褶积模型,推导了基于贝叶斯理论的AVO波阻抗反演公式,给出了采用Huber分布作为模型参数的先验分布及测井数据的参数协方差矩阵作为约束条件的实现方法,并利用共轭梯度法计算了纵、横波阻抗及密度。
-
Through adding new state variable and using supplement functions,the problem with restriction conditions was converted into nonrestriction problem.
通过增加新的状态变量和用补偿函数法,将本课题的有约束条件问题化为无约束条件问题,并提出了最优步长参数的动态搜索法来修改传统的梯度法,从而较完善地解决了多变量最优周期控制的计算问题。
-
Through adding new state variable and using supplement functions,the problem with restriction conditions was converted into nonrestriction problem.In addition,the dynamic searching method of optimal step coefficient was developed to modify the conventional gradient method,consequently the calculation problem of the multivariable optimal periodic control was able to be resolved better.
通过增加新的状态变量和用补偿函数法,将本课题的有约束条件问题化为无约束条件问题,并提出了最优步长参数的动态搜索法来修改传统的梯度法,从而较完善地解决了多变量最优周期控制的计算问题。
-
A sufficient condition ensuring the Gauss-Newton method quadratically convergent exactly is given.
给出了保证Gauss-Newton法恰2阶收敛的条件,在此基础上构造了利用条件预优共轭梯度法求解Gauss-Newton方程的新的有效算法。
- 更多网络解释与条件梯度法相关的网络解释 [注:此内容来源于网络,仅供参考]
-
conditional event:条件性事件
conditional equation 条件方程 | conditional event 条件性事件 | conditional gradient method 条件梯度法
-
conditional gradient method:条件梯度法
conditional event 条件性事件 | conditional gradient method 条件梯度法 | conditional inequality 条件不等式
-
three-term conjugate gradient method:三项共轭梯度法
预条件共扼梯度法:Preconditioned Conjugate Gradient Method | 三项共轭梯度法:three-term conjugate gradient method | 非线性共轭梯度法:nonlinear conjugate gradients method
-
Preconditioned Conjugate Gradient Method:预条件共扼梯度法
多项式预处理共轭梯度法:Multinomial Preprocessing Conjugate Gradient Method | 预条件共扼梯度法:Preconditioned Conjugate Gradient Method | 三项共轭梯度法:three-term conjugate gradient method
-
Preconditioned Conjugate Gradient Method:预条件共轭梯度法
共扼梯度最优化:Conjugate Gradient Optimization | 预条件共轭梯度法:preconditioned conjugate gradient method | 多搜索方向共扼梯度方法:multiple search directions conjugate gradient
-
nonmonotone conjugate gradient method:非单调共轭梯度法
条件预优共轭梯度法:Pre- conditioned conjugate gradient method | 非单调共轭梯度法:nonmonotone conjugate gradient method | 三项共轭梯度算法:three-term conjugate gradient method
-
preconditioned method:预条件方法
三项共轭梯度法:three-term conjugate gradient method | 预条件方法:preconditioned method | 共轭梯度法:conjugate gradient
-
preconditioned conjugate-gradient technique:abbr. pcg; 预条件共轭梯度法