mathematical-optimization

Optimisation/knapsack algorithm with multiple contraints in JavaScript

限于喜欢 提交于 2019-12-21 06:40:12
问题 I'm looking for a solution to the knapsack problem where multiple constraints are in place. Say our knapsack has a maximum weight of 30kg and we have a set of 100 objects, each with a weight and each with a benefit. These objects could look like this: { name: 'water bottle', weight: 2, benefit: 5 }, { name: 'potatoes', weight: 10, benefit: 6 } Finding the combination of objects with the highest benefit within the maximum weight is simple enough. Here is a nodeJS plugin showing how it can be

Optimisation/knapsack algorithm with multiple contraints in JavaScript

自作多情 提交于 2019-12-21 06:40:06
问题 I'm looking for a solution to the knapsack problem where multiple constraints are in place. Say our knapsack has a maximum weight of 30kg and we have a set of 100 objects, each with a weight and each with a benefit. These objects could look like this: { name: 'water bottle', weight: 2, benefit: 5 }, { name: 'potatoes', weight: 10, benefit: 6 } Finding the combination of objects with the highest benefit within the maximum weight is simple enough. Here is a nodeJS plugin showing how it can be

Algorithm for finding smallest collection of components

让人想犯罪 __ 提交于 2019-12-21 04:21:33
问题 I'm looking for an algorithm to solve the following problem. I have a number of subsets (1-n) of a given set (a-h). I want to find the smallest collection of subsets that will allow me to construct, by combination, all of the given subsets. This collection can contain subsets that do not exist in 1-n yet. a b c d e f g h 1 1 2 1 1 3 1 1 1 4 1 1 5 1 1 6 1 1 1 1 7 1 1 1 1 8 1 1 1 9 1 1 1 Below are two possible collections, the smallest of which contains seven subsets. I have denoted new subsets

Fantasy football linear programming in R with RGLPK

被刻印的时光 ゝ 提交于 2019-12-21 04:12:13
问题 long time listener first time caller to S.O... I am asking a question that has been asked very similarly before, however I don't believe I am smart enough to decipher how to implement the solution, for this I apologize. Here is the link to the question I found: Constraints in R Multiple Integer Linear Programming I am maxing over my projected fantasy points(FPTS_PREDICT_RF), subject to a 50,000 salary cap, while minimizing a 'risk' calculation that I have came up with. Now, the problem lies

Wrong result for best fit plane to set of points with scipy.linalg.lstsq?

随声附和 提交于 2019-12-21 02:51:12
问题 I have a set of (x, y, z) points for which I need to find the plane that best fits them. A plane is defined by its coefficients as: a*x + b*y + c*z + d = 0 or equivalently: A*X +B*y + C = z The second equation is just a re-write of the first. I'm using the method developed in this gist, which is a translation to Python from the Matlab code given in this answer. The method finds the coefficients to define the plane equation that best fits the set of points. The issue is that I am able to come

QP solver for Java [closed]

烂漫一生 提交于 2019-12-21 01:58:09
问题 As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance. Closed 8 years ago . I'm looking for a good easy to use Java based Quadratic Programming (QP) solver. Googling around I came across ojAlgo (http://ojalgo

Solving nonlinear equations numerically

主宰稳场 提交于 2019-12-20 20:38:19
问题 I need to solve nonlinear minimization (least residual squares of N unknowns) problems in my Java program. The usual way to solve these is the Levenberg-Marquardt algorithm. I have a couple of questions Does anybody have experience on the different LM implementations available? There exist slightly different flavors of LM, and I've heard that the exact implementation of the algorithm has a major effect on the its numerical stability. My functions are pretty well-behaved so this will probably

Choosing the initial simplex in the Nelder-Mead optimization algorithm

谁都会走 提交于 2019-12-20 12:36:29
问题 What's the best way to initialize a simplex for use in a Nelder-Mead simplex search from a user's 'guess' vertex? 回答1: I'm not sure if there is a best way to choose the initial simplex in the Nelder-Mead method, but the following is what is done in common practice. The construction of the initial simplex S is obtained from generating n+1 vertices x0,..,xn around what you call a user's "guess" vertex xin in a N dimensional space. The most frequent choice is x0=xin and the remaining n vertices

Maximizing linear objective subject to quadratic constraints

冷暖自知 提交于 2019-12-20 04:37:05
问题 I have a programming formulation from a paper and want to give it a tool for solving specific problems. The authors stated it as an linear programming (LP) instance, however I am not sure. Formulation is somewhat like as follows: max x1+x2+x3... s.t. x1.x3+x4.x5 <= 10 x2.x5+x3.x7+x1.x9 <=10 ... I tried to program it through cplexqcp function (due to quadratic constraints, however constraints do not include any x_i^2 variable). However I receive CPLEX Error 5002: Q in %s is not positive semi

Python scipy.optimize.fmin_l_bfgs_b error occurs

≡放荡痞女 提交于 2019-12-20 03:17:18
问题 My code is to implement an active learning algorithm, using L-BFGS optimization. I want to optimize four parameters: alpha , beta , w and gamma . However, when I run the code below, I got an error: optimLogitLBFGS = sp.optimize.fmin_l_bfgs_b(func, x0 = x0, args = (X,Y,Z), fprime = func_grad) File "C:\Python27\lib\site-packages\scipy\optimize\lbfgsb.py", line 188, in fmin_l_bfgs_b **opts) File "C:\Python27\lib\site-packages\scipy\optimize\lbfgsb.py", line 311, in _minimize_lbfgsb isave, dsave)