i am trying to solve a rather complicated design problem through optimization, which is very non-linear and has non-linear constraints but all being somewhat convex, and in order to reach a good minimum for the cost function i wanted to combine a few algorithms together.
the problem is : there's a 'main' cost function which is what i am trying to minimize and there is the 'constraints' which i am trying to satisfy (set to zero or some value within some tolerance).
while all the constraints are normalized, the main cost function is not normalized. also, some of the constraints vary with the cost function such that their product is constant and some vary in a very non-linear way, but they are all convex, including the cost function itself.
so i want to create a cost function that lumps all of this together, to be able to use all algorithms together to get fastest and best results.
i have tried minimizing the squares,however minimizing squares that vary linearly with each other ends up with them all being equal to the main cost function, for example if the main cost function is 100, all the constraints will be infeasable by 100 etc ...
lagrange multipliers seem to overcomplicate the solution as there is a very large number of constraints which leads to a large number of multipliers.
any ideas what might be a good function to use ?
or can anyone point me in a direction that would help me reach it ?
sorry for the long question.