Looking for an Automatic differentiation trick
1 次查看(过去 30 天)
显示 更早的评论
I have a minimization problem yout=f(x). I am using some operator overloading routine to compute the Jacobian and Hessian, and everything works fine in small cases.
However, when the length of vector x increases I have issues with memory and computing time (no surprise!). I noticed that, in this particular function, most elements of vector x have no impact on yout, and I were able to identify those which are relevant. For the sake of this question, let's assume that the first 100 elements of x are relevant for the minimization.
I found no way to teach the AD library to ignore some variables and keep working on others. The "limitation" is not particular of this toolbox (INTLAB), but in the operator overload mechanism. I The question is: can anyone suggest a trick to circumvent this limitation? My best results were to compute the derivatives as usual, and afterwards extract from the large Jacobian and Hessian those rows and columns of interest. The minimization routine will operate in the small size problem, but the overhead is still in the computation of a large Jacobian and Hessian.
1 个评论
Phil
2024-6-25
I've had the same problem, but the language/compiler came to my rescue. Somehow FC kept moving different parts of memory around. Here is a little intro to the FC program:
FortranCalculus™, Compiler, Alpha version
-----------------
FortranCalculus™ is a (free) Calculus (level) Compiler that simplifies Tweaking parameters in ones math model. The FortranCalculus (FC) language is for math continuous modeling, simulation, and optimization. FC is based on Automatic Differentiation (AD) coupled with Operator Overload that simplifies computer code to an absolute minimum; i.e., a mathematical model, constraints, and the objective (function) definition. Minimizing the amount of code allows the user to concentrate on the science or engineering problem at hand and not on the (numerical) process requirements to achieve an optimum solution. Download at https://goal-driven.net/apps/fc-compiler.html
Phil Brubaker
Mathematical Engineer / Electrical Engineer / Author / STEM Speaker
Oregon State University ['67' ...
E-mail: <math-coach@goal-driven.net>
Goal: help solve problems like cancer, lupus, atrial fibrillation (Afib), irregular heart beats, and other Continuous math Modeling & Simulation problems.
回答(1 个)
Umar
2024-6-25
Hi Carlos, To optimize your process and circumvent this limitation, one potential strategy could involve modifying your computation approach. Instead of computing derivatives for the entire vector x, you could focus solely on the relevant elements (the first 100 in this case) during the differentiation process. By selectively calculating derivatives for the significant variables, you can reduce the computational overhead associated with handling unnecessary elements. Additionally, you may consider implementing a sparse matrix representation for the Jacobian and Hessian matrices to efficiently store and manipulate only the relevant entries. This can help minimize memory usage and improve computational efficiency, especially when dealing with large-scale problems. Furthermore, exploring techniques such as automatic relevance determination (ARD) or feature selection methods could aid in identifying and prioritizing the important variables during optimization, thereby streamlining the minimization process and enhancing performance.
0 个评论
另请参阅
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!