To use the adam optimizer in this custom training loop example, you can follow the example set out in the documentation page for the adamupdate function - https://uk.mathworks.com/help/deeplearning/ref/adamupdate.html.
Note that the adamupdate function has some different required input arguments and return arguments so you will need to map the differences to the ODE example you are trying to solve. For example, initializing empty averageGrad and averageSqGrad outside the custom training loop so that you can update it at each call to adamupdate. Here is a snippet just showing where these quantites would be used.
averageGrad = [];
averageSqGrad = [];
...
[net,averageGrad,averageSqGrad] = adamupdate(net,gradients,averageGrad,averageSqGrad,iteration);