Hey,
From what I understand, and I hope you do as well, is that the dropout layer randomly chooses some amount of the data specified by the value X in dropoutLayer(X).
When you create some layers for a LSTM network, lets say:
numHiddenUnits = 200; % "Number of Neurons"
featureDimension = 2; % "Number of Inputs"
numResponses = 1; %"Number of Outputs"
layers = [ ...
%Input Sequence Layer
sequenceInputLayer(featureDimension)
%LSTM Layers
lstmLayer(numHiddenUnits,'OutputMode','sequence')
%Layer where the output size is reduced to 50
fullyConnectedLayer(50)
%Then before the last fullyConnected layer that has an output the same size as the numResponses (number of outputs in training data), randomly select half the values and change them to zero.
dropoutLayer(0.5)
fullyConnectedLayer(numResponses)
regressionLayer];
The dropout layer will randomly set 50% of the parameters after the first fullyConnectedLayer to 0.
This is the reference which matlab provides for understanding dropout, but if you have used Keras I doubt you would need to read it: Srivastava, N., G. Hinton, A. Krizhevsky, I. Sutskever, R. Salakhutdinov. "Dropout: A Simple Way to Prevent Neural Networks from Overfitting." Journal of Machine Learning Research. Vol. 15, pp. 1929-1958, 2014.