There are certain aspects that control the degree of overfitting and generalization.
- Number of parameters can be altered depending on the difference between test score and training score. Also, keeping in mind the complexity(non-linearity) of the data. (Bringing down the num of parameters in case of simpler problems)
- Dropout neurons: adding dropout neurons to reduce overfitting.
- Regularization: L1 and L2 regularization.
After you have trained the network, you can successfully use that same network to perform prediction on other datasets(simple/complex). This process will be termed as transfer learning.