- LogCosh is similar to MAE with the difference being that it is a softer version that can allow smoother convergence. It was adapted from https://github.com/tuantle/regression-losses-pytorch
- Note: adding the 1e-12 is not necessary, but it can act as a small regularizer adding epsilon to regularize