- relu activation function: maps any negative input values to zero
- hyperbolic tangent function / tanh function: Maps large positive input values to outputs very close to one, large negative input values, to outputs very close to negative one
- different activation functions results in the shape of regression prediction plots or decision boundaries