Generalisation is the end goal for machine learning

Overfitting is the problem where your alogorith fits extremely well with the training set, but cannot generalise well to new data

Dropout

Emerged relatively recently, but works relatively well https://classroom.udacity.com/courses/ud730/lessons/6379031992/concepts/63923585860923

Drop out works by randomly zero-ing out half of your activations

https://s3-us-west-2.amazonaws.com/secure.notion-static.com/9fd907c8-ab35-4fc9-b023-f27347f92a09/Untitled.png

It's one of the most important techniques that has emerged in recent years. If drop out isn't working for you, you probably need a bigger next work.