What do you mean by dense layer and Drop out layer in Keras Neural Network?
(a) dense layer
Ans: Dense Layer is regular layer of neurons in Neural Network. Each neuron receives input from all previous neurons. Hence it forms Dense Layer. This layer represents matrix vector multiplication. The values in this matrix are trainable parameters which gets updated during Back Propagation. Dense layer is also used to change the values of matrix.
(b) Drop out layer
A dropout layer is used for regularization where you can randomly set some of the dimensions of your input vector to be zero. A dropout layer does not have any trainable parameters, they can be updated during training.
Dropout is a technique which is used to get hold of Overfitting. This method takes input between 0 and 1. Dropout consists of randomly setting fraction rate of inputs to 0 at each update during training, which can help in preventing overfitting.