/  Deep Learning Interview questions and answers   /  What is LSTM and Explain different types of gates used in LSTM?
Neural network 64 (i2tutorials)

What is LSTM and Explain different types of gates used in LSTM?

Ans: Long Short Term Memory networks – usually just called “LSTMs” – are a special kind of RNN, capable of learning long-term dependencies. They work tremendously well on a large variety of problems, and are now widely used.

LSTMs are explicitly designed to avoid the long-term dependency problem. Remembering information for long periods of time is practically their default behavior, not something they struggle to learn!

LSTMs also have this chain like structure, but the repeating module has a different structure. Instead of having a single neural network layer, there are four, interacting in a very special way.

The input gate regulates the writing operation, the input modulating gate determines how much to write in, the forget gate conducts an erase/remember operation, and the output gate determines what output to generate from the cell memory .To summarize the working mechanism of the gates, the input gate controls the flow of input data to the memory cell, and the output stream of information from the memory cell to other LSTM blocks is regulated by the output gate.

Leave a comment