Skip to main content
Oopsy Oops von , vor 3 Jahren
Gated Recurrent Unit

GRU (Gated Recurrent Unit) is an advancement of the standard RNN (Recurrent Neural Network). It was introduced by Kyunghyun Cho et al in 2014. GRU uses less memory and is faster than LSTM, but LSTM is more accurate when using datasets with longer sequences.


Exey Panteleev | CC BY 2.0