Difference between revisions of "Long Short-Term Memory"

From Psyc 40 Wiki
Jump to: navigation, search
(Created page with "By Alphonso Bradham Long Short-Term Memory (LSTM) refers to a type of recurrent neural network architecture useful for performing classification and regression tasks in long...")
 
Line 1: Line 1:
 
By Alphonso Bradham
 
By Alphonso Bradham
 +
 +
```Note: This page is currently a work in progress```
  
 
Long Short-Term Memory (LSTM) refers to a type of recurrent neural network architecture useful for performing classification and regression tasks in long sequence or time-series data. LSTMs were developed to counter the [vanishing gradient problem], and the key features of an LSTM network are the inclusion of LSTM "cell state" vectors that allow them to keep track of long range relationships in data that other models would "forget".
 
Long Short-Term Memory (LSTM) refers to a type of recurrent neural network architecture useful for performing classification and regression tasks in long sequence or time-series data. LSTMs were developed to counter the [vanishing gradient problem], and the key features of an LSTM network are the inclusion of LSTM "cell state" vectors that allow them to keep track of long range relationships in data that other models would "forget".

Revision as of 23:39, 21 October 2022

By Alphonso Bradham

```Note: This page is currently a work in progress```

Long Short-Term Memory (LSTM) refers to a type of recurrent neural network architecture useful for performing classification and regression tasks in long sequence or time-series data. LSTMs were developed to counter the [vanishing gradient problem], and the key features of an LSTM network are the inclusion of LSTM "cell state" vectors that allow them to keep track of long range relationships in data that other models would "forget".

Motivation

LSTMs were developed to