Course Hero Logo

77 applications of recurrent neural networks 297

Course Hero uses AI to attempt to automatically extract content from documents to surface to you and others so you can study better, e.g., in search results, to enrich docs, and more. This preview shows page 315 - 316 out of 512 pages.

7.7.APPLICATIONS OF RECURRENT NEURAL NETWORKS297Backward gradient flow is multiplied with this factor. Here, the termz(0,1) helps inpassingunimpededgradient flow and makes computations more stable. Furthermore, sincethe additive terms heavily depend on (1-z), the overall multiplicative factor that tends to becloser to 1 even whenzis small. Another point is that the value ofzand the multiplicativefactorhtht-1isdifferentfor each time stamp, which tends to reduce the propensity forvanishing or exploding gradients.Although the GRU is a closely related simplification of the LSTM, it should not beseen as a special case of the LSTM. A comparison of the LSTM and the GRU is providedin [71,228]. The two models are shown to be roughly similar in performance, and therelative performance seems to depend on the task at hand. The GRU is simpler and enjoysthe advantage of greater ease of implementation and efficiency. It might generalize slightlybetter with less data because of a smaller parameter footprint [71], although the LSTMwould be preferable with an increased amount of data. The work in [228] also discussesseveral practical implementation issues associated with the LSTM. The LSTM has beenmore extensively tested than the GRU, simply because it is an older architecture and enjoyswidespread popularity. As a result, it is generally seen as a safer option, particularly whenworking with longer sequences and larger data sets. The work in [160] also showed that noneof the variants of the LSTM can reliably outperform it in a consistent way. This is becauseof the explicit internal memory and the greater gate-centric control in updating the LSTM.7.7Applications of Recurrent Neural NetworksRecurrent neural networks have numerous applications in machine learning applications,which are associated with information retrieval, speech recognition, and handwriting recog-nition. Text data forms the predominant setting for applications of RNNs, although thereare several applications to computational biology as well. Most of the applications of RNNsfall into one of two categories:1.Conditional language modeling:When the output of a recurrent network is a languagemodel, one can enhance it with context in order to provide a relevant output to thecontext. In most of these cases, the context is the neural output of another neuralnetwork. To provide one example, in image captioning the context is the neural rep-resentation of an image provided by a convolutional network, and the language modelprovides a caption for the image. In machine translation, the context is the repre-sentation of a sentence in a source language (produced by another RNN), and thelanguage model in the target language provides a translation.2.Leveraging token-specific outputs:The outputs at the different tokens can be usedto learn other properties than a language model. For example, the labels outputat different time-stamps might correspond to the properties of the tokens (such astheir parts of speech). In handwriting recognition, the labels might correspond to thecharacters. In some cases, all the time-stamps might not have an output, but theend-of-sentence marker might output a label for the entire sentence. This approachis referred to as sentence-level classification, and is often used in sentiment analysis.

Upload your study docs or become a

Course Hero member to access this document

Upload your study docs or become a

Course Hero member to access this document

End of preview. Want to read all 512 pages?

Upload your study docs or become a

Course Hero member to access this document

Term
Summer
Professor
N/A
Tags
Neural Networks, Machine Learning, Artificial neural network

Newly uploaded documents

Show More

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture