This requires bringing together a number of different technologies including recurrent neural networks, web applications, templating, HTML, CSS, and of course Python.
这场革命的领导者是循环神经网络,特别是其LSTM模型的表现。
The leader of this revolution has been the recurrent neural network and particularly its manifestation as an LSTM.
卷积神经网络通常用于处理图像,而循环神经网络(RNN)用于处理语言。
While convolutional neural networksare typically used for processing images, recurrent neural networks(RNNs) are used for processing language.
对于循环神经网络,信号可能会多次经过同一层,CAP的深度可能是无限的。!
For a recurrent neural network, a signal might propagate through a layer more than once, so the CAP depth is potentially unlimited!
在完成了完全神经网络和卷积网络后,你应该看看循环神经网络。
After fully-connected and convolutional networks, you should have a look at recurrent neural networks.
我们正在探索这个方向非常具体,主要与与深层神经网络和循环神经网络以及其他类型的机器学习有关。
We're exploring this very specific directionhaving to do with deep neural networks and recurrent neural networks and other kinds of machine learning.
I will describe the foundations of deep learning for natural language processing:word vectors, recurrent neural networks, tasks and models influenced by linguistics.
在完全连接和卷积网络之后,您应该看看循环神经网络。
After fully-connected and convolutional networks, you should have a look at recurrent neural networks.
我们请数据科学家NeelabhPant向大家来讲述他使用循环神经网络预测汇率变化的经验。
We asked a data scientist, Neelabh Pant,to tell you about his experience of forecasting exchange rates using recurrent neural networks.
虽然结果并不总是完全正确,但它们确实表明循环神经网络已经掌握了英语基础。
While the results are not always entirely on-point,they do show the recurrent neural network has learned the basics of English.
序列模型:隐藏马尔可夫模型、循环神经网络(RNN)、长短期记忆神经网络(LSTMs).
Sequence models: Hidden Markov models, recursive neural networks(RNNs), Long short term memory neural networks(LSTMs).
未来研究的一个方向是要在管理器层引入循环神经网络架构。
One future direction is to introduce a recurrent neural network architecture at the Manager level.
RNN Modeling Capability: Recurrent neural networks(RNNs) are used for speech recognition, time series prediction, image captioning, and other tasks that require processing sequential information.
In recurrent neural networks, updating across fewer prior time steps during training, called truncated Backpropagation through time, may reduce the exploding gradient problem.
所以循环神经网络(RNN)是处理它们的最佳选择,因为它们能记住之前的结果--在这里即是之前的词。
So Recurrent Neural Networks(RNN) would be the best choice to handle them, since they remember the previous result- the prior word, in our case.
A popular approach for language modeling is Recurrent Neural Networks(RNNs) as they capture dependencies between words well, especially when using modules such as LSTM.
但是,循环神经网络有一个问题。
Recurrent Neural Networks have one problem though.
一个简单的循环神经网络只对短期记忆有效。
The simple recurrent neural network works well only for short-term memory.
一个简单的循环神经网络只对短期记忆有效。
A simple recurrent neural network works well only for a short-term memory.
我们的下一步是双向循环神经网络(BRNN)。
Our next step is bidirectional recurrent neural networks(BRNNs).
它融入了变分推理,并将超网络用作循环神经网络细胞。
It incorporates variational inference and utilizes hypernetworks as recurrent neural network cells.
中文
Bahasa indonesia
日本語
عربى
Български
বাংলা
Český
Dansk
Deutsch
Ελληνικά
Español
Suomi
Français
עִברִית
हिंदी
Hrvatski
Magyar
Italiano
Қазақ
한국어
മലയാളം
मराठी
Bahasa malay
Nederlands
Norsk
Polski
Português
Română
Русский
Slovenský
Slovenski
Српски
Svenska
தமிழ்
తెలుగు
ไทย
Tagalog
Turkce
Українська
اردو
Tiếng việt