Conversational agents like Siri, Alexa,Cortana basically work on simplifying the speech recognition techniques through LSTMs and RNNs.
LSTMs在序列预测问题中非常强大,因为它们能够存储过去的信息。
LSTMs are very powerful in sequence prediction problems because they're able to store past information.
LSTMs能够从时间序列数据中捕捉最重要的特征并进行关联建模。
LSTMs are capable of capturing the most important features from time series data and modeling its dependencies.
LSTMs可以给单元状态移除或者增加信息,由一个称为门(gate)的结构来控制。
The LSTM can remove or add information to the cell state, carefully regulated by structures called gates.
LSTMs已经被证明能够学习复杂的序列,比如像莎士比亚一样进行写作或创作原始的音乐。
LSTMs have been shown to be able to learn complex sequences, such as writing like Shakespeare or composing primitive music.
LSTMs在刚开始学习时比较复杂,但如果你感兴趣想要多了解一下,这篇文章讲解的非常清楚。
LSTMs can be quite confusing in the beginning but if you're interested in learning more this post has an excellent explanation.
LSTMs的固有特性使其成为涉及时间序列、非线性数据流的异常检测任务的理想选择。
The inherent properties of LSTMs makes them an ideal candidate for anomaly detection tasks involving time-series, non-linear numeric streams of data.
CNNLSTMs开发用来可视化序列预测问题和从图像序列生成文本描述的应用(例如:视频)。
CNN LSTMs were developed for visual time series prediction problems and the application of generating textual descriptions from sequences of images(e.g. videos).
中文
Bahasa indonesia
日本語
عربى
Български
বাংলা
Český
Dansk
Deutsch
Ελληνικά
Español
Suomi
Français
עִברִית
हिंदी
Hrvatski
Magyar
Italiano
Қазақ
한국어
മലയാളം
मराठी
Bahasa malay
Nederlands
Norsk
Polski
Português
Română
Русский
Slovenský
Slovenski
Српски
Svenska
தமிழ்
తెలుగు
ไทย
Tagalog
Turkce
Українська
اردو
Tiếng việt