What is the translation of " 语言建模 " in English?

language modeling
语言建模
语言模型
as language modelling

Examples of using 语言建模 in Chinese and their translations into English

{-}
  • Political category close
  • Ecclesiastic category close
  • Programming category close
它们通过声学和语言建模使用算法。
They use algorithms through acoustic and language modeling.
针对机器翻译和语言建模的Transformer.
Transformer for machine translation and language modeling.
词嵌入:word2vec的目标是简化语言建模
Word embeddings: The objective of word2vec is a simplification of language modelling.
用于机器翻译和语言建模的Transformer.
Transformer for machine translation and language modeling.
然而,要想从理论上更好地理解为什么语言建模似乎在迁移学习中如此有效,还需要进行更多的研究。
Still, much more research is necessary togain a better theoretical understanding why language modeling seems to work so well for transfer learning.
语言建模真的是更有趣的自然语言问题的一个子任务,特别是那些在其它输入条件下的语言模型。
Language modeling is really a subtask of more interesting natural language problems, specifically those that condition the language model on some other input.
这激励了模型去学习更多有意义的“词对”表征,而不是更通用的目标,比如语言建模
This encourages the model to learn more meaningful representations of word pairs than with more general objectives,such as language modelling.
语言建模(LM)在给定前一个单词的情况下尝试预测下一个单词。
Language modeling(LM) aims to predict the next word given its previous word.
这对带有大量输出的任务尤其有用,比如语言建模(Melisetal.,2017)。
This is useful particularly for tasks with a large number of outputs,such as language modelling(Melis et al., 2017).
在评估语言建模任务时,ransformer-XL的速度比vanillatransformer快1800多倍,因为不需要重新计算(见FasterEvaluation部分).
Transformer-XL is up to 1,800+times faster than a vanilla Transformer during evaluation on language modeling tasks, because no re-computation is needed(see figures above).
这鼓励模型学习更有意义的词对表示,而不是更一般的目标,如语言建模
This encourages the model to learn more meaningful representations of word pairs than with more general objectives,such as language modelling.
GPT-2在WinogradSchema、LAMBADA和其他语言建模任务中达到了当前最佳性能。
GPT-2 achieves state-of-the-art on Winograd Schema, LAMBADA, and other language modeling tasks.
Transformer网络具有学习更长期依赖性的潜力,但这种潜力往往会受到语言建模中上下文长度固定的限制。
Transformers have a potential of learning longer-term dependency,but are limited by a fixed-length context in the setting of language modeling.
这种方法以及被用来在一组有趣的语言建模问题上生成文本,例如:.
This approach has beenused to generate text on a suite of interesting language modeling problems, such as:.
为此,DLR对LINQ表达式树进行了扩展,以便包括控制流、工作分配以及其他语言建模节点。
For this purpose, the DLR has extended LINQ expression trees to include control flow, assignment,and other language-modeling nodes.
为此,DLR对LINQ表达式树进行了扩展,以便包括控制流、工作分配以及其他语言建模节点。
DLR has extended LINQ expression trees with control flow,assignment and other language-modeling nodes.
不过,Nvidia提出了一种允许多个gpu并行处理语言建模任务的方法。
Nvidia, though,has come up with a way to allow multiple GPUs to work on the language modeling task in parallel.
凯伦有意将这些女孩放在一起,以利用同龄人的支持和语言建模
Karen has purposely put these girlstogether to take advantage of peer support and language modeling.
它在语言建模和文档分类等预测问题上取得了很大的成功。
It has beenused with great success on prediction problems like language modeling and documentation classification.
除了语言建模的学术兴趣,它是许多深度学习自然语言处理架构的关键组成部分。
In addition to the academic interest in language modeling, it is a key component of many deep learning natural language processing architectures.
Lukasz也给了一些LSTM应用于语言建模和句子压缩方面的例子。
Lukasz also gave some examples of applications of LSTMs in language modelling and sentence compression.
下面是深度学习语言建模(仅有)的一个例子:.
Below is an example of deep learning for language modeling(only):.
模型的灵活性使得我们能够将其应用于各种任务,如问答、语言建模
The flexibility of the model allows us to apply it to tasks as diverse as(synthetic)question answering and to language modeling.
反过来讲,这意味着近年来NLP的许多重要进展都可以归结为某些形式的语言建模
This conversely means that many of the most importantrecent advances in NLP reduce to a form of language modelling.
将RNN用于任何序列建模任务,尤其是文本分类,机器翻译和语言建模
Use RNNs for any sequence modelling task specially text classification,machine translation, language modelling.
虽然NEG可能因此对于学习词嵌入非常有用,但是它不能保证渐近一致性(asymptoticconsistency),这使得它不适合语言建模
While NEG may thus be useful for learning word embeddings,its lack of asymptotic consistency guarantees makes it inappropriate for language modelling.
在Facebook的这项工作中,他们确定了三个步骤--词到词的翻译(word-by-wordinitialization)、语言建模和反向翻译--作为无监督机器翻译的重要原则。
In our research, we identified three steps- word-by-word initialization, language modeling, and back translation- as important principles for unsupervised MT.
深度神经网络(DNNs:DeepNeuralNetworks)已经在大量应用中取得了巨大进展,这些应用包括图像分类、翻译、语言建模以及视频字幕等。
Deep Neural Networks(DNNs) have facilitated tremendous progress across a range of applications, including image classification,translation, language modeling, and video captioning.
因此RNN似乎并没有以正确的偏置对语言建模,这在实践中可能会导致统计效率低下还有泛化能力弱这样的问题。
RNNs thus don't seem to have the right bias for modeling language, which in practice can lead to statistical inefficiency and poor generalization behaviour.
这使词汇量变小,提高了许多语言建模任务的准确性。
This keeps the vocabulary small and improves the accuracy of many language modeling tasks.
Results: 264, Time: 0.0197

Word-for-word translation

Top dictionary queries

Chinese - English