What is the translation of " DROPOUT " in English? S

Noun
dropout
辍学
退学
辍学率
辍学生

Examples of using Dropout in Chinese and their translations into English

{-}
  • Political category close
  • Ecclesiastic category close
  • Programming category close
这就是所谓的dropout
It's called the dropout.
使用dropout,这个过程就改了。
With dropout, this process is modified.
Dropout的想法在本质上是过于简单化的。
The idea of dropout is simplistic in nature.
图形变换的应用,如dropout
Application of graph transformations, such as dropout.
Hulu没有说“TheDropout”什么时候会首次亮相。
Hulu did not say when“The Dropout” would debut.
一个名为TheDropout的ABC播客,一个由SNL的KateMcKinnon主演的声望限量系列,刚刚宣布;
An ABC podcast called The Dropout, a prestige limited series starring SNL's Kate McKinnon, was just announced;
他还给出了dropout的有趣解释。
He also gave an interesting intuitive explanation for dropout.
如果没有dropout,我们的网络会表现出大量的过拟合。
Without dropout, our network exhibits substantial overfitting.
过量的数据增强,加上其它形式的正则化(权重L2,dropout操作,等等)可能会导致网络欠拟合。
Too much of this combined with other forms of regularization(weight L2, dropout, etc.) can cause the net to underfit.
在Vgg的情况下,dropoutrate被设置为0.5。
In the case of Vgg, the Dropout rate is set to 0.5.
本文将说明,Dropout是一个带有固定参数(np,np(1-p))的二项随机变量的特例。
We will show that Dropout is a special case with a binomial random variable with fixed parameters(np, np(1- p)).
也许再多增加0.00000001的dropout会有所帮助,看起来我们的训练集有点过拟合了。
Maybe adding 0.00000001 more dropout will help, seems like we are fitting our train set a little bit too well here”.
其中之一是Dropout层,由GeoffreyHinton两年前在一篇开创性的论文中提出。
One of these is the concept of Dropout, proposed by Geoffrey Hinton two years ago in this seminal paper.
总而言之,不管是否将dropout应用于图层,在keras中,权重总是具有正确的比例。
To summarize, regardless if you apply dropout to a layer, in keras the weights will always be of correct scale.
例如,您可以修剪决策树,在神经网络上使用dropout,或者在回归中向代价函数添加一个惩罚参数。
For example, you could prune a decision tree, use dropout on a neural network, or add a penalty parameter to the cost function in regression.
现在,我们对Dropout有了一个直观的概念,接下来让我们深入的分析它。
Now that we got an intuitive idea behind Dropout, let's analyze it in depth.
有正规化技术,如丢失数据(dropout),可以强制它以更好的方式学习,但过拟合也有更深的根源。
There are regularisation techniques like dropout that can force it to learn in a better way but overfitting also has deeper roots.
注意那些被dropout的神经元,即那些临时性删除的神经元,用虚圈表示在途中:.
Note that the dropout neurons, i.e.,the neurons which have been temporarily deleted, are still ghosted in:.
我们现在已经目睹了一些进步,像Dropout、超收敛和迁移学习,所有这些使得训练变得更容易。
We have already seen this some with advances like dropout, super convergence, and transfer learning, all of which make training easier.
如上图所示,dropout可以应用于隐藏层以及输入层。
As seen in the image above, dropout can be applied to both the hidden layers as well as the input layers.
这似乎相当大,鉴于Imagenet分类的复杂性,想要使我们的dropoutrate这么高似乎是合理的。
This seems quite large, and given the complexity of classification for Imagenet,it seems reasonable to want to make our dropout rate this high.
相反,单独使用Dropout方法不能防止参数值在训练阶段变得过大。
Dropout alone, instead, does not have any way to prevent parameter values from becoming too large during this update phase.
在本节,我简要地给出三种减轻过匹配的其他的方法:L1规范化、dropout和人工增加训练样本。
In this section I briefly describe three other approaches to reducing overfitting:L1 regularization, dropout, and artificially increasing the training set size.
他还解释了dropout是L2正则化的自适应形式,两种方法效果相近。
He also explains that dropout is nothing more than an adaptive form of L2 regularization and that both methods have similar effects.
Dropout层最初是通过在CNN中的使用而流行起来的,但后来被应用到其他层,包括输入嵌入或循环网络。
Dropout layers first gained popularity through their use in CNNs, but have since been applied to other layers, including input embeddings or recurrent networks.
还能使用多种高级层比如Dropout或Batch正则化,以及自适应学习率技术比如Adadelta和Adam。
Several advanced layers such as dropout or batch normalization are also available as well as adaptive learning rates techniques such as Adadelta and Adam.
Dropout模拟来自给定层的稀疏激活,有趣的是,这反过来又鼓励网络实际学习稀疏表示作为副作用。
Dropout simulates a sparse activation from a given layer, which interestingly, in turn, encourages the network to actually learn a sparse representation as a side-effect.
我们将使用Lasagne实现许多不同结构的神经网络架构,也会谈到诸如dataaugmentation,dropout,theimportanceofmomentum和预训练这些技术。
We will use Lasagne to implement a couple of network architectures,talk about data augmentation, dropout, the importance of momentum, and pre-training.
InvertedDropout方法应该和别的规范化参数的技术一起使用,从而帮助简化学习率的选择过程。
Inverted Dropout should be using together with other normalization techniques that constrain the parameter values in order to simplify the learning rate selection procedure.
Results: 29, Time: 0.0346
S

Synonyms for Dropout

辍学 退学

Top dictionary queries

Chinese - English