BATCH NORMALIZATION 中文是什么意思 - 中文翻译

[bætʃ ˌnɔːməlai'zeiʃn]
[bætʃ ˌnɔːməlai'zeiʃn]
batch normalization
批归一化
批量标准化

在 英语 中使用 Batch normalization 的示例及其翻译为 中文

{-}
  • Political category close
  • Ecclesiastic category close
  • Programming category close
Batch normalization has a slight regularization effect.
BatchNorm有轻微的正则化效果.
And that also means we can use higherlearning rates during training when using Batch Normalization.
这也意味着当使用BatchNormalization时我们可以在训练中使用更高的学习率。
Batch normalization allows for faster training.
Batchnormalization可以让训练结果更稳定。
This is where we discovered, by removing batch normalization, that the network was quickly outputting NaN after one or two iterations.
在这里我们发现了,通过删除批归一化层,网络很快地在一两次迭代之后输出NaN。
Batch normalization is a technique for making neural networks easier to train.
Batchnormalization是一个用于优化训练神经网络的技巧。
Based on VGG16 but modified to take account of the small dataset andreduce overfitting(probably dropout and batch normalization).
基于VGG16网络,但是由于考虑到小数据集和减少过度拟合(有可能放弃和批量标准化)而进行了修改。
We used Batch Normalization before the activation.
在每一个卷积层之后,激活层之前均使用batchnormalization
I will thus present different variants of gradient descent algorithms,dropout, batch normalization and unsupervised pretraining.
因此我将讲解不同类型的梯度下降法算法,dropout,batchnormalization和无监督预训练。
Batch Normalization is an effective method when training a neural network model.
BatchNormalization是训练神经网络模型的一种有效方法。
These four parameters- mean, variance, gamma, and beta-are what the batch normalization layer learns as the network is trained.
这四个参数--mean,variance,gamma,beta--是在网络训练过程中,批标准化层学习得到的。
Batch Normalization also acts as a form of regularization that helps to minimize overfitting.
BatchNormalization也是一种正则化形式,有助于最小化过拟合。
Finally, we have considered other strategies to improve SGD such as shuffling andcurriculum learning, batch normalization, and early stopping.
最后,我们研究了提升SGD的其他策略,比如洗牌和课程学习方法、批量归一化和earlystopping。
For example, in this Batch Normalization diagram, the emphasis is on the backward pass:.
例如,在批归一化(BatchNormalization)的图中,重点是逆推过程:.
Be able to effectively use the common neural network"tricks", including initialization,L2 and dropout regularization, Batch normalization, gradient checking.
能够高效地使用神经网络通用的技巧,包括初始化、L2和dropout正则化、Batch归一化、梯度检验。
Batch normalization: accelerating deep network training by reducing internal covariate shift.
BatchNormalization:通过减少内部协变量转变加速深度网络训练.
Finally, we have considered other strategies to improve SGD such as shuffling andcurriculum learning, batch normalization, and early stopping.
最后,我们考虑了用于提升SGD性能的其他策略,例如shuffling与curriculumlearning,batchnormalization以及earlystopping。
Batch Normalization allows us to use much higher learning rates and be less careful about initialization.
BN允许我们使用更高的学习率,并且不用太过关心初始化。
Ioffe and Szegedy[2015] S. Ioffe and C. Szegedy,“Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift.
批归一化是在2015年由Ioffe和Szegedy在论文“BatchNormalization:AcceleratingDeepNetworkTrainingbyReducingInternalCovariateShift”中首次提出的。
Batch normalization: accelerating deep network training by reducing internal covariate shift.
BatchNormalization算法:通过减少内部协变量转化加速深度网络的训练.
Finally, we have considered other strategies to improve SGD such as shuffling andcurriculum learning, batch normalization, and early stopping.
最后,介绍了一些提高SGD性能的其它优化建议,如:训练集随机洗牌与课程学习(shufflingandcurriculumlearning)、batchnormalization,、earlystopping与Gradientnoise。
Batch normalization makes the Hyperparameter tuning easier and makes the neural network more robust.
BatchNormalization使超参数的搜索更加快速便捷,也使得神经网络鲁棒性更好。
Several advanced layers such as dropout or batch normalization are also available as well as adaptive learning rates techniques such as Adadelta and Adam.
还能使用多种高级层比如Dropout或Batch正则,以及自适应学习率技术比如Adadelta和Adam。
By batch normalization, these outlier activations are reduced and hence higher learning rates can be used to accelerate the learning process.
通过批量标准化,这些异常激活减少,因此可以使用更高的学习速度来加速学习过程。
Revisiting batch normalization,[33] describes covariate shift as a change in the distribution of model inputs.
再来看批归一化(batchnormalization),[33]描述协变量偏移(covariateshift)为模型输入分布的变化。
Batch normalization can help us avoid the phenomenon that the value of x falls into saturation after going through non-linear activation functions.
批归一化可帮助我们避免x的值在经过非线性激活函数之后陷入饱和的现象。
Batch Normalization(BatchNorm) is a widely adopted technique that enables faster and more stable training of deep neural networks(DNNs).
批归一化(BatchNorm)是一种广泛采用的技术,用于更快速、更稳定地训练深度神经网络(DNN)。
Abstract: Batch Normalization(BN) is a milestone technique in the development of deep learning, enabling various networks to train.
批归一化(BatchNormalization)是深度学习发展中的一项里程碑技术,它让各种网络都能够进行训练。
Batch Normalization(BatchNorm) is a widely adopted technique that enables faster and more stable training of deep neural networks(DNNs).
月13日消息,批归一化(BatchNorm)是一种广泛采用的技术,用于更快速、更稳定地训练深度神经网络。
Batch normalization can help us avoid the phenomenon that the value of x falls into saturation after going through non-linear activation functions.
BatchNormalization可以避免x值经过非线性激活函数后趋于饱和的现象。
Batch normalization usually happens after the convolutional layer but before the activation function gets applied(a so-called“leaky” ReLU in the case of YOLO).
批量归一化通常在卷积层之后,在激活函数(在YOLO中叫做”泄露“的Relu)生效之前。
结果: 41, 时间: 0.0315

单词翻译

顶级字典查询

英语 - 中文