交叉熵 英语是什么意思 - 英语翻译

名词
cross-entropy
交叉熵
化交叉熵
cross entropy
交叉熵
化交叉熵

在 中文 中使用 交叉熵 的示例及其翻译为 英语

{-}
  • Political category close
  • Ecclesiastic category close
  • Programming category close
交叉熵损失函数使用梯度下降进行优化。
Cross-Entropy Loss functions are optimized using Gradient Descent.
在分类树中我们使用交叉熵和基尼指数。
In classification trees, we use cross-entropy and Gini index.
这里采用了交叉熵(cross-entropy)来作为costfunction。
Use Cross entropy as the cost function.
Logistic回归:模型,交叉熵损失,类概率估计。
Logistic regression: model, cross-entropy loss, class probability estimation.
在分类树中,我们使用交叉熵和Gini指数。
In classification trees, we use cross-entropy and Gini index.
Combinations with other parts of speech
用动词使用
然后我们实现交叉熵功能:.
Then we can implement the cross-entropy function:.
也称为交叉熵损失。
Also called the cross entropy loss.
如果我们处理分类结果,我们可能会使用交叉熵
If we were dealing with a classification outcome,we might use cross-entropy.
对于分类任务,通常使用交叉熵损失函数。
In the classification task, the cross entropy loss function is commonly used.
现在,您可以以安全的方式计算交叉熵:.
And now you can compute your cross-entropy in a safe way:.
交叉熵的公式定义.
Formal definition of the cross-entropy.
交叉熵定义为.
The cross entropy is defined as.
为什么交叉熵是在分类问题中合适的定义距离??
Why is the cross-entropy the right distance to use for classification problems?
交叉熵误差函数,其定义如下:.
The cross-entropy error function is defined as follows:.
大致说来,想法就是:交叉熵是对惊讶的测度。
Roughly speaking, the idea is that the cross-entropy is a measure of surprise.
我们引入了交叉熵基准测试来获得复杂多比特动力学的实验保真度。
We introduce cross-entropy benchmarking to obtain the experimental fidelity of complex multiqubit dynamics.
交叉熵作为损失函数,在分类问题中被广泛应用。
We use categorical cross entropy as the loss function, which is widely used in classification problems.
熵,交叉熵和KL-散度经常用于机器学习,特别是用于训练分类器。
Entropy, cross-entropy and KL-divergence are often used in machine learning, in particular for training classifiers.
交叉熵是一个用来比较两个概率分布p和q的数学工具。
Cross entropy is a mathematical tool for comparing two probability distributions p and q.
交叉熵是用来衡量我们的预测用于描述真相的低效性。
The cross-entropy is measuring how inefficient our predictions are for describing the truth.
我们也选择二进制--交叉熵作为损失(因为我们处理二进制分类)和准确性作为我们的评估指标。
We also choose binary- cross entropy as the loss(because we deal with binary classification) and accuracy as our evaluation index.
在本章中我们主要使用交叉熵代价函数来解决学习速度衰退的问题。
In this chapter we will mostly use the cross-entropy cost to address the problem of learning slowdown.
在训练循环中使用该代码训练数据计算精度和交叉熵(例如每10次迭代):.
The accuracy and cross entropy are computed on training data using this code in the training loop(every 10 iterations for example):.
为了实现交叉熵,我们需要先添加一个新的占位符来输入正确答案:.
To implement cross-entropy we need to first add a new placeholder to input the correct answers:.
二分类问题:对数损失(也称为交叉熵)“_binarycrossentropy”。
Binary Classification(2 class): Logarithmic Loss, also called cross entropy or‘binary_crossentropy‘.
在这副图片当中,交叉熵被表示为一个具有两个权重的函数。
In this picture, cross-entropy is represented as a function of 2 weights.
损失函数(lossfunction,此处为「交叉熵」)的选择稍后会做出解释。
The choice of a loss function(here,"cross-entropy") is explained later.
由于真实分布是未知的,我们不能直接计算交叉熵
Since the true distribution is unknown, cross-entropy cannot be directly calculated.
结果: 28, 时间: 0.0165

单词翻译

顶级字典查询

中文 - 英语