What is the translation of " AN ACTIVATION FUNCTION " in Chinese?

[æn ˌækti'veiʃn 'fʌŋkʃn]
[æn ˌækti'veiʃn 'fʌŋkʃn]
激活函数

Examples of using An activation function in English and their translations into Chinese

{-}
  • Political category close
  • Ecclesiastic category close
  • Programming category close
Why an Activation Function?
为什么要用activationfunctions??
That we are using the sigmoid function as an activation function:.
这里我们就用sigmoidfunction作为我们的激活函数:.
Why an Activation Function?
为什么会有这么多activationfunction?
You still need to add a bias and feed the result through an activation function.
您仍然需要添加偏差并通过激活功能提供结果。
Each neuron has an Activation Function.
每个神经元都有一个激活函数
Like any neurons,these take a weighted average of their inputs and then apply an activation function.
像任何神经元一样,这些神经元取其输入的加权平均值,然后应用一个激活函数
What is an activation function and why to use them?
什么是激活功能,为什么需要它们??
In most cases, a sigmoid function is used as an activation function.
在实践中,多采用Sigmoid函数作为激活函数
We also need to pick an activation function for our hidden layer.
我们还需要为隐藏层挑选一个激活函数
In many applications the units of these networks apply a sigmoid function as an activation function.
在许多应用中,这些网络的单元将sigmoid函数用作激活函数
Another important feature of an activation function is that it should be differentiable.
激活函数的另一个重要特征是:它应该是可以区分的。
Then output V from the node in considerationcan be calculated as below(f is an activation function such as sigmoid):.
涉及到的节点的输出V可以按如下方式计算(f是类似Sigmoid的激活函数):.
Neurons use an activation function to“standardize” the data coming out of the neuron(output).
神经元对数据应用一个激活函数来“标准化”神经元输出的数据。
The result of this transferfunction would then be fed into an activation function to produce a labeling.
这个传递函数的结果将被输入到一个激活函数中以产生标记。
Next, it applies an activation function, which is a function that's applied to this particular neuron.
接下来,它应用激活函数,该函数是作用于该特定神经元的函数。
Then output V from the node in considerationcan be calculated as below(f is an activation function such as sigmoid):.
那么考虑节点的输出V可以计算如下(f是一个激活函数,如Sigmoid函数):.
During the process, neurons use an activation function to“standardize” the data coming out of the neuron(output).
神经元在数据上应用一个激活函数来“标准化”神经元的输出。
Each neuron takes a weighted average of its inputs, adds a bias value,and then applies an activation function.
每个神经元为输入值加权求平均,然后加上偏置,然后调用一个激活函数
For a classification problem, an activation function that works well is softmax.
对于一个分类问题,一个很好的激活函数是softmax。
Like any neuron, this one takes a weighted average of these 363 input values andthen applies an activation function.
和任何神经元一样,这个神经元取363个输入值的加权平均值,然后应用一个激活函数
If we do not apply an activation function, the output signal would simply be a linear function..
如果我们不运用激活函数的话,则输出信号将仅仅是一个简单的线性函数。
We will use the Sigmoid function,which draws a characteristic“S”-shaped curve, as an activation function to the neural network.
我们将使用Sigmoid函数,它绘制出一个“S”形曲线,将其作为本文创建的神经网络的激活函数
It is called an activation function because it governs the threshold at which the neuron is activated and strength of the output signal.
它被称为激活函数,是因为它控制神经元激活的阈值和输出信号的强度。
After all of the feature columns and weights are multiplied, an activation function is called that determines whether the neuron is activated.
在所有特征列和权重相乘之后,调用激活函数来确定神经元是否被激活。
Using an activation function on the final layer can sometimes mean that your network cannot produce the full range of required values.
在最后一层中,不合理的激活函数有时会导致你的网络无法输出所需值的全部范围。
Where W1 is the matrix of input-to-hidden-layer weights, σ{\displaystyle\sigma}is an activation function, and W2 is the matrix of hidden-to-output-layer weights.
其中W1是输入向量到隐藏节点层的权重矩阵,σ是激活函数,W2是隐藏节点层到输出向量的权重矩阵。
Without an activation function, every neural network, no matter how complex, would be reducible to a linear combination of its inputs.
没有了激活函数,无论多复杂的神经网络都可以简化为它的输入的线性组合。
This means you can use an activation function such as MPSCNNNeuronLinear on its own, as if it were a separate layer.
这意味着你可以单独使用一个激活函数,如MPSCNNNeuronLinear,就像它是一个单独的层级一样。
An activation function, or transfer function, applies a transformation on weighted input data(matrix multiplication between input data and weights).
激活函数或传递函数会对加权后的输入数据(对输入数据与权重执行矩阵乘法运算)进行转换。
Each CEC uses as an activation function f, the identity function, and has a connection to itself with fixed weight of 1.0.
每个CEC使用一��激活函数f,它是一个恒常函数,�K有一个与其自身的连接,其固定权重为1.0。
Results: 30, Time: 0.0328

Word-for-word translation

Top dictionary queries

English - Chinese