What is the translation of " ACTIVATION FUNCTIONS " in Chinese?

[ˌækti'veiʃn 'fʌŋkʃnz]
[ˌækti'veiʃn 'fʌŋkʃnz]
激活函数

Examples of using Activation functions in English and their translations into Chinese

{-}
  • Political category close
  • Ecclesiastic category close
  • Programming category close
What are activation functions and why are they required?
什么是激活功能,为什么需要它们??
Different layers may have different activation functions.
不同的层可能拥有不同的激活函数
The activation functions were rectified linear units in every hidden layer.
激活函数是在每个隐藏层中修正线性单元。
That's why most experts stick to using non-linear activation functions.
这就是为什么大多数专家坚持使用非线性激活函数
ReLUs are often used as activation functions in Deep Neural Networks.
ReLU常在深度神经网络中被用作激活函数
Many activation functions have been proposed, but for now we will describe two in detail: sigmoid and ReLU.
人们提出了许多种激活函数,但是我们现在只详细描述两种:sigmoid和ReLU。
Fundamentals of Deep Learning- Activation Functions and When to Use Them?
深度学习的基础知识-激活功能以及何时使用它们》?
Non-linear activation functions make it possible for neural networks to approximate any mathematical function..
非线性激活函数可以让神经网络更接近任何数学函数。
Home Deep Learning Fundamentals of Deep Learning- Activation Functions and When to Use Them?
深度学习的基础知识-激活功能以及何时使用它们》?
Without these activation functions, neural networks would not be able to learn very interesting things.
没有这些激活函数,神经网络将无法学习非常有趣的事情。
Doesn't mean they don't have their uses, but most FFNNs with other activation functions don't get their own name.
但这不意味着它没有用处,但大部分带有其它激活函数的FFNN都没有自己的专用名称。
There can be other activation functions like Tanh, softmax and RELU.
还有其他激活函数,如:Tanh、softmax和RELU。
The sequential action network is afully connected networks with leaky ReLU activation functions.
序列动作网络(sequentialactionnetwork)是一个带有leakyReLU激活函数的全连接网络。
There can be other activation functions like Tanh, softmax and RELU.
可以试试其他激活函数,例如softmax、tanh或者relu。
Doesn't mean they don't have their uses, but most FFNNs with other activation functions don't get their own name.
这不是说没有相关的应用,但大多数以其它函数作为激活函数的FFNNs都没有它们自己的名字。
There have been 5 major activation functions tried to date, step, sigmoid, tanh, and ReLU.
主要的激活函数有5种,date,step,sigmoid,tanh和ReLU。
In order to improve the training effect of the deep neural networks,the neuron connection methods and activation functions have been adjusted.
为了提高深层神经网络的训练效果,人们对神经元的连接方式和激活功能进行了相应的调整。
This is why non-linear activation functions are so important in deep learning.
这就是为什么非线性激励函数在深度学习中如此重要。
In order to improve the training effect of the deep neural networks,the neuron connection methods and activation functions have been adjusted.
为了提高深层神经网络的训练效果,人们对神经元的连接方法和激活函数等方面做出相应的调整。
Most commonly used activation functions in Deep Learning→.
浅谈深度学习中的激活函数-TheActivationFunctioninDeepLearning.
Activation functions are important components of the network because they introduce non-linearity to the system.
激活函数是网络体系结构非常重要的组成部分,因为它们将非线性引入了系统。
Because of this, most neural networks use non-linear activation functions like the logistic, tanh, binary or rectifier.
正因为如此,大多数神经网络使用非线性的激活函数,例如Logistic函数、tanh函数、binary函数或者rectifier函数。
Activation functions are important elements of the network architecture since they introduce non-linearity to the system.
激活函数是网络体系结构非常重要的组成部分,因为它们将非线性引入了系统。
You would think this would be as straightforward as APIs go,but strangely enough BNNS has a different way of defining these activation functions than MPSCNN.
你会认为这与API一样简单,但是奇怪的是,与MPSCNN相比,BNNS有一个不同的定义这些激活函数的方式。
We will also explore some key activation functions that are used in conjunction with matrix multiplication.
本章也会探究某些关键的激活函数,它们用到了矩阵乘法。
Other activation functions you will see are the logistic(often called the sigmoid), tanh, and softmax functions..
其它常见激活函数还有对数几率(又称作sigmoid),tanh和softmax。
These assumptions appear everywhere in deep learning literature,from weight initialization, to activation functions, to the optimization algorithms which train the network.
这些假设在深度学习文献中随处可见,从权重初始化,到激活函数,再到训练网络的优化算法。
However, only nonlinear activation functions allow such networks to compute nontrivial problems using only a small number of nodes.
然而,只有非线性激活函数才允许这种网络仅使用少量节点来计算非平凡问题。
There are many other activation functions that may be used on the output layer and the specifics of your problem may add confusion.
还有许多其他激活功能可以在输出层上使用,您的问题的具体细节可能会增加混乱。
Results: 29, Time: 0.0273

Word-for-word translation

Top dictionary queries

English - Chinese