What is the translation of " OVERFITTING " in Chinese?

Noun
过度拟合
过拟合
过拟
的过拟合问题
过匹配

Examples of using Overfitting in English and their translations into Chinese

{-}
  • Political category close
  • Ecclesiastic category close
  • Programming category close
Dropout was added to reduce overfitting.
还使用了Dropout来减少过拟合
This is called overfitting, and we will explore it later.
这称为过拟合,我们稍后会加以探讨。
How do you ensure your models are not overfitting?
怎么保证你的商业模式不被复制?
Worse yet, you might be overfitting your validation set.
更糟糕的是,你可能会过拟合你的验证集。
Without dropout, our network exhibits substantial overfitting.
没有dropout,我们的网络表现出大量的过拟合
However, overfitting is a serious problem in such networks.
但是,过拟合是这类网络的一个严重问题。
(5) Too many functions ortoo complex a model can lead to overfitting.
特征太多或模型过于复杂都可能导致过拟合
However, overfitting is a serious problem in such networks.
然而,过拟合在这样的网络中是一个严重的问题。
Without dropout, our network exhibits substantial overfitting.
如果没有dropout,我们的网络会表现出大量的过拟合
Avoiding overfitting can single-handedly improve our model's performance.
避免过拟合可以提高我们模型的性能。
Use max_depth to control the size of the tree to prevent overfitting.
使用max_depth来控制树的大小防止过拟合
You will be able to discuss overfitting in the context of decision tree models.
您将能够在决策树模型中讨论过拟合
Use max_depth to control the size of the tree to prevent overfitting.
使用max_depth控制树的大小,以防止过度拟合
Training tips and tricks: overfitting, dropout, learning rate decay….
训练提示和技巧:过拟合、dropout、学习速率衰减等….
In general,data augmentation is always a good idea to reduce overfitting.
一般来说,数据集增广总是减少过度拟合的好主意。
To prevent overfitting, the best solution is to use more training data.
为了防止发生过拟合,最好的解决方案是使用更多训练数据。
That helps prevent acommon machine-learning problem in health care: overfitting.
这有助于防止医疗保健中常见的机器学习问题:过度拟合
The other way to avoid overfitting in decision trees is to grow the tree to its.
在决策树中避免过度拟合的另一个方法是将树生长到其.
It does raise important topics such as overfitting and regularization.
引入了一对重要的问题和概念:Overfitting和Regularization。
Overfitting: It is one of the most practical difficulties for decision tree models.
过拟合(Overfitting):过拟合是决策树模型最实际的困难之一。
The simplest way to reduce overfitting is to increase the size of the training data.
减少过拟合的最简单方法是增加训练数据的大小。
This problem is called overfitting- it's like memorizing the answers instead of understanding how to solve a problem.
这个问题被称之为过拟合--就像程序记住了答案而不是理解如何解决问题一样。
High variance results overfitting, learning errors as relevant information.
高方差导致过度拟合(overfitting),将错误作为相关信息进行学习。
This in turn leads to overfitting because these co-adaptations do not generalize to unseen data.
这反过来又会导致过拟合,因为这些相互适应并没有泛化到看不见的数据。
As a quick recap, I explained what overfitting is and why it is a common problem in neural networks.
简单回顾下上述内容,我解释了什么是过拟合以及为什么它是神经网络当中常见的问题。
This helps in significantly reducing overfitting, while furnishing major improvements over other regularization methods.
这显著减轻了过拟合,并且为其他的正则化方法带来重要改进。
Earlier we mentioned that overfitting is a result of our network having learned too much of the specifics of our training set.
之前我们曾经提到,过度拟合是我们的网络学习了太多训练集的细节的结果。
Cross-validation can help to combat overfitting, for example by using it to choose the best size of decision tree to learn.
交叉验证可以帮助对抗过度拟合,例如通过使用交叉验证来选择决策树的最佳尺寸来学习。
When training on small data sets, challenges include overfitting, difficulties in handling outliers, differences in the data distribution between training and test.
在对小型数据集进行培训时,难点包括过度拟合,处理异常值的困难,训练和测试之间数据分布的差异。
Results: 29, Time: 0.025

Top dictionary queries

English - Chinese