开发者:上海品职教育科技有限公司 隐私政策详情

应用版本:4.2.11(IOS)|3.2.5(安卓)APP下载

alexanderwang · 2024年10月03日

没理解B为什么不对

NO.PZ2023040502000057

问题如下:

Assuming a Classification and Regression Tree (CART) model is used, which of the following is most likely to result in model overfitting?

选项:

A.

Using the k-fold cross validation method.

B.

Including an overfitting penalty (i.e., regularization term).

C.

Using a fitting curve to select a model with low bias error and high variance error.

解释:

C is correct. A fitting curve shows the trade-off between bias error and variance error for various potential models. A model with low bias error and high variance error is, by definition, overfitted.

A is incorrect, because there are two common methods to reduce overfitting, one of which is proper data sampling and cross-validation. K-fold cross validation is such a method for estimating out-of-sample error directly by determining the error in validation samples.

B is incorrect, because there are two common methods to reduce overfitting, one of which is preventing the algorithm from getting too complex during selection and training, which requires estimating an overfitting penalty.

答案C的解释里也提到,用penality是方法之一,谢谢

1 个答案

品职助教_七七 · 2024年10月04日

嗨,爱思考的PZer你好:


加penalty消除overfitting的主要应用是penalized regression/LASSO。CART里没有这个应用。

即虽然加惩罚项是控制overfitting的方法之一,但没法用在CART这个背景。

----------------------------------------------
就算太阳没有迎着我们而来,我们正在朝着它而去,加油!

  • 1

    回答
  • 0

    关注
  • 48

    浏览
相关问题