开发者:上海品职教育科技有限公司 隐私政策详情

应用版本:4.2.11(IOS)|3.2.5(安卓)APP下载

FrankSun · 2021年10月09日

麻烦解答一下

* 问题详情,请 查看题干

NO.PZ202108310100000106

问题如下:

Based on Exhibit 2, the accuracy metric for Dataset XYZ’s test set sample is closest to:

选项:

A.

0.67

B.

0.70

C.

0.75

解释:

B is correct.

Accuracy is the percentage of correctly predicted classes out of total predictions and is calculated as (TP + TN)/(TP + FP + TN + FN).

In order to obtain the values for true positive (TP), true negative (TN), false positive (FP), and false negative (FN), predicted sentiment for the positive (Class “1”) and the negative (Class “0”) classes are determined based on whether each individual target p-value is greater than or less than the threshold p-value of 0.65. If an individual target p-value is greater than the threshold p-value of 0.65, the predicted sentiment for that instance is positive (Class “1”). If an individual target p-value is less than the threshold p-value of 0.65, the predicted sentiment for that instance is negative (Class “0”). Actual sentiment and predicted sentiment are then classified as follows:


Exhibit 2, with added “Predicted Sentiment” and “Classification” columns, is presented below:


Based on the classification data obtained from Exhibit 2, a confusion matrix can be generated:


Using the data in the confusion matrix above, the accuracy metric is computed as follows:

Accuracy = (TP + TN)/(TP + FP + TN + FN).

Accuracy = (3 + 4)/(3 + 1 + 4 + 2) = 0.70.

A is incorrect because 0.67 is the F1 score, not accuracy metric, for the sample of the test set for Dataset XYZ, based on Exhibit 2. To calculate the F1 score, the precision (P) and the recall (R) ratios must first be calculated. Precision and recall for the sample of the test set for Dataset XYZ, based on Exhibit 2, are calculated as follows:

Precision (P) = TP/(TP + FP) = 3/(3 + 1) = 0.75.

Recall (R) = TP/(TP + FN) = 3/(3 + 2) = 0.60.

The F1 score is calculated as follows:

F1 score = (2 × P × R)/(P + R) = (2 × 0.75 × 0.60)/(0.75 + 0.60) = 0.667, or 0.67.

C is incorrect because 0.75 is the precision ratio, not the accuracy metric, for the sample of the test set for Dataset XYZ, based on Exhibit 2. The precision score is calculated as follows:

Precision (P) = TP/(TP + FP) = 3/(3 + 1) = 0.75.

老师,这四个分类是从题目中哪里找到的?

2 个答案

星星_品职助教 · 2023年04月24日

@ff

p-value是probability value(概率值)的意思。可以用在多个场合,并不仅限于假设检验。本题并不去考虑是否拒绝原假设。

在Y为二值变量时,给定p-threshold作为一个阈值,如果此时得到的p-value超过这个值,就认为是positive,也就是Y这时候就取1。低于这个阈值,则Y=0(negative)

星星_品职助教 · 2021年10月10日

同学你好,

根据Exhibit 2给出的p-value和题干给出的threshold 0.65,可以得出预测值(predicted sentiment)这一列。

根据预测值和真实值(actual sentiment)这一列进行对比,可以写出classification这一列,并得到TP,TN,FN,FP四者的值。

然后代入到Accuracy = (TP + TN)/(TP + FP + TN + FN)这个公式里,即可以求得“the accuracy metric”

FrankSun · 2021年10月10日

原来如此,题干中的图片不全,没有classification这一列

ff · 2023年04月24日

想问一下,为什么pvalue 大于threshold p value 0.65,就是positive 呢? 我记得之前学是说p value约小约rejct null hypothesis,不太知道这里怎么评判positive的prediction 还是negative的

  • 2

    回答
  • 2

    关注
  • 557

    浏览
相关问题

NO.PZ202108310100000106 问题如下 Baseon Exhibit 2, the accurametric for taset XYZ’s test set sample is closest to: A.0.67 B.0.70 C.0.75 B is correct. Accurais the percentage of correctly precteclasses out of totprections anis calculate(TP + TN)/(TP + FP + TN + FN).In orr to obtain the values for true positive (TP), true negative (TN), false positive (FP), anfalse negative (FN), prectesentiment for the positive (Class “1”) anthe negative (Class “0”) classes are terminebaseon whether eainvitarget p-value is greater thor less ththe thresholp-value of 0.65. If invitarget p-value is greater ththe thresholp-value of 0.65, the prectesentiment for thinstanis positive (Class “1”). If invitarget p-value is less ththe thresholp-value of 0.65, the prectesentiment for thinstanis negative (Class “0”). Actusentiment anprectesentiment are then classifiefollows:Exhibit 2, with ae“PrecteSentiment” an“Classification” columns, is presentebelow:Baseon the classification ta obtainefrom Exhibit 2, a confusion matrix cgenerateUsing the ta in the confusion matrix above, the accurametric is computefollows: Accura= (TP + TN)/(TP + FP + TN + FN).Accura= (3 + 4)/(3 + 1 + 4 + 2) = 0.70.A is incorrebecause 0.67 is the F1 score, not accurametrifor the sample of the test set for taset XYZ, baseon Exhibit 2. To calculate the F1 score, the precision (P) anthe recall (R) ratios must first calculate Precision anrecall for the sample of the test set for taset XYZ, baseon Exhibit 2, are calculatefollows: Precision (P) = TP/(TP + FP) = 3/(3 + 1) = 0.75.Recall (R) = TP/(TP + FN) = 3/(3 + 2) = 0.60. The F1 score is calculatefollows: F1 score = (2 × P × R)/(P + R) = (2 × 0.75 × 0.60)/(0.75 + 0.60) = 0.667, or 0.67.C is incorrebecause 0.75 is the precision ratio, not the accurametrifor the sample of the test set for taset XYZ, baseon Exhibit 2. The precision score is calculatefollows: Precision (P) = TP/(TP + FP) = 3/(3 + 1) = 0.75. 如题

2022-07-13 14:25 1 · 回答

NO.PZ202108310100000106 这题目表格不全

2022-01-19 18:47 1 · 回答