You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
returnfmt.Errorf("train.num_boost_round should be of type int, received %T", x)
48
-
}
49
-
},
50
-
"objective": func(xinterface{}) error {
51
-
if_, ok:=x.(string); !ok {
52
-
returnfmt.Errorf("objective should be of type string, received %T", x)
53
-
}
54
-
returnnil
55
-
},
26
+
funcnewFloat32(ffloat32) *float32 {
27
+
return&f
28
+
}
29
+
30
+
funcnewInt(iint) *int {
31
+
return&i
32
+
}
33
+
34
+
// TODO(tony): complete model parameter and training parameter list
35
+
// model parameter list: https://xgboost.readthedocs.io/en/latest/parameter.html#general-parameters
36
+
// training parameter list: https://github.com/dmlc/xgboost/blob/b61d53447203ca7a321d72f6bdd3f553a3aa06c4/python-package/xgboost/training.py#L115-L117
Step size shrinkage used in update to prevents overfitting. After each boosting step, we can directly get the weights of new features, and eta shrinks the feature weights to make the boosting process more conservative.
0 commit comments