a <- runif(100)
b <- runif(100)
c <- runif(100)
d <- b runif(100)/10
e <- a runif(100)/10
test <- cv.glmnet(cbind(a,b,c), cbind(d,e), family = "mgaussian", relax = TRUE, gamma = c(0.5))
test
test$lambda.min
In this example, calling test directly gives something like:
> test
Call: cv.glmnet(x = cbind(a, b, c), y = cbind(d, e), gamma = c(0.5), relax = TRUE, family = "mgaussian")
Measure: Mean-Squared Error
Gamma Index Lambda Index Measure SE Nonzero
min 0.5 1 0.000884 63 0.001603 0.0001329 4
1se 0.5 1 0.014402 33 0.001715 0.0001521 3
It appears that lambda.min should be 0.000884. However, when I call test$lambda.min
I get a different number, in this instance 0.0004198282. I'm using glmnet v4.1-4 and R version:
> R.version
_
platform x86_64-w64-mingw32
arch x86_64
os mingw32
system x86_64, mingw32
status
major 4
minor 1.2
year 2021
month 11
day 01
svn rev 81115
language R
version.string R version 4.1.2 (2021-11-01)
nickname Bird Hippie
Anyone know why there is a difference?
CodePudding user response:
With relax = TRUE
in cv.glmnet
, two sets of cross-validation are performed:
- the usual one for the
relax = FALSE
case; - the special one for the
relax = TRUE
case.
Their results are stored in different places, and there are two methods for print
.
Result for relax = FALSE
:
print.cv.glmnet(test)
test$lambda.min
Result for relax = TRUE
(stored in test$relaxed
):
test
#print(test)
#glmnet:::print.cv.relaxed(test)
test$relaxed$lambda.min