Skip to content

Conversation

@PhilippPro
Copy link
Member

This is the implementation of liquidSVM as mlr-learner. The probability predictions are still technically flawed and they are not (yet) responding to my emails, so unfortunately probability predictions are not provided at the moment.

@larskotthoff
Copy link
Member

Could you add some tests please?

@PhilippPro
Copy link
Member Author

Why are there some errors on the pr and not on the push? I also do not understand the errors very well...

@larskotthoff
Copy link
Member

Ok, it looks like the tests are failing with a segmentation fault. I've tried on two different machines and can't reproduce it. Does it work when you run the tests locally?

@PhilippPro
Copy link
Member Author

Yes, they were ok. The algorithm only produces warnings in the default version which is not good. Also the probability prediction is not working properly yet. But the package has quite potential regarding performance and runtime and I already contacted the creators.

package = "liquidSVM",
par.set = makeParamSet(
makeIntegerLearnerParam(id = "d", lower = 0L),
makeLogicalLearnerParam(id = "scale"),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Default for this is TRUE.

cl = "classif.liquidSVM",
package = "liquidSVM",
par.set = makeParamSet(
makeIntegerLearnerParam(id = "d", lower = 0L),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should not be tunable (tunable = FALSE).

par.set = makeParamSet(
makeIntegerLearnerParam(id = "d", lower = 0L),
makeLogicalLearnerParam(id = "scale"),
makeIntegerLearnerParam(id = "threads", lower = -1L),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Default is 0.

makeIntegerLearnerParam(id = "d", lower = 0L),
makeLogicalLearnerParam(id = "scale"),
makeIntegerLearnerParam(id = "threads", lower = -1L),
makeIntegerLearnerParam(id = "partition_choice", lower = 0L),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Default is 0.

makeLogicalLearnerParam(id = "scale"),
makeIntegerLearnerParam(id = "threads", lower = -1L),
makeIntegerLearnerParam(id = "partition_choice", lower = 0L),
makeIntegerLearnerParam(id = "grid_choice", lower = -2L, upper = 2L),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How does this interact with gamma_steps, lambda_steps, etc? Are these options mutually exclusive?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And all of these options should be not tunable.

makeIntegerLearnerParam(id = "partition_choice", lower = 0L),
makeIntegerLearnerParam(id = "grid_choice", lower = -2L, upper = 2L),
makeIntegerLearnerParam(id = "adaptivity_control", lower = 0L, upper = 2L),
makeIntegerLearnerParam(id = "random_seed", default = 1),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I didn't find any documentation on what the default random seed is -- where did you find this? It also shouldn't be tunable.

makeIntegerLearnerParam(id = "grid_choice", lower = -2L, upper = 2L),
makeIntegerLearnerParam(id = "adaptivity_control", lower = 0L, upper = 2L),
makeIntegerLearnerParam(id = "random_seed", default = 1),
makeIntegerLearnerParam(id = "fold"),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should that be folds? And there should be a minimum for this (and again not tunable).

makeIntegerLearnerParam(id = "adaptivity_control", lower = 0L, upper = 2L),
makeIntegerLearnerParam(id = "random_seed", default = 1),
makeIntegerLearnerParam(id = "fold"),
makeIntegerLearnerParam(id = "clipping"),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seems that the minimum is -1. Also probably shouldn't be tunable.

makeNumericLearnerParam(id = "max_lambda", lower = 0),
makeNumericVectorLearnerParam(id = "lambdas", lower = 0),
makeNumericVectorLearnerParam(id = "c_values", lower = 0),
makeLogicalLearnerParam(id = "useCells")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seems that the default for this is FALSE. Should be not tunable.

@PhilippPro
Copy link
Member Author

I tried to change all parameters according to your suggestions. I am unsure about clipping, if it is tunable or not, that's why I did not set tunable = FALSE.

@larskotthoff
Copy link
Member

Thanks, looks good -- what about restrictions between parameters that control the tuning though? If you set all the gamma parameters at the same time, I guess that some of them override others? It may make sense to add requirements saying that if you set a certain parameter, others cannot be set to avoid surprising results when users accidentally set conflicting parameters.

@PhilippPro
Copy link
Member Author

Thanks, looks good -- what about restrictions between parameters that control the tuning though? If you set all the gamma parameters at the same time, I guess that some of them override others? It may make sense to add requirements saying that if you set a certain parameter, others cannot be set to avoid surprising results when users accidentally set conflicting parameters.

I changed the parameter settings to take into account requirements. Moreover I looked into https://github.com/cran/liquidSVM/blob/master/R/mlr.R and changed the parameter settings slightly.

@larskotthoff
Copy link
Member

Thanks for all your work on this. Merging.

@larskotthoff larskotthoff merged commit ca824b2 into master Sep 24, 2018
@larskotthoff larskotthoff deleted the liquidSVM branch September 24, 2018 19:19
vrodriguezf pushed a commit to vrodriguezf/mlr that referenced this pull request Jan 16, 2021
* create liquidSVM learners

* add the necessary package

* include some tests (that are still failing (probably))

* add debug seed

* change parameter settings of liquidSVM
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants