-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
XGBoost expecting different number of features than custom model #137
Comments
Hi! Can you attach mqpar.xml file of this run? |
Cheers thanks! |
I have an update to my error description: The correct error message I receive when doing the described process is:
However, I can see the metadata file in the custom model folder describes it as HCD. The error message in my first post occurred when I replaced the files in bin/conf/fragModel/standard/unmodified/ with my custom model. |
Yes, specifying a custom model folder only works if the models lie inside the same folder structure as the default ones (so you need to have |
No change to the parameters when training the models, although I did try with and without the default variable modifications. When I supply the
|
Can you try specifying |
Describe the bug
Using a custom model trained by MQ > Tools > MS/MS intensity prediction errors on library prediction in MaxDIA.
The error message seems to indicate XGBoost is looking for 227 features but only finding 216.
To Reproduce
Steps to reproduce the behavior using MQ 2.6.6.0:
-Create a model using Tools > MS/MS intensity prediction (I used the msms_short.txt provided in a previously closed issue)
-Specify this custom model to be used by MaxDIA > Library type > Predicted > Custom
-Error on Library_prediction_0
Error
... Check failed: learner_model_param_.num_feature >= p_fmat->Info().num_col_ (216 vs. 227) : Number of columns does not match number of features in booster._ ...
Many thanks for any ideas or help with this! Are there parameters I can specify for building with the expected 227 features?
The text was updated successfully, but these errors were encountered: