importModelFromXGBoost
Description
Examples
Import a pretrained XGBoost classification model trained using the ionosphere dataset. The pretrained model is provided with this example.
The model was trained in Python and saved as a JSON file using model.save_model('trainedXGBoostModel.json').
load ionosphere modelfile = "trainedXGBoostModel.json"; Mdl = importModelFromXGBoost(modelfile)
Mdl =
CompactClassificationXGBoost
ResponseName: 'Y'
ClassNames: [0 1]
ScoreTransform: 'logit'
NumTrained: 30
ImportedModelParameters: [1×1 struct]
Properties, Methods
The model is imported as a CompactClassificationXGBoost model object.
Use the dot notation to view the imported model parameters.
Mdl.ImportedModelParameters
ans = struct with fields:
BaseScore: 0.6374
Objective: 'binary:logistic'
Booster: 'Tree'
NumBoostingRounds: 30
HasParallelTrees: 1
IsBinary: 1
NumClasses: 2
The parameters indicate it is a binary classification model trained using the 'Tree' booster. Convert the response data to a boolean array to match the imported model.
Y = (Y=="g");Use the loss function to evaluate the classification loss of the model.
loss(Mdl,X,Y)
ans = single
0.0476
Import a pretrained XGBoost regression model trained using the carsmall dataset to predict the fuel economy (MPG) of a car. The pretrained model is provided with this example.
The model was trained in Python using Cylinders, Displacement, Horsepower, and Weight as predictors. Then, it was saved as a json file using model.save_model('trainedRegressionXGBoostModel.json').
load carsmall modelfile = "trainedRegressionXGBoostModel.json"; Mdl = importModelFromXGBoost(modelfile)
Mdl =
CompactRegressionXGBoost
ResponseName: 'Y'
ResponseTransform: 'none'
NumTrained: 30
ImportedModelParameters: [1×1 struct]
Properties, Methods
The model is imported as a CompactRegressionXGBoost model object.
Use the dot notation to view the imported model parameters.
Mdl.ImportedModelParameters
ans = struct with fields:
BaseScore: 23.7181
Objective: 'reg:squarederror'
Booster: 'Tree'
NumBoostingRounds: 30
HasParallelTrees: 1
The parameters indicate it is a regression model trained using the 'Tree' booster.
Use the trained model to predict the fuel economy (MPG) for a four-cylinder car with a 200-cubic inch displacement, 150 horsepower, and weighing 3000 lbs.
predict(Mdl,[4 200 150 3000])
ans = single
24.0842
Import a pretrained XGBoost multiclass classification model trained using the NLP dataset with 13 classes. The pretrained model is provided with this example.
The model was trained in Python and saved as a json file using model.save_model('trainedMulticlassXGBoostModel.json').
load nlpdata modelfile = "trainedMulticlassXGBoostModel.json"; Mdl = importModelFromXGBoost(modelfile)
Mdl =
CompactClassificationXGBoost
ResponseName: 'Y'
ClassNames: [0 1 2 3 4 5 6 7 8 9 10 11 12]
ScoreTransform: 'softmax'
NumTrained: 390
ImportedModelParameters: [1×1 struct]
Properties, Methods
The model is imported as a CompactClassificationXGBoost model object.
Use the dot notation to view the imported model parameters.
Mdl.ImportedModelParameters
ans = struct with fields:
BaseScore: 0.5000
Objective: 'multi:softmax'
Booster: 'Tree'
NumBoostingRounds: 30
HasParallelTrees: 1
NumClasses: 13
IsBinary: 0
The parameters indicate it is a multiclass classification model trained using the 'Tree' booster. Convert the response data Y to an encoded array to match the imported model and the predictor matrix X to a full matrix.
labels = categories(Y); % Create mapping with 0-based indexing labelMap = containers.Map(labels,0:length(labels)-1); % Encode Y Y = cellstr(Y); encodedY = zeros(size(Y)); for i = 1:length(Y) encodedY(i) = labelMap(Y{i}); end fullX = full(X);
Use the predict function to evaluate the class for the ninth sample.
predict(Mdl,fullX(9,:))
ans = 0
Import a pretrained XGBoost classification model trained using the ionosphere dataset. The pretrained model is provided with this example.
The model was trained in Python and saved as a json file using model.save_model('trainedXGBoostModel.json').
load ionosphere modelfile = "trainedXGBoostModel.json"; Mdl = importModelFromXGBoost(modelfile)
Mdl =
CompactClassificationXGBoost
ResponseName: 'Y'
ClassNames: [0 1]
ScoreTransform: 'logit'
NumTrained: 30
ImportedModelParameters: [1×1 struct]
Properties, Methods
The model is imported as a CompactClassificationXGBoost model object. Convert the response data to a boolean array to match the imported model.
Y = (Y=="g");Create a test sample with 5 missing predictor values.
missingIdx = randi(size(X,2),[1,5])
missingIdx = 1×5
28 31 5 32 22
for i=missingIdx X(1,i)= NaN; end
Predict the class of the sample with missing predictor values.
predict(Mdl,X(1,:))
ans = 1
Input Arguments
Name of the XGBoost model file, specified as a character vector or string scalar.
modelfile must be in the current folder, or you must include a
full or relative path to the file. The XGBoost model must be pretrained and saved in the
serialized JSON format. For more information on how to save a model,
see Save XGBoost Model in Python.
Example: "xgboostModel.json"
Output Arguments
Compact XGBoost model, returned as either a CompactClassificationXGBoost or CompactRegressionXGBoost
object. The compact model does not include the data used to train the model. To
reference properties of Mdl, use dot notation. For example, to
access or display the cell vector of weak learner model objects for the underlying tree
ensemble, enter Mdl.Trained at the command line.
Limitations
The following features in XGBoost library are not supported for imported models :-
Multilabel classification
Categorical predictors
importModelFromXGBoostis tested up to XGBoost version 3.1.
More About
importModelFromXGBoost accepts XGBoost models in the
JSON format. Use the following command in Python® before import to save a trained XGBoost model
:-
model.save_model("modelfileName.json")XGBoost models can be trained and saved in other languages such as R, C, C++ etc.
However, importModelFromXGBoost has been tested only for Python imports.
importModelFromXGBoost supports models trained with the following objectives :-
reg:squarederrorbinary:logisticmulti:softmaxmulti:softprob
reg:squarederror, binary:logistic and
multi:softprob are the default objectives for regression, binary
classification and multiclass classification respectively.
Missing values support is enabled in the imported XGBoost models. This aligns with the default behavior of XGBoost to accept predictors with missing values.
References
[1] XGBoost. “XGBoost.” Accessed August 14, 2025. https://xgboost.readthedocs.io/en/stable/.
[2] XGBoost. “Introduction to Model IO” Accessed August 14, 2025. https://xgboost.readthedocs.io/en/stable/tutorials/saving_model.html.
Version History
Introduced in R2026a
See Also
CompactClassificationXGBoost | CompactRegressionXGBoost | predict | loss
MATLAB Command
You clicked a link that corresponds to this MATLAB command:
Run the command by entering it in the MATLAB Command Window. Web browsers do not support MATLAB commands.
Select a Web Site
Choose a web site to get translated content where available and see local events and offers. Based on your location, we recommend that you select: .
You can also select a web site from the following list
How to Get Best Site Performance
Select the China site (in Chinese or English) for best site performance. Other MathWorks country sites are not optimized for visits from your location.
Americas
- América Latina (Español)
- Canada (English)
- United States (English)
Europe
- Belgium (English)
- Denmark (English)
- Deutschland (Deutsch)
- España (Español)
- Finland (English)
- France (Français)
- Ireland (English)
- Italia (Italiano)
- Luxembourg (English)
- Netherlands (English)
- Norway (English)
- Österreich (Deutsch)
- Portugal (English)
- Sweden (English)
- Switzerland
- United Kingdom (English)