Main Content


(Not recommended) Pretrained NASNet-Large convolutional neural network

  • NASNet-Large network architecture

nasnetlarge is not recommended. Use the imagePretrainedNetwork function instead and specify the "nasnetlarge" model. For more information, see Version History.


NASNet-Large is a convolutional neural network that is trained on more than a million images from the ImageNet database [1]. The network can classify images into 1000 object categories, such as keyboard, mouse, pencil, and many animals. As a result, the network has learned rich feature representations for a wide range of images. The network has an image input size of 331-by-331. For more pretrained networks in MATLAB®, see Pretrained Deep Neural Networks.


net = nasnetlarge returns a pretrained NASNet-Large convolutional neural network.

This function requires the Deep Learning Toolbox™ Model for NASNet-Large Network support package. If this support package is not installed, then the function provides a download link.


collapse all

Download and install the Deep Learning Toolbox Model for NASNet-Large Network support package.

Type nasnetlarge at the command line.


If the Deep Learning Toolbox Model for NASNet-Large Network support package is not installed, then the function provides a link to the required support package in the Add-On Explorer. To install the support package, click the link, and then click Install. Check that the installation is successful by typing nasnetlarge at the command line. If the required support package is installed, then the function returns a DAGNetwork object.

ans = 

  DAGNetwork with properties:

         Layers: [1244×1 nnet.cnn.layer.Layer]
    Connections: [1463×2 table]

Visualize the network using Deep Network Designer.


Explore other pretrained neural networks in Deep Network Designer by clicking New.

Deep Network Designer start page showing available pretrained neural networks

If you need to download a neural network, pause on the desired neural network and click Install to open the Add-On Explorer.

Output Arguments

collapse all

Pretrained NASNet-Large convolutional neural network, returned as a DAGNetwork object.


[1] ImageNet.

[2] Zoph, Barret, Vijay Vasudevan, Jonathon Shlens, and Quoc V. Le. "Learning Transferable Architectures for Scalable Image Recognition." arXiv preprint arXiv:1707.07012 2, no. 6 (2017).

Extended Capabilities

Version History

Introduced in R2019a

collapse all

R2024a: Not Recommended

nasnetlarge is not recommended. Use the imagePretrainedNetwork function instead and specify "nasnetlarge" as the model.

There are no plans to remove support for the nasnetlarge function. However, the imagePretrainedNetwork function has additional functionality that helps with transfer learning workflows. For example, you can specify the number of classes in your data using the numClasses option, and the function returns a network that is ready for retraining without the need for modification.

The imagePretrainedNetwork function returns the network as a dlnetwork object, which does not store the class names, To get the class names of the pretrained network, use the second output argument of the imagePretrainedNetwork function.

This table shows some typical usages of the nasnetlarge function and how to update your code to use the imagePretrainedNetwork function instead.

Not RecommendedRecommended
net = nasnetlarge;[net,classNames] = imagePretrainedNetwork("nasnetlarge");
net = nasnetlarge(Weights="none");net = imagePretrainedNetwork("nasnetlarge",Weights="none");

The imagePretrainedNetwork returns a dlnetwork object, which also has these advantages:

  • dlnetwork objects are a unified data type that supports network building, prediction, built-in training, visualization, compression, verification, and custom training loops.

  • dlnetwork objects support a wider range of network architectures that you can create or import from external platforms.

  • The trainnet function supports dlnetwork objects, which enables you to easily specify loss functions. You can select from built-in loss functions or specify a custom loss function.

  • Training and prediction with dlnetwork objects is typically faster than LayerGraph and trainNetwork workflows.

To train a neural network specified as a dlnetwork object, use the trainnet function.