Main Content

Operations

Develop custom deep learning functions

For most tasks, you can use built-in layers. If there is not a built-in layer that you need for your task, then you can define your own custom layer. You can specify a custom loss function using a custom output layer and define custom layers with learnable and state parameters. After defining a custom layer, you can check that the layer is valid, GPU compatible, and outputs correctly defined gradients. To learn more, see Define Custom Deep Learning Layers. For a list of supported layers, see List of Deep Learning Layers.

If the trainingOptions function does not provide the training options that you need for your task, or custom output layers do not support the loss functions that you need, then you can define a custom training loop. For models that layer graphs do not support, you can define a custom model as a function. To learn more, see Define Custom Training Loops, Loss Functions, and Networks.

Use deep learning operations to develop MATLAB® code for custom layers, training loops, and model functions.

Functions

expand all

dlarrayDeep learning array for customization (Since R2019b)
dimsDimension labels of dlarray (Since R2019b)
finddimFind dimensions with specified label (Since R2019b)
stripdimsRemove dlarray data format (Since R2019b)
extractdataExtract data from dlarray (Since R2019b)
isdlarrayCheck if object is dlarray (Since R2020b)
dlconvDeep learning convolution (Since R2019b)
dltranspconvDeep learning transposed convolution (Since R2019b)
lstmLong short-term memory (Since R2019b)
gruGated recurrent unit (Since R2020a)
attentionDot-product attention (Since R2022b)
embedEmbed discrete data (Since R2020b)
fullyconnectSum all weighted input data and apply a bias (Since R2019b)
dlode45Deep learning solution of nonstiff ordinary differential equation (ODE) (Since R2021b)
batchnormNormalize data across all observations for each channel independently (Since R2019b)
crosschannelnormCross channel square-normalize using local responses (Since R2020a)
groupnormNormalize data across grouped subsets of channels for each observation independently (Since R2020b)
instancenormNormalize across each channel for each observation independently (Since R2021a)
layernormNormalize data across all channels for each observation independently (Since R2021a)
avgpoolPool data to average values over spatial dimensions (Since R2019b)
maxpoolPool data to maximum value (Since R2019b)
maxunpoolUnpool the output of a maximum pooling operation (Since R2019b)
reluApply rectified linear unit activation (Since R2019b)
leakyreluApply leaky rectified linear unit activation (Since R2019b)
geluApply Gaussian error linear unit (GELU) activation (Since R2022b)
softmaxApply softmax activation to channel dimension (Since R2019b)
sigmoidApply sigmoid activation (Since R2019b)
crossentropyCross-entropy loss for classification tasks (Since R2019b)
l1lossL1 loss for regression tasks (Since R2021b)
l2lossL2 loss for regression tasks (Since R2021b)
huberHuber loss for regression tasks (Since R2021a)
mseHalf mean squared error (Since R2019b)
ctcConnectionist temporal classification (CTC) loss for unaligned sequence classification (Since R2021a)
dlaccelerateAccelerate deep learning function for custom training loops (Since R2021a)
AcceleratedFunctionAccelerated deep learning function (Since R2021a)
clearCacheClear accelerated deep learning function trace cache (Since R2021a)

Topics

Automatic Differentiation

Model Functions

Deep Learning Function Acceleration