Microscopy Image Browser 2.91
MIB
Loading...
Searching...
No Matches
mibDeepController Class Reference

class is a template class for using with GUI developed using appdesigner of Matlab More...

Inheritance diagram for mibDeepController:
Collaboration diagram for mibDeepController:

Public Member Functions

function [ lgraph , outputPatchSize ] = createNetwork (previewSwitch)
 generate network Parameters: previewSwitch: logical switch, when 1 - the generated network is only for preview, i.e. weights of classes won't be calculated
 
function net = generateDeepLabV3Network (imageSize, numClasses, targetNetwork)
 generate DeepLab v3+ convolutional neural network for semantic image segmentation of 2D RGB images
 
function net = generate3DDeepLabV3Network (imageSize, numClasses, downsamplingFactor, targetNetwork)
 generate a hybrid 2.5D DeepLabv3+ convolutional neural network for semantic image segmentation. The training data should be a small substack of 3,5,7 etc slices, where only the middle slice is segmented. As the update, a 2 blocks of 3D convolutions were added before the standard DLv3
 
function [ net , outputSize ] = generateUnet2DwithEncoder (imageSize, encoderNetwork)
 generate Unet convolutional neural network for semantic image segmentation of 2D RGB images using a specified encoder
 
function [ patchOut , info , augList , augPars ] = mibDeepAugmentAndCrop3dPatchMultiGPU (info, inputPatchSize, outputPatchSize, mode, options)
 
function [ patchOut , info , augList , augPars ] = mibDeepAugmentAndCrop2dPatchMultiGPU (info, inputPatchSize, outputPatchSize, mode, options)
 
function TrainingOptions = preprareTrainingOptionsInstances (valDS)
 prepare trainig options for training of the instance segmentation network
 
 mibDeepController (mibModel, varargin)
 
function  closeWindow ()
 update preferences structure
 
function  dnd_files_callback (hWidget, dragIn)
 drag and drop config name to obj.View.handles.NetworkPanel to load it
 
function  updateWidgets ()
 update widgets of this window
 
function  activationLayerChangeCallback ()
 callback for modification of the Activation Layer dropdown
 
function  setSegmentationLayer ()
 callback for modification of the Segmentation Layer dropdown
 
function  toggleAugmentations ()
 callback for press of the T_augmentation checkbox
 
function  updateBatchOptFromGUI (event)
 
function  singleModelTrainingFileValueChanged (event)
 callback for press of SingleModelTrainingFile
 
function  selectWorkflow (event)
 select deep learning workflow to perform
 
function  selectArchitecture (event)
 select the target architecture
 
function  returnBatchOpt (BatchOptOut)
 return structure with Batch Options and possible configurations via the notify syncBatch event Parameters: BatchOptOut: a local structure with Batch Options generated during Continue callback. It may contain more fields than obj.BatchOpt structure
 
function  bioformatsCallback (event)
 update available filename extensions
 
function net = selectNetwork (networkName)
 select a filename for a new network in the Train mode, or select a network to use for the Predict mode
 
function  selectDirerctories (event)
 
function  updateImageDirectoryPath (event)
 update directories with images for training, prediction and results
 
function  checkNetwork (fn)
 generate and check network using settings in the Train tab
 
function lgraph = updateNetworkInputLayer (lgraph, inputPatchSize)
 update the input layer settings for lgraph paramters are taken from obj.InputLayerOpt
 
function lgraph = updateActivationLayers (lgraph)
 update the activation layers depending on settings in obj.BatchOpt.T_ActivationLayer and obj.ActivationLayerOpt
 
function lgraph = updateConvolutionLayers (lgraph)
 update the convolution layers by providing new set of weight initializers
 
function lgraph = updateMaxPoolAndTransConvLayers (lgraph, poolSize)
 update maxPool and TransposedConvolution layers depending on network downsampling factor only for U-net and SegNet. This function is applied when the network downsampling factor is different from 2
 
function lgraph = updateSegmentationLayer (lgraph, classNames)
 redefine the segmentation layer of lgraph based on obj.BatchOpt settings
 
function [ status , augNumber ] = setAugFuncHandles (mode, augOptions)
 define list of 2D/3D augmentation functions
 
function [ dataOut , info ] = classificationAugmentationPipeline (dataIn, info, inputPatchSize, outputPatchSize, mode)
 
function  setAugmentation3DSettings ()
 update settings for augmentation fo 3D images
 
function  setActivationLayerOptions ()
 update options for the activation layers
 
function  setSegmentationLayerOptions ()
 update options for the activation layers
 
function  setAugmentation2DSettings ()
 update settings for augmentation fo 2D images
 
function  setInputLayerSettings ()
 update init settings for the input layer of networks
 
function  setTrainingSettings ()
 update settings for training of networks
 
function  start (event)
 start calcualtions, depending on the selected tab preprocessing, training, or prediction is initialized
 
function imgOut = channelWisePreProcess (imgIn)
 Normalize images As input has 4 channels (modalities), remove the mean and divide by the standard deviation of each modality independently.
 
function  processImages (preprocessFor)
 Preprocess images for training and prediction.
 
function  startPreprocessing ()
 
function TrainingOptions = preprareTrainingOptions (valDS)
 prepare trainig options for the network training
 
function  updatePreprocessingMode ()
 callback for change of selection in the Preprocess for dropdown
 
function  saveConfig (configName)
 save Deep MIB configuration to a file
 
function res = correctBatchOpt (res)
 correct loaded BatchOpt structure if it is not compatible with the current version of DeepMIB
 
function  loadConfig (configName)
 load config file with Deep MIB settings
 
function  startPredictionBlockedImage ()
 predict 2D/3D datasets using the blockedImage class requires R2021a or newer
 
function [ outputLabels , scoreImg ] = processBlocksBlockedImage (vol, zValue, net, inputPatchSize, outputPatchSize, blockSize, padShift, dataDimension, patchwiseWorkflowSwitch, patchwisePatchesPredictSwitch, classNames, generateScoreFiles, executionEnvironment, fn)
 dataDimension: numeric switch that identify dataset dimension, can be 2, 2.5, 3
 
function  startPrediction2D ()
 predict datasets for 2D taken to a separate function for better performance
 
function  startPrediction3D ()
 predict datasets for 3D networks taken to a separate function to improve performance
 
function bls = generateDynamicMaskingBlocks (vol, blockSize, noColors)
 generate blocks using dynamic masking parameters acquired in obj.DynamicMaskOpt
 
function  previewPredictions ()
 load images of prediction scores into MIB
 
function  previewModels (loadImagesSwitch)
 load images for predictions and the resulting modelsinto MIB
 
function  evaluateSegmentationPatches ()
 evaluate segmentation results for the patches in the patch-wise mode
 
function  evaluateSegmentation ()
 evaluate segmentation results by comparing predicted models with the ground truth models
 
function  selectGPUDevice ()
 select environment for computations
 
function  gpuInfo ()
 display information about the selected GPU
 
function  exportNetwork ()
 convert and export network to ONNX or TensorFlow formats
 
function  transferLearning ()
 perform fine-tuning of the loaded network to a different number of classes
 
function  importNetwork ()
 import an externally trained or designed network to be used with DeepMIB
 
function  updateDynamicMaskSettings ()
 update settings for calculation of dynamic masks during prediction using blockedimage mode the settings are stored in obj.DynamicMaskOpt
 
function  saveCheckpointNetworkCheck ()
 callback for press of Save checkpoint networks (obj.View.handles.T_SaveProgress)
 
function  previewDynamicMask ()
 preview results for the dynamic mode
 
function  exploreActivations ()
 explore activations within the trained network
 
function  countLabels ()
 count occurrences of labels in model files callback for press of the "Count labels" in the Options panel
 
function  balanceClasses ()
 balance classes before training see example from here: https://se.mathworks.com/help/vision/ref/balancepixellabels.html
 
function  customTrainingProgressWindow_Callback (event)
 callback for click on obj.View.handles.O_CustomTrainingProgressWindow checkbox
 
function  previewImagePatches_Callback (event)
 callback for value change of obj.View.handles.O_PreviewImagePatches
 
function  sendReportsCallback ()
 define parameters for sending progress report to the user's email address
 
function  helpButton_callback ()
 show Help sections
 
function  duplicateConfigAndNetwork ()
 copy the network file and its config to a new filename
 

Static Public Member Functions

static function  viewListner_Callback (obj, src, evnt)
 
static function data = tif3DFileRead (filename)
 data = tif3DFileRead(filename) custom reading function to load tif files with stack of images used in evaluate segmentation function
 
static function [ outputLabeledImageBlock , scoreBlock ] = segmentBlockedImage (block, net, dataDimension, patchwiseWorkflowSwitch, generateScoreFiles, executionEnvironment, padShift)
 test function for utilization of blockedImage for prediction The input block will be a batch of blocks from the a blockedImage.
 

Public Attributes

 mibModel
 handles to mibModel
 
 mibController
 handle to mib controller
 
 View
 handle to the view / mibDeepGUI
 
 listener
 a cell array with handles to listeners
 
 childControllers
 list of opened subcontrollers
 
 childControllersIds
 a cell array with names of initialized child controllers
 
 availableArchitectures
 containers.Map with available architectures keySet = {2D Semantic, 2.5D Semantic, 3D Semantic, 2D Patch-wise, 2D Instance}; valueSet{1} = {DeepLab v3+, SegNet, U-net, U-net +Encoder}; old: {U-net, SegNet, DLv3 Resnet18, DLv3 Resnet50, DLv3 Xception, DLv3 Inception-ResNet-v2} valueSet{2} = {Z2C + DLv3, Z2C + DLv3, Z2C + U-net, Z2C + U-net +Encoder}; % 3DC + DLv3 Resnet18' valueSet{3} = {U-net, U-net Anisotropic} valueSet{4} = {Resnet18, Resnet50, Resnet101, Xception} valueSet{5} = {SOLOv2} old: {SOLOv2 Resnet18, SOLOv2 Resnet50};
 
 availableEncoders
 containers.Map with available encoders keySet - is a mixture of workflow -space- architecture {2D Semantic DeepLab v3+, 2D Semantic U-net +Encoder} encoders for each "workflow -space- architecture" combination, the last value shows the selected encoder encodersList{1} = {Resnet18, Resnet50, Xception, InceptionResnetv2, 1}; % for 2D Semantic DeepLab v3+ encodersList{2} = {Classic, Resnet18, Resnet50, 2}; % for 2D Semantic U-net +Encoder encodersList{3} = {Resnet18, Resnet50, 1}; % for 2D Instance SOLOv2'
 
 BatchOpt
 a structure compatible with batch operation name of each field should be displayed in a tooltip of GUI it is recommended that the Tags of widgets match the name of the fields in this structure .Parameter - [editbox], char/string .Checkbox - [checkbox], logical value true or false .Dropdown{1} - [dropdown], cell string for the dropdown .Dropdown{2} - [optional], an array with possible options .Radio - [radiobuttons], cell string Radio1 or Radio2... .ParameterNumeric{1} - [numeric editbox], cell with a number .ParameterNumeric{2} - [optional], vector with limits [min, max] .ParameterNumeric{3} - [optional], string on - to round the value, off to do not round the value
 
 AugOpt2D
 a structure with augumentation options for 2D unets, default obtained from obj.mibModel.preferences.Deep.AugOpt2D, see getDefaultParameters.m .FillValue = 0; .RandXReflection = true; .RandYReflection = true; .RandRotation = [-10, 10]; .RandScale = [.95 1.05]; .RandXScale = [.95 1.05]; .RandYScale = [.95 1.05]; .RandXShear = [-5 5]; .RandYShear = [-5 5];
 
 Aug2DFuncNames
 cell array with names of 2D augmenter functions
 
 Aug2DFuncProbability
 probabilities of each 2D augmentation action to be triggered
 
 Aug3DFuncNames
 cell array with names of 2D augmenter functions
 
 Aug3DFuncProbability
 probabilities of each 3D augmentation action to be triggered
 
 gpuInfoFig
 a handle for GPU info window
 
 AugOpt3D
 .Fraction = .6; % augment 60% of patches .FillValue = 0; .RandXReflection = true; .RandYReflection = true; .RandZReflection = true; .Rotation90 = true; .ReflectedRotation90 = true;
 
 ActivationLayerOpt
 options for the activation layer
 
 DynamicMaskOpt
 options for calculation of dynamic masks for prediction using blocked image mode .Method = Keep above threshold; % Keep above threshold or Keep below threshold .ThresholdValue = 60; .InclusionThreshold = 0.1; % Inclusion threshold for mask blocks
 
 SegmentationLayerOpt
 options for the segmentation layer
 
 InputLayerOpt
 a structure with settings for the input layer .Normalization = zerocenter; .Mean = []; .StandardDeviation = []; .Min = []; .Max = [];
 
 modelMaterialColors
 colors of materials
 
 PatchPreviewOpt
 structure with preview patch options .noImages = 9, number of images in montage .imageSize = 160, patch size for preview .labelShow = true, display overlay labels with details .labelSize = 9, font size for the label .labelColor = black, color of the label .labelBgColor = yellow, color of the label background .labelBgOpacity = 0.6; % opacity of the background
 
 SendReports
 send email reports with progress of the training process .T_SendReports = false; .FROM_email = 'user@.nosp@m.gmai.nosp@m.l.com'; .SMTP_server = smtp-relay.brevo.com; .SMTP_port = 587; .SMTP_auth = true; .SMTP_starttls = true; .SMTP_sername = 'user@.nosp@m.gmai.nosp@m.l.com'; .SMTP_password = '; .sendWhenFinished = false; .sendDuringRun = false;
 
 TrainingOpt
 a structure with training options, the default ones are obtained from obj.mibModel.preferences.Deep.TrainingOpt, see getDefaultParameters.m .solverName = adam; .MaxEpochs = 50; .Shuffle = once; .InitialLearnRate = 0.0005; .LearnRateSchedule = piecewise; .LearnRateDropPeriod = 10; .LearnRateDropFactor = 0.1; .L2Regularization = 0.0001; .Momentum = 0.9; .ValidationFrequency = 400; .Plots = training-progress;
 
 TrainingProgress
 a structure to be used for the training progress plot for the compiled version of MIB .maxNoIter = max number of iteractions for during training .iterPerEpoch - iterations per epoch .stopTraining - logical switch that forces to stop training and save all progress
 
 wb
 handle to waitbar
 
 colormap6
 colormap for 6 colors
 
 colormap20
 colormap for 20 colors
 
 colormap255
 colormap for 255 colors
 
 sessionSettings
 structure for the session settings .countLabelsDir - directory with labels to count, used in count labels function
 
 TrainEngine
 temp property to test trainnet function for training: can be trainnet or trainNetwork
 
EVENT closeEvent
 > Description of events event firing when window is closed
 
- Public Attributes inherited from handle
 addlistener
 Creates a listener for the specified event and assigns a callback function to execute when the event occurs.
 
 notify
 Broadcast a notice that a specific event is occurring on a specified handle object or array of handle objects.
 
 delete
 Handle object destructor method that is called when the object's lifecycle ends.
 
 disp
 Handle object disp method which is called by the display method. See the MATLAB disp function.
 
 display
 Handle object display method called when MATLAB software interprets an expression returning a handle object that is not terminated by a semicolon. See the MATLAB display function.
 
 findobj
 Finds objects matching the specified conditions from the input array of handle objects.
 
 findprop
 Returns a meta.property objects associated with the specified property name.
 
 fields
 Returns a cell array of string containing the names of public properties.
 
 fieldnames
 Returns a cell array of string containing the names of public properties. See the MATLAB fieldnames function.
 
 isvalid
 Returns a logical array in which elements are true if the corresponding elements in the input array are valid handles. This method is Sealed so you cannot override it in a handle subclass.
 
 eq
 Relational functions example. See details for more information.
 
 transpose
 Transposes the elements of the handle object array.
 
 permute
 Rearranges the dimensions of the handle object array. See the MATLAB permute function.
 
 reshape
 hanges the dimensions of the handle object array to the specified dimensions. See the MATLAB reshape function.
 
 sort
 ort the handle objects in any array in ascending or descending order.
 

Detailed Description

class is a template class for using with GUI developed using appdesigner of Matlab

obj.startController('mibDeepController'); // as GUI tool

or

// a code below was used for mibImageArithmeticController
BatchOpt.Parameter = 'test'mib; // fill edit boxes as strings
BatchOpt.Checkbox = true; // fill checkboxes with logicals: true/false
BatchOpt.Popup = {'value'}; // value for the popups as a cell
BatchOpt.Radio = {'Radio1'}; // selection of radio buttons, as cell with the handle of the target radio button
BatchOpt.showWaitbar = true; // show or not the waitbar
obj.startController('mibDeepController', [], BatchOpt); // start mibDeepController in the batch mode
BatchOpt
a structure compatible with batch operation name of each field should be displayed in a tooltip of GU...
Definition mibDeepController.m:111
function controller = mib()
Definition mib.m:17

or

// trigger return of the possible Options using returnBatchOpt function
// using notify syncBatch event
obj.startController('mibDeepController', [], NaN);

Constructor & Destructor Documentation

◆ mibDeepController()

mibDeepController.mibDeepController ( mibModel,
varargin )

Member Function Documentation

◆ activationLayerChangeCallback()

function mibDeepController.activationLayerChangeCallback ( )

callback for modification of the Activation Layer dropdown

◆ balanceClasses()

function mibDeepController.balanceClasses ( )

balance classes before training see example from here: https://se.mathworks.com/help/vision/ref/balancepixellabels.html

References mibInputMultiDlg(), and mibShowErrorDialog().

Here is the call graph for this function:

◆ bioformatsCallback()

function mibDeepController.bioformatsCallback ( event)

update available filename extensions

Parameters
eventan event structure of appdesigner
Required fields of event:

◆ channelWisePreProcess()

function imgOut = mibDeepController.channelWisePreProcess ( imgIn)

Normalize images As input has 4 channels (modalities), remove the mean and divide by the standard deviation of each modality independently.

Parameters
imgIninput image, as matrix [heigth, width, color, depth]
Return values
imgOutresulting image, stretched between 0 and 1

References max, and min.

◆ checkNetwork()

function mibDeepController.checkNetwork ( fn)

generate and check network using settings in the Train tab

Parameters
fnoptional string with filename (*.mibDeep) to preview its configuration

References mibShowErrorDialog().

Here is the call graph for this function:

◆ classificationAugmentationPipeline()

function [ dataOut , info ] = mibDeepController.classificationAugmentationPipeline ( dataIn,
info,
inputPatchSize,
outputPatchSize,
mode )

◆ closeWindow()

function mibDeepController.closeWindow ( )

update preferences structure

References handle.isvalid, and handle.notify.

◆ correctBatchOpt()

function res = mibDeepController.correctBatchOpt ( res)

correct loaded BatchOpt structure if it is not compatible with the current version of DeepMIB

Parameters
resBatchOpt structure loaded from a file
Required fields of res:
Generated fields of res:

References mibDeepConvertOldAugmentationSettingsToNew().

Referenced by loadConfig().

Here is the call graph for this function:
Here is the caller graph for this function:

◆ countLabels()

function mibDeepController.countLabels ( )

count occurrences of labels in model files callback for press of the "Count labels" in the Options panel

References mibDeepStoreLoadCategorical(), mibDeepStoreLoadModel(), and mibInputMultiDlg().

Here is the call graph for this function:

◆ createNetwork()

function [ lgraph , outputPatchSize ] = mibDeepController.createNetwork ( previewSwitch)

generate network Parameters: previewSwitch: logical switch, when 1 - the generated network is only for preview, i.e. weights of classes won't be calculated

Return values
lgraphnetwork object
outputPatchSizeoutput patch size as [height, width, depth, color]

References mibShowErrorDialog().

Here is the call graph for this function:

◆ customTrainingProgressWindow_Callback()

function mibDeepController.customTrainingProgressWindow_Callback ( event)

callback for click on obj.View.handles.O_CustomTrainingProgressWindow checkbox

◆ dnd_files_callback()

function mibDeepController.dnd_files_callback ( hWidget,
dragIn )

drag and drop config name to obj.View.handles.NetworkPanel to load it

Parameters
hWidgeta handle to the object where the drag action landed
dragIna structure containing the dragged object .ctrlKey - 0/1 whether the control key was pressed .shiftKey - 0/1 whether the control key was pressed .names - cell array with filenames
Required fields of dragIn:

◆ duplicateConfigAndNetwork()

function mibDeepController.duplicateConfigAndNetwork ( )

copy the network file and its config to a new filename

References BatchOpt, mib_uigetfile(), and wb.

Here is the call graph for this function:

◆ evaluateSegmentation()

function mibDeepController.evaluateSegmentation ( )

evaluate segmentation results by comparing predicted models with the ground truth models

References handle.fieldnames, max, mibDeepController(), mibDeepStoreLoadImages(), mibDeepStoreLoadModel(), mibInputMultiDlg(), and mibShowErrorDialog().

Here is the call graph for this function:

◆ evaluateSegmentationPatches()

function mibDeepController.evaluateSegmentationPatches ( )

evaluate segmentation results for the patches in the patch-wise mode

References C().

Here is the call graph for this function:

◆ exploreActivations()

function mibDeepController.exploreActivations ( )

explore activations within the trained network

◆ exportNetwork()

function mibDeepController.exportNetwork ( )

convert and export network to ONNX or TensorFlow formats

References exportONNXNetwork(), mibInputMultiDlg(), mibShowErrorDialog(), and wb.

Here is the call graph for this function:

◆ generate3DDeepLabV3Network()

function net = mibDeepController.generate3DDeepLabV3Network ( imageSize,
numClasses,
downsamplingFactor,
targetNetwork )

generate a hybrid 2.5D DeepLabv3+ convolutional neural network for semantic image segmentation. The training data should be a small substack of 3,5,7 etc slices, where only the middle slice is segmented. As the update, a 2 blocks of 3D convolutions were added before the standard DLv3

Parameters
imageSizevector [height, width, colors] defining input patch size, should be larger than [224 224] for resnet18, colors should be 3
numClassesnumber of output classes (including exterior) for the output results
targetNetworkstring defining the base architecture for the initialization resnet18 - resnet18 network resnet50 - resnet50 network xception - xception network inceptionresnetv2 - resnet50 network

References mibInputMultiDlg().

Here is the call graph for this function:

◆ generateDeepLabV3Network()

function net = mibDeepController.generateDeepLabV3Network ( imageSize,
numClasses,
targetNetwork )

generate DeepLab v3+ convolutional neural network for semantic image segmentation of 2D RGB images

Parameters
imageSizevector [height, width, colors] defining input patch size, should be larger than [224 224] for resnet18, colors should be 3
numClassesnumber of output classes (including exterior) for the output results
targetNetworkstring defining the base architecture for the initialization resnet18 - resnet18 network resnet50 - resnet50 network xception - xception network (requires matlab) inceptionresnetv2 - inceptionresnetv2 network (required matlab)

References max, and mibInputMultiDlg().

Here is the call graph for this function:

◆ generateDynamicMaskingBlocks()

function bls = mibDeepController.generateDynamicMaskingBlocks ( vol,
blockSize,
noColors )

generate blocks using dynamic masking parameters acquired in obj.DynamicMaskOpt

Parameters
volblocked image to process
blockSizeblock size
noColorsnumber of color channels in the blocked image
Return values
blscalculated blockLocationSet .ImageNumber .BlockOrigin .BlockSize .Levels

References max.

◆ generateUnet2DwithEncoder()

function [ net , outputSize ] = mibDeepController.generateUnet2DwithEncoder ( imageSize,
encoderNetwork )

generate Unet convolutional neural network for semantic image segmentation of 2D RGB images using a specified encoder

Parameters
imageSizevector [height, width, colors] defining input patch size, should be larger than [224 224] for Resnet18, colors should be 3
encoderNetworkstring defining the base architecture for the initialization Classic - classic unet architecture Resnet18 - Resnet18 network Resnet50 - Resnet50 network
Return values
netUnet dlnetwork, with softmax (Name: FinalNetworkSoftmax-Layer) as the final layer
outputSizeoutput size of the network returned as [height, width, number of classes] Updates

◆ gpuInfo()

function mibDeepController.gpuInfo ( )

display information about the selected GPU

References D(), and handle.fieldnames.

Referenced by mibDeepController().

Here is the call graph for this function:
Here is the caller graph for this function:

◆ helpButton_callback()

function mibDeepController.helpButton_callback ( )

show Help sections

◆ importNetwork()

function mibDeepController.importNetwork ( )

import an externally trained or designed network to be used with DeepMIB

Example
%
generate a network
net = deeplabv3plusLayers([512 512 3], 5, resnet18); % save network to a file save(myNewNetwork.mat, net, -mat); % use Import opetation to load and adapt the network for use with DeepMIB

References ActivationLayerOpt, BatchOpt, DynamicMaskOpt, handle.fieldnames, InputLayerOpt, mib_uigetfile(), mibInputMultiDlg(), SegmentationLayerOpt, and wb.

Here is the call graph for this function:

◆ loadConfig()

function mibDeepController.loadConfig ( configName)

load config file with Deep MIB settings

Parameters
configNamefull filename for the config file to load

References correctBatchOpt(), mib_uigetfile(), mibConcatenateStructures(), and updateBatchOptCombineFields_Shared().

Here is the call graph for this function:

◆ mibDeepAugmentAndCrop2dPatchMultiGPU()

function [ patchOut , info , augList , augPars ] = mibDeepController.mibDeepAugmentAndCrop2dPatchMultiGPU ( info,
inputPatchSize,
outputPatchSize,
mode,
options )

◆ mibDeepAugmentAndCrop3dPatchMultiGPU()

function [ patchOut , info , augList , augPars ] = mibDeepController.mibDeepAugmentAndCrop3dPatchMultiGPU ( info,
inputPatchSize,
outputPatchSize,
mode,
options )

◆ preprareTrainingOptions()

function TrainingOptions = mibDeepController.preprareTrainingOptions ( valDS)

prepare trainig options for the network training

Parameters
valDSdatastore with images for validation

References mibShowErrorDialog().

Here is the call graph for this function:

◆ preprareTrainingOptionsInstances()

function TrainingOptions = mibDeepController.preprareTrainingOptionsInstances ( valDS)

prepare trainig options for training of the instance segmentation network

Parameters
valDSdatastore with images for validation

References mibShowErrorDialog().

Here is the call graph for this function:

◆ previewDynamicMask()

function mibDeepController.previewDynamicMask ( )

preview results for the dynamic mode

References wb.

◆ previewImagePatches_Callback()

function mibDeepController.previewImagePatches_Callback ( event)

callback for value change of obj.View.handles.O_PreviewImagePatches

◆ previewModels()

function mibDeepController.previewModels ( loadImagesSwitch)

load images for predictions and the resulting modelsinto MIB

Parameters
loadImagesSwitch[logical], load or not (assuming that images have already been preloaded) images. When true, both images and models are loaded, when false - only models are loaded

◆ previewPredictions()

function mibDeepController.previewPredictions ( )

load images of prediction scores into MIB

◆ processBlocksBlockedImage()

function [ outputLabels , scoreImg ] = mibDeepController.processBlocksBlockedImage ( vol,
zValue,
net,
inputPatchSize,
outputPatchSize,
blockSize,
padShift,
dataDimension,
patchwiseWorkflowSwitch,
patchwisePatchesPredictSwitch,
classNames,
generateScoreFiles,
executionEnvironment,
fn )

dataDimension: numeric switch that identify dataset dimension, can be 2, 2.5, 3

Required fields of vol:

References max, mibDeepController(), and handle.permute.

Here is the call graph for this function:

◆ processImages()

function mibDeepController.processImages ( preprocessFor)

Preprocess images for training and prediction.

Parameters
preprocessFora string with target, training, prediction

References amiraMesh2bitmap(), mibDeepStoreLoadImages(), mibDeepStoreLoadModel(), mibLoadImages(), mibShowErrorDialog(), handle.permute, and saveImageParFor().

Here is the call graph for this function:

◆ returnBatchOpt()

function mibDeepController.returnBatchOpt ( BatchOptOut)

return structure with Batch Options and possible configurations via the notify syncBatch event Parameters: BatchOptOut: a local structure with Batch Options generated during Continue callback. It may contain more fields than obj.BatchOpt structure

References handle.notify.

◆ saveCheckpointNetworkCheck()

function mibDeepController.saveCheckpointNetworkCheck ( )

callback for press of Save checkpoint networks (obj.View.handles.T_SaveProgress)

References mibInputMultiDlg().

Here is the call graph for this function:

◆ saveConfig()

function mibDeepController.saveConfig ( configName)

save Deep MIB configuration to a file

Parameters
configName[optional] string, full filename to the config file

References ActivationLayerOpt, BatchOpt, convertAbsoluteToRelativePath(), DynamicMaskOpt, InputLayerOpt, and SegmentationLayerOpt.

Here is the call graph for this function:

◆ segmentBlockedImage()

static function [ outputLabeledImageBlock , scoreBlock ] = mibDeepController.segmentBlockedImage ( block,
net,
dataDimension,
patchwiseWorkflowSwitch,
generateScoreFiles,
executionEnvironment,
padShift )
static

test function for utilization of blockedImage for prediction The input block will be a batch of blocks from the a blockedImage.

  .BlockSub: [1 1 1]
  .Start: [1 1 1]
  .End: [224 224 3]
  .Level: 1
  .ImageNumber: 1
  .BorderSize: [0 0 0]
  .BlockSize: [224 224 3]
  .BatchSize: 1
  .Data: [224×224×3 uint8]

net: a trained DAGNetwork dataDimension: numeric switch that identify dataset dimension, can be 2, 2.5, 3 patchwiseWorkflowSwitch: logical switch indicating the patch-wise mode, when true->use patch mode, when false->use semantic segmentation generateScoreFiles: variable to generate score files with probabilities of classes 0-> do not generate 1-> Use AM format 2-> Use Matlab non-compressed format 3-> Use Matlab compressed format 4-> Use Matlab non-compressed format (range 0-1) executionEnvironment: string with the environment to execute prediction padShift: numeric, (y,x,z or y,x) value for the padding, used during the overlap mode to crop the output patch for export

Parameters
blocka structure with a block that is provided by blockedImage/apply. The first and second iterations have batch size==1, while the following have the batch size equal to the selected. Below fields of the structure,
Required fields of block:

References handle.permute, and handle.reshape.

◆ selectArchitecture()

function mibDeepController.selectArchitecture ( event)

select the target architecture

◆ selectDirerctories()

function mibDeepController.selectDirerctories ( event)

◆ selectGPUDevice()

function mibDeepController.selectGPUDevice ( )

select environment for computations

◆ selectNetwork()

function net = mibDeepController.selectNetwork ( networkName)

select a filename for a new network in the Train mode, or select a network to use for the Predict mode

Parameters
networkNameoptional parameter with the network full filename
Return values
nettrained network

References handle.fieldnames, mib_uigetfile(), mibConcatenateStructures(), and updateBatchOptCombineFields_Shared().

Here is the call graph for this function:

◆ selectWorkflow()

function mibDeepController.selectWorkflow ( event)

select deep learning workflow to perform

◆ sendReportsCallback()

function mibDeepController.sendReportsCallback ( )

define parameters for sending progress report to the user's email address

References mibInputMultiDlg().

Here is the call graph for this function:

◆ setActivationLayerOptions()

function mibDeepController.setActivationLayerOptions ( )

update options for the activation layers

References mibInputMultiDlg().

Here is the call graph for this function:

◆ setAugFuncHandles()

function [ status , augNumber ] = mibDeepController.setAugFuncHandles ( mode,
augOptions )

define list of 2D/3D augmentation functions

Parameters
modestring defining 2D or 3D augmentations
augOptionsa custom temporary structure with augmentation options to be used instead of obj.AugOpt2D and obj.AugOpt3D. It is used by mibDeepAugmentSettingsController to preview selected augmentations
Return values
statusa logical success switch (1-success, 0- fail)
augNumbernumber of selected augmentations

References handle.fieldnames.

◆ setAugmentation2DSettings()

function mibDeepController.setAugmentation2DSettings ( )

update settings for augmentation fo 2D images

References mibDeepConvertOldAugmentationSettingsToNew().

Here is the call graph for this function:

◆ setAugmentation3DSettings()

function mibDeepController.setAugmentation3DSettings ( )

update settings for augmentation fo 3D images

References mibDeepConvertOldAugmentationSettingsToNew().

Here is the call graph for this function:

◆ setInputLayerSettings()

function mibDeepController.setInputLayerSettings ( )

update init settings for the input layer of networks

References mibInputMultiDlg(), and handle.reshape.

Here is the call graph for this function:

◆ setSegmentationLayer()

function mibDeepController.setSegmentationLayer ( )

callback for modification of the Segmentation Layer dropdown

◆ setSegmentationLayerOptions()

function mibDeepController.setSegmentationLayerOptions ( )

update options for the activation layers

References mibInputMultiDlg().

Here is the call graph for this function:

◆ setTrainingSettings()

function mibDeepController.setTrainingSettings ( )

update settings for training of networks

References mibInputMultiDlg().

Here is the call graph for this function:

◆ singleModelTrainingFileValueChanged()

function mibDeepController.singleModelTrainingFileValueChanged ( event)

callback for press of SingleModelTrainingFile

◆ start()

function mibDeepController.start ( event)

start calcualtions, depending on the selected tab preprocessing, training, or prediction is initialized

Required fields of event:

References mibInputMultiDlg(), and mibShowErrorDialog().

Here is the call graph for this function:

◆ startPrediction2D()

function mibDeepController.startPrediction2D ( )

predict datasets for 2D taken to a separate function for better performance

References bitmap2amiraMesh(), max, mibDeepStoreLoadImages(), mibShowErrorDialog(), modelMaterialColors, handle.notify, and saveImageParFor().

Here is the call graph for this function:

◆ startPrediction3D()

function mibDeepController.startPrediction3D ( )

predict datasets for 3D networks taken to a separate function to improve performance

References bitmap2amiraMesh(), max, mibDeepStoreLoadImages(), mibShowErrorDialog(), min, modelMaterialColors, handle.notify, handle.permute, and saveImageParFor().

Here is the call graph for this function:

◆ startPredictionBlockedImage()

function mibDeepController.startPredictionBlockedImage ( )

predict 2D/3D datasets using the blockedImage class requires R2021a or newer

References bitmap2amiraMesh(), max, mibDeepStoreLoadImages(), mibDoImageFiltering(), mibResize3d(), mibShowErrorDialog(), min, modelMaterialColors, handle.notify, and saveImageParFor().

Here is the call graph for this function:

◆ startPreprocessing()

function mibDeepController.startPreprocessing ( )

References mibShowErrorDialog(), and wb.

Here is the call graph for this function:

◆ tif3DFileRead()

static function data = mibDeepController.tif3DFileRead ( filename)
static

data = tif3DFileRead(filename) custom reading function to load tif files with stack of images used in evaluate segmentation function

◆ toggleAugmentations()

function mibDeepController.toggleAugmentations ( )

callback for press of the T_augmentation checkbox

◆ transferLearning()

function mibDeepController.transferLearning ( )

perform fine-tuning of the loaded network to a different number of classes

References mibInputMultiDlg().

Here is the call graph for this function:

◆ updateActivationLayers()

function lgraph = mibDeepController.updateActivationLayers ( lgraph)

update the activation layers depending on settings in obj.BatchOpt.T_ActivationLayer and obj.ActivationLayerOpt

Required fields of lgraph:

◆ updateBatchOptFromGUI()

function mibDeepController.updateBatchOptFromGUI ( event)

References updateBatchOptFromGUI_Shared().

Here is the call graph for this function:

◆ updateConvolutionLayers()

function lgraph = mibDeepController.updateConvolutionLayers ( lgraph)

update the convolution layers by providing new set of weight initializers

Required fields of lgraph:

◆ updateDynamicMaskSettings()

function mibDeepController.updateDynamicMaskSettings ( )

update settings for calculation of dynamic masks during prediction using blockedimage mode the settings are stored in obj.DynamicMaskOpt

References mibInputMultiDlg().

Here is the call graph for this function:

◆ updateImageDirectoryPath()

function mibDeepController.updateImageDirectoryPath ( event)

update directories with images for training, prediction and results

Required fields of event:

◆ updateMaxPoolAndTransConvLayers()

function lgraph = mibDeepController.updateMaxPoolAndTransConvLayers ( lgraph,
poolSize )

update maxPool and TransposedConvolution layers depending on network downsampling factor only for U-net and SegNet. This function is applied when the network downsampling factor is different from 2

Required fields of lgraph:

◆ updateNetworkInputLayer()

function lgraph = mibDeepController.updateNetworkInputLayer ( lgraph,
inputPatchSize )

update the input layer settings for lgraph paramters are taken from obj.InputLayerOpt

Required fields of lgraph:

References handle.reshape.

◆ updatePreprocessingMode()

function mibDeepController.updatePreprocessingMode ( )

callback for change of selection in the Preprocess for dropdown

◆ updateSegmentationLayer()

function lgraph = mibDeepController.updateSegmentationLayer ( lgraph,
classNames )

redefine the segmentation layer of lgraph based on obj.BatchOpt settings

Parameters
classNamescell array with class names, when not provided is auto switch is used
Required fields of lgraph:

◆ updateWidgets()

function mibDeepController.updateWidgets ( )

update widgets of this window

References updateGUIFromBatchOpt_Shared().

Here is the call graph for this function:

◆ viewListner_Callback()

static function mibDeepController.viewListner_Callback ( obj,
src,
evnt )
static

Member Data Documentation

◆ ActivationLayerOpt

mibDeepController.ActivationLayerOpt

options for the activation layer

Referenced by importNetwork(), and saveConfig().

◆ Aug2DFuncNames

mibDeepController.Aug2DFuncNames

cell array with names of 2D augmenter functions

◆ Aug2DFuncProbability

mibDeepController.Aug2DFuncProbability

probabilities of each 2D augmentation action to be triggered

◆ Aug3DFuncNames

mibDeepController.Aug3DFuncNames

cell array with names of 2D augmenter functions

◆ Aug3DFuncProbability

mibDeepController.Aug3DFuncProbability

probabilities of each 3D augmentation action to be triggered

◆ AugOpt2D

mibDeepController.AugOpt2D

a structure with augumentation options for 2D unets, default obtained from obj.mibModel.preferences.Deep.AugOpt2D, see getDefaultParameters.m .FillValue = 0; .RandXReflection = true; .RandYReflection = true; .RandRotation = [-10, 10]; .RandScale = [.95 1.05]; .RandXScale = [.95 1.05]; .RandYScale = [.95 1.05]; .RandXShear = [-5 5]; .RandYShear = [-5 5];

◆ AugOpt3D

mibDeepController.AugOpt3D

.Fraction = .6; % augment 60% of patches .FillValue = 0; .RandXReflection = true; .RandYReflection = true; .RandZReflection = true; .Rotation90 = true; .ReflectedRotation90 = true;

◆ availableArchitectures

mibDeepController.availableArchitectures

containers.Map with available architectures keySet = {2D Semantic, 2.5D Semantic, 3D Semantic, 2D Patch-wise, 2D Instance}; valueSet{1} = {DeepLab v3+, SegNet, U-net, U-net +Encoder}; old: {U-net, SegNet, DLv3 Resnet18, DLv3 Resnet50, DLv3 Xception, DLv3 Inception-ResNet-v2} valueSet{2} = {Z2C + DLv3, Z2C + DLv3, Z2C + U-net, Z2C + U-net +Encoder}; % 3DC + DLv3 Resnet18' valueSet{3} = {U-net, U-net Anisotropic} valueSet{4} = {Resnet18, Resnet50, Resnet101, Xception} valueSet{5} = {SOLOv2} old: {SOLOv2 Resnet18, SOLOv2 Resnet50};

◆ availableEncoders

mibDeepController.availableEncoders

containers.Map with available encoders keySet - is a mixture of workflow -space- architecture {2D Semantic DeepLab v3+, 2D Semantic U-net +Encoder} encoders for each "workflow -space- architecture" combination, the last value shows the selected encoder encodersList{1} = {Resnet18, Resnet50, Xception, InceptionResnetv2, 1}; % for 2D Semantic DeepLab v3+ encodersList{2} = {Classic, Resnet18, Resnet50, 2}; % for 2D Semantic U-net +Encoder encodersList{3} = {Resnet18, Resnet50, 1}; % for 2D Instance SOLOv2'

◆ BatchOpt

mibDeepController.BatchOpt

a structure compatible with batch operation name of each field should be displayed in a tooltip of GUI it is recommended that the Tags of widgets match the name of the fields in this structure .Parameter - [editbox], char/string .Checkbox - [checkbox], logical value true or false .Dropdown{1} - [dropdown], cell string for the dropdown .Dropdown{2} - [optional], an array with possible options .Radio - [radiobuttons], cell string Radio1 or Radio2... .ParameterNumeric{1} - [numeric editbox], cell with a number .ParameterNumeric{2} - [optional], vector with limits [min, max] .ParameterNumeric{3} - [optional], string on - to round the value, off to do not round the value

Referenced by duplicateConfigAndNetwork(), importNetwork(), and saveConfig().

◆ childControllers

mibDeepController.childControllers

list of opened subcontrollers

◆ childControllersIds

mibDeepController.childControllersIds

a cell array with names of initialized child controllers

◆ closeEvent

EVENT mibDeepController.closeEvent

> Description of events event firing when window is closed

Events
closeEvent

◆ colormap20

mibDeepController.colormap20

colormap for 20 colors

◆ colormap255

mibDeepController.colormap255

colormap for 255 colors

◆ colormap6

mibDeepController.colormap6

colormap for 6 colors

◆ DynamicMaskOpt

mibDeepController.DynamicMaskOpt

options for calculation of dynamic masks for prediction using blocked image mode .Method = Keep above threshold; % Keep above threshold or Keep below threshold .ThresholdValue = 60; .InclusionThreshold = 0.1; % Inclusion threshold for mask blocks

Referenced by importNetwork(), and saveConfig().

◆ gpuInfoFig

mibDeepController.gpuInfoFig

a handle for GPU info window

◆ InputLayerOpt

mibDeepController.InputLayerOpt

a structure with settings for the input layer .Normalization = zerocenter; .Mean = []; .StandardDeviation = []; .Min = []; .Max = [];

Referenced by importNetwork(), and saveConfig().

◆ listener

mibDeepController.listener

a cell array with handles to listeners

◆ mibController

mibDeepController.mibController

handle to mib controller

◆ mibModel

mibDeepController.mibModel

handles to mibModel

Referenced by mibDeepController().

◆ modelMaterialColors

mibDeepController.modelMaterialColors

colors of materials

Referenced by startPrediction2D(), startPrediction3D(), and startPredictionBlockedImage().

◆ PatchPreviewOpt

mibDeepController.PatchPreviewOpt

structure with preview patch options .noImages = 9, number of images in montage .imageSize = 160, patch size for preview .labelShow = true, display overlay labels with details .labelSize = 9, font size for the label .labelColor = black, color of the label .labelBgColor = yellow, color of the label background .labelBgOpacity = 0.6; % opacity of the background

◆ SegmentationLayerOpt

mibDeepController.SegmentationLayerOpt

options for the segmentation layer

Referenced by importNetwork(), and saveConfig().

◆ SendReports

mibDeepController.SendReports

send email reports with progress of the training process .T_SendReports = false; .FROM_email = 'user@.nosp@m.gmai.nosp@m.l.com'; .SMTP_server = smtp-relay.brevo.com; .SMTP_port = 587; .SMTP_auth = true; .SMTP_starttls = true; .SMTP_sername = 'user@.nosp@m.gmai.nosp@m.l.com'; .SMTP_password = '; .sendWhenFinished = false; .sendDuringRun = false;

◆ sessionSettings

mibDeepController.sessionSettings

structure for the session settings .countLabelsDir - directory with labels to count, used in count labels function

◆ TrainEngine

mibDeepController.TrainEngine

temp property to test trainnet function for training: can be trainnet or trainNetwork

◆ TrainingOpt

mibDeepController.TrainingOpt

a structure with training options, the default ones are obtained from obj.mibModel.preferences.Deep.TrainingOpt, see getDefaultParameters.m .solverName = adam; .MaxEpochs = 50; .Shuffle = once; .InitialLearnRate = 0.0005; .LearnRateSchedule = piecewise; .LearnRateDropPeriod = 10; .LearnRateDropFactor = 0.1; .L2Regularization = 0.0001; .Momentum = 0.9; .ValidationFrequency = 400; .Plots = training-progress;

◆ TrainingProgress

mibDeepController.TrainingProgress

a structure to be used for the training progress plot for the compiled version of MIB .maxNoIter = max number of iteractions for during training .iterPerEpoch - iterations per epoch .stopTraining - logical switch that forces to stop training and save all progress

◆ View

mibDeepController.View

handle to the view / mibDeepGUI

◆ wb

mibDeepController.wb

The documentation for this class was generated from the following files: