Verfasst am: 22.07.2022, 13:24
Titel: Probleme ein Matlab-Script von GitHub in einem loop zu verwe
Code:
functionvarargout = ps_LST_lpa(varargin)
%ps_LST_lpa Lesion segmentation by a lesion predicialgorithm (LGA) % Part of the LST toolbox, www.statistical-modelling.de/lst.html
%
% ps_LST_lpa Lesion segmentation by the LPA requires a FLAIR image only. % However, the user is free to choose an additional image that serves as % a reference image during a coregistration step before the main lesion % segmentation. This may be helpful if the dimension of the FLAIR image % is low or if the goal of the lesion segmentation is to fill lesions in % T1 images. Beside that no additional parameters need to be set. No % other parameters need to be set by the user.
%
% This routine produces lesion probability maps (ples...), [coregistered] % bias corrected versions of the FLAIR inputs, a .mat-file with % information about the segmentation that is needed for a re-run of the % algorithm, and a HTML report along with a folder if this option has % been chosen desired. See the documentation for details. % % ps_LST_lpa asks the user for the input images (FLAIR and optional % reference images). A HTML report is produced by default.
%
% ps_LST_lpa(Vf2, Vref, html) performs lesion % segmentation for the FLAIR volumes given in Vf2 and reference volumes % given in Vref. The letter can be left empty. If specified, both must be % characters like a call from from spm_select. The last argument is a % dummy for the HTML reports. If the last argument ismissing it is % replaced by its default value, see above.
%
% ps_LST_lpa(job) Same as above but with job being a harvested job data % structure (see matlabbatch help).
%
% Check if user has writing permissions [~, struc] = fileattrib;
if ~struc.UserWrite fprintf(fileID, 'User has no writing permissions!');
error('You do not have writing permissions for the current folder.\n');
end
% Load some stuff load(fullfile(spm('dir'), 'toolbox', 'LST', 'LST_lpa_stuff.mat'))
% Warp TPMs and spatial effect to native space % --------------------------------------------
strout = 'Rough segmentation of FLAIR ';
fprintf(strout) tic
% Warp brain position of MNI space in native space fprintf(fileID, 'Warp brain position ...');
cd(tmpFolder)
bp = ps_LST_lpa_mni2ns(bp_mni, 'bp', Vf2_tmp, [], [], []);
indx_brain = find(bp > 0);
fprintf(fileID, 'ok.\n');
cd ..
% Get TPMs in native space fprintf(fileID, 'TPMs in native space ...');
csf = ps_LST_bc_mni2ns(csf_mni, 'tissue', Vf2_tmp, bp_mni, indx_brain, bp);
gm = ps_LST_bc_mni2ns(gm_mni, 'tissue', Vf2_tmp, bp_mni, indx_brain, bp);
wm = ps_LST_bc_mni2ns(wm_mni, 'tissue', Vf2_tmp, bp_mni, indx_brain, bp);
fprintf(fileID, 'ok.\n');
% Rough segmentation of FLAIR into CSF, GM and WM % -----------------------------------------------
fprintf(fileID, 'Rough segmentation of FLAIR ...');
% For the calculation of the lesion belief maps we need a rough % segmentation of the FLAIR image. As the segmentation by SPM is % quite poor we use our one one. Here, we set up a simple mixture % model with one Gaussian per tissue class and keep the prior % probabilities (the TPMs) for these classes fixed.
% For the segmentation we smooth the FLAIR image with a Gaussian % kernel with FWHM at 1.5 mm
f2 = spm_read_vols(Vf2_tmp);
sf2 = 0 .* f2;
spm_smooth(f2, sf2, [1, 1, 1] .*1.5);
f2_vec = sf2(indx_brain);
clear sf2;
% Prior probabilities are the TPMs
prior = [csf(indx_brain), gm(indx_brain), wm(indx_brain)];
clear gm;
prior = bsxfun(@times, prior, 1 ./ sum(prior, 2));
post = 0 .* prior;
[m, I] = max(prior, [], 2);
st = 0;
counter = 0;
% Main loop of rough segmentation while ~st
counter = counter + 1;
% Mean and SD of Gaussians
m = [mean(f2_vec(I == 1)), ... mean(f2_vec(I == 2)), ... mean(f2_vec(I == 3))];
s = [std(f2_vec(I == 1)), ... std(f2_vec(I == 2)), ... std(f2_vec(I == 3))];
% Posterior probabilities for j = 1:3
%post(:,j) = normpdf(f2_vec, m(j), s(j)) .* prior(:,j);
post(:,j) = ps_dnorm(f2_vec, m(j), s(j)) .* prior(:,j);
end
post = bsxfun(@times, post, 1 ./ sum(post, 2));
% New class labels [~, I_new] = max(post, [], 2);
% Correct missclassified voxels, i.e. voxels that are segmented % as CSF but have intensities larger than the mean of GM
I_new(f2_vec > m(2) & I_new == 1) = 2;
% Check if the label of any voxels have been changed. ifisequal(I, I_new)
st = 1;
else
I = I_new;
end end% END while ~st
% One last iteration. Here we use the last posterior probability as % the prior probabilities. This helps in identifying CSF voxels % better. for j = 1:3
%post(:,j) = normpdf(f2_vec, m(j), s(j)) .*mean(I_new == j);
post(:,j) = ps_dnorm(f2_vec, m(j), s(j)) .*mean(I_new == j);
end
post = bsxfun(@times, post, 1 ./ sum(post, 2));
[~, I_new] = max(post, [], 2);
I_new(f2_vec > m(2) & I_new == 1) = 2;
seg = 0.*f2;
seg(indx_brain) = I_new;
% Correct voxels in the TPM for CSF
csf(seg > 1) = 0;
fprintf(fileID, ['ok, with ' num2str(counter), ' iterations.\n']);
fprintf(fileID, 'Lesion belief maps ...');
% For the lesion belief map we use the original FLAIR image
f2_vec = f2(indx_brain);
tmp = f2_vec(I == 2);
f2_vec = f2_vec ./ mean(tmp(~isnan(tmp)));
% In order to calculate the correct distance (in mm) we need to % correct for different voxel dimensions. We do this by artifically % blowing the brain up until all voxels have a natural number as % voxel dimension.
fprintf(fileID, 'Factor ...');
% Find factor that gives a natural number for each voxel size
fac_rs = 1;
vs_rs = vs;
whileany(mod(vs_rs, 1) > 0)
fac_rs = fac_rs + 1;
vs_rs = vs .* fac_rs;
end fprintf(fileID, [' ok, ', num2str(fac_rs), '\n']);
fprintf(fileID, 'Translate coordinates ...');
% Create an empty image with the new voxel size
outerCSF_rs = zeros(size(f2, 1) .* vs_rs(1), ... size(f2, 2) .* vs_rs(2), ... size(f2, 3) .* vs_rs(3));
% Translate coordinate of the original image into coordinates of % the new image
coords = indx2coord(indx_brain, size(f2, 1), size(f2, 2));
coords_rs = coords;
for j = 1:3 if vs_rs(j) > 0
coords_rs(:,j) = round(ps_scale(coords(:,j), ... min(coords(:,j)), ... max(coords(:,j)) .* vs_rs(j)));
end end % Translate coordinates back to indx
indx_rs = coord2indx(coords_rs(:,1), coords_rs(:,2), coords_rs(:,3), ... size(f2, 1) .* vs_rs(1), size(f2, 2) .* vs_rs(2));
fprintf(fileID, 'ok.\n');
fprintf(fileID, 'Calculate distance ...');
% Fill new image
outerCSF_rs(indx_rs) = outerCSF(indx_brain);
% Calculate distance of all voxels to non-zero voxels
outerCSF_rs_dist = bwdist(outerCSF_rs);
clear outerCSF_rs;
% Put calculated distances back to original image
outerCSF_dist = 0 .* f2;
outerCSF_dist(indx_brain) = outerCSF_rs_dist(indx_rs) ./ fac_rs;
clear outerCSF_rs_dist;
fprintf(fileID, 'ok.\n');
end % Finally: delete all voxels whose distance to outer CSF is smaller % or equal than 4 mm fprintf(fileID, 'Delete voxels ...');
prob2 = prob;
prob2(outerCSF_dist <= 4) = 0;
fprintf(fileID, 'ok.\n');
% Correct for big lesions that are too close at outer CSF fprintf(fileID, 'Correct for big lesions that are too close at outer CSF ...');
indx_les = find(prob > 0);
bw = ps_bwlabeln(1. *(prob > 0));
bw_tmp = bw(indx_les);
p_tmp = prob(indx_les);
p2_tmp = prob2(indx_les);
gs1 = arrayfun(@(x)mean(p_tmp(bw_tmp == x)), (1:max(bw_tmp))');
gs2 = arrayfun(@(x)mean(p2_tmp(bw_tmp == x)), (1:max(bw_tmp))');
%gs1 = grpstats(p_tmp, bw_tmp, 'mean');
%gs2 = grpstats(p2_tmp, bw_tmp, 'mean');
indx_correct = find(gs1 > .5 & gs2 < .5 & gs2 > 0);
if ~isempty(indx_correct) for j = indx_correct'
prob2(bw == j) = prob(bw == j);
end end
prob = prob2; clear prob2;
fprintf(fileID, 'ok.\n');
% Delete small lesions with low probability fprintf(fileID, 'Delete small lesions with low probability ...');
bw = ps_bwlabeln(1 .*(prob > 0));
volfactor = abs(det(Vf2_tmp.mat(1:3,1:3))) / 1000;
for j = 1:max(bw(bw > 0))
indx_tmp = find(bw == j);
if(sum(prob(indx_tmp) > .1)*volfactor < 3*0.003)
prob(indx_tmp) = 0;
end end fprintf(fileID, 'ok.\n');
% Delete holes fprintf(fileID, 'Delete holes ...');
ch = 0;
iter = 0;
while ~ch
iter = iter + 1;
indx_tmp = find(prob == 0 & bp > 0);
nh = getNeighborhood2(prob, indx_tmp, 1);
indx_tmp = indx_tmp(sum(nh > 0) > 4);
ifisempty(indx_tmp) || iter > 10
ch = 1;
else
prob(indx_tmp) = mean(nh(:,sum(nh > 0) > 4));
end end fprintf(fileID, ['ok, with ', num2str(iter), ' iterations.\n']);
% A bit of growing fprintf(fileID, 'Grow a bit ...');
st = 0;
iter = 0;
while ~st
iter = iter + 1;
indx_tmp = find(Bf2_2 > 0.5 & prob == 0);
nh = getNeighborhood2(prob, indx_tmp, 3);
tmp_tmp = find(mean(nh > 0) > 0.25);
ifnumel(tmp_tmp) > 10
tmp = nh(:,tmp_tmp); tmp(tmp == 0) = NaN;
prob(indx_tmp(tmp_tmp)) = ps_quantile(tmp, .85);
else
st = 1;
end end fprintf(fileID, ['ok, with ', num2str(iter), ' iterations.\n']);
Ich habe oben aufgeführtes Script auf GitHub für die LesionSegmentationToolbox von SPM gefunden. Hier öffnet sich ein SPM-Batch in dem ich die Input files manuell auswählen muss.
Da ich diese Analyse gerne auf 40 Subjects anwenden möchte, würde ich gerne den Namen der Input-files definieren und dann das Script über alle Subjects loopen.
Hat jemand eine Idee wie ich das umsetzen kann?
Diese kann man sicher auch in einer Schleife durchführen.
Für weitere Unterstützung:
* welche der Aufrufmöglichkeiten willst du verwenden?
* wie (Datentyp, Dimensionen etc.) liegen die Eingangsdaten vor?
* wie sollen beispielhaft der 1. und 2. Aufruf erfolgen?
Grüße,
Harald
_________________
1.) Ask MATLAB Documentation
2.) Search gomatlab.de, google.de or MATLAB Answers
3.) Ask Technical Support of MathWorks
4.) Go mad, your problem is unsolvable ;)
Einstellungen und Berechtigungen
Du kannst Beiträge in dieses Forum schreiben. Du kannst auf Beiträge in diesem Forum antworten. Du kannst deine Beiträge in diesem Forum nicht bearbeiten. Du kannst deine Beiträge in diesem Forum nicht löschen. Du kannst an Umfragen in diesem Forum nicht mitmachen. Du kannst Dateien in diesem Forum posten Du kannst Dateien in diesem Forum herunterladen
MATLAB, Simulink, Stateflow, Handle Graphics, Real-Time Workshop, SimBiology, SimHydraulics, SimEvents, and xPC TargetBox are registered trademarks and The MathWorks, the L-shaped membrane logo, and Embedded MATLAB are trademarks of The MathWorks, Inc.