BIOM 480A Biomedical
Signal and Image Processing
Colorado State University
Student: Minh Anh Nguyen
Email: minhanhnguyen@q.com
Projection Read about the
radon() function in MATLAB as well as the introduction to Radon transformations
in Wikipedia.
(a) Read in the .png image
file using imread() and then use the radon() function at angles of 0, 45, 90
and 135 degrees to create four projections of it.
Plot these four projections
as conventional 1-D graphs on the same page as the original image (use imshow()
to display the image) and comment on whether or not they appear to be projections.
Yes, they are projections
in 1D.
(b) Why are the number of
elements in each projection greater than the number of pixels across the image?
- We want to obtain higher
(true) resolution. The projector's resolution is used to determine the
sharpness and clarity of the picture on the screen. Resolution is the number of
pixels that the projector uses to create an image. High resolution projectors
will show more picture details than low resolution projectors; because it required
more pixels to make an image and it is more expensive.
(c)
You will want more projections to more accurately reconstruct the image, so use
radon() to get projections at angles of of 0, 5, 10, 15,... 170 and 175
degrees. Plot the 2D matrix produced by radon() as a grayscale image by
normalizing the matrix element values so that they range between 0 and 1 and
then using the imshow() function to render the normalized matrix as an image.
(d)
Why does the image appear to be fuzzy, overlapping sinusoids (“sonogram”)?
Yes,
this is a “Sinogram” because the radon transform of an off-center point source
is a sinusoid. According to the Radon
transformations in Wikipedia, “The
Radon transform data is often called a sinogram
because the Radon transform of a Dirac delta function
is a distribution
supported on the graph of a sine wave. Consequently the Radon transform of a
number of small objects appears graphically as a number of blurred sine waves
with different amplitudes and phases.” We collected the projections of
a single slice using
the parallel scanning arrangement.
Note: there is an inverse Radon transformation
function, iradon(), but you are not allowed to use it for any problem in this
assignment.
Numerical Issues with FFTs
of Projections
(a) Plot the phase and amplitude of the 1D FFT of the 90° projection.
(b) Use the 1-D interpolation function,
interp(), to double the number of points in the FFT, and then plot the
amplitude of the inverse FFT of the interpolated FFT. Be sure to add the
interpolated points after you perform the FFT.
(c) How well does the
inverse FFT resemble the original projection? Comment on the reason for any observed
changes.
-
The FFT function returns a complex double
array of the image. So when you take the real part of the IFFT and then convert
it back into original image.
(d)Now
truncate the 90° projection data to contain the same number of points across
the original image (a power of 2) by deleting regions of zeros at the outsides
of the projection. Next, use fftshift() to effectively rotate the truncated projections
by half their length. Plot the shifted, truncated projection. Repeat parts (a) through (c) of this problem
using the shifted, truncated projection.
(f) Given your results to this problem,
discuss if you will modify the original projection data before using it to
construct a 2D FFT.
Yes,
the above projection result is 1D FFT, so we need to convert the projection
data before we can construct 2D FFT.
Constructing a 2D FFT (a)
Based on your results from the last problem, manipulate your projection data if
desired, take the FFT of each projection, and plot the phase and amplitude of
the resulting 1D FFT of the sonogram as a grayscale image. This matrix is
effectively a FFT of the original image in polar coordinates.
(b) To use the ifft2() function in MATLAB to
recover the original image, you will need to construct a Cartesian FFT from the
polar data.
Use the interp2() and
cart2pol() or pol2cart() functions to create a Cartesian version of the FFT
with the same number of pixels as the original image. As a check on your
conversion, plot the phase and amplitude of both the polar and FFTs along the
-45° axis and comment on any differences. [The 2D interpolation may be the most
challenging part of this assignment, but the points for succeeding at it are
allocated to the next problem which is easy if you got this part right.]
The
phase for Cartesian FFT is much cleaner than the phase for polar FFT. Cartesian FFT has higher amplitude than the
polar FFT but it shifted in the frequency range.
Inverting the 2D FFT (a)
Use the ifft2() function to invert your Cartesian FFT data. You may need to use
the fftshift() function to put the image parts back where they belong. Print
your reconstructed image. (b) In one or two well-structured paragraphs (that
include good topic sentences and logical structure – talk to the Writing Center
if you need to), discuss how your reconstructed image compares with the
original. Describe any differences and offer a hypothesis on what causes them.
The result image was not what I expected. I
expected an image of 3
objects with sharp edges with a better resolution instead of one
white dot in the middle of the image. The quality of the reconstruction must
increase since we add more projections; however, I did not see this in my final
result image. The individual projection can not on my reconstruction image. I think, there is something wrong with the
interpolation step but I am not sure how to fix it.
I applied the polar scan technique to
reconstruction my image instead of the Cartesian. The algorithm for using polar
scan method is: after the samples of image are in the polar grid, perform
interpolation to rectangular and then take the inverse 2D FFT to get back the
image. As a result, I only saw the half
of the circle after I performed the interpolation step. So the result of image
is not go outside of my polar value. If I was used the Cartesian scan method,
then I would get a better result.
I have followed the reconstruction of image
instructions exactly, which are provided in the homework, but I could not get
the correct results. I have spent more than 10 hours on this problem and did
not get any better image.
There
are other methods of tomographic reconstruction including filtered back
projection.
If
you are desperate, and I do mean desperate (because I will grade this very
stringently),
I
will give up to 100 points of homework extra credit for a high-quality homework
assignment, corresponding detailed solution, and at least one-page handout with
related concepts and links to web resources that teach students about filtered
back projection. All three parts (assignment, solution and handout) must be
submitted and they must be appropriate for direct use in a future class without
my editing.
Computer Tomography (CT) image reconstruction
is described in the Chapter 13, The Principles of Computer Tomography, of the
Biomedical Signals and Image Processing text book. This chapter explains the following topics
of Tomography: Attenuations Tomography,
Time–of-Flight Tomography, Reflection Tomography, Diffraction Tomography, and
Fourier Slide Theorem. From the list of
topics one can see that tomographic
image reconstruction using filtered back projection is not discussed in this
chapter. For this reason I will explain what filtered back projection is and
use MATLAB code to show how it works.
The (CT) image reconstruction concept is needed
for X-Ray and MRI images, which
are described and studied in chapter 14 and 15 of the text book;
however, this topic is not an easy concept to understand for some students.
The
filtered back projection method is discussed in chapter 25, Computed Tomography
(CT) Special Image Techniques of Digital Signal Processing text book [1]. According to chapter 25, there are four CT
reconstruction algorithms or techniques to calculate the slice image. The first method is applying linear equations to solve and understand the
problem. In this method, each measurement would require one equation, and each
sample is the sum of pixels in the image.
Most CT scanners will need about 50% more samples than the requirements
for this analysis. For example, the
system will take 700 views with 600 samples in each view to reconstruction
512x512 images; the problem with this CT reconstruction method is time
consumption. The second CT reconstruction method is to use iterative techniques
to calculate the final image in steps.
This technique would need three variations, and these three variations
are: the Algebraic Reconstruction Technique (ART),
Simultaneous Iterative Reconstruction Technique (SIRT), and Iterative Least
Squares Technique (ILST). The difference between these methods is how the
successive corrections are made: ray-by-ray, pixel-by-pixel, or simultaneously
correcting the entire data set. For the
ART algorithm, all the pixels in the image array are set to some arbitrary
value and change the image array to correspond to the profiles. Each measurement, the measured sample is
compared with the sum of the image pixels along the ray of the sample. If all
pixel values along the ray are increased, then the rays of sums are lower than the
measured sample. If all pixel values
along the ray are decreased, the rays of sums are higher than the measured
sample. Since changes are made for any
one measurement, the previous measure values could disrupt the calculations,
and a resulting error between the measured and ray sums values might happen.
The third method is Fourier
reconstruction or Fourier slice theorem, which is explained in chapter 13 of
the Biomedical Signals and Image Processing text book. The fourth method is called filtered backprojection, which is used to correct the blurring image issue
that simple backprojection
techniques create. In the
backprojection technique, a
single point in the true image is reconstructed as a circular region that
decreases in intensity away from the center. In another words, a backprojection
can be created by smearing each
view back through the image in the direction it was originally developed. The
final backprojected image is then taken as the sum of all the backprojected
views.
In
the filtered backprojection
technique, each
of the one-dimensional views is convolved with a one-dimensional filter kernel
to create a set of filtered views.
These filtered views are then backprojected to provide the reconstructed image,
a close approximation to the "correct" image. In fact, the image
produced by filtered backprojection is identical
to the "correct" image when there are an infinite
number of views and an infinite number of points per view.
The lecture, Medical Imaging Computer
Tomography, which was created by the Polytechnic School of Engineering [2],
provided a filtered
backprojection algorithm. The filtered backprojection algorithm for each angle
(θ) is described in the following steps: 1) take 1D-FFT of g(l,θ) for each
θ-> G(ρ,θ). 2) Frequency domain filteringG(ρ,θ) -> Q(ρ,θ)=|ρ|G(ρ,θ), 3)
take inverse 1D FT: Q(ρ,θ) -> q(l,θ). 4) Backprojecting q(l,θ) to image
domain -> bθ(x,y), and 5)Sum of backprojected images for all θ.
Image reconstruction assignments are
made using back-project and filtered backprojection. The CT projections are
constructions using the forward radon transform. Use the “3 objects with sharp
edges” .png image file from the previous problem. Use MATLAB command “iradon”
for filtered back-projection and “i_back” for unfiltered backproject.
a. Use Matlab command imread() to read the
original “.png” image file and then use imshow() to display the image.
b. Use command radon() to generate the
projections
c. Use the command mat2gray() to convert to
grayscale
d. Use command iradon() to filtered backproject.
The
Matlab code of the assignment is shown in the following text:
clear all; close all; clc;
I =
imread('J:\BIOM480A3\Hw7\xid-14412348_1.png');
% Define angle resolution (in
degree)
ang_res = 4;
theta = 0:ang_res:180-ang_res;
[R,xp] = radon(I,theta);
figure('Color', 'w'), imagesc(theta,xp,R)
colormap(hot), colorbar
title('Projections of the 3
objects with sharp edges')
xlabel('Parallel Rotation Angle
- \theta (degrees)')
ylabel('Parallel Sensor
Position - x\prime (pixels)')
figure;
R = radon(I,0:175);
I1 = iradon(R,0:175);
I2 =
iradon(R,0:175,'linear','none');
subplot(1,3,1), imshow(I),
title('Original of 3 objects with sharp edges')
subplot(1,3,2), imshow(I1),
title('Filtered backprojection of 3 objects with sharp edges')
subplot(1,3,3), imshow(I2,[]),
title('Unfiltered backprojection of 3 objects with sharp edges')
The
original image, the filtered and unfiltered images are displayed in the figure
below.
The following
matlab code is for filtered back-projection without using iradon function
%% Minh Anh Nguyen
%% filtered baclkprojection
testing extra credit
clear all; close all; clc;
I = imread('J:\BIOM480A3\Hw7\xid-14412348_1.png');
figure, subplot
(3,2,1),imshow(I), hold off, title('The original image of image 2')
subplot(2,3,1); imagesc(I);
title('3 shape'); xlabel('X'); ylabel('Y');
% set some parameters
freq = 1; %
[1/degree];
thetas = 0:1/freq:180-1/freq;
subplot(2,3,2)
% compute sinogram with
matlab function
sinogram = radon(I,thetas);
imagesc(sinogram); title('Sinogram')
xlabel('\alpha'); ylabel('# parallel
projection'); subplot(2,3,3)
% % simple backprojection
(schlegel & bille 9.1.2)
%%http://www.mathworks.com/matlabcentral/fileexchange/34608-ct-reconstruction-package/content/ctRecontruction/myBackprojection.m
% figure out how big our
picture is going to be.
ParallelProjections =
size(sinogram,1);
AngularProjections = length(thetas);
% convert thetas to radians
thetas = (pi/180)*thetas;
% set up the backprojected
image
backprojected =
zeros(ParallelProjections,ParallelProjections);
% find the middle index of
the projections
midindex =
floor(ParallelProjections/2) + 1;
% set up the coords of the
image
[xCoords,yCoords] =
meshgrid(ceil(-ParallelProjections/2):ceil(ParallelProjections/2-1));
% set up filter: now for the
spatial domain!!!
filterMode = 'sheppLogan'; % put
either 'sheppLogan' or 'ramLak'
if mod(ParallelProjections,2) == 0
halfFilterSize = floor(1 +
ParallelProjections);
else
halfFilterSize =
floor(ParallelProjections);
end
if strcmp(filterMode,'ramLak')
filter = zeros(1,halfFilterSize);
filter(1:2:halfFilterSize) =
-1./([1:2:halfFilterSize].^2 * pi^2);
filter = [fliplr(filter) 1/4 filter];
elseif
strcmp(filterMode,'sheppLogan')
filter = -2./(pi^2 * (4 * (-halfFilterSize:halfFilterSize).^2
- 1) );
end
% loop over each projection
for i = 1:AngularProjections
% figure out which projections to
add to which spots
rotCoords = round(midindex +
xCoords*sin(thetas(i)) + yCoords*cos(thetas(i)));
% check which coords are in bounds
indices
= find((rotCoords > 0) & (rotCoords <= ParallelProjections));
newCoords = rotCoords(indices);
% filter
filteredProfile =
conv(sinogram(:,i),filter,'same');
% summation
backprojected(indices) = backprojected(indices) +
filteredProfile(newCoords)./AngularProjections;
% visualization on the fly
imagesc( backprojected); title('backprojection
without using iradon');
drawnow
end
% % find the middle index
of the projections
midindex =
floor(size(backprojected,1)/2) + 1;
%
% % prepare filter for
frequency domain without normalization
[xCoords,yCoords] = meshgrid(1 -
midindex:size(backprojected,1) - midindex);
rampFilter2D = sqrt(xCoords.^2 + yCoords.^2);
%
% % 2 D Fourier
transformation and sorting
reconstrution2DFT =
fftshift(fft2(backprojected));
% % Filter in Freq Domain
reconstrution2DFT = reconstrution2DFT .*
rampFilter2D;
% % inverse 2 D fourier
transformation and sorting
reconstrution2DFT = real( ifft2( ifftshift(
reconstrution2DFT )));
% figure
%
imagesc(reconstrution2DFT);
% title('Filtered
backprojection with ifft2( ifftshift()) function orginal background')
% xlabel('X'); ylabel('Y');
b4= mat2gray(reconstrution2DFT );
subplot(2,3,4); imshow(b4);title('filter
backproject image using ifft2(ifftshift()) with gray background');
The
original image, the filtered and unfiltered images are displayed in the figure
below without using iradon funtion.
The filtered backprojection
method is very fast and a direct inversion of the projection formula. It is a
correction for scattered non-uniform attenuation, and needs a lot of filtering
–a trade-off between the blurring and noise.
References: