Understanding MRI Scanner Receivers
Página 1 de 1
Understanding MRI Scanner Receivers
How Advanced Technologies Improve MRI Imaging
Multichannel radiofrequency and parallel imaging technologies are hardware and software implementations, respectively, aimed at improving the coverage, signal resolution and speed of MRI examinations. With multichannel RF technology, the MRI signal used to form an image is collected by an array of separate detectors, or coil elements. Each element relays signal information along a separate channel to an image reconstruction computer. Such arrays of coil elements and receivers can improve imaging coverage and the ratio of signal-to-noise in the image.
The number of elements in the array of detectors and receivers is an important factor in characterizing an MRI scanner. Parallel imaging technology uses complex software algorithms to reconstruct the signals from multiple channels in a way that can reduce imaging times and/or increase image resolution.
The Main Sources of MRI Noise
Before we examine the parameters of MRI scanner receivers, it is important to understand what the principal source of noise in MRI scanner signals is:
The magnetic resonance signal is an electromotive force induced in a coil by a rotating magnetic moment of nuclear spins. The MRI scanner signal level must be well above noise levels to produce clinically useful MRI images, and yet this signal is very weak.
Image noise originates in the patient to be imaged and is added during the processing of the signal in the receiver chain. In the receiver chain, noise may be generated in the preamplifiers and at the connection between the preamplifier and the RF receive coil. In the RF coil, which is a conductor, thermal noise is produced by the stochastic motion of free electrons. This motion is caused by ohmic losses in the RF coil itself, and by eddy current losses in the patient, which are inductively coupled to the RF coil. High conductivity of receiver coils avoids noise, whereas conduction in the patient causes noise.
Rapid Improvement in MRI Technologies
The resistance induced in the receiving circuit by eddy currents in the patient is much more significant in modern high-field MRI scanners than a receiver coil’s own resistance. A larger mass in vivo causes greater coil loading – and more.
Since the introduction of the MRI scanners and the application of accelerated parallel MRI imaging, it has become increasingly clear that the use of a large number of RF coils for signal reception offers substantial increases in SNR and acceleration rates. The optimal number of coils is dependent on application and field strength, and might exceed 100 for whole body applications at high field. Because these coil signals have to be received, amplified and digitized independently, the result is an increased complexity of the MRI receiver. At the same time, with the advent of digital radio in the early 1990′s, receiver technology has developed rapidly as well, with increased performance, and reduction in size and cost made possible by improvements in semiconductor technology. Currently, MRI scanners are becoming available with 32 and more independent channels based on digital receiver technology. The number of receivers in MRI scanners is growing rapidly, with at least one MRI scanner manufacturer claiming to have an “unlimited” number of channels.
The Roles of the MRI Scanner Receiver
The primary role of the in an MRI scanner receiver is to convert the analog coil signals into digital format. The design of a modern digital receiver centers around an analog-to-digital converter (ADC), which samples the analog MRI signal and converts it into digital format. Important characteristics of the ADC are its conversion bandwidth and resolution. The conversion bandwidth equals half the digitization rate. State-of-the-art ADCs allow conversion bandwidth of over 50MHz at 14 bit resolution. Since frequencies are generally well above 50 MHz, usually an alias of the MRI signal is detected.
Prior to input into the ADC, the MRI scanners signal needs to be amplified and filtered. Amplification serves to match the voltage range of the MRI signal to the input range of the ADC, in order to engage its full dynamic range. Analog filtering serves to reduce noise and interference signals that alias into the ADC conversion band from outside the target band around the MRI scanner Larmor frequency. In addition, depending on Larmor frequency, down conversion might be required to bring the signal frequency to within the input band of the ADC. However, with the high input bandwidth of the state-of-the-art ADC’s direct digitization is possible for MRI scanners fieldstrengths of at least 3.0 Tesla.
The choice of ADC digitization rate is to some extent dependent on the master clock of the MRI scanner exciter. To avoid phase errors between excitation and reception, the digitization clock of the ADC needs to be synchronized to the clock of the MRI scanner exciter frequency that is a multiple of the exciter clock. In addition, it is beneficial to avoid digitization frequencies that put the Larmor alias at around 0 Hz. For optimal phase stability of the output signal, a digitization clock is required that has minimal noise and jitter.
After digitization, digital down-sampling is performed to reduce the amount of data. The output bandwidth and center frequency can be adjusted to match those of the MRI scanner signal bandwidth and (aliased) center frequency. An added advantage of down-sampling is the increase in dynamic range, which amounts to 1 bit for every factor of 4 of down-sampling.
What to Look for in MRI Scanner Receivers
When dealing with MRI scanner receivers, it is important to consider the signal-to-noise ratio and the bandwidth. As a rule, a wider bandwidth includes more noise. Decreasing the bandwidth by a factor of 4 results in an increase in the signal-to-noise ratio by a factor of 2 (less noise in the image).
When decreasing the bandwidth, we gain a better signal quality. However, there are also some trade-offs:
the chemical shift artifact increases.
longer echo time (TE) means some reduction.
longer echo time means that fewer slices can be fitted into the repetition time (TR) period.
In the signal-to-noise ratio however, the overall effect of the reduction of the bandwidth is an improvement in the signal-to-noise ratio. The initial bandwidth of the MRI signal produced by the MRI scanner is a function of the special encoding readout gradient strength, and the chemical shift. So, if the bandwidth is too narrow, only a low readout gradient strength can be used.
Multichannel radiofrequency and parallel imaging technologies are hardware and software implementations, respectively, aimed at improving the coverage, signal resolution and speed of MRI examinations. With multichannel RF technology, the MRI signal used to form an image is collected by an array of separate detectors, or coil elements. Each element relays signal information along a separate channel to an image reconstruction computer. Such arrays of coil elements and receivers can improve imaging coverage and the ratio of signal-to-noise in the image.
The number of elements in the array of detectors and receivers is an important factor in characterizing an MRI scanner. Parallel imaging technology uses complex software algorithms to reconstruct the signals from multiple channels in a way that can reduce imaging times and/or increase image resolution.
The Main Sources of MRI Noise
Before we examine the parameters of MRI scanner receivers, it is important to understand what the principal source of noise in MRI scanner signals is:
The magnetic resonance signal is an electromotive force induced in a coil by a rotating magnetic moment of nuclear spins. The MRI scanner signal level must be well above noise levels to produce clinically useful MRI images, and yet this signal is very weak.
Image noise originates in the patient to be imaged and is added during the processing of the signal in the receiver chain. In the receiver chain, noise may be generated in the preamplifiers and at the connection between the preamplifier and the RF receive coil. In the RF coil, which is a conductor, thermal noise is produced by the stochastic motion of free electrons. This motion is caused by ohmic losses in the RF coil itself, and by eddy current losses in the patient, which are inductively coupled to the RF coil. High conductivity of receiver coils avoids noise, whereas conduction in the patient causes noise.
Rapid Improvement in MRI Technologies
The resistance induced in the receiving circuit by eddy currents in the patient is much more significant in modern high-field MRI scanners than a receiver coil’s own resistance. A larger mass in vivo causes greater coil loading – and more.
Since the introduction of the MRI scanners and the application of accelerated parallel MRI imaging, it has become increasingly clear that the use of a large number of RF coils for signal reception offers substantial increases in SNR and acceleration rates. The optimal number of coils is dependent on application and field strength, and might exceed 100 for whole body applications at high field. Because these coil signals have to be received, amplified and digitized independently, the result is an increased complexity of the MRI receiver. At the same time, with the advent of digital radio in the early 1990′s, receiver technology has developed rapidly as well, with increased performance, and reduction in size and cost made possible by improvements in semiconductor technology. Currently, MRI scanners are becoming available with 32 and more independent channels based on digital receiver technology. The number of receivers in MRI scanners is growing rapidly, with at least one MRI scanner manufacturer claiming to have an “unlimited” number of channels.
The Roles of the MRI Scanner Receiver
The primary role of the in an MRI scanner receiver is to convert the analog coil signals into digital format. The design of a modern digital receiver centers around an analog-to-digital converter (ADC), which samples the analog MRI signal and converts it into digital format. Important characteristics of the ADC are its conversion bandwidth and resolution. The conversion bandwidth equals half the digitization rate. State-of-the-art ADCs allow conversion bandwidth of over 50MHz at 14 bit resolution. Since frequencies are generally well above 50 MHz, usually an alias of the MRI signal is detected.
Prior to input into the ADC, the MRI scanners signal needs to be amplified and filtered. Amplification serves to match the voltage range of the MRI signal to the input range of the ADC, in order to engage its full dynamic range. Analog filtering serves to reduce noise and interference signals that alias into the ADC conversion band from outside the target band around the MRI scanner Larmor frequency. In addition, depending on Larmor frequency, down conversion might be required to bring the signal frequency to within the input band of the ADC. However, with the high input bandwidth of the state-of-the-art ADC’s direct digitization is possible for MRI scanners fieldstrengths of at least 3.0 Tesla.
The choice of ADC digitization rate is to some extent dependent on the master clock of the MRI scanner exciter. To avoid phase errors between excitation and reception, the digitization clock of the ADC needs to be synchronized to the clock of the MRI scanner exciter frequency that is a multiple of the exciter clock. In addition, it is beneficial to avoid digitization frequencies that put the Larmor alias at around 0 Hz. For optimal phase stability of the output signal, a digitization clock is required that has minimal noise and jitter.
After digitization, digital down-sampling is performed to reduce the amount of data. The output bandwidth and center frequency can be adjusted to match those of the MRI scanner signal bandwidth and (aliased) center frequency. An added advantage of down-sampling is the increase in dynamic range, which amounts to 1 bit for every factor of 4 of down-sampling.
What to Look for in MRI Scanner Receivers
When dealing with MRI scanner receivers, it is important to consider the signal-to-noise ratio and the bandwidth. As a rule, a wider bandwidth includes more noise. Decreasing the bandwidth by a factor of 4 results in an increase in the signal-to-noise ratio by a factor of 2 (less noise in the image).
When decreasing the bandwidth, we gain a better signal quality. However, there are also some trade-offs:
the chemical shift artifact increases.
longer echo time (TE) means some reduction.
longer echo time means that fewer slices can be fitted into the repetition time (TR) period.
In the signal-to-noise ratio however, the overall effect of the reduction of the bandwidth is an improvement in the signal-to-noise ratio. The initial bandwidth of the MRI signal produced by the MRI scanner is a function of the special encoding readout gradient strength, and the chemical shift. So, if the bandwidth is too narrow, only a low readout gradient strength can be used.
Página 1 de 1
Permissões neste sub-fórum
Não podes responder a tópicos