Gardner Lab

Psychology Department

Neurosciences Institute

Stanford University


日本語   

Troubleshooting

Indexing

When scripting your own analysis code, you should watch out for the following thing: Let's say there are several scans in the group you're analyzing (for the sake of illustration, let's say there are 10 scans in the group). And let's say you only want to analyze a subset of those scans (because you've already done the analysis on some of the scans, or because some of the scans shouldn't be analyzed this way) - for the sake of illustration, let's say scans 2, 5 and 9. The analysis data overlay structure that gets saved will have 10 cells, one for each scan in the group, but only 3 of those cells will have data in them, namely cells 2, 5 and 9. This way mrLoadRet can keep track of which scans have been analyzed. However, when you are writing your code, you will probably need to do things like read in the data from those scans in order to analyze that data. When you do this, you will notice that the code that reads in data from multiple scans, for example the meanTSeries function, returns a data structure that has as many cells as scans being analyzed. So in our example, it will return three cells, the first corresponding to scan 2, the second corresponding to scan 5, and the third corresponding to scan 9. It has to be this way in order for everything to work correctly, but you need to know about it and keep track carefully of how you're doing your indexing so things don't go wrong.

Preprocessing

Often you will be doing your analysis on the concatenated data. In this case, the data has already been preprocessed when it was concatenated - see the section on that. You don't need to do any additional processing (e.g. detrending), though you might still want to do things like spatial normalization (dividing by and subtracting the mean). But you should be careful to apply the same detrending to any model you are using in your analysis as was done in the concatenation (e.g. hi-pass filtering). You should also be aware that the hi-pass filtering function called when the data is concatenated is NOT the same as what is called by the functions that read in the data (e.g. percentTSeries), and you need to specify the filter cut-offs differently. (There should be some other wiki section that spells this out fully.)

Frame Period

The frame period (how long it takes to acquire a volume, sometimes incorrectly referred to as TR or volume TR) is set in the original nifti file as the fifth element of the pixdim field. The units of that field are set by the nifti field xyzt_units (see nifti1.h). If you are having trouble getting the frame period set correctly, you can try the function:

setFramePeriod(1.5);

Which changes both the nifti file and properly does a viewSet. The above line would change the frame period to be 1.5 seconds.

Slow ROI drawing

If you are having trouble drawing ROIs because the interface has gotten really slow - it is probably because Matlab doesn't handle well the simultaneous display of many line segments (i.e. ROIs) and the defining of a new roi with the function roipoly. You might try the following:

  1. Display ROIs as perimeters (these take less lines to draw)
  2. Only show one (or even no) ROIs when you are drawing
  3. Change your Edit/Preferences roiPolygonMethod to getpts. This will not show you the line segments connecting the points as you draw, but may run faster.
  4. Change your Edit/Preferences roiPolygonMethod to getptsNoDoubleClick. This is like getpts above, but instead of using a double-click to end the selection (which can sometimes be problematic) you hit the <enter> key.

Memory issues

If you get into memory troubles when trying to run an event-related analysis or load lots of data (a heavy concatenated scan).e.g., the analysis eats up all your application memory and uses too much swap file until the window “no application memory” pops up on your screen. Reduce mrTools “maxBlocksize” value:

>> mrLoadRet

Edit - Preferences - change maxBlocksize (default: 250000000000) to a lower value e.g., 30000000000

The analysis will take longer but at least it will run. You can make sure that matlab doesn't eat up too much memory in Activity monitor. Check that memory used stay low with respect to Physical memory and that the amount of swap file used stays low.