Sorry, I only just saw this last message. I thought it should be scaled appropriately to be able to see up to 600 or so TRs at once, it seems like more than that is unlikely at this point.
Unless you are trying to run it on concatenated data? Do not run this on concatenated data. The demeaning will be meaningless. (get it? ha ha.) Run censor.py first, then concatenate.
If you do have very long scan runs, I apologize for this limitation, but I don't really have time to fix it right now. I'll be sure to mention it here if I do get around to it.