It's hard to say why the seg fault is not consistent. David
has agreed to show me on his system, so that should help.
More than that though, I have run your data on his system,
again with no crash, which is quite confusing.
Ignoring the last point, what is likely the case is that the
close-to-linearly-dependent regressors are producing a
division-by-almost-zero scenario. While division by zero
may be handled properly, division by almost zero may
not be explicitly dealt with, and may lead to an overflow
condition. If so, this may be handled gracefully on some
systems (via NaN, or not-a-number), but not on others.
In any case, we can probably rely on David hitting the same
problem that you are. And if we can reproduce it, we can
fix it (almost positively). So I will get back to you after we
have figured this out.
Thanks for bringing this up.
- rick