AFNI Message Board

Dear AFNI users-

We are very pleased to announce that the new AFNI Message Board framework is up! Please join us at:

https://discuss.afni.nimh.nih.gov

Existing user accounts have been migrated, so returning users can login by requesting a password reset. New users can create accounts, as well, through a standard account creation process. Please note that these setup emails might initially go to spam folders (esp. for NIH users!), so please check those locations in the beginning.

The current Message Board discussion threads have been migrated to the new framework. The current Message Board will remain visible, but read-only, for a little while.

Sincerely, AFNI HQ

History of AFNI updates  

|
June 05, 2009 04:40PM
Hello,

While running a recent 3dDeconvolve command, I received the following error regarding singular values in my matrices (full command and output are below):

tcsh doDecon.sh JULY10_2007I
++ '-stim_times_AM2 1 stimuli/dPass4_n.1D' has 2 auxiliary values per time point
++ '-stim_times_AM2 1 stimuli/dPass4_n.1D' will have 30 regressors
++ '-stim_times_AM2 2 stimuli/dPass6_n.1D' has 2 auxiliary values per time point
++ '-stim_times_AM2 2 stimuli/dPass6_n.1D' will have 30 regressors
++ '-stim_times_AM2 3 stimuli/dPass8_n.1D' has 2 auxiliary values per time point
++ '-stim_times_AM2 3 stimuli/dPass8_n.1D' will have 30 regressors
++ '-stim_times_AM2 4 stimuli/dTake4_n.1D' has 2 auxiliary values per time point
++ '-stim_times_AM2 4 stimuli/dTake4_n.1D' will have 30 regressors
++ '-stim_times_AM2 5 stimuli/dTake6_n.1D' has 2 auxiliary values per time point
++ '-stim_times_AM2 5 stimuli/dTake6_n.1D' will have 30 regressors
++ '-stim_times_AM2 6 stimuli/dTake8_n.1D' has 2 auxiliary values per time point
++ '-stim_times_AM2 6 stimuli/dTake8_n.1D' will have 30 regressors
++ '-stim_times_AM2 7 stimuli/out_take_bad.1D' has 4 auxiliary values per time point
++ '-stim_times_AM2 7 stimuli/out_take_bad.1D' will have 35 regressors
++ '-stim_times_AM2 8 stimuli/out_take_both.1D' has 4 auxiliary values per time point
++ '-stim_times_AM2 8 stimuli/out_take_both.1D' will have 35 regressors
++ '-stim_times_AM2 9 stimuli/out_take_good.1D' has 4 auxiliary values per time point
++ '-stim_times_AM2 9 stimuli/out_take_good.1D' will have 35 regressors
++ '-stim_times_AM2 10 stimuli/out_take_none.1D' has 4 auxiliary values per time point
++ '-stim_times_AM2 10 stimuli/out_take_none.1D' will have 35 regressors
++ '-stim_times_AM2 11 stimuli/out_pass_bad.1D' has 4 auxiliary values per time point
++ '-stim_times_AM2 11 stimuli/out_pass_bad.1D' will have 35 regressors
++ '-stim_times_AM2 12 stimuli/out_pass_both.1D' has 4 auxiliary values per time point
++ '-stim_times_AM2 12 stimuli/out_pass_both.1D' will have 35 regressors
++ '-stim_times_AM2 13 stimuli/out_pass_good.1D' has 4 auxiliary values per time point
++ '-stim_times_AM2 13 stimuli/out_pass_good.1D' will have 35 regressors
++ '-stim_times_AM2 14 stimuli/out_pass_none.1D' has 4 auxiliary values per time point
++ '-stim_times_AM2 14 stimuli/out_pass_none.1D' will have 35 regressors
++ 3dDeconvolve: AFNI version=AFNI_2008_07_18_1710 (Jan 5 2009) [64-bit]
++ Authored by: B. Douglas Ward, et al.
++ current memory malloc-ated = 231742 bytes (about 232 thousand)
++ loading dataset pb04.JULY10_2007I.r01.scale+tlrc.HEAD pb04.JULY10_2007I.r02.scale+tlrc.HEAD pb04.JULY10_2007I.r03.scale+tlrc.HEAD pb04.JULY10_2007I.r04.scale+tlrc.HEAD pb04.JULY10_2007I.r05.scale+tlrc.HEAD pb04.JULY10_2007I.r06.scale+tlrc.HEAD pb04.JULY10_2007I.r07.scale+tlrc.HEAD
++ current memory malloc-ated = 566882900 bytes (about 567 million)
++ Auto-catenated datasets start at: 0 149 298 447 596 745 894
++ Imaging duration=298.0 s; Automatic polort=2
++ -stim_times using TR=2 s for stimulus timing conversion
++ -stim_times using TR=2 s for any -iresp output datasets
++ [you can alter the -iresp TR via the -TR_times option]
++ ** NOTE ** Will guess GLOBAL times if 1 time per line; LOCAL otherwise
++ ** GUESSED ** -stim_times_AM2 1 using GLOBAL times
++ '-stim_times_AM2 1' average amplitude#1=29.5895
++ '-stim_times_AM2 1' average amplitude#2=17.3368
++ ** GUESSED ** -stim_times_AM2 2 using GLOBAL times
++ '-stim_times_AM2 2' average amplitude#1=43.75
++ '-stim_times_AM2 2' average amplitude#2=17.88
++ ** GUESSED ** -stim_times_AM2 3 using GLOBAL times
++ '-stim_times_AM2 3' average amplitude#1=44.216
++ '-stim_times_AM2 3' average amplitude#2=25.144
++ ** GUESSED ** -stim_times_AM2 4 using GLOBAL times
++ '-stim_times_AM2 4' average amplitude#1=16.1478
++ '-stim_times_AM2 4' average amplitude#2=37.2174
++ ** GUESSED ** -stim_times_AM2 5 using GLOBAL times
++ '-stim_times_AM2 5' average amplitude#1=18.0364
++ '-stim_times_AM2 5' average amplitude#2=41.1182
++ ** GUESSED ** -stim_times_AM2 6 using GLOBAL times
++ '-stim_times_AM2 6' average amplitude#1=22.0588
++ '-stim_times_AM2 6' average amplitude#2=37.0118
++ ** GUESSED ** -stim_times_AM2 7 using GLOBAL times
++ '-stim_times_AM2 7' average amplitude#1=48.3333
++ '-stim_times_AM2 7' average amplitude#2=47.0833
++ '-stim_times_AM2 7' average amplitude#3=75
++ '-stim_times_AM2 7' average amplitude#4=44.25
++ ** GUESSED ** -stim_times_AM2 8 using GLOBAL times
++ '-stim_times_AM2 8' average amplitude#1=41.25
++ '-stim_times_AM2 8' average amplitude#2=56.9375
++ '-stim_times_AM2 8' average amplitude#3=73.75
++ '-stim_times_AM2 8' average amplitude#4=62.125
++ ** GUESSED ** -stim_times_AM2 9 using GLOBAL times
++ '-stim_times_AM2 9' average amplitude#1=55.4545
++ '-stim_times_AM2 9' average amplitude#2=32.5455
++ '-stim_times_AM2 9' average amplitude#3=65.4545
++ '-stim_times_AM2 9' average amplitude#4=61.7727
++ ** GUESSED ** -stim_times_AM2 10 using GLOBAL times
++ '-stim_times_AM2 10' average amplitude#1=35
++ '-stim_times_AM2 10' average amplitude#2=33.25
++ '-stim_times_AM2 10' average amplitude#3=80
++ '-stim_times_AM2 10' average amplitude#4=42.8333
++ ** GUESSED ** -stim_times_AM2 11 using GLOBAL times
++ '-stim_times_AM2 11' average amplitude#1=69.5238
++ '-stim_times_AM2 11' average amplitude#2=64.7619
++ '-stim_times_AM2 11' average amplitude#3=54.2857
++ '-stim_times_AM2 11' average amplitude#4=28.619
++ ** GUESSED ** -stim_times_AM2 12 using GLOBAL times
++ '-stim_times_AM2 12' average amplitude#1=77.1429
++ '-stim_times_AM2 12' average amplitude#2=65.2143
++ '-stim_times_AM2 12' average amplitude#3=60
++ '-stim_times_AM2 12' average amplitude#4=58.0714
++ ** GUESSED ** -stim_times_AM2 13 using GLOBAL times
++ '-stim_times_AM2 13' average amplitude#1=71.6667
++ '-stim_times_AM2 13' average amplitude#2=53.8333
++ '-stim_times_AM2 13' average amplitude#3=38.3333
++ '-stim_times_AM2 13' average amplitude#4=59.5
++ ** GUESSED ** -stim_times_AM2 14 using GLOBAL times
++ '-stim_times_AM2 14' average amplitude#1=75.2941
++ '-stim_times_AM2 14' average amplitude#2=40.1765
++ '-stim_times_AM2 14' average amplitude#3=57.6471
++ '-stim_times_AM2 14' average amplitude#4=27.5882
++ total shared memory needed = 2194794640 bytes (about 2.2 billion)
++ current memory malloc-ated = 569175207 bytes (about 569 million)
++ mmap() memory allocated: 2194794640 bytes (about 2.2 billion)
++ Memory required for output bricks = 2194794640 bytes (about 2.2 billion)
++ Wrote matrix image to file X.jpg
++ Wrote matrix values to file X.xmat.1D
++ ========= Things you can do with the matrix file =========
++ (a) Linear regression with ARMA(1,1) modeling of serial correlation:

3dREMLfit -matrix X.xmat.1D \
-input "pb04.JULY10_2007I.r01.scale+tlrc.HEAD pb04.JULY10_2007I.r02.scale+tlrc.HEAD pb04.JULY10_2007I.r03.scale+tlrc.HEAD pb04.JULY10_2007I.r04.scale+tlrc.HEAD pb04.JULY10_2007I.r05.scale+tlrc.HEAD pb04.JULY10_2007I.r06.scale+tlrc.HEAD pb04.JULY10_2007I.r07.scale+tlrc.HEAD" \
-mask full_mask.JULY10_2007I+tlrc -Rbeta coeffs.JULY10_2007I_REML \
-fout -tout -Rbuck stats.JULY10_2007I_REML -Rvar stats.JULY10_2007I_REMLvar \
-Rfitts fitts.JULY10_2007I_REML -verb

++ N.B.: 3dREMLfit command above written to file stats.REML_cmd
++ (b) Visualization/analysis of the matrix via ExamineXmat.R
++ (c) Synthesis of sub-model datasets using 3dSynthesize
++ ==========================================================
++ ----- Signal+Baseline matrix condition [X] (1043x481): 20.7244 ++ GOOD ++
*+ WARNING: !! in Signal+Baseline matrix:
* Largest singular value=3.35145
* 8 singular values are less than cutoff=3.35145e-07
* Implies strong collinearity in the matrix columns!
++ Signal+Baseline matrix singular values:
-3.37751e-16 -1.46008e-16 2.20727e-17 1.12214e-16 2.42623e-16
5.1715e-16 7.56008e-16 9.2526e-16 0.00780312 0.00912512
0.00956073 0.0105657 0.0118083 0.0126447 0.0146948
0.0162681 0.0163646 0.0173503 0.0185045 0.0191722
0.01948 0.0203486 0.0211466 0.0222718 0.0236961
0.0242096 0.0256205 0.0591713 0.0708909 0.0734689
0.0795592 0.0825152 0.0842061 0.0914776 0.0939785
0.095328 0.096909 0.0986183 0.101661 0.1037
0.105901 0.106365 0.107104 0.109718 0.111118
0.116571 0.117067 0.120017 0.121515 0.124903
0.126993 0.133593 0.136765 0.138432 0.140898
0.145163 0.148987 0.14951 0.153848 0.158832
0.160765 0.164589 0.167211 0.172991 0.17462
0.176505 0.180456 0.181774 0.184465 0.189652
0.190882 0.194387 0.196896 0.200383 0.2046
0.206755 0.209177 0.213452 0.216736 0.220916
0.224916 0.231799 0.233003 0.23874 0.242927
0.245056 0.245614 0.248102 0.251439 0.252811
0.25775 0.259751 0.261071 0.265474 0.268304
0.273268 0.275693 0.278814 0.281565 0.283148
0.284706 0.28874 0.290572 0.29392 0.296307
0.298388 0.301875 0.307871 0.310221 0.314425
0.319044 0.323578 0.325944 0.328057 0.333441
0.335298 0.339195 0.341953 0.344474 0.344941
0.348303 0.351421 0.354571 0.361008 0.363377
0.369196 0.373199 0.375262 0.378971 0.380365
0.391915 0.396912 0.40239 0.407246 0.409645
0.421379 0.428246 0.432793 0.436689 0.440194
0.442913 0.445825 0.446466 0.457985 0.461017
0.465931 0.472033 0.473773 0.475813 0.483039
0.488112 0.493353 0.496297 0.502202 0.510487
0.514855 0.515308 0.525728 0.526125 0.532933
0.539237 0.545197 0.551891 0.553251 0.557675
0.559045 0.567266 0.570505 0.57359 0.579959
0.58178 0.587095 0.588387 0.595097 0.601528
0.606078 0.610219 0.613301 0.620857 0.623215
0.62558 0.633817 0.638902 0.645187 0.649688
0.653632 0.657303 0.661681 0.668219 0.668554
0.671417 0.675648 0.677855 0.682412 0.688431
0.695757 0.698683 0.700984 0.705284 0.71024
0.711922 0.715155 0.720578 0.731547 0.738159
0.742175 0.746019 0.747203 0.753548 0.75524
0.760665 0.766466 0.770794 0.773322 0.77468
0.785543 0.788004 0.788732 0.793003 0.796414
0.802949 0.811664 0.81378 0.817633 0.824286
0.82877 0.835352 0.838323 0.84211 0.846288
0.854386 0.858309 0.863523 0.870521 0.872022
0.875954 0.884543 0.886582 0.890478 0.891695
0.894722 0.89714 0.901283 0.910454 0.914036
0.920713 0.922826 0.928627 0.933746 0.934425
0.941823 0.94456 0.947143 0.95026 0.954252
0.958548 0.961363 0.967571 0.969973 0.974097
0.977731 0.989557 0.993277 1.00228 1.00524
1.00742 1.01208 1.01537 1.02106 1.02181
1.02557 1.03004 1.03329 1.0362 1.04227
1.04551 1.04815 1.04979 1.05606 1.06235
1.06548 1.07022 1.07338 1.07438 1.07957
1.08379 1.08638 1.09054 1.09207 1.09412
1.09798 1.10193 1.10543 1.11149 1.11605
1.12012 1.12374 1.12594 1.12803 1.1349
1.14102 1.14578 1.15118 1.15427 1.1635
1.16391 1.16814 1.17487 1.17957 1.18265
1.18609 1.18888 1.19327 1.20155 1.2043
1.20614 1.21238 1.22108 1.2223 1.2388
1.24752 1.25151 1.25634 1.26032 1.26358
1.27266 1.28195 1.28551 1.29104 1.29756
1.29971 1.31176 1.31638 1.321 1.32833
1.33147 1.33682 1.34289 1.34691 1.35762
1.36162 1.36894 1.37335 1.37635 1.3795
1.38618 1.39059 1.39815 1.39949 1.41442
1.41986 1.42215 1.43204 1.4379 1.44312
1.45193 1.46563 1.47447 1.48066 1.4818
1.48624 1.49283 1.49559 1.50753 1.5112
1.51388 1.52155 1.53709 1.54135 1.55894
1.5629 1.57579 1.57924 1.5805 1.58799
1.59576 1.60217 1.60324 1.60879 1.62375
1.62781 1.64449 1.64606 1.65217 1.6622
1.66474 1.67721 1.68286 1.68967 1.69871
1.70578 1.71106 1.71424 1.71909 1.72865
1.73414 1.74039 1.7411 1.74301 1.75502
1.76598 1.77708 1.78519 1.78887 1.80007
1.80504 1.82638 1.83676 1.84777 1.86173
1.87291 1.87653 1.8969 1.90109 1.91244
1.91395 1.92136 1.94217 1.95733 1.96191
1.96839 1.97419 1.98697 1.99412 2.00008
2.00811 2.0115 2.02264 2.03798 2.05821
2.06085 2.0699 2.08468 2.10095 2.11367
2.11834 2.13474 2.1416 2.14794 2.1555
2.1606 2.1673 2.18035 2.2018 2.21105
2.21716 2.23646 2.24843 2.26777 2.30128
2.31366 2.31942 2.33462 2.3531 2.35771
2.38055 2.4068 2.40774 2.42735 2.43855
2.45393 2.47416 2.49574 2.5345 2.57992
2.6081 2.61897 2.67584 2.72493 2.76308
2.77398 2.79494 2.82743 2.83802 2.87691
2.91067 2.94806 2.98922 3.00507 3.08913
3.35145
++ ----- Signal-only matrix condition [X] (1043x460): 18.9199 ++ VERY GOOD ++
*+ WARNING: !! in Signal-only matrix:
* Largest singular value=3.01071
* 8 singular values are less than cutoff=3.01071e-07
* Implies strong collinearity in the matrix columns!
++ Signal-only matrix singular values:
-7.06193e-16 -4.69261e-16 -3.11085e-16 -1.68862e-16 3.88673e-17
1.44896e-16 2.33256e-16 1.02434e-15 0.00841068 0.00957781
0.00979179 0.0108382 0.014178 0.0154626 0.0170386
0.0172154 0.0175943 0.0190998 0.019901 0.0205426
0.0212876 0.0213327 0.022885 0.0240591 0.0247437
0.0262179 0.0642521 0.0721761 0.0834778 0.0849563
0.0877255 0.0925756 0.096091 0.0965247 0.100298
0.101164 0.101825 0.106024 0.107626 0.10769
0.112475 0.112865 0.114445 0.116531 0.123929
0.124777 0.129663 0.130331 0.136234 0.139402
0.140257 0.14562 0.150899 0.152066 0.155395
0.157279 0.160791 0.16497 0.167112 0.169148
0.179616 0.181939 0.185727 0.188069 0.193252
0.19355 0.195096 0.196942 0.203175 0.206547
0.20947 0.215657 0.218677 0.221671 0.227081
0.232976 0.235225 0.239454 0.243752 0.247019
0.249038 0.25175 0.254087 0.255282 0.260084
0.26269 0.267313 0.269802 0.271212 0.275808
0.277615 0.280167 0.28564 0.28661 0.288062
0.291202 0.29407 0.295879 0.302316 0.306192
0.307522 0.308363 0.310515 0.31482 0.320624
0.326681 0.328571 0.334727 0.336071 0.337948
0.34146 0.341634 0.345967 0.351602 0.355614
0.35919 0.362232 0.363583 0.364001 0.365225
0.368748 0.380706 0.383679 0.387645 0.400986
0.403208 0.411389 0.416925 0.423029 0.427592
0.436656 0.437539 0.441353 0.44561 0.452215
0.458109 0.463725 0.464955 0.467877 0.471887
0.482067 0.484751 0.491959 0.494544 0.502277
0.50364 0.510126 0.514237 0.521778 0.524598
0.535616 0.538726 0.545236 0.552893 0.555629
0.556741 0.562018 0.568478 0.574556 0.574687
0.579222 0.584677 0.586632 0.596727 0.601781
0.60548 0.610505 0.616758 0.624513 0.629279
0.63441 0.635222 0.643639 0.6462 0.648843
0.659052 0.660971 0.663569 0.668472 0.670097
0.671883 0.678938 0.684028 0.692696 0.694482
0.702975 0.703598 0.710637 0.713777 0.716123
0.719278 0.731044 0.738621 0.741338 0.746617
0.747412 0.751144 0.752556 0.759449 0.76259
0.770551 0.773058 0.776228 0.78134 0.784692
0.788578 0.792895 0.79664 0.802263 0.812444
0.819293 0.82009 0.826929 0.827481 0.834982
0.838725 0.843246 0.851712 0.861312 0.869136
0.870524 0.873485 0.875996 0.886762 0.889534
0.89126 0.892655 0.894775 0.896271 0.903883
0.907891 0.918045 0.919108 0.920045 0.923735
0.92768 0.932836 0.939285 0.941444 0.941731
0.949565 0.953642 0.95472 0.957022 0.968062
0.969217 0.980504 0.982549 0.987936 1
1 1.00033 1.00987 1.01106 1.01739
1.0202 1.02344 1.02905 1.03034 1.03697
1.03847 1.04506 1.04848 1.05163 1.05285
1.05849 1.06403 1.06586 1.06877 1.07152
1.07677 1.08354 1.08383 1.08572 1.0909
1.09798 1.09972 1.09976 1.10321 1.1082
1.11171 1.11292 1.12048 1.12225 1.12572
1.12989 1.13709 1.14115 1.1454 1.15336
1.15843 1.16367 1.16678 1.1678 1.17066
1.18101 1.18135 1.18739 1.19281 1.19312
1.19954 1.20455 1.21327 1.21773 1.2318
1.23624 1.24447 1.24969 1.25105 1.26666
1.26942 1.27615 1.28493 1.28728 1.2902
1.29847 1.30203 1.30549 1.31132 1.31805
1.32013 1.33332 1.34355 1.3525 1.36182
1.36373 1.36554 1.37093 1.37708 1.38134
1.38359 1.38456 1.3942 1.40304 1.41498
1.4178 1.42601 1.43231 1.44379 1.45165
1.46475 1.47337 1.47692 1.48173 1.48333
1.48626 1.50215 1.50234 1.5132 1.52255
1.52822 1.53807 1.53919 1.54 1.55774
1.55999 1.56569 1.57999 1.58053 1.59179
1.59701 1.60811 1.61703 1.62599 1.63889
1.64089 1.64619 1.65826 1.66232 1.67598
1.67983 1.6879 1.69099 1.69248 1.69381
1.7117 1.71528 1.71664 1.72363 1.72388
1.729 1.74549 1.75153 1.75204 1.76162
1.77557 1.78884 1.80232 1.8065 1.81654
1.82106 1.8269 1.84221 1.87449 1.88641
1.89277 1.89977 1.91003 1.91884 1.92521
1.94123 1.94394 1.97881 1.98023 1.99231
1.99268 1.99642 2 2 2.00827
2.03765 2.05536 2.06529 2.07187 2.08483
2.10499 2.11406 2.12322 2.12426 2.12523
2.14015 2.15668 2.15868 2.16314 2.18668
2.20353 2.21702 2.22514 2.23388 2.26376
2.27866 2.29692 2.31 2.3304 2.34126
2.3624 2.36726 2.41279 2.41532 2.42372
2.43229 2.44192 2.48974 2.49288 2.55057
2.6061 2.60967 2.67337 2.72292 2.7273
2.744 2.77367 2.81462 2.81686 2.82577
2.8649 2.913 2.94112 2.99394 3.01071
++ ----- Baseline-only matrix condition [X] (1043x21): 1 ++ VERY GOOD ++
++ ----- polort-only matrix condition [X] (1043x21): 1 ++ VERY GOOD ++
++ +++++ Matrix inverse average error = 0.00024202 ++ OK ++
++ Matrix setup time = 381.77 s
** ERROR: !! 3dDeconvolve: Can't run past 2 matrix warnings without '-GOFORIT 2'
** ERROR: !! Currently at -GOFORIT 0
** ERROR: !! See file 3dDeconvolve.err for all WARNING and ERROR messages !!
** ERROR: !! Be sure you understand what you are doing before using -GOFORIT !!
** ERROR: !! If in doubt, consult with someone or with the AFNI message board !!
** FATAL ERROR: !! 3dDeconvolve (regretfully) shuts itself down !!
---------------------------------------
** 3dDeconvolve error, failing...
(consider the file 3dDeconvolve.err)

================================


I ran 1d_tool.py on the resulting X.xmat.1D file, and got the following result for matrix correlations:

1d_tool.py -infile X.xmat.1D -show_cormat_warnings

Warnings regarding Correlation Matrix: X.xmat.1D

severity correlation regressor pair
-------- ----------- ----------------------------------------
medium: 0.694 (190 vs. 200) dTake8_n#19 vs. dTake8_n#29
medium: 0.694 (189 vs. 199) dTake8_n#18 vs. dTake8_n#28
medium: 0.694 (188 vs. 198) dTake8_n#17 vs. dTake8_n#27
medium: 0.694 (187 vs. 197) dTake8_n#16 vs. dTake8_n#26
medium: 0.694 (186 vs. 196) dTake8_n#15 vs. dTake8_n#25
medium: 0.694 (185 vs. 195) dTake8_n#14 vs. dTake8_n#24
medium: 0.694 (184 vs. 194) dTake8_n#13 vs. dTake8_n#23
medium: 0.694 (183 vs. 193) dTake8_n#12 vs. dTake8_n#22
medium: 0.694 (182 vs. 192) dTake8_n#11 vs. dTake8_n#21
medium: 0.692 (100 vs. 110) dPass8_n#19 vs. dPass8_n#29
medium: 0.692 (99 vs. 109) dPass8_n#18 vs. dPass8_n#28
medium: 0.692 (98 vs. 108) dPass8_n#17 vs. dPass8_n#27
medium: 0.692 (97 vs. 107) dPass8_n#16 vs. dPass8_n#26
medium: 0.692 (96 vs. 106) dPass8_n#15 vs. dPass8_n#25
medium: 0.692 (95 vs. 105) dPass8_n#14 vs. dPass8_n#24
medium: 0.692 (94 vs. 104) dPass8_n#13 vs. dPass8_n#23
medium: 0.692 (93 vs. 103) dPass8_n#12 vs. dPass8_n#22
medium: 0.692 (92 vs. 102) dPass8_n#11 vs. dPass8_n#21
medium: 0.692 (91 vs. 101) dPass8_n#10 vs. dPass8_n#20
medium: 0.687 (181 vs. 191) dTake8_n#10 vs. dTake8_n#20
medium: -0.657 (428 vs. 442) out_pass_good#17 vs. out_pass_good#31
medium: -0.657 (426 vs. 440) out_pass_good#15 vs. out_pass_good#29
medium: -0.657 (431 vs. 445) out_pass_good#20 vs. out_pass_good#34
medium: -0.657 (429 vs. 443) out_pass_good#18 vs. out_pass_good#32
medium: -0.657 (427 vs. 441) out_pass_good#16 vs. out_pass_good#30
medium: -0.657 (425 vs. 439) out_pass_good#14 vs. out_pass_good#28
medium: -0.657 (430 vs. 444) out_pass_good#19 vs. out_pass_good#33
medium: 0.564 (424 vs. 438) out_pass_good#13 vs. out_pass_good#27
medium: 0.564 (422 vs. 436) out_pass_good#11 vs. out_pass_good#25
medium: 0.564 (420 vs. 434) out_pass_good#9 vs. out_pass_good#23
medium: 0.564 (418 vs. 432) out_pass_good#7 vs. out_pass_good#21
medium: 0.564 (419 vs. 433) out_pass_good#8 vs. out_pass_good#22
medium: 0.564 (421 vs. 435) out_pass_good#10 vs. out_pass_good#24
medium: 0.564 (423 vs. 437) out_pass_good#12 vs. out_pass_good#26
medium: 0.558 (136 vs. 335) dTake4_n#25 vs. out_take_none#29
medium: 0.558 (138 vs. 337) dTake4_n#27 vs. out_take_none#31
medium: 0.558 (139 vs. 338) dTake4_n#28 vs. out_take_none#32
medium: 0.558 (137 vs. 336) dTake4_n#26 vs. out_take_none#30
medium: 0.558 (135 vs. 334) dTake4_n#24 vs. out_take_none#28
medium: 0.558 (140 vs. 339) dTake4_n#29 vs. out_take_none#33
medium: 0.495 (214 vs. 235) out_take_bad#13 vs. out_take_bad#34
medium: 0.495 (212 vs. 233) out_take_bad#11 vs. out_take_bad#32
medium: 0.495 (210 vs. 231) out_take_bad#9 vs. out_take_bad#30
medium: 0.495 (208 vs. 229) out_take_bad#7 vs. out_take_bad#28
medium: 0.495 (211 vs. 232) out_take_bad#10 vs. out_take_bad#31
medium: 0.495 (209 vs. 230) out_take_bad#8 vs. out_take_bad#29
medium: 0.495 (213 vs. 234) out_take_bad#12 vs. out_take_bad#33
medium: -0.482 (400 vs. 407) out_pass_both#24 vs. out_pass_both#31
medium: -0.482 (403 vs. 410) out_pass_both#27 vs. out_pass_both#34
medium: -0.482 (401 vs. 408) out_pass_both#25 vs. out_pass_both#32
medium: -0.482 (399 vs. 406) out_pass_both#23 vs. out_pass_both#30
medium: -0.482 (397 vs. 404) out_pass_both#21 vs. out_pass_both#28
medium: -0.482 (398 vs. 405) out_pass_both#22 vs. out_pass_both#29
medium: -0.482 (402 vs. 409) out_pass_both#26 vs. out_pass_both#33
medium: 0.479 (249 vs. 270) out_take_both#13 vs. out_take_both#34
medium: 0.479 (247 vs. 268) out_take_both#11 vs. out_take_both#32
medium: 0.479 (245 vs. 266) out_take_both#9 vs. out_take_both#30
medium: 0.479 (243 vs. 264) out_take_both#7 vs. out_take_both#28
medium: 0.479 (248 vs. 269) out_take_both#12 vs. out_take_both#33
medium: 0.479 (244 vs. 265) out_take_both#8 vs. out_take_both#29
medium: 0.479 (246 vs. 267) out_take_both#10 vs. out_take_both#31
medium: -0.468 (295 vs. 302) out_take_good#24 vs. out_take_good#31
medium: -0.468 (298 vs. 305) out_take_good#27 vs. out_take_good#34
medium: -0.468 (296 vs. 303) out_take_good#25 vs. out_take_good#32
medium: -0.468 (294 vs. 301) out_take_good#23 vs. out_take_good#30
medium: -0.468 (292 vs. 299) out_take_good#21 vs. out_take_good#28
medium: -0.468 (297 vs. 304) out_take_good#26 vs. out_take_good#33
medium: -0.468 (293 vs. 300) out_take_good#22 vs. out_take_good#29
medium: -0.460 (353 vs. 360) out_pass_bad#12 vs. out_pass_bad#19
medium: -0.460 (354 vs. 361) out_pass_bad#13 vs. out_pass_bad#20
medium: -0.460 (352 vs. 359) out_pass_bad#11 vs. out_pass_bad#18
medium: -0.460 (350 vs. 357) out_pass_bad#9 vs. out_pass_bad#16
medium: -0.460 (348 vs. 355) out_pass_bad#7 vs. out_pass_bad#14
medium: -0.460 (351 vs. 358) out_pass_bad#10 vs. out_pass_bad#17
medium: -0.460 (349 vs. 356) out_pass_bad#8 vs. out_pass_bad#15
medium: 0.458 (71 vs. 322) dPass6_n#20 vs. out_take_none#16
medium: 0.455 (424 vs. 445) out_pass_good#13 vs. out_pass_good#34
medium: 0.455 (422 vs. 443) out_pass_good#11 vs. out_pass_good#32
medium: 0.455 (420 vs. 441) out_pass_good#9 vs. out_pass_good#30
medium: 0.455 (418 vs. 439) out_pass_good#7 vs. out_pass_good#28
medium: 0.455 (419 vs. 440) out_pass_good#8 vs. out_pass_good#29
medium: 0.455 (421 vs. 442) out_pass_good#10 vs. out_pass_good#31
medium: 0.455 (423 vs. 444) out_pass_good#12 vs. out_pass_good#33
medium: 0.449 (46 vs. 475) dPass4_n#25 vs. out_pass_none#29
medium: 0.449 (49 vs. 478) dPass4_n#28 vs. out_pass_none#32
medium: 0.449 (47 vs. 476) dPass4_n#26 vs. out_pass_none#30
medium: 0.449 (45 vs. 474) dPass4_n#24 vs. out_pass_none#28
medium: 0.449 (50 vs. 479) dPass4_n#29 vs. out_pass_none#33
medium: 0.449 (48 vs. 477) dPass4_n#27 vs. out_pass_none#31
medium: 0.443 (61 vs. 71) dPass6_n#10 vs. dPass6_n#20
medium: 0.443 (149 vs. 274) dTake6_n#8 vs. out_take_good#3
medium: 0.443 (148 vs. 273) dTake6_n#7 vs. out_take_good#2
medium: 0.443 (146 vs. 271) dTake6_n#5 vs. out_take_good#0
medium: 0.443 (150 vs. 275) dTake6_n#9 vs. out_take_good#4
medium: 0.443 (147 vs. 272) dTake6_n#6 vs. out_take_good#1
medium: 0.439 (98 vs. 398) dPass8_n#17 vs. out_pass_both#22
medium: 0.439 (99 vs. 399) dPass8_n#18 vs. out_pass_both#23
medium: 0.439 (97 vs. 397) dPass8_n#16 vs. out_pass_both#21
medium: 0.439 (100 vs. 400) dPass8_n#19 vs. out_pass_both#24
medium: 0.438 (72 vs. 323) dPass6_n#21 vs. out_take_none#17
medium: 0.438 (74 vs. 325) dPass6_n#23 vs. out_take_none#19
medium: 0.438 (75 vs. 326) dPass6_n#24 vs. out_take_none#20
medium: 0.438 (73 vs. 324) dPass6_n#22 vs. out_take_none#18
medium: 0.428 (330 vs. 337) out_take_none#24 vs. out_take_none#31
medium: 0.428 (333 vs. 340) out_take_none#27 vs. out_take_none#34
medium: 0.428 (331 vs. 338) out_take_none#25 vs. out_take_none#32
medium: 0.428 (329 vs. 336) out_take_none#23 vs. out_take_none#30
medium: 0.428 (327 vs. 334) out_take_none#21 vs. out_take_none#28
medium: 0.428 (332 vs. 339) out_take_none#26 vs. out_take_none#33
medium: 0.428 (328 vs. 335) out_take_none#22 vs. out_take_none#29
medium: 0.424 (88 vs. 342) dPass8_n#7 vs. out_pass_bad#1
medium: 0.424 (90 vs. 344) dPass8_n#9 vs. out_pass_bad#3
medium: 0.424 (89 vs. 343) dPass8_n#8 vs. out_pass_bad#2
medium: 0.424 (87 vs. 341) dPass8_n#6 vs. out_pass_bad#0
medium: 0.418 (458 vs. 472) out_pass_none#12 vs. out_pass_none#26
medium: 0.418 (459 vs. 473) out_pass_none#13 vs. out_pass_none#27
medium: 0.418 (457 vs. 471) out_pass_none#11 vs. out_pass_none#25
medium: 0.418 (455 vs. 469) out_pass_none#9 vs. out_pass_none#23
medium: 0.418 (453 vs. 467) out_pass_none#7 vs. out_pass_none#21
medium: 0.418 (456 vs. 470) out_pass_none#10 vs. out_pass_none#24
medium: 0.418 (454 vs. 468) out_pass_none#8 vs. out_pass_none#22
medium: 0.417 (198 vs. 293) dTake8_n#27 vs. out_take_good#22
medium: 0.417 (199 vs. 294) dTake8_n#28 vs. out_take_good#23
medium: 0.417 (197 vs. 292) dTake8_n#26 vs. out_take_good#21
medium: 0.417 (200 vs. 295) dTake8_n#29 vs. out_take_good#24
medium: 0.413 (130 vs. 234) dTake4_n#19 vs. out_take_bad#33
medium: 0.413 (128 vs. 232) dTake4_n#17 vs. out_take_bad#31
medium: 0.413 (129 vs. 233) dTake4_n#18 vs. out_take_bad#32
medium: 0.413 (127 vs. 231) dTake4_n#16 vs. out_take_bad#30
medium: 0.413 (125 vs. 229) dTake4_n#14 vs. out_take_bad#28
medium: 0.413 (126 vs. 230) dTake4_n#15 vs. out_take_bad#29
medium: 0.412 (115 vs. 306) dTake4_n#4 vs. out_take_none#0
medium: 0.412 (119 vs. 310) dTake4_n#8 vs. out_take_none#4
medium: 0.412 (117 vs. 308) dTake4_n#6 vs. out_take_none#2
medium: 0.412 (118 vs. 309) dTake4_n#7 vs. out_take_none#3
medium: 0.412 (116 vs. 307) dTake4_n#5 vs. out_take_none#1
medium: 0.412 (120 vs. 311) dTake4_n#9 vs. out_take_none#5
medium: 0.412 (46 vs. 468) dPass4_n#25 vs. out_pass_none#22
medium: 0.412 (49 vs. 471) dPass4_n#28 vs. out_pass_none#25
medium: 0.412 (47 vs. 469) dPass4_n#26 vs. out_pass_none#23
medium: 0.412 (45 vs. 467) dPass4_n#24 vs. out_pass_none#21
medium: 0.412 (50 vs. 472) dPass4_n#29 vs. out_pass_none#26
medium: 0.412 (48 vs. 470) dPass4_n#27 vs. out_pass_none#24
medium: 0.410 (130 vs. 140) dTake4_n#19 vs. dTake4_n#29
medium: 0.410 (129 vs. 139) dTake4_n#18 vs. dTake4_n#28
medium: 0.410 (128 vs. 138) dTake4_n#17 vs. dTake4_n#27
medium: 0.410 (127 vs. 137) dTake4_n#16 vs. dTake4_n#26
medium: 0.410 (126 vs. 136) dTake4_n#15 vs. dTake4_n#25
medium: 0.410 (125 vs. 135) dTake4_n#14 vs. dTake4_n#24
medium: 0.410 (124 vs. 134) dTake4_n#13 vs. dTake4_n#23
medium: 0.410 (123 vs. 133) dTake4_n#12 vs. dTake4_n#22
medium: 0.410 (122 vs. 132) dTake4_n#11 vs. dTake4_n#21
medium: 0.410 (121 vs. 131) dTake4_n#10 vs. dTake4_n#20
medium: 0.409 (59 vs. 379) dPass6_n#8 vs. out_pass_both#3
medium: 0.409 (57 vs. 377) dPass6_n#6 vs. out_pass_both#1
medium: 0.409 (58 vs. 378) dPass6_n#7 vs. out_pass_both#2
medium: 0.409 (56 vs. 376) dPass6_n#5 vs. out_pass_both#0
medium: 0.409 (60 vs. 380) dPass6_n#9 vs. out_pass_both#4
medium: 0.408 (286 vs. 293) out_take_good#15 vs. out_take_good#22
medium: 0.408 (290 vs. 297) out_take_good#19 vs. out_take_good#26
medium: 0.408 (291 vs. 298) out_take_good#20 vs. out_take_good#27
medium: 0.408 (289 vs. 296) out_take_good#18 vs. out_take_good#25
medium: 0.408 (287 vs. 294) out_take_good#16 vs. out_take_good#23
medium: 0.408 (285 vs. 292) out_take_good#14 vs. out_take_good#21
medium: 0.408 (288 vs. 295) out_take_good#17 vs. out_take_good#24
medium: 0.408 (138 vs. 330) dTake4_n#27 vs. out_take_none#24
medium: 0.408 (139 vs. 331) dTake4_n#28 vs. out_take_none#25
medium: 0.408 (137 vs. 329) dTake4_n#26 vs. out_take_none#23
medium: 0.408 (136 vs. 328) dTake4_n#25 vs. out_take_none#22
medium: 0.408 (135 vs. 327) dTake4_n#24 vs. out_take_none#21
medium: 0.408 (140 vs. 332) dTake4_n#29 vs. out_take_none#26

========================

My question is, what do these singular values say about the collinearity of the matrix, and would it be unable to adequately solve the system if I input 'GOFORIT 2' (i.e., would this lead to a pseudo-inverse solution for the matrix)? My impression was that only perfectly collinear columns lead to this sort of error, but if there is a more technical explanation for what is happening here, I'm all ears.

Thank you again for your time and patience,

-Andrew
Subject Author Posted

Output of 1d_tool.py

Andrew Jahn June 05, 2009 04:40PM

Re: Output of 1d_tool.py

rick reynolds June 05, 2009 05:43PM

Re: Output of 1d_tool.py

Andrew Jahn June 06, 2009 06:02PM

Re: Output of 1d_tool.py

rick reynolds June 08, 2009 10:49AM

Re: Output of 1d_tool.py

Andrew Jahn June 11, 2009 09:58AM

Re: Output of 1d_tool.py

rick reynolds June 11, 2009 01:20PM

Re: Output of 1d_tool.py

Daniel Glen June 11, 2009 02:17PM

Re: Output of 1d_tool.py

Andrew Jahn June 11, 2009 03:25PM

Re: Output of 1d_tool.py

Daniel Glen June 11, 2009 03:30PM

Re: Output of 1d_tool.py

Andrew Jahn June 11, 2009 04:02PM