Hello everyone,
I'm using a cDAQ 9174 with a 9401 module to output a finite sequence at a specific rate to trigger some devices. However, the actual sample rate seems to differ from the expected rate.
I made some measurements using the attached vi to compare the expected with the actual effective sample rate.
Expected | Actual | Actual-Expected |
1 | 1,19624 | 0,19624 |
5 | 5,92426 | 0,92426 |
10 | 11,7485 | 1,7485 |
20 | 22,9828 | 2,9828 |
30 | 33,7812 | 3,7812 |
40 | 43,8152 | 3,8152 |
50 | 54,1208 | 4,1208 |
60 | 62,8937 | 2,8937 |
70 | 71,8916 | 1,8916 |
80 | 80,5822 | 0,5822 |
90 | 88,9891 | -1,0109 |
100 | 96,6605 | -3,3395 |
120 | 112,182 | -7,818 |
140 | 125,938 | -14,062 |
160 | 139,997 | -20,003 |
180 | 158,569 | -21,431 |
200 | 171,392 | -28,608 |
There should be an increasing overhead since I'm stopping and restarting the task after each iteration. But as you can see, below ~90 Hz, the actual effective sample rate is even higher than expected! What's happening here?