Time in LabVIEW DAQ Assistant and Write LabVIEW file

Time in LabVIEW DAQ Assistant and Write LabVIEW file

Post by PKP » Mon, 16 Jul 2007 08:10:04


Hello all,
I would be extremely grateful for your assistance in helping me understand the timing in LabVIEW.
Here are the details of the system. The data acquisition is done using a simple LabVIEW 8.2 based program consisting of a DAQ Assistant VI and a Write to Measurement File VI, which writes data in TDM format (binary with XML header).
The hardware consists of a 1042Q chassis that contains 2 PXI-6133 cards. 16 transducers are connected to these 8 channel cards. The chassis is connected to a PC by means of MXI-4 fiber optic cable and a PCI-8336 card.
 
I would like to sample all 16 channels simultaneously at 500 kS/s for 2 secs. or 1MS/s for 0.5 secs.
When the signals were sampled at 500kS/s x 2secs., the time column looks like this when opened on DIAdem.
 
Serial#   Time   Time(in secs. when displayed in number format with 8 decimals)
1     07/10/2007 00:09:42.1406     63351158982.14060800
14   07/10/2007 00:09:42.1407     63351158982.14069760
64   07/10/2007 00:09:42.1408     63351158982.14080000
109 07/10/2007 00:09:42.1409     63351158982.14090240
etc.
 
The time does not change from serial 1 to 13, and only at the 14th row does it change, after which it remains constant till the 63rd row, and then  it remains the same till row 108, and so on. I understand that the time displayed in the format "63351158982.14069760" is the total number of seconds since 0 A.D.
But why does the time step not change at 0.000 002 seconds intervals? When an operation was performed in DIAdem to find the difference between the individual time values, the time steps were different from one pair of rows to another, e.g., 4us, -4us, -6us, etc. (u=micro).
 
A similar "non-uniform time step" trend is seen when sampling at 1MS/s also.
 
Is the system multiplexing the sampling of signals from the 16 channels, rather than taking simultaneous sampling of all 16 channels at one point in time? (I think this is the case.) How is the time step determined by the system?
 
How do I estimate the uncertainty in the determination of time? Where do I look for the uncertainty information for the DAQ system?
 
Evidently, I am perplexed. If anyone can point me in the right direction(s), I'd be very grateful. Thank you all for your time.
Philip
 
 
 

1. How to write data acquired by DAQ Assistant in Labview 7.0 to spread sheet?

2. I am trying to use a labview program written on labview 6.1 on labview 7.1.

Hi Kalina,
altenbach already gave you the correct answer and I am really sorry there is not much we can do at this point. However, if you have any other problems upgrading VIs from one LabVIEW versions to another, feel free to contact us. Simply make sure you have the block diagrams (i.e. the brain of your programs). As a general rule, it is good to mass compile your directory of old VIs, so that they are properly updated to the new version and all subVIs (if any) are relinked. To mass compile, click on Toolsdvancedass Compile.
Hope that helps!

3. Why labview error code 11 following installation of Labview 6.1 with NI-DAQ 6.1 on NT4

4. How do I enable DAQ Assistant in LabView Student Edition?

5. Labview ignores all but 1 Daq Assistant, how can I avoid this?

6. What are my options for deploying DAQ assistant with a built LabView application.

7. change a LabVIEW array outside LabVIEW and keep updated in LabVIEW

8. Using a PC4350 DAQ and/or a PC-LPM-16/PnP DAQ under Labview 7.1

9. vi.lib\Daq directory not found after installing NI-DAQ 7 followed by LabVIEW 7,1

10. import of virtual instruments written in Labview 3.1 in Labview 6.1?

11. Desktop PC as LabVIEW Real-Time Target and DAQ ISA card

12. can anyone provide me the suitable material for labview fpga &labview real time??

13. LabVIEW real time vs LabVIEW desktop

14. Labview timing issues with DAQ and Field Point

15. Can LabVIEW Run-Time Engine 7.0 control LabVIEW 6.1 VIs?