Obtaining time information for collected data

Hi,
I'm trying to read the signals from 16 different microphones using the attached VI. I believe that the DAQ VIs are working fine.
However, my problem is that the data that I obtain is only of the array type without any time information. This prevents me from calculating things like frequency of the input data. If I'm not mistaken, the dt values of the input data is now at a default value of 1. I wanted to know what I could change to obtain the actual analog frequency of the input data, and be able to save the X and Y values of the waveforms in a file.
Thanks.
Attachments:
data_acquisition3.vi ‏166 KB

Thank you for the suggestion.
I have changed the instance of the polymorphic VI. But now I'm having issues with saving the data. I don't know how I could save the data for all the channels (similar to my post on another thread)
Write To Measurement File does what I want, but I just wondered whether I could use Write to Spreadsheet VI (as I prefer it in this form).
Cheers.
Message Edited by imperial-aero on 03-02-2008 05:13 AM
Attachments:
wavedata.PNG ‏24 KB

Similar Messages

  • When I change the time zone of the clock, the "Date created" time information for my documents and image files in the Finder window (and in Get Info) is changed. Can I make the time info in "Date created" remain fixed regardless of the clock's timezone?

    When I change the time zone of the clock, the "Date created" time information for my documents and image files in the Finder window (and in Get Info) is changed. Can I make the time info in "Date created" remain fixed regardless of the clock's timezone?

    When I change the time zone of the clock, the "Date created" time information for my documents and image files in the Finder window (and in Get Info) is changed. Can I make the time info in "Date created" remain fixed regardless of the clock's timezone?

  • TRANSFER failed!.Error: Error obtaining distribution information for asset_

    i am getting the following error while running the Asster Tranfer API in R11
    TRANSFER failed!.
    Error: Error obtaining distribution information for asset_id &ASSET_ID.
    I want to transfer the asset location and code is as follows
    l_asset_dist_tbl(1).distribution_id := 2385128;
    l_asset_dist_tbl(1).transaction_units := -1;
    l_asset_dist_tbl(2).transaction_units := 1;
    l_asset_dist_tbl(2).assigned_to := null;
    l_asset_dist_tbl(2).expense_ccid := null;
    Any idea is highly appriciated,

    Below is the code which i used to transfer the asset. Used the same code mentioned in metalink
    declare
    l_return_status varchar2(1);
    l_msg_count number:= 0;
    l_msg_data varchar2(4000);
    l_trans_rec fa_api_types.trans_rec_type;
    l_asset_hdr_rec fa_api_types.asset_hdr_rec_type;
    l_asset_dist_tbl fa_api_types.asset_dist_tbl_type;
    temp_str varchar2(512);
    begin
    --fnd_profile.put('PRINT_DEBUG', 'Y');
    dbms_output.enable(1000000);
    fa_srvr_msg.init_server_message;
    fa_debug_pkg.initialize;
    -- fill in asset information
    l_asset_hdr_rec.asset_id := 2000068;
    l_asset_hdr_rec.book_type_code := 'US ASSET SERIAL';
    -- transaction date must be filled in if performing
    -- prior period transfer
    --l_trans_rec.transaction_date_entered :=to_date('01-JAN-1999 10:54:22','dd-mon-yyyy hh24:mi:ss');
    l_asset_dist_tbl.delete;
    fill in distribution data for existing distribution lines
    affected by this transfer txn. Note: You need to fill in
    only affected distribution lines.
    For source distribution, you must fill in either existing
    distribution id or 2 columns(expense_ccid,location_ccid) or
    3-tuple columns(assigned_to,expense_ccid,and location_ccid)
    depending on the makeup of the particular distribution
    of the asset.
    --l_asset_dist_tbl(1).distribution_id := 108422;
    --l_asset_dist_tbl(1).transaction_units := -1;
    l_asset_dist_tbl(1).transaction_units := -1;
    l_asset_dist_tbl(1).expense_ccid :=108422;
    either above 2 lines or below 4 lines must be provided
    for source distribution:
    l_asset_dist_tbl(1).transaction_units := -2;
    l_asset_dist_tbl(1).assigned_to := 11;
    l_asset_dist_tbl(1).expense_ccid :=15338;
    l_asset_dist_tbl(1).location_ccid := 3; */
    -- fill in dist info for destination distribution
    l_asset_dist_tbl(2).transaction_units := 1;
    --l_asset_dist_tbl(2).assigned_to := NULL;
    l_asset_dist_tbl(2).expense_ccid :=109260;
    --l_asset_dist_tbl(2).location_ccid := 3;
    --l_asset_dist_tbl(3).transaction_units := 1;
    --l_asset_dist_tbl(3).assigned_to := 10;
    --l_asset_dist_tbl(3).expense_ccid := 24281;
    --l_asset_dist_tbl(3).location_ccid := 3;
    l_trans_rec.who_info.last_updated_by := 25728;--FND_GLOBAL.USER_ID;
    l_trans_rec.who_info.last_update_login := 25728; --FND_GLOBAL.LOGIN_ID;
    FA_TRANSFER_PUB.do_transfer(
    p_api_version => 1.0,
    p_init_msg_list => FND_API.G_FALSE,
    p_commit => FND_API.G_FALSE,
    p_validation_level =>FND_API.G_VALID_LEVEL_FULL,
    p_calling_fn => NULL,
    x_return_status => l_return_status,
    x_msg_count => l_msg_count,
    x_msg_data => l_msg_data,
    px_trans_rec => l_trans_rec,
    px_asset_hdr_rec => l_asset_hdr_rec,
    px_asset_dist_tbl => l_asset_dist_tbl);
    if (l_return_status != FND_API.G_RET_STS_SUCCESS) then
    dbms_output.put_line('TRANSFER failed!.');
    l_msg_count := fnd_msg_pub.count_msg;
    if (l_msg_count > 0) then
    temp_str := substr(fnd_msg_pub.get(fnd_msg_pub.G_FIRST,
    fnd_api.G_FALSE),1,512);
    dbms_output.put_line('Error: '||temp_str);
    for I in 1..(l_msg_count -1) loop
    temp_str :=
    substr(fnd_msg_pub.get(fnd_msg_pub.G_NEXT,
    fnd_api.G_FALSE),1,512);
    dbms_output.put_line('Error: '||temp_str);
    end loop;
    end if;
    else
    dbms_output.put_line('TRANSFER completed successfully!');
    dbms_output.put_line('THID = ' ||
    to_char(l_trans_rec.transaction_header_id));
    end if;
    fnd_msg_pub.delete_msg();
    end;
    Thanks

  • I have lost my login information for cellular data account. How do I retrieve it?

    I have lost my login information for cellular data account. How do I retrieve it?

    I called AT&T at 1-800-331-0500 and asked for IPad support. They are going to send me a new SIM card. This will solve the problem. Thanks for  your help.

  • Change Time Constraint for Personal Data InfoType

    Hi,
    How to change the Time Constraint for Personal Data InfoType.
    I tried to do it in Customisation Procedure --> Infotypes, but the option to change Time Constraint is disabled for Infotype 0002.
    Thanks

    Hi,
    you can change time constraints in general attributes of infotype attributes,this can be done through table maintenace view V_T582A.
    But note that we cannot change the time contraints for mandatory infotyepe of personnel in an organization i.e -0000,0001,0002.
    ex:- without personal details like names no personnel will exist in an organization.
    regards,
    Soori

  • An unexpected exception occurred while attempting to locate the run-time information for this Web Service. Error: java.lang.reflect.InvocationTargetException:null

    Hi I m getting the below wxpection when i run test browser from workshop. please help me.
    An unexpected exception occurred while attempting to locate the run-time information for this Web Service. Error: java.lang.reflect.InvocationTargetException:null

    Thamarai,
    Can you provide more information on your jws ? Also can you start the server
    from the command line with verbose option. This will cause
    weblogic_debug.log to be generated in the domain folder.
    Raj Alagumalai
    Backline Workshop Support
    "Thamarai Selvan" <[email protected]> wrote in message
    news:[email protected]..
    Hi I m getting the below wxpection when i run test browser from workshop.please help me.
    >
    An unexpected exception occurred while attempting to locate the run-timeinformation for this Web Service. Error:
    java.lang.reflect.InvocationTargetException:null

  • Time information for PO based Invoice

    Hi Gurus,
    In which table the time information for PO based Invoice is stored & how can we find out the invoice creation time for a particular PO (with particular line item, for ex. line item 10, 20, 30 etc.)?
    Thanks & Regards,
    VK

    please check the invoice header table which will give you the info on who created, when created etc may be RBKP or please check at SE11 or BSEG Table etc.

  • Can i buy a smaller macbook air (64gb) but use the time capsule for additional data ?

    can i buy a smaller macbook air (64gb) but use the time capsule for additional data ?

    If you plan to store important "original" or "master" files and documents on the Time Capsule, then you might want to think about a way to back up those files to another hard drive.
    Why? If the Time Capsule has a problem...and you have no backups....you lose everything.

  • How do I create a delay or time-lag for my data in LABVIEW

    In my data acquisition system I am using acquired data to create an digital output. I want to delay this output or create a sort of time lag for it. Is there any easy way to incorporate this?

    What you might consider doing is using a sequence. Run the data through this sequence and inside of the sequence, put a delay equal to the time you wish to delay the signal. Then, the output will not be available until the delay has completed.
    If you need continuous streaming data, this might not work too well. Then, you will need some sort of a data buffer - perhaps using a queue might be one possible solution. I have not used it, but I think that queues can have a time stamp. You could possibly artifically alter the time stamp.
    Hope this helps,
    Jason

  • Changing time-out for scheduled data refresh

    Using a Power Query connection, is it possible to extend the time-out time for scheduled data refreshes? The amount of data to be retrieved is rather limited, but there's thousands of rows (NAV server).
    If not, any suggestions to how to reduce latency?
    Thanks.

    Thorm,
    Is this still an issue?
    Thanks!
    Ed Price, Azure & Power BI Customer Program Manager (Blog,
    Small Basic,
    Wiki Ninjas,
    Wiki)
    Answer an interesting question?
    Create a wiki article about it!

  • SP2013 alternative for Collect Data workflow task?

    SP2010 offered a workflow action named Collect Data, allowing for easy data collection from users without having to create custom ASPX forms. Apparently, SP2013 no longer offers this feature.
    I'm doing a SP2013 project now that absolutely inhibits me from doing any custom development, so I'm trying to find an easy way to collect data from users in a SP2013 without having to do custom coding. Are there alternatives?

    Hi,
    I understand that there is a big list of workflow actions that are not available in SharePoint 2013. However, SharePoint 2013 still allows users to create workflows on the SharePoint 2010 platform. If you need these legacy actions, make sure you select the
    "Platform Type" as SharePoint 2010 when you create the workflow. You cannot change the platform type once the workflow is created.
    Source link:
    What's new in workflow in SharePoint Server 2013
    http://technet.microsoft.com/en-us/library/jj219638.aspx
    Tracy Cai
    TechNet Community Support

  • Use Time Machine for transferring data to new drive

    2008 pre-unibody 17" 250GB MacBook Pro4,1 - 2GB RAM, 10.6.8, 2.5ghz
    How can I use Time Machine to transfer data to hybrid 1TB Seagate SSD?

    Hi V.K., was just reading your instructions for jumper25 about transferring all info to a new HD and had a question. I am in the same situation where i have bought a new 320Gb to replace the OEM 120Gb in my MBP. Everything is very straight forward from what you have posted but i only have the original Tiger 10.4.9 install DVD and I am running Leopard 10.5.5. Will I be able to use my Tiger install disk and then use my TM back up even though it Leopard? Or will I need a Leopard install disk to complete? Also is there another way around this problem if i can not use the Tiger disk such as mirroring the 120Gb disk on to my 320Gb?
    Thanks in advance for your time and expertise.
    P.

  • To get Run time Statistics for a Data target

    Hello All,
    I need to collect one month data (ie.Start time and End time of the cube) for the documentation work. Could someone help me to find out the easiest way to get above mentioned data in BW Production system.
    Please guide me to know the query name to get the runtime statistics for the cube
    Thanks in advance,
    Anjali

    it will fetch the data if the BI stats are turned on for that cube.....
    please verify these links
    http://help.sap.com/saphelp_nw04s/helpdata/en/8c/131e3b9f10b904e10000000a114084/content.htm
    http://help.sap.com/saphelp_nw2004s/helpdata/en/43/15c54048035a39e10000000a422035/frameset.htm
    http://help.sap.com/saphelp_nw2004s/helpdata/en/e3/e60138fede083de10000009b38f8cf/frameset.htm

  • Restrict Time Entry for Future Date in Portal

    Hi Gurus,
    My scenario is to restrict users from entering future date attendance in Portal.
    I know the configurations for restrictiong the user from scrolling to future week.
    But in my scenario users will be able to scroll to future weeks but should be restricted with an error message while trying to enter future time.
    Please let me know if you have any suggestions.
    Thanks.
    Regards
    Sairam Maharaj S
    Please search in the ESS forum as I remember responding to a similar scenario some time back. Thread is moved.
    ~Suresh

    Hi Srini,
    Thanks for your reply.
    I have achieved this by deselcting Release Future Time option. But user wants the future time to be restricted even from getting saved. Can I make changes to the default information message "1 out of 1 future time not released" to an error message so that the user will not be able to save. If so please let me know if this can be achieved from any configuration of messages or it should be done from the programming side.
    Thanks.
    Regards
    Sairam Maharaj S

  • Time needed for automated data transfer - hypothetic

    An EBS client with interest in maintaining integration in their large customer base backoffice and with CRMOD has had past challenges in terms of the overall size of the data they wish to integration and the time it takes to do so. As such, the client wishes to understand from a data size (X GIGS) and time (how much time) it might take to for instance synchronize their data. Since they would not quantify their data size, I can only ask... how much data be moved and how fast. This is urgent. Any experienced help here using strategies such as batch integration leveraging web services, cast iron systems or PIP is of help here!

    Disclaimer
    The Author of this posting offers the information contained within this posting without consideration and with the reader's understanding that there's no implied or expressed suitability or fitness for any purpose. Information provided is for informational purposes only and should not be construed as rendering professional advice of any kind. Usage of this posting's information is solely at reader's own risk.
    Liability Disclaimer
    In no event shall Author be liable for any damages whatsoever (including, without limitation, damages for loss of use, data or profit) arising out of the use or inability to use the posting's information even if Author has been advised of the possibility of such damage.
    Posting
    "And the solution was using an FTP client which was able to send files using multiple parallel threads."
    Yep, although that assumes you have multiple files (for which it can work very well).
    For a single large file, a variation of what Milan described, is using a utility that can slice and dice (and reassemble) a single file across multiple TCP flows.
    "Another possibility would be using some WAN accelerators (like Riverbeds, e.g.) which would sent "local ACKs" to the clients on both sides and improve the efficiency of the WAN file transfer."
    Yep, again, and such products can even "transfer" faster than the line rate.  The latter is possible because they generally compress in-line and also use local caching (negating the need to send some data). One of their issues, their transfer rates can be very variable and inconsistent.

Maybe you are looking for