Data is analysis mode is oscillating, but not in aquisition mode

Hi,
I'm having data acquisition problems using BioBench 1.2.  I'm using pressure transducers hooked up to a PC1-M10-16XE-50 DAQ.  When logging data, the pressure shown on the Biobench screen when acquiring matches the numbers on the transducer amplifiers, but when I go to the analysis screen, the numbers are oscillating from 5 to 50, even if the amplifiers are reading 15.  None of the data matches.  I'm not sure if this is a software or hardware problem.  Any ideas?

Do you have Measurement and Automation explorer on your machine?  Are you using Traditional NI-DAQ device drivers or are you using NI-DAQmx drivers?  A good way to isolate the device hardware from the software would be to try to acquire data in Measurement and Automation Explorer and see if we get good data from there.  If the data is good than that means the hardware is in good shape, and it is a software issue.  
Please follow the following link to see how to use test panels in Measurement and Automation Explorer to verify that your device hardware is working well. 
http://zone.ni.com/devzone/cda/tut/p/id/4638
My other question for you is when was the last time you calibrated this device.  This device has a one-year calibration cycle and so it could be useful to get your device calibrated. If you are interested in learning more about calibration lpease visit www.ni.com/calibration
Regards,
CharlesD
Digital MultimetersSchedule a Free 1 Hour LabVIEW Tutorial with an NI Applications Engineer

Similar Messages

  • Data source was activated and replicated but not showing up in RSA7.

    Hello,
    Data source was activated and replicated but not showing up in RSA7.  At what point does the data source appear in the Delta Queue?
    Thanks

    Hi,
    for LO,LIS,generic,FI data sources, delta records come from delta queue.
    if u run the INIT in BW whether it may be success or not delta queue will be maintained in RSA7. and u can check the records in RSA7 or smq2.
    when the init request goes to R/3 then it will maintains delta queue in RSA7.
    assign points if it helps,
    thanks,
    pavan.

  • HT1937 my cellular data connection works in one location but not in another

    Why does my iPhone's data connection work at other places but not at work?
    It works sometimes.

    Good question to ask your cell phone provider.. Maybe the material the building is made of is blocking the cellular data signal.

  • Can you check for data in one table or another but not both in one query?

    I have a situation where I need to link two tables together but the data may be in another (archive) table or different records are in both but I want the latest record from either table:
    ACCOUNT
    AccountID     Name   
    123               John Doe
    124               Jane Donaldson           
    125               Harold Douglas    
    MARKETER_ACCOUNT
    Key     AccountID     Marketer    StartDate     EndDate
    1001     123               10526          8/3/2008     9/27/2009
    1017     123               10987          9/28/2009     12/31/4712    (high date ~ which means currently with this marketer)
    1023     124               10541          12/03/2010     12/31/4712
    ARCHIVE
    Key     AccountID     Marketer    StartDate     EndDate
    1015     124               10526          8/3/2008     12/02/2010
    1033     125               10987         01/01/2011     01/31/2012  
    So my query needs to return the following:
    123     John Doe                        10526     8/3/2008     9/27/2009
    124     Jane Donaldson             10541     12/03/2010     12/31/4712     (this is the later of the two records for this account between archive and marketer_account tables)
    125     Harold Douglas               10987          01/01/2011     01/31/2012     (he is only in archive, so get this record)
    I'm unsure how to proceed in one query.  Note that I am reading in possibly multiple accounts at a time and returning a collection back to .net
    open CURSOR_ACCT
              select AccountID
              from
                     ACCOUNT A,
                     MARKETER_ACCOUNT M,
                     ARCHIVE R
               where A.AccountID = nvl((select max(M.EndDate) from Marketer_account M2
                                                    where M2.AccountID = A.AccountID),
                                                      (select max(R.EndDate) from Archive R2
                                                    where R2.AccountID = A.AccountID)
                   and upper(A.Name) like parameter || '%'
    <can you do a NVL like this?   probably not...   I want to be able to get the MAX record for that account off the MarketerACcount table OR the max record for that account off the Archive table, but not both>
    (parameter could be "DO", so I return all names starting with DO...)

    if I understand your description I would assume that for John Dow we would expect the second row from marketer_account  ("high date ~ which means currently with this marketer"). Here is a solution with analytic functions:
    drop table account;
    drop table marketer_account;
    drop table marketer_account_archive;
    create table account (
        id number
      , name varchar2(20)
    insert into account values (123, 'John Doe');
    insert into account values (124, 'Jane Donaldson');
    insert into account values (125, 'Harold Douglas');
    create table marketer_account (
        key number
      , AccountId number
      , MktKey number
      , FromDt date
      , ToDate date
    insert into marketer_account values (1001, 123, 10526, to_date('03.08.2008', 'dd.mm.yyyy'), to_date('27.09.2009', 'dd.mm.yyyy'));
    insert into marketer_account values (1017, 123, 10987, to_date('28.09.2009', 'dd.mm.yyyy'), to_date('31.12.4712', 'dd.mm.yyyy'));
    insert into marketer_account values (1023, 124, 10541, to_date('03.12.2010', 'dd.mm.yyyy'), to_date('31.12.4712', 'dd.mm.yyyy'));
    create table marketer_account_archive (
        key number
      , AccountId number
      , MktKey number
      , FromDt date
      , ToDate date
    insert into marketer_account_archive values (1015, 124, 10526, to_date('03.08.2008', 'dd.mm.yyyy'), to_date('02.12.2010', 'dd.mm.yyyy'));
    insert into marketer_account_archive values (1033, 125, 10987, to_date('01.01.2011', 'dd.mm.yyyy'), to_date('31.01.2012', 'dd.mm.yyyy'));
    select key, AccountId, MktKey, FromDt, ToDate
         , max(FromDt) over(partition by AccountId) max_FromDt
      from marketer_account
    union all
    select key, AccountId, MktKey, FromDt, ToDate
         , max(FromDt) over(partition by AccountId) max_FromDt
      from marketer_account_archive;
    with
    basedata as (
    select key, AccountId, MktKey, FromDt, ToDate
      from marketer_account
    union all
    select key, AccountId, MktKey, FromDt, ToDate
      from marketer_account_archive
    basedata_with_max_intervals as (
    select key, AccountId, MktKey, FromDt, ToDate
         , row_number() over(partition by AccountId order by FromDt desc) FromDt_Rank
      from basedata
    filtered_basedata as (
    select key, AccountId, MktKey, FromDt, ToDate from basedata_with_max_intervals where FromDt_Rank = 1
    select a.id
         , a.name
         , b.MktKey
         , b.FromDt
         , b.ToDate
      from account a
      join filtered_basedata b
        on (a.id = b.AccountId)
    ID NAME                     MKTKEY FROMDT     TODATE
    123 John Doe                  10987 28.09.2009 31.12.4712
    124 Jane Donaldson            10541 03.12.2010 31.12.4712
    125 Harold Douglas            10987 01.01.2011 31.01.2012
    If your tables are big it could be necessary to do the filtering (according to your condition) in an early step (the first CTE).
    Regards
    Martin

  • Data exists in DSO New table but not in Active table

    Guys,
    I am loading data from one DSO to another DSO, in my target DSO I have 6 fields which 2 of them are mapped in Transformation and remaining 4 will be populated by Start routine and End routines. Before Activating I am seeing all 6 fields got populated in New data table but after activating I am only seeing the 2 fields in Active data table but not those that are populated through Start and End rouitnes.
    I have only one Key in my DSO, am I missing something here?please let me know.
    Thanks,
    KK

    Hi,
    beside end routine, there will be one icon.
    click on that.
    you will see two options.
    select the option "populate values into fields with out active transformation rules"
    don't remember the words correctly but gives the same meaning...
    Activate the transformation.
    this will solve your problem
    Regards,
    Raghu
    Edited by: Raghu tej harish reddy on May 2, 2009 11:10 PM

  • Data saved from PDF form printable but not readable?

    I created a Reader extended PDF to send out to clients to collect data.
    I must have done something wrong as when the forms are coming back, they forms are populated in OSX Preview, but not when opening the PDF in Acrobat.
    When the form is printed, all data displays correctly.
    Is this something to do with the way the form fields are formatted?
    any help appreciated!!! Thank You
    FILE IM HAVING TROUBLE WITH
    Running Mac, Acrobat X Pro

    For some reason when I click on the field in Acrobat the data shows?!
    It's actually not printing either unless i print from Mac Preview and then it destroys the line height.
    Any ideas?

  • Data plan used if MiFi connected but not actively being used?

    If computer is on overnight but not in use, meaning MiFi connected, with that use up my data plan? Or does it use up with "active" use only? I'm sharing internet with other people so I'm not sure my data plan is being used on purpose or just getting eaten up while we are all asleep? I'm already maxed with 3 weeks to go!!!

    As long as you have devices connected to the MiFi there is always a risk that those devices will consume data behind the scenes without your knowledge.  This is how network connected devices work, they are constantly "in use" while connected.  Computers, smart phones and any kind of internet connected devices are constantly backing up content, looking for updates and talking to other network devices.  It has nothing to do with your active participation and all internet communication is held against you on your data plan.
    The only way to completely stop data usage is to turn off your MiFi when it is not needed.  This is a common safe practice that you should start using when on a metered data plan.  MiFis are not intended to be left on 24x7 anyways, frequent reboots will help ensure the device stays in proper working order and increase the longevity of its life.
    Verizon does provide everyone with a data log.  The data log will provide you with time stamps and the amounts that are consumed at those times.  It should be easy to see when your biggest spikes of consumption are.  Isolate what devices are online during those times and what may have been going on and you can start to build a picture of where the problem lies.
    To truly see the network traffic in terms of specific applications, web services or devices you need to find and install network monitors.  Network monitors come in a variety of applications so you need to find and build a monitoring system that fits your environment.  Considering you have other people connected to your MiFi it would not be uncommon for each of you to install a separate monitor specific to those machines.
    Let me know if you have any other questions.

  • Getting a error ...data found in the main document but not in data source

    Good morning .....
    I am generating letters , calling wordpad from forms 6i......when it calls the wordpad document . it is saying that ...data found in the maindocument but not in the data source ...
    but when i see manually in that wordpad data source is present .
    please let me know .. if anyone know solution for this ......

    thanks for the reply Grant....
    yes it worked in simple case .. now i am trying to retrive a mutiple records in a single header .... so i used loops in the program ... i am getting this error now .
    i am calling lst files ...
    by using ole2

  • Data Services job rolling back Inserts but not Deletes or Updates

    I have a fairly simple CDC job that I'm trying to put together. My source table has a record type code of "I" for Inserts, "D" for deletes, "UB" for Update Before and "UP" for Update After. I use a Map_CDC_Operation transform to update the destination table based on those codes.
    I am not using the Transaction Control feature (because it just throws an error when I use it)
    My issue is as follows.
    Let's say I have a set of 10,000 Insert records in my source table. Record number 4000 happens to be a duplicate of record number 1. The job will process the records in order starting with record 1 and begin happily inserting records into the destination table. Once it gets to record 4000 however it runs into a duplicate key issue and then my try/catch block catches the error and the dataflow will exit. All records that were inserted prior to the error will be rolled back in the destination.
    But the same is not true for updates or deletes. If I have 10000 deletes and 1 insert in the middle that happens to be an insert of a duplicate key, any deletes processed before the insert will not be rolled back. This is also the case for updates.
    And again, I am not using Transaction Control, so I'm not sure why the Inserts are being rolled back, but more curiously Updates and Deletes are not being rolled back. I'm not sure why there isn't a consistent result regardless of type of operation. Does anyone know what's going on here or  what I'm doing wrong/what my misconception may be?
    Environment information: both source and destination are SQL Server 2008 databases and the Data Services version we use is 14.1.1.460.
    If you require more information, please let me know.

    Hi Michael,
    Thanks for your reply. Here are all the options on my source table:
    My Rows per commit on the table is 10,000.
    Delete data table before loading is not checked.
    Column comparison - Compare by name
    Number of loaders - 1
    Use overflow file - No
    Use input keys - Yes
    Update key columns - No
    Auto correct load - No
    Include in transaction - No
    The rest were set to Not Applicable.
    How can I see the size of the commits for each opcode? If they are in fact different from my Rows per commit (10,000) that may solve my issue.
    I'm new to Data Services so I'm not sure how I would implement my own transaction control logic using a control column and script. Is there a guide somewhere I can follow?
    I can also try using the Auto correct load feature.  I'm guessing "upsert" was a typo for insert? Where is that option?
    Thank you very much!
    Riley

  • Data In R/3 is present but not in BW

    Hi experts i got thiss problem in last extraction i have 5 fields in r/3 which having data but in BW one feild is not getting data remain 4 r showing data what could be the problem ? n how to solve it?plz let me know
    Thanks in advance. bye
    jay

    Hi,
    This info may be helpful.
    The first step should be to see if there was any filters when you ran the report (If you had viewed data thru a report).
    Then proceed to see the contents of the infoprovider/table level in BW. See of data is there. If data is there in cube and not showing in report it may be an issue with report.
    Check the data availabilty in the first level target.
    As its found in R/3 and not in cube take a top down or bottom up apporach and follow the data flow path to see to what level data is populated, the presence of any filters in transformations,etc.
    Check the data flow from source to target.
    Check if any update or transfer routine available in the data flow.
    Check the contents of all the targets in the data flow for this field.
    Refer.
    R/3 - BW reconcilation
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/968dab90-0201-0010-c093-9d2a326969f1
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/7a5ee147-0501-0010-0a9d-f7abcba36b14
    Reconsilation
    Please check this link for data reconcilation:
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/7a5ee147-0501-0010-0a9d-f7abcba36b14
    Re: Reconcilation of data
    strategies and methods to reconcile sap bw data to source systems .
    http://bluestonestep.com/component/option,com_docman/task,doc_download/gid,7
    thanks,
    JituK

  • Date format accepted in new table but not in active table for DSO, BI 7.0

    Hi All,
    Need help from the experts, i was trying to load Flate file data in CSV(Coma separated) and cell type general for dates(yyyymmdd). everything was fine i have a routine for new fields in DSO for Fiscal week and Fiscal Quater which is calculating based on my Flate file date i.e delivery date(yyyymmdd).
    The New table of DSO means before activation everything was fine and all the logics and mappings working for the Fiscal week and Fiscal Quater and that can be visible, But after activation the same only the Fiscal week and Fiscal Quater is not showing data, its comming as blank and '000000' for all records.
    As the Fiscal week and Fiscal Quater is maintained as custmised InfoObject with data type NUM.
    Can you tell what i am missing, because i tried with different cell type also in CSV File.
    Thanks and Regards,
    Taps
    Edited by: Taps on Oct 1, 2010 1:21 PM

    Resolved...!!!
    Follow the link ...
    /people/maheshsingh.mony5/blog/2010/09/24/update-behavior-of-end-routine-in-transformations
    Regards,
    Taps
    Edited by: Taps on Oct 1, 2010 2:02 PM

  • Data refreshed when preview in XLF but not refreshed in exported SWF

    Hi,,
    I have an XML file linked to Exel which is the source of an XLF file. All connections are OK and I am able to see the latest data are being refreshed when I preview the XLF file.But when I exported the XLF file to SWF file, the error 2048 occured and the data is not refreshed at all.
    I have put the SWF file in the Global Security Settings Panel (Always Allow list) and also checked the Every Time I visit the webpage under the Temporary Internet Files and History Settings.
    Anything else I need to do..can someone help?

    Hi
    Error 2048 is usually caused by two main reasons.
    One -  the global security settings are not set to allow files from the desired folder to be accessed.  Check that you have added the source folder(s) for your data to the list "allows trust files in this location" under Adobe's online settings manager.
    Two - you're trying to access a file on a domain that is different from the one you're running the swf file on. For this you need a crossdomain file added to the root folder of the server you're trying to access.
    So your crossdomain.xml should look like this
    <?xml version="1.0"?>
    <!DOCTYPE cross-domain-policy SYSTEM "http://www.macromedia.com/xml/dtds/cross-domain-policy.dtd">
    <cross-domain-policy>
    <allow-http-request-headers-from domain="" headers="" secure="false" />
    <allow-access-from domain="*" secure="false" />
    </cross-domain-policy>
    For more information, search other posts on this topic or please check out  the Adobe website.
    Andy
    Xcelsius QA

  • Saved VI file size changes when data is visible on a graph but not the default value

    If a VI is saved when data is visible on a graph, the file size is larger; even though the default is set to be a blank graph.  I have tested this in LabVIEW 8.6 and 2009.  If you load the larger file, the graph is blank as expected.  If this behavior is by design, it appears odd to me.
    To duplicate the issue:
    Create a blank VI.
    Add a Sine Waveform VI and set the number of samples to 1,000,000.
    Add a Waveform Graph VI that spans the entire monitor and connect it to the output of the Sine Waveform VI.
    Save the VI and note the file size.
    Run the VI.
    Save the VI and compare the size to the original size.
    The VI file size is larger.
    In the Waveform Graph, select “Reinitialize to default value.
    The VI file size returns to It’s original size.

    Your obeservation is correct, and expected behavior.
    This behavior is useful when you have inputs that you would like to set as defaults for the user.
    Obviously, if there is a value to be saved, it will require some memory to store the value.
    If the input is left empty by default, that memory is made available again.
    It may not be easy to think of a good use for this for a graph, but think about for numeric or string controls.
    What if the user isnt sure how they should input a certain parameter or value?
    You could store a default value so they could see how they should input their value.
    Message Edited by Cory K on 09-15-2009 11:12 AM
    Cory K

  • XML data connection works in the preview, but not in the final swf

    Hello,
    I have set-up a dashboard with an XML data connection button to load XML data from an internet source. Everthing works perfect in the preview-swf within Xcelsius 4.5. But pressing the button in the exported swf does not have any effect. The dashboard is busy for 2 seconds, but no data is loaded. What could be the reason?
    Thanks.

    Yes, the location is registered. The problem also occours when the swf is opened directly on the web server where the xml-source is running.

  • Data shows on preview in CR2008 but not on report in InfoView

    Hi,
    I'm working on a report and have a strange (to me anyway) thing happening.  I created a report using a stored procedure as the data source. When I run the stored procedure in SQL Server Manager it shows all the data I'm expecting.  When I create the report in Crystal Reports 2008 and preview the data I see what I am expecting to see.  However, when I save the report in the repository and access it from within InfoView there are a few fields (numeric only) that are not displaying the expected values (values that have shown in SQL Server Manager adn CR2008 preview mode) -- in fact they aren't showing anything.  I created a formula field that checks one of the offending fields for a null value and it returns positive. Not sure where to go from here.
    Any help would be appreciated as I am up against a deadline and this has just reared its ugly head or at least just been noticed.
    Thanks!

    Hi
    There are two things to check :
    1.  Check the driver which you are using in BO ( If you are try to connect using ODBC try to change the drivers and check)
    2.  Viewer--In infoview go in preferences and try to change the viewer.  If your report is using DHTL, try to change it to Java or Web viewer and check.
    Thanks,
    Sastry

Maybe you are looking for

  • How can I transfer books from kobo ipad app to another ipad

    How can I transfer books from kobo ipad app to another ipad with a kobo ipad app.

  • Backup of iTunes Library to DVD causes iTunes to crash

    I thought backing up my entire iTunes library onto DVD would be simple. iTunes reveals that it will take 3 disks for my library and proceeds to write Disk#1. After requesting (and inserting) a second blank disk, iTunes crashes -- EVERY TIME. Now what

  • Connecting network Printer to laptops via wireless

    My daughter received a new Toshiba laptop for Christmas to help with her school work and I have been working to get it setup. I have nearly everything working fine including a network connection for internet through my network router, but I am having

  • SAP B1 not printing on Dot Matrix Printer

    Hello everyone I'm trying to print from SAP B1 2007B PL:00 to a Dot Matrix Printer connected in network.I selected the printer and gave print.But it is printing one or two junk characters only.When i try the same in a laser printer it prints well. Wh

  • Query on Encrpytion and Decryption in FTP server

    Hi Experts, We are currently using SFTP receiver adapter in one of our interface. Our query is, is it possible to do encryption and decryption of file in the sender and receiver FTP channel. Your inputs are welcome. Thanks in Advance. Regards Suganya