Delta fetching zero records

Hi all,
the delta load for a master data infoobject runs daily at 5.00PM.
on 30th June, the load failed due to invalid character.
as there was no one to monitor the load on saturday, it remained as it is.
on 1st July, the delta ran again and fetched the same no of records and failed with the same error.
on yesterday(2nd July) when we came, we saw the error and corrected the load which ran on 1st July.
Yesterday,the delta load again ran on 5.00PM.but we had to kill this job due to some reason.After killing i manually changed the status to red though it became red.
But with today's load it fetched 0 records with the message"no updates since the last delta"
Delta update in the monitor shows"Repeat of last delta".
so,have i lost any records in this process?how do i know if there were any records or not?
<b>observation:</b>
in R3 in table BWOM2_TIMEST the last delta entry is that of 30th.
Any useful help will be rewarded
Regards
Dhanya.

if i correct the 30th load, and again request delta ,will it bring all records from 30th?
Yes... it shoudl bring data from 30th.
yesterday i corrected the 1st July load and then when 5.00PM load ran ,i had to kill the job.But with today's load it fetched 0 records.
is the 1st Load set tp green and Loaded in BI?
Nagesh Ganisetti.
Assign points if it helps.

Similar Messages

  • Delta Loading -Zero Records Only-- Why?- Please Answer

    Helo Gurus,
    I have created one generic data source on a custom created table and put counter field as delta field. This table is populated by running one pgm.
    Then I replicated the data source into BI and created initialisation and pulled data to PSA. Then from there to DSO and to Cube.
    Then I run the program for another date and populated the table in R/3 with new 200 records. Then I created another InfoPackage in BI and put the update as Delta and scheduled immediately. <b>But it pulled 0 records</b>. <b>Why?</b>
    Whether we need to delete the initialisation load from PSA before loading delta to PSA?
    While creating Generic delta field (counter) , I put lower limit as 10 and checked the field additive delta.

    Pallavi,
    The program is populating the table. initial count was 15788. So after running the program for different date the count becomes 15801.
    While Initialisation I pulled 15788 into PSA and then to DSO and then to cube. Then I run the program and made the count to 15801. Then I checked in RSA3 and is pulling 15801. Then I created Infopackage for delta update and scheduled teh same immediately. But pulling 0 records.
    As you told I checked in RSA7. Its showing in the total field 0. But in the stat field when clicked its showing count as 15801. I went inside the delta update and repetition, its showing zero entries.
    I had given lower limit of 10 as the field counter is numeric

  • Zero records in delta update

    Dear All,
    In HR :Time management (Time and Labour )(cube)  0PT_C01 two data sources are there  and one is 0HR_PT_1 , during delta load it does not get any recors and it will show zero records at cube level and in r/3 rsa3 ir shows new records.Please suggest how to solve this issue.
    Regards
    Albaik

    Hi,
    This is because you havent run the init load till now.
    We have to run a init load before running delta(one time process).
    now as you have selected init with datatransfer, you got all the records transferred.
    from next time, no need to run init and you can proceed with delta.
    Cheers,
    Srinath.

  • Zero records in Delta Queue for Non-LO Datasource

    Hi,
    I have a process chain which loads data daily and last loaded on 5th of this month which is a delta load to DSO, and then I have triggered process chain on 10th  and now the process chain got successful but delta is returning zero records. I have gone through the Delta queue monitor, in that the data source is showing 0. what could be the reason for this? The data source is a Generic data source built on View and it is not a LO data source and delta is on timestamp.
    Thanks,
    Karan.

    Hi lokesh.
    Repair Full delta option it wont distub existing deltas.....
    repair full delta put full update indicate request as repair request.
    Via the Scheduler menu we can indicate a request with full-update mode as Repair Full Request. This request can be updated into every data target even if the data target already contains data from an initialization run or delta for this Data Source/ source system combination, and has overlapping selection criteria.

  • 0fi_ar_4 INIT returning zero records

    Hi gurus,
    0fi_ar_4 INIT is returning zero records while loading into ODS.I have checked in RSA3 it is returning zero records with INIT selection but is it is fetching records with Full update.Can ne one tell me what to do.how to solve this problem.
    rgds,
    ***Points Assured**

    Hi Suravi,
    Check at the Info Package selection-- Init with out data transfer.
    Make it with data transfer..
    Hope it helps..

  • 0FI_AR_4 extractor bringing zero records

    Hi,
    We are using extractor 0FI_AR_4 as delta. At times it is bringing zero records but the next time it brings data along with the daata missed the previous day.
    For ex
    Monday it brought records until previous week
    Tuesday it brought zero record
    Wednesday it brought more records i.e including records created on Monday and Tuesday
    We could not figure out a situation when can this happen. But our observation is that there is no entry for tuesday in the table BWOM2_TIMEST.
    BWOM_SETTINGS
    BWFILOWLIM     19910101
    BWFINSAF         3600
    BWFISAFETY     1
    BWFITIMBOR      020000
    DELTIMEST         60
    OBJCURTYPE    10
    Regards
    Vijay

    check http://help.sap.com/erp2005_ehp_04/helpdata/EN/af/16533bbb15b762e10000000a114084/content.htm
    it states :
    In delta mode, data requests with InfoSource 0FI_AR_4 and InfoSource 0FI_AP_4 do not provide any data if no new extraction has taken place with InfoSource 0FI_GL_4 since the last data transfer. This ensures that the data in BW for Accounts Receivable and Accounts Payable Accounting is exactly as up to date as the data for General Ledger Accounting.
    you can check this...de link gives details about the delta methods for FI extractors

  • Zero Record Data Load Problem

    Hi,
    Please give your suggestion for following problem.
    we are loading data from ETL (Flat File - Data Stage) into SAP BW 3.1.
    data may contain Zero records. When we try to push the data into BW. At ETL side, it is showing successful data transfer. At, BW side it is showing "Processing state" (Yellow light). and all BW resources are hang-up.
    When we try to send another data load from ETL side, We could not push the data as BW resources are hang up by the previous process.
    Whenever we are getting this kind of problem, we are killing the process and continuing with another data Re-load. But this is not a permanent solution. This is happening more often.
    What is the solution for this problem?
    One of my colleague suggested following suggestion. Shall I consider this one?
    Summary:  when loading with empty files, data may be in the processing state in BW 
    Details:  When user load with empty file(must be empty, can not have any line returns, user can check the data file in binary mode), data is loaded into BW with 0 records. BW will show be in yellow state(processing state) with 0 record showing, and in the PSA inside BW, 1 datapacket will show there with nothing inside. Depends on how user configured their system, BW server can either accept the 0 record packet or deny it. When BW server is configured to accept it, this load request will change to green state(finished state). When the BW server is configured to deny it, this load request will be in the yellow state.
    Please give me ur suggestions.
    Thanks in advance.
    Regards,
    VPR

    hi VPR,
    have you tried to set the light 'judge'ment
    go to monitor of one request and menu settings->evaluation of requests(traffic light), in next screen 'evaluation of requests', 'if no data is avaible in the system, the request' -> choose option 'is judged to be successful' (green).
    Set delta load to complete when no delta data
    hope this helps.

  • Zero records in generic extractor

    Dear all ,
    I have created  a generic extrator with function module but there are zero records which are getting extracted.I am able to extract records if I execute only the function module.
    Below is the code for the same.*
    FUNCTION ZGET_CUST_SALP .
    ""Local Interface:
    *"  IMPORTING
    *"     VALUE(I_REQUNR) TYPE  SRSC_S_IF_SIMPLE-REQUNR
    *"     VALUE(I_DSOURCE) TYPE  SRSC_S_IF_SIMPLE-DSOURCE OPTIONAL
    *"     VALUE(I_MAXSIZE) TYPE  SRSC_S_IF_SIMPLE-MAXSIZE OPTIONAL
    *"     VALUE(I_INITFLAG) TYPE  SRSC_S_IF_SIMPLE-INITFLAG OPTIONAL
    *"     VALUE(I_READ_ONLY) TYPE  SRSC_S_IF_SIMPLE-READONLY OPTIONAL
    *"     VALUE(I_REMOTE_CALL) TYPE  SBIWA_FLAG DEFAULT SBIWA_C_FLAG_OFF
    *"  TABLES
    *"      I_T_SELECT TYPE  SRSC_S_IF_SIMPLE-T_SELECT OPTIONAL
    *"      I_T_FIELDS TYPE  SRSC_S_IF_SIMPLE-T_FIELDS OPTIONAL
    *"      E_T_DATA STRUCTURE  ZKN_VP OPTIONAL
    *"  EXCEPTIONS
    *"      NO_MORE_DATA
    *"      ERROR_PASSED_TO_MESS_HANDLER
    Example: DataSource for table ZKN_VP
      TABLES: ZKN_VP.
    Auxiliary Selection criteria structure
      DATA: L_S_SELECT TYPE SRSC_S_SELECT.
      data : begin of t_tab1 occurs 0,
               kunnr like kna1-kunnr,
               land1 like kna1-land1,
               PERNR like knvp-PERNR,
               end of t_tab1.
      data : begin of t_knvv occurs 0,
             kunnr like knvv-kunnr,
             vkorg like knvv-vkorg,
             VTWEG like knvv-VTWEG,
             spart like knvv-spart,
             end of t_knvv.
      data : begin of t_knvp_kunn2 occurs 0,
              kunnr like knvp-kunnr,
              kunn2 like knvp-kunn2,
              vkorg like knvv-vkorg,
              VTWEG like knvv-VTWEG,
              spart like knvv-spart,
              end of t_knvp_kunn2.
      data : begin of t_knvp_pernr occurs 0,
              kunnr like knvp-kunnr,
              pernr like knvp-pernr,
              vkorg like knvv-vkorg,
              VTWEG like knvv-VTWEG,
              spart like knvv-spart,
              end of t_knvp_pernr.
      data : begin of t_knvp_p_k occurs 0,
           kunnr like knvp-kunnr,
           pernr like knvp-pernr,
           vkorg like knvv-vkorg,
           VTWEG like knvv-VTWEG,
           spart like knvv-spart,
           end of t_knvp_p_k.
      data : IS_BW_CUST1_w like zkn_vp occurs 0 with header line.
    Maximum number of lines for DB table
      STATICS: S_S_IF TYPE SRSC_S_IF_SIMPLE,
    counter
              S_COUNTER_DATAPAKID LIKE SY-TABIX,
    cursor
              S_CURSOR TYPE CURSOR,
              S_CURSOR1 TYPE CURSOR,
              S_CURSOR2 TYPE CURSOR.
    Select ranges
      RANGES: L_R_KUNNR  FOR KNA1-KUNNR.
    Initialization mode (first call by SAPI) or data transfer mode
    (following calls) ?
      IF I_INITFLAG = SBIWA_C_FLAG_ON.
    Initialization: check input parameters
                    buffer input parameters
                    prepare data selection
    Check DataSource validity
        CASE I_DSOURCE.
          WHEN 'ZGET_CUST_SALP_ATTR'.
          WHEN OTHERS.
            IF 1 = 2. MESSAGE E009(R3). ENDIF.
    this is a typical log call. Please write every error message like this
            LOG_WRITE 'E'                  "message type
                      'R3'                 "message class
                      '009'                "message number
                      I_DSOURCE   "message variable 1
                      ' '.                 "message variable 2
            RAISE ERROR_PASSED_TO_MESS_HANDLER.
        ENDCASE.
        APPEND LINES OF I_T_SELECT TO S_S_IF-T_SELECT.
    Fill parameter buffer for data extraction calls
        S_S_IF-REQUNR    = I_REQUNR.
        S_S_IF-DSOURCE = I_DSOURCE.
        S_S_IF-MAXSIZE   = I_MAXSIZE.
    Fill field list table for an optimized select statement
    (in case that there is no 1:1 relation between InfoSource fields
    and database table fields this may be far from beeing trivial)
        APPEND LINES OF I_T_FIELDS TO S_S_IF-T_FIELDS.
      ELSE.                 "Initialization mode or data extraction ?
    Data transfer: First Call      OPEN CURSOR + FETCH
                   Following Calls FETCH only
    First data package -> OPEN CURSOR
        IF S_COUNTER_DATAPAKID = 0.
    Fill range tables BW will only pass down simple selection criteria
    of the type SIGN = 'I' and OPTION = 'EQ' or OPTION = 'BT'.
          LOOP AT S_S_IF-T_SELECT INTO L_S_SELECT WHERE FIELDNM = 'KUNNR'.
            MOVE-CORRESPONDING L_S_SELECT TO L_R_KUNNR.
            APPEND L_R_KUNNR.
          ENDLOOP.
         LOOP AT S_S_IF-T_SELECT INTO L_S_SELECT WHERE FIELDNM = 'CONNID'.
           MOVE-CORRESPONDING L_S_SELECT TO L_R_CONNID.
           APPEND L_R_CONNID.
         ENDLOOP.
    Determine number of database records to be read per FETCH statement
    from input parameter I_MAXSIZE. If there is a one to one relation
    between DataSource table lines and database entries, this is trivial.
    In other cases, it may be impossible and some estimated value has to
    be determined.
    ******Get Customer Number from KNA1 Table
          OPEN CURSOR WITH HOLD S_CURSOR FOR
          SELECT kunnr FROM KNA1
                                   WHERE KUNNR  IN L_R_KUNNR.
          fetch next cursor s_cursor APPENDING corresponding fields of table t_tab1
          PACKAGE SIZE S_S_IF-MAXSIZE.
          CLOSE CURSOR S_CURSOR.
    ******Get Customer Details from KNVV Table
          OPEN CURSOR WITH HOLD  S_CURSOR1  FOR
          SELECT kunnr vkorg VTWEG spart  FROM KNVV for all entries in t_tab1
          where
          KUNNR = t_tab1-kunnr.
          fetch next cursor s_cursor1 APPENDING corresponding fields of table t_knvv
          PACKAGE SIZE S_S_IF-MAXSIZE.
          CLOSE CURSOR S_CURSOR1.
    ******Get Customer Partner Function Details from KNVP Table for ship to party.
          OPEN CURSOR WITH HOLD  S_CURSOR2  FOR
      select kunnr kunn2 vkorg VTWEG spart from knvp for all entries in t_knvv
        where kunnr = t_knvv-kunnr and vkorg = t_knvv-vkorg
       and VTWEG = t_knvv-VTWEG and spart = t_knvv-spart and parvw = 'WE'.
          fetch next cursor s_cursor2 APPENDING corresponding fields of table t_knvp_kunn2
                PACKAGE SIZE S_S_IF-MAXSIZE.
          CLOSE CURSOR S_CURSOR2.
          delete adjacent duplicates from t_knvp_kunn2 comparing kunnr.
    ******Get Customer Partner Function Details from KNVP Table for Sales Personnel.
          OPEN CURSOR WITH HOLD S_CURSOR2 FOR
          select kunnr pernr vkorg VTWEG spart from knvp for all entries in t_knvv
          where kunnr = t_knvv-kunnr and vkorg = t_knvv-vkorg
          and VTWEG = t_knvv-VTWEG and spart = t_knvv-spart and parvw = 'ZR'.
        ENDIF.                             "First data package ?
        fetch next cursor s_cursor2 APPENDING corresponding fields of table t_knvp_pernr
           PACKAGE SIZE S_S_IF-MAXSIZE.
        CLOSE CURSOR S_CURSOR2.
        Loop at t_knvp_pernr.
          t_knvp_p_k-kunnr = t_knvp_pernr-kunnr.
          t_knvp_p_k-pernr = t_knvp_pernr-pernr.
          t_knvp_p_k-vkorg = t_knvp_pernr-vkorg.
          t_knvp_p_k-VTWEG = t_knvp_pernr-VTWEG.
          t_knvp_p_k-spart = t_knvp_pernr-spart.
          append t_knvp_p_k.
          loop at t_knvp_kunn2 where kunnr = t_knvp_pernr-kunnr and vkorg = t_knvp_pernr-vkorg
          and VTWEG = t_knvp_pernr-VTWEG and spart = t_knvp_pernr-spart.
            t_knvp_p_k-kunnr = t_knvp_kunn2-kunn2.
            t_knvp_p_k-pernr = t_knvp_pernr-pernr.
            t_knvp_p_k-vkorg = t_knvp_pernr-vkorg.
            t_knvp_p_k-VTWEG = t_knvp_pernr-VTWEG.
            t_knvp_p_k-spart = t_knvp_pernr-spart.
            append t_knvp_p_k.
          endloop.
        endloop.
        delete adjacent duplicates from t_knvp_p_k comparing kunnr pernr vkorg vtweg spart.
        loop at t_knvp_p_k.
          IS_BW_CUST1_w-kunnr = t_knvp_p_k-kunnr.
          IS_BW_CUST1_w-pernr = t_knvp_p_k-pernr.
          IS_BW_CUST1_w-vtweg = t_knvp_p_k-vtweg.
          append IS_BW_CUST1_w to  E_T_DATA.
        endloop.
    Fetch records into interface table.
      named E_T_'Name of extract structure'.
       FETCH NEXT CURSOR S_CURSOR
                  APPENDING CORRESPONDING FIELDS
                  OF TABLE E_T_DATA
                  PACKAGE SIZE S_S_IF-MAXSIZE.
        if e_t_data[] is initial.
          RAISE NO_MORE_DATA.
        endif.
      IF SY-SUBRC <> 0.
      CLOSE CURSOR S_CURSOR.
        RAISE NO_MORE_DATA.
      ENDIF.
        S_COUNTER_DATAPAKID = S_COUNTER_DATAPAKID + 1.
      ENDIF.              "Initialization mode or data extraction ?
    ENDFUNCTION.
    Nimisha Gandhi.

    Hi,
    you can't open several cursors this way....
    If you need fields from several tables I suggest to open the cursor on something like
    SELECT t1<field> t2<field>
    FROM KNA1 AS t1
    INNER JOIN KNVV t2 ON t2KUNNR = t1KUNNR
    Please try to stick to template as well since it is of paramount importance to FETCH NEXT at the right moment as well as RAISE NO_MORE_DATA at the right place....
    otherwise that won't work.
    Hope this helps...
    Olivier.

  • Initialization with zero records

    Hi BW Folks,
    I have scheduled initialization to ODS object, Ran successfully but zero records.
    After that tried to do delta the package got failed with out any proper error message. i can see only error message " Start InfoPackage XXXXX "
    if i go to infopackage manually its showing message saying that " There is no active delta Initialization for this IS/QS/DATA source"
    Have checked in R3 - extract checker could see only Zero records.
    Can you please help me in this! Thanks in Advance.
    Regards,
    --Nani.

    Hi,
    Thats the reason for why you are not able to do the delta loads. This request should be there in the <i>Schedular</i> as the prerequisite to do the delta loads.It does not matter  even you have that request in the data target if there is no delta init info at infopackage.
    So you need to do delta init one more tmie. So delete teh data from data targets. And do the delta init .
    With rgds,
    Anil Kumar Sharma .P
    Message was edited by:
            Anil Kumar Sharma

  • Deltas fetching negative values

    *hii,this is regarding deltas fetching negative values,where as when i use full load data is  matching perfectly but when when coming to deltas it is picking negative value bcoz of cancel indicator X  it is showing negative value, so the data is mismatching.
    this is based on pp cube (production planning)
    for eg:                       cancelindicator  orderno   conformqty         targetqty
                                                                 205674                  100             100*
                                               X                 205674              - 50              -100

    hi thanks for ur reply
    datasource is 2lis_04_p_arbpl
    if we take any document  we will get cancel indicator i.e space and X . In space indicator we r getting correct data and data is matching in r/3 but X indicator  gives negative values so if we aggregate them we r getting wrong data  and it is getting zero.
    it is happening in deltas only not for full load if we take full load the data is correctly matching but in deltas it is getting wrong values
    if we delete set up tables and we load data then it is getting correct data up to that date but from next day onwards deltas are getting negative values and data is not matching

  • Global Temp Table, always return  zero records

    I call the procedure which uses glbal temp Table, after executing the Proc which populates the Global temp table, i then run select query retrieve the result, but it alway return zero record. I am using transaction in order to avoid deletion of records in global temp table.
    whereas if i do the same thing in SQL navigator, it works
    Cn.ConnectionString = Constr
    Cn.Open()
    If FGC Is Nothing Then
    Multiple = True
    'Search by desc
    'packaging.pkg_msds.processavfg(null, ActiveInActive, BrandCode, Desc, Itemtype)
    SQL = "BEGIN packaging.pkg_msds.processavfg(null,'" & _
    ActiveInActive & "','" & _
    BrandCode & "','" & _
    Desc & "','" & _
    Itemtype & "'); end;"
    'Here it will return multiple FGC
    'need to combine them
    Else
    'search by FGC
    SQL = "BEGIN packaging.pkg_msds.processavfg('" & FGC & "','" & _
    ActiveInActive & "','" & _
    BrandCode & "',null,null); end;"
    'will alway return one FGC
    End If
    ' SQL = " DECLARE BEGIN rguo.pkg_msds.processAvedaFG('" & FGC & "'); end;"
    Stepp = 1
    Cmd.Connection = Cn
    Cmd.CommandType = Data.CommandType.Text
    Cmd.CommandText = SQL
    Dim Trans As System.Data.OracleClient.OracleTransaction
    Trans = Cn.BeginTransaction()
    Cmd.Transaction = Trans
    Dim Cnt As Integer
    Cnt = Cmd.ExecuteNonQuery
    'SQL = "SELECT rguo.pkg_msds.getPDSFGMass FROM dual"
    SQL = "select * from packaging.aveda_mass_XML"
    Cmd.CommandType = Data.CommandType.Text
    Cmd.CommandText = SQL
    Adp.SelectCommand = Cmd
    Stepp = 2
    Adp.Fill(Ds)
    If Ds.Tables(0).Rows.Count = 0 Then
    blError = True
    BlComposeXml = True
    Throw New Exception("No Record found for FGC(Finished Good Code=)" & FGC)
    End If
    'First Row, First Column contains Data as XML
    Stepp = 0
    Trans.Commit()

    Hi,
    This forum is for Oracle's Data Provider and you're using Microsoft's, but I was curious so I went ahead and tried it. It works fine for me. Here's the complete code I used, could you point out what are you doing differently?
    Cheers,
    Greg
    create global temporary table abc_tab(col1 varchar2(10));
    create or replace procedure ins_abc_tab(v1 varchar2) as
    begin
    insert into abc_tab values(v1);
    end;
    using System;
    using System.Data;
    using System.Data.OracleClient;
    class Program
        static void Main(string[] args)
            OracleConnection con = new OracleConnection("data source=orcl;user id=scott;password=tiger");
            con.Open();
            OracleTransaction txn = con.BeginTransaction();
            OracleCommand cmd = new OracleCommand("begin ins_abc_tab('foo');end;", con);
            cmd.Transaction = txn;
            cmd.ExecuteNonQuery();
            cmd.CommandText = "select * from abc_tab";
            OracleDataAdapter da = new OracleDataAdapter(cmd);
            DataSet ds = new DataSet();
            da.Fill(ds);
            Console.WriteLine("rows found: {0}", ds.Tables[0].Rows.Count);
            // commit, cleanup, etc ommitted for clarity
    }

  • How to look at the delta (daily load) records from R/3 system to BW master

    How to look at the delta (daily load) records from R/3 system to BW master on a particular date, let us say,
    today delta happened from R/3 to 0vendor  in BW , how to view what are all the records came to BW - 0vendor (from R/3) in today's delta.
    Regards
    Siva

    Hi,
    youi can see the data in the PSA.
    Just go to the request in monitor tab and in the monitor and in the left top click on the PSA button.
    In the nest window give the selection on the number of records you want to see from that if you want to see all the records give the number which get laoded in that request.
    this will show you the data loaded in that request.
    Hope it clears
    thanks

  • Fetching many records all at once is no faster than fetching one at a time

    Hello,
    I am having a problem getting NI-Scope to perform adequately for my application.  I am sorry for the long post, but I have been going around and around with an NI engineer through email and I need some other input.
    I have the following software and equipment:
    LabView 8.5
    NI-Scope 3.4
    PXI-1033 chassis
    PXI-5105 digitizer card
    DELL Latitude D830 notebook computer with 4 GB RAM.
    I tested the transfer speed of my connection to the PXI-1033 chassis using the niScope Stream to Memory Maximum Transfer Rate.vi found here:
    http://zone.ni.com/devzone/cda/epd/p/id/5273.  The result was 101 MB/s.
    I am trying to set up a system whereby I can press the start button and acquire short waveforms which are individually triggered.  I wish to acquire these individually triggered waveforms indefinitely.  Furthermore, I wish to maximize the rate at which the triggers occur.   In the limiting case where I acquire records of one sample, the record size in memory is 512 bytes (Using the formula to calculate 'Allocated Onboard Memory per Record' found in the NI PXI/PCI-5105 Specifications under the heading 'Waveform Specifications' pg. 16.).  The PXI-5105 trigger re-arms in about 2 microseconds (500kHz), so to trigger at that rate indefinetely I would need a transfer speed of at least 256 Mb/s.  So clearly, in this case the limiting factor for increasing the rate I trigger at and still be able to acquire indefinetely is the rate at which I transfer records from memory to my PC.
    To maximize my record transfer rate, I should transfer many records at once using the Multi Fetch VI, as opposed to the theoretically slower method of transferring one at a time.  To compare the rate that I can transfer records using a transfer all at once or one at a time method, I modified the niScope EX Timestamps.vi to allow me to choose between these transfer methods by changing the constant wired to the Fetch Number of Records property node to either -1 or 1 repectively.  I also added a loop that ensures that all records are acquired before I begin the transfer, so that acquisition and trigger rates do not interfere with measuring the record transfer rate.  This modified VI is attached to this post.
    I have the following results for acquiring 10k records.  My measurements are done using the Profile Performance and Memory Tool.
    I am using a 250kHz analog pulse source.
    Fetching 10000 records 1 record at a time the niScope Multi Fetch
    Cluster takes a total time of 1546.9 milliseconds or 155 microseconds
    per record.
    Fetching 10000 records at once the niScope Multi Fetch Cluster takes a
    total time of 1703.1 milliseconds or 170 microseconds per record.
    I have tried this for larger and smaller total number of records, and the transfer time per is always around 170 microseconds per record regardless if I transfer one at a time or all at once.  But with a 100MB/s link and 512 byte record size, the Fetch speed should approach 5 microseconds per record as you increase the number of records fetched at once.
    With this my application will be limited to a trigger rate of 5kHz for running indefinetely, and it should be capable of closer to a 200kHz trigger rate for extended periods of time.  I have a feeling that I am missing something simple or am just confused about how the Fetch functions should work. Please enlighten me.
    Attachments:
    Timestamps.vi ‏73 KB

    Hi ESD
    Your numbers for testing the PXI bandwidth look good.  A value of
    approximately 100MB/s is reasonable when pulling data accross the PXI
    bus continuously in larger chunks.  This may decrease a little when
    working with MXI in comparison to using an embedded PXI controller.  I
    expect you were using the streaming example "niScope Stream to Memory
    Maximum Transfer Rate.vi" found here: http://zone.ni.com/devzone/cda/epd/p/id/5273.
    Acquiring multiple triggered records is a little different.  There are
    a few techniques that will help to make sure that you are able to fetch
    your data fast enough to be able to keep up with the acquired data or
    desired reference trigger rate.  You are certainly correct that it is
    more efficient to transfer larger amounts of data at once, instead of
    small amounts of data more frequently as the overhead due to DMA
    transfers becomes significant.
    The trend you saw that fetching less records was more efficient sounded odd.  So I ran your example and tracked down what was causing that trend.  I believe it is actually the for loop that you had in your acquisition loop.  I made a few modifications to the application to display the total fetch time to acquire 10000 records.  The best fetch time is when all records are pulled in at once. I left your code in the application but temporarily disabled the for loop to show the fetch performance. I also added a loop to ramp the fetch number up and graph the fetch times.  I will attach the modified application as well as the fetch results I saw on my system for reference.  When the for loop is enabled the performance was worst at 1 record fetches, The fetch time dipped  around the 500 records/fetch and began to ramp up again as the records/fetch increases to 10000.
    Note I am using the 2D I16 fetch as it is more efficient to keep the data unscaled.  I have also added an option to use immediate triggering - this is just because I was not near my hardware to physically connect a signal so I used the trigger holdoff property to simulate a given trigger rate.
    Hope this helps.  I was working in LabVIEW 8.5, if you are working with an earlier version let me know.
    Message Edited by Jennifer O on 04-12-2008 09:30 PM
    Attachments:
    RecordFetchingTest.vi ‏143 KB
    FetchTrend.JPG ‏37 KB

  • Fetching 10 records at a time

    Product.Category dimension has 4 child nodes Accessories,Bikes,Clothing n Components.My problem is when I have thousands of first level nodes my application takes a lot of time to load. Is there a way to fetch only say 100 records at a time? So then when
    i click a next button i get the next 100
    Eg:On the 1st click of a button I fetch 2 members
    WITH MEMBER [Measures].[ChildrenCount] AS
    [Product].[Category].CurrentMember.Children.Count
    SELECT [Measures].[ChildrenCount] ON 1
    ,TopCount([Product].[Category].Members, 2) on 0
    FROM [Adventure Works]
    This fetches only Accessories. Is there a way the fetch the next two records Bikes n Clothing on  click.
    Then Components on the next click. So on an so forth.

    Hi Tsunade,
    According to your description, there are thousands of members on your cube. It will take long time to retrieve all the member at a time, in order to improve the performance, you are looking for a function to fetch 10 records at a time, right? Based on my
    research, there is no such a functionally to work around this requirement currently.
    If you have any concern about this behavior, you can submit a feedback at
    http://connect.microsoft.com/SQLServer/Feedback and hope it is resolved in the next release of service pack or product. Your feedback enables Microsoft to make software and services the best that they can be, Microsoft might consider to add this feature
    in the following release after official confirmation.
    Regards,
    Charlie Liao
    TechNet Community Support

  • Query returns zero records in coldfusion context, but works fine in Navicat

    I've got a query that's returning zero records when I load a page.  If I copy and paste that same query (from the debug output) into navicat, I get rows returned (as I expect).  Has anyone seen this?  It happens locally (CF9) AND remotely on our staging server (CF10).  Even weirder, it's a query that was previously working fine - I simply added an if statement to the where clause, and all of a sudden... 
    Here's the query:
            <CFQUERY name="LOCAL.getEncounterServices" datasource="#REQUEST.dsn#"> 
            SELECT
                a.EncounterProductID,
                a.DateTime AS ServiceDate,
                aa.CartItemID,
                aaa.CartID,
                aaaaa.CartStatus,
                b.ProductID,
                b.ProductName,
                b.CPTCode,
                b.Price,
                c.EncounterID,
                c.DateTimeClosed AS EncounterClosedDate,
                d.FirstName,
                d.LastName
            FROM
                EncounterProducts a
                    LEFT JOIN CartItemProduct aa ON (a.EncounterProductID = aa.EncounterProductID AND aa.Active = 1)
                    LEFT JOIN CartItem aaa ON (aa.CartItemID = aaa.CartItemID)
                    LEFT JOIN Cart aaaa ON (aaa.CartID = aaaa.CartID)
                    LEFT JOIN CartStatus aaaaa ON (aaaa.CartStatusID = aaaaa.CartStatusID),
                Product b,
                Encounters c,
                Contacts d,
                EncounterStatuses e
            WHERE
                1 = 1
                AND (aa.CartItemID IS NULL OR aaaaa.CartStatus = 'Deleted')
                AND a.Active = 1
                AND a.ProductID = b.ProductID
                AND a.EncounterID = c.EncounterID
                AND c.PatientID = d.ContactID
                AND c.EncounterStatusID = e.EncounterStatusID
                AND e.EncounterStatus = 'Closed'
              <CFIF IsDefined("ARGUMENTS.encounter") AND IsObject(ARGUMENTS.encounter)>
                     AND c.EncounterID = <CFQUERYPARAM cfsqltype="cf_sql_integer" value="#ARGUMENTS.encounter.getID()#">
             <CFELSE>
                    AND c.DateTimeClosed >= <CFQUERYPARAM cfsqltype="cf_sql_date" value="#ARGUMENTS.startDate#">
                    AND c.DateTimeClosed < <CFQUERYPARAM cfsqltype="cf_sql_date" value="#DateFormat(DateAdd('d', 1, ARGUMENTS.endDate), 'yyyy-mm-dd')# 00:00:00">
               </CFIF>
                AND c.LocationID = <CFQUERYPARAM cfsqltype="cf_sql_integer" value="#ARGUMENTS.locationID#">
                AND c.CustomerID = <CFQUERYPARAM cfsqltype="cf_sql_integer" value="#ARGUMENTS.customerID#">
            </CFQUERY>
    All of this worked just fine before I added the lines:
             <CFIF IsDefined("ARGUMENTS.encounter") AND IsObject(ARGUMENTS.encounter)>
                     AND c.EncounterID = <CFQUERYPARAM cfsqltype="cf_sql_integer" value="#ARGUMENTS.encounter.getID()#">
             <CFELSE>
                    AND c.DateTimeClosed >= <CFQUERYPARAM cfsqltype="cf_sql_date" value="#ARGUMENTS.startDate#">
                    AND c.DateTimeClosed < <CFQUERYPARAM cfsqltype="cf_sql_date" value="#DateFormat(DateAdd('d', 1, ARGUMENTS.endDate), 'yyyy-mm-dd')# 00:00:00">
              </CFIF>
    Previously, it had just been:
                    AND c.DateTimeClosed >= <CFQUERYPARAM cfsqltype="cf_sql_date" value="#ARGUMENTS.startDate#">
                    AND c.DateTimeClosed < <CFQUERYPARAM cfsqltype="cf_sql_date" value="#DateFormat(DateAdd('d', 1, ARGUMENTS.endDate), 'yyyy-mm-dd')# 00:00:00">
    With no IF/ELSE statement.
    Anyone seen anything like this before?  Any ideas? 
    Thanks.

    Right, I'll start disabusing myself of the DateFormat!
    I'm sorry, I should've posted the actual query too.  It's inserting the first part - "AND c.EncounterID = ....."
    Here's the full query:
    LOCAL.getEncounterServices (Datasource=xmddevdb, Time=9ms, Records=0) in /Applications/ColdFusion9/wwwroot/XMD_NEW/xmd_dev/cfc/ShoppingGateway.cfc @ 16:56:28.028
    SELECT
                a.EncounterProductID,
                a.DateTime AS ServiceDate,
                aa.CartItemID,
                aaa.CartID,
                aaaaa.CartStatus,
                b.ProductID,
                b.ProductName,
                b.CPTCode,
                b.Price,
                c.EncounterID,
                c.DateTimeClosed AS EncounterClosedDate,
                d.FirstName,
                d.LastName
            FROM
                EncounterProducts a
                    LEFT JOIN CartItemProduct aa ON (a.EncounterProductID = aa.EncounterProductID AND aa.Active = 1)
                    LEFT JOIN CartItem aaa ON (aa.CartItemID = aaa.CartItemID)
                    LEFT JOIN Cart aaaa ON (aaa.CartID = aaaa.CartID)
                    LEFT JOIN CartStatus aaaaa ON (aaaa.CartStatusID = aaaaa.CartStatusID),
                Product b,
                Encounters c,
                Contacts d,
                EncounterStatuses e
            WHERE
                1 = 1
                AND (aa.CartItemID IS NULL OR aaaaa.CartStatus = 'Deleted')
                AND a.Active = 1
                AND a.ProductID = b.ProductID
                AND a.EncounterID = c.EncounterID
                AND c.PatientID = d.ContactID
                AND c.EncounterStatusID = e.EncounterStatusID
                AND e.EncounterStatus = 'Closed'
                     AND c.EncounterID = ?
                AND c.LocationID = ?
                AND c.CustomerID = ?
    Query Parameter Value(s) -
    Parameter #1(cf_sql_integer) = 28
    Parameter #2(cf_sql_integer) = 16
    Parameter #3(cf_sql_integer) = 6
    Thansk again for the help!

Maybe you are looking for

  • JDK 1.6 doesn't choose proper XPathFactory...

    Hello, I was using jdk 1.5.0 with the saxon api to give me XPath 2.0 functionallity. I've recently upgraded to JDK6 and now the XPathFactory.newInstance() doesn't give me the saxon implementation anymore. When I try to compile the the XPath Expressio

  • File to File (file not getting picked)

    Hi , I am having a file to file scenario. My file is not getting processed from  the source directory. When i check in the communication channel monitoring bith my sender and rcvr communication chanels are wrking fine. I have placed the source file w

  • Problem installing Solaris 10 1/06 Operating System

    I am trying to install Sol 10 1/06 release. Problem CDE fails, with Error Opening PAM libraries ? Log in to console (as root) fails with: open_module:/usr/lib/security/pam_authtok_get.so.1 error message At the same time I am missing /usr/lib/mps/libn

  • Template Designer - Custom Partner Functions in Object GDCOIC

    Hi, After spending a lot of time juggling with Template designer we have finally managed to get it working. But we have now come to an issue where the object used to create web service for template designer GDCOIC doesn't have details of partner func

  • Director and SAP

    Hi Has anyone any experience in integrating Director and SAP Cheers Photman52