Records are repeatedly loading for RDA DataSource

Hi
Iam trying to load the data for custom datasource into DSO . This datasource is real time enabled one .
Eventhough there is no new records are created in ECC system , the real time datasource getting the same number of old reocrds dailiy . I mean , the records are repeatedly loading into the DSO object .
Why the records are repeatedly loading into the DSO object for real time datasource .
Please let me know
Regards
mohammed

Hi,
In RSO2 click on a'GENERIC DELTA' and select the Time stamp and give the lower limit and upper limit
and if you are loading the data into DSO then select the radio button is "New Status of changed record"
if you are loading the data into CUBE then select the radio button is "Additive Delta".
Regards,
Yerrabelli.

Similar Messages

  • Delta records are not loading from DSO to info cube

    My query is about delta loading from DSO to info cube. (Filter used in selection)
    Delta records are not loading from DSO to Info cube. I have tried all options available in DTP but no luck.
    Selected "Change log" and "Get one request only" and run the DTP, but 0 records got updated in info cube
    Selected "Change log" and "Get all new data request by request", but again 0 records got updated
    Selected "Change log" and "Only get the delta once", in that case all delta records loaded to info cube as it was in DSO and  gave error message "Lock Table Overflow" .
    When I run full load using same filter, data is loading from DSO to info cube.
    Can anyone please help me on this to get delta records from DSO to info cube?
    Thanks,
    Shamma

    Data is loading in case of full load with the same filter, so I don't think filter is an issue.
    When I follow below sequence, I get lock table overflow error;
    1. Full load with active table with or without archive
    2. Then with the same setting if I run init, the final status remains yellow and when I change the status to green manually, it gives lock table overflow error.
    When I chnage the settings of DTP to init run;
    1. Select change log and get only one request, and run the init, It is successfully completed with green status
    2. But when I run the same DTP for delta records, it does not load any data.
    Please help me to resolve this issue.

  • Delta records are not populating for enhanced fields in 2LIS_17_I3HDR

    Hello,
           I have an issue with 2LIS_17_I3HDR datasource, i have enhanced the DS with Zfields, these are populated from AUFK through user exit. When i fill setup table, i can see the values being populated for enhanced fields, but for delta records, i dont find any values for these enhanced fields, even though the values exists in the AUFK table and i even tried degugging the user exit code, it doesnt go through it during the posting of delta records.
    Can some one give some suggestion on this issue.
    Thanks,
              Pavan

    Hello Ravi,
                     Yes, delta records cannot be populated by enhanced fields. Here the issue is different,
    i have added a closed date to the extract structure, this is order specific, so when i create a new PM order after the full load, and release the order and close it, the AUFK table get an entry for closed date, my exit compare the object number of the extractor to object number in AUFK and get the closed date value to the DS, this is not same as enabling delta on the enhanced fields (which cant be done).
    Thanks,
                Pavan

  • What are the tables for PS datasource.

    HI ALL,
    I am new to PS module , I got the Data source , infocube and details from help.sap.com
    but i want to know the exact flow on the r3 side.
    which table values come to data source .
    for eg : 0ps_dat_ntw is the data source for network dates ,so where in r3 side the data is maintained, which table will have this.
    As in lo-cockpit we have LBWE where we can see the communication structure , similarly  what is there for PS module.
    MY few data source are :
    0ps_dat_ntw ,
    0ps_dat_nwa ,
    0ps_dat_wbs ,
    0ps_dat_prj ,
    0ps_wbs_eva ,
    0ps_ord_eva ,
    0ps_ntw_eva.
    Thanks In Advance.

    Hi,
    You aren't so far away from information required.
    URL for BC PS:
    http://help.sap.com/saphelp_nw04/helpdata/EN/b9/60073c0057f057e10000000a114084/frameset.htm
    Open Folder "DataSources",and go through DSs.
    Acoording to your example:
    http://help.sap.com/saphelp_nw04/helpdata/EN/ea/5fcb39722a8504e10000000a114084/content.htm
    Required information is provided in columns "Table of Origin" and  "Field in Table of Origin"
    Regards
    Joe

  • Prompt for transport request if records are added/modified for custom table

    Hello Experts,
    A couple of questions,
    1. How do we prompt user to create a TR whenever he changes a record in my custom table
    via SM30? my custom table has a maintenance type of 'C' - customizing already.
    2. How do we transport the custom table contents to another client?
    Thank you guys and take care!

    Hello
    Regarding (1) just have a look at a standard maintenance view like V_T001W (SE11 -> menu Utilities -> Table Maintenance Generator ):
    Here you can see that the option Standard Recording Routine is chosen which is responsible for prompting for a transport request as soon as records of the table/view are changed.
    Regards
      Uwe

  • Multiple records are not loaded   in FILE to RFC  scenario

    Hi   ,
         I did  File- to-RFC  scenario.. it was working fine.. but the problem is... from my file.. it was loading the first record only into the R/3 database..  what i have to do..  i'm new to XI..
      generally a file must conatin multiple records.. but my scenario was  performing.. that was loading only the first record of my file.. and  remaining records it wa skipping.. 
    plz provide me the solution..
    Thanks
    Babu

    Hi
    In message mapping, change the occurrence of the RFC to be 0 to unbounded.
    Map the root node of the file xml structure to the RFC's root node.
    In the interface mapping after selecting the source and the target message interface,change the occurrence of the target interface to 0 to unbounded i.e.RFC .
    While doing the interface determination, you will find an option called Enhanced / Extended  Select this and then select the Interface mapping.
    Thanks

  • Webfonts are not loading for certain sites (407 Proxy Authentication Required)

    Hello,
    I'm running into an issue with CSS web fonts: They often don't load.
    This happens in both FF4RC and the latest FF3.6. I'm behind a proxy (an ISA server I believe), and everything is set up correctly (as far as I can tell) in about:config.
    Today I tried the [https://mozillademos.org/demos/runfield/demo.html Runfield demo]. Everything except the fonts (using google fonts api) loads and the game runs fine. Other browsers load the page (including fonts) without any problems.
    A quick look in Firebug reveals that the proxy is returning "407 Proxy Authentication Required", but only for the font files. Everything else on the page is loaded correctly. Again, other browsers on the same machine (e.g. Chrome, IE9, Opera 11) load the font files without issue.
    What's more, if I grab the URL and load it directly, it works fine, downloading the .woff file without issue.
    This happens with the Google Fonts API, but I'm also seeing the problem on FontFonter.com and FontDeck.com.
    However, Typekit.com works fine and so do the webfonts on this page.
    (I've tried restarting Firefox without add-ons and it makes no difference.)
    Thanks in advance for any pointers.

    A-ha! It seems to be a bug! [https://bugzilla.mozilla.org/show_bug.cgi?id=627616 #627616]

  • Two records are repeated in report..but differ on one column that is amount

    Hi SAP GURUS.........
    I have a report that is containing one of the columns as wbs element.Now when i execute a report i am gettin two records of the same wbs element.one is having the amount (in figures),while the other record dont have amount values.
    Can u explain me why this is happening i mean two records which are same in every respect expect the amount values.???
    Thanks!!!
    Shashi Sharma.

    Hi Sashi,
    Do you have WBS element as only one column in your report or there are any other column also present in drilldown?
    Because if there are multiple columns then though the value of WBS element is same but due to different value of other characteristic you are getting multiple rows in the report.
    Regards,
    Durgesh.

  • Record are reparting

    Below query record are reparting as in one union i have to use ra_customer table and in another i have to not use ra_customer table, second condition is that i have to use ard.source_type='CASH', Please advice
    select
            jc.je_category_name            category,
            jh.je_source                source,
            ir.je_header_id                je_header_id,
           ir.je_line_num                          je_line_num,
            jl.effective_date                           acc_date,
             /* Removed to_char function in effective date   for bug  2514941  kakrishn */
            jh.name                              entry,
            jh.je_batch_id                batch,
            jb.name                       Batch_Name,
           substr(jl.description,1,25)            descr,
       --    ds.name                                           LSequence,
           jh.doc_sequence_value                                  HNumber,
           jl.je_line_num                                                 LLine,
           jl.accounted_dr                debit,
           jl.accounted_cr                credit,
           jl.entered_dr                 ent_dr,
           jl.entered_cr                ent_cr,
           jh.external_reference            reference,
           jl.reference_2                ref2,
           jl.reference_3                ref3,
           jl.period_name                per_name,
         null,--   ael.AE_LINE_NUMBER,
         null, --   ael.CURRENCY_CODE,
         null, --   ael.REFERENCE1,
         null, --   ael.REFERENCE4,
         null, --  ael.REFERENCE5,
         null, --  ael.REFERENCE10,
         null, --  ael.REFERENCE6,
         null, --  ael.ENTERED_DR,
         null, --  ael.ENTERED_CR,
         null, --  ael.ACCOUNTED_DR,
         null, --  ael.ACCOUNTED_CR,
            jl.accounted_dr     Accounted_Line_Dr,
            jl.accounted_cr     Accounted_Line_Cr,
              nvl(ir.reference_1,' ')                             reference_1,
              ir.reference_2                                          reference_2,
              ir.reference_3                                         reference_3,
              ir.reference_4                                          reference_4,
              nvl(ir.reference_5,' ')                                    reference_5,
              ir.reference_6                                          reference_6,
              ir.reference_7                                          reference_7,
              ir.reference_8                                          reference_8,
              ir.reference_9                                          reference_9,
              ir.reference_10                                        reference_10
              ir.subledger_doc_sequence_id                  seq_id,
              ir.subledger_doc_sequence_value             seq_num,
              jh.doc_sequence_id            h_seq_id,
              ir.gl_sl_link_id                gl_sl_link_id,
        --   NULL,--   rcu.customer_id,
            rcu.customer_name,
          --   ael.ae_line_id                ae_line_id,
             gcc.segment1                   segment1,
             gcc.segment2                   segment2,
             gcc.segment3                   segment3,
             gcc.segment4                   segment4,
             gcc.segment5                   segment5,
             gcc.segment6                   segment6,
             gcc.segment7                   segment7,
             gcc.segment8                   segment8,
             ARD.ACCTD_AMOUNT_DR                                                 Debit_Amount,
            ARD.ACCTD_AMOUNT_CR                                                 Credit_Amount,
            jl.period_name                                                      Period_Name
        from GL_JE_LINES           jl,
             GL_JE_HEADERS         jh,
             GL_JE_BATCHES       jb,
             GL_JE_SOURCES       js,
             GL_JE_CATEGORIES     jc,
             GL_IMPORT_REFERENCES         ir,
               GL_CODE_COMBINATIONS      gcc,
           ar_cash_receipts_all           acr,
           ar_distributions_all            ard,
             AR_CASH_RECEIPT_HISTORY_ALL     ACH,
          ra_customers                    rcu
             where
              jl.status||''                         = 'P'
      -- and jl.code_combination_id     in (59186,39596,41444,41446,59334,39595,59338,114180,114185)
    and jl.code_combination_id     in (39595)
       and jl.set_of_books_id              = 22
       and jh.status                         = 'P'
       and jh.actual_flag                    = 'A'
       and jh.je_header_id                 = jl.je_header_id + 0
       and jb.je_batch_id        = jh.je_batch_id + 0
        and jl.code_combination_id=gcc.code_combination_id
       and jb.average_journal_flag     = 'N'
       and js.je_source_name     = jh.je_source
       and jc.je_category_name      = jh.je_category
       and ir.je_header_id (+)            = jl.je_header_id
       and ir.je_line_num (+)              = jl.je_line_num
       and  (     ir.rowid                           IS NULL
                 or jh.je_source                   IN ('Payables', 'Receivables')
                 or ir.rowid                           IN (  SELECT max(ir2.rowid)
                                                                     FROM    gl_import_references ir2
                                                                     WHERE   ir2.je_header_id = jl.je_header_id
                                                                     AND        ir2.je_line_num = jl.je_line_num))
       and (NVL(ir.reference_4,'0')) = to_char(acr.receipt_number)
      and ir.reference_3=to_char(ard.line_id)
      AND ACH.CASH_RECEIPT_ID=ACR.CASH_RECEIPT_ID
      and acr.pay_from_customer=rcu.customer_id
       AND ACH.CASH_RECEIPT_HISTORY_ID=ARD.SOURCE_ID
    and ir.reference_7=to_char(rcu.customer_id)
      AND ARD.SOURCE_TYPE='CASH'
    and  jl.period_name IN ('FEB-09')
    UNION
    select
            jc.je_category_name            category,
            jh.je_source                source,
            ir.je_header_id                je_header_id,
           ir.je_line_num                          je_line_num,
            jl.effective_date                           acc_date,
             /* Removed to_char function in effective date   for bug  2514941  kakrishn */
            jh.name                              entry,
            jh.je_batch_id                batch,
            jb.name                       Batch_Name,
           substr(jl.description,1,25)            descr,
       --    ds.name                                           LSequence,
           jh.doc_sequence_value                                  HNumber,
           jl.je_line_num                                                 LLine,
           jl.accounted_dr                debit,
           jl.accounted_cr                credit,
           jl.entered_dr                 ent_dr,
           jl.entered_cr                ent_cr,
           jh.external_reference            reference,
           jl.reference_2                ref2,
           jl.reference_3                ref3,
           jl.period_name                per_name,
         null,--   ael.AE_LINE_NUMBER,
         null, --   ael.CURRENCY_CODE,
         null, --   ael.REFERENCE1,
         null, --   ael.REFERENCE4,
         null, --  ael.REFERENCE5,
         null, --  ael.REFERENCE10,
         null, --  ael.REFERENCE6,
         null, --  ael.ENTERED_DR,
         null, --  ael.ENTERED_CR,
         null, --  ael.ACCOUNTED_DR,
         null, --  ael.ACCOUNTED_CR,
            jl.accounted_dr     Accounted_Line_Dr,
            jl.accounted_cr     Accounted_Line_Cr,
              nvl(ir.reference_1,' ')                             reference_1,
              ir.reference_2                                          reference_2,
              ir.reference_3                                         reference_3,
              ir.reference_4                                          reference_4,
              nvl(ir.reference_5,' ')                                    reference_5,
              ir.reference_6                                          reference_6,
              ir.reference_7                                          reference_7,
              ir.reference_8                                          reference_8,
              ir.reference_9                                          reference_9,
              ir.reference_10                                        reference_10
              ir.subledger_doc_sequence_id                  seq_id,
              ir.subledger_doc_sequence_value             seq_num,
              jh.doc_sequence_id            h_seq_id,
              ir.gl_sl_link_id                gl_sl_link_id,
        --   NULL,--   rcu.customer_id,
        NULL,--    rcu.customer_name,
          --   ael.ae_line_id                ae_line_id,
             gcc.segment1                   segment1,
             gcc.segment2                   segment2,
             gcc.segment3                   segment3,
             gcc.segment4                   segment4,
             gcc.segment5                   segment5,
             gcc.segment6                   segment6,
             gcc.segment7                   segment7,
             gcc.segment8                   segment8,
             ARD.ACCTD_AMOUNT_DR                                                 Debit_Amount,
            ARD.ACCTD_AMOUNT_CR                                                 Credit_Amount,
            jl.period_name                                                      Period_Name
    from GL_JE_LINES           jl,
             GL_JE_HEADERS        jh,
             GL_JE_BATCHES        jb,
             GL_JE_SOURCES       js,
             GL_JE_CATEGORIES      jc,
             GL_IMPORT_REFERENCES           ir,
             GL_CODE_COMBINATIONS     gcc,
           ar_cash_receipts_all             acr,
           ar_distributions_all             ard,
           AR_CASH_RECEIPT_HISTORY_ALL     ACH
             where
              jl.status||''                         = 'P'
      -- and jl.code_combination_id     in (59186,39596,41444,41446,59334,39595,59338,114180,114185)
    and jl.code_combination_id     in (39595)
       and jl.set_of_books_id              = 22
       and jh.status                         = 'P'
       and jh.actual_flag                    = 'A'
       and jh.je_header_id                 = jl.je_header_id + 0
       and jb.je_batch_id        = jh.je_batch_id + 0
        and jl.code_combination_id=gcc.code_combination_id
       and jb.average_journal_flag     = 'N'
       and js.je_source_name     = jh.je_source
       and jc.je_category_name      = jh.je_category
       and ir.je_header_id (+)            = jl.je_header_id
       and ir.je_line_num (+)              = jl.je_line_num
       and  (     ir.rowid                           IS NULL
                 or jh.je_source                   IN ('Payables', 'Receivables')
                 or ir.rowid                           IN (  SELECT max(ir2.rowid)
                                                                     FROM    gl_import_references ir2
                                                                     WHERE   ir2.je_header_id = jl.je_header_id
                                                                     AND        ir2.je_line_num = jl.je_line_num))
       and (NVL(ir.reference_4,'0')) = to_char(acr.receipt_number)
      and ir.reference_3=to_char(ard.line_id)
      AND ACH.CASH_RECEIPT_ID=ACR.CASH_RECEIPT_ID
       AND ACH.CASH_RECEIPT_HISTORY_ID=ARD.SOURCE_ID
      AND ARD.SOURCE_TYPE='CASH'
    and  jl.period_name IN ('FEB-09')Edited by: user605933 on Jul 10, 2009 7:25 AM
    Edited by: user605933 on Jul 10, 2009 7:48 AM
    Edited by: user605933 on Jul 10, 2009 7:56 AM
    Edited by: user605933 on Jul 10, 2009 7:57 AM
    Edited by: user605933 on Jul 10, 2009 7:58 AM
    Edited by: user605933 on Jul 10, 2009 8:05 AM

    user605933 wrote:
    Below query record are reparting as in one union i have to use ra_customer table and in another i have to not use ra_customer table, second condition is that i have to use ard.source_type='CASH', Please adviceDo you mean records are "repeating"?
    Can you edit your post and put *{noformat}{noformat}* tags before and after your code so that the formatting remains and we can read it properly.
    Also, please provide your database version number and some sample input data (create table statements with insert statements will do) and en example of the expected output from that data.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

  • Enhance additional fields for RDA

    Hi
    what is process need to follow if I want enhance fields to dso which is using loading process as RDA. and what is process follow while moving to production and what is process  I need historical loads for rda dso.

    Hi,
    Whats your data flow?
    Assuming as source field might be exist at source/psa.
    at Dev system you can add required info object to DSO.
    do the proper transformations from source.
    if you at PSA/source there is no historic data then delete psa/dso data.
    Load historical data from source to psa and then dso..further.
    at prod. Stop DSO data flow process chain.delete delta pointer at data source level.
    move your changes to prod.
    load full load.
    run init info pack without data.
    set delta pointer and continue your loads.
    At DSO level data deletion not mandatory if your dso key fig are over write option. But memory wise better to delete old data and reload historic data.
    Think twice and follow your actions.
    Thanks

  • Can I use full load for daily loads in LO Cockpit extraction.

    I am using LO cockpit extraction for 2LIS_11_VAITM and i am using delta load for daily load and I have following doubt.
    Can i use full load for daily loads instead of delta in LO Cockpit extraction.
    Because, my understanding is that full load always takes data from setup tables in ECC and set up tables will not have delta data.
    pls reply. Thanks.

    You are right abt the understanding that full load (atleast for the extractor at hand) brings data from setup tables. You are also right that delta records are not loaded into setup tables.
    So if you plan to do full loads every day, you have to load the setup tables everyday. It is at this juncture that I have to remind you that, filling of setup tables requires a downtime of the system to ensure  that no changes / creates happen during the process.
    Hence performing a full load is not a very good approach to go by...especially when SAP made the job easy by doing all the dirty work of identifying changes and writing them to extraction queue.
    Hope this helps!

  • Move New Windows Inside Recording Area & More

    Hi All
    I have a few questions for all those kind people who
    contribute their time and experience here.
    I am exploring Captivate and have trouble with my new windows
    or dialogs jumping outside the recording area. I am creating movies
    using 'Custom Size and when I open new dialogs, they are opening up
    outside of the recorded area. I have the 'New Windows Inside
    Recording Area ' recording option checked, but it still happens. Is
    there something I am missing, another setting perhaps or a
    combination of things I am doing or not doing to prevent this?
    Another problem I am having occurs when I am recording a CAD
    application. When I open new dialogs and then close them, I get
    this display problem that shows where the dialog just appeared. The
    problem is that my application is blank where the dialog box was -
    I cannot see any of the icons of commands. It only ever happens
    when I am recording, not when I am generally using the program. Is
    this a known phenomenon, maybe a video card acceleration or OpenGL
    issue perhaps?
    Thanks in advance - I look forward to hearing from anyone who
    can help me solve these issues.
    Cheers,
    spritc.

    Hi there "spritc.",
    To your first problem, that of those pesky windows opening
    outside the selected recording area, if you use "Application"
    instead of "Custom", I think you'll find you have better luck.
    To your second problem, yes, I think you figured it out, you
    have a problem with video. Try disabling hardware acceleration and
    see if that won't help with the "shadow of where it was" issue.
    That option is in "Control Panel > Display > Settings (tab)
    > Advanced" on most systems.
    Now, a lesson in Captivate (or you can call it a "rant" by an
    old-timer if you wish).
    Many new users have been so "sold" on Captivate's "automatic"
    features that they have forgotten what its real purpose is, and
    that sometimes there is more than one way to accomplish it. It was
    created to fill a need to capture background changes in
    applications, then add objects to those changed backgrounds to
    accomplish the true goal of electronic learning through
    demonstration and/or simulation. Notice that nowhere in that
    statement did I use the word "fast", nor "automatic", nor
    "effortless", nor did I add "without the effort of thought".
    "spritc.", this is not directed at you nor meant to demean
    you in any way. The statement above is a result of my own
    frustration with a lack of reality in advertising and promotion of
    the product - not just Captivate but all products. With that said,
    here is how you do what you want to do:
    About Windows: Be aware that all "Windows" applications
    share some helpful features. One is that when an application is
    opened, it should open in a window of the same dimensions, and at
    the same position as the position and dimensions it occupied when
    it was last closed. That includes - or is supposed to - additional
    supporting pop-ups and dialog boxes (windows) for the application.
    Another feature of Windows applications is that if the main
    application window is placed too closely to - say - the right edge
    of the screen (or the bottom, or the top) and is too close for an
    attached drop-list to appear to the right, it will automatically
    appear instead - on the left, or above its normal position. In
    other words, Windows compensates for bad positioning by adjusting
    its location of appearance on the screen. So much for issues
    created by and corrected by the Windows OS.
    About Captivate: When in "Record" mode, Captivate does take
    a snapshot each time it detects a change in the background (caused
    by what we call an "event"), thus automatically creating a series
    of snapshots that we know as a demonstration or simulation, as the
    case may be. But there is a key on the keyboard that is much
    ignored, but of greatest value - it is named the "PrtScr" key, and
    (by default - it can be changed to another key) pressing it will
    cause Captivate to "snap" a background image, in addition to those
    snapped automatically by the detection of an "event". Captivate
    also gives you a feature that is nearly unknown among new users ...
    it is called the "other option". That is (to give you a simple
    example), while the Recording Options dialog gives you the
    (default) action of automatically recording keystrokes, the act of
    un-ticking this box gives you the
    other option of
    not automatically recording keystrokes. There are many
    "automatic" functions that are the default selections during the
    recording of a Captivate movie, and each gives you the option of
    turning it off.
    Now, let's put these two groups of knowledge together. Before
    recording, know what you are going to "click" to create a
    background change image. Then do a "walkthrough" without Captivate
    running. Each dialog box that pops up should be adjusted for size
    and position during this walkthrough so that when Captivate's
    actual image-recording begins, each dialog will appear where you
    want it. Also, each menu item that opens outside the recording area
    should be tested to see where it will appear; if the "File" menu
    item you want is falling off the bottom of the recording area, try
    positioning the main application window lower on-screen . . . this
    should force the "File" menu items to appear to "drop up", instead
    of "dropping down", allowing the wanted item to appear properly
    within the recording area. Finally (for this excerpt), you should
    always have one eye on that "PrtScr" key mentioned above. There may
    be times when using that magic key to force the capture of a
    background, is the only way to grab an image of a "missed event" -
    for instance, the appearance of a tooltip.
    In conclusion, don't blindly buy what the advertising people
    are selling. Be prepared to use your imagination and even some
    preparation work in making your movies. There are very few things
    that Captivate can't do in its line of work, but somethimes you
    have to do the "thinking" for it.
    Have a really great day!!

  • What are the settings for datasource and infopackage for flat file loading

    hI
    Im trying to load the data from flat file to DSO . can anyone tel me what are the settings for datasource and infopackage for flat file loading .
    pls let me know
    regards
    kumar

    Loading of transaction data in BI 7.0:step by step guide on how to load data from a flatfile into the BI 7 system
    Uploading of Transaction data
    Log on to your SAP
    Transaction code RSA1—LEAD YOU TO MODELLING
    1. Creation of Info Objects
    • In left panel select info object
    • Create info area
    • Create info object catalog ( characteristics & Key figures ) by right clicking the created info area
    • Create new characteristics and key figures under respective catalogs according to the project requirement
    • Create required info objects and Activate.
    2. Creation of Data Source
    • In the left panel select data sources
    • Create application component(AC)
    • Right click AC and create datasource
    • Specify data source name, source system, and data type ( Transaction data )
    • In general tab give short, medium, and long description.
    • In extraction tab specify file path, header rows to be ignored, data format(csv) and data separator( , )
    • In proposal tab load example data and verify it.
    • In field tab you can you can give the technical name of info objects in the template and you not have to map during the transformation the server will automatically map accordingly. If you are not mapping in this field tab you have to manually map during the transformation in Info providers.
    • Activate data source and read preview data under preview tab.
    • Create info package by right clicking data source and in schedule tab click star to load data to PSA.( make sure to close the flat file during loading )
    3. Creation of data targets
    • In left panel select info provider
    • Select created info area and right click to create ODS( Data store object ) or Cube.
    • Specify name fro the ODS or cube and click create
    • From the template window select the required characteristics and key figures and drag and drop it into the DATA FIELD and KEY FIELDS
    • Click Activate.
    • Right click on ODS or Cube and select create transformation.
    • In source of transformation , select object type( data source) and specify its name and source system Note: Source system will be a temporary folder or package into which data is getting stored
    • Activate created transformation
    • Create Data transfer process (DTP) by right clicking the master data attributes
    • In extraction tab specify extraction mode ( full)
    • In update tab specify error handling ( request green)
    • Activate DTP and in execute tab click execute button to load data in data targets.
    4. Monitor
    Right Click data targets and select manage and in contents tab select contents to view the loaded data. There are two tables in ODS new table and active table to load data from new table to active table you have to activate after selecting the loaded data . Alternatively monitor icon can be used.
    Loading of master data in BI 7.0:
    For Uploading of master data in BI 7.0
    Log on to your SAP
    Transaction code RSA1—LEAD YOU TO MODELLING
    1. Creation of Info Objects
    • In left panel select info object
    • Create info area
    • Create info object catalog ( characteristics & Key figures ) by right clicking the created info area
    • Create new characteristics and key figures under respective catalogs according to the project requirement
    • Create required info objects and Activate.
    2. Creation of Data Source
    • In the left panel select data sources
    • Create application component(AC)
    • Right click AC and create datasource
    • Specify data source name, source system, and data type ( master data attributes, text, hierarchies)
    • In general tab give short, medium, and long description.
    • In extraction tab specify file path, header rows to be ignored, data format(csv) and data separator( , )
    • In proposal tab load example data and verify it.
    • In field tab you can you can give the technical name of info objects in the template and you not have to map during the transformation the server will automatically map accordingly. If you are not mapping in this field tab you have to manually map during the transformation in Info providers.
    • Activate data source and read preview data under preview tab.
    • Create info package by right clicking data source and in schedule tab click star to load data to PSA.( make sure to close the flat file during loading )
    3. Creation of data targets
    • In left panel select info provider
    • Select created info area and right click to select Insert Characteristics as info provider
    • Select required info object ( Ex : Employee ID)
    • Under that info object select attributes
    • Right click on attributes and select create transformation.
    • In source of transformation , select object type( data source) and specify its name and source system Note: Source system will be a temporary folder or package into which data is getting stored
    • Activate created transformation
    • Create Data transfer process (DTP) by right clicking the master data attributes
    • In extraction tab specify extraction mode ( full)
    • In update tab specify error handling ( request green)
    • Activate DTP and in execute tab click execute button to load data in data targets.

  • DB Connect DataSource PSA records and DSO records are not matching...

    Dear All,
    I'm working with SAP NetWeaver BW 7.3 and for the first time, I have loaded from Source System DB Connect. I have created a DataSource and pulled all records and found 8,136,559 records in PSA. When I designed and created DSO with Key Fields 0CALDAY, Item No and Company Code, it has transferred records about 8,136,559 and added records about 12,534 only. Similarly following InfoCube has about 12,534 records into its Fact table. When I tried to reconcile the data/records with source DBMS for a month, the records/data could not matched?
    1. What could be the reason behind the issue? why I am unable to load the records/data correctly?
    2. Have I not mentioned the Key Fields of DSO in a correct manner?
    3. Is it possible to load the records/data into DSO without giving any field as Key Fields?
    4. How should I resolve this issue?
    5. Is it could be of DSO Overwrite and summation function utilization if yes, then how to utilize it?
    Many thanks,
    Tariq Ashraf

    Dear Tariq,
    1. What could be the reason behind the issue? why I am unable to load the records/data correctly?
    Ans:  Check transformation once. Is there any start routine you have used or direct assignments. What kind of DTP settings you have done.
    Check the messages at the DTP monitor. You will surely find some clue. Any duplicate records are being detected or not check once if you are using semantic keys in your DTP.
    2. Have I not mentioned the Key Fields of DSO in a correct manner?
    Ans:  The transformation key and the DSo key are they same in your case?
    What kind of DSO is it? Like for sales order DSO you take Order number as a key field., So you have to define the key fields according to business semantics I suppose. Do you agree?
    3. Is it possible to load the records/data into DSO without giving any field as Key Fields?
    Ans:  I dont think so as the keys you defined will help in having unique data records isnot it?
    4. How should I resolve this issue?
    Ans: Please check the above as in Ans:1 please share your observation.
    5. Is it could be of DSO Overwrite and summation function utilization if yes, then how to utilize it?
    Ans: DSO overwriting of key figures is useful when you have full loads in picture. Are you always going to perform full loads ?
    For reference would you like to check this thread:  Data fileds and key fields in DSO
    Lets see what experts give their inputs.
    Thank You...

  • Data Records are missing in between while loading from R/3 (ECC) to BI.

    Dear Experts,
    I have created a custom DataSource on Custom Function Module.  This datasource contains 600 fields. (I know its a monster and splitting options are thinner).
    1) Validate the data using RSA3 in R/3 and showed the correct record count.
    2) Validate the data by debugging the FM, still showed the correct record count.
    But while loading from R/3 to BI, records are missing.
    Various Scenarios load from R/3 to BI:
    1a) Loaded full load (78000 records) with all default data transfer settings.  PSA showed up with 72000 records (missing 6000) only.  Compared the Idocs vs data packets, both reconciled.
    1b) Loaded full load (78000) with modified settings (15000 KB / data packet).  PSA showed up with 74000 records (missing 4000) only.
    2a) Loaded with selection parameters (took a small chunk) (7000 records) with default data transfer settings.  PSA showed up only 5000 records (missing 2000).
    2b) Loaded with selection parameters (7000 records) with modified settings (15000 KB / data packet).  PSA showed up all 7000 records.
    3a) Loaded with selection parameters (took further small chunk) (4000 records).  PSA showed up all records regardless data transfer settings.
    Also please look at this piece of code from the function module,
    IF l_wa_interface-isource = 'ZBI_ARD_TRANS'.
          l_package_size = l_wa_interface-maxsize DIV 60.
        ENDIF.
    I really appreciate your advise or help in this regard.
    Thanks much,
    Anil

    Hi,
    Which module u want?
    if its SD(for example)
    steps>>
    1>In AWB goto "business content"
    2> goto "Info provider"
    3>Under infoarea select SD cubes
    4> Drag related cubes and ODS to right panel
    5> Set the grouping option "In Data flow before&afterwards"
    6>Install the collected objects
    Go to R/3
    7> Use Tcode RSA5 Transfer all regarding SD module datasources
    Goto BW
    8> Right click on the source system "Replicate datasources"
    [DataSources|http://help.sap.com/saphelp_nw70/helpdata/en/3c/7b88408bc0bb4de10000000a1550b0/frameset.htm]
    Edited by: Obily on Jul 10, 2008 8:36 AM

Maybe you are looking for