All Data is not loaded from PSA

Hi,
  I have loaded 0material_attr data to Master data 0material.
I got records to PSA whatever data availble in R/3 after Infopackage execution,
After execution of DTP i ddint get all fields of PSA data to 0material.
In transformation whatever fields i mapped those fields data available in PSA but not in 0master.
And i restricted the fields in Infopackage but in DTP it is not extraction only PSA fileds. its extracting lot of fields, how can i restrict in DTP.
If i want delta records means for next time what i have to do..
Urgent reply pls..
Thanks in Advance

a.  check your transformation- make sure key field is set correctly.
b.  no need to restrict in DTP unless you need a language or something.
c.  delta records will come through fine if your dtp has delta on it.
Works for us.
cheers,
Sarah

Similar Messages

  • Data not loading from PSA to infocube

    Hi,
    I'm facing a problem while extracting the data from PSA to infocube (On APO SIDE). we are using 3 infocubes and we extracting data in such a way that each infocube has only one unit of measure, so with the help of abaper, we have written a start routine in the transformation that for eg: the products with unit of measure KM should come only in the cube.But no data is coming into the cube and it is showing the following errors
    From Abap Side:
    It turns up with an error message. The error is generated at the start routine method. In the method abap code is added. the code is "DELETE SOURCE_PACKAGE where VRKME NE 'KM'." Following is the error displayed :
    1.) Call of the method START_ROUTINE of the class LCL_TRANSFORM failed; wrong type for parameter SOURCE_PACKAGE
    2.) Exception CX_RS_STEP_FAILED logged
    please help me out in this.
    Thank you
    Regards,
    Raj

    Could you please explain it in more elaborate.
    V r not passing anything to DATA_PACKAGE neither using it. In the method call v r passing <_yt_SC_1> to parameter SOURCE_PACKAGE. And inside the method v have written the delete code as "DELETE SOURCE_PACKAGE where VRKME NE 'KM'. "
    Do we need to change the coding to "DELETE DATA_PACKAGE where VRKME NE 'KM'. "
    Regards,
    Raj

  • Master data attribute not updaing from PSA

    Hi all,
        I am on 3.5, and the have a custom attribute extension to the 0CUSTOMER object.  This attribute is populated in R/3, and populated in the PSA, but not the object itself, or the master data table of the object (/bi0/pcustomer).  Has anyone seen this before, or can anyone recommend how to trouble-shoot this?
       Thanks.
    Dave

    Hello David,
    Please follow as Sudharshan and Dennis suggested, along with that
    1. Check which type of attribute (time-dependent/time independent).
    2. Then search accordingly in the table p/q table of customer InfoObject.
    3. Try to search in m table of custome (/BI0/MCUSTOMER).
    Attribute change run for the particular master data is must to see the data in tables.
    4. Just check the Infopackage settings whether the load is till PSA or which option is selected(just to make sure whether the data is loaded till master data tables)- My doubt is the data might be till PSA and not yet processed from PSA to InfoObject, and you are searching the data in master data tables ;-).
    Please let us know your findings.
    Thanks,
    Umashankar

  • Data is not loaded into PSA

    When I am scheduling infopackage, request is successful saying 200 records received in PSA but no data is there in PSA.

    hi ambika,,,
    this could happen because of Check if PSA table is active or not, if not replicate from source system.
    can u provide details msg about scheduling thr info pkg. there could loading issue in extractor settings.or selection parameter provided is invalid.
    Regards
    satish

  • PSA data change not loaded to the target.

    Hi all,
    We have the data in PSA (it is the master data)- while transferring the data from PSA to the target, system throws error - on analysing the error we found that the characters in the data is wrong, the same was corrected in the PSA - on reloading after correction to the target, system is ignoring the corrected records and loading only the other records.
    What could be the reason for this behaviour.
    Can experts help.
    Regards,
    M.M

    Dear Mr.Goud/Murali,
    Thank you for your immediate reply and instructions.
    Mr. Goud,
    The PSA QM status is only green but even then the corrected record is not loaded to the target.
    Mr. Murali,
    As per your information we do not have an Error DTP -  i have only one DTP and we are in BI7.00 - I have deleted the request from the target and corrected the erroneous record and am trying to load only that record by using the filter option in the DTP - still the data is not loaded.
    Can you suggest method of adopting - i am not able to understand why that corrected record is not loaded. Is it that manually corrected record in the PSA will not be loaded into the target?
    Kindly clarify.
    Regards,
    M.M

  • Do not Extract from PSA but Access Data Source (for Small Amounts of Data)

    Hi Experts,
    In the DTP, the above option is available for Full Loads for certain extractors but not for others, particularly, certain HR extractors?
    Is there a way to make it available for HR extractors?  Is there a setting that needs to be updated in ECC or in BI?
    Thank you for your help!

    Hi,
    There is no special setting for this, Please see the detail description:
    Data is not extracted from the PSA for the DataSource; it is requested from the data source directly at DTP runtime.
    Use
    You use this mode for small data sets and full uploads, for example, small sets of master data. With file source systems, note that the file has to be available on the application server.
    Dependencies
    You do not have to create an InfoPackage in order to extract data from the source.
    Data in the data source is accessed in "direct access mode". This has certain consequences, especially if you are extracting data from SAPI systems:
    Data is extracted synchronously. This places a particular demand on the main memory, especially in remote systems.
    The SAPI extractors may respond differently than during asynchronous load since they receive information by direct access.
    SAPI customer enhancements are not processed. Fields that have been added using the append technology of the DataSource remain empty. The exits RSAP0001, exit_saplrsap_001, exit_saplrsap_002, exit_saplrsap_004 do not run.
    If errors occur during processing in BI, you have to extract the data again since the PSA is not available as a buffer. This means that deltas are not possible.
    In the DTP, the filter only contains fields that the DataSource allows as selection fields. With an intermediary PSA, you can filter in the DTP by any field.
    Regards,
    Kams

  • Difference between Reconstruction and regular load from PSA

    Gurus,
          What is the Difference between Reconstruction and load from PSA into datatarget.I understand that during Reconstruction we are not loaing data from R3.
          I have a cube and there are 3 requests in that.I deleted all the requests went to PSA and selected one of the requests and selected reconstruct from this.I went to cube and I can see this selected request in PSA in the reconstruction requests of cube.I submitted the reconstruction/Insert button there.
          Is this the procedure to do reconstruction.If yes then how is it different from regular load from PSA.
    Thanks

    If look function wise both are doing samething. But there is difference in data flow, PSA is used as staging area. If we choose at first place to load only to PSA and analyze our load do manual editing if needed and then load to data target in this case ofcourse reconstruct option is not available or  we could parrallely or sequencialy load to data target beside laoding into PSA. So now having already have loaded into data target if  some how we needed to delete and reload we can reconstruct the request or run normal load from PSA.

  • Delta records are not loading from DSO to info cube

    My query is about delta loading from DSO to info cube. (Filter used in selection)
    Delta records are not loading from DSO to Info cube. I have tried all options available in DTP but no luck.
    Selected "Change log" and "Get one request only" and run the DTP, but 0 records got updated in info cube
    Selected "Change log" and "Get all new data request by request", but again 0 records got updated
    Selected "Change log" and "Only get the delta once", in that case all delta records loaded to info cube as it was in DSO and  gave error message "Lock Table Overflow" .
    When I run full load using same filter, data is loading from DSO to info cube.
    Can anyone please help me on this to get delta records from DSO to info cube?
    Thanks,
    Shamma

    Data is loading in case of full load with the same filter, so I don't think filter is an issue.
    When I follow below sequence, I get lock table overflow error;
    1. Full load with active table with or without archive
    2. Then with the same setting if I run init, the final status remains yellow and when I change the status to green manually, it gives lock table overflow error.
    When I chnage the settings of DTP to init run;
    1. Select change log and get only one request, and run the init, It is successfully completed with green status
    2. But when I run the same DTP for delta records, it does not load any data.
    Please help me to resolve this issue.

  • Urgent::::Data is not loaded for a perticular info object in ods

    Hi All,
    We have loaded data into an ODS( 0PRC_DS01) in development server. It was successfully loaded into it all info objects.Reports were working well.
    When we transported it to production, data loading was taking longggg  time (15000 for 9 hours).So, we have done two things to improve the loading speed.
    1) we have created an index based on the fields in the where clause RCLNT,RLDNR,RYEAR, DOCNR, POPER in JVSO1 table.
    (JVSO1 is an R/3 table from where key data coming to datasource 0EN_JVA_2.)
    2)We have updated the optmizer statistics.
    Now the problem is, data is not loaded to one perticular info object JOINT VENTURE PARTNER in dso. Which was loaded successfully in development.
    Please help us........We will assign points for helpful answers

    Hi Chek in the transfer and update rules whether u mapped the fields with target and also check whether u have routine. and check whether the data is coming for that object from the source.
    Khaja

  • Loading from PSA

    Hi All,
    I have \edited the PSA and did some changes to the data.I want to load that to to a cube now.Could any body tell the step how to load from psa to cube?
    Thanks
    Manish

    Hi,
    If you have already loaded data in to the InfoCube through PSA and wants to reload the same request from PSA then you need to delete that request from InfoCube first.
    After that go to RSA1 -- > PSA Tab --> go to respective PSA --> go to respective Request --> Right Click --> Schedule the Update
    It'll take you to InfoPackage there select the data load options like particular InfoProvider. If you delete the specific request from the InfoCube then only you'll get InfoCube to select.
    Just schedule it, it'll load the cube now.
    hope its clear..
    Ram

  • Page data does not load with goButton? (new window of identical page)

    Hi guys. I ran into a problem. Essentially, I need to print a .jsff page but before I do, I have to alter its format a little bit. The solution I decided to use was to create a new .jspx page containing almost the exact same content. I even copied the data bindings and everything. Finally, I created a goButton on my original page like so:
    <af:goButton text="Print as BOSSMON" destination="SubmitPrint.jspx?org.apache.myfaces.trinidad.agent.email=true"
                        targetFrame="_blank">
    </af:goButton>This opens my page nicely in the format I want. But...there is no data on the page! It just opens a frame of the page. Can someone help me resolve this?
    Otherwise, is there any other technique I could use to accomplish my goal?

    Hi guys. Thanks to everyone for their suggestions. I finally got a working solution, something totally different from the suggestions.
    My goal was to print a .jsff page nicely. But before I could do that, I needed to modify some data on the page. I also wanted to do this in as few clicks as possible. I tried the goButton technique but the binding data would not load. I couldnt figure out why.
    Nonetheless, I finally managed to make my own version (albeit, not that neat!). I accomplished this by essentially making a template which automatically opens in showprintablebehavior format. I tried to follow a bunch of online tutorials/blogs but nothing worked for me. Here are the steps I used:
    1) Creating a new .jspx page.
    2) Copy the entire source of the .jsff page (the one which needs to be printed) into our new template .jspx page. (do have to modify the beginning parts by adding <af:document> <af:form> etc)
    3) Create a pageDef for the template page and copy the entire pageDef as well (do have to change the name of the pagedef)
    4) Modify your print template to your liking.
    5) Add a taskflow rule that links the original .jsff page to the new template page
    6) Add a commandButton on the original .jsff page which opens this printing page in a new window (remember to use "dialog:....."). Here is mine:
    <af:commandButton text="Open Printer Friendly Version" id="printer_popup" action="dialog:edit_print" useWindow="true"
                                  windowEmbedStyle="window" inlineStyle="display:block;"
                                  windowHeight="100" windowWidth="100">
    </af:commandButton>7) Now, in the printing template page, add a command button which simply has the showprintablebehaviour tag within it as well as a client listener to invoke some javascript. Here is what I used:
    <af:commandButton text="Printer Page" id="sha_print">
        <af:clientListener method="do_loading" type="focus"/>
        <af:showPrintablePageBehavior/>
    </af:commandButton>8) In the beginning of this page, you will have an <af:document> tag. Modify it so it sets focus to our print button above when the page loads. Like so:
    <af:document id="d2" initialFocusId="sha_print">9) Now add a resource tag to your page which has the "do_loading" javascript method.
    <f:facet name="metaContainer">
        <af:resource type="javascript">
              function do_loading(event) {
                  var target = document.getElementById('sha_print');
                  target.click();
                  top.close();
                  //alert("got focus sir!");
        </af:resource>
    </f:facet>Thats it! What this does is when you click on the command button to open the printer template, our template page opens but as it opens, it automatically sets focus to the printer button using the document tag. The printer button has a listener which activates when it receives focus (in our case, as soon as the page loads). This listener invokes the javascript method, which progamatically clicks the printer button and closes this window. What your left with is a printer friendly page of your modified page!
    All the data is present there too!
    If anyone has any questions/comments, please do ask.

  • Loading from PSA with different process chains at a different time

    Hi,
    We currently have an issue regarding loading the same data from 2 separate Process Chains from the same datasource at a different time.
    Simplified what is happening :
    Process Chain 1:
    Step 1a = Load 2LIS_13_VDITM into ODS "A"
    Step 1b = Load 0FI_GL_4 into ODS "B"
    At a later time, (PC1 is ready) =>
    Process Chain 2:
    Step 1 = Load ODS "A" to ODS "C"
    Step 2 = Load 0FI_GL_4 (from PSA) to ODS "C"
    Because of dependencies, step 2 from PC2 has to be loaded AFTER Step 1 (with activation) from PC2.
    We cannot change Process Chain 1.
    What i want to do is load the data from 0FI_GL_4 only to PSA in Process Chain 1 and pick it up again in Process Chain 2. This functionality is available in NW2004S, but not in our version BW 3.0B.
    Does anyone have an idea how to do this ?
    Thanks,
    Pascal

    Hi Dinesh,
    Thank you for your quick answer.
    The problem is i cannot change PC1, but 0FI_GL_4 is loaded with the option into PSA and then immediatly to the target. So data is available in the PSA.
    When I try to load data from PSA into ODS "C" in PC2, i get an error saying that the belonging infopackage (from PC1) is not available in this PC and thus cannot be loaded from PSA.
    Regards,
    Pascal

  • Data is not uploaded from the dso to the cube

    Dear Experts,
    In one of my process chains the data is not uploaded from the dso to the cube .
    I have tried to upload the data month wise also but still after certain data records the data gets stuck up.
    I have recreated the indexes also.
    When I am checking in DB02 the table space is shown as 18491 MB.The used space is 97%.
    Please suggest.

    Hi.....
    I didn't get your point........what have you mention that you have recreated the index before loading....
    Basically.the process should be........Delete index --> Load --> Create index......
    You have mentioned that 97% of the memory space is in use........please check with the basis team once......
    Also, in SM37 check if any such job is there which is running for a long time but not progressing......or which you think you don't need......then kill that job......
    Regards,
    Debjani.....

  • Datas are not coming from top to bottom in collumn

    Hiiiii
    I am moving my data from itab to itab1.
    And in itab1 datas are not started from top of the collumn for each a/c no.
    A/CNO|AB|DZ|RV|SA
    00001|00 |000|00 |10
    00001|00 |000|00 |15
    00001|00 |000|020|000
    00002|00 |15 |000 |00
    00002|21 |00 |000 |00
    00002|00 |25 |000 |00
    00002|00 |00 |000 |17
    what should i do to start my data from top of the collumn.
    Thanks in advance.
    Edited by: sanket sagar on Mar 17, 2011 8:14 AM

    Hi,
    Let me add some more comments in your code.
    LOOP AT IT_BKPF INTO WA_BKPF.
    IF SY-SUBRC = 0.    - do not use this statement. instead of this u can use a condition for a particular record. ( Eg: if wa_bkpf-belnr is not initial.)
    MOVE-CORRESPONDING WA_BKPF TO WA_FINAL.
    ENDIF.
    READ TABLE IT_BSEG INTO WA_BSEG WITH KEY BELNR = WA_BKPF-BELNR
    BUKRS = WA_BKPF-BUKRS
    GJAHR = WA_BKPF-GJAHR.
    IF WA_BSEG-SHKZG = 'H'.
    WA_BSEG-PSWBT = WA_BSEG-PSWBT * -1.
    ENDIF.
    IF SY-SUBRC = 0.
    MOVE-CORRESPONDING WA_BSEG TO WA_FINAL.
    ENDIF.
    APPEND WA_FINAL TO IT_FINAL.
    ENDLOOP.
    I think the above loop-endloop will get all the belnrs, and their related info to the final internal table.
    SORT IT_FINAL BY SAKNR BLART.
    LOOP AT IT_FINAL.
    IF WA_FINAL-BLART = 'AB'
    MOVE WA_FINAL-PSWBT TO WA_FINAL-PSWBT1.
    ENDIF.
    AND SAME FOR ALL DOC TYPES.
    ENDLOOP.
    SORT IT_FINAL BY HKONT .
    What is the structure of the internal table.
    If my understanding is correct, you want to populate a table as below.
    For every HKONT, populate the Document types (BLART) and their corresponding amounts like below.
    HKONT1 - AB - 1000
    HKONT1 - BC - 2000
    HKONT3 - CD - 2000
    Please correct it if your problem is something different.

  • Impact on transactional Loads if Master data is not loaded

    Scenario for your reference:
    The loads to ZPO_LINE from PC3 was failing for past 20 days from The fix was applied on April 21. 
    I need help to decide the effects of not loading ZPO_LINE for past 20 days. And create a detailed plan for data loads required.
    If the master data is not loaded for 20 days will it affect the transaction loads happened during those days?
    And how can i find out the impact and correct the transaction data if it does?
    Can any 1 help me with this?

    Hi,
    If i understand your scenario, you have a scenario where the Master data load has not been updated for the last 20days but, the transaction data was loaded without any interruption.
    In such a scenario, the Transactional loads will only be affected if there is some field which undergoes transformation after looking up on the master data.
    So, first load the master data and run the attrib change run process.
    After this, if the transaction data is in full update mode, then you dont need to do anything as data will be refreshed with correct values in the next load.
    If delta loads are there, you might need to performa  full repair.
    Regards,
    Rahul

Maybe you are looking for

  • FF keeps freezing. this started 3 months ago (there have been 2 updates since the update when it started) tried all solutions in forum

    FF now freezes almost every time I click a button on any website -- it worked fine until about 3 months ago. This usually resolves itself without any warnings, but sometimes get unresponsive script or plug-in (flash) warning. But it's usually quicker

  • Issues grouping music videos and tv shows

    I have had this issue on both my 3G and 2G. SOme artists, TV shows will group in the video mode in the iphone and some wont. I have a some screenshot. I checked the properties and theyre identical as far as bit rate, type, etc... Thanks in advance. M

  • Dual 1.8 PowerPC G5 does not read Tiger Install Disk

    Has anyone else had problems with their Mac not reading the OSX Tiger 10.4 install disks? I just purchased Tiger brand new from Apple, but my Mac Dual 1.8 GHz PowerPC G5 will not even acknowledge the install disk. I currently have OSX 10.3.9 running.

  • Bapi for infotype 0015

    Hiii Alll I had a requirement wherein i want to upload data in Infotype 0015 i need a BAPI to upload data . please help me regards Hitesh

  • Itunes keeps shutting my computer down

    hi any clever people reading this : ) im havimg problems with itunes .tried updating to 6 tried changing settings on my wireless router but still shuts me down . shut down today when doing virus check with ad ware 6 . any ideas and help would be appr