Strange Issue in PSA Table

Hi,
I have deleted all the requests in PSA, but still when I check in SE11 tcode for the same PSA table, it is showing entries.
This seems strange and I faced this situation for the first time.
P.S. There are no requests in PSA currently.
Thanks,
Karan/

Hi,
Its not strange issue.You may deleted the latest request only.
While deleting request at PSA, you need to click on display all request. it will shows all existing request at psa. later you can delete them all.
in general at PSA it shows week older requests.
Other wise, Use PSA table name and delete whole data by using SE14.
Later you can check at SE11.
Thanks

Similar Messages

  • Strange issue with ADF table in chrome browser

    I have ADF table which should display 23 rows, but only 20 rows are visible in chrome browser, but other browsers like IE, firefox displays the 23 rows correctly. I have used default ADF table with Drag&drop behaviour in this table. All the 23 rows exported correctly to Excel with export to Excel behaviour and inspect page source also shows all the rows in Chrome browser, but display in the adf is only problem in chrome browser. We're having a production issue with this, any ideas are appreciated.
    Thanks,
    Surya

    Hi All,
    Is this issue fixed yet? There are a couple of threads reporting this issue and the original thread has been Archived. It is a real issue, and it remains an issue. The Chrome browser cuts off the last row of a table in the display. IE displays the row correctly. I am working with JDev 12.1.2 and I am building an application using ADF Tables. Without exception, on every page that has one, the last row of the table is cut off from display in a very ugly way and you cannot scroll down to display the full row. I have tried wrapping the table in a Panel Collection - same result, I have tried setting the height of the table - same result. I have tried surrounding the table with a PanelGroupLayout component (layout set to scroll) - same result. I have even tried surrounding the table with a PanelHeaderComponent component, Type set to both default and Stretch - yes, you guessed it, same result! I've even put the table in the middle of a PanelStretchLayout component - but the last row is always cut off.
    This should be easy for you to reproduce, just drop a data control on a ADF page and select a table. When you view it in the Chrome browser and you will see what I'm talking about. I'm using Google Chrome version 31.0.1650.63 m.
    I have experimented with AFStretchWidth and AutoHeightRows (as suggested by previous threads), nothing seems to work.
    Here's another suggestion, if the forum would allow you to insert an image, I could actually show you what I'm talking about. Food for thought perhaps?
    Best regards,
    Nigel
    "Life's too short not to use ADF"

  • Strange issue with Multi-Table LTS and report filters (10g)

    Hi,
    I am having some troubles with the following scenario:
    I have a large flattened table that is being used as both Fact and Dimension in the BMM layer. I also have another table that contains some supplementary measures needed by some reports.
    Logical Fact table has an LTS that consists of the 2 tables which are joined on 2 columns (eg, a.col1 = b.col1 and a.col2 = b.col2, etc). The columns used in the join also exist in the logical dimension which uses the same flattened table. I use these columns in my report filters.
    I have created some Fact measures, some from each source inside the LTS.
    What's happening is that when I am using this model in Answers, the query being generated is applying my filtering conditions to both tables, which seems unnecessary to me since they are joined on these columns.
    i.e.,
    select *
    from table1 a, table2 b
    where a.col1 = b.col1
    and a.col2 = b.col2
    and a.col1 = 'Value 1'
    and a.col2 = 'Value 2'
    and b.col1 = 'Value 1'
    and b.col1 = 'Value 2'
    The last two lines in the query above should not be happening. I have not mapped these columns anywhere in the BMM to table2 (other than physical layer join), only to table1.
    For clarification, it's not hurting the end results from what I've seen, but it is adding additional filtering to the query that can increase overall cost, which I'm trying to avoid.
    Any ideas/suggestions?
    Thanks
    Edited by: odinsride on Jan 31, 2012 5:03 PM

    Hi,
    I think this is a known issue. I think that the BI Server pulls in both the tables involved in the LTS for any query (Yes, even if the query has only one column from any of the tables in LTS) and so are the filters.
    In this case, what I would suggest you is to create 2 LTS (probably 3). One for each physical table (that are being joined in the LTS) and one for both of them joined (may be inner/left/right) etc.
    With this kind of setup, when there are reports for any particular column of this logical table, the BI Server can go and choose the corresponding LTS only. However, if there is a report with columns from both the tables, the BI Server would choose the LTS with the join.
    Hope this helps.
    Thank you,
    Dhar

  • CVC creation - Strange issue with Master data table of 9AMATNR

    Hi Experts,
    We have encountered a strange issue with Master data table (/BI0/9APMATNR) of info object 9AMATNR.
    We have a BADI implemented for checking the valid Characteristic before creation of the CVC using transaction /SAPAPO/MC62. This BADI puts a select on master data tab of material /BI0/9APMATNR and returns no value. But the material actually exists in the table (checked through SE16).
    Now we go inside the info object 9AMATNR and go to the Master data Tab. There we go inside the master table
    /BI0/9APMATNR and activate that. After activating the table it is read by the select statement inside BADI (Strange) and allows the CVC to be created.
    Ideally it should not allow us to activate the SAP standard table /BI0/9APMATNR. I observed that in technical settings of this table it has single record buffering as switched on. (But as per my knowledge buffer gets refreshed every 2 to 4 mins and not in 2 days or something).
    Your expert comment is valuable to us. Thanks.
    Best Regards,
    Chandan Dubey

    Hi Chandan,
                 Try to use a WAIT statment with 5 seconds before your select statment.
    I'm not sure whether this will work. Anyway check it and let me know the result.
    Regards,
    Siva.

  • Data not received in PSA Table

    hi guyz,
    im facing the below issue while loading master data attr from source system to bi system.
    im loading data from 0vendor_attr to ovendor object.
    only half of the records are processed and gets hanged.
    in monitor- administrator workbench its showing (60434 from 157912 records)
    below are the details of the issue:
    Data not received in PSA Table
    Diagnosis
    Data has not been updated in PSA Table . The request is probably still running or there was a short dump.
    Procedure
    In the short dump overview in BI, look for the short dump that belongs to your data request. Make sure the correct date and time are specified in the selection screen.
    You can use the wizard to get to the short dump list, or follow the menu path "Environment -> Short dump -> In Data Warehouse".
    Removing errors
    Follow the instructions in the short dump.
    please guide with your expertise.
    cheerz,
    raps.

    Hi Rajeev,
    Check the job in Background (SM37), you can analyze from here.
    Check the short dump in ST22.and check system performance once.
    If you are facing this problem first time, cancel the background job and make the request red.
    Then repeat the IP again, it will get successful.
    This should be due to performance issue or RFC problem of your system.
    If the job is active in SM37, cancel the job and make the status IP red.
    Re run the IP, it  will get successful.
    Regards,
    Venkatesh
    Edited by: Venky1903 on Aug 15, 2011 2:09 AM

  • Data not received in PSA Table  0ORGUNIT

    Hi,
    We have an issue with a text load to 0ORGUNIT with around 35 lakhs of records. This particular load taking lot of time and resulting in short dump with the message 'Data not received in PSA Table'. But when iam making the request red and restarting it manulally it will get succeeed with in 15 minutes. The load is starting at 2:30 UK time and getting failed. I am restarting it at around 9 '0 clock and it got succed . The same thing happening from past 7 days. Can any one tell me what might be the reason and solution for this? Is this temporary memory issue? As the load is to an info object nothing to do with indices.
    Thanks in advance
    K

    You can change the size of the Data Packet.
    Open InfoPackge.
    From Top Menu choose "Scheduler >> DataS. Default Data Transfer".
    Default setting of the Source System will be shown.
    Keeping that in mind, try to reduce the "Max Size" & "Number of Data Packages per Request". You might need to try this a couple of times with lesser values each time to ensure that the issue resolves.
    Regards
    Anujit Ghosh

  • How to correct the data in the psa table?

    1Q. There are lot of invalid character in the infopackage of say 1million records. it takes lot of time to check each and every record in the data package(PSA)and correct it. i think there is an efficient way to reslove this issue that is going in the PSA table to correct all the records. is it right, if yes how to do it?
    2Q. If say there are 30 data packages in the request and only data pacakge 25 has the bad records. if i correct the data in the PSA and push it to the data target, its gone process all the data packages one by one that takes lot of time and delay our process chain job that has depedency on the load. can i just manually process this data package only. if yes how to do it?
    3Q. when i successfully correct all the bad records in the data package and push it from the PSA. the request dont turn to status green and have to manually turn this request to green in the data target after i verify all the data packages have on bad records and it is a delta update. is my process right? as it is a delta what are the pitfalls i have to watch for? and the next step after this is compress the request this is very dangerous because this basic cube have lot of history and it will take a long time probably weeks to reload it. how to take precuation before i turn it to stutus green in the data target?
    Thanks in advance! and i know how to thank SDN experts by assining points.

    Hi,
    1Q . Update the invalid chars in the filter table using tcode RSKC and also write a ABAP routine to filter out the invalid characters.
    2Q. For the incorrect data packet, you can right click on the data packet in the monitor details tab and say update manually. That way you don't need to reload the entire request again.
    3Q. When you reload the request or update individual data packet again, the request should automatically turn green. You don't have to turn it green manually. The pitfall is, if you turn a delta request green, you have chances of losing data and corrupting the delta. Best practise is never turn a request green manually.. Even if you compress the requests, you can use selective deletion to delete the data and then use an infopackage with the same selections, that you used for deletion to load the same data back.
    Cheers,
    Kedar

  • Issues with Nested Tables and Adobe Designer

    Hi,
    I have some strange issues when I am trying to create a nested dynamic table with SAP DATA in Adobe Designer.
    My Outer Table has article items and in this table I got a nested charges. For instance the table contains this data:
    DATA
    |->Article1
    |--->charge111
    |--->charge211
    |
    |->Article2
    |--->charge122
    |--->charge222
    Now I am trying to display the data in a Adobe Designer table. But if I create a table with a row article and a repeating row charge all charges will be displayed under the first Article.
    This is the result:
    DATA
    |->Article1
    |--->charge111
    |--->charge211
    |--->charge122
    |--->charge222
    |
    |->Article2
    How can I solve this issue? I tried to select data binding on charges like article.DATA.charges.DATA[*] but this don't work.
    Anyone an idea?

    Alex,
    Is it print based form or interactvie form.
    ABAP
    If it is printbased form and if you are designing from SFP, you can use the following solution.
    You have to create a nested table in the context as below.
    say Table1-> Article ( fields: ARTICLENAME,....other fields) info Table2->cost info(fields:ARTICLENAME,Charge...,other fields).
    The 2 tables should contains data as beow.
    Table1 data.
    1row->Article1  -.....other values.
    2row->Article2  -.....other values.
    3row->Article3  -.....other values.
    Table2 data.
    1row->Article1  -Charge11 .....other values.
    2row->Article1  -Charge12 .....other values.
    3row->Article1  -Charge13 .....other values.
    4row->Article2  -Charge21 .....other values.
    5row->Article2  -Charge22 .....other values.
    6row->Article2  -Charge23 .....other values.
    7row->Article3  -Charge31 .....other values.
    8row->Article3  -Charge32 .....other values.
    9row->Article3  -Charge33 .....other values.
    In the context drag table2 into the table1 and define where clause on ARTICLENAME.
    In the layout drag nested table in the body page and make alignments.
    If your requirement is interactive, may be you can use the similar logic.

  • ORA-39165: During Expdp Strange Issue

    Hi All,
    DB -- 11.1.0.6.0
    OS --HPUX Itanium.
    Today , I came across with one strange issue , where EXPDP tells us the scehma does not exist. we then tried to look into this this particular schema into the database and discovered it's does exist there. One wearied thing is that , all the tables inside this schema are of same size and have bit strange naming convention.I also discovered the same issue with one more database on the same host.
    CPMDVFDM> Total estimation using BLOCKS method: 0 KB
    ORA-39165: Schema WMSYS was not found.
    ORA-31655: no data or metadata objects selected for job
    Job "SYS"."SYS_EXPORT_SCHEMA_01" completed with 2 error(s) at 11:02:43
    [1] +  Done(5)                    expdp  \'/ as sysdba\' directory=data_pump_dir1 logfile=wmsys.log schemas=wmsys &
    SQL> select table_name from dba_tables where owner='WMSYS';
    TABLE_NAME
    WM$RIC_TABLE
    WM$RIC_TRIGGERS_TABLE
    WM$INSTEADOF_TRIGS_TABLE
    WM$WORKSPACES_TABLE
    WM$VERSION_TABLE
    WM$NEXTVER_TABLE
    WM$VERSION_HIERARCHY_TABLE
    WM$VERSIONED_TABLES
    MAX(BYTES) SEGMENT_NAME
         65536 WM$RIC_TRIGGERS_TABLE
         65536 WM$WORKSPACES_TABLE
         65536 WM$MODIFIED_TABLES
         65536 WM$NEXTVER_TABLE
         65536 WM$CONS_COLUMNS
         65536 WM$MP_PARENT_WORKSPACES_TABLE
         65536 WM$LOG_TABLE
         65536 WM$ADT_FUNC_TABLE
         65536 WM$UDTRIG_INFO
         65536 WM$VT_ERRORS_TABLE
         65536 WM$NESTED_COLUMNS_TABLE
    MAX(BYTES) SEGMENT_NAME
         65536 WM$VERSIONED_TABLES
         65536 WM$RESOLVE_WORKSPACES_TABLE
         65536 WM$LOCKROWS_INFO
         65536 WM$MP_GRAPH_WORKSPACES_TABLE
         65536 WM$BATCH_COMPRESSIBLE_TABLES
         65536 WM$TMP_DBA_CONSTRAINTS
         65536 WM$RIC_LOCKING_TABLE
         65536 AQ$_WM$EVENT_QUEUE_TABLE_S
         65536 WM$INSTEADOF_TRIGS_TABLE
         65536 WM$VERSION_TABLE
         65536 WM$VERSION_HIERARCHY_TABLE
    MAX(BYTES) SEGMENT_NAME
         65536 WM$WORKSPACE_PRIV_TABLE
         65536 WM$WORKSPACE_SAVEPOINTS_TABLE
         65536 WM$UDTRIG_DISPATCH_PROCS
         65536 WM$REPLICATION_TABLE
         65536 WM$CONSTRAINTS_TABLE
         65536 WM$EVENTS_INFO
         65536 WM$LOG_TABLE_ERRORS
         65536 WM$REMOVED_WORKSPACES_TABLE
         65536 WM$HINT_TABLE
         65536 WM$RIC_TABLE
         65536 WM$ENV_VARS
    MAX(BYTES) SEGMENT_NAME
         65536 WM$SYSPARAM_ALL_VALUES
         65536 WM$EVENT_QUEUE_TABLE
         65536 SYS_IOT_OVER_12359
         65536 WM$REPLICATION_DETAILS_TABLERegards

    This is not true, there is a set of schemas that Data Pump knows are really not user schemas and therefore are not exported. Most contain sys, but they don't have to.
    If you created a schema called sysxyz, it would be exported.
    Dean

  • Strange issue in my n82 handset

    I experience a strange issue in my n82 handset. When I'm listening to any radio station the sound suddenly swaps from stereo to mono and back, especialy when I'm on the move. First I thought it was because of the headset, but yesterday I bought HS-45 headset and the problem still persists. I have never had such an issue in my previous nokia phones. Maybe somone is familiar with this problem and knows if it is a hardware failure or a soft one that is common to this model and can be fixed in future firmware updates. I read about similar problem concerning N73 and E51, but no solution. I live in the center of Kolkata so the FM signal is strong. Any ideas how to solve the problem?

    Then I'm back to the FM signal - because the headset cable is used for an aerial, it tends to be a bit flaky. Reception on my N80 and my N95 8GB fluctuates even when I have the phone laid on a table playing through the loudspeaker.
    The only alternative explanation would be some sort of dodgy connection between the FM receiver and the port which is possible, but I would think pretty unlikely.

  • PSA Table Naming convention

    Hi Experts,
    Currently I am working on the BW3.5 version. I would like to delete the old PSA req through Process Chain. I need some clarification. Please provide me your suggestions.
    I have collected full list of PSA Table in Development system through excel, then i can filter out by source system.
    While create the Process Chain for the PSA deletion, i want to add the collected PSA tables(Object Name).
    Please refer the screen shot. But i noticed that "differing in naming convention for PSA from Dev to quality & prod!!".
    So if i transport this Process chain to quality & production, this will not work same as in Dev.
    I have already referred the form and found the thread that discuss about the same issue. But resolution not given.
    Please help me to get it this issue resolved. Thanks in advance.
    Similar issue thread:
    psa
    Screen shot:
    http://img818.imageshack.us/img818/3963/psa1.jpg
    Thanks,
    RR

    To explain this I will take the systems with this naming Convension.
    Dev BW: BWD
    Dev ECC: E01
    Quality BW: BWQ
    Quality ECC: Q01
    When we take the conversion in the quality system you should have the below parameters.
    BWD to BWQ
    E01 to Q01
    Q01 to Q01
    FLAT File to FLAT FILE.
    So lets say the source system related object is goign from D to Q lets say transfer rule, the same will be converted to Quality system based on the conversions maintained in this table: RSLOGSYSMAP.
    So the source system related objects will gets converted to the target system objects using the refernce maintained in this place.
    Hope this is clear for you now.
    Thanks
    Murali

  • Data Deletion from PSA Tables

    Hi,
      I deleted 2 failed requests (10000000 Records) from PSA Tables because of the disk space issue for our BW System.
    RSA1—PSA –DELETE PSA DATA –
    The requests are deleted from the PSA, In BW in can see that there not requests in that particular PSA.
    But Basis/Database team raised an issue that the data has been added to the Database…
    What is this mean and how can I proceed with this
    Thanks

    Hi,
    Thanks for the info provided.
    So according to the info provided the Deleting the request from the PSA  marks as ‘To be Deleted’
    ‘When the last of all requests in one partition gets deleted,’ what do you mean by this.
    There are 2 requests in our PSA,I deleted 2 of them and in BW when I go to PSA there I don’t find any requests available in PSA.
    But our basis team complained that some data has been added to data base
    Thanks

  • Inconsistent data in PSA table and Data source

    Dear Experts,
    I am using Process chain to update info cube from data source of a z table. In the process chain, the Delete PSA request step is having the PSA table number of the correct data source. The info package and the DTP also have the correct data source.
    But after executing the process chain, the PSA table is updated with the wrong data source.
    I have checked the Extraction of the data source in RSA3 transaction. The records are matching to the correct data source. But when I check PSA contents through process monitor in process chain log for DTP, I found the PSA content data is inconsistent and getting the data from wrong data source.
    If process chain is repeated once again, the PSA is updated with the right data source.
    Also when I do the process chain steps manually, it is working well.
    Thanks and regards
    Murugesan

    Hi,
    Looks strange but you can do the following check,
    Go to your data source and from the PSA table check which infopackage is loading data. You should be able to see that data is getting loaded from same infopackage. If there are multiple infopackages involved means you could get different data.
    Regards,
    Durgehs.

  • Data deletion in PSA tables

    Hello Folks,
    I am trying to delete Older than 7 days data in my PSA tables, with Process chains I am able to delete the requests ID but actual data is not getting dropped. I can write an ABAP program but that will delete the entire content of PSA tables. I went through lot of SDN messages in this regard but I did not find a solution. Could you please throw some suggestions on this?
    Kris

    Hi,
    There is a process type called "Deleting requests from PSA" and there you can set the frequency i.e. how many old data you want to retain.
    And as I seen once the request is deleted, all the data belonging to that request also get deleted otherwise there is no point in just deleting request ID's.
    I guess there must be some issue with your process chain. Try manually deleting PSA request if they are still available in PSA table.
    Regards,
    Durgesh.

  • Data not received in PSA Table(urgent)

    hello,
    While loading Master data full load
    which is a load of 457070 on 2days back
    where only 4lac records r received to PSA but remaining 57070 was missing
    with error<b> Data not received in PSA Table
    </b>
    when the next day we loaded 5lac records and we got the same error<b> Data not received in PSA Table</b> where except 57070 records remaining all reached PSA
    where we tried 2day with 550000 records, same error without 57070 records all other reached to PSA
    I required ur help to solve this
    i will thank by giving points
    Regards
    PSC

    Hi,
    check if there is a TRFC haging in SM58.
    can that be that your DB cannot extend its space for this amount of data.
    How do you perform the load?
    package by package (in series or in parallel)? could that be that the next stage (into IObj) is failing thus stopping the load in the PSA?
    Can you try to perform the same load with only to PSA and then enable the "susbsequent update in target"?
    As already mentionned it could also be that an IDOC is haging...
    please let us know
    Olivier.

Maybe you are looking for

  • Battery suddenly draining fast, phone warm/hot

    I've had the phone since December and never had an issue with battery life. It normally lasts all day with no interim charge and I use it on and off all day for Facebook, Yahoomail, texting. Starting this morning, the battery drained to nothing in le

  • Paypal button for jquery mobile app in DW CS5.5

    Using the jquery mobile app framework in DW CS5.5 and need help in using Paypal mobile. <form action="https://www.paypal.com/cgi-bin/webscr" method="post">   <input type="hidden" name="cmd" value="_xclick" />   <input type="hidden" name="bn" value="w

  • Assignment field in FB03 is updated with a TM PO, and not the material PO

    Hi Experts, I am facing an issue. Issue: The assignment field in accounting document generated during GR is updated with the TM PO number(Trade PO) and not the Material PO . this happens while Posting goods receipt in VL32N tcode. Please let me know

  • Import RFC from SAP System to XI

    Hi It is taking more time ( 5 to 10 mins) to list the RFC from SAPsystem ,When i try to Import RFC from SAP System to XI. Please let me know your ideas on the same Regards Madhan D

  • Update Street2 for Customer

    Hi, I need to update Street2 field while creating Customers using LSMW. I am using the program RFBIDE00 to update all other fields but its not working for Street2 field. Also this field is updated using a different structure BIADDR2 instead of BKNA1.