Issue with extraction of data to Delta Queue

Hi All,
Background:- One LO Data source each from Application 8, 11 and 12 was active in LBWE. Update method was specified as Queued Delta. However this Data source was not utilized to pull any data to any of the Global BI Systems. These data sources did not exist in RSA7. There was a business requirement that these Data sources were to be enhanced, along with this other Data sources under these applications were also required to be enhanced.  There were entries in LBWQ which used to get dumped out at regular interval by running the SAP Standard Collector jobs. Since no Delta was initialized for these extractors, no entries were passed on to RSA7.
Problem:- We did not clear the entries or took a downtime before sending transports and now the collector job is dumping saying the structures have changed, which is correct.  I believe we wonu2019t require any data sitting in LBWQ as none of the extractors had any entry in RSA7. This problem is happening only in our Production system and we did not face this problem in any other system even though there were entries existing in LBWQ.
What is the solution to fix this? Is the only option now is to delete the Queue from LBWQ and then run the collector job or is there any other way and will that fix the problem?

Hi Victor,
As u said that u were not using the DS for Application 8,11 and 12.. in this case I believe that u dont have any data in BW as well(Historical data). As u have enhanced these data sources with the new requirement, I hope u must have changed the structures in BW side as well to load the data to the newly added fields.
So in ur case u need to replicate the DS in BW and then activate the Transfer structure for all of them.
And then flushout the LBWQ for all those 3 applications and then take the setup tables fill for the same (but this requires the down time in prod to make sure that u wont loose any records).
Before setup check SM13 whether u have any blocked entries for all these entries if so then u can delete them as well before taking the setup.
Once the setup is done then u can take Init without data transfer and then take repair full loads in BW for the targets.(Activate ur V3 once the init- without data transfer took place for the regular deltas) .
Thanks
Assign points if this helps

Similar Messages

  • Issue with Ord Start Date

    Hi,
    I am having issue with Ord Start Date.
    When i create XYZ Asset now when i do the acquasation for that new asset. Example: Using FB01 i enter todays date and i do the posting.
    Now when i go to DEP Area TAB for the DEP Key (M200) - The Ord Dep Start Date is 15 Dec 2010.
    For this DEP Key Period Control Methods (T code - AFAMP) - Acquisation Entered is ( 6 that is - at the start of year) and the Period Control is ( T code - OAVS) (4 that is - First year convention at half year start date).
    Our client is using the fiscal year as 1 June 2010 to 31 May 2011.
    So when i do acquistion for the asset as per the period contol the Ord Dep Start Date should be 1 Dec 2010, but here it is talking 15 Dec 2010.
    Why is system talking 15 days more.
    Regards.

    Hi
    Which Fiscal year variant are you using??
    Did you maintain it in OAVH? Copy the existing entry in OAVH for K4 to your Fisc Yr var and try again
    And also, 04 does not figure for Acquisitions in AFAMP under 0004.... Whatever you specify in OAVH must also be a part of AFAMP
    Regards
    Ajay M
    Edited by: Ajay Maheshwari on Oct 28, 2010 2:31 PM

  • Issue with list saving data after sites upgrade from sharepoint 2010 to sharepoint 2013

    Issue with list saving data after sites upgrade from sharepoint 2010 to sharepoint 2013 
    Newform.aspx of list:-
    Custom List is not saving data sometimes in the new form after 15 minutes and only blank entry record got created without saving data, even though some columns are mandatory fields?

    Hello dcakumar,
    Sounds like a strang issue. If you can reproduce this can you see some errors in the ULS logs?
    - Dennis | Netherlands | Blog |
    Twitter

  • View data in Delta Queue (RSA7)

    Hi,
    We want to see data records available for extraction in delta queue.
    Data source is 0CO_OM_NWA_2. In RSA7 it is showing 5 in total column for this datasource.  When we select "Diaplay Data Entries (F2) , select update mode Delta and execute it is displaying "List contains no data"
    How to view this data ?
    Regards
    SS

    Hi,
    Once you schedule the infopackage , it will pick up the data from RSA7  . Afterwards when you check the data in RSA7   by selecting update mode as delta - you won't find any records . but you can see the data in by selecting update mode - delta repetition - execute it. It will show the records which has been already transferred to BW side.
    Update mode - repetition , will hold the data until the next delta is successful..
    Hope i am clear..
    Regards,
    Siva.

  • Issue with 0hrposition master data

    We are extracting data from SAP using 0HRPOSITION_ATTR datasource. I noticed that data is not maintained correctly in master data tables in BW and giving us incorrect results in reporting. Consider below mentioned scenario:
    Position A created as vacant on 04/01/2006 with start date (BEGDA/ Valid from) as 04/01/2006 and End date (ENDDA/ Valid to) as 12/31/9999 and is vacant. Below mentioned entry is shown under maintain master data for 0HRPOSITION in BW:
    Position Valid To Valid From Position Vacant
    A 03/31/2006 01/01/1000
    A 12/31/9999 04/01/2006 X
    Position A is now delimited on 09/15/2006 as it’s no more required. In SAP, position has record only from 04/01/2006 till 09/15/2006 as vacant. When record is extracted in BW, it creates below mentioned entry in master data table.
    Position Valid To Valid From Position Vacant
    A 03/31/2006 01/01/1000
    A 09/15/2006 04/01/2006 X
    <b>A 12/31/9999 09/16/2006 X</b>
    Entry 09/16- 12/31 is incorrect as position doesn’t exist for this duration. If we report on 0HRPOSTION with key date as 09/30/2006, it shows position A as vacant though position no longer exists.
    Has anyone come across this situation. any help is greatly appreciated.
    Kamal
    P.S: Milind Rane...I was searching through the forums and came across your post.I would appreciate if you could let me know how you solved this issue...
    Message was edited by:
            Kamal K

    HI KK
    I have a similar issue. Can you please let me know how this Issue with 0HRPOSITION_ATTR extractor was resolved. In my case i have incorrect data in BW when reporting is done. THrough the extractor untill the PSA i have the correct records coming  but in the master data 0HRPOSITION i have incorrect records.
    Please help.
    Thanks
    Hari
    Message was edited by:
            SAPCOOL

  • R&R issue with extracts

    Hi All,
    We have an issue with the R&R demon when creating a new extract. The queue AC_Extract holds with a DDIC_WRITE error. After deleting it and releasing the queue, the extract will eventually end succesfully. We have the same problem with all MSA sites. What does the error mean and what can we do to prevent it?
    Thanks!

    Hi,
    isn't it SPE_DDIC_WRITE which causes the problem?!
    Then in case you are running 5.0 please check note 765953.
    Regards,
    Wolfhard

  • Issue with the Posting Date of the Purchase Order.

    Hi All,
    There are fields in BW like SSL1: Time OK, SSL2: Qty OK, SSL3: Time & Qty Ok, SSL4: Days Late (Routines are written to calculate). These fields will indicate whether the delivery against a GR is OK or not with respect to Time, Quantity and the No. of Days..
    But here the issue I am facing is
    If there is only1 delivery/ GR against a single item the calculation in BW are correct - i.e. for a particular PO if there is only one delivery the above fields like SSL1: Time Ok, SSL2: Qty OK will show like the delivery is done within the specified time and everything is OK (in case if it is delivered within the allotted time)
    But if there are multiple deliveries or multiple GR's  posted for one PO item, the calculations are going wrong i.e. even if the delivery is done well within the specified time it is showing the wrong calculations like it is delievered too late. Because in this case the earlier dates are overwritten.
    Can anyone throw me some light on how can I go about solving this issue.
    I am thinking of declaring the Posting Date as the KeyField of the DSO as of now it is a Data field  I also want to know the impact of assigning this as a Keyfield.
    Thanks in advance,
    Prasapbi

    Hi,
    As I understand, you have a DSO based on Purchase Order and your key field is PO and its line item. The problem as you stated will always be there if you have multiple deliveries/GRs created for a single line item because the system will overwrite the entries for same key.
    Problem with adding Posting date as keyfield will be that then your key will be PO-PO Lineitem-Date. When PO will be created, the Posting date will be blank(correct me here if I am wrong), therefore you will have one entry for same PO-line item combination. One without date and other with date, which again would be incorrect. If my assumption about Posting date was wrong, even then your data may not be correct because then you may have many entries with same posting date which again would overwrite each other.
    If there is any direct link between PO line item and number of deliveries that will get created for them, then you can bring that field in DSO as keyfield. But I don't think there is any such field.
    Looking at your report requirement, I would suggest that you make a DSO based on Goods Receipts and then calculate these keyfigures by comparing the dates between GR posting date and PO line item date.
    Else you can change the way your datasource works(if its generic one based on function module). Since your main requirement is to check whether the GR posting date has met your SLA or not, you should fetch all the details only when GR is created and make your key field as PO-PO Line item-GR

  • Issues with 4.1 Data Upload

    I've got some issues with the new feature 'data upload' in 4.1. I had already made the pages and the data loading and it worked perfect. Now a new column was added to my data loading table and I have to add a new table lookup for this column. Everything is set up like it has to be. I've got 4 table lookups. When I try to upload some data and indicate the table columns to the data, I've always get the error: Failed to retrieve the lookup value. There is no problem when I do a data load where he only have to retrieve one column from the lookup table, when he has to retrieve data from more tables for more columns, I always get the FAILED message. Does anyone know the problem of this situation? I already tried to make a totally new data loading, but this also failed to do the job.

    Hi Ellenvanhees,
    I dont think the number of lookups that you defined is an issue here. If possible try to share more details about your data and tables. The few things that come to my mind are probably your data.
    But if you are able to do one by one lookup without problem then I think your upload is failing due to null values. The current status of data upload feature returns a failed lookup even if a null value was to be uploaded. This is a bug #13582661 and has been fixed for the future release.
    Patrick

  • Issue with setting Planning Data Source

    I keep getting error when I try to set up the data source
    the Oracle database  = orcl
    The data base schema EPMAPLAN
    Password = EPMAPLAN
    when i log on i  to Oracle Database
    Sys as SYSDBA
    Password = Password01
    so what what should be the Database Source Name  = ORCL Correct ????
    Server = win-392h1l307n1 or localhost
    Databse = ORCL
    User Name  = ????????????
    Password  = ?????????
    I try all the  combinations
    Please advise

    Duplicate post - Issue with setting up planning data source
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Issues with processin the data in Adpater engine

    Dear experts,
    we are facing the below mention issues while processing the data Please let me know what we can do to resolve the issue.
    "Error while reading ID of own business system from the SLD for system DX1 and client 500 Internal error in Landscape Directory"
    regards
    shankar

    Hi,
    As suggested by Aashish, you can go through.
    It seems to like problem with SLD.  It might be wrongly configured. Please check with the T.code SLDAPICUST
    Regards,
    P.Rajesh

  • Issue with Real-Time data acquisition

    Hello,
    I'm using RDA to extract a non-standard transparent table (Z*) from  ECC to BI. When I launch my deamon, my infopackage and my DTP become yellow only 2/3 seconds and green just after, and no data are coming from ECC.
    If I check the deamon log I could see following lines :
    Job started
    Step 001 started (program RSCRT_RDA_DEMON, variant &0000000001977, user ID XXXXXX)
    InfoPackage ZPAK_4J6KVMAKPPWI3FYUYAWVLYVWL for daemon 01 is locked
    Setting status of InfoPackage ZPAK_4J6KVMAKPPWI3FYUYAWVLYVWL
    Status set (2 -> 3)
    Start of upload for InfoPackage ZPAK_4J6KVMAKPPWI3FYUYAWVLYVWL for daemon 01
    Request 242.373 opened, status set (0 -> 1)
    InfoPackage ZPAK_4J6KVMAKPPWI3FYUYAWVLYVWL: Deleting TIDs
    InfoPackage ZPAK_4J6KVMAKPPWI3FYUYAWVLYVWL: TIDs successfully deleted
    Data records deleted in the queue, status set (1 -> 4)
    Data package 000001 opened, status set (4 -> 2)
    The current application triggered a termination with a short dump.
    Upload finished for InfoPackage ZPAK_4J6KVMAKPPWI3FYUYAWVLYVWL for daemon 01
    Setting status of InfoPackage ZPAK_4J6KVMAKPPWI3FYUYAWVLYVWL
    Status set (3 -> 2)
    A short dump is triggered in ECC, MESSAGE_TYPE_X in FM RSC3_DEMON_UPL. Why ???
    I have done another extractor (on a HR infotype) and It works fine.
    Thanks for your help,

    First tip is to not configure the daq task and close the task on every iteration of the while loop.  Configure the task before the loop starts and close it after the loop.  Only stop and configure the task in the event that any of the configuration data changes.

  • Oracle EPM 11.1.2 issue with system-jazn-data.xml & HIT entries

    Have been working on configuring Oracle EPM 11.1.2 and have one final issue from the diagnostic utility that I cannot figure out. Configuration sequence is as follows and each step is installed in its own database:
    Step 1 - Foundation/Shared Services/Calc Mgr/EPMA/Essbase to a single relational DB. I am not configuring the web server until the final step.
    Step 2 - Hyperion Performance Scorecard
    Step 3 - Planning
    Step 4 - Profitability
    Step 5 - RA and configure web server.
    I have used both SQL Server Express 2008 and Oracle DB 11g and get the same result.
    When I complete the install, restart all of the services, and run the diagnostic utility, I get a failure with foundation services indicating that the file "system-jazn-data.xml" cannot be found. No real help is provided with the error message and have found no help in the docs or on the web. I have searched the disk and the file seems to be in the proper place per the docs. I have done partial configs and do not get the error. I have then compared the system-jazn-data.xml file from the successful config to the system-jazn-data.xml file from the failed config they are identical. Both files seem to be bloated with tens of thousands of lines, most of them blank.
    I had reached a point where I thought the issue was related to Performance Scorecard and removed that step. I am now getting the error again.
    Anyone seeing this issue? Is it just a bogus message in the diagnostic report and can be ignored? Any other thoughts?
    Thanks
    EPMCloud

    Update - After going through the install many more times, I still do not know what the issue is, but I believe I have figured out how to resolve it. It appears that if you go back (after everything is installed and configured) and reconfigure the application server for Foundation services, the issue is corrected.
    I am running some final test now and if I discover something different, I will update the post.
    EPMCloud

  • Issue with status of data information in Bex analyzer report

    Hi BI gurus,
    One of the queries showing older date for the "status of data" information in the report of Bex Analyzer. I have tried to correct it in Bex analyzer by removing existing Text information element and adding a new Text element in the Bex Analyzer designer for the query. But it doesn't worked out as the changes made to the query through Bex Analyzer are only being saved as a local work book rather than reflecting to the query. Please suggest me with some options to resolve this issue and give any Idea to correct the "Status of data" in the Bex Query designer.

    Hi Aditya
    This is a common problem faced by users when reporting on Multi-Provider.
    In my project what I did to overcome this is to run a Fake DTP to the cube whose status is creating problem.
    Like , if under MultiPro I have a planning cube which is only updated monthly but all the actual cubes updated daily.  In this case create a DTP under Plan cube with some impossible selection condition ( like fiscal year 2099). This will bring 0 records to planning cube ( and thereby not impacting the data) but will update the last loading time.
    Regards
    Anindya

  • Issue with time sheet data source

    Hi All,
      We are on ECC 5.0.
      I am trying to load CATS timesheet data into BW thorugh data sources 0CA_TS_1 (approved time). This data source fetches data from CATSDB table.
    In CATSDB there is a good amount of data with Status '30' i.e. 'Approved'.
    However on running RSA3 checker this datasource is returning 0 records.
    The only SAP note most relevant was 509592. But it is valid for older versions.
    The ECC HR is on Level 10 and support pack SAPKE50010.
    This issue has been unresolved in previous threads.
    Please suggest any solution. Its urgent. Points for sure.
    Thanks
    Vishno

    Hi Guillaume 
    Thanks for the reply
    I had a glimpse on above link.
    1) Above link says need to perform below steps. Where we need to run below steps (SQL Server or AD Server)
    Sorry for asking basic question
    Click Start, click Run, type Adsiedit.msc, and then click
    OK.
    In the ADSI Edit snap-in, expand Domain [<var>DomainName</var>], expand
    DC= <var>RootDomainName</var>, expand CN=Users, right-click
    CN= <var>AccountName</var>, and then click Properties. 
    In the CN= <var>AccountName</var> Properties dialog box, click the
    Security tab.
    On the Security tab, click Advanced.
    In the Advanced Security Settings dialog box, make sure that
    SELF is listed under Permission entries.
    2) Link talks about Windows Server 2000 version. Is this applicable for 2008 ?
    Regards
    Santosh

  • In OSB , xquery issue with large volume data

    Hi ,
    I am facing one problem in xquery transformation in OSB.
    There is one xquery transformation where I am comparing all the records and if there are similar records i am clubbing them under same first node.
    Here i am reading the input file from the ftp process. This is perfectly working for the small size input data. When there is large input data then also its working , but its taking huge amount of time and the file is moving to error directory and i see the duplicate records created for the same input data. I am not seeing anything in the error log or normal log related to this file.
    How to check what is exactly causing the issue here,  why it is moving to error directory and why i am getting duplicate data for large input( approx 1GB).
    My Xquery is something like below.
    <InputParameters>
                    for $choice in $inputParameters1/choice              
                     let $withSamePrimaryID := ($inputParameters1/choice[PRIMARYID eq $choice/PRIMARYID])                
                     let $withSamePrimaryID8 := ($inputParameters1/choice[FIRSTNAME eq $choice/FIRSTNAME])
                     return
                      <choice>
                     if(data($withSamePrimaryID[1]/ClaimID) = data($withSamePrimaryID8[1]/ClaimID)) then
                     let $claimID:= $withSamePrimaryID[1]/ClaimID
                     return
                     <ClaimID>{$claimID}</ClaimID>                
                     else
                     <ClaimID>{ data($choice/ClaimID) }</ClaimID>

    HI ,
    I understand your use case is
    a) read the file ( from ftp location.. txt file hopefully)
    b) process the file ( your x query .. although will not get into details)
    c) what to do with the file ( send it backend system via Business Service?)
    Also noted the files with large size take long time to be processed . This depends on the memory/heap assigned to your JVM.
    Can say that is expected behaviour.
    the other point of file being moved to error dir etc - this could be the error handler doing the job ( if you one)
    if no error handlers - look at the timeout and error condition scenarios on your service.
    HTH

Maybe you are looking for