Inventory 0IC_C03, issue with historical data (data before Stock initializa

Hi Experts,
Inventory Management implementation we followed as bellow.
Initailization data and delta records data is showing correctly with ECC MB5B data, but historical data (2007 and 2008 till date before initialization) data is not showing correctly (stock on ECC side) but showing only difference Qunatity from Stock initialization data to date of Query.
we have done all the initial setting at BF11, Process keys and filed setup table for BX abd BF datasources, we are not using UM datasource.
1 we loaded BX data and compressed request (without tick mark at "No Marker Update)
2. initialization BF data and compressed request (with tick mark at "No Marker Update)
3 for deltas we are comperessing request on daily (without tick mark at "No Marker Update).
is this correct process
in as you mentioned for BX no need to compress ( should not compress BX request ? )
and do we need to compress delta requets ?
we have issue for historial data validation,
here is the example:
we have initilaized on may 5th 2009.
we have loaded BX data from 2007 (historical data)
for data when we see the data on january 1st 2007, on BI side it is showing value in negative sign.
on ECC it is showing different value.
for example ECC Stock on january 1st 2007 : 1500 KG
stock on Initialization may 5th 2009 : 2200 KG
on BI side it is showing as: - 700 KG
2200 - (-700) = 1500 ,
but on BI side it is not showing as 1500 KG.
(it is showing values in negative with refence to initialization stock)
can you please tell, is this the process is correct, or we did worng in data loading.
in validity table (L table) 2 records are there with SID values 0 and -1, is this correct
thanks in advance.
Regards,
Daya Sagar
Edited by: Daya Sagar on May 18, 2009 2:49 PM

Hi Anil,
Thanks for your reply.
1. You have performed the initialization on 15th May 2009.
yes
2. For the data after the stock initialization, I believe that you have either performed a full load from BF data source for the data 16th May 2009 onwards or you have not loaded any data after 15th May 2009.
for BF after stock initialization delta data, this compressed with marker update option unchecked.
If this is the case, then I think you need to
1. Load the data on 15th May (from BF data source) separately.
do you mean BF ( Material movements) 15th May data to be compressed with No Marker Update option unchecked. which we do for BX datasource ?
2. Compress it with the No Marker Update option unchecked.
3. Check the report for data on 1st Jan 2007 after this. If this is correct, then all the history data will also be correct.
After this you can perform a full load till date
here till date means May 15 th not included ?
for the data after stock initialization and then start the delta process. The data after the stock initialization(after 15th May 2009) should also be correct.
can you please clarify these doubts?
Thanks
Edited by: Daya Sagar on May 20, 2009 10:20 AM

Similar Messages

  • Issue with Ord Start Date

    Hi,
    I am having issue with Ord Start Date.
    When i create XYZ Asset now when i do the acquasation for that new asset. Example: Using FB01 i enter todays date and i do the posting.
    Now when i go to DEP Area TAB for the DEP Key (M200) - The Ord Dep Start Date is 15 Dec 2010.
    For this DEP Key Period Control Methods (T code - AFAMP) - Acquisation Entered is ( 6 that is - at the start of year) and the Period Control is ( T code - OAVS) (4 that is - First year convention at half year start date).
    Our client is using the fiscal year as 1 June 2010 to 31 May 2011.
    So when i do acquistion for the asset as per the period contol the Ord Dep Start Date should be 1 Dec 2010, but here it is talking 15 Dec 2010.
    Why is system talking 15 days more.
    Regards.

    Hi
    Which Fiscal year variant are you using??
    Did you maintain it in OAVH? Copy the existing entry in OAVH for K4 to your Fisc Yr var and try again
    And also, 04 does not figure for Acquisitions in AFAMP under 0004.... Whatever you specify in OAVH must also be a part of AFAMP
    Regards
    Ajay M
    Edited by: Ajay Maheshwari on Oct 28, 2010 2:31 PM

  • Issue with list saving data after sites upgrade from sharepoint 2010 to sharepoint 2013

    Issue with list saving data after sites upgrade from sharepoint 2010 to sharepoint 2013 
    Newform.aspx of list:-
    Custom List is not saving data sometimes in the new form after 15 minutes and only blank entry record got created without saving data, even though some columns are mandatory fields?

    Hello dcakumar,
    Sounds like a strang issue. If you can reproduce this can you see some errors in the ULS logs?
    - Dennis | Netherlands | Blog |
    Twitter

  • Issue with extraction of data to Delta Queue

    Hi All,
    Background:- One LO Data source each from Application 8, 11 and 12 was active in LBWE. Update method was specified as Queued Delta. However this Data source was not utilized to pull any data to any of the Global BI Systems. These data sources did not exist in RSA7. There was a business requirement that these Data sources were to be enhanced, along with this other Data sources under these applications were also required to be enhanced.  There were entries in LBWQ which used to get dumped out at regular interval by running the SAP Standard Collector jobs. Since no Delta was initialized for these extractors, no entries were passed on to RSA7.
    Problem:- We did not clear the entries or took a downtime before sending transports and now the collector job is dumping saying the structures have changed, which is correct.  I believe we wonu2019t require any data sitting in LBWQ as none of the extractors had any entry in RSA7. This problem is happening only in our Production system and we did not face this problem in any other system even though there were entries existing in LBWQ.
    What is the solution to fix this? Is the only option now is to delete the Queue from LBWQ and then run the collector job or is there any other way and will that fix the problem?

    Hi Victor,
    As u said that u were not using the DS for Application 8,11 and 12.. in this case I believe that u dont have any data in BW as well(Historical data). As u have enhanced these data sources with the new requirement, I hope u must have changed the structures in BW side as well to load the data to the newly added fields.
    So in ur case u need to replicate the DS in BW and then activate the Transfer structure for all of them.
    And then flushout the LBWQ for all those 3 applications and then take the setup tables fill for the same (but this requires the down time in prod to make sure that u wont loose any records).
    Before setup check SM13 whether u have any blocked entries for all these entries if so then u can delete them as well before taking the setup.
    Once the setup is done then u can take Init without data transfer and then take repair full loads in BW for the targets.(Activate ur V3 once the init- without data transfer took place for the regular deltas) .
    Thanks
    Assign points if this helps

  • Issue with 0hrposition master data

    We are extracting data from SAP using 0HRPOSITION_ATTR datasource. I noticed that data is not maintained correctly in master data tables in BW and giving us incorrect results in reporting. Consider below mentioned scenario:
    Position A created as vacant on 04/01/2006 with start date (BEGDA/ Valid from) as 04/01/2006 and End date (ENDDA/ Valid to) as 12/31/9999 and is vacant. Below mentioned entry is shown under maintain master data for 0HRPOSITION in BW:
    Position Valid To Valid From Position Vacant
    A 03/31/2006 01/01/1000
    A 12/31/9999 04/01/2006 X
    Position A is now delimited on 09/15/2006 as it’s no more required. In SAP, position has record only from 04/01/2006 till 09/15/2006 as vacant. When record is extracted in BW, it creates below mentioned entry in master data table.
    Position Valid To Valid From Position Vacant
    A 03/31/2006 01/01/1000
    A 09/15/2006 04/01/2006 X
    <b>A 12/31/9999 09/16/2006 X</b>
    Entry 09/16- 12/31 is incorrect as position doesn’t exist for this duration. If we report on 0HRPOSTION with key date as 09/30/2006, it shows position A as vacant though position no longer exists.
    Has anyone come across this situation. any help is greatly appreciated.
    Kamal
    P.S: Milind Rane...I was searching through the forums and came across your post.I would appreciate if you could let me know how you solved this issue...
    Message was edited by:
            Kamal K

    HI KK
    I have a similar issue. Can you please let me know how this Issue with 0HRPOSITION_ATTR extractor was resolved. In my case i have incorrect data in BW when reporting is done. THrough the extractor untill the PSA i have the correct records coming  but in the master data 0HRPOSITION i have incorrect records.
    Please help.
    Thanks
    Hari
    Message was edited by:
            SAPCOOL

  • Issue with the Posting Date of the Purchase Order.

    Hi All,
    There are fields in BW like SSL1: Time OK, SSL2: Qty OK, SSL3: Time & Qty Ok, SSL4: Days Late (Routines are written to calculate). These fields will indicate whether the delivery against a GR is OK or not with respect to Time, Quantity and the No. of Days..
    But here the issue I am facing is
    If there is only1 delivery/ GR against a single item the calculation in BW are correct - i.e. for a particular PO if there is only one delivery the above fields like SSL1: Time Ok, SSL2: Qty OK will show like the delivery is done within the specified time and everything is OK (in case if it is delivered within the allotted time)
    But if there are multiple deliveries or multiple GR's  posted for one PO item, the calculations are going wrong i.e. even if the delivery is done well within the specified time it is showing the wrong calculations like it is delievered too late. Because in this case the earlier dates are overwritten.
    Can anyone throw me some light on how can I go about solving this issue.
    I am thinking of declaring the Posting Date as the KeyField of the DSO as of now it is a Data field  I also want to know the impact of assigning this as a Keyfield.
    Thanks in advance,
    Prasapbi

    Hi,
    As I understand, you have a DSO based on Purchase Order and your key field is PO and its line item. The problem as you stated will always be there if you have multiple deliveries/GRs created for a single line item because the system will overwrite the entries for same key.
    Problem with adding Posting date as keyfield will be that then your key will be PO-PO Lineitem-Date. When PO will be created, the Posting date will be blank(correct me here if I am wrong), therefore you will have one entry for same PO-line item combination. One without date and other with date, which again would be incorrect. If my assumption about Posting date was wrong, even then your data may not be correct because then you may have many entries with same posting date which again would overwrite each other.
    If there is any direct link between PO line item and number of deliveries that will get created for them, then you can bring that field in DSO as keyfield. But I don't think there is any such field.
    Looking at your report requirement, I would suggest that you make a DSO based on Goods Receipts and then calculate these keyfigures by comparing the dates between GR posting date and PO line item date.
    Else you can change the way your datasource works(if its generic one based on function module). Since your main requirement is to check whether the GR posting date has met your SLA or not, you should fetch all the details only when GR is created and make your key field as PO-PO Line item-GR

  • Issues with 4.1 Data Upload

    I've got some issues with the new feature 'data upload' in 4.1. I had already made the pages and the data loading and it worked perfect. Now a new column was added to my data loading table and I have to add a new table lookup for this column. Everything is set up like it has to be. I've got 4 table lookups. When I try to upload some data and indicate the table columns to the data, I've always get the error: Failed to retrieve the lookup value. There is no problem when I do a data load where he only have to retrieve one column from the lookup table, when he has to retrieve data from more tables for more columns, I always get the FAILED message. Does anyone know the problem of this situation? I already tried to make a totally new data loading, but this also failed to do the job.

    Hi Ellenvanhees,
    I dont think the number of lookups that you defined is an issue here. If possible try to share more details about your data and tables. The few things that come to my mind are probably your data.
    But if you are able to do one by one lookup without problem then I think your upload is failing due to null values. The current status of data upload feature returns a failed lookup even if a null value was to be uploaded. This is a bug #13582661 and has been fixed for the future release.
    Patrick

  • Issue with setting Planning Data Source

    I keep getting error when I try to set up the data source
    the Oracle database  = orcl
    The data base schema EPMAPLAN
    Password = EPMAPLAN
    when i log on i  to Oracle Database
    Sys as SYSDBA
    Password = Password01
    so what what should be the Database Source Name  = ORCL Correct ????
    Server = win-392h1l307n1 or localhost
    Databse = ORCL
    User Name  = ????????????
    Password  = ?????????
    I try all the  combinations
    Please advise

    Duplicate post - Issue with setting up planning data source
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Issues with processin the data in Adpater engine

    Dear experts,
    we are facing the below mention issues while processing the data Please let me know what we can do to resolve the issue.
    "Error while reading ID of own business system from the SLD for system DX1 and client 500 Internal error in Landscape Directory"
    regards
    shankar

    Hi,
    As suggested by Aashish, you can go through.
    It seems to like problem with SLD.  It might be wrongly configured. Please check with the T.code SLDAPICUST
    Regards,
    P.Rajesh

  • CRS 5.0(1) issue with historical reports

    Hi support community,
    I'm facing an issue on a CRS 5.0(1) with historical reports (HDS).
    Last known issue on the Windows Server was that there were no free space on hard drive.
    This is now fixed and everything looks fine again.
    Now the user reporting that they could not get historical reports.
    As far as I could troube shot this issue by now, I can say that all regarding services are up and running.
    I can still run reports for the current day but not for the last week or month.
    So I guess there is a issue that the CDR don't stay in the MSSQL server.
    I go through the logs and could not find anything suspicious.
    Just one thing gives me to think of, in the database "da_cra" / table "ConfigLog":
    - last entry was on 07.11.2012 where the space issues started
    - I could not get any usefull information about the last entries
    CRS 5.0(1) is not supported by TAC anymore so I hope some one could put me in to the right direction for further trouble shooting.
    The user need the HDS reports until a new soloutin for CC is online.
    My action plan looks as following:
    - enable tracing for SQL - HDS
    - perform check table / repair table on MSSQL
    Unfortunately there is a message when I try to enable the tracing, but I don't get it what this message try to say me.
    I don't know any MSSQL commands to run such checks / repairs.
    Regards, Alexander

    Hi Anand,
    let me see if this could help.
    I'm facing an issue that we could not get data older than current day.
    So if the database would be reached the maxium size so I guess we could not get the latest CDR but older one, right?
    But in our situtation we have a vice versa situation, we get current CDR but not historical.
    I checked the purge settings:
    - schedule is configured to daily "12:00 AM"
    - purge data older than "99" months
    - notify when database reach "70" % of 2048 MB
    - initiate autmoatic purge when databse size exceeds "80" % of 2048 MB
    - auto purge data for the oldest "15" days
    So this should be fine
    I attached the PurgeProcess.log and have some questions about it:
    line 4: Calling execute with purge date sat nov 27 00:00:03 CET 2004
    --> where is this date coming from? date time settings are correct on Windows host
    line 17: oldest record in the database has a timestamp of tue feb 26 07:27:39 CET 2013
    --> this is excatly what I experience, any how the purge process gone crazy and deleting to much CDR's
    line 18: database size exceeded auto-purge threshold of 80%
    --> why is the database have such a great size if the lates record is from day before (call amount ~ 500 calls per day)
    So at this point I have two questions why is the time stamp wrong and how could I clean up the database (what are these data which fill 80% of database)
    Thanks and Regards, Alexander

  • Oracle EPM 11.1.2 issue with system-jazn-data.xml & HIT entries

    Have been working on configuring Oracle EPM 11.1.2 and have one final issue from the diagnostic utility that I cannot figure out. Configuration sequence is as follows and each step is installed in its own database:
    Step 1 - Foundation/Shared Services/Calc Mgr/EPMA/Essbase to a single relational DB. I am not configuring the web server until the final step.
    Step 2 - Hyperion Performance Scorecard
    Step 3 - Planning
    Step 4 - Profitability
    Step 5 - RA and configure web server.
    I have used both SQL Server Express 2008 and Oracle DB 11g and get the same result.
    When I complete the install, restart all of the services, and run the diagnostic utility, I get a failure with foundation services indicating that the file "system-jazn-data.xml" cannot be found. No real help is provided with the error message and have found no help in the docs or on the web. I have searched the disk and the file seems to be in the proper place per the docs. I have done partial configs and do not get the error. I have then compared the system-jazn-data.xml file from the successful config to the system-jazn-data.xml file from the failed config they are identical. Both files seem to be bloated with tens of thousands of lines, most of them blank.
    I had reached a point where I thought the issue was related to Performance Scorecard and removed that step. I am now getting the error again.
    Anyone seeing this issue? Is it just a bogus message in the diagnostic report and can be ignored? Any other thoughts?
    Thanks
    EPMCloud

    Update - After going through the install many more times, I still do not know what the issue is, but I believe I have figured out how to resolve it. It appears that if you go back (after everything is installed and configured) and reconfigure the application server for Foundation services, the issue is corrected.
    I am running some final test now and if I discover something different, I will update the post.
    EPMCloud

  • Issue with status of data information in Bex analyzer report

    Hi BI gurus,
    One of the queries showing older date for the "status of data" information in the report of Bex Analyzer. I have tried to correct it in Bex analyzer by removing existing Text information element and adding a new Text element in the Bex Analyzer designer for the query. But it doesn't worked out as the changes made to the query through Bex Analyzer are only being saved as a local work book rather than reflecting to the query. Please suggest me with some options to resolve this issue and give any Idea to correct the "Status of data" in the Bex Query designer.

    Hi Aditya
    This is a common problem faced by users when reporting on Multi-Provider.
    In my project what I did to overcome this is to run a Fake DTP to the cube whose status is creating problem.
    Like , if under MultiPro I have a planning cube which is only updated monthly but all the actual cubes updated daily.  In this case create a DTP under Plan cube with some impossible selection condition ( like fiscal year 2099). This will bring 0 records to planning cube ( and thereby not impacting the data) but will update the last loading time.
    Regards
    Anindya

  • Issue with time sheet data source

    Hi All,
      We are on ECC 5.0.
      I am trying to load CATS timesheet data into BW thorugh data sources 0CA_TS_1 (approved time). This data source fetches data from CATSDB table.
    In CATSDB there is a good amount of data with Status '30' i.e. 'Approved'.
    However on running RSA3 checker this datasource is returning 0 records.
    The only SAP note most relevant was 509592. But it is valid for older versions.
    The ECC HR is on Level 10 and support pack SAPKE50010.
    This issue has been unresolved in previous threads.
    Please suggest any solution. Its urgent. Points for sure.
    Thanks
    Vishno

    Hi Guillaume 
    Thanks for the reply
    I had a glimpse on above link.
    1) Above link says need to perform below steps. Where we need to run below steps (SQL Server or AD Server)
    Sorry for asking basic question
    Click Start, click Run, type Adsiedit.msc, and then click
    OK.
    In the ADSI Edit snap-in, expand Domain [<var>DomainName</var>], expand
    DC= <var>RootDomainName</var>, expand CN=Users, right-click
    CN= <var>AccountName</var>, and then click Properties. 
    In the CN= <var>AccountName</var> Properties dialog box, click the
    Security tab.
    On the Security tab, click Advanced.
    In the Advanced Security Settings dialog box, make sure that
    SELF is listed under Permission entries.
    2) Link talks about Windows Server 2000 version. Is this applicable for 2008 ?
    Regards
    Santosh

  • In OSB , xquery issue with large volume data

    Hi ,
    I am facing one problem in xquery transformation in OSB.
    There is one xquery transformation where I am comparing all the records and if there are similar records i am clubbing them under same first node.
    Here i am reading the input file from the ftp process. This is perfectly working for the small size input data. When there is large input data then also its working , but its taking huge amount of time and the file is moving to error directory and i see the duplicate records created for the same input data. I am not seeing anything in the error log or normal log related to this file.
    How to check what is exactly causing the issue here,  why it is moving to error directory and why i am getting duplicate data for large input( approx 1GB).
    My Xquery is something like below.
    <InputParameters>
                    for $choice in $inputParameters1/choice              
                     let $withSamePrimaryID := ($inputParameters1/choice[PRIMARYID eq $choice/PRIMARYID])                
                     let $withSamePrimaryID8 := ($inputParameters1/choice[FIRSTNAME eq $choice/FIRSTNAME])
                     return
                      <choice>
                     if(data($withSamePrimaryID[1]/ClaimID) = data($withSamePrimaryID8[1]/ClaimID)) then
                     let $claimID:= $withSamePrimaryID[1]/ClaimID
                     return
                     <ClaimID>{$claimID}</ClaimID>                
                     else
                     <ClaimID>{ data($choice/ClaimID) }</ClaimID>

    HI ,
    I understand your use case is
    a) read the file ( from ftp location.. txt file hopefully)
    b) process the file ( your x query .. although will not get into details)
    c) what to do with the file ( send it backend system via Business Service?)
    Also noted the files with large size take long time to be processed . This depends on the memory/heap assigned to your JVM.
    Can say that is expected behaviour.
    the other point of file being moved to error dir etc - this could be the error handler doing the job ( if you one)
    if no error handlers - look at the timeout and error condition scenarios on your service.
    HTH

  • Report with historical payment data and current bp balance

    Hi,
    Has anybody created an report like this??
    cardcode,cardname,adres,zipcode,documentnumber,doctotal,docdate,docduedate,payed amount and paydate??
    data should be from now untill 1 year before (historical)
    Kind regards
    Mark

    Dear Hangman,
    This query is for customer receivable.
    if you want for supplier just change tables name from query.
    SELECT T0.DocNum, T0.DocDate, T0.DocDueDate, T0.CardCode,t0.doccur As InvoiceCurrency,T0.DocTotalFC InvoiceTotal ,T0.PaidFC As ReceivedAmount, T0.CardName, T1.DocNum As ReceiptsEntryNo, T1.DocDate, T1.DocDueDate, T1.CashSum, T1.CheckSum,T1.TrsfrSumFc,DATEDIFF(Day, T0.DocDueDate, T1.DocDueDate)As OverdueDays
    FROM OINV T0 LEFT OUTER JOIN ORCT T1
    ON T0.ReceiptNum = T1.DocEntry
    WHERE T0.DocDate >=[%0] AND  T0.DocDate <=[%1] AND (T0.CardName ='[%2]' or '[%2]'=' ')
    Regards
    MANGESH PAGDHARE.

Maybe you are looking for

  • Premiere/Media Encoder Crashing/Inconsistent

    Hello, I am rendering a 75 minute long feature film timeline. It is a 4kHD timeline with mixed RED and 1080p content. At first I was using CS6 but everytime I would render in media encoder it would crash. No matter the preset. I tried clearing the ca

  • Transfer Data From One ReportServer to Another

    I was asked to set up a new reporting services instance but was not told that there was not an already identified way to transfer the existing reports. As a result, I configured the ReportServer when I installed SSRS on the new 2012 instance.  What I

  • Can I send a question to a specific top forum person?

    I understand the order of these forums, I think.  But if none of the top level people answer my questions, or know the answers, how can I get help elsewhere?  I think my problem is a quick fix for someone who actually understands icloud.  But I don't

  • Can't import .avi files into Captivate 4

    I need some help, I'm getting really frustrated.  My project consists of about 10 short movies (2-3 minutes each) with quiz questions between the movies.  The movies were produced in Adobe Premiere Elements 2 and exported as .avi files.  When I go to

  • I Need to know how to do this!

    I am needing to know how to do this. Is there a plugin for dreamweaver to do this, and what is this called At the top click on "SCHEDULE A TOUR" or "REQUEST BROCHURE" The window it opens is what I am needing to know http://www.thesolana.com/