Issue with Ord Start Date

Hi,
I am having issue with Ord Start Date.
When i create XYZ Asset now when i do the acquasation for that new asset. Example: Using FB01 i enter todays date and i do the posting.
Now when i go to DEP Area TAB for the DEP Key (M200) - The Ord Dep Start Date is 15 Dec 2010.
For this DEP Key Period Control Methods (T code - AFAMP) - Acquisation Entered is ( 6 that is - at the start of year) and the Period Control is ( T code - OAVS) (4 that is - First year convention at half year start date).
Our client is using the fiscal year as 1 June 2010 to 31 May 2011.
So when i do acquistion for the asset as per the period contol the Ord Dep Start Date should be 1 Dec 2010, but here it is talking 15 Dec 2010.
Why is system talking 15 days more.
Regards.

Hi
Which Fiscal year variant are you using??
Did you maintain it in OAVH? Copy the existing entry in OAVH for K4 to your Fisc Yr var and try again
And also, 04 does not figure for Acquisitions in AFAMP under 0004.... Whatever you specify in OAVH must also be a part of AFAMP
Regards
Ajay M
Edited by: Ajay Maheshwari on Oct 28, 2010 2:31 PM

Similar Messages

  • Issue with list saving data after sites upgrade from sharepoint 2010 to sharepoint 2013

    Issue with list saving data after sites upgrade from sharepoint 2010 to sharepoint 2013 
    Newform.aspx of list:-
    Custom List is not saving data sometimes in the new form after 15 minutes and only blank entry record got created without saving data, even though some columns are mandatory fields?

    Hello dcakumar,
    Sounds like a strang issue. If you can reproduce this can you see some errors in the ULS logs?
    - Dennis | Netherlands | Blog |
    Twitter

  • Issue with 0hrposition master data

    We are extracting data from SAP using 0HRPOSITION_ATTR datasource. I noticed that data is not maintained correctly in master data tables in BW and giving us incorrect results in reporting. Consider below mentioned scenario:
    Position A created as vacant on 04/01/2006 with start date (BEGDA/ Valid from) as 04/01/2006 and End date (ENDDA/ Valid to) as 12/31/9999 and is vacant. Below mentioned entry is shown under maintain master data for 0HRPOSITION in BW:
    Position Valid To Valid From Position Vacant
    A 03/31/2006 01/01/1000
    A 12/31/9999 04/01/2006 X
    Position A is now delimited on 09/15/2006 as it’s no more required. In SAP, position has record only from 04/01/2006 till 09/15/2006 as vacant. When record is extracted in BW, it creates below mentioned entry in master data table.
    Position Valid To Valid From Position Vacant
    A 03/31/2006 01/01/1000
    A 09/15/2006 04/01/2006 X
    <b>A 12/31/9999 09/16/2006 X</b>
    Entry 09/16- 12/31 is incorrect as position doesn’t exist for this duration. If we report on 0HRPOSTION with key date as 09/30/2006, it shows position A as vacant though position no longer exists.
    Has anyone come across this situation. any help is greatly appreciated.
    Kamal
    P.S: Milind Rane...I was searching through the forums and came across your post.I would appreciate if you could let me know how you solved this issue...
    Message was edited by:
            Kamal K

    HI KK
    I have a similar issue. Can you please let me know how this Issue with 0HRPOSITION_ATTR extractor was resolved. In my case i have incorrect data in BW when reporting is done. THrough the extractor untill the PSA i have the correct records coming  but in the master data 0HRPOSITION i have incorrect records.
    Please help.
    Thanks
    Hari
    Message was edited by:
            SAPCOOL

  • Issue with the Posting Date of the Purchase Order.

    Hi All,
    There are fields in BW like SSL1: Time OK, SSL2: Qty OK, SSL3: Time & Qty Ok, SSL4: Days Late (Routines are written to calculate). These fields will indicate whether the delivery against a GR is OK or not with respect to Time, Quantity and the No. of Days..
    But here the issue I am facing is
    If there is only1 delivery/ GR against a single item the calculation in BW are correct - i.e. for a particular PO if there is only one delivery the above fields like SSL1: Time Ok, SSL2: Qty OK will show like the delivery is done within the specified time and everything is OK (in case if it is delivered within the allotted time)
    But if there are multiple deliveries or multiple GR's  posted for one PO item, the calculations are going wrong i.e. even if the delivery is done well within the specified time it is showing the wrong calculations like it is delievered too late. Because in this case the earlier dates are overwritten.
    Can anyone throw me some light on how can I go about solving this issue.
    I am thinking of declaring the Posting Date as the KeyField of the DSO as of now it is a Data field  I also want to know the impact of assigning this as a Keyfield.
    Thanks in advance,
    Prasapbi

    Hi,
    As I understand, you have a DSO based on Purchase Order and your key field is PO and its line item. The problem as you stated will always be there if you have multiple deliveries/GRs created for a single line item because the system will overwrite the entries for same key.
    Problem with adding Posting date as keyfield will be that then your key will be PO-PO Lineitem-Date. When PO will be created, the Posting date will be blank(correct me here if I am wrong), therefore you will have one entry for same PO-line item combination. One without date and other with date, which again would be incorrect. If my assumption about Posting date was wrong, even then your data may not be correct because then you may have many entries with same posting date which again would overwrite each other.
    If there is any direct link between PO line item and number of deliveries that will get created for them, then you can bring that field in DSO as keyfield. But I don't think there is any such field.
    Looking at your report requirement, I would suggest that you make a DSO based on Goods Receipts and then calculate these keyfigures by comparing the dates between GR posting date and PO line item date.
    Else you can change the way your datasource works(if its generic one based on function module). Since your main requirement is to check whether the GR posting date has met your SLA or not, you should fetch all the details only when GR is created and make your key field as PO-PO Line item-GR

  • Issues with 4.1 Data Upload

    I've got some issues with the new feature 'data upload' in 4.1. I had already made the pages and the data loading and it worked perfect. Now a new column was added to my data loading table and I have to add a new table lookup for this column. Everything is set up like it has to be. I've got 4 table lookups. When I try to upload some data and indicate the table columns to the data, I've always get the error: Failed to retrieve the lookup value. There is no problem when I do a data load where he only have to retrieve one column from the lookup table, when he has to retrieve data from more tables for more columns, I always get the FAILED message. Does anyone know the problem of this situation? I already tried to make a totally new data loading, but this also failed to do the job.

    Hi Ellenvanhees,
    I dont think the number of lookups that you defined is an issue here. If possible try to share more details about your data and tables. The few things that come to my mind are probably your data.
    But if you are able to do one by one lookup without problem then I think your upload is failing due to null values. The current status of data upload feature returns a failed lookup even if a null value was to be uploaded. This is a bug #13582661 and has been fixed for the future release.
    Patrick

  • Issue with setting Planning Data Source

    I keep getting error when I try to set up the data source
    the Oracle database  = orcl
    The data base schema EPMAPLAN
    Password = EPMAPLAN
    when i log on i  to Oracle Database
    Sys as SYSDBA
    Password = Password01
    so what what should be the Database Source Name  = ORCL Correct ????
    Server = win-392h1l307n1 or localhost
    Databse = ORCL
    User Name  = ????????????
    Password  = ?????????
    I try all the  combinations
    Please advise

    Duplicate post - Issue with setting up planning data source
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Issues with processin the data in Adpater engine

    Dear experts,
    we are facing the below mention issues while processing the data Please let me know what we can do to resolve the issue.
    "Error while reading ID of own business system from the SLD for system DX1 and client 500 Internal error in Landscape Directory"
    regards
    shankar

    Hi,
    As suggested by Aashish, you can go through.
    It seems to like problem with SLD.  It might be wrongly configured. Please check with the T.code SLDAPICUST
    Regards,
    P.Rajesh

  • Issue with extraction of data to Delta Queue

    Hi All,
    Background:- One LO Data source each from Application 8, 11 and 12 was active in LBWE. Update method was specified as Queued Delta. However this Data source was not utilized to pull any data to any of the Global BI Systems. These data sources did not exist in RSA7. There was a business requirement that these Data sources were to be enhanced, along with this other Data sources under these applications were also required to be enhanced.  There were entries in LBWQ which used to get dumped out at regular interval by running the SAP Standard Collector jobs. Since no Delta was initialized for these extractors, no entries were passed on to RSA7.
    Problem:- We did not clear the entries or took a downtime before sending transports and now the collector job is dumping saying the structures have changed, which is correct.  I believe we wonu2019t require any data sitting in LBWQ as none of the extractors had any entry in RSA7. This problem is happening only in our Production system and we did not face this problem in any other system even though there were entries existing in LBWQ.
    What is the solution to fix this? Is the only option now is to delete the Queue from LBWQ and then run the collector job or is there any other way and will that fix the problem?

    Hi Victor,
    As u said that u were not using the DS for Application 8,11 and 12.. in this case I believe that u dont have any data in BW as well(Historical data). As u have enhanced these data sources with the new requirement, I hope u must have changed the structures in BW side as well to load the data to the newly added fields.
    So in ur case u need to replicate the DS in BW and then activate the Transfer structure for all of them.
    And then flushout the LBWQ for all those 3 applications and then take the setup tables fill for the same (but this requires the down time in prod to make sure that u wont loose any records).
    Before setup check SM13 whether u have any blocked entries for all these entries if so then u can delete them as well before taking the setup.
    Once the setup is done then u can take Init without data transfer and then take repair full loads in BW for the targets.(Activate ur V3 once the init- without data transfer took place for the regular deltas) .
    Thanks
    Assign points if this helps

  • Issue with time sheet data source

    Hi All,
      We are on ECC 5.0.
      I am trying to load CATS timesheet data into BW thorugh data sources 0CA_TS_1 (approved time). This data source fetches data from CATSDB table.
    In CATSDB there is a good amount of data with Status '30' i.e. 'Approved'.
    However on running RSA3 checker this datasource is returning 0 records.
    The only SAP note most relevant was 509592. But it is valid for older versions.
    The ECC HR is on Level 10 and support pack SAPKE50010.
    This issue has been unresolved in previous threads.
    Please suggest any solution. Its urgent. Points for sure.
    Thanks
    Vishno

    Hi Guillaume 
    Thanks for the reply
    I had a glimpse on above link.
    1) Above link says need to perform below steps. Where we need to run below steps (SQL Server or AD Server)
    Sorry for asking basic question
    Click Start, click Run, type Adsiedit.msc, and then click
    OK.
    In the ADSI Edit snap-in, expand Domain [<var>DomainName</var>], expand
    DC= <var>RootDomainName</var>, expand CN=Users, right-click
    CN= <var>AccountName</var>, and then click Properties. 
    In the CN= <var>AccountName</var> Properties dialog box, click the
    Security tab.
    On the Security tab, click Advanced.
    In the Advanced Security Settings dialog box, make sure that
    SELF is listed under Permission entries.
    2) Link talks about Windows Server 2000 version. Is this applicable for 2008 ?
    Regards
    Santosh

  • Issue with SAP demo data for SCPM content in BI system

    Dear Expert,
    "1558947 - SCPM 2.0: Loading Demo Data into SCPM"
    Based on above SAP Note  1558947, we are performing the Demo data loads for SCPM data targets in SAP BI system.
    However, the data is not available for any Data targets due to the reason, upload is sucessful but "Validation" is fialed.
    Does any one has experienced similar situations. Please help to resolve this issue.
    Thanks,
    Khader

    Hi Phani,
    we are trying to follow above link and performing accordingly.
    Please find the following steps we performed
    1. we made sure that the DSO (Reporting UOM- 0SPM_DS32) and Process chain (DM: Reporting UOM (0SPM_REP_UOM) are Installed and active (as per the SAP Note 1540655) in BW system
    2. Triggered the upload from front end
    3. when we check the job log in BW system, the system triggers a separete process chain 0ASA_P044, insted of UOM process chain 0SPM_REP_UOM hence the load is not sucessful to the data target
    4. Below is the job log for the above run
    Job started
    Step 001 started (program RSPROCESS, variant &0000000000136, user ID BWREMOTE_CRH)
    Start process TRIGGER 0ASA_P044_VSTART in run F0KTSA5KR590X4TUQ4G7ILGD0 of chain 0ASA_P044
    Event RSPROCESS with parameter BYLPP950MSCT56CM5C4QNL0N1 successfully triggered
    Job finished
    Could you please guide how we should proceed and correct this error.
    Thanks,
    khader

  • Issues with loading master data and hierarchy from BI InfoObject

    Hi,
    I was successful with loading the master data from BI InfoObject. It copies the text node as well but it discards the space I have in the technical name when it copies it to BPC. For example if my Text Node technical ID is "NODE 1" it copies it as "NODE1" on BPC side because of which my hierarchy load fails as it is not able to find "NODE 1".
    What is the reason it deletes the space from technical ID? and how can I resolve this issue?
    Do we have a provision to write a ABAP exit where I can check?
    (PS: If i manuallay enter space on my BPC dimension the hierarchy loads successfully)
    Thanks in advance,
    Diksha.

    Hi,
    You can check this in start routine which can be called from Transformation file using BADI.
    http://help.sap.com/saphelp_bpc75_nw/helpdata/en/28/b66863b41f47589b9943f80b63def6/content.htm
    Hope this helps...
    regards,
    Raju

  • Oracle EPM 11.1.2 issue with system-jazn-data.xml & HIT entries

    Have been working on configuring Oracle EPM 11.1.2 and have one final issue from the diagnostic utility that I cannot figure out. Configuration sequence is as follows and each step is installed in its own database:
    Step 1 - Foundation/Shared Services/Calc Mgr/EPMA/Essbase to a single relational DB. I am not configuring the web server until the final step.
    Step 2 - Hyperion Performance Scorecard
    Step 3 - Planning
    Step 4 - Profitability
    Step 5 - RA and configure web server.
    I have used both SQL Server Express 2008 and Oracle DB 11g and get the same result.
    When I complete the install, restart all of the services, and run the diagnostic utility, I get a failure with foundation services indicating that the file "system-jazn-data.xml" cannot be found. No real help is provided with the error message and have found no help in the docs or on the web. I have searched the disk and the file seems to be in the proper place per the docs. I have done partial configs and do not get the error. I have then compared the system-jazn-data.xml file from the successful config to the system-jazn-data.xml file from the failed config they are identical. Both files seem to be bloated with tens of thousands of lines, most of them blank.
    I had reached a point where I thought the issue was related to Performance Scorecard and removed that step. I am now getting the error again.
    Anyone seeing this issue? Is it just a bogus message in the diagnostic report and can be ignored? Any other thoughts?
    Thanks
    EPMCloud

    Update - After going through the install many more times, I still do not know what the issue is, but I believe I have figured out how to resolve it. It appears that if you go back (after everything is installed and configured) and reconfigure the application server for Foundation services, the issue is corrected.
    I am running some final test now and if I discover something different, I will update the post.
    EPMCloud

  • Issue with status of data information in Bex analyzer report

    Hi BI gurus,
    One of the queries showing older date for the "status of data" information in the report of Bex Analyzer. I have tried to correct it in Bex analyzer by removing existing Text information element and adding a new Text element in the Bex Analyzer designer for the query. But it doesn't worked out as the changes made to the query through Bex Analyzer are only being saved as a local work book rather than reflecting to the query. Please suggest me with some options to resolve this issue and give any Idea to correct the "Status of data" in the Bex Query designer.

    Hi Aditya
    This is a common problem faced by users when reporting on Multi-Provider.
    In my project what I did to overcome this is to run a Fake DTP to the cube whose status is creating problem.
    Like , if under MultiPro I have a planning cube which is only updated monthly but all the actual cubes updated daily.  In this case create a DTP under Plan cube with some impossible selection condition ( like fiscal year 2099). This will bring 0 records to planning cube ( and thereby not impacting the data) but will update the last loading time.
    Regards
    Anindya

  • In OSB , xquery issue with large volume data

    Hi ,
    I am facing one problem in xquery transformation in OSB.
    There is one xquery transformation where I am comparing all the records and if there are similar records i am clubbing them under same first node.
    Here i am reading the input file from the ftp process. This is perfectly working for the small size input data. When there is large input data then also its working , but its taking huge amount of time and the file is moving to error directory and i see the duplicate records created for the same input data. I am not seeing anything in the error log or normal log related to this file.
    How to check what is exactly causing the issue here,  why it is moving to error directory and why i am getting duplicate data for large input( approx 1GB).
    My Xquery is something like below.
    <InputParameters>
                    for $choice in $inputParameters1/choice              
                     let $withSamePrimaryID := ($inputParameters1/choice[PRIMARYID eq $choice/PRIMARYID])                
                     let $withSamePrimaryID8 := ($inputParameters1/choice[FIRSTNAME eq $choice/FIRSTNAME])
                     return
                      <choice>
                     if(data($withSamePrimaryID[1]/ClaimID) = data($withSamePrimaryID8[1]/ClaimID)) then
                     let $claimID:= $withSamePrimaryID[1]/ClaimID
                     return
                     <ClaimID>{$claimID}</ClaimID>                
                     else
                     <ClaimID>{ data($choice/ClaimID) }</ClaimID>

    HI ,
    I understand your use case is
    a) read the file ( from ftp location.. txt file hopefully)
    b) process the file ( your x query .. although will not get into details)
    c) what to do with the file ( send it backend system via Business Service?)
    Also noted the files with large size take long time to be processed . This depends on the memory/heap assigned to your JVM.
    Can say that is expected behaviour.
    the other point of file being moved to error dir etc - this could be the error handler doing the job ( if you one)
    if no error handlers - look at the timeout and error condition scenarios on your service.
    HTH

  • Facing issues with 3g Mobile Data Usage Current Period and Apps Volumes are differ.

    Hi All,
              i am facing some data mismatch with my iphones. the actual mobile data usage volume and data usage for Apps volumes are showing different data.
    for eg : Mobile Data Usage - Current Data Usage - Current Period volumes showing 342 MB but if i check the  Use Mobile Data For Apps Volumes for all Apps including System Service Data's and Uninstalled Data's are showing very less ( around 200 MB only ) compare with Current Period Volumes.
    Mobile Data Usage - Current Data Usage - Current Period  and Use Mobile Data For Apps ( Apps wise data usage ) Data's are not matching..
    Please let me know will it show like that only or its an issue ????  if you want i can share the screen shot as well.

    Hello kasinathadurai,
    Welcome to the Apple Support Communities!
    I understand that you have some questions about cellular data usage and apps that use cellular data. For this question, I would refer you to the attached article that will help explain how data usage, call time, and app cellular data is calculated. 
    Learn about cellular data settings and usage on your iPhone and iPad (Cellular Model) - Apple Support
    Have a great day,
    Joe

Maybe you are looking for

  • Why will FireFox not work on my system? (Windows 7) Help.

    This is the error message I receive when attempting to open FireFox "Your Firefox profile cannot be loaded. It may be missing or inaccessible." I do not like using any of the "Brand-X" browsers and I wish to have FireFox working again. My operating s

  • How to document a query/report from HR-Infoset

    Hi, we developed some queries from our own HR-Infoset. As our users are not allowed to work directly on this infoset or its queries every query has its generated report. The users call these reports through a transcation code. Unfortunately we have n

  • Ipod Touch Acts like a Camera?

    Anyone else had this experience? Today my Touch wigged out on me and started acting like a camera. Everytime I touched the power/on/off button you could hear a shutter sound like a camera takng a picture. I plugged it into my computer and it found ne

  • Newest upgrade: Why have my websafe fonts disappeared?

    I just upgraded to the Feb. release of Muse and my own websafe fonts have all disappeared. Why? What happened? Sigh... I wish I had known this would happen; I wouldn't have upgraded so quickly.

  • Weird behaviour with Variables of a Stored Procedure

    Hello All,    I'm writing following stored procedure and it is failing with following error "Conversion failed when converting from a character string to uniqueidentifier"  What's Happening?   I've created Create Procedure usp_Sample AS Declare @Var