Data incosistency

Dear experts
                   Our client implemented  sap in 2006 ECC 6 version and 2011 they want to implement   new profit center instead of old profit center  .I have discussed with  client  what they are saying  they have implemented  document spiting in the year 2006 itself. so i checked entire document split implementation  from there i came to know some thing that i want to share
I have checked following configurations
1.Classify G/L Accounts for Document Splitting : here balance sheet  & revenue ,exp accounts are mapped properly
2.Define Zero-Balance Clearing Account  : zero balance is mapped properly
3.Define Document Splitting Characteristics for General Ledger: here i have seen a thing  that is here profit center is mapped with  zero balance but profit center is not  not yet mandatory
4.Activate Document Splitting :   document spiting check box is not yet activated
                                                   in this scenario a going to implement  new  profit center and also make profit center as mandatory   and am going to activate  document splitting check  box ..my question is if i do theses thing  in future it will create any data inconsistency .please tell me
Regards
     Ajeesh

Dear Ajeesh.s
There is no inconsistency. why because, if you not select the profit center as mandatory. you can't see proper reports via profit centers.
If select the profit center is mandatory you can see the report  a well manner. Document splitting concept was came for Reporting purpose.
Regards
shankar

Similar Messages

  • Sample of cost/benefit analysis of data warehouse

    Greeting!
    I am currently doing a research on data warehouse kindly give me a smple of a cost/benefit analysis.
    Looking forward for your immedicate response.
    e-mail me at "[email protected]"
    Thank you and God Bless!
    very truly your,
    Benedict

    Good day Colin.
    You question is like trying to guess the next day stock qouats...
    There are many parameters that combined together and not all of them are constants infact...most of them are variables.
    The issues are vary from lets say:
    1.  COBOL programmers payments needed to support and maintain you current AS400 special written applications and interfacing...
    2.The number of system you want to integrate in your organization buisness scenarios..
    3.The monitoring capabilities currently available in your organization in the Spagethi architecture.
    4.Predicting the growth of your comapny and adding Buisness partners systems...
    5. Trying to estimate the re-using of XI-allready-writen-interfaces (not possible... )
    6.The duplicated data and data incosistancy in your IT enviorment before XI and after XI...
    So my friend,I belive this imaginary generic document of TCO can't be created...
    Good luck with your project
    Nimrod
    If it helps you I've recently asked an ABAP programmer
    how long it will take him to develope a simple File to IDoc scenario....he said that including design documents and testing Aprox. 8 days.
    I belive i can do the first one in 5 days and the second in 50% of that time...
    Message was edited by: Nimrod Gisis

  • Need TP command for Add  Transport Request to import Queue all at a time

    Hi SAP gurus,
    Can you tell the o/s  or  TP  command  to  Add   Transport Request to import Queue all at a time which are available in co files directory   ,
    Because I have to add 500- 600 request .  For every request I canu2019t  do like this  STMS_IMPORT  -- Extras=> other requests -add--
    Thanks and regards
    babu

    Hi,
    You can export list of all transport requests from your DEV system in one shot, from SE09-> Display (provide your required selection criteria) -> Goto -> "Display as List"
    You can export these all Transport Requests as Spreadsheet. Then Filter out all your required 400 to 500 Transport Requests.
    Now,use another excel sheet to create a file with the list of required transport request Nos only, then concatenate all of them to generate the required TP commands as mentioned below.
    The syntax will become as follows:
    column1           column2         column3
    tp addtobuffer DEVK900123 <SID> pf=<Drive>:\usr\sap\trans\bin\TP_DOMAIN_<SID>.PFL
    tp addtobuffer DEVK900124 <SID> pf=<Drive>:\usr\sap\trans\bin\TP_DOMAIN_<SID>.PFL
    tp addtobuffer DEVK900125 <SID> pf=<Drive>:\usr\sap\trans\bin\TP_DOMAIN_<SID>.PFL
    tp addtobuffer DEVK900126 <SID> pf=<Drive>:\usr\sap\trans\bin\TP_DOMAIN_<SID>.PFL
    tp addtobuffer DEVK900127 <SID> pf=<Drive>:\usr\sap\trans\bin\TP_DOMAIN_<SID>.PFL
    tp import          DEVK900123 <SID> client=### pf=<Drive>:\usr\sap\trans\bin\TP_DOMAIN_<SID>.PFL
    tp import          DEVK900124 <SID> client=### pf=<Drive>:\usr\sap\trans\bin\TP_DOMAIN_<SID>.PFL
    tp import          DEVK900125 <SID> client=### pf=<Drive>:\usr\sap\trans\bin\TP_DOMAIN_<SID>.PFL
    tp import          DEVK900126 <SID> client=### pf=<Drive>:\usr\sap\trans\bin\TP_DOMAIN_<SID>.PFL
    tp import          DEVK900127 <SID> client=### pf=<Drive>:\usr\sap\trans\bin\TP_DOMAIN_<SID>.PFL
    -> Open Notepad and Copy+Paste these three columns & save the text file in the <Drive>:\usr\sap\trans\bin\ folder.
    -> Then rename this .txt file to *.bat.
    -> open a command prompt and change to the <Drive>:\usr\sap\trans\bin\ folder and run the .bat batch file.
    Make sure that object lists of all the Trans. Requests are checked and confirmed for the transport to avoid data incosistency in target system. (e.g. Number Range Object)
    Regards,
    Bhavik G. Shroff

  • Want to know the Mat docs for the stock in trans qty in MARD

    Hi Experts,
    I found 10 qty in stock in transfer(sloc) in MARD table for a material. I want to know what are all the material docuements posted against them descretely?
    Because:
    when I am doing the MB1B with movement type 313 for that material, the qty is appending on the existing qty, not knowing the details when and how much qty have been posted with in MARD.
    when I am using MSEG to fetch the Material documents posted with 313 movement type for that material. its showing so many material documents with 313 for that material docs for more qty.
    My questions is stock in transfer(sloc) qty should be equal to Mat docs with qty.
    Please help me to find the descretely.
    Thanks
    Srinvas

    Thanks for reply Kumar,
    Actually I have generated a report for pulling Mat docs which have been posted with 313 mov type by ignoring the corresponding docs with 314 and 315 using MARD and MSEG. I will use these Mat docs in further transaction like MBSU.
    Now my report is pulling all the mat docs with 313 for a particular material & plant combination when I am executing without any date range.
    But when I campare the Material in stock in trans (sloc) qty is 3 in MARD(for example), but my report is showing documents with more qty. (7 mat docs with total of 10 qty for ex.).  causing the data incosistency.
    Can you please help me in pulling the Material documents related to the stock in trans(sloc) qty only?
    Thanks
    srini

  • Cost benefit analysis for using XI 3.0

    Hi,
    Does anyone have a cost benefit analysis document for using XI 3.0 versus developing point to point interfaces ?
    I need something that can easily justify and sell the idea of using XI over going down the custom path for point to point development of interfaces.
    Kind regards
    Colin.

    Good day Colin.
    You question is like trying to guess the next day stock qouats...
    There are many parameters that combined together and not all of them are constants infact...most of them are variables.
    The issues are vary from lets say:
    1.  COBOL programmers payments needed to support and maintain you current AS400 special written applications and interfacing...
    2.The number of system you want to integrate in your organization buisness scenarios..
    3.The monitoring capabilities currently available in your organization in the Spagethi architecture.
    4.Predicting the growth of your comapny and adding Buisness partners systems...
    5. Trying to estimate the re-using of XI-allready-writen-interfaces (not possible... )
    6.The duplicated data and data incosistancy in your IT enviorment before XI and after XI...
    So my friend,I belive this imaginary generic document of TCO can't be created...
    Good luck with your project
    Nimrod
    If it helps you I've recently asked an ABAP programmer
    how long it will take him to develope a simple File to IDoc scenario....he said that including design documents and testing Aprox. 8 days.
    I belive i can do the first one in 5 days and the second in 50% of that time...
    Message was edited by: Nimrod Gisis

  • Project Plan for KM/ Cost Benefit Analysis document for KM

    Is there a template Project Plan Document available for KM?
    Is there a Cost Benefit Analysis document available for KM, that lists the advantages of KM, things like what are the Cost Benefits of Implementing KM? etc.
    Any help is greatly appreciated.
    Regards,
    Bharath

    Good day Colin.
    You question is like trying to guess the next day stock qouats...
    There are many parameters that combined together and not all of them are constants infact...most of them are variables.
    The issues are vary from lets say:
    1.  COBOL programmers payments needed to support and maintain you current AS400 special written applications and interfacing...
    2.The number of system you want to integrate in your organization buisness scenarios..
    3.The monitoring capabilities currently available in your organization in the Spagethi architecture.
    4.Predicting the growth of your comapny and adding Buisness partners systems...
    5. Trying to estimate the re-using of XI-allready-writen-interfaces (not possible... )
    6.The duplicated data and data incosistancy in your IT enviorment before XI and after XI...
    So my friend,I belive this imaginary generic document of TCO can't be created...
    Good luck with your project
    Nimrod
    If it helps you I've recently asked an ABAP programmer
    how long it will take him to develope a simple File to IDoc scenario....he said that including design documents and testing Aprox. 8 days.
    I belive i can do the first one in 5 days and the second in 50% of that time...
    Message was edited by: Nimrod Gisis

  • Incosistent data in RFC

    Hi experts!!
    I am facing the following problem:
    I have created an WDJ application that calls an RFC. The RFC returns a table. When i run the RFC straight from the R/3 it gives me a certain number of entries (e.g. 48). But when the RFC is called from the WD it gives me one entry less. And to be more precise the last entry is missing.
    Any ideas???
    Please help!!
    Thanx in advance!!!

    hi,
    are you displaying the values in a table
    after selecting a few rows in another table?
    i mean based on some condition like
    isMultiselected, and then populating it into the table
    or you have just applied table template to a model node ?
    if u are displaying on condition like isMultiselected() always the last record will not be displayed..if this is the case then change the condition to isSelected() and check
    Regards,
    Satya.

  • Error during data load due to special characters in source data

    Hi Experts,
    We are trying to load Billing data into the BW using the billing item datasource. It seems that there are some special characters in the source data. When the record with these characters is encountered, the request turns red and the package is not loaded even into the PSA. The error we get in the monitor is something like
    'RECORD 5028: Contents from field ****  cannot be converted into type CURR',
    where the field **** is a key figure of type currency. We managed to identify the said record in RSA3 on the source system and found that one of the fields contains some invalid (special) characters that show up as squares in RSA3. The data in the rest of the fields, including the fields mentioned in the error  looks correct.
    Our source system is a non-unicode system wheras the BW system is unicode enabled. I figure that the data in the rest of the fields are getting misaligned due to the presence of the invalid characters in the above field. This was confirmed when we unassigned the field with the special characters from the transfer rules and removed the source field from the transfer structure. After doing this the data was loaded successfully and the request turned green.
    Can anyone suggest a way to either filter out such invalid characters from the source data or make some settings in the BW systems such that the special characters are no longer invalid in the BW system? We cannot write code in the transfer rules because the data package does not even come into the PSA. Is there any other method to solve this problem?
    Regards,
    Ted

    Hi Thad,
    I was wondering, if system is unicode or non unicode that should not matter the amount and currency field . As currencies are defined by SAP and it is in pure English at least a currency code part of 3 Chars. 
    Could this because of some incosistency of data ??
    I would like to know for Which currency  had some special characters it it in that particular record ??
    Hope that helps.
    Regards
    Mr Kapadia

  • Utilizing SuccessFactors PMS solution without saving any employee master data in SF

    Hi Friends,
    We’re implementing SuccessFactors’s Performance Management Solution only at one of the client location in Saudi Arabia and trying to integrate with the On-Premise SAP HCM and Push the employee data to SF.
    However, due to security concerns customer doesn’t want to store any data on SucessFactors cloud solution to run the PMS cycle.
    During the purchase of license for SuccessFactors SAP has demonstrated client that, in SF a unique Id will be created for each employee, which will not be the actual UserID(PERNR)from SAP, and with this unique ID actual employee data will be tagged/linked  in SF which will fetch the data from SAP On-Premise system to run  the PMS cycle.  Once the user logins in SF, based on the unique ID data will fetch from SAP On-Premise system to SF on a real time basis and when the user logged out from the SF system, data should be cleared/deleted and nothing will be saved or stored in SF with respect to employee Master Data.
    I understand SAP has provided the standard add-on for the Employee Data Scenario between SuccessFactors and SAP HCM, which could have been leveraged here for integration, however with the above limitation this solution is absolutely out of scope.
      Please advice, if anyone has faced any such requirement.
    Thanks,
    Farhan

    Dear Farhan,
    I'm with Luke on this. This is against anything we know about SAP and SF.
    However, taking a step back, I would really, really urge you to trigger a strategy discussion with your HR, IT and data privacy leaders.
    Let me be very blunt here:
    Even if such a process could be made possible at considerable cost and inconvenience, it is incosistent at its very core.
    If your strategy and culture are anti-cloud, there is a simple strategy execution: don't use cloud (yet).
    If your strategy is cloud, but your culture is anti-cloud (if I had to make a guess, that would be it), what you will see is your culture having your strategy for breakfast very soon, if I may use this old truism. I.e. it's not going to work.
    To be cloud-ready, culture, strategy and capabilities must be in line. Pushing capabilities by building very complex solutions will get you into a complexity spiral, which will eventually get the better of you.
    The decision to use cloud for such a key process (which soon, if not today, will probably be the right decision for most organisations), can't just be triggered by someone appreciating some screens being a bit prettier than in on-premise applications. The foundation needs to be there first. Your organisation has the opportunity to use the Performance Management Process to align culture and strategy with the changing business and technological environment - but to do so, you need to acknowlegde that the organisation is currently NOT cloud-ready and that this needs changing.
    Some good, provokative questions to start with could be:
    Why are some organisations (put in some names of well renowned companies, ideally leading organisations in your industry, using cloud HCM) successfully using cloud HCM, but we think we can't?
    Which advantages of cloud services are left, if we have a very complex custom solution built to avoid storing data in the cloud?
    Can our IT team demonstrate that they have better expertise in protecting data in systems accessible through the internet (and your SAP system will have to be accessible in some way, if SF is supposed to access the data) than world leading organisations like SAP and SuccesFactors?
    What is our roadmap for HR systems in the future and how does this weird architecture lead us along that road?
    Do the relevant decision makers actually understand, what cloud in general and Software as a Service in particular are?
    You may find it difficult to find an employee ready to ask your CIO, HR-Director and Data Privacy Lead these questions, unless they want to be fired anyway So, a 3rd party change agent may come in handy for that task.

  • I want to update the Custom table using the data available in ITAB.

    Hi,
    I want to updaste the Custom Table which is created by me (Ztable) using the data available in itab.(which i got from defferent standard tables)
    I want to update the custom table using the itab data How is it possible?
    Is any possible by using Modify ?
    DPK.

    example here
    modifying datbase table useing internal table
    advises before updating this datbase table plz lock that table to avoid incosistency
    write the logic for modifying
    Modify the database table as per new dunning procedure
    MODIFY fkkvkp FROM TABLE lt_fkkvkp .
    and finally unlock the table
    example
    *To lock table for further operations
    constants: lc_tabname TYPE rstable-tabname VALUE 'FKKVKP' . "FKKVKP
    CALL FUNCTION 'ENQUEUE_E_TABLE'
    EXPORTING
    tabname = lc_tabname
    EXCEPTIONS
    foreign_lock = 1
    system_failure = 2
    OTHERS = 3.
    IF sy-subrc EQ 0.
    To fetch all the contract accounts for customers of the segment
    Households/SME.
    PERFORM fetch_contract_accounts using lc_tabname .
    ENDIF. " IF sy-subrc EQ 0.
    *wrote the logic
    Modify the database table as per new dunning procedure from internal table
    MODIFY fkkvkp FROM TABLE lt_fkkvkp .
    *unlock the tbale
    CALL FUNCTION 'DEQUEUE_E_TABLE'
    EXPORTING
    TABNAME = uc_tabname .

  • BP: can't edit "user name" field in section "employee data"

    Transaction -  BP, tab - identification, Change in BP role - employee.
    why i can't edit "user name" field in section "employee data"?
    Field is unchangeable only for one in a  ~800 BPs.

    Earlier it is possible to change the User ID field.
    Guess that this might be ur prb
    If this BP is used somewhere else I mean in some transaction/CRMD_ORDER/Support message,so it might be not possible to change this now....becuase the log will become incosistent.
    for e.g. if in ticket XXXX i have been assigned as reporter as...BP no 10
    Later you are assigning this to no 10 to some other person or ID ....
    Then the log of CRMD_ORDER or ticket will become inconsistent because the BP is now assigned to a different person who never worked on the same.
    Actualy i have faced this query during one implementation that why we are able to change the UserID of existing BP even if it is used in some transaction/message.
    Might be this is a new correction in the recent release.
    Regards
    Prakhar
    Edited by: Prakhar Saxena on Jan 16, 2009 10:10 AM

  • Segmentation fault error during data load in parallel with multiple rules

    Hi,
    I'm trying to do sql data load in parallel with multiple rules (4 or 5 rules, maybe), i'm getting a "segmentation fault" error. I tested 3 rules file and it worked fine. we're using Essbase system 9.3.2., with UDB (v8) as the sql data source. ODBC driver is DataDirect 5.2 DB2 Wire Protocol Driver (ARdb222). Please let me know if you have any information on this.
    thx.
    Y

    Hi Thad,
    I was wondering, if system is unicode or non unicode that should not matter the amount and currency field . As currencies are defined by SAP and it is in pure English at least a currency code part of 3 Chars. 
    Could this because of some incosistency of data ??
    I would like to know for Which currency  had some special characters it it in that particular record ??
    Hope that helps.
    Regards
    Mr Kapadia

  • I install window 8 and Unfortunatly all drive format. And all drive mix, now i have only 1 DRIVE C. I want Bit locker Drive's data Back

    Last sunday i install a window 8 and this window format my all drive & make it 1 drive (DRIVE C). Before all of this i have 5 drive in different size with 1 Bitlocker protect drive.
    So i try data recovery software to recover my data. i got back all my data without that ( bitlocker ) protected drive.
    so please guys help me how can i get back data from bitlocker protected drives.
    please please help me.

    Hi,
    I sorry for your experience, but there is no way to recovery the data encryped by BitLocker untill now.
    BitLocker is designed to make the encrypted drive unrecoverable without the required authentication. When in recovery mode, the user needs the recovery password or recovery key to unlock the encrypted drive. Therefore, we highly recommend
    that you store the recovery information in AD DS or in another safe location.
    Please refer to the link below for the statement:
    http://technet.microsoft.com/de-de/library/ee449438(v=ws.10).aspx#BKMK_RecoveryInfo
    Roger Lu
    TechNet Community Support

  • MB5B Report table for Open and Closing stock on date wise

    Hi Frds,
    I am trying get values of Open and Closing stock on date wise form the Table MARD and MBEW -Material Valuation but it does not match with MB5B reports,
    Could anyone suggest correct table to fetch the values Open and Closing stock on date wise for MB5B reports.
    Thanks
    Mohan M

    Hi,
    Please check the below links...
    Query for Opening And  Closing Stock
    Inventory Opening and Closing Stock
    open stock and closing stock
    Kuber

  • Refreshing the Data from a embed view in a view container

    Hi everybody
    I would like to know how can I do to refresh all data from a View with a view container the problem is:
    that I have a window that has a view at the same time this has a view container.  The Main view brings the data of editable elements when I select one element of the main view the view container brings a list for that element (dependencies) but only the first time a choose an element loads the correct data, when I choose another one it brings the same old data and doesn't make the call for the wdDoInit() method.
    The question is:
    How do I force the view to refresh all the data or call again the wdDoInit() method?
    Thank you for your help

    Aida,
    Lets say you have two components C1 and C2 and you want method from C1 to be available in C2 then follow these steps:-
    1) Goto the Interface Controller of C1 and create a method there lets say doSomething
    2) Then goto C2. There you can see Used Web Dynpro Components --> Right click Add Used Component --> Give some name say C1Comp --> Click browse and select C1 --> Click Finish.
    3) Next goto Component Controller of C2 --> Properties --> Click Add and check if C1 is added. If not then select the checkbox and select OK.
    4) Now goto Implementation tab of C2 and lets say wdDoInit you can write following code:-
    wdThis.wdGetC1CompInterface().doSomething();
    Chintan

Maybe you are looking for