Best practice for retraction of data from BPC 10.0 to General Ledger

Hi All,
I have the requirement to retract data from BPC 10.0 to General Ledger.
What is the best practice for doing this?
I have read the "How To... Retract data from BPC 7.5 NetWeaver to SAP ERP Cost Center Accounting, Part I", but this is prepared to transfer data to Cost Center accounting. Will General Ledger be populated automatically?
What is your opinion on this?
Best regards,
JA

Hi Rich,
In BPC 10 NW, the data entered in Input Forms has to be retracted to ECC.
For this retraction we are following below link.
http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/c029accf-6d95-2d10-79aa-cc291180bf73?QuickLink=index&overridelayout=true&59180354379512
In this document, I am unable to get the details of class ZCL_BPC_RTRCT_SUPERCLASS.
as it is Z class, i am unable to create in my system.
could you please help me.

Similar Messages

  • Best practice for migrating eLearning data from Sol Mgr 3 to 4?

    Greetings,
    What is the recommended method for moving eLearning data when migrating from Solution Manager 3 to version 4?
    Thanks in advance,
         Ken Henderson

    948115 wrote:
    Dear All,
    This is Priya.
    We are using ODI 11.1.1.6 version.
    In my ODI project, we have separate installations for Dev, Test and Prod. i.e. Master repositories are not common between all the three. Now my code is ready in dev. Test environment is just installed with ODI and Master and Work repositories are created. Thats it
    Now, I need to know and understand what is the simple & best way to import the code from Dev and migrate it to test environment. Can some one brief the same as a step by step procedure in 5-6 lines? If this is the 1st time you are moving to QA, better export/import complete work repositories. If it is not the 1st time then create scenario of specific packages and export/import them to QA. In case of scenario you need not to bother about model/datastores. keep in mind that the logical schema name should be same in QA as used in your DEV.
    Some questions on current state.
    1. Do the id's of master and work repositories in Dev and Test need to be the same?It should be different.
    2. I usually see in export file a repository id with 999 and fail to understand what it is exactly. None of my master or work repositories are named with that id.It is required to ensure object uniqueness across several work repositories. For more understanding you can refer
    http://docs.oracle.com/cd/E14571_01/integrate.1111/e12643/export_import.htm
    http://odiexperts.com/odi-internal-id/
    3. Logical Architecture objects and context do not have an export option. What is the suitable alternative for this?If you are exporting topology then you will get the logical connection and context details. If you are not exporting topology then you need to manually create context and other physical connection/logical connection.
    >
    Thanks,
    Priya
    Edited by: 948115 on Jul 23, 2012 6:19 AM

  • Best Practice for Initial Load Data

    Dear Experts,
        I would like to know the best practices or factors to be concerned when performing initial load
    For example,
    1) requirement from business stakeholders for data analysis
    2) age of data to meet tactical reproting
    3) data dependency crossing sap  modules
    4) Is there any best practice for loading master data?

    HI ,
    check this links
    Master Data loading
    http://searchsap.techtarget.com/guide/allInOne/category/0,296296,sid21_tax305408,00.html
    http://datasolutions.searchdatamanagement.com/document;102048/datamgmt-abstract.htm
    Regards,
    Shikha

  • What are best practices for managing my iphone from both work and home computers?

    What are best practices for managing my iphone from both work and home computers?

    Sync iPod/iPad/iPhone with two computers
    Although it isn't possible to sync an Apple device with two different libraries it is possible to sync with the same logical library from multiple computers. Each library has an internal ID and when iTunes connects to your iPod/iPad/iPhone it compares the local ID with the one the device normally syncs with. If they are the same you can go ahead and sync...
    I have my library cloned to a small 1Tb USB drive which I can take between home & work. At either location I use SyncToy 2.1 to update the local copy with the external drive. Mac users should be able to find similar tools. I can open either of the local libraries or the one on the external drive and update the media content of my iPhone. The slight exception is Photos which normally connects to a specific folder on a specific machine, although that can easily be remapped to the current library if you create a "Photos" folder inside the iTunes Media folder so that syncing the iTunes folders keeps this up to date as well. I periodically sweep my library for new files & orphans withiTunes Folder Watch just in case I make changes at one location but then overwrite the library with a newer copy from the other. Again Mac users should be able to find similar tools.
    As long as your media is organised within an iTunes Music or Tunes Media folder, in turn held inside the main iTunes folder that has your library files (whether or not you let iTunes keep the media folder organised) each library can access items at the same relative path from the library folder so the library can be at different drives/paths on different machines. This solution ensures I always have adequate backups of my library and I can update my devices whenever I can connect to the same build of iTunes.
    When working with an iPhone earlier builds of iTunes would remove any file not physically present in the local library, even if there was an entry for it, making manual management practically redundant on the iPhone. This behaviour has been changed but it will still only permit manual management with a library that has the correct internal ID. If you don't want to sync your library between machines on a regular basis just copy the iTunes Library.itl file from the current "home" machine to any other you want to use, then clean out the library entires and import the local content you have on that box.
    tt2

  • Best practice for putting binary data on the NMR

    Hi,
    We're creating a component that will consume messages off the NMR, encode them, and subsequently put them back on the NMR. What's the best practice for sending binary data over the NMR?
    1. setContent()?
    2. addAttachment()?
    3. setProperty()?
    If NormailzedMessage.setContent() is the desired approach, then how can you accomplish that?
    Thanks,
    Bruce

    setContent() is used only for XML messages. The recommended way to accommodate binary data is to use addAttachment().

  • BPC:NW - Best practices to load Transaction data from ECC to BW

    I have a very basic question for loading GL transaction data into BPC for variety of purposes, would be great if you can point me towards best practices/standard ways of making such interfaces.
    1. For Planning
    When we are doing the planning for cost center expenses and need to make variance reports against the budgets, what would be the source Infocube/DSO for loading the data from ECC via BW, if the application is -
    YTD entry mode:
    Periodic entry mode:
    What difference it makes to use 0FI_GL_12 data source or using 0FIGL_C10 cube or 0FLGL_O14 or 0FIGL_D40 DSOs.
    Based on the data entry mode of planning application, what is the best way to make use of 0balance or debit_credit key figures on the BI side.
    2. For consolidation:
    Since we need to have trading partner, what are the best practices for loading the actual data from ECC.
    What are the typical mappings to be maintained for movement type with flow dimensions.
    I have seen multiple threads with different responses but I am looking for the best practices and what scenarios you are using to load such transactions from OLTP system. I will really appreciate if you can provide some functional insight in such scenarios.
    Thanks in advance.
    -SM

    For - planning , please take a look at SAP Extended Financial Planning rapid-deployment solution:  G/L Financial Planning module.   This RDS captures best practice integration from BPC 10 NW to SAP G/L.  This RDS (including content and documentation) is free to licensed customers of SAP BPC.   This RDS leverages the 0FIGL_C10 cube mentioned above.
      https://service.sap.com/public/rds-epm-planning
    For consolidation, please take a look at SAP Financial Close & Disclosure Management rapid-deployment solution.   This RDS captures best practice integration from BPC 10 NW to SAP G/L.  This RDS (including content and documentation) is also free to licensed customers of SAP BPC.
    https://service.sap.com/public/rds-epm-fcdm
    Note:  You will require an SAP ServiceMarketplace ID (S-ID) to download the underlying SAP RDS content and documentation.
    The documentation of RDS will discuss the how/why of best practice integration.  You can also contact me direct at [email protected] for consultation.
    We are also in the process of rolling out the updated 2015 free training on these two RDS.  Please register at this link and you will be sent an invite.
    https://www.surveymonkey.com/s/878J92K
    If the link is inactive at some point after this post, please contact [email protected]

  • Best Practice for disparately sized data

    2 questions in about 20 minutes!
    We have a cache, which holds approx 80K objects, which expired after 24 hours. It's a rolling population, so the number of objects is fairly static. We're over a 64 node cluster, high units set, giving ample space. But.....the data has a wide size range, from a few bytes, to 30Mb, and everywhere in between. This causes some very hot nodes.
    Is there a best practice for handling a wide range of object size in a single cache, or can we do anything on input to spread the load more evenly?
    Or does none of this make any sense at all?
    Cheers
    A

    Angel 1058 wrote:
    2 questions in about 20 minutes!
    We have a cache, which holds approx 80K objects, which expired after 24 hours. It's a rolling population, so the number of objects is fairly static. We're over a 64 node cluster, high units set, giving ample space. But.....the data has a wide size range, from a few bytes, to 30Mb, and everywhere in between. This causes some very hot nodes.
    Is there a best practice for handling a wide range of object size in a single cache, or can we do anything on input to spread the load more evenly?
    Or does none of this make any sense at all?
    Cheers
    AHi A,
    It depends... if there is a relationship between keys and sizes, e.g. if this or that part of the key means that the size of the value will be big, then you can implement a key partitioning strategy possibly together with keyassociation on the key in a way that it will evenly spread the large entries across the partitions (and have enough partitions).
    Unfortunately you would likely not get a totally even distribution across nodes because of having fairly small amount of entries compared to the square of the number of nodes (btw, which version of Coherence are you using?)...
    Best regards,
    Robert

  • Best practices for submitting CF data to an AJAX page?

    Hi everyone,
    I've got a project I'm working on for work and have hit a
    little problem.
    I am extracting data from my database and then after each
    piece of data (just numbers, usually 10 chunks of numbers), I tack
    a "|" onto the end of each number. Then, I output the data to the
    page. Back on my AJAX enabled page, I get the "responseText" from
    that page, and then split it up using javascript and the
    pre-inserted "|".
    This seems to work fine, but it is quite a bit more messy.
    Also, It would really, really be nice to be able to do sorting and
    various other operations on the data with javascript instead of
    having to rely on CF's icky code logic.
    Can someone please enlighten me as to best practices for this
    type of thing? I get the suspicion that I'll probably be using XML
    somehow, but I'd your opinion.
    Thanks!

    Check out the Samples and Documentation portions of Adobe's
    Spry website for client side use of JSON with Spry.
    http://labs.adobe.com/technologies/spry/home.html
    Here is link to Adobe's Spry Forums:
    http://www.adobe.com/cfusion/webforums/forum/categories.cfm?forumid=72&catid=602
    If you are using CF8 you can use the SerializeJSON function
    to convert a variable to JSON. You might also be interested in the
    cfsprydataset tag. CF 8 documentation:
    http://livedocs.adobe.com/coldfusion/8/htmldocs/
    If you are using a previous version of CF there is 3rd party
    JSON support. You can find links at
    http://json.org.

  • Best Practice for Using Static Data in PDPs or Project Plan

    Hi There,
    I want to make custom reports using PDPs & Project Plan data.
    What is the Best Practice for using "Static/Random Data" (which is not available in MS Project 2013 columns) in PDPs & MS Project 2013?
    Should I add that data in Custom Field (in MS Project 2013) or make PDPs?
    Thanks,
    EPM Consultant
    Noman Sohail

    Hi Dale,
    I have a Project Level custom field "Supervisor Name" that is used for Project Information.
    For the purpose of viewing that "Project Level custom field Data" in
    Project views , I have made Task Level custom field
    "SupName" and used Formula:
    [SupName] = [Supervisor Name]
    That shows Supervisor Name in Schedule.aspx
    ============
    Question: I want that Project Level custom field "Supervisor Name" in
    My Work views (Tasks.aspx).
    The field is enabled in Task.aspx BUT Data is not present / blank column.
    How can I get the data in "My Work views" ?
    Noman Sohail

  • Best Practice for Migration of BO  from one server to another

    Hi All,
               I would like to know what is the Best pratice for Migration of BO from One server to another.
    i have Installed BO Xi R2 on my server.
    Thanks,
    Anendu Bothra
    Edited by: Anendu Bothra on Mar 5, 2009 10:24 AM

    You need to copy your input and output file stores from the old server to the new server. By default these are located in the <Business Objects install path>\FileStore directory.
    Then you need to stop the CMS, Right click the CMS,Click the Configuration tab, and then click Specify.
    Choose Copy, then click OK.
    Choose the version information for the source CMS database.
    Select the database type for the source CMS database, and then specify
    its database information (including host name, user name, and password).
    Select the database type for the destination CMS database, and then
    specify its database information (including host name, user name, and
    password).
    When the CMS database has finished copying, click OK.
    Once this process has been completed start the CMS and click on update objects -> located on the top of the CCM.
    I'd advise taking full backups beforehand.

  • Best practice for deleting multiple rows from a table , using creator

    Hi
    Thank you for reading my post.
    what is best practive for deleting multiple rows from a table using rowSet ?
    for example how i can execute something like
    delete from table1 where field1= ? and field2 =?
    Thank you

    Hi,
    Please go through the AppModel application which is available at: http://developers.sun.com/prodtech/javatools/jscreator/reference/codesamples/sampleapps.html
    The OnePage Table Based example shows exactly how to use deleting multiple rows from a datatable...
    Hope this helps.
    Thanks,
    RK.

  • Best practice for Smartview when upgrading from Excel 2003 to Excel 2007?

    Does anyone know the best pratice for Smartview when upgrading from Excel 2003 to Excel 2007?
    Current users have Microsoft Excel 2003 with Smartview 9.3.1.2.1.003.
    Computers are being upgraded to Microsoft Excel 2007.
    What is the best pratice for Smartview in this situation?
    1. Do nothing with Smartview and just install Excel 2007.
    2. Install Excel 2007 and then uninstall and reinstall Smartview
    3. Uninstall Smartview, Install Excel 2007, and then install Smartview
    4. Somthing else??
    Thanks!

    We went with option 1 and it worked out fine. Be aware that SV processes noticeably slower in Excel 2007 than 2003. Many users were/are unhappy with the switch. We haven't tested SV v11 yet, so I'm not sure if it has improved performance with Excel 2007 or not (hopefully it does).

  • Best Practice for Flat File Data Uploaded by Users

    Hi,
    I have the following scenario:
    1.     Users would like to upload data from flat file and subsequently view their reports.
    2.     SAP BW support team would not be involved in data upload process.
    3.     Users would not go to RSA1 and use InfoPackages & DTPs. Hence, another mechanism for data upload is required.
    4.     Users consists of two group, external and internal users. External users would not have access to SAP system. However, access via a portal is acceptable.
    What are the best practice we should adopt for this scenario?
    Thanks!

    Hi,
    I can share what we do in our project.
    We get the files from the WEB to the Application Server in path which is for this process.The file placed in the server has a naming convention based on ur project,u can name it.Everyday the same name file is placed in the server with different data.The path in the infopackage is fixed to that location in the server.After this the process chain trigers and loads the data from that particular  path which is fixed in the application server.After the load completes,a copy of file is taken as back up and deleted from that path.
    So this happens everyday.
    Rgds
    SVU123
    Edited by: svu123 on Mar 25, 2011 5:46 AM

  • Best practice for exposing internal data to external world?

    Currently we have our Internet server sitting in our corporate DMZ taking website and web service requests from the outside world.  Class libraries with compiled connection strings exist on that server.  That server then has a connection through
    the firewall to the database server.  I'm told that this is no longer the secure/recommended best practice.
    I'm told to consider having that Internet server make requests of not the database server, but rather a layer in between (application server, intranet server, whatever) that has those same Web UI methods exposed.. and then THAT server (being inside the firewall)
    connects to the database server.
    Is this the current recommended best practice to have external users interact with internal data?  It seems like lots of hoops -- outside person app queries Web UI methods on Internet server which in-turn queries same method (duplicated) on Intranet
    server which then talks to the database.
    I'm just trying to determine the simples practice, but also what is appropriately secure for our ASP.NET applications and services.
    Thanks.

    IMO this has little to do with SOA and all about DMZs. What you're are trying to stop is the same comm protocol accessing the database as accessed the web site. As long as you fulfil that then great. WCF can help here because it helps with configuring
    the transport of calls. Another mechanism is to use identities but IMO it's easier to use firewalls and transports.
    http://pauliom.wordpress.com

  • Best Practice for creating Excel report from SSIS.

    I have a requirement to create an Excel report on a daily basis which pulls data from SQL. I have attempted to resolve this by creating a stored procedure to save the results in SQL, a template in Excel to hold the graphs & pivot tables and an SSIS package
    to copy the data to the template.
    Problem 1: When the data turns up in Excel it is saved as text rather than numbers.
    Problem 2: When the data turns up in Excel it appends the data rather than overwriting it.
    I resolved problem 1 by having another sheet which converts the text to numbers (=int(sheet1!A1))
    I resolved problem 2 by adding some VB script to my SSIS package which clears the existing cells before copying the data
    The job runs fine, however when I schedule the job to run overnight it complains "System.UnauthorizedAccessException: Retrieving the COM class factory for component with CLSID". A little googling tells me that running the client side commands in
    my vb script (workSheet1.Range("A2:F9999").Clear(), workBook.Save(), workBook.Close() etc) from a server side task is bad practice.
    So, I am left wondering how people usually get around this problem; copy a SQL table into an existing Excel file and overwrite the data, without having the numbers turn up as text. My requirements are that the report must display pivot charts with selectable
    options and be automatically updated overnight.
    Help appreciated,
    Bish.
    Office 2013 on my PC, Office 2010 on the server, Windows Server 2008R2 Enterprise, SQL Server 2008R2.

    I think that the best practice in case like this is to Link an excel file to a view or directly to a table. So you don't have to struggle with changing template, with overnight packages, etc. If the data are too much complex and the desiderate too excessive
    then I tend to create a Cube and that's it...dashboard, graph and everyone is happy. In your case if the request is not too much try to don't use SSIS but directly build a view and point directly on SQL.
    SSIS is really strong for the ETL, to run some stored procedure too heavy, to use a cut time scheduled, etcetera , etcetera, etcetera...I love it. But sometimes we need to find the easier solutions...
    I hope this post helped you

Maybe you are looking for

  • OWB9.2 HP-UX 11.00 Installation

    Hi I am trying to install the HPUX 11.00 version of OWB9.2 . At the stage where we specify the source path, name , and oracle destination. I am getting the following error. "Cannot create the name. The name already exist in ORCA.Please select another

  • App to schedule backups to WebDAV?

    Looking for an app, please, to schedule backups to a WebDAV on a leased CPanel server. Saw "Twin" in App Store but looking for something less.

  • Total value for PO not displayed

    Hi, While creating a SSP PO, the total value is not displayed. Please suggest. Regards, Anand.

  • Are dashed grid lines possible without bitmap background?

    Hi, I am trying to get dashed grid lines onto my graph.  Having searched around, I suspect that I can only achieve this by setting the background as a bitmap that has the dashed lines on.  Is this correct? I think that this was addressed in the .NET

  • Elements Organizer shows different timestamps in JPEG and NEF files of the same photo

    Hi, I generally use the option of my Nikon D5000 to store pictures in NEF and JPEG format. While both files show the same time stamp under properties - as one may expect (click on a photo, press right mouse button, last entry of context menu "show pr