How to undo data upload in BPC

Hi All,
I have inadvertently uploaded "Actual" data file in BPC with the wrong year, i.e. the Actuals data file should have 2008 as the year, but instead it has 2009 as the year.
Please suggest what would be the best way to fix it.
Is there any way to delete Transactions data in BPC?
I also tried to look the option of using journal for fixing this issue, but every time I try to open an existing or new journal, I get the message that I do not have any authorization for this module.
I am working in a sandbox /Demo enviornment with only one user, and that user is the Primary Admin/System Admin and has all accesses.
Can anyone throw some light as to why I might be getting this error message.
Thanks in advance.
Best Regards.

Hi Anurag,
For deleting the existing data goto datamanger-Run datamanger- select clear function from data manager package.
Delete your existing data based upon the parameters.
Incase of journal posting you dont have sutorizatiion, check Task profile in your security settings.
Ensure creation of journal wizard before going forward the journal.
For your current issue journal posting is not correct option.
Regards
Naveen.KV

Similar Messages

  • Data upload from R3 to BPC using filters

    We are facing a performance issue on the uploading data flow from R3 to BPC. Two steps have been set up.
    Step1: R/3 --> ODS --> BW CUBE (No filters and full mode)
    Step2: BW  CUBE --> BPC CUBE (Standard BW upload package)
    The first one (from R3 to a BW Cube) has no filters, so it loads every data and it takes too much time. However, the second one (from the Cube to BPC)  has filtering options, so we can only load the data we need to (we usually use entity and time filters).
    Both are executed from the BPC DataManager.
    Any advice to improve the performance of the first step? Could we join the step 1 and step 2 and get the filters from the standard BW upload process using a custom process chain?
    Does anybody have any experience using filters to reduce the amount of records on the data uploading from R/3 to BPC?
    BPC Version 7.50.15

    Hi,
    for example in BI data is stored as below
    GroupAcc - Date - Balance
    100000 - 12.12.10 - 400
    **I'm not quite sure how you are getting the above data from infocube.Are you directly loading data from PSA to infocube with out loading data into DSO -0FIGL_O10(General ledger(New):Transaction figures) ?
    To get required format of following data, no need to do any thing while laoding the data into BPC. Your BI infocube should maitain the data in below format.
    GroupAcc - Date - Balance
    100000 - 12.12.10 - 100 - (LC)
    100000 - 12.12.10 - 100 - (TC)
    100000 - 12.12.10 - 200 - (GC)
    To acheive above format of data in the infocube please follow standard way of data flow given by SAP in BI Contect(http://help.sap.com/saphelp_nw70ehp1/helpdata/en/e6/f16940c3c7bf49e10000000a1550b0/frameset.htm).
    Flow should be
    R/3 --> PSA --> DSO -->Infocube.
    In this flow its important to remember about infoobect 0CURKEY_TC in DSO.
    This is always the currency key of the transaction currency; it is also filled for records with a different currency type. Without this key field, postings with different transaction currencies would be overwritten after summarization and thereby be lost.
    (http://help.sap.com/saphelp_nw70ehp1/helpdata/en/a8/e26840b151181ce10000000a1550b0/content.htm)
    Once you follow the standard flow, your infocube contains the requried format (LC,TC and GC )of data to load it into BPC application.
    hope it helps...
    regards,
    Raju

  • Data Validation Before Upload to BPC

    Hi All,
    We pump in trial balance data from our SAP system to BPC with some breakdown to match dimension combination in BPC. One of the required steps is that to have a checking procedure to see whether downloaded data is balanced or not, before we upload to BPC.
    Would you mind sharing with me how best to have this checking, or if you have some experience on how this validation works in your company, i would appreciate it very much if you can share it.
    Thanks a lot,
    Liam

    Hi Nilanjan,
    Thanks for replying.
    If we use Business Rule function, then we would have to upload it to BPC first and let Business Rule run the validation.
    Our objective is to make sure the figures in the flat file is correct (balance in accounting view), before we submit to BPC.
    Thanks,
    Liam

  • How to schedule Job for data uploading from source to BI

    Hi to all,
    How to schedule Job for data uploading from source to BI,
    Why we required and how we do it.
    As I am fresher in BI, I need to know from bottom.
    Regards
    Pavneet Rana

    Hi.
    You can create [process chain |http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/502b2998-1017-2d10-1c8a-a57a35d52bc8?quicklink=index&overridelayout=true]for data loading pocess and schedule start process to any time/date etc ...
    Regadrs.

  • How do I schedule regular daily/weekly/monthly/quarterly data uploads?

    Hello gurus!
    How do I schedule regular daily/weekly/monthly/quarterly data uploads?  How can I make it "automatic"?
    Thank you very much!
    Philips

    Hi,
    There are lots of documents available on how to design a process chain. It is basically your requirement what is needed.
    for eg.. You have  a daily masterd ata loaded into BW from R/3 and then the transaction data. So we create a chain where in we create anothe meta chain (for Master data) and drag the option of Load IP ( give variant as ur IP name you have created for that Particluar master data) , same for texts and hierarchy, for hierarchy use save option after the hierarchy process. and then use a attribute change run proces step to activate the master data. and then create another meta chain that loads transactiond data. Normally you have a IP that loads data into ODS here from source system, so Use IP process for that , next a process step to activate the ODS data and then use another IP to send the data to a cube. If you have aggregates built on that cube, use roll up process to roll up the data.
    If you can gimme your mail ID , I can send some docu's on process chains
    Regards
    Srini

  • How to extract data from SAP 4.7 and upload data to SAP ECC 6.0

    Hi,
        How to extract data from SAP 4.7 and upload data to SAP ECC 6.0? Can i use BDC,BAPI,LSMW? Help me please.

    hi
    good
    both works not possible simultaneously.
    If you want to do it in two separate task than you can use the GUI_UPLOAD function module to fulfill your requirement.
    thanks
    mrutyun^

  • How to load master data in sap bpc 7.0

    Hi ,
      how to load master data in sap bpc 7.0?plz give me the steps also?
    Thank u

    Hello Devi,
    There are three ways you can load Master Data in BPC
    1) Copy and paste.
    Download master data in flat file .Copy the  master data and paste in members of dimension after that process the dimension.
    2) Using SISS Pakage - load Flat File into BPC-Cube
    3) Using SQL Command.
    Thanks.
    With regards,
    Anand Kumar

  • Hello !  pls give some ti[ps how to use bapi's for data uploading?

    hello !
      pls give some ti[ps how to use bapi's for data uploading?
    regards,
    Arjun

    Hi,
    See the below report extract:
    where it_data is having uploaded data.
    LOOP AT<b> it_data</b> INTO wa_data.
        line_count = sy-tabix.
        "Date Validation
        CONCATENATE wa_data-uplft_date4(4) wa_data-uplft_date2(2) wa_data-uplft_date+0(2)
        INTO wa_data-uplft_date.
        "READ TABLE it_ekko INTO wa_ekko WITH KEY lifnr = wa_data-vendor.
        LOOP AT it_ekko_temp INTO wa_ekko_temp WHERE lifnr = wa_data-vendor.
          IF wa_ekko_temp-kdatb <= wa_data-uplft_date AND wa_ekko_temp-kdate >= wa_data-uplft_date.
            MOVE-CORRESPONDING wa_ekko_temp TO wa_ekko.
            APPEND wa_ekko TO it_ekko.
          ENDIF.
        ENDLOOP.
        "IF sy-subrc = 0 AND wa_ekko-kdatb <= wa_data-uplft_date AND wa_ekko-kdate >= wa_data-uplft_date.
        LOOP AT it_ekko INTO wa_ekko.
          wa_data_header-pstng_date = wa_data-uplft_date.
          wa_data_header-doc_date = sy-datum.
          wa_data_header-bill_of_lading = wa_data-bill_of_lad.
          wa_data_header-ref_doc_no = wa_data-del_no.
          CONCATENATE wa_data-header_text1 '-'
                      wa_data-header_text2 '-'
                      wa_data-header_text3 '-'
                      wa_data-header_text4
                      into wa_data_header-HEADER_TXT.
          IF wa_data-indicator = 'Y'.
            wa_data_item-material = '000000000000200568'.
          ELSE.
            wa_data_item-material = '000000000000200566'.
          ENDIF.
          LOOP AT it_ekpo INTO wa_ekpo WHERE ebeln = wa_ekko-ebeln AND matnr = wa_data_item-material.
            "Collect Item Level Data
            wa_data_item-plant = '1000'.
            wa_data_item-stge_loc = '1001'.
            wa_data_item-move_type = '101'.
            wa_data_item-vendor = wa_data-vendor.
            wa_data-qnty = wa_data-qnty / 1000.
            wa_data_item-entry_qnt = wa_data-qnty.
            wa_data_item-po_pr_qnt = wa_data-qnty.
            wa_data_item-entry_uom = 'KL'.
            wa_data_item-entry_uom_iso = 'KL'.
            wa_data_item-orderpr_un = 'KL'.
            wa_data_item-orderpr_un_iso = 'KL'.
            wa_data_item-no_more_gr = 'X'.
            wa_data_item-po_number = wa_ekpo-ebeln.
            wa_data_item-po_item = wa_ekpo-ebelp.
            wa_data_item-unload_pt = wa_data-unload_pt.
            wa_data_item-mvt_ind = 'B'.
            APPEND wa_data_item TO it_data_item.
            CLEAR wa_data_item.
          ENDLOOP.
          CALL FUNCTION 'BAPI_GOODSMVT_CREATE'
            EXPORTING
              goodsmvt_header = wa_data_header
              goodsmvt_code   = goodsmvt_code
              testrun         = 'X'
            TABLES
              goodsmvt_item   = it_data_item
              return          = return.
          READ TABLE return INTO wa_return WITH KEY type = 'S'.
          IF sy-subrc <> 0.
            DESCRIBE TABLE return LINES sy-tfill.
            IF sy-tfill = 0.
              CALL FUNCTION <b>'BAPI_GOODSMVT_CREATE'</b>   
            EXPORTING
                  goodsmvt_header = wa_data_header
                  goodsmvt_code   = goodsmvt_code
                  testrun         = ' '
                TABLES
                  goodsmvt_item   = it_data_item
                  return          = return.
              CALL FUNCTION <b>'BAPI_TRANSACTION_COMMIT'</b>
               EXPORTING
                 WAIT          = 'X'
              IMPORTING
                RETURN        =
            ENDIF.
          ENDIF.
          LOOP AT return INTO wa_return.
            WRITE: 'Messsage TYPE  ', wa_return-type,
                  /,'ID  ', wa_return-id,
                  /, 'Number  ', wa_return-number,
                  /, 'Message  ', wa_return-message,
                  /, 'Long Text  ', wa_return-message_v1,
                                    wa_return-message_v2,
                                    wa_return-message_v3,
                                    wa_return-message_v4,
                 /, 'Failed at line', line_count.
          ENDLOOP.
          CLEAR: wa_ekko, wa_ekpo, wa_data, it_data_item[], wa_data_header.
        ENDLOOP.
    Reward if useful!

  • SAP- BPC (MS Version 10.0) : How to get data from MS SQL !!!

    Appreciate, if any one can assists regarding data uploading from MS SQL (2005/2008) to SAP BPC - (EPM - 10.0). Thanks.

    Hi Deepak,
    Thanks for your email. Actually we not using SAP NW, our BaaN ERP database exist on MS SQL (2005). That’s why we have purchased SAP-BPC 10.0 MS version.
    Really appreciate if you can assist to resolve this issue.
    Thanks,
    Nasar Khan

  • How to automize data transactional loads from BW to BPC 7.5

    Hi ,
    How can I automated the data loads from BW to BPC as nightly loads. If the data coming over to BW cube as delta, how can I handle that in BPC. Appreciate on any inputs.
    Thanks,
    Vara

    There are several different ways to automate BPC processes but I would recommend creating a Data Manager Package (set all code in the prompt variable in the advanced DM options) then you can go to the BW backend and run program UJD_TEST_PACKAGE or UJD_TEST_PACKAGE_LINK (SE38).  Create a variant for the needeed program and DMP combination  and call it from a process chain using the ABAP Program Process Type. 
    There are several How-To articles describing this process
    [Automate MD Load|http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/00380440-010b-2c10-70a1-e0b431255827?quicklink=index&overridelayout=true]

  • Fixed Assets - How to modify Accu.Dep. Amt. in G/L A/c (Data Uploaded Amt)

    Hi Experts,
    In my organization in the last financial year i.e. 2006-2007, we have loaded Fixed Assets. Our production Client is active since last fiancial year. Our SAP was also went live for production last year only.
    So this is a case of modifying data uploaded amount for fixed assets Accu.Dep. G/L A/c in last finacial year (without Depreciation  RUN).
    Now, we have found that we have to rectify (modify / correct) the amounts in the G/L Account - Accumulated Depreciation for some Assets / Assets' Classes like Accu. Depr.->Computers, Accu.Dep.->Furniture & Fixtures etc (because some descrepancies have been found).
    How can we modify the amounts in these respective Accumualted Depreciation G/L Accounts (they are reconciliation accounts - Assets) and so cannot be directly posted to.
    How can we post the rectified / modified amounts and that too in last fiancial year in these Accumulated Depreciation.
    Please note that we cannot do Depreciation Run because last year only we also uploaded from legacy data to SAP.
    In summary, I want to modify ./ rectify the data uploaded amounts of Accumulated Depreciation G/L Accounts in last fiancial year when our SAP also went into production client.
    I will definitely award points with open heart
    Now it is production client.

    Hi
    Is Acc Depn as per asset accounting and Acc Depn in GL balances matching right now? In such case how are you planning to make changes in AA if you are correcting in GL?
    Check status of company code for asset transfer - Is it in status 1 (Asset Transfer not completed)?
    Besides how are you going to take the impact of changes to Acc Depn for past year - are u planning to open periods of last year?
    I assume that you loaded the asset values last year (may be 12/31/2006) and posted GL balances as of 12/31/2006 and carried forward of GL balances to 2007.
    Please provide some details so that we can take it from there.
    Thanks
    Satya

  • How to extract data from BPC InfoCube via ABAP program?

    Hi experts!!
    I tried to extract data from a BPC InfoCube via ABAP program, but I did'n have succeed.
    I used the function 'RSDRI_INFOPROV_READ' to extract data from standard InfoCubes such as '0COPC_C07' and it run OK! However, when I change the InfoCube name to '/CPMB/WAIX8NE' (BPC InfoCube), everything goes wrong...
    Is there any difference between extracting data from BPC and standard InfoCubes?
    Thank you all!

    Moderator message - Welcome to SCN.
    But please do not cross and duplicate post.
    Thread locked.
    Rob

  • How to Load External Data to SAP BPC using SAP BO Data Services

    Hi,
    We want to load data from an external MS SQL Server DB to SAP Business Planning and Consolidation (BPC 7.5 Netweaver version) using SAP BO Data Services 4.0 as ETL. What is the best way to load data to SAP BPC using Data Services?
    Thanks

    Hello Devi,
    There are three ways you can load Master Data in BPC
    1) Copy and paste.
    Download master data in flat file .Copy the  master data and paste in members of dimension after that process the dimension.
    2) Using SISS Pakage - load Flat File into BPC-Cube
    3) Using SQL Command.
    Thanks.
    With regards,
    Anand Kumar

  • How to associate an uploaded file with form data

    I have a "ticketing" app which stores a ticket no. as well as allows users to upload multiple files per ticket. The problem I am having is how to associate an uploaded file with a particular ticket no. As you can guess this becomes complicated when the same user can potentially update multiple tickets using the same file names. I am having difficulty trying to understand how to associate a ticket no with one or more uploaded files. I do have a custom table which I update with the attachments but I am unsure how to, or when to, update the ticket information on this custom table. I only want to retrieve attachments for a given ticket, not all attachments uploaded by a user.Does anyone have any ideas?

    Hi,
    My question is bit related to this topic.
    I am having a requirement to upload the CSV files so that they will store in a database table and later on wards when they search on that table it needs to pull the information and display on the form. I am a new bee to application express. Could some body tell me how to start this process with??. Just give me an overvoew/hints so that I will try to carry on my own.
    Cheers,
    Krishna.

  • Data Collection in BPC

    Hi,
    How does the Data Collection(UCMON tcode) works in BPC?. Do we have any standard Data Manager Package to do the Data Collection?.How can we achieve this
    Would appreciate your time and response.
    Thanks.

    Hi,
    First, I am not sure what does the tcode that you mentioned does. There is no such code in BPC as far as iam aware.
    Regarding the data collection into BPC application (in other words a cube) -
    You could have a flat file approach -  Upload the flat file into BPC file service using upload data file - and then using standard import data manager package - you could import the data into BPC application. You need to define  and use the BPC transoformations and conversions (if any) in the import process. These are nothing but the files.
    Alternatively, you could also load the data directly from the BW infocube or even from a multi-provider. There is a separate data manager package to do this.
    You could also collect data into BPC through input schedules.
    You have the advantage of triggering the default logic while collecting the data which would apply your business logic on the data before it is saved to the database.
    Hope the above gives some insight to you.
    Thanks

Maybe you are looking for