BPC data load for consolidation..

Once gone live.. how frequently is the data will be loaded for each month.. Is it on daily basis or only at the end of month loading.
I understand that if the data is loaded only at the end of the month.. all errors will come only during monthend and will delay the consolidation process.

Hi,
Consolidation happens monthly and quarterly only.
Consolidation normally happens at YTD figures ie Year to date figures.
If you take the transaction data mid of the month and start doing execution your profit will show wrong figures and all eliminations may go wrong.
You can plan your consolidation process after Closer of existing source system books for the particular month.

Similar Messages

  • Master data loads for Attributes and texts failing

    Hello
    Master data loads for Attributs and texts are failing. The error is
    1. Lock Not set for Loading master data attributes
    2.Table /BI0/YAccount does not exists (this error is for 0Account master data - attribute load)
    3.Error  1 in the update.
    We had faced this error few days ago and rebooting the server resolved the error but it has recurred after 4 days.
    RS12 and SM12 do not show any locks. Activating the info object has also not resolved the error.
    Any insight is appreciated.
    Thanks
    Inder

    Hello
    Master data loads for Attributs and texts are failing. The error is
    1. Lock Not set for Loading master data attributes
    2.Table /BI0/YAccount does not exists (this error is for 0Account master data - attribute load)
    3.Error  1 in the update.
    We had faced this error few days ago and rebooting the server resolved the error but it has recurred after 4 days.
    RS12 and SM12 do not show any locks. Activating the info object has also not resolved the error.
    Any insight is appreciated.
    Thanks
    Inder

  • Need to skip the data load for one of the sub chain.

    Hi All,
    In our project, we have one meta chain and have many sub chains inside that. Today we don't want to run one of the sub chain. As we are in BW7.0, Skip options are not available. Can you please help how to stop the data load for the particular sub-chain.Please suggest.
    Thanks.

    Hi Jalina,
    If this is a frequent request then you can create a custom ABAP Program and then use the below simple logic to skip/run the meta chain, but you will also have to change the link between the sub chains as "Always" instead of "Successful" so that no matter if the dependent chain is successful OR not the nex chain will proceed. If you are comfortable with event then you can also use event to achieve this
    Program Logic- Create a new table where you maintain meaningful values (indicator/description) in it and the program should read the data from the table, Based on the values in the table the program will be successful/fail
    Thanks
    Abhishek Shanbhogue

  • Master Data Loading for Prices and Conditions in CRM - "/SAPCND/GCM"

    Hi,
    Could anyone give me some inputs on Master Data Loading for Prices and Conditions in CRM.
    T. Code is:  /SAPCND/GCM
    I need to load data on a file (extracted from 4.6) for service contracts.
    I tried LSMW : for this transaction, recording does not work.
    I am trying loading thru Idocs (LSMW). But that too is note really working.
    Do we require some custom development for this , or is some SAP standard funcntionality available ??
    Can anyone provide some valuable inputs one this.
    Would appreciate your responses.

    Hi Tiest,
    Thanx for responding.
    U r right, our clint is upgrading from 4.6 to ECC.
    So as per the clients requirements, we are maintaining all the configs for Services in CRM.
    Services Data which was in 4.6 is being pulled put on flat files which needs to be loaded in CRM. So middleware would not be able to do this.
    What I am looking os some standard upload program.
    LSMW recording does not work.
    This I-Doc "CRMXIF_COND_REC_SLIM_SAVE_M", i am able to load a single record. But I am not able to find , how to make this function for multiple entries.
    IN standard we for loading master data thru I-docs, we map the values to the standard fields which are available in that I-Doc.
    But in this particular i-doc, there is a common field for which I need to define the field name and a field value..
    Till now, I am only able to define just one field name and a field value.
    I want this to word for mutliple entries.
    Hope u get my point.
    Thanx

  • Master Data Load for New Attribute

    Hi Users,
    We had to implement a separate load flow for a new field coming from R3. This field was to be added to existing master data object.
    I added a new Display attribute for an existing 0GL_ACCOUNT master data object.
    This new attribute along with some other existing fields is getting data from another master data object with an infosource in between because two transformations cannot be created for same source and target.
    When i load the data i dont see data being populated for this new field. I did ACR, checked the keys e.t.c.
    Source object has data but after executing DTP no data comes to this attribute. No routines or anything.
    Please suggest
    Regards
    Zabi

    Hi,
    The situation is:
    Field x from source maps to
    1. field y ( which was existing field ) and also maps to
    2. field z which is the new attribute.
    field y has to get updated for company codes 10 for example.
    field z for company codes 30.
    now if i use same flow and map field x to both y and z then there is overwriting happening if 10 does not have value for x and 20 has then its not good.
    So if i use a separate flow with infosource then i will map only x to z so after loads which means for 10 code if no value went in first dtp to y then if code 30 has value for x then z will only be updated and y remains empty....
    Master Data Load for New Attribute 

  • Master data Load for Substance Creation :  Using BAPI in LSMW  ?

    Hi ,
    Had any one did a Master data load for creating a Substance in SAP RM. If so please let me know the best possible option. I am beeing trying to figure out the option to load Substances and its IDENTIFIER / MATERIAL ASSINGMNET from my legacy system. But I am not geeting a clue to use LSMW. ( Direct or BatchInput )
    I am not seeing any BAPI for uploading it through the LSMW except the one BUS1077 ( method SAVEREPMUL ) and I guess this is used along with IDOC.
    Appreciate your help and suggestion.
    David

    Thanks John.
    I will try this option what you have suggested. Even I told by business team to look for CG33. But they were been telling me some format issue. Probably they would meant the same what you have said. But they were not aware of work around for this. your information is intresting.
    The other option given to me was to create a recording for each and every Tab like Substance header / Identifier / Material assignment etc. seperately. Technically I don't feel good to create in that way. Do you think that is alernate approach.?
    With Respect & Regards
    David

  • Master data load for 0COSTCENTER (Cost Center) failing

    Hi Experts
    I have a master data load for 0COSTCENTER (Cost Center). The Load has started failing from a couple of days at DTP.
    (R) Filter Out New Records with the Same Key
          (G) Starting Processing...
          (R) Dump: ABAP/4 processor: DBIF_RSQL_SQL_ERROR
    I am unable to understand the reason for this failure. I tried loading the data 1 costcenter at a time it is still failing so i doubt if its a internal table storage issue as suggested by the dump.
    Could you help me on this one please.
    Regards
    Akshay Chonkar

    Thanks All
    I have got the issue resolved
    The Error Stack for the DTP had accumulated so many entries that it was unable to process further.
    So i deleted the entries in Error Stack usin SE14. Then again executed my DTP.
    Everything is fine now thanks for your help.
    Regards
    Akshay Chonkar

  • Data load for VK11

    Hi All,
    can anyone suggest me about data load for VK11, I mean which method will be easier?
    appreciate if you can send me the developed custom code
    thanks.

    Hi Kiran,
    You can also use BAPI BAPI_PRICES_CONDITIONS.
    Please check this link for sample codes.
    Re: Sample code for  BAPI_PRICES_CONDITIONS
    Hope this will help.
    Regards,
    Ferry Lianto

  • How to Run ebs adaptor data load for 1 year using DAC

    Hi,
    iam trying to Run the ebs adaptor data load for 1 year for procurement and spend ,Please let me know the parameter to set in DAC.
    last extract date is set as Custom Format(@DAC_SOURCE_PRUNED_REFRESH_TIMESTAMP
    Thanks

    You need to set $$INITIAL_EXTRACT_DATE to a year ago. The LAST EXTRACT DATE is something used for incremental loads and you do not manually set that.
    if this helps, mark as correct or helpful.

  • "Sematic Group" in BPC data loading?

    Hi All:
    Now, I have a Data Manager Package to load data from BW to BPC, during the loading, I wrote a UJD routine to do the calculation of expense from cost center. The issue is the data volume for one individual cost center is huge. So, those records could come in multiple packages. I would like to know if there is similar mechanism in BPC like the "Sematic Group" in BW. So, I can put all records for one cost center into one package.
    Thanks,
    Sean

    Sean,
    You can increase the default package size (default 50K) to include all your records for that Cost center.
    Nikhil

  • BI/BPC data load

    we have created the complete data flow for the cube created by BPC (/cpmb/cube) and loaded the transaction data into the cube but we cant see the data in BPC.
    Is there any program to be run at the BI side to make the data available in bpc
    thanks in advance

    Hey,
      when you  directly  load  your  data  with IP  from  BI  to BPC,  the  BPC  cube  changes  from  planning mode  to  loading mode.
    right  click  and  change  it  back to  planning mode.Check  it  out now.
    But  as  i  have said  earlier  in my  prev  reply,  optimisation  would  cause  loss of data  as bpc does not have an  idea  of  this load.
    check  this  pravn's blog  for  more  idea
    /people/pravin.datar/blog/2009/04/16/loading-transactional-data-from-any-infocube-to-bpc-application-in-bpc7nw
    hope  it  helps..
    Edited by: Vishal Mahawadhi on Oct 20, 2009 8:23 PM

  • Data load for target after target enhancement

    Dear all,
    We are using BI7.00 and in one of our design we have the data first loaded to the ODS and then to the Cube. Now we wanted to get the value of one more field. This field is already avaialable in the data source and data is flowing upto PSA. I have added that field in the ODS. My problem starts here. I want the data to flow for all the previous requests i.e., earlier requests in the ODS and the Cube (Prior to enhancement) without disturbing the data load.
    Kindly provide the step by step instructions. I do not want the earlier data to be deleted. The current field value to be updated to the previous loads to the ODS and the Cube.
    Regards,
    M.M

    Hi,
    Overwrite optrion is available in the update mode of key figures.
    Its key figure which determines whether the DSO is in overwrite or addition mode.
    Go to the mappings of each infoobject in the tranformation from the data source to DSO and check the mapping type of each key figure and then see if its overwrite or not??
    To avoide reloading to the cube.....One of the option is to add the fields in both cubes and DSO and then activate the transformation and mappings,DTP ect.
    After that load the data to the DSO and then schedule the delta from the DSO to the cube.Thi will bring all the chnages to the DSO in the cube.
    But this delta may be a huge one and may fail depending upon the data in the DSO.
    But you can try for this and if it is not working then you may have to delete the cube and reload it.
    Thanks
    Ajeet

  • After recent patch upgrade, BW delta Data loads for 2LIS_02_ITM failing

    Hi All
    We have patched our BW 3.5 system from 16 to 25. After updating patches, I am geetting delta load issues.
    2lis_02_itm and 2lis_02_scl delta loads are taking long long time. Deltas are failing even for 3000  records . some time it is taking 14 hours .  Delta loads are failing  and I am getting below message. Please advise.
    I am still getting below dump and i could not find any errors in
    ST22. Please see below error.
    Processing in Warehouse timed out; processing steps mising
    Diagnosis
    Processing the request in the BW system is taking a long time and the
    processing step Update rules has still not been executed.
    System response
    <DS:DE.RSCALLER>Caller is still missing.
    Procedure
    Check in the process overview in the BW system whether processes are
    running in the BW system under the background user.
    If this is not the case, check the short dump overview in the BW system.
    Thanks
    Pramod

    Hello Pramod,
    Is this load is to load to initial level ODS? If Yes I understand that you have issue in loading but not at activation of the ODS?
    Normally this kind of issue comes during the activation of ODS.
    1. If you are getting this during the data loads, then try to check loading only to PSA first and then from PSA to target? And check how much time that it's taking as this will help to track the issue whether its present in Source or in BW?
    2. If the load is fine till PSA then try loading the same to ODS and then check whether this issue is present or not?
    3. At the same you need to check the source side the system load and the number of processors which are available?
    4. Did you perform the delta queue draining before the stack upgrade in R/3 or not? Normally as a pre-requisite this needs to be done if not then you will have issues in extracting the data!
    5. Finally this can be avoided by having Initwithout data transfer and then performing the Delta loads. BUt for this need to first extract all the data from Source to BW and then take a down time in R/3 for few min and perform this activity so that the Init Timstamp will be reset properly and the Delta should work fine for you.
    Do get back to us with more details.
    Thanks
    Murali M

  • Performance Tuning Data Load for ASO cube

    Hi,
    Anyone can help how to fine tune data load on ASO cube.
    We have ASO cube which load around 110 million records from a total of 20 data files.
    18 of the data files has 4 million records each and the last two has around 18 million records.
    On average, to load 4 million records it took 130 seconds.
    The data file has 157 data column representing period dimension.
    With BSO cube, sorting the data file normally help. But with ASO, it does not seem to have
    any impact. Any suggestion how to improve the data load performance for ASO cube?
    Thanks,
    Lian

    Yes TimG it sure looks identical - except for the last BSO reference.
    Well nevermind as long as those that count remember where the words come from.
    To the Original Poster and to 960127 (come on create a profile already will you?):
    The sort order WILL matter IF you are using a compression dimension. In this case the compression dimension acts just like a BSO Dense dimension. If you load part of it in one record then when the next record comes along it has to be added to the already existing part. The ASO "load buffer" is really a file named <dbname.dat> that is built in your temp tablespace.
    The most recent x records that can fit in the ASO cache are still retained on the disk drive in the cache. So if the record is still there it will not have to be reread from the disk drive. So you could (instead of sorting) create an ASO cache as large as your final dat file. Then the record would already still be on the disk.
    BUT WAIT BEFORE YOU GO RAISING YOUR ASO CACHE. All operating systems use memory mapped IO therefore even if it is not in the cache it will likely still be in on the disk in "Standby" memory (the dark blue memory as seen in Resource Monitor) this will continue until the system runs out of "Free" memory (light blue in resource monitor).
    So in conclusion if your system still has Free memory there is no need (in a data load) to increase your ASO cache. And if you are out of Free memory then all you will do is slow down the other applications running on your system by increasing ASO Cache during a data load - so don't do it.
    Finally, if you have enough memory so that the entire data file fits in StandBY + Free memory then don't bother to sort it first. But if you do not have enough then sort it.
    Of course you have 20 data files so I hope that you do not have compression members spread out amongst these files!!!
    Finally, you did not say if you were using parallel load threads. If you need to have 20 files read up on having parrallel load buffers and parallel load scripts. that will make it faster.
    But if you do not really need 20 files and just broke them up to load parallel then create one single file and raise your DLTHREADSPREPARE and DLTHREADSWRITE settings. Heck these will help even if you do go parallel and really help if you don't but still keep 20 separate files.

  • IMP -Data load for 0BBP_BIGURE_HIER in SRM  5.0 to BI 7.0

    I have been trying to load hierarchy data in the datasource 0BBP_BIGURE_HIER from SRM to my BI system but the monitor shows the req remaining in yellow for hours together.The data comes fine till PSA but doesnt get loaded to the Infoobject 0BBP_BIGURE.The text data has been succesfully loaded for 0BBP_BIGURE.No attributes for the infoobejct to be loaded.
    Can someone provide some information abt how to load 0BBP_BIGURE_HIER from SRM and what can be the possible errors.
    Regards,
    Joy
    [email protected]

    Sony,
    You need to implement the following note for SRM 5.0
    Note 931998 - Release Enhancement for note 501836
    Praveen

Maybe you are looking for

  • Can't copy file from Shared Folder

    Hi, Does anyone here knows how to copy file from shared music. I can view others shared music but can't do anything about it. When I tried to Right Click on music I want to copy. There are only 3 options (Get info, Unchecked Selection & Copy). I chos

  • J_security_check - custom error page

    Hi all, I've successfully run jazndemo - callerInfo sample from Oracle 9iAS and OC4J - version 9.0.3 I then modified web.xml to use FORM-based auth, instead of BASIC auth. I utilize j_security_check for this purpose. FYI: - I use JAZNUserManager with

  • Can we assign IPv4 IP address pool to IPv6 VPN Client

    We are planning to enable IPv6 SSL VPN clients, Let me explain the current setup We have Cisco ASA firewall used for SSL VPN and Cisco ACS for user authentication and RSA for two factor authentication. LAN Server are in IPv4 only.. Requirement : Clie

  • Screen outputs on the mac pro

    can you get a dvi to hdmi lead as would like to link it up to my t.v and us it on that or is there any other way

  • Configure Managers Approval of Overtime in ECC5

    Hi, I'm trying to find where I can configure the columns for the managers approval of overtime view in the backend for ECC5. The IMG path in ECC6 Time sheet>Setting for all user interfaces>Approval procedure>Approve working times> Define field select