Data Load Cube

Hi Experts,
     There is virtual cube and the data is loaded from BAPI . I there a mehtod to directly load  data from the BAPI function from R/3?  . Please send email to srikanth329 at yahoo.co.in
Regards

Hi Experts,
     There is virtual cube and the data is loaded from BAPI . I there a mehtod to directly load  data from the BAPI function from R/3?  . Please send email to srikanth329 at yahoo.co.in
Regards

Similar Messages

  • Data load Error to Planning

    Hi All,
    I have created One Interface for loading data to Planning Application i have make the Account Dimension as Load dimension and Segment as Driver dimension (member is Audio)
    I have created one form for this now when i am running the interface its get successfully executed but at report statics it showing following problem
    org.apache.bsf.BSFException: exception from Jython:
    Traceback (innermost last):
    File "<string>", line 2, in ?
    Planning Writer Load Summary:
         Number of rows successfully processed: 0
         Number of rows rejected: 48
         at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
         at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:345)
         at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:169)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2374)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java:1615)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java:1580)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java:2755)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java:68)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2515)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:534)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:449)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1954)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:322)
         at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:224)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:246)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:237)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:794)
         at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:114)
         at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
         at java.lang.Thread.run(Thread.java:619)
    I have find out a log file for it which says :
    Account,Data Load Cube Name,MP3,Point-of-View,Error_Reason
    Max Hits Per Day,Consol,Jan,"USD,Cannot load dimension member, error message is: java.lang.IllegalArgumentException: Unbalanced double quotes found at character 3 in the member list: "USD
    Avg Hits Per Day,Consol,Jan,"USD,Cannot load dimension member, error message is: java.lang.IllegalArgumentException: Unbalanced double quotes found at character 3 in the member list: "USD
    Can any buddy tell me what i did wrong. My Data file not contains any Double Quote mark with USD the POV is like this :---> USD,E01_0 ,Jan,Current,Working,FY11
    Thanks
    Gourav Atalkar
    Edited by: Gourav Atalkar(ELT) on May 3, 2011 4:06 AM

    As it is comma separated and all the members for the POV need to be together try enclosing it in quotes e.g "USD,E01_0 ,Jan,Current,Working,FY11"
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Data loading - Member containing comma taken in point-of-view

    Hi,
    I tried loading date into planning. My point-of-View contains a member which has comma in it.
    When loading it is considering that member as two different members.
    Has anyone tried loading data into planning with members having comma.
    Eg. Item No.,Start Date,Point-of-View,Data load Cube Name
    30103,11-02-2011,"Actual,Version1,Dazzing Infotech, Inc,230"Capex
    I have 6 dimensions say.
    Dazzing Infotech, Inc is a member in dimension. It has comma which is seperating it as two members.
    Error when loading is "Dazzing Infotech" doesn't exist or you do not have access to it.
    Has anyone worked on this issue previously.
    Thanks

    I agree that it would be cleaner to use the commas in an alias rather than a member name.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Date loading - Member containing comma in Point-of-View

    Hi,
    I tried loading date into essbase via planning. My point-of-View contains a member which has comma in it.
    When loading it is considering that member as two different members.
    Has anyone tried loading data into planning with members having comma.
    Eg. Item No.,Start Date,Point-of-View,Data load Cube Name
    30103,11-02-2011,"Actual,Version1,Dazzing Infotech, Inc,230"Capex
    I have 6 dimensions say.
    Dazzing Infotech, Inc is a member in dimension. It has comma which is seperating it as two members.
    Error when loading is "Dazzing Infotech" doesn't exist or you do not have access to it.
    Has anyone worked on this issue previously.
    Thanks

    Cross post :- Data loading - Member containing comma taken in point-of-view
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Error Regarding Data loading in Planning and budgeting cloud service.

    Hi ,
    I was able to load data in planning and budgeting cloud service before 1 month.
    I loaded via Administration -> Import and Export -> Import data from File.
    Today i loaded the same file to the cloud instance for the same entity that i loaded after clearing the existing data.
    I am getting error while validating itself as
    \\Unrecognized column header value(s) were specified for the "Account" dimension: "Apr", "May", "Jun", "Jul", "Aug", "Sep", "Oct", "Nov", "Dec", "Jan", "Feb", "Mar", "Point-of-View", "Data Load Cube Name". (Check delimiter settings, also, column header values are case sensitive.)
    I checked the source file. Everything is correct. I actually loaded the same file before and i was able to load.
    Does anyone know the problem behind. Can anyone give me suggestion please
    Thanks in Advance
    Pragadeesh.J

    Thanks for your response John.
    I had Period and Year dimension in Columns
    I changed the layout of year dimension in period to POV and loaded the data file. I was able to load without any errors.
    I changed the layout again to the original form as before and loaded. I didn't get any errors again.
    It worked somehow.
    Thank you
    Cheers John

  • How to create a report in bex based on last data loaded in cube?

    I have to create a query with predefined filter based upon "Latest SAP date" i.e. user only want to see the very latest situation from the last load. The report should only show the latest inventory stock situation from the last load. As I'm new to Bex not able to find the way how to achieve this. Is there any time characteristic which hold the last update date of a cube? Please help and suggest how to achieve the same.
    Thanks in advance.

    Hi Rajesh.
    Thnx fr ur suggestion.
    My requirement is little different. I build the query based upon a multiprovider. And I want to see the latest record in the report based upon only the latest date(not sys date) when data load to the cube last. This date (when the cube last loaded with the data) is not populating from any data source. I guess I have to add "0TCT_VC11" cube to my multiprovider to fetch the date when my cube is last loaded with data. Please correct me if I'm wrong.
    Thanx in advance.

  • Data load from DSO to cube

    Hi gurus
    We have a typical problem, we have to combined 2 records into one when they reach to DSO.
    One source is flat file and the other is R3. so few fields I am geting from flat file and few from R3 . They are creating one record when they loaded to DSO . Now I am getting one record in active data table but when I load the data to cube from that DSO ( Data is going from change log to cube ) to cube , I am getting 2 seperate records ( one which is loaded from flat file and one which is loaded from R3 ) in cube which I dont want. I want only one record in cube just like i have in my active data table in DSO.
    I cant get the data from Active data table because I need delta load.
    Would you please advise what can I do to get that one record in Cube?
    Pl help

    Ravi
    I am sending the data thro DTP only but is there any solution to get one record bcos in other scenario I am getting data from 2 different ERP sources and getting one record in DSO and cube as well.
    but that is not happening for this second scenario where i am getting data from FLAT file and ERP and traing to create one record.

  • Data load from DSO to cube fails

    Hi Gurus,
    The data loads have failed last night and when I try to dig inside the process chains I find out that the cube which should be loaded from DSO is not getting loaded.
    The DSO has been loaded without errors from ECC.
    Error Message say u201DThe unit/currency u2018source currency 0currencyu2019 with the value u2018spaceu2019 is assigned to the keyfigure u2018source keyfigure.ZARAMTu2019 u201D
    I looked in the  PSA has about 50, 000 records .
    and all data packages have green light and all amounts have 0currency assigned
    I went in to the DTP and looked at the error stack it has nothing in it then I  changed the error handling option from u2018 no update,no reportingu2019 to u2018valid records update, no reporting (resquest red)u2019 and executed then the error stactk showed 101 records
    The ZARAMT filed has 0currency blank for all these records
    I tried to assign USD to them and  the changes were saved and I tried to execute again but then the message says that the request ID should be repaired or deleted before the execution. I tried to repair it says can not be repaired so I deleted it and executed . it fails and the error stack still show 101 records.when I look at the records the changes made do not exist anymore.
    If I delete the request ID before changing and try to save the changes then they donu2019t get saved.
    What should I do to resolve the issue.
    thanks
    Prasad

    Hi Prasad....
    Error Stack is request specific.....Once you delete the request from the target the data in error stach will also get deleted....
    Actually....in this case what you are suppose to do is :
    1) Change the error handling option to u2018valid records update, no reporting (resquest red)u2019 (As you have already done....) and execute the DTP......all the erroneous records will get sccumulated in the error stack...
    2) Then correct the erroneos records in the Error Stack..
    3) Then in the DTP----in the "Update tab"....you will get the option "Error DTP".......if it is already not created you will get the option as "Create Error DT".............click there and execute the Error DTP..........The Error DTP will fetch the records from the Error Stack and will create a new request in the target....
    4) Then manually change the status of the original request to Green....
    But did you check why the value of this field is blank....if these records again come as apart of delta or full then your load will again fail.....chcek in the source system and fix it...for permanent solution..
    Regards,
    Debjani......

  • Data loading from DSO to Cube

    Hi,
    I have a question,
    In book TBW10 i read about the data load from DSO to InfoCube
    " We feed the change log data to the InfoCube, 10, -10, and 30 add to the correct 30 value"
    My question is cube already have 10 value, if we are sending 10, -10 and 30 Values(delta), the total should be 40 instead of 30.
    Please some one explaine me.
    Thanks

    No, it will not be 40.
    It ll be 30 only.
    Since cube already has 10, so before image ll nullify it by sending -10 and then the correct value in after immage ll be added as 30.
    so it ll be like this 10-10+30 = 30.
    Thank-You.
    Regards,
    Vinod

  • Data load from DSO to Cube in BI7?

    Hi All,
    We just migrated a dataflow from 3.5 to 7 in development and moved to production. So till now in production, the dataloads happend using the infopackages.
    a. Infopackage1 from datasource to ODS and
    b. Infopackage2 from ODS to the CUBE.
    Now after we transported the migrated dataflow to production, to load the same infoproviders I use
    1. Infopackage to load PSA.
    2. DTP1 to load from PSA to DSO.
    3. DTP2 to load from DSO to CUBE.
    step1 and step2 works fine but when I run the DTP2 it is getting terminated. But now when I tried the step b (above), it loads the CUBE fine using the infopackage. So I am unable to understand why the DTP failed and why the infopackage load is successful. In order to use the DTP do we need to do any cleanup when using it for first time? Please let me know if you have any suggestions.
    Please note that the DSO already has data loaded using infopackage.  (Is this causing the problem?)
    Thanks,
    Sirish.

    Hi Naveen,
    Thanks for the Reply. The creation of DTP is not possible without a transformation.
    The transformation has been moved to production successfully.

  • Partial data loading from dso to cube

    Dear All,
    I am loading the data from dso to the cube through DTP. The problem is that some records are getting added  through data
    packet 1 to the cube around 50 crore records while through data packet2 records are not getting added.
    It is full load from dso to cube.
    I have tried deleting the request and executing DTP but again the same no of records get added to the cube through datapacket 1   and after that records are not added to the cube through datapacket 1  ;request remains in yellow state only.
    Please suggest .

    Nidhuk,
    Data load transfers package by package. Your story sounds like it got stuck in second package or something. Suggest you check package size, try to increase to a higher number to see if anything  changes ? 50 records per package kind of low, your load should not spread out into too many packages.
    Regards. Jen
    Edited by: Jen Yakimoto on Oct 8, 2010 1:45 AM

  • Loading data from Cube to Planning area

    Hi,
             If I am loading data from a cube to a planning area using transaction TSCUBE,
    does the system load data into planning area for the combinations that exist in the cube or does it load for all CVCs?
    For example,
    I have my CVC as Plant, Material, Customer
    If there are 4 CVCs in the POS that were previously generated as
    Plant--Material--Customer
    01--M1--
    C1
    01--M2--
    C3
    01--M2--
    C2
    01--M4--
    C5
    If the cube has data like this:
    Plant--MaterialCustomer----Qty.
    01--M1C1--
    10
    01--M2C3--
    20
    01--M2C2--
    5
    (doesnot have the last combination), then if I use TSCUBE transaction to load data to Planning area from this cube,
    is the data loaded as
    Plant--MaterialCustomer----Qty.
    01--M1C1--
    10
    01--M2C3--
    20
    01--M2C2--
    5
    Only for the 3 combinations that exist in the cube and not load anything for the last one
    OR
    is the data loaded as
    Plant--MaterialCustomer----Qty.
    01--M1C1--
    10
    01--M2C3--
    20
    01--M2C2--
    5
    01--M4C5--
    0
    Load all 4 combinations and send 0 as the cube doesnot have this combination?
    Hope I am clear on this question.
    Thanks.

    Thanks a lot Vinod, Srinivas and Harish. The reason why I am asking you is that we have a scenario where we get this situation.
    We initially get data from R/3 to BW to APO like the below:
    Plant--MaterialCustomer----Qty.
    01--M1C1--
    10
    Later when the customer is changed or bought out by somebody C1 is changed to C2. Some times when the business doesnot know who the customer is initially they just put C1 as dummy and then after sometime replace it by C2. Then the new record coming in is as follows:
    Plant--MaterialCustomer----Qty.
    01--M1C2--
    10
    BW can identify changes in transaction data  but not in Master data. What I mean by this is when Qty. 10 changes from 10 to 20, the system can identify it in deltas.
    If the customer (master data) changes to C2 from C1, the system thinks it's a new record all together then if I use delta loads, it gets me the following:
    Plant--MaterialCustomer----Qty.
    01--M1C1--
    10
    01--M1C2--
    10
    If I am looking at Plant and Material Level, my data is doubled.
    So we are planning to do a full load that works like this:
    1. Initial data like the below:
    Plant--MaterialCustomer----Qty.
    01--M1C1--
    10
    The CVC is created and the planning area has Qty.10
    Then we delete the contents of cube and do a full load into the cube with changed customer
    Plant--MaterialCustomer----Qty.
    01--M1C2--
    10
    This time a new CVC is created. Then we have another 10 loaded into Planning area.
    If the system loads all CVCs, then the it would send
    Plant--MaterialCustomer----Qty.
    01--M1C1--
    0
    01--M1C1--
    10
    If the system loads only combinations in cube,
    then it loads
    Plant--MaterialCustomer----Qty.
    01--M1C2--
    10
    But the system already has another 10 for Customer C1 duplicating the values.
    We are trouble in the second case.
    We had to go fr this solution instead of realignment as our business has no way pf knowing that C1 was replaced by C2.
    Hope I am clear.

  • Data loaded to Power Pivot via Power Query is not yet supported in SSAS Tabular Cube

    Hello, I'm trying to create a SSAS Tabular cube from a data loaded to Power Pivot via Power Query (SAP BOBJ connector) but looks like is not yet supported.
    Any one tried this before? any workaround that make sense?
    The final goal is pull data from SAP BW, BO Universe (using PowerQuery) and be able to create a SSAS Tabular cube.
    Thanks in advance
    Sebastian

    Sebastian, 
    Depending on the size of the data from Analysis Services, one work around could be to import the data into into Excel and then make an Excel table and then use the Excel table as a data source. 
    Reeves
    Denver, CO

  • Loading data from Cube to ODS

    Hi All,
    I have a cube which contains projects data. I want to load the data from cube to a write optimized ODS. In the cube for a project i have multiple records. But while loading the data to ODS, I have to summarise the data in such a way that i should have one record for a project i.e if the cube has data as follows:
    Project Cost
    abc      100
    abc       200
    abc       300
    Then in the ODS the record should be
    Project    Cost
    abc        600
    How do i achieve this ?
    Thanks,
    Satya

    Hi Satya,
    Generate export datasource from Cube ( Right click on cube ->generate export datasource).
    Create update rules for the ODS and in the update rule of Cost , you will find two options
    1. overwrite
    2. addition.
    Choose addition and activate update rules. ..This will work exactly same way you wanted.
    One more thing to mention ,Put project info object in to key fields of ODS and rest all info objects in to data fields.
    Hope that helps.
    Regards
    Mr Kapadia
    Assigning points is the way to say thanks in SDN.
    Message was edited by:
            Mr Kapadia

  • Error while data loading in real time cube

    HI experts,
    I have a problem. I am loading data from a flat file.The data is loading correctly till the DSO but when i am trying to load it into the cube it is giving an error.
    The cube is  a real time cube for PLANING. I have chnaged the status to allow data loading but still the DTP is giving an error.
    It shows an error "error while extracting from DataStore" and some RSBK 224 ERROR and rsar 051 error.

    What was the resolution to this issue.  We rae having the same issue only with external system (not a flat file).  We get the  RSAR 051 with a return code of 238 error message, like it is not even getting to the rfc connection (DI_SOURCE).  We have been facing this issue for a while and even opened up a message with SAP.

Maybe you are looking for