Load to cube - issue

Hi ,
I have infobject 0gl_account being used in production . I am now using it in a custom ods and cube also.
however , when i load data into the ods , The load happens fine . GL account field gets loaded with leading zeroes.
when i load from ODS to Cube ( ODS is not Bex enabled) , I get an error saying NO SID exists for gl acct '000999999' , whereas gl acct 999999 exists in the master data.
I cant change settings of 0gl_account as it is already being used in production . is there
a way to resolve this without creating a custom infoobject .
pls guide
thanks

Hi,
Pls. try to use the same thread instead of opening the new one
Leading zeroes when loading data
PB

Similar Messages

  • Data not loading to cube

    I am loading a cube from a cube.  There is data which meets my info package selection criteria in the source cube but I get 0 records transferred, added to the cube.  There is no delete in start routines.
    Just before loading I have selectively deleted all records from the target cube so as to not disturb existing delta mechanism
    The load I am making is full repair
    Can someone assist
    Thanks

    Hi Akreddy
    I am not sure "Repair full to Cube"...!
    Still the following is my opninon abt your issue...
    It seems there is some  miss map ..Just one quick check in the DTP monitor in which step your getting 0 records . Is it Data package level or transformation level or routine level..
    Identify the step where the transfered records is nullifying then dig through the step..or post the waning or message in this forum so that its pretty easier to get answered..
    Hope its clear a little..!
    Thanks
    K M R
    "Impossible Means I 'M Possible"
    Winners Don't Do Different things,They Do things Differently...!.
    >
    akreddy wrote:
    > I am loading a cube from a cube.  There is data which meets my info package selection criteria in the source cube but I get 0 records transferred, added to the cube.  There is no delete in start routines.
    >
    > Just before loading I have selectively deleted all records from the target cube so as to not disturb existing delta mechanism
    >
    > The load I am making is full repair
    >
    > Can someone assist
    >
    > Thanks
    Edited by: K M R on Aug 16, 2010 2:01 PM

  • Delta Load to cube

    Hi All,
                  I have an issue in the delta load to cube.From Nov 11 onwards the delta load is not picking any data.It shows 0 from 0 records in the monitor screen.Could anyone please help me on this.
    Thanks in advance,
    Drisya

    hi Drisya,
    actiavte the transfer rules & update rules once & then load the data.
    u said that 0 from 0 records it is showing .Is it showing status yellow OR green?
    2. ur loading data from ODS to cube OR directly to cube
    If it is yellow force that to red
    actiavte transfer structure & update rules and do loading again.
    If ur loading through ODS do export generic data sorce at ODS level by right click
    Thanks,
    kiran

  • Loading relational cube

    Hi,
    Several questions regarding relational cubes (build using OWB wisard).
    1. I would like to have an sample mapping for loading relational cube.
    I am able to load all dimension correctly, but for some reason when I load the cube nothing is loaded.
    2. Is it possible to on relational cube to aggregate the information on part of the dimensions? I have created relational cube based on 5 dimensions including time. The cube has two cumulative measures: amount and storage.
    For "amount" and "storage" measures, I would like to sum up for all dimension except for time. For time I want to take the end_date of a period value.
    3. Is it possible to add a third and forth measures avg_amount and avg_storage and issue an "average" aggregation on those measure (while on "amount" and "storage" we shall use "sum").
    Hope the above is clear.
    Thanks
    Galit

    Please ignore, I put it under wrong discussion group.

  • Error Message When Loading a Cube

    Hello - I received the following error message when I attempted to load a cube:
    No SID found for value '0.000' of characteristic 0CURRENCY
    Is anyone familiar with this error message?  Also, I tested this load in DEV before I moved the cube into QA but I did not receive the error message in DEV.  Any help would be greatly appreciated, thanks.

    Hi,
    I had a similar error just a few days ago. I could solve it by replicating the datasources again and activating the transfer structures via rs_transtru_activate_all. After that the load went fine.
    regards
    Siggi

  • All records loaded to cube successfully but the request is in red status

    All the records are loaded from r3 to target, but the status of the request is still in RED, what is the reason and how to resolve it?

    Hi,
    If you find the records are loaded to cube and request is still red, Follow any one of the following options.
    1. Make the request to green manually
    2. Check the extraction monitor settings in SPRO -
    > SAP Customizing Implementation Guide  ---> Business Intelligence  --->  
        Automated Processes  -
    > Extraction Monitor Settings. Check how the settings are done if you are authorized.
    3. Go to Manage of the target ---> Environment ---> Automatic Request processing ---> Check the option Set Quality Status to OK
    Hope this helps,
    Regards,
    Rama Murthy.
    Edited by: Rama Murthy Pvss on May 11, 2010 7:39 AM

  • Why the delivery date is the same date as 'transptn plan date" & loading date' & ' good issue' & GR end date'

    Hi Experts,
    why the delivery date is the same date as ‘transptn plan date” & loading date’ & ‘ good issue’ & GR end date’.
    in shipping tab i can see Planned Deliv. Time  170 Days ... wat could be the reason.
    Many Thanks:
    Raj Kashyap

    Hi Jurgen,,
    Thanks for quick reply!!
    But i didnot find any things like that .. what could be the customizing .. and we are using GATP from APO side.
    \Raj Kashyap

  • Function module to load transactional cube directly.

    Hi,
    Is there any function module to load transactional cube directly.
    Thanks.

    Hi.
    Transactional cube behaves as regular cube except additional possibility to write directly from planning applications.
    So you can load this cube with regular BW tools (data source, info source, info package, update rules).
    But as mentioned above before loading you should switch cube to load mode and after switch to planning mode using FM or manually.
    Regards.

  • Can data be loaded from Cube to DSO

    Can data be loaded from Cube to DSO

    Hello,
    Yes,it can be loaded...all you need to do is the follow the same procedure of source and target as you do with the normal flow,...there is no different with cube as data source...you can even schedule delta from the cube to the DSO.
    Thanks
    Ajeet

  • XRPM Business content: Loading of cube 0RPM_C02

    Hi,
      We are implementing the xRPM BW business content and find one scenario very interesting related to the cube 0RPM_C02( Staffing Resource Assignment)
      When we look into the BW content documentation in help.sap.com we find that this info cube can be updated from the following data source/info source, the first 3 being the transaction data.
    Data source     Info source
    0RPM_BUPA_AVL      0RPM_SRAR_03
    0RPM_ROLE_D     0RPM_SRAR_02
    0RPM_RELATE_D     0RPM_SRAR_01
    0RPM_TEAM_H     0RPM_TGUID
    0RPM_RELATE_H     0RPM_RELGUI
    0RPM_ROLE_H     0RPM_ROLGUI
    0RPM_PM_01     0RPM_PROJ/0RPM_PROJ_GUI
    0RPM_RESOURCE     0RPM_RESGUI
    The first three combinations have already been working and data loaded successfully in the cube. What I am not able to understand as
    1> why the cube should also get updated from the last 5 data source/info source combination which are basically for master data and loaded before we load the cube.
    2> There is no business content (update rule etc) that connect these info source to the cube 0RPM_C02. Do we need to manually create the update rules for these?
    3> Why the cube is getting loaded from the master data, which are also characteristics of its own dimension.
    Please answer point by point.
    Thanks
    Arun

    can u please post the answer here again. I'm like in the same scenario now.
    Regards

  • How to make data loaded into cube NOT ready for reporting

    Hi Gurus: Is there a way by which data loaded into cube, can be made NOT available for reporting.
    Please suggest. <removed>
    Thanks

    See, by default a request that has been loaded to a cube will be available for reporting. Bow if you have an aggregate, the system needs this new request to be rolled up to the aggregate as well, before it is available for reporting...reason? Becasue we just write queries for the cube, and not for the aggregate, so you only know if a query will hit a particular aggregate at its runtime. Which means that if a query gets data from the aggregate or the cube, it should ultimately get the same data in both the cases. Now if a request is added to the cube, but not to the aggregate, then there will be different data in both these objects. The system takes the safer route of not making the 'unrolled' up data visible at all, rather than having inconsistent data.
    Hope this helps...

  • Load More Messages issues

    I saw some posts about load more messages issues, but nothing seems to be the same as my problem. I'm new to iPhone, so I must be missing something obvious.
    I was looking at my emails this morning and deleted all but 5. I responded to an email, but didn’t send because I forget change the server port for outgoing mail that our server requires. So I went into settings and changed the port. I went back to my inbox but can no longer see the messages I saved, instead there's an option that says “Load More Messages” with a note below it stating “5 messages total, 0 unread”. I have no other emails in my inbox. When I click Load More Messages, nothing happens. I can’t figure out how to get to those emails. I tried cycling the power on the iphone, that didn’t help.
    I'm getting new messages normally. My mail is set to show 25 recent messages. Even if I change this, it still displays the "Load More Messages". As an extreme test, I set it to show 200 recent messages, went back to my inbox, hit the "Load More Messages" option, but that didn't help. What's more, I have a full signal and I'm running 3G.

    Hey man, your problem sounds similar to the one I'm having. I can't load more messages on both of my email accounts. But I think I might have figured out the problem. I recently re-installed my hotmail account and when I was looking through my,"Unread" inbox the, "Load more messages" command was working just fine. So I was continuing to go through my messages to make them read (because if your like me, I don't like to have any messages, "Unread." So when all my messages were read there was about 80 in total, and my setting was set to show the 25 most recent. So I backed out of the mail app, did a few more things and came back hoping that there would only be 25 showing with 0 unread and the ability to load more if i wished. Well, there was 25 showing but the box with the load more messages command was gone and no matter how many times I tried to refresh it would not come back.
    So my theory is this; there always has to be an unread message in your inbox for the load more messages to work. Sorry about the long post. I'm going to test this theory out, knowing my luck, it won't work at all. Good luck bro!
    Peace.

  • DTP load to cube failed with no errors

    Hi,
    I executed a DTP load from cube to cube with huge data volume approx 25 million records. It is a full load. The DTP load failed without any errors in any packages. The load went for 1 day and 33 mins and the request just turned red without any error messages. I have check ST22 and SM37 Job logs. No clue.
    I restarted this load and again it failed approximately around the sametime without any error messages. Please throw some light on this.
    Thanks

    Hi,
    Thanks for the response. There are no abnormal messages in the SM37 Job logs. These process is running fine the production. We are making a small enhancement.
    08/22/2010 15:36:01 Overlapping check with archived data areas for InfoProvider Cube                             RSDA          230          S
    08/22/2010 15:36:42 Processing of data package 000569 started                                                        RSBK          244          S
    08/22/2010 15:36:47 All records forwarded                                                                            RSM2          730          S
    08/22/2010 15:48:58 Overlapping check with archived data areas for InfoProvider Cube                            RSDA          230          S
    08/22/2010 15:49:48 Processing of data package 000573 started                                                        RSBK          244          S
    08/22/2010 15:49:53 All records forwarded                                                                            RSM2          730          S
    08/22/2010 16:01:25 Overlapping check with archived data areas for InfoProvider Cube                            RSDA          230          S
    08/22/2010 16:02:13 Processing of data package 000577 started                                                        RSBK          244          S
    08/22/2010 16:02:18 All records forwarded                                                                            RSM2          730          S
    08/22/2010 16:10:46 Overlapping check with archived data areas for InfoProvider Cube                            RSDA          230          S
    08/22/2010 16:11:12 Job finished                                                                                00           517          S
    As shown here The process ended in 577 and 578 package did not start.  I can do be executing multiple DTPs but this process is running fine in production. There are no error messages or short dumps to start troubleshooting.
    Thanks,
    Mathi

  • BI Loading to Cube Manually with out creating Indexes.

    BW 3.5
    I have a  process chain schedules overnight which loads data from the InfoCubes from the ODS after loading to the staging and transformation layer
    The data loaded into the InfoCube is scheduled in the process chain as
    delete Index > delete contents of the cube> Load Data to the Cube --> Create Index.
    Tha above process chain load to cube normally takes 5 - 6 hrs.
    The only concern I have is at times if the process chain fails at the staging layer and transformation layer then I have to rectify the same manually.
    After rectifying the error, now I have to load the data to the Cube.
    I have only left with couple of hours say 2-3 hrs to complete the process chain of the load to the cube because of business hours.
    Kindly let me know in the above case where I have short of time to load data to the cube via process chain
    can I manually delete the contents of the cube and load the data to the cube. Here I will not be deleting the existing index(s) and create Index(s) after loading to the Cube because creation of Index normally takes a long time which I can avoid where I am short of time.
    Can I do the above at times and what are the impacts
    If the load to the InfoCube schedules via process chain the other normal working days. Is it going to fail or it will go through
    Also deleting contents of the cubes deletes the indexes.
    Thanks
    Note: As far I understand that Index are created to improve the performance at loading and query performance level.
    your input will be appreciated.

    Hi Pawan,
    Please find below my views in bold
    BW 3.5
    I have a process chain schedules overnight which loads data to the InfoCubes from the ODS after loading to the staging and transformation layer
    The data loaded into the InfoCube is scheduled in the process chain as
    delete Index > delete contents of the cube> Load Data to the Cube --> Create Index.
    I assume you are deleting the entire contents of the cube. If this is the normal pattern of loads to this cube and if there are no other loads to this cube you may consider configuring a setting in the infocube which " Delete InfoCube indexes before each data load and then refresh" .This setting you would find in Performance tab in create index batch option. Read F1 help of the checkbox. It will provide with more info.
    Tha above process chain load to cube normally takes 5 - 6 hrs.
    The only concern I have is at times if the process chain fails at the staging layer and transformation layer then I have to rectify the same manually.
    After rectifying the error, now I have to load the data to the Cube.
    I have only left with couple of hours say 2-3 hrs to complete the process chain of the load to the cube because of business hours.
    Kindly let me know in the above case where I have short of time to load data to the cube via process chain
    can I manually delete the contents of the cube and load the data to the cube. YES, you can Here I will not be deleting the existing index(s) and create Index(s) after loading to the Cube because creation of Index normally takes a long time which I can avoid where I am short of time.
    Can I do the above at times and what are the impacts Impacts :Lower query performance and loading performance as you mentioned
    If the load to the InfoCube schedules via process chain the other normal working days. Is it going to fail or it will go through
    I dont probably understand the question above, but i assume you mean, that if you did a manual load will there be a failure next day. - THERE WOULDNT
    Also deleting contents of the cubes deletes the indexes.
    YES it does
    Thanks
    Pavan - You can skip creating indices, but you will have slower query performance. However if you have no further loads to this cube, you could create your indices during business hours as well. I think, the building of indices demands a lock on the cube and since you are not loading anything else u should b able to furnish it. Lastly, is there no way you can remodel this cube and flow...do you really need to have full data loads?
    Note: As far I understand that Index are created to improve the performance at loading and query performance level. TRUE
    your input will be appreciated.
    Hope it helps,
    Regards,
    Sunmit.

  • Tools for Anaylsing DB request when loading a cube

    Hello everyone !
    I have seen in documentation a screen in BW where you can see, while a cube is loading, usefull informations about SQL statement (request code, elapsed time...).
    (It is may be accessible from SM50 while prosseces are running).
    Anyone has got idea about this management tools (transaction name) ?
    Thanks
    Best Regards
    Fred

    It's much like ST05  !
    When I load a cube, I have for instance an update routine for the infoobjet XXX.
    Is it possible to get, for the whole datapackage, the time elapsed for this particular sql statement ?
    Thanks anyway,
    Best Regards
    Fred

Maybe you are looking for

  • Error while creating DataTemplate based Report in BI publisher

    hi, i am trying to create dataTemplate in BI Publisher when i view the report in BI Publisher its gives a error Attribute "valueSet" required anyone please help me <dataTemplate name="SrBysrtype" description="ServiceRequestBySrtype" defaultPackage="s

  • Is it ok to install Exchange 2013 on Windows Server 2012 Standard?

    Hello everyone, I'd like to know if it is ok/safe to install Exchange Server 2013 on Windows Server 2012 Standard? As I started doing some research and noticed a page where Microsoft states they don't recommend running Exchange on Server 2012 Standar

  • G5 won't startup from any source w/out kernel panic

    I've got a Power Mac G5 dual 2GHz 1.5GB RAM, 160GB HD. It's running 10.5.6. It kernel panicked the other day while I was at work and when I came home the fan was running really loudly and the screen o' death was up. Tried restarting a few times and e

  • Officejet9120 all-in-one won't print yellow

    My brand new hp yellow cartridge C4838A won't print.  The 9120 display shows the cartridge is full.  The confirguration page output contains no yellow.  I have cleaned the print head, reseated the cartridge many times, aligned the print heads, calibr

  • Can we develop EP(enterprise portals) using sap work bench?

    hi, i know that we can develop portals using NWDS. But i want to know  can we develop EP using sap work bench or any other transaction.