Data load (Urgent)??

Hi all,
I am struggling an issue, please help.
In my query, I have one IO-product group (as a navigation attribute of  IO-0HTproduct in the cube). There are no values in the IO-product group. But other cube which contain this IO have values. Can someone tell me why and how to resolve it? Thank you.
J.

Hi Bhanu,
Have you loaded master data for 0HTproduct (and activated it)?
Based on your question, Here are I did.
<b>0HTproduct has an reference char -0product.
0product group is from 0product and 0product master data was already loaded (I checked the IP under it and it showed the master data is there). So all what I needed to do is to 'activate' 0prduct group' as a navigation att. in the cube and I did. But I still didn't see the value.</b> If I were wrong for the concept, please correct me. Thank you.
J.

Similar Messages

  • Data Loading Question - Urgent

    Dear Experts,
      I have to load 0FI_GL_4.  I want to do an Init with Data Transfer. But,
    there are 120+ million records.  I want to split the data load into smaller chunks filtered by Fiscal Period For E.g:  199701 - 199712
    199801 - 199812......200801 - 200812. 
    I would really appreciate if someone can guide me how to load the data in chunks. We are in BI 7.0.
    Thanks.
    Regards,
    Jamspam
    Edited by: jamspam jamspam on May 27, 2008 4:42 PM

    Hi,
    Try to see the record count for some time period like 6 months or a year and split them in such a way that you have a reasonable record count for each data set like 10-15 million and peform multiple inits for those.
    You can use RSA3 for this.
    Multiple init  
    Hope this helps.
    Thanks,
    JituK

  • Urgent help require-ASO data loading

    I am working on 9.3.0 and want to do incremental data loading so that it wont affect my past data.I am still not sure that it is doable or not .Now the question is
    do I need to design a new aggragation after loading data.
    Thanks in advance

    Hi,
    The ASO cube will clear off all the data , if you make any structural changes to the cube ( i.e if you change your outline ,and you can find out what exactly can clear off hte data ,for example , if you add a new member ,then ,it clears off , and in other case ,if you just add a simple formula to the existing member , it might not clear off the data ).
    If you dont want to effect the past data ,and yet you want to incrementally load , then ensure that all the members are already in the otl( take care of time dimension , let all the time members be there in place) and then you can load in ,using the option of 'add to existing values' while you load .
    But remember , you can only do ,without any structural changes. Else , you need to load all together again
    Its good if you design aggregation , as it helps in retrieval performance, but its not mandatory.
    Sandeep Reddy Enti
    HCC

  • Data loading error- URGENT

    Hi all,
    One of the job that is pulling data from R/3 has failed and the error message that I see in Status section is as below, can you please tell what I should do for a successfull data load.  I appreciate your help.
    Request still running
    Diagnosis
    No errors could be found. The current process has probably not finished yet.
    System response
    The ALE inbox of the SAP BW is identical to the ALE outbox of the source system
    and/or
    the maximum wait time for this request has not yet run out
    and/or
    the batch job in the source system has not yet ended.
    Current status
    in the source system
    thanks,
    Sabrina.

    Hi all,
    I rescheduled the datapackage and I can see all the records extracted properly but I guess ODS Activation is taking time.  The status bar in monitor says
    Data successfully updated
    Diagnosis
    The request has been updated successfully.
    InfoSource : Z_PARTNER_FUNCTION
    Data type : Transaction Data
    Source system: CRPCLNT300
    Step by step analysis has everything set to Green, but there is no Error Analysis tab.
    but in Details tab....
    Extraction, Transfer, Subseq.processing and ODS Activation is still in yellow.  Also the total job status on the left hand side is still yellow.  What should i do now, please let me know.
    thanks all
    sabrina.

  • Data load error--urgent

    hi friends ,
    we are geting data load error as; Process 000087 for deterermined fro infoobject DUVTYPE started in STATUS TAB and in DETAILS TAB  update(251new/160 chnaged): Errors occured.
      what should i  know  from information and what are the reasons for this  load failure and possible rectification methods.
    This  load is through  process chains and aslo a delta load.
    Can i delete the request and start the infopackage manually? also will it make my Process chain Green automatically or  should i repeat from the subsequent process?
    Plz help.
    I award points.
    regards,
    siddartha

    Hi siddharta,
    look in the details of the monitor to see the error. Be very cautious with deltas! They can not be repeated every time. it depends on the datasource! What you can do is delete the failed request from all(!) the data targets meanwhile!
    Regards,
    Jzergen

  • Ods to cube  data load verry urgent

    Hallo ALl,
    i have started the int from ods to cube (data packet 1200000) and its talking hell of time there are 85 data packet, can some suggest me how can i seep up the data load.
    regards sanjeev

    hi
    1) Stop the load first.check the job log and st22 if htere any shor dumps
    2) go to the *8 gnerated infosource...
    3) createa New IP.with the option as PSA and datatarget
    4)  it will take little time but it wont fail..
    5) this will process package by package...
    hope it helps
    regards
    AK

  • Urgent: Master Data Load

    Hi All
    I have activated some master data info objects but i cant find them in the infosource area to load master data (attrib, text and hierarchies) although in the info object definition the application component is displayed. Can someone tell me the reason behind this?
    Thanks in Advance
    ~Alise

    dear alise
      while activating the infoobject make it 'data flow before and after' this will make sure its application area also been activated.
      now coming back to ur query on master data load. once u activate infoobject with 'dataflow before and after', its infosource will also be activated from the content version to active, now in definition of infosource u will find communication structure and datasource assignement (for text, attributes and hierarchy) for the source system. Select them as per ur requirement, make sure field mapping is ok and activate Infosource for text, attribute and hierarchy.
      Now u can create the infopackage for these elements (different infopackages for text, attribute and hierarchy) and laod master data to infoobject via PSA so that u can see the data in PSA.
      confirm that ur data has been reached to the table properly.
      i hope this could resolve ur problem.
      kindly assign the points if it helps.

  • QUERY PERFORMANCE AND DATA LOADING PERFORMANCE ISSUES

    WHAT ARE  QUERY PERFORMANCE ISSUES WE NEED TO TAKE CARE PLEASE EXPLAIN AND LET ME KNOW T CODES...PLZ URGENT
    WHAT ARE DATALOADING PERFORMANCE ISSUES  WE NEED TO TAKE CARE PLEASE EXPLAIN AND LET ME KNOW T CODES PLZ URGENT
    WILL REWARD FULL POINT S
    REGARDS
    GURU

    BW Back end
    Some Tips -
    1)Identify long-running extraction processes on the source system. Extraction processes are performed by several extraction jobs running on the source system. The run-time of these jobs affects the performance. Use transaction code SM37 — Background Processing Job Management — to analyze the run-times of these jobs. If the run-time of data collection jobs lasts for several hours, schedule these jobs to run more frequently. This way, less data is written into update tables for each run and extraction performance increases.
    2)Identify high run-times for ABAP code, especially for user exits. The quality of any custom ABAP programs used in data extraction affects the extraction performance. Use transaction code SE30 — ABAP/4 Run-time Analysis — and then run the analysis for the transaction code RSA3 — Extractor Checker. The system then records the activities of the extraction program so you can review them to identify time-consuming activities. Eliminate those long-running activities or substitute them with alternative program logic.
    3)Identify expensive SQL statements. If database run-time is high for extraction jobs, use transaction code ST05 — Performance Trace. On this screen, select ALEREMOTE user and then select SQL trace to record the SQL statements. Identify the time-consuming sections from the results. If the data-selection times are high on a particular SQL statement, index the DataSource tables to increase the performance of selection (see no. 6 below). While using ST05, make sure that no other extraction job is running with ALEREMOTE user.
    4)Balance loads by distributing processes onto different servers if possible. If your site uses more than one BW application server, distribute the extraction processes to different servers using transaction code SM59 — Maintain RFC Destination. Load balancing is possible only if the extraction program allows the option
    5)Set optimum parameters for data-packet size. Packet size affects the number of data requests to the database. Set the data-packet size to optimum values for an efficient data-extraction mechanism. To find the optimum value, start with a packet size in the range of 50,000 to 100,000 and gradually increase it. At some point, you will reach the threshold at which increasing packet size further does not provide any performance increase. To set the packet size, use transaction code SBIW — BW IMG Menu — on the source system. To set the data load parameters for flat-file uploads, use transaction code RSCUSTV6 in BW.
    6)Build indexes on DataSource tables based on selection criteria. Indexing DataSource tables improves the extraction performance, because it reduces the read times of those tables.
    7)Execute collection jobs in parallel. Like the Business Content extractors, generic extractors have a number of collection jobs to retrieve relevant data from DataSource tables. Scheduling these collection jobs to run in parallel reduces the total extraction time, and they can be scheduled via transaction code SM37 in the source system.
    8). Break up your data selections for InfoPackages and schedule the portions to run in parallel. This parallel upload mechanism sends different portions of the data to BW at the same time, and as a result the total upload time is reduced. You can schedule InfoPackages in the Administrator Workbench.
    You can upload data from a data target (InfoCube and ODS) to another data target within the BW system. While uploading, you can schedule more than one InfoPackage with different selection options in each one. For example, fiscal year or fiscal year period can be used as selection options. Avoid using parallel uploads for high volumes of data if hardware resources are constrained. Each InfoPacket uses one background process (if scheduled to run in the background) or dialog process (if scheduled to run online) of the application server, and too many processes could overwhelm a slow server.
    9). Building secondary indexes on the tables for the selection fields optimizes these tables for reading, reducing extraction time. If your selection fields are not key fields on the table, primary indexes are not much of a help when accessing data. In this case it is better to create secondary indexes with selection fields on the associated table using ABAP Dictionary to improve better selection performance.
    10)Analyze upload times to the PSA and identify long-running uploads. When you extract the data using PSA method, data is written into PSA tables in the BW system. If your data is on the order of tens of millions, consider partitioning these PSA tables for better performance, but pay attention to the partition sizes. Partitioning PSA tables improves data-load performance because it's faster to insert data into smaller database tables. Partitioning also provides increased performance for maintenance of PSA tables — for example, you can delete a portion of data faster. You can set the size of each partition in the PSA parameters screen, in transaction code SPRO or RSCUSTV6, so that BW creates a new partition automatically when a threshold value is reached.
    11)Debug any routines in the transfer and update rules and eliminate single selects from the routines. Using single selects in custom ABAP routines for selecting data from database tables reduces performance considerably. It is better to use buffers and array operations. When you use buffers or array operations, the system reads data from the database tables and stores it in the memory for manipulation, improving performance. If you do not use buffers or array operations, the whole reading process is performed on the database with many table accesses, and performance deteriorates. Also, extensive use of library transformations in the ABAP code reduces performance; since these transformations are not compiled in advance, they are carried out during run-time.
    12)Before uploading a high volume of transaction data into InfoCubes, activate the number-range buffer for dimension IDs. The number-range buffer is a parameter that identifies the number of sequential dimension IDs stored in the memory. If you increase the number range before high-volume data upload, you reduce the number of reads from the dimension tables and hence increase the upload performance. Do not forget to set the number-range values back to their original values after the upload. Use transaction code SNRO to maintain the number range buffer values for InfoCubes.
    13)Drop the indexes before uploading high-volume data into InfoCubes. Regenerate them after the upload. Indexes on InfoCubes are optimized for reading data from the InfoCubes. If the indexes exist during the upload, BW reads the indexes and tries to insert the records according to the indexes, resulting in poor upload performance. You can automate the dropping and regeneration of the indexes through InfoPackage scheduling. You can drop indexes in the Manage InfoCube screen in the Administrator Workbench.
    14)IDoc (intermediate document) archiving improves the extraction and loading performance and can be applied on both BW and R/3 systems. In addition to IDoc archiving, data archiving is available for InfoCubes and ODS objects.
    Hope it Helps
    Chetan
    @CP..

  • Error regarding data load into Essbase cube for Measures using ODI

    Hi Experts,
    I am able to load metadata for dimensions into Essbase cube using ODI but when we are trying same for loading data for Measures encountring following errrors:
    Time,Item,Location,Quantity,Price,Error_Reason
    '07_JAN_97','0011500100','0000001001~1000~12~00','20','12200','Cannot end dataload. Essbase Error(1003014): Unknown Member [0011500100] in Data Load, [1] Records Completed'
    '28_JAN_97','0011500100','0000001300~1000~12~00','30','667700','Cannot end dataload. Essbase Error(1003014): Unknown Member [0011500100] in Data Load, [1] Records Completed'
    '28_JAN_97','0011500100','0000001300~1000~12~00','500','667700','Cannot end dataload. Essbase Error(1003014): Unknown Member [0011500100] in Data Load, [1] Records Completed'
    Can anyone look into this and reply quickly as it's urgent requiremet.
    Regards,
    Rohan

    We are having a similar problem. We're using the IKM SQL to Hyperion Essbase (DATA) knowledge module. We are mapping the actual data to the field called 'Data' in the model. But it kicks everything out saying 'Unknown Member [Data] in Data Load', as if it's trying to read that field as a dimension member. We can't see what we missed in building the interface. I would think the knowledge module would just know that the Data field is, um, data; not a dimension member. Has anyone else encountered this?
    Sabrina

  • Data load from R/3....(problem)

    Hello all,
    I have a query regarding data load.
    Business scenario  is :
    1) I had taken "init delta"  2 months ago....had started deltas at that time...
    2) somehow...i need to delete whole data....
    3) present RSA7  in R/3 :  Delta queue : 0 records
                                           Repeat Delta : 53000 records.
    4)reloaded whole data till date in BW.....through process chain ( Full load )..
    5) Now...my question is ....
    Should i need to do "init delta " again......and start delta ?????
    Will there be any missing data ????
    what will happen to "repeat delta queue data" ????
    Please tell me the steps ....i should take..
    Thanks...

    hello,
    One more thing .....
    I have to load fiscal period wise....
    April 2007  is : 001.2007 and 013.2006 as special period.....
    Now i have loaded till March 2007 : (012.2006)......
    So before starting deltas.....Is this sequence for loading data is correct ??
    1) 013.2006  to  016.2006 <b>(data for these periods may exist or may not )</b>
    2) 001.2007 to 002.2007 <b>(Current Open month)</b> ....(Full load)
    3) Then Init delta ......then delta ...???
    Please help me....its urgent .....
    Thanks....

  • Master data load gives dump with exception CX_RSR_X_MESSAGE

    Hi all,
    I scheduled master data load for 0MATERIAL in our BI 7.0 Quality server. The load ended in dump with the exception CX_RSR_X_MESSAGE. The data has loaded into PSA( 37257 from 37257 records) but the technical status is red and has not updated the master data object.
    I have seen note 615389 which talks about number range but the check is successful. The data load was successful in development server. This error has occured only in Quality.
    We are on SP 12 in BI 7.0.
    We are moving to production soon, so this is an urgent issue. I would appreciate if anyone responds soon. Points will be rewarded.
    with regards,
    Ashish

    Hi,
    The log of the dump is given below.
    Error analysis
    An exception occurred which is explained in detail below.
    The exception, which is assigned to class 'CX_RSR_X_MESSAGE', was not caught
    and
    therefore caused a runtime error.
    The reason for the exception is:
    No text available for this exception
    Information on where terminated
    Termination occurred in the ABAP program "SAPLRRMS" - in "RRMS_X_MESSAGE".
    The main program was "SAPMSSY1 ".
    In the source code you have the termination point in line 78
    of the (Include) program "LRRMSU13".
    Source Code Extract
    Line
    SourceCde
    48
    *... Nachricht an den Message-Handler schicken
    49
    50
    *... S-Meldung wenn kein Handler initialisiert
    51
    DATA: l_type  TYPE smesg-msgty,
    52
    l_zeile TYPE rrx_mesg-zeile.
    53
    CALL FUNCTION 'MESSAGES_ACTIVE'
    54
    IMPORTING
    55
    zeile      = l_zeile
    56
    EXCEPTIONS
    57
    not_active = 1.
    58
    IF sy-subrc NE 0.
    59
    l_type = 'S'.
    60
    ELSE.
    61
    l_type = 'W'.
    62
    ENDIF.
    63
    64
    CALL FUNCTION 'RRMS_MESSAGE_HANDLING'
    65
    EXPORTING
    66
    i_class  = 'BRAIN'
    67
    i_type   = l_type
    68
    i_number = '299'
    69
    i_msgv1  = i_program
    70
    i_msgv2  = i_text.
    71
    ELSE.
    72
    DATA: l_text  TYPE string,
    73
    l_repid TYPE syrepid.
    74
    75
    l_repid = i_program.
    76
    l_text  = i_text.
    77
    >>>>>
    RAISE EXCEPTION TYPE cx_rsr_x_message
    79
    EXPORTING text = l_text
    80
    program = l_repid.
    81
    ENDIF.
    82
    83
    ENDFUNCTION.
    I went to SM21 and went through the system log.  There was a log with the following text  ORA-20000: Insufficient privileges#ORA-06512: at
    "SYS.DBMS_STATS", line 2150#ORA-06512: at "SYS.DBMS_STATS.
    I found a note 963760 for this error. After the changes said in this note were done I reloaded the data and it worked fine. Data load for another data source which gave dump now worked.
    I am not sure if the note did the trick. So i am keeping this post open in case someone finds another solution or can confirm the solution which worked for me.
    Thanks Oliver and Oscar, for replying.
    regards,
    Ashish

  • Error in Master data load because of ###

    Hello,
    I am getting an error in master data load.
    The problem is ####### symbols are encountered in few records.
    We have defined a formula to replace the ### with S in transfer structure.
    But still it does not replace the #### symbols and gives error.
    But when I insert the ####### symbols manually in PSA and try to load it is replaced with the formula definition and loads the data.
    Suggestions required urgently…
    Thanks in advance
    Ram

    Hello srikanth,
    Check the code below.
    I have used it in the update rule to replace the #### symbol with A.
    But when i check in debug mode it is not recognizing this symbol and sy-subrc ia always 4.
    GV_STRING = COMM_STRUCTURE-/BIC/XXX
    FIND '#' in R_STRING.
    IF SY-SUBRC = 0.
    REPLACE ALL OCCURRENCES OF '#' in R_STRING with 'A'.
    ENDIF.
    result value of the routine
      RESULT = R_STRING
    Thanks
    Ram

  • Oracle Database Table data Load it into Excel

    Hello All,
    Please I need your help for this problem:
    I need to load Oracle database Table data and load it into Excel and saved with xls format.
    Example -Select * from Slase data load it into the Excel.
    I appreciate ans sample code to help me do that, Please help me out. This is very urgent.
    Thanks alot and best regards,
    anbu

    >
    I need to load Oracle database Table data and load it into Excel and saved with xls format.
    Example -Select * from Slase data load it into the Excel.
    I appreciate ans sample code to help me do that, Please help me out. This is very urgent.
    >
    Nothing in these forums is 'urgent'. If you have an urgent problem you should contact Oracle support or hire a consultant.
    You have proven over and over again that you are not a good steward of the forums. You continue to post questions that you say are 'urgent' but rarely take the time to mark your questions ANSWERED when they have been.
    Total Questions: 90 (78 unresolved)
    Are you willing to make a commitment to to revisit your 78 unresolved questions and mark them ANSWERED if they have been?
    The easiest way to export Oracle data to Excel is to use sql developer. It is a free download and this article by Jeff Smith shows how easy it is
    http://www.thatjeffsmith.com/archive/2012/09/oracle-sql-developer-v3-2-1-now-available/
    >
    And One Last Thing
    Speaking of export, sometimes I want to send data to Excel. And sometimes I want to send multiple objects to Excel – to a single Excel file that is. In version 3.2.1 you can now do that. Let’s export the bulk of the HR schema to Excel, with each table going to it’s own workbook in the same worksheet.
    >
    And you have previously been ask to read the FAQ at the top of the thread list. If you had done that you would have seen that there is a FAQ for links that have many ways, with code, to export data to Excel.
    5. How do I read or write an Excel file?
    SQL and PL/SQL FAQ

  • Data loading problem with Movement types

    Hi Friends,
            I extarcted data using the data source General Ledger : Line itemdata (0fi_gl_4) to BW side.
        Problem is Movement Types for some documents missing.
    But i checked in rsa3 that time showing correctly.
    i restricted the data in bw side infopackage level only particular document that time data loading perfecly.
    this data source having 53,460 records.among all the records 400 records doc type 'we' movement types are missing.
    please give me solution for this how to loading the data with movement types.
    i checked particular document of 50000313 in RSA3 it is showing movement types. then i loaded data in bw side that time that movement types are not comming to be side. then i gave the particular doc 50000313 in infopackage level loading the data that time movement types are loading correctly. this extaractor having 55000 records.
    this is very urgent problem.Please give me reply urgenty. i am waiting for your's replys.
    Thanks & Regards,
    Guna.
    Edited by: gunasekhar raya on May 8, 2008 9:40 AM

    Hi,
    we enhanced Mvement type field(MSEG-BWART) General ledger (0FI_GL_4) extractor.
    this field populated with data all the ACC. Doc . number.
    Only 50000295 to 50000615  in this range we are not getting the movement types values.
    we didn't write any routines in transfer and update rules level.
    just we mapped to BWART field 0MOVETYPE info object.
    we restrict the particular doc no 50000313 infopackage level that time loading the the data into cube with movement types.
    but we remove the restriction infopackage level then loading the data that time we missing the movement types data of particular doc no 50000295 to 50000615.
    Please give mesolution for this. i need to solve this very urgently.
    i am witing for your reply.
    Thanks,
    Guna.

  • BW Production Data Loads

    Today our BW Production data load was failed.
    When i right click the data target -manage-i can find the request with red trafic light...but when i tried to hit the TV SCREEN THE SYSTEM IS HANGING DOWN...it is not allowing me to enter into the request...
    When i try to access the request from rsmon the same thing is happening
    when i tried to check the logs in process chain for today the same thing happened.
    I think this is some thing related to back ground job
    Please help urgent..please give me detailed procedure as it production system
    Thanks

    ST22 ERRORS
    21.03.2006     09:17:57     sbcpap01     SKELLY2     400     C     MESSAGE_TYPE_X               CL_GUI_CFW====================CP     1
    21.03.2006     04:08:09     sbcpcibw     ALEREMOTE     400     C     TIME_OUT               SAPLSENA     1
    21.03.2006     04:03:18     sbcpcibw     ALEREMOTE     400     C     TIME_OUT               SAPLSENA     1
    21.03.2006     03:57:22     sbcpcibw     ALEREMOTE     400     C     TIME_OUT               SAPLSENA     1
    21.03.2006     03:51:37     sbcpcibw     ALEREMOTE     400     C     TIME_OUT               SAPLSENA     1
    21.03.2006     03:45:55     sbcpcibw     ALEREMOTE     400     C     TIME_OUT               SAPLSENA     1

Maybe you are looking for

  • Multiple Hard Drives, External Hard Drive

    I'm not sure exactly how to phrase this question. I've been using Apple's Aperture program, but now that I'm subscribed to CS Cloud, I want to try Lightroom. I installed Lightroom. During the installation process, I created a new folder for storing i

  • Video pans from Canon 5D mark 3 look sloppy whats up?

    Ive tried alot of settings to get a good clean pan, inluding auto but nothing is working.  Video is a little jumpy.  Using a tripod and various speeds.  The memory card is fast. 

  • Has your zen micro hung/hanged befo

    hi all, i just got my micro about a week agao, and my boss wants to transfer my file sinto her laptop, so i used my removable disk more to copy up till 2G of songs and i went on to put in more files in the 3G partion. But it got filled up so i cancel

  • Crash report,when i try to render in shake

    **Does anybody understand this? Date/Time: 2010-10-22 21:37:04.351 -0700 OS Version: Mac OS X 10.6.4 (10F569) Report Version: 6 Interval Since Last Report: 426688 sec Crashes Since Last Report: 17 Per-App Crashes Since Last Report: 15 Anonymous UUID:

  • Multicolor graphic lock in fullscreen mode with Compiz

    Hello, if you understand something about the topic Multicolor graphic lock in fullscreen mode with Compiz I post elsewhere, please give me your lights.