Invalid Characters during data load

Hi All,
We are in BI7 and its unicode compliant with code page 4102.
My setting in RSKC is ALL_CAPITAL. When i tried loading data from the source system, i got many error records in BW saying that it has invalid characters. On debugging, i found that hexadecimals 00-1F comes as part of the record. Why control characters like 00-1F are not allowed in BW by default? I changed the RSKC setting to ALL_CAPITAL_PLUS_HEX, but its of no use.
when i checked in i18n transaction, i could see that code 4102 can accept all control characters.
Is there any setting that has to be made in order to allow hexadecimals in BW [ without converting it in to space using a routine ]?
Thanks and Regards
Amith

Hi Amith Anand,
Please
check
Program
RSKC_ALLOWED_CHAR_MAINTAIN
Or
RSKCCHECK
Let me know.
Regards
Hari

Similar Messages

  • How to debug a transfer rule during data load?

    I am conducting a flat file (excel sheet saved as a CSV file) data load.  The flat file contains a date field and the value is '12/18/1988'.  In transfer rule for this field, I use a function call to transfer this value to '19881218' which corresponds to BW DATS format, but the monitor of the InfoPackage shows red error:
    "Value '1981218' of characteristic 0DATE is not a number with 000008 spaces".
    Somehow, the last digit or character of the year 1988 was cut and the year grabbed is 198 other than 1988.  The function code is (see below in between two * lines):
    FUNCTION ZDM_CONVERT_DATE.
    ""Local Interface:
    *"  IMPORTING
    *"     REFERENCE(CHARDATE) TYPE  STRING
    *"  EXPORTING
    *"     REFERENCE(DATE) TYPE  D
    DATA:
    c_date(2) TYPE c,
    c_month(2) TYPE c,
    c_year(4) TYPE c,
    c_date_combined(8) TYPE c.
    data: text(10).
    text = chardate.
    search text for '/'.
    if sy-fdpos = 1.
      concatenate '0' text into text.
    endif.
    c_month = text(2).
    c_date = text+3(2).
    c_year = text+6(4).
    CONCATENATE c_year c_month c_date INTO c_date_combined.
    date = c_date_combined.
    ENDFUNCTION.
    Could experts here tell me what's wrong and also tell me on how to debug a transfer rule during data load?
    Thanks

    hey Bhanu/AHP,
    I find the reason.  Originally, I set the character length for the date InfoObject ZCHARDAT1 to 9, then I find the date field value (12/18/1988)length is 10.  Then I modified the InfoObject ZCHARDAT1 length from 9 to 10 and activated it already.  But when defining the transfer rule for this field, before the code screen, click the radio button "Selected Fields" and pick the filed /BIC/ZCHARDAT1, then continue to go to the transfer rule code screen, but find the declaration lines for the infoObject /BIC/ZCHARDAT1 is as following:
      InfoObject ZCHARDAT1: CHAR - 000009
        /BIC/ZCHARDAT1(000009) TYPE C,
    That means even if I've modified the length to 10 for the InfoObject and activated it, but somehow the transfer rule code screen always takes the old length 9.  Any idea to have it fixed to take the length 10 in the transfer rule code screen defination?
    Thanks

  • Number of parallel process definition during data load from R/3 to BI

    Dear Friends,
    We are using Bi7.00. We have a requirement in which i should increase the number of parallel process during data load from R/3 to BI.  I want to modify this for a particular data source and check.Can experts provide helpful answers for the following question.
    1) When load is taking place or have taken place, where can we see how many parallel process that particular load has taken.
    2) Where should i change the setting for the number of parallel process for data load (from R/3 to BI) and not within BI.
    3) How system works and what will be net result of increasing or decreasing the number of parallel process.
    Expecting Experts help.
    Regards,
    M.M

    Dear Des Gallagher,
    Thank you very much for the useful information provided. The following was my observation.
    From the posts in this forum, i was given to understand that the setting for specific data source can be done in the infopackage and DTP level, i carried out the same and found that there is no change in the load, i.e., system by default takes only one parallel process even though i maintained 6.
    Can you kindly explain about the above mentioned point. i.e.,
    1) Even though the value is maintained in the infopackage level , will system consider it or not. -> if not then from which transaction system is able to derive the 1 parallel process.
    Actually we wanted to increase the package size but we failed because i could not understand what values have to be maintained  -> can you explain in detail
    Can you calrify my doubt and provide solution?
    Regards,
    M.M

  • Error during data load due to special characters in source data

    Hi Experts,
    We are trying to load Billing data into the BW using the billing item datasource. It seems that there are some special characters in the source data. When the record with these characters is encountered, the request turns red and the package is not loaded even into the PSA. The error we get in the monitor is something like
    'RECORD 5028: Contents from field ****  cannot be converted into type CURR',
    where the field **** is a key figure of type currency. We managed to identify the said record in RSA3 on the source system and found that one of the fields contains some invalid (special) characters that show up as squares in RSA3. The data in the rest of the fields, including the fields mentioned in the error  looks correct.
    Our source system is a non-unicode system wheras the BW system is unicode enabled. I figure that the data in the rest of the fields are getting misaligned due to the presence of the invalid characters in the above field. This was confirmed when we unassigned the field with the special characters from the transfer rules and removed the source field from the transfer structure. After doing this the data was loaded successfully and the request turned green.
    Can anyone suggest a way to either filter out such invalid characters from the source data or make some settings in the BW systems such that the special characters are no longer invalid in the BW system? We cannot write code in the transfer rules because the data package does not even come into the PSA. Is there any other method to solve this problem?
    Regards,
    Ted

    Hi Thad,
    I was wondering, if system is unicode or non unicode that should not matter the amount and currency field . As currencies are defined by SAP and it is in pure English at least a currency code part of 3 Chars. 
    Could this because of some incosistency of data ??
    I would like to know for Which currency  had some special characters it it in that particular record ??
    Hope that helps.
    Regards
    Mr Kapadia

  • NonAscii Characters after data load

    Oracle 10g R2 and HPUX - Excell 2007.
    I am loading one table of data from Microsoft Excell to SQL Server 2008 and then to from SQL Server 2008 to Oracle 10g R2 database.
    I am accomplishing the above task using SSIS package(ETL tool).
    When I point SSIS package to the production database(Oracle database) the few data in few colums has non ascii characters.
    When the same SSIS package is pointed to the test Oracle database there is no data issue. Oracle test and production databases are on two different boxes. Why the same Excell data and SSIS package is not working on production box? Is there any initial parameters affects this data load? Or any code base I have to change?
    Thank you for your help.
    smith

    The OS on both boxes are the same.
    HP-UX B.11.11 U (64). When I look at the patch level there are more than 100 files to compare.
    select * from NLS_DATABASE_PARAMETERS;
    Running the above query resulted in 20 rows as output.
    NLS_NCHAR_CHARACTERSET     AL16UTF16
    NLS_LANGUAGE     AMERICAN
    NLS_TERRITORY     AMERICA
    NLS_CURRENCY     $
    NLS_ISO_CURRENCY     AMERICA
    NLS_NUMERIC_CHARACTERS     .,
    NLS_CHARACTERSET     WE8ISO8859P1
    NLS_CALENDAR     GREGORIAN
    NLS_DATE_FORMAT     DD-MON-RR
    NLS_DATE_LANGUAGE     AMERICAN
    NLS_SORT     BINARY
    NLS_TIME_FORMAT     HH.MI.SSXFF AM
    NLS_TIMESTAMP_FORMAT     DD-MON-RR HH.MI.SSXFF AM
    NLS_TIME_TZ_FORMAT     HH.MI.SSXFF AM TZR
    NLS_TIMESTAMP_TZ_FORMAT     DD-MON-RR HH.MI.SSXFF AM TZR
    NLS_DUAL_CURRENCY     $
    NLS_COMP     BINARY
    NLS_LENGTH_SEMANTICS     BYTE
    NLS_NCHAR_CONV_EXCP     FALSE
    NLS_RDBMS_VERSION     10.2.0.2.0
    The above query result is same bit by bit on production and test box.
    Is the load process exactly the same ?
    I just change the connection point in SSIS to point to production database. Also the table structure (data types, etc what I can see through SQL Developer) in production and test datasbe are same.
    What else be the reason for this?
    Smith

  • FDM event scripts firing twice during data loads

    Here's an interesting one. I have added the following to three different event scripts (one at a time, ensuring only one of these exists at any one time), to clear data before loading to Essbase:
    Event Script content:
    ' Declare local variables
    Dim objShell
    Dim strCMD
    ' Call MaxL script to run data clear calculation.
    Set objShell = CreateObject("WScript.Shell")
    strCMD = "D:\Oracle\Middleware\EPMSystem11R1\products\Essbase\EssbaseClient\bin\startMAXL.cmd D:\Test.mxl"
    API.DataWindow.Utilities.mShellAndWait strCMD, 0
    MaxL Script:
    login ******* identified by ******* on *******;
    execute calculation 'FIX("Member1","Member2") CLEARDATA "Member3"; ENDFIX' on *******.*******;
    exit;
    However, it appears that the clear is carried out twice, both before and after the data has been loaded to Essbase. This has been verified at each step by checking the Essbase application log:
    No event script:
    - No Essbase data clear in application log
    Adding above to "BefExportToDat" event script:
    - Script is executed once after clicking on Export in FDM Web Client (before the "Target System Load" modal popup is displayed). Entries are visible in Essbase Application log.
    - Script is then executed a second time when clicking on the OK button in the "Target System Load" modal popup. Entries are visible in Essbase Application log.
    Adding above to "AftExportToDat" event script:
    - Script is executed once after clicking on Export in FDM Web Client (before the "Target System Load" modal popup is displayed). Entries are visible in Essbase Application log.
    - Script is then executed a second time when clicking on the OK button in the "Target System Load" modal popup. Entries are visible in Essbase Application log.
    Adding above to "BefLoad" event script:
    - Script is NOT executed after clicking on Export in FDM Web Client (before the "Target System Load" modal popup is displayed).
    - Script is executed AFTER the data load to Essbase when clicking on the OK button in the "Target System Load" modal popup. Entries are visible in Essbase Application log.
    Some notes on the above:
    1. "BefExportToDat" and "AftExportToDat" are both executed twice, before and after the "Target System Load" modal popup. :-(
    2. "BefLoad" is executed AFTER the data is loaded to Essbase! :-( :-(
    Does anyone please have any idea how we might execute an Essbase database clear before loading data, and not after we have loaded fresh data? And perhaps on why the above event scripts appear to be firing twice?! There does not appear to be any logic to this!
    BefExportToDat - Essbase Application Log entries:
    +[Wed May 16 16:19:51 2012]Local/Monthly/Monthly/admin@Native Directory/140095859451648/Info(1013091)+
    +Received Command [Calculate] from user [admin@Native Directory]+
    +[Wed May 16 16:19:51 2012]Local/Monthly/Monthly/admin@Native Directory/140095859451648/Info(1013162)+
    +Received Command [Calculate] from user [admin@Native Directory]+
    +[Wed May 16 16:19:51 2012]Local/Monthly/Monthly/admin@Native Directory/140095859451648/Info(1012555)+
    +Clearing data from [Member3] partition with fixed members [Period(Member1); Scenario(Member2)]+
    +...+
    +[Wed May 16 16:20:12 2012]Local/Monthly/Monthly/admin@Native Directory/140095932016384/Info(1003037)+
    Data Load Updated [98] cells
    +[Wed May 16 16:20:12 2012]Local/Monthly/Monthly/admin@Native Directory/140095932016384/Info(1003024)+
    Data Load Elapsed Time : [0.52] seconds
    +...+
    +[Wed May 16 16:20:12 2012]Local/Monthly/Monthly/admin@Native Directory/140095930963712/Info(1013091)+
    +Received Command [Calculate] from user [admin@Native Directory]+
    +[Wed May 16 16:20:12 2012]Local/Monthly/Monthly/admin@Native Directory/140095930963712/Info(1013162)+
    +Received Command [Calculate] from user [admin@Native Directory]+
    +[Wed May 16 16:20:12 2012]Local/Monthly/Monthly/admin@Native Directory/140095930963712/Info(1012555)+
    +Clearing data from [Member3] partition with fixed members [Period(Member1); Scenario(Member2)]+
    AftExportToDat - Essbase Application Log entries:
    +[Wed May 16 16:21:32 2012]Local/Monthly/Monthly/admin@Native Directory/140095933069056/Info(1013091)+
    +Received Command [Calculate] from user [admin@Native Directory]+
    +[Wed May 16 16:21:32 2012]Local/Monthly/Monthly/admin@Native Directory/140095933069056/Info(1013162)+
    +Received Command [Calculate] from user [admin@Native Directory]+
    +[Wed May 16 16:21:32 2012]Local/Monthly/Monthly/admin@Native Directory/140095933069056/Info(1012555)+
    +Clearing data from [Member3] partition with fixed members [Period(Member1); Scenario(Member2)]+
    +...+
    +[Wed May 16 16:21:47 2012]Local/Monthly/Monthly/admin@Native Directory/140095930963712/Info(1003037)+
    Data Load Updated [98] cells
    +[Wed May 16 16:21:47 2012]Local/Monthly/Monthly/admin@Native Directory/140095930963712/Info(1003024)+
    Data Load Elapsed Time : [0.52] seconds
    +...+
    +[Wed May 16 16:21:47 2012]Local/Monthly/Monthly/admin@Native Directory/140095928858368/Info(1013091)+
    +Received Command [Calculate] from user [admin@Native Directory]+
    +[Wed May 16 16:21:47 2012]Local/Monthly/Monthly/admin@Native Directory/140095928858368/Info(1013162)+
    +Received Command [Calculate] from user [admin@Native Directory]+
    +[Wed May 16 16:21:47 2012]Local/Monthly/Monthly/admin@Native Directory/140095928858368/Info(1012555)+
    +Clearing data from [Member3] partition with fixed members [Period(Member1); Scenario(Member2)]+
    BefLoad - Essbase Application Log entries:
    +[Wed May 16 16:23:43 2012]Local/Monthly/Monthly/admin@Native Directory/140095932016384/Info(1013091)+
    +Received Command [Calculate] from user [admin@Native Directory]+
    +[Wed May 16 16:23:43 2012]Local/Monthly/Monthly/admin@Native Directory/140095932016384/Info(1013162)+
    +Received Command [Calculate] from user [admin@Native Directory]+
    +[Wed May 16 16:23:43 2012]Local/Monthly/Monthly/admin@Native Directory/140095932016384/Info(1012555)+
    +Clearing data from [Member3] partition with fixed members [Period(Member1); Scenario(Member2)]+
    +...+
    +[Wed May 16 16:23:44 2012]Local/Monthly/Monthly/admin@Native Directory/140095929911040/Info(1003037)+
    Data Load Updated [98] cells
    +[Wed May 16 16:23:44 2012]Local/Monthly/Monthly/admin@Native Directory/140095929911040/Info(1003024)+
    Data Load Elapsed Time : [0.52] seconds
    +...+
    +[Wed May 16 16:23:45 2012]Local/Monthly/Monthly/admin@Native Directory/140095860504320/Info(1013091)+
    +Received Command [Calculate] from user [admin@Native Directory]+
    +[Wed May 16 16:23:45 2012]Local/Monthly/Monthly/admin@Native Directory/140095860504320/Info(1013162)+
    +Received Command [Calculate] from user [admin@Native Directory]+
    +[Wed May 16 16:23:45 2012]Local/Monthly/Monthly/admin@Native Directory/140095860504320/Info(1012555)+
    +Clearing data from [Member3] partition with fixed members [Period(Member1); Scenario(Member2)]+

    Hi Larry,
    As mentioned, our exports do not appear to be generating the "-B.Dat" and "-C.Dat" files at present. However, you are correct with the Export and Load event scripts firing twice (once for the main TB file and again for the journal file). Does this also mean it could continue to fire an additional two times for the "-B.Dat" and "-C.Dat" files?
    On the last run, the output was as follows with the modified scripts:
    After clicking on Export in Workflow, the Target System Load modal popup is displayed, and the first two files have been generated:
    14.24.15.0527_BefExportToDat.txt
    14.24.17.0617_AftExportToDat.txt
    After clicking on OK in the Target System Load modal popup, the actual load to Essbase takes place. A further six files are generated:
    14.24.21.0289_BefLoad.txt
    14.24.22.0117_AftLoad.txt
    *14.24.22.0152_BefExportToDat-A.txt*
    *14.24.22.0414_AftExportToDat-A.txt*
    *14.24.22.0433_BefLoad-A.txt*
    *14.24.22.0652_AftLoad-A.txt*
    This makes a lot more sense, since one can see that the event scripts are being run a second time against the journal files during the data load. Many thanks, this solves my problem as I can now place my script where I want in the process chain. It's just a shame that there are not separate event scripts to distinguish between the various .Dat exports/loads, which are clearly occuring at separate times in the process chain.
    Many thanks! :-)
    P.S. Updated script below if anyone wishes to use it:
    Sub BefExportToDat(strLoc, strCat, strPer, strTCat, strTPer, strFile)
    Dim strF, fso, tf, t, temp, m, miliseconds, strSuf
    t = Timer
    temp = Int(t)
    m = Int((t-temp)*1000)
    miliseconds = String(4 - Len(m), "0") & m
    strF = "D:\TEST\" & Replace(Time, ":", ".") & "." & miliseconds & "_BefExportToDat"
    strSuf = UCase(Left(Right(strFile,6),2))
    If strSuf = "-A" Or strSuf = "-B" Or strSuf = "-C" Then
    strF = strF & UCase(strSuf) & ".txt"
    Else
    strF = strF & ".txt"
    End If
    Set fso = CreateObject("Scripting.FileSystemObject")
    Set tf = fso.CreateTextFile(strF, True)
    tf.WriteLine(strFile)
    tf.Close
    Set fso = Nothing
    End Sub

  • Query Execution during Data Loads (extarction)

    I think BI 7.0 permitts that but I would still like to confirm from Gurus.
    Can we/users continue to access data or execute queries while extraction is going on?? Can we load data during query execution?
    What are the pros and cons of doing that??
    Always appreciative of your help.
    Suresh

    Hi,
    Query execution will not really hamper data loading or vice versa. But freshly loaded data would not be available for reporting before it gets activated in the infoprovider. Also in case of a cube, if the 'delete overlapping requests' step is to be performed, there could be erroneous looking data in the report till the time this step runs - that is, between the time when the new load has come in and the old request is deleted. That is why loads are best scheduled when users are not working on the system.

  • Segmentation fault error during data load in parallel with multiple rules

    Hi,
    I'm trying to do sql data load in parallel with multiple rules (4 or 5 rules, maybe), i'm getting a "segmentation fault" error. I tested 3 rules file and it worked fine. we're using Essbase system 9.3.2., with UDB (v8) as the sql data source. ODBC driver is DataDirect 5.2 DB2 Wire Protocol Driver (ARdb222). Please let me know if you have any information on this.
    thx.
    Y

    Hi Thad,
    I was wondering, if system is unicode or non unicode that should not matter the amount and currency field . As currencies are defined by SAP and it is in pure English at least a currency code part of 3 Chars. 
    Could this because of some incosistency of data ??
    I would like to know for Which currency  had some special characters it it in that particular record ??
    Hope that helps.
    Regards
    Mr Kapadia

  • Maxl Error during data load - file size limit?

    <p>Does anyone know if there is a file size limit while importingdata into an ASO cube via Maxl. I have tried to execute:</p><p> </p><p>Import Database TST_ASO.J_ASO_DB data</p><p>using server test data file '/XX/xXX/XXX.txt'</p><p>using server rules_file '/XXX/XXX/XXX.rul'</p><p>to load_buffer with buffer_id 1</p><p>on error write to '/XXX.log';</p><p> </p><p>It errors out after about 10 minutes and gives "unexpectedEssbase error 1130610' The file is about 1.5 gigs of data. The filelocation is right. I have tried the same code with a smaller fileand it works. Do I need to increase my cache or anything? I alsogot "DATAERRORLIMIT' reached and I can not find the log filefor this...? Thanks!</p>

    Have you looked in the data error log to see what kind of errors you are getting. The odds are high that you are trying to load data into calculated memebers (or upper level memebers) resulting in errors. It is most likely the former. <BR><BR>you specify the error file with the <BR><BR>on error write to '/XXX.log'; <BR><BR>statement. Have you looked for this file to find why you are getting errors? Do yourself a favor, load the smaller file and look for the error file to see what kind of an error you are getting. It is possible that you error file is larger than your load file, since multiple errors on a single load item may result in a restatement of the entire load line for each error.<BR><BR>This is a starting point for your exploration into the problem. <BR><BR>DATAERRORLIMIT is set at the config file, default at 1000, max at 65000.<BR><BR>NOMSGLOGGINGONDATAERRORLIMIT if set to true, just stops logging and continues the load when the data error limit is reached. I'd advise using this only in atest environement since it doesn't solve the initial problem of data errors.<BR><BR>Probably what you'll have to do is ignore some of the columns in the data load that load into calculated fields. If you have some upper level memebers, you could put them in skip loading condition. <BR><BR>let us know what works for you.

  • Errors during Data loads

    Hi,
    Our end users are preparing their scripts for UAT and I have been asked to provide a list of data errors which they might need to test during UAT.
    Can someone please help me in this. This is very urgent!!!
    Thanks,
    RPK.
    Message was edited by:
            RPK

    HI,
    1.No IDocs could be sent to the SAP BW using RFC.
    2.IDocs were found in the ALE inbox for Source System that are not
    Updated. Processing is overdue
    5.1st – That it is a PC File Source System.
    2nd - That there are Duplicate Data Records leading to error.
    7.1st – That it is a PC File Source System.
    2nd – For uploading data from the PC File it seems that the file was not kept at the application server & the job was scheduled in Background
    8.The background processing was not finished in the source system.
    It is possible that the background processing in the source system was terminated.System response .There are incomplete data packets present
    9.After looking at all the above error messages we find that if we want to load data with the delta update you must first initialize the delta process.
    10.Here we can see that the Activation of ODS has failed as the Request Status in the ODS may not be green.
    13.While Fiscal year is mostly the financial year and varies depending upon the Client using which Fiscal year Variant. Client may consider April is the start month of year (financial) .So this can be achieved by defining the fiscal year.One client can have different fiscal year definations with different Fiscal varients.So Fiscal variants differ the starting and ending of fiscal year.
    For example in India we follow Fiscal year from April to March. You can have a look at the Fiscal year Variant and the periods in transaction OB29
    14.Due to space problem these types of errors occurs

  • Concatnate two fields during data loading - SQL Loader

    How to concatnate two fields(Date and Time ) to a Timestamp field during loading?
    Data is in text format and it contains variable data.
    Example Data
    abc#10.02.2003#16:23:12.12345#dsd#
    Execepted output in a timestamp field
    10.02.2003 16:23:12.12345

    Let me try:
    LOAD DATA
    INFILE 'myfile.dat'
    INTO TABLE mytable
    FIELDS TERMINATED BY '#' OPTIONALLY ENCLOSED BY '"'
    (col1 CHAR(3),
    col2 TIMESTAMP "dd.mm.yyyy hr24:mi:ss.ms" "col2||field1",
    field1 FILLER CHAR,
    col3 CHAR(3))     
    The filler datatype tells sqlldr that this part of the record is not data to be loaded in the table.
    P.S. I am not sure if I got the timestamp format string right.
    P.P.S. This will only work with 9i as 8i and below do not have a timestamp datatype.
    P.P.P.S. Your other option would be to define an external table (again 9i) the do an insert statement into the final table: insert into mytable (select col1, col2||col3, col4 ...

  • Member In Outline, Not Found During Data Load

    I am exporting data from DatabaseA (zero level export in columns to a text file) using MaxL. I then to import this data (using MaxL) into Database B on another server but which has the same exact outline. We set this up about 3 weeks ago and the export import scripts run every day and have been working fine. However today I got an error while loading members into Database B, 3 members were not able to load:
    Member 73997612 Not Found In Database
    Member STCR Not Found In Database
    Member 22-340101 Not Found In Database
    Initially I thought that these 3 members got added to DatabaseA, and were therefore in the export file and probably didn't exist in Database B. However when I went to database B and did a "find members" on the outline, all 3 members are there. Just a note that they had been added manually, but were added.
    Then I thought maybe they had been added in the wrong dimension. I checked the location of the above three members in Database A, and they were added in the same location in the hierarchy in Database B.
    So if the members exist in Database B and are not in the wrong location, why can't they be loaded to Database B ?
    I also wanted to add that there seemed to be several data values which existed for these members. I am guessing they came from the export.
    Version is 11.1.2.1. OS = Oracle Enterprise Linux
    Has anybody encountered an issue like this before ? I don't think it has to do with the member names, they seem to fit syntax requirements PLUS they are exported from the production database.
    Thanks in advance
    Edited by: EssbaseApprentice on Feb 9, 2012 11:12 AM
    Edited by: EssbaseApprentice on Feb 9, 2012 11:54 AM

    Check member names are exactly same( Without any extra spaces, spellings etc...)
    Check storage type

  • How to prevent OLAP from doing unneccessary aggredations during data load?

    Hi,
    I'm trying to create a relatively simple two-dimensional OLAP cube (you might want to call it "OLAP square"). My current environment is 11.2EE with AWM for workspace management.
    One dimension is date, all years->year->month->day, the other one is production unit, implemented as a hierarchy with a certain machine at the bottom level. The fact is defined by a pair of  bottom-level values of these dimensions; for instance, a measure is taken once a day from each machine. I would like to store these detailed facts in a cube together with aggregates, so they could be easily drilled down to without querying the original fact table.
    The aggregation rules are set to "Aggregate from level = default" (which is day and machine respectively) for both of my dimensions, the cube is mapped to fact table with dimension tables, the data is loaded, and the whole thing is working as expected.
    The problem is with the load itself, I noticed it being too slow for my amount of sample data. After some investigation of the issue I found out a query in cube_build_log table, a query the data is actually being loaded with.
    <SQL>
      <![CDATA[
    SELECT /*+  bypass_recursive_check  cursor_sharing_exact  no_expand  no_rewrite */
      T4_ID_DAY ALIAS_37,
      T1_ID_POT ALIAS_38,
      MAX(T7_TEMPERATURE)  ALIAS_39,
      MAX(T7_TEMPERATURE)  ALIAS_40,
      MAX(T7_METAL_HEIGHT)  ALIAS_41
    FROM
      SELECT /*+  no_rewrite */
        T1."DATE_TRUNC" T7_DATE_TRUNC,
        T1."METAL_HEIGHT" T7_METAL_HEIGHT,
        T1."TEMPERATURE" T7_TEMPERATURE,
        T1."POT_GLOBAL_ID" T7_POT_GLOBAL_ID
      FROM
        POTS."POT_BATH" T1   )
      T7,
      SELECT /*+  no_rewrite */
        T1."ID_DIM" T4_ID_DIM,
        T1."ID_DAY" T4_ID_DAY
      FROM
        RI."DIM_DATES" T1   )
      T4,
      SELECT /*+  no_rewrite */
        T1."ID_DIM" T1_ID_DIM,
        T1."ID_POT" T1_ID_POT
      FROM
        RI."DIM_POTS" T1   )
      T1
    WHERE
      ((T4_ID_DIM = T7_DATE_TRUNC)
        AND (T1_ID_DIM = T7_POT_GLOBAL_ID)
        AND ((T7_DATE_TRUNC)  IN  < a long long list of dates for currently processed cube partition is clipped >  ) ) ) 
    GROUP BY
      (T1_ID_POT, T4_ID_DAY) 
    ORDER BY
      T1_ID_POT ASC NULLS LAST ,
      T4_ID_DAY ASC NULLS LAST ]]>>
    </SQL>
    Notice T4_ID_DAY,  T1_ID_POT in the top level column list - these are bottom-level identifiers of my dimensions, which means the query isn't actually doing any aggregation here, as there is only one fact per each pair of (ID_DAY, ID_POT).
    What I want to do is to somehow load the data without doing this (totally useless in my case) intermediate aggregation. Basically, I want it to be something like
    SELECT /*+  bypass_recursive_check  cursor_sharing_exact  no_expand  no_rewrite */
      T4_ID_DAY ALIAS_37,
      T1_ID_POT ALIAS_38,
      T7_TEMPERATURE  ALIAS_39,
      T7_TEMPERATURE  ALIAS_40,
      T7_METAL_HEIGHT  ALIAS_41
    FROM etc...
    without any aggregations. In fact, I can live even with this loading query, as the amounts of data are not that large, but I want things to work the right way (more or less ).
    Any chance to do it?
    Thanks.

    I defined a primary key over all the dimension keys in the fact table but for some reason the build still uses an aggregation. Probably because the aggregation operator for the cube I'm currently playing with is actually set, and I don't see a way to undefine it from the UI toolbar you are referring to. This is a piece of mapping section from the workspace file I exported using AWM.:
    <CubeMap
          Name="MAP1"
          IsSolved="False"
          Query="POT_BATH_T"
          AggregationMethod="SUM">
    Looks like the build aggregates because it is clearly told to so by the AggregationMethod attribute? Any way to override it?

  • PERFORM_CONFLICT_TAB_TYPE  shortdump during data loading into ODS

    HAI
    Im trying to load the data into ODS in Production and QA system . But im getting the shortdump says that <b>PERFORM_CONFLICT_TAB_TYPE</b> .
    But in development system , data is loading into ODS.
    So please tell me wht i have to do
    i will assing the points
    rizwan

    here it is..
    Note 707986 - Writing in trans. InfoCubes: PERFORM_CONFLICT_TAB_TYPE
    Summary
    Symptom
    When data is written to a transactional InfoCube, the termination PERFORM_CONFLICT_TAB_TYPE occurs. The short dump lists the following reasons for the termination:
               ("X") The row types of the two tables are incompatible.
               ("X") The table keys of the two tables do not correspond.
    Other terms
    transactional InfoCube, SEM, BPS, BPS0, APO
    Reason and Prerequisites
    The error is caused by an intensified type check in the ABAP runtime environment.
    Solution
    Workaround for BW 3.0B (SP16-19), BW 3.1 (SP10-13)
               Apply the attached correction instructions.
    BW 3.0B
               Import Support Package 20 for 3.0B (BW3.0B Patch20 or SAPKW30B20) into your BW system. The Support Package is available oncenote 0647752 with the short text "SAPBWNews BW3.0B Support Package 20", which describes this Support Package in more detail, has been released for customers.
    BW 3.10 Content
               Import Support Package 14 for 3.10 (BW3. 10 Patch14 or SAPKW31014) into your BW system. The Support Package is available once note 0601051  with the short text "SAPBWNews BW 3.1 Content Support Package 14" has been released for customers.
    BW3.50
               Import Support Package 03 for 3.5 (BW3.50 Patch03 or SAPKW35003) into your BW system. The Support Package is available once note 0693363 with the short text "SAPBWNews BW 3.5 Support Package 03", which describes this Support Package in more detail, has been released for customers.
    The notes specified may already be available to provide advance information before the Support Package is released. However, in this case, the short text still contains the term "Preliminary version" in this case.
    Header Data
    Release Status: Released for Customer
    Released on: 18.02.2004  08:11:39
    Priority: Correction with medium priority
    Category: Program error
    Primary Component: BW-BEX-OT-DBIF Interface to Database
    Secondary Components: FIN-SEM-BPS Business Planning and Simulation
    Releases
    Software
    Component Release From
    Release To
    Release And
    subsequent
    SAP_BW 30 30B 30B  
    SAP_BW 310 310 310  
    SAP_BW 35 350 350  
    Support Packages
    Support
    Packages Release Package
    Name
    SAP_BW_VIRTUAL_COMP 30B SAPK-30B20INVCBWTECH
    Related Notes
    693363 - SAPBWNews BW SP03 NW'04 Stack 03 RIN
    647752 - SAPBWNews BW 3.0B Support Package 20
    601051 - SAPBWNews BW 3.1 Content Support Package 14
    Corrections Instructions
    Correction
    Instruction Valid
    from Valid
    to Software
    Component Ref.
    Correction Last
    Modifcation
    301776 30B 350 SAP_BW J19K013852 18.02.2004  08:03:33
    Attributes
    Attribute Value
    weitere Komponenten 0000031199
    German
    English
    Vishvesh

  • Facing TIME OUT error during data load extraction

    The program "SAPLSENA" has exceeded the maximum permitted runtime without
    interruption and has therefore been terminated.
    Can someone help me with the solution please?

    Hi Sai,
    IT looks like you are running the program in foreground and execution takes more time than the value specified under profile parameter rdisp/max_wprun_time.
    Check whether you can execute the same in background.
    If not then increase the value of parameter rdisp/max_wprun_time . Take SAP restart and test again.
    Hope this helps.
    Regards,
    Deepak Kori

Maybe you are looking for

  • Motion Crashes while using 2 filters on 17" MBP

    I have encountered a reproducible crash with Motion on my 17" MBP and wanted to know if there is a problem with my machine specifically or if it is a problem with Motion running on all MBPs. Here are the steps that caused my crash: 1. Open a project

  • Acrobat 7.1.1 Upgrade Question

    My company has sever hundred installations of Acrobat 7.x and with the latest security update we need to get them updated to the 7.1.1 level.  First I need to get them up to the cummulative update of 7.1.0 and that is where the problem is.  We have s

  • PO for Consumbles in Great SAP

    Please suggest your views on my below class of Consumbles: I have 10 depts each depat have 1 cost centre, when they raise a PO for any consumbles they will choose K as account assignment & short decsription as A4 size papers.. They will order 100 Pac

  • Apache FOP Compatible with BI Publisher XSL Layout?

    Hi, I've tried a very basic XSL layout in Apache FOP and BI Publisher and I'm able to get the same PDF result in both FOP and BIP. As I create more complex layouts, I'm wondering if the layouts that I create in one or the other (FOP vs. BIP) can be m

  • New 30Gb Just shows white screen

    I purchased a 30GB on Ebay (yes, I know, bad move) and when I got it and plugged it in, it only shows a white screen, and won't do anything else. Tried to reset, no go. Is there anything else I can do before I return this to the guy I bought it from?