Field Length Restriction on DATA_COLUMN_MEMBERS while loading from ODI into Essbase.

Hi All,
I have a question on Data Loading process into Essbase using ODI using ‘RKM Hyperion Essbase’. While doing the custom Reverse Engineering for level 0 columns, we find a message popup with the below message:
Unable to save Test( ORA-12899: value too large for column "ODI_HYP_W"."SNP_UE_USED"."SHORT_VALUE"(actual: 368, maximum: 250)).
The error says the total length of all the DATA_COLUMN_MEMBERS (Level 0 columns)  exceeds 250 characters, which is usually the case in a real world scenario. Is there a setting to bypass this or a config tag to ignore or so ? How has others go around this issue ?
Would appreciate your responses.
thanks,
Sandeep.

Hi John,
I have marked the ExportColHeader as Period as i need data for all the months. Even then the same error.
Also,
When i tried to load data from a file to table, the following error is encountered.
java.lang.NumberFormatException: For input string: "#Mi"
     at java.lang.NumberFormatException.forInputString(Unknown Source)
     at java.lang.Integer.parseInt(Unknown Source)
     at java.math.BigInteger.<init>(Unknown Source)
     at java.math.BigInteger.<init>(Unknown Source)
     at java.math.BigDecimal.<init>(Unknown Source)
     at com.sunopsis.sql.SnpsQuery.updateExecStatement(SnpsQuery.java)
     at com.sunopsis.sql.SnpsQuery.addBatch(SnpsQuery.java)
     at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execCollOrders(SnpSessTaskSql.java)
     at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
     at com.sunopsis.dwg.dbobj.SnpSessTaskSqlC.treatTaskTrt(SnpSessTaskSqlC.java)
     at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
     at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
     at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
     at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
     at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
     at com.sunopsis.dwg.cmd.e.i(e.java)
     at com.sunopsis.dwg.cmd.h.y(h.java)
     at com.sunopsis.dwg.cmd.e.run(e.java)
     at java.lang.Thread.run(Unknown Source)

Similar Messages

  • Error while loading from PSA to Cube. RSM2-704 & RSAR -119

    Dear Gurus,
    I have to extract about 1.13 million records from set up tables to the MM cube 0IC_C03. While doing, the following two errors occured while loading from the psa to the cube:
    Error ID - RSM2 & error no. 704: Data records in the PSA were marked for data package 39 . Here there were 2 errors. The system wrote information or warnings for the remaining data records.
    Error ID- RSAR & No. 119: The update delivered the error code 4 .
    (Data records for package 39 selected in PSA - 2 error(s). Record 5129 :No SID found for value 'EMN ' of characteristic 0BASE_UOM . Record 5132 :No SID found for value 'EMN ' of characteristic 0BASE_UOM )
    I tried to change the defective records in psa by deleting the erraneous field value EMN and tried to load but failed.
    Now, my questions are:
    How can I resolve the issue ?
    (ii) How to rectify the erraneous record ? should we only delete the erraneous field value or delete the entire record from the psa.
    (iii) How to delete a record from psa.
    Thanks & regards,
    Sheeja.

    Hi,
    Data records for package 39 selected in PSA - 2 error(s). Record 5129 :No SID found for value 'EMN ' of characteristic 0BASE_UOM . Record 5132 :No SID found for value 'EMN ' of characteristic 0BASE_UOM
    The issue with record no. 5129 and 5132.
    In PSA check errorneous records and it will display only the error records, you just edit as per the require and try reloading into cube.
    Deleting single record is not possible.
    Let us know if you still have any issues.
    Reg
    Pra

  • Data Records are missing in between while loading from R/3 (ECC) to BI.

    Dear Experts,
    I have created a custom DataSource on Custom Function Module.  This datasource contains 600 fields. (I know its a monster and splitting options are thinner).
    1) Validate the data using RSA3 in R/3 and showed the correct record count.
    2) Validate the data by debugging the FM, still showed the correct record count.
    But while loading from R/3 to BI, records are missing.
    Various Scenarios load from R/3 to BI:
    1a) Loaded full load (78000 records) with all default data transfer settings.  PSA showed up with 72000 records (missing 6000) only.  Compared the Idocs vs data packets, both reconciled.
    1b) Loaded full load (78000) with modified settings (15000 KB / data packet).  PSA showed up with 74000 records (missing 4000) only.
    2a) Loaded with selection parameters (took a small chunk) (7000 records) with default data transfer settings.  PSA showed up only 5000 records (missing 2000).
    2b) Loaded with selection parameters (7000 records) with modified settings (15000 KB / data packet).  PSA showed up all 7000 records.
    3a) Loaded with selection parameters (took further small chunk) (4000 records).  PSA showed up all records regardless data transfer settings.
    Also please look at this piece of code from the function module,
    IF l_wa_interface-isource = 'ZBI_ARD_TRANS'.
          l_package_size = l_wa_interface-maxsize DIV 60.
        ENDIF.
    I really appreciate your advise or help in this regard.
    Thanks much,
    Anil

    Hi,
    Which module u want?
    if its SD(for example)
    steps>>
    1>In AWB goto "business content"
    2> goto "Info provider"
    3>Under infoarea select SD cubes
    4> Drag related cubes and ODS to right panel
    5> Set the grouping option "In Data flow before&afterwards"
    6>Install the collected objects
    Go to R/3
    7> Use Tcode RSA5 Transfer all regarding SD module datasources
    Goto BW
    8> Right click on the source system "Replicate datasources"
    [DataSources|http://help.sap.com/saphelp_nw70/helpdata/en/3c/7b88408bc0bb4de10000000a1550b0/frameset.htm]
    Edited by: Obily on Jul 10, 2008 8:36 AM

  • Data flow tasks faills while loading from database to excel

    Hello,
    I am getting error while loading from oledb source to excel and the error as shown below.
    Error: 0xC0202009 at DFT - Company EX, OLE DB Destination [198]: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80004005.
    Error: 0xC0209029 at DFT - Company EX, OLE DB Destination [198]: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR.  The "input "OLE DB Destination Input" (211)" failed because error code 0xC020907B occurred, and the error row
    disposition on "input "OLE DB Destination Input" (211)" specifies failure on error. An error occurred on the specified object of the specified component.  There may be error messages posted before this with more information about the
    failure.
    Error: 0xC0047022 at DFT - Company EX: SSIS Error Code DTS_E_PROCESSINPUTFAILED.  The ProcessInput method on component "OLE DB Destination" (198) failed with error code 0xC0209029. The identified component returned an error from the ProcessInput
    method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.  There may be error messages posted before this with more information about the failure.
    Error: 0xC02020C4 at DFT - Company EX, OLE DB Source 1 [1]: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
    Error: 0xC0047021 at DFT - Company EX: SSIS Error Code DTS_E_THREADFAILED.  Thread "WorkThread0" has exited with error code 0xC0209029.  There may be error messages posted before this with more information on why the thread has exited.
    Error: 0xC0047038 at DFT - Company EX: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED.  The PrimeOutput method on component "OLE DB Source 1" (1) returned error code 0xC02020C4.  The component returned a failure code when the pipeline engine
    called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.  There may be error messages posted before this with more information about the failure.
    Error: 0xC0047021 at DFT - Company EX: SSIS Error Code DTS_E_THREADFAILED.  Thread "SourceThread0" has exited with error code 0xC0047038.  There may be error messages posted before this with more information on why the thread has exited.
    Any help would be appreciated ASAP.
    Thanks,
    Vinay s

    You can use this code to import from SQL Server to Excel . . .
    Sub ADOExcelSQLServer()
    ' Carl SQL Server Connection
    ' FOR THIS CODE TO WORK
    ' In VBE you need to go Tools References and check Microsoft Active X Data Objects 2.x library
    Dim Cn As ADODB.Connection
    Dim Server_Name As String
    Dim Database_Name As String
    Dim User_ID As String
    Dim Password As String
    Dim SQLStr As String
    Dim rs As ADODB.Recordset
    Set rs = New ADODB.Recordset
    Server_Name = "EXCEL-PC\EXCELDEVELOPER" ' Enter your server name here
    Database_Name = "AdventureWorksLT2012" ' Enter your database name here
    User_ID = "" ' enter your user ID here
    Password = "" ' Enter your password here
    SQLStr = "SELECT * FROM [SalesLT].[Customer]" ' Enter your SQL here
    Set Cn = New ADODB.Connection
    Cn.Open "Driver={SQL Server};Server=" & Server_Name & ";Database=" & Database_Name & _
    ";Uid=" & User_ID & ";Pwd=" & Password & ";"
    rs.Open SQLStr, Cn, adOpenStatic
    ' Dump to spreadsheet
    With Worksheets("sheet1").Range("a1:z500") ' Enter your sheet name and range here
    .ClearContents
    .CopyFromRecordset rs
    End With
    ' Tidy up
    rs.Close
    Set rs = Nothing
    Cn.Close
    Set Cn = Nothing
    End Sub
    Also, check this out . . .
    Sub ADOExcelSQLServer()
    Dim Cn As ADODB.Connection
    Dim Server_Name As String
    Dim Database_Name As String
    Dim User_ID As String
    Dim Password As String
    Dim SQLStr As String
    Dim rs As ADODB.Recordset
    Set rs = New ADODB.Recordset
    Server_Name = "LAPTOP\SQL_EXPRESS" ' Enter your server name here
    Database_Name = "Northwind" ' Enter your database name here
    User_ID = "" ' enter your user ID here
    Password = "" ' Enter your password here
    SQLStr = "SELECT * FROM Orders" ' Enter your SQL here
    Set Cn = New ADODB.Connection
    Cn.Open "Driver={SQL Server};Server=" & Server_Name & ";Database=" & Database_Name & _
    ";Uid=" & User_ID & ";Pwd=" & Password & ";"
    rs.Open SQLStr, Cn, adOpenStatic
    With Worksheets("Sheet1").Range("A2:Z500")
    .ClearContents
    .CopyFromRecordset rs
    End With
    rs.Close
    Set rs = Nothing
    Cn.Close
    Set Cn = Nothing
    End Sub
    Finally, if you want to incorporate a Where clause . . .
    Sub ImportFromSQLServer()
    Dim Cn As ADODB.Connection
    Dim Server_Name As String
    Dim Database_Name As String
    Dim User_ID As String
    Dim Password As String
    Dim SQLStr As String
    Dim RS As ADODB.Recordset
    Set RS = New ADODB.Recordset
    Server_Name = "Excel-PC\SQLEXPRESS"
    Database_Name = "Northwind"
    'User_ID = "******"
    'Password = "****"
    SQLStr = "select * from dbo.TBL where EMPID = '2'" 'and PostingDate = '2006-06-08'"
    Set Cn = New ADODB.Connection
    Cn.Open "Driver={SQL Server};Server=" & Server_Name & ";Database=" & Database_Name & ";"
    '& ";Uid=" & User_ID & ";Pwd=" & Password & ";"
    RS.Open SQLStr, Cn, adOpenStatic
    With Worksheets("Sheet1").Range("A1")
    .ClearContents
    .CopyFromRecordset RS
    End With
    RS.Close
    Set RS = Nothing
    Cn.Close
    Set Cn = Nothing
    End Sub
    Knowledge is the only thing that I can give you, and still retain, and we are both better off for it.

  • While loading master data into PSA, the eorror is 'Error from PSA"

    Hi All,
    While loading master data into PSA, the eorror is 'Error from PSA"
    Could you anybody please help in this reagard.
    Thanx,
    Sridhar.

    Hi rakesh,
    May be IDOCs not processed completely,
    Idoc Problem, Either wait till time out & process Idoc from detail monitor screen, or go to BD87 & process Idoc with status = YELLOW ( be careful while processing IDOCS from BD87, choose only relevant Idocs
    Cheers
    Raj

  • Caller-70 Error while loading master data into infoobject

    hi ,
    I am getting following error while loading master data into infoobject (0tb-account). I am loading this master data in production environment for the first time. there are about 300000 records. All have got loaded upto PSA. Infopackage settings were PSA and then into data target.
    Short dump in the Warehouse
    Diagnosis
    The data update was not finished. A short dump has probably been logged in BI. This provides information about the error.
    System Response
    "Caller 70" is missing.
    ST22 dump analysis is as below:
    Termination occurred in the ABAP program "GP476CZYBEF2WX53UZ8TXFG6XOS" - in                  
    "VALUE_TO_SID_CONVERT_DB".                                                                  
    The main program was "RSMO1_RSM2 ".
    Please help as soon as you can..Production problem....
    Regards
    Rakesh

    Hi rakesh,
    May be IDOCs not processed completely,
    Idoc Problem, Either wait till time out & process Idoc from detail monitor screen, or go to BD87 & process Idoc with status = YELLOW ( be careful while processing IDOCS from BD87, choose only relevant Idocs
    Cheers
    Raj

  • Issue while loading data to sample essbase app using odi

    while executing data load the error is
    Now, while loading data to an essbase app i am getting the following error:
    org.apache.bsf.BSFException: exception from Jython:
    Traceback (innermost last):
    File "<string>", line 23, in ?
    com.hyperion.odi.essbase.ODIEssbaseException: Error records reached the maximum error threshold : 1
    at com.hyperion.odi.essbase.ODIEssbaseDataWriter.loadData(Unknown Source)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
    at java.lang.reflect.Method.invoke(Unknown Source)
    at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java)
    at org.python.core.PyMethod.__call__(PyMethod.java)
    at org.python.core.PyObject.__call__(PyObject.java)
    at org.python.core.PyInstance.invoke(PyInstance.java)
    at org.python.pycode._pyx4.f$0(<string>:23)
    at org.python.pycode._pyx4.call_function(<string>)
    at org.python.core.PyTableCode.call(PyTableCode.java)
    at org.python.core.PyCode.call(PyCode.java)
    at org.python.core.Py.runCode(Py.java)
    at org.python.core.Py.exec(Py.java)
    at org.python.util.PythonInterpreter.exec(PythonInterpreter.java)
    at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:144)
    at com.sunopsis.dwg.codeinterpretor.k.a(k.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
    at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
    at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
    at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
    at com.sunopsis.dwg.cmd.e.i(e.java)
    at com.sunopsis.dwg.cmd.g.y(g.java)
    at com.sunopsis.dwg.cmd.e.run(e.java)
    at java.lang.Thread.run(Unknown Source)
    Caused by: com.hyperion.odi.essbase.ODIEssbaseException: Error records reached the maximum error threshold : 1
    at com.hyperion.odi.essbase.ODIEssbaseDataWriter.sendRecordArrayToEsbase(Unknown Source)
    ... 32 more
    com.hyperion.odi.essbase.ODIEssbaseException: com.hyperion.odi.essbase.ODIEssbaseException: Error records reached the maximum error threshold : 1
    at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
    at com.sunopsis.dwg.codeinterpretor.k.a(k.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
    at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
    at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
    at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
    at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
    at com.sunopsis.dwg.cmd.e.i(e.java)
    at com.sunopsis.dwg.cmd.g.y(g.java)
    at com.sunopsis.dwg.cmd.e.run(e.java)
    at java.lang.Thread.run(Unknown Source)

    Hi,
    It means in your options of the KM you have it set to quit when it hits one error, if you set it to 0 (infinite) then it will not stop no matter how many data load errors it hits.
    If you set it to 0 and run the interface, depending on how you set up the options in the KM it can write to two log files, which you should check.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Load from ODS into InfoCube gives TIME-OUT runtime error after 10 minutes ?

    Hi all,
       We have a full load from ODS into InfoCube and it was working fine till the last week upto with 50,000 records. Now, we have around 70,000+ records and started failing with TIME_OUT runtime error.
       The following is from the Short Dump (ST22):
       The system profile "rdisp/max_wprun_time" contains the maximum runtime of a
    program. The current setting is 600 seconds. Once this time limit has been exceeded, the system tries to terminate any SQL statements that are currently being executed and tells the ABAP processor to terminate the current program.
      The following are from ROIDOCPRMS table:
       MAXSIZE (in KB) : 20,000
       Frequency       :  10
       Max Processes : 3
      When I check the Data Packages under 'Details' tab in Monitor, there are four Data Packages and the first three are with 24,450 records.  I will right click on each Data Package and select 'Manual Update' to load from PSA. When this Manual Update takes more than 10 minutes it is failing with TIME_OUT again.
      How could I fix this problem, PLEASE ??
    Thanks,
    Venkat.

    Hello A.H.P,
    The following is the Start Routine:
    PROGRAM UPDATE_ROUTINE.
    $$ begin of global - insert your declaration only below this line  -
    TABLES: /BIC/AZCPR_O0400, /BIC/AZCPR_O0100, /BIC/AZCPR_O0200.
    DATA: material(18), plant(4).
    DATA: role_assignment like /BIC/AZCPR_O0100-CPR_ROLE, resource like
    /BIC/AZCPR_O0200-CPR_BPARTN.
    $$ end of global - insert your declaration only before this line   -
    The follow definition is new in the BW3.x
    TYPES:
      BEGIN OF DATA_PACKAGE_STRUCTURE.
         INCLUDE STRUCTURE /BIC/CS8ZCPR_O03.
    TYPES:
         RECNO   LIKE sy-tabix,
      END OF DATA_PACKAGE_STRUCTURE.
    DATA:
      DATA_PACKAGE TYPE STANDARD TABLE OF DATA_PACKAGE_STRUCTURE
           WITH HEADER LINE
           WITH NON-UNIQUE DEFAULT KEY INITIAL SIZE 0.
    FORM startup
      TABLES   MONITOR STRUCTURE RSMONITOR "user defined monitoring
               MONITOR_RECNO STRUCTURE RSMONITORS " monitoring with record n
               DATA_PACKAGE STRUCTURE DATA_PACKAGE
      USING    RECORD_ALL LIKE SY-TABIX
               SOURCE_SYSTEM LIKE RSUPDSIMULH-LOGSYS
      CHANGING ABORT LIKE SY-SUBRC. "set ABORT <> 0 to cancel update
    $$ begin of routine - insert your code only below this line        -
    fill the internal tables "MONITOR" and/or "MONITOR_RECNO",
    to make monitor entries
       clear DATA_PACKAGE.
       loop at DATA_PACKAGE.
          select single /BIC/ZMATERIAL PLANT
             into (material, plant)
             from /BIC/AZCPR_O0400
             where CPR_EXT_ID = DATA_PACKAGE-CPR_EXT_ID
             and ( MATL_TYPE = 'ZKIT' OR MATL_TYPE = 'ZSVK' ).
           if sy-subrc = 0.
              DATA_PACKAGE-/BIC/ZMATERIAL = material.
              DATA_PACKAGE-plant = plant.
              modify DATA_PACKAGE.
              commit work.
           endif.
           select single CPR_ROLE into (role_assignment)
                         from /BIC/AZCPR_O0100
                         where CPR_GUID = DATA_PACKAGE-CPR_GUID.
            if sy-subrc = 0.
              select single CPR_BPARTN into (resource)
                         from /BIC/AZCPR_O0200
                         where CPR_ROLE = role_assignment
                         and CPR_EXT_ID = DATA_PACKAGE-CPR_EXT_ID.
                   if sy-subrc = 0.
                      DATA_PACKAGE-CPR_ROLE = role_assignment.
                      DATA_PACKAGE-/BIC/ZRESOURCE = resource.
                      modify DATA_PACKAGE.
                      commit work.
                   endif.
              endif.
           clear DATA_PACKAGE.
           endloop.
    if abort is not equal zero, the update process will be canceled
      ABORT = 0.
    $$ end of routine - insert your code only before this line         -
    Thanks,
    Venkat.

  • While loading transaction data into cube what are the tables generats

    Hi,
    while loading transaction data into cube what are the tables normally generats.

    Hi,
    Normally datas will be loading to 'F'- Fact tables (/BIC/F****) *** - cube name..
    When you do compress the request the data will be moved into E tables.
    Regards,
    Siva.

  • GL Data load using ODI to Essbase

    Hi, I am trying to load GL actual data to essbase application using ODI. Source file is having 10 columns and the Target is having 11 columns. We are using rules file to load data into essbase. Rules file will split 9th column as two columns and will load the data into Essbase. When we test the rules file in essbase data is getting load into application. when we use the same rule file in ODI interface data is not getting load and giving error as "Unknown member" for the member which we are spliting in 9th column. Source file: HSP_RATES ACCOUNT PERIOD YEAR SCENARIO VERSION CURRENCY ENTITY SFUND PROGRAM DATA HSP_InputValue 611101 Jul FY13 ACTUAL Final Local 0000 SBNR AC0001PS0001 25000 AC0001PS0001 is the concatenated string from GL. we will split this as two columns using rules file to load into essbase application. Please suggest what might be the reason for the error. How to do the mapping between source and target. I have mapping one column(AC0001PS0001) to two dimensions (Program, Activity) in Essbase. Please suggest. Thanks Sri

    In ODI, what you have to do is to split it in the ODI itself. While you are mapping, you can use SQL functions to map it to two different columns. Similar to the way you are doing it in Rule file.
    Regards
    Amarnath
    ORACLE | Essbase

  • How to load metadata directly into essbase 9.3 ?

    Hi all,
    When I load metadata directly into essbase with IKM SQL to Hyperion Essbase (METADATA).
    My metadata is already on Microsoft SQL 2000. After execute an interface that load metadata to essbase and check in the essbase outline , It seem that outline data could not retrieve correctly.And still have many rejected records.
    My question is :
    Can I use the same rule file that used for load a metadata from text file?
    Because in this case I use the same rule file that load metadata from text file.
    Let me know if need more information on this.
    Thank you in advance.

    Hi,
    You can use the same rules file, make sure in the IKM options you have the same rules seperator as in your rules file.
    You can also turn on error logging and logging in the IKM options so you should get more information to why the metaload is failing
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Problem with the field length restrictions in the WSDL file

    Hi all,
    We have created a XSD file where we have defined fields and given some restrictions (like minLength, maxLength) for each field. See below one ex of one element "Id":
    {code     <xs:simpleType name="Id">
              <xs:restriction base="xs:string">
                   <xs:maxLength value="40"/>
              </xs:restriction>
         </xs:simpleType>
    {code}
    Here we have defined maxLength of this field as 40 chars. Our WSDL uses (refers/import) this XSD file and we ganerates java skeleton using RAD. But at runtime if we set more than 40 chars then also it is accepting. It is not throwing an exception. (In the generated java skeletion these restrictions are not reflected antwhere)
    I have one question that, if such restrictions defined in the XSD file works or not? and is it a industry standard to define restriction in the XSD file?
    If yes then what i need to do more to make it working?
    If not then is there any way to do such validation of the fields that are input to the webservice? Or shall i have to just write my own java class to validate each field?
    Regards,
    Ravi

    Or is it possible that we give length restrictions in the XSD (and import this XSD in WSDL) and generate java skeleton from WSDL then the restrictions defined in XSD are mapped into java classes?
    For ex:
    <xs:simpleType name="Id">
        <xs:restriction base="xs:string">
            <xs:maxLength value="40"/>
        </xs:restriction>
    </xs:simpleType>so when in generated java skeleton we set value to "Id" element which is more than 40 charsthen it should throw a exception?
    Is it possible by default or do we need to write custom validation classes to do validations on such fields?
    Has anybody worked in such scenerios?
    Or how to do field validations in webservice? Simple question.
    Thanks In Advance.

  • DUmp Erro ASSIGN_TYPE_CONFLICT while loading from DSO to INFOCUBE

    HI ALL,
    Here i had rquirement of modifying the Tranformation rules and i have done some routine coding for a particular infoobject
    and deleted the data from DOS  & Cube , then i started Loading to DSO, here i got the sucess for DSO, now i have got  the modified data of the particular Infoobject in the DSO,
    But i am facing a Dump Error while loading data from DSO to Cube thorugh the DTP .
    HEre is the Error "ASSIGN_TYPE_CONFLICT" which i see in ST22.
    Plz,
    Thanks in Advance,
    sravan

    HI,
    When i started the load for the first time i got he same error, so i have activated the DTP and laoded it again,
    then i faced the same problem,
    where the Modified data is already in DSO and  i have validated the Data also,  that is ok ...
    so i need to Delete or to need to create another DTP to load the data from DSA to CUBE.
    and even i have cheked all the tansformation rules they are all fine  the DSO structure and Infocube Structure is ok,
    Please  suggest,
    Thanks LAX,

  • Error while loading from DTP

    Hi,
    We are facing some issue while loading the data from R/3 to BW using DTP.
    We are getting the error (both for full and delta)as follows:
    Data package 1:Errors during Processing
    *     Extraction datasource*
    *     Filter out new records with same key.*
    *     RSDS material*
    *     Update to datastore object*
    *          Unpack data package*
    *          Exception in substep:write data package*
    *          Processing terminated.*
    *     Set technical status to red*
    *     Set overall status to red*
    *     Set overall status to green*
    *     Further processing started*
    *     Set status to u2018Processed furtheru2019*
    The expansion of the error Exception in substep: write data package:
    Record 0, segment 0001 is not in teh cross-record table
    Error while writing error stack
    Record 0, segment 0001 is not in the cross-record table
    Error in substep update to DataStore Object
    We are using both start routine and end routine in the transformation.
    Any help on this is highly appreciable.
    End routine:
        IF NOT RESULT_PACKAGE IS INITIAL.
          SELECT salesorg comp_code FROM /bi0/psalesorg
          INTO TABLE it_salesorg
          FOR ALL ENTRIES IN RESULT_PACKAGE
          WHERE salesorg = RESULT_PACKAGE-SALESORG
          AND objvers = 'A'.
          SELECT /bic/ZMATRTUGP FROM /bic/pZMATRTUGP
          INTO TABLE it_mat
          FOR ALL ENTRIES IN RESULT_PACKAGE
          WHERE /bic/ZMATRTUGP = RESULT_PACKAGE-/bic/ZMATRTUGP.
          it_data[] = RESULT_PACKAGE[].
        ENDIF.
    get company code from salesorg for all material
        LOOP AT it_data INTO st_data.
          w_tabix = sy-tabix.
          READ TABLE it_salesorg INTO st_salesorg
          WITH TABLE KEY salesorg = st_data-salesorg.
          IF sy-subrc = 0.
            st_data-comp_code = st_salesorg-compcode.
            MODIFY it_data FROM st_data INDEX w_tabix.
          ENDIF.
        ENDLOOP.
        REFRESH RESULT_PACKAGE.
        RESULT_PACKAGE[] = it_data[].
    *Finally check for each material if something has changed since the last
    *loading : if yes compare and update, otherwise no upload
        IF NOT it_mat[] IS INITIAL.
          SELECT * FROM /bic/azalomrtu00
          INTO TABLE it_check
          FOR ALL ENTRIES IN it_mat
          WHERE /bic/ZMATRTUGP = it_mat-zmaterial.
        ENDIF.
        DELETE it_check WHERE salesorg IS INITIAL.
        CLEAR st_result_package.
        LOOP AT RESULT_PACKAGE INTO st_RESULT_PACKAGE.
          READ TABLE it_check INTO st_check
          WITH TABLE KEY
          /bic/ZMATRTUGP = st_result_package-/bic/ZMATRTUGP
          CUST_GRP4 = st_result_package-CUST_GRP4
          SALESORG = st_result_package-SALESORG
          VALIDTO = st_result_package-VALIDTO.
          IF sy-subrc = 0.
    *If one of the characteristic is different, let that entry in the
    *result_package otherwise no use updating it
            IF st_check-/BIC/ZCSBRTUGP = st_result_package-/BIC/ZCSBRTUGP
            AND st_check-/BIC/ZCONDTYPE = st_result_package-/BIC/ZCONDTYPE
            AND st_check-comp_code = st_result_package-comp_code
            AND st_check-/BIC/ZCNT_RTU = st_result_package-/bic/ZCNT_RTU
            AND st_check-VALIDFROM = st_result_package-VALIDFROM.
              DELETE RESULT_PACKAGE.
            ELSE.
    *entry is new : let it updated.
            ENDIF.
            DELETE it_check WHERE
                  /bic/ZMATRTUGP = st_result_package-/bic/ZMATRTUGP
          AND CUST_GRP4 = st_result_package-CUST_GRP4
          AND SALESORG = st_result_package-SALESORG
          AND VALIDTO = st_result_package-VALIDTO.
          ENDIF.
        ENDLOOP.
    *if some entries are in the ODS but not in the datapackage, they have to
    *be deleted in the ODS since they don't exist anymore
        LOOP AT it_check INTO st_check.
          CLEAR st_result_package.
          st_result_package-/bic/ZMATRTUGP = st_check-/bic/ZMATRTUGP.
          st_result_package-CUST_GRP4 = st_check-CUST_GRP4.
          st_result_package-SALESORG = st_check-SALESORG.
          st_result_package-VALIDTO = st_check-VALIDTO.
          st_result_package-recordmode = 'D'.
          APPEND st_result_package TO RESULT_PACKAGE.
        ENDLOOP.
    Thanks in advance.
    Thanks,
    Meghana

    Hi Meghana
    HAve you got an short dump in ST22 transaction ?
    If yes, what is it ?
    In your routine, maybe you should as well add a counter for index in your loop :
    DATA : w_idx like sy-tabix.
    When you loop at result_package, do :      w_idx = sy-tabix.
    And make :     DELETE result_package INDEX w_idex.
    Hope it helps you
    Mickael

  • How to validate data is in specific list while loading from SQL*Loader

    I have a sample data file like below
    1,name1,05/02/2012 10:00:00,blue
    2,name2,06/02/2012 10:00:00,red
    3,name3,07/02/2012 10:00:00,yellow
    4,name4,08/02/2012 10:00:00,white
    I would like to validate 4Th column to be a valid color (ie) All color should be in a specific list, if it is not in the lis then the record should do to bad/discard file
    How can do that while loading Data From SQL*Loader?

    user8860934 wrote:
    I have a sample data file like below
    1,name1,05/02/2012 10:00:00,blue
    2,name2,06/02/2012 10:00:00,red
    3,name3,07/02/2012 10:00:00,yellow
    4,name4,08/02/2012 10:00:00,white
    I would like to validate 4Th column to be a valid color (ie) All color should be in a specific list, if it is not in the lis then the record should do to bad/discard file
    How can do that while loading Data From SQL*Loader?Probably a lot easier with an EXTERNAL TABLE (they're much more flexible).
    Is SQL Loader a mandatory requirement for some reason?

Maybe you are looking for