DataLoad using @XREF

I have two cubes, 'Planning' and 'AllData',I want to using @XREF to tranfer data from 'Planning' to 'AllData' cube. Following is the code I'm using FIX(Current_Year) Budget = @XREF(Planning_L_Alias); ENDFIX; But it is running of more than 2-hrs, I killed it the first time, thinking that it might have got 'hung' or something. I'm copying over a very small set of data. Any clues... Thanks in advance - Maneesh Hari

Hi thereWe have seen very long transfer times using @XREF as well. We now prefer to export data and then import it into the target cube again. This is a bit more complex in setting up, but certainly faster.RegardsReto

Similar Messages

  • Cross Reference within external Database using XREF API

    Hi Experts,
       Can we do Cross Reference within external Database using  XREF API uses JDBC to access the Oracle Database Stored Procedures in SAP PI? How to use a JNDI Data source to access the DB and how to do the Connection Pooling will be done by the SAP J2EE server? Kindly let me know step by step proceedings.
    Regards
    Archana

    Hello Archana,
    It can be done with a Lookup call in a mapping.
    Here's a little article about the topic in the SAP wiki:
    http://wiki.sdn.sap.com/wiki/display/XI/HowtouseCrossReferencewithinexternal+Database
    With kind regards
                     Sebastian

  • How to use XREF table in BPEL?

    Can anyone tell me how to use XREF table in BPEL ??

    It is used to build up an XML fragment to the element on the target side...
    please refer this link....
    http://kr.forums.oracle.com/forums/thread.jspa?threadID=2252997&tstart=2
    Thanks,
    N

  • ** Invaild Member Names when using @XREF **

    To the Oracle Nation,
    I am using the @XREF function to obtain data from one database to copy to another. an Example of my calc string is as follows:
    Fix(Jan:Dec,Product,NY,Other) - where NY = Next Year Budget
    "Sales" = @XREF(GetData,"Sales Wages",Soda,Entity,"NY working member",West);
    When I run the calculation, the status shows that it has ran to success; however, no data is displayed when data is pulled from the target database. To throw a little twist into the mix, when I change "NY" to "NX" in my target database, it works just fine.
    That being said, is NY, or any other name string, a restricted member name due to conflicts within Essbase. As side not, It also does not work with NNY or NNNY. We are at a lose on this one, and want to put this issue to rest.
    Thank you.
    Concerned Essbase Admin

    To jump on Stuart's comment, I have had exactly this problem with XREF's.
    In the bad old days (Planning 3.5.1, Essbase 6.5.x), I would actually force the creation of blocks in the target by assigning zeros and then clearing the values out to ensure that the calculation went through. Very painful and slow.
    If block creation is in fact your issue, you should be able to use the SET CREATEONMISSINGBLK setting to create your blocks: http://download.oracle.com/docs/cd/E10530_01/doc/epm.931/html_esb_techref/calc/set_createnonmissingblk.htm
    You might want to be careful, however, with how you use that setting and target very carefully what/where you are creating blocks as you might otherwise get way, way more blocks than you want.
    I have not followed my own advice, wondered why my db was so big/slow, and then used SET MSG DETAIL to see all of the blocks roll in. Whoops. At least my numbers calculated. :)
    Regards,
    Cameron Lackpour

  • Using XREF

    I am stuck with an issue with XREF and can't seem to find where the problem is.The calc script is executing with no errors, but the value is not being copied over
    We have 2 databases(with one cube in each of them)
    SOurce cube has an additional dimension
    Here's my calc
    FIX("FY2010',CAD, Budget, "version1', "Exchange Rate", "No_properties","No_Period")
    AMOUNT= @XREF("CSBUDGET",AMount,"BU_None");
    ENDFIX
    1. I have already verified the source datcube has a value for the FIX I am using in the Calc( source data base has a value of 1.05)
    2. The location alias "CSBUDGET" is already defined in the destination database
    3. BU_None is the member from the additional dimension in the SOurce cube, and that's the reason it's used in the XREF
    Any ideas on why the value is not being copied over?

    Have you checked for implied share on the source side, or destination side for that matter? Make sure that any one-child-parents across all dimensions are tagged as NeverShare to make sure that it is able to reference from and store into an actual storage location. Implied Shares cause issues across many functions in Essbase.

  • Dataload using loadrule inODI errored

    Hi John,
    I have tried mapping my file in ODI and created a loadrule which is not pointing to any file but i jst mapped the columns as per my target columns in the field properties.
    But it errored out
    [AC_4250020,Nov,Future 1,Future 2,BU Version_1,Current,FY10,CC_000,EE_011-00000,0.00] in Data Load, [1] Records Completed'
    'AC_4250040','Nov','Future 1','Future 2','BU Version_1','Current','FY10','CC_000','EE_011-00000','0.00','Cannot end dataload. Analytic Server Error(1003014): Unknown Member [AC_4250040,Nov,Future 1,Future 2,BU Version_1,Current,FY10,CC_000,EE_011-00000,0.00] in Data Load, [1] Records Completed'
    'AC_4250050','Nov','Future 1','Future 2','BU Version_1','Current','FY10','CC_000','EE_011-00000','0.00','Cannot end dataload. Analytic Server Error(1003014): Unknown Member [AC_4250050,Nov,Future 1,Future 2,BU Version_1,Current,FY10,CC_000,EE_011-00000,0.00] in Data Load, [1] Records Completed'
    'AC_1416115','Nov','Future 1','Future 2','BU Version_1','Current','FY10','CC_000','EE_011-00000','27408245.00','Cannot end dataload. Analytic Server Error(1003014): Unknown Member [AC_1416115,Nov,Future 1,Future 2,BU Version_1,Current,FY10,CC_000,EE_011-00000,27408245.00] in Data Load, [1] Records Completed'
    And as per performance it is taking more time to execute the interface. Please suggest

    The difference in performance is around when records are rejected, if no records are rejected then the performance will not be noticable.
    If a load rule is not used and there are rejections then you would get a wearning related to sending record by record.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Same transformation using xref:lookupXRef() that works in BPEL fails in BPM

    Hi,
    Developing using JDev 11.1.1.5 I'm trying to get the lookupXRef() work in a transformation in a BPM process but it fails with no error message just causing empty output xml. In an attempt to debug I copied a lookupXRef() from another transformation used in a BPEL process, hardcoding everything, with no luck.
    Could it be that supporting jar files are not available for transformations done in BPM process?
    Appreciate input, thank you.

    Have you configured your default channels from OEM, try configuring all your default channels at RMAN level, so that you dont have to specifiy in run { } script;
    For Eg. in my environment , i have following settings and backup command works fine both from shell and grid
    its Oracle 9.2.0.6
    RMAN configuration parameters are:
    CONFIGURE RETENTION POLICY TO REDUNDANCY 3;
    CONFIGURE BACKUP OPTIMIZATION ON;
    CONFIGURE DEFAULT DEVICE TYPE TO DISK;
    CONFIGURE CONTROLFILE AUTOBACKUP ON;
    CONFIGURE CONTROLFILE AUTOBACKUP FORMAT FOR DEVICE TYPE DISK TO '/rman_backup/%F';
    CONFIGURE CONTROLFILE AUTOBACKUP FORMAT FOR DEVICE TYPE 'SBT_TAPE' TO '%F_TAPE_CONTROLFILE';
    CONFIGURE DEVICE TYPE 'SBT_TAPE' PARALLELISM 4;
    CONFIGURE DEVICE TYPE DISK PARALLELISM 5;
    CONFIGURE DATAFILE BACKUP COPIES FOR DEVICE TYPE SBT_TAPE TO 1; # default
    CONFIGURE DATAFILE BACKUP COPIES FOR DEVICE TYPE DISK TO 1; # default
    CONFIGURE ARCHIVELOG BACKUP COPIES FOR DEVICE TYPE SBT_TAPE TO 1; # default
    CONFIGURE ARCHIVELOG BACKUP COPIES FOR DEVICE TYPE DISK TO 1; # default
    CONFIGURE CHANNEL DEVICE TYPE DISK FORMAT '/rman_backup/d_%d_s_%s_p_%p_t_%t' MAXPIECESIZE 1536 M;
    CONFIGURE CHANNEL DEVICE TYPE 'SBT_TAPE' MAXPIECESIZE 1536 M PARMS "ENV=(NB_ORA_CLIENT=topdb1,NB_ORA_SERV=ilsdb2)" FORMAT 'd_%d_s_%s_p_%p_t_%t';
    CONFIGURE MAXSETSIZE TO UNLIMITED; # default
    CONFIGURE SNAPSHOT CONTROLFILE NAME TO '/opt1/app/oracle/product/ora92/dbs/snapcf_topaz.f'; # default
    BACKUP DEVICE TYPE SBT CURRENT CONTROLFILE;

  • Dataload using sql query

    Hi,
    i need to load the data using sql query in to essbase cube by using maxl .If any one knows this please let me know.
    i
    thanks in advance

    Here is a sample maxl script I use to clear, load, and calculation a BSO Application and Database. Both the Application and Database are titled "STAT_2".
    /*** Login to essbase server ***/
    /*** login $1 $2 on $3; ***/
    login 'username 'password' on 'server';
    /*** Set output file to spool all error messages only ***/
    spool stdout on to 'C:\Hyperion\Automation\Logs\STAT_2_ALL_Load_Essbase.LOG';
    spool stderr on to 'C:\Hyperion\Automation\Errors\STAT_2_ALL_Load_Essbase.ERR';
    /*** Clear data in STAT_2 ***/
    /** execute calculation 'STAT_2'.'STAT_2'.'ClrA2Yr'; **/
    /*** Import data from the FACT_STAT_2YEARS table using a load rule ***/
    import database 'STAT_2'.'STAT_2' data connect as 'SQLUserName' identified by 'SQLPassword' using server rules_file 'LdA_All.rul' on error write to 'C:\Hyperion\Automation\Logs\STAT_2_ALL_Load_Essbase.log';
    /*** Calculate data in STAT_2 ***/
    /** execute calculation 'STAT_2'.'STAT_2'.'ClcA2Yr'; ***/
    spool off;
    logout;
    exit;

  • Dataload using Concat, Replace funtions errored

    Hi John,
    I was trying to load data into Essbase using a text file as source, Sunopsis as staging and Essbase as Target.
    My Text file has double quotes(") in one of the column. For this i have used Replace(Data,'"',null) expression.
    One more column has to concatenated with AC_. but i encountered the following error
    org.apache.bsf.BSFException: exception from Jython:
    Traceback (innermost last):
    File "<string>", line 20, in ?
    java.sql.SQLException: Unexpected token: 1 in statement [select   CONCAT('AC_',C3_NAT_ACC)    "Account",Dec    "Period",Future 1]
    Please let me know how can i use these functions in this scenario.
    Thanks,
    Sravan Ganti

    Hi John,
    I also need to Concat 2 more columns in the same file. I have written the load rule and also executed it though EAS and the data got loaded.
    Now i want to run the same thinng from ODI. So do i need to do the mapping. If yes, how do i do the mapping.
    My source file has Data Account Loc CostCenter Company
    So for the Account column i want AC_ to be prefixed, And Concat Loc and Costcenter into one column.
    In ODI i was not able to do it and did it using the load rule. Now i jst have to run the load rule from ODI without any mapping?
    Please let me know how can i achieve this.
    Also,
    I have tried the same thing in my local server with Oracle as the staging area and tried Concat function using the Staging area. But it says
    Cannot Build the query for unknown reasons: Cannot Display data: Datastores are not on the same data server or one of the tables is not accessible on the data server
    So do ineed to first load the data into the oracle table and then create an other interface which has source as table and there use the Concat functions?

  • Manipulating data using Xref

    <p>Hi all,<br>I need to copy data between two cubes but with a little twist: Thedata for specific values must be copied between two differentdimensions.<br>Example:<br>In the first cube i have the product dimension with two sets ofdynamic calcs that aggregate data. one of this groups is salesmanagers.<br>The second cube has the manager dimension, with the same valuenames as in the other cube(except for a suffix to maintainuniqueness)<br>I need to copy the data aggregated data between from the first cubeto the members in the second cube.<br>Please help,<br>Thanks,<br>Guy</p>

    I'm assuming that you want to preserve the existing product -> product relationship, and need the parent/ancestor value to determine which manager gets the details (in other words, mapping one dimension into two). Otherwise, Sandeep's response would still work (you could load the product summary by manager to each manager's member without product detail).<BR><BR>You could try setting up an attribute dimension, and use that as the basis for your translation. This might be a better way of expressing the products anyway, what purpose do the managers have in the product hierarchy apart from being attributes related to some form of ownership?<BR>

  • Infoobject dataload using BI table as datasource

    I have created an infoobject A,whose A (text) is having a datasource which is created from a BI table.
    System had generated the required proposal while creating the transformation and then created the Data Transfer Process and loaded the data into the target which went off smoothly but zero records were transferred,but the datasource is having 1000 records.
    Can anyone explain why my A(text) table is not getting loaded with data?Please Reply ASAP.
    Nikhil

    hi Nikhil,
    did you execute the DTP ?
    hope this helps.

  • Using @SUMRANGE with @XREF

    Hi,
    I'm having a problem in using @XREF with @SUMRANGE and @CURRMBRRANGE.
    I have Plan2 with 9 dimensions and Plan3 with only 7 and I need to create a Dynamic Calc in Plan3 that references an account in Plan2 called MT_PSDIR.
    I did the following in the member formula
    @XREF("_PnlCube_",MT_PSDIR,TOT_LB,TOT_CHANNEL);
    TOT_LB and TOT_CHANNEL are the members from the missing dimensions that I want to use.
    What I want now is to use the YTD value so I tried:
    @SUMRANGE(@XREF("_PnlCube_",MT_PSDIR,TOT_LB,TOT_CHANNEL), @CURRMBRRANGE(Period,LEV,0,,0));
    Formula is valid but the form returns an error when trying to open.
    I also tried to @XREF the @SUMRANGE but it seems I'm missing the correct syntax because forms with this member do not open with the sumrange and currmbrrange. I'm probably not seeing the correct picture :) formula is also valid but when it fails at calc time.
    Any ideas?
    EDIT: At the moment what I did as a workaround was to create 2 members: one with the XREF (member A) and one with the @SUMRANGE of A
    Thank you
    Edited by: Icebergue on 5/Mar/2012 10:18

    Ditche I am sorry I did not follow,
    " overrides sue to another axis."
    Actually there were 3 diff accounts like Acc_X - so I was using a dynamic calc member initially with a formula ("Acc_x1" + "Acc_X1" + "Acc_X2") .
    Acc_Y = @SUMRANGE("Acc_X", @UDA(Entity, "X"));
    Since it was not working I switched to three different sum range functions for these three accounts.
    Acc_Y = @SUMRANGE("Acc_x1", @UDA(Entity, "X")) + @SUMRANGE("Acc_x2", @UDA(Entity, "X")) + @SUMRANGE("Acc_x3", @UDA(Entity, "X"));
    And it worked fine.
    But yeah you were right, the roll up members do work for sumrange functions
    - but a dynamic calc member (containing roll up members) does not work.
    Ankur

  • Use of XREF in AIA

    Hello,
    Can anyone please send me some document and provide me a link where I can get exhaustive information about XREF? Actually I am struggling to get a bigger picture of XREF and how it is used in AIA.
    I want to learn about XREF with some practical example and how it is being used in AIA (for message mapping).
    Thanks,
    Sanjay Bharatiya

    I recognise the posters frustration. There is a lot of basic documentation on using XRefs but these don’t give a lot of detailed information on how they work in terms of AIA.
    There is the basesample but the problem is even this doesn’t use the standard AIA method.
    If you use the AIA service constructor to create a requestor ABCS you are provided with a template xsl with some templates to use xref’s. No documentation on how to use these templates, I had to get these working through trial and error. They seem to provide an extra layer of caching around the xref functions.
    Creating a provider ABCS results doesn’t result in any transformations at all so you don’t have the Xref functions to use. There is no explanation anywhere stating why one includes these templates and the other doesn’t. Personally I have copied the Xref templates from the requestor into the provider and used them in my own transformations.
    Reverse engineering the AIA Service Constructor created templates further reveals the usage of service properties to provide the system names for use in the Xref (and DVM) lookups.
    I think this is a case of the documentation being behind the current version.

  • 11.1.2.1 execute dataload error in erpi and FDM

    Hi
    I met a critical problem when execute dataload in erpi using classic method.
    there are data in E business suite, but I don't know why I can't pull data from EBS.
    also I execute dataload using FDM method, the import step in FDM yields an error: "error creating database link".
    could you help me to find the problem?
    regards,
    Samantha
    the error message please refering this:
    java.lang.Exception: ERROR: Drill Region was not created/updated. No valid data rows exist in TDATASEG_T for Process ID: 14
    ERPI Process Start, Process ID: 14
    ERPI Logging Level: DEBUG (5)
    ERPI Log File: C:\DOCUME~1\ADMINI~1\LOCALS~1\Temp\1\/aif_14.log
    Jython Version: 2.1
    Java Platform: java1.4.2_08
    COMM Process Periods - Insert Periods - START
    COMM Process Periods - Insert Periods - END
    COMM Process Periods - Insert Process Details - START
    COMM Process Periods - Insert Process Details - END
    LKM EBS/FS Extract Type - Set Extract Type - START
    LKM EBS/FS Extract Type - Set Extract Type - END
    LKM EBS/FS Extract Delta Balances - Delete Obsolete Rows - START
    LKM EBS/FS Extract Delta Balances - Delete Obsolete Rows - END
    LKM EBS/FS Extract Delta Balances - Load Audit - START
    LKM EBS/FS Extract Delta Balances - Load Audit - END
    COMM End Process Detail - Update Process Detail - START
    COMM End Process Detail - Update Process Detail - END
    LKM EBS/FS Extract Type - Set Extract Type - START
    LKM EBS/FS Extract Type - Set Extract Type - END
    LKM EBS/FS Extract Delta Balances - Delete Obsolete Rows - START
    LKM EBS/FS Extract Delta Balances - Delete Obsolete Rows - END
    LKM EBS/FS Extract Delta Balances - Load Audit - START
    LKM EBS/FS Extract Delta Balances - Load Audit - END
    COMM End Process Detail - Update Process Detail - START
    COMM End Process Detail - Update Process Detail - END
    EBS/FS Load Data - Load TDATASEG_T - START
    Import Data from Source for Period '2011年八月'
    Monetary Data Rows Imported from Source: 182
    Total Data Rows Imported from Source: 182
    EBS/FS Load Data - Load TDATASEG_T - END
    COMM Update Data - Update TDATASEG_T/TDATASEGW - START
    Warning: Data rows with zero balances exist
    Zero Balance Data Rows Deleted: 8
    Processing Mappings for Column 'ACCOUNT'
    Data Rows Updated by 'DEFAULT' (LIKE): 174
    Processing Mappings for Column 'ENTITY'
    Data Rows Updated by 'DEFAULT' (LIKE): 174
    Processing Mappings for Column 'ICP'
    Data Rows Updated by 'DEFAULT' (LIKE): 174
    Processing Mappings for Column 'UD1'
    Data Rows Updated by 'Custom1' (LIKE): 174
    Processing Mappings for Column 'UD2'
    Data Rows Updated by 'Custom2' (LIKE): 174
    Processing Mappings for Column 'UD3'
    Data Rows Updated by 'Custom3' (LIKE): 174
    Processing Mappings for Column 'UD4'
    Data Rows Updated by 'Custom4' (LIKE): 174
    Processing Mappings for Column 'UD5'
    Data Rows Updated by 'Value' (LIKE): 174
    Warning: Data rows with unmapped dimensions exist
    Data Rows Marked as Invalid: 174
    Warning: No data rows available for Export to Target
    Total Data Rows available for Export to Target: 0
    COMM Update Data - Update TDATASEG_T/TDATASEGW - END
    COMM End Process Detail - Update Process Detail - START
    COMM End Process Detail - Update Process Detail - END
    EBS/FS Load Data - Load TDATASEG_T - START
    Import Data from Source for Period '2011年九月'
    Monetary Data Rows Imported from Source: 193
    Total Data Rows Imported from Source: 193
    EBS/FS Load Data - Load TDATASEG_T - END
    COMM Update Data - Update TDATASEG_T/TDATASEGW - START
    Warning: Data rows with zero balances exist
    Zero Balance Data Rows Deleted: 69
    Processing Mappings for Column 'ACCOUNT'
    Data Rows Updated by 'DEFAULT' (LIKE): 124
    Processing Mappings for Column 'ENTITY'
    Data Rows Updated by 'DEFAULT' (LIKE): 124
    Processing Mappings for Column 'ICP'
    Data Rows Updated by 'DEFAULT' (LIKE): 124
    Processing Mappings for Column 'UD1'
    Data Rows Updated by 'Custom1' (LIKE): 124
    Processing Mappings for Column 'UD2'
    Data Rows Updated by 'Custom2' (LIKE): 124
    Processing Mappings for Column 'UD3'
    Data Rows Updated by 'Custom3' (LIKE): 124
    Processing Mappings for Column 'UD4'
    Data Rows Updated by 'Custom4' (LIKE): 124
    Processing Mappings for Column 'UD5'
    Data Rows Updated by 'Value' (LIKE): 124
    Warning: Data rows with unmapped dimensions exist
    Data Rows Marked as Invalid: 124
    Warning: No data rows available for Export to Target
    Total Data Rows available for Export to Target: 0
    COMM Update Data - Update TDATASEG_T/TDATASEGW - END
    COMM End Process Detail - Update Process Detail - START
    COMM End Process Detail - Update Process Detail - END
    COMM Update YTD Amounts - Update YTD Amounts - START
    COMM Update YTD Amounts - Update YTD Amounts - END
    ODI Hyperion Financial Management Adapter Version 9.3.1
    Load task initialized.
    LKM COMM Load Data into HFM - Load Data to HFM - START
    Connecting to Financial Management application [EEHFM2] on [taly_cluster] using user-name [admin].
    Connected to Financial Management application.
    HFM Version: 11.1.2.1.0.
    Options for the Financial Management load task are:
    Load Options validated.
    Source data retrieved.
    Pre-load tasks completed.
    HFM Log file path: "C:\DOCUME~1\ADMINI~1\LOCALS~1\Temp\1\aif_14HFM59238.log".
    Load to Financial Management application completed.
    Load task statistics - success-count:0, error-count:0
    Cleanup completed.
    Disconnected from Financial Management application.
    LKM COMM Load Data into HFM - Load Data to HFM - END
    ERPI Process End, Process ID: 14

    The data load method for your target application should be set to "FDM" if you are going to be using FDM for your Data Load Rule imports. The Error in creating the DB link is due to the FDM Schema not being granted the DBLINK role in Oracle. In the FDM DBA Guide it explains that the FDM Schema must be granted the create DBLINK role when using the ERPi adapter.

  • Error invoking populateXRefRow:oracle.tip.xref.exception.RepositoryExceptio

    Hi All,
    I created an XREF table using XREF commandline utility. Table was successfully created and columns were also successfully added. But I am getting the following error while using populateXRefRow function in assign activity in BPEL. I have XREF_DATA table is present in oraesb schema.
    SOA suite version 10.1.3.4
    JDev version 10.1.3.4
    OS - Windows XP.
    Basic Installation with Oracle lite database.
    Caused by: oracle.xml.xpath.XPathException: Extension function error: Error invoking 'populateXRefRow':'oracle.tip.xref.exception.RepositoryException: Unable to access Cross Reference Values from Database.The SQL Exception is: "JDBC 2.0 feature is not yet implemented"
    Please ensure that the database is accessible. If accessible, please look at the stack trace and fix the issue. If unable to fix contact Oracle Support '
         at oracle.xml.xslt.XSLStylesheet.flushErrors(XSLStylesheet.java:1846)
         at oracle.xml.xslt.XSLStylesheet.execute(XSLStylesheet.java:612)
         at oracle.xml.xslt.XSLStylesheet.execute(XSLStylesheet.java:548)
         at oracle.xml.xslt.XSLProcessor.processXSL(XSLProcessor.java:333)
         at oracle.xml.jaxp.JXTransformer.transform(JXTransformer.java:460)
         ... 91 more
    oracle.xml.xpath.XPathException: Extension function error: Error invoking 'populateXRefRow':'oracle.tip.xref.exception.RepositoryException: Unable to access Cross Reference Values from Database.The SQL Exception is: "JDBC 2.0 feature is not yet implemented"
    Please ensure that the database is accessible. If accessible, please look at the stack trace and fix the issue. If unable to fix contact Oracle Support '
         at oracle.xml.xpath.XSLExtFunctions.callStaticMethod(XSLExtFunctions.java:118)
         at oracle.xml.xpath.XPathExtFunction.evaluateMethod(XPathExtFunction.java:337)
         at oracle.xml.xpath.XPathExtFunction.evaluate(XPathExtFunction.java:266)
         at oracle.xml.xslt.XSLValueOf.processAction(XSLValueOf.java:120)
         at oracle.xml.xslt.XSLNode.processChildren(XSLNode.java:480)
         at oracle.xml.xslt.XSLTemplate.processAction(XSLTemplate.java:205)
         at oracle.xml.xslt.XSLStylesheet.execute(XSLStylesheet.java:581)
         at oracle.xml.xslt.XSLStylesheet.execute(XSLStylesheet.java:548)
         at oracle.xml.xslt.XSLProcessor.processXSL(XSLProcessor.java:333)
         at oracle.xml.jaxp.JXTransformer.transform(JXTransformer.java:460)
         at com.collaxa.cube.xml.xpath.functions.xml.GetElementFromXSLTFunction.transform(GetElementFromXSLTFunction.java:335)
         at com.collaxa.cube.xml.xpath.functions.xml.GetElementFromXDKXSLTFunction.transform(GetElementFromXDKXSLTFunction.java:38)
         at com.collaxa.cube.xml.xpath.functions.xml.GetElementFromXSLTFunction.evaluate(GetElementFromXSLTFunction.java:144)
         at com.collaxa.cube.xml.xpath.functions.xml.GetElementFromXSLTFunction.call(GetElementFromXSLTFunction.java:89)
         at com.collaxa.cube.xml.xpath.BPELXPathFunctionWrapper.evaluate(BPELXPathFunctionWrapper.java:50)
         at oracle.xml.xpath.JXPathContext$JXFunction.invoke(JXPathContext.java:147)
         at oracle.xml.xpath.JXPathContext$JXFunction.invoke(JXPathContext.java:116)
         at oracle.xml.xpath.XPathExtFunction.evaluate(XPathExtFunction.java:254)
         at oracle.xml.xpath.JXPathExpression.evaluate(JXPathExpression.java:181)
         at com.collaxa.cube.xml.xpath.BPELXPathUtil.evaluate(BPELXPathUtil.java:189)
         at com.collaxa.cube.engine.ext.wmp.BPELAssignWMP.evalFromValue(BPELAssignWMP.java:679)
         at com.collaxa.cube.engine.ext.wmp.BPELAssignWMP.__executeStatements(BPELAssignWMP.java:143)
         at com.collaxa.cube.engine.ext.wmp.BPELActivityWMP.perform(BPELActivityWMP.java:199)
         at com.collaxa.cube.engine.CubeEngine.performActivity(CubeEngine.java:3698)
         at com.collaxa.cube.engine.CubeEngine.handleWorkItem(CubeEngine.java:1655)
         at com.collaxa.cube.engine.dispatch.message.instance.PerformMessageHandler.handleLocal(PerformMessageHandler.java:75)
         at com.collaxa.cube.engine.dispatch.DispatchHelper.handleLocalMessage(DispatchHelper.java:217)
         at com.collaxa.cube.engine.dispatch.DispatchHelper.sendMemory(DispatchHelper.java:314)
         at com.collaxa.cube.engine.CubeEngine.endRequest(CubeEngine.java:5765)
         at com.collaxa.cube.engine.CubeEngine.createAndInvoke(CubeEngine.java:1087)
         at com.collaxa.cube.engine.ejb.impl.CubeEngineBean.createAndInvoke(CubeEngineBean.java:133)
         at com.collaxa.cube.engine.ejb.impl.CubeEngineBean.syncCreateAndInvoke(CubeEngineBean.java:162)
         at sun.reflect.GeneratedMethodAccessor91.invoke(Unknown Source)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
         at java.lang.reflect.Method.invoke(Method.java:585)
         at com.evermind.server.ejb.interceptor.joinpoint.EJBJoinPointImpl.invoke(EJBJoinPointImpl.java:35)
         at com.evermind.server.ejb.interceptor.InvocationContextImpl.proceed(InvocationContextImpl.java:119)
         at com.evermind.server.ejb.interceptor.system.DMSInterceptor.invoke(DMSInterceptor.java:52)
         at com.evermind.server.ejb.interceptor.InvocationContextImpl.proceed(InvocationContextImpl.java:119)
         at com.evermind.server.ejb.interceptor.system.JAASInterceptor$1.run(JAASInterceptor.java:31)
         at com.evermind.server.ThreadState.runAs(ThreadState.java:693)
         at com.evermind.server.ejb.interceptor.system.JAASInterceptor.invoke(JAASInterceptor.java:34)
         at com.evermind.server.ejb.interceptor.InvocationContextImpl.proceed(InvocationContextImpl.java:119)
         at com.evermind.server.ejb.interceptor.system.TxRequiresNewInterceptor.invoke(TxRequiresNewInterceptor.java:52)
         at com.evermind.server.ejb.interceptor.InvocationContextImpl.proceed(InvocationContextImpl.java:119)
         at com.evermind.server.ejb.interceptor.system.DMSInterceptor.invoke(DMSInterceptor.java:52)
         at com.evermind.server.ejb.interceptor.InvocationContextImpl.proceed(InvocationContextImpl.java:119)
         at com.evermind.server.ejb.InvocationContextPool.invoke(InvocationContextPool.java:55)
         at com.evermind.server.ejb.StatelessSessionEJBObject.OC4J_invokeMethod(StatelessSessionEJBObject.java:87)
         at CubeEngineBean_LocalProxy_4bin6i8.syncCreateAndInvoke(Unknown Source)
         at com.collaxa.cube.engine.delivery.DeliveryHandler.initialRequestAnyType(DeliveryHandler.java:547)
         at com.collaxa.cube.engine.delivery.DeliveryHandler.initialRequest(DeliveryHandler.java:464)
         at com.collaxa.cube.engine.delivery.DeliveryHandler.request(DeliveryHandler.java:133)
         at com.collaxa.cube.ejb.impl.DeliveryBean.request(DeliveryBean.java:95)
         at sun.reflect.GeneratedMethodAccessor139.invoke(Unknown Source)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
         at java.lang.reflect.Method.invoke(Method.java:585)
         at com.evermind.server.ejb.interceptor.joinpoint.EJBJoinPointImpl.invoke(EJBJoinPointImpl.java:35)
         at com.evermind.server.ejb.interceptor.InvocationContextImpl.proceed(InvocationContextImpl.java:119)
         at com.evermind.server.ejb.interceptor.system.DMSInterceptor.invoke(DMSInterceptor.java:52)
         at com.evermind.server.ejb.interceptor.InvocationContextImpl.proceed(InvocationContextImpl.java:119)
         at com.evermind.server.ejb.interceptor.system.JAASInterceptor$1.run(JAASInterceptor.java:31)
         at com.evermind.server.ThreadState.runAs(ThreadState.java:693)
         at com.evermind.server.ejb.interceptor.system.JAASInterceptor.invoke(JAASInterceptor.java:34)
         at com.evermind.server.ejb.interceptor.InvocationContextImpl.proceed(InvocationContextImpl.java:119)
         at com.evermind.server.ejb.interceptor.system.TxRequiredInterceptor.invoke(TxRequiredInterceptor.java:50)
         at com.evermind.server.ejb.interceptor.InvocationContextImpl.proceed(InvocationContextImpl.java:119)
         at com.evermind.server.ejb.interceptor.system.DMSInterceptor.invoke(DMSInterceptor.java:52)
         at com.evermind.server.ejb.interceptor.InvocationContextImpl.proceed(InvocationContextImpl.java:119)
         at com.evermind.server.ejb.InvocationContextPool.invoke(InvocationContextPool.java:55)
         at com.evermind.server.ejb.StatelessSessionEJBObject.OC4J_invokeMethod(StatelessSessionEJBObject.java:87)
         at DeliveryBean_RemoteProxy_4bin6i8.request(Unknown Source)
         at com.collaxa.cube.ws.soap.oc4j.SOAPRequestProvider.processNormalOperation(SOAPRequestProvider.java:451)
         at com.collaxa.cube.ws.soap.oc4j.SOAPRequestProvider.processBPELMessage(SOAPRequestProvider.java:274)
         at com.collaxa.cube.ws.soap.oc4j.SOAPRequestProvider.processMessage(SOAPRequestProvider.java:120)
         at oracle.j2ee.ws.server.provider.ProviderProcessor.doEndpointProcessing(ProviderProcessor.java:956)
         at oracle.j2ee.ws.server.WebServiceProcessor.invokeEndpointImplementation(WebServiceProcessor.java:349)
         at oracle.j2ee.ws.server.provider.ProviderProcessor.doRequestProcessing(ProviderProcessor.java:466)
         at oracle.j2ee.ws.server.WebServiceProcessor.processRequest(WebServiceProcessor.java:114)
         at oracle.j2ee.ws.server.WebServiceProcessor.doService(WebServiceProcessor.java:96)
         at oracle.j2ee.ws.server.WebServiceServlet.doPost(WebServiceServlet.java:194)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:763)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:856)
         at com.evermind.server.http.ResourceFilterChain.doFilter(ResourceFilterChain.java:64)
         at oracle.security.jazn.oc4j.JAZNFilter$1.run(JAZNFilter.java:400)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAsPrivileged(Subject.java:517)
         at oracle.security.jazn.oc4j.JAZNFilter.doFilter(JAZNFilter.java:414)
         at com.evermind.server.http.ServletRequestDispatcher.invoke(ServletRequestDispatcher.java:623)
         at com.evermind.server.http.ServletRequestDispatcher.forwardInternal(ServletRequestDispatcher.java:370)
         at com.evermind.server.http.HttpRequestHandler.doProcessRequest(HttpRequestHandler.java:871)
         at com.evermind.server.http.HttpRequestHandler.processRequest(HttpRequestHandler.java:453)
         at com.evermind.server.http.HttpRequestHandler.serveOneRequest(HttpRequestHandler.java:221)
         at com.evermind.server.http.HttpRequestHandler.run(HttpRequestHandler.java:122)
         at com.evermind.server.http.HttpRequestHandler.run(HttpRequestHandler.java:111)
         at oracle.oc4j.network.ServerSocketReadHandler$SafeRunnable.run(ServerSocketReadHandler.java:260)
         at oracle.oc4j.network.ServerSocketAcceptHandler.procClientSocket(ServerSocketAcceptHandler.java:234)
         at oracle.oc4j.network.ServerSocketAcceptHandler.access$700(ServerSocketAcceptHandler.java:29)
         at oracle.oc4j.network.ServerSocketAcceptHandler$AcceptHandlerHorse.run(ServerSocketAcceptHandler.java:879)
         at com.evermind.util.ReleasableResourcePooledExecutor$MyWorker.run(ReleasableResourcePooledExecutor.java:303)
         at java.lang.Thread.run(Thread.java:595)
    Please help me to resolve the above error.
    - Sam

    Hi Sam,
    This error is because JDBC 2.0 feature is not completely implemented in Oracle Lite database.
    So, you might want to go for an SOA Advanced Install with 10.2.0.3 oracle database to resolve your issue.
    Hope this helps!
    Cheers
    Anirudh Pucha

Maybe you are looking for

  • Single directory Server for Messaging and Portal

    We are trying to unify our directory services. At present, there two directory servers, one for iPlanet messaging 5.2 and another for Portal server 6.0. Messaging's Directory server is v5.1 and Portal's Directory server is v5.2. Their BaseDN is same.

  • System Recovery After Hard Drive Replacement

    I have a HP Pavilion dv6000 that the hard drive crashed on. I purchased a new hard drive for the laptop, but since the Windows Vista operating system and all the factory installed programs were on the Recovery Partition of the bad hard drive, I have

  • Ipod 4G song problem

    I have a ipod 4G 64 gb touch I just got before Thanksgiving.  It was syncing fine up untill about a month ago.  I noticed all the some of the artists were out of alphabetical order and the letter seperator was not working.  I have done system restore

  • I cant close itunes on my pc it keeps popping up

    when i close itunes it popps up again as if i started it again i tried reinstalling itunes a couple of times and tried troubleshooting it with some people but no one have never heard of the problem so i just dont know how to solve it out i can comple

  • Help with Numbers. I am trying to input numbers such as 001, 002 etc.

    I am trying to input numbers such as 001, 002 etc. but the 00's in front keep getting dropped. I understand putting them there does not make this true or real numbers but I need them in this table. How do I get this without the 00's being dropped?