Load Process in 11g

Hi,
How do you model a load process in BPM 11g using the new BPMN palette. The load process queries an external oracle database table and creates tasks for the end users in the workspace. Each task has a user interface that will display the data passed in.
thanks.

OracleStudent,
I am not going to recommned to fiddle with your redo log size and that will be my last option if I have to.
number of record = 8413427
Txt fiel size = 3.59GB
columns = 91
you said remote server, is that the case? i've have no idea could you tell mean what is meaning of remote server?? plz tell me how can i check this???? i've recenly join this campany i asked to developer who show me the code where he is using direct = ture. plz help me this process of loading is very annoying for me. plz tell what i need to check
Couple of questions.
How are you loading this data? You mentioend using some .NET application my question, is this .NET applicaiton resides on the same server as your database or does it run from a different machine. Also if you are invoking sqlldr (as you mentioned), please post your sqlldr control file. Also during the load it should be generating a log file , check and look for following line to verify and confirm you are using direct path.
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Continuation:    none specified
Path used:      Direct2. Do you have any indexes on this table, if yes how many and what type? I mean regular btree or bitmap or both?
3. Does this table in logging or nologging state?
Regards

Similar Messages

  • Repository corrupted/loading process is taking long time

    The repository load process is getting stuck at the below message for long time. We have disabled sort indexes on all look up tables but still it is taking huge amount of time to load the repository.Any information on this would be of great help.
    94      2011/08/26 20:16:55.523     Report Info     Background_Thread@Accelerator     Preload.cpp               Processing sort indices for 'Products'... (98%)
    81      2011/08/27 02:08:23.298     Report Info     Background_Thread@Accelerator     Preload.cpp               Processing sort indices for 'Products'... (100%)
    Regards,
    Nitin

    Hi Nitin,
    There are many performance improvement steps that one can take regarding this including verifying your repository.
    But i think it should only be a problem if this problem reoccurs.
    The accelerators get created when there is a change in table and update indices creates them,possibly there are multiple changes and load by update indices has not taken place for sometme thats why it is creating them.
    For better performance one can do following:
    Make a judicious choice of which fields to track in change tracking
    Disk Space has a huge impact on smooth functioning of MDM. If the disk space is not enough MDM Server cannot even load the repositories
    Have a closer look at data model and field properties
    Verify if your MDS.INI has this parameter: Session Timeout Minutes (Number. Causes MDM Console, CLIX, and applications based on the new Java API to expire after the specified number of minutes elapses. Default is 14400 (24 hours).When set to 0, sessions never time out.).When you have many open connections, this can to generate performance issue on MDM server
    Refer to SAP Note Number: 1012745
    Hope this helps.
    Thanks,
    Ravi

  • Process Chain Load Processing Time Issue

    Hi All,
    One my process Chain is running daily , but after 2 hours of load it is showing Red in the Monitor screen, but after 5 hours load is successful.
    Why Monitor screen is showing Red?
    Is it possible to Extend the load processing time in the Infopackage level?
    for e.g-  if they set as 60 seconds -- go to Red
    I want to change to 120 seconds -- go to Red.
    If yes...where we can do it ...please let me know the steps...
    Regards,
    Nithi.

    hey hi,
    double click on the infopackage -> go to "scheduler" on the top left corner of the menu -> click "Timeout time" and you have the different option to change it.
    hope this helps.

  • Btree vs Bitmap.  Optimizing load process in Data Warehouse.

    Hi,
    I'm working on fine tuning a Data Warehousing system. I understand that Bitmap indexes are very good for OLAP systems, especially if the cardinality is low and if the WHERE clause has multiple fields on which bitmap indexes exist for each field.
    However, what I'm finetuning is not query, but load process. I want to minimize the total load time. If I create a bitmap index on a field with cardinality of one million, and if the table has one million rows (each row has a distinct field value), then my understanding is
    The total size of the bitmap index = number of rows * (cardinality / 8) bytes
    (because there are 8 bits in a byte).
    Hence the size of my bitmap index will be
    Million * Million / 8 bytes = 116 GB.
    Also, does anyone know what would be the size of my B-tree index? I'm thinking
    The total size of the B-tree index = number of rows * (field length+20) bytes
    (assuming that the field length of rowid is 20 charas).
    Hence the size of my b-tree index will be
    Million * (10+20) bytes = 0.03 GB (assuming that my field length is 10 charas).
    That means B-tree index is much lesser than the size of the Bitmap index.
    Is my math correct? If so, then the disk activity will be much higher for a bitmap index than a B-tree index. Hence, creation of the bitmap index should take much longer than the B-tree index if the cardinality is high.
    Please let me know your opinions.
    Thanks
    Sankar

    Hi Jaffar,
    Thanks to you and Jonathan. This is the kind of answer I have been looking for.
    If I understand your email correctly, for the scenario from my original email, bitmap index will be 32MB where as Btree will be 23MB. Is that right?
    Suppose there is an order table with 10 orders. There are four possible values for OrderType. Based on your reply, now I understand that the bitmap index is organized as shown below.
    Data Table:
    RowId     OrderNo     OrderType
    1     23456     A
    2     23457     A
    3     23458     B
    4     23459     C
    5     23460     C
    6     23461     C
    7     23462     B
    8     23463     B
    9     23464     D
    10     23465     A
    Index table:
    OrderType     FROM     TO
    A     1     2     
    B     3     3     
    C     4     6     
    B     7     8     
    D     9     9     
    A     10     10     
    That means, you might have more entries in the index table than the cardinality. Is that right? That means, the size of the index table cannot be EXACTLY determined based on cardinality. In our example, the cardinality is 4 while there are 6 entries in the index table.
    In an extreme example, if no two adjacent records have the same OrderType, then there will be 10 records in the index table as well, as shown in the example below.
    Data Table (second example):
    RowId     OrderNo     OrderType
    1     23456     A
    2     23457     B
    3     23458     C
    4     23459     D
    5     23460     A
    6     23461     B
    7     23462     C
    8     23463     D
    9     23464     A
    10     23465     B
    Index table (second example):
    OrderType     FROM     TO
    A     1     1     
    B     2     2     
    C     3     3     
    D     4     4     
    A     5     5     
    B     6     6     
    C     7     7
    D     8     8
    A     9     9
    B     10     10
    That means, the size of the index table will be somewhere between the cardinality (minimally) and the table size (maximally).
    Please let me know if I make sense.
    Regards
    Sankar

  • Data Load process for 0FI_AR_4  failed

    Hi!
    I am aobut to implement SAP Best practices scenario "Accounts Receivable Analysis".
    When I schedule data load process in Dialog immediately for Transaction Data 0FI_AR_4 and check them in Monitor the the status is yellow:
    On the top I can see the following information:
    12:33:35  (194 from 0 records)
    Request still running
    Diagnosis
    No errors found. The current process has probably not finished yet.
    System Response
    The ALE inbox of BI is identical to the ALE outbox of the source system
    or
    the maximum wait time for this request has not yet been exceeded
    or
    the background job has not yet finished in the source system.
    Current status
    No Idocs arrived from the source system.
    Question:
    which acitons can  I do to run the loading process succesfully?

    Hi,
    The job is still in progress it seems.
    You could monitor the job that was created in R/3 (by copying the technical name in the monitor, appending "BI" to is as prefix, and searching for this in SM37 in R/3).
    Keep on eye on ST22 as well if this job is taking too long, as you may have gotten a short dump for it already, and this may not have been reported to the monitor yet.
    Regards,
    De Villiers

  • Compiling a package without disturbing the load process

    Hi,
    I need to compile a package, with the changes, in the database without stopping the load process that is using this package. Please let me know if any one has any ideas.
    Thanks

    sdk11 wrote:
    Hi,
    I need to compile a package, with the changes, in the database without stopping the load process that is using this package. Please let me know if any one has any ideas.
    ThanksIf you mean: "I need to create or replace a package", while some session is still running code of that package.
    Then sorry: nocando.
    Unless you are on 11.2, in which case you could (with the necessary preparation/configuration done first) create a new version of the package in another edition than the session is using. But the session will have to finish its work using the package as-is currently.

  • How to design data load process chain?

    Hello,
    I am designing data load process chains for the first time and would like to get some general information on best practicies in that area.
    My situation is as follows:
    I have 3 source systems (R3 and two for which I use flat files).
    How do you suggest, should I define one big chain for all my loading process (I have about 20 InfoSources) or define a few shorter e.g.
    1. Master data R3
    2. Master data flat file system 1
    3. Master data flat file system 2
    4. Transaction data R3
    5. Transaction data file sys 1
    ... and execute one after another succesful end?
    Could you also suggest me any links or manuals on that topic?
    Thank you
    Andrzej

    Andrzej,
    My advise is to make separate chains for master & transaction data (always load in this order!) and afterwards make a 'master chain' where you insert these 2 chains one after the other (so: Start process -> Master data chain -> Transaction data chain).
    Regarding the separate chains; paralellize as much as possible (if functionally allowed). Normally, the number of parallel ('vertical') chains equals the nr of CPU's available (check with basis-person).
    Hope this provides you with enough info to start off with!
    Regards,
    Marco

  • How to automate the data load process using data load file & task Scheduler

    Hi,
    I am doing Automated Process to load the data in Hyperion Planning application with the help of data_Load.bat file & Task Scheduler.
    I have created Data_Load.bat file but rest of the process i am unable complete.
    So could you help me , how to automate the data load process using Data_load.bat file & task Scheduler or what are the rest of the file is require to achieve this.
    Thanks

    To follow up on your question are you using the maxl scripts for the dataload?
    If so I have seen and issue within the batch (ex: load_data.bat) that if you do not have the full maxl script path with a batch when running it through event task scheduler the task will work but the log and/ or error file will not be created. Meaning the batch claims it ran from the task scheduler although it didn't do what you needed it to.
    If you are using maxl use this as the batch
    "essmsh C:\data\DataLoad.mxl" Or you can also use the full path for the maxl either way works. The only reason I would think that the maxl may then not work is if you do not have the batch updated to call on all the maxl PATH changes or if you need to update your environment variables to correct the essmsh command to work in a command prompt.

  • Help Please : Cron Job to automate load process

    Hi
    I am trying to automate data load process. I am loading data into a number of tables from flat files.
    Currently I have a UNIX (SunOS) file with a bunch of SQLLDR commands and I changed permission on this file to executable. Every morning I execute this file to load data.
    Now I want to automate this process by writing a cron job and scheduling it. I am running into a number of problems. I exported ORACLE_SID, ORACLE_HOME and PATH still cron is unable to find SQLLDR.
    Whatelse am I missing. Here is my command file and cron file.
    Please help!?!?!?
    ORAENV VARiables
    export ORACLE_HOME=/export/opt/oracle/product/8.1.6
    export ORACLE_SID=fid1
    export PATH=$PATH:$ORACLE_HOME/bin
    .profile
    . $HOME/oraenv
    daily_full.sql file
    export ORACLE_SID ORACLE_HOME PATH
    sqlldr userid=user/pwd control=acct.ctl log=acct.log
    sqlldr .......
    Cron Job
    16 11 * * 1-5 /apps/fire/data/loadscripts/daily_full.sql >> /apps/fire/data/loadscripts/fulllog.log 2>&1
    Output fulllog.log file
    /apps/fire/data/loadscripts/daily_full.sql: sqlldr: not found
    /apps/fire/data/loadscripts/daily_full.sql: sqlldr: not found
    Thanks
    Shanthi

    Hi Ramayanapu,
    first; you have written a shell-script not an sql-script. Please rename your file from daily_full.sql to daily_full.sh
    I suggest that you use the cronjob from a user who has the enviroment with the variables ORACLE_SID and ORACLE_HOME.
    In this case cron will operate from the $HOME variable of this user.
    Perhaps your export will destroy the .kshrc setting. The statement has no effect in your script, please remove it.
    Rename your sqlldr-Statement as follows;
    $ORACLE_HOME/bin/sqlldr userid=user/pwd control=<path>acct.ctl log=acct.log
    <path> will placed with the path of your controlfile.
    Your user/pwd will correspond with a ORACLE user who has the right to insert in the destination table.
    Your logfile will be place in the %HOME directory.
    Hope that i could help to solve your problems
    with kind regards
    Hans-Peter

  • No Infopackage available in Load process (In Process Chain)

    Hello All,
    We have a problem in process Chain,
    When we add the load Process Item and want to see all the infopackage available, the system say no data... We have Infopackage on Datasource vs 3.x and Datasource BI7.
    Thanks in advance for your help.

    Hi Bruno,
    if i am right u r getting message" no data selected" when u try 2 find IP. which SP u wrking on. try to look 4 ur IP in infosources tab next to data targets in process chain maintainence. It looks like some bug. check note 1062704
    Thanks
    Prashant

  • Data load process for FI module

    Dear all,
    We are using BI7.00 and in one of our FI data source 0EC_PCA_1 we had data load failure, the cause for the failure was analysed and we did the following
    1) deleted the data from cube and the PSA
    2) reloaded (full load) data - without disturbing the init.
    This solved our problem. Now when the data reconciliation is done we find that there are doubled entries for some of the G/L codes.
    I have a doubt here.
    Since there is no setup table for FI transactions (correct me if i am wrong), the full load had taken the data which was also present in the delta queue and subsequently the delta load had also loaded the same data
    (some g/l which was available as delta).
    Kindly provide the funtioning of FI data loads. Should we go for a Down time and how FI data loads works without setup tables.
    Can experts provided valuable solution for addressing this problem. Can anyone provide step by step process that has to be adopted to solve this problem permenantly.
    Regards,
    M.M

    Hi Magesh,
    The FI datasources do not involve Setup tables while performing full loads and they do not involve outbound queue during delta loads.
    Full load happens directly from your datasource view to BI and delta is captured in the delta queue.
    Yes you are right in saying that when you did a full load some of the values were pulled that were also present in the delta queue. Hence you have double loads.
    You need to completely reinitialise as the full load process is disturbed. Taking a down time depends on how frequent the transactions are happening.
    You need to.
    1. Completely delete the data in BW including the initialisation.
    2. Take a down time if necessary.
    3. Reintialise the whole datasource from scratch.
    Regards,
    Pramod

  • Load Process needs to break into two Processes

    Hi friends
    I have a Process Of Load data in my Process chain. This Load Data is loading into two targets(PSA and then into Data Target, Package by Package). The two targets are:
    0MATERIAL$T, Material (Texts)
    ZE_MATERL$T, Euro Material (Texts)
    Now i have to break this process into two processes. First process should load into PSA and the secound process should load through the PSA to Data Targets.
    I have done for the PSA Know the problem is how i should do the same for two Data Targets because at a time in 'Read PSA and Update Data Target' process only one process can be updated. Please suggest.
    Thanks and Regards
    Vismark Tiwari

    hi,
    which version u r in?
    if 3.x
    choose the third option in the infopack PSA and update to target. and save the infopack. use this infopack in process chain this will generate the two process that loads upto PSA then to target with next process.
    if 7.0
    use infopack to load upto PSA.
    create transformation for the datasource used each from both the infoobject. create independent DTPs to load both the infoobject.
    use, the infopack to load upto PSA and use the 2 DTPs after the infopack load process.
    Ramesh

  • Exception while loading process

    Hi all,
    I just installed BPEL 2.0.10 (PM and Designer) and as I completed my first BPEL process (a simple invocation of a synchronous WS) I got the following problem
    <2004-09-09 19:05:47,036> <DEBUG> <default.collaxa.cube.engine.deployment> <Cube
    ProcessHolder::bind> Exception while loading process
    ORABPEL-05217
    Error while creating process.
    An error has occurred while attempting to instantiate the class "bpel.FirstBPEL.
    FirstBPEL__BPEL4WS_BIN" for the process "FirstBPEL" (revision "1.0"). The excep
    tion reported was: bpel.FirstBPEL.FirstBPEL__BPEL4WS_BIN
    Please try recompiling your BPEL process again. The current BPEL process archiv
    e "FirstBPEL" may have been compiled with an older version of "bpelc".
    Obviously I have no older version of 'bpelc'. The stack trace reported
    at com.collaxa.cube.engine.deployment.CubeProcessFactory.create(CubeProc
    essFactory.java:83)
    at com.collaxa.cube.engine.deployment.CubeProcessLoader.create(CubeProce
    ssLoader.java:351)
    at com.collaxa.cube.engine.deployment.CubeProcessLoader.load(CubeProcess
    Loader.java:276)
    at com.collaxa.cube.engine.deployment.CubeProcessHolder.loadAndBind(Cube
    ProcessHolder.java:698)
    at com.collaxa.cube.engine.deployment.CubeProcessHolder.getProcess(CubeP
    rocessHolder.java:512)
    at com.collaxa.cube.engine.deployment.CubeProcessHolder.getStatus(CubePr
    ocessHolder.java:168)
    at com.collaxa.cube.engine.CubeEngine.lookupProcessStatus(CubeEngine.jav
    a:967)
    at com.collaxa.cube.beans.BPELProcessManagerBean.getErrors(BPELProcessMa
    nagerBean.java:52)
    at com.collaxa.cube.beans.ProcessManagerBean_1eyxw1_EOImpl.getErrors(Pro
    cessManagerBean_1eyxw1_EOImpl.java:370)
    at com.oracle.bpel.client.BPELProcessHandle.getErrors(BPELProcessHandle.
    java:163)
    at jsp_servlet.__ngprocessloaderror._jspService(__ngprocessloaderror.jav
    a:188)
    at weblogic.servlet.jsp.JspBase.service(JspBase.java:33)
    at weblogic.servlet.internal.ServletStubImpl$ServletInvocationAction.run
    (ServletStubImpl.java:1053)
    at weblogic.servlet.internal.ServletStubImpl.invokeServlet(ServletStubIm
    pl.java:387)
    at weblogic.servlet.internal.ServletStubImpl.invokeServlet(ServletStubIm
    pl.java:305)
    at weblogic.servlet.internal.RequestDispatcherImpl.include(RequestDispat
    cherImpl.java:594)
    at weblogic.servlet.internal.RequestDispatcherImpl.include(RequestDispat
    cherImpl.java:409)
    at weblogic.servlet.jsp.PageContextImpl.include(PageContextImpl.java:155
    at jsp_servlet.__displayprocess._jspService(__displayprocess.java:414)
    at weblogic.servlet.jsp.JspBase.service(JspBase.java:33)
    at weblogic.servlet.internal.ServletStubImpl$ServletInvocationAction.run
    (ServletStubImpl.java:1053)
    at weblogic.servlet.internal.ServletStubImpl.invokeServlet(ServletStubIm
    pl.java:387)
    at weblogic.servlet.internal.ServletStubImpl.invokeServlet(ServletStubIm
    pl.java:305)
    at weblogic.servlet.internal.WebAppServletContext$ServletInvocationActio
    n.run(WebAppServletContext.java:6310)
    at weblogic.security.acl.internal.AuthenticatedSubject.doAs(Authenticate
    dSubject.java:317)
    at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:
    118)
    at weblogic.servlet.internal.WebAppServletContext.invokeServlet(WebAppSe
    rvletContext.java:3622)
    at weblogic.servlet.internal.ServletRequestImpl.execute(ServletRequestIm
    pl.java:2569)
    at weblogic.kernel.ExecuteThread.execute(ExecuteThread.java:197)
    at weblogic.kernel.ExecuteThread.run(ExecuteThread.java:170)
    Can somebody help me?
    BTW I'm using WLS 8.1 sp2
    Regards.
    Antonio.

    I tried both solutions but with no benefit. If it helps here are the FirstBPEL.bpel content and the FirstBPEL.wsdl
    BEGIN FirstBPEL.bpel -----------------------------
    <process name="FirstBPEL" targetNamespace="http://acm.org/samples" suppressJoinFailure="yes" xmlns:tns="http://acm.org/samples" xmlns="http://schemas.xmlsoap.org/ws/2003/03/business-process/" xmlns:bpelx="http://schemas.oracle.com/bpel/extension" xmlns:ora="http://schemas.oracle.com/xpath/extension" xmlns:ns0="http://www.openuri.org/" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:ns1="urn:MSGSenderService">
         <partnerLinks>
              <partnerLink name="client" partnerLinkType="tns:FirstBPEL" myRole="FirstBPELProvider" partnerRole="FirstBPELRequester"/>
              <partnerLink name="msgWs" partnerRole="MSGSenderServiceProvider" partnerLinkType="ns1:MSGSenderServiceLink"/>
         </partnerLinks>
         <variables>
              <variable name="input" messageType="tns:FirstBPELRequestMessage"/>
              <variable name="output" messageType="tns:FirstBPELResponseMessage"/>
              <variable name="msgInput" messageType="ns1:MSGSenderService_sendMsg"/>
              <variable name="msgOutput" messageType="ns1:MSGSenderService_sendMsgResponse"/>
         </variables>
         <sequence>
              <receive createInstance="yes" name="request" partnerLink="client" portType="tns:FirstBPEL" operation="initiate" variable="input"/>
              <assign name="assignInput">
                   <copy>
                        <from variable="input" part="payload" query="/tns:FirstBPELRequest/tns:from">
                        </from>
                        <to variable="msgInput" part="String_1"/>
                   </copy>
                   <copy>
                        <from variable="input" part="payload" query="/tns:FirstBPELRequest/tns:to">
                        </from>
                        <to variable="msgInput" part="String_2"/>
                   </copy>
                   <copy>
                        <from variable="input" part="payload" query="/tns:FirstBPELRequest/tns:body">
                        </from>
                        <to variable="msgInput" part="String_3"/>
                   </copy>
                   <copy>
                        <from variable="input" part="payload" query="/tns:FirstBPELRequest/tns:la">
                        </from>
                        <to variable="msgInput" part="String_4"/>
                   </copy>
              </assign>
              <invoke name="forward" partnerLink="msgWs" portType="ns1:MSGSenderService" operation="sendMsg" inputVariable="msgInput" outputVariable="msgOutput"/>
              <sequence name="main">
                   <assign name="assignOutput">
                        <copy>
                             <from variable="msgOutput" part="result">
                             </from>
                             <to variable="output" part="payload" query="/tns:FirstBPELResponse/tns:msgId"/>
                        </copy>
                   </assign>
                   <invoke name="callback" partnerLink="client" portType="tns:FirstBPELCallback" operation="onResult" inputVariable="output"/>
              </sequence>
         </sequence>
    </process>
    END FirstBPEL.bpel -------------------------------
    BEGIN FirstBPEL.wsdl -----------------------------
    <?xml version="1.0"?>
    <definitions name="FirstBPEL"
    targetNamespace="http://acm.org/samples"
    xmlns:tns="http://acm.org/samples"
    xmlns:plnk="http://schemas.xmlsoap.org/ws/2003/05/partner-link/"
    xmlns="http://schemas.xmlsoap.org/wsdl/"
    >
    <types>
    <schema attributeFormDefault="qualified"
    elementFormDefault="qualified"
    targetNamespace="http://acm.org/samples"
    xmlns="http://www.w3.org/2001/XMLSchema"
    >
    <element name="FirstBPELRequest">
    <complexType>
    <sequence>
    <element name="from" type="string" />
    <element name="to" type="string" />
    <element name="body" type="string" />
    <element name="la" type="string" />
    </sequence>
    </complexType>
    </element>
    <element name="FirstBPELResponse">
    <complexType>
    <sequence>
    <element name="msgId" type="string"/>
    </sequence>
    </complexType>
    </element>
    </schema>
    </types>
    <message name="FirstBPELRequestMessage">
    <part name="payload" element="tns:FirstBPELRequest"/>
    </message>
    <message name="FirstBPELResponseMessage">
    <part name="payload" element="tns:FirstBPELResponse"/>
    </message>
    <portType name="FirstBPEL">
    <operation name="initiate">
    <input message="tns:FirstBPELRequestMessage"/>
    </operation>
    </portType>
    <portType name="FirstBPELCallback">
    <operation name="onResult">
    <input message="tns:FirstBPELResponseMessage"/>
    </operation>
    </portType>
    <plnk:partnerLinkType name="FirstBPEL">
    <plnk:role name="FirstBPELProvider">
    <plnk:portType name="tns:FirstBPEL"/>
    </plnk:role>
    <plnk:role name="FirstBPELRequester">
    <plnk:portType name="tns:FirstBPELCallback"/>
    </plnk:role>
    </plnk:partnerLinkType>
    </definitions>
    END FirstBPEL.wsdl -------------------------------

  • Failing to load processes after installing patch 10.1.2.3

    I have installed patch 10.1.2.3 after Oracle advised that it may resolve some database adapter problems I was getting in 10.1.2.0.2. However after installing the patch, many of the BPEL processes are now failing to load when I bounce the OC4J component and I am getting ORABPEL-05215 errors.
    I turned on debug logging and an extract of the log file is as follows, has anybody got any ideas on how I might resolve these errors;
    <2008-07-09 15:32:33,311> <ERROR> <default.collaxa.cube.engine.deployment> <CubeProcessLoader::create>
    <2008-07-09 15:32:33,310> <ERROR> <default.collaxa.cube.engine.deployment> Process "DRSSecuritiesReader" (revision "1.0") load FAILED!!
    <2008-07-09 15:32:33,375> <DEBUG> <default.collaxa.cube.engine.dispatch> <BaseDispatchSet::receive> Receiving message log process event message afa47b51cbfc4dfa:4ee70b:11b0
    83c6e51:-7ffc for set system
    <2008-07-09 15:32:33,376> <DEBUG> <default.collaxa.cube.engine.dispatch> <Dispatcher::adjustThreadPool> Allocating 1 thread(s); pending threads: 1, active threads: 0, total
    : 0
    <2008-07-09 15:32:33,378> <DEBUG> <default.collaxa.cube.engine.dispatch> <QueueConnectionPool::getConnection> Fetched a queue connection from pool java:comp/env/jms/collaxa
    /BPELWorkerQueueFactory, available connections=24, total connections=25
    <2008-07-09 15:32:33,403> <DEBUG> <default.collaxa.cube.engine.dispatch> <DispatcherBean::send> Sent message to queue
    <2008-07-09 15:32:33,403> <DEBUG> <default.collaxa.cube.engine.dispatch> <QueueConnectionPool::releaseConnection> Released queue connection to pool java:comp/env/jms/collax
    a/BPELWorkerQueueFactory, available connections=25, total connections=25
    <2008-07-09 15:32:33,404> <DEBUG> <default.collaxa.cube.engine.deployment> <CubeProcessHolder::bind> Exception while loading process
    java.lang.AbstractMethodError
    at com.collaxa.cube.engine.core.BaseCubeProcess.loadActivationAgents(BaseCubeProcess.java:946)
    at com.collaxa.cube.engine.core.BaseCubeProcess.load(BaseCubeProcess.java:310)
    at com.collaxa.cube.engine.deployment.CubeProcessFactory.create(CubeProcessFactory.java:66)
    at com.collaxa.cube.engine.deployment.CubeProcessLoader.create(CubeProcessLoader.java:391)
    at com.collaxa.cube.engine.deployment.CubeProcessLoader.load(CubeProcessLoader.java:302)
    at com.collaxa.cube.engine.deployment.CubeProcessHolder.loadAndBind(CubeProcessHolder.java:882)
    at com.collaxa.cube.engine.deployment.CubeProcessHolder.getProcess(CubeProcessHolder.java:790)
    at com.collaxa.cube.engine.deployment.CubeProcessHolder.loadAll(CubeProcessHolder.java:362)
    at com.collaxa.cube.engine.CubeEngine.loadAllProcesses(CubeEngine.java:910)
    at com.collaxa.cube.admin.ServerManager.loadProcesses(ServerManager.java:284)
    at com.collaxa.cube.admin.ServerManager.loadProcesses(ServerManager.java:250)
    at com.collaxa.cube.ejb.impl.ServerBean.loadProcesses(ServerBean.java:219)
    at IServerBean_StatelessSessionBeanWrapper14.loadProcesses(IServerBean_StatelessSessionBeanWrapper14.java:2466)
    at com.collaxa.cube.admin.agents.ProcessLoaderAgent$ProcessJob.execute(ProcessLoaderAgent.java:401)
    at org.quartz.core.JobRunShell.run(JobRunShell.java:141)
    at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:281)
    <2008-07-09 15:32:33,419> <ERROR> <default.collaxa.cube.engine.deployment> <CubeProcessHolder::loadAll> Error while loading process 'DRSSecuritiesReader', rev '1.0': Error
    while loading process.
    The process domain encountered the following errors while loading the process "DRSSecuritiesReader" (revision "1.0"): null.
    If you have installed a patch to the server, please check that the bpelcClasspath domain property includes the patch classes.
    ORABPEL-05215
    Error while loading process.
    The process domain encountered the following errors while loading the process "DRSSecuritiesReader" (revision "1.0"): null.
    If you have installed a patch to the server, please check that the bpelcClasspath domain property includes the patch classes.
    at com.collaxa.cube.engine.deployment.CubeProcessHolder.bind(CubeProcessHolder.java:1270)
    at com.collaxa.cube.engine.deployment.CubeProcessHolder.loadAndBind(CubeProcessHolder.java:883)
    at com.collaxa.cube.engine.deployment.CubeProcessHolder.getProcess(CubeProcessHolder.java:790)
    at com.collaxa.cube.engine.deployment.CubeProcessHolder.loadAll(CubeProcessHolder.java:362)
    at com.collaxa.cube.engine.CubeEngine.loadAllProcesses(CubeEngine.java:910)
    at com.collaxa.cube.admin.ServerManager.loadProcesses(ServerManager.java:284)
    at com.collaxa.cube.admin.ServerManager.loadProcesses(ServerManager.java:250)
    at com.collaxa.cube.ejb.impl.ServerBean.loadProcesses(ServerBean.java:219)
    at IServerBean_StatelessSessionBeanWrapper14.loadProcesses(IServerBean_StatelessSessionBeanWrapper14.java:2466)
    at com.collaxa.cube.admin.agents.ProcessLoaderAgent$ProcessJob.execute(ProcessLoaderAgent.java:401)
    at org.quartz.core.JobRunShell.run(JobRunShell.java:141)
    at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:281)

    I forgot to mention that I did flush the JAR cache to make sure the new JAR's would be downloaded.
    Gerrit

  • What would make my Cp7 course get hung up during the loading process when launching on LMS?

    In Cp7, I created a SCORM project to post on our LMS and submitted it for testing. Our LMS department confirmed it worked successfully, and in fact I tested it myself – it worked great. However, my work team asked me to make some changes to the content before pushing it out of the testing phase to launch company-wide. Per company policy, making changes to a course means resubmitting it once more to be tested before making it available to everybody. So I made the requested changes and resubmitted it for testing. Now our LMS department reports the course will not launch properly – it gets stuck in the loading process. It shows “Loading...” endlessly but never loads.
    When I submitted the updated version for testing, I kept all the project settings that worked successfully the first time. The changes I made to the project were:
    I added a slide toward the beginning (slide 2) to give the user navigation tips.
    I set slides so that each one must play all the way through before the user can proceed to the next slide (I think I did this simply by removing “play” on the playbar”).
    Under table of contents settings, I checked “navigate visited slides only,” so the user can navigate backwards using the contents bar at left, but can only navigate forward to slides that have already played or to the next slide in the queue.
    I broke up a couple of lengthy multiple choice questions into shorter ones (for an additional two slides).
    Is there any reason one of these changes would make the course get hung up during loading?
    Is there a size limit Cp7 projects which, if exceeded, might be causing such an issue? 
    Or does anyone have ideas about what else might be making it get hung up in the loading process?
    Thank you, any and all, for your feedback.

    back up the iPhoto library like any other backup - make a copy of the iPhoto library in case of problems
    you Depress the option (alt) and command keys and launch iPhoto - anyplace you can launch iPhoto you do this - keep the keys down until you get the rebuild window
    LN

Maybe you are looking for

  • How to send the mial with pdf format file

    Hi,    I would like to write a program, I have an external file in D drive with PDF format,  So i have to attact that file and send to particual user... If any body know pls send with sample code to [email protected]   Thanks in advance... Gopal

  • Need clarification to write transformation routine for my requirement

    Hi all, I have a requirement as mentioned below... I have a keyfigure amount which will hold the amount value..now i have been asked to add amount_2000 key figure...it has been done.. But the real problem is that They will give a filter value in DTP

  • Saved presentation as .zip file or a folder. Presentation gone?

    Hello, I really need some help. Yesterday, I worked with a presentation which took about 10 hours, I'm not really that experienced with computers, but I think I saved it as a .zip file or simply as a folder. I cannot find the presentation and open it

  • Disappearing Calendar Items

    About 4 days ago all of the items in my iCloud Calendar started to disappear from my iPad and iPhone.  I have been using iCloud as my master calendar for my PC, iPad, and iPhone since iCloud was introduced.  Now all of a sudden things are starting to

  • Help! weird itunes playlist

    i opened itunes and there's a playlist with a blue icon that is NOT mine, never seen it before. there is no way to even delete it. how does someone elses playlist end up on my itunes? i have wireless, should i be concerned????????