Essbase 9.3.1, more time taken for data load.

Hi,
i am trying to load 15gb of data, (data is taken from two seperate database of oracle) to my ASO application directly.
the load is very slow, what factors should i consider to make the load faster?
Will incresing the RAM size help me in this context?
i have gone through the admin doc of Essbase 9.3.1, in that the Hard-Disk and the RAM requirement is given, but it is for the Block storage.
is there any difference between this estimation and the ASO estimation?
If anyone can guide me, what things has to be taken care while loading the cube with huge data.
i shall be very thankful.

The statements which matters for the Aggregate storage is
DLSINGLETHREADPERSTAGE FALSE
DLTHREADSPREPARE Sample Basic 3
The write(DLTHREADSWRITE Sample Basic 4) option doesnt have any impact on the writing speed of the data in the ASO.
The statements has to be included in the essbase.cfg file, which is the configuration file of the Essbase server.
Well as per the document this should be done throught the maxl ,Esscmd or the Analytic services console?
Question?
1) Should we simply put this statements in the essbase.cfg file without a semicolon? and restart the server/application?
2) if these statement can be executed by the maxl? Please let us know how can we do that?
3) How do we know what is the current level of thread being used for the Read/write?
Thanks

Similar Messages

  • OWB 10g - The time taken for data load is too high

    I am loading data on the test datawarehouse server. The time taken for loading data is very high. The size of data is around 7 GB (size of flat files on the OS).
    The time it takes to load the same amount of data on the production server from the staging area to the presentation area(datawarehouse) is close to 8 hours maximum.
    But, in the test environment the time taken to execute one mapping (containing 300,000 records)is itself 8 hours.
    The version of Oracle database on both the test and production servers is the same i.e., Oracle 9i.
    The configuration of the production server is : 4 Pentium III processors (2.7 GHz each), 2 GB RAM, Windows 2000 Advanced Server, 8 kilobyte primary memory cache, 512 kilobyte secondary memory cache, 440.05 Gigabytes Usable Hard Drive Capacity, 73.06 Gigabytes Hard Drive Free Space
    The configuration of the test server is : 4 Pentium III processors (2.4 GHz each), 1 GB RAM, Windows 2000 Advanced Server, 8 kilobyte primary memory cache,
    512 kilobyte secondary memory cache, 144.96 Gigabytes Usable Hard Drive Capacity, 5.22 Gigabytes Hard Drive Free Space.
    Can you guys please help me to detect the the possible causes of such erratic behaviour of the OWB 10g Tool.
    Thanks & Best Regards,
    Harshad Borgaonkar
    PwC

    Hello Harshad,
    2 GB of RAM doesn't seem to be very much to me. I guess your bottleneck is I/O. You've got to investigate this (keep an eye on long running processes). You didn't say very much about your target database design. Do you have a lot of indexes on the target tables and if so have you tried to drop them before loading? Do your OWB mappings require a lot of lookups (then apropriate indexes on the lookup table are very useful)? Do you use external tables? Are you talking about loading dimension or fact tables or both? You've got to supply some more information so that we can help you better.
    Regards,
    Jörg

  • How to show the processing time taken for a BPEL process in BAM report.

    Hi All,
    I have the data as below in the Data object. I would like to show the time taken for each order to complete in the report.
    instance Id     order Id     product Name     product Code     price     status     instance Time      updaterName
    1360010     ord004     Guitar     prod003     2000     requested     9/22/2008 12:12:11 PM     Invoke_InsertSalesOrder
    1360010     ord004     Guitar     prod003     2000     Approved     9/22/2008 12:15:11 PM     Invoke_OrderStatusUpdate
    This data comes from simple BPEL process where sensors are configured at the start and end of BPEL process. Also have a human task activity in between to create the time difference.
    In Enterprise link design studio, I tried to calculate the time difference using expression calculator and store it as calculated field. But that doesn't seems to work because when I execute the plan, second sensor data reaches only after human approval whereas first sensor data would be waiting for calculation and ultimately nothing comes into data object.
    How and where the calculation be done to show the processing time in the report. Please someone throw some light on this.
    Regards
    Jude.
    Edited by: user600726 on Sep 30, 2008 1:30 AM

    I would suggest modifying your data object so that the data can all be in a single row and use the sensor at the end of the process to upsert (update) the row created by the sensor at the start of the process. The time difference between two fields in the same row is then an easy calculation on a BAM report -- No EL plan should be needed.

  • Report to display Average time taken for processing payments".

    Hi,
    I have been asked to develop a report for "Report to display Average time taken for processing payments".
    Could any one guide me technically what are the different tables i need to take to generate the report. Treat this is very urgent. Pls provide sample code too....
    Thanks in advance....

    Given below is the set up for credit card payment processing:
    Set Up Credit Control Areas:
    Define Credit Control Area
    Transaction: OB45 
    Tables: T014
    Action: Define a credit control area and its associated currency.  The Update Group should be ‘00012’.  This entry is required so the sales order will calculate the value to authorize
    Assign Company Code to Credit Control Area
    Transaction: OB38
    Tables: T001
    Action: Assign a default credit control area for each company code
    Define Permitted Credit Control Area for a Company
    Code
    Transaction: 
    Tables: T001CM
    Action: For each company code enter every credit control area that can be used
    Identify Credit Price
    Transaction: V/08
    Tables: T683S
    Action: Towards the end of the pricing procedure, after all pricing and tax determination, create a subtotal line to store the value of the price plus any sales tax.  Make the following entries:
    Sub to:  “A”
    Reqt:  “2”
    AltCTy:  “4”
    Automatic Credit Checking
    Transaction: OVA8
    Tables: T691F
    Action: Select each combination of credit control areas, risk categories and document types for which credit checking should be bypassed.  You need to mark the field “no Credit Check” with the valid number for sales documents.
    Set Up Payment Guarantees
    Define Forms of Payment Guarantee
    Transaction: OVFD
    Tables: T691K
    Action: R/3 is delivered with form “02” defined for payment cards.  Other than the descriptor, the only other entry should be “3” in the column labeled “PymtGuaCat”
    Define Payment Guarantee Procedure
    Transaction: 
    Tables: T691M/T691O
    Action: Define a procedure and a description. 
    Forms of Payment Guarantee and make the following entries Sequential Number  “1” 
    Payment Guarantee Form “02”
    Routine Number   “0”    Routine Number can be used to validate payment card presence.
    Define Customer Payment Guarantee Flag
    Transaction: 
    Tables: T691P
    Action: Define a flag to be stored in table. 
    Create Customer Payment Guarantee = “Payment Card Payment Cards (All Customers can use Payment Cards)”.
    Define Sales Document Payment Guarantee Flag
    Transaction: 
    Tables: T691R
    Action: Define the flag that will be associated with sales document types that are relevant for payment cards
    Assign Sales Document Payment Guarantee Flag
    Transaction: 
    Tables: TVAK
    Action: Assign the document flag type the sales documents types that are relevant for payment cards.
    Determine Payment Guarantee Procedure
    Transaction: OVFJ
    Tables: T691U
    Action: Combine the Customer flag and the sales document flag to derive the payment guarantee procedure
    Payment Card Configuration
    Define Card Types
    Transaction: 
    Tables: TVCIN
    Action: Create the different card types plus the routine that validates the card for length and prefix (etc…) 
    Visa , Mastercard, American Express, and Discover 
    Create the following entries for each payment card 
    AMEX  American Express ZCCARD_CHECK_AMEX Month
    DC  Discover Card  ZCCARD_CHECK_DC  Month*****
    MC  Mastercard  ZCCARD_CHECK_MC  Month
    VISA  Visa   ZCCARD_CHECK_VISA  Month
    The Routines can be created based on the original routines delivered by SAP. 
    *****SAP does not deliver a card check for Discover Card. We created our own routine.
    Define Card Categories
    Transaction: 
    Tables: TVCTY
    Action: Define the card category to determine if a
    payment card is a credit card or a procurement card.
    Create the following two entries
    Cat Description  One Card  Additional Data
    CC Credit Cards  No-check  No-check
    PC Procurement Cards No-check  Check
    Determine Card Categories
    Transaction: 
    Tables: TVCTD
    Action: For each card category map the account number range to a card category.  Multiple ranges are possible for each card category or a masking technique can be used.  Get the card number ranges from user community.  Below is just a sample of what I am aware are the different types of cards. 
    Visa Credit  Expires in 7 days. 
        400000   405500
        405505   405549
        405555   415927
        415929   424603
        424606   427532
        427534   428799
        428900   471699
        471700   499999
    Visa Procurement  Expires in 7 days.
        405501   405504
        405550   405554
        415928   415928
        424604   424605
        427533   427533
        428800   428899
    Mastercard Credit Expires in 30 days
        500000   540499
        540600   554999
        557000   599999
    Mastercard Procurement Expires in 30 days
        540500   540599
        555000   556999
    American Express Credit Expires in 30 days
        340000   349999
        370000   379999
    Discover Card Credit Expires in 30 days
        601100   601199
    Set Sales Documents to accept Payment Card Information Transaction: 
    Tables: TVAK
    Action: Review the listing of Sales Document types and enter “03” in the column labeled “PT” for each type which can accept a payment card
    Configuration for Authorization Request
    Maintain Authorization Requirements
    Transaction: OV9A
    Tables: TFRM
    Action: Define and activate the abap requirement that determines when an authorization is sent.  Note that the following tables are available to be used in the abap requirement (VBAK, VBAP, VBKD, VBUK, and VBUP).
    Define Checking Group
    Transaction: 
    Tables: CCPGA
    Action: Define a checking group and enter the
    description.  Then follow the below guidelines for the remaining fields to be filled.
    AuthReq Routine 901 is set here.
    PreAu  If checked R/3 will request an authorization for a .01 and the authorization will be flagged as such. (Insight does not use pre-authorization check).
    A horizon This is the days in the future SAP will use to determine the value to authorize
    (Insight does not use auth horizon period).
    Valid  You will get warning message if the payment card is expiring within 30 days of order entry date. 
    Assign Checking Group to Sales Document
    Transaction: 
    Tables: TVAK
    Action: Assign the checking group to the sales order types relevant for payment cards
    Define Authorization Validity Periods
    Transaction: 
    Tables: TVCIN
    Action: For each card type enter the authorization validity period in days.
    AMEX American Express 30
    DC Discover card  30
    MC Master card  30
    VISA Visa   7
    Configuration for clearing houses
    Create new General Ledger Accounts
    Transaction: FS01
    Tables: 
    Action: Two General Ledger accounts need to be created for each payment card type.  One for A/R reconciliation purposes and one for credit card clearing.
    Maintain Condition Types
    Transaction: OV85
    Tables: T685
    Action: Define a condition type for account determination and assign it to access sequence “A001”
    Define account determination procedure
    Transaction: OV86
    Tables: T683 / T683S
    Action: Define procedure name and select the procedure for control.  Enter the condition type defined in the previous step.
    Assign account determination procedure
    Transaction: 
    Tables:
    Action: Determine which billing type we are using for payment card process.
    Authorization and Settlement Control
    Transaction: 
    Tables: TCCAA
    Action: Define the general ledger accounts for reconciliation and clearing and assign the function modules for authorization and settlement along with the proper RFC destinations for each.
    Enter Merchant ID’s
    Transaction: 
    Tables: TCCM
    Action: Create the merchant id’s that the company uses to process payment cards
    Assign merchant id’s
    Transaction: 
    Tables: TCCAA
    Action: Enter the merchant id’s with each clearinghouse account

  • PO Lead Time cannot capture the time taken for shipping!

    Dear All
    I understand that we have PO lead time = PO Processing Time (Working Day) + Planned Delivery Time (Calendar Day) + GR processing time (working day).
    And this PO lead time will be added on top of my PO Creation Date to defer the actual goods availability date.
    My question:
    1. Planned delivery time is the time taken from vendor place to send out the goods to my warehouse. What if it is overseas purchase where goods leaving vendor's Port will first arrive in my country custom, and it will take 3 days to do clearance. once it is cleared, forwarding agent will delivery goods from my country custom to my warehouse. In this case, how do I capture it in SAP system for the planned delivery time as it has 4 periods of time now
    a. Time taken from vendor's port to reach my country's port
    b. Time taken for my country custom to do clearing
    c. Time taken for forwarding agent to fetch goods from custom to my warehouse
    d. Time taken for unpack, take out , count, inspect and put for use (GR processing time)
    Do I need to use user Exit?
    Thanks
    Edited by: Daimos on Apr 27, 2009 6:52 PM

    Dear dogboy.
    I think we must use feature on the PO Confirmation Control (CC) Key at PO Item Level:
    ED - Estimated Time of Departure from Overseas Port.
    EA - Actual Time of Departure from Overseas Port.
    EA - Estimated Time of Arrival
    AA - Actual Time of Arrival
    And the purchaser will maintain the value of each of the CC Key each time they are notified by the vendor.
    And we need to come out with a Customised Report to capture those CC dates entered so that finance is able to prepare $ in advance if the moment the EA is maintained, meaning the estimated date of arrival at the Custom there.
    But the problem is that PO User Exit is only at the header of Confirmation Control Key but not capture the DATE field we entered for each CC.
    That was the problem I last encountered.

  • How to Measure time taken for a some lines of code in a program?

    Hi
    I have one requirement to measure time taken for executing a  block of code in report . How we can find it ?
    Is there any way to code something in report to caluculate it ?
    Please send solution as early as possible
    thank u

    Ok.. try this code...
    DATA : t1 TYPE i,
    t2 TYPE i,
    delta(16) TYPE p.
    GET RUN TIME FIELD t1.
    PERFORM get_data. "your block of code
    GET RUN TIME FIELD t2.
    delta = t2 - t1.
    delta = delta / 1000000.
    WRITE :/ 'Time elapsed : ', delta , 'Secs'. "time in secs.

  • How to analyse the time taken for a query

    Hey gurus ,
                          How to find the time taken for a query to execute .
    Regards,
    Venkatesh

    Hi,
    Time taken to execute a query = FRONT END TIME + OLAP TIME + DB TIME.
    front end time is time taken to do format in BEX.
    olap time is time taken to aggegate data in OLAP buffer.
    db time is tme taken to collect data at data target.
    to fine all these information
    goto RSRT -> give query name -> execute+debug -> it will display all the fields > check fields what ever u want.
    Regards,
    Haritha.

  • How to find time taken for a search

    hi all,
       I need to find the time taken for a particular search in KM Search iView. I refered the following thread
    /message/5739737#5739737 [original link is broken]
    but not able to get the duration(time taken). is there any other way to achieve this?.
    all helps will be appriciated.
    Regards,
    Shanthakumar.

    Hi Shantakhumar,
    do you want to implement you own Search iView?
    Best regards,
    Denis

  • Time taken for a method to run. ?

    I have a query regarding ascertaining the time taken
    for a method to execute
    I have a SQL statement that I reads 10,000 rows.
    String a_SQL = "Select.....from TableA";
    try {
         IQuery query = m_UC.createQuery();
         SimpleTableModel stm = query.executeSelect(a_SQL);
       }catch(SQLException se){
       // Query Over.
       private void displayJTable(){
       // display rows read from the query above.
       }          As the query is executing,I want to display a progress bar
    showing the status of the query.
    Now I cant use this :
    start = System.currentTimeMillis();
    try {
         IQuery query = m_UC.createQuery();
         SimpleTableModel stm = query.executeSelect(a_SQL);
       }catch(SQLException se){
    end = System.currentTimeMillis();
    System.out.println(" Time taken  to display " + (end - start)/1000);
                  The above will give me the time elapsed in seconds for the SQL query to execute,but
    this is not what I want.
    What i want is to use this :
           // Progress BAR executing,so I need to get to know the time
           // taken for the query to run
            try {
            the SQL execution
            }cach(){
            // Query over.
            // Stop Progres Bar.
            // Display JTable.       How can I know when to stop the Progress Bar as I have no handle
    on the time taken to execute the query?
    Any help will be appreciated

    You can have a separate thread (or maybe it's just part of the regular GUI update thread? I don't know details about GUIs) that puts up an hourglass or spinning clock hands or dancing hamster or whatever to indicate that something is going on, but like the man says, you can't know ahead of time how long it will take to run a query, so you can't show percent done. You also don't in general know ahead of time how many rows will be returned so that doesn't help you.
    You might be able to do something for the processing of the returned data, once the query completes, because then you can often get a count of the number of rows, so you process each row in a loop, and update the % done counter each time through the loop.

  • How to find the time taken for creating execution plan alone

    Hi,
    Is there any way to find out the division between the time taken for query parsing, creating execution plan and actual data retrieval seperately? If I enable 'set timing on' I see the elapsed time which is the total time taken for all these 3.
    Some of my queries are taking long time when I run it first time and so want to know what is it taking long, is it the parsing the query or creating the execution plan? (Since my queries run faster second time, I am assuming the major part was for parsing or creating the plan but not the data retrieval).
    Also, where does Oracle keep the execution plan? Is it in the BUFFER_CACHE? if so, I tried flushing the buffer_cache and restarting the DB as well, but still the query execution seems faster compared to the first time. How long does Oracle keep the execution plan in the cache and is there anyway to increase this cache size?
    Thanks in advance!

    user13169027 wrote:
    Hi,
    Is there any way to find out the division between the time taken for query parsing, creating execution plan and actual data retrieval seperately? If I enable 'set timing on' I see the elapsed time which is the total time taken for all these 3.
    Some of my queries are taking long time when I run it first time and so want to know what is it taking long, is it the parsing the query or creating the execution plan? (Since my queries run faster second time, I am assuming the major part was for parsing or creating the plan but not the data retrieval).
    The ideal way for finding the answer to your questions above, would be to perform a (sql) trace of your query executions. To see the difference in the trace-files, you might want to trace the first execution in one session, and the second execution in another session: so you get two different trace files, which you can then seperately tkprof, and investigate.
    Also, where does Oracle keep the execution plan? Is it in the BUFFER_CACHE? if so, I tried flushing the buffer_cache and restarting the DB as well, but still the query execution seems faster compared to the first time. How long does Oracle keep the execution plan in the cache and is there anyway to increase this cache size?
    Execution plans are held in the shared-pool, not the buffer-cache. As far as I know they will be kept in memory in an LRU way (least recently used), just like db-blocks are in the buffer-pool (I know this is not entirely correct, but for all practical purposes, think of it this way).

  • Total time taken for the quiz needs to be displayed in the report

    Hi,
    I would want tthe total time taken for the quiz to be displayed in the report...Let me know if its possible

    Hi,
    Please try this.
    Define three user variables - StartTime, EndTime and Duration.
    At the start of the quiz ('On Slide Enter' of the first Question Slide). have an advanced action with the following actions.
    Assign: StartTime with cpInfoElapsedTimeMS  
    Continue
    At the end of the quiz ('On Success' or 'On Failure' of the last Question Slide or 'On Slide Enter' of Result Slide), have this advanced action.
    Assign: EndTime with cpInfoElapsedTimeMS
    Expression: Duration = EndTime - StartTime
    Expression: Duration = Duration / 1000
    Continue
    The variable 'Duration' will have the time taken by the user for the Quiz in seconds.
    Let me know if you have any queries.
    Thanks,
    Thejas

  • Missing Standard Dimension Column for data load (MSSQL to Essbase Data)

    This is similar error to one posted by Sravan -- however I'm sure I have all dimensions covered -- going from MS SQL to SunOpsys Staging to Essbase. It is telling me missing standard dimension, however I have all accounted for:
    org.apache.bsf.BSFException: exception from Jython:
    Traceback (innermost last): File "<string>", line 23, in ? com.hyperion.odi.essbase.ODIEssbaseException: Missing standard dimension column for data load
    at com.hyperion.odi.essbase.ODIEssbaseDataWriter.loadData(Unknown Source)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
    at java.lang.reflect.Method.invoke(Unknown Source)
    I'm using multiple time period inputs -- BegBalance,Jul,Aug,Sep,Oct,Nov,Dec,Jan,Feb,Mar,Apr,May,Jun (target has all of those in place of Time Periods)
    I'm using hard coded input mapping for Metric, Scenario, Version, HSP_Rates and Currencies. -> 'Amount', 'Actual', 'Final', 'HSP_InputValue','Local' respectively.
    The only thing I can think of is that since I'm loading to each of the months in the Time Periods dimension (the reversal was set up to accomodate that)... and now its somehow still looking for that? Time Periods as a dimension does not show up in the reversal -- only the individual months named above.
    Any ideas on this one??

    John -- I extracted the data to a file and created a data load rule in Essbase to load the data. All dimensions present and accounted for (five header items as similar here) and everything loads fine.
    So not sure what else is wrong -- still getting the missing dimension error.
    Any other thoughts?? Here's the entire error message. Thanks for all your help on this.
    org.apache.bsf.BSFException: exception from Jython:
    Traceback (innermost last):
    File "<string>", line 23, in ?
    com.hyperion.odi.essbase.ODIEssbaseException: Missing standard dimension column for data load
         at com.hyperion.odi.essbase.ODIEssbaseDataWriter.loadData(Unknown Source)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
         at java.lang.reflect.Method.invoke(Unknown Source)
         at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java)
         at org.python.core.PyMethod.__call__(PyMethod.java)
         at org.python.core.PyObject.__call__(PyObject.java)
         at org.python.core.PyInstance.invoke(PyInstance.java)
         at org.python.pycode._pyx8.f$0(<string>:23)
         at org.python.pycode._pyx8.call_function(<string>)
         at org.python.core.PyTableCode.call(PyTableCode.java)
         at org.python.core.PyCode.call(PyCode.java)
         at org.python.core.Py.runCode(Py.java)
         at org.python.core.Py.exec(Py.java)
         at org.python.util.PythonInterpreter.exec(PythonInterpreter.java)
         at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:144)
         at com.sunopsis.dwg.codeinterpretor.k.a(k.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
         at com.sunopsis.dwg.cmd.e.k(e.java)
         at com.sunopsis.dwg.cmd.g.A(g.java)
         at com.sunopsis.dwg.cmd.e.run(e.java)
         at java.lang.Thread.run(Unknown Source)
    Caused by: com.hyperion.odi.essbase.ODIEssbaseException: Missing standard dimension column for data load
         at com.hyperion.odi.essbase.ODIEssbaseDataWriter.validateColumns(Unknown Source)
         ... 32 more
    com.hyperion.odi.essbase.ODIEssbaseException: com.hyperion.odi.essbase.ODIEssbaseException: Missing standard dimension column for data load
         at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
         at com.sunopsis.dwg.codeinterpretor.k.a(k.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
         at com.sunopsis.dwg.cmd.e.k(e.java)
         at com.sunopsis.dwg.cmd.g.A(g.java)
         at com.sunopsis.dwg.cmd.e.run(e.java)
         at java.lang.Thread.run(Unknown Source)

  • Sample SOAP request for Data Loader API

    Hi
    Can anyone please help me out in giving a sample SOAP request for Data Loader API .This is to say import 1K records from my system to the CRM instance I have .

    Log into the application and then click on Training and Support there is a WS Library of Information within the application

  • Essbase Calculation Script taking more time in new environment

    Hi Everyone:
    We have four environments in our implementation.
    1. DEV Environment - 64 bit Essbase Version 11.1.1.3
    2. PreProd Environment - 32 bit Essbase Version 9.3.0
    3. PreProd Environment - 64 bit Essbase Version 11.1.1.3
    In the above mentioned environment PreProd Environment - 64 bit Essbase Version 11.1.1.3 is a newly installed
    environment.
    We have migrated our Application from PreProd Environment - 32 bit Essbase Version 9.3.0 to PreProd Environment - 64 bit Essbase Version 11.1.1.3. A calculation script that takes only 20 minutes in 32 bit PreProd is taking more than
    5 and half hours in newly installed 64 bit PreProd.
    We have also migrated our Application from DEV Environment - 64 bit Essbase Version 11.1.1.3 to PreProd Environment - 64 bit Essbase Version 11.1.1.3. The calculation script that takes only 20 minutes in 64 bit Dev is taking more than 5 and half hours in newly installed 64 bit PreProd.
    All the server settings and cache setting everything looks similar in all the three environments.
    Please advice us what are all the possibilities that creates the issue.
    Thanks and Regards,
    Prabhakar.

    Hi Cameron,
    Thanks for your reply.
    I have cross checked the Virtual memory in both servers,in new server it was declared high.
    Please find the cfg setting which we are using in our application.
    AGENTPORT 1423
    SERVERPORTBEGIN 32768
    SERVERPORTEND 33768
    AGENTDESC hypservice_1
    ;CSSREFRESHLEVEL auto
    ;SHAREDSERVICESREFRESHINTERVAL 30
    CALCCACHEHIGH 199999999
    CALCCACHEDEFAULT 150000000
    CALCCACHELOW     10000000
    CALCLOCKBLOCKDEFAULT 3000
    DATAERRORLIMIT 10000
    UPDATECALC FALSE
    EXCEPTIONLOGOVERWRITE FALSE
    CALCREUSEDYNCALCBLOCKS FALSE
    PORTUSAGELOGINTERVAL 15
    QRYGOVEXECTIME 600
    LOGMESSAGELEVEL INFO
    CALCPARALLEL 6
    MAXLOGINS 100000
    AGENTDELAY 100
    AGENTTHREADS 30
    AGTSVRCONNECTIONS 10
    SERVERTHREADS 25
    EXPORTTHREADS 1
    SSLOGUNKNOWN FALSE
    CALCNOTICEDEFAULT 10
    NETRETRYCOUNT 3000
    NETDELAY 2000
    __SM__BUFFERED_IO TRUE
    __SM__WAITED_IO TRUE
    and aslo find the caches that we define:
    Index cache:250000
    Data Cache:250000
    Data file cache:32768
    The all above settings are identical both servers.
    In New server ,only one script that is taking more time but remaining scripts are working fine with less time.
    We also did one test cause that splitting the script in to multiple and executed ,in this cause the script where we are using direct assigning value from member(say A1) to another member(Say A2) is taking more time.But same scripts we executed in old server it executes fine.
    Still we are not able to find out exact root cause for this issue.
    Could please anyone help me to resove this issue.
    Regards,
    Prabhakar.

  • Time taken for HANA System to become available after a System restart for continuing transaction (in ERP on HANA)

    Hi HANA Gurus,
    Would like to know in a normal system replication based HA for SoH, without memory preload, after an automatic failover to the second node,in how much time HANA becomes available for the SoH(ERP) users to continue transaction.  Consider an ideal scenario with  512 GB appliance , with allocation limit of 490GB approx., considering normal 90% of first 64gb and 97% of every further 64GB memory on the single host.  Also consider a used memory of say 128 GB(indicating database size and considering no custom configuration for specific  tables preload  .Consider the save point is every five minutes as default and committed transactions are written to the log.  Host is replicated to second node using HANA System replication and with HANA automatic failover option. This means on first node failure the HANA system will be restarted on second node and data will be loaded to the memory based on the last state in the other node. As I understand the save points are loaded, logs are applied and database become available for users, the lazy load continues etc. Kindly answer me if anyone knows by experience how much time it takes from the failure detection  to next state of the system becoming available for user to start updating for example a Sales Order. Also like to know more about this start-up process to understand it better(for example the last save points are done by saving the changed pages, so time taken to load the last savepoint pages to memory, then applying transactions from last save point (last 4 minutes for example  before failing over etc).  The purpose is to understand whether the failover time from Primary to standby takes similar time like a traditional db  (for example db2) or more due to memory load time in HANA. At the same time I am aware of other options available to reduce the RTO by using memory preload, cluster solutions from vendors , OS like SUSE etc.
    Thanks in advance.
    Rajesh Vikraman

    Hi John
    My question is specific to a system failover  which is not replication with memory preload, where you require cluster software to do the failover. My question is in case of a automatic failover configuration using a standby node . For example gpfs / hana automatic failover option, where standby note starts hana, and  loads last savepoint , applies logs and then start lazy loading.
    Regards

Maybe you are looking for

  • Problems after latest update

    Anyone having troubles with Itunes after the latest update.  Mine wont open now

  • How can I import address books from Thunderbird to Mail?

    I had to use Thunderbird for all my Mail, because I cannot import about 10 different e-mail address lists with a total of about 300 addresses (contacts) into the Mail application nor into the Contacts Application.  Every other e-mail application, and

  • Query to pull all the receipts applied....

    My requirement is to pull the list of all the receipts applied with the applied amount of that receipt, the unapplied amount and the on account amount on that receipt. The Applied plus Unapplied plus On-Account amounts should equal the Receipt Amount

  • ABAP self-learning

    Hi everyone, I'm trying to learn ABAP from home by self learning. I want to get a feel of ABAP. Need not be a professional with all advanced features and stuff, but just for self knowledge. I was advised that I can learn by installing "Mini Web AS 6.

  • Warranty Date field in the equipment master data

    Hi everyone! I wanted to display the warranty date field in the equipment master data (IE01). I have already made the configuration and added the additional tab and screen thru this path: Plant Maintenance and Customer Service > Master Data in Plant