0CO_PC_PCP_20 extraction lot of time due to delta incapability

Hi experts,
We are extracting data from 0CO_PC_PCP_20 which is product costing analysis, and pulling data into 0COPC_C08 Info cube.
the data source 0CO_PC_PCP_20 is not supporting delta, but it has got huge volume of data.
if i extract everyday it is taking lot of time through process chain (inserted into metachain), and because of this the the other local chains are getting delayed and for the business reports are not available ontime.
How to make 0CO_PC_PCP_20 as delta capable or is there any other way to extract the data based on this data source, so that i can minimize the time by extracting deltas.
Regards
venuscm

Hi Venu,
This data source doesn't support delta and being standard data source I am not sure whether you will be able to make it delta, but what you can do is, try to split your data load by different selections using multiple full load infopackages and schedule important infopackages first and then once all the loading is completed schedule other infopackages.
Even if you don't want to do that then execute all these infopackages in parallel in a process chain, it will still reduce your lot of source system data extraction time.
Just make sure that your selection will extract all the data. 
Regards,
Durgesh.

Similar Messages

  • Server boot takes a lot of time due to JMS persistence file store

    Hi,
    We're using WebLogic 9.2 on a windows machine.
    Recently, we enhanced our application to work with JMS.
    We're using 2 JMS servers on 2 different servers in a cluster, and working with distributed destinations.
    We're using file store as our persistence store.
    Lately, we found that server boot takes a lot of time due to actions which are performed on the file-store.
    Analyzing this problem, we saw that the file size is ~2GB.
    We understand that the file size does not shrink (unless manually compacted), and its size is matching the highest number ever of pending messages in the queues (like a high water-mark).
    We also turned store debug on, using weblogic.Debug.DebugStoreIOPhysical, and we saw a lot of recovery logs after the file-store is opened.
    We've checked and verified that our JMS transactions are completed successfully. We also verified that the number of pending and current messages is 0.
    We've stopped the server gracefully, verifying there are no pending messages.
    However, server boot repeatedly takes ~30 minutes due to heavy work done on the persistence store.
    Compacting or removing the file store eliminates the delay.
    However, we want to avoid manual operations every time we want to boot the server.
    Please share your ideas.
    Thanks,
    Itsik

    The only suggestions I have off the top of my head are:
    * Ensure (1) the host system has at least 2GB (file store size) of free physical memory in addition to the memory used by current processes, and (2) the host system is not rebooted between each WebLogic restart. If you make sure of these two items, then the operating system will implicitly cache the store's file contents in memory even between boots -- when WebLogic subsequently boots the file store, the O/S should serve the file contents up much more quickly.
    * Consider moving to a JDBC store. JDBC store runtime performance is lower, but since you have a typically only have a small amount of records to recover, boot performance should be higher in this case.
    * Not that it helps your particular case, but it happens that we have boot performance enhancements in process for a future release (hopefully a near future release).
    Tom

  • Generic Extraction : Taking a lot of time

    HI Experts
    I have created ZKONV a generic extracter which is a copy of KONV table. It has around 16,00,000 records .When I´m pulling the data in to ODS it is taking a lot of time. it is taking around 5+ hours to load the data. Is there anything wrong with my Datasource why it is taking so much of time.
    Kindly provide some inputs
    Thanks
    NLN

    Hi Lakshminarayana
    You got to check couple of things.
    First goto the source system and see, how long the extract program is running. The long time may be due to poor source system performance or huge processing at the BW system side. If the job in the source system is running for a long time, then check the source system resource. Build proper index on the mentioned table and see whether it has improved the performance. You can ask the basis people for the SQL trace and they let u know what kindof indexes u can build on the tables.
    If the processing is taking more time, then u have to imrpove the start/update routine in the BW side. Please let us know, where exactly u have the problem. Then we can think of resolving it.
    Sriram

  • Function Module Extraction from KONV Table taking lot of time for extractio

    Hi
    I have a requirement wherein i need to get records from KONV Table (Conditions (Transaction Data) ). i need the data corresponding to Application (KAPPL) = 'F'.
    For this i had written one function module but it is taking lot of time (@ 2.5 hrs) for fetching records as there are large number of records in KONV Table.
    I am pasting the Function Module code for reference.
    <b>kindly guide me as to how the extraction performance can be improved.</b>
    <b>Function Module Code:</b>
    FUNCTION ZBW_SHPMNT_COND.
    ""Local interface:
    *"  IMPORTING
    *"     VALUE(I_REQUNR) TYPE  SBIWA_S_INTERFACE-REQUNR
    *"     VALUE(I_ISOURCE) TYPE  SBIWA_S_INTERFACE-ISOURCE OPTIONAL
    *"     VALUE(I_MAXSIZE) TYPE  SBIWA_S_INTERFACE-MAXSIZE OPTIONAL
    *"     VALUE(I_INITFLAG) TYPE  SBIWA_S_INTERFACE-INITFLAG OPTIONAL
    *"     VALUE(I_UPDMODE) TYPE  SBIWA_S_INTERFACE-UPDMODE OPTIONAL
    *"     VALUE(I_DATAPAKID) TYPE  SBIWA_S_INTERFACE-DATAPAKID OPTIONAL
    *"     VALUE(I_PRIVATE_MODE) OPTIONAL
    *"     VALUE(I_CALLMODE) LIKE  ROARCHD200-CALLMODE OPTIONAL
    *"  TABLES
    *"      I_T_SELECT TYPE  SBIWA_T_SELECT OPTIONAL
    *"      I_T_FIELDS TYPE  SBIWA_T_FIELDS OPTIONAL
    *"      E_T_DATA STRUCTURE  ZBW_SHPMNT_COND OPTIONAL
    *"      E_T_SOURCE_STRUCTURE_NAME OPTIONAL
    *"  EXCEPTIONS
    *"      NO_MORE_DATA
    *"      ERROR_PASSED_TO_MESS_HANDLER
    The input parameter I_DATAPAKID is not supported yet !
      TABLES: KONV.
    Auxiliary Selection criteria structure
      DATA: l_s_select TYPE sbiwa_s_select.
    Maximum number of lines for DB table
      STATICS: l_maxsize TYPE sbiwa_s_interface-maxsize.
    Maximum number of lines for DB table
      STATICS: S_S_IF TYPE SRSC_S_IF_SIMPLE,
    counter
              S_COUNTER_DATAPAKID LIKE SY-TABIX,
    cursor
              S_CURSOR TYPE CURSOR.
    Select ranges
      RANGES: L_R_KNUMV  FOR KONV-KNUMV,
              L_R_KSCHL  FOR KONV-KSCHL,
              L_R_KDATU  FOR KONV-KDATU.
    Declaring internal tables
    DATA : I_KONV LIKE KONV OCCURS 0 WITH HEADER LINE.
      DATA : Begin of I_KONV occurs 0,
             MANDT LIKE konv-mandt,
             KNUMV LIKE konv-knumv,
             KPOSN LIKE konv-kposn,
             STUNR LIKE konv-stunr,
             ZAEHK LIKE konv-zaehk,
             KAPPL LIKE konv-kappl,
             KSCHL LIKE konv-kschl,
             KDATU LIKE konv-kdatu,
             KBETR LIKE konv-kbetr,
             WAERS LIKE konv-waers,
             END OF I_KONV.
    Initialization mode (first call by SAPI) or data transfer mode
    (following calls) ?
      IF i_initflag = sbiwa_c_flag_on.
    Initialization: check input parameters
                    buffer input parameters
                    prepare data selection
    The input parameter I_DATAPAKID is not supported yet !
    Invalid second initialization call -> error exit
        IF NOT g_flag_interface_initialized IS INITIAL.
          IF
            1 = 2.
            MESSAGE e008(r3).
          ENDIF.
          log_write 'E'                    "message type
                    'R3'                   "message class
                    '008'                  "message number
                    ' '                    "message variable 1
                    ' '.                   "message variable 2
          RAISE error_passed_to_mess_handler.
        ENDIF.
    Check InfoSource validity
        CASE i_isource.
          WHEN 'X'.
         WHEN 'Y'.
         WHEN 'Z'.
          WHEN OTHERS.
           IF 1 = 2. MESSAGE e009(r3). ENDIF.
           log_write 'E'                  "message type
                     'R3'                 "message class
                     '009'                "message number
                     i_isource            "message variable 1
                     ' '.                 "message variable 2
           RAISE error_passed_to_mess_handler.
        ENDCASE.
    Check for supported update mode
        CASE i_updmode.
    For full upload
          WHEN 'F'.
          WHEN 'D'.
          WHEN OTHERS.
           IF 1 = 2. MESSAGE e011(r3). ENDIF.
           log_write 'E'                  "message type
                     'R3'                 "message class
                     '011'                "message number
                     i_updmode            "message variable 1
                     ' '.                 "message variable 2
           RAISE error_passed_to_mess_handler.
        ENDCASE.
        APPEND LINES OF i_t_select TO g_t_select.
    Fill parameter buffer for data extraction calls
        g_s_interface-requnr    = i_requnr.
        g_s_interface-isource   = i_isource.
        g_s_interface-maxsize   = i_maxsize.
        g_s_interface-initflag  = i_initflag.
        g_s_interface-updmode   = i_updmode.
        g_s_interface-datapakid = i_datapakid.
        g_flag_interface_initialized = sbiwa_c_flag_on.
    Fill field list table for an optimized select statement
    (in case that there is no 1:1 relation between InfoSource fields
    and database table fields this may be far from beeing trivial)
        APPEND LINES OF i_t_fields TO g_t_fields.
    Interpretation of date selection for generic extraktion
       CALL FUNCTION 'RSA3_DATE_RANGE_CONVERT'
         TABLES
           i_t_select = g_t_select.
      ELSE.                 "Initialization mode or data extraction ?
       CASE g_s_interface-updmode.
         WHEN 'F' OR 'C' OR 'I'.
    First data package -> OPEN CURSOR
        IF g_counter_datapakid = 0.
       L_MAXSIZE = G_S_INTERFACE-MAXSIZE.
          LOOP AT g_t_select INTO l_s_select WHERE fieldnm = 'KNUMV'.
            MOVE-CORRESPONDING l_s_select TO l_r_knumv.
            APPEND l_r_knumv.
          ENDLOOP.
          LOOP AT g_t_select INTO l_s_select WHERE fieldnm = 'KSCHL'.
            MOVE-CORRESPONDING l_s_select TO l_r_kschl.
            APPEND l_r_kschl.
          ENDLOOP.
          Loop AT g_t_select INTO l_s_select WHERE fieldnm = 'KDATU'.
            MOVE-CORRESPONDING l_s_select TO l_r_kdatu.
            APPEND l_r_kdatu.
          ENDLOOP.
    *In case of full upload
    Fill field list table for an optimized select statement
    (in case that there is no 1:1 relation between InfoSource fields
    and database table fields this may be far from beeing trivial)
       APPEND LINES OF I_T_FIELDS TO S_S_IF-T_FIELDS.
          OPEN CURSOR G_CURSOR FOR
            SELECT MANDT
                   KNUMV
                   KPOSN
                   STUNR
                   ZAEHK
                   KAPPL
                   KSCHL
                   KDATU
                   KBETR
                   WAERS
            FROM   KONV
            WHERE KNUMV IN l_r_knumv
            AND   KSCHL IN l_r_kschl
            AND   KDATU IN l_r_kdatu
            AND   KAPPL EQ 'F'.
        ENDIF.
        Refresh I_KONV.
        FETCH NEXT CURSOR G_CURSOR
                   APPENDING CORRESPONDING FIELDS OF TABLE I_KONV
                   PACKAGE SIZE S_S_IF-MAXSIZE.
        IF SY-SUBRC <> 0.
          CLOSE CURSOR G_CURSOR.
          RAISE NO_MORE_DATA.
        ENDIF.
        LOOP AT I_KONV.
         IF I_KONV-KAPPL EQ 'F'.
          CLEAR :E_T_DATA.
          E_T_DATA-MANDT = I_KONV-MANDT.
          E_T_DATA-KNUMV = I_KONV-KNUMV.
          E_T_DATA-KPOSN = I_KONV-KPOSN.
          E_T_DATA-STUNR = I_KONV-STUNR.
          E_T_DATA-ZAEHK = I_KONV-ZAEHK.
          E_T_DATA-KAPPL = I_KONV-KAPPL.
          E_T_DATA-KSCHL = I_KONV-KSCHL.
          E_T_DATA-KDATU = I_KONV-KDATU.
          E_T_DATA-KBETR = I_KONV-KBETR.
          E_T_DATA-WAERS = I_KONV-WAERS.
          APPEND E_T_DATA.
       ENDIF.
        ENDLOOP.
        g_counter_datapakid = g_counter_datapakid + 1.
      ENDIF.
    ENDFUNCTION.
    Thanks in Advance
    Regards
    Swapnil.

    Hi,
    one option to investigate is to select the data with a condition on KNUMV (primary IDX).
    Since shipment costs are store in VFKP I would investigate if all your F condition records are used in this table (field VFKP-KNUMV).
    If this is the case then something like
    SELECT *
    FROM KONV
    WHERE KNUMV IN (SELECT DISTINCT KNUMV FROM VFKP)
    or
    SELECT DISTINCT KNUMV
    INTO CORRESPONDING FIELD OF <itab>
    FROM VFKP
    and then
    SELECT *
    FROM KONV
    FOR ALL ENTRIES IN <itab>
    WHERE...
    will definitively speed it up.
    hope this helps....
    Olivier

  • Extraction, Flat File, Generic Data Source, Delta Initialization

    Extraction, Flat File, Generic Data Source, Delta Initialization
    I have couple of questions regarding data extraction.
    1.     If you have Data Source a Flat File e.g. Excel file I know that you have to create Data Source at BW side. How do you upload updates, by selecting Delta Update when executing next Data Load? Do you ever u201Cconvertu201D this Excel file into Application Tables to become SAP Source 
    2.     Can you please give me example of situation where you had to create Generic Data Source? What is difference between Time Stamp, Calend. Day and Numeric Pointer. Which one is most common to select?
    3.     I read somewhere that Generic Data Source does not have Setup Table. I thought that you have to have Setup Table in order to load transaction Data otherwise you will lock the Application Tables. Where am I going wrong im my thinking please?
    4.     What are steps in terms of IP before, under and after Delta Initialization. I belive that you can do two ways:
    Full Update - Initialize Delta Process (without Data Transfer) u2013 Delta Update  or
    Initialize Delta Process (with Data Transfer) u2013 Delta Update
    Am I right? What is most common method and why?
    5.     If you want to add a filed in Data Source after 6 month using it, you want to do it without re-init Delta Queue. You add field in RSA6, then provide info for ABAP to populate new filed (info u2013 name of Data Source, Extract Structure, field added, name of Application Table which contains the field). How does it work now as there is no SetUp table it has been deleted after Initialisation? How does Delta Queue know that it is going to receive data which has been  expanded by one field or it may does not need to know at all?
    THANKSSSSSSSSSs

    Hi,
    1. If you have Data Source a Flat File e.g. Excel file I know that you have to create Data Source at BW side. How do you upload updates, by selecting Delta Update when executing next Data Load? Do you ever u201Cconvertu201D this Excel file into Application Tables to become SAP Source
    Once you create Datasource for A flat file extraction then it is file source system specific hence you cont change to Application table source Data source
    In info package you can change the source as application server instead of desktop no need to change the DS
    2. Can you please give me example of situation where you had to create Generic Data Source? What is difference between Time Stamp, Calend. Day and Numeric Pointer. Which one is most common to select?
    When we don't find any standard extractor then we can go for Generic(if i want information sales along with finance information in a data source then generally we dont get standard one hence we can go for generic DS)
    Check the below link for More about generic DS .
    http://wiki.sdn.sap.com/wiki/display/BI/Generic+Extraction
    for Delta capturing you can use
    Timestamp(if the table has time stamp filed  so it can stamp the last changed record in that table hence it is easy to get delta based on the time stamp)
    Calday- (If the table doesn't have the Timestamp filed then search for Calday where you can stamp the delta based on the date when documents are changed )
    Numericpointer : If the table doesn't above both then we go for this option where you can the numeric value change stamp )
    3. I read somewhere that Generic Data Source does not have Setup Table. I thought that you have to have Setup Table in order to load transaction Data otherwise you will lock the Application Tables. Where am I going wrong im my thinking please?
    Generic datasource nothing but we extracting data directly from the database table without any interface between the application/systems
    4. What are steps in terms of IP before, under and after Delta Initialization. I belive that you can do two ways:
    Full Update - Initialize Delta Process (without Data Transfer) u2013 Delta Update or
    Initialize Delta Process (with Data Transfer) u2013 Delta Update
    Am I right? What is most common method and why?
    Correct
    5. If you want to add a filed in Data Source after 6 month using it, you want to do it without re-init Delta Queue. You add field in RSA6, then provide info for ABAP to populate new filed (info u2013 name of Data Source, Extract Structure, field added, name of Application Table which contains the field). How does it work now as there is no SetUp table it has been deleted after Initialisation? How does Delta Queue know that it is going to receive data which has been expanded by one field or it may does not need to know at all?
    Once you add the new field to structure(DS) you will get the data as on date onwards not historical data hence what is the concept of setup table  ( delta records come from the Delta Que not from the setup table )
    If you want histaric data to new field then you need to setp table deletion ...etc...
    Hope it is clear..
    Regards,
    Satya

  • Data load taking lot of time

    Hi All,
    I am trying to upload 2LIS_02_SCL data, I am trying to refresh the entire data using Init, it is taking lot of time in Prodcution. This is creating the problem for me for next Back ground Job which runs in the night for delta's. Please can anybody guide me how to speed up the load? This datasource is connected to 2 Infocubes and 1 ODS.
    Regards,
    Rajesh

    Hi Sal,
    I have done the same things as you said, my mistake is
    R/3:
    Locked all R/3 users.
    Deleted from LBWG.
    Filled setup tables using Document Date from 2006 to 2008 (Month wise i have filled).
    BW:
    I have cleaned entire data from 0PUR_C01 and 0PUR_C04.
    Loaded data at a time to 2 Cubes and 1 ODS using Init upload.
    It is started taking long time. Actual problem is load was not finished at the time of Daily load Process Chain starts(Night).
    I have cancelled the job. made available only delta to Process Chain.
    Now problem is escaled, again i started today same Init for 1 Cube only, again taking long time.
    Regards,
    Rajesh

  • Importing SLD content takes lot of time in Solman 4.0

    Hi Folks,
    Importing SLD content takes lot of time while installing Solution Manager 4.0. Its in the 42nd phase of 45...SQL 2005 DB.
    Its stuck in Configuring system landscape directory....and NOT thrown any error as of now...touch wood.
    Can anyone tell me how much time does it take OR is it gone in a loop?

    HI All,
    Finally I have received an error during the above mentionde phase....
    sapinst.log  -
    >>>
    Import Status: PREPARING
    Import Status: PREPARING
    Import Status: PREPARING
    ERROR: CIM_ERR_FAILED: IO error: Read timed out
    <BR>CONFIGURATION=
    ERROR 2007-10-22 15:55:34
    CJS-30059  J2EE Engine configuration error.<br>DIAGNOSIS: Error when configuring J2EE Engine. See output of logfile java.log: 'JSE'.
    java.exe.log -
    >>>
    Import Status: PREPARING
    TYPE=A<BR>STATE=<BR>INFO_SHORT=com.sap.sld.api.wbem.exception.CIMCommunicationException: com.sap.sld.api.wbem.exception.CIMCommunicationException: CIM_ERR_FAILED: IO error: Read timed out
         at com.sap.sld.api.wbem.client.WBEMHttpRequestSender.send(WBEMHttpRequestSender.java:158)
         at com.sap.sld.api.wbem.client.WBEMRemoteClient.send(WBEMRemoteClient.java:720)
         at com.sap.sld.api.wbem.client.WBEMRemoteClient.send(WBEMRemoteClient.java:694)
         at com.sap.sld.api.wbem.client.WBEMRemoteClient.send(WBEMRemoteClient.java:638)
         at com.sap.sld.api.wbem.client.WBEMRemoteClient.referencesImpl(WBEMRemoteClient.java:375)
         at com.sap.sld.api.wbem.client.WBEMClient.references(WBEMClient.java:1773)
         at com.sap.sld.api.wbem.client.WBEMClientUtil.referencesComplete(WBEMClientUtil.java:490)
         at com.sap.lcr.pers.delta.importing.SAPCRUpgrade.collectAssociationsForRestoration(SAPCRUpgrade.java:700)
         at com.sap.lcr.pers.delta.importing.SAPCRUpgrade.delete(SAPCRUpgrade.java:355)
         at com.sap.lcr.pers.delta.importing.ImportHandler.loadFullImport(ImportHandler.java:1765)
         at com.sap.lcr.pers.delta.importing.ImportHandler.loadImpl(ImportHandler.java:1605)
         at com.sap.lcr.pers.delta.importing.ImportHandler.load(ImportHandler.java:1573)
         at com.sap.ctc.util.SLDConfig.importSldContent(SLDConfig.java:812)
         at com.sap.ctc.util.SLDConfig.performFunction(SLDConfig.java:154)
         at com.sap.ctc.util.ConfigServlet.doGet(ConfigServlet.java:69)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:740)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
         at com.sap.engine.services.servlets_jsp.server.HttpHandlerImpl.runServlet(HttpHandlerImpl.java:390)
         at com.sap.engine.services.servlets_jsp.server.HttpHandlerImpl.handleRequest(HttpHandlerImpl.java:264)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.startServlet(RequestAnalizer.java:347)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.startServlet(RequestAnalizer.java:325)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.invokeWebContainer(RequestAnalizer.java:887)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.handle(RequestAnalizer.java:241)
         at com.sap.engine.services.httpserver.server.Client.handle(Client.java:92)
         at com.sap.engine.services.httpserver.server.Processor.request(Processor.java:148)
         at com.sap.engine.core.service630.context.cluster.session.ApplicationSessionMessageListener.process(ApplicationSessionMessageListener.java:33)
         at com.sap.engine.core.cluster.impl6.session.MessageRunner.run(MessageRunner.java:41)
         at com.sap.engine.core.thread.impl3.ActionObject.run(ActionObject.java:37)
         at java.security.AccessController.doPrivileged(Native Method)
         at com.sap.engine.core.thread.impl3.SingleThread.execute(SingleThread.java:100)
         at com.sap.engine.core.thread.impl3.SingleThread.run(SingleThread.java:170)
    Caused by: java.net.SocketTimeoutException: Read timed out
         at java.net.SocketInputStream.socketRead0(Native Method)
         at java.net.SocketInputStream.read(SocketInputStream.java:129)
         at java.net.SocketInputStream.read(SocketInputStream.java:182)
         at com.tssap.dtr.client.lib.protocol.streams.ChunkedInputStream.readLine(ChunkedInputStream.java:323)
         at com.tssap.dtr.client.lib.protocol.streams.ResponseStream.readLine(ResponseStream.java:271)
         at com.tssap.dtr.client.lib.protocol.impl.Response.initialize(Response.java:476)
         at com.tssap.dtr.client.lib.protocol.Connection.getResponse(Connection.java:2604)
         at com.tssap.dtr.client.lib.protocol.Connection.sendInternal(Connection.java:1578)
         at com.tssap.dtr.client.lib.protocol.Connection.send(Connection.java:1427)
         at com.sap.sld.api.wbem.client.WBEMHttpRequestSender.send(WBEMHttpRequestSender.java:142)
         ... 30 more
    caused by:
    java.net.SocketTimeoutException: Read timed out
         at java.net.SocketInputStream.socketRead0(Native Method)
         at java.net.SocketInputStream.read(SocketInputStream.java:129)
         at java.net.SocketInputStream.read(SocketInputStream.java:182)
         at com.tssap.dtr.client.lib.protocol.streams.ChunkedInputStream.readLine(ChunkedInputStream.java:323)
         at com.tssap.dtr.client.lib.protocol.streams.ResponseStream.readLine(ResponseStream.java:271)
         at com.tssap.dtr.client.lib.protocol.impl.Response.initialize(Response.java:476)
         at com.tssap.dtr.client.lib.protocol.Connection.getResponse(Connection.java:2604)
         at com.tssap.dtr.client.lib.protocol.Connection.sendInternal(Connection.java:1578)
         at com.tssap.dtr.client.lib.protocol.Connection.send(Connection.java:1427)
         at com.sap.sld.api.wbem.client.WBEMHttpRequestSender.send(WBEMHttpRequestSender.java:142)
         at com.sap.sld.api.wbem.client.WBEMRemoteClient.send(WBEMRemoteClient.java:720)
         at com.sap.sld.api.wbem.client.WBEMRemoteClient.send(WBEMRemoteClient.java:694)
         at com.sap.sld.api.wbem.client.WBEMRemoteClient.send(WBEMRemoteClient.java:638)
         at com.sap.sld.api.wbem.client.WBEMRemoteClient.referencesImpl(WBEMRemoteClient.java:375)
         at com.sap.sld.api.wbem.client.WBEMClient.references(WBEMClient.java:1773)
         at com.sap.sld.api.wbem.client.WBEMClientUtil.referencesComplete(WBEMClientUtil.java:490)
         at com.sap.lcr.pers.delta.importing.SAPCRUpgrade.collectAssociationsForRestoration(SAPCRUpgrade.java:700)
         at com.sap.lcr.pers.delta.importing.SAPCRUpgrade.delete(SAPCRUpgrade.java:355)
         at com.sap.lcr.pers.delta.importing.ImportHandler.loadFullImport(ImportHandler.java:1765)
         at com.sap.lcr.pers.delta.importing.ImportHandler.loadImpl(ImportHandler.java:1605)
         at com.sap.lcr.pers.delta.importing.ImportHandler.load(ImportHandler.java:1573)
         at com.sap.ctc.util.SLDConfig.importSldContent(SLDConfig.java:812)
         at com.sap.ctc.util.SLDConfig.performFunction(SLDConfig.java:154)
         at com.sap.ctc.util.ConfigServlet.doGet(ConfigServlet.java:69)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:740)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
         at com.sap.engine.services.servlets_jsp.server.HttpHandlerImpl.runServlet(HttpHandlerImpl.java:390)
         at com.sap.engine.services.servlets_jsp.server.HttpHandlerImpl.handleRequest(HttpHandlerImpl.java:264)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.startServlet(RequestAnalizer.java:347)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.startServlet(RequestAnalizer.java:325)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.invokeWebContainer(RequestAnalizer.java:887)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.handle(RequestAnalizer.java:241)
         at com.sap.engine.services.httpserver.server.Client.handle(Client.java:92)
         at com.sap.engine.services.httpserver.server.Processor.request(Processor.java:148)
         at com.sap.engine.core.service630.context.cluster.session.ApplicationSessionMessageListener.process(ApplicationSessionMessageListener.java:33)
         at com.sap.engine.core.cluster.impl6.session.MessageRunner.run(MessageRunner.java:41)
         at com.sap.engine.core.thread.impl3.ActionObject.run(ActionObject.java:37)
         at java.security.AccessController.doPrivileged(Native Method)
         at com.sap.engine.core.thread.impl3.SingleThread.execute(SingleThread.java:100)
         at com.sap.engine.core.thread.impl3.SingleThread.run(SingleThread.java:170)
    //   => Importing Data : E:/usr/sap/SOL/SYS/global/sld/model/CR_Content.zip URL=http://devsrv:50100 USER=J2EE_ADMIN ...
    Import Status: PREPARING
    ERROR: CIM_ERR_FAILED: IO error: Read timed out
    <BR>CONFIGURATION=

  • Partition switching taking lots of time

    I have a partitioned table. All the partitions of this table are mapped to a single filegroup.
    When I am trying to switch partition, it is taking lots of time more than 5 mins for a single partition.
    I can see that there is a file in the filegroup and its size is very high that is 234GB.
    Can partition switching take more time due to file size?
    or is there any other issue?
    I need help in finding out what is the problem.
    Thanks,

    So you are running ALTER TABLE SWITCH? Check for blocking. The operation requires an Sch-M lock which means that it is blocked any query that is running, including queries using NOLOCK.
    Erland Sommarskog, SQL Server MVP, [email protected]

  • Extractor taking lots of time.

    hi experts,
    we are using extractor 2LIS_13_VDKON to extract the data.
    but its taking lots of time.how can i solve this problem.
    any help appreciated.

    Hi,
    Have you done any enhancements(Append) to this DS.If so, make sure that code is performance optimistic.
    Basically the setup table for 2LIS_13_VDKON will has the lot of data. For example, it holds at n avarage 10 records for each billing item record.
    So also check system is uploading records to the New table of ODS or not. 
    With rgds,
    Anil Kumar Sharma .P

  • Taking lot of time for loading

    Hi,
      We are loading data from HR to BI. Connection between HR and BI has been done recently. When I am trying to load data from HR  to BI it taking lot of time--for example to load 5recs its taking 8hrs. Its same for every datasource. Should we change anything in settings to makes IDOCS work proper? Thanks

    You have to isolate the part that is taking the time.
    - Is R/3 extraction quick? (You can check with RSA3 and see how long does it take.)
    - If R/3 extraction is slow, then is the selection using an index? How many records are there in the table / view?
    - Is there a user exit? Is user exit slow?
    You can find out using monitor screen:
    - After R/3 extraction completed, how long did it take to insert into PSA?
    - How long did it take in the update rules?
    - How long did it take to activate?
    Once you isolate the problem area, post your findings here and someone will help you.

  • Activation of datatargets  in bw 3.5 is taking lot of time

    Hi
    i am using bw 3.5.I have created lot of infocubes , ods etc for practice sake .May be due to this or some other reason , its taking lot of time to get executed . sometimes same screen is appearing for a long time (nearly half an hour)even i m trying to activate any datatarget . In this case i am again stopping the bw in  sapmmc and again starting
    what may be the reason for both these things.Is it due to overload of data ??
    If so , shall i delete the previous data targets?
    Anyone plz give me the reason for this kind of problem
    Thank you
    Deepthi matturi

    Hi Deepthi
    Your are working on ides or any training system , there is no batch_jobs was scheduled to clear down the logs,
    it seems that it is bulding up logs , which needs to be deleted
    OR
    there may be less table space , you can check table space using : DB02,
    please ask your basis team to check your bw system/server
    Hope this helps
    sukhi

  • Connection takes a lot of time

    we have a problem on our database that it takes a lot of time to connect.
    connection done locally (with BEQ protocol).
    the connection is available only after about a minute.
    what could be the reason for it??

    'tnsping' and see what is the response time?
    tnsping TNSNAME
    This problem could be due to NTS defined in the SQLNET.AUTHENTICATION_SERVICES parameter of the SQLNET.ORA
    file.
    Please Edit the SQLNET.ORA file and remove (NTS) from:
    SQLNET.AUTHENTICATION_SERVICES
    Check to see whether or not tracing is enabled on the client side. Verify in the client's SQLNET.ORA file for the TRACE_LEVEL_CLIENT (= off) parameter. If set ON please change to OFF
    Try changing the hostname in the TNSNAMES.ORA file to an ip address so that hostname lookup is not needed.
    SJH
    OCP DBA

  • I bought new Macbook Pro 13" around two months before .My Apple ID is working on all other things except app store . It is buffering for a lot of time and lastly coming on screen " can not connect to app store " Please help me

    I bought new Macbook Pro 13" around two months before .My Apple ID is working on all other things except app store . It is buffering for a lot of time and lastly coming on screen " can not connect to app store " Please help me

    Have you tried repairing disk permissions : iTunes download error -45054

  • Opening up an Excel output takes a lot of time whenever the file size is la

    Hi All,
    We have some BI Pub reports with excel output format. These files have records in the order of thousands. The file sizes are > 2MB. Its taking a lot of time to open such files. Does anyone have any idea how can we get around this problem ??
    Thanks in advacne,
    Venkatesh N

    Another point is that BIP doesn't create a native/binary Excel file but an XML output. When you open this XML once and save it again as *.XLS it's size will decrease significantly and Excel will open it much faster.
    This applies to Excel 2000 - 2003, I don't know how it works with Office 2007.
    Regards,
    Stefan

  • Loading into the Infipackages takes lot of time

    Hi Friends,
    When i schedule and activate the process chain everyday, it takes lot of time to get loaded in the infopackages, around 5 to 6 hours.
    in st13 when i click on the Log id name to see the process chain.
    in the LOAD DATA infopackage it shows green, after double clicking on that, process moniter
    i see request still running and its yellow but 0 from 0 records.
    and this will be the same for hours
    Its very slow.
    need your guidence on this
    Thanks for your help.

    Hi,
    What are you trying to load using this process chain? If it is cube, are you deleting the indexes on the cube before starting the load. If not, include that step as well in the process chain. This will help in getting improved performance.
    Ideally it is always a good practice to delete the indexes before loading.
    Regards,
    Yogesh

Maybe you are looking for