Short Dump Error when we check the Data Source in RSA3

Hi. Experts.
                   We are using  Data Source 2Lis_13_VDITM and when we try to check in RSA3
for recorts it is going short dump. I am just providing the error information.
              An exception occurred that is explained in detail below.
The exception, which is assigned to class 'CX_SY_IMPORT_MISMATCH_ERROR', was
not caught in procedure "MCEX_BW_LO_API" "(FUNCTION)", nor was it propagated by a RAISING
clause.
               Since the caller of the procedure could not have anticipated that the exception would occur, the current program is terminated.
The reason for the exception is:
When importing the object "MC13VD0ITM_TAB", the component no. 96 in the dataset has a different type from the corresponding component of the target object in the program "SAPLMCEX".
                    The data type is "N" in the dataset, but "X" in the program.
        Since I don't have much knowledge in ABAP, Please help me out regarding this issue. Suggest me possible solution and let me know if you need any further information.

Hi,
Try adding a TRY CATCH block in the extractor function module to catch the exception.
For the type mismatch you will have to make sure the variables are of same type or they can be type converted.
Use the debug mode of RSA3 to find what data is being exchanged.
You will need to change your extractor to avoid the dump.. or ensure that the value being passed to that object is correctly being passed(correct format)
Cheers
Kiran

Similar Messages

  • Short dump error when using count(*)

    Hi Experts
    I am getting a short dump error when selecting the records >= current date - 30 see the coding and comment please correct the coding I want to know the ztable records it is important for command interface.
    I have 1402345 records available after deleting the records but as the memory is not enough it is giving short dump error
    select count(*) from ZINT_TABLE
    select count(*) from ZINT_MSGS
       select * from zint_data
                 nto table izint_d2 . "PACKAGE SIZE 20000
       where STATUS = 'OK' AND CREATED_ON >= w_date1.          " VALUE
    endselect.**
    report z_eslp_command_records.
    data: cnt type i.
    data: cnt2 type i.
    data: cnt3 type i.
    data: cnt4 type i.
    DATA:
         w_date1 like sy-datum .
    DATA:
         w_date2 like sy-datum.
    data: izint_msgs type table of zint_msgs.
    data: izint_data type table of zint_data.
    data: izint_m2 type table of zint_msgs.
    data: izint_d2 type table of zint_data.
    INITIALIZATION.
    w_date1 = sy-datum -  30.
    w_date2 = sy-datum -  30.
    select * from zint_data
                 into table izint_data PACKAGE SIZE 3000
                 where STATUS = 'OK' AND CREATED_ON <= w_date1.   " ZERO
    endselect.
    select * from zint_msgs
                 into table izint_msgs  PACKAGE SIZE 3000
                  where  CREATED_ON <= w_date2.              " ZERO
    endselect.
      select * from zint_data
                 into table izint_d2 PACKAGE SIZE 20000
       where STATUS = 'OK' AND CREATED_ON >= w_date1.          " VALUE
    endselect.
    select * from zint_msgs
                 into table izint_m2 PACKAGE SIZE 20000
      where CREATED_ON >= w_date2.                            " VALUE
      endselect.
    select * from zint_data
                into table izint_data2
    where STATUS = 'OK' AND CREATED_ON >= CONVERT(CHAR(8), GETDATE() - 30, 112)).
    ENDSELECT.
      select * from zint_msgs
                into table izint_msgs2
    where CREATED_ON >= CONVERT(CHAR(8), GETDATE() - 30, 112)).
    ENDSELECT.
    sort izint_data by created_on ascending.
    sort izint_msgs by created_on ascending.
    sort izint_d2 by created_on ascending.
    sort izint_m2 by created_on ascending.
    describe table izint_data lines cnt.
    describe table izint_msgs lines cnt2.
    describe table izint_d2 lines cnt3.
    describe table izint_m2 lines cnt4.
    write:/ ' Note: THE RECORDS COUNTED SHOULD SHOW ZERO ELSE THE SCRIPT FAILED TO RUN' color 3.
    skip.
    write:/ '1. Records counted in ZINT_DATA   <=current date - 30                   :' color 2,                        cnt color 4.
    write:/ '2. Records available in ZINT_DATA >=current date - 30                   'color 4,                   cnt3 color 4 .
    skip.
    write:/ '2. Records counted in ZINT_MSGS   <=current date - 30                   :' color 2                 ,                      cnt2 color 4.
    write:/ '4. Records available in ZINT_MSGS >=current date - 30                   'color 4  ,            cnt4 color 4 .
    TOP-OF-PAGE.
    WRITE:/55(60) ' WAGNERS INVESTMENT LIMITED  '.
    WRITE:/50(40) ' Command Interface Data' CENTERED .
      WRITE:/50(40) '----
    ' CENTERED .
      FORMAT INTENSIFIED ON.
      SKIP.
      "FORMAT COLOR COL_HEADING.
      ULINE.
      FORMAT COLOR 1.
    END-OF-PAGE.

    Answer

  • Short dump error when extracting delta records from R/3

    I am working on BW 3.5 and I am facing some short dump error when extracting delta records from the r/3 to BW.
    Below is the error message
    Kindly do the needful ASAP.
    Job started
    Step 001 started (program SBIE0001, variant &0000000024277, user ID BWREMOTE)
    Asynchronous transmission of info IDoc 2 in task 0001 (0 parallel tasks)
    DATASOURCE = 0ISCM_PAYMENT_01
             Current Values for Selected Profile Parameters               *
    abap/heap_area_nondia......... 2000000000                              *
    abap/heap_area_total.......... 2000000000                              *
    abap/heaplimit................ 40000000                                *
    zcsa/installed_languages...... DE                                      *
    zcsa/system_language.......... E                                       *
    ztta/max_memreq_MB............ 2047                                    *
    ztta/roll_area................ 6500000                                 *
    ztta/roll_extension........... 2000000000                              *
    2,454 LUWs confirmed and 2,454 LUWs to be deleted with function module RSC2_QOUT_CONFIRM_DAT
    ABAP/4 processor: MESSAGE_TYPE_X
    Job cancelled

    Hi,
    I look at the transaction ST22 to see which type of error has given you. Take a look at the notes to correct the error.
    Another option is to look at OSS notes, because the error is giving you a standard extractor.
    Greetings,

  • SharePoint PPS 2013 Dashboard Designer error "Please check the data source for any unsaved changes and click on Test Data Source button"

    Hi,
    I am getting below error in SharePoint PPS 2013 Dashboard Designer. While create the Analysis Service by using "PROVIDER="MSOLAP";DATA SOURCE="http://testpivot2013:9090/Source%20Documents/TestSSource.xlsx"
    "An error occurred connecting to this data source. Please check the data source for any unsaved changes and click on Test Data Source button to confirm connection to the data source. "
    I have checked all the Sites and done all the steps also. But still getting the error.Its frustrating like anything. Everything is configured correctly but still getting this error.
    Thanks in advance.
    Poomani Sankaran

    Hi Poomani,
    Thanks for posting your issue,
    you must have to Install SQL Server 2012 ADOMD.Net  on your machine and find the browse the below mentioned URL to create SharePoint Dashboard with Analysis service step by step.
    http://www.c-sharpcorner.com/UploadFile/a9d961/create-an-analysis-service-data-source-connection-using-shar/
    I hope this is helpful to you, mark it as Helpful.
    If this works, Please mark it as Answered.
    Regards,
    Dharmendra Singh (MCPD-EA | MCTS)
    Blog : http://sharepoint-community.net/profile/DharmendraSingh

  • Short dump error when extracting from one of the datasource in R/3 to BW

    When extracting from one of the datasource I am getting the short dump. below is the source code of the same.
    Source code extract
    Get boundaries of next TID block
    L_FROM_INDEX = L_TO_INDEX + 1.
    IF L_FROM_INDEX GT NFILL.  EXIT.  ENDIF.
    L_TO_INDEX   = L_TO_INDEX + L_BLOCK_SIZE.
    IF L_TO_INDEX GT NFILL.
      L_TO_INDEX = NFILL.
      L_BLOCK_SIZE = L_TO_INDEX - L_FROM_INDEX + 1.
    ENDIF.
    Create hashed index on TID of TID table
    CLEAR L_TH_TID_IDX.
    LOOP AT TIDTAB FROM L_FROM_INDEX TO L_TO_INDEX.
      L_S_TID_IDX-TIDIX = SY-TABIX.
      L_S_TID_IDX-TID   = TIDTAB-TID.
      COLLECT L_S_TID_IDX INTO L_TH_TID_IDX.
    ENDLOOP.
    Select TID block from STATE table
    SELECT * INTO TABLE L_T_STATE
           FROM ARFCSSTATE FOR ALL ENTRIES IN L_TH_TID_IDX
           WHERE ARFCIPID   EQ L_TH_TID_IDX-TID-ARFCIPID
             AND ARFCPID    EQ L_TH_TID_IDX-TID-ARFCPID
             AND ARFCTIME   EQ L_TH_TID_IDX-TID-ARFCTIME
             AND ARFCTIDCNT EQ L_TH_TID_IDX-TID-ARFCTIDCNT
           ORDER BY PRIMARY KEY.
    Consistence check
    DESCRIBE TABLE L_T_STATE LINES L_LINES.
    IF L_LINES NE L_BLOCK_SIZE OR
       L_LINES EQ 0.
      MESSAGE X097(SY).
    ENDIF.
    PERFORM DELETE_BATCH_JOB
            USING    L_T_STATE
            CHANGING L_S_TID1.
    Update LUW-Status und Zeit
    CLEAR L_T_STATE_IDX.
    CLEAR L_TH_TID2_IDX.
    CLEAR L_T_TID.
    LOOP AT L_T_STATE INTO L_S_STATE.
      L_S_STATE_IDX-TABIX = SY-TABIX.

    Hi Pavan,
                     This is a table space error.
    Regards,
    rahul

  • Getting error when we loading the data to 0TCT_C01

    When we are loading the delta loads to 0TCT_C01 cube , Load got failed saying that "Job terminated in source system --> Request set to red".
    Below is the details abou the dump:
    Runtime Errors         COMPUTE_INT_TIMES_OVERFLOW
    Exception              CX_SY_ARITHMETIC_OVERFLOW
    Date and Time          10/05/2009 09:56:36
    Short text
         Whole number overflow on multiplication.
    What happened?
         Error in the ABAP Application Program
         The current ABAP program "CL_RSTCT_BIRS_EXTRACTION======CP" had to be
          terminated because it has
         come across a statement that unfortunately cannot be executed.
    What can you do?
         Note down which actions and inputs caused the error.
         To process the problem further, contact you SAP system
         administrator.
         Using Transaction ST22 for ABAP Dump Analysis, you can look
         at and manage termination messages, and you can also
         keep them for a long time.
    Please do help me to solve this error.
    Thanks in advance.
    Regards,
    Anil

    Hi Anil,
    Please check the below SAP Note:
    Note 1323805 - 0TCT_DS01 run time error -COMPUTE_INT_TIMES_OVERFLOW
    Hope this helps.
    Veerendra.

  • 00933 error when clicking on the data tab for the selected table

    Hi,
    I'm getting the following error when selecting a table from the tree view then clicking on the data tab:
    An error was encountered performing the requested operation:
    ORA-00933: SQL command not properly ended
    00933.00000 - "SQL command not properly ended"
    *Cause:
    *Action:
    Vendor code 933
    The exact same tables created from sqlplus for a different user in the same database work fine. Note, I'm not typing any SQL in myself. It would appear this happens on all tables created as this user. I cannot see any difference between permissions for this user and others which work but the failing user does have a "-" in the username and password which I'm suspicious of.
    This is SQL Developer v 1.2.0 (build 29.98) (the most recent I believe, Oracle Database 10g Express Edition Release 10.2.0.1.0 and Linux ubuntu.
    Thanks
    Martin

    "-" is not allowed in identifiers and is almost certainly the cause of your problem. This can be got round by quoting i.e. using "STUPID-NAME" instead of STUPID-NAME.
    I'm slightly surprised that SQLDeveloper doesn't quote the statement. Perhaps it quotes table names but not user names.

  • Error when activating Real Time Data Source

    Hi..
    I created a generic datasource in R/3 system via RSO2, with the Real-time check box enabled in the settings for Generic Delta. I saved the datasource. Replicated it into BW System.
    I tried to activate this Real time Data Source in BW, it says "Error when activating Data Source".
    I am on BI 7.0, with PI_BASIS: 2005_1_700.
    Please let me know if you need any further information.
    Thanks,
    Sai.

    Hi sai,
    Can you specify from where you have created your geneic data source like by using DB tabl/view/infoset query.
    am

  • Error when creating a new Data Source

    All, This is on BI Publisher 11g.
    When I create a new data source and supply password and then click "Apply" button on top right corner, i get following error
    Error 500--Internal Server Error
    From RFC 2068 Hypertext Transfer Protocol -- HTTP/1.1:
    10.5.1 500 Internal Server Error
    The server encountered an unexpected condition which prevented it from fulfilling the request.
    what am i doing wrong?
    Thank you for reading this post.
    Thanks,
    R

    The schema/user is system and the password is alpha-numeric, upper/lower case, no special char ...
    As a workaround I was able to add the password to the entry in the datasources.xml file and restarted BI Pub.
    The JDBC is working ... I can test the connection successfully ... just wont save the entry
    Edited by: tread on Oct 17, 2011 11:55 AM

  • "logon failed" error when connecting to XML data source

    I have an HTTP source that generates XML. I have a schema that describes this XML. If I save the XML to a local file it works fine as a data source with the schema. However, when I try to access the same data via HTTP, I get the following error:
    Logon failed.
    Details: Cannot open file
    Server returned HTTP response code: 401 for URL: http://localhost:8004/report.xqy?Validate%20XML=0&Use%20WS-Security%20Config%20File=&WS-Security%20File%20Location
    However, if I enter that URL in a browser it works fine. And even if I disable security on the HTTP source, it produces the same error.
    I have tried a local HTTP source as well as the same source running on Amazon EC2.
    Any ideas?

    Thanks for the help.
    I am new to crystal reports, so I do not understand your suggestion. Both of those points look like they are related to Adobe Flash.
    I am trying to create a report using the standard report creation wizard:
    1) new, standard report
    2) create  new connection
    3) xml and web services
    4) xml data source
    When using the sample from the SAP site, it works fine:
    http://resources.businessobjects.com/support/downloads/samples/cr/customer_db/customer.xml
    When connecting to my source from a browser, it works fine. When using my source as a local document instead of over http it works fine. It is only when connecting to my source over http that I get the failed logon error.
    Kelly

  • Error when deploying JDBC file data source for J2EE agent in weblogic

    Hi ODI experts!
    we're using ODI 11g and we've installed J2EE agents on weblogic. We need to configure a connection pool within weblogic for a file data source. Thus, we've added snpsFile.jar (that was shipped with ODI 10g...) to the weblogic config so that we can configure a JDBC connection to file.
    As we validate the config, an error occurs (see below):
    Obviously, driver is found but cannot be deployed.
    ####<4 août 2011 09 h 39 CEST> <Error> <Deployer> <ppdlodi001> <SRVPPDODI_001> <[ACTIVE] ExecuteThread: '2' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <c53bfd6260db74bb:-485a441a:13193b42397:-7ffc-0000000000000027> <1312443551325> <BEA-149265> <Failure occurred in the execution of deployment request with ID '1312443550020' for task 'weblogic.deploy.configChangeTask.6'. Error is: 'weblogic.application.ModuleException: '
    weblogic.application.ModuleException:
    at weblogic.jdbc.module.JDBCModule.prepare(JDBCModule.java:290)
    at weblogic.application.internal.flow.ModuleListenerInvoker.prepare(ModuleListenerInvoker.java:199)
    at weblogic.application.internal.flow.DeploymentCallbackFlow$1.next(DeploymentCallbackFlow.java:508)
    at weblogic.application.utils.StateMachineDriver.nextState(StateMachineDriver.java:41)
    at weblogic.application.internal.flow.DeploymentCallbackFlow.prepare(DeploymentCallbackFlow.java:149)
    at weblogic.application.internal.flow.DeploymentCallbackFlow.prepare(DeploymentCallbackFlow.java:47)
    at weblogic.application.internal.BaseDeployment$1.next(BaseDeployment.java:1223)
    at weblogic.application.utils.StateMachineDriver.nextState(StateMachineDriver.java:41)
    at weblogic.application.internal.BaseDeployment.prepare(BaseDeployment.java:367)
    at weblogic.application.internal.SingleModuleDeployment.prepare(SingleModuleDeployment.java:43)
    at weblogic.application.internal.DeploymentStateChecker.prepare(DeploymentStateChecker.java:154)
    at weblogic.deploy.internal.targetserver.AppContainerInvoker.prepare(AppContainerInvoker.java:60)
    at weblogic.deploy.internal.targetserver.operations.ActivateOperation.createAndPrepareContainer(ActivateOperation.java:208)
    at weblogic.deploy.internal.targetserver.operations.ActivateOperation.doPrepare(ActivateOperation.java:98)
    at weblogic.deploy.internal.targetserver.operations.AbstractOperation.prepare(AbstractOperation.java:217)
    at weblogic.deploy.internal.targetserver.DeploymentManager.handleDeploymentPrepare(DeploymentManager.java:749)
    at weblogic.deploy.internal.targetserver.DeploymentManager.prepareDeploymentList(DeploymentManager.java:1216)
    at weblogic.deploy.internal.targetserver.DeploymentManager.handlePrepare(DeploymentManager.java:218)
    at weblogic.deploy.internal.targetserver.DeploymentServiceDispatcher.prepare(DeploymentServiceDispatcher.java:160)
    at weblogic.deploy.service.internal.targetserver.DeploymentReceiverCallbackDeliverer.doPrepareCallback(DeploymentReceiverCallbackDeliverer.java:171)
    at weblogic.deploy.service.internal.targetserver.DeploymentReceiverCallbackDeliverer.prepare(DeploymentReceiverCallbackDeliverer.java:41)
    at weblogic.deploy.service.internal.statemachines.targetserver.AwaitingContextUpdateCompletion.callDeploymentReceivers(AwaitingContextUpdateCompletion.java:164)
    at weblogic.deploy.service.internal.statemachines.targetserver.AwaitingContextUpdateCompletion.handleContextUpdateSuccess(AwaitingContextUpdateCompletion.java:66)
    at weblogic.deploy.service.internal.statemachines.targetserver.AwaitingContextUpdateCompletion.contextUpdated(AwaitingContextUpdateCompletion.java:32)
    at weblogic.deploy.service.internal.targetserver.TargetDeploymentService.notifyContextUpdated(TargetDeploymentService.java:225)
    at weblogic.deploy.service.internal.DeploymentService$1.run(DeploymentService.java:190)
    at weblogic.work.SelfTuningWorkManagerImpl$WorkAdapterImpl.run(SelfTuningWorkManagerImpl.java:528)
    at weblogic.work.ExecuteThread.execute(ExecuteThread.java:201)
    at weblogic.work.ExecuteThread.run(ExecuteThread.java:173)
    Caused By: weblogic.common.ResourceException: com.sunopsis.jdbc.driver.file.a.i
    at weblogic.jdbc.common.internal.ConnectionEnvFactory.createResource(ConnectionEnvFactory.java:263)
    at weblogic.common.resourcepool.ResourcePoolImpl.makeResources(ResourcePoolImpl.java:1193)
    at weblogic.common.resourcepool.ResourcePoolImpl.makeResources(ResourcePoolImpl.java:1117)
    at weblogic.common.resourcepool.ResourcePoolImpl.start(ResourcePoolImpl.java:244)
    at weblogic.jdbc.common.internal.ConnectionPool.doStart(ConnectionPool.java:1110)
    at weblogic.jdbc.common.internal.ConnectionPool.start(ConnectionPool.java:147)
    at weblogic.jdbc.common.internal.ConnectionPoolManager.createAndStartPool(ConnectionPoolManager.java:386)
    at weblogic.jdbc.common.internal.ConnectionPoolManager.createAndStartPool(ConnectionPoolManager.java:326)
    at weblogic.jdbc.module.JDBCModule.prepare(JDBCModule.java:254)
    at weblogic.application.internal.flow.ModuleListenerInvoker.prepare(ModuleListenerInvoker.java:199)
    at weblogic.application.internal.flow.DeploymentCallbackFlow$1.next(DeploymentCallbackFlow.java:508)
    at weblogic.application.utils.StateMachineDriver.nextState(StateMachineDriver.java:41)
    at weblogic.application.internal.flow.DeploymentCallbackFlow.prepare(DeploymentCallbackFlow.java:149)
    at weblogic.application.internal.flow.DeploymentCallbackFlow.prepare(DeploymentCallbackFlow.java:47)
    at weblogic.application.internal.BaseDeployment$1.next(BaseDeployment.java:1223)
    at weblogic.application.utils.StateMachineDriver.nextState(StateMachineDriver.java:41)
    at weblogic.application.internal.BaseDeployment.prepare(BaseDeployment.java:367)
    at weblogic.application.internal.SingleModuleDeployment.prepare(SingleModuleDeployment.java:43)
    at weblogic.application.internal.DeploymentStateChecker.prepare(DeploymentStateChecker.java:154)
    at weblogic.deploy.internal.targetserver.AppContainerInvoker.prepare(AppContainerInvoker.java:60)
    at weblogic.deploy.internal.targetserver.operations.ActivateOperation.createAndPrepareContainer(ActivateOperation.java:208)
    at weblogic.deploy.internal.targetserver.operations.ActivateOperation.doPrepare(ActivateOperation.java:98)
    at weblogic.deploy.internal.targetserver.operations.AbstractOperation.prepare(AbstractOperation.java:217)
    at weblogic.deploy.internal.targetserver.DeploymentManager.handleDeploymentPrepare(DeploymentManager.java:749)
    at weblogic.deploy.internal.targetserver.DeploymentManager.prepareDeploymentList(DeploymentManager.java:1216)
    at weblogic.deploy.internal.targetserver.DeploymentManager.handlePrepare(DeploymentManager.java:218)
    at weblogic.deploy.internal.targetserver.DeploymentServiceDispatcher.prepare(DeploymentServiceDispatcher.java:160)
    at weblogic.deploy.service.internal.targetserver.DeploymentReceiverCallbackDeliverer.doPrepareCallback(DeploymentReceiverCallbackDeliverer.java:171)
    at weblogic.deploy.service.internal.targetserver.DeploymentReceiverCallbackDeliverer.prepare(DeploymentReceiverCallbackDeliverer.java:41)
    at weblogic.deploy.service.internal.statemachines.targetserver.AwaitingContextUpdateCompletion.callDeploymentReceivers(AwaitingContextUpdateCompletion.java:164)
    at weblogic.deploy.service.internal.statemachines.targetserver.AwaitingContextUpdateCompletion.handleContextUpdateSuccess(AwaitingContextUpdateCompletion.java:66)
    at weblogic.deploy.service.internal.statemachines.targetserver.AwaitingContextUpdateCompletion.contextUpdated(AwaitingContextUpdateCompletion.java:32)
    at weblogic.deploy.service.internal.targetserver.TargetDeploymentService.notifyContextUpdated(TargetDeploymentService.java:225)
    at weblogic.deploy.service.internal.DeploymentService$1.run(DeploymentService.java:190)
    at weblogic.work.SelfTuningWorkManagerImpl$WorkAdapterImpl.run(SelfTuningWorkManagerImpl.java:528)
    at weblogic.work.ExecuteThread.execute(ExecuteThread.java:201)
    at weblogic.work.ExecuteThread.run(ExecuteThread.java:173)
    Thanks in advance for your help.
    Fredd.

    Hi!
    Some information: I found this website [http://gerardnico.com/doc/odi/webhelp/en/refmanual/connexion/jdbcdriversample.htm] explaining that the 'old' snpsFile.jar driver was deprecated and that a new one was existing... but I can't find it anywhere!
    My questions are:
    - Can the problem (original problem. See initial post) be due to driver version?
    - Does someone know where to find this 'new' driver?
    Thanks for your help.
    Fredd.
    Edited by: fredd2 on Aug 5, 2011 10:04 AM
    Edited by: fredd2 on Aug 5, 2011 10:07 AM

  • Short dump error report

    What is the purpose of short dump error report generated by the system? What r the major parts/section of this report?

    hi please look through this:
    Whenever any transaction or any process is executed it hits an ABAP code.
    If there is any issue in executing the code the situation is called "ABAP DUMP" or 'Short Dump'
    Transaction ST22 is used to check any ABAP Dump within a given duration.
    U can also fix the issue with the help of ABAP editor (Transaction Code SE38) if u r good with ABAP.
    you can select back to the last 14 days if you are
    running the SAP_REORG_ABAPDUMPS job (program RSSNAPDL) which deletes te
    old ABAP dumps every day, possibly further if you run it weekly or
    monthly.
    sap-basis   sap-basisAll Groups  Ignore this text box. It is used to detect spammers.If you enter anything into this text box, no search results will be displayed.
    SAP Groups > Technical-functional > sap-basis > Message
    About This Group | Invite peers to join this group 
    << Prev thread  < Prev msg  Next msg >  Next thread >> 
    short dump----urgent
    Reply from Keith Lewis on 5/2/2003 3:22:00 AM 
    Good explanation Joy, but it is not only limited to the current day and
    yesterday. If you click on the Selection button you can select previous
    days as well. In fact you can select back to the last 14 days if you are
    running the SAP_REORG_ABAPDUMPS job (program RSSNAPDL) which deletes te
    old ABAP dumps every day, possibly further if you run it weekly or
    monthly.
    Cheers
    Keith
    SAP R/3 Support Specialist
    Information Services International
    Tel +44 1664 41 6178 Mobile +44 7887 627818
    E-mail email@removed
    Archive Page - http://www.ittoolbox.com//I/sap-r3-basis.asp
    Dear sreenivas
    If you receive an error message in the R/3 System log (SM21), or if you
    see
    a terminated update in the update service analysis transaction (SM13), or
    DB
    realated error ,check for dumps using the dump analysis transaction
    (transaction ST22), or choose Tools -> Administration -> Monitoring ->
    Dump
    analysis.
    Transaction ST22 enables you to analyze short dumps from the current and
    previous day.
    The dump analysis function shows you:
    What happened
    What you can do
    How to correct the error
    The dump analysis function also provides an error ID and keywords that you
    can use to search in SAPNet, as well as information about:
    The system environment
    Users and transactions
    Transaction ST22 enables you to analyze the following data:
    Date, time, user, client
    Contents of system and data fields
    Contents of internal tables and application tables
    ST22 is a very nice feature in SAP. This is where all the system errors
    and users mistakes showed up. If there is a problem that you cannot
    solve and you contact SAP support, they look at this area to analyze it.
    This gives us lead where to handle the problems.
    Error analysis
    How to correct the error
    System environment
    User, transaction...
    Information on where termination occurred
    Contents of system fields
    Chosen variables
    Active calls / events
    Internal notes
    Active calls in SAP kernel
    List of ABAP programs affected
    List of internal tables
    Directory of application tables (contents)
    Directory of data areas (administration information)
    Directory of data areas (contents)
    ABAP control blocks CONT
    End of runtime analysis
    i think this would be helpful
    regards
    sravani

  • Webelements : does not support when i add the data sourcce& view in HTML

    hi Masters,
    i am using sap crystal reports 2008 and when i add the data source and add tables to the report and i use web elements functions as my selection screen then it does not show i mean the script remains as it is and when i take out my data source
    then it works when i view it in HTML VIEWR.
    how can i make it to work despite me using the data soruce.
    thank you,
    pasala

    hi Pasala,
    this is due to the sap integration kit which does not support pass-through html. webelements require pass-through html in order to show up as html.
    here's what you can try though:
    1) install the latest sap integration kit on your boe system machine(s)
    2) if the above does not solve the issue follow the steps at the bottom of [this thread|WebElements: HTML not rendering when using Bex Query as Source]
    3) if 2 above doesn't work then see [this thread |Re: WebElements only show HTML tags (even after following WebElement guides)]as a last resort
    hopefully step 1 & step 2 will solve the issue.
    jamie

  • Cannot initialize the data source object of OLE DB provider "Microsoft.ACE.OLEDB.12.0" for linked server "Excel_Test"

    Here,
    I have created a Linked Server Called Excel_Test And its working fine ( For Retrieving data from excel to Sql Server Database Table ) ...
    I have written a Stored Procedure for inserting data , When i Execute that stored procedure it is inserting data to Table Which i have
    given.
    And the data which is in table should be shown in the application,
    But, Am getting the following error.
    Cannot
    initialize the data source object of OLE DB provider "Microsoft.ACE.OLEDB.12.0" for linked server "Excel_Test".
     Please help me on this.
     Thanks in Advance....

    Even am able to retrieve the data from Sql Server management studio.
    The same retrieved data should be shown in the application (Dashbord).
    Query : Select * from OpenQuery(Excel_Test,'Select * from [Sheet1$]')
    Am getting the output excatly the data in the excel sheet.
    But when am opening that application where the above data should be shown am getting the below error.
    Cannot
    initialize the data source object of OLE DB provider "Microsoft.ACE.OLEDB.12.0" for linked server "Excel_Test".
     Is this error is because of problem in Dot Net Coding or in the linked server.
    Thanks in Advance.....

  • Data in PSA content is different than the data source

    Dear Experts,
    I am using Process chain to update info cube from data source of a z table.  In the process chain, the PSA request request step is having the PSA table number of the correct data source. The info package and the DTP also have the correct data source.
    But after executing the process chain, the info cube is not updated with the required data source. It is updated with data from some other data source which is belonging to a different custom table.
    I have checked the Extraction of the data source in RSA3. This is extracting from correct data source. But when I check PSA contents through process monitor in process chain log for DTP, I found the  PSA content data is different from the data source.
    I am totally confused, how this is possible. Can anybody advise me to troubleshoot and fix this issue.
    Thanks and regards
    Murugesan

    Dear Igor,
    The PSA request is being deleted in a previous step prior to loading the info package.  I can see every time when I run the process  chain, the old request number is deleted and a new request number is created.
    Still the same issue is persisting.
    I also observed one more thing that the second time when we repeat the same process chain, the correct data source data is loaded in the info cube and PSA table. Only during the first time, the PSA table and Info cube  it is not updating with the right data.
    thanks and regards
    Murugesan

Maybe you are looking for