SQLException "Expected 'EOF'" when extracting XMLType from ResultSet

Hi,
I am trying to implement a Java method that returns the nodes matching a given xpath expression. The query is working fine and the code is mostly working. The problem is getting the information out of an XmlType in Java.
The query is as follows:
---8<--
select extract(res, '/xdb:Resource/xdb:Contents'||:xpath,
'xmlns:xdb="http://xmlns.oracle.com/xdb/XDBResource.xsd" '||:ns) node
from resource_view
where under_path(res, :path) = 1;
---8<--
xpath = '//n:PostalCode'
ns = 'xmlns:n="uri:cosmos:schema:company:1.0"'
path = '/home/contactmgr'
And the Java code:
---8<---
resultSet = statement.executeQuery();
while (resultSet.next())
OPAQUE opaque = (OPAQUE) resultSet.getObject(1);
if (opaque != null)
XMLType myNode = XMLType.createXML(opaque);
String testString = myNode.getStringVal();
Document testDocument = myNode.getDOM();
---8<---
The testString contains the expected result (messy, but workable), i.e. "<PostalCode xmlns="uri:cosmos:schema:company:1.0">GU1 4LY</PostalCode>\n<PostalCode xmlns="uri:cosmos:schema:company:1.0">TW18 4AQ</PostalCode>\n". These are the two postal codes in the first document.
The call to myNode.getDOM() fails with an SQLException and a message of "Expected 'EOF'".
I appreciate that the returned xml is a fragment and as such it shouldn't be possible to return an org.w3c.dom.Document instance. However, when I try using myNode.getDocumentFragment() I still get an SQLException but with no message or any other details.
The method is currently expected to return a list of Nodes.
Any ideas?

The Oracle 10G XDB Developers Guide (http://download-west.oracle.com/docs/cd/B13789_01/appdev.101/b10790/xdb03usg.htm#sthref209) offers the following:
"The extract() function returns the node or nodes that match the XPath expression. Nodes are returned as an instance of XMLType. The results of extract() can be either a document or DocumentFragment."
To provide for the fact that an XPath expression may match multiple nodes in any given document, I modified the query to use xmlsequence(). This should then return an array of XmlType instances.
--8<--
select xmlsequence(extract(res, '/xdb:Resource/xdb:Contents'||?,
'xmlns:xdb="http://xmlns.oracle.com/xdb/XDBResource.xsd"
'||?)) nodes
from resource_view
where under_path(res, ?) = 1;
--8<--
The Java JDBC code then looks like this:
--8<--
OPAQUE[] myRawNodes = (OPAQUE[]) myResultSet.getArray(1).getArray();
for (int i = 0; i < myRawNodes.length; i++)
OPAQUE myOpaque = (OPAQUE) myRawNodes[ i ];
if (myOpaque != null)
XMLType myNode = XMLType.createXML(myOpaque);
myCollection.add(myNode.getDOM());
--8<--
Testing this out however, I get the following exception:
java.sql.SQLException: Internal Error: makeJavaArray doesn't support type 2007
To re-iterate: my goal is to:
a) evaluate an XPath expression that will return a list of nodes against all the documents under a particular path in the resource_view;
b) extract all the returned nodes using JDBC; and
c) return the collection of nodes to the caller.
Any ideas?

Similar Messages

  • Problem when extracting data from R/3 to BI

    Hi Experts,
    I am facing a strange problem. While i am extracting data from R/3 yo BI. One Infopackage is running successfully and i can see the data at Bi side. But while I am running another infopackage for another data source it is taking long time and at last loading results into failure.
    I have gone through SDN also but not able to find any inputs.
    I think it might be the reason with the Idocs.  The error it is giving is
    Request still running
    Diagnosis
    No errors found. The current process has probably not finished yet.
    System Response
    The ALE inbox of BI is identical to the ALE outbox of the source system
    or
    the maximum wait time for this request has not yet been exceeded
    or
    the background job has not yet finished in the source system.
    Current status
    All of the packets from the source system arrived
    I am not able to see list of Idocs in sm58.
    What might be the reoson?
    Regards,
    sridhar

    Hi,
    See if its a full schedule it again by deleting any old requests....
    And then when going thru job overview>in the sourcesystem>u can execute there itself and chk the status of the job...
    If its no there then the job might have missed up due to some unfair reason....it happens sometimes also..some times it will be in scheduled only....once select all active/cancled/finished and so on...
    And also chk under details tab...under extraction....whether data selection scheduled and requested and finished....????
    If not there try to run it again and at the same moment monitor in source with jobname as BIREQU_* and select all input entries as canceled/active/finished......
    rgds,

  • Error when extracting data from ETL - Missing messages and missing InfoIdoc

    Hi All,
    We are using BW 3.0 and extracting data from other source systems through Informatica Powercenter (ETL). The system is working fine but when we try to extract data on 31st Dec , we get the following error. Only this load gives the error and all the other load to other data targets are going fine. All the data are one-to-one mapped from ETL to BW.
    Error messages from Monitor -Status tab:-
       "InfoIdocs missing; external system
       Diagnosis :- InfoIDocs are missing. The IDocs are created in BW with non-SAP systems as source    
       systems that transfer the data and metadata to BW using BAPIs. Since the IDocs are missing, a   
       short dump in BW probably occurred.
       System response:  No selection information arrived from the source system"
    Error messages from Monitor -Details tab:-
        Missing message: Number of sent records,   Missing message: Selection completed
    Highly appretiate your suggestions.
    Vinod.CH

    Hi Rathy Moorthy,
    Thank you very much for your reply. The source system connections are OK and we are able to load data to other Data targets from ETL, we have issue only with this this particular load. The load extracts data and I have loaded the data from ETL to PSA and have checked the data content and it is correct. But when I update the same data to the target I get this error. I have also tried to update from PSA to target and also directly from ETL to target.
    Appretiate your suggestions.

  • Error when extracting data from R/3 into BW

    Hi,
      There is an infoobject Payscale Level(Datsource 0paysclaelv_attr) in BW module HR that is extracting data from R/3. While extracting data, the following error occurs "Assignment to feature PFREQ did not take place". I debugged the standard function module HR_BW_EXTRACT_IO_PAYSCALELV and I believe there is definitely an error in the data. But Im not able to find out that record. Has anyone come across this error before? Please let me know if theres any solution to this error. The help for this error provides the following details.  But I havent understood it. Any help would be greatly appreciated.
    You can assign a specific operation (assignment) to each decision option
    within a feature. You can also define decision option ** whose
    assignment is effected for all unlisted decision options (this is known
    as a default value).
    Example:
    o                D PERSK
    o     K1           &ABRKS=D1,    <- D1 for employee subgroup K1
    o     K2           &ABRKS=D1,    <- D1 for employee subgroup K2
    o     K3           &ABRKS=D2,    <- D2 for employee subgroup K3
    o     **           &ABRKS=D0,    <- D0 for all other employee subgroups
    If you now create employee subgroup K4 and the default entry ** is
    missing, the system is unable to find a decision option for payroll area
    ABKRS.
    Procedure
        Define a decision option for K4 or for default value **.

    Hi ,
    The error is  due to the existing customizing of feature PFREQ for MOLGA = 05, there the value to be returned depends on the WERKS, but apparently it was forgotten to add a new line for the "Otherwise" case.
    Like it was already done for the case of MOLGA = 03, for example. Please add then a line for MOLGA = 05 and "Otherwise" and the issue should be solved.
    If this doesnot solves your issue , then kindly check these OSS notes :
    1033423 -> 0PAYSCALELV_ATTR: Short dump feature_error
    842212   -> 0PAYSCALELV_ATTR: DataSources deliver wrong results
    92055     ->  RP_FROM_PERIOD_TO_PERIOD: FEATURE_ERROR dump
    Regards,
    Lokesh

  • Regexp problem when extracting filename from path

    I'm trying to extract filename and full path from string: \dir1\dir2\file.name
    import java.util.regex.*;
    public class Main {  
        private static final String REGEX = "(\\S*\\)*(\\S*)";
        private static final String INPUT = "\\dir1\\dir2\\file.name";
        public Main() {    }   
        public static void main(String[] args) {
           Pattern p = Pattern.compile(REGEX);
           Matcher m = p.matcher(INPUT); // get a matcher object
           int count = 0;      
           if(m.find()) {
               count++;          
               System.out.println("group1: "+m.group(1));
               System.out.println("group2: "+m.group(2));                     
    }I expect group1 to be \dir1\dir2\ and group2 file.name
    when I run program exception is thrown:
    Exception in thread "main" java.util.regex.PatternSyntaxException: Unclosed group near index 12
    (\S*\)*(\S*)
    ^
    at java.util.regex.Pattern.error(Pattern.java:1650)
    at java.util.regex.Pattern.accept(Pattern.java:1508)
    at java.util.regex.Pattern.group0(Pattern.java:2460)
    at java.util.regex.Pattern.sequence(Pattern.java:1715)
    at java.util.regex.Pattern.expr(Pattern.java:1687)
    at java.util.regex.Pattern.compile(Pattern.java:1397)
    at java.util.regex.Pattern.<init>(Pattern.java:1124)
    at java.util.regex.Pattern.compile(Pattern.java:817)
    at javaapplication2.Main.main(Main.java:15)

    'jverd' is right but if you are just trying to split the INPUT into a path + filename then why not just look for the last '\'. i.e.
    int lastSlash = INPUT.lastIndexOf('\\');
    String group1 = INPUT.substring(0, lastSlash);
    String group2 = INPUT.substring(lastSlash+1);
    As it stands, your example will not give you group 1 as \dir1\dir2. For that you would need something like
    "((?:\\[^\\])*)(\\.*)"
    to make sure you capure all the path in group 1.

  • Error when extracting data from Data Source

    Hi All,
    I have a Generic datasource in BW which is based on a Function Module.When I am extracting the data from this source in RSA3 it is ending up in Run time error.
    the error message is
    The current ABAP program "SAPLYBWPU" had to be terminated because one of the
    statements could not be executed.
    and the error analysis is
    You wanted to add an entry to table "\FUNCTION-POOL=YBWPU\DATA=GT_HIERTAB2", which you declared with a UNIQUE KEY. However, there was already an entry with the same key.
    This may have been in an INSERT or MOVE statement, or within a
    SELECT ... INTO statement.
    In particular, you cannot insert more than one initial line into a
    table with a unique key using the INSERT INITIAL LINE... statement.
    Can anybody explain me how can I resolve this
    Thanks
    Sreeja

    Hello
    You can refer following link.
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/a0f46157-e1c4-2910-27aa-e3f4a9c8df33&overridelayout=true

  • Failure when extracting data from R/3

    We have many InfoPacjkages running business content extractors from R/3 into PSA  Occaisonally these extracts fail with the following message:
    Non-updated Idocs found in Source System
    Diagnosis
    IDocs were found in the ALE inbox for Source System that are not updated.
    Processing is overdue.
    Error correction:
    Attempt to process the IDocs manually. You can process the IDocs manually using the Wizard or by selecting the IDocs with incorrect status and processing them manually.
    The failure appears to be caused by the extract job in R/3 not starting, no job BIREQU* in SM37.
    When this situation occurs the BW job waits a lenth of time, in our case approximately 8 hours, before going red.
    Does anyone else suffer this kind of issue and if so is there any way in reducing the time elapsed before BW errors?

    Hi Terry,
    you can reduce BW job waits a lenth of time at Infopackage level.
    RSA1-> select you infopackage -> Schedule (menu) -> Timeout time -> Maximum wait time for this infopackage (here you can se 1 or 2 hrs).
    Best Regards.

  • Extracting xmltype from table into xml output:  redundant nodes

    Hi guys - see the below example.  (edit:   sorry, haven't used these forums for a while and can't seem to figure out how to format code.....)
    [code]SQL> select *
      2    from v$version;
    BANNER
    Oracle Database 11g Enterprise Edition Release 11.2.0.4.0 - 64bit Production
    PL/SQL Release 11.2.0.4.0 - Production
    CORE    11.2.0.4.0      Production
    TNS for Linux: Version 11.2.0.4.0 - Production
    NLSRTL Version 11.2.0.4.0 - Production
    [/code]
    [/code]
    SQL> create table mytesttable (item_id number, x_data xmltype);
    Table created.
    SQL>
    SQL> insert into mytesttable
      2  with dat as (select 'blah' item from dual
      3               union all
      4               select 'more blah' from dual)
      5  ,t as (select xmlelement("x_data"
      6                              ,xmlattributes(1 as "dummyattr")
      7                              ,xmlagg(xmlelement("item"
      8                                                ,item))
      9                              ) x_data
    10              from dat
    11              )
    12  select 1,x_data
    13    from t    ;
    1 row created.
    SQL>
    SQL> select *
      2    from mytesttable;
       ITEM_ID
    X_DATA
             1
    <x_data dummyattr="1">
      <item>blah</item>
      <item>more blah</item>
    </x_data>
    1 row selected.
    SQL>
    SQL> select xmlelement("outerelement"
      2                   ,xmlelement("itemID", item_id)
      3                   ,xmlelement("x_data",x_data)
      4                   )
      5     from mytesttable;
    XMLELEMENT("OUTERELEMENT",XMLELEMENT("ITEMID",ITEM_ID),XMLELEMENT("X_DATA",X_DATA))
    <outerelement><itemID>1</itemID><x_data><x_data dummyattr="1">
    <item>blah</item>
    <item>more blah</item>
    </x_data>
    </x_data></outerelement>
    1 row selected.
    [/code]
    As the above, I have an xmltype in a table which has other non-xmltype columns.   I have a requirement to extract them out into a single xml document in a similar way to the last SQL there.
    the problem is if I do it as per the above, I artificially add a redundant node.  the first <x_data> node is not necessary, but how do I go and add the xmltype column to the result without having to qualify it with another element around it?
    i.e. how do I get the following:
    <outerelement><itemID>1</itemID><x_data dummyattr="1">
    <item>blah</item>
    <item>more blah</item>
    </x_data></outerelement>

    nevermind.. I was having a dumb moment:
    select xmlelement("outerelement"
                     ,xmlelement("itemID", item_id)
                     ,x_data
       from mytesttable;

  • Losing Metadata when extracting image from stack

    Hi,
    I have a series of 70 images that I edited in colour and B/W. They were stacked with the original RAW file in stacks of 3. I decided to extract the B/W versions to collect them in a separate folder. Now I discovered that from most B/W images the IPTC data is gone (and I had extensive captions and keyword lists). Only the info I filled in when importing the files from the camera remains. The IPTC of the other images in the stacks (RAW and a colour TIFFs) are in some cases gone too. So in more than 50% of the cases I have lost all added data in all three images from each stack.
    Has anyone had similar problems? Is there any way to recover the data of the images where all data is gone?
    In some stacks at least one of the images still has the data. So I can copy that again.
    Thanks,
    Ronald

    I don't have a file to test with mdls but I usually use ExifTool for metadata work.  Here is a script I wrote a while ago that will display all of the metadata contained in the file.
    https://www.johneday.com/219/exiftool-service-for-quick-metadata-access

  • Is it possible to change the dimension order when extracting data from HFM?

    Hello,
    When accessing Consolidation/Extract/Data is it possible to change the dimension order of the extract rather than extracting the standard order?
    I can see you can change the dimension order visually by going to "Reorder dimensions" but this does not carry through to the actual extract.
    If its not possible using this method is there another method? Trying to avoid using smartview for this purpose.
    Many thanks and kind regards,
    Rich

    Hi,
    No, it is not possible to change the order type.
    -Paul

  • Error when extracting data from Hyperion enterprise 6.3.1

    I extracted data and at the end it's returning me an error for 2 entities. The problem it's extarcting data from an entity that doesn't exist. Any idea why?
    There is no journal posted to it.
    This is from the error log.
    !\\tdwm-aawhy-cp01\entdata\FSS\IFRS_YE_Rollover_R1\Extract Data AL_IFRS ACTMAVG.txt
    ACTMAVG
    1
    12
    D0786.ADJML,N250099.CDN.RCDN,1806525.000000,1755614.000000,1768445.000000,1780035.000000,,,,,,,,
    D0786.ADJML,N257099.CDN.RCDN,-16904.000000,-15093.000000,-15093.000000,-15093.000000,,,,,,,,
    D0786.ADJML,N607599.CDN.RCDN,17694.000000,17191.000000,17335.000000,17392.000000,,,,,,,,
    D0786.ADJML,P200099.CDN.RCDN,2644.000000,2500.000000,2514.000000,2529.000000,,,,,,,,
    D0786.ADJML,P200099.CDN.ROTH,4694.000000,4694.000000,4694.000000,4694.000000,,,,,,,,
    D0786.ADJML,P300599.CDN.RCDN,49977.000000,403.000000,12897.000000,24182.000000,,,,,,,,
    D0786.ADJML,P301199.CDN.RCDN,1750000.000000,1750000.000000,1750000.000000,,,,,,,,,
    D0786.ADJML,P304099.CDN.RCDN,,115.000000,582.000000,929.000000,,,,,,,,
    D0786.ADJML,P301189.CDN.RCDN,,,,1750000.000000,,,,,,,,
    D1124.ADJML,N250099.CDN.RCDN,360104.000000,352066.000000,354059.000000,356051.000000,,,,,,,,
    D1124.ADJML,P200099.CDN.RCDN,2075.000000,2000.000000,2008.000000,2015.000000,,,,,,,,
    D1124.ADJML,P200099.CDN.ROTH,39.000000,-7.000000,-7.000000,-7.000000,,,,,,,,
    D1124.ADJML,P203099.CDN.RCDN,0.000000,-1.000000,0.000000,0.000000,,,,,,,,
    D1124.ADJML,P300599.CDN.RCDN,7990.000000,66.000000,2047.000000,4028.000000,,,,,,,,
    D1124.ADJML,P301199.CDN.RCDN,350000.000000,350000.000000,350000.000000,,,,,,,,,
    D1124.ADJML,P304099.CDN.RCDN,,8.000000,11.000000,15.000000,,,,,,,,
    D1124.ADJML,P301189.CDN.RCDN,,,,350000.000000,,,,,,,,
    !\\tdwm-aawhy-cp01\entdata\FSS\IFRS_YE_Rollover_R1\Extract Data AL_IFRS ACTMENT.txt
    ACTMEND
    1
    12
    D0786.ADJML,N250099.CDN.RCDN,1806525.000000,1755614.000000,1768445.000000,1780035.000000,,,,,,,,
    D0786.ADJML,N257099.CDN.RCDN,-16904.000000,-15093.000000,-15093.000000,-15093.000000,,,,,,,,
    D0786.ADJML,N607599.CDN.RCDN,17694.000000,17191.000000,17335.000000,17392.000000,,,,,,,,
    D0786.ADJML,P200099.CDN.RCDN,2644.000000,2500.000000,2514.000000,2529.000000,,,,,,,,
    D0786.ADJML,P200099.CDN.ROTH,4694.000000,4694.000000,4694.000000,4694.000000,,,,,,,,
    D0786.ADJML,P300599.CDN.RCDN,49977.000000,403.000000,12897.000000,24182.000000,,,,,,,,
    D0786.ADJML,P301199.CDN.RCDN,1750000.000000,1750000.000000,1750000.000000,,,,,,,,,
    D0786.ADJML,P304099.CDN.RCDN,,115.000000,582.000000,929.000000,,,,,,,,
    D0786.ADJML,P301189.CDN.RCDN,,,,1750000.000000,,,,,,,,
    D1124.ADJML,N250099.CDN.RCDN,360104.000000,352066.000000,354059.000000,356051.000000,,,,,,,,
    D1124.ADJML,P200099.CDN.RCDN,2075.000000,2000.000000,2008.000000,2015.000000,,,,,,,,
    D1124.ADJML,P200099.CDN.ROTH,39.000000,-7.000000,-7.000000,-7.000000,,,,,,,,
    D1124.ADJML,P203099.CDN.RCDN,0.000000,-1.000000,0.000000,0.000000,,,,,,,,
    D1124.ADJML,P300599.CDN.RCDN,7990.000000,66.000000,2047.000000,4028.000000,,,,,,,,
    D1124.ADJML,P301199.CDN.RCDN,350000.000000,350000.000000,350000.000000,,,,,,,,,
    D1124.ADJML,P304099.CDN.RCDN,,8.000000,11.000000,15.000000,,,,,,,,
    D1124.ADJML,P301189.CDN.RCDN,,,,350000.000000,,,,,,,,

    If you are sure the entity doesn't exist, trying using the purge entities feature. Go to the Entities module, from its main menu select Task->Unowned Entities and run the Unknown entities report, if the entity that is causing the problem is there, run the purge unowned entities task. Make sure you dont purge other entities that you may need.

  • Lost Decimal Places when extracting value from ODS into Cube

    Hi,
    In short, we wish to transfer data from an ODS to a custom cube. One of the Key Figures is becoming an integer in the process.
    Both the ODS and Cube use the same InfoObject which is defined as a Number with a Data Type of FLTP.
    If I set a break-point in the update rules, I can see that it is reading the value into the communication structure correctly. Yet, when I browse the data in the Cube, it seems to have lost the decimal places in the update.
    More details:
    We have a custom ODS and cube which needs to store a Unit Price to four decimal places. In creating the InfoObject for Unit Price, we could not use a Data Type of Currency (Type = Amount) as this is limited to 2 decimal places.
    Instead, I used a Data Type of FLTP (Floating Point) and a type of Number. All decimal places are loading correctly into the ODS, but are being lost when transferred to Cube.
    Many thanks
    Adrian

    Hi Simon and Emmanuel,
    Thanks for the suggestions.
    Unfortunately the Query is returning the same integers as LISTCUBE.
    Also, my key figure has the same definition in both the ODS and cube (uses the same InfoObject), and there is no transformation - just transfer.
    I'd prefer not to create another currency. It could get a bit messy if I'm having to include the figure in calculation with other figures that use the real currency.
    I had thought of making the key figure a standard type of Currency and multiplying the value by 100 before storing them in the cube - then I would simply divide the figures by 100 in the query. But again, not ideal.
    I have logged a OSS note with SAP to see if there is any known issue with transferring  a floating point number between the ODS and Cube.
    Thanks
    Adrian

  • Error when Extracting Metadata from ECC

    Hello,
    DS 4.2 was working fine with ECC after the creation of RFC connection. I was able to pull some of the tables. but after Basis team made some security changed with the objects, I am not able to pull the metadata from ECC.It is showing the following error. Can any one help me with this.

    Hi Navinchandar Selvaraj,
    As per there discussion
    Yeah it is not a best practice to assigning SAP_ALL. You should create a new role with authorization. Please contact your basis team. Ask them to create a New role and grant Authorizations. But in some cases SAP recommended to assign SAP_ALL authorization. 
    Hope this will help you out
    Authorization Profile SAP_ALL - Identity Management - SAP Library
    This composite profile contains all SAP authorizations, meaning that a user with this profile can perform all tasks in the SAP system. You should therefore not assign this authorization profile to any of your users. We recommend that you create only one user with this profile. You should keep the password of this user secret (store it in a safe) and only use it in emergencies (see also Protective Measures for SAP*).
    Instead of using the SAP_ALL profile, you should distribute the authorizations it contains to the appropriate positions. For example, instead of assigning your system administrator (or superuser) the authorization SAP_ALL, assign him or her only those that apply to system administration, namely the S_* authorizations. These authorizations give him or her enough rights to administer the entire SAP system, without allowing him or her to perform tasks in other areas such as Personnel.
    Best Practice - How to analyze and secure RFC connections - Security and Identity Management - SCN Wiki
    Regards,
    Akhileshkiran

  • When I extracting data from DSO to Cube by using DTP.

    When i am extracting data from DSO to Cube by using DTP.
    I am getting following erros.
    Data package processing terminated (Message no. RSBK229).
    Error in BW: error getting datapakid cob_pro (Message no. RS_EXCEPTION105).
    Error while extracting from source 0FC_DS08 (type DataStore) - (Message no. RSBK242).
    Data package processing terminated. (Message no. RSBK229).
    Data package 1 / 10/04/2011 15:49:56 / Status 'Processed with Errors'. (Message no. RSBK257).
    This is the brand new BI 7.3 system. Implementing the PSCD and TRM.
    I have used the standard business content objects in FI-CA (dunning history header, item,activities) and standard Datasource (0FC_DUN_HEADER ,0FC_DUN_ITEMS, 0FC_DUN_ACTIVITIES). I have extracted data till the DSO level . when I try to pull the data to info provider level(cube) using DTP . I am getting fallowing error.
    my observation: when ever I use the DSO as source to any target like another DSO or cube. its throwing same kind of error for any flow including simple Flat file .
    please suggest any one whether do I need to maintain basic settings since its a brand new BI 7.3.
    please help me out on this issue . I am not able to move forward .its very urgent

    hello
    Have you solved the problem ?  
    I have the same error...
    as you solve this error,
    can you help me please I have the same error
    yimi castro garcia
    [email protected]

  • Short dump error when extracting from one of the datasource in R/3 to BW

    When extracting from one of the datasource I am getting the short dump. below is the source code of the same.
    Source code extract
    Get boundaries of next TID block
    L_FROM_INDEX = L_TO_INDEX + 1.
    IF L_FROM_INDEX GT NFILL.  EXIT.  ENDIF.
    L_TO_INDEX   = L_TO_INDEX + L_BLOCK_SIZE.
    IF L_TO_INDEX GT NFILL.
      L_TO_INDEX = NFILL.
      L_BLOCK_SIZE = L_TO_INDEX - L_FROM_INDEX + 1.
    ENDIF.
    Create hashed index on TID of TID table
    CLEAR L_TH_TID_IDX.
    LOOP AT TIDTAB FROM L_FROM_INDEX TO L_TO_INDEX.
      L_S_TID_IDX-TIDIX = SY-TABIX.
      L_S_TID_IDX-TID   = TIDTAB-TID.
      COLLECT L_S_TID_IDX INTO L_TH_TID_IDX.
    ENDLOOP.
    Select TID block from STATE table
    SELECT * INTO TABLE L_T_STATE
           FROM ARFCSSTATE FOR ALL ENTRIES IN L_TH_TID_IDX
           WHERE ARFCIPID   EQ L_TH_TID_IDX-TID-ARFCIPID
             AND ARFCPID    EQ L_TH_TID_IDX-TID-ARFCPID
             AND ARFCTIME   EQ L_TH_TID_IDX-TID-ARFCTIME
             AND ARFCTIDCNT EQ L_TH_TID_IDX-TID-ARFCTIDCNT
           ORDER BY PRIMARY KEY.
    Consistence check
    DESCRIBE TABLE L_T_STATE LINES L_LINES.
    IF L_LINES NE L_BLOCK_SIZE OR
       L_LINES EQ 0.
      MESSAGE X097(SY).
    ENDIF.
    PERFORM DELETE_BATCH_JOB
            USING    L_T_STATE
            CHANGING L_S_TID1.
    Update LUW-Status und Zeit
    CLEAR L_T_STATE_IDX.
    CLEAR L_TH_TID2_IDX.
    CLEAR L_T_TID.
    LOOP AT L_T_STATE INTO L_S_STATE.
      L_S_STATE_IDX-TABIX = SY-TABIX.

    Hi Pavan,
                     This is a table space error.
    Regards,
    rahul

Maybe you are looking for

  • Mid 2012 MacBook Air Yosemite Slow Booting and Old Boot Screen

    Hello Guys, I have a mid 2012 MacBook Air 13" and I had installed public beta of Yosemite on it. Now I am running OS X 10.10 (14A388b) and I have problems on the boot. On all the other macs, I saw that the booting screen is black background and a whi

  • Need help installing Virtual box Windows VM's

    I am installing virtual box on wife's mac.   I got the instructions here: http://osxdaily.com/2011/09/04/internet-explorer-for-mac-ie7-ie8-ie-9-free/ I did it previously and it all worked well. Recently the windows session license expired so I reinst

  • The 'popping' sound

    I have the problem with strange intermittent small popping sound (which has been mentioned in other posts) It comes from the top hand left corner of the keyboard and seem to have noting to do with the speaker (it's occurring regardless of volume and

  • How to break a long project in 3 smaller timelines and assembling them back

    Hi, I'm presently working on a big project (a 90 min long documentary). I took advantage of tips from other user to speed up FCPX but now I'm at a point where I fell in need of a radical switch to give breath to my mac (btw I just switched to a new i

  • Calling a URL from WD for ABAP 2004s

    Hi, In our scenario we would like to call workflow items from a web dynpro for abap application. We do not use a portal. I have build an alv from where the user can select his workflow tasks. A button then calls a method to trigger workflow. Within t