Unable to interpret 1,000.000 as a number while posting IDOCs

Hi All,
I know this query has been asked multiple times but as of now I am unable to find the right match so posting this query again. We are posting an IDOC from SAP PI to SAP ECC using SAP PI IDOC Adapter. When the IDOC is sent there is one specific value in one of the fields which have value 1000.00. Now when the IDOC is received by SAP the value is received as 1,000.000. As far as I understand this is because the user configured in the RFC destination in SAP PI  (which sends the IDOC to SAP) is having in its default profile the Decimal Notation as 1,234,567.89 and hence we see the value as 1,000.00.
But then the IDOC is not posted in SAP and the issue is: "Immediately processing not possible: Unable to interpret 1,000.000 as a number".
As I am not much aware about ABAP being a PI consultant, would appreciate if experts can guide me on how to resolve this issue. Thanks for taking time to read through the issue.
Thanks,
Pratik

Please check at abap end for the inbound parameters.
In we20 the message type will be assigned .Double click it and you will get the process code.Double click that and you will get the required FM.
If the inbound function module is Z ,Please do the necessary change.
OR
Else try WE19 to process the idoc (Existing).
Execute and then check the Inbound function module option and then call in debugging mode to check the where the error is happening.
Hope it helps.
Thanks!

Similar Messages

  • HR : Unable to interpret "40#6411000000#7" as a number

    Hi,
    We upgraded our system from 4.6C to ECC 6.0 Unicode.
    In HR, there is a problem after uploading a file.
    DATA total TYPE p DECIMALS 2.
    DATA : BEGIN OF itab OCCURS 1500,
       clco(2),      "Clé de comptabilisation
       compte(10),   "Numéro de compte
       c_cout(8),    "Center de coût
       montant(15),  "Montant
       texte(55),    "Texte du poste
    END OF itab.
    LOOP AT itab.
          ADD itab-montant TO  total.
    ENDLOOP.
    the output of itab-montant comes as 40#6411000000#7 so it can't be put in total...
    The error message is :
    Unable to interpret "40#6411000000#7" as a number.
    How can I solve this problem??
    It's very urgent! please help me...
    Tarick.

    hi,
    as far as i understood..u uploaded some data from a file....
    and it shows an error ....unable to interpret..but when exactly are u getting this error...
    and what should the value look like instead of 40#6411000000#7..can u be some more specific?
    shud there be some vlaues instead of #?
    rergards,
    sampath
    Edited by: sampath pilla on Apr 29, 2008 8:06 PM

  • Unable to interpret 0.00.00 as a number

    Hi Gurus,
    We are facing issue peculiare issue while doing delta load of material master.
    We are bring materials form R/3 to CRM system as a delta load. We found in R/3, smq1 there are in material master queue stuck due to the error message of "Unable to interpret 0.00.00 as a number".
    Help us to resolve this issue, points are sure and thanks in advance.
    Regards,
    Muki

    Hi Frederic,
    Thanks a lot, we are using crm 4.0 and R/3 4.7.
    We have checked the SAP Notes mentioned by you, we are getting the error in R/3 outbound queue.
    We have same patch level in the both the system.
    CRM:
    2005_1_620 SAPKIPYJ55
    In  R/3:
    2005_1_640 SAPKIPYJ65
    Do you have any idea?
    Regards,
    muki

  • Unable to Interpret 0é`Xú_ as a number in CJ20N

    Hi experts,
    I have only one user with this problem, every time she tries to get into the Tcode CJ20N, she receives a DUMP, saying this:
    CONVT_NO_NUMBER
    Unable to Interpret 0é`Xú_ as a number.
    SAP Program is SAPLCNPB_W
    Anyone know how can i solve this issue?
    Thanks in advance.

    Hello,
    if this is happening just for one user, you might want to check her parameters in SU01...
    Rgds
    Martina

  • Unable to interpret " 1,100.0 " as a number

    Hi all
    Im getting this error when I try to print a customized sapscript form. This sapscript call a form from include file.
    Here's the code:
    FORM Cal_weight tables in_tab structure itcsy             
                           out_tab structure itcsy.           
      DATA: mate TYPE STRING, "Material no                    
            mqty TYPE RESBD-BDMNG,"IOOPCOMP-MENGE, "QtySTRING 
            umret TYPE MARM-LAENG, "Temp qty                  
            umrez LIKE MARM-UMREZ, "B.UOM qty                 
            umren LIKE MARM-UMREN. "A.UOM qty                                                                               
    READ table in_tab index 1.                              
      MOVE in_tab-value to mate.                                                                               
    READ table in_tab index 2.                              
      MOVE in_tab-value to mqty.        <--- the error is pointing to here.                        
      umret = mqty.                                                                               
    SELECT SINGLE UMREZ UMREN                               
      INTO (umrez, umren)                                     
      FROM MARM                                               
      WHERE MATNR = mate                                      
      AND MEINH = 'KG'.                                                                               
    umret = ( umret / umrez ) * umren.                      
      mqty = umret.                                                                               
    read table out_tab index 1.                             
      move mqty to out_tab-value.                             
      modify out_tab index sy-tabix.                          
    ENDFORM.
    And this is the call code in sapscript:
    DEFINE &V_WEIGHT& = ''.
    PERFORM CAL_WEIGHT IN PROGRAM ZFIR002
    USING &RESBD-MATNR&
    USING &RESBD-BDMNG&
    CHANGING &V_WEIGHT&
    ENDPERFORM.
    Please advise
    Thanks
    az

    Hi Santosh
    I've changed it to 'write' but I encounter this error in the include file.
    " "MQTY" must be a character-type field (data type C,N,D or T)"
    Any idea?
    Thanks
    az

  • ABAP DF Unable to interpret as a number

    I have a very simply ABAP data flow which joins SAP tables PA0001 and T528T together using PLANS.  If I use PA0001.PLANS in my output schema I receive the following error message:
    "RFC CallReceive error <Function Z_AW_ABAP_RUN: RFC_ABAP_MESSAGE- Unable to interpret 0    0.00 as a number."
    If I select T528T.PLANS in the output schema it works fine, but I have no idea what is causing this error message.

    Hi Tom,   If the error si correcto, it seems to be a string trying to be saved on a integer field. Change the target data type or validate it before be loaded. I amnot sure if the error is "0    0.00 " >string or "0.00 " >number,but not a integer. Regards.

  • SMQ2 failure during Initial load - Unable to interpret 9.990. as number

    hi All,
    I perform the initial download of customizing objects from R/3(4.6C) to CRM 7.0.
    For object DNL_CUST_BASIS2 I received the "Unable to interpret 9.990.  as a number" error message.
    Could you give some hints how to solve this problem?
    I checked several posts with similar error messages, but none of them seemed to be related to my case.
    We just implemented note 777944 to update SMOFPARSFA in CRM to neglect the unicode check.
    I was able to download other customizing objects like DNL_CUST_BASIS, DNL_CUST_ACGRPB, DNL_CUST_ACGRPP, DNL_CUST_ADDR.
    My next step is to debug the queue, if you have further suggestions let me know.
    Thanks

    Hi,
    Please check my reply in the link
    Loading DNL_CUST_BASIS2: Unable to interpret "9.990. " as a number.
    Hope this will help.
    Thanks,
    Vikash.

  • Unable to interpret " 12,000.00 " as a number.

    Hi,
         can some one help me with this error:
    Unable to interpret " 12,000.00 " as a number.

    Hi Sanjay,
    [unable to interpret as number|unable to interpret as number]
    Here is a post in SCN which describes all the possible ways to solve your problem..
    Thanks & regards,
    Dileep .C

  • Lost my iPad password and unable to reactivate before 22,000,000 minutes...

    I would like to rebot the iPad. I've made a few hard rebot, but I doesn't change anything. I'm also unable to link the iPad to iTune, because do do so, I'm asked to enter the password, but I will have to wait 22,000,000 minutes before. How can I erase everythinh on the iPad and start again from scratch ?

    You may need to put the iPad into recovery mode : http://support.apple.com/kb/ht1808 - you should then be able to reset the iPad and restore/resync your content to it.

  • Abap dump (Unable to interpret "65000.00u00A0 " as a number.)

    hi everyone,
                           im getting the following dump error.
    Unable to interpret "65000.00  " as a number.
    previously it worked fine. after some patches moved from production server to quality server
    and from quality to production this report alone getting dump.
    DATA: pono LIKE ekko-ebeln,
          poit LIKE ekpo-ebelp,
          SNO LIKE ESSR-LBLNI,
          PIT LIKE EKPO-EBELP,
           dis LIKE konv-kbetr,
          frg LIKE konv-kbetr,
          VAL LIKE KONV-KBETR,
    FORM tax TABLES intab
    STRUCTURE itcsy outtab
    STRUCTURE itcsy.
      READ TABLE intab INDEX 1.
      pono = intab-value.
      READ TABLE intab INDEX 2.
      poit = intab-value.
      READ TABLE intab INDEX 3.
      SNO = intab-value.
      READ TABLE intab INDEX 4.
      L_CHAR1 = intab-value.
    REPLACE all occurrences of ',' in l_char1 with ' '.
      VAL = L_CHAR1.      <----
    this is the line im getting dump.  l_char1 = 65,000.00
    thanks in advance
    karthe

    KARTHE wrote:
    > Unable to interpret "65000.00  " as a number.
    >   VAL = L_CHAR1.      <----
    this is the line im getting dump.  l_char1 = 65,000.00
    you mention "65000.00  " and "65,000.00" and i guess L_CHAR1 contains "65,000.00".
    If that's the case, remove the thousand separator from L_CHAR1, so 65,000.00 becomes 65000.00
    edit: basically what Pawan mentioned.
    Edited by: Maen Anachronos on Oct 5, 2010 1:48 PM

  • We got error "System is unable to interpret the SSO ticket received"

    We got error "System is unable to interpret the SSO ticket received" on Backend system(R/3 4.7).
    We use SAP Enterprise Portal 6.0 SP15.
    Please let us know what is problems ?
    We done following steps in SAP EP.
    1. Created LOGON-METHOD (Logon Method : SAPLOGONTICKET).
    2. Export Public-key (verify.der) from SAP EP.
       (System Admin -> System config -> Keystore Administration -> Download)
    We done following steps in SAP R/3.
    3. Start STRUSTSSO2 Transaction.
       In the PSE status frame on the left, choose the system PSE.
       In the certificate section, choose Import Certificate.
         The Import Certificate screen appears.
       Choose the File tab.
       In the File path field, enter the path of the portal's verify.der file.
       Set the file format to DER coded and confirm.
       In the Trust Manager, choose Add to PSE.
       Choose Add to ACL, to add the Portal Server to the ACL list.
       In the dialog box that appears, enter follows:
         system ID    :J2E
         client       :000
         Subject name :CN=J2E
         Issuer name  :CN=J2E
         Serial number:00
       Save your entry.
    4. Set profile parameters
       login/accept_sso2_ticket = 1
       login/create_sso2_ticket = 0
    5. Restart SAP R/3 system.

    Hi,
    I faced the same problem when i am configuring the SSO,
    check the portal client number in System Administration>System Configaration->UM Configuration--->Direct Editing
    and give the same client number
    In the dialog box that appears,
    system ID :J2E
    client :000
    else check weather u r importing the certificate correctly.
    First export the verify.der and and import that using the STRUSTSSO2
    Thanks
    Ajay

  • SM21 Errors DIA 000 000 SAPSYS D01 Transaction Canceled 00 152

    OK I have done almost every search I c an think of in service.sap.com, here and in Google.  I have exhausted my mind... I can not figure out what is causing the problem.  Here is what I am seeing:
    14:16:12 DIA 000 000 SAPSYS            R68 Perform rollback
    14:17:13 DIA 000 000 SAPSYS            D01 Transaction Canceled 00 152 ( )
    14:17:13 DIA 000 000 SAPSYS            R68 Perform rollback
    14:18:13 DIA 000 000 SAPSYS            D01 Transaction Canceled 00 152 ( )
    14:18:13 DIA 000 000 SAPSYS            R68 Perform rollback
    14:19:13 DIA 000 000 SAPSYS            D01 Transaction Canceled 00 152 ( )
    14:19:13 DIA 000 000 SAPSYS            R68 Perform rollback
    14:20:13 DIA 000 000 SAPSYS            D01 Transaction Canceled 00 152 ( )
    14:20:13 DIA 000 000 SAPSYS            R68 Perform rollback
    14:21:13 DIA 000 000 SAPSYS            D01 Transaction Canceled 00 152 ( )
    14:21:13 DIA 000 000 SAPSYS            R68 Perform rollback
    14:22:14 DIA 000 000 SAPSYS            D01 Transaction Canceled 00 152 ( )
    14:22:14 DIA 000 000 SAPSYS            R68 Perform rollback
    Double clicking I see:
    Documentation for system log message D0 1 :
    The transaction has been terminated.  This may be caused by a
    termination message from the application (MESSAGE Axxx) or by an
    error detected by the SAP System due to which it makes no sense to
    proceed with the transaction.  The actual reason for the termination
    is indicated by the T100 message and the parameters.
    Additional documentation for message 00                  152
    Name or password is incorrect. Please re-enter
    No documentation exists for message 00152
    As you can see this is happening many times a second.  I have checked the batch jobs and can not see anything there.  I see that there is a incorrect name or password but where?  I can not dig it up.  This is a Netweaver 04 XI 3.0 box running on a HP-UX platform.  XI Seems to be running fine so I don't want to reset all the JAVA passwords.  Is there any way to trace this down.  I set the Trace on the Dialog process' to 3 but I can not decipher them.
    Please give me any advice as I am stuck.  This is not stopping us from working but it is getting hard to read through the logs in SM21 when this is none stop.
    Best Regards,
    Paul

    Hi
    The error
    14:17:13 DIA 000 000 SAPSYS D01 Transaction Canceled 00 152 ( )
    14:17:13 DIA 000 000 SAPSYS R68 Perform rollback
    Is basically a logon time error, when you try to logon through SAPGUI in dialog mode, the instance is 000 and logon takes place through dialog process 000.
    As SM21 log further says:
    Name or password is incorrect. Please re-enter
    This is the message you get at bottom screen of gui when you enter wrong user id or password.
    Many a times when user wrongly enter the invalid logon data this error is evident in SM21 log.
    Now in your case this is occuring every minute, so there might be some process which initiates a dialog logon to the system every minute but unable to do that.
    So in this case this could be the CCMS monitoring component on a remote system e.g. a Solution Manager system, which is monitoring your XI system and trying to login to get the data.
    Please check if this is possibility in your case.
    Hope this might help.
    Rahul

  • What happens if the iPod is said to be disabled for 27,000,000 minutes?

    What do you do if your iPod has been disabled for 27,000,000 minutes?

    Just what the users guide and the dozen or so post/replies to your question that users make everyday.
    You place the iPod in recovery mod and restore it via iTunes on your computer.  For recovery mode see:
    iPhone and iPod touch: Unable to update or restore

  • SSRS and a report of 3,000,000 rows

    I have SQL Server 2008 R2 with SSRS. I have created an SSRS report that may contain up to 3,000,000 rows.
    When I tried to generate such huge report I saw the following picture:
    The stored procedure (one that brings the data into the Report) worked 50 seconds
    After this the SSRS ReportingServivesService.exe started to consume a lot of memory. It's Working Set grew up to 11 GB. It took 6 minutes; and then the report generation failed with the following error message:
    An error has occurred during report processing. (rsProcessingAborted)
    There is not enough space on the disk.
    “There is not enough space on the disk.” – this was probably about the disk drive on that server where the Windows page file was mapped into. The drive had 14 GB of free space.
    A NOTE: the report was not designed as a single-page report. It is divided on pages by 40 rows. When I try to generate the same report with 10,000 rows – it takes just 1 minute.
    The question is: can this be fixed somehow?

    I have an MS SQL Server 2008 R2 with SSRS. I have created an SSRS report that may contain up to 3,000,000 rows.
    When I tried to generate such huge report I saw the following picture:
    -          The stored procedure (one that brings the data into the Report) worked 50 seconds
    -          After this the SSRS ReportingServivesService.exe started to consume a lot of memory. Its Working Set grew up to 11 GB. It took 6 minutes; and then the report generation failed with the following error message:
    An error has occurred during report processing. (rsProcessingAborted)
    There is not enough space on the disk.
    “There is not enough space on the disk.” – this was probably about the disk drive on that server where the Windows page file was mapped into. The drive had 14 GB of free space.
    A NOTE: the Report was not designed as a single-page report. It is divided on pages by 40 rows. When I try to generate the same Report with  10,000 rows – it takes just 1 minute.
    The question is: can this be fixed somehow?

  • Building secondary Index fails for large number(25,000,000) of records

    I am inserting 25,000,000 records of the type:
    Key --> Data
    [long,String,long] --> [{long,long}, {String}}
    using setSecondaryBulkLoad(true) and then build two Secondary indexes on {long,long} and {String} of the data portion.
         private void buildSecondaryIndex(DataAccessLayer dataAccessLayer ) {
              try {
                   SecondaryIndex<TDetailSecondaryKey, TDetailStringKey, TDetailStringRecord> secondaryIndex = store.getSecondaryIndex(dataAccessLayer.getPrimaryIndex(), TDetailSecondaryKey.class, SECONDARY_KEY_NAME);
              } catch (DatabaseException e) {
                   throw new RuntimeException(e);
    It fails when I build the SecondaryIndex probably due to Java Heap Space Error. See the failure trace below.
    I do not face this problem when I deal with 250,000 records.
    Is there a work around that without haveing to set the memory space configurations of the JVM.
    Failure Trace:
    java.lang.RuntimeException: Environment invalid because of previous exception: com.sleepycat.je.RunRecoveryException
         at com.infobionics.ibperformance.TDetailStringDAOInsertTest.buildSecondaryIndex(TDetailStringDAOInsertTest.java:444)
         at com.infobionics.ibperformance.TDetailStringDAOInsertTest.insertCellSetInOneTxn(TDetailStringDAOInsertTest.java:280)
         at com.infobionics.ibperformance.TDetailStringDAOInsertTest.mainTest(TDetailStringDAOInsertTest.java:93)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
         at java.lang.reflect.Method.invoke(Method.java:597)
         at org.junit.internal.runners.TestMethodRunner.executeMethodBody(TestMethodRunner.java:99)
         at org.junit.internal.runners.TestMethodRunner.runUnprotected(TestMethodRunner.java:81)
         at org.junit.internal.runners.BeforeAndAfterRunner.runProtected(BeforeAndAfterRunner.java:34)
         at org.junit.internal.runners.TestMethodRunner.runMethod(TestMethodRunner.java:75)
         at org.junit.internal.runners.TestMethodRunner.run(TestMethodRunner.java:45)
         at org.junit.internal.runners.TestClassMethodsRunner.invokeTestMethod(TestClassMethodsRunner.java:66)
         at org.junit.internal.runners.TestClassMethodsRunner.run(TestClassMethodsRunner.java:35)
         at org.junit.internal.runners.TestClassRunner$1.runUnprotected(TestClassRunner.java:42)
         at org.junit.internal.runners.BeforeAndAfterRunner.runProtected(BeforeAndAfterRunner.java:34)
         at org.junit.internal.runners.TestClassRunner.run(TestClassRunner.java:52)
         at org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:38)
         at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
         at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
         at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
         at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
         at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
    Caused by: Environment invalid because of previous exception: com.sleepycat.je.RunRecoveryException
         at com.sleepycat.je.dbi.EnvironmentImpl.checkIfInvalid(EnvironmentImpl.java:976)
         at com.sleepycat.je.log.LogManager.getLogEntry(LogManager.java:584)
         at com.sleepycat.je.txn.Txn.undo(Txn.java:713)
         at com.sleepycat.je.txn.Txn.abortInternal(Txn.java:631)
         at com.sleepycat.je.txn.Txn.abort(Txn.java:599)
         at com.sleepycat.je.txn.AutoTxn.operationEnd(AutoTxn.java:36)
         at com.sleepycat.je.Environment.openDb(Environment.java:505)
         at com.sleepycat.je.Environment.openSecondaryDatabase(Environment.java:382)
         at com.sleepycat.persist.impl.Store.openSecondaryIndex(Store.java:684)
         at com.sleepycat.persist.impl.Store.getSecondaryIndex(Store.java:579)
         at com.sleepycat.persist.EntityStore.getSecondaryIndex(EntityStore.java:286)
         at com.infobionics.ibperformance.TDetailStringDAOInsertTest.buildSecondaryIndex(TDetailStringDAOInsertTest.java:441)
         ... 22 more
    Caused by: java.lang.OutOfMemoryError: Java heap space
         at java.util.HashMap.resize(HashMap.java:462)
         at java.util.HashMap.addEntry(HashMap.java:755)
         at java.util.HashMap.put(HashMap.java:385)
         at java.util.HashSet.add(HashSet.java:200)
         at com.sleepycat.je.txn.Txn.addReadLock(Txn.java:964)
         at com.sleepycat.je.txn.Txn.addLock(Txn.java:952)
         at com.sleepycat.je.txn.LockManager.attemptLockInternal(LockManager.java:347)
         at com.sleepycat.je.txn.SyncedLockManager.attemptLock(SyncedLockManager.java:43)
         at com.sleepycat.je.txn.LockManager.lock(LockManager.java:178)
         at com.sleepycat.je.txn.Txn.lockInternal(Txn.java:295)
         at com.sleepycat.je.txn.Locker.nonBlockingLock(Locker.java:288)
         at com.sleepycat.je.dbi.CursorImpl.lockLNDeletedAllowed(CursorImpl.java:2357)
         at com.sleepycat.je.dbi.CursorImpl.lockLN(CursorImpl.java:2297)
         at com.sleepycat.je.dbi.CursorImpl.fetchCurrent(CursorImpl.java:2227)
         at com.sleepycat.je.dbi.CursorImpl.getCurrentAlreadyLatched(CursorImpl.java:1296)
         at com.sleepycat.je.dbi.CursorImpl.getNextWithKeyChangeStatus(CursorImpl.java:1442)
         at com.sleepycat.je.dbi.CursorImpl.getNext(CursorImpl.java:1368)
         at com.sleepycat.je.Cursor.retrieveNextAllowPhantoms(Cursor.java:1587)
         at com.sleepycat.je.Cursor.retrieveNext(Cursor.java:1397)
         at com.sleepycat.je.SecondaryDatabase.init(SecondaryDatabase.java:182)
         at com.sleepycat.je.SecondaryDatabase.initNew(SecondaryDatabase.java:118)
         at com.sleepycat.je.Environment.openDb(Environment.java:484)
         at com.sleepycat.je.Environment.openSecondaryDatabase(Environment.java:382)
         at com.sleepycat.persist.impl.Store.openSecondaryIndex(Store.java:684)
         at com.sleepycat.persist.impl.Store.getSecondaryIndex(Store.java:579)
         at com.sleepycat.persist.EntityStore.getSecondaryIndex(EntityStore.java:286)
         at com.infobionics.ibperformance.TDetailStringDAOInsertTest.buildSecondaryIndex(TDetailStringDAOInsertTest.java:441)
         at com.infobionics.ibperformance.TDetailStringDAOInsertTest.insertCellSetInOneTxn(TDetailStringDAOInsertTest.java:280)
         at com.infobionics.ibperformance.TDetailStringDAOInsertTest.mainTest(TDetailStringDAOInsertTest.java:93)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

    1. Does the speed of building of secondary index
    depend on the type of the data in the key? Will
    having integers in secondary key as opposed to string
    be better?The byte size of the key and data is significant of course, but the data type is not.
    2. How much are we bound of the memory? Lets assume
    my memory setting is fixed.
    a. I know with current memory settings if I set txn
    n on, I have java Heap Error.
    So will I be limited on the size of
    secondary index or
    will it just get really slow swapping
    tree information from the disk as it builds it.No. The out-of-memory error was caused by a very large transaction that holds locks. When using small transactions or non-transactional access, you won't have this problem. In general, like most databases, JE writes and reads information to/from disk as needed.
    b. Is there any other way of speeding the build of
    f secondary database?No, other then general performance tuning, nothing I know of.
    c. Will it be more beneficial not to bulk
    load when the datasize gets large
    so that secondary database is built
    incrementally?It's up to you whether you want to pay the price during an initial load or incrementally.
    d. Do you think it will help to partition the
    e original database into smaller databases
    using some criteria, and thus build
    smaller trees.          Why? You can use deferred write or non-transactional access to load any size database.
    The only weak point in this is if we have to bulk
    bulk load in one partition
    at some time increasing its size we may
    face the same problem againFace what problem?
    --mark                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

Maybe you are looking for