Exception while loading data into the cache

I'm getting the following error while attempting to pre-populate the cache:
2010-11-01 16:27:21,766 ERROR [STDERR] (Logger@9229983 n/a) 2010-11-01 16:27:21.766/632.975 Oracle Coherence EE n/a <Error> (thread=DistributedCache, member=1): SynchronousListener cannot be added on the service thread:
     at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$BinaryMap.addMapListener(PartitionedCache.CDB:14)
     at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$ViewMap.addMapListener(PartitionedCache.CDB:1)
     at com.tangosol.coherence.component.util.SafeNamedCache.addMapListener(SafeNamedCache.CDB:27)
     at com.tangosol.net.cache.CachingMap.registerListener(CachingMap.java:1463)
     at com.tangosol.net.cache.CachingMap.ensureInvalidationStrategy(CachingMap.java:1579)
     at com.tangosol.net.cache.CachingMap.registerListener(CachingMap.java:1484)
     at com.tangosol.net.cache.CachingMap.get(CachingMap.java:487)
     at com.jpm.ibt.primegps.cachestore.AbstractGpsCacheStore.isEnabled(AbstractGpsCacheStore.java:54)
     at com.jpm.ibt.primegps.cachestore.AbstractGpsCacheStore.store(AbstractGpsCacheStore.java:83)
     at com.tangosol.net.cache.ReadWriteBackingMap$CacheStoreWrapper.store(ReadWriteBackingMap.java:4783)
     at com.tangosol.net.cache.ReadWriteBackingMap$CacheStoreWrapper.storeInternal(ReadWriteBackingMap.java:4468)
     at com.tangosol.net.cache.ReadWriteBackingMap.putInternal(ReadWriteBackingMap.java:1147)
     at com.tangosol.net.cache.ReadWriteBackingMap.put(ReadWriteBackingMap.java:853)
     at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$Storage.put(PartitionedCache.CDB:98)
     at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache.onPutAllRequest(PartitionedCache.CDB:41)
     at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$PutAllRequest.onReceived(PartitionedCache.CDB:90)
     at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.onMessage(Grid.CDB:11)
     at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.onNotify(Grid.CDB:33)
     at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.PartitionedService.onNotify(PartitionedService.CDB:3)
     at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache.onNotify(PartitionedCache.CDB:3)
     at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:42)
     at java.lang.Thread.run(Thread.java:619)
2010-11-01 16:27:21,829 ERROR [STDERR] (pool-14-thread-3) Exception in thread "pool-14-thread-3"
2010-11-01 16:27:21,829 ERROR [STDERR] (Logger@9229983 n/a) 2010-11-01 16:27:21.766/632.975 Oracle Coherence EE n/a <Error> (thread=DistributedCache, member=1): Assertion failed: poll() is a blocking call and cannot be called on the Service thread
     at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.poll(Grid.CDB:5)
     at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.poll(Grid.CDB:11)
     at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$BinaryMap.get(PartitionedCache.CDB:26)
     at com.tangosol.util.ConverterCollections$ConverterMap.get(ConverterCollections.java:1559)
     at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$ViewMap.get(PartitionedCache.CDB:1)
     at com.tangosol.coherence.component.util.SafeNamedCache.get(SafeNamedCache.CDB:1)
     at com.tangosol.net.cache.CachingMap.get(CachingMap.java:491)
     at com.jpm.ibt.primegps.cachestore.AbstractGpsCacheStore.isEnabled(AbstractGpsCacheStore.java:54)
     at com.jpm.ibt.primegps.cachestore.AbstractGpsCacheStore.store(AbstractGpsCacheStore.java:83)
     at com.tangosol.net.cache.ReadWriteBackingMap$CacheStoreWrapper.store(ReadWriteBackingMap.java:4783)
     at com.tangosol.net.cache.ReadWriteBackingMap$CacheStoreWrapper.storeInternal(ReadWriteBackingMap.java:4468)
     at com.tangosol.net.cache.ReadWriteBackingMap.putInternal(ReadWriteBackingMap.java:1147)
     at com.tangosol.net.cache.ReadWriteBackingMap.put(ReadWriteBackingMap.java:853)
     at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$Storage.put(PartitionedCache.CDB:98)
     at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache.onPutAllRequest(PartitionedCache.CDB:41)
     at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$PutAllRequest.onReceived(PartitionedCache.CDB:90)
     at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.onMessage(Grid.CDB:11)
     at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.onNotify(Grid.CDB:33)
     at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.PartitionedService.onNotify(PartitionedService.CDB:3)
     at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache.onNotify(PartitionedCache.CDB:3)
     at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:42)
     at java.lang.Thread.run(Thread.java:619)
2010-11-01 16:27:21,922 ERROR [STDERR] (pool-14-thread-3) (Wrapped: Failed request execution for DistributedCache service on Member(Id=1, Timestamp=2010-11-01 16:20:09.372, Address=169.65.134.90:7850, MachineId=28250, Location=site:EMEA.AD.JPMORGANCHASE.COM,machine:WLDNTEC6WM754J,process:5416) (Wrapped: Failed to store key="9046019") poll() is a blocking call and cannot be called on the Service thread) com.tangosol.util.AssertionException: poll() is a blocking call and cannot be called on the Service threadI'm a bit stumped as my code doesn't call poll() anywhere and this appears to be caused by the following in my CacheStore class:
public void store(Object key, Object value) {
        log.info("CacheStore currently " + isEnabled());
        if (isEnabled()) {
            throw new UnsupportedOperationException("Store method not currently supported");
public boolean isEnabled() {
        return ((Boolean) CacheFactory.getCache(CacheNameEnum.CONTROL_CACHE.name()).get(ENABLED)).booleanValue();
    }the only thing I can think of is maybe it has a problem calling a cache from within a CacheStore (if that makes sense). What I have is a CONTROL_CACHE which just stores a boolean value to indicate whether the store(), storeAll(), erase(), and eraseAll() methods should do anything. Is this correct?

Hi Jonathan,
I am trying to implement a write-behind cache but my configs may be wrong. The config for the cache with the cachestore looks like:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE cache-config SYSTEM "cache-config.dtd" >
<cache-config>
     <caching-scheme-mapping>
     <cache-mapping>
          <cache-name>PARTY_CACHE</cache-name>
          <scheme-name>party_cache</scheme-name>
     </cache-mapping>
     </caching-scheme-mapping>
     <caching-schemes>
          <near-scheme>
               <scheme-name>party_cache</scheme-name>
               <service-name>partyCacheService</service-name>
               <!-- a sensible default ? -->
               <thread-count>5</thread-count>
               <front-scheme>
                    <local-scheme>
                    </local-scheme>
               </front-scheme>
               <back-scheme>
                    <distributed-scheme>
                         <backing-map-scheme>
                              <read-write-backing-map-scheme>
                                   <internal-cache-scheme>
                                        <local-scheme>
                                        </local-scheme>
                                   </internal-cache-scheme>
                                   <cachestore-scheme>
                                        <class-scheme>
                                             <class-name>spring-bean:partyCacheStore</class-name>
                                        </class-scheme>
                                   </cachestore-scheme>
                              </read-write-backing-map-scheme>
                         </backing-map-scheme>
                    </distributed-scheme>
               </back-scheme>
               <autostart>true</autostart>
          </near-scheme>
     </caching-schemes>
</cache-config>and the control cache config looks like:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE cache-config SYSTEM "cache-config.dtd" >
<cache-config>
     <caching-scheme-mapping>
     <cache-mapping>
          <cache-name>CONTROL_CACHE</cache-name>
          <scheme-name>control_cache</scheme-name>
     </cache-mapping>
     </caching-scheme-mapping>
     <caching-schemes>
          <near-scheme>
               <scheme-name>control_cache</scheme-name>
               <service-name>controlCacheService</service-name>
               <!-- a sensible default ? -->
               <thread-count>5</thread-count>
               <front-scheme>
                    <local-scheme>
                         <high-units>100</high-units>
                    </local-scheme>
               </front-scheme>
               <back-scheme>
                    <distributed-scheme>
                         <backing-map-scheme>
                              <read-write-backing-map-scheme>
                              </read-write-backing-map-scheme>
                         </backing-map-scheme>
                    </distributed-scheme>
               </back-scheme>
               <autostart>true</autostart>
          </near-scheme>
     </caching-schemes>
</cache-config>They have different service names but I'm guessing this isn't what you mean and I thought I was using write-behind but again I'm guessing my config is not correct?

Similar Messages

  • Error while loading data into the cube

    Hi,
    I loaded data on to PSA and when I am loading the data to the cube through DataTransferProcess, I get an error (red color).
    Through "Manage", I could see teh request in red. How do I get to knoe the exact error? Also what could be the possibel reason for this?
    Also can some one explain the Datatransfer process(not in process chain)?
    Regards,
    Sam

    Hi Sam
        after you load the data  through DTP(after click on execute button..) > just go to monitor screen.. in that press  the refresh button..> in that it self.. you can find the  logs..
       otherwise.. in the request  screen also..  beside of the request number... you can see the logs icon.. you can click on this..
    DTP  means..
    DTP-used for data transfer process from psa to data target..
    check thi link..for DTP:
    http://help.sap.com/saphelp_nw04s/helpdata/en/42/f98e07cc483255e10000000a1553f7/frameset.htm
    to load data in to the datatargets or infoproviders like DSO, cube....
    in your  case.. the problem may be.. check the date formats.. special cherectrs... and
    REGARDS
    @JAY

  • Short Dump while loading data into the GL ODS

    Hello Guyz
    I am getting a database error while loading the GL ODS in BW. I am performing an INIT load, and the error occurs during the activation stage:
    Database Error

    Hello Atul,
            PSAPTEMP has nothing to do with PSA. It is a temporary storage space and is present in all systems.
    You can check this in transaction DB02 if you are using Bw 3.x and DB02OLD if you are using BI 7.0. Once in DB02/DB02OLD, you will see a block which says tablespaces. In this block, there is a push button for freespace statistics, click on it and it will take you to a screen which shows the free space for all tablespaces.
    Hope this helps,
    Regards.

  • Error while loading  data into External table from the flat files

    HI ,
    We have a data load in our project which feeds the oracle external tables with the data from the Flat Files(.bcp files) in unix.
    While loading the data, we are encountering the following error.
    Error occured (Error Code : -29913 and Error Message : ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-04063: un) while loading data into table_ext
    Please let us know what needs to be done in this case to solve this problem.
    Thanks,
    Kartheek

    Kartheek,
    I used Google (mine still works).... please check those links:
    http://oraclequirks.blogspot.com/2008/07/ora-29400-data-cartridge-error.html
    http://jonathanlewis.wordpress.com/2011/02/15/ora-29913/
    HTH,
    Thierry

  • Error while loading data into clob data type.

    Hi,
    I have created interface to load data from oracle table into oracle table.In target table we have attribute with clob data type. while loading data into clob field ODI gave below error. I use odi 10.1.3.6.0
    java.lang.NumberFormatException: For input string: "4294967295"
         at java.lang.NumberFormatException.forInputString(Unknown Source)
         at java.lang.Integer.parseInt(Unknown Source)
         at java.lang.Integer.parseInt(Unknown Source)
    Let me know if anyone come across and resolved this kind of issue.
    Thanks much,
    Nishit Gajjar

    Mr. Gajjar,
    You didnt mention what KMs you are using ?
    have a read of
    Re: Facing issues while using BLOB
    and
    Load BLOB column in Oracle to Image column in MS SQL Server
    Try again.
    And can you please mark the Correct/Helpful points to the answers too.
    Edited by: actdi on Jan 10, 2012 10:45 AM

  • Error while loading data into cube

    hi BW gurus,
    when ever i am trying to load data into the cube from flat file after scheduling iam getting short dump in BW system. I checked it in st22 it is giving me a error as exception add_partition_failed. please help me to sort out this problem. If you know the error recovery please give me the answer in detail.
    I will assign points for good answers.

    This is what the note says:
    Symptom
    The process of loading transaction data fails because a new partition cannot be added to the F fact table.The loading process terminates with a short dump.
    Other terms
    RSDU_TABLE_ADD_PARTITION_ORA, RSDU_TABLE_ADD_PARTITION_FAILED, TABART_INCONSITENCY, TSORA, TAORA , CATALOG
    Reason and Prerequisites
    The possible causes are:
    SQL errors when creating the partition
    Inconsistencies in the Data Dictionary control tables TAORA and TSORA
    Solution
    BW 3.0A & BW 3.0B
    In the case of SQL errors:Analyze the SQL code in the system log or short dump and if possible, eliminate the cause. The cause is often a disk space problem or lock situations on the database catalog or, less frequently: the partitioning option in the ORACLE database is not installed.
    The most common cause of the problem is inconsistencies in the TAORA and TSORA tables. As of Support Package 14 for BW 3.0B/Support Package 8 for BW 3.1C, the TABART_INCONSITENCY exception is issued in this case. The reason is almost always missing entries in TSORA for the tablespaces of the DDIM, DFACT and DODS data classes.
    The TAORA table contains the assignment of data classes to data tablespaces and their attributes, for example:
    Tablespace Data class
    DDIM PSAPDIMD ........
    DFACT PSAPFACTD ........
    DODS PSAPODSD .......
    Foreach data tablespace, the TSORA table must contain an entry for the corresponding index tablespace, for example:
    TABSPACE INDSPACE
    PSAPDIMD PSAPDIMD
    PSAPFACTD PSAPFACTD
    PSAPODSD PSAPODSD
    In most cases, these entries are missing and have to be added. See also notes 502989 and 46272.

  • Getting error while loading  Data into ASO cube by flat file.

    Hi All,
    i am getting this error Essbase error 1270040: Data load buffer[1] does not exist while loading data into ASO cube.
    does anyone have solution.
    Regards,
    VM

    Are you using ODI to load the data or maxl? If you are using an ODI interface, are you using a load rule also which version of essbase and ODI are you using
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Facing  problem while loading data into XI

    Hi All,
    I am facing a peculiar problem while loading data into XI when we load small amount of data say 5-10 records these records gets processed, but when we try to load a bit higher volume say 100-500 which comes out to be around 7 record s per second then the record processing gets stalled. That is it gives a message that "recorded for outbound processing".
    So can any body throw some light on it and can tell us why we rare facing such problem and how can we resolve that problem.
    Thanks in Advance.
    Regards
    rahul

    Hi rahul,
    Deregistering and Registering the Queue is a temporary solution.
    The better way to this would be to Tune your Xi system as already mentioned by Vijaya as mentioned in the guide ,
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/70ada5ef-0201-0010-1f8b-c935e444b0ad
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/7fdca26e-0601-0010-369d-b3fc87d3a2d9
    Regards,
    Bhavesh

  • Error while loading data into ODS

    Hi we have BI for NW04S and CRM4.0. While loading data for a CRM ODS , I have received a error message which I never seen before. Yesterdaya data load was smooth and I only see issue while loading data today
    The error message in the monitor is as follows
    "Port 'A000000003' does not exists in the table of Port Description' ( message ID E0, NO 31 ).
    have anybody encountered this.
    Thanks
    Arun

    Thanks Marc,
      we found the table EDIPOA and found that the port that is missing. It seems that the enytry( row)  for our CRM system is missing which has this port as a field in this table.
      Not sure whether to manually populate it or it's part of the source system repair etc.
      please let me know
      Thanks
    Arunava

  • While uploading data into the r/3 using call transaction or session method

    hi experts
    while uploading data into the r/3 using call transaction or session method error occured in the middle of the processing then how these methods behaves it transfers next records or not?

    hai
    Session method: The records are not added to the database until the session is processed. sy-subrc is not returned. Error logs are created for error records. Updation in database table is always Synchronous.
    Call Transaction method: The records are immediately added to the database table. sy-subrc is returned to 0 if successful. Error logs are not created and hence the errors need to be handled explicitly. Updation in database table is either Synchronous or Asynchronous.
    While to transfer the data from the through if any errors occurs until the errors are the complete the data is not transfer to the SAP system.
    the system compulsory shows the errors. that errors are stored into the error logs (Transaction is SM35).
    so the session method should not return any value.
    In call transaction method data is directly pass to the SAP system.
    So its compulsory return the value.
    Because of the call transaction is the function.
    A function should return the value mandatory
    In session method errors stroed in SYSTEM GENRATED ERROR LOG.
    IN CALL TRANSACTION TO CAPTURE THE ERRORS WE SHOULD PERFORM THE FOLLOWING.
    FIRST ME MUST DECLARE AN INTERNAL TABLE WITH THE STRUCTURE OF BDCMSGCOLL TABLE.
    THEN WHILE WRITING THE CALL TRANSACTION STATEMENT WE SHOULD PUT THE 'E' MODE FOR CAPTURING ALL THE ERRORS.
    THEN FINALLY THE CAPTURED ERRORS MUST TO SENT TO THE INTERNAL TABLE WHICH WE DECLARED IN THE BEGINNING WITH BDCMSGCOLL BY USING THE FUNCTION MODULE "FORMAT_MESSAGE"
    AND THUS THE ERROR MESSAGES WILL BE SENT TO THE INTERNAL TABLE WHICH WE DECLARED AT THE BEGINNING.

  • Steps for loading data into the infocube in BI7, with dso in between

    Dear All,
    I am loading data into the infocube in BI7, with dso in between. By data flow looks like....
    Top to bottom:
    InfoCube (Customized)
    Transformation
    DSO (Customized)
    Transformation
    DataSource (Customized).
    The mapping and everything else looks fine and data is also seen in the cube on the FULL Load.
    But due to some minor error (i guess), i am unable to see the DELTA data in DSO, although it is loaded in DataSource through process chains.
    Kindly advise me, where did i miss.
    OR .. Step by step instructions for loading data into the infocube in BI7, with dso in between would be really helpful.
    Regards,

    Hi,
    my first impulse would be to check if the DSO is set to "direct update". In this case there is no Delta possible, because the Change log is not maintained.
    My second thought would be to check the DTP moving data between the DSO and the target cube. If this is set to full, you will not get a delta. It is only possible to create one DTP. So if you created one in FULL mode you can't switch to Delta. Just create the DTP in Delta mode.
    Hope this helps.
    Kind regards,
    Jürgen

  • Runtime error while loading data into infocube

    Hi all,
    I am getting a runtime error while loading the transaction data into the InfoCube.
    Exception name: CX_SY_GENERATE_SUBPOOL_FULL
    Runtime error: GENERATE_SUBPOOL_DIR_FULL
    Anyone has any idea about this error? Any help will be really helpful. Thanks in advance.
    Regards,
    Sathesh

    Hi Sathesh,
    Please provide more details for the problem.
    You can go and check in ST22. Go through the Dump Analysis and Let us know.
    Regards
    Hemant Khemani

  • Error while loading data into 0pur_c03 using 2lis_02_scl

    Hi,
    While trying to load data into 0pur_c03 by using Data Source 2lis_02_scl from R/3 the error message "The material 100 does not exist or not active" is being displayed. I have loaded master data for 0Material and performed change Run also.Still the problem is not resolved.Please suggest in resolving this issue.
    Regards,
    Hari Prasad B.

    I have the same problem in my system and i found that the material XXXXX had movements.....sales documents and more but in certain moment it material was deleted in R3 and it is not loaded into 0MATERIAL infoobject because is marked for deletion....
    If you have this error must activate the load request manually and your infocube can hold the data without any problem.....
    Regards

  • Error while loading Data into Essbase using ODI

    Hi,
    I am very new to ODI. I have installed ODI and working on Demo environment only. I havn't done any configuration. I am using Essbase Technology which is coming by default.
    I have created one sample outline in Essbase and a text file to load data into essbase using ODI.
    Following my text file.
    Time     Market     Product     Scenario     Measures     Data
    Jan     USA     Pepsi     Actual     Sales     222
    I am getting the error. I have checked in Operator. It is giving at step 6 i.e. Integration SampleLoad data into essbase.
    Here is the description.
    from com.hyperion.odi.common import ODIConstants
    from com.hyperion.odi.connection import HypAppConnectionFactory
    from java.lang import Class
    from java.lang import Boolean
    from java.sql import *
    from java.util import HashMap
    # Get the select statement on the staging area:
    sql= """select C3_C1 ""Time"",C5_C2 ""Market"",C2_C3 ""product"",C6_C4 ""Scenario"",C1_C5 ""Measures"",C4_C6 ""Data"" from "C$_0Demo_Demo_genData" where      (1=1) """
    srcCx = odiRef.getJDBCConnection("SRC")
    stmt = srcCx.createStatement()
    srcFetchSize=30
    stmt.setFetchSize(srcFetchSize)
    rs = stmt.executeQuery(sql)
    #load the data
    stats = pWriter.loadData(rs)
    #close the database result set, connection
    rs.close()
    stmt.close()
    Please help me to proceed further...

    Hi John,
    Here is the error message in execution tab....
    org.apache.bsf.BSFException: exception from Jython:
    Traceback (innermost last):
    File "<string>", line 20, in ?
    java.sql.SQLException: Unexpected token: TIME in statement [select   C3_C1    ""Time]
         at org.hsqldb.jdbc.jdbcUtil.sqlException(jdbcUtil.java:67)
         at org.hsqldb.jdbc.jdbcStatement.fetchResult(jdbcStatement.java:1598)
         at org.hsqldb.jdbc.jdbcStatement.executeQuery(jdbcStatement.java:194)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
         at java.lang.reflect.Method.invoke(Unknown Source)
         at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java)
         at org.python.core.PyMethod.__call__(PyMethod.java)
         at org.python.core.PyObject.__call__(PyObject.java)
         at org.python.core.PyInstance.invoke(PyInstance.java)
         at org.python.pycode._pyx4.f$0(<string>:20)
         at org.python.pycode._pyx4.call_function(<string>)
         at org.python.core.PyTableCode.call(PyTableCode.java)
         at org.python.core.PyCode.call(PyCode.java)
         at org.python.core.Py.runCode(Py.java)
         at org.python.core.Py.exec(Py.java)
         at org.python.util.PythonInterpreter.exec(PythonInterpreter.java)
         at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:144)
         at com.sunopsis.dwg.codeinterpretor.k.a(k.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
         at com.sunopsis.dwg.cmd.e.i(e.java)
         at com.sunopsis.dwg.cmd.g.y(g.java)
         at com.sunopsis.dwg.cmd.e.run(e.java)
         at java.lang.Thread.run(Unknown Source)
    java.sql.SQLException: java.sql.SQLException: Unexpected token: TIME in statement [select   C3_C1    ""Time]
         at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
         at com.sunopsis.dwg.codeinterpretor.k.a(k.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
         at com.sunopsis.dwg.cmd.e.i(e.java)
         at com.sunopsis.dwg.cmd.g.y(g.java)
         at com.sunopsis.dwg.cmd.e.run(e.java)
         at java.lang.Thread.run(Unknown Source)

  • Error while loading data into hierarchy

    Hi,
    I tried loading the master data from flat file into a hierarchy but its giving me an error message as invalid entry and says hierarchy doesn't exist.Could anyone please guide me how to proceed.
    Thanks,
    Kuldeep.

    Hi,
    1. create a hierarchy,  go to Transfer structure->Hierarchy structure
    ->create hierarchy, put a check of whether it is sorted or time dependent etc.
    2. Assuming it is a flat file load in csv format,selct upload method as IDOC,after that while loading data through infopackage just check that you have selected an appropriate hierarchy name in hierarchy selection tab
    This will hopefully solve your problem.
    Kind Regards,
    Ray

Maybe you are looking for