Strange data from cache

Hi,
windows x64, TimesTen 11.2.2, direct connection from .net client.  TT Cache maps nclob field in Oracle to nvarchar field in TT.
Sometimes when i update oracle field with string data shorter then previous value, after cache update i am getting mixed data from TT cache - new string (new and shorter) overlapped over old one. This happens only with direct connection.
it looks like bug...
Regards, Andrey

So are you saying that if you use a client/server connection the problem does not occur? Do you see the same issue with ttIsql if you query the column value?
Which *exact* version are you using (ttVersion output)?
Chris

Similar Messages

  • 5 months rolling monthly report by fetching data from cache & Database

    Scenario: One monthly report would be created each month end which contains one summary tab and other details tabs. Summary tabs will contain the data of all the months summary data (current month and previous months)
    Problem:
    Current month data have to come directly from database and previous months data come form cache and whole thing will be cached this time so that when we run next month report the data till this month will also come from cache.
    Any one please help me ASAP.

    What's wrong with F_GET_COMPANY_CODE ?  Below is similar code - try running this and seeing what you get:
    report zlocal_jc_t001w.
    tables:
      t001k,     "Valuation area
      t001w.     "Plants/Branches
    parameters:
      p_bukrs          like t001k-bukrs default '1000'.
    select-options:
      s_werks          for t001w-werks.
    start-of-selection.
      perform get_data.
    *&      Form  get_data
    form get_data.
      data:
        begin of gt_t001k occurs 10,
          bukrs             like t001k-bukrs,
          bwkey             like t001k-bwkey,
          werks             like t001w-werks,
        end of gt_t001k.
      select
        t001k~bukrs
        t001k~bwkey
        t001w~werks
        into corresponding fields of table gt_t001k
        from t001k as t001k
        inner join t001w as t001w on t001w~bwkey = t001k~bwkey
        where t001k~bukrs = p_bukrs
        and   t001w~werks in s_werks.
      loop at gt_t001k.
        write: /
          gt_t001k-bukrs,
          gt_t001k-bwkey,
          gt_t001k-werks.
      endloop.
    endform.                    "get_data
    As for links to locally defined database tables, that will depend on what the keys defined in SE11 are e.g. there will be a primary key plus relationships to other tables (for example "WERKS LIKE YSDA_EXP_PRTLOG-YY_PLANT" indicates YY_PLANT relates to T001W).
    Jonathan

  • Bad Data from Cache after infoprovider update

    Hello,
    We have queries running off of a multi-provider consisting of 4 cubes.  Occasionally when we generate quires after deltas have been loaded into the cubes we get incomplete or bad data.  Deactivating the cache or deleting the cache and runing the query again always seems to resolve the problem.
    Any thoughts on what the issue might be and how to resolve?
    Thanks
    Stan Pickford

    Stanley,
    after any data change in an InfoCube, table RSDINFOPROVDATA should contain the timestamp. This timestamp is used by the OLAP cache to determine if new data has been loaded or if the cache is still valid.
    I'm not aware of any problems. If it is reproducible and you can't fix it, open a message to SAP Support.
    Regards,
    Marc
    SAP NetWeaver RIG

  • Error while reading data from cache

    Hi,
    I am implementing IPofSerializer for an object of C#. I have successfully written C# object.
    I have also read the first object.
    But when I am going to read the second object's last field which of type boolean it is giving me error:
    System.IO.EndOfStreamException was unhandled by user code
    Message="Unable to read beyond the end of the stream."
    Source="mscorlib"
    StackTrace:
    at System.IO.__Error.EndOfFile()
    at System.IO.BinaryReader.ReadByte()
    at Tangosol.IO.DataReader.ReadPackedInt32()
    at Tangosol.IO.Pof.PofStreamReader.UserTypeReader.AdvanceTo(Int32 index)
    at Tangosol.IO.Pof.PofStreamReader.ReadInt32(Int32 index)
    at Tangosol.IO.Pof.PofStreamReader.ReadBoolean(Int32 index)
    at HyperRig.CachingService.Adapter.POFSerializers.MapMessageFieldPOFSerializer.Deserialize(IPofReader Reader) in D:\Hyper Rig\HyperRig.CachingService.Adapter\HyperRig.CachingService.Adapter.POFSerializers\MapMessageFieldPOFSerializer.cs:line 21
    at Tangosol.IO.Pof.PofStreamReader.ReadAsObject(Int32 typeId)
    at Tangosol.IO.Pof.PofStreamReader.ReadCollection[T](Int32 index, ICollection`1 coll)
    at HyperRig.CachingService.Adapter.POFSerializers.MapMessagePOFSerializer.Deserialize(IPofReader Reader) in D:\Hyper Rig\HyperRig.CachingService.Adapter\HyperRig.CachingService.Adapter.POFSerializers\MapMessageFieldPOFSerializer.cs:line 55
    at Tangosol.IO.Pof.PofStreamReader.ReadAsObject(Int32 typeId)
    at Tangosol.IO.Pof.PofStreamReader.ReadObject(Int32 index)
    at Tangosol.IO.Pof.ConfigurablePofContext.Deserialize(DataReader reader)
    at Tangosol.Net.Impl.RemoteNamedCache.ConverterFromBinary.Convert(Object o)
    at Tangosol.Util.ConverterCollections.ConvertArray(Object[] ao, IConverter conv)
    at Tangosol.Util.ConverterCollections.ConverterCollection.CopyTo(Array array, Int32 index)
    at Tangosol.Util.ConverterCollections.ConverterQueryCache.GetValues(IFilter filter)
    at Tangosol.Util.ConverterCollections.ConverterNamedCache.GetValues(IFilter filter)
    at Tangosol.Net.Impl.RemoteNamedCache.GetValues(IFilter filter)
    at Tangosol.Net.Impl.SafeNamedCache.GetValues(IFilter filter)
    at HyperRig.CachingService.Adapter.OracleCoherence.OracleCoherenceCachingLayer.ReadMultipleBySubjectName(String SubjectName, List`1& ReceivedMessageRecords, Object TransactionObject) in D:\Hyper Rig\HyperRig.CachingService.Adapter\HyperRig.CachingService.Adapter.OracleCoherence\OracleCoherenceCachingLayer.cs:line 444
    at HyperRig.CachingService.Adapter.OracleCoherence.OracleCoherenceCachingLayer.ReadMultipleBySubjectName(String SubjectName, List`1& ReceivedMessageRecords) in D:\Hyper Rig\HyperRig.CachingService.Adapter\HyperRig.CachingService.Adapter.OracleCoherence\OracleCoherenceCachingLayer.cs:line 461
    at HyperRig.DataManager.ETLEngine.DataCacheExtractor.Extract(Object transaction, DataTable& resultingDataTable)
    InnerException:
    I am using Oracle Coherence 3.5.3
    Regards
    Nitin Jain

    Hi Luk,
    Here is the server side config:
    <?xml version="1.0"?>
    <!--
    | Copyright (c) 2000, 2010, Oracle and/or its affiliates. All rights reserved.
    |
    | Oracle is a registered trademarks of Oracle Corporation and/or its affiliates.
    |
    | This software is the confidential and proprietary information of
    | Oracle Corporation. You shall not disclose such confidential and
    | proprietary information and shall use it only in accordance with the
    | terms of the license agreement you entered into with Oracle.
    |
    | This notice may not be removed or altered.
    -->
    <!DOCTYPE cache-config SYSTEM "cache-config.dtd">
    <cache-config>
    <caching-scheme-mapping>
    <cache-mapping>
    <cache-name>ACCOUNT</cache-name>
    <scheme-name>dist-default</scheme-name>
    </cache-mapping>
    </caching-scheme-mapping>
    <caching-schemes>
    <distributed-scheme>
    <scheme-name>dist-default</scheme-name>
    <serializer>
              <class-name>com.tangosol.io.pof.ConfigurablePofContext</class-name>
              <init-params>
              <init-param>
                   <param-type>string</param-type>
                   <param-value>custom-types-pof-config.xml</param-value>
              </init-param>
              </init-params>
         </serializer>
    <backing-map-scheme>
    <local-scheme/>
    </backing-map-scheme>
    <autostart>true</autostart>
    </distributed-scheme>
    <proxy-scheme>
    <service-name>ExtendTcpProxyService</service-name>
    <thread-count>5</thread-count>
    <acceptor-config>
    <tcp-acceptor>
    <local-address>
    <address>localhost</address>
    <port>9099</port>
    </local-address>
    </tcp-acceptor>
    <serializer>
    <class-name>com.tangosol.io.pof.ConfigurablePofContext</class-name>
    <init-params>
    <init-param>
    <param-type>string</param-type>
    <param-value>custom-types-pof-config.xml</param-value>
    </init-param>
    </init-params>
    </serializer>
    </acceptor-config>
    <autostart>true</autostart>
    </proxy-scheme>
    </caching-schemes>
    </cache-config>
    and client side config is:
    <?xml version="1.0"?>
    <!--
    | Copyright (c) 2000, 2009, Oracle and/or its affiliates. All rights reserved.
    |
    | Oracle is a registered trademarks of Oracle Corporation and/or its affiliates.
    |
    | This software is the confidential and proprietary information of
    | Oracle Corporation. You shall not disclose such confidential and
    | proprietary information and shall use it only in accordance with the
    | terms of the license agreement you entered into with Oracle.
    |
    | This notice may not be removed or altered.
    -->
    <cache-config xmlns="http://schemas.tangosol.com/cache">
    <caching-scheme-mapping>
    <cache-mapping>
    <cache-name>ACCOUNT</cache-name>
    <scheme-name>extend-direct</scheme-name>
    </cache-mapping>
    </caching-scheme-mapping>
    <caching-schemes>
    <remote-cache-scheme>
    <scheme-name>extend-direct</scheme-name>
    <service-name>ExtendTcpProxyService</service-name>
    <initiator-config>
    <tcp-initiator>
    <remote-addresses>
    <socket-address>
    <address>localhost</address>
    <port>9099</port>
    </socket-address>
    </remote-addresses>
    </tcp-initiator>
    <outgoing-message-handler>
    <request-timeout>30s</request-timeout>
    </outgoing-message-handler>
    <serializer>
    <class-name>Tangosol.IO.Pof.ConfigurablePofContext, Coherence</class-name>
    <init-params>
    <init-param>
    <param-type>string</param-type>
    <param-value>Config\pof-config.xml</param-value>
    </init-param>
    </init-params>
    </serializer>
    </initiator-config>
    </remote-cache-scheme>
    </caching-schemes>
    </cache-config>
    Regards
    Nitin Jain

  • Client automatically cache the data got from cache server?

    Hi expert,
    I have 2 questions about the client local cache. Would you please help to give me some suggestion?
    1. Will client automatically locally cache the data got from cache server the first time and automatically update the data in local cache when getting the same data from cache server again? I go through the API reference but cannot find any API to query the data currently cached in the local cache.
    2. If client will automatically cache the data got from cache server. Is there any way for a client to get the data event that happens to its local cache, such as entry created in local cache, entry deleted from local cache and entry updated in local cache? In my opinion, when getting an entry from cache server the first time, the MapListener's entry create event should be triggered. When getting the same entry again, the entry update event should be triggered.
    However, I have tried a client with replicated cache, a client with partitioned cache, an extend client with remote cache and a client with local cache(front cache part of near cache), the client (the NamedCache object has been set the MapListener) cannot get any event notification after getting data from cache server. By the way, my listener is OK since when putting data the entry create event and entry update event will be triggered.
    Your suggestion is very appreciated. :)

    Hi
    If I were you I would read this http://download.oracle.com/docs/cd/E14526_01/coh.350/e14510/toc.htm
    and particularly the section about Near Caching here http://download.oracle.com/docs/cd/E14526_01/coh.350/e14510/nearcache.htm#CDEFEAJG
    which is what you are asking about in your question.
    Near Caching is how Coherence stores data in the locally - which is the answetr to your first question. How Near Caching works is explained in the documentation.
    Events, which you ask about in your second question are explained here http://download.oracle.com/docs/cd/E14526_01/coh.350/e14510/delivereventsjava.htm#CBBIIEFA
    It might be that ContinuousQueryCache is closer to what you want. This is explained here http://download.oracle.com/docs/cd/E14526_01/coh.350/e14510/queryabledatafabric.htm#sthref38 A ContinuousQueryCache is like having a sub-set of the underlying cache on the local client which you can then listen to etc...
    JK

  • Problem in getting the data from a cache in hibernate

    I am storing data inside a cache. When i am geeting the data from a cache its showing null.
    How can i retrieve the data from a cache.
    I am using EHCache

    Hi ,
    You have done one mistake while setting the input parameters for BAPI..Do the following steps for setting input to BAPI
    Bapi_Goodsmvt_Getitems_Input input = new Bapi_Goodsmvt_Getitems_Input();
    wdContext.nodeBapi_Goodsmvt_Getitems_Input().bind(input);
    Bapi2017_Gm_Material_Ra input1 = new Bapi2017_Gm_Material_Ra();
    wdContext.nodeBapi2017_Gm_Material_Ra().bind(input1);
    Bapi2017_Gm_Move_Type_Ra input2 = new Bapi2017_Gm_Move_Type_Ra();
    wdContext.nodeBapi2017_Gm_Move_Type_Ra().bind(input2);
    Bapi2017_Gm_Plant_Ra input3 = new Bapi2017_Gm_Plant_Ra();
    wdContext.nodeBapi2017_Gm_Plant_Ra().bind(input3);
    Bapi2017_Gm_Spec_Stock_Ra input4 = new Bapi2017_Gm_Spec_Stock_Ra();
    wdContext.nodeBapi2017_Gm_Spec_Stock_Ra().bind(input4);
    input1.setSign("I");
    input1.setOption("EQ");
    input1.setLow("1857");
    input2.setSign("I");
    input2.setOption("EQ");
    input2.setLow("M110");
    input3.setSign("I");
    input3.setOption("EQ");
    input3.setLow("309");
    input4.setSign("I");
    input4.setOption("EQ");
    input4.setLow("W");
    wdThis.wdGetWdsdgoodsmvmtcustController().execute_BAPI_GOODSMOVEMENT_GETITEMS();
    Finally inavidate your output node like
    wdContext.node<output node name>.invalidate();
    also put your code inside try catch to display any exception if any
    Regards,
    Amit Bagati

  • ORA-13773: insufficient privileges to select data from the cursor cache

    We are trying to create STS using the below query:
    exec sys.dbms_sqltune.create_sqlset(sqlset_name => 'TEST_STS', -
    sqlset_owner => 'SCOTT');
    The below procedure will load sql starting with 'select /*MY_CRITICAL_SQL*/%' from cursor cache into STS TEST_STS.
    DECLARE
    stscur dbms_sqltune.sqlset_cursor;
    BEGIN
    OPEN stscur FOR
    SELECT VALUE(P)
    FROM TABLE(dbms_sqltune.select_cursor_cache(
    'sql_text like ''select /*MY_CRITICAL_SQL*/%''',
    null, null, null, null, null, null, 'ALL')) P;
    dbms_sqltune.load_sqlset(sqlset_name => 'TEST_STS',
    populate_cursor => stscur,
    sqlset_owner => 'SCOTT');
    END;
    We were getting the following error: ORA-13761: invalid filter
    After granting the below privileges to the user we are getting the below error:
    Err msg:
    ERROR at line 1:
    ORA-13773: insufficient privileges to select data from the cursor cache
    ORA-06512: at "SYS.DBMS_SQLTUNE", line 2957
    ORA-06512: at line 10
    For SQL Tuning Sets:
    GRANT ADMINISTER ANY SQL TUNING SET TO scott;
    For Managing SQL Profiles:
    GRANT CREATE ANY SQL PROFILE TO scott;
    GRANT ALTER ANY SQL PROFILE TO scott;
    GRANT DROP ANY SQL PROFILE TO scott;
    For SQL Tuning Advisor:
    GRANT ADVISOR TO scott;
    Others:
    GRANT SELECT ON V_$SQL TO SCOTT;
    GRANT SELECT ON V_$SQLAREA TO SCOTT;
    GRANT SELECT ON V$SQLAREA_PLAN_HASH TO SCOTT;
    GRANT SELECT ON V_$SQLSTATS TO SCOTT;
    grant select on sys.DBA_HIST_BASELINE to SCOTT;
    grant select on sys.DBA_HIST_SQLTEXT to SCOTT;
    grant select on sys.DBA_HIST_SQLSTAT to SCOTT;
    grant select on sys.DBA_HIST_SQLBIND to SCOTT;
    grant select on sys.DBA_HIST_OPTIMIZER_ENV to SCOTT;
    grant select on sys.DBA_HIST_SNAPSHOT to SCOTT;
    Any info from your end to resolve the issue will be of great help.
    Thanks

    What is the alert log reporting. Are you seeing any other errors than these in the alert log too?

  • How to cache data from database

    Hi all,
    How can i cache the data from a database, so that i don't have to go to database again and again to get the same results.
    thanks in advance
    gcs

    Well, you can make a structure to match the record layout for your data and put each record in a collection. Search the collection each time before you go out to the database. You may want to overload the contains method or make your own method that checks according to just your key values though.

  • Attempt to fetch cache data from Integration Directory failed

    HI,
    while checking cache connectivity testing: status is
         green:   Integration Repository     
         green:    Integration Directory     
              green: Integration Server - JAVA     
                  red:Adapter Engine af.axd.aipid     
               yello:Integration Server - ABAP
    Jun 30, 2007 1:16:08 PM - Cache notification from Integration Directory received successfully
    Attempt to fetch cache data from Integration Directory failed; cache could not be updated
    [Fetch Data]: Unable to find an associated SLD element (source element: SAP_XIIntegrationServer, [CreationClassName, SAP_XIIntegrationServer, string, Name, is.00.aipid, string], target element type: SAP_BusinessSystem)
    [Data Evaluation]: GlobalError
    what to do?
    and there is nothing under integration server and integration engine but there is an green status under Non-Central Adapter Engines > from this i am doing send messeage testing fro xi to bi ,
    send message to: http://aibid:8000/sap/xi/engine?type=entry
    payload:
    <?xml version="1.0" encoding="utf-8"?>
    <ns1:MI_VCNdatatoBI
    xmlns:ns1="http://bi.sap.com"
    xmlns:xsi="http://www.w3.org/2001/XMLSchemainstance">
    <DATA>
    <item>
    </BIC/ZG_CWW010>1000<//BIC/ZG_CWW010>
    </BIC/ZVKY_CHK>1<//BIC/ZVKY_CHK>
    </item>
    </DATA>
    </ns1:MI_VCNdatatoBI>
    i can sent message from there (component monitoring > Non-Central Adapter Engines) but unable to get it at message monitoring and at BI side.
    dushyant.

    thanks,
    but i have adepter type XI
    and i am folowing step of this lonk and there is no need to create fild adepter type according to that and almost done but while sending message through config. monitor in RWB it goes but not coming in mess monitoring and at bi side
    see 4.5 > 3 and 4 topic and 4.6 > 3,4,5
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/f027dde5-e16e-2910-97a4-f231046429f2
    now what to do?
    dushyant,

  • Adobe Reader is Cacheing Data From OLEDB Connection

    I am trying to pre-populate a form from an oledb connection to an access database. I have an html page where a user can search an id. This id then gets written to a table where the sql query defined in the PDF form can grab it, join it with a table where the user info is stored, then display it.
    My problem is that Adobe Reader seems to be caching data from the first SQL select query that is executed. When I change the id I am loading, I still get data from the first SQL query in Reader. If I open the PDF via Acrobat, the data loads up properly, and it doesn't seem to be cached. I have looked at the following forum for suggestions: http://www.adobeforums.com/webx/.3bc3549c, but their suggestions haven't worked.
    I have tried turning off caching anywhere I can find it (i.e. in LiveCycle, Adobe Reader), but nothing is working. Does anybody have any suggestions?

    It sounds like you need use the adobe schema in MII so the output of your transaction matches it.
    Regards,
    Jamie

  • Error reading data from static cursor cache.

    Hi,
    Does anyone know what causes this error below. It just started happening out of the blue. I'm running Apache Tomcat 4.1 on Win 2000 server with MS SQL Server 2000 database.
    Error:
    java.sql.SQLException: [Microsoft][SQLServer JDBC Driver]Error reading data from static cursor cache.
    Thanks,
    TR

    hi,
    i had a similar sort of error, something along the lines of "error setting up static cursor cache" using the SQL JDBC drivers on Win2K. i deleted the file entries in the TEMP folder of c:\documents and settings\<user>\Local Settings\TEMP and everything was cool after it. i'm not sure what the exact issue is (probably something like maximum folder size had been reached). i ran the FileMon utility from www.sysinternals.com and it reported a DISK_FULL error on a temporary file being read by the process. to cut a long story short, everything is NOW cool.
    cheer,
    dara

  • ( very urgent ) read data from bw to live cache

    hi,
         i am very mew to APO LIVE CACHE. i need to read the data from BW to LIVE CACHE for forecasting. i now the transaction lc10 and lc11. but the problem is i need to know where i have to give the BW cubes or BW information into LIVECACHE.
    WOULD APPRECIATE UR HELP.
    RGS,
    SUGUNA.S

    Try TCode/SAPAPO/TSCUBE or Program /SAPAPO/RTSINPUT_CUBE
    Regards
    Krishna Chaitanya.P

  • MRP Data from Live Cache

    Hi
    Please let me know the FM/BAPI which will get me the MRP data from Live cache.
    My requirement is to get the date and quantity for each element, MRP elements to be used are VC,VE,LF based on the plant and material
    Thanks in advance
    Amit

    Hello Sebastian,
    FM '/SAPAPO/OM_PEG_CAT_GET_ORDERS' seems to cover my requirements and I'm happy I found out about it here. Unfortunately I have some problems testing with SE37:
    Specifying only PEGID together with needed categories was not successful. Searching for calling programs of this FM I found FM '/SAPAPO/ATP_DISP_LC_SINGLE' which additionally sets parameter IS_GEN_PARAMS-SIMVERSION to '000' and receives the correct values.
    Now I have the problem that with setting the same input values I still receive exception 'LC_APPL_ERROR' and don't know why. Since I'm new to APO it could be some basic setting, still I'm wondering why this could be. Hopefully you can provide some helpful information, otherwise I think I'm lost
    Thanks in advance,
    eddy

  • Pulling differential data from the cache

    Hi,
    I am using LCDS 2.6 in my application. My application is using DMS to pull data from the EhCache which inturn gets refreshed from the database. Now once I receive a piece of data, it will not change in the future. Only new data would get appended to it. Hence I would just be needing the differential data and not the entire data. Currently after a regular interval, I use the DataService to hit the cache and get the whole data again which includes the older data as well as the newly appended data since I have the VOs mapped on both Flex and java ends using assemblers.
    What I now want is that I don't want to again pull the data that has already come in. I just need the new data that has got appended to the original data. Is there any way that the LCDS would itself get to know about the differential data and notify the client.
    Please note that the LCDS service hits the EhCache using assemblers and is not directly mapped to the database.
    I am currently in the design phase and would really need help on this.
    Looking forward to your replies on this.
    Thanks and Regards,
    Shally

    I believe data is being added to the store (in this case EhCache) from some other location. You need to notify LCDS that you are adding/removing/modifying data in the store directly, LCDS will then evaluate the change and send updates to relevant clients.
    You do this using the DataServiceTransaction API. http://help.adobe.com/en_US/livecycle/9.0/programLC/javadoc/flex/data/DataServiceTransacti on.html
    Inside the same webapp as the one in which LCDS is running, on adding an object, you need to begin() a DST, call createItem() to notify LCDS that you have added a new item and then commit() the DST to send messages to clients.

  • 30EZ2 - Strange message from Data Modeler Viewer.

    I am getting a strange message from Data Modeler Viewer when I try to shutdown SQLDeveloper.
    I have not used data model viewer, but on shutting down sqldevloper I get
    Oracle SQL Developer Data Modeler Viewer is waiting for tasks that should finish, such as writing files.

    As reported in 30EA2: Data Modeler Viewer annoyance when exiting
    The team has yet to respond.
    Regards,
    K.

Maybe you are looking for