Data synchronization on Android

Hello,
I am trying to run mobile application on Android device and I have a problem with the synchronization process. When I try to run sync operation in example application Mobile Server throws exception:
log-1: ============== Server Exception - Begin ==================
oracle.lite.resource.ResourceNotFound: Device (-1) - Resource not found
     at oracle.lite.resource.Resource.exception(Unknown Source)
     at oracle.lite.resource.User.getDevice(Unknown Source)
     at oracle.lite.dm.AuthManager.verifyRequest(Unknown Source)
     at oracle.lite.dm.AuthManager.processCredentials(Unknown Source)
     at oracle.lite.dm.AuthManager.authenticateMAC(Unknown Source)
     at oracle.lite.dm.AuthManager.authenticateMAC(Unknown Source)
     at oracle.lite.dm.AuthManager.verifyMAC(Unknown Source)
     at oracle.lite.sync.HeliosSession.initConvIS(Unknown Source)
     at oracle.lite.sync.HeliosSession.recvCompressed(Unknown Source)
     at oracle.lite.sync.HeliosSession.recvRec(Unknown Source)
     at oracle.lite.sync.HeliosSession.startSession(Unknown Source)
     at oracle.lite.sync.resume.Client$1.run(Unknown Source)
     at oracle.lite.sync.resume.ThreadPool$PoolTask.run(Unknown Source)
================== Server Exception - End ====================
I tried to add the mobile device to Mobile Server manually, but there is no suitable platform for this (there is only SQLite Linux x86 and no Android).
How to deploy this application?
Thanks in advance,
Kamil

http://download.oracle.com/docs/cd/E12095_01/doc.10303/e12548/cpreinstall.htm#CBHDGGBB
oracle lite on 10.3.0.3 doesnt support android nor iphone or other
as far as i know it only supports win32, linux wince.

Similar Messages

  • Data Synchronizer GroupWise connector time shift problem

    I have recently installed the latest 1.2.2 Data Synchronizer Mobility with GroupWise connector, one bug I have found when looking at time stamps of emails sent and received from the connector perspective I am seeing a 20 hrs time shift i.e. from my Android device I send an email to myself at 06:00 May 6 local the time it will show received at 10:00 May 5 also looking at sent time on the device it will say sent at 10:00 May 5. When I look at the GroupWise client or WebAccess sent and received time and date are correct. Conclusion it's only the Data Synchronizer showing time date incorrectly. Any thoughts welcome.

    Update - This problem appears to be phone specific it is a Motorola MB525 with Android 2.2.2 we have some HTCs which do NOT have a problem with time zone. Another minor problem is items deleted on the device do not sync up to GroupWise

  • Urgen: SRM and BW user data Synchronization problem

    Dear Buddies:
    I'm a BWer in a SRM project. These days I meet a very strange problem in the user data Synchronization configuration between SUS and BW system.
    The symptom is:
    I config the user data Synchronization parameters in SUS system:
    SAP Reference IMG u2192 Supplier Relationship Management u2192 Supplier Self-Services u2192 Master Data u2192 Maintain Systems for Synchronization of User Data
    Here I've maintained the BW logical system and filled the 'FM BPID' field with 'RS_BW_BCT_SRM_SUS_USER_BPID', and filled the 'Function Module for creating user' field with 'BAPI_USER_CREATE'.
    The function of the config above is that:
    When a new user is created in the SAP SUS system, it will automatically be created in SAP BW, too.
    At the same time, an internal table (SRM_USER_SUPBPID) is filled automatically. The table contains the assignment between the automatically created SAP BW user and the corresponding Business Partner ID of the supplier company.
    Then I test the user creation in SUS on web. I found that when the SUS user created , the same user is created automatically in BW system. That means the 'BAPI_USER_CREATE' is work.
    But the content of the user-BPID mapping table 'SRM_USER_SUPBPID' is still empty. That means the  FM 'RS_BW_BCT_SRM_SUS_USER_BPID' is not work at all.
    Anybody met with similar problem? Or any suggestion do you have pls kindly show your solutions, Thanks!!

    No solutions?  I need your support my friends.

  • Data load fron flat file through data synchronization

    Hi,
    Can anyone please help me out with a problem. I am doing a data load in my planning application through a flat file and using data synhronization for it. How can i specify during my data synchronization mapping to add values from the load file that are at the same intersection instead of overriding it.
    For e:g the load files have the following data
    Entity Period Year Version Scenario Account Value
    HO_ATR Jan FY09 Current CurrentScenario PAT 1000
    HO_ATR Jan FY09 Current CurrentScenario PAT 2000
    the value at the intersection HO_ATR->Jan->FY09->Current->CurrentScenario->PAT should be 3000.
    Is there any possibility? I dont want to give users rights to Admin Console for loading data.

    Hi Manmit,
    First let us know if you are in BW 3.5 or 7.0
    In either of the cases, just try including the fields X,Y,Date, Qty etc in the datasource with their respective length specifications.
    While loading the data using Infopackage, just make the setting for file format as Fixed length in your infopackage
    This will populate the values to the respective fields.
    Prathish

  • Migration of mapping created in data synchronization

    Hi,
    I've created a mapping in EPM data synchronization utility. I am to migrate from dev to production. Is there any way to migrate / export the data synchronization along with the mappings created in Dev or I have to recreate everything from the scratch? IT seems that there is no way in which I can export the mapping created. Appreciate your help.
    Thanks,
    ADB

    Hi Alexey,
    Could you elaborate on the requirement? It is still not clear to me what you want to achieve.
    What I do understand is that the users should be able to make adjustments to the mapping/lookup entries.
    If that is the case, what exactly is going to be maintained in the 'additional table' and how are you suggesting end users are going to maintain this?
    Ideally, your query transformation should not change when parameter values change, so you have to think about what logic you put where.
    My suggestion would be to use a file or a table which can be maintained by users. In your query transformation you can then use the lookup or lookup_ext function.
    Especially with lookup_ext you can make it as complicated as you want. The function is well documented but if you need help then just reply and explain in a bit more detail what you're trying to do.
    If you do think the 'hard-coded' option would suit you better, you can look into the 'decoce' function. Again, it is well documented in the technical manual.
    Jan.

  • Any way to stop Data Synchronizer from syncing every folder?

    SLES 11 64 bit for VMware.
    Groupwise connector 1.0.3.512
    Mobility connector 1.0.3.1689
    I have "folders" unchecked every place you can uncheck them.
    Still the Groupwise connector insists on syncing every single folder in a user's cabinet and contacts (even the ones that are unselected in the Groupwise connector->user edit section).
    I do *not* mean that it force syncs these folders to the device, just that it syncs them into the "folder list" section that you can monitor from Mobility Connector->Monitor->Click on user name.
    Most of our users have dozens of folders, and all the scrolling makes it kind of a pain to monitor the folder (ONE) and address books (usually 1-2) that they are syncing. Also, it seems like adding a ton of unneeded work to the system, and it eats up a pretty good chunk of CPU.

    rhconstruction wrote:
    > So, do the "Folders" checkboxes currently do anything?
    If you are talking about the Folder checkboxes in the GroupWise connector, these
    do not current pertain to Mobility. Remember that Mobility is part of a larger
    "Data Synchronizer" product, with connectors to other components like SugarCRM,
    Teaming, etc. So, some of the GroupWise connector settings show for all types
    of connectors, but do not always apply to each connector.
    Danita Zanr
    Novell Knowledge Partner
    Get up and running with Novell Mobility!
    http://www.caledonia.net/gw-mobility.html

  • Data warehouse monitor initial state data synchronization process failed to write state.

    Data Warehouse monitor initial state data synchronization process failed to write state to the Data Warehouse database. Failed to store synchronization process state information in the Data Warehouse database. The operation will be retried.
    Exception 'SqlException': Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.
    One or more workflows were affected by this. 
    Workflow name: Microsoft.SystemCenter.DataWarehouse.Synchronization.MonitorInitialState
    Instance name: Data Warehouse Synchronization Service
    Instance ID: {0FFB4A13-67B7-244A-4396-B1E6F3EB96E5}
    Management group: SCOM2012R2BIZ
    Could you please help me out of the issue?

    Hi,
    It seems like that you are encountering event 31552, you may check operation manager event logs for more information regarding to this issue.
    There can be many causes of getting this 31552 event, such as:
    A sudden flood (or excessive sustained amounts) of data to the warehouse that is causing aggregations to fail moving forward. 
    The Exchange 2010 MP is imported into an environment with lots of statechanges happening. 
    Excessively large ManagedEntityProperty tables causing maintenance to fail because it cannot be parsed quickly enough in the time allotted.
    Too much data in the warehouse staging tables which was not processed due to an issue and is now too much to be processed at one time.
    Please go through the links below to get more information about troubleshooting this issue:
    The 31552 event, or “why is my data warehouse server consuming so much CPU?”
    http://blogs.technet.com/b/kevinholman/archive/2010/08/30/the-31552-event-or-why-is-my-data-warehouse-server-consuming-so-much-cpu.aspx
    FIX: Failed to store data in the Data Warehouse due to a Exception ‘SqlException': Timeout expired.
    Regards,
    Yan Li
    Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Subscriber Support, contact [email protected]

  • Offline data synchronization in azure mobile services on windows server 2008

    Hi,
    I have a class library which insert data into tables in azure mobile services on windows server 2008 OS for windows universal C# platform. I am trying to insert data using offline data synchronization.
    I had installed SQLite runtime for windows 8.1 and windows phone 8.1, but unable to add reference 'SQLite for Windows Runtime(Windows8.1)'.
    Please guide me whether windows server 2008 OS supports offline data synchronization in azure mobile services.
    Thank you.

    I also have a Windows Server 2012 R2 system using Azure Backup, and I don't have the problem. However, you probably noticed that you use a different Azure Backup installation download for Windows Server 2008 R2 vs. Windows Server 2012 R2. Although both
    show the same Microsoft Azure Recovery Services Agent version 2.0.8692.0 installed, my Windows Server 2012 R2 also lists Microsoft Azure Backup for Windows Server Essentials version 6.2.9805.9 installed as well. It could be the case the my problem with the
    CATALOG FAILURE 0x80131500 errors is something specific to the version of Azure Backup installed on my Windows 2008 R2 servers.
    Trilon, Inc.

  • Essbase Analytics Link cannot create data synchronization server database

    When I try to create data synchronization server database using Essbase Analytics Link, the below error occur, anyone can help?Thnaks
    dss.log:
    19 Oct 2011 17:28:55] [dbmgr] ERROR: last message repeated 2 more times
    [19 Oct 2011 17:28:55] [dbmgr] removed "C:\oracle\product\EssbaseAnalyticsLink\oem\hfm\Comma\Default\Comma.hdf"
    [19 Oct 2011 17:28:55] [dbmgr] removed "C:\oracle\product\EssbaseAnalyticsLink\oem\hfm\Comma\Default\PERIOD.hrd"
    [19 Oct 2011 17:28:55] [dbmgr] removed "C:\oracle\product\EssbaseAnalyticsLink\oem\hfm\Comma\Default\VIEW.hrd"
    [19 Oct 2011 17:28:55] [dbmgr] removed "C:\oracle\product\EssbaseAnalyticsLink\oem\hfm\Comma\Default\YEAR.hrd"
    [19 Oct 2011 17:28:58] [dbmgr] Create metadata: "C:/oracle/product/EssbaseAnalyticsLink/oem/hfm/Comma/Default/Comma.hdf"
    [19 Oct 2011 17:28:59] [dbmgr] WARN : HR#03826: Directory "C:\oracle\product\EssbaseAnalyticsLink/Work/XOD/backUp_2" not found. Trying to create
    [19 Oct 2011 17:29:15] [dbmgr] ERROR: ODBC: HR#01465: error in calling SQLDriverConnect ([Microsoft][ODBC SQL Server Driver][Shared Memory]Invalid connection. [state=08001 code=14]).
    [19 Oct 2011 17:29:15] [dbmgr] ERROR: HR#00364: Cannot open source reader for "ACCOUNT"
    [19 Oct 2011 17:29:15] [dbmgr] ERROR: HR#00627: Cannot create dimension: "ACCOUNT".
    [19 Oct 2011 17:29:16] [dbmgr] ERROR: HR#07722: Cube 'main_cube' of application 'Comma' is not registered.
    eal.log:
    [2011-Oct-19 17:28:56] http://localhost/livelink/Default.aspx?command=readYear&server=TestEss64&application=Comma&domain=
    [2011-Oct-19 17:28:56] http://localhost/livelink/Default.aspx?command=readPeriod&server=TestEss64&application=Comma&domain=
    [2011-Oct-19 17:28:57] http://localhost/livelink/Default.aspx?command=readView&server=TestEss64&application=Comma&domain=
    [2011-Oct-19 17:28:57] http://localhost/livelink/Default.aspx?command=getVersion&server=TestEss64&application=Comma&domain=
    [2011-Oct-19 17:28:58] DSS Application created
    [2011-Oct-19 17:28:58] http://localhost/livelink/Default.aspx?command=getICPWeight&server=TestEss64&application=Comma&domain=
    [2011-Oct-19 17:29:15] (-6981) HR#07772: cannot register HDF
    [2011-Oct-19 17:29:15] com.hyperroll.jhrapi.JhrapiException: (-6981) HR#07772: cannot register HDF
    [2011-Oct-19 17:29:15]      at com.hyperroll.jhrapi.JhrapiImpl.updateMetadata(Native Method)
    [2011-Oct-19 17:29:15]      at com.hyperroll.jhrapi.Application.updateMetadata(Unknown Source)
    [2011-Oct-19 17:29:15]      at com.hyperroll.hfm2ess.bridge.HyperRollProcess.updateMetadata(Unknown Source)
    [2011-Oct-19 17:29:15]      at com.hyperroll.hfm2ess.bridge.ws.BridgeOperationManagerImpl.createAggServerApp(Unknown Source)
    [2011-Oct-19 17:29:15]      at com.hyperroll.hfm2ess.bridge.ws.BridgeOperationManager.createAggServerApp(Unknown Source)
    [2011-Oct-19 17:29:15]      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [2011-Oct-19 17:29:15]      at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [2011-Oct-19 17:29:15]      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [2011-Oct-19 17:29:15]      at java.lang.reflect.Method.invoke(Method.java:597)
    [2011-Oct-19 17:29:15]      at weblogic.wsee.jaxws.WLSInstanceResolver$WLSInvoker.invoke(WLSInstanceResolver.java:92)
    [2011-Oct-19 17:29:15]      at weblogic.wsee.jaxws.WLSInstanceResolver$WLSInvoker.invoke(WLSInstanceResolver.java:74)
    [2011-Oct-19 17:29:15]      at com.sun.xml.ws.server.InvokerTube$2.invoke(InvokerTube.java:151)
    [2011-Oct-19 17:29:15]      at com.sun.xml.ws.server.sei.EndpointMethodHandlerImpl.invoke(EndpointMethodHandlerImpl.java:268)
    [2011-Oct-19 17:29:15]      at com.sun.xml.ws.server.sei.SEIInvokerTube.processRequest(SEIInvokerTube.java:100)
    [2011-Oct-19 17:29:15]      at com.sun.xml.ws.api.pipe.Fiber.__doRun(Fiber.java:866)
    [2011-Oct-19 17:29:15]      at com.sun.xml.ws.api.pipe.Fiber._doRun(Fiber.java:815)
    [2011-Oct-19 17:29:15]      at com.sun.xml.ws.api.pipe.Fiber.doRun(Fiber.java:778)
    [2011-Oct-19 17:29:15]      at com.sun.xml.ws.api.pipe.Fiber.runSync(Fiber.java:680)
    [2011-Oct-19 17:29:15]      at com.sun.xml.ws.server.WSEndpointImpl$2.process(WSEndpointImpl.java:403)
    [2011-Oct-19 17:29:15]      at com.sun.xml.ws.transport.http.HttpAdapter$HttpToolkit.handle(HttpAdapter.java:532)
    [2011-Oct-19 17:29:15]      at com.sun.xml.ws.transport.http.HttpAdapter.handle(HttpAdapter.java:253)
    [2011-Oct-19 17:29:15]      at com.sun.xml.ws.transport.http.servlet.ServletAdapter.handle(ServletAdapter.java:140)
    [2011-Oct-19 17:29:15]      at weblogic.wsee.jaxws.WLSServletAdapter.handle(WLSServletAdapter.java:171)
    [2011-Oct-19 17:29:15]      at weblogic.wsee.jaxws.HttpServletAdapter$AuthorizedInvoke.run(HttpServletAdapter.java:708)
    [2011-Oct-19 17:29:15]      at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:363)
    [2011-Oct-19 17:29:15]      at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:146)
    [2011-Oct-19 17:29:15]      at weblogic.wsee.util.ServerSecurityHelper.authenticatedInvoke(ServerSecurityHelper.java:103)
    [2011-Oct-19 17:29:15]      at weblogic.wsee.jaxws.HttpServletAdapter$3.run(HttpServletAdapter.java:311)
    [2011-Oct-19 17:29:15]      at weblogic.wsee.jaxws.HttpServletAdapter.post(HttpServletAdapter.java:336)
    [2011-Oct-19 17:29:15]      at weblogic.wsee.jaxws.JAXWSServlet.doRequest(JAXWSServlet.java:98)
    [2011-Oct-19 17:29:15]      at weblogic.servlet.http.AbstractAsyncServlet.service(AbstractAsyncServlet.java:99)
    [2011-Oct-19 17:29:15]      at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
    [2011-Oct-19 17:29:15]      at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
    [2011-Oct-19 17:29:15]      at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
    [2011-Oct-19 17:29:15]      at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:300)
    [2011-Oct-19 17:29:15]      at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:183)
    [2011-Oct-19 17:29:15]      at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.wrapRun(WebAppServletContext.java:3717)
    [2011-Oct-19 17:29:15]      at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3681)
    [2011-Oct-19 17:29:15]      at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
    [2011-Oct-19 17:29:15]      at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:120)
    [2011-Oct-19 17:29:15]      at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2277)
    [2011-Oct-19 17:29:15]      at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2183)
    [2011-Oct-19 17:29:15]      at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1454)
    [2011-Oct-19 17:29:15]      at weblogic.work.ExecuteThread.execute(ExecuteThread.java:207)
    [2011-Oct-19 17:29:15]      at weblogic.work.ExecuteThread.run(ExecuteThread.java:176)
    [2011-Oct-19 17:29:15] LiveLinkException [HR#09746]: Data Synchronization Server database cannot be created

    What version of EAL have you installed, what OS + 32bit/64bit are you installing it on.
    What version of the OUI did you use.
    Have you gone through all the configuration steps successfully.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Data Filter on CSV file using Data Synchronization

    gOT error When i used Data Filter on CSV file in Data Synchronization task, Filter condition : BILLINGSTATE LIKE 'CA'
    TE_7002 Transformation stopped due to a fatal  error in the mapping. The expression [(BILLINGSTATE LIKE 'CA')] contains  the following errors [<<PM Parse Error>> missing operator  ... (BILLINGSTATE>>>> <<<<LIKE 'CA')].

    Hi,
    Yes,This can be done through BEx Broadcaster.
    Please follow the below stes...
    1.Open your query in BEx Analyzer
    2.Go to BEx Analysis Toolbar->Tools->BEx Broadcaster...
    3.Click "Create New Settings"->Select the "Distribution Type" as "Broadcast Email" and "Output Format"  as "CSV"
    4.Enter the Recipients Email Address under "Recipients" tab
    5.Enter the Subject and Body of the mail under "Texts" tab
    6.Save the Setting and Execute it.
    Now the Query data will be attached as a CSV file and sent to the recipents through Email.
    Hope this helps you.
    Rgds,
    Murali

  • Data Synchronize from HFM FMDmeException Query Failed or invalid

    I'm trying to create a Data Synchronization where HFM is the data source source.
    I keep getting the following dme exception and don't know how to further diagnose the issue.
    4/19/12 1:29 PM : Error submitting request to source connector - exception: com.hyperion.datasync.fmdme.exception.FMDmeException: Acknowledgement failure - status: 1, reason: Query Failure: The query failed or is invalid! (0)
    4/19/12 1:29 PM : Translation failed - Source query failed to open - exception:; nested exception is:
         com.hyperion.datasync.fmdme.exception.FMDmeException: Acknowledgement failure - status: 1, reason: Query Failure: The query failed or is invalid! (0)
    The DME Listener service is running.
    I've tried the simplest data sync -- just one member in each dimension, copying to same application (in a different scenario).
    I have data synchronizations that do work where the source is a Planning application.
    I was able -- a month ago -- to create a sync where HFM is the source on this development application so don't know if something has changed in the application itself.
    Any suggestions appreciated.
    Thanks,
    Barb
    Edited by: bg on Apr 19, 2012 1:31 PM

    You're right. I should have put it in Financial Consolidation. I don't see that I can post a question directly to EPMA level.

  • The First Recurring Yearly Appointment (Anniversary) Disappears on z10 with Novell Data Synchronizer

    Device: z10
    Carrier: Verizon
    Sync: Novell Data Synchronizer
    1.Open Calendar, select month, select add
    2. Type subject (My Aniversary), all day event to on
    3. Select the Starts day and leave the ends day as default
    4. Select Recurrence and set as yearly
    5. Leave recurrence defaults and select <, then select Save
    6. Note that appointment (Anniversary) will show on desired date for approx. 20 seconds and then disappear. The appointment will as show scheduled for next year, but for this year, you will miss your anniversary.
    7. Monthly and weekly recurrence works fine. Local calander works fine.
    8. Support recommends KB33905 with Security Wipe. Results: No Difference
    9. Only work around is if you need to add a yearly recurrence, you must start the event with a prior-year date
    10. Does anyone else have a solution?
    Thanks

    pgainer,
    It appears that in the past few days you have not received a response to your
    posting. That concerns us, and has triggered this automated reply.
    Has your problem been resolved? If not, you might try one of the following options:
    - Visit http://support.novell.com and search the knowledgebase and/or check all
    the other self support options and support programs available.
    - You could also try posting your message again. Make sure it is posted in the
    correct newsgroup. (http://forums.novell.com)
    Be sure to read the forum FAQ about what to expect in the way of responses:
    http://forums.novell.com/faq.php
    If this is a reply to a duplicate posting, please ignore and accept our apologies
    and rest assured we will issue a stern reprimand to our posting bot.
    Good luck!
    Your Novell Product Support Forums Team
    http://forums.novell.com/

  • Offline data synchronization

    We are trying to use the offline data synchronization feature of DMS using data modeling.
    Below is the only example we found on adobe site and its working.
    http://help.adobe.com/en_US/LiveCycleDataServicesES/3.1/Developing/WS4ba8596dc6a25eff-56f1 c8f4126dcd963c6-8000.html
    Also we have read “occasionally connected client”  and Model driver applications documentation in lcds31 user guide.
    Is there any other example to demonstrate how to use the offline data sync?. We don’t want to generate the  Java code and use assembler class for this .
    In our example we are implementing the SearchCustomerList Funcationality. Based of search criteria a list of customers is displayed to the user.
    Beloew are the configuration settings
                            var cs:ChannelSet = new ChannelSet();
                            var customChannel:Channel = new RTMPChannel("my-rtmp",
                                        "rtmp://wpuag85393:2038");
                            cs.addChannel(customChannel);
                            this.serviceControl.channelSet=cs;
                            this.serviceControl.autoCommit = false;
                            this.serviceControl.autoConnect = false;
                            this.serviceControl.autoSaveCache = true;
                            this.serviceControl.offlineAdapter = new
                                        com.deere.itarch.adapter.MaintainCustomerBeanOfflineAdapter();
                            this.serviceControl.fallBackToLocalFill=true;
                            this.serviceControl.encryptLocalCache = true;
                            this.serviceControl.cacheID = "localCache";
    CASE I:
    Below is our understanding of offline data sync. for our implementation.
    ·          LCDS server is started and application is connected to the server.
    ·          User enters search criteria and clicks the Search Button.
    ·          Data is fetched and displayed on the screen.
    As autoSaveCache is set to true it should automatically save result in local cache
    ·          Shut down the LCDS server.
    ·          Close the earlier Search Window.
    ·          Run the application and open the customer search page.
    ·          Enter the same search criteria and click search.
    ·          Result : Nothing is displayed on screen. ( No data fetched from local cache)
    Many times we are getting error cannot connect to server ( when server is running 50% of times)
    We also tried setting reconnect strategy to instance. ( but this is also not working)
    Also can you please provide end-to-end sample for data synchronization.

    Good to see you got a little further along with your application. I'm not sure why setting autoconnect to true helped.
    Regarding your search, I'm not sure how you implemented that but the easiest way to do it with model-driven development is by using a criteria filter. It will result in a new query in your offline adapter. You just add a filter element to an entity in your model and in that filter you specify your like expression. I added one to the doc sample app as an example. When you generate code for the offline adapter, you'll be able to see the proper structure for the like clause too. I'm including my fml and offline adapter source below.I've also included the MXML so you can see how I called the new filter method from the client. After I saved to the local cache, and I went offline, I successfully performed the search in the client app. There were no issues with it.
    Here's my fml. The new filter is in bold text. I should have chose a better filter name, since it will generate a method called byProductName, which is very close to the existing getByProductName. But you'll get the idea. Once you add the filter, just remember to redeploy your model and regenerate your code.
    Regarding your question about associations, I'll look into that, but I think you would generate offline adapters for each entity involved in the association and your relationships should behave correctly offline.
    <model xmlns="http://ns.adobe.com/Fiber/1.0">
        <annotation name="DMS">
            <item name="datasource">java:/comp/env/jdbc/ordersDB</item>
            <item name="hibernate.dialect">org.hibernate.dialect.HSQLDialect</item>
        </annotation>
        <entity name="Product" persistent="true">
            <annotation name="ServerProperties" ServerType="LCDS"/>
            <annotation name="DMS" Table="PRODUCT"/>
            <annotation name="VisualModeler" width="114" height="88" x="66" y="79"/>
            <annotation name="ActionScriptGeneration" GenerateOfflineAdapter="true" OfflineAdapterPackage="com.adobe.offline"/>
            <id name="productid" type="integer">
                <annotation name="DMS" ColumnName="PRODUCTID"/>
            </id>
            <property name="description" type="string" length="255">
                <annotation name="DMS" ColumnName="DESCRIPTION"/>
            </property>
            <property name="price" type="float">
                <annotation name="DMS" ColumnName="PRICE"/>
            </property>
            <property name="productname" type="string" length="255">
                <annotation name="DMS" ColumnName="PRODUCTNAME"/>
            </property>
            <filter name="byProductName" criteria="productname like"/>
        </entity>
    </model>
    Here's the new query for byProductName in my offline adapter, which contains a valid like clause. That section of the adapter is in bold text.
    * This is an auto-generated offline adapter for the Product entity.
    package com.adobe.offline
    import mx.core.mx_internal;
    import mx.data.SQLiteOfflineAdapter;
    import mx.utils.StringUtil;
    use namespace mx_internal;
    public class ProductOfflineAdapter extends SQLiteOfflineAdapter
         * Return an appropriate SQL WHERE clause for a given set of fill parameters.
         * @param originalArgs fill parameters
         * @return String representing the WHERE clause for a SQLite SQL
        override protected function getQueryCriteria(originalArgs:Array):String
            var args:Array = originalArgs.concat();
            var filterName:String = args.shift();
            var names:Array = new Array();
            switch (filterName)
                case "byProductName":
                    // JPQL: select Product_alias from Product Product_alias where Product_alias.productname like :productname
                    // Preview: productname like :productname                
                    names.push(getTargetColumnName(["productname"]));
                    return StringUtil.substitute("{0} like :productname", names);
                    break;
            return super.getQueryCriteria(originalArgs);
    Here's my modified MXML. I'm still calling getAll(), but after that I use the new filter to filter the results/datagrid display to just the subset that matches the string I input in the search field. This results in a new call to productService.byProductName(), which is the client-side method generated from the filter element in my model.
    <?xml version="1.0" encoding="utf-8"?>
    <s:WindowedApplication xmlns:fx="http://ns.adobe.com/mxml/2009"
                           xmlns:s="library://ns.adobe.com/flex/spark"
                           xmlns:mx="library://ns.adobe.com/flex/mx" xmlns:OfflineAIRAPP="TestOfflineApp.*"
                           preinitialize="app_preinitializeHandler(event)"
                           creationComplete="windowedapplication1_creationCompleteHandler(event)">
        <fx:Script>
            <![CDATA[
                import com.adobe.offline.ProductOfflineAdapter;
                import mx.controls.Alert;
                import mx.events.FlexEvent;
                import mx.messaging.Channel;
                import mx.messaging.ChannelSet;
                import mx.messaging.channels.RTMPChannel;
                import mx.messaging.events.ChannelEvent;
                import mx.rpc.AsyncToken;
                import mx.rpc.events.FaultEvent;
                import mx.rpc.events.ResultEvent;
                public var myOfflineAdapter:ProductOfflineAdapter;
                public function channelConnectHandler(event:ChannelEvent):void
                    productService.serviceControl.autoConnect=false;
                protected function 
                    app_preinitializeHandler(event:FlexEvent):void
                    var cs:ChannelSet = new ChannelSet();
                    var customChannel:Channel = new RTMPChannel("my-rtmp",
                        "rtmp://localhost:2037");
                    cs.addChannel(customChannel);
                    productService.serviceControl.channelSet=cs;
                    customChannel.addEventListener(ChannelEvent.CONNECT,
                        channelConnectHandler);
                protected function dataGrid_creationCompleteHandler(event:FlexEvent):void
                    getAllResult.token = productService.getAll();
                protected function
                    windowedapplication1_creationCompleteHandler(event:FlexEvent):void
                    productService.serviceControl.autoCommit = false;
                    productService.serviceControl.autoConnect = true;
                    productService.serviceControl.autoSaveCache = true;                
                    productService.serviceControl.fallBackToLocalFill=true;
                    productService.serviceControl.encryptLocalCache = true;
                    productService.serviceControl.cacheID = "myOfflineCache4";
                protected function connectBtn_clickHandler(event:MouseEvent):void
                    productService.serviceControl.connect();
                protected function disconnectBtn_clickHandler(event:MouseEvent):void
                    productService.serviceControl.disconnect();
                protected function commitBtn_clickHandler(event:MouseEvent):void
                    productService.serviceControl.commit();
                protected function saveCacheBtn_clickHandler(event:MouseEvent):void
                    productService.serviceControl.saveCache();
                protected function clearCacheBtn_clickHandler(event:MouseEvent):void
                    productService.serviceControl.clearCache();
                protected function button_clickHandler(event:MouseEvent):void
                    getAllResult.token = productService.byProductName("%"+key.text+"%");
            ]]>
        </fx:Script>
        <fx:Declarations>
            <mx:TraceTarget />       
            <s:CallResponder id="getAllResult" />
            <OfflineAIRAPP:ProductService id="productService"
                                          fault="Alert.show(event.fault.faultString + '\n' +
                                          event.fault.faultDetail)"/>
            <s:CallResponder id="byProductNameResult"/>
            </fx:Declarations>
        <mx:DataGrid editable="true" x="141" y="10" id="dataGrid"
                     creationComplete="dataGrid_creationCompleteHandler(event)"
                     dataProvider="{getAllResult.lastResult}">
            <mx:columns>
                <mx:DataGridColumn headerText="productid" dataField="productid"/>
                <mx:DataGridColumn headerText="description" dataField="description"/>
                <mx:DataGridColumn headerText="price" dataField="price"/>
                <mx:DataGridColumn headerText="productname" dataField="productname"/>
            </mx:columns>
        </mx:DataGrid>
        <s:Button x="10" y="246" label="Connect" click="connectBtn_clickHandler(event)"
                  id="connectBtn" width="84" height="30"/>
        <s:Button x="112" y="204" label="Save to Local Cache" id="saveCacheBtn"
                  click="saveCacheBtn_clickHandler(event)" height="30"/>
        <s:Button x="110" y="246" label="Commit to Server" id="commitBtn"
                  click="commitBtn_clickHandler(event)" width="135" height="30"/>
        <s:Button x="10" y="204" label="Disconnect" id="DisconnectBtn"
                  click="disconnectBtn_clickHandler(event)" height="30"/>
        <s:Label x="270" y="204" text="{'Commit Required: ' +
                 productService.serviceControl.commitRequired}"/>
        <s:Label x="270" y="246" text="{'Connected: ' +
                 productService.serviceControl.connected}"/>
        <s:TextInput x="10" y="19" id="key"/>
        <s:Button x="10" y="49" label="Search" id="button" click="button_clickHandler(event)"/>   
    </s:WindowedApplication>

  • Preventive Work Order Dates Synchronization with its automatic created  notification

    Dear all,
    Preventive Work Order Dates Synchronization with its automatic created  notification
    My question was initiated from PM module forum, please check above URL if you have time.
    My ultimate problem is that :
         "SMOD" Enhancement  "QQMA0018" needs to receive its origin data of work order when the execution time of IP10 or IP30.
         so that I can decide/calculate desired dates on the notification according to its origin work order date .
    Do I need to use "export/import memory" statements to communicate with origin work order?
    if it's true
    Thanks for your help in advance.

    Dear Pete,
    I just made test data in our DEV server,
    Work orders has been created by "IP30" which was executed today with its maint. plan( date format DD.MM.YYYY ).
    below work order has been created and the basic start date seems derived from plan date.
    below image is showing our 'SPRO' configuration for the PM priority.
    "ZM03" order type is using "ZP" priority type.
    (I don't know why is this work order didn't take the values defined on 'priorities for Each Priority Type' for calculating "basic start" & "basic finish" date, it seemed just copied same date with PlanDate)
    you can see the notification which is created automatically by work order as below.
    "M4" notification type is using "ZP" priority type as well.
    It took the 'required start' date by adding 2 weeks from today(=creation date) as 'priorities for Each Priority Type' has defined and 'required end' date also set to 6 months later from 'required start' date.
    (Am I right?)
    Please let me know, if you need further information.
    Thank you for your help.

  • Data synchronization in EPMA V11

    Hi,
    in EPMA System 11, it is possible to synchronise EPMA applications. Does anybody know how this is done in the background. I assume that data export calc scripts and rule files are generated for exchanging data between Essbase BSO and Planning application.
    Additionally, how is the mapping of dynamically calculated members handled? In dataexport calc script this option can be set to DataExportDynamicCalc ON | OFF; but how is it done in data synchronisation?
    Kind regards,
    Ilker

    EPMA data sync seems to use an MDX query to pull the data out of an Essbase/planning application. The way the query runs, for essbase for example... if you want to specify a certain member is MEMBER(FY10,Years). You have to specify the dimension that the member belongs. Now I could be wrong here, but all of my experience with seeing how the tool queries Essbase points to this.
    There is no special handling of dynamic calc members right now, it just extracts the data and tries to load it right back to the same member (which then will fail if the corresponding member is dynamic calc)
    The export is actually relatively quick. You can move data from 1 app to the next in about 30-40 minutes (11 million rows exported out of the essbase cube). Data Sync thinks about the volume of data being exported in rows, much like a relational database. Once the data is extracted into a text file, it is sent to essbase and a 'streaming data load' process kicks off. Now I think that this can utilize the DLTHREADSWRITE, and DLTHREADSPREPARE but i am not entirely sure.
    The problem with EPMA data synchronizations lie in the inability to refine the member set being pulled. You cant specify Level 0 members, so usually you are relegated to using DESCENDANTS, one or two members or no filter at all (essentially pulling all members of the dimension, including upper levels - which is quite inefficient). Also, the EPMA data synch technology is not advanced enough yet to filter out dynamic calc members
    It is important to remember that EPMA Data synchs are truly a 'push.' So if you are trying to push data to an application with a lower level of detail, or one that requires mapping, it is going to be a monumental pain in the butt. Maintaining a mapping using the Data Synch tools is a lot like maintaining a mapping using a partition. The unheralded blessing (or curse depending on how you look at it), is the data synch will not stop when it encounters an error trying to locate a member, it treats it a lot like a load rule would (member not found kickouts).
    the EPMA data synchronization documentation for Essbase Apps is also incorrect so you will have problems specifying syntax sometimes, I think there has been some work done on trying to fix the documentation on Oracle's end, but they are still working on it.
    I am sure that the the next few releases of the tool will add lots of functions and useful commands, but right now it's still limited. Before I recommend this tool wholeheartedly, I would love to see level 0 members only, and other member set functions.
    -Matt

Maybe you are looking for

  • How do I turn off the Ipod Charging Tone?

    Firstly please don't say by turning down the sound slider because if I do I can't play the music on my ipod whilst it's charging. Right the situation is that my iPod touch is connected via usb in my car so I can play music on the move, also as a bonu

  • Error while creating new release groups in PO release strategy

    Hi All I was creating a release procedure for PO and after completing steps, Create Characteristic and Create class & while creating new release grp with the class created i am not able to save it and the error message is (ME492) is" Only one release

  • ESS Claims - Requested value Rs. 20000 is greater than balance eligibility

    Dear All, While Approving the ESS claims request getting a error message Requested value Rs. 20000 is greater than balance eligibility value 0.00. Help in solving this issue. Regards, Potru.

  • Exchange 2010 SP3 - Delayed emails with XLSM attachments...

    Running Exchange 2010 SP3 Rollup 5.  Clients running Windows 7 and Office 2010 Pro Plus SP2.  Have one specific user / client that when sending emails with an XLSM attachment around 5:30AM, the email is delayed getting to the Exchange server and is d

  • Help! Megastick 527- Windows available space errors!

    Hey everyone! I hope you guys can help me, i surely haven't seen a similar thread relating to this, so i guess this one's going to be a new problem ^_^;; Okay, things go like this: i bought my megastick 527 just four days ago, and everything seemed f