Mining data residing in different 9i DB server

Can I build a model on a 10g DB but mining data from a 9i DB server via DB link?
Can I also redirect the mining results back to the 9i DB server?
If this method is not viable? Any suggestions?

Hi Weng,
The DB link will work.
Going back I suppose you could use a db link going the other way.
You would update the table locally and have the external db pull the data through the link.
There are probably other techniques but at least this keeps it simple.
Thanks, mark

Similar Messages

  • Trying to use FTP to get data from a different server

    Hi Friends,
        I have to use FTP to get data from a different server and upload it on SAP server. Now my problem is when I m trying to do ftp through command line it brings the file but with no data.
       Through ABAP program nothing is happening.
    Here's my code--
      V_PASSWORD = 'test@123'.
      V_PWD_LEN = STRLEN( V_PASSWORD ).
      CALL FUNCTION 'HTTP_SCRAMBLE'
        EXPORTING
          SOURCE      = V_PASSWORD
          SOURCELEN   = V_PWD_LEN
          KEY         = CS_KEY_500098
        IMPORTING
          DESTINATION = V_PASSWORD.
      CALL FUNCTION 'FTP_CONNECT'
        EXPORTING
          USER            = 'test'
          PASSWORD        = V_PASSWORD
          HOST            = '176.0.1.6'
          RFC_DESTINATION = 'SAPFTPA'
        IMPORTING
          HANDLE          = MI_HANDLE
        EXCEPTIONS
          NOT_CONNECTED   = 1
          OTHERS          = 2.
      CHECK SY-SUBRC = 0.
      cmd = 'lcd d:\ftp'. .
      PERFORM FTP_COMMAND USING CMD.
      CMD = 'asc'.
      PERFORM FTP_COMMAND USING CMD.
      CONCATENATE 'dir' 'ftpt*' INTO CMD SEPARATED BY SPACE.
      PERFORM FTP_COMMAND USING CMD.
      cmd = 'ls'.
    concatenate 'ls' INTO CMD SEPARATED BY SPACE.
      PERFORM FTP_COMMAND USING CMD.
      cmd = 'mget trial.txt'.
    CONCATENATE 'mget' 'trial.txt' INTO CMD SEPARATED BY SPACE.
      CALL FUNCTION 'FTP_COMMAND'
        EXPORTING
          HANDLE        = MI_HANDLE
          COMMAND       = CMD
        TABLES
          DATA          = MTAB_DATA1
        EXCEPTIONS
          TCPIP_ERROR   = 1
          COMMAND_ERROR = 2
          DATA_ERROR    = 3
          OTHERS        = 4.
      IF SY-SUBRC = 0.
        LOOP AT MTAB_DATA1.
          WRITE: / MTAB_DATA1.
        ENDLOOP.
      ELSE.
        CONCATENATE 'Error in FTP Command while executing' CMD INTO ERROR SEPARATED BY SPACE.
        WRITE: / ERROR.
      ENDIF.

    Hi
    try this.....in one of my reqt, i done this successfully....
    FORM FTPCON.
    FTP-------------------------------------------------------*
      CLEAR DSTLEN.
      SET EXTENDED CHECK OFF.
      DSTLEN = STRLEN( S_PWD ).     -
    >  (S_PWD (password) is a selection screen field )                  
      CALL FUNCTION 'HTTP_SCRAMBLE'
        EXPORTING
          SOURCE      = S_PWD
          SOURCELEN   = DSTLEN
          KEY         = KEY
        IMPORTING
          DESTINATION = S_PWD.
      CALL FUNCTION 'FTP_CONNECT'
        EXPORTING
          USER            = P_USER                   -
    > Username
          PASSWORD        = S_PWD             -
    > password
          HOST            = P_HOST                  -
    > Host
          RFC_DESTINATION = P_DEST         -
    > Destination
        IMPORTING
          HANDLE          = HDL
        EXCEPTIONS
          NOT_CONNECTED   = 1
          OTHERS          = 2.
      IF SY-SUBRC <> 0.
        MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
                WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
      ENDIF.
      CALL FUNCTION 'FTP_COMMAND'
        EXPORTING
          HANDLE        = HDL
          COMMAND       = 'set passive on'
        TABLES
          DATA          = RESULT
        EXCEPTIONS
          TCPIP_ERROR   = 1
          COMMAND_ERROR = 2
          DATA_ERROR    = 3.
      CALL FUNCTION 'FTP_R3_TO_SERVER'
        EXPORTING
          HANDLE         = HDL
          FNAME          = G_FCNAME
          CHARACTER_MODE = 'X'
        TABLES
          TEXT           = T_FILE1
        EXCEPTIONS
          TCPIP_ERROR    = 1
          COMMAND_ERROR  = 2
          DATA_ERROR     = 3
          OTHERS         = 4.
      IF SY-SUBRC <> 0.
        MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
             WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
      ENDIF.
      CALL FUNCTION 'FTP_R3_TO_SERVER'
        EXPORTING
          HANDLE         = HDL
          FNAME          = G_FCNAME1
          CHARACTER_MODE = 'X'
        TABLES
          TEXT           = T_FILE2
        EXCEPTIONS
          TCPIP_ERROR    = 1
          COMMAND_ERROR  = 2
          DATA_ERROR     = 3
          OTHERS         = 4.
      IF SY-SUBRC <> 0.
        MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
             WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
      ENDIF.
      CALL FUNCTION 'FTP_DISCONNECT'
        EXPORTING
          HANDLE = HDL.
      CALL FUNCTION 'RFC_CONNECTION_CLOSE'
          EXPORTING
            DESTINATION          = P_DEST
          EXCEPTIONS
            DESTINATION_NOT_OPEN = 1
            OTHERS               = 2.
        IF SY-SUBRC <> 0.
          MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
                  WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
        ENDIF.
    ENDFORM.                    " FTPCON
    Hope it helps.....

  • Office 365 Streaming Notifications, "One or more subscriptions in the request reside on another Client Access server."

    Hello all,
    I am maintaining a part of our product that requires monitoring mailboxes for events.  This is currently being done by using streaming connections for getting the notifications.  Our solution has been successful for situations with smaller numbers
    of mailboxes, ~200 or less.  However we are seeing some issues when scaling up to say, 5000 mailboxes.
    The error and the sequence leading up to it are as follows:
    Make an Exchange Service Account.
    exchSvc.ConnectionGroupName = someGroupName;
    add to the httpheaders ("X-AnchorMailbox", userSmtp) and ("X-PreferServerAffinity", "true");
    create a new impersonated UserId for the userSmtp address that is our anchor mailbox.
    set the Exchange Service account ImpersonatedUserID to the one we just made.
    ExchangeServiceAccount.SubscribeToStreamingNotifications(new FolderId[] { WellKnownFolderName.Inbox }, _mailEvents);
    to this point everything was successful, saw no error messages.
    we create a second impersonated UserID for a different mailbox, and repeat the process above from that step forward.  Upon the final step, subscribing to the streaming notifications we get the error:
    Exception: Microsoft.Exchange.WebServices.Data.ServiceResponseException: One or more subscriptions in the request reside on another Client Access server. GetStreamingEvents won't proxy in the event of a batch request.
    This is only the second subscription that we are trying to add to this connection, and it is to a different mailbox than the first.
    Can anyone please help point me to where this is going wrong?

    >> Is there a good way to verify the number of subscriptions in a group?
    Not that I know of you should be tracking this in your code there are no server side operations in EWS to even tell you if there are active subscriptions on a mailbox.
    >>The error I am getting is on the second subscription in a new group, just after doing the anchor mailbox so I don't think we are hitting the 200 limit. 
    It's hard to say without seeing your code but it sounds like there is problem with your grouping code. One way to validate this is that with every request you make with the EWS managed API there is a
    RequestId header http://blogs.msdn.com/b/exchangedev/archive/2012/06/18/exchange-web-services-managed-api-1-2-1-now-released.aspx
    you should be able to give that RequestId to the Office365 support people and they should be able to check the EWS Log on the server and tell you more about what's happening (it maybe server side bug). Something doesn't quite add up in that the X-BackEndOverrideCookie
    is what ultimately determines what server the request ends up at and the error is essentially telling you its ending up at the wrong server (have you looked at the headers on the error message?). Is it always one group of users that fails have
    you tried different groups and different combinations etc.
    Cheers
    Glen

  • Systimestamp value in different timezoon database server

    Hi Experts,
    There is a SQL (select systimestamp from dual; )
    If we run above sql between PST and EST two different timezone database serves at same time, does SQL return different value from systimestamp?
    if return same value from different timezone database server, why?
    Please explaining!
    Thanks
    Jim

    http://download.oracle.com/docs/cd/B19306_01/server.102/b14200/functions173.htm#SQLRF06125
    SYSTIMESTAMP returns the system date, including fractional seconds and time zone, of the system on which the database resides. The return type is TIMESTAMP WITH TIME ZONE.systimestamp returns database server OS's timezone.
    if server is in PST systimestamp will return PST.
    if server is EST systimestamp will use EST.

  • APEX Application accessing data from two different databases

    Hi All,
    Currently as we all know that APEX Application resides in database and is connected to the schema of that database.
    I want APEX Application to be running and accessing data from two different databases. Elaborating my question,
    Currently, my APEX Production Application is connected with XXXX Schema of DB1 Database(Where APEX Resides). Now I want to add some pages into this APEX Application for REPORT Purpose, But I want to connect this REPORT APEX Pages to get data from Different Schema YYYY for Database DB2.
    Is it possible to configure this scenario?
    The reason for doing this is to avoid the REPORT related (adhoc queries) resource utilization effect on Production DB1 Database.
    Thanks
    Nil

    1. If you do the joining of two or more tables in DB1 then all data is pulled over to DB1 and then the join is executed: so more data over the databaselink and more work for DB1. Better keep the joining stuff where the data resides and just pull exactly that data over that you need.
    2. Don't know about your different block sizes. Seems a nice question for one of the other forums (DBA or SQL).
    3. I mean create synonyms on DB1 for reports VIEWS in DB2.
    Hope all is clear!

  • Unable to load data into any application.database on server.

    I have a rather odd problem that's been vexing my for a few days.
    I am unable to do a data import into any cubes within Essbase, its as if the cube is in read only mode, though everything seems OK. I'm not running any sort of archiving, and to manually check, within essmsh, I did an "alter database end archive". I've tried with our existing and unchanged data load script as well as from within the EAS ( Right-click Data Load ). If I check the processes running on the server ( LINUX server ) the ESSSVR process is the top process eating 100% CPU. I can do outline builds OK.
    From within the application log file, the last few entries are:
    [Wed Apr  3 09:44:58 2013]Local/OP_ACC/Accounts/svc_biserver/Info(1021044)
    Starting to execute query
    [Wed Apr  3 09:44:58 2013]Local/OP_ACC/Accounts/svc_biserver/Info(1021045)
    Finished executing query, and started to fetch records
    [Wed Apr  3 09:44:58 2013]Local/OP_ACC/Accounts/svc_biserver/Info(1021000)
    Connection With SQL Database Server is Established
    [Wed Apr  3 09:44:58 2013]Local/OP_ACC/Accounts/svc_biserver/Info(1003040)
    Parallel dataload enabled: [1] block prepare threads, [1] block write threads.
    [Wed Apr  3 09:45:19 2013]Local/OP_ACC/Accounts/svc_biserver/Info(1021047)
    Finished fetching data
    Nothing has changed on this server for a few weeks so I'm somewhat flummoxed and no other errors show in any other logs ( nohup.out ).
    Essbase 11.1.1.2
    Redhat LINUX_x64 2.6

    Yes, I tried rebooting the whole box, server has 9Gb free space, same story when I load just 1 row via a text file in the EAS, it seems to hang and the ESSSVR process for that cube goes to 100%.
    I've tried a few different cubes, and they all have the same problem, so I suspect its something to do with Essbase itself rather than the specific cube I'm having problems with.
    This is our test server I'm experiencing the problem with and I tried migrating 1 app from UAT back to test which still didnt work.
    Stumped!

  • Multiple FTP locations on different drives in Server

    I've been working to set up an older Power Mac as a location to back-up my ReadyNAS. The ReadyNAS has an automated function for FTP back-up to an FTP location on my network.
    I've put two 1.5TB drives in the OS X server. I have about 1.7TB of data on the ReadyNAS. I had hoped to backup about half of that to each of the drives in the Power Mac Server.
    However, I can't figure out how to set up 2 FTP Share Point locations, one on each of the drives, so that I can set the ReadyNAS to back up some of its directories to one drive and other directories to the other.
    The data is actually mostly static. Once I do the initial backup the incremental ones (what's changed) will be relatively small.
    Does anyone know how to set up multiple FTP share points on different OS X Server drives?
    Thank you in advance for any enlightenment.

    Thank you for the quick response. My problem is how to have the system "know" which backup job it is and switch (without user intervention) to the right Share Point and drive.
    I have 6 shares on the NAS with ~760GB and would like to have those backed up to Drive 1. Then when the job starts with the 1 share with 800GB I want that to back up to Drive 2.
    That's what I can't figure out how to do automatically. It seems like you have to manually go in and change the Share Point and then manually start the job.
    Is there a way to do that in OS X Server without me getting involved?
    Thanks again,
    BflatBlues

  • Can I populate a text field in one PDF with the modification date of a different file ?

    Rather convoluted problem here but I'm trying to place a text field on a PDF document that serves as the main menu page for a library of interlinked PDF documents
    The complete library contains over 7,000 files and additions are added and documents changed almost daily.
    We currently use batch files to open the main menu from it's shortcut and this runs a check on a sync log file (.txt) to ascertain when the last time the user synchronised with the server to get the latest copy of the files.
    Between a certain time range they are told how long it has been since they sync'ed and are offered the option to sync before opening, after a prescribed timeframe they cannot enter without synchronising. We use Allways Sync to conduct the file synchronisation with our mother files on our server.
    What I'd like to do is take advantage of Allways Syncs automatic synchronisation options to synchronise on log on and at prescribed idle periods there after.
    This works fine but I'd like the text field on the main menu PDF to say when the last synchronisation took place - easy if the main menu is the last file modified .. just use info.modDate.
    However, the main menu is rarely modified therefore I wish to import the text to populate the text field from a different file.. either by interrogating the other files modifictaion data (though I doubt Acrobat can do this) or by simply importing some text stored in another file (a .txt file?) which has previously been created by batch file commands.
    Any assistance would be greatly appreciated.
    Regards,
    Nifty Styles
    (Norfolk, England)
    P.S.  I'm using Acrobat 8.3.1. Professional on Windows XP (SP3).

    Thank you for all your help above.
    Just to confirm your advice, am I right with the following conclusions? :
    1. The script (function) to fill the text field with the modification date of a different PDF file needs to be stored in a folder level .js file.
    2. The document containing the text field needs to call the .js function either within the document script or within the custom script property of the text field itself.
    Further to that can you just advise on the syntax for accessing the modification date of the other document.
    Do I need to assign a variable to the address of the file to be used, and then use this variable in the text form filling script (as below) or can I use a direct file reference at the .modDate command.
    var LastSync = "C:\sync\bin\lastsync.pdf";
    var strMsg = util.printd("h:MM tt",LastSync.modDate) + " on ";
    strMsg += util.printd("dddd, d mmmm, yyyy",LastSync.modDate);
    this.getField("LastSyncDate").value = strMsg;
    If the syntax is totally different to the above I would be very grateful for some guidance in the right direction.
    I much appreciate your time to help me ... I'm almost there.
    Kind Regards,
    Nifty

  • Load Data from a table on one server's database, to the same table structure in multiple server databases

    Hi,
    I have a situation where i have to load data from one server/database table to multiple servers/databases.
    Example:
    I need to load data from dbo.TABLE_A  (on Server: Server_A & Database: Database_A)  to the same table on the list of server databases like
    Server: Server_B , Database: Database_B
    Server: Server_C , Database: Database_C
    Server: Server_D , Database: Database_D
    Server: Server_E , Database: Database_E
    Server: Server_F , Database: Database_F
    Server: Server_G , Database: Database_G
    Server: Server_H , Database: Database_H
    so on and so forth on 250 such server database combinations.
    The table structure is the same on all the servers.
    If i make the source or destination dynamic, it throws an error while mapping ?
    I cannot get Linked server permissions and SQL Server Config thing doesn't work as well.
    Please suggest on how to load data from one source to multiple server/databases.
    Thank you.

    I just need to transfer one table's data. its like i have to use a query to pick data for
    the most recent data. So i use something like, select A, B, C, D from dbo.table where ETL_TIMESTAMP > (the max(etltimestamp) in the destination on different server). There are no foreign key relationships and the data should not be truncated. it just had
    to append the new records.

  • How can i look up a EJB residing in different machine from client side?

    hai ,
    How can i look up a EJB residing in different machine from client side?
    this is my code...........i don't know what should i use as Initial Context Factory...................i am using a sun appserver 8............
    package com.parx.lms.lmsdelegate;
    import com.parx.lms.exception.LMSException;
    import javax.naming.Context;
    import javax .ejb.CreateException;
    import javax.naming.InitialContext;
    import javax.naming.NamingException;
    import javax.rmi.PortableRemoteObject;
    import java.rmi.RemoteException;
    import com.parx.lms.controller.*;
    import com.parx.lms.vo.UserVo;
    import com.parx.lms.exception.BusinessException;
    import java.util.Hashtable;
    import java.lang.*;
    public class LmsDelegate{
    private final static String JNDI_NAME="LmsBean";
    private static String url="http://localhost:4848";
    public static Lms lms = null;
    public void getController() throws CreateException,
    NamingException,RemoteException{
    if(lms == null){
    Hashtable h=new Hashtable();
    h.put(Context.INITIAL_CONTEXT_FACTORY," ********what should i use here???????*************************");
    h.put(Context.PROVIDER_URL,url);
    System.out.println("Before Loading Context in Delegate");
    Context ctx=new InitialContext(h);
    System.out.println("Loaded Context in Delegate");
    Object obj=ctx.lookup(JNDI_NAME);
    System.out.println("Loaded Object in Delegate");
    System.out.println("Before Loading Home in Delegate");
    LmsHome home = (LmsHome )PortableRemoteObject.narrow(obj,com.parx.lms.controller.LmsHome.class);
    System.out.println("Loaded Home in Delegate");
    lms = home.create();
    System.out.println("Loaded remote in Delegate");
    public void addUserDelegate(UserVo vo) throws BusinessException{
    try{
    getController();
    System.out.println("Before calling the addUser of Session");
    lms.addUser(vo);
    }catch(CreateException e){
    System.out.println("Create Exception in Delegate due to--->"+e);
    e.printStackTrace();
    throw new BusinessException(e);
    }catch(NamingException e){
    System.out.println("Naming Exception in Delegate due to--->"+e);
    e.printStackTrace();
    throw new BusinessException(e);
    catch(RemoteException e){
    System.out.println("Remote Exception in Delegate due to--->"+e);
    e.printStackTrace();
    throw new BusinessException(e);
    }catch(LMSException e){
    System.out.println("duplicate user name--->"+e);
    e.printStackTrace();
    throw new BusinessException(e);
    pls help me..........

    h.put(Context.INITIAL_CONTEXT_FACTORY," ********what should i use here???????*************************")
    Each app server provides their own jndi factory class. For ex for weblogic it is weblogic.jndi.WLInitialContextFactory. SInce you are using sun app server, check if there are any examples to find out or the docs.
    private static String url="http://localhost:4848
    Since the client is in a different machine the localhost is not going to work here. provide the url or the machine name of the system in which ur sunapp server is running. In addition u will need to have the stubs of the remote interfaces in ur client machine.

  • Load one text file with 12 periods' data into 12 different periods at once?

    Hi guys,
    In one swoop, can we load one .txt file with 12 periods of data into 12 different periods?
    The scenario:
    Budget data is required for monthly comparative reporting with actuals, so we have 12 periods in our Budget version.
    From a non-SAP system we get one .txt file containing 12 periods worth of budget data,
    - it is in the correct format (and we don't want to create risk by opening and editing it) so it is loaded into each period and the other 11 periods of irrelevant data are obviously ignored.
    Some extra tasks (such as validation, cashflow calculations) are then performed per period.
    Because this has to be repeated 12 times, this can be a time consuming process
    We now have multiperiod monitor functionality (from EHP2) and I see how it works for the automatic tasks (very well).
    I'm aware that the guide says that manual tasks will be ignored during the automatic run, this is true.
    However, if I remember correctly EC-CS used to allow it with upoad files, so i was expecting it in BCS 6.02
    - is there anyway to load a file containing 12 periods worth of data into 12 individual periods in all at once?
    (NB we still have an improvement to the previous situation, the user can scroll between periods more quickly and load the file 12 times, then go back to the start and run all auto tasks at once)..
    One thought was to use a file server location with a hardcoded filename but his would require work beyond my expertise.
    All suggestions welcome

    Hi
    I wil suggest mapping the path based on a common location is the best way and then map the same logically in the Upload method
    If you are using Citrix then it becomes all the more easy to have a commmon location
    Rgds
    Dheeraj

  • Stage tab delimited CSV file and load the data into a different table

    Hi,
    I pretty new to writing PL/SQL packages.
    We are using Application express for our development. We get CSV files which is stored as a BLOB content in a table. I need to write a trigger that would get executed once the user the uploads the file and parse thru the Blob content and upload or stage the data in a different table.
    I would like to see if there is any tutorial or article that could explain the above process with the example or sample code to do the same. Any help in this regard will be highly appreciated.

    Hi,
    This is slightly unusual but at the same time easy to solve. You can read through a blob using the dbms_lob package, which is one of the Oracle supplied packages. This is presumably the bit you are missing, as once you know how you read a lob the rest is programming 101.
    Alternatively, you could write the lob out to a file on the server using another built in package called utl_file. This file can be parsed using an appropriately defined external table. External tables are the easiest way of reading data from flat files, including csv.
    I say unusual because why are you loading a csv file into a blob? A clob is almost understandable but if you can load into a column in a table why not skip this bit and just load the data as it comes in straight into the right table?
    All of what I have described is documented functionality, assuming you are on 9i or greater. But you didn't provide a version so I can't provide a link to the documentation ;)
    HTH
    Chris

  • Accessing data from 2 different servers

    I have oracle data on 2 different servers. I want to access the data retrieve it
    through a sql query and combine the data retrieved. How do i retrieve the data
    on the 2 servers.
    thanks

    try, like this
    SQL> select name from v$database;
    NAME
    DBDEMO
    SQL> create user testing identified by passwd;
    User created.
    SQL> grant connect,resource to testing;
    Grant succeeded.
    SQL> conn testing/passwd;
    Connected.
    SQL> create table test_tb (id number);
    Table created.
    SQL> insert into test_tb values(123);
    1 row created.
    SQL> /
    1 row created.
    SQL> commit;
    Commit complete.
    SQL> select count(*) from test_tb;
      COUNT(*)
             2
    SQL>
    on dblink created machine:-
    SQL> select name from v$database;
    NAME
    DB2
    SQL> create user nic identified by nic;
    User created.
    SQL> grant connect,resource, create database link to nic;
    Grant succeeded.
    SQL> conn nic/nic;
    Connected.
    SQL> create database link test_db_link connect to testing identified by passwd using '(DESCRIPTION = (ADDRESS = (PROTOCOL = TCP)(HOST = 192.168.1.1)(PORT = 1521)) (CONNECT_DATA = (SERVER = DEDICATED) (SERVICE_NAME = dbdemo)))
      2  ';
    Database link created.
    SQL> select count(*) from test_tb@test_db_link;
      COUNT(*)
             2
    SQL>

  • IPhone nike+ Sync - Data Could Not Be Validated By Server

    Hi
    I have been using my iPhone 3GS with cardio equipment at my local gym (TechnoGym Vario Series) which has built in docks that among other features, store workout data to the iPhone to sync with nike+. This had been working perfectly since I started using it last month.
    In the past week I have been unable to sync any of my past 6 sessions to nike+ because I get the following error message:
    "Your workout data could not be sent to nike.com because the data could not be validated by the server."
    I understand how to access the xml files on the device and force a re-synchronisation by moving them to the "latest" folder, but the error message persists and none of my past 6 workout sessions will sync with nike+.
    I have tried to sync the data on 2 different machines, one Windows 7 64-bit and the other Windows XP 32-bit but exactly the same error occurs. Both are running iTunes 9 which I suspect may have been what has broken this feature?
    I really hope I can find a solution to this problem and would be grateful for any help. I am happy to provide any information required including copies of the xml files affected.
    Many thanks.

    Hi
    I have had this response from nike plus and this cleared the error when uploading my runs.
    Sometimes it happens that the iPod register the runs as already uploaded,after a not succesful uplaod.
    Then the only way to upload them again is the following.
    First you need to enable to view hidden files ( Control-panel >Folder Options>enable to view hidden files.)
    1. Dock your iPod nano and allow iTunes to launch. Select your iPod from the device tree on the left-hand side of the screen in iTunes.
    2. On the Summary page, make sure that "Enable disk use" is selected.
    3. From your Start menu, navigate to Start > All Programs > Accessories > Windows Explorer. (Please note that this program is not the same as Internet Explorer, the web browser.)
    4. In Windows Explorer, click Tools > Folder Options and select the View tab. Make sure that "View hidden files and folders" is selected.
    5. Find your iPod nano in the device list under "My Computer." It will appear as an external drive; mine, for example, is Drive E, and it's labeled "Your iPod." Yours should have a similar name. Click on it.
    6. Navigate your iPod nano's folders as follows: iPod_Control > Device > Trainer > Workouts > Empeds. In the Empeds folder, you'll see one or more subfolders with alphanumeric filenames. These filenames correspond to your Nike+ sensors. Explore the folders until you locate the one containing the runs you'd like to manually re-upload. They'll probably be in the "synched" folder within your most recent sensor's folder.
    !!!!!!!!!!!Important ,you need to delete the settings.plist file in the sensor folder which stores your login information to make sure for next time you will upload to the correct profile
    7. Drag and drop each run folder you'd like to manually re-upload from "synched" to "latest."
    8. Eject your iPod nano and close iTunes.
    9. Dock your iPod nano and allow iTunes to launch. Your run data should upload
    Hope this helps.
    C

  • Use of SAP memory to transfer data between two different sessions.

    Hello experts,
    I wish to know how to use SAP memory to transfer data between two different sessions.
    The scenario is that when I run a report and change a variable, the value of changed variable should be availabe to another user on another terminal.
    Thanks & Regards!
    Sumit

    Hello,
    Just to add what Max has already mentioned. IMPORT TO / EXPORT FROM DATABASE statements can be used to store data in special "cluster" tables (you can't use any DDIC table) e.g., INDX.
    @OP: You can opt for Shared Memory(SHM) for this specific requirement as well. In my opinion SHM is a bit tricky to code, but it is easier to monitor. The opposite holds true for "data clusters".
    You should remember SHM is app-server specific. So if you've a load balancing scenario, using SHM can cause problems.
    Hope i'm clear.
    BR,
    Suhas
    Edited by: Suhas Saha on Nov 19, 2010 4:12 PM

Maybe you are looking for

  • Using CASE WHEN to change background color

    I've done this many times without any problems, but this time its a little different situation and I can't get it to work. I can use this CASE WHEN statement without any problems. CASE WHEN NAME in (select EMP_REQUEST_NAME from OAX_PTA_LOGGER where E

  • Firefox won't open email links after update to version 32

    I just updated to Firefox 32. When I have FF open but hidden in my Dock on my Mac 10.6.8 it will not open links from my email in Outlook 2011 for Mac. All it will do is unhide Firefox from my dock. Once FF is visible and I click the same link in my e

  • About o-r mapping

    Hi all, I have two questions when creating database scheme: 1, kodo use a default FlatInheritanceMapping to mapping class inheritance hierachy to a RDMS.But this is not always a practic way.In most time we mapping every leaf class in the hierachy to

  • CCX 7.0 pre-upgrade tool ERROR

    Dear ALL, I'm planning to upgrade my CCX 7.0 to the latest version,  I know that  I need to  take the backup  with pre-upgrade tool, But When  I enter  details of SFTP and  click to take a  backup  I'm getting this error: Cisco Unified CCX backup pre

  • Your Information cannot be saved at this time

    Trying to setup a new account for my daughter and I keep getting this stupid error message: Your Information cannot be saved at this time I am using my birth date since I will control the account, but it will not let me proceed. Please help.