Ware is the my To Do data from 10.6 after up grade to Maverick 10.9

I would be happy if I can retreve the data from backup and use it in Note's or Reminders or even a word dock.
New Queston My cheep LG phone used to connect vea USB but now I cant see it in the Finder.Is there something I am missing .
Thank You
Albert

Hi Eric thanks for the reply. I tried Disk Utility and it didnot show up.The phone says it is connecting to mas storage but I cannot find it.
Albert

Similar Messages

  • Automatically trigger the event to load data from Planning cube to Standard Cube

    Hello,
    We have a below set up in our system..
    1. A Planning BEx query using which user makes certain entries and writes back data to the planning cube.
    2. An actual reporting cube which gets data from the planning cube above.
    Now, what we want to do is to automate the data load from Planning cube to Reporting cube.
    This involves 2 things..
    1. Change the setting " Change real time load behaviour " of the planing cube to Planning.
    2. Trigger the DTP which loads data from Planning cube to reporting cube.
    We want to automate the above two steps...
    I have tried few things to achieve the same..
    1. Created an event in SM64,
    2. In the Planning cube "Manage" Screen,  clicked on "Subsequent Processing" and provided the event details. ( Not sure if its the correct place to provide the event detail ).
    3. Wrote a ABAP program which changes the setting of the planning cube (  " Change real time load behaviour " to Loading )
    4. Created a process chain, where we have used the event as a start variant, used the ABAP program as a next step, DTP run as the last step..
    This i hoped, would trigger the event as soon as a new request comes and sits in planning cube, which in turn would trigger the process chain which would load the data from Planning cube to reporting cube..
    This is not working. I dont think the event is triggering, and even if does i am not sure whether it will start the process chain automatically. Any ideas please ?

    hi,
    try to do the transformation directly in the input cube by using CR of type exit, more details :
    http://help.sap.com/saphelp_nw70ehp2/helpdata/en/43/1c3d0f31b70701e10000000a422035/content.htm
    hope it helps.

  • HT3231 What cables are necessary for the fastest transfer of data from MacBook Pro to MacBook Air

    What cables are necessary for the fastest transfer of data from MacBook Pro to MacBook Air

    As I posted "it is an Unknown", at least to me, what type FW port is on the OPs older Mac.
    LowLuster
    HT3231 Re: What cables are necessary for the fastest transfer of data from MacBook Pro to MacBook Air 
    Dec 26, 2013 8:04 PM (in response to OGELTHORPE)
    But the newest Mac notebooks do not have a firewire port.
    So he may need a TB to FW adapter, $29, a FW 400 or 800 cable, $??, and then a FW 400 to 800 cable/port adapter (If the old Mac only has a FW 400 port), $??.
    It is unknown if the old Mac has a FW 400 or 800 port.

  • How can we improve the performance while fetching data from RESB table.

    Hi All,
    Can any bosy suggest me the right way to improve the performance while fetching data from RESB table. Below is the select statement.
    SELECT aufnr posnr roms1 roanz
        INTO (itab-aufnr, itab-pposnr, itab-roms1, itab-roanz)
        FROM resb
        WHERE kdauf  = p_vbeln
        AND   ablad  = itab-sposnr+2.
    Here I am using 'KDAUF'  & 'ABLAD' in condition. Can we use secondary index for improving the performance in this case.
    Regards,
    Himanshu

    Hi ,
    Declare intenal table with only those four fields.
    and try the beloe code....
    SELECT aufnr posnr roms1 roanz
    INTO  table itab
    FROM resb
    WHERE kdauf = p_vbeln
    AND ablad = itab-sposnr+2.
    yes, you can also use secondary index for improving the performance in this case.
    Regards,
    Anand .
    Reward if it is useful....

  • What are the ways to download data from ods?

    Hi all,
    what are the ways to download data from ods?
      1. se11/se12
      2. listcube
      3. infospokes.
    Apart from the above 3 is there any other way to download data from my ods.
    I need to download around 90 fields from my ods to a flat file. But listcube doesnt allow me to select all the fields. So i used se11 to download the data from my ods. bUt still se12 doesnt allow me to download all the 90 fields it downloads only 40 fields.
    Can anyone suggest me how to select all the 90 fields and download it to an excel sheet.
    thanxs
    haritha

    Hi Haritha,
    Go to Tcode SE16, give your ODS active Table name and give width of output list as 1023.
    Now run transaction to see your data and then click Settings --> User parameters and Select ALV Grid Display.
    Now you should see and Excel Icon on top, click on it then select Table , then
    Microsoft Excel and it will open your data with all columns you want.
    I just tried for 213 columns
    Hope this helps.
    Thanks
    CK

  • The attempt to read data from the server failed

    Today I was looking into an error with several IMAP accounts in Apple Mail:
    +The attempt to read data from the server "<<servername.tld>>" failed.+
    At first I thought this was an Apple Mail problem, as the accounts in question seemed to work just fine when not used in combination. One possible answer to this problem is rather short:
    Apple Mail uses IMAP caching, which uses more than 4 connections at the same time to the mail server. Some mail servers (like courier-imap in its default configuration) do not allow that much connections from the same IP address. The more accounts you are trying to connect to at the same time raises this number of connections. Meaning while you could probably check one account for new mails, the second will ultimately fail for no obvious reasons. The only solution to this problem is to raise the connections allowed by your IMAP server software. this solution only applies to people who have root access to their mail server.
    in courier-imap you have to edit /etc/courier-imap/imapd
    and change MAXPERIP=4 to a higher number (5 to 10 times the number of accounts you want to check simultaneously)
    and change MAXDAEMONS=40 to a higher number (with only one user 200 might work, whereas if you serve multiple users, something like 500 or higher might be better suited).
    of course increasing the numbers increases load on your server. this is why there are these restrictions in place.

    MY SOLUTION REPOSTED FROM ANOTHER THREAD:
    I've just solved a similar issue.
    I have a dedicated server running Plesk 9.5 and when I upgraded to iLife 11 and Snow Leopard this error appeared. I could quickly click "get mail" and I'd get all my mail, but only 3-4 of my 9 mail accounts would connect. Theo others would have the error:
    "The server error encountered was: The attempt to read data from the server..."
    I found solutions for those using IMAP mail:
    modify the /etc/courier-imap/imapd configuration file and change MAXDAEMONS from 40 to 80 and MAXPERIP from 4 to 40. This allows all the machines behind my home firewall to connect to multiple accounts on the e-mail server with mailbox caching enabled.
    I'd made this change on my server but it didn't seem to have any effect. It dawned on me that I'm using POP, not IMAP. So I found in /etc/courier-imap/pop3d the same settings. I changed the MAXDAEMONS from 40 to 80 and MAXPERIP from 4 to 40 and voila, all my connections concurrently worked.
    This has taken me more than two days to fix and I hope posting this helps someone else with the same issue.

  • Server error: "The attempt to read data from the server '(null)' failed"

    Multiple times during each day my client (Mail.app) puts up a little exclamation mark "!" next to the mail account hosted on our Leopard Server. Clicking on this little alert icon pops up a message that reads:
    +There may be a problem with the mail server or network. Verify the settings for account “Leopard Server Account” or try again.+
    +The server returned the error: The attempt to read data from the server “(null)” failed.+
    I can make the "!" go away by choosing Mailbox>Synchronize>Leopard Server Account. And everything seems peachy but it inevitably pops up again in another hour or two. It's annoying because I'm not sure if mail is getting through or not when the "!" is up.
    Any ideas why this is happening?

    MY SOLUTION REPOSTED FROM ANOTHER THREAD:
    I've just solved a similar issue.
    I have a dedicated server running Plesk 9.5 and when I upgraded to iLife 11 and Snow Leopard this error appeared. I could quickly click "get mail" and I'd get all my mail, but only 3-4 of my 9 mail accounts would connect. Theo others would have the error:
    "The server error encountered was: The attempt to read data from the server..."
    I found solutions for those using IMAP mail:
    modify the /etc/courier-imap/imapd configuration file and change MAXDAEMONS from 40 to 80 and MAXPERIP from 4 to 40. This allows all the machines behind my home firewall to connect to multiple accounts on the e-mail server with mailbox caching enabled.
    I'd made this change on my server but it didn't seem to have any effect. It dawned on me that I'm using POP, not IMAP. So I found in /etc/courier-imap/pop3d the same settings. I changed the MAXDAEMONS from 40 to 80 and MAXPERIP from 4 to 40 and voila, all my connections concurrently worked.
    This has taken me more than two days to fix and I hope posting this helps someone else with the same issue.

  • Why won't my wife's new iPad2 upload the "contacts" and "calendar" data from Outlook when I sync it.  Her iPhone does it just fine.

    Why won't my wife's new iPad2 upload the "contacts" and "calendar" data from Outlook when I sync it.  Her iPhone does it just fine.

    Hi!  check out the answers given here:
    https://discussions.apple.com/message/15387968#15387968

  • Which t.cod to copy in a massive manner the G/L Master data from a Company?

    Hi All,
    could anyone show me the customizing transaction to copy in a massive manner the G/L Master data from a Company code to another?
    Thanks
    Moderator: Please, search before posting

    Hi Rossi,
    You can do this through transaction code OB_GLACC01. Please check it and let us know if you run into any issues.
    For more information check the following area.
    SPRO-GL ACCOUNTING-GL ACCOUNTS-MASTER DATA-PREPARATION-GL ACCOUNT CREATION AND PROCESSING-CHANGE GL ACCOUNTS COLLECTIVELY.
    Warm regards,
    Murukan Arunachalam
    Edited by: Murukan_A on Jan 23, 2012 8:47 PM

  • What are the idoc to send data from sap hr to sap fi

    what are the idoc to send data from sap hr to sap fi

    Message type of IDoc depends on the data you wish to send.
    please detail on the data that you need to send in the IDocs.
    regards,
    Nitin

  • Fatch data from a table after a given time interval

    Hello
    I want to fetch date from a table after a fixed time interval is there any
    way to do it.
    thanks

    Not sure what you want. But maybe this helps.
    SELECT date+interval FROM atable;
    Also if you tell us more about the business case the answers could be more to the point.

  • How to restore files from Time Machine after Clean Install of Mavericks

    How to restore files from Time Machine after Clean Install of Mavericks. I know the data is there but seems unaccessable because I may have changed the Computer or Owner name. Is there any way to fix that now?

    Hi Linc,
    my back-up data is managed by Time Machine; the actual data is on a second internal Hard Drive with a capacity of 1TB - not partitioned. I am using Time Machine and have access to all data backed up since Mavericks Clean Install but not the data prior to this date.
    When I open the 1TB drive in Finder I can see a folder "Backups.backupdb/Sigi's Mac Pro/ followed by many folders of backup dates going back to 2010-09-20-103441 and up to 2014-02-07-142414 all followed with folder Macintosh HD.  February 7th  2014 was the date of the clean install.
    These Backups are followed by backup date folder 2014-02-08-075554 with a subfolder of MacPro-320GB (this is the name I assigned to my Boot Drive during formatting; I suspect I should have assigned the same name as before ie. Macintosh HD) and 2014-02-11 with a subfolder Macintosh HD, these are the ones I get access to by way of Time Machine > Restore.
    Weird thing is on Febr 11 when I relised my possible mistake and renamed the Bootdrive back to what it was initially ie. Macintosh HD despite this I have access to all backup data since the Clean Install in both folders MacPro-320GB as well as Macintosh HD but not to any data prior to the Clean Install.
    Is there something that Time Machine knows and prevents access or is it simply a matter of renaming the subfolder 2014-02-08-07554/MacPro-320GB to ........./Macintosh HD?
    I like to restore selectively and not everything - It was hard work reinstalling apps that were supposed to be the troublemakers (ref my discussion on Maverick problems) but I now need to get onto data specific to some of these apps as well as other data I may have missed. Long story? Yes and sorry.
    Sigi

  • How to find the table in which data from a structure sits

    Hi,
    I want to know how to find the exact table where data sitting in various structures during runtime are stored.
    For instance,in ME23N we have various tabs and data in those are held in various structures. This we can see by checking the technical setting of each field.
    I want to know in which table the data is actually stored for each field and how to find them.
    Any other means other than using "WHERE USED" option?
    Thanks
    CM

    After checking for technical field from the screen, when you reach out to structure, you can dbl click on the particular field's data element. From this data-element you can get to know in which tables it is used. Also if the data element refers to some master data field then you can check out its domain and in the domain you can refer the<b> value table</b> for that domain. This is what i will do if i am not sure about anything.
    Hope it will help a little.
    Jignesh.

  • Unable to connect to the server to pull data from mysql

    Hello,
    I am novice working with Flash Builder 4 and I just created a test application which runs well in my computer pulling data from Mysql using PHP and populating a datagrid. But when I transfered it to the my hosting provider failed. I have been doing some modifications to the gateway.php and amf.config.ini to solve some of the issues. Now the application try to run but doesn't populate the data in the datagrid. I included a tracking point in my data service file to read the connections variables, but they come up in blank. I highly appreciate any help. Here are my gateway.php, amf.config.ini and the data service.
    gateway.php
    <?php
    ini_set("display_errors", 1);
    $dir = dirname(__FILE__);
    $webroot = $_SERVER['DOCUMENT_ROOT'];
    $configfile = "$dir/amf_config.ini";
    $fp = fopen("tracking.txt", "a");
    fwrite($fp, "1-config file " . $configfile . "\r\n");
    //default zend install directory
    $zenddir = $webroot. '/ZendFramework/library';
    //-$zenddir = $webroot;
    fwrite($fp, "2-default zendir" . $zenddir . "\r\n");
    //Load ini file and locate zend directory
    if(file_exists($configfile)) {
         $arr=parse_ini_file($configfile,true);
         if(isset($arr['zend']['webroot'])){
              $webroot = $arr['zend']['webroot'];
              $zenddir = $webroot. '/ZendFramework/library';
         if(isset($arr['zend']['zend_path'])){
              $zenddir = $arr['zend']['zend_path'];
    fwrite($fp, "3-after zendir" . $zenddir . "\r\n");
    // Setup include path
    //add zend directory to include path
    set_include_path(get_include_path().PATH_SEPARATOR.$zenddir);
    // Initialize Zend Framework loader
    require_once 'Zend/Loader/Autoloader.php';
    //-require_once 'Autoloader.php';
    Zend_Loader_Autoloader::getInstance();
    // Load configuration
    $default_config = new Zend_Config(array("production" => false), true);
    $default_config->merge(new Zend_Config_Ini($configfile, 'zendamf'));
    $default_config->setReadOnly();
    $amf = $default_config->amf;
    fwrite($fp, "4- configfile" . $dafault_config["production"] . "\r\n");
    // Store configuration in the registry
    Zend_Registry::set("amf-config", $amf);
    // Initialize AMF Server
    $server = new Zend_Amf_Server();
    $server->setProduction($amf->production);
    if(isset($amf->directories)) {
         $dirs = $amf->directories->toArray();
         foreach($dirs as $dir) {
             // get the first character of the path.
             // If it does not start with slash then it implies that the path is relative to webroot. Else it will be treated as absolute path
             $length = strlen($dir);
             $firstChar = $dir;
             if($length >= 1)
                  $firstChar = $dir[0];
             if($firstChar != "/"){
                  // if the directory is ./ path then we add the webroot only.
                  if($dir == "./"){                  
                       $server->addDirectory($webroot);
                  }else{
                       $tempPath = $webroot . "/" . $dir;
                        $server->addDirectory($tempPath);
              }else{
                      $server->addDirectory($dir);             
    fwrite($fp, "5-temp path" . $tempPath . "=>" . "\r\n");
    fwrite($fp, "******************************************" . "\r\n");
    // Initialize introspector for non-production
    if(!$amf->production) {
         $server->setClass('Zend_Amf_Adobe_Introspector', '', array("config" => $default_config, "server" => $server));
            $server->setClass('Zend_Amf_Adobe_DbInspector', '', array("config" => $default_config, "server" => $server));
    // Handle request
    echo $server->handle();
    ?>
    amf.config.ini
    [zend]
    ;set the absolute location path of webroot directory, example:
    ;Windows: C:\apache\www
    ;MAC/UNIX: /user/apache/www
    ;-webroot =c:/wamp/www/
    webroot = /home/frutiexp/public_html
    ;set the absolute location path of zend installation directory, example:
    ;Windows: C:\apache\PHPFrameworks\ZendFramework
    ;MAC/UNIX: /user/apache/PHPFrameworks/ZendFramework
    ;zend_path = /home/frutiexp/public_html/ZendFramework
    [zendamf]
    amf.production = true
    amf.directories[]=fb41/services
    ;amf.directories[]=./
    CoursesService.php
    <?php
    *  README for sample service
    *  This generated sample service contains functions that illustrate typical service operations.
    *  Use these functions as a starting point for creating your own service implementation. Modify the
    *  function signatures, references to the database, and implementation according to your needs.
    *  Delete the functions that you do not use.
    *  Save your changes and return to Flash Builder. In Flash Builder Data/Services View, refresh
    *  the service. Then drag service operations onto user interface components in Design View. For
    *  example, drag the getAllItems() operation onto a DataGrid.
    *  This code is for prototyping only.
    *  Authenticate the user prior to allowing them to call these methods. You can find more
    *  information at <link>
    class CoursesService {
         var $username = "myusername";
         var $password = "mypassword"
         var $server = "localhost";
         var $port = "3306";
         var $databasename = "frutiexp_trainsur";
         var $tablename = "courses";
         var $connection;
          * The constructor initializes the connection to database. Everytime a request is
          * received by Zend AMF, an instance of the service class is created and then the
          * requested method is invoked.
         public function __construct() {
                $this->connection = mysqli_connect(
                                              $this->server, 
                                              $this->username, 
                                              $this->password,
                                              $this->databasename,
                                              $this->port
    $fp = fopen("./tracking.txt", "a");
    fwrite($fp, "1-service".  $databasename . " " . $username . "\r\n");
    fclose($fp);
              $this->throwExceptionOnError($this->connection);
          * Returns all the rows from the table.
          * Add authroization or any logical checks for secure access to your data
          * @return array
         public function getAllCourses() {
              $stmt = mysqli_prepare($this->connection, "SELECT * FROM $this->tablename");         
              $this->throwExceptionOnError();
              mysqli_stmt_execute($stmt);
              $this->throwExceptionOnError();
              $rows = array();
              mysqli_stmt_bind_result($stmt, $row->cou_id, $row->cou_title, $row->cou_overview, $row->cou_objectives);
             while (mysqli_stmt_fetch($stmt)) {
               $rows[] = $row;
               $row = new stdClass();
               mysqli_stmt_bind_result($stmt, $row->cou_id, $row->cou_title, $row->cou_overview, $row->cou_objectives);
              mysqli_stmt_free_result($stmt);
             mysqli_close($this->connection);
             return $rows;
          * Returns the item corresponding to the value specified for the primary key.
          * Add authroization or any logical checks for secure access to your data
          * @return stdClass

    Hello Jdesko,
    Thank you for you prompt response. Yes, I have changed the connections variables in my dataservice ( I didn't post real values). You are right, after all I didn't make changes on the gateway.php except to add some tracking points. The one that I changed is the amf.config,ini. The application runs without any error exceptions, but don't populate the datagrid. According with the tracing is stoping just when establishing the connection to the database. Please let me know if you have any other clue. thanks

  • Getting short dump at the time of loading data from R/3 to ODS

    Hi BW Grurus,
    I am trying to load data from R/3 to ODS, but after running for a few minutes it is getting into the short dump and displays the following run time error. So please give me solution how I can load data without getting short dump. I tried thrice but it is giving the same and failed.
    Run time error : TSV_TNEW_PAGE_ALLOC_FAILED

    Hi,
    Check, is start routine or individual routine in present in update/transfer rule?
    May be read large amount data (select * from) another ODS and put into internal table cause these type of error.
    Regards,
    Saran

Maybe you are looking for