Using IPD files to restore data using a Mac

HI
Whilst my daughter was living at home I was backing up her BB 9700 using my PC. However she has now gone off to university and lost her BB which has been replaced under insurance. We now need to restore all of her contacts etc. The problem is that she uses a Mac and I am trying to find out if I can get the Mac desktop software to read IPD files. Is there anyway to do this?
Thanks
Mike

I'm guessing the reason you want to 'bundle' up the files is to avoid the slower copy speed of lots of small files? If that or the desire to compress the data is you reason for using a disk image, zip them instead. Windows and most Macs will be able to open the zip file without additional software and any Macs that are running Jaguar or older OS versions will almost certainly have Stuffit Expander which can also decompress zip files.

Similar Messages

  • I used time machine to restore on a formatted MAC. Now the HDD space has reduced by 100GB but I cannot see any of the files. How do I find and delete those 100GB data from the HDD?

    I used time machine to restore on a formatted MAC. Now the HDD space has reduced by 100GB but I cannot see any of the files. How do I find and delete those 100GB data from the HDD?

    dglenn9000 wrote:
    I created a new user account just to see if it was my user Library or if there was something wrong with my system. And the new user account is doing most of the same things so I will need to do a full restore anyway.
    Not necessarily. I'd suggest downloading and installing the "combo" update. That's a combination (thus the clever name) of all the updates to Leopard since it was first released, so installing it should fix anything that's gone wrong since then, such as with one of the normal "point" updates. Info and download available at: http://support.apple.com/downloads/MacOS_X_10_5_8_ComboUpdate Be sure to do a +Repair Permissions+ via Disk Utility (in your Applications/Utilities folder) afterwards.

  • Error while using Rule file in loading data into Essbase through ODI

    Hi Experts,
    I am facing problem while loading data into Essbase. I am able to load data into Essbase successfully. But when i used Rule fule to add values to existing values am getting error.
    test is my Rule file.
    com.hyperion.odi.essbase.ODIEssbaseException: com.hyperion.odi.essbase.ODIEssbaseException: Cannot put olap file object. Essbase Error(1053025): Object [test] already exists and is not locked by user [admin@Native Directory]
         at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
         at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:346)
         at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2458)
         at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:48)
         at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)
         at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2906)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2609)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:540)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:453)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1740)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:338)
         at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:214)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:272)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:263)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:822)
         at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:123)
         at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:83)
         at java.lang.Thread.run(Thread.java:662)
    from com.hyperion.odi.common import ODIConstants
    from com.hyperion.odi.connection import HypAppConnectionFactory
    from java.lang import Class
    from java.lang import Boolean
    from java.sql import *
    from java.util import HashMap
    # Get the select statement on the staging area:
    sql= """select C1_HSP_RATES "HSP_Rates",C2_ACCOUNT "Account",C3_PERIOD "Period",C4_YEAR "Year",C5_SCENARIO "Scenario",C6_VERSION "Version",C7_CURRENCY "Currency",C8_ENTITY "Entity",C9_VERTICAL "Vertical",C10_HORIZONTAL "Horizontal",C11_SALES_HIERARICHY "Sales Hierarchy",C12_DATA "Data" from PLANAPP."C$_0HexaApp_PLData" where      (1=1) """
    srcCx = odiRef.getJDBCConnection("SRC")
    stmt = srcCx.createStatement()
    srcFetchSize=30
    #stmt.setFetchSize(srcFetchSize)
    stmt.setFetchSize(1)
    print "executing query"
    rs = stmt.executeQuery(sql)
    print "done executing query"
    #load the data
    print "loading data"
    stats = pWriter.loadData(rs)
    print "done loading data"
    #close the database result set, connection
    rs.close()
    stmt.close()
    Please help me on this...
    Thanks & Regards,
    Chinnu

    Hi Priya,
    Thanks for giving reply. I already checked that no lock are available for rule file. I don't know what's the problem. It is working fine without the Rule file, but throwing error only when using rule file.
    Please help on this.
    Thanks,
    Chinnu

  • Using XML File as a Data Server

    OK - here's my scenario:
    - Have an XML file that I am holding parameters (i.e log file folder) in
    - I have set the XML file as a data server (which test connects OK) and created physical and logical schemas in Topology Manager
    - In Designer, have created a data model for this file
    What I am trying to do is populate/refresh a global variable with a value from the XML file; however, when I make a change to the value in the XML file, the value does not get passed into the variable.
    In the global variable, it is set to refresh from the xml data source and in a package I am using a variable step set to refresh but still the data does not update with the current values in the file.
    Couple of other things - if I go to the data model and view the data, it sees the value that is in the xml file but the variable is not getting the same value - it seems to be using some kind of cached value. Also, if I close any open ODI apps (Designer, Operator, etc.) and re-open, the values (sometimes) seem to reflect what is in the file.
    Crux of the post - am I missing something so obvious that I cannot see it or has anyone else experienced this kind of issue.
    Thanks,
    Gee

    OK - I got this working but in order to do this I followed some info from another post and changed the structure of the XML.
    Previously, it was structured as:
    <params>
    <param>
    <name>Test1</name>
    <value>Some Value</value>
    </param>
    <param>
    <name>Test2</name>
    <value>Some Value2</value>
    </param>
    </params>
    Now I have changed it to:
    <params>
    <param name="Test1" value="Some Value" />
    <param name="Test2" value="Some Value2" />
    </params>
    I also added the dod=true value into the JDCB url.
    Now, one of these circumstances fixed my issue (of which I am very glad) but I don't know which one :-) .... and I currently don't have the time to find out.
    I may investigate this further in the future (and post accordingly).
    Thanks,
    G

  • Flat  file to upload data using BDC for transaction MM01

    Hi
    I am trying to update data using bdc code has been attached below using a txt file.
    It is updating the first set of data into the table mara ,but  for the rest it is not
    All the data from txt file has being loaded to internal table , but the problem is it does not gets updated from internal table to the database .
    Only the first set of data has been loaded ,<u><b> rest of the data is not loaded</b></u>
    <u><b>content of txt file</b></u>
    zsc     zsc     kg     
    zsv     zsv     kg     
    zsb     zsb     kg
    <u><b>Actual code</b></u>
    report ZMAT_UPLOAD
           no standard page heading line-size 255.
    types declaration..........................................................................
    types : begin of t_mat,
       matnr(20),
       desc(50),
       uom(5),
    end of t_mat.
    internal table and workarea declaration.......................................
    data : i_mat type table of t_mat.
    data : wa_mat type t_mat.
    include bdcrecx1.
    start-of-selection.
    moving the flat file content to internal table................................
    CALL FUNCTION 'UPLOAD'
         EXPORTING
             FILETYPE   = 'DAT'
         TABLES
             data_tab   = i_mat.
    IF sy-subrc <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.
    perform open_group.
    loop at i_mat into wa_mat.
    perform bdc_dynpro      using 'SAPLMGMM' '0060'.
    perform bdc_field       using 'BDC_CURSOR'
                                       'RMMG1-MATNR'.
    perform bdc_field       using 'BDC_OKCODE'
                                        '=AUSW'.
    perform bdc_field       using 'RMMG1-MATNR'
                                        wa_mat-matnr.
    perform bdc_field       using 'RMMG1-MBRSH'
                                        'P'.
    perform bdc_field       using 'RMMG1-MTART'
                                        'ZOH'.
    perform bdc_dynpro   using 'SAPLMGMM' '0070'.
    perform bdc_field       using 'BDC_CURSOR'
                                        'MSICHTAUSW-DYTXT(01)'.
    perform bdc_field       using 'BDC_OKCODE'
                                        '=ENTR'.
    perform bdc_field       using 'MSICHTAUSW-KZSEL(01)'
                                        'X'.
    perform bdc_dynpro   using 'SAPLMGMM' '4004'.
    perform bdc_field       using 'BDC_OKCODE'
                                        '=BU'.
    perform bdc_field       using 'MAKT-MAKTX'
                                        wa_mat-desc.
    perform bdc_field       using 'BDC_CURSOR'
                                        'MARA-MEINS'.
    perform bdc_field       using 'MARA-MEINS'
                                        wa_mat-uom.
    perform bdc_field       using 'MARA-MTPOS_MARA'
                                        'NORM'.
    perform bdc_transaction using 'MM01'.
    endloop.
    Perform close_group.

    Hi Sumant,
    just concentrate on bold one
    report ZMAT_UPLOAD
    no standard page heading line-size 255.
    types declaration..........................................................................
    <b>
    data : begin of t_mat occurs 0,
    matnr(20),
    desc(50),
    uom(5),
    end of t_mat.</b>
    internal table and workarea declaration.......................................
    <b>*data : i_mat type table of t_mat.
    *data : wa_mat type t_mat.</b>
    include bdcrecx1.
    start-of-selection.
    moving the flat file content to internal table................................
    CALL FUNCTION 'UPLOAD'
    EXPORTING
    FILETYPE = 'DAT'
    TABLES
    <b>data_tab = i_mat.---> t_mat.</b>
    IF sy-subrc <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
    WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.
    perform open_group.
    <b>loop at i_mat into wa_mat.------>loop at t_mat.</b>
    perform bdc_dynpro using 'SAPLMGMM' '0060'.
    perform bdc_field using 'BDC_CURSOR'
    'RMMG1-MATNR'.
    perform bdc_field using 'BDC_OKCODE'
    '=AUSW'.
    perform bdc_field using 'RMMG1-MATNR'
    <b>wa_mat-matnr.---->t_mat-matnr(change in this for ur wa to t_mat.</b>
    perform bdc_field using 'RMMG1-MBRSH'
    'P'.
    perform bdc_field using 'RMMG1-MTART'
    'ZOH'.
    perform bdc_dynpro using 'SAPLMGMM' '0070'.
    perform bdc_field using 'BDC_CURSOR'
    'MSICHTAUSW-DYTXT(01)'.
    perform bdc_field using 'BDC_OKCODE'
    '=ENTR'.
    perform bdc_field using 'MSICHTAUSW-KZSEL(01)'
    'X'.
    perform bdc_dynpro using 'SAPLMGMM' '4004'.
    perform bdc_field using 'BDC_OKCODE'
    '=BU'.
    perform bdc_field using 'MAKT-MAKTX'
    wa_mat-desc.
    perform bdc_field using 'BDC_CURSOR'
    'MARA-MEINS'.
    perform bdc_field using 'MARA-MEINS'
    wa_mat-uom.
    perform bdc_field using 'MARA-MTPOS_MARA'
    'NORM'.
    perform bdc_transaction using 'MM01'.
    endloop.
    Perform close_group.
    Reward points for helpful answers.
    Thanks
    Naveen khan
    Message was edited by:
            Pattan Naveen
    Message was edited by:
            Pattan Naveen

  • Use Control File to load data in BW7.0

    Hi All,
    I have a requirement to load data in BW using control file. The development is done in in BW 7.0. In BW 3.5 there is an option of FILE IS ( Control File or Data File ). Please suggest how to simulate the same in BW 7.0
    Regards,
    Vikram

    Hi Vikram,
    Please have the contents of a sample control file. The file should be saved as ".TXT".
    FILENAME = C:\test.csv
    FILETYPE = CSV
    LOCATION = C
    FS = ,
    ESCAPE = "
    DECIMALPOINT = ,
    1000SEPARATOR = .
    RECCOUNT = 8
    RECSIZE = 53
    PACKETSIZE = 1000
    Filename should contain the path of the CSV file
    Filetype is CSV
    location is 'C' for Workstation and 'A' for Application Server
    FS contains field seperator ',' in our case
    RECCOUNT contain the Rec. Count
    RECSIZE is the Max DB size of the a single row ( this can be evaluated by the Total DB Size of a single Line of the Target BW Structure as well ).
    The content of my sample test.csv are
    1234567890,10,9999999999,,,15,01/01/2005
    1234567891,20,9999999999,,,30,01/01/2005
    1234567892,30,9999999999,,,0,01/01/2005
    1234567893,10,9999999999,,,5,01/01/2005
    1234567894,20,9999999999,,,6,01/01/2005
    1234567895,40,9999999999,,,10,01/01/2005
    1234567896,10,9999999999,,,5,02/01/2005
    1234567897,20,9999999999,,,6,02/01/2005
    Please let me know if there are any further concerns.
    Regards,
    Shrey
    Edited by: Shrey SAP BI on Mar 8, 2010 4:13 PM

  • Using .ods File to import data

    How to use the .ods file which is generated from Open Office  for importing data in Siena.

    Hey Anurag,
    Thanks for the post!
    You could convert the .ods file into .xlsx and use the Excel file in Project Siena. There are many ways to convert the file but here is one option that worked for me:
    Create/Save the OpenOffice Calc file in some location
    Launch Excel and use the File Open option read the .ods file (in Excel 2013, the file type option reads "OpenDocument Spreadsheet")
    Save the converted file in .xlsx format. Format any Table data so they can be recognized in Project Siena.
    Launch Project Siena and use the add data source option to import the xlsx file.
    Your mileage might vary depending on the formulas/tables in the .ods. Let us know if you run into any issues.
    Thanks!
    -Karthik
    This posting is provided "AS IS" with no warranties and confers no rights.

  • Using .csv file and adding data into database

    hi,
    i'm working on a project which shows all the share prices on a webpage from the FTSE100..
    my problem is..i connect to yahoo.co.uk to get my share price information which is updated every 15mins..they return a .csv file to me...at the moment, i am just printing the information onto my website, but is there any way that i could store this information into a database if i needed the data to be used elsewhere...thanx for the help in advance

    below is my code which i used to get my info onto the web...i'd just like to know how i would use this to store the data into a database..
    import java.net.*;
    import java.io.*;
    import java.lang.*;
    import java.util.*;
    public class SharePrice
         private String line;
         private int maxShares = 101;//maximun shares a user can have
         private int details = 5;//five details, name,date,time,price,change.
         public String [][] shareData = new String[maxShares][details];
    public SharePrice(String [] shares) throws Exception
              getShare(shares);
         //returns a double array containing share data of each share as a seperate row in the array
    public String [][] getShare(String [] sh) throws Exception
                   for(int i=0; i<sh.length; i++)
                        //if the entry is null we have reached the end of the array
                        if(sh!=null)
                             String share = sh[i];
                             //part of url of the resource
                             String address ="http://uk.finance.yahoo.com/d/quotes.csv?s=";
                             //adds the share tothe url so that particular shares data is retieved
                             address = address+share;
                             System.out.println(address);
                             try
                                  //connection is created to the resource and input stream opened to read data
                                  URL url = new URL(address);
                                  BufferedReader in = new BufferedReader(
                                            new InputStreamReader(
                                            url.openStream()));
                                  line = in.readLine();
                                  in.close();
                             }catch(Exception e){System.err.println("Exception: " + e.getMessage());
                                  e.printStackTrace();}
                             //the line of data retrieved is spli and placed in a single row of the array
                             //beause the each piece of data is seperated by commas it is easily seperated.
                             StringTokenizer t = new StringTokenizer(line, ",");
                             int count = t.countTokens();
                             System.out.println(" count= "+count);
                             while(t.hasMoreTokens())
                                  for(int j=0; j<count; j++)
                                       String s = t.nextToken();
                                       shareData[i][j] = s;
              return shareData;

  • Using XML File As Target- Data Integrator (SAP BODI Tool)

    I am trying to populate records to an XML File from a table. The XML schema has a node say 'Address'. When I run my job, I get duplicate records in the output XML File. For example: under the element address, the fields are address1 and addres2 then these fields are getting repeated as many times as the number of records in the output file.
    Source Data Looks like this                                                                   Target Schema Looks like this
    EmployeeRecord                                                                                EmployeeDetails
                  FirstName                                                                                FirstName
                  LastName                                                                                LastName
                  EmployeeID                                                                                EmployeeTrack
                  Address                                                                                |
                                                                                    Details
                                                                                    |
                                                                                    EmployeeID
                                                                                    Address
    Note: Where EmployeeTrack Complex element has to repeat only once (Min =1 and Max =1 Occurances)
    I need the output like
    <EmployeeDetails>
    <FirstName>ABC</FirstName>
    <LastName>XYZ</LastName>
    <EmployeeTrack>    
            <Details>
                  <EmployeeID>12345</EmployeeID>
                  <Address>12tyytyt</Address>
            </Details>
    </EmployeeTrack>
    <EmployeeDetails>
    As what i said earliet here i am getting duplication of the data, and if i try to validate the schema its throwing error
    Looking forward to get help from you all experts.
    Regards
    S

    Hi,
    would suggest you put the question into the Einterprise Information Management area. This forum is about the Integration Kit for SAP.
    ingo

  • Need original IPD File for restore

    Hello,
    I own a Blackberry Curve 8520 and I would like to restore the phone data to original settings when I purchased it, sadly I didnt back up the data in any file so my question is if there is any original ipd file available that I could use to restore to original settings.
    Ty
    Solved!
    Go to Solution.

    isabels wrote:
    Well then how can I restore the original settings of the Phone?
    Ah, that is a different question.
    Read this RIM Knowledge Base article to reset your device to the factory settings.
    KB18998 How to reset a BlackBerry smartphone to factory defaults
    1. If any post helps you please click the below the post(s) that helped you.
    2. Please resolve your thread by marking the post "Solution?" which solved it for you!
    3. Install free BlackBerry Protect today for backups of contacts and data.
    4. Guide to Unlocking your BlackBerry & Unlock Codes
    Join our BBM Channels (Beta)
    BlackBerry Support Forums Channel
    PIN: C0001B7B4   Display/Scan Bar Code
    Knowledge Base Updates
    PIN: C0005A9AA   Display/Scan Bar Code

  • How do I use time machine to restore to a new mac

    Just bought the new i mac. I'm using Mt Lion on my old mac and Time machine with a back up.  How do i restore to my new mac from time machine?

    Pondini's excellent resource provides some useful reading:
    http://pondini.org/OSX/Setup.html

  • Using OEM to backup/restore DB using EMC Avamar (3rd party)

    Hi,
    I'm trying to find a solution for using OEM to backup and restore DB using EMC Avamar software.
    Normally, we can backup/restore DB using Avamar through RMAN CLI using scripts similar to the following:
    connect target /
    run {
    configure controlfile autobackup on;
    set controlfile autobackup format for device type sbt to 'CONTROLFILE.orcl.%F';
    allocate channel c0 type sbt PARMS="SBT_LIBRARY=c:\PROGRA~1\avs\bin\orasbt64.dll" format '%d_%U';
    send '"--prefix=11g/orcl/" "--flagfile=c:\flagfile\avtar-flags.txt" "--bindir=c:\PROGRA~1\avs\bin"';
    backup database plus archivelog delete input;
    release channel c0;
    Where orasbt64.dll is the name of Avamar's 64bit library and flagfile contains some internal flags necessary for backup/restore operations.
    Now, I want to use OEM for the same thing so, what I did was try to specify the Media Management Vendor Library Parameters under Backup Settings in OEM. I saved the following as the parameters:
    "SBT_LIBRARY=c:\PROGRA~1\avs\bin\orasbt64.dll";
    Even after this, the backup script generated by OEM does not contain this parameter. Naturally, backups to EMC Avamar fail.
    I would like answers to these 2 questions:
    1. How to correctly specify the Media Management Vendor Library Parameters?
    2. Is there a way to specify all the parameters (--prefix, flagfile, bindir, PARMS etc.) so that the script generated by OEM is same (or as close as possible) to the one I use for RMAN CLI backups.
    Thanks!

    Hi Anant,
    I'm also having the same issue. It fails because it doesn't understand the "send" command in OEM. Here is the output of the tape backup test:
    RMAN> run {
    2> allocate channel oem_sbt_backup type 'sbt_tape' format '%U' parms 'SBT_LIBRARY=/usr/local/avamar/lib/libobk_avamar64.so,
    3> ENV=(PATH=/bin:/usr/bin:/usr/local/avamar/bin);
    4> send "--flagfile=/home/oracle/scripts/backup/my-avtar-flags.txt"';
    5> backup as COMPRESSED BACKUPSET current controlfile tag '04302012033156';
    6> restore controlfile validate from tag '04302012033156';
    7> release channel oem_sbt_backup;
    8> }
    RMAN-00571: ===========================================================
    RMAN-00569: =============== ERROR MESSAGE STACK FOLLOWS ===============
    RMAN-00571: ===========================================================
    RMAN-03009: failure of allocate command on oem_sbt_backup channel at 04/30/2012 15:32:00
    ORA-19554: error allocating device, device type: SBT_TAPE, device name:
    ORA-27209: syntax error in device PARMS - unknown keyword or missing =
    RMAN> allocate channel for maintenance type 'sbt_tape' parms 'SBT_LIBRARY=/usr/local/avamar/lib/libobk_avamar64.so,
    2> ENV=(PATH=/bin:/usr/bin:/usr/local/avamar/bin);
    3> send "--flagfile=/home/oracle/scripts/backup/my-avtar-flags.txt"';
    RMAN-00571: ===========================================================
    RMAN-00569: =============== ERROR MESSAGE STACK FOLLOWS ===============
    RMAN-00571: ===========================================================
    RMAN-03009: failure of allocate command on ORA_MAINT_SBT_TAPE_1 channel at 04/30/2012 15:32:01
    ORA-19554: error allocating device, device type: SBT_TAPE, device name:
    ORA-27209: syntax error in device PARMS - unknown keyword or missing =
    When I use TSM it doesn't require the "send" command and it backs up successfully. I guess the other option would be to create an EM job to do the backups. However, this wouldn't populate the EM tables for backup reporting.

  • Can I manually back up cookies.sqlite and use that file to restore my important cookies after a HD reformat?

    I read the first suggested article before posting this question, but it wasn't specific enough so I was forced to comment, "This article fails to specify use of the backed up file in the event of a system/HDD reformat." The article reveals that the profile folder name must match (match WHAT in this scenario?), but does not reveal if any of the files within the profile have that specific identification embedded in them [which might result in logical error or error messages, and failure to accomplish desired goal]. It also does not say if all I want to do is back up the cookies, whether or not will backing up cookies.sqlite and placing it in the new default profile after the system reinstallation/hard drive reformat work.
    Let me get more general in the event I'm not communicating what I wish to: I don't want to use a confusing, perhaps badly-written Firefox add-on to do this. I have a few secure sites I visit that give me a really hard time when I update or reformat, like my bank and NetFlix. Both parties CLAIM that I am "attempting to access your account from a different computer," but their security protocol writers are MORONS and liars, because it is NOT a different computer; instead of storing the STATIC MAC address of my computer, which DOES identify the specific computer attempting access, they store a number of different bits of data which are VOLATILE and DYNAMIC [in cookies], and I have to go through this bullcrap of accessing my email account and entering in a verifying one-time-use code to get to my own [epithet for solid waste matter]. So, will I be able to do what I want just by manually backing up cookies.sqlite and placing it in the appropriate directory when needed? It should be a very simple matter for these idiot security experts to query systems for the MAC addresses or to get users to supply it manually (it's easy to find), but NO! That would make SENSE, be EASY, and MORE EFFECTIVELY MANAGE THE SECURE NETWORKS. My idiot bank does not even allow the use of special characters necessary for the creation of sophisticated passwords [that would take over a quintillion years to hack via a Brute Force attack].

    You can backup specific files from the Firefox Profile Folder and restore them when needed.<br />
    This shouldn't pose a problem wit SQLite database files like cookies.sqlite.<br />
    With other files you may have to be more cautious though.
    *http://kb.mozillazine.org/Transferring_data_to_a_new_profile_-_Firefox
    You may want to backup the permissions.sqlite file that stores exceptions for cookies, images, pop-up windows, software installation.

  • Use transaction FILE to store data from a cube into a file with Open Hub

    Hi people:
    I'm using BI 7.0 .Mi requirement is to make a flat file using the information of a virtual cube. The file name must have the number of the month and the year. I know that this is possible through FILE transaction.
    Can anybody give me a clue how this transaction is used?Which are the steps in order to assemble the name of the file? Or is there any other option? I have defined the physical directory where the file must be leaved
    Any help will be great. Thanks in advanced

    Hi,
    pick up the code which you need from below.
    REPORT RSAN_WB_ROUTINE_TEMP_REPORT .
    TYPES: BEGIN OF y_source_fields ,
             /BIC/ZTO_ROUTE TYPE /BIC/OIZTO_ROUTE ,
             ZINT_HU__Z_WM_HU TYPE /BIC/OIZ_WM_HU ,
             CREATEDON TYPE /BI0/OICREATEDON ,
             ROUTE TYPE /BI0/OIROUTE ,
             PLANT TYPE /BI0/OIPLANT ,
             PLANT__0STREET TYPE /BI0/OISTREET ,
             PLANT__0CITY TYPE /BI0/OICITY ,
             PLANT__0REGION TYPE /BI0/OIREGION ,
             PLANT__0POSTAL_CD TYPE /BI0/OIPOSTAL_CD ,
             /BIC/ZRECVPLNT TYPE /BIC/OIZRECVPLNT ,
             ZRECVPLNT__0STREET TYPE /BI0/OISTREET ,
             ZRECVPLNT__0CITY TYPE /BI0/OICITY ,
             ZRECVPLNT__0REGION TYPE /BI0/OIREGION ,
             ZRECVPLNT__0POSTAL_CD TYPE /BI0/OIPOSTAL_CD ,
             KYF_0001 TYPE /BI0/OIDLV_QTY ,
             ROUTE__Z_CR_DOCK TYPE /BIC/OIZ_CR_DOCK ,
             REFER_DOC TYPE /BI0/OIREFER_DOC ,
           END OF y_source_fields .
    TYPES: yt_source_fields TYPE STANDARD TABLE OF y_source_fields .
    TYPES: BEGIN OF y_target_fields ,
             RECORDTYPE TYPE /BI0/OISTREET ,
             CONTAINER TYPE /BI0/OICITY ,
             /BIC/ZTO_ROUTE TYPE /BIC/OIZTO_ROUTE ,
             TRACKINGNUMBER TYPE /BIC/OIZ_WM_HU ,
             PO TYPE /BI0/OICITY ,
             STAGEDDATE TYPE /BI0/OICITY ,
             MOVEMENTTYPE TYPE /BI0/OICITY ,
             ROUTE TYPE /BI0/OIROUTE ,
             PLANT TYPE /BI0/OIPLANT ,
             PLANT__0STREET TYPE /BI0/OISTREET ,
             PLANT__0CITY TYPE /BI0/OICITY ,
             PLANT__0REGION TYPE /BI0/OIREGION ,
             PLANT__0POSTAL_CD TYPE /BI0/OIPOSTAL_CD ,
             ORIGINCONTACTNAME TYPE /BI0/OISTREET ,
             ORIGINCONTACTPHONE TYPE /BI0/OISTREET ,
             /BIC/ZRECVPLNT TYPE /BIC/OIZRECVPLNT ,
             ZRECVPLNT__0STREET TYPE /BI0/OISTREET ,
             ZRECVPLNT__0CITY TYPE /BI0/OISTREET ,
             ZRECVPLNT__0REGION TYPE /BI0/OISTREET ,
             ZRECVPLNT__0POSTAL_CD TYPE /BI0/OISTREET ,
             DESTINATIONCONTACTNAME TYPE /BI0/OISTREET ,
             DESTINATIONCONTACTPHONE TYPE /BI0/OISTREET ,
             RCCCODE TYPE /BI0/OISTREET ,
             GLCORCLLICODE TYPE /BI0/OISTREET ,
             JFCCODE TYPE /BI0/OISTREET ,
             DESCRIPTIONOFWORK1 TYPE /BI0/OISTREET ,
             DESCRIPTIONOFWORK2 TYPE /BI0/OISTREET ,
             INSTRUCTIONS TYPE /BI0/OISTREET ,
             REQUESTEDSHIPDATE TYPE /BI0/OICITY ,
             ROUTE__Z_CR_DOCK TYPE /BIC/OIZ_CR_DOCK ,
             REQUESTEDDELIVERYDATE TYPE /BI0/OICITY ,
             ATTSEORDER TYPE /BI0/OICITY ,
             CUBE TYPE /BI0/OISTREET ,
             WEIGHT TYPE /BI0/OISTREET ,
             PIECES TYPE /BI0/OIREFER_DOC ,
             REEL TYPE /BI0/OISTREET ,
             REELSIZE TYPE /BI0/OISTREET ,
             VENDORSKU TYPE /BI0/OISTREET ,
             ATTSESKU TYPE /BI0/OISTREET ,
             COMPANYNAME TYPE /BI0/OISTREET ,
             OEM TYPE /BI0/OISTREET ,
             REFER_DOC TYPE /BI0/OIREFER_DOC ,
             REFERENCENUMBER2 TYPE /BI0/OISTREET ,
             REFERENCENUMBER3 TYPE /BI0/OISTREET ,
             REFERENCENUMBER4 TYPE /BI0/OISTREET ,
           END OF y_target_fields .
    TYPES: yt_target_fields TYPE STANDARD TABLE OF y_target_fields .
    Begin of type definitions -
    *TYPES: ...
    End of type definitions -
    FORM compute_data_transformation
         USING     it_source TYPE yt_source_fields
                   ir_context TYPE REF TO if_rsan_rt_routine_context
         EXPORTING et_target TYPE yt_target_fields .
    Begin of transformation code -
      DATA: ls_source TYPE y_source_fields,
            ls_target TYPE y_target_fields,
            var1(10),
            var2(10),
            year(4),
            month(2),
            day(2),
            date(10),
            it_workdays type table of /bic/pzworkdays,
            wa_workdays type /bic/pzworkdays,
            sto_date(10),
            V_tabix TYPE sy-tabix,
            Y_tabix TYPE sy-tabix,
            sto_var1(10),
            sto_year(4),
            sto_month(2),
            sto_day(2),
            sto_final_date(10),
            W_HEADER LIKE LS_TARGET-RECORDTYPE,
            W_HEADER1(12) TYPE C VALUE 'HEDR00000000',
            W_FOOTER LIKE W_HEADER VALUE 'TRLR0000',
            CNT(5),
            CMD(125) TYPE C.
    **********CODE FOR GENRATING CSV FILE PATH*******************
    data: OUTFILE_NAME(100) TYPE C,
          OUTFILE_NAME1(10) TYPE C VALUE '/sapmnt/',
          OUTFILE_NAME3(18) TYPE C VALUE '/qoutsap/ATTUVS',
          DATE1 LIKE SY-DATUM,
          DD(2) TYPE C,
          MM(2) TYPE C,
          YYYY(4) TYPE C.
    MOVE SY-DATUM+6(2) TO DD.
    MOVE SY-DATUM+4(2) TO MM.
    MOVE SY-DATUM(4) TO YYYY.
    CONCATENATE YYYY MM DD INTO DATE1.
    CONCATENATE OUTFILE_NAME1 SY-SYSID OUTFILE_NAME3 '.CSV' INTO
    OUTFILE_NAME.
    **********END OF CODE FOR GENRATING CSV FILE PATH*************
      OPEN DATASET OUTFILE_NAME FOR OUTPUT IN TEXT MODE ENCODING DEFAULT.
    Code for generating Header.
      CONCATENATE W_HEADER1  SY-DATUM SY-UZEIT INTO W_HEADER.
      APPEND W_HEADER TO ET_TARGET.
      TRANSFER W_HEADER TO OUTFILE_NAME.
      CLEAR W_HEADER.
    End of code for generating Header.
    code for excluding the rows who's Quantity(PIECES) equal to zero.
      LOOP AT it_source INTO ls_source where KYF_0001 NE '0'.
    end of code for excluding the rows who's Quantity(PIECES) equal to
    *zero
        MOVE-CORRESPONDING ls_source TO ls_target.
        ls_target-RECORDTYPE = 'PKUP'.
        ls_target-CONTAINER = ''.
        ls_target-TRACKINGNUMBER = ls_source-ZINT_HU__Z_WM_HU.
        ls_target-PO = ''.
    Date Conversion for Staged Date.
        var1 = ls_source-CREATEDON.
        year = var1+0(4).
        month = var1+4(2).
        day = var1+6(2).
        CONCATENATE month '/' day '/' year INTO date.
    End of Date Conversion for Staged Date.
        ls_target-STAGEDDATE = date.
        ls_target-MOVEMENTTYPE = 'P'.
        ls_target-ORIGINCONTACTNAME = ''.
        ls_target-ORIGINCONTACTPHONE = ''.
        ls_target-DESTINATIONCONTACTNAME = ''.
        ls_target-DESTINATIONCONTACTPHONE = ''.
        ls_target-RCCCODE = ''.
        ls_target-GLCORCLLICODE = ''.
        ls_target-JFCCODE = ''.
        ls_target-DESCRIPTIONOFWORK1 = ''.
        ls_target-DESCRIPTIONOFWORK2 = ''.
        ls_target-INSTRUCTIONS = ''.
        ls_target-REQUESTEDSHIPDATE = date.
    Calculating STO Creation Date + 3 working Days.
        select /BIC/ZWORKDAYS from /bic/pzworkdays into table it_workdays.
        loop at it_workdays into wa_workdays.
            if  wa_workdays-/bic/zworkdays = ls_source-CREATEDON.
                V_tabix = sy-tabix.
                Y_tabix = V_tabix + 3.
            endif.
            If sy-tabix = y_tabix.
                sto_date = wa_workdays-/bic/zworkdays.
            endif.
        Endloop.
        clear v_tabix.
        clear Y_tabix.
        sto_var1 = sto_date.
        sto_year = sto_var1+0(4).
        sto_month = sto_var1+4(2).
        sto_day = sto_var1+6(2).
        CONCATENATE sto_month '/' sto_day '/' sto_year INTO sto_final_date.
    End Of Calculating STO Creation Date + 3 working Days.
        ls_target-REQUESTEDDELIVERYDATE = sto_final_date.
        ls_target-ATTSEORDER = ''.
        ls_target-CUBE = ''.
        ls_target-PIECES = ls_source-KYF_0001.
        ls_target-REEL = ''.
        ls_target-REELSIZE = ''.
        ls_target-VENDORSKU = ''.
        ls_target-ATTSESKU = ''.
        ls_target-COMPANYNAME = 'AT&T'.
        ls_target-OEM = ''.
        ls_target-REFERENCENUMBER2 = '0'.
        ls_target-REFERENCENUMBER3 = '0'.
        ls_target-REFERENCENUMBER4 = '0'.
        APPEND ls_target TO et_target.
        TRANSFER ls_target TO OUTFILE_NAME.
        CNT = CNT + 1.
      ENDLOOP.
        CNT = CNT + 2.
    Code for generating Header -Footer.
      SHIFT CNT LEFT DELETING LEADING SPACE.
      CONCATENATE W_FOOTER CNT INTO W_HEADER.
      APPEND W_HEADER TO ET_TARGET.
    End of code for generating Header -Footer.
    Code for file permissions
      TRANSFER W_HEADER TO OUTFILE_NAME.
      CLOSE DATASET OUTFILE_NAME.
      CONCATENATE 'chmod 644' OUTFILE_NAME INTO CMD SEPARATED BY SPACE.
      CALL 'SYSTEM' ID 'COMMAND' FIELD CMD.
    End of code for file permissions
    End of transformation code -
    ENDFORM.
    Hope it helps
    bhaskar

  • Using time capsule to restore data to new computer and doing the back-up for the new computer

    Hello
    My iMac has broken down. Data were saved by Time Machine on a Time Capsule. So the new computer was set-up with things from the Time Capsule. Worked fine. The Time Machine informed me that the backups will no longer be available for the old iMac. That's fine. After everything was set-up on the new one (15" Mac Book pro retina) I wanted to make sure that the backup process for this one was ready to go and as it was done in the past. I have selected the Time Capsule (it's a 500GB that is backing two computers) and as such there is limited space available on it but that's ok since the process is creating space when needed by deleting old backups from 2011 which I don't need any more keeping 1 year history being ok.
    The problem I am now facing is that it seems Time Machine consider that the back-up of the New Mac book pro as a new volume - i.e. on top of the one used to set-up the new Mac book (the old iMAc!) and the one for the other computer. Time machine seems to be looking to open a third volume and obviuosly doesn't have the space to perform the back-up.
    How can I sort this? Ideally for me after I have restored the data from the old iMac to the Mac Book pro I would like to keep updating the back-ups (originally from the old iMac) with the changes that happened on the Mac Book pro.
    I hope I am clear and looking for some tips on that.
    Many thanks in advance.
    Best
    Pierre

    You can't do that. This is because the backups are tied to the network adapter's MAC address of the computer you are backing up with. You could "fake" the MAC address but this would make it impossible for you to use the iMac on a network (as it's MAC address would already be being used). Your only option is to delete the old iMac's backups or buy a new TC (best option if you want to keep the backups).

Maybe you are looking for

  • SELECT Query design with JOINS

    Dear Users, In one of our requirements, we have a SELECT query which joins a few tables together and selects a few fields from each table. Example: Fields 1,2 and 3 from Table 1          Fields 4 and 5 from Table 2. These fields are later displayed i

  • How to avoid submitting form twice

    How to avoid submitting form twice when this form use spry validation.

  • Is JDMK 5.1 supported from jdk1.5.0 / Solaris10?

    Hi all, I have to upgrade a snmp agent based on jdmk 5.1 and i have to install it on Solaris10 with jdk1.5.0. The system is currently running on Solaris 9 with jdk1.4.2. I wonder if jdmk 5.1 support jdk1.5.0/Solaris10. From the jdmk 5.1 product page

  • Java Script Error in Page sumbit

    Hi Friends, Can some please help me with the JavaScript error in my jsp page. Here is my Jsp page source: <%@ page language="java"%> <%@ page import="java.util.*"%> <%@page import="com.caremark.ivr.beans.CallSummaryByDateBean"%> <%@page import="com.c

  • My Photos Are Out Of Control

    On the summary of my phone in iTunes it says that I have over 400 pictures. On my phone however, it says I only have 78. These photos are taking up a lot of unnecessary space!