How can I copy large amount of data between two HD ?

Hello !
Which command could I user to copy large amount of data between two hard disk drives ?
How Lion identify the disk drives when you want to write some script, for example in Windows I sue
Robocopy D:\folder source\Files E:\folder destination
I just want to copy files and if the files/folders exist in destination the files should be overwritted.
Help please, I bougth my first MAC 4 days ago.
Thanks !

Select the files/folders on one HD and drag & drop onto the other HD. The copied ones will overwrite anything with the same names.
Since you're a newcomer to the Mac, see these:
Switching from Windows to Mac OS X,
Basic Tutorials on using a Mac,
Mac 101: Mac Essentials,
Mac OS X keyboard shortcuts,
Anatomy of a Mac,
MacTips,
Switching to Mac Superguide, and
Switching to the Mac: The Missing Manual,
Snow Leopard Edition.&
Additionally, *Texas Mac Man* recommends:
Quick Assist,
Welcome to the Switch To A Mac Guides,
Take Control E-books, and
A guide for switching to a Mac.

Similar Messages

  • How can I edit large amount of data using Acrobat X Pro

    Hello all,
    I need to edit a catalog that contains large amount of data - mainly the product price. Currently I can only export the document into excel file and then paste the new price onto the catalog using Acrobat X Pro one by one, which is extremely time-consuming. I am sure there's a better way to make this faster while keeping the accuracy of the data. Thanks a lot in advance if any one's able to help! 

    Hi Chauhan,
    Yes I am able to edit text/image via tool box, but the thing is the catalog contains more than 20,000 price data and all I can do is deleteing the orginal price info from catalog and replace it with the revised data from excel. Repeating this process over 20,000 times would be a waste of time and manpower... Not sure if I make my situation clear enough? Pls just ask away, I really hope to sort it out, Thanks! 

  • How can we transfer huge amount of data from database server to xml format

    hi guru
    how can we transfer huge amount of data from database server to xml format.
    regards
    subhasis.

    Create ABAP coding
    At first we create the internal table TYPES and DATA definition, we want to fill with the XML data. I have declared the table "it_airplus" like the structure from XML file definition for a better overview, because it is a long XML Definition (see the XSD file in the sample ZIP container by airplus.com)
    *the declaration
    TYPES: BEGIN OF t_sum_vat_sum,
              a_rate(5),
              net_value(15),
              vat_value(15),
             END OF t_sum_vat_sum.
    TYPES: BEGIN OF t_sum_total_sale,
            a_currency(3),
            net_total(15),
            vat_total(15),
            vat_sum TYPE REF TO t_sum_vat_sum,
           END OF t_sum_total_sale.
    TYPES: BEGIN OF t_sum_total_bill,
            net_total(15),
            vat_total(15),
            vat_sum TYPE t_sum_vat_sum,
            add_ins_val(15),
            total_bill_amount(15),
           END OF t_sum_total_bill.TYPES: BEGIN OF t_ap_summary,
            a_num_inv_det(5),
            total_sale_values TYPE t_sum_total_sale,
            total_bill_values TYPE t_sum_total_bill,
           END OF t_ap_summary.TYPES: BEGIN OF t_ap,
            head    TYPE t_ap_head,
            details TYPE t_ap_details,
            summary TYPE t_ap_summary,
           END OF t_ap.DATA: it_airplus TYPE STANDARD TABLE OF t_ap
    *call the transformation
    CALL TRANSFORMATION ZFI_AIRPLUS
         SOURCE xml l_xml_x1
         RESULT xml_output = it_airplus
         .see the complete report: Read data from XML file via XSLT program
    Create XSLT program
    There are two options to create a XSLT program:
    Tcode: SE80 -> create/choose packet -> right click on it | Create -> Others -> XSL Transformation
    Tcode: XSLT_TOOL
    For a quick overview you can watch at the SXSLTDEMO* programs.
    In this example we already use the three XSLT options explained later.
    As you can see we define a XSL and ASX (ABAP) tags to handle the ABAP and XML variables/tags. After "

  • How can I reduce the amount of space between paragraphs in Pages?

    How can I reduce the amount of space between paragraphs in Pages?

    Let's pretend that you told us which version of Pages you are talking about, and that you are using Pages 5:
    click in the text > Format > Spacing > click on triangle to expand options > change Before/After
    Peter

  • Couldn't copy large amount of data from enterprise DB to Oracle 10g

    Hi,
    I am using i-batis to copy data from enterprise DB to oracle and viceversa.
    The datatype of a field on EDB is 'text' and the datatype on oracle is 'SYS.XMLTYPE'
    i am binding these to a java string property in a POJO to bind values.
    I could successfully copy limited amount of data from EDB to oracle but if there is more data, i am getting the following exceptions with different oracle drivers ( but i could read large amount of data from EDB):
    --- Cause: java.sql.SQLException: ORA-01461: can bind a LONG value only for insert into a LONG column
    at com.ibatis.sqlmap.engine.mapping.statement.MappedStatement.executeUpdate(MappedStatement.java:107)
    at com.ibatis.sqlmap.engine.impl.SqlMapExecutorDelegate.update(SqlMapExecutorDelegate.java:457)
    at com.ibatis.sqlmap.engine.impl.SqlMapSessionImpl.update(SqlMapSessionImpl.java:90)
    at com.ibatis.sqlmap.engine.impl.SqlMapClientImpl.update(SqlMapClientImpl.java:66)
    at com.aqa.pojos.OstBtlData.updateOracleFromEdbBtlWebservice(OstBtlData.java:282)
    at com.aqa.pojos.OstBtlData.searchEdbAndUpdateOracleBtlWebservice(OstBtlData.java:258)
    com.ibatis.common.jdbc.exception.NestedSQLException:
    --- The error occurred in com/aqa/sqlmaps/SQLMaps_OSTBTL_Oracle.xml.
    --- The error occurred while applying a parameter map.
    --- Check the updateOracleFromEDB-InlineParameterMap.
    --- Check the parameter mapping for the 'btlxml' property.
    --- Cause: java.sql.SQLException: setString can only process strings of less than 32766 chararacters
    at com.ibatis.sqlmap.engine.mapping.statement.MappedStatement.executeUpdate(MappedStatement.java:107)
    at com.iba
    I have latest oracle 10g jdbc drivers.
    remember, i could copy any amount of data from oracle to EDB but not otherway around.
    PLease let me know if you have come across this issue, any recommendation is very much appreciated.
    Thanks,
    CK.

    Hi,
    I finally remembered how I solved this issue previously.
    The jdbc driver isn't able to directly call the insert with a column xml_type. The solution I was using was to build a wrapper function in plSQL.
    Here it is (for insert but I suppose tha update will be the same)
    create or replace procedure insertXML(file_no_in in number, program_no_in in varchar2, ost_XML_in in clob, btl_XML_in in clob) is
    begin
    insert into AQAOST_FILES (file_no,program_no,ost_xml,btl_xml) values(file_no_in, program_no_in, xmltype(ost_XML_in), xmltype(btl_XML_in));
    end insertXML;
    here is the sqlmap file I used
    <?xml version="1.0" encoding="UTF-8" ?>
    <!DOCTYPE sqlMap
    PUBLIC "-//ibatis.apache.org//DTD SQL Map 2.0//EN"
    "http://ibatis.apache.org/dtd/sql-map-2.dtd">
    <sqlMap>
         <typeAlias alias="AqAost" type="com.sg2net.jdbc.AqAost" />
         <insert id="insert" parameterClass="AqAost">
              begin
                   insertxml(#fileNo#,#programNo#,#ostXML:CLOB#,#bltXML:CLOB#);
              end;
         </insert>
    </sqlMap>
    an here is a simple program
    package com.sg2net.jdbc;
    import java.io.IOException;
    import java.io.Reader;
    import java.io.StringWriter;
    import java.sql.Connection;
    import oracle.jdbc.pool.OracleDataSource;
    import com.ibatis.common.resources.Resources;
    import com.ibatis.sqlmap.client.SqlMapClient;
    import com.ibatis.sqlmap.client.SqlMapClientBuilder;
    public class TestInsertXMLType {
         * @param args
         public static void main(String[] args) throws Exception {
              // TODO Auto-generated method stub
              String resource="sql-map-config-xmlt.xml";
              Reader reader= Resources.getResourceAsReader(resource);
              SqlMapClient sqlMap = SqlMapClientBuilder.buildSqlMapClient(reader);
              OracleDataSource dataSource= new OracleDataSource();
              dataSource.setUser("test");
              dataSource.setPassword("test");
              dataSource.setURL("jdbc:oracle:thin:@localhost:1521:orcl");
              Connection connection=dataSource.getConnection();
              sqlMap.setUserConnection(connection);
              AqAost aqAost= new AqAost();
              aqAost.setFileNo(3);
              aqAost.setProgramNo("prg");
              Reader ostXMLReader= Resources.getResourceAsReader("ostXML.xml");
              Reader bltXMLReader= Resources.getResourceAsReader("bstXML.xml");
              aqAost.setOstXML(readerToString(ostXMLReader));
              aqAost.setBltXML(readerToString(bltXMLReader));
              sqlMap.insert("insert", aqAost);
              connection.commit();
         public static String readerToString(Reader reader) {
              StringWriter writer = new StringWriter();
              char[] buffer = new char[2048];
              int charsRead = 0;
              try {
                   while ((charsRead = reader.read(buffer)) > 0) {
                        writer.write(buffer, 0, charsRead);
              } catch (IOException ioe) {
                   throw new RuntimeException("error while converting reader to String", ioe);
              return writer.toString();
    package com.sg2net.jdbc;
    public class AqAost {
         private long fileNo;
         private String programNo;
         private String ostXML;
         private String bltXML;
         public long getFileNo() {
              return fileNo;
         public void setFileNo(long fileNo) {
              this.fileNo = fileNo;
         public String getProgramNo() {
              return programNo;
         public void setProgramNo(String programNo) {
              this.programNo = programNo;
         public String getOstXML() {
              return ostXML;
         public void setOstXML(String ostXML) {
              this.ostXML = ostXML;
         public String getBltXML() {
              return bltXML;
         public void setBltXML(String bltXML) {
              this.bltXML = bltXML;
    I tested the insert and it works correctly
    ciao,
    Giovanni

  • How can I stabilise large transfers of data & retain full smart playlists?

    Hi,
    I hope somebody can help. Now that I have more than 20,000 songs in my iTunes, all of which I want to sync, my ipod classic 160gb goes into the 'infinite reset loop', or, when occasionally appearing to sync correctly, ends up with 0 songs, 0 video, etc. on the iPod.
    This is despite supposedly having some 22GB of unused space.
    I assumed it was by 2-and-a-half-year-old iPod succumbing to old age and bought a replacement, but the problem occurs on the new machine too.
    I have tried various solutions from the Discussion forum, e.g.:
    1) procedures for checking for hard-drive errors
    2) restoring to factory settings and then re-syncing
    3) I found one helpful tip for creating a new smart playlist to introduce the files in stages, which is supposed stabilise the transfer of large amounts of data. While this was helpful, and appeared to stabilise the data, I was left without all my smart playlists (except the one I'd created for this exercise). Transferring back to Syncing the entire library, so as to regain all my smart playlists, merely made the iPod unstable again.
    I really want to be able to access my (extensive) smart playlists: I'm presuming it's the very amount of these that is contributing to the problem. Can anyone suggest a solution?
    Thanks!

    jcannon wrote:
    I am using my notebook whch only has 2GB RAm... I might need to try the workstation PC which has 8GB..
    I did read all the data using the read from text file vi and displayed it all in a strinbg indicator (reading in x number of bytes/lines at a time)... How can I no plot this text/string data?
    8GB won't really help you unless you also use LabVIEW 64bit. 2GB should be OK.
    To graph it, you need to decimate it as shown in the link I gave you earlier. Most likely, your monitor has only a few thousand pixels across, so it is not worth to display more points than that. It won't look any different if done right!
    Can you show us your code? Can you attach a small sample file that has the same structure, just with less data?
    LabVIEW Champion . Do more with less code and in less time .

  • How to pass large amount of data between steps

    Hi all,
    I have some LabVIEW VIs for data acquisition。
    I need to pass large amount of data(array size >5000000 each time) from one step to another.
    But it is not allowed to set array size larger than 5000000.
    Any suggestion?
    czhen
    Win 7 SP1 & LabVIEW 2012 SP1, Teststand 2012 SP1
    Solved!
    Go to Solution.
    Attachments:
    Array Size Limits.png ‏34 KB

    In your LabVIEW code, put the data into a data value reference.  Pass this reference between your TestStand steps.  As an added bonus, you will not get an extra copy of the data at each step.  You will need to use the InPlace element structure to get your data out of the data value reference.
    This account is no longer active. Contact ShadesOfGray for current posts and information.

  • Looking for ideas for transferring large amounts of data between systems

    Hello,
    I am looking for ideas based on best practices for transferring Large Amounts of Data in and out of a Netweaver based application.
    We have a new system we are developing in Netweaver that will utilize both the Java and ABAP stack, and will require integration with other SAP and 3rd Party Systems. It is a standalone product that doesn't share any form of data store with other systems.
    We need to be able to support 10s of millions of records of tabular data coming in and out of our system.
    Since we need to integrate with so many different systems, we are planning to use RFC for our primary interface in and out of the system. As it turns out RFC is not good at dealing with this large amount of data being pushed through a single call.
    We have considered a number of possible ideas, however we are not very happy with any of them. I would like to see what the community has done in the past to solve problems like this as well as how SAP currently solves this problem in other applications like XI, BI, ERP, etc.

    Primoz wrote:Do you use KDE (Dolphin) 4.6 RC or 4.5?
    Also I've noticed that if i move / copy things with Dolphin they're substantially slower than if I use cp/mv. But cp/mv works fine for me...
    Also run Dolphin from terminal to try and see what's the problem.
    Hope that help at least a bit.
    Could you explain why Dolphin should be slower? I'm not attacking you, I'm just asking.
    Cause I thought that Dolphin is just a „little" wrapper around the cp/mv/cd/ls applications/commands.

  • Copying large amount of data from one table to another getting slower

    I have a process that copies data from one table (big_tbl) into a very big archive table (vb_archive_tbl - 30 mil recs - partitioned table). If there are less than 1 million records in the big_tbl the copy to the vb_archive_table is fast (-10 min), but more importantly - it's consistant. However, if the number of records is greater than 1 million records in the big_tbl copying the data into the vb_archive_tbl is very slow (+30 min - 4 hours), and very inconsistant. Every few days the time it takes to copy the same amount of data grows signicantly.
    Here's an example of the code I'm using, which uses BULK COLLECT and FORALL INSERST to copy the data.
    I occasionally change 'LIMIT 5000' to see performance differences.
    DECLARE
    TYPE t_rec_type IS RECORD (fact_id NUMBER(12,0),
    store_id VARCHAR2(10),
    product_id VARCHAR2(20));
    TYPE CFF_TYPE IS TABLE OF t_rec_type
    INDEX BY BINARY_INTEGER;
    T_CFF CFF_TYPE;
    CURSOR c_cff IS SELECT *
    FROM big_tbl;
    BEGIN
    OPEN c_cff;
    LOOP
    FETCH c_cff BULK COLLECT INTO T_CFF LIMIT 5000;
    FORALL i IN T_CFF.first..T_CFF.last
    INSERT INTO vb_archive_tbl
    VALUES T_CFF(i);
    COMMIT;
    EXIT WHEN c_cff%NOTFOUND;
    END LOOP;
    CLOSE c_cff;
    END;
    Thanks you very much for any advice
    Edited by: reid on Sep 11, 2008 5:23 PM

    Assuming that there is nothing else in the code that forces you to use PL/SQL for processing, I'll second Tubby's comment that this would be better done in SQL. Depending on the logic and partitioning approach for the archive table, you may be better off doing a direct-path load into a staging table and then doing a partition exchange to load the staging table into the partitioned table. Ideally, you could just move big_tbl into the vb_archive_tbl with a single partition exchange operation.
    That said, if there is a need for PL/SQL, have you traced the session to see what is causing the slowness? Is the query plan different? If the number of rows in the table is really a trigger, I would tend to suspect that the number of rows is causing the optimizer to choose a different plan (with your sample code, the plan is obvious, but perhaps you omitted some where clauses to simplify things down) which may be rather poor.
    Justin

  • How to deal with large amount of data

    Hi folks,
    I have a web page that needs 3 maps to serve. Each map will hold about 150,000 entries, and all of them will use about 100MB total. For some users with lots of data (e.g. 1,000,0000 entries), it may use up to 1GB of memory. If few of these high-volume users log in at the same time, it can bring the server down. The data is from the files, I cannot read it on demand because it will be too slow. Loading the data to maps offers me very good performance, but it does not scale. I am thinking to serialize the maps and deserialize them when I need. Is it my only option?
    Thanks in advance!

    JoachimSauer wrote:
    I don't buy the "too slow" argument.
    I'd put the data into a database. They are built to handle that amount of data.++

  • How can I delete large amounts of mail at once

    On my iPad, how can I delete all of my mail at once?

    How to Delete Email on the iPad
    http://ipad.about.com/od/iPad_Guide/ss/How-To-Delete-Email-On-The-Ipad.htm
    How to Mass Delete Emails from iPhone and iPad Inbox (with video)
    http://suiteminute.com/how-to-mass-delete-emails-from-iphone-and-ipad-inbox/
    How to delete ALL mail messages from iPhone/iPad in one step
    http://www.conferencesthatwork.com/index.php/technology/2014/01/how-to-delete-al l-mail-messages-from-iphoneipad-in-one-step/
     Cheers, Tom

  • Why does copying large amount of data block internet?

    I have a wrt600n and copied about 20G of data from a 100Mb connection and it took over the entire router.  All computers plugged into the router lost internet access and I was unable to log into the router admin during the transfer. Why would this happen?  After the transfer everything went back to normal.

    It was not just my computer..... tried all the computers on my network...none could reach the internet.  Even the new software that Linksys has to display your network show all lines red. 
    And the computer that was sending the data was 100mb... while the others are all Gig.  Strange...switches should localize the traffic to the 2 computers talking and not affect the others.

  • How to get list of all the dates between two dates

    Hi, Can anybody please help me ..
    I have two dates in string format ("dd/MM/yyyy).I need to get all the dates in between these two dates.How can I do thin in java
    Thanks in advance

    Look at classes Calendar and SimpleDateFormat.
    And get your abstraction straight: you don't have two dates. You have two Strings that represent a date. To use them, you have to convert them to Date objects first with SDF.

  • Copying large amount of data from one table to another

    We have a requirement to copy data from a staging area (consists of several tables) to the actual tables. The copy operation needs to
    1) delete the old values from the actual tables
    2) copy data from staging area to actual tables
    3) commit only after all rows in the staging area are copied successfully. Otherwise, it should rollback.
    Is it possible to complete all these steps in one transaction without causing problems with the rollback segments, etc.? What are the things that I need to consider in order to make sure that we will not run into production problems?
    Also, what other best practices/alternative methods are available to accomplish what is described above.
    Thanks,
    Eser

    It's certainly possible to do this in a single transaction. In fact, that would be the best practice.
    Of course, the larger your transactions are, the more rollback you need. You need to allocate sufficient rollback (undo in 9i) to handle the transaction size you're expecting.
    Justin
    Distributed Database Consulting, Inc.
    www.ddbcinc.com

  • Mail. How can I copy the sender and date info from the Header like I used to do in Lion?

    I used to find it very useful to drag and highlight all the info in the header of the email including sender, cc, date, subject etc to be able to paste in into other docs as reminders of who sent what when. Now I have to do 2 copy/pastes. Not a deal killer but makes my life just that little bit more unnecessarily complicated.

    There is usually a toolbar button in the editor to turn a text link into a clickable hyperlink (look for a chain like button).<br />
    You can select the text and click that button to turn the link into a clickable hyperlink.<br />
    If you can't find the button then hover them all to check the tooltip of each.<br />
    * Make Link - https://addons.mozilla.org/firefox/addon/142

Maybe you are looking for

  • Image processing on matched patterns

    Hi all I am using LabVIEW 7.1 for image processing. I have used the pattern matching example to suit my requirements. Now i want that whatever image processing operations i perform after pattern matching, like, threshold, circle detection, particle f

  • Mail slow to reach recipients

    Two of my users have reported that e-mails they send sometimes take over 12 hours to reach the intended recipients. These messages are not showing up stuck in the queue in Server Admin. The recipients' e-mail accounts are in the same domain as the se

  • Non-ascii, jsp-struts, not reciveing the correct value in the server

    Hi, I am developing JSP-Struts based application using tomcat server. In JSP struts form, when I put nonascii charater in the text field or text area, on the server side I recived twice of that non-ascii character.. for example, in the text field I e

  • MODIFYING WEB SITES TAB BUG - Anyone else?

    This isn't a how-to thread as such, but I'm using Mac OS X Server 10.4.11 and I have many websites that I'm hosting (27 at the moment, mainly subdomains). The nature of my business is that I need to keep adding subdomains the more users I get, and th

  • MobileMe uploads of photos are not of good quality.

    When I upload images from Aperture 3 to MobileMe Gallery the image quality is poor, yet the pictures I have are awesome. How can I improve the quality of images posted to my MobileMe Gallery?