'fromuser' to 'touser' option in datapump

Dear Friends ,
I m using oracle 10g in AIX platform . I take a full export (expdp)using datapump . Now I want to import it in another database in my test server . In test database I just import two user (Like user1,user2 ) from the export dump , i.e., I just want to use 'fromuser' , 'touser' option .
How can I do it in datapump ?
plz help ... ..

The EXCLUDE options to filter expdp or impdp to filter specfic objects being exported or imported.
If you're exporting a TABLE and you dont want export indexes,triggers,statistics,you can use
Tables=A
EXCLUDE=INDEX,TRIGGER,STATISTICS.
it works the same way for Schema. If you're exporting a full database export and if you EXCLUDE=SCHEMA:"='SCOTT'", The SChema SCOTT will not be exported.
If I want to import 4 users (user1,user2,user3,user4) from a full 'import dump" >using 'impdp' , then how can I use "remap_schema" ?You haven't specified in which schema the data to be imported.
REMAP_SCHEMA=scott:newscott - load database objects from a source schema (SCOTT) to a target schema (NEWSCOTT).
Import DataPump Parameter REMAP_SCHEMA - How to Move Database Objects from one Schema to Another - Note:260225.1
HTH
Anantha

Similar Messages

  • FLASHBACK_SCN option within datapump

    I am trying to use the datapump option FLASHBACK_SCN by setting a variable within a script as the value from "SELECT dbms_flashback.get_system_change_number FROM dual;" in the following way.
    sqlplus '/ as sysdba' <<EOF
    set head off
    spool flashback_scn.output
    SELECT dbms_flashback.get_system_change_number FROM dual;
    spool off
    exit;
    EOF
    sed '/^$/d' flashback_scn.output |\
    sed -e "s/[ <tab>]*//g" > flashback_scn.list
    export FLASHBACK_SCN=`cat flashback_scn.list`
    The variable FLASHBACK_SCN is then passed to the option and the datapump export performed. This has worked fine on a test database but the SCN is a lot smaller number than in production. I am now trying to run this in production but the value that comes back from "SELECT dbms_flashback.get_system_change_number FROM dual;" is as follows.
    GET_SYSTEM_CHANGE_NUMBER
    9.2196E+12
    And datapump does not see this as a valid SCN nor does a select statement using the option of "AS OF SCN", I just get the error "specified number is not a valid change number.
    Can anyone suggest a way to get the value as a proper value and not a "to the power of what ever" value.
    Kind regards.
    Graeme.

    Hi Graeme,
    This is only a display think of sqlplus.
    Try this:
    sqlplus '/ as sysdba' <<EOF
    set head off
    col get_system_change_number format 9999999999999999999999
    spool flashback_scn.output
    SELECT dbms_flashback.get_system_change_number FROM dual;
    spool off
    exit;
    EOFthe format change the display to optional digits (in this case 22 digits, you can set it to whatever you like).
    By the way, I take the date and use FLASHBACK_TIME instead of FLASHBACK_SCN, but it really doesn't matter.
    HTH
    Liron Amitzi
    Senior DBA consultant
    http://www.dbsnaps.com

  • Compression options for datapump

    Hi
    We are using 10.2 version of the db.
    Using datapump
    I have rad docs for datapump and as far as compression goes, can only see COMPRESSION=METADATA_ONLY.
    We need to export something like a 200gb schema and we have no space for it.
    Is this the only option for compression? (yes, I have looked into other mount points)
    Looks like there isnt any. "Stay tuned for a data compression solution in a future release ".
    Brilliant.
    Thanks.
    DA
    Edited by: Dan A on Mar 28, 2009 12:18 PM

    Good point,and one I will look into.
    In the end I eventually got the dump - with 99% space used! I then scp-ed it to another box and gzipped it from there.
    I am going to look into setting those environment variables as this took me two hours yesterday and yesterday was a Saturday - know what I mean?
    DA

  • How export one table along with data from one location to other location

    Hi All,
    I'm new in export/import practice.
    Can anyone plz tell the steps along with commands to do the following:
    1. I want to export a table with data from one location(computer) to other(computer) that are in same network.
    2.Also from one user to another user.
    I'm using oracle 10g.
    regards
    Sonia
    Edited by: 983040 on Feb 19, 2013 11:35 PM

    First of all read documentation
    Oracle Export/Import : http://docs.oracle.com/cd/B19306_01/server.102/b14215/exp_imp.htm
    Datapump Export/Import : http://docs.oracle.com/cd/B19306_01/server.102/b14215/dp_overview.htm
    If you are using Datapump or Traditional Export/import you need to follow following steps
    *1) Take User dump via EXPDP on Computer A .*
    For EXP
    exp username/password owner=Test file=D:\test.dmp log=D:\test.log
    For EXPDP
    expdp username/password schemas=TEST directory=TEST_DIR dumpfile=TEST.dmp logfile=TEST.log*2) Copy that to Computer B*
    *3) Import dumpfile.*
    For IMPDP  Use remap_schema optionhttp://www.acehints.com/2012/05/data-pump-impdp-remapschema-parameter.html
    For IMP use fromuser and touser option
    one user to another user imp

  • Import all but synonyms

    Hi.
    I have made a dump of schema with exp and OWNER option. Now i want to import that into other schema using FROMUSER and TOUSER options. But what i don't want is synonyms to be imported.
    How can i exclude them from the import?
    Thanx in advance

    There's about 80 of themIt's not too much, we have more than 32000 synonyms in our db... ;-)
    You can script the deletion to not have to write by yourself each synonym name.
    Nicolas.

  • System refresh from non-cluster to a MSCS cluster system

    I have a system (call GTA) running NW 7.0 SR2 with both ABAP and Java instance. It runs on WIndows 2003 64-bit and SQL Server 2005.  Now we want to migrate to a new set of hardware with Microsoft cluster (MSCS). I have installed this new system (Call GTB), MSCS clustered with both ABAP and Java instance and running no problem.
    Now I am at the last point of the migration -- which I need refresh database GTB from GTA to capture live data. In WAS 6.40 ABAP, I just need perform a database backup/restore and execute SAPINST tool and everything works. Now based on SAP's "System Copy Guide", I need run export on source system and then run import on target system. In source system, it seems straight to export since it's central system (ABAP+ Java). However, when I get to target system, and when I choose cluster, it listed all items I have to choose (ABAP, SCS, 1st node, 2nd node, DB, enqueue, etc). They are exactly the same as a new installation. My questions are: 
    1. Do I have to go through all these steps for just refresh a database? If I go through these installation, am I overwrite something in my GTQ from GTA unnecessnaryly?
    2. Can I just perform a database backup/restore? But what kind steps I need do to make cluster work after database refresh since original databse (GTA) is not a cluster system?
    I appreciate you can provide some advise and share some experiences.
    Thanks,
    Yujun Ran

    Hi
    I have used the FROMUSER and TOUSER option of the IMPORT cmd to perform a system copy like you have described.
    Steps:
    1) Export BW4 (EXPORT USER=SAPBW4 ... )
    2) In MCOD database, delete/drop all objects belonging to SAPBW1.  From memory, I think I found it easier to DROP USER SAPBW1 CASCADE and the recreate the SAPBW1 user.
    3) Import BW4 export (IMPORT FROMUSER=SAPBW4 TOUSER=SAPBW1 ..)
    I don't use MCOD anymore because it is such a pain to refresh the systems.
    Regards
    Doug

  • Backup on a extern Hard disk

    Hi,
    If any body can tell me the steps how to take a back on a externe Hard Disk.
    and then not of all schema,s.just two schema,s.
    Thanx
    Navneet kaur

    Take an export backup of the database in the externam hard disk
    while importing use fromuser and touser option for bothe schema?
    other than this i am not very sure

  • Grant details required for user and schema

    Hi
    I have Oracle version - 10.2.0.4.0
    We have Schema A (Lot of objects exist) and User B (No objects exist - acts as application user to access objects in other schema).
    I have listed below doubts.
    1) I want to know the method to find the list of users have access to objects in Schema A and privileges granted for the objects in Schema A
    2) I want to know the method to find the list of grants provided to the schema objects to the user B

    user1368801 wrote:
    Thanks ajallen.
    It really helped me.
    one more question. I think DBA_TAB_PRIVS gives details for tables only right.
    What about other objects like procedures, views etc. Go back to the Reference Manual and re-read the description of DBA_TAB_PRIVS. Re-read the specific description of TABLE_NAME.
    >
    Actually I am exporting 3 schemas (A,B,C) from production and importing them to test environment (A1, B1,C1) using fromuser and touser option.
    Now I have to properly remap all the privileges, grants, synonyms etc.
    There are so many objects and I am wondering how to remap properly.
    It may be simple, as a newbie, your direction will be more helpful

  • Stats question

    Hi All
    This is related to schema stats from production to QA.
    We have two DB
    Production: ABC
    QA: XYZ
    Now,
    I did export for schema ABC from production and inported the complete schema using 'fromuser and touser' option for exp into XYZ.
    Then i created table for production (ABC) stats using :
    exec dbms_stats.create_stat_table(ownname => 'ABC', stattab => 'ABCSTATS', tblspace => 'ABC_DATA');
    Then, i gathered the stats:
    exec dbms_stats.export_schema_stats(ownname => 'ABC',statown=>'ABC', stattab=>'ABCSTATS');
    then, exported the table 'ABCSTATS' to file exp.dmp and imported into QA database XYZ.
    PROD: exp file=/exp.dmp tables=ABCSTATS
    QA: imp file=exp.dmp fromuser=ABC touser=XYZ
    Now, i have complete data from PRODUCTION in QA and stats for production too in table ABCSTATS.
    Now, When i do
    exec dbms_stats.import_schema_stats(ownname => 'XYZ',statown=>'XYZ', stattab=>'ABCSTATS');
    Then command runs sucessfully. But stats are not loaded as of procution.
    I have stats cost coming for one query as 40,000 in production whereas in QA it is coming as 7,000 only.
    Now the question is, is there something i am doing wrong? Or is this possible to load schema stats from one to another schema for same data structure.
    OS: Sun10
    DB: 10G 10.2.0.3.0
    Thanks
    aps
    Edited by: aps on Apr 13, 2009 9:31 AM

    this possible to import diffrent schame stats into other? Can someone confirm?its possible. you have to perform few manual tasks.
    create the stat table in source and export source schema stats.
    export the stat table from source using exp utility.
    transfer and import the stat table into target database.
    Now, the Schema in Stat table isn't available in your target.So, you have to update the STAT table column C5 to target schema name.
    Now, Import the stats using exec dbms_stats.import_table_stats.
    Any special reason you want to do this? Instead, you can gather stats in current Target schema?
    HTH
    Anantha.
    Edited by: Anantha on Apr 13, 2009 2:13 PM

  • Export schema and import with different name

    Hi ,
    I need your assistance guys,
    I need to export a schema from a database and then import it with different name to another database. How can I do that?
    I export it with the following command:
    exp userid=system/PW file=May_02_Export.dmp log=May_02_Export.log owner=schema_name statistics="none"
    Now I want to import it but with different name (schma_name).
    Regards

    Did you try FROMUSER and TOUSER option ?
    E:\oracle\product\10.2.0\db_1\install>sqlplus / as sysdba
    SQL*Plus: Release 10.2.0.1.0 - Production on Sat May 2 20:01:09 2009
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Connected to an idle instance.
    SQL> startup
    ORACLE instance started.
    Total System Global Area  167772160 bytes
    Fixed Size                  1247900 bytes
    Variable Size              71304548 bytes
    Database Buffers           92274688 bytes
    Redo Buffers                2945024 bytes
    Database mounted.
    Database opened.
    SQL> create user fromuser identified by fromuser;
    User created.
    SQL> grant resource, create table, create session to fromuser;
    Grant succeeded.
    SQL> create user touser identified by touser;
    User created.
    SQL> grant resource, create table, create session to touser;
    Grant succeeded.
    SQL> conn fromuser/fromuser
    Connected.
    SQL> create table export_table(a char);
    Table created.
    SQL> exit
    Disconnected from Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    E:\oracle\product\10.2.0\db_1\install>exp fromuser/fromuser file=e:\a.dmp
    Export: Release 10.2.0.1.0 - Production on Sat May 2 20:03:06 2009
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    Export done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set
    . exporting pre-schema procedural objects and actions
    . exporting foreign function library names for user FROMUSER
    . exporting PUBLIC type synonyms
    . exporting private type synonyms
    . exporting object type definitions for user FROMUSER
    About to export FROMUSER's objects ...
    . exporting database links
    . exporting sequence numbers
    . exporting cluster definitions
    . about to export FROMUSER's tables via Conventional Path ...
    . . exporting table                   EXPORT_TABLE          0 rows exported
    . exporting synonyms
    . exporting views
    . exporting stored procedures
    . exporting operators
    . exporting referential integrity constraints
    . exporting triggers
    . exporting indextypes
    . exporting bitmap, functional and extensible indexes
    . exporting posttables actions
    . exporting materialized views
    . exporting snapshot logs
    . exporting job queues
    . exporting refresh groups and children
    . exporting dimensions
    . exporting post-schema procedural objects and actions
    . exporting statistics
    Export terminated successfully without warnings.
    E:\oracle\product\10.2.0\db_1\install>imp help=y
    Import: Release 10.2.0.1.0 - Production on Sat May 2 20:03:47 2009
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    You can let Import prompt you for parameters by entering the IMP
    command followed by your username/password:
         Example: IMP SCOTT/TIGER
    Or, you can control how Import runs by entering the IMP command followed
    by various arguments. To specify parameters, you use keywords:
         Format:  IMP KEYWORD=value or KEYWORD=(value1,value2,...,valueN)
         Example: IMP SCOTT/TIGER IGNORE=Y TABLES=(EMP,DEPT) FULL=N
                   or TABLES=(T1:P1,T1:P2), if T1 is partitioned table
    USERID must be the first parameter on the command line.
    Keyword  Description (Default)       Keyword      Description (Default)
    USERID   username/password           FULL         import entire file (N)
    BUFFER   size of data buffer         FROMUSER     list of owner usernames
    FILE     input files (EXPDAT.DMP)    TOUSER       list of usernames
    SHOW     just list file contents (N) TABLES       list of table names
    IGNORE   ignore create errors (N)    RECORDLENGTH length of IO record
    GRANTS   import grants (Y)           INCTYPE      incremental import type
    INDEXES  import indexes (Y)          COMMIT       commit array insert (N)
    ROWS     import data rows (Y)        PARFILE      parameter filename
    LOG      log file of screen output   CONSTRAINTS  import constraints (Y)
    DESTROY                overwrite tablespace data file (N)
    INDEXFILE              write table/index info to specified file
    SKIP_UNUSABLE_INDEXES  skip maintenance of unusable indexes (N)
    FEEDBACK               display progress every x rows(0)
    TOID_NOVALIDATE        skip validation of specified type ids
    FILESIZE               maximum size of each dump file
    STATISTICS             import precomputed statistics (always)
    RESUMABLE              suspend when a space related error is encountered(N)
    RESUMABLE_NAME         text string used to identify resumable statement
    RESUMABLE_TIMEOUT      wait time for RESUMABLE
    COMPILE                compile procedures, packages, and functions (Y)
    STREAMS_CONFIGURATION  import streams general metadata (Y)
    STREAMS_INSTANTIATION  import streams instantiation metadata (N)
    The following keywords only apply to transportable tablespaces
    TRANSPORT_TABLESPACE import transportable tablespace metadata (N)
    TABLESPACES tablespaces to be transported into database
    DATAFILES datafiles to be transported into database
    TTS_OWNERS users that own data in the transportable tablespace set
    Import terminated successfully without warnings.
    E:\oracle\product\10.2.0\db_1\install>imp system/oracle fromuser=fromuser touser=touser file=e:\a.d
    Import: Release 10.2.0.1.0 - Production on Sat May 2 20:04:20 2009
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    Export file created by EXPORT:V10.02.01 via conventional path
    Warning: the objects were exported by FROMUSER, not by you
    import done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set
    . importing FROMUSER's objects into TOUSER
    . . importing table                 "EXPORT_TABLE"          0 rows imported
    Import terminated successfully without warnings.
    E:\oracle\product\10.2.0\db_1\install>sqlplus touser/touser
    SQL*Plus: Release 10.2.0.1.0 - Production on Sat May 2 20:04:36 2009
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Connected to:
    Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    SQL> desc export_table
    Name                                      Null?    Type
    A                                                  CHAR(1)
    SQL>HTH
    Aman....

  • Urgent: IMP-00034: Warning: FromUser "USER1" not found in export file

    Hello all
    I succeccfully export DMP file using sys user with the option full=y.
    I want now to import the schema objects for the user1 with the folloing command:
    C:\oracle\ora81\bin\IMP.EXE sys/sys@bas fromuser=user1 touser=scott file=C:\omran12-06-2005.dmp
    But I received this error :
    IMP-00034: Warning: FromUser "OMBR" not found in export file
    Import terminated successfully with warnings.
    What is the problem
    Thanks in advance

    Hello
    Thanks for ur replay but doesn't work at all:
    - Itried the first by creatint the user and i received this error:
    C:\oracle\ora81\bin\IMP.EXE newuser/newnewuser@sid file=C:\x.dmp log=c:\log.log full=y
    Warning: the objects were exported by SYSTEM, not by you
    import done in AR8MSAWIN character set and AR8MSAWIN NCHAR character set
    . importing SYSTEM's objects into OMBR1
    Import terminated successfully without warnings.
    Then I tried this:
    C:\>C:\oracle\ora81\bin\IMP.EXE sys/sys@sid file=C:\x.dmp log=c:\log.log full=y show=y
    I received this:
    Warning: the objects were exported by SYSTEM, not by you
    import done in AR8MSAWIN character set and AR8MSAWIN NCHAR character set
    . importing SYSTEM's objects into SYS
    Import terminated successfully without warnings.
    Then i tried this:
    C:\>C:\oracle\ora81\bin\IMP.EXE system/system@sid file=C:\x.dmp log=c:\log.log full=y show=y
    And I get this error:
    Export file created by EXPORT:V08.01.07 via conventional path
    import done in AR8MSAWIN character set and AR8MSAWIN NCHAR character set
    . importing SYSTEM's objects into SYSTEM
    Import terminated successfully without warnings.
    Why nothing work with me????????

  • Imp / from user/touser

    what to clear my concepts
    What fromuser-touser is doing, like in this example what is going on with
    fromuser=(FA_F3PROD)
    touser=(FA_R3TST01)
    imp parfile=exp.parvi parfile
    userid=system/*******@R3TST01
    file=refresh.dmp
    fromuser=(FA_F3PROD)
    touser=(FA_R3TST01)
    ignore=Y
    commit=y
    compile=y
    buffer=2000000
    resumable=y
    resumable_timeout=12000
    log=imp_refresh.log

    In this case, Objects from schema FA_F3PROD user are being imported to FA_R3TST01 schema. So basically it is schema level import.
    For other parameters
    http://download-east.oracle.com/docs/cd/B10501_01/server.920/a96652/ch02.htm#1014036
    Import Modes
    http://download-east.oracle.com/docs/cd/B10501_01/server.920/a96652/ch02.htm#1005234

  • Datapump TTS from filesystem to ASM migration

    Hi Experts,
    I have to move the 1.5 TB of data which is on filesystem running on Solaris SPARC (single instance) to Linux x86 RAC (10.2.0.4) with ASM. I am considering the Transportable tablespaces option using datapump. But I have done the filesystem to filesystem datapump for TTS. Is there any way that I can use datapump TTS for filesystem to ASM directly ? Any suggestion and comments please.
    Thanks
    Shashi

    I'm not sure what your question is. Data Pump works with ASM and supports transportable. You would have to copy your data files from your file system to your asm file system, then do the export on your source, copy the dumpfile to your asm target and then run data pump. If you have more specific questions, please list them.
    Hope this helps.
    Thanks
    Dean

  • E-business database migration to linux (Cross platform) using datapump

    Have anyone used parallel=? option of datapump when migrating e-business database (cross platform) during export (expdp) and import (impdp).

    174b0d50-0464-47ad-8831-22215fbc4bbe wrote:
    Have anyone used parallel=? option of datapump when migrating e-business database (cross platform) during export (expdp) and import (impdp).
    You can use this option, just consider not using a high number to avoid ORA-31693 errors.
    Using Transportable Tablespaces for EBS 11i Using Database 11gR2 (Doc ID 1366265.1)
    Using Transportable Tablespaces for EBS Release 12.0 or 12.1 Using Database 11gR2 (Doc ID 1311487.1)
    Thanks,
    Hussein

  • Oracle 11g imp erroneously tries to recreate existing tables with CLOBs?

    I have a shell script for loading database dumps from both Datapump and the older exp/imp.
    Often when loading dumps, I need to rename the schema owner and tablespace names (which is handled by REMAP_SCHEMA and REMAP_TABLESPACE in Datapump).
    However I have a whole bunch of dumps created with exp at this point and not that many Datapump dumps yet. As such the old style dumps are handled by the shell script in this way:
    1) A first pass imp is run using INDEXFILE to generate a file with the SQL to create tables and indexes. Options also include FROMUSER and TOUSER.
    2) A series of sed command edit the SQL file to change the tablespace names (which are schema owner specific in our case).
    3) The editted SQL file is run with sqlplus to create the tables and indexes.
    4) A second pass imp is run to load the table rows as well as triggers, stored procedures, views, etc. Options include FROMUSER, TOUSER, COMMIT=Y, IGNORE=Y, BUFFER, STATISTICS=NONE, CONSTRAINTS=N
    This shell script has been working great for loading exp dump files into Oracle 9 and Oracle 10 databases, but now that I'm trying to load these dumps into Oracle 11, it fails.
    The problem is in step 4, the imp program is trying to create some of the tables that already were created with sqlplus in step 2. The problematic tables all seem to have CLOB columns in them. The table creation fails because it tries to use the tablespace names from the dump file, which do not exist in the destination database. And when the table creation fails, imp then decides not to load the rows for those table.
    This seems like a bug in the Oracle 11 imp program. I don't understand why it thinks it needs to recreate tables that already exist when those tables have CLOB columns. Is there something different about CLOB columns in Oracle 11 that I should know about that might be confusing imp into thinking that it needs to create tables when they already exist? Maybe I need to do something to those tables in SQL so that imp does not think it needs to recreate them?
    I know that the tables with the CLOBs were created correctly because I was trying to find some way to workaround this. For step 4, I tried using DATA_ONLY=Y, in which case imp does not try to create the tables and just loads the table rows. Of course using DATA_ONLY, I don't get a lot of other things like triggers, view and stored procedures. I started to try to get around that by doing 3 passes with imp, so that I could pick up the missing pieces by using an imp pass with ROWS=N, but strangely that has the same problem of trying to recreate the existing tables.

    The only solution I've found so far as a workaround is rather convoluted.
    1. I took an export using datapump's expdp of SCHEMA1 (in 10g it will skip the table with the xmltype).
    2. I imported the data to my empty schema (SCHEMA2) using impdp. To avoid the error that the type already exists with another OID, I used the TRANSFORM=oid:n parameter e.g.
    impdp user/pwd dumpfile=noxmltable.dmp logfile=importallbutxmltable.log remap_schema=SCHEMA1:SCHEMA2 TRANSFORM=oid:n directory=MYDUMPDIR
    3. I then manually created my xmltype table in the SCHEMA2 and did a select into to load it (make sure you have the select privileges to do so):
    INSERT INTO SCHEMA2.XMLTABLE2 SELECT * FROM SCHEMA1.XMLTABLE1;
    4. I am still taking an export with exp of the xmltable as well even though I'm not sure I can do anything with it.
    Thanks!
    Edited by: stacyz on Jul 28, 2009 9:49 AM

Maybe you are looking for

  • All songs on ipod, how do i get them into itunes without losing them

    the only place that i have my songs is on my ipod now. i made the mistake of deleting all of my songs in my itunes so that i could clean up my computer. is there any way i can put the songs on my ipod into itunes withuot losing them?

  • MacBook Pro Harddrive help!!!

    Before putting my itunes library (from my old computer) on my MacBook, My memory was at 230gb. My whole library is only 60gb, and now my MBP's harddrive space is starting to drop below 20gb!? Why is this? PLEASE HELP THANKS IN ADVANCE

  • How to synch IPOD Nano with multiple pc's

    I will like to add 200 extra tracks from my sister's laptop ITUNES library onto my own very IPOD nano. Anyone can advise how to synch it so that it can add those audio files WITHOUT erasing those already stored on the IPOD? Do i simply click on CTRL

  • Merge Organizations

    Hi, We've implemented Oracle Projects and Oracle Financials. We have setup 5 organizations (departments) in HR which have the classifications: HR Organization Project Task Owning Organization Project Expenditure Organization The orgnaizations in HR a

  • [svn:osmf:] 15581: Better coding style and some comments for the previous code submission.

    Revision: 15581 Revision: 15581 Author:   [email protected] Date:     2010-04-19 17:14:00 -0700 (Mon, 19 Apr 2010) Log Message: Better coding style and some comments for the previous code submission. Modified Paths:     osmf/trunk/apps/samples/framew