Data Pump between differnt oracle versions
This is a follow on to the posting from yesterday. We have a customer database that is at 10.2.N, which we currently do not support. So they are planning on buying a new server and installing oracle at the supported version of 10.1.0.5. We are then planning on exporting the data from the 10.2 database using the schema option and then importing back into the 10.1.05 database
I thought I could use the data pump and specify the version parameter as 10.1.0.5 during the export, but according to the help that option/parameter is only valid for NETWORK_LINK and SQLFILE. I was not planning on using those options but just the default dmp file.
Am I better off just using the old export/import utilites. The schema owner has tables that contain lob elements so how does that work with SQLFILE?
Any advice is appreaciated!
thanks!
I don't think it is not supported:
C:\Documents and Settings\ATorelli>expdp system/xxxxx version=*10.1.0.1* schemas=pippo
Export: Release *10.2.0.4.0* - Production on Mercoledý, 29 Giugno, 2011 18:42:19
Copyright (c) 2003, 2007, Oracle. All rights reserved.
Connesso a: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Avvio di "SYSTEM"."SYS_EXPORT_SCHEMA_01": system/******** version=10.1.0.1 schemas=pippo
Stima in corso con il metodo BLOCKS...
Elaborazione dell'object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Stima totale con il metodo BLOCKS: 0 KB
Elaborazione dell'object type SCHEMA_EXPORT/USER
Elaborazione dell'object type SCHEMA_EXPORT/DEFAULT_ROLE
Elaborazione dell'object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Caricamento/Scaricamento della tabella principale "SYSTEM"."SYS_EXPORT_SCHEMA_01" completato
File di dump impostato per SYSTEM.SYS_EXPORT_SCHEMA_01 is:
C:\ORA\ATORELLI\APP\PRODUCT\ADMIN\MRMANREP\DPDUMP\EXPDAT.DMP
Job "SYSTEM"."SYS_EXPORT_SCHEMA_01" completato in 18:42:23
Regards
Similar Messages
-
Data pump issue for oracle 10G in window2003
Hi Experts,
I try to run data pump in oracle 10G in window 2003 server.
I got a error as
D:\>cd D:\oracle\product\10.2.0\SALE\BIN
D:\oracle\product\10.2.0\SALE\BIN>expdp system/xxxxl@sale full=Y directory=du
mpdir dumpfile=expdp_sale_20090302.dmp logfile=exp_sale_20090302.log
Export: Release 10.2.0.4.0 - Production on Tuesday, 03 March, 2009 8:05:50
Copyright (c) 2003, 2007, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - Produc
tion
With the Partitioning, OLAP, Data Mining and Real Application Testing options
ORA-31626: job does not exist
ORA-31650: timeout waiting for master process response
However, I can run exp codes and works well.
What is wrong for my data pump?
Thanks
JIMHi Anand,
I did not see any error log at that time. Actually, it did not work any more. I will test it again based on your emial after exp done.
Based on new testing, I got below errors as
ORA-39014: One or more workers have prematurely exited.
ORA-39029: worker 1 with process name "DW01" prematurely terminated
ORA-31671: Worker process DW01 had an unhandled exception.
ORA-04030: out of process memory when trying to allocate 4108 bytes (PLS non-lib hp,pdzgM60_Make)
ORA-06512: at "SYS.KUPC$QUEUE_INT", line 277
ORA-06512: at "SYS.KUPW$WORKER", line 1366
ORA-04030: out of process memory when trying to allocate 65036 bytes (callheap,KQL tmpbuf)
ORA-06508: PL/SQL: could not find program unit being called: "SYS.KUPC$_WORKERERROR"
ORA-06512: at "SYS.KUPW$WORKER", line 13360
ORA-06512: at "SYS.KUPW$WORKER", line 15039
ORA-06512: at "SYS.KUPW$WORKER", line 6372
ORA-39125: Worker unexpected fatal error in KUPW$WORKER.DISPATCH_WORK_ITEMS while calling DBMS_METADATA.FETCH_XML_CLOB [PROCOBJ:"SALE"."SQLSCRIPT_2478179"]
ORA-06512: at "SYS.KUPW$WORKER", line 7078
ORA-04030: out of process memory when trying to allocate 4108 bytes (PLS non-lib hp,pdzgM60_Make)
ORA-06500: PL/SQL: storage error
ORA-04030: out of process memory when trying to allocate 16396 bytes (koh-kghu sessi,pmucpcon: tds)
ORA-04030: out of process memory when trying to allocate 16396 bytes (koh-kghu sessi,pmucalm coll)
Job "SYSTEM"."SYS_EXPORT_FULL_01" stopped due to fatal error at 14:41:36
ORA-39014: One or more workers have prematurely exited.
the trace file as
*** 2009-03-03 14:20:41.500
*** ACTION NAME:() 2009-03-03 14:20:41.328
*** MODULE NAME:(oradim.exe) 2009-03-03 14:20:41.328
*** SERVICE NAME:() 2009-03-03 14:20:41.328
*** SESSION ID:(159.1) 2009-03-03 14:20:41.328
Successfully allocated 7 recovery slaves
Using 157 overflow buffers per recovery slave
Thread 1 checkpoint: logseq 12911, block 2, scn 7355467494724
cache-low rba: logseq 12911, block 251154
on-disk rba: logseq 12912, block 221351, scn 7355467496281
start recovery at logseq 12911, block 251154, scn 0
----- Redo read statistics for thread 1 -----
Read rate (ASYNC): 185319Kb in 1.73s => 104.61 Mb/sec
Total physical reads: 189333Kb
Longest record: 5Kb, moves: 0/448987 (0%)
Change moves: 1378/5737 (24%), moved: 0Mb
Longest LWN: 1032Kb, moves: 45/269 (16%), moved: 41Mb
Last redo scn: 0x06b0.9406fb58 (7355467496280)
----- Recovery Hash Table Statistics ---------
Hash table buckets = 32768
Longest hash chain = 3
Average hash chain = 35384/25746 = 1.4
Max compares per lookup = 3
Avg compares per lookup = 847056/876618 = 1.0
*** 2009-03-03 14:20:46.062
KCRA: start recovery claims for 35384 data blocks
*** 2009-03-03 14:21:02.171
KCRA: blocks processed = 35384/35384, claimed = 35384, eliminated = 0
*** 2009-03-03 14:21:02.531
Recovery of Online Redo Log: Thread 1 Group 2 Seq 12911 Reading mem 0
*** 2009-03-03 14:21:04.718
Recovery of Online Redo Log: Thread 1 Group 1 Seq 12912 Reading mem 0
*** 2009-03-03 14:21:16.296
----- Recovery Hash Table Statistics ---------
Hash table buckets = 32768
Longest hash chain = 3
Average hash chain = 35384/25746 = 1.4
Max compares per lookup = 3
Avg compares per lookup = 849220/841000 = 1.0
*** 2009-03-03 14:21:28.468
tkcrrsarc: (WARN) Failed to find ARCH for message (message:0x1)
tkcrrpa: (WARN) Failed initial attempt to send ARCH message (message:0x1)
*** 2009-03-03 14:26:25.781
kwqmnich: current time:: 14: 26: 25
kwqmnich: instance no 0 check_only flag 1
kwqmnich: initialized job cache structure
ktsmgtur(): TUR was not tuned for 360 secs
Windows Server 2003 Version V5.2 Service Pack 2
CPU : 8 - type 586, 4 Physical Cores
Process Affinity : 0x00000000
Memory (Avail/Total): Ph:7447M/8185M, Ph+PgF:6833M/9984M, VA:385M/3071M
Instance name: vmsdbsea
Redo thread mounted by this instance: 0 <none>
Oracle process number: 0
Windows thread id: 2460, image: ORACLE.EXE (SHAD)
Dynamic strand is set to TRUE
Running with 2 shared and 18 private strand(s). Zero-copy redo is FALSE
*** 2009-03-03 08:06:51.921
*** ACTION NAME:() 2009-03-03 08:06:51.905
*** MODULE NAME:(expdp.exe) 2009-03-03 08:06:51.905
*** SERVICE NAME:(xxxxxxxxxx) 2009-03-03 08:06:51.905
*** SESSION ID:(118.53238) 2009-03-03 08:06:51.905
SHDW: Failure to establish initial communication with MCP
SHDW: Deleting Data Pump job infrastructure
is it a system memory issue for data pump? my exp works well
How to fix this issue?
JIM
Edited by: user589812 on Mar 3, 2009 5:07 PM
Edited by: user589812 on Mar 3, 2009 5:22 PM -
Dataguard between different oracle versions
Can I dataguard an 11.1.0.6 database to a backup 11.2.0.1 database
For rolling upgrade purposes yes. But only to a Logical standby. And you can also use the transient Logical standby rolling upgrade as well if you are a Physical standby user.
See the following:
Mixed Oracle Version support with Data Guard Redo Transport Services (Doc ID 785347.1) on Oracle Support
MAA Rolling Upgrade paper at http://www.oracle.com/technology/deploy/availability/maa/maa_wp_11g_upgrades_made_easy.pdf
Transient Logical Standby Rolling Upgrade MAA paper at http://www.oracle.com/technology/deploy/availability/pdf/maa_wp_11g_transientlogicalrollingupgrade.pdf
Larry -
Data Guard with diff Oracle Versions??
Hi,
I have one query on Data Guard.
Can I config. data guard with two different Oracle versions?
i.e Primary=> 8i
stand by=> 10g
OR either way. Pls post your comments
JAI HIND
DarshanSome of the parameters that you need to have with Physical standby database are :
*.background_dump_dest='/ford/app/oracle/admin/xchbot1/bdump'
*.compatible='9.2.0.7'
*.control_files='/home30/oradata/xchange/xchbot1/control01.ctl','/home30/oradata/xchange/xchbot1/control02.ctl','/home30/orad
ata/xchange/xchbot1/control03.ctl'
*.core_dump_dest='/ford/app/oracle/admin/xchbot1/cdump'
*.db_block_buffers=1024
*.db_block_size=8192
*.db_file_multiblock_read_count=8# SMALL
*.db_files=1000# SMALL
*.db_name='xchbot1'
*.global_names=TRUE
*.log_archive_dest_1='LOCATION=/home30/oradata/xchange/xchbot1/archivelog'
*.log_archive_dest_2='SERVICE=standby'
*.log_archive_dest_state_2='ENABLE'
*.log_archive_format='arch_%t_%s.arc'
*.log_archive_start=true
*.log_buffer=16384000# SMALL
*.log_checkpoint_interval=10000
*.max_dump_file_size='10240'# limit trace file size to 5 Meg each
*.parallel_max_servers=5
*.parallel_min_servers=1
*.processes=50# SMALL
*.rollback_segments='rbs01','rbs02','rbs03','rbs04','rbs05','rbs06','rbs07','rbs08','rbs09','rbs10'
*.shared_pool_size=67108864
*.sort_area_retained_size=2048
*.sort_area_size=10240
*.user_dump_dest='/ford/app/oracle/admin/xchbot1/udump' -
Transports between different Oracle versions
Hello Experts,
Is it possible to have DEV, QAS on Oracle 10.2.0.2 and PRD on Oracle 9.2.0.8? Will the transports work?
Thanks & Regards,
tamilboyus> Is it possible to have DEV, QAS on Oracle 10.2.0.2 and PRD on Oracle 9.2.0.8? Will the transports work?
Sure - the whole transport thing works database independently.
But you should really move on with your Oracle versions there - 10.2.0.2: old, 9.2.0.8: damn old...
regards,
Lars -
Inserting data into multiple tables(Oracle Version 9.2.0.6)
Hi,
we are going to receive the following XML file from one of our vendor. We need to parse the file and then save the data to multiple database tables (around 3).
<?xml version="1.0"?>
<datafeed xmlns:xsi="ht tp://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="DailyFeed.xsd" deliverydate="2007-02-14T00:00:00" vendorid="4">
<items count="1">
<item feed_id="001379" mode="MERGE">
<content empbased="true">
<emp>10000000</emp>
<value>
</value>
<date>2006-01-16</date>
<unit>CHF</unit>
<links>
<link lang="EN">
<url>www.pqr.com</url>
<description>pqr website</description>
</link>
<link lang="DE">
<url>www.efg.com</url>
<description>efg website</description>
</link>
</links>
</content>
<content empbased="true">
<emp>10000001</emp>
<value>
</value>
<date>2006-01-16</date>
<unit>CHF</unit>
<links>
<link lang="EN">
<url>www.abc.com</url>
<description>abc website</description>
</link>
<link lang="DE">
<url>www.xyz.com</url>
<description>xyz website</description>
</link>
</links>
</content>
<content empbased="true">
<emp>10000002</emp>
<value>
</value>
<date>2006-01-16</date>
<unit>CHF</unit>
<links>
<link lang="IT">
<url>www.rst.com</url>
<description>rst website</description>
</link>
</links>
</content>
</item>
</items>
</datafeed>
Now the operation to be done on the table depends on the mode attribute. Further, there are some basic validations need to be done using count attribute. Here the item tag, content tag & link tag are recurring elements.
The problem is I am not able to find the correct attributes like mode, feed_id, lang through SQL query(they are getting duplicated) though I was able to find the deliverydate & vendorid attribute as they are non-repeatitive. Here are the scripts :
create table tbl_xml_rawdata (xml_content xmltype);
create directory xmldir as 'c:\xml';
--put the above xml file in this directory and name it testfile.xml
Declare
l_bfile BFILE;
l_clob CLOB;
BEGIN
l_bfile := BFileName('XMLDIR', 'testfile.xml');
dbms_lob.createtemporary(l_clob, cache=>FALSE);
dbms_lob.open(l_bfile, dbms_lob.lob_readonly);
dbms_lob.loadFromFile(dest_lob => l_clob,
src_lob => l_bfile,
amount => dbms_lob.getLength(l_bfile));
dbms_lob.close(l_bfile);
insert into test_xml_rawdata values(xmltype.createxml(l_clob));
commit;
end;
My query is:
select extractvalue(value(b),'/datafeed/@deliverydate') ddate, extractvalue(value(b),'/datafeed/@vendorid') vendorid,
extractvalue( value( c ), '//@feed_id') feed_id,
extractvalue( value( a ), '//@empbased') empbased,
extractvalue( value( a ), '//emp') emp,
extractvalue( value( a ), '//value') value,
extractvalue( value( a ), '//unit') unit,
extractvalue( value( a ), '//date') ddate1,
extract( value( a ), '//links/link/url') url,
extract( value( a ), '//links/link/description') description
from tbl_xml_rawdata t,
table(xmlsequence(extract(t.xml_content,'/datafeed/items/item/content'))) a,
table(xmlsequence(extract(t.xml_content,'/'))) b ,
table(xmlsequence(extract(t.xml_content,'/datafeed/items/item'))) c;
If the above query is run, the feed_id is cartesian joined with other data ,which is wrong.
How should I go with this so that I can have 1 relational record with respect to each element & sub-elements.
Also, if this is not doable in SQL, can someone direct me to some plsql example to do this. I read that dbms_xmldom & dbms_xmlparser can be used to travel through XML doc but I don't know how to use them.
Any help please ??I'm still getting the same error while installing Oracle Patch set 9.2.0.6. I downloaded the patchset 2 weeks back.
Pls help where download the correct version ? -
Data migration between different ECC versions
Is there a way to migrate data between two different ECC versions?
I know that there are differences between the tables and therefore TDMS transports are not possible. But is there another way to get master data from one version (6.0 EHP 3) to another one (6.0 EHP 4)? For example a tool that can prepare the exported data for the import in a higher version?
thxThat's indeed the most feasible option we have on the table right now. Although we have a lot of Z-fields and would need to transfer them with LSMW.
The ZMATMAS has only been built for outbound... -
Replication between different db versions?
Can anyone tell me what limitations there might be performing data replication between different oracle database versions?
ThanksDepends on the versions and what exactly you mean by "replication", since there are a variety of technologies that may fall under that umbrella.
You can't directly replicate data between a 7.3.4 database and a 10.1 database, for example. You can use a variety of replication technologies between a 9.2 database and a 10.1 database-- materialized views, multi-master materialized views, Streams, transportable tablesspaces-- though you cannot use things like DataGuard, which may fall under the replication umbrella in some cases.
Justin
Distributed Database Consulting, Inc.
http://www.ddbcinc.com/askDDBC -
How to find table with colum that not support by data pump network_link
Hi Experts,
We try to import a database to new DB by data pump network_link.
as oracle statement, Tables with columns that are object types are not supported in a network export. An ORA-22804 error will be generated and the export will move on to the next table. To work around this restriction, you can manually create the dependent object types within the database from which the export is being run.
My question, how to find these tables with colum that that are object types are not supported in a network export.
We have LOB object and oracle spital SDO_GEOMETRY object type. our database size is about 300G. nornally exp will takes 30 hours.
We try to use data pump with network_link to speed export process.
How do we fix oracle spital users type SDO_GEOMETRY issue during data pump?
our system is 32 bit window 2003 and 10GR2 database.
Thanks
Jim
Edited by: user589812 on Nov 3, 2009 12:59 PMHi,
I remember there being issues with sdo_geometry and DataPump. You may want to contact oracle support with this issue.
Dean -
Different Oracle versions Replication?
Hello,
We want to create a replication in our laptop.
The server has oracle 9i rel.1. In the laptop we want to install oracle 9i rel. 2.
Can we replicate part of the database using different Oracle DBMS versions?
As a comment: The replication is SDE information.
Thanks,Hi;
Please see:
Client / Server / Interoperability Support Between Different Oracle Versions [ID 207303.1]
Re: Replication between 8i and 10g
Regard
Helios -
Hi
I am trying to import data in Oracle 11g Release2(11.2.0.1) using impdp utitlity and getting below errror
UDI-00018: Data Pump client is incompatible with database version 11.2.0.1.0
Export dump has taken in database with oracle 11g Release 1(11.1.0.7.0) and I am trying to import in higher version of the database. Is there any parameter I have to set to avoid this error?AUTHSTATE=compat
A__z=! LOGNAME
CLASSPATH=/app/oracle/11.2.0/jlib:.
HOME=/home/oracle
LANG=C
LC__FASTMSG=true
LD_LIBRARY_PATH=/app/oracle/11.2.0/lib:/app/oracle/11.2.0/network/lib:.
LIBPATH=/app/oracle/11.2.0/JDK/JRE/BIN:/app/oracle/11.2.0/jdk/jre/bin/classic:/app/oracle/11.2.0/lib32
LOCPATH=/usr/lib/nls/loc
LOGIN=oracle
LOGNAME=oracle
MAIL=/usr/spool/mail/oracle
MAILMSG=[YOU HAVE NEW MAIL]
NLSPATH=/usr/lib/nls/msg/%L/%N:/usr/lib/nls/msg/%L/%N.cat
NLS_DATE_FORMAT=DD-MON-RRRR HH24:MI:SS
ODMDIR=/etc/objrepos
ORACLE_BASE=/app/oracle
ORACLE_HOME=/app/oracle/11.2.0
ORACLE_SID=AMT6
ORACLE_TERM=xterm
ORA_NLS33=/app/oracle/11.2.0/nls/data
PATH=/app/oracle/11.2.0/bin:.:/usr/bin:/etc:/usr/sbin:/usr/ucb:/home/oracle/bin:/usr/bin/X11:/sbin:.:/usr/local/bin:/usr/ccs/bin
PS1=nbsud01[$PWD]:($ORACLE_SID)>
PWD=/nbsiar/nbimp
SHELL=/usr/bin/ksh
SHLIB_PATH=/app/oracle/11.2.0/lib:/usr/lib
TERM=xterm
TZ=Europe/London
USER=oracle
_=/usr/bin/env -
Oracle 10g - Data Pump: Export / Import of Sequences ?
Hello,
I'm new to this forum and also to Oracle (Version 10g). Since I could not find an answer to my question, I open this post in hoping to get some help from the experienced users.
My question concerns the Data Pump Utility and what happens to sequences which were defined in the source database:
I have exported a schema with the following command:
"expdp <user>/<pass> DIRECTORY=DATA_PUMP_DIR DUMPFILE=dumpfile.dmp LOGFILE=logfile.log"
This worked fine and also the import seemed to work fine with the command:
"impdp <user>/<pass> DIRECTORY=DATA_PUMP_DIR DUMPFILE=dumpfile.dmp"
It loaded the exported objects directly into the schema of the target database.
BUT:
Something has happened to my sequences. :-(
When I want to use them, all sequences start again with value "1". Since I have already included data with higher values in my tables, I get into trouble with the PK of these tables because I used sequences sometimes as primary key.
My question go in direction to:
1. Did I something wrong with Data Pump Utility?
2. How is the correct way to export and import sequences that they keep their actual values?
3. When the behaviour described here is correct, how can I correct the values that start again from the last value that was used in the source database?
Thanks a lot in advance for any help concerning this topic!
Best regards
FireFighter
P.S.
It might be that my english sounds not perfect since it is not my native language. Sorry for that! ;-)
But I hope that someone can understand nevertheless. ;-)My question go in direction to:
1. Did I something wrong with Data Pump Utility?I do not think so. But may be with the existing schema :-(
2. How is the correct way to export and import
sequences that they keep their actual values?If the Sequences exist in the target before the import, oracle does not drop and recreate it. So you need to ensure that the sequences do not already exist in the target or the existing ones are dropped before the import.
3. When the behaviour described here is correct, how
can I correct the values that start again from the
last value that was used in the source database?You can either refresh with the import after the above correction or drop and manually recreate the sequences to START WITH the NEXT VALUE of the source sequences.
The easier way is to generate a script from the source if you know how to do it -
Differences between using Data Pump to back up database and using RMAN ?
what are differences between using Data Pump to back up database and using RMAN ? what is CONS and PROS ?
ThanksSearch for Database backup in
http://docs.oracle.com/cd/B28359_01/server.111/b28318/backrec.htm#i1007289
In short
RMAN -> Physical backup.(copies of physical database files)
Datapump -> Logical backup.(logical data such as tables,procedures)
Docs for RMAN--
http://docs.oracle.com/cd/B28359_01/backup.111/b28270/rcmcncpt.htm#
Docs for Datapump
http://docs.oracle.com/cd/B19306_01/server.102/b14215/dp_overview.htm
Edited by: Sunny kichloo on Jul 5, 2012 6:55 AM -
What is the differences between the older version of Oracle Financials and the 11 i
What is the differences between the older version of Oracle Financials and 11i?
Vijay,Thanks for your answer,but I am still not clear about it,I have a instance to describe my question at detail.
If there is a final product A, and the planning strategy for A is 20(MTO), the procurement type of A is F(external procurement) at the view mrp2 in the material master datas.
step1: I creat a sales order.
step2: run MRP for A.
step3: transfer the purchase requistion into a purchase order,and the field of acc.***.cat. in the purchase order will be filled out M automatically, because the acc.***.cat. in the planning strategy 20(MTO) is set with E.
Well, the purchase order is created, what is the relationship between the sales order and the purchase order? What will be happened about costing between the SO and the PO?
If I delete the E, I make the PO become a standard PO, what is difference between standard PO and the PO including E?
Best Regards
Bob -
Transfer data from two oracle version to one sql server 2005
Hi,
I have two database servers on different machines. They are
1) Oracle 8.1.7.4
2) Oracle 7.3.1.4
I have to create agents which can transfer tables from these two databases to one machine having sql server 2005 database.
Please tell me what are the options. What drivers i need to install on machine having sql server 2005 so that i can transfer data from both oracle versions.
Thanks
Rajneesh.Your Oracle databases are so old you might want to look around and see if you can find dinosaur bones near by.
Given the differences in data types between Oracle and SQL Server I'd suggest you start off by dumping the data into delimited ASCII files and then loading it using whichever SQL Server tool you wish.
Maybe you are looking for
-
OIM: Mapping to a particular OU in AD
I have to create an adapter so that an Division from HR is mapped to a particular OU in AD. Like a Accounting divioson form HR system being mapped to OU=Users, ou=accouting I belive I can start creating the adapter from adapter factory. So i was tryi
-
How to call a business graphics from a SAPGUI program?
Dear experts, I'm new to WebDynpro for ABAP. I was able to create a working WebDynpro application that shows a simple bar-chart. Nice. What I would like to realize is a bar chart as a pop-up when I click a button on an ALV table element. This ALV tab
-
How can you tell what apple id is associated with your MAC
I bought a MAC desktop for my business and immediately started using FACETIME. I did a software update and after the update it stated that I couldn't use my FACETIME because the Apple ID had not been set up. I tried setting it up under the email that
-
My ipod g2 doesn't play any song or any sound
my ipod doesn't play any music or sound but the earphones are new
-
Crystal Report XI serial number
Since yesterday everytime I open the program, it asked me for the serial number, then sends me an error message of Not Able to find Key Code, but it opens the program and the program works OK, what can be the problem, I don't want to do this everytim