Cmd for exp and importing all the table data's in a specified schema
i need to export table data without exporting the defenition how can i do this
Why do you have such a requirement?
Are you worried about import failures if the object already exists on the database?
Similar Messages
-
How to bind table data to datatable component and show all the table data??
I bind table to datatable component !
The datatable has four rows,
but the datatable alway show the first row data in its four rows,why??do you mean at design time or at runtime?
at design time, the datatable uses generic fields to
show if data type is numeric, text, date, etc...
If this is at runtime,
- what driver are you using?
- how did you bind the data to the table?
- what is the resulting code in the Java backing file?
hth,
-Alexis -
Request in steps in deleting all the tables data in an user schema.
Hi Gurus,
Could some one please provide me the steps involved in deleting all the tables data in an user(schema)
thanks in advancewrite a script as below
sys@11GDEMO> select 'truncate table '||owner||'.'||table_name||';' from dba_tables where owner='SCOTT';
'TRUNCATETABLE'||OWNER||'.'||TABLE_NAME||';'
truncate table SCOTT.DEPT;
truncate table SCOTT.EMP;
truncate table SCOTT.BONUS;
truncate table SCOTT.SALGRADE;
truncate table SCOTT.EMPBACKUP;
truncate table SCOTT.T_NAME;
truncate table SCOTT.D_TEMP_STSC;
Example:
sys@11GDEMO> truncate table SCOTT.T_NAME;
Table truncated.
sys@11GDEMO> -
Query for customer and its all the underlying sites in AR
Hi Expert,
Please help me to build a query for a customer and its all the underlying sites .
ThanksPlease check this thread:
Running Total in QLD
Thanks,
Gordon -
Exporting and importing only the Incremental datas
Dear all,
we r facing a big time consuming process of changing oracle schema datas from live servers to out testing servers..
scenario as follows:
1. .dmp files of size 1GB to 3 GB are taken from live server and posted in test servers to check the live issues
2. same sized dumps are taken for simulation to rectify issue
this type of above process consumes the whole time for exporting and importing
can any one suggest me how can the incremental data's alone can be exported and imported to the test server.
Hoping that i can get a valuable solution for this
Thanks in Advance
S.AravindHi Aravind,
as Nicolas specified RMAN would be the best option.
Incremental data refresh is not possible through exp/imp. but there is a possiblity of writing a script which automates this refresh process. -
To export and import oracle 11g table data only
Hi Gurus,
Just not sure of the procedure to follow in the export just the table data and then truncate the table do some changes(not table structure changes ) and then import the same table data in to the relevent table .
Could some please help me in the setps involved in it .
Thanks a Lot in advanceIf you can use Data Pump, here are your commands:
expdp table_owner/password directory=<your_directory> dumpfile=table_name.dmp tables=table_name content=data_only
impdp table_owner/password directory=<your_directory> dumpfile=table_name.dmp tables=table_name table_exists_action=append
Data Pump requires version 10.1 or later.
Dean -
After i connected my iphone to my computer to upload some pictures, the couple of apps that i had synched to my itunes, later i noticed that this deleted them from my iphone, therefore i re-connected my phone, dragged that apps back onto my phone and all seamed okay. Till i went to play an app (dinner dash) and notice that now only was all my data (scores and level beaten) erased, but i had also purchased extra levels and these are no longer available, they are asking my to pay for the levels again. I know they weren't expensive, but i already paid for these and obviously still want them without having to pay again? Any help would be great, i can't even seam to find a email address that i could write to on the apple site for help!! I either want to get these levels again, or my money back to download them again.
Hello johnohuk,
Thanks for the question. It sounds like most of your music was purchased from the iTunes Store. If your country supports iTunes in the Cloud, you can download your past purchases using this article:
Downloading past purchases from the App Store, iBookstore, and iTunes Store
http://support.apple.com/kb/HT2519
Thanks,
Matt M. -
Export and import of an table data
Hi All,
I have a situation where i need to append/import the production table ( X ) data to test table ( X ) which has some data already in it and should not be lost during the operation. The record count is around 6 million .
Any expert suggestion or tips with export and import command is highly appriciable.
Thanks in advance .if you are using
IMPDP --- use table_exist_action=append
IMP -- ignore=y
Lets say
impdp username/pwd directory=<directory_name> dumpfile=<dumpfile_name> tables=X table_exist_action=APPEND.
This will append the data to existing table -
I recently had to delete my yahoo e-mail account ,which is also my apple ID ,is there anyway that i can transfer all my purchased songs,pictures,and apps to a new apple account?
You do not need to create a new Apple ID, you just need to change the email address (which is your username) for your Apple ID. People change email providers all the time and they just change the ID. In fact, you do not want to create a new ID because you cannot transfer everything to a new ID. All purchases are tied to the ID that was used to make the purchases.
Take a look at this,
Change your Apple ID - Apple Support -
I want to split off my daughter from my apple account and set up her own. Once she has her own account she would like to have her apps/data/music, etc. on her iPhone and iPad under the new account as it was under mine. Can I set her up a separate account and then move her apps/data/music, etc., to her new account? If so how do we do this and what are the dangers/pifalls of doing this (if any)?
You can migrate her to a separate iCoud account if you want, but existing purchased media (apps, music, etc.) will still be tied to the existing ID. Purchased media is permanently tied to the ID used to purchase them.
If you want to move her to a separate iCloud account, you can do that and still continue to share the same ID for iTunes purchases. The ID you use for iTunes does not have to be the same as the one you use for iCloud. To do this, start by saving any photo stream photos she wants to keep to your camera roll (unless already there) by opening your my photo stream album, tapping Select, tapping the photos, tap the share icon (box with upward facing arrow, then tapping Save to Camera Roll. If she is syncing notes with iCloud, you'll need to open each of your notes and email them to her so she can later copy and paste the text into new notes created in her new account. Then go to Settings>iCloud, tap Delete Account (which only deletes it from this device, not from iCloud; your devices will not be effected by this), choose Keep on My iDevice and provide the password when prompted. Then sign back in with a different Apple ID to create your her account and choose Merge to upload the data.
Once you are both on separate accounts, you can each go to icloud.com and delete the other person's data from your account. -
How to clear the table data at the time of leaving fromt the page
Hi all,
I have requirement to clear all the table data on the page. For that i have written code like
public void clearRunCalcPage(){
System.out.println("Inside clearRunCalcPage");
this.getCmProcessView().executeEmptyRowSet();
this.getCmProcessFiveRecordsView().executeEmptyRowSet();
But for clearing i don't have any button and i don't have next button also, based on menu navigation i will move from 1 page to another. Here my problem is where can i put this login, i tried with the constructor of the page but that time i am getting errors when i came to the page.
Can any one suggest me.Call a backing bean method on click of the next page and place the below code in the backing bean
ViewCriteria vc = myView.getViewCriteria("criteriaName");
vc.setProperty(ViewCriteriaHints.CRITERIA_AUTO_EXECUTE, false);Morgan. -
How to generate the insert script of the tables data present in an entire
How to generate the insert script of the tables data present in an entire schema in sqlplus environment
with out toad can you please help me please!!!!!!!!!!!!!HI,
First create this function to get insert scripts.
/* Formatted on 2012/01/16 10:41 (Formatter Plus v4.8.8) */
CREATE OR REPLACE FUNCTION extractdata (v_table_name VARCHAR2)
RETURN VARCHAR2
AS
b_found BOOLEAN := FALSE;
v_tempa VARCHAR2 (8000);
v_tempb VARCHAR2 (8000);
v_tempc VARCHAR2 (255);
BEGIN
FOR tab_rec IN (SELECT table_name
FROM user_tables
WHERE table_name = UPPER (v_table_name))
LOOP
b_found := TRUE;
v_tempa := 'select ''insert into ' || tab_rec.table_name || ' (';
FOR col_rec IN (SELECT *
FROM user_tab_columns
WHERE table_name = tab_rec.table_name
ORDER BY column_id)
LOOP
IF col_rec.column_id = 1
THEN
v_tempa := v_tempa || '''||chr(10)||''';
ELSE
v_tempa := v_tempa || ',''||chr(10)||''';
v_tempb := v_tempb || ',''||chr(10)||''';
END IF;
v_tempa := v_tempa || col_rec.column_name;
IF INSTR (col_rec.data_type, 'CHAR') > 0
THEN
v_tempc := '''''''''||' || col_rec.column_name || '||''''''''';
ELSIF INSTR (col_rec.data_type, 'DATE') > 0
THEN
v_tempc :=
'''to_date(''''''||to_char('
|| col_rec.column_name
|| ',''mm/dd/yyyy hh24:mi'')||'''''',''''mm/dd/yyyy hh24:mi'''')''';
ELSE
v_tempc := col_rec.column_name;
END IF;
v_tempb :=
v_tempb
|| '''||decode('
|| col_rec.column_name
|| ',Null,''Null'','
|| v_tempc
|| ')||''';
END LOOP;
v_tempa :=
v_tempa
|| ') values ('
|| v_tempb
|| ');'' from '
|| tab_rec.table_name
|| ';';
END LOOP;
IF NOT b_found
THEN
v_tempa := '-- Table ' || v_table_name || ' not found';
ELSE
v_tempa := v_tempa || CHR (10) || 'select ''-- commit;'' from dual;';
END IF;
RETURN v_tempa;
END;
SET PAUSE OFF
SET LINESIZE 1200
SET PAGESIZE 100
SET TERMOUT OFF
SET HEAD OFF
SET FEED OFF
SET ECHO OFF
SET VERIFY OFF
SPOOL GET_INSERTS.SP REP
SELECT EXTRACTDATA('EMP') FROM DUAL;
SPOOL OFF
SET PAUSE ON
SET LINESIZE 120
SET PAGESIZE 14
SET TERMOUT ON
SET HEAD ON
SET FEED 5
SET ECHO ON
SET VERIFY ON
SELECT 'insert into EMP ('
|| CHR (10)
|| 'EMPNO,'
|| CHR (10)
|| 'ENAME,'
|| CHR (10)
|| 'JOB,'
|| CHR (10)
|| 'MGR,'
|| CHR (10)
|| 'HIREDATE,'
|| CHR (10)
|| 'SAL,'
|| CHR (10)
|| 'COMM,'
|| CHR (10)
|| 'DEPTNO) values ('
|| DECODE (empno, NULL, 'Null', empno)
|| ','
|| CHR (10)
|| ''
|| DECODE (ename, NULL, 'Null', '''' || ename || '''')
|| ','
|| CHR (10)
|| ''
|| DECODE (job, NULL, 'Null', '''' || job || '''')
|| ','
|| CHR (10)
|| ''
|| DECODE (mgr, NULL, 'Null', mgr)
|| ','
|| CHR (10)
|| ''
|| DECODE (hiredate,
NULL, 'Null',
'to_date('''
|| TO_CHAR (hiredate, 'mm/dd/yyyy hh24:mi')
|| ''',''mm/dd/yyyy hh24:mi'')'
|| ','
|| CHR (10)
|| ''
|| DECODE (sal, NULL, 'Null', sal)
|| ','
|| CHR (10)
|| ''
|| DECODE (comm, NULL, 'Null', comm)
|| ','
|| CHR (10)
|| ''
|| DECODE (deptno, NULL, 'Null', deptno)
|| ');'
FROM emp;
SELECT '-- commit;'
FROM DUAL;now run the baove select statement you will get the following insert statements
/* Formatted on 2012/01/16 10:57 (Formatter Plus v4.8.8) */
--'INSERT INTO EMP('||CHR(10)||'EMPNO,'||CHR(10)||'ENAME,'||CHR(10)||'JOB,'||CHR(10)||'MGR,'||CHR(10)||'HIREDATE,'||CHR(10)||'SAL,'||CHR(10)||'COMM,'||CHR(10)||'DEPTNO)VALUES('||DECODE(EMPNO,NULL,'NULL',EMPNO)||','||CHR(10)||''||DECODE(ENAME,NULL,'NULL',''''|
INSERT INTO emp
(empno, ename, job, mgr,
hiredate, sal, comm, deptno
VALUES (7369, 'SMITH', 'CLERK', 7902,
TO_DATE ('12/17/1980 00:00', 'mm/dd/yyyy hh24:mi'), 800, NULL, 20
INSERT INTO emp
(empno, ename, job, mgr,
hiredate, sal, comm, deptno
VALUES (7499, 'ALLEN', 'SALESMAN', 7698,
TO_DATE ('02/20/1981 00:00', 'mm/dd/yyyy hh24:mi'), 1600, 300, 30
INSERT INTO emp
(empno, ename, job, mgr,
hiredate, sal, comm, deptno
VALUES (7521, 'WARD', 'SALESMAN', 7698,
TO_DATE ('02/22/1981 00:00', 'mm/dd/yyyy hh24:mi'), 1250, 500, 30
INSERT INTO emp
(empno, ename, job, mgr,
hiredate, sal, comm,
deptno
VALUES (7566, 'JONES', 'MANAGER', 7839,
TO_DATE ('04/02/1981 00:00', 'mm/dd/yyyy hh24:mi'), 2975, NULL,
20
INSERT INTO emp
(empno, ename, job, mgr,
hiredate, sal, comm,
deptno
VALUES (7654, 'MARTIN', 'SALESMAN', 7698,
TO_DATE ('09/28/1981 00:00', 'mm/dd/yyyy hh24:mi'), 1250, 1400,
30
INSERT INTO emp
(empno, ename, job, mgr,
hiredate, sal, comm,
deptno
VALUES (7698, 'BLAKE', 'MANAGER', 7839,
TO_DATE ('05/01/1981 00:00', 'mm/dd/yyyy hh24:mi'), 2850, NULL,
30
INSERT INTO emp
(empno, ename, job, mgr,
hiredate, sal, comm,
deptno
VALUES (7782, 'CLARK', 'MANAGER', 7839,
TO_DATE ('06/09/1981 00:00', 'mm/dd/yyyy hh24:mi'), 2450, NULL,
10
INSERT INTO emp
(empno, ename, job, mgr,
hiredate, sal, comm,
deptno
VALUES (7788, 'SCOTT', 'ANALYST', 7566,
TO_DATE ('04/19/1987 00:00', 'mm/dd/yyyy hh24:mi'), 3000, NULL,
20
INSERT INTO emp
(empno, ename, job, mgr,
hiredate, sal, comm,
deptno
VALUES (7839, 'KING', 'PRESIDENT', NULL,
TO_DATE ('11/17/1981 00:00', 'mm/dd/yyyy hh24:mi'), 5000, NULL,
10
INSERT INTO emp
(empno, ename, job, mgr,
hiredate, sal, comm, deptno
VALUES (7844, 'TURNER', 'SALESMAN', 7698,
TO_DATE ('09/08/1981 00:00', 'mm/dd/yyyy hh24:mi'), 1500, 0, 30
INSERT INTO emp
(empno, ename, job, mgr,
hiredate, sal, comm,
deptno
VALUES (7876, 'ADAMS', 'CLERK', 7788,
TO_DATE ('05/23/1987 00:00', 'mm/dd/yyyy hh24:mi'), 1100, NULL,
20
INSERT INTO emp
(empno, ename, job, mgr,
hiredate, sal, comm, deptno
VALUES (7900, 'JAMES', 'CLERK', 7698,
TO_DATE ('12/03/1981 00:00', 'mm/dd/yyyy hh24:mi'), 950, NULL, 30
INSERT INTO emp
(empno, ename, job, mgr,
hiredate, sal, comm,
deptno
VALUES (7902, 'FORD', 'ANALYST', 7566,
TO_DATE ('12/03/1981 00:00', 'mm/dd/yyyy hh24:mi'), 3000, NULL,
20
INSERT INTO emp
(empno, ename, job, mgr,
hiredate, sal, comm,
deptno
VALUES (7934, 'MILLER', 'CLERK', 7782,
TO_DATE ('01/23/1982 00:00', 'mm/dd/yyyy hh24:mi'), 1300, NULL,
10
);i hope this helps .
Thanks,
P Prakash
Edited by: prakash on Jan 15, 2012 9:21 PM
Edited by: prakash on Jan 15, 2012 9:22 PM -
How to move all the tables from one tablespace to other for a whole schema
Hi,
Is there any way to move all the tables in a schema from one tablespace to other?
If so please help me out to do that.
Thanks
Regards
Gathahi,
here is the steps to move SCOTT's objects from their current tablespace to a NEW_TABLESPACE
would be:
1) do an export of all of scott's objects. Make sure no one modifies them after you
begin this process. You will lose these changes if they do.
$ exp userid=scott/tiger owner=scott
2) you would drop all of scotts tables. This will get the indexes as well. I don't
suggest dropping the user SCOTT but rather dropping scott's objects. Dropping scott
would cause any system priveleges SCOTT has to disappear and the import would not restore
them. This script can be used to drop someones tables:
set heading off
set feedback off
set verify off
set echo off
spool tmp.sql
select 'drop table &1..' || table_name || ' cascade constraints;'
from dba_tables
where owner = upper('&1')
spool off
@tmp.sql
3) You would modify the user to not have unlimited tablespace (else the IMP will just
put the objects right back into the tablespace they came from) and then give them
unlimited quota's on the new tablespace you want the objects to go into and on their
temporary tablespace (for the sorts the index creates will do)
alter user SCOTT default tablespace NEW_TABLESPACE
revoke unlimited tablespace from SCOTT
alter user SCOTT quota unlimited on NEW_TABLESPACE
alter user SCOTT quota unlimited on SCOTTS_TEMPORARY_TABLESPACE
4) you will IMP the data back in for that user. IMP will rewrite the create statements
to use the users default tablespace when it discovers that it cannot create the objects
in their original tablespace. Please make sure to review the file imp.log after you do
this for any and all errors after you import.
imp userid=scott/tiger full=y ignore=y log=imp.log
5) you can optionally restore 'unlimited tablespace' to this user (or not). If you do
not, this user can only create objects in this new tablespace and temp (which in itself
is not a bad thing)...
Regards,
Mohd Mehraj Hussain
http://mehrajdba.wordpress.com -
What are all the tables used for this report:
hi
what are all the tables used for this report:
report:
<b>Stock Report, which will give opening balance, receipt, issue, and closing balance for any given Duration for any material.</b>
thanks in advanceTables: MSEG, MKPF, MARD.
FOR REFERENCE SEE TRANSACTION : MB5B.
Message was edited by: Sharath kumar R -
How to generate test data for all the tables in oracle
I am planning to use plsql to generate the test data in all the tables in schema, schema name is given as input parameters, min records in master table, min records in child table. data should be consistent in the columns which are used for constraints i.e. using same column value..
planning to implement something like
execute sp_schema_data_gen (schemaname, minrecinmstrtbl, minrecsforchildtable);
schemaname = owner,
minrecinmstrtbl= minimum records to insert into each parent table,
minrecsforchildtable = minimum records to enter into each child table of a each master table;
all_tables where owner= schemaname;
all_tab_columns and all_constrains - where owner =schemaname;
using dbms_random pkg.
is anyone have better idea to do this.. is this functionality already there in oracle db?Ah, damorgan, data, test data, metadata and table-driven processes. Love the stuff!
There are two approaches you can take with this. I'll mention both and then ask which
one you think you would find most useful for your requirements.
One approach I would call the generic bottom-up approach which is the one I think you
are referring to.
This system is a generic test data generator. It isn't designed to generate data for any
particular existing table or application but is the general case solution.
Building on damorgan's advice define the basic hierarchy: table collection, tables, data; so start at the data level.
1. Identify/document the data types that you need to support. Start small (NUMBER, VARCHAR2, DATE) and add as you go along
2. For each data type identify the functionality and attributes that you need. For instance for VARCHAR2
a. min length - the minimum length to generate
b. max length - the maximum length
c. prefix - a prefix for the generated data; e.g. for an address field you might want a 'add1' prefix
d. suffix - a suffix for the generated data; see prefix
e. whether to generate NULLs
3. For NUMBER you will probably want at least precision and scale but might want minimum and maximum values or even min/max precision,
min/max scale.
4. store the attribute combinations in Oracle tables
5. build functionality for each data type that can create the range and type of data that you need. These functions should take parameters that can be used to control the attributes and the amount of data generated.
6. At the table level you will need business rules that control how the different columns of the table relate to each other. For example, for ADDRESS information your business rule might be that ADDRESS1, CITY, STATE, ZIP are required and ADDRESS2 is optional.
7. Add table-level processes, driven by the saved metadata, that can generate data at the record level by leveraging the data type functionality you have built previously.
8. Then add the metadata, business rules and functionality to control the TABLE-TO-TABLE relationships; that is, the data model. You need the same DETPNO values in the SCOTT.EMP table that exist in the SCOTT.DEPT table.
The second approach I have used more often. I would it call the top-down approach and I use
it when test data is needed for an existing system. The main use case here is to avoid
having to copy production data to QA, TEST or DEV environments.
QA people want to test with data that they are familiar with: names, companies, code values.
I've found they aren't often fond of random character strings for names of things.
The second approach I use for mature systems where there is already plenty of data to choose from.
It involves selecting subsets of data from each of the existing tables and saving that data in a
set of test tables. This data can then be used for regression testing and for automated unit testing of
existing functionality and functionality that is being developed.
QA can use data they are already familiar with and can test the application (GUI?) interface on that
data to see if they get the expected changes.
For each table to be tested (e.g. DEPT) I create two test system tables. A BEFORE table and an EXPECTED table.
1. DEPT_TEST_BEFORE
This table has all EMP table columns and a TEST_CASE column.
It holds EMP-image rows for each test case that show the row as it should look BEFORE the
test for that test case is performed.
CREATE TABLE DEPT_TEST_BEFORE
TESTCASE NUMBER,
DEPTNO NUMBER(2),
DNAME VARCHAR2(14 BYTE),
LOC VARCHAR2(13 BYTE)
2. DEPT_TEST_EXPECTED
This table also has all EMP table columns and a TEST_CASE column.
It holds EMP-image rows for each test case that show the row as it should look AFTER the
test for that test case is performed.
Each of these tables are a mirror image of the actual application table with one new column
added that contains a value representing the TESTCASE_NUMBER.
To create test case #3 identify or create the DEPT records you want to use for test case #3.
Insert these records into DEPT_TEST_BEFORE:
INSERT INTO DEPT_TEST_BEFORE
SELECT 3, D.* FROM DEPT D where DEPNO = 20
Insert records for test case #3 into DEPT_TEST_EXPECTED that show the rows as they should
look after test #3 is run. For example, if test #3 creates one new record add all the
records fro the BEFORE data set and add a new one for the new record.
When you want to run TESTCASE_ONE the process is basically (ignore for this illustration that
there is a foreign key betwee DEPT and EMP):
1. delete the records from SCOTT.DEPT that correspond to test case #3 DEPT records.
DELETE FROM DEPT
WHERE DEPTNO IN (SELECT DEPTNO FROM DEPT_TEST_BEFORE WHERE TESTCASE = 3);
2. insert the test data set records for SCOTT.DEPT for test case #3.
INSERT INTO DEPT
SELECT DEPTNO, DNAME, LOC FROM DEPT_TEST_BEFORE WHERE TESTCASE = 3;
3 perform the test.
4. compare the actual results with the expected results.
This is done by a function that compares the records in DEPT with the records
in DEPT_TEST_EXPECTED for test #3.
I usually store these results in yet another table or just report them out.
5. Report out the differences.
This second approach uses data the users (QA) are already familiar with, is scaleable and
is easy to add new data that meets business requirements.
It is also easy to automatically generate the necessary tables and test setup/breakdown
using a table-driven metadata approach. Adding a new test table is as easy as calling
a stored procedure; the procedure can generate the DDL or create the actual tables needed
for the BEFORE and AFTER snapshots.
The main disadvantage is that existing data will almost never cover the corner cases.
But you can add data for these. By corner cases I mean data that defines the limits
for a data type: a VARCHAR2(30) name field should have at least one test record that
has a name that is 30 characters long.
Which of these approaches makes the most sense for you?
Maybe you are looking for
-
Problem in Saving GRN with reference to purchase order
when i am saving a GRN i am getting this error. "Target item number does not match base item number PDN1.LineNum].[line2 " Edited by: Sana Samuel on Jun 3, 2008 9:42 AM
-
Find/Change: mission impossible? [CS3]
I want to assign a paragraph style to all occurrences of a 'Head1' style like this:<br /><br />1.1 The situation on the ground<br /><br />It's easy to find ^9.^9^t (number, period, number, tab) but if I want to ensure i don't wrongly style the Head
-
SHOW THE PARENT CHILD TABLE RELATIONSHIP TABLE
hi experts, I want show the parent child relation ship tables for exmple iam passing the emp table then display related child tables like dept,grade after that again child table(dept,grade) related to child relationship tables like recursive Regards,
-
Hi gurus, I am having a problem with the CATS approval workflow.When clicking on the attachment of the workflow we are getting a short dump. Short text Program error: ASSIGN with length 0 in program "SAPLSWOT". What happened? Error in the ABAP Appli
-
Hi All?Iv been having major probs lately with my sound card and software. I get these messages when i load mediasource. [size="3" face="Times New Roman">)your music library is corrupted and cannot be recovered Mediasource will now revert to your most