Import only table but got PLSQL CALL STACK.
Hi,
I am using ORACLE DATABASE 11g and Linux Enterprise 5. I am executing a command to import only one table from other schema to our schema. This import is done on netwrok_link parameter. In this case the log shows me many errors but the table gets imported.
What does these error means ? can anyone please suggest how to correct them...
$impdp tables=IUT01.FLX_DD_FIN_STMT_LOGS network_link=DBLINK_IUT01 remap_schema=iut01:ngpiut01 remap_tablespace=iut01:ngpiut01 table_exists_action=replace
Import: Release 11.2.0.3.0 - Production on Wed Mar 28 10:06:54 2012
Copyright (c) 1982, 2011, Oracle and/or its affiliates. All rights reserved.
Username: sys as sysdba
Password:
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Starting "SYS"."SYS_IMPORT_TABLE_01": sys/******** AS SYSDBA tables=IUT01.FLX_DD_FIN_STMT_LOGS network_link=DBLINK_IUT01 remap_schema=iut01:ngpiut01 remap_tablespace=iut01:ngpiut01 table_exists_action=replace
Estimate in progress using BLOCKS method...
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 2 MB
ORA-39126: Worker unexpected fatal error in KUPW$WORKER.CONFIGURE_METADATA_UNLOAD [WORK_PHASE]
ORA-03113: end-of-file on communication channel
ORA-02063: preceding line from DBLINK_IUT01
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
ORA-06512: at "SYS.KUPW$WORKER", line 9001
----- PL/SQL Call Stack -----
object line object
handle number name
0x8aabd0e0 20462 package body SYS.KUPW$WORKER
0x8aabd0e0 9028 package body SYS.KUPW$WORKER
0x8aabd0e0 6814 package body SYS.KUPW$WORKER
0x8aabd0e0 2691 package body SYS.KUPW$WORKER
0x8aabd0e0 9697 package body SYS.KUPW$WORKER
0x8aabd0e0 1775 package body SYS.KUPW$WORKER
0x9f3376a0 2 anonymous block
Processing object type TABLE_EXPORT/TABLE/TABLE
. . imported "NGPIUT01"."FLX_DD_FIN_STMT_LOGS" 2921 rows
Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type TABLE_EXPORT/TABLE/COMMENT
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
ORA-39083: Object type REF_CONSTRAINT failed to create with error:
ORA-02298: cannot validate (NGPIUT01.FLX_DD_FIN_ACCOUNT_CODE_FK) - parent keys not found
Failing sql is:
ALTER TABLE "NGPIUT01"."FLX_DD_FIN_STMT_LOGS" ADD CONSTRAINT "FLX_DD_FIN_ACCOUNT_CODE_FK" FOREIGN KEY ("ACCOUNT_CODE") REFERENCES "NGPIUT01"."FLX_DD_ACCOUNTS_B" ("ACCOUNT_CODE") ENABLE
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
ORA-39126: Worker unexpected fatal error in KUPW$WORKER.DISPATCH_WORK_ITEMS [TABLE_STATISTICS]
ORA-31600: invalid input value 100001 for parameter HANDLE in function CLOSE
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 105
ORA-06512: at "SYS.DBMS_METADATA", line 709
ORA-06512: at "SYS.DBMS_METADATA", line 5734
ORA-06512: at "SYS.DBMS_METADATA", line 1475
ORA-06512: at "SYS.DBMS_METADATA", line 7481
ORA-06512: at "SYS.KUPW$WORKER", line 2792
ORA-02067: transaction or savepoint rollback required
ORA-06512: at "SYS.DBMS_METADATA", line 1475
ORA-06512: at "SYS.DBMS_METADATA", line 7481
ORA-06512: at "SYS.KUPW$WORKER", line 10928
ORA-03113: end-of-file on communication channel
ORA-02055: distributed update operation failed; rollback required
ORA-02063: preceding lines from DBLINK_IUT01
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
ORA-06512: at "SYS.KUPW$WORKER", line 9001
----- PL/SQL Call Stack -----
object line object
handle number name
0x8aabd0e0 20462 package body SYS.KUPW$WORKER
0x8aabd0e0 9028 package body SYS.KUPW$WORKER
0x8aabd0e0 9831 package body SYS.KUPW$WORKER
0x8aabd0e0 1775 package body SYS.KUPW$WORKER
0x9f3376a0 2 anonymous block
ORA-39126: Worker unexpected fatal error in KUPW$WORKER.CONFIGURE_METADATA_UNLOAD [POST_WORK_PHASE]
ORA-03113: end-of-file on communication channel
ORA-02063: preceding line from DBLINK_IUT01
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
ORA-06512: at "SYS.KUPW$WORKER", line 9001
----- PL/SQL Call Stack -----
object line object
handle number name
0x8aabd0e0 20462 package body SYS.KUPW$WORKER
0x8aabd0e0 9028 package body SYS.KUPW$WORKER
0x8aabd0e0 6814 package body SYS.KUPW$WORKER
0x8aabd0e0 2691 package body SYS.KUPW$WORKER
0x8aabd0e0 9697 package body SYS.KUPW$WORKER
0x8aabd0e0 1775 package body SYS.KUPW$WORKER
0x9f3376a0 2 anonymous block
Job "SYS"."SYS_IMPORT_TABLE_01" completed with 10 error(s) at 10:07:27
ORA-03113: end-of-file on communication channelThis error means that you have a communication error. The Data Pump code has automatic restart, so when this happens, the worker process KUPW$WORKER fails with a fatal error, but the MCP process restarts the worker and restarts the job where it left off.
Is the import command getting terminated in middle of transaction and putting the call stack on the screen. Previously i dint >used to get these errors in import. What could be the reason ?Part of the Data Pump job is terminated, but the complete job is not terminated. There is an MCP process that runs and controls the job, there are also worker process(es) that run that are assigned work items to perform. In your case the work item fails because it can't communicate with some other process. This causes the worker process to die. The MCP detects this and starts another worker process and assigns the same work item that failed. When the worker process fails, it dumps its stack.
All this means is that the communication works, but looks like it is flaky. The worker process can communicate sometimes, but fails others. I have no idea which layer is flaky.
Dean
Similar Messages
-
Invalid XML, Expected item name child table but got 'row' UDO name
Hi expects
my client is using SAP 2007B, PL15
I have problem in uploading data via DTW in UDO master type table.
I don't have importing data
Only i have problem in updating data in UDO master type table
Detail log is Invalid XML, Expected item name 'TB_SALES_AGTDISC_CH' but got 'row' UDO_obj_sales_agt_disc
TB_SALES_AGTDISC - master type
fields are Code,Name
'TB_SALES_AGTDISC_CH -master row type
fields Code,vaidfrm,validto,disc_ltr
Please give some suggestions
PrasadHi
I checked my previous thread.
But still I am getting problem in importing one additional row in child table.
Message i was getting when i importing via DTW Re: Invalid XML, Expected item name<child table> but got 'row' UDO name
Please give me some links or procedure how to import row in child tabe
Prasad -
Importing only tables from full export
hi,
our database is going to get live..
Oracle 10GR2
Windows 2003.
i have full expdp dump file...
i just want to import only tables from that dumpfile..
how can i do that..
there are around 1059 tables...
searched over the net but did not find anythiing..
your help is needed....
thank you ..
wating for yourthnks for the replay..
the schema is different in which the tables needs to be imported.
but this schema in which tables needs to b imported contains the same number of tables with same strutures.
thanks. -
Can I import only the tables from dump file
(without synonyms)?
It's in Oracle 9i.
Thanks in advance.yes of course, specify tables=().
If you have a large number of tables then dynamically build a par file from the source db and call that par file on import -
Trying to import a table but getting this:
SAP DBTech JDBC: [258]: insufficient privilege: Not authorized
i have selected Tables/ImportHello,
this is security related problem.
Please see my blog for required privileges (chapter 2.1 Required security authorizations):
SAP HANA - Modeling Content Migration - part 1: Preparations, Security Requirements, Export
I think you are missing IMPORT privilege (but can be other privilege as well).
2.1.3 Target: Import of data content
For table import you need to have IMPORT system privilege and write access to target schema:
for table creation (and optionally data load): CREATE ANY
for table dropping and re-creation (and optionally data load): CREATE ANY, DROP
Note: It might look like error but you do not need any other privileges (no need to grant SELECT, INSERT or DELETE privilege).
In case that target schema does not exist and you wish to create is as part of import process you need to grant the user system privileges IMPORT and CREATE SCHEMA (no other privileges are required).
Note: From owner perspective it does not matter who started the import. Object owner is always user SYSTEM.
Tomas -
'You can download past purchases on this computer with just one Apple ID every 90 days. You cannot associate this computer with a different Apple ID for 67 days.'
What is this? It's the most frustrating thing, and I just wnt to scream 'I DIDN'T'. 23 days ago I did because I was unaware of the new 90 day rule.
But this time, i didn't do anything. I was watching a TV series, and I was purchasing the episodes one at a time (in hopes that this would encourage me to stop and go to sleep after i finished the next one). Anyway I had just finished watching episode 3 and purchased 4 and it had begun to download, when I left my laptop to go get something to eat came back n it had obviously gone into sleep mode. so had to continue the download BUT when I try to continue the download the 'message' pops up. I haven't signed out, it's purchased under j_akale, so I don't understand.
<Edited by Host>Frist tip, after installing software updates (I assure your machine started off with 10.4.2 and when you first used it, you were promted to update to 10.4.4.), go into the utlities folder (under applications) open disk utlitity and verifity and repaire disk permissions, this will greatly boost preformace and may fix your problem. (In fact do this once a week) Go ahead and run the Techtool (Applecare hardware test CD) and check for any problems. If these come up with nothing, I'd call Applecare and tell them what is happening. Applecare will cover you at any Apple dealer and some Apple resellers as well. Going elsewhere will kill your warrenty BTW.
-
While import the table i got error "ORA-39166"
I got error "ORA-39166: Object XXXXXXX_030210 was not found." while importing.
Edited by: AshishS on 03-Feb-2012 04:37Pl post details of OS and database versions, along with the complete impdp command used and the sections from the log file where this error occurs.
HTH
Srini -
Export and import only table structure
Hi ,
I have two schema scott and scott2. scott schema is having table index and procedure and scott2 schema is fully empty.
Now i want the table structure, indexes and procedure from scott schema to scott2 schema. No DATA needed.
What is the query to export table structure, indexes and procedure from scott schema and import in scott2 schema.
Once this done, i want scott schema should have full access to scott2 schema.
Oracle Database 10g Release 10.2.0.1.0 - 64bit Production
Please help...Pravin wrote:
I used rows=n
it giving me below error while importing dump file:-
IMP-00003: ORACLE error 604 encountered
ORA-00604: error occurred at recursive SQL level 1
ORA-01013: user requested cancel of current operation^CYou are getting this error because you hit "Ctrl C" during the import, which essentially cancels the import.
IMP-00017: following statement failed with ORACLE error 604:
"CREATE TABLE "INVESTMENT_DETAILS_BK210509" ("EMP_NO" VARCHAR2(15), "INFOTYP"
"E" VARCHAR2(10), "SBSEC" NUMBER(*,0), "SBDIV" NUMBER(*,0), "AMOUNT" NUMBER("
"*,0), "CREATE_DATE" DATE, "MODIFY_DATE" DATE, "FROM_DATE" DATE, "TO_DATE" D"
"ATE) PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 STORAGE(INITIAL 6684672"
" FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT) TABLESPACE "DMSTG" LOGG"
"ING NOCOMPRESS"Srini -
How to check if internal table exists in dynamical called subroutine ?
Hi,
in a dynamically called subroutine i'm using a internal table, but in some calls this table is not exist.
How can i check in the code whether the internal table exist or not ?
regards,
HansIn Horst Keller's blog /people/horst.keller/blog/2005/05/27/abap-geek-10--everything-functions-150-but-how my issue is talked about :
All other parameters are handled in a way as they were declared as global data objects in the top include of the function group, that can be used only during the execution of the function module: They are visible throughout the function group but you can access them only while the function module is active. If you access such a parameter and the respective function module is not executed, you get the runtime error GETWA_NOT_ASSIGNED (Why? Well, technically thos guys are represented via field symbols which are valid only during the runtime of the function module).
The code is in SD pricing. Sometimes the code is called from function module PRICING_BUILD_XKOMV or PRICING_SUBSCREEN_PBO where TKOMV is defined as globalized parameter.
And sometimes it is called from function module PRCING_CHECK where TKOMV is NOT defined as parameter.
In the call of last function the dump occures on the ASSIGN statement :
data: ls_tkomv like line of tkomv,
lv_tablename(30) type c value 'TKOMV[]'.
field-symbols: <lfs> type any table.
assign (lv_tablename) to <lfs>.
if <lfs> is assigned.
Any suggestions to solve the issue ?
regards,
Hans -
Datapump API: Import all tables in schema
Hi,
how can I import all tables using a wildcard in the datapump-api?
Thanks in advance,
tensai_tensai_ wrote:
Thanks for the links, but I already know them...
My problem is that I couldn't find an example which shows how to perform an import via the API which imports all tables, but nothing else.
Can someone please help me with a code-example?I'm not sure what you mean by "imports all tables, but nothing else". It could mean that you only want to import the tables, but not the data, and/or not the statistics etc.
Using the samples provided in the manuals:
DECLARE
ind NUMBER; -- Loop index
h1 NUMBER; -- Data Pump job handle
percent_done NUMBER; -- Percentage of job complete
job_state VARCHAR2(30); -- To keep track of job state
le ku$_LogEntry; -- For WIP and error messages
js ku$_JobStatus; -- The job status from get_status
jd ku$_JobDesc; -- The job description from get_status
sts ku$_Status; -- The status object returned by get_status
spos NUMBER; -- String starting position
slen NUMBER; -- String length for output
BEGIN
-- Create a (user-named) Data Pump job to do a "schema" import
h1 := DBMS_DATAPUMP.OPEN('IMPORT','SCHEMA',NULL,'EXAMPLE8');
-- Specify the single dump file for the job (using the handle just returned)
-- and directory object, which must already be defined and accessible
-- to the user running this procedure. This is the dump file created by
-- the export operation in the first example.
DBMS_DATAPUMP.ADD_FILE(h1,'example1.dmp','DATA_PUMP_DIR');
-- A metadata remap will map all schema objects from one schema to another.
DBMS_DATAPUMP.METADATA_REMAP(h1,'REMAP_SCHEMA','RANDOLF','RANDOLF2');
-- Include and exclude
dbms_datapump.metadata_filter(h1,'INCLUDE_PATH_LIST','''TABLE''');
dbms_datapump.metadata_filter(h1,'EXCLUDE_PATH_EXPR','LIKE ''TABLE/C%''');
dbms_datapump.metadata_filter(h1,'EXCLUDE_PATH_EXPR','LIKE ''TABLE/F%''');
dbms_datapump.metadata_filter(h1,'EXCLUDE_PATH_EXPR','LIKE ''TABLE/G%''');
dbms_datapump.metadata_filter(h1,'EXCLUDE_PATH_EXPR','LIKE ''TABLE/I%''');
dbms_datapump.metadata_filter(h1,'EXCLUDE_PATH_EXPR','LIKE ''TABLE/M%''');
dbms_datapump.metadata_filter(h1,'EXCLUDE_PATH_EXPR','LIKE ''TABLE/P%''');
dbms_datapump.metadata_filter(h1,'EXCLUDE_PATH_EXPR','LIKE ''TABLE/R%''');
dbms_datapump.metadata_filter(h1,'EXCLUDE_PATH_EXPR','LIKE ''TABLE/TR%''');
dbms_datapump.metadata_filter(h1,'EXCLUDE_PATH_EXPR','LIKE ''TABLE/STAT%''');
-- no data please
DBMS_DATAPUMP.DATA_FILTER(h1, 'INCLUDE_ROWS', 0);
-- If a table already exists in the destination schema, skip it (leave
-- the preexisting table alone). This is the default, but it does not hurt
-- to specify it explicitly.
DBMS_DATAPUMP.SET_PARAMETER(h1,'TABLE_EXISTS_ACTION','SKIP');
-- Start the job. An exception is returned if something is not set up properly.
DBMS_DATAPUMP.START_JOB(h1);
-- The import job should now be running. In the following loop, the job is
-- monitored until it completes. In the meantime, progress information is
-- displayed. Note: this is identical to the export example.
percent_done := 0;
job_state := 'UNDEFINED';
while (job_state != 'COMPLETED') and (job_state != 'STOPPED') loop
dbms_datapump.get_status(h1,
dbms_datapump.ku$_status_job_error +
dbms_datapump.ku$_status_job_status +
dbms_datapump.ku$_status_wip,-1,job_state,sts);
js := sts.job_status;
-- If the percentage done changed, display the new value.
if js.percent_done != percent_done
then
dbms_output.put_line('*** Job percent done = ' ||
to_char(js.percent_done));
percent_done := js.percent_done;
end if;
-- If any work-in-progress (WIP) or Error messages were received for the job,
-- display them.
if (bitand(sts.mask,dbms_datapump.ku$_status_wip) != 0)
then
le := sts.wip;
else
if (bitand(sts.mask,dbms_datapump.ku$_status_job_error) != 0)
then
le := sts.error;
else
le := null;
end if;
end if;
if le is not null
then
ind := le.FIRST;
while ind is not null loop
dbms_output.put_line(le(ind).LogText);
ind := le.NEXT(ind);
end loop;
end if;
end loop;
-- Indicate that the job finished and gracefully detach from it.
dbms_output.put_line('Job has completed');
dbms_output.put_line('Final job state = ' || job_state);
dbms_datapump.detach(h1);
exception
when others then
dbms_output.put_line('Exception in Data Pump job');
dbms_datapump.get_status(h1,dbms_datapump.ku$_status_job_error,0,
job_state,sts);
if (bitand(sts.mask,dbms_datapump.ku$_status_job_error) != 0)
then
le := sts.error;
if le is not null
then
ind := le.FIRST;
while ind is not null loop
spos := 1;
slen := length(le(ind).LogText);
if slen > 255
then
slen := 255;
end if;
while slen > 0 loop
dbms_output.put_line(substr(le(ind).LogText,spos,slen));
spos := spos + 255;
slen := length(le(ind).LogText) + 1 - spos;
end loop;
ind := le.NEXT(ind);
end loop;
end if;
end if;
-- dbms_datapump.stop_job(h1);
dbms_datapump.detach(h1);
END;
/This should import nothing but the tables (excluding the data and the table statistics) from an schema export (including a remapping shown here), you can play around with the EXCLUDE_PATH_EXPR expressions. Check the serveroutput generated for possible values used in EXCLUDE_PATH_EXPR.
Use the DBMS_DATAPUMP.DATA_FILTER procedure if you want to exclude the data.
For more samples, refer to the documentation:
http://download.oracle.com/docs/cd/B28359_01/server.111/b28319/dp_api.htm#i1006925
Regards,
Randolf
Oracle related stuff blog:
http://oracle-randolf.blogspot.com/
SQLTools++ for Oracle (Open source Oracle GUI for Windows):
http://www.sqltools-plusplus.org:7676/
http://sourceforge.net/projects/sqlt-pp/ -
How to do import of only table data without procedures and synonyms
Hello All,
I want to import only the tables of one database to a new database. I did a table level export of one schema. But along with it the stored procedures,synonyms, views etc are exported. Is there any way to stop them. Or is it possible to import only the tables from the dump created?
Regards
SatishHi Maran,
Thanks for the feedback. I think I didnot specify the parafile with the table list while exporting. I am redoing it now. I shall update this post within 20 minutes.
Regards
Satish -
Import for table only in one line
There is the following command for import of the schema:-
imp <userid/password> file=<export dump file location> fromuser=<export user name> touser=<import user name> grants=y constraints=Y rows=y log=<import log file location>
But, what if we want to import only one table, then what should the import statement look like?
I hope, my question is clear.
Please, help in solving the doubt.
regardsJust a query!!
I have a 9GB full dump and i want to recover say 1 table 100MB from it. Suppose, if the table is probably in a schema that is at the last portion of the dump file.
Why does the imp utility search through the whole dump to come accross the table to be imported. Is there any way to cut the time of search in the dump file by imp utility to directly jump on the schema and then start the search for the table?
Please share experiences.
Regards,
Zaffer Khan -
I Cant insert all values to the table But only Default Values r inserted?
Hai,
i can insert only default values to the database. Other values are not shown in DB.
Steps I have done so far:
I have created EO based on VO and VO has the query : select * from emp. Attached to AM.
That AM is attached to the Page. The page consists of employeed details and SAVE button.
Set controller for that page and CO-PR calls create method in AM .
Default values for who columns and SLNo also set through sequence.
In CO-PFR I call apply method which commits the data.
when i run the page and giving values it is not inserted into table but only default values are inserted.
I checked with System.out.println(getattribute("my attr").toString()). It send output correctly.
But y other values are not inserted.?
Anybody plz plz help me in this regard. I am struggling with this for last 1 week. I have to apply this to another realtime scenorio? Deadline is near.
Regards,
Lakshmi ChandiranHai Prince,
Thanks for your immediate response. pls find my codes written in EO,AM,CO.
In VO i havent added anythig xcept the query.
The following fields are only getting inserted in the table.
My EOImpl code:
public void create(AttributeList attributeList) {
super.create(attributeList);
System.out.println("NOW I AM IN CREATE METHOD");
OADBTransaction transaction = getOADBTransaction();
Number EMP_ID = transaction.getSequenceValue("Employeeid");
setEmpId(EMP_ID);
setCreationDate(transaction.getCurrentDBDate());
setStartDate(transaction.getCurrentDBDate());
setLastUpdateDate(transaction.getCurrentDBDate());
setCreationDate(transaction.getCurrentDBDate());
VO:
select * from emp1
AM:
public void saveForm()
OAViewObjectImpl empvo = getEMP_VO();
System.out.println("NOW I AM IN AM SAVEFORM OF EMTS");
if ( !empvo.isPreparedForExecution())
empvo .executeQuery();
Row prow = empvo .createRow();
empvo .insertRow(prow);
prow.setNewRowState(Row.STATUS_INITIALIZED) ;
System.out.println("NOW I AM IN ROW CREATED");
public void commitdata()
System.out.println("commitMethod()") ;
getDBTransaction().commit();
CO:PR & PFR:
public void processRequest(OAPageContext pageContext, OAWebBean webBean)
super.processRequest(pageContext, webBean);
if(!pageContext.isFormSubmission())
pageContext.getApplicationModule(webBean).invokeMethod("saveForm",null);
public void processFormRequest(OAPageContext pageContext, OAWebBean webBean)
super.processFormRequest(pageContext, webBean);
OAApplicationModule EMP_AM = pageContext.getApplicationModule(webBean);
OAViewObject vo = (OAViewObject)EMP_AM.findViewObject("EMP_VO");
if ( pageContext.getParameter("BtnSave") != null )
pageContext.getApplicationModule(webBean).invokeMethod("commitdata");
s1 = (String)vo.first().getAttribute("Empname");
s12 = (String)vo.first().getAttribute("Emptype");
s3 = (String)vo.first().getAttribute("Emporg");
s4 = (String)vo.first().getAttribute("Empcity");
System.out.println(" values are " + s1+s2+s3+s4);
here i can get the values correctly printed in output as entered in the form.
I dont know where the problem is.
plz help me.
Thanks in advance,
Regards,
Lakshmi Chandiran -
How to export and import only data not table structure
Hi Guys,
I am not much aware about import ,export utility please help me ..
I have two schema .. Schema1, Schema2
i used to use Schema1 in that my valuable data is present . now i want to move this data from Schema1 to Schema2 ..
In schema2 , i have only table structure , not any data ..user1118517 wrote:
Hi Guys,
I am not much aware about import ,export utility please help me ..
I have two schema .. Schema1, Schema2
i used to use Schema1 in that my valuable data is present . now i want to move this data from Schema1 to Schema2 ..
In schema2 , i have only table structure , not any data ..Nothing wrong with exporting the structure. Just use 'ignore=y' on the import. When it tries to do the CREATE TABLE it (the CREATE statement) will fail because the table already exists, but the ignore=y means "ignore failures of CREATE", and it will then proceed to INSERT the data. -
I purchased Lightroom 5, loaded it on my Mac, and updated to version 5.6. I then tried importing photos from my Sony DSC-RX100iii. Unfortunately it only imports JPEG files but no RAW files. What is the problem and how do I fix it?
Than LR should have brought in the RAW files as well. ACR is the RAW
processing engine used by LR under the covers so anything not supported by
ACR would not be supported by Lightroom.
Are you shooting RAW plus JPEG? By default, Lightroom will stack then
unless you tell it not to. Check to see if there is a small badge on the
preview that looks like two sheets of paper with a number. That is how
Lightroom indicates a stack. You can unstack them by clicking on the badge.
There is also a menu item to stack/unstack photos. Otherwise, I would have
to research but not sure why Lightroom would ignore the RAW photos but
still import the JPEGS.
Maybe you are looking for
-
I have a £15 on my iTunes account, from a gift voucher. I have made purchases, but my account remains at £15, and the money instead has been deducted from bank account. Why is not taking the money from valide gift voucher.
-
I've searched through the forum and can't find information for what I believe is a general problem that I'm having with Forms 10g (10.1.2). None of the "normal" Forms messages are appearing. If I open a form, enter values in a field or two then refre
-
T40 Type 2373 with a DVD - CDRW. Need to reformat hard drive and reload WIN XP and new programs. DVD will not spin to full speed. Light goes on and spins slowly in what sounds like a cyclic manner. Tried several CD's. Ejects properly. Solved! Go
-
Problems getting track times to import.
I have experienced a problem where the CD track times display on the CD I am importing, but when that CD is in the Library, some or all tracks display "not available" for the track time. When those tracks are downloaded to the I-pod, they get skipped
-
White Balance settings name "Cloudy" Italian version
Hi, the White Balance "Cloudy" settings italian name is not "Nuvolo" but "Nuvoloso". Nuvolo is a forums slang words, it is non italian!!! Will You modify it?