Error while importing schemas using datapump

Hi,
I am trying to import schema from qc to development. after importing i got the following error attached below:
Processing object type SCHEMA_EXPORT/TABLE/GRANT/WITH_GRANT_OPTION/OBJECT_GRANT
Processing object type SCHEMA_EXPORT/TABLE/GRANT/CROSS_SCHEMA/OBJECT_GRANT
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
ORA-39065: unexpected master process exception in RECEIVE
ORA-39078: unable to dequeue message for agent MCP from queue "KUPC$C_2_20090421161917"
Job "SYS"."uat.210409" stopped due to fatal error at 20:15:13
ORA-39014: One or more workers have prematurely exited.
ORA-39029: worker 2 with process name "DW02" prematurely terminated
ORA-31671: Worker process DW02 had an unhandled exception.
ORA-39078: unable to dequeue message for agent KUPC$A_2_20090421161934 from queue "KUPC$C_2_20090421161917"
ORA-06512: at "SYS.KUPW$WORKER", line 1397
ORA-06512: at line 2
ORA-39029: worker 3 with process name "DW03" prematurely terminated
ORA-31671: Worker process DW03 had an unhandled exception.
ORA-39078: unable to dequeue message for agent KUPC$A_2_20090421162030 from queue "KUPC$C_2_20090421161917"
ORA-06512: at "SYS.KUPW$WORKER", line 1397
ORA-06512: at line 2
ORA-39029: worker 4 with process name "DW04" prematurely terminated
ORA-31671: Worker process DW04 had an unhandled exception.
ORA-39078: unable to dequeue message for agent KUPC$A_2_20090421162031 from queue "KUPC$C_2_20090421161917"
ORA-06512: at "SYS.KUPW$WORKER", line 1397
ORA-06512: at line 2
Is my import completed successfully or not??. please help...

When a datapump job runs, it creates a table called the master table. It has the same name as the job name. This is used to keep track of where all of the information in the dumpfile is located. It is also used when restarting a job. For some reason, this table got dropped. I'm not sure why, but in most cases, datapump jobs are restartable. I don't know why the original message was reported, but I was hoping the job woudl be restartable. You could always just rerun the job. Since the job that failed already created tables and indexes, if you restart the job, all of the objects that are dependent on those objects will not be created by default.
Let's say you have table tab1 with an index ind1 and both table and index are anaylized. Since tab1 is already created, the datapump job will mark all of the objects dependent on tab1 to skip. This includes the index, table_statistics, and index_statistics. To get around this, you could say
table_exists_action=replace
but this will replace all tables that are in the dumpfile. Your other options are:
table_exists_action=
truncate -- to truncate the data in the table and then just reload the data, but not the dependent objects
append -- to append the data from the dumpfile to the existing table, but do not import the dependent objects
skip -- skip the data and dependent objects from the dumpfile.
Hope this helps.
Dean

Similar Messages

  • Getting access denied error while importing file using input type="file"

    Hi All,
    I am using struts application wherein I need to import file for some purpose.I have used input type="file" for the same which goes like:
    <input type="file" id="uploadFile" name="uploadFile" size="50">
    I have the import button on which I have used onClick event to call javascript function submitValues() used to validate all the fields from the page which goes like:
    <input type="button" name="select" value="Import" class="CSSButton" onClick="javascript:submitValues();">
    The JS function then in turn submits the form and calls the action.The problem is sometimes even when the correct path is specified for the file to be imported results in access denied error.This error comes sometimes and other times it works fine.But when this error comes,I need to relogin into the application and then it works fine.I am using IE7 for this.
    Any idea why I am getting access denied error while importing? Has it got something to do with IE7 version or with the input type="file" which is being used here?
    Thanks for any help if anyone can provide.
    Edited by: passionateforjava on Mar 4, 2009 2:18 AM

    vishnuS1984 wrote:
    Hi Friends,
    I have gone through scores of examples and i am failing to understand the right thing to be done to copy a file from one directory to another. Here is my class...So let's see... C:\GetMe1 is a directory on your machine, right? And this is what you are doing with that directory:
    public static void copyFiles(File src, File dest) throws IOException
    // dest is a 'File' object but represents the C:\GetMe1 directory, right?
    fout = new FileOutputStream (dest);If it's a directory, where in your code are you appending the source file name to the path, before trying to open an output stream on it? You're not.
    BTW, this is awful:
    catch (IOException e)
    IOException wrapper = new IOException("copyFiles: Unable to copy file: " +
    src.getAbsolutePath() + "to" + dest.getAbsolutePath()+".");
    wrapper.initCause(e);
    wrapper.setStackTrace(e.getStackTrace());
    throw wrapper;
    }1) You're hiding the original IOException and replacing it with your own? For what good purpose?
    2) Even if you had a good reason to do that, this would be simpler and better:
    throw new IOException("your custom message goes here", e);
    rather than explicitly invokign initCause and setStackTrace. Yuck!

  • Error while taking dump using datapump

    getting following error -
    Export: Release 10.2.0.1.0 - Production on Friday, 15 September, 2006 10:31:41
    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    Starting "XX"."SYS_EXPORT_SCHEMA_02": XX/********@XXX directory=dpdump dumpfile=XXX150906.dmp logfile=XXX150906.log
    Estimate in progress using BLOCKS method...
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    ORA-39125: Worker unexpected fatal error in KUPW$WORKER.GET_TABLE_DATA_OBJECTS while calling DBMS_METADATA.FETCH_XML_CLOB []
    ORA-31642: the following SQL statement fails:
    BEGIN "DMSYS"."DBMS_DM_MODEL_EXP".SCHEMA_CALLOUT(:1,0,0,'10.02.00.01.00'); END;
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 86
    ORA-06512: at "SYS.DBMS_METADATA", line 907
    ORA-06550: line 1, column 7:
    PLS-00201: identifier 'DMSYS.DBMS_DM_MODEL_EXP' must be declared
    ORA-06550: line 1, column 7:
    PL/SQL: Statement ignored
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
    ORA-06512: at "SYS.KUPW$WORKER", line 6235
    ----- PL/SQL Call Stack -----
    object line object
    handle number name
    2A68E610 14916 package body SYS.KUPW$WORKER
    2A68E610 6300 package body SYS.KUPW$WORKER
    2A68E610 9120 package body SYS.KUPW$WORKER
    2A68E610 1880 package body SYS.KUPW$WORKER
    2A68E610 6861 package body SYS.KUPW$WORKER
    2A68E610 1262 package body SYS.KUPW$WORKER
    255541A8 2 anonymous block
    Job "XX"."SYS_EXPORT_SCHEMA_02" stopped due to fatal error at 10:33:12
    Action required is contact customer support. And on metalink found a link that states it a bug in 10g release 1 that was suppose to be fixed in 10g release 1 version 4.
    some of the default schemas were purposely dropped from the database. The only default schema available now are -
    DBSNMP, DIP, OUTLN, PUBLIC, SCOTT, SYS, SYSMAN, SYSTEM, TSMSYS.
    DIP, OUTLN, TSMSYS were created again.
    Could this be a cause of problem??
    Thanks in adv.

    Hi,
    Below is the DDL taken from different database. Will this be enough ? One more thing please, what shall be the password should it be DMSYS.....since this will not be used by me but system.
    CREATE USER "DMSYS" PROFILE "DEFAULT" IDENTIFIED BY "*******" PASSWORD EXPIRE DEFAULT TABLESPACE "SYSAUX" TEMPORARY TABLESPACE "TEMP" QUOTA 204800 K ON "SYSAUX" ACCOUNT LOCK
    GRANT ALTER SESSION TO "DMSYS"
    GRANT ALTER SYSTEM TO "DMSYS"
    GRANT CREATE JOB TO "DMSYS"
    GRANT CREATE LIBRARY TO "DMSYS"
    GRANT CREATE PROCEDURE TO "DMSYS"
    GRANT CREATE PUBLIC SYNONYM TO "DMSYS"
    GRANT CREATE SEQUENCE TO "DMSYS"
    GRANT CREATE SESSION TO "DMSYS"
    GRANT CREATE SYNONYM TO "DMSYS"
    GRANT CREATE TABLE TO "DMSYS"
    GRANT CREATE TRIGGER TO "DMSYS"
    GRANT CREATE TYPE TO "DMSYS"
    GRANT CREATE VIEW TO "DMSYS"
    GRANT DROP PUBLIC SYNONYM TO "DMSYS"
    GRANT QUERY REWRITE TO "DMSYS"
    GRANT SELECT ON "SYS"."DBA_JOBS_RUNNING" TO "DMSYS"
    GRANT SELECT ON "SYS"."DBA_REGISTRY" TO "DMSYS"
    GRANT SELECT ON "SYS"."DBA_SYS_PRIVS" TO "DMSYS"
    GRANT SELECT ON "SYS"."DBA_TAB_PRIVS" TO "DMSYS"
    GRANT SELECT ON "SYS"."DBA_TEMP_FILES" TO "DMSYS"
    GRANT EXECUTE ON "SYS"."DBMS_LOCK" TO "DMSYS"
    GRANT EXECUTE ON "SYS"."DBMS_REGISTRY" TO "DMSYS"
    GRANT EXECUTE ON "SYS"."DBMS_SYSTEM" TO "DMSYS"
    GRANT EXECUTE ON "SYS"."DBMS_SYS_ERROR" TO "DMSYS"
    GRANT DELETE ON "SYS"."EXPDEPACT$" TO "DMSYS"
    GRANT INSERT ON "SYS"."EXPDEPACT$" TO "DMSYS"
    GRANT SELECT ON "SYS"."EXPDEPACT$" TO "DMSYS"
    GRANT UPDATE ON "SYS"."EXPDEPACT$" TO "DMSYS"
    GRANT SELECT ON "SYS"."V_$PARAMETER" TO "DMSYS"
    GRANT SELECT ON "SYS"."V_$SESSION" TO "DMSYS"
    The other database has the DMSYS and the status is EXPIRED & LOCKED but I'm still able to take the dump using datapump??

  • Error while importing data using SAPINST system copy method

    Dear Friends,
    I am in process of doing system copy from production to Quality.
    OS: Linux
    Database: DB2
    We have completed export from production. I have uninstalled quality system before importing, uninstall of quality system was succesful. Know while importing the exported data by using sapinst method, i am getting error in phase define parameter. Where it is showing error as below.
    ==============
    TRACE      2011-07-01 13:15:35.738
    Found Error, error_codes[0] = <SELECT 'DATABASE-SCHEMA '||name||' COUNT : ( '||STRIP(CHAR(cnt1))||' ) ' FROM ( SELECT 'SAPGEQ' AS name , COUNT(*) AS cnt1 FROM SYSCAT.SCHEMATA WHERE schemaname='SAPGEQ' ) AS sub
    SQL1036C  An I/O error occurred while accessing the database.  SQLSTATE=58030>
    TRACE      2011-07-01 13:15:35.738
    During execution of <last_command.sql>, <1> errors occured.
    ERROR      2011-07-01 13:15:35.738 [iaxxinscbk.cpp:260]
    MDB-01999  Error occured, first error is: <SQL1036C  An I/O error occurred while accessing the database.  SQLSTATE=58030>
    ===================
    from the above error , it seems it is trying to conect to old quality database, but i don't know why it is doing so, as i have already uninstall the complete instance using sapinst. need your expert suggestion to over come this issue.
    Thanks
    Anil Bhandary

    Hi ,
    This is probably due to shutdown of the system not correctly.
    you will have to re-create control file.
    Regards
    Nagaraju

  • Getting error while importing schema with ORACLE TEXT

    IMP-00003: ORACLE error 20000 encountered
    ORA-20000: Oracle Text error:
    DRG-52204: error while registering index
    DRG-10507: duplicate index name: WORKORDER_Q, owner: SYS
    ORA-06512: at "CTXSYS.DRUE", line 160
    ORA-06512: at "CTXSYS.DRIIMP", line 115
    ORA-06512: at line 2
    IMP-00088: Problem importing metadata for index WORKORDER_Q. Index creation will be skipped
    Database version - Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
    Os version - Linux nlxs1012.slb.atosorigin-asp.com 2.6.18-308.el5 #1 SMP Fri Jan 27 17:17:51 EST 2012 x86_64 x86_64 x86_64 GNU/Linux
    We have take export of schema from production db now importing data to qa environment..
    In import facing above error..

    I am importing objects from P20_MAXIMO to Q25_MAXIMO to another database..
    Below is import par file..
    USERID='/ as sysdba'
    FILE=exp_P20_MAXIMO_C2364781.dmp
    LOG=imp_P20_MAXIMO__Q25_MAXIMO_C2364781_1.log
    FROMUSER=P20_MAXIMO
    TOUSER=Q25_MAXIMO
    buffer=1000000
    feedback=100000
    Export parfile
    userid='/ as sysdba'
    owner=P20_MAXIMO
    FILE=exp_P20_MAXIMO_C2364781.dmp
    LOG=exp_P20_MAXIMO_C2364781.log
    buffer=10000000
    feedback=100000
    statistics=none

  • Getting error while importing metadata using View Objects

    Hi All,
    I am trying to create a repository using View Object in OBIEE 11.1.5.1 but getting error while viewing the data, after importing the metadata in the repository "[nQSError: 77031] Error occurs while calling remote service ADFService11G. Details: Runtime error for service -- ADFService11G - oracle/apps/fnd/applcore/common/ApplSession".
    I am also getting error "žADFException-2015: The BI Server is incompatible with the BI-ADF Broker Servlet: BI Server protocol version = null, BI-ADF Broker Servlet protocol version = 1" during testing my sample which is deployed to Admin server. I followed BI Adminstrator help file guide in order to create the sample for creating repository using view object.
    Admin server log says
    [2011-09-27T02:59:03.646-05:00] [AdminServer] [NOTIFICATION] [] [oracle.bi.integration.adf] [tid: [ACTIVE].ExecuteThread: '2' for queue: 'weblogic.kernel.Default (self-tuning)'] [userId: <anonymous>] [ecid: b260b57746aa92d3:-1f9ca26:1328fcfd3e6:-8000-0000000000006744,0] [APP: BIEEOrders] [[
    QUERY:
    <?xml version="1.0" encoding="UTF-8" ?><ADFQuery><Parameters></Parameters><Projection><Attribute><Name><![CDATA[Deptno]]></Name><ViewObject><![CDATA[AppModule.DeptViewObj1]]></ViewObject></Attribute><Attribute><Name><![CDATA[Dname]]></Name><ViewObject><![CDATA[AppModule.DeptViewObj1]]></ViewObject></Attribute><Attribute><Name><![CDATA[Loc]]></Name><ViewObject><![CDATA[AppModule.DeptViewObj1]]></ViewObject></Attribute></Projection><JoinSpec><ViewObject><Name><![CDATA[AppModule.DeptViewObj1]]></Name></ViewObject></JoinSpec></ADFQuery>
    [2011-09-27T02:59:04.199-05:00] [AdminServer] [ERROR] [] [oracle.bi.integration.adf.v11g.obieebroker] [tid: [ACTIVE].ExecuteThread: '2' for queue: 'weblogic.kernel.Default (self-tuning)'] [userId: <anonymous>] [ecid: b260b57746aa92d3:-1f9ca26:1328fcfd3e6:-8000-0000000000006744,0] [APP: BIEEOrders] java.lang.NoClassDefFoundError: oracle/apps/fnd/applcore/common/ApplSession[[
         at oracle.bi.integration.adf.ADFDataQuery.makeQueryBuilder(ADFDataQuery.java:81)
         at oracle.bi.integration.adf.ADFDataQuery.<init>(ADFDataQuery.java:70)
         at oracle.bi.integration.adf.ADFReadQuery.<init>(ADFReadQuery.java:15)
         at oracle.bi.integration.adf.ADFService.makeADFQuery(ADFService.java:227)
         at oracle.bi.integration.adf.ADFService.execute(ADFService.java:136)
         at oracle.bi.integration.adf.v11g.obieebroker.ADFServiceExecutor.execute(ADFServiceExecutor.java:185)
         at oracle.bi.integration.adf.v11g.obieebroker.OBIEEBroker.doGet(OBIEEBroker.java:89)
         at oracle.bi.integration.adf.v11g.obieebroker.OBIEEBroker.doPost(OBIEEBroker.java:106)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
         at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
         at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
         at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:300)
         at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:26)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at oracle.adf.share.http.ServletADFFilter.doFilter(ServletADFFilter.java:62)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at oracle.security.jps.ee.http.JpsAbsFilter$1.run(JpsAbsFilter.java:111)
         at java.security.AccessController.doPrivileged(Native Method)
         at oracle.security.jps.util.JpsSubject.doAsPrivileged(JpsSubject.java:313)
         at oracle.security.jps.ee.util.JpsPlatformUtil.runJaasMode(JpsPlatformUtil.java:413)
         at oracle.security.jps.ee.http.JpsAbsFilter.runJaasMode(JpsAbsFilter.java:94)
         at oracle.security.jps.ee.http.JpsAbsFilter.doFilter(JpsAbsFilter.java:161)
         at oracle.security.jps.ee.http.JpsFilter.doFilter(JpsFilter.java:71)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at oracle.dms.servlet.DMSServletFilter.doFilter(DMSServletFilter.java:136)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at weblogic.servlet.internal.RequestEventsFilter.doFilter(RequestEventsFilter.java:27)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.wrapRun(WebAppServletContext.java:3715)
         at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3681)
         at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
         at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:120)
         at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2277)
         at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2183)
         at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1454)
         at weblogic.work.ExecuteThread.execute(ExecuteThread.java:209)
         at weblogic.work.ExecuteThread.run(ExecuteThread.java:178)
    Caused by: java.lang.ClassNotFoundException: oracle.apps.fnd.applcore.common.ApplSession
         at weblogic.utils.classloaders.GenericClassLoader.findLocalClass(GenericClassLoader.java:297)
         at weblogic.utils.classloaders.GenericClassLoader.findClass(GenericClassLoader.java:270)
         at java.lang.ClassLoader.loadClass(ClassLoader.java:305)
         at java.lang.ClassLoader.loadClass(ClassLoader.java:246)
         at weblogic.utils.classloaders.GenericClassLoader.loadClass(GenericClassLoader.java:179)
         ... 38 more
    Please suggest how to make it work.

    Hi,
    Thanks for providing the online help URL, i have already completed the steps mentioned in URL
    I am able to import metadata using view object but getting above error("[nQSError: 77031] Error occurs while calling remote service ADFService11G. Details: Runtime error for service -- ADFService11G - oracle/apps/fnd/applcore/common/ApplSession".") while validating the data.
    It fails at step 5 of "To import metadata from an ADF Business Component data source:" section of above URL.

  • Error while import schema in HANA DB

    Hi...
    I am trying to perform an export-import of a HANA DB user instance at schema level including all objects (in my case tables and synonyms) and data.
    Export and Import works very well except importing a schema which contains a synonym referred to other existing schema .
    Here is a test case
    - Schema A with a Table and Schema B to contain a synonym for schema A's table.
    - Exported both schemas to CSV which is done successfully and everything looked ok in export location.
    - Imported Schema A and then Schema B with the following command.
    Command:
    IMPORT "<bkup_schema>"."*" FROM '<bkp_location>' WITH RENAME SCHEMA <bkup_schema> TO <brand_new_schema>;
    The schema A (with table) imported fine.  Schema B (with synonym) gives the below error :
    "SAP DBTech JDBC: [2048]: column store error: table import failed:  [2003] An index already exists with the same name "
    When trying to use "REPLACE" clause in the command the error changes as below:
    SAP DBTech JDBC: [2048]: column store error: table import failed:  [30117] Binary import failed (cannot execute drop statement)
    Has anybody faced this type of error before??

    Folks, I am running into the same issue. What I figure out was the following:
    1) I turn off the autocommit
    2) I run the import for the first time
    import "<SCHEMA_NAME>"."*" as binary from '<location>' with replace;
    rollback;
    3) I get the error as described
    SAP DBTech JDBC: [2048]: column store error: table import failed:  [30117] Binary import failed (cannot execute drop statement)
    4) I run this import once again and commit and surprisingly the error is gone and the import is successful.
    import "<SCHEMA_NAME>"."*" as binary from '<location>' with replace;
    commit;

  • Error while compiling schema using JAXB 1.0.2 to

    Hi ,
    Please help ! i am trying to compile my schema using JAXB1.0.2 (bundled with Java Web Services Developer Pack 1.3 ) .I am getting a java.util.MissingResourceException . Can anyone help as to which portion of the schema i should look at.Here are the error details.
    thanks
    Rahul
    parsing a schema...
    java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
    java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
    sorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:324)
    at org.apache.commons.launcher.ChildMain.run(ChildMain.java:269)
    Caused by: java.util.MissingResourceException: Can't find resource for bundle ja
    va.util.PropertyResourceBundle, key parser.cc.8
    at java.util.ResourceBundle.getObject(ResourceBundle.java:314)
    at java.util.ResourceBundle.getString(ResourceBundle.java:274)
    at com.sun.msv.datatype.xsd.regex.RegexParser.ex(RegexParser.java:138)
    at com.sun.msv.datatype.xsd.regex.ParserForXMLSchema.parseCharacterClass
    (ParserForXMLSchema.java:291)
    at com.sun.msv.datatype.xsd.regex.RegexParser.parseAtom(RegexParser.java
    :736)
    at com.sun.msv.datatype.xsd.regex.RegexParser.parseFactor(RegexParser.ja
    va:638)
    at com.sun.msv.datatype.xsd.regex.RegexParser.parseTerm(RegexParser.java
    :342)
    at com.sun.msv.datatype.xsd.regex.RegexParser.parseRegex(RegexParser.jav
    a:320)
    at com.sun.msv.datatype.xsd.regex.RegexParser.parse(RegexParser.java:158
    at com.sun.msv.datatype.xsd.regex.RegularExpression.setPattern(RegularEx
    pression.java:3040)
    at com.sun.msv.datatype.xsd.regex.RegularExpression.setPattern(RegularEx
    pression.java:3051)
    at com.sun.msv.datatype.xsd.regex.RegularExpression.<init>(RegularExpres
    sion.java:3017)
    at com.sun.msv.datatype.xsd.PatternFacet.compileRegExps(PatternFacet.jav
    a:79)
    at com.sun.msv.datatype.xsd.PatternFacet.<init>(PatternFacet.java:67)
    at com.sun.msv.datatype.xsd.TypeIncubator.derive(TypeIncubator.java:261)
    at com.sun.tools.xjc.reader.xmlschema.DatatypeBuilder.restrictionSimpleT
    ype(DatatypeBuilder.java:82)
    at com.sun.xml.xsom.impl.RestrictionSimpleTypeImpl.apply(RestrictionSimp
    leTypeImpl.java:66)
    at com.sun.tools.xjc.reader.xmlschema.DatatypeBuilder.build(DatatypeBuil
    der.java:65)
    at com.sun.tools.xjc.reader.xmlschema.SimpleTypeBuilder.buildPrimitiveTy
    pe(SimpleTypeBuilder.java:161)
    at com.sun.tools.xjc.reader.xmlschema.SimpleTypeBuilder.access$100(Simpl
    eTypeBuilder.java:50)
    at com.sun.tools.xjc.reader.xmlschema.SimpleTypeBuilder$Functor.checkCon
    version(SimpleTypeBuilder.java:201)
    at com.sun.tools.xjc.reader.xmlschema.SimpleTypeBuilder$Functor.restrict
    ionSimpleType(SimpleTypeBuilder.java:276)
    at com.sun.xml.xsom.impl.RestrictionSimpleTypeImpl.apply(RestrictionSimp
    leTypeImpl.java:66)
    at com.sun.tools.xjc.reader.xmlschema.SimpleTypeBuilder.build(SimpleType
    Builder.java:93)
    at com.sun.tools.xjc.reader.xmlschema.cs.DefaultClassBinder.simpleType(D
    efaultClassBinder.java:130)
    at com.sun.xml.xsom.impl.SimpleTypeImpl.apply(SimpleTypeImpl.java:89)
    at com.sun.tools.xjc.reader.xmlschema.cs.ClassSelector._bindToClass(Clas
    sSelector.java:212)
    at com.sun.tools.xjc.reader.xmlschema.cs.ClassSelector.bindToType(ClassS
    elector.java:177)
    at com.sun.tools.xjc.reader.xmlschema.BGMBuilder.populate(BGMBuilder.jav
    a:254)
    at com.sun.tools.xjc.reader.xmlschema.BGMBuilder.buildContents(BGMBuilde
    r.java:223)
    at com.sun.tools.xjc.reader.xmlschema.BGMBuilder._build(BGMBuilder.java:
    116)
    at com.sun.tools.xjc.reader.xmlschema.BGMBuilder.build(BGMBuilder.java:8
    0)
    at com.sun.tools.xjc.GrammarLoader.annotateXMLSchema(GrammarLoader.java:
    424)
    at com.sun.tools.xjc.GrammarLoader.load(GrammarLoader.java:130)
    at com.sun.tools.xjc.GrammarLoader.load(GrammarLoader.java:79)
    at com.sun.tools.xjc.Driver.run(Driver.java:177)
    at com.sun.tools.xjc.Driver._main(Driver.java:80)
    at com.sun.tools.xjc.Driver.access$000(Driver.java:46)
    at com.sun.tools.xjc.Driver$1.run(Driver.java:60)

    Hi,
    A similar problem occur to me once.
    In that case it was that I copied the files from:
    JWSDP_HOME\jaxb\lib
    to
    JAVA_HOME\jre\lib\endorsed
    when I shouldn�t.
    I should take the files only from JWSDP_HOME\jwsdp-sahred\lib\endorsed
    So try taking the files:
    jaxb-api.jar
    jaxb-impl.jar
    jaxb-xjc.jar
    jaxb-libs.jar
    out of the JAVA_HOME\jre\lib\endorsed directory living them only in jaxb\lib where they should be.
    Try that, and let�s see if the compilation improves.

  • Stats Not getting imported while importing partition using DATAPUMP API

    Hi,
    I am using datapump API for exporting a partition of a table and the importing the partition into table on another DB.
    The data is getting imported but i could the partition stats are not getting imported. I am using TABLE_EXISTS_ACTION = APPEND.
    any idea on how to achieve the stats also to be imported into the partition.
    Thanks!

    Hi,
    I am using datapump API for exporting a partition of a table and the importing the partition into table on another DB.
    The data is getting imported but i could the partition stats are not getting imported. I am using TABLE_EXISTS_ACTION = APPEND.
    any idea on how to achieve the stats also to be imported into the partition.
    Thanks!

  • URGENT: Error while importing personalizations using Functional Admin Resp

    Hi all,
    We are facing problem when importing personalizations using Functional Administrator responsibility.
    We have personalised a standard oracle page in development instance and have successfully exported the personalizations in a file.
    But when we try to import the same files in another instance, we get the following error when opening "Exported Personalizations" tab in "Personalizations> Import/Export" tab .
    The path ' $HOME/ ' you specified does not exist. Please specify an existing path in the profile 'FND: Personalization Document Root Path'.
    We have tried giving different paths in profile option 'FND: Personalization Document Root Path', but it gives the same error for every path in this new instance.
    This was not the case with the development instance where we were able to import/export personalizations successfully.
    Can anybody give inputs on this.
    This is a bit urgent.
    Regards,
    Anas.

    What is the current value in the Profiles Form?
    What is the value that is displayed in the "About this Page" -> "Profiles", from the Functional Administration screen.
    Can you also confirm if the directory has 777 permissions?

  • ATG 10.0.1 - Error while importing data using CIM

    Hi,
    I am trying to setup the data for ATG10.0.1 using cim.
    I was able to successfully create the publishing schema. However, when I try to import the data through cim it gives me following error:
    Importing (1 of 37) /Publishing/base/install/epub-role-data.xml to /atg/
    userprofiling/InternalProfileRepository...
    -------DATA IMPORT FAILED-------------------------------------------------------
    +enter [h]elp, [m]ain menu, [q]uit to exit+
    Make sure you have configured the connection details and created the schema.
    exec returned: 255
    Please help. Am I missing something?
    Thanks

    cim error log:
    *** info     Wed Nov 23 00:57:05 PST 2011     1322038625172     atg.cim.task.ant.utility.AntLogger     Total time: 0 seconds
    **** Warning     Wed Nov 23 00:57:05 PST 2011     1322038625176     atg.cim.worker.common.PropertyFileCreatorTask     Attempting to create properties file D:\ATG\ATG10.0.1\home\..\home\servers\cimDbInitializer\localconfig\atg\registry\RepositoryGroups.properties with out contents.
    **** info     Wed Nov 23 00:57:05 PST 2011     1322038625177     atg.cim.worker.common.PropertyFileCreatorTask     Wrote File D:\ATG\ATG10.0.1\home\..\home\servers\cimDbInitializer\localconfig\atg\registry\RepositoryGroups.properties
    **** info     Wed Nov 23 00:57:05 PST 2011     1322038625197     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask     Executing startSqlRepository for /atg/userprofiling/InternalProfileRepository
    **** info     Wed Nov 23 00:57:05 PST 2011     1322038625197     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask     Import File D:\ATG\ATG10.0.1\home\..\Publishing\base\install\epub-role-data.xml
    **** info     Wed Nov 23 00:57:05 PST 2011     1322038625197     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask     args : -workspace workspace -comment initial_data_import -user publishing
    **** info     Wed Nov 23 00:57:05 PST 2011     1322038625204     atg.cim.task.ant.utility.AntLogger     
    **** info     Wed Nov 23 00:57:05 PST 2011     1322038625204     atg.cim.task.ant.utility.AntLogger     start.sqlrepository.windows:
    **** info     Wed Nov 23 00:57:05 PST 2011     1322038625263     atg.cim.task.ant.utility.AntLogger     [exec] \Java\jdk1.6.0_27\bin\java was unexpected at this time.
    **** info     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.task.ant.utility.AntLogger     
    **** info     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.task.ant.utility.AntLogger     BUILD FAILED
    **** info     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.task.ant.utility.AntLogger     D:\ATG\ATG10.0.1\CIM\plugins\Base\ant\cim-ant.xml:4: exec returned: 255
    **** info     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.task.ant.utility.AntLogger     
    **** info     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.task.ant.utility.AntLogger     Total time: 0 seconds
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask     ---     D:\ATG\ATG10.0.1\CIM\plugins\Base\ant\cim-ant.xml:4: exec returned: 255
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at org.apache.tools.ant.taskdefs.ExecTask.runExecute(ExecTask.java:636)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at org.apache.tools.ant.taskdefs.ExecTask.runExec(ExecTask.java:662)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at org.apache.tools.ant.taskdefs.ExecTask.execute(ExecTask.java:487)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:288)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at sun.reflect.GeneratedMethodAccessor102.invoke(Unknown Source)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at java.lang.reflect.Method.invoke(Method.java:597)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:105)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at org.apache.tools.ant.Task.perform(Task.java:348)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at org.apache.tools.ant.Target.execute(Target.java:357)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at org.apache.tools.ant.Target.performTasks(Target.java:385)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1329)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at org.apache.tools.ant.Project.executeTarget(Project.java:1298)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.task.ant.utility.AntExecutionWrapper.executeAntTarget(AntExecutionWrapper.java:145)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.worker.AntTask.executeAntTarget(AntTask.java:93)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask.execute(StartSqlRepositoryExecutorTask.java:243)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.database.ImportDataTask.createStartSqlRepositoryTask(ImportDataTask.java:635)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.database.ImportDataTask.importTask(ImportDataTask.java:454)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.database.ImportDataTask.execute(ImportDataTask.java:217)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.task.TaskExecutor.execute(TaskExecutor.java:134)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.task.TaskExecutor.executeTasks(TaskExecutor.java:80)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.command.types.ExecuteStepTasks.execute(ExecuteStepTasks.java:51)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.command.CommandExecutor.execute(CommandExecutor.java:128)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.command.CommandExecutor.executeCommands(CommandExecutor.java:156)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.step.StepExecutor.executeValidations(StepExecutor.java:289)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.ui.text.TextDisplay.processStep(TextDisplay.java:436)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.ui.UIDispatchImpl.processStep(UIDispatchImpl.java:89)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.step.StepExecutor.processStep(StepExecutor.java:201)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.step.StepExecutor.processCurrentStep(StepExecutor.java:80)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.runner.Runner.run(Runner.java:152)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.command.types.LaunchTemplate.execute(LaunchTemplate.java:69)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.command.CommandExecutor.execute(CommandExecutor.java:128)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.command.CommandExecutor.executeCommands(CommandExecutor.java:156)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.step.StepExecutor.processStep(StepExecutor.java:129)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.step.StepExecutor.processCurrentStep(StepExecutor.java:80)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.runner.Runner.run(Runner.java:152)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.command.types.LaunchTemplate.execute(LaunchTemplate.java:69)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.command.CommandExecutor.execute(CommandExecutor.java:128)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.command.CommandExecutor.executeCommands(CommandExecutor.java:156)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.step.StepExecutor.processStep(StepExecutor.java:129)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.step.StepExecutor.processCurrentStep(StepExecutor.java:80)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.runner.Runner.run(Runner.java:152)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.command.types.LaunchTemplate.execute(LaunchTemplate.java:69)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.command.CommandExecutor.execute(CommandExecutor.java:128)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.command.CommandExecutor.executeCommands(CommandExecutor.java:156)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.step.StepExecutor.processStep(StepExecutor.java:129)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.step.StepExecutor.processCurrentStep(StepExecutor.java:80)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.runner.Runner.run(Runner.java:152)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.command.types.LaunchTemplate.execute(LaunchTemplate.java:69)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.command.CommandExecutor.execute(CommandExecutor.java:128)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.command.CommandExecutor.executeCommands(CommandExecutor.java:156)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.step.StepExecutor.processStep(StepExecutor.java:129)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.step.StepExecutor.processCurrentStep(StepExecutor.java:80)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.runner.Runner.run(Runner.java:152)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.command.types.LaunchWizard.execute(LaunchWizard.java:73)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.command.CommandExecutor.execute(CommandExecutor.java:128)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.command.CommandExecutor.executeCommands(CommandExecutor.java:156)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.step.StepExecutor.processStep(StepExecutor.java:216)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.step.StepExecutor.processCurrentStep(StepExecutor.java:80)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.runner.Runner.run(Runner.java:152)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.flow.CimFlow.startupFlow(CimFlow.java:69)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.flow.CimFlowCreator.startDefaultCimFlow(CimFlowCreator.java:78)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.Launcher.startCimFlow(Launcher.java:168)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask          at atg.cim.Launcher.main(Launcher.java:68)
    **** Error     Wed Nov 23 00:57:05 PST 2011     1322038625265     atg.cim.worker.dynamo.StartSqlRepositoryExecutorTask     
    *

  • How to resolve error while importing data using IDoc method in LSMW ?

    Hi
    I am trying to import my data using IDoc method in LSMW.
    But after completing the whole LSMW process, when I look into the IDOC generated, the error description is as this.
    It talks about the process code and other stuff.
    Function module not allowed : APPL_IDOC_INPUTI
    Message No. B1252
    Diagnosis :
    The function module APPL_IDOC_INPUTI and the application object type which were determined are not valid for this IDoc.
    I am not able to resolve the problem.
    Please help.
    Regards,
    Rachesh Nambiar

    check the below link.
    /people/stephen.johannes/blog/2005/08/18/external-data-loads-for-crm-40-using-xif-adapter

  • Error in import data using datapump

    Hi all,
    I just install oracle 10gR2 in my local machine and tryied to import data in 9iR2 database. But it ended with errors. Here r the steps I have followed.

    SQL> conn sys/ as sysdba
    Enter password:
    Connected.
    SQL> crete user amila identified by a;
    SP2-0734: unknown command beginning "crete user..." - rest of line ignored.
    SQL> create user amila identified by a;
    User created.
    SQL> grant connect,resource,create any directory to amila;
    Grant succeeded.
    SQL> create or replace directory dump_dir as 'E:\oracle\dump';
    Directory created.
    SQL> grant read,write on directory dump_dir to amila;
    Grant succeeded.
    -==========================================================
    C:\>impdp amila5/a@cd01 directory=dump_dir dumpfile=amila5.dmp logfile=amila5.lo
    g
    Import: Release 10.2.0.1.0 - Production on Friday, 19 October, 2007 13:27:17
    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    Connected to: Oracle9i Enterprise Edition Release 9.2.0.1.0 - Production
    With the Partitioning, OLAP and Oracle Data Mining options
    JServer Release 9.2.0.1.0 - Production
    UDI-00008: operation generated ORACLE error 6550
    ORA-06550: line 1, column 52:
    PLS-00201: identifier 'SYS.DBMS_DATAPUMP' must be declared
    ORA-06550: line 1, column 52:
    PL/SQL: Statement ignored
    Pls help on this issue.

  • Error in trigger while importing schema

    I am getting the below error while importing a schema back to a database..
    ORA-39083: Object type TRIGGER failed to create with error:
    ORA-04072: invalid trigger type
    Failing sql is:
    BEGIN DBMS_DDL.SET_TRIGGER_FIRING_PROPERTY('"pula"','"Trigger1"',FALSE) ; END;
    ORA-39083: Object type TRIGGER failed to create with error:
    I have checked the status of the trigger and everything id fine...
    Please help..!!

    Rajesh,
    Please go through this Wiki chapter on CoA import using DTW and you should be able to understand this better
    https://wiki.sdn.sap.com/wiki/display/B1/SAPBusinessOne...ToGo-10.DataTransfer+Workbench
    Suda

  • Error while importing RFC schema

    Hello Friends,
    I am getting the following error while importing the RFC schema in Integration repository from SAP which is on ECC version.
    The error I am getting is:
    Unable to establish connection to R/3 system 10.238.52.79 (system=03, client=250) Troubleshooting tips:
    Is the target system online?
    Check the connection data (note that server names and groups are case-sensitive)
    Tips for administrators (see the configuration guide for more details):
    Does the user have the required authorizations in the target system?
    Is the target system configured correctly in "etc/services"?
    I checked the target system configuraion as well as user authorization. All the connection parameters are also correct.
    Does anybody have idea about the solution to this problem?
    Regards,
    Nitin.

    I had this problem before. I had to make an entry with SAPGW03 in my services file on the XI server. the entry should look like
    sapgw03     3303
    That resolved my problem. If you are working with a XI server on Windowns the file is located in <WINDOWS_DIR>\system32\driver\etc\ directory.
    Thanks,
    Kalyan Musunuri
    OBT Global Inc.
    Message was edited by: Kalyan Tej Musunuri

Maybe you are looking for

  • Driver issue with Windows 8.1 on new iMac 2013

    Just got a brand new iMac and am attempting to install Windows 8.1 on it. The Boot Camp Assistant kept timing out on the support software download so I downloaded Boot Camp 5.0.5033 from Apple support and put that on my external FAT drive. I was able

  • What's new in ColdFusion 10

    This question was posted in response to the following article: http://help.adobe.com/en_US/ColdFusion/10.0/Developing/WSd160b5fdf5100e8f639b4550129d6ce3d 4f-8000.html

  • Full screen mode PDF created on Mac not working on PC?

    I've created a PDF presentation on a Mac in Adobe Acrobat 8 Professional with about 80 pages. I've set it to open in full screen mode when viewed, which works fine on Mac when tested. When opened on a PC however, in my clients corporate environment,

  • VSCO cam unable to provide action extension right inside of photos app on my iPhone 4s

    Recently, I decided to check out VSCO cam on my iPhone 4s. The app worked fine on my device but I was little bit shocked when I was editing a photo and I noticed there was no extensions button at the top left corner of the screen. and I remember Appl

  • Switching from Linksys

    For the past few years I have been using a wired Linksys BEFSR41 four-port router with a flying saucer Airport Extreme acting as a bridge. Lately the Linksys needs to be power cycled up to 4 times a day. It seems to have gotten worse with Leopard. Si