JDeveloper3.1.1.2 problem:convert from UTF8 to UCS2 failed

oracleTeam:
i test JDeveloper3.1.1.2,it has problem in runtime:convertion from UTF8 to UCS2 failed ,AttributeLoadException.( our language is chinese)
I found that oracle\jbo\server\QueryCollection.class in dacf.zip maybe has problem,i use this class of JDeveloper3.1 to repalce same_name class in JDeveloper3.1.1.2,above problem disappeared,
but because this class is not suit of JDeveloper3.1.1.2,other problems appeared.
so you should work out this problem ,i hope
it runs correctly.

I searched this forum and the SQLJ/JDBC forum, and found a few occurrences of this problem. Among the things people suggested:
* Changing JDBC drivers (experience varied as to which one fixed the problem)
* Adding nls_charset1x.zip to your CLASSPATH
* Ensuring you're using the same character set on the client and server.
I suggest you take a look at the following discussion threads: http://technet.oracle.com:89/ubb/Forum8/HTML/001810.html http://technet.oracle.com:89/ubb/Forum8/HTML/000065.html http://technet.oracle.com:89/ubb/Forum2/HTML/000820.html
Blaise
null

Similar Messages

  • Fail to Convert between UTF8 and UCS2: fail UTF Conversion

    I'm using the JReport software, a product which generates java reports based on a jdbc datasource.
    during installation the JReport installation program located Microsoft java VM installed on my PC and I accepted this option to be working JVM for JReport.
    This worked very well. I have installed jdbc driver for Oracle 8.1.5 that worked fine as well on the design time (I have connected to Oracle, I have seen my tables and other information on my DB).
    But when I switched the run-time view (when real data is going to load) then the following exception message appeared :
    "Fail to Convert between UTF8 and UCS2: fail UTF Conversion" .
    if there is anybody who understood my problem any kind of help will be appreceated
    It will be helpfull to receive any suggestions made to this matter.
    Anything relative help will be welcome.
    thanks in advance.

    I had the same problem using Oracle 9i. The problem lied within the Oracle JDBC driver itself! --;
    If you're using JDBC driver from Oracle 9i, stop using it!
    You should download JDBC driver of Oracle 10g from Oracle site and use that driver instead.
    After I changed the driver, I now have no problem of getting Korean characters from the database.
    Hope this would solve your problem too.

  • Fail to convert between UTF8 and UCS2

    : java.sql.SQLException: Fail to convert between UTF8 and UCS2: failUTF8Conv
    (BC4J throws that exception)
    I got that error message when try to show the table's content in uix page.
    I use varchar2(40). the error occurs, when i set special characters like õ,ü,Ü.. and so on.
    This affects only UTF encoded database.
    We use: 9.0.2.5, Jdev9.0.3.3., UIX, iAS 9.4.2??(i dont'know)
    As i know the problem is when there is 40 letters in the column and it contains "special character" it can't convert it, because the neccessary space requered to store values is much more.
    If i use nvarchar, i don't think it fix this problem.
    And as i know, all sql constant must use where coulmn=N'constans value' format.
    Q:
    How can i set the UTF database and BC4J to run correctly.
    I mean... what type to use, what is the column size to set.
    Thanks in advice,
    Viktor

    I had the same problem using Oracle 9i. The problem lied within the Oracle JDBC driver itself! --;
    If you're using JDBC driver from Oracle 9i, stop using it!
    You should download JDBC driver of Oracle 10g from Oracle site and use that driver instead.
    After I changed the driver, I now have no problem of getting Korean characters from the database.
    Hope this would solve your problem too.

  • Fail to convert between UTF8 and UCS2: failUTF8Conv

    We need to store possibly all UTF8 chararacter in his database
    especially $ & ( 4 8 < = > from WE8ISO8859P1 and from WE8ISO8859P15.
    So we install an UTF8 instance and set NLS_LANG to UTF8.
    We have to do select/update from a java client and sqlplus(like).
    When we insert with java client it's unreadable from sqlplus and
    when whe insert from sqlplus we've got 'Fail to convert between UTF8 and UCS2: failUTF8Conv '
    here the code made in sqlplus
    update CPW_TEST set lb_comportement='$ & ( 4 8 < = > ' WHERE ID_TEST=14805;
    here the code made in java
    update CPW_TEST set lb_comportement='$ & ( 4 8 < = > ' WHERE ID_TEST=14804;
    and then the result in database
    SELECT id_test,LB_COMPORTEMENT FROM CPW_TEST WHERE ID_TEST=14804 or ID_TEST=14805
    ID_TEST LB_COMPORTEMENT
    14804 B$ B& B( B4 B8 B< B= B> B
    14805 $ & ( 4 8 < = >
    2 rows selected
    and the dump
    SELECT id_test,dump(LB_COMPORTEMENT) FROM CPW_TEST WHERE ID_TEST=14804 or ID_TEST=14805
    ID_TEST DUMP(LB_COMPORTEMENT)
    14804 Typ=1 Len=26: 194,164,32,194,166,32,194,168,32,194,180,32,194,184,32,194,188,32,194,189,32,194,190,32,194,128
    14805 Typ=1 Len=17: 164,32,166,32,168,32,180,32,184,32,188,32,189,32,190,32,128
    2 rows selected
    I'm not sure, but it seems that sqlplus uses true UTF8 (variable length codes) and java client uses UCS-2 (2 bytes)
    How can I solve my problem?
    Our configuration
    javaclient (both thin and oci jdbc driver 8.1.7), sqlplus client and database Oracle 8.1.7.0.0 on the same computer (W2000 or NT4)
    Thank you for yoru attention.

    Hi Luis, thanks for your suggestions. You're right that problem was in JServ and his JVM.
    There was conflict between different versions of Java. While iFS was using JRE 1.3.1, JServ was configured to use JRE 1.1.8. As soon as I corrected link to the right Java, problem disappears.
    Radek

  • Java.sql.SQLException: Fail to convert between UTF8 and UCS2: failUTF8Conv

    Hi all,
    I am writing a servlet that connects to Oracle 8.0.6 through jdbc for jdk1.2 on NT 4.0
    English version and it works fine.
    But when the servlet is deployed to a solaris with Oracle 8.0.5 (not a typo, the oracle on
    NT is 8.0.6 and oracle on solaris is 8.0.5) and jdbc for jdk1.2 (of course, for Solaris),
    the servlet failed with the Exception:
    java.sql.SQLException: Fail to convert between UTF8 and UCS2: failUTF8Conv
    (I am using JRun 3.0 as the application and web server for both NT and Solaris)
    (The database in both the NT and solaris platform are using UTF8 charset)
    My servlet looks like this: (dbConn is a Connection object proved to be connected to Oracle
    in previous segment of the same method):
    String strSQL = "SELECT * FROM test";
    try { Statement stmt = dbConn.createStatement();
    ResultSet rs = stmt.execute(strSQL);
    while (rs.next()) {
    out.println("id = " + rs.getInt("id"));
    System.out.println("id written");
    out.println("name = " + rs.getString("name")); // <-- this is the line the
    exception is thrown
    System.out.println("name written");
    } catch (java.sql.SQLException e) {
    System.out.println("SQL Exception");
    System.out.println(e);
    The definition of the "test" table is:
    create table test(
    id number(10,0),
    name varchar2(30));
    There are about 10 rows exists in the table "test", in which all rows contains ONLY chinese
    characters in the "name" field.
    And when I view the System log, the string "id written" is shown EXACTLY ONCE and then there
    is:
    SQL Exception
    java.sql.SQLException: Fail to convert between UTF8 and UCS2: failUTF8Conv
    That means the resultset is fetch back from the database correctly. The problem arise only
    during the getString("name") method.
    Again, this problem only happens when the servlet is run on the solaris platform.
    At first I would expect there are some strange code shown on the web page rather than having
    an exception. I know that I should use getBytes to convert between different encodings, but
    that's another story.
    One more piece of information: When all the rows contains ascii characters in their "name"
    field, the servlet works perfectly even in solaris.
    If anyone knows why and how to tackle the problem please let me know. You can feel free to
    send email to me at [email protected]
    Many thanks,
    Ben
    null

    Hi all,
    For the problem I previously posted, I found that Oracle had had such bug filed before in Oracle 7.3.2 (something like that) and is classified to be NOT A BUG.
    A further research leads me to the document of Oracle that the error message:
    "java.sql.SQLException: Fail to convert between UTF8 and UCS2: failUTF8Conv"
    is a JDBC driver error message of error number ORA-17037.
    I'm still wondering why this behaviour will happen only in Solaris platform. The servlet on an NT machine I am using (which has an Oracle 8.0.6 and jdbc for jdk 1.2 running) is working just fine. I also suspect that this may be some sort of mistakes from jdbc driver.
    Nevertheless, I have found a way to work around the problem that I cannot get non-English string from Oracle in Solaris and I would like to share it with you all here.
    Before I go on, I found that there are many people out there on the web that encounter the same problem. (Some of which said s/he has been working on this problem for a month). As a result, if you find this way of working around the problem does help you, please tell those who have the same problem but don't know how to tackle. Thanks very much.
    Here's the way I work it out. It's kinda simple, but it does work:
    Instead of using:
    String abc = rs.getString("SomeColumnContainsNonEnglishCharacters");
    I used this:
    String abc = new String(rs.getBytes("SomeColumnContainsNonEnglishCharacters"));
    This will give you a string WITH YOUR DEFAULT CHARSET (or ENCODING) from your system.
    If you want to convert the string read to some other encoding type, say Big5, you can do it like this:
    String abc = new String(rs.getBytes("SomeColumneContainsNonEnglishCharacters"), "BIG5");
    Again, it's simple, but it works.
    Finally, if anyone knows why the fail to convert problem happens, please kindly let me know by leaving a word in [email protected]
    Again, thanks to those of you who had tried to help me out.
    Creambun
    <BLOCKQUOTE><font size="1" face="Verdana, Arial">quote:</font><HR>Originally posted by creambun creambun ([email protected]):
    Hi all,
    I am writing a servlet that connects to Oracle 8.0.6 through jdbc for jdk1.2 on NT 4.0
    English version and it works fine.
    But when the servlet is deployed to a solaris with Oracle 8.0.5 (not a typo, the oracle on
    NT is 8.0.6 and oracle on solaris is 8.0.5) and jdbc for jdk1.2 (of course, for Solaris),
    the servlet failed with the Exception:
    java.sql.SQLException: Fail to convert between UTF8 and UCS2: failUTF8Conv
    (I am using JRun 3.0 as the application and web server for both NT and Solaris)
    (The database in both the NT and solaris platform are using UTF8 charset)
    My servlet looks like this: (dbConn is a Connection object proved to be connected to Oracle
    in previous segment of the same method):
    String strSQL = "SELECT * FROM test";
    try { Statement stmt = dbConn.createStatement();
    ResultSet rs = stmt.execute(strSQL);
    while (rs.next()) {
    out.println("id = " + rs.getInt("id"));
    System.out.println("id written");
    out.println("name = " + rs.getString("name")); // <-- this is the line the
    exception is thrown
    System.out.println("name written");
    } catch (java.sql.SQLException e) {
    System.out.println("SQL Exception");
    System.out.println(e);
    The definition of the "test" table is:
    create table test(
    id number(10,0),
    name varchar2(30));
    There are about 10 rows exists in the table "test", in which all rows contains ONLY chinese
    characters in the "name" field.
    And when I view the System log, the string "id written" is shown EXACTLY ONCE and then there
    is:
    SQL Exception
    java.sql.SQLException: Fail to convert between UTF8 and UCS2: failUTF8Conv
    That means the resultset is fetch back from the database correctly. The problem arise only
    during the getString("name") method.
    Again, this problem only happens when the servlet is run on the solaris platform.
    At first I would expect there are some strange code shown on the web page rather than having
    an exception. I know that I should use getBytes to convert between different encodings, but
    that's another story.
    One more piece of information: When all the rows contains ascii characters in their "name"
    field, the servlet works perfectly even in solaris.
    If anyone knows why and how to tackle the problem please let me know. You can feel free to
    send email to me at [email protected]
    Many thanks,
    Ben<HR></BLOCKQUOTE>
    null

  • Help me,Fail to convert between UTF8 and UCS2: failUTF8Conv

    using SUN APP server 7.0 + Studio 4 +oracle9i to develop cmp, when i input chinese in some fields, i encount following errors:
    java.sql.SQLException: Fail to convert between UTF8 and UCS2: failUTF8Conv
    at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:180)
    at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:222)
    at oracle.jdbc.dbaccess.DBError.check_error(DBError.java:916)
    at oracle.jdbc.dbaccess.DBConversion.failUTF8Conv(DBConversion.java:1958)
    at oracle.jdbc.dbaccess.DBConversion.utf8BytesToJavaChars(DBConversion.java:1797)
    at oracle.jdbc.dbaccess.DBConversion.charBytesToJavaChars(DBConversion.java:828)
    at oracle.jdbc.dbaccess.DBConversion.CHARBytesToJavaChars(DBConversion.java:783)
    at oracle.jdbc.ttc7.TTCItem.getChars(TTCItem.java:231)
    at oracle.jdbc.dbaccess.DBDataSetImpl.getCharsItem(DBDataSetImpl.java:1094)
    at oracle.jdbc.driver.OracleStatement.getCharsInternal(OracleStatement.java:2947)
    at oracle.jdbc.driver.OracleStatement.getStringValue(OracleStatement.java:3103)
    at oracle.jdbc.driver.OracleStatement.getObjectValue(OracleStatement.java:5089)
    at oracle.jdbc.driver.OracleStatement.getObjectValue(OracleStatement.java:4964)
    at oracle.jdbc.driver.OracleResultSetImpl.getObject(OracleResultSetImpl.java:404)
    at com.sun.jdo.spi.persistence.support.sqlstore.ResultDesc.getConvertedObject(ResultDesc.java:399)
    at com.sun.jdo.spi.persistence.support.sqlstore.ResultDesc.setFields(ResultDesc.java:746)
    at com.sun.jdo.spi.persistence.support.sqlstore.ResultDesc.getResult(ResultDesc.java:635)
    at com.sun.jdo.spi.persistence.support.sqlstore.SQLStoreManager.executeQuery(SQLStoreManager.java:648)
    at com.sun.jdo.spi.persistence.support.sqlstore.SQLStoreManager.retrieve(SQLStoreManager.java:500)
    at com.sun.jdo.spi.persistence.support.sqlstore.SQLStateManager.reload(SQLStateManager.java:1197)
    at com.sun.jdo.spi.persistence.support.sqlstore.SQLStateManager.loadForRead(SQLStateManager.java:3797)
    at com.sun.jdo.spi.persistence.support.sqlstore.impl.PersistenceManagerImpl.getObjectById(PersistenceManagerImpl.java:604)
    at com.sun.jdo.spi.persistence.support.sqlstore.impl.PersistenceManagerWrapper.getObjectById(PersistenceManagerWrapper.java:247)
    at com.tops.gdgpc.EntityBean.SpeBaseTable.SpeBaseTableBean_1769729755_ConcreteImpl.jdoGetInstance(SpeBaseTableBean_1769729755_ConcreteImpl.java:2479)
    at com.tops.gdgpc.EntityBean.SpeBaseTable.SpeBaseTableBean_1769729755_ConcreteImpl.ejbLoad(SpeBaseTableBean_1769729755_ConcreteImpl.java:2267)
    at com.sun.ejb.containers.EntityContainer.callEJBLoad(EntityContainer.java:2372)
    at com.sun.ejb.containers.EntityContainer.afterBegin(EntityContainer.java:1362)
    at com.sun.ejb.containers.BaseContainer.startNewTx(BaseContainer.java:1405)
    at com.sun.ejb.containers.BaseContainer.preInvokeTx(BaseContainer.java:1313)
    at com.sun.ejb.containers.BaseContainer.preInvoke(BaseContainer.java:462)
    at com.tops.gdgpc.EntityBean.SpeBaseTable.SpeBaseTableBean_1769729755_ConcreteImpl_EJBObjectImpl.getSpeBaseTable(SpeBaseTableBean_1769729755_ConcreteImpl_EJBObjectImpl.java:24)
    at com.tops.gdgpc.EntityBean.SpeBaseTable._SpeBaseTable_Stub.getSpeBaseTable(Unknown Source)
    at sun.reflect.GeneratedMethodAccessor23.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:324)
    at com.sun.forte4j.j2ee.ejbtest.webtest.InvocableMethod$MethodIM.invoke(InvocableMethod.java:233)
    at com.sun.forte4j.j2ee.ejbtest.webtest.EjbInvoker.getInvocationResults(EjbInvoker.java:98)
    at com.sun.forte4j.j2ee.ejbtest.webtest.DispatchHelper.getForward(DispatchHelper.java:191)
    at jasper.dispatch_jsp._jspService(_dispatch_jsp.java:127)
    at org.apache.jasper.runtime.HttpJspBase.service(HttpJspBase.java:107)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
    at com.iplanet.ias.web.jsp.JspServlet$JspServletWrapper.service(JspServlet.java:552)
    at com.iplanet.ias.web.jsp.JspServlet.serviceJspFile(JspServlet.java:368)
    at com.iplanet.ias.web.jsp.JspServlet.service(JspServlet.java:287)
    at javax.servlet.http.HttpServle
    I think it is caused by that the fields' not enought long.UTF must use 3 characters for one chinese,but in our database only 2 characters for one chinese.How can I resolve this puzzle,please help me!

    I've got many more tips from our experts:
    1. One thing you can try to do is make sure that Oracle Databse is set to UTF-8. If this is the case then you need to make sure that the client encoding is UTF-8.
    2. A colleague at Oracle points out that oracle.jdbc.* is a package that Oracle
    provides, not Sun. He says you might try asking this question on an Oracle
    forum. He writes:
    We have Oracle Technology Network (OTN) discussion forum, which would be a good place to
    start with this question.
    Oracle Technology Network (OTN) " Technologies " Java " SQLJ/JDBC
    Try the SQLJ/JDBC forum first.
    -- markus.

  • SQL Exception: Fail to convert between UTF8 and UCS2: failUTF8Conv

    Hi,
    I am trying to use the DBMS_OBFUSCATION_TOOLKIT to encrypt/decrypt some strings through JDBC, but I am getting the following exception:
    SQL Exception: Fail to convert between UTF8 and UCS2: failUTF8Conv
    The input and output parameters for the encryption/decryption functions are VarChar2.
    Our database is using UTF8.
    I will be glad if someone can help me on this.
    Thanks,
    Cenk

    Susi,
    This is just a wild guess, but your java client is on a different computer to your Oracle 9.2 server, and the locale (or encodings) on the two computers is different. Am I correct? If so, then I guess you need to change the locales (or encodings) so that they match.
    Have you tried printing out the "System" properties for the two JVMs (Oracle's and your client's)? Something simple like this:
    System.getProperties().list()Good Luck,
    Avi.

  • JBO-27022, Fail to convert between UTF8 and UCS2: failUTF8Conv

    steps to reproduce :
    on a Database with UTF8 as NLS_CHARACTERSET
    create table test (s1 varchar2(10), s2 varchar2(32));
    DECLARE
    l_data varchar2(4000);
    return_data varchar2(4000);
    skey varchar2(255);
    p_str varchar2(10);
    BEGIN
    p_str := 'Test';
    skey := chr(20) || chr(115) || chr(110) || chr(122) || chr(37) || chr(37) || chr(94) || chr(48);
    l_data := rpad (p_str,(trunc(length(p_str)/8)+1)*8, chr(0));
    dbms_obfuscation_toolkit.DESEncrypt (input_string=&gt;l_data, key_string =&gt;skey,
    encrypted_string =&gt; return_data);
    insert into test values ('1',return_data);
    END;
    commit;
    Now create a new Workspace and a new "Business Components Package"
    with only table test selected.
    In the tester we get an oracle.jbo.AttributeLoadException:
    (oracle.jbo.AttributeLoadException) JBO-27022: Failed to load value at index 2
    with java object of type java.lang.String due to java.sql.SQLException.
    ----- LEVEL 1: DETAIL 0 -----
    (java.sql.SQLException) Fail to convert between UTF8 and UCS2: failUTF8Conv
    oracle.jbo.AttributeLoadException: JBO-27022: Failed to load value at index 2 with java object of type java.lang.String due to java.sq[i]Long postings are being truncated to ~1 kB at this time.

    This is still happening in jdev9031.
    And it is very easy to reproduce!
    How can we avoid this error?
    Thanks,
    Bert.

  • Are there problems converting from Adobe Bridge to Organizer?

    HI,
       I've searched for information on this and haven't found anything so maybe there aren't any problems :} but...Murphy and I are good friends so I have to ask if there are problems.
    I currnetly run Elements 8 on a MacBook and have about 2000 pictures in Bridge and some in Iphoto. I use bridge for pictures for web design and Iphoto for family stuff.  I 'help out' with the computer stuff at a cat shelter in town and they have just bought Elements 10 and run the PC version.  I am thinking of upgrading to 10 so I will be familiar with their version but am worried about converting from bridge to organizer. Are there any known problems?  Is there a tutorial some place about the conversion from bridge to organizer.
    Thanks for any ideas

    You have no pictures in Bridge. Bridge is just a browser; it just shows the current state of any folder to which you point it, so that's no problem. If you really want to use organizer, it will pick up your metadata keywords when you import the photos. Photos from iphoto will be duplicated on import to the organizer (normally it just makes a dbase pointer to the existing photo) to avoid inadvertently writing into the iphoto library, which can corrupt it and cause loss of the photos it contains.
    You will lose any stacks in bridge when you import the photos into organizer.

  • Problem converting from wma

    Hi I am having trouble converting my old WMA files into itunes. It all started well until it found a track which it said the name was repeated, so it couldn't transfer it. thinking nothing of it I clicked ok and the conerting stopped. Can I override this, as I have lots which are purely saved as "track 1" and how do i restart the converting from WMA??
    Thanks

    Oh I would love to know this too! Is there a program to get to convert the WMA to MP3.
    Here is what I want to do: I have burned a lot of CD's (some I do not own anymore) to my hard dri've on my computer and have put them on my player. I would like to take the WMA files on my computer and convert them to MP3 to make them smaller "files", blank my Creative Jukebox Zen 2.0, and then reload them into the player in the ideal that there will be less "space" taken up from the WMA's. I am thinking this will help in battery life as well as the amount of space on my MP3 player. Anyone know of a good program that can do this? I assume there are no "free" programs to do this, so any suggestions on good ones to purchase within a "decent" amount of money? Or can I just go into any store and pick one up and have it work for files already on my harddri've? I want the smallest MP3 space of each song that I can get without compromising the sound on my MP3.
    Thanks!

  • Problem converting from pdf to doc (encrypted or corrupt file?)

    Hi everyone.
    I need to convert my resume (pdf) to doc so I can make a couple of edits, and then convert it back to pdf. Problem is, I can't seem to do it. I first tried straight from the program in adobe by saving the file as a doc, but the file formating was way too screwed up when I did that.
    So, I then googled around, and found zamzar.com. I tried that site, but kept getting conversion errors which said "file may be encrypted, password protected, or corrupt". I don't have any passwords or encryptions on it, so I tried downloading other conversion programs, but I kept getting the same types of errors.
    I checkd the "security properties" for the file, and I don't have anything. I also don't know how the file could be corrupt since I don't have any issues viewing it, and when I sent it to my friend via email, he was able to open it, too.
    Ihave recently been using Acrobat 9 pro extended version that I downloaded as a torrent from a reputable person. I just wanted to get a feel for the program, but maybe this is the problem, since any pdf I've saved since I began using it, seems to have that conversion issue, but I figure there has to be a workaround. I really don't know what the issue is, or how I can solve it, and so I'm stumped.
    Any suggestions would be really great.
    Thanks a lot.

    This is a good read for anyone attempting to convert a PDF to a Word document...
    http://www.planetpdf.com/enterprise/article.asp?ContentID=PDF-to-Word_Conversion_-_Why_it_ is_so_hard_to_do&gid=7837&fa

  • Problems converting from MS Word 2007 to PDF using LiveCycle Designer 8

    Hi all,
    Hope anybody out there could assist me. have a Word document (we are using
    Office 2007) and I'm trying to use Adobe Live Cycle Designer 8.0 to convert
    a form from word to pdf. However, I get an error message (as shown in the
    image below)
    Is this a compatibility problem with Office 07?
    Appreciate all the assistance.
    Thanks and regards.

    Hi Peggy,
    I found a solution in a tech note from the adobe site:
    http://kb.adobe.com/selfservice/viewContent.do?externalId=329044&sliceId=2
    2007 is a little tricky, since the instructions it gives are for Office 2003.
    In 2007 I went into "Word Options" from that funky button at the top of the Word window.
    From there, go into "Add Ins".
    At the bottom where it says "Manage" select "Com Add Ins" then click "Go".
    You should have a list that includes the Acrobat PDFMaker Office Add In. My Add In file is located at: \\Program Files\Adobe\Acrobat 8.0\PDFMaker\Office\PDFMOfficeAddin.dll
    The document finally converted, but it couldn't autodetect any of the form fields I'd placed on the original Word Document (.docx). Still looking for an answer on that one.
    Good luck.
    Also posted this solution to: https://www.adobeforums.com/webx/.3bc400e4/4

  • NAT Problems Converting from 7.2(2) to 8.6(1)2

                       I am trying to replace an ASA 5510 running 7.2(2) with an ASA 5515x running 8.6(1)2.  The problem I am having is that the NAT entries are not working on the ASA 5515x.  Is there anything that needs to be considered when moving the configuration from the ASA 5510 to the ASA 5515x.

    Hi,
    ASAs NAT configuration format went under a big change when going from 8.2 to 8.3. The NAT configuration format changed completely and therefore none of the old NAT configurations work anymore. These are "global" , "nat" and "static". Actual NAT configurations start with the command "nat" though but otherwise in a totally different format.
    Your new ASA 5500-X series firewall can only use 8.6 or above software level. That is its "oldest" software. Therefore you cant use your old configuration on it. People who simply upgrade software on the original ASA5500 series will be able to just boot their ASA to the new software. Though while the ASA then migrates the NAT configurations to the new format, the results arent always the best.
    One major change would also be ACLs. In the new software you will always use the real IP address in the interface ACL when allowing traffic somewhere. So even if you were allowing traffic to some server (that has a Static NAT configured on the ASA) you would now use the real IP address as the destination rather than the NAT IP address. This is mainly due to the fact that ASA handles NAT before ACL now in the new software.
    There is also some minor changes to the commands related to VPN configurations.
    But the above are the biggest changes.
    How large NAT configuration do you have on the original ASA5510? If we are not talking about a huge configuration I could probably help with converting the NAT configurations.
    Here is a document I wrote about the new NAT configuration format
    https://supportforums.cisco.com/docs/DOC-31116
    Here is also a good document that might help you compare the old and new NAT configuration formats
    https://supportforums.cisco.com/docs/DOC-9129
    Hope this helps
    Please do remember to mark a reply as the correct answer if it answered your question.
    Feel free to ask more if needed.
    - Jouni

  • Problems Converting from Form6i to Forms10g/Reports10g - first experiences

    Hello,
    We are converting a Forms6i application to Forms10g.
    There is one fmb-file with a size of more than 3100kb.
    First step and surprise is that after using the
    forms migration assistant for converting the file from
    6i to 10g the size is minimized and now about 1000kb.
    We did not use the migration assistant and opened all
    6i files with FormBuilder10g (WindowsXP)
    Assuming that the conversion will be done if we are saving the file in form10g-builder.
    After doing a lot of chances in the big-fmb file and saving it -
    closing the form and opening it - the changes were lost!
    (Perhaps because there was more than one crash of form builder?)
    My questions are:
    - how can i check if the integrity of my fmb-file is ok?
    - is the minimum hardware requirement for developing a 10g-FMB file on a PC more than > 256MB RAM/450 MHz Processor?
    - for WindowsXP there is a DeveloperSuite 10g PreView -
    what does that mean? when will the final version be released?
    My experiences are:
    - if you are integrating a 10g Report with a parameter form in forms10g: first it does not work - you have to integrate a javabean in forms (encrypted userid in cookie
    before calling the report) or the other solution i read was funny with nearly no security (userid in hex in the url).
    how can u call a 10g report from the web without writing all your credentials in cgicmd.dat and without having oid (single sign on)?
    a solution would be writing your credentials encrypted in a cookie and then calling the report with the parameter form -
    but how? if you have never done java before?
    - handling with formbuilder10g is much more slower than formbuilder6i: is forms10g written in java?
    - you have to do a lot of configuration before calling a report on your development pc.
    thx for any comments and tips
    ps:
    shall i use webforms10g now or learn java/jdeveloper or wait for java.net (=cooperation Microsoft and Sun)? <

    Hello Frank,
    Thank you for your 9i/10i secure calls to web.show_document solution. I spend hours with it but could'nt get it working. I followed all the instructions from the pdf document. Everything looks fine ( java console output included ) but the cookie isn't created.
    I watched the tcp packet for the http header to see if any cookie information was included, but this isn't the case. I switched browsers (IE5 and IE6) and changed their security settings to very low. Nothing seems to matter. Still getting the REP-51018: Need database user authentication error. What am I doing wrong???
    I'm running 9.0.4 (10g) forms reports services edition on Windows 2000.
    ============
    Forms code :
    ============
    DECLARE
    rep_url varchar2(2000);
    BEGIN
    rep_url:='http://cen0060s:9989/reports/rwservlet?server=repcl&report=ebn2900r.rep'
    ||'&desformat=htmlcss&destype=cache&envid=ebn&userid=';
    set_custom_property('control.userid_bean',1,'WRITE_LOGOUTPUT','true');
    set_custom_property('control.userid_bean',1,'ADD_USERID',
    get_application_property(username)||'/'||
    get_application_property(password)||'@'||
    get_application_property(connect_string));
    set_custom_property('control.userid_bean',1,'SET_MAX_AGE','30');
    set_custom_property('control.userid_bean',1,'SET_COOKIE_DOMAIN','.dlg.agro.nl');
    set_custom_property('control.userid_bean',1,'SET_COOKIE_PATH','/reports/');
    set_custom_property('control.userid_bean',1,'WRITE_USERID_COOKIE','10g');
    SYNCHRONIZE; <-- Metalink bulletin regarding REP-51018 errors
    WEB.SHOW_DOCUMENT(rep_url, '_blank');
    END;
    ===============
    Applet output :
    ===============
    Loading http://cen0053s/forms90/java/frmrwinteg.jar from JAR cache
    proxyHost=null
    proxyPort=0
    connectMode=HTTP, native.
    Versie van Forms-applet is: 9.0.4.0
    4.0 (compatible; MSIE 5.5; Windows NT 5.0; LNV; .NET CLR 1.1.4322)
    FrmReportsInteg0: Debugging true
    FrmReportsInteg0: Adding new userid string "EBN_OWNER/[email protected]"
    FrmReportsInteg0: Default cookie domain:
    FrmReportsInteg0: set RW_AUTH10g
    FrmReportsInteg0: Arguments: encryptionKey=reports9i; Reports version=RW10g
    FrmReportsInteg0: Cookie value for RW10g is: EBN_OWNER/[email protected];1093349923071:30
    FrmReportsInteg0: Encoded cookie value is: ZF/zcPJEKWXsS9Rh4pfD3079dl+p4fnz20rz8aM2PdMW4ITpb+rYdtWOF2GUmqkXrw==
    FrmReportsInteg0: Complete cookie string is: userid=ZF/zcPJEKWXsS9Rh4pfD3079dl+p4fnz20rz8aM2PdMW4ITpb+rYdtWOF2GUmqkXrw==
    FrmReportsInteg0: Added domain " " to cookie
    FrmReportsInteg0: Generated Cookie String: userid=ZF/zcPJEKWXsS9Rh4pfD3079dl+p4fnz20rz8aM2PdMW4ITpb+rYdtWOF2GUmqkXrw==; domain= ; path=/
    FrmReportsInteg0: IE Cookie Set

  • Problems converting from CS4 - eps files show TIFF error

    Having downloaded the latest version of Indesign CS6, I have tried to open several douments of a music tutor featuring many EPS music graphics along with many Tiff images. However, I keep getting following error: "Error encountered while reading TIFF image. Image may be damaged or incompatible. Resave the image with different settings and try again."
    I have resaved all the EPS and Tiff images as CS6 files and relinked them into the document but to no avail - the problem remains. I suspect that the  image files are not actually damaged but that there is some other problem. I have 100s of images and am not sure what next to do. I noticed another user ( Kaleidoscopes Violin) who had a similar problem and it was suggested that "if your CS2 file is still intact, try exporting it to interchange format (.inx), then open the .inx file in CS5 (or CS5.5 which is the current release, and not the same as CS5)." - but this was for CS2 to CS5, and CS6 does not allow .inx file export. I need to work with the document in future so exporting it as a PDF file is of no use to me. Is this a known bug in CS6? I would be grateful for any advice. Thank you.

    Wow ... very, very helpful responses.  I wish I'd posted this earlier!! 
    I had no idea that EPS was an older file format, or that PDF was considered a "legitimate" graphics format.  Sibelius 6 only exports to EPS, TIFF (yuck!!), and PNG.  However, I did discover that it "prints" to Adobe PDF.  Odd that the option is located in an unexpected file menu, but very glad it's there!!  The PDF format loads fine into InDesign and also prints beautifully, unlike the PNG option. Yay!! 
    As far as embedding fonts, I had been making sure this option was checked (when exporting from Sibelius), so it seems unlikely that was the problem. So in theory that would mean it's not a problem of the InDesign font cache, or of a font discrepancey.  In practice ... who knows.  What's particularly puzzling is that some EPS files are fine, while others were not; and also that CS2 had no problems with any of them.
    Regardless, the font information is really, really good to know ... I'll save this thread for future use!
    Thanks so much to all for your generous help!!!
    Elise

Maybe you are looking for

  • "the document.....could not be saved"

    I just started receiving this error message on one song. "the document.....could not be saved" If I closed down everything and restarted i could then save the project. Now it won't even do that and it has spread to another song/project, with even mor

  • Duplex Printing in Smart Form

    On the back of each page of delivery note i have to print terms and conditions, I followed the below following  steps: 1.Created 2 Pages FRONT  and  BACK. 2.Set page property of FRONT page as Duplex and next page as BACK. 3.Set page property of BACK

  • Organisational structure Maintaince

    Hi Gurus, Our Oraganisational structure is a quite messup. The Chief positions and reporting structure is in a mess up state because of which workflow doesnot work well. Suggest me relevant steps which i have to follow to bring it in the right shape.

  • TLF Hanging in 4.5.20967

    We just noticed that our application would hang (for 15s) with the following warning after upgrading to the release 4.5 SDK: Error: Error #1502: A script has executed for longer than the default timeout period of 15 seconds. at flashx.textLayout.cont

  • Unique file name  - 6dk1632m3a06f9lh.xml

    If I Submit by Email from within Designer 8 - from the Preview Tab - the programme generates a unique file name, i.e. 6dk1632m3a06f9lh.xml(or XDP) yet when I Submit by Email directly from Reader it gives the xml file name the same name as the host PD