PreparedStatement, VARCHAR"(4000) and umlauts

Can someone explain this to me, please?
I have a varchar2 column of size 4000. Just big enough so I don't have to LOB it.I can insert a String of size 4000 no problem. But, if my String contains special characters, such as umlauts, I get an error that the String size is too big for the column. Surprises me, as my column type is char rather than byte. However, if I don't use prepared Statements the same String will insert OK.
Further, if I reduce the column size to 1, I can insert 'ä' using prepared statements.
Simple test code is below. Can someone explain to me what is going on? Thanks.
create table test_table (
name varchar2(4000 char);
public static void main(String[] args) {
final String url = "jdbc:oracle:thin:@host:port:DB";
final String username = "me";
final String password = "password";
String veryBigString = "";
for (int i = 0; i < 4000; i++) veryBigString = veryBigString + "a";
System.out.println("vbs length: " + veryBigString.length());
Connection connection = null;
try {
Class.forName("oracle.jdbc.driver.OracleDriver");
connection = DriverManager.getConnection(url, username, password);
Statement s = connection.createStatement();
int inserted = s.executeUpdate("insert into test_table values ('" + veryBigString + "')");
System.out.println("1: inserted " + inserted);
inserted = s.executeUpdate("insert into test_table values ('" + ("ä" + veryBigString.substring(1)) + "')");
System.out.println("2: inserted " + inserted);
PreparedStatement ps = connection.prepareStatement("insert into test_table values (?)");
ps.setString(1, veryBigString);
inserted = ps.executeUpdate();
System.out.println("3: inserted " + inserted);
ps.setString(1, "ä" + veryBigString.substring(1));
inserted = ps.executeUpdate();
System.out.println("4: inserted " + inserted);
catch (Exception e) {
e.printStackTrace();
finally {
if (connection != null) try {
connection.close();
catch (SQLException e) {
e.printStackTrace();
}

I am unable to reproduce your problem
I took your code, I had to change the connect string but otherwise it is the same
I was able to load all four rows with no problem
My character set is US7ASCII for the database and for the client
I suspect you might be having a character-set-conversion issue, you know, where the client character set does not match the db character set so it has to do a translation, sometimes it is called 'lossy' if the client char set is not a complete subset of the db char set
So - I changed the NLS_LANG env var on the client to WEISO... (the one you mentioned in your mail), but it still worked
So, then I changed the driver - I was using classes12.jar so I changed it to ojdbc14.jar, it still worked
I am using 10.2, is that what you guys are using?
One thing you might want to try - do a
SELECT DUMP(NAME) FROM TEST_TABLE
It will give you the byte-by-byte ascii info that was loaded into the table, that might give you some insight
Maybe load just 1000 characters and see what it is putting in the db for the As as opposed to the umlaut-As
If you look at the data I loaded, it put in the umlaut-A as a single byte
(but - that makes sense for the US7ASCII character set)
(I trimmed the output)
SQL> select dump(name) from test_table;
DUMP(NAME)
Typ=1 Len=4000: 228,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,...
Typ=1 Len=4000: 97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,9...
Typ=1 Len=4000: 97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,9...
Typ=1 Len=4000: 228,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,...
See below, char 228 is the umlaut-A
SQL> select chr(228) from dual;
C
ä
Here is the code, it is almost identical, I just put the user/pass in the connect string
import java.sql.*;
import java.math.*;
import java.io.*;
import oracle.jdbc.*;
import java.util.*; // for the Properties
class Varchar
public static void main(String[] args) {
final String url = "jdbc:oracle:thin:user/pass@host:1521:db";
String veryBigString = "";
for (int i = 0; i < 4000; i++) veryBigString = veryBigString + "a";
System.out.println("vbs length: " + veryBigString.length());
Connection connection = null;
try {
Class.forName("oracle.jdbc.driver.OracleDriver");
connection = DriverManager.getConnection(url);
Statement s = connection.createStatement();
int inserted = s.executeUpdate("insert into test_table values ('" + veryBigString + "')");
System.out.println("1: inserted " + inserted);
inserted = s.executeUpdate("insert into test_table values ('" + ("ä" + veryBigString.substring(1)) + "')");
System.out.println("2: inserted " + inserted);
PreparedStatement ps = connection.prepareStatement("insert into test_table values (?)");
ps.setString(1, veryBigString);
inserted = ps.executeUpdate();
System.out.println("3: inserted " + inserted);
ps.setString(1, "ä" + veryBigString.substring(1));
inserted = ps.executeUpdate();
System.out.println("4: inserted " + inserted);
catch (Exception e) {
e.printStackTrace();
finally {
if (connection != null) try {
connection.close();
catch (SQLException e) {
e.printStackTrace();
}

Similar Messages

  • Retrieving data from a "varchar(4000)" field w/ JDBC

    I'm using the Oracle thin JDBC drivers w/ the 1.1.8 JDK. In the past when needing to access "LONG" fields with over 2000 bytes of data I used the "rs.getAsciiStream(xx)" method and read the data from the STREAM. A later version of the thin drivers allowed me to use the straight-forward method of "rs.getString(xx)" and it would return all of the data.
    I'm now trying to access an ORACLE database (NT, 8.0.5) with a field defined as "varchar(4000)". Using either of the retrieval methods [getString(), getAsciiStream()], I only get 2000 bytes of data, which is not acceptable. I ran into the same problem when I used the 1.2.2 JDK.
    I finally downloaded the thin drivers for ORACLE 8.1.6, and they seem to work against the 1.2.2 JDK, and the ORACLE 8.1.5 drivers work with the 1.1.7 JDK.
    My question is, can I "safely" use the 8.1.6 drivers against an 8.0.5 database? If not, is there another way of pulling the data out? [I tried using the "defineColumnType" to "LONGVARCHAR" and that didn't work either].
    Any help would be appreciated.
    Wayne Johnson
    [email protected]

    Hi,
    I entered the SQL statement in transaction MDXTest but it is running since 7 minutes.
    >> Transaction MDXTEST has an option to generate a test sequence and is not there to test SQL statements. You need to enter a valid MDX statement.
    Function module "/Crystal/MDX_GET_STREAM_INFO" not found
    >> then I would suggest you make sure that all the Transports from the SAP Integration Kit have been properly imported - see the Installation Guide for the SAP Integration Kit for details.
    Ingo

  • How to store bit data in VARCHAR(4000) field?

    Hi.
    Please help!
    We are porting some C/C++ software with embedded SQLs from NT/DB2 to Linux/Oracle.
    On NT/DB2 we have some table to store file data in a VARCHAR(4000) blocks.
    Table looks like this
    CREATE TABLE FileData (filetime as timestamp not null, idx int not null, datablock varchar(4000) FOR BIT DATA not null, primary key (filetime, idx) );
    As you can see DB2 has appropriate field modifier - "FOR BIT DATA" which makes DB2 storing data as-is, not converting characters.
    I need to know - if it is possible to do the same in Oracle like in DB2?
    If Oracle has some kind of field modifier like "FOR BIT DATA" in DB2?
    If not, how can I do the same in Oracle?
    The current problems are:
    1) when application imports the file with some national chars the Oracle stores "?" in a database in place of national chars.
    2) another piece of a problem - if file is more than 4000 bytes length, it reports the ORA-01461 error (see it is trying to expand some chars to UTF8 char which is not fit a single char, so finally - not fit the field size).
    So, it seems that it cannot proceed national chars at all. :-\
    For details please see enclosed [C code|http://dmitry-bond.spaces.live.com/blog/cns!D4095215C101CECE!1606.entry] , there is example how data written to a table.
    In other places we also need to read data from FIELDATA table and store back to file (other filename, other location).
    Here is summary on a field-datatype variants I have tried for the "datablock" field:
    1) VARCHAR2, RAW, LONG RAW, BLOB - does not work! All reports the same error - ORA-01461.
    2) CLOB, LONG - both working fine but(!) both still returns "?" instead of national chars on data reading.
    Hint: how I did try these field types - I just drop the "FileData" table, created it using different type for "datablock" field and run the same application to test it.
    I think I need to explain what the problem - we do not provide direct access to Oracle database, we use a some middle-ware. Middle-ware is a C/C++ software which also has a interface for dynamic SQLs execution. So, exactly this middle-ware (which is typically running on the same server with Oracle) receives the "?" instead of national chars! But we need it to return all data AS-IS(!) without any changes!!! That is wjhy I did ask - if Oracle has any options to store byte-data as-is?
    The BIG QUESTION - HOW CAN WE DO THIS?!
    Another thing I need to explain - it is ok to use Oracle-specific SQL ONLY IF THERE IS REALLY NO WAY TO ACHIEVE THIS WITH STANDARD SQL! Please.
    So, please look on a C code (by link I have posted above) and tell - if it is possible to make working in Oracle the VARCHAR approach we using at the moment?
    If not - please describe what options do we have for Oracle?
    Regards,
    Dmitry.
    PS. it is Oracle 11gR2 on CentOS 5.4, all stuff installed with default settings, so Oracle db encoding is "AL32UTF8".
    C/C++ application is built as ANSI/ASCII application (non-unicode), so sizeof(char)=1.
    The target Oracle db (I mean - the one which will be used on customer site) is Oracle 10g. So, solution need to be working on Oracle 10g.

    P. Forstmann wrote:
    There is some contradiction in your requirements:
    - if you want to store data as is without any translation use RAW or BLOB
    - if you want to store national character data try to use NVARCHAR2 or NCLOB.Seems you did not understand the problem. Ok, I'll try to explain. Please look on the code sample I provided in original question
    (I just added expanded data structures there, sorry I forgot to publish them when post original question):
    EXEC SQL BEGIN DECLARE SECTION;
      struct {
        char timestamp[27];
        char station[17];
        char filename[33];
        char task[17];
        char orderno[17];
        long filelen;
      gFilehead;
      struct {
        char timestamp[27];
        long idx;
        struct {
          short len;
          char arr[4001];
        } datablock;
      gFiledata;
    EXEC SQL END DECLARE SECTION;
    #define DATABLOCKSIZE 4000
    #ifdef __ORACLE
      #define VARCHAR_VAL(vch) vch.arr
    #elif __DB2
    #endif
    short dbWriteFile( char *databytes, long datalen )
      short nRc;
      long movecount;
      long offset = 0;
      gFilehead.filelen = gFilehead.filelen + datalen;
      while ((datalen + gFiledata.datablock.len) >= DATABLOCKSIZE)
        movecount = DATABLOCKSIZE - gFiledata.datablock.len;
        memcpy(&VARCHAR_VAL(gFiledata.datablock)[gFiledata.datablock.len], databytes, movecount);
        gFiledata.datablock.len = (short)(gFiledata.datablock.len + movecount);
        exec sql insert into filedata (recvtime, idx, datablock)
          values(
            :gFiledata.recvtime type as timestamp,
            :gFiledata.idx,
            :gFiledata.datablock /* <--- ORA-01461 appears here */
        nRc = sqlcode;
        switch (nRc)
        case SQLERR_OK: break;
        default:
          LogError(ERR_INSERT, "filedata", IntToStr(nRc), LOG_END);
          exit(EXIT_FAILURE);
        offset = offset + movecount;
        datalen = datalen - movecount;
        gFiledata.idx = gFiledata.idx + 1;
        memset(&gFiledata.datablock, 0, sizeof(gFiledata.datablock));
        databytes = databytes + movecount;
        gFiledata.datablock.len = 0;
      if (datalen + gFiledata.datablock.len)
        memcpy(&VARCHAR_VAL(gFiledata.datablock)[gFiledata.datablock.len], databytes, datalen);
        gFiledata.datablock.len = (short)(gFiledata.datablock.len + datalen);
      return 0;
    }So, the thing we need is - to put some data into the "datablock" field of following structure:
      struct {
        char timestamp[27];
        long idx;
        struct {
          short len;
          char arr[4001];
        } datablock;
      gFiledata;Then insert it into a database table using static SQL like this:
        exec sql insert into filedata (recvtime, idx, datablock)
          values(
            :gFiledata.recvtime type as timestamp,
            :gFiledata.idx,
            :gFiledata.datablock /* <--- ORA-01461 appears here */
            ); And then expect to read exactly the same data back!
    The problems are:
    1) Oracle make decision to convert the data we are inserting (why? and how to disable converting?!)
    2) even if it inserts the data (CLOB and LONG field datatypes are working fine when inserting data with such static SQL + such host variable) then it became not readable! (why?! how to make it readable?!)
    P. Forstmann wrote:
    ORA-01461 could mean that you have a wrong data type for bind variable variable in your client code:Not me decided that host variable is the "LONG datatype value" - the Oracle make that decision instead of me. And that is the problem!
    Looks like Oracle react on any char code >= 0x80.
    So, assume if I run the code:
    // if Tab1 was created as "CREATE TABLE Tab1 (value VARCHAR(5))" then
    char szData[10] = "\x41\x81\x82\x83\x84";
    EXEC SQL INSERT INTO Tab1 (value) VALUES (:szData);
    Oracle will report the ORA-01461 error!
    EXACTLY THIS IS THE PROBLEM I HAVE DESCRIBED IN MY ORIGINAL QUESTION.
    So, problem - why Oracle make such decision instead of me?! How we can make Oracle insert data into a table AS-IS?
    What other type of host variable we should use to make Oracle think that data is a binary?
    void*? unsigned char? Could you please provide any examples?
    Ok, you did recommend - "use the RAW datatype". But RAW datatype is limited to size 2000 bytes only - we need 4000! So, it is not match our needs at all.
    Also you have mentioned "use BLOB" - but testing shows that Oracle reports the same ORA-01461 error on inserting data into a BLOB field from such host variable! (see the code I posted)
    What also we can do?
    Change type of host variables? BUT HOW?!

  • How to handle varchar(4000) field in reporting?

    Hello,
    I am having a Oracle database, the data in tables of which is being stored from a web application. One of the fields of a table is having varchar(4000) datatype.
    The data in this field is text which may contain carriage returns also.
    I have to create report of this table. How to display this data properly in a report?
    I tried few things:
    Accessed the table using SQL Developer and pasted the query output in Excel.
    As the text contains carriage returns, it is being spread into one or more rows.
    How to handle this? What is the proper way to create report for such data?
    Please help.
    -Sameer

    You the following to remove the carriage returns
    SQL> select 'a'||chr(10)||'line2
    2
    SQL> select 'line 1'||chr(10)||'line 2' from dual ;
    'LINE1'||CHR(
    line 1
    line 2
    SQL> select replace('line 1'||chr(10)||'line 2',chr(10),'') from dual ;
    REPLACE('LIN
    line 1line 2
    If this is not the soln , can you paste some sample data

  • Dblink poor performance with varchar(4000) after upgrade 11g

    Hi
    Since a long time we connected from a 10g database via dblink to another 10g database to copy a simple table with four columns. Tablesize at about 500MB
    column1 (varchar(100))
    column2/3/4 (varchar(4000)).
    After the upgrade of the source database to 11g the dblink performance is poor with the big varchar columns. If I copy (select column1 from ) only column1 then I get the data within minutes. If I want to copy the whole table (select column1, column2, column3, column4 from...) then the performance is poor. It didn't finish within days.
    Does anyone know about dblink issues with 11g and big varchar columns?
    Thank you very much

    Use DBlink to pull data from table(s) using IMPDP:
    #1 Create DBlink in the Target database
    create database link sourceDB1_DBLINK connect to system identified by password10 using 'SourceDB1';
    [update tnsnames.ora appropriately]
    #2
    select DIRECTORY_PATH from dba_directories where DIRECTORY_NAME=’DATA_PUMP_DIR’;
    create DATA_PUMP_DIR directory if not exists
    #3
    impdp system/pwdxxx SCHEMAS=SCOTT NETWORK_LINK=ORCL10R2 JOB_NAME=scott_import LOGFILE=data_pump_dir:network_imp_scott.log
    That's all

  • Nice to see 13" retina but it has only Intel HD Graphics 4000 and does not have NVIDIA GeForce GT 650M with 1GB of GDDR5 memory card. How it will affect the speed, performance and other things compared to 15" retina where NVIDIA GeForce card is available.

    Nice to see 13" retina but it has only Intel HD Graphics 4000 and does not have NVIDIA GeForce GT 650M with 1GB of GDDR5 memory card. How it will affect the speed, performance and other things compared to 15" retina where NVIDIA GeForce card is available.

    The 15" Retina's will have better performance than any 13" Retina. Not only do the 15" machines have dedicated GPU's, but they also have quad-core processors, whereas the 13" Retina's only have dual-core processors.

  • I have 12 core with Quatro 4000 and 5770, I want to use dual monitor setup, monitors are NEC with Spectraview-II.  How do I connect?  4000 only has 1 Display Port and 1 DVI.  5770 has 2 of each, if I use both 5770 Display Ports, does the 4000 contribute?

    I just bought a 12 core with Quatro 4000 and 5770, I want to use dual monitor setup, monitors are NEC with Spectraview-II.  How do I connect?  4000 only has 1 Display Port and 1 DVI.  5770 has 2 of each, if I use both 5770 Display Ports, does the 4000 contribute any work at all?  I read where on a PC they would work together, but on a MAC they do not.
    I read that Display Port has higher band width than DVI, NEC monitors for best performance they recommend using DIsplay Port.
    When I was setting this up I looked at a Nvidia Quadro 4000, unfortunately it was for PC, it had 2 Display Ports, in the Mac version they reduce it to one.  I did not think there could be a difference.
    Mainly want to use it for CS6 and LR4.
    How to proceed??? 
    I do not want to use the Quadro 4000 for both, that would not optimize both monitors, one DP and 1 DVI.  Using just the 5770 would work but I do not think the 4000 would be doing anything, and the 5770 has been replaced by the 5870.more bandwidth.
    Any ideas, I am a Mac newbie, have not ever tried a Mac Pro, just bought off ebay and now I have these problems.
    As a last resort I could sell both and get a 5870.  That would work, I'm sure of that, it's just that I wanted the better graphics card.
    Thanks,
    Bill

    The Hatter,
    I am a novice at Mac so I read all I can.  From what I understand the NEC monitors I bought require Display Port for their maximum performance.  The GTX 680 only has DVI outputs.  Difference from what I understand is larger bandwidth with the DP.
    You said I have the 4000 for CUDA.  I am not all that familiar with CUDA and when I do read about it I do not understand it. 
    A concern I have is, that if I connect the 2 high end NEC monitors via the 5770, using it's 2 Display Ports I would have nothing connected to the 4000.  Is the 4000 doing anything with nothing connected?  I read where in a PC system the 2 cards would interact but in a Mac system they do not.
    Bottom line, as I see it, the 4000 will not be useful at all to me, since I want a dual monitor set-up.
    So far the 5870 seems the best choice, higher band width than the 5770, and it has 2 Display Ports to optimize the NEC monitors.
    I'm not sure how fine I am splitting hairs, nor do I know how important those hairs are.  I am just trying to set up a really fast reliable system that will mainly be used for CS6 and LR4.  Those NEC monitors are supposed to be top notch.

  • HP Laserjet 4000 and 6L in Windows 7 - cannot locate drivers

    I recently purchased an Asus laptop, Windows 7 operating system, 64-bit. I am trying to install HP LaserJet 4000 and HP LaserJet 6L printers to this computer with a parallel-to-USB cable.
    The computer allows me to install the 6L, and the printer appears in “Devices and Printers.” However, I cannot print pages.  I checked with the Windows Compatibility Center website, and this printer is supposed to run in Windows 7 without additional drivers.  Restarting has not helped.
    The HP LaserJet 4000 will not install. I went to HP’s website and found the page for LaserJet 4000 drivers.  There are two boxes on this page: “Drivers” and “Universal Print Drivers.”  The driver listed under “Drivers” does not have a download button, only an “Obtain Software” link giving me instructions on navigating to the very page I am on.  I am not sure what Universal Print Drivers are or if they are what I should be installing.
    Any help is greatly appreciated. Thanks!

    There are Laserjet 4000 drivers available through Windows Update.  Go to Start, Devices and Printers, Add a Printer, select the proper port, then click on Windows Update.  After a few minutes the list will repopulate, next select HP (not Hewlett-Packard) and select the appropriate driver.  Note you will need an appropriate driver for your USB-parallel cable.
    Bob Headrick,  HP Expert
    I am not an employee of HP, I am a volunteer posting here on my own time.
    If your problem is solved please click the "Accept as Solution" button ------------V
    If my answer was helpful please click the "Thumbs Up" to say "Thank You"--V

  • Looking for how to run both a Quadro 4000 and a geforce card win7 please

    hello,
    way back when i built the window 7 editing rig i tried to get a quadro 4000 and a geforce 9800 GT (older version) to work together to get 4 displays going
    but i had stability issues so i got another quadro and never had a problem
    however, reading this recent thread
    http://forums.adobe.com/thread/1258243?tstart=0
    it sounds like users are able to mix quadro 4000 with geforce cards
    using the geforce for the cuda processing and the quadro for the output
    i still want to run 4 monitors (2 from the quadro 4000 and then 2 from the geforce card (open to suggestions here)
    if i can mix cards now i'd like to get a geforce gpu to help out with 4k editing of a feature (probably red scarlet)
    and use the quadro for 10bit output
    as always positive feedback is appreciated, cheers, j

    hello,
    no i am still sticking with cs6 for now
    and i found where i had read the instructions before
    (an answer to one of my own previously asked questions):
    Install Quadro Driver full installation via Wizard.
    Decompress Geforce Driver latest version but do not run Wizard install.
    Manually update Geforce Card driver via Device manager by pointing to Nvidia/Geforce folder
    Launch Nvidia Control Panel and set the GPU Acceleration to Geforce card only.
    Eric
    ADK
    i guess this wraps this one up
    thanks for taking the time to read
    sorry for wasting your time
    cheers, j

  • Epson Stylus Pro 4000 and SL

    Is anyone successfully printing with an SP 4000 and the 10.5 driver in Snow Leopard? I can work with either Photoshop CS4 or Lightroom 2. Epson suggests using Rosetta as a work around. Does that work with CS4?
    Thanks,
    Peter
    Message was edited by: pcalvin

    I am able to print with ICC profiles properly colour calibrated on my Epson 4000 providing I follow a very specific sequence.
    My system is 10.6.1 SL, Lightroom 2.5 (starting in 32 bit mode), Epson driver 3.09, USB connected printer
    I must do the following in this sequence (if I do not the prints are way too dark)
    1. Use lightroom as normal, press print or print one with profile settings as normal
    2. In the print driver dialogue go to "Color Management" and choose "OFF"
    3. in the print driver dialogue go to "Print Settings" and choose Mode "Automatic"
    4. In the print driver dialogue press "Print"
    If I do step 3 then 2, then prints are way too dark, it appears to only work in this way for me.
    I am very pleased with SL, the performance improvement is impressive.
    I have resolved the issue with my Epson and it is not a big brick but it is clunky to print this way and took a lot of trial and error and paper/ink to figure out something that works for me.
    Now, if only I could get my eye-1 software to recognise the monitor profile is installed already and I don't need to reprofile every time I start up, I'd be a very happy man.

  • Is The Intel HD Graphics 4000 And The Intel Core i3-3110M Processor Good For Premier?

    Dear Forums Members,
    Latley I've been looking for a laptop that can fit my video edting needs + some office work. What i need from Premier is the abilty to edit SD quality videos, my budget is not that high ( + I live in Israel where everything is more expesnive ).
    I've been thinking about the Lenovo IdeaPad G500 5938-9998 wich comes with a Intel® Core™ i3-3110M Processor and Intel® HD Graphics 4000 and 4GB RAM and 500GB Sata (5400RPM) Hard Drive.
    My Quastion is: will it be ok to edit SD videos in premier 6 or 5? and if not, wich premier IS good for this specific computer?
    Thanks ahead, Jonathan.

    Jonathan, If you mean the old Premiere 6 or Premiere 5, then those versions of Premiere are extremely old and severely outdated. In fact, Premiere 6 was released before Windows XP even came out. As such, you may have serious trouble running Premiere 6 (not to be confused with Premiere Pro CS6) on newer versions of Windows such as Windows 7 or Windows 8.x.
    On the other hand, if you mean Premiere Pro CS6 or CS5.x, then that planned laptop is too weak to handle much of anything (video editing related), especially since that i3-3110M CPU is only a dual-core CPU that's capped at 2.4GHz and that CS6/5 does not support GPU acceleration for any non-Nvidia or any integrated GPUs. As such, I second the responses suggesting the cheapo consumer version of Premiere, Premiere Elements (the current release is 12.0).

  • Pro 4000 and XP 64 Pro

    Hi,
    I did a search on Pro 4000 x64 on this board, and was disappointed to see that there is no mention of that combination working as I have always been very happy with the Pro 4000, and had intended to use it with the new machine I have running XP 64 Pro.
    Can you please advise on the following;
    1) Will the Pro 4000 Webcam be supported on Windows X64 Pro? There is no mention of it in SID7824.
    2) In the absence of support, which current Webcam is comparable / better?
    ( 3) and I know this is off topic, but if anyone has any ideas on getting the Remote control to work for the Audigy 4 with the X64 drivers, that would be cool too They is a Driver, but the majority of the associated software does not work with X64 )
    Cheers,
    .\/.artin

    Yeah, but even if he had a Creative product he would still get no help. Which makes me think of a question. You respond to a non Creative user that doesn't need a response, but the people who have a Creative product with a question, you get no response. Why is that?
    My unanswered question as of 5 days ago ....
    I use Windows x64, and have the Creative Live Pro. I have installed the 64 bit driver multiple times with no luck at the product working. However; when I plug the camera into my USB port my computer recognizes the camera as the Creative Live Pro. When I try to locate the driver it tells me to make sure the driver is compatible with x64. I have the correct driver installed, but for some reason it can't recognize it. In my device manager it is listed as "other device" with a question mark around it. I'm assuming when the driver is installed correctly it would be listed in USB controlled devices. Why isn't the x64 driver working correctly? I have other USB devices that work properly.

  • T530, HD 4000 and Quad Core: Not Possible?

    Hello,
    I'm in the market for a new Thinkpad and it seems on the T530, if you go for a quad core (QM CPUs), you're forced to go with nvidia.
    Is is set in stone?  I ask because I couldn't care less about the nvidia gpu on Linux...
    If it is, can the GPU be completely disabled in the BIOS?

    JDay wrote:
    You get the Nvidia GPU and the HD 4000 (HD 4000 is built into the CPU). It is actually a nice feature to have, especially if you want to use more than 2 monitors at a time. If you only want to use the HD 4000 just select Integrated Graphics in the UEFI firmware (BIOS). I don't think you can upgrade to the quad-core yourself since I am pretty sure you need a different heatsink/fan assembly to prevent overheating. In fact the motherboard may be different as well, which would explain why it is only available with the Nvidia GPU.
    The HD4000 should be able to handle enough monitors I can think of.  That said you have a good point about the heatsink/fan assembly.  Looking at the HW Maintenance Manual, it's definitely not the same piece.
    Page 92 of T530 Maintenance Manual.
    Bottom line is I guess I'll fork the extra money, even if the GPU isn't really of any use.
    Thanks folks.

  • Abap function module http_post and umlaut characters

    I am using abap function module HTTP_POST to frame xml and send the data to our middleware system 'CASTIRON'.
    Everything works fine but when there are any umlaut characters in xml, the function HTTP_POPST is unable to transmit the data and is creating SM21 updation failures. We are on ECC 6.0 unicode system.
    Any help is vey much appreciated.

    Are you using a CDATA tag to encapsulate your data. Try doing it.

  • Error 4000 and other anomalies

    Like many others I also have the dreaded 4000 error and can't burn discs. I also have another problem that is giving me great pain. On play back of any recorded or commercial CDs there is skipping or stuttering. This issue only presents itself in iTunes. I have used Windows Media Player, Blaze Audio Pro, Creative Media Source as players and they do not skip or stutter. I have updated all software/firmware and have gone as far as doing a non-destructive system restore with no success. The only time that these drives did not exhibit this issue was when I was using iTunes 4. someting or other. This issue presents itself on both drives.
    I have (but iTunes CD diagnostics doesn't show it) a Sound Blaster X-fi Extreme Music sound card installed as well. I have taken the steps of disabling it and it's associated drivers/sofware and reinstalled the on board audio to see if it is the card or not.
    I have spent the last week with HP support only to narrow the issue.
    The issue is with iTunes
    Microsoft Windows XP Professional Service Pack 2 (Build 2600)
    HP Pavilion 061 PX724AA-ABA M7170N
    iTunes 6.0.2.23
    CD Driver 2.0.4.3
    CD Driver DLL 2.0.3.2
    LowerFilters: Pfc (2.5.0.201), PxHelp20 (2.0.0.0), drvmcdb (1.0.0.1),
    UpperFilters: GEARAspiWDM (2.0.4.3), pwd_2k (8.0.5.39),
    Video Driver: NVIDIA GeForce 6600 GT\GeForce 6600 GT
    IDE\DiskMaxtor6B250S0_________________________BANC1B10, Bus Type ATA, Bus Address [0,0]
    USBSTOR\DiskGenericUSB_CF_Reader__1.01, Bus Type USB
    USBSTOR\DiskGenericUSB_MS_Reader__1.03, Bus Type USB
    USBSTOR\DiskGenericUSB_SD_Reader__1.00, Bus Type USB
    USBSTOR\DiskGenericUSB_SM_Reader__1.02, Bus Type USB
    USBSTOR\DiskST3160023A____________8.01, Bus Type USB
    IDE\CdRomASUSDVD-E616P3H________________________1.04___, Bus Type ATA, Bus Address [1,0]
    IDE\CdRomHL-DT-STDVDRRW_GWA-4166B_______________1I24___, Bus Type ATA, Bus Address [0,0]
    If you have multiple drives on the same IDE or SCSI bus, these drives may interfere with each
    other.
    Some computers need an update to the ATA or IDE bus driver, or Intel chipset. If iTunes has
    problems recognizing CDs or hanging or crashing while importing or burning CDs, check the
    support site for the manufacturer of your computer or motherboard.
    Current user is administrator.
    E: HL-DT-ST DVDRRW GWA-4166B, Rev 1I24
    Audio CD in drive.
    Found 12 songs on CD, playing time 39:03 on Audio CD.
    Track 1, start time 00:02:00
    Track 2, start time 01:52:37
    Track 3, start time 05:18:25
    Track 4, start time 08:29:55
    Track 5, start time 09:34:03
    Track 6, start time 13:06:39
    Track 7, start time 18:52:30
    Track 8, start time 22:02:43
    Track 9, start time 26:05:54
    Track 10, start time 27:24:63
    Track 11, start time 31:25:55
    Track 12, start time 34:31:13
    Audio CD reading succeeded.
    Get drive speed succeeded.
    The drive CDR speeds are: 16 24 40.
    The drive CDRW speeds are: 16.
    The drive DVDR speeds are: 16.
    The drive DVDRW speeds are: 16.
    The last failed audio CD burn had error code 4000(0x00000fa0). It happened on drive E: HL-DT-ST
    DVDRRW GWA-4166B on CDR media at speed 24X.
    F: ASUS DVD-E616P3H, Rev 1.04
    Audio CD in drive.
    Found 12 songs on CD, playing time 39:03 on Audio CD.
    Track 1, start time 00:02:00
    Track 2, start time 01:52:37
    Track 3, start time 05:18:25
    Track 4, start time 08:29:55
    Track 5, start time 09:34:03
    Track 6, start time 13:06:39
    Track 7, start time 18:52:30
    Track 8, start time 22:02:43
    Track 9, start time 26:05:54
    Track 10, start time 27:24:63
    Track 11, start time 31:25:55
    Track 12, start time 34:31:13
    Audio CD reading succeeded.
    Get drive speed succeeded.
    Please help.
    HP m7170n   Windows XP Pro  

    Well success may be at hand. I beleive I found a way to repair the skipping/stuttering in iTunes. If you suffer from this rare anomaly, go here: http://www.mp3.com/stories/1826.html&refid=38&ref_typeid=8
    Now if only I could fix this rather frustrating error 4000 issue...
    HP m7170n Windows XP Pro

Maybe you are looking for

  • Can I use multiple apple ids with iTunes Match?

    My family has multiple apple ids although we share many devices.  If we subscribe to iTunes Match, will it pull all the content from all the devices or just one apple id?

  • How can I trigger mousover menus?

    Some web pages have popup menus that show when you put the mouse over them. On the built-in Android browser, these menus can be accessed by touching them and then moving your finger slightly to the side. This action doesn't work on the Firefox Androi

  • MacBook Air - My Passport 1T won't mount after Yosemite update

    1) Tried to switch cables2) Won't mount on Linux or Windows3) Mac WD Drive Tools - shows me drive 1T click on - run drive status check, nothing ...same with other buttons.4).Not showing in Disk Utility or terminal 5). Lights...... 2-3 minutes steady

  • Don't stop at the first error in validateEntity

    I'm working with JDeveloper 10.1.2, ADF BC, Struts and JSP. I'm using the validateEntity method and my application stops when one validation's error is detected and only this error is showed. I'd like to show a list with all errors detected in the va

  • A first chance exception of type 'System.IO.FileLoadException' occurred in Unknown Module.

    Hiii... I was doing FacebookIntegration App.Trying it from many dayss...Can anybody please help me with the below error.It was raising in FacebookLoginPage.... A first chance exception of type 'System.IO.FileLoadException' occurred in Unknown Module.