Data load uses wrong character set, where to correct? APEX bug/omission?

Hi,
I created a set of Data Load pages in my application, so the users can upload a CSV file.
But unlike the Load spreadsheet data (under SQL Workshop\Utilities\Data Workshop), where you can set the 'File Character Set', I didn't see where to set the Character set for Data Load pages in my application.
Now there is a character set mismatch, "m³/h" and "°C" become "m�/h" and "�C"
Where to set?
Seems like an APEX bug or at least omission, IMHO the Data Load page should ask for the character set, as clients with different character sets could be uploading CSV.
Apex 4.1 (testing on the apex.oracle.com website)

Hello JP,
Please give us some more details about your database version and its character set, and the character set of the CSV file.
>> …But unlike the Load spreadsheet data (under SQL Workshop\Utilities\Data Workshop), where you can set the 'File Character Set', I didn't see where to set the Character set for Data Load pages in my application.
It seems that you are right. I was not able to find any reference to the (expected/default) character set of the uploaded file in the current APEX documentation.
>> If it's an APEX omission, where could I report that?
Usually, an entry on this forum is enough as some of the development team members are frequent participants. Just to be sure, I’ll draw the attention of one of them to the thread.
Regards,
Arie.
♦ Please remember to mark appropriate posts as correct/helpful. For the long run, it will benefit us all.
♦ Author of Oracle Application Express 3.2 – The Essentials and More

Similar Messages

  • Message uses a character set that is not supported by the internet service

    Does any one have any advice on how to fix this problem?
    E-mails sent from my iphone 3G periodically arrive in an unreadable form at the recipient. The body of the e-mail has been replaced with the message "This message uses a character set that is not supported by the internet service...." The problem e-mails also include an attachment that contains an unformatted text file containing the original message surrounded by what appears to be lots of formatting data that is displayed as gibberish.
    This occurs sometimes, but not always, even with the same recipients. I am sending e-mail through a G-mail account that is configured on the iphone using IMAP. I have tried the gmail account to use the two available formatting options for mail, but neither fixes the problem.
    I have also upgraded to 2.01 and restored a few times without impact.

    Hi,
    I got somewhat similar problem with special charecters(German umlaud �,�,�..).
    I create a file with java having special charecters in it. Now if I open this file I am able to view the special charecters in it.But If I attach this file send it using following code then receiver can not see the umlaud charecters in it.They get replaced by _ or ?
    MimeBodyPart mbp2 = new MimeBodyPart();
    FileDataSource fds = new FileDataSource(fileName);
    mbp2.setDataHandler(new DataHandler(fds));
    mbp2.setFileName(output.getName());
    Multipart mp = new MimeMultipart();
    mp.addBodyPart(mbp2);
    msg.setContent(mp);
    Transport.send(msg);
    From you message it looks like you are able to send the mail attachment correctly(by preserving special charecters).
    Can you tell me what might be wrong in my code.
    I appriciate your efforts in advance.
    Prasad

  • ORA-12709: error while loading create database character set after upgrade

    Dear All
    i m getting ORA-12709: error while loading create database character set, After upgraded the database from 10.2.0.3 to 11.2.0.3 in ebusiness suit env.
    current application version 12.0.6
    please help me to resolve it.
    SQL> startup;
    ORACLE instance started.
    Total System Global Area 1.2831E+10 bytes
    Fixed Size 2171296 bytes
    Variable Size 2650807904 bytes
    Database Buffers 1.0133E+10 bytes
    Redo Buffers 44785664 bytes
    ORA-12709: error while loading create database character set
    -bash-3.00$ echo $ORA_NLS10
    /u01/oracle/PROD/db/teche_st/11.2.0/nls/data/9idata
    export ORACLE_BASE=/u01/oracle
    export ORACLE_HOME=/u01/oracle/PROD/db/tech_st/11.2.0
    export PATH=$ORACLE_HOME/bin:$ORACLE_HOME/perl/bin:$PATH
    export PERL5LIB=$ORACLE_HOME/perl/lib/5.10.0:$ORACLE_HOME/perl/site_perl/5.10.0
    export ORA_NLS10=/u01/oracle/PROD/db/teche_st/11.2.0/nls/data/9idata
    export ORACLE_SID=PROD
    -bash-3.00$ pwd
    /u01/oracle/PROD/db/tech_st/11.2.0/nls/data/9idata
    -bash-3.00$ ls -lh |more
    total 56912
    -rw-r--r-- 1 oracle oinstall 951 Jan 15 16:05 lx00001.nlb
    -rw-r--r-- 1 oracle oinstall 957 Jan 15 16:05 lx00002.nlb
    -rw-r--r-- 1 oracle oinstall 959 Jan 15 16:05 lx00003.nlb
    -rw-r--r-- 1 oracle oinstall 984 Jan 15 16:05 lx00004.nlb
    -rw-r--r-- 1 oracle oinstall 968 Jan 15 16:05 lx00005.nlb
    -rw-r--r-- 1 oracle oinstall 962 Jan 15 16:05 lx00006.nlb
    -rw-r--r-- 1 oracle oinstall 960 Jan 15 16:05 lx00007.nlb
    -rw-r--r-- 1 oracle oinstall 950 Jan 15 16:05 lx00008.nlb
    -rw-r--r-- 1 oracle oinstall 940 Jan 15 16:05 lx00009.nlb
    -rw-r--r-- 1 oracle oinstall 939 Jan 15 16:05 lx0000a.nlb
    -rw-r--r-- 1 oracle oinstall 1006 Jan 15 16:05 lx0000b.nlb
    -rw-r--r-- 1 oracle oinstall 1008 Jan 15 16:05 lx0000c.nlb
    -rw-r--r-- 1 oracle oinstall 998 Jan 15 16:05 lx0000d.nlb
    -rw-r--r-- 1 oracle oinstall 1005 Jan 15 16:05 lx0000e.nlb
    -rw-r--r-- 1 oracle oinstall 926 Jan 15 16:05 lx0000f.nlb
    -rw-r--r-- 1 oracle oinstall 1.0K Jan 15 16:05 lx00010.nlb
    -rw-r--r-- 1 oracle oinstall 958 Jan 15 16:05 lx00011.nlb
    -rw-r--r-- 1 oracle oinstall 956 Jan 15 16:05 lx00012.nlb
    -rw-r--r-- 1 oracle oinstall 1005 Jan 15 16:05 lx00013.nlb
    -rw-r--r-- 1 oracle oinstall 970 Jan 15 16:05 lx00014.nlb
    -rw-r--r-- 1 oracle oinstall 950 Jan 15 16:05 lx00015.nlb
    -rw-r--r-- 1 oracle oinstall 1.0K Jan 15 16:05 lx00016.nlb
    -rw-r--r-- 1 oracle oinstall 957 Jan 15 16:05 lx00017.nlb
    -rw-r--r-- 1 oracle oinstall 932 Jan 15 16:05 lx00018.nlb
    -rw-r--r-- 1 oracle oinstall 932 Jan 15 16:05 lx00019.nlb
    -rw-r--r-- 1 oracle oinstall 951 Jan 15 16:05 lx0001a.nlb
    -rw-r--r-- 1 oracle oinstall 944 Jan 15 16:05 lx0001b.nlb
    -rw-r--r-- 1 oracle oinstall 953 Jan 15 16:05 lx0001c.nlb
    Starting up:
    Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options.
    ORACLE_HOME = /u01/oracle/PROD/db/tech_st/11.2.0
    System name: SunOS
    Node name: proddb3.zakathouse.org
    Release: 5.10
    Version: Generic_147440-19
    Machine: sun4u
    Using parameter settings in server-side spfile /u01/oracle/PROD/db/tech_st/11.2.0/dbs/spfilePROD.ora
    System parameters with non-default values:
    processes = 200
    sessions = 400
    timed_statistics = TRUE
    event = ""
    shared_pool_size = 416M
    shared_pool_reserved_size= 40M
    nls_language = "american"
    nls_territory = "america"
    nls_sort = "binary"
    nls_date_format = "DD-MON-RR"
    nls_numeric_characters = ".,"
    nls_comp = "binary"
    nls_length_semantics = "BYTE"
    memory_target = 11G
    memory_max_target = 12G
    control_files = "/u01/oracle/PROD/db/apps_st/data/cntrl01.dbf"
    control_files = "/u01/oracle/PROD/db/tech_st/10.2.0/dbs/cntrl02.dbf"
    control_files = "/u01/oracle/PROD/db/apps_st/data/cntrl03.dbf"
    db_block_checksum = "TRUE"
    db_block_size = 8192
    compatible = "11.2.0.0.0"
    log_archive_dest_1 = "LOCATION=/u01/oracle/PROD/db/apps_st/data/archive"
    log_archive_format = "%t_%s_%r.dbf"
    log_buffer = 14278656
    log_checkpoint_interval = 100000
    log_checkpoint_timeout = 1200
    db_files = 512
    db_file_multiblock_read_count= 8
    db_recovery_file_dest = "/u01/oracle/fast_recovery_area"
    db_recovery_file_dest_size= 14726M
    log_checkpoints_to_alert = TRUE
    dml_locks = 10000
    undo_management = "AUTO"
    undo_tablespace = "APPS_UNDOTS1"
    db_block_checking = "FALSE"
    session_cached_cursors = 500
    utl_file_dir = "/usr/tmp"
    utl_file_dir = "/usr/tmp"
    utl_file_dir = "/u01/oracle/PROD/db/tech_st/10.2.0/appsutil/outbound"
    utl_file_dir = "/u01/oracle/PROD/db/tech_st/10.2.0/appsutil/outbound/PROD_proddb3"
    utl_file_dir = "/usr/tmp"
    plsql_code_type = "INTERPRETED"
    plsql_optimize_level = 2
    job_queue_processes = 2
    cursor_sharing = "EXACT"
    parallel_min_servers = 0
    parallel_max_servers = 8
    core_dump_dest = "/u01/oracle/PROD/db/tech_st/10.2.0/admin/PROD_proddb3/cdump"
    audit_file_dest = "/u01/oracle/admin/PROD/adump"
    db_name = "PROD"
    open_cursors = 600
    pga_aggregate_target = 1G
    workarea_size_policy = "AUTO"
    optimizer_secure_view_merging= FALSE
    aq_tm_processes = 1
    olap_page_pool_size = 4M
    diagnostic_dest = "/u01/oracle"
    max_dump_file_size = "20480"
    Tue Jan 15 16:16:02 2013
    PMON started with pid=2, OS id=18608
    Tue Jan 15 16:16:02 2013
    PSP0 started with pid=3, OS id=18610
    Tue Jan 15 16:16:03 2013
    VKTM started with pid=4, OS id=18612 at elevated priority
    VKTM running at (10)millisec precision with DBRM quantum (100)ms
    Tue Jan 15 16:16:03 2013
    GEN0 started with pid=5, OS id=18616
    Tue Jan 15 16:16:03 2013
    DIAG started with pid=6, OS id=18618
    Tue Jan 15 16:16:03 2013
    DBRM started with pid=7, OS id=18620
    Tue Jan 15 16:16:03 2013
    DIA0 started with pid=8, OS id=18622
    Tue Jan 15 16:16:03 2013
    MMAN started with pid=9, OS id=18624
    Tue Jan 15 16:16:03 2013
    DBW0 started with pid=10, OS id=18626
    Tue Jan 15 16:16:03 2013
    LGWR started with pid=11, OS id=18628
    Tue Jan 15 16:16:03 2013
    CKPT started with pid=12, OS id=18630
    Tue Jan 15 16:16:03 2013
    SMON started with pid=13, OS id=18632
    Tue Jan 15 16:16:04 2013
    RECO started with pid=14, OS id=18634
    Tue Jan 15 16:16:04 2013
    MMON started with pid=15, OS id=18636
    Tue Jan 15 16:16:04 2013
    MMNL started with pid=16, OS id=18638
    DISM started, OS id=18640
    ORACLE_BASE from environment = /u01/oracle
    Tue Jan 15 16:16:08 2013
    ALTER DATABASE MOUNT
    ORA-12709 signalled during: ALTER DATABASE MOUNT...

    ORA-12709 signalled during: ALTER DATABASE MOUNT...Do you have any trace files generated at the time you get this error?
    Please see these docs.
    ORA-12709: WHILE STARTING THE DATABASE [ID 1076156.6]
    Upgrading from 9i to 10gR2 Fails With ORA-12709 : Error While Loading Create Database Character Set [ID 732861.1]
    Ora-12709 While Trying To Start The Database [ID 311035.1]
    ORA-12709 when Mounting the Database [ID 160478.1]
    How to Move From One Database Character Set to Another at the Database Level [ID 1059300.6]
    Thanks,
    Hussein

  • ORA-12709: error while loading create database character set

    I installed Oracle 8.05 on Linux successfully: was able to login
    whith SQLPlus, start and stop the db whith svrmgrl etc.
    During this install I chose WE8ISO8859P9 as the database
    characterset when prompted.
    After that I installed Oracle Application Server 3.02, and now
    I'm getting the
    ORA-12709: error while loading create database character set
    message when I try to start up the database, and the database
    won't mount.
    Platform is RedHat Linux 5.2.
    NLS_LANG set to different settings,
    e.g. AMERICAN_AMERICA.WE8ISO8859P9
    but without success.
    Anyone any clue?
    Thanks!
    null

    Jogchum Reitsma (guest) wrote:
    : I installed Oracle 8.05 on Linux successfully: was able to
    login
    : whith SQLPlus, start and stop the db whith svrmgrl etc.
    : During this install I chose WE8ISO8859P9 as the database
    : characterset when prompted.
    : After that I installed Oracle Application Server 3.02, and now
    : I'm getting the
    : ORA-12709: error while loading create database character set
    : message when I try to start up the database, and the database
    : won't mount.
    : Platform is RedHat Linux 5.2.
    : NLS_LANG set to different settings,
    : e.g. AMERICAN_AMERICA.WE8ISO8859P9
    : but without success.
    : Anyone any clue?
    : Thanks!
    You can create the database with WE8DEC character set
    and to use the WE8ISO8859P9 on the client or even on Linux.
    The NLS_LANG setting doesn't effect the database, but the
    interface with the database. The same setting can be used in de
    windows 95/98/NT registry.
    null

  • Transferring Data between Databases with Character Sets UTF08 and US7ASCII

    Hi,
    I am trying to transfer data from Oracle 10g (character set :UTF08) to Oracle 8i ( character set: US7ASCII). I have tried the transfer using the DBLinks and found that there is no way the data could be transferred from 10g to Oracle 8i.
    The last option available is to use staging database for transfer. The staging database would be Oracle 10g only but the character set would be US7ASCII. I am expecting that since the character set is US7ASCII, this would be able to get compatible with Oracle 8i (US7ASCII). Secondly, Transfer from 10g to staging 10g should also work, since staging 10g would support UTF08 character set.
    Kindly tell me if this option would work or if there is any other way around.
    Thanks
    Nitin
    Message was edited by:
    Nits
    Message was edited by:
    Nits
    Message was edited by:
    Nits

    You possibly have a fundamental problem,which is more important than any technical issues. If your UTF8 (Unicode) database stores non-english characters you will lose these characters when transferring.
    Werner

  • Server uses WE8ISO8859P15 character set (possible charset conversion)

    Hi,
    when EXP in 9i I receive :
    Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Export done in WE8PC850 character set and AL16UTF16 NCHAR character set
    server uses WE8ISO8859P15 character set (possible charset conversion)What is the problem ?
    Thank you.
    I exported just a table, how to see if it is exported ?

    Dear user522961,
    You have not defined or misdefined the NLS_LANG environmental variable before trying to run the export command.
    Here is a little illustration;
    *$ echo $NLS_LANG*
    *AMERICAN_AMERICA.WE8ISO8859P9*
    $ exp system/password@opttest file=ogan.dmp owner=OGAN
    Export: Release 10.2.0.4.0 - Production on Mon Jul 12 18:10:47 2010
    Copyright (c) 1982, 2007, Oracle.  All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    *Export done in WE8ISO8859P9 character set and AL16UTF16 NCHAR character set*
    About to export specified users ...
    . exporting pre-schema procedural objects and actions
    . exporting foreign function library names for user OGAN
    . exporting PUBLIC type synonyms
    . exporting private type synonyms
    . exporting object type definitions for user OGAN
    About to export OGAN's objects ...
    . exporting database links
    . exporting sequence numbers
    . exporting cluster definitions
    . about to export OGAN's tables via Conventional Path ...
    . exporting synonyms
    . exporting views
    . exporting stored procedures
    . exporting operators
    . exporting referential integrity constraints
    . exporting triggers
    . exporting indextypes
    . exporting bitmap, functional and extensible indexes
    . exporting posttables actions
    . exporting materialized views
    . exporting snapshot logs
    . exporting job queues
    . exporting refresh groups and children
    . exporting dimensions
    . exporting post-schema procedural objects and actions
    . exporting statistics
    Export terminated successfully without warnings.
    *$ export NLS_LANG=AMERICAN_AMERICA.WE8ISO8859P15*
    $ exp system/password@opttest file=ogan.dmp owner=OGAN
    Export: Release 10.2.0.4.0 - Production on Mon Jul 12 18:12:41 2010
    Copyright (c) 1982, 2007, Oracle.  All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    *Export done in WE8ISO8859P15 character set and AL16UTF16 NCHAR character set*
    *server uses WE8ISO8859P9 character set (possible charset conversion)*
    About to export specified users ...
    . exporting pre-schema procedural objects and actions
    . exporting foreign function library names for user OGAN
    . exporting PUBLIC type synonyms
    . exporting private type synonyms
    . exporting object type definitions for user OGAN
    About to export OGAN's objects ...
    . exporting database links
    . exporting sequence numbers
    . exporting cluster definitions
    . about to export OGAN's tables via Conventional Path ...
    . exporting synonyms
    . exporting views
    . exporting stored procedures
    . exporting operators
    . exporting referential integrity constraints
    . exporting triggers
    . exporting indextypes
    . exporting bitmap, functional and extensible indexes
    . exporting posttables actions
    . exporting materialized views
    . exporting snapshot logs
    . exporting job queues
    . exporting refresh groups and children
    . exporting dimensions
    . exporting post-schema procedural objects and actions
    . exporting statistics
    Export terminated successfully without warnings.Hope it Helps,
    Ogan

  • Server uses AL32UTF8 character set

    Hello All,
    OS: Linux (SLES 8)
    Recently I have applied 10g Relase 1 (10.1.0.5) patch set for AIX 64 -- Patch #
    4505133.
    My earler version of database was 10.1.0.3.0.
    I got message patch successfully installed.
    I can login to the database.
    One of our weekly routine is to run our internal shell script interface
    programs.
    Very first step of this program is to created .dmp (exp) file. We are doing this
    from years.
    Now after applying this patch set,
    I am getting error in this program as below:
    Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.5.0 -
    Production
    With the Partitioning, OLAP and Data Mining options
    Export done in US7ASCII character set and AL16UTF16 NCHAR character set
    server uses AL32UTF8 character set (possible charset conversion)
    About to export specified users ...
    . exporting pre-schema procedural objects and actions
    . exporting foreign function library names for user abc
    . exporting PUBLIC type synonyms
    . exporting private type synonyms
    . exporting object type definitions for user abc
    About to export abc's objects ...
    . exporting database links
    . exporting sequence numbers
    . exporting cluster definitions
    EXP-00056: ORACLE error 932 encountered
    ORA-00932: inconsistent datatypes: expected BLOB, CLOB got CHAR
    EXP-00000: Export terminated unsuccessfully
    DN

    Pierre,
    I do not have any invalid objects.
    But I run
    SQL> @?/rdbms/admin/catmetx.sql (Per Metalink note 339938.1).
    But now I am getting errorsbelow:
    EXP-00056: ORACLE error 904 encountered
    ORA-00904: "SYS"."DBMS_EXPORT_EXTENSION"."FUNC_INDEX_DEFAULT": invalid identifier
    . . exporting table QUEST_COM_USER_PRIVILEGES 0 rows exported
    EXP-00056: ORACLE error 904 encountered
    ORA-00904: "SYS"."DBMS_EXPORT_EXTENSION"."FUNC_INDEX_DEFAULT": invalid identifier
    . . exporting table QUEST_SL_COLLECTION_DEFINITION 0 rows exported
    EXP-00056: ORACLE error 904 encountered
    ORA-00904: "SYS"."DBMS_EXPORT_EXTENSION"."FUNC_INDEX_DEFAULT": invalid identifier
    . . exporting table QUEST_SL_COLLECTION_DEF_REPOS 0 rows exported
    EXP-00056: ORACLE error 904 encountered
    ORA-00904: "SYS"."DBMS_EXPORT_EXTENSION"."FUNC_INDEX_DEFAULT": invalid identifier
    . . exporting table QUEST_SL_COLLECTION_REPOSITORY 0 rows exported
    EXP-00056: ORACLE error 904 encountered
    ORA-00904: "SYS"."DBMS_EXPORT_EXTENSION"."FUNC_INDEX_DEFAULT": invalid identifier
    . . exporting table QUEST_SL_ERRORS 0 rows exported
    . . exporting table QUEST_SL_EXPLAIN 0 rows exported
    EXP-00056: ORACLE error 904 encountered
    ORA-00904: "SYS"."DBMS_EXPORT_EXTENSION"."FUNC_INDEX_DEFAULT": invalid identifier
    . . exporting table QUEST_SL_EXPLAIN_PICK 0 rows exported
    EXP-00056: ORACLE error 904 encountered
    ORA-00904: "SYS"."DBMS_EXPORT_EXTENSION"."FUNC_INDEX_DEFAULT": invalid identifier
    . . exporting table QUEST_SL_QUERY_DEFINITIONS 0 rows exported
    EXP-00056: ORACLE error 904 encountered
    ORA-00904: "SYS"."DBMS_EXPORT_EXTENSION"."FUNC_INDEX_DEFAULT": invalid identifier
    . . exporting table QUEST_SL_QUERY_DEF_REPOSITORY 0 rows exported
    EXP-00056: ORACLE error 904 encountered
    ORA-00904: "SYS"."DBMS_EXPORT_EXTENSION"."FUNC_INDEX_DEFAULT": invalid identif
    Now I am running @catproc.sql.
    After that I will run utlrp.sql script if there is any invalid objects..
    DN

  • HOW can I enter text using Japanese character sets?

    The "Text, Plates, Insets" section of the LOOKOUT(6.01) Help files states:
    "Click the » button to the right of the Text field to expand the field for multiple line entries. You can enter text using international character sets such as Chinese, Korean, and Japanese."
    Can someone please explain HOW to do this? Note, I have NO problem inputting Hirigana, Katakana, and Kanji into MS WORD; the keyboard emulates the Japanese layout and characters (Romaji is default) and the IME works fine converting Romaji, and I can also select charcters directly from the IME Pad. I have tried several different fonts with success and am currently using MS UI Gothic.ttf as default. Again, everything is normal and working in a predictable manner within Word.
    I cannot get these texts into Lookout. I can't cut/paste from HTML pages or from text editors, even though both display properly. Within Lookout with JP selected as language/keyboard, when trying to type directly into the text field, the IME CORRECTLY displays Hirigana until <enter> is pressed, at which point all text reverts to question marks (?? ???? ? ?????). If I use the IME Pad, it does pretty much the same. I managed to get the "Yen" symbol to display, though, if that's relevant. As I said, font selected (in text/plate font options) is MS UI Gothic with Japanese as the selected script. Oddly enough, at this point the "sample" window is showing me the exact Hirigana character I want displayed in Lookout, but it won't. I've also tried staying in English and copying unicode characters from the Windows Character Map. Same results (Yen sign works, Hirigana WON'T).
    Help me!
    JW_Tech

    JW_Tech,
    Have you changed the regional setting to Japanese?
    Doug M
    Applications Engineer
    National Instruments
    For those unfamiliar with NBC's The Office, my icon is NOT a picture of me
    Attachments:
    language.JPG ‏50 KB

  • Data load using   0HR_PA_OS_1 data source

    Hi,
    I am trying to load data (Full Load) using 0HR_PA_OS_1 (Staffing Assignments) data source, But it is extracting 0 records….
    I checked in RSA3 (R/3) it is stilling showing 0 records…but there is significant amount of data already present in BW Cube (50000 Records)…
    Do I need to perform any pre requisites to perform data load using  0HR_PA_OS_1
    Thanks

    Refered SAP Note 429145

  • Help : error while loading create database character set

    SQL> startup
    ORACLE instance started.
    Total System Global Area 599785472 bytes
    Fixed Size 2022600 bytes
    Variable Size 180355896 bytes
    Database Buffers 411041792 bytes
    Redo Buffers 6365184 bytes
    ORA-12709: error while loading create database character set

    ORA-12709 error while loading create database character set
    Cause: This is an internal error.
    Action: Contact Oracle Support Services.

  • Unable to load CSV data using APEX Data Load using Firefox/Safari on a MAC

    I have APEX installed on a Windows XP machine connected to an 11g database on the same Windows XP machine.
    While on the windows XP, using IE 7, I am able to successfully load a CSV spreadsheet of data using the APEX Data Load utility.
    However, if I switch to my MacBook Pro running OS X leopard, then login into same APEX machine using Firefox 2 or 3 or Safari 3, then try to upload CSV data, it fails on the "Table Properties" step when it asks you for the name of the new table and then asks you to set table properties, the table properties just never appear (they do appear in IE 7 on Windows XP) and if you try to hit the NEXT button, you get error message: "1 error has occurred. At least one column must be specified to include in new table." and of course, you can't specify the any of the columns because there is nothing under SET TABLE PROPERTIES in the interface.
    I also tried to load data with Firefox 2, Firefox 3 (beta), and Safari 3.1, but get same failed result on all three. If I return to the Windows XP machine and use IE 7.0, Data Load works just fine. I work in an ALL MAC environment, it was difficult to get a windows machine into my workplace, and all my end users will be using MACs. There is no current version of IE for the MAC, so I have to use Firefox or Safari.
    Is there some option in Firefox or Safari that I can turn on so this Data Load feature will work on the MAC?
    Thanks for your help. Any assistance appreciated.
    Tony

    I managed to get this to work by saving the CSV file as Windows CSV (not DOS CSV), which allowed the CSV data to be read by Oracle running on Windows XP. I think the problem had to do with different character sets being used for CSV on MAC versus CSV on Windows. Maybe if I had created my windows XP Oracle database with Unicode as the default character set, I never would have experienced this problem.

  • Wrong character set in RFC Response.

    Hi all,
    interesting little problem - in calling an ERP system via the RFC Adapater, I am expecting English text in a string field, however,I end up getting a series of, I assume, Chinese characters. I don't know why, it has nothing to do with a language text look up as I am explicity forcing a return of 'Call Made!' to one of the the string fields all apart of my debugging.
    So the question is - where can I go to find out what the default character set is for the PI system.
    Thanks.

    Thanks Stefan,
    the front end codepage is character set is UTF-8 and the RFC user, in fact all users, are logging on with langauge of 'EN'. It seems to be just this particular field. Another field for example is Material Description, and that is coming through as english and is readable, however, this other field as populated by an RFC Call, regardless of whether I set delibrately via code eg;
    ev_sortf = 'Call Made!'.
    or let it populate from a select statement comes through in a different character set. When I look at the field configuration and the WSDL, it is set as type STRING in both the data type and in the RFC definition, and I cannot see anything that stands out as not to use the English character set.

  • Extraction problem - selection conditions for data load using abap program

    Hi All,
           I have a problem loading data over a selected period where the selection of date range is done using ABAP routine (type 6). Here though in the request header tab in monitor screen i'm able to see the selection date range populated correctly, no records are being extracted. But if i delete the abap filter and directly give the same date range for selection we are able to extract data. if any body has faced similar problem and have a solution for it please help me with yur suggestion.
    Thanks,
    nithin.

    It seems the the data range is not properly set in the routine.
    You can check the value of selection period generated by routine in the data selection tab-> execute button is there .
    Click it to to test the selection values generated by the ABAP routine..
    If the value here seems correct one then paste the code of the routine that u have written with brief logic details that u have applied.
    Sonal.....

  • Export using UTF8 character set

    Hi,
    My client is having a production database with default character set.
    While exporting I need to export the database to create a dump with UTF8 character set.
    Please let me know how to export with UTF8 option.
    Thanks....

    Hi, I am not sure if I got you correst. Here is what I think I have understood:
    - your client has a db which uses UTF8 as character set.
    - you want to export and make sure that there is no conversion taking place.
    for this you must set NLS_LANG variable in the shell from where you call exp, resp. expdp properly.
    NLS_LANG is composit and consists of:
    NLS_LANGUAGE_NLS_TERRITORY.NLS_CHARACTERSET
    In a bash this would look like this:
    $ export NLS_LANG=american_america.UTF8
    In other shells you might need to first define the varaible and then export:
    $ NLS_LANG=american_america.UTF8
    $ export NLS_LANG
    What you also need to know is that you influence the behavior for decimal separator and grand separator in numeric values and a few more settings by specifying NLS_TERRITORY.
    For instance NLS_TERRITORY uses a "." as decimal separator and a "," as grand separator for numeric values.
    This can be a pitfall! If you have the wrong territory specified then it might destroy all you numerics!
    Hope this helps,
    Lutz

  • Use 2 character sets in the same sapscript form (1100 & 4030)

    Is this possible or I need to make a new character set and then make the changes that I need.
    For example:
    The 'Field group' character is usually hex 1D. The problem is
    that hex 1D is not part of the printer character set 1100 . So
    a lot of changes are necessary:
    - Copy your device type in the customer naming room (transaction SPAD,
    Utitities - For device types - Copy device type).
    - Check, which character set your device type uses (transaction SPAD,
    device type).
    - Copy the character set to the customer naming room (SPAD - Character
    set ; customer naming room: 9xxx)
    - Call SPAD - Device Type and put the new character set into all three
    fields at 'Printer character sets'.
    - Now go into your new character set (SPAD - character set , Button 'Edit
    character set '). Unfortunately the character 'Field group'  doesn't
    exist yet. So you should select a character which you don't need and
    which you can 'misuse' as 'Field group' . Let's say, you don't need
    the character 'Thorn'. Then you
    have to add/change the entry of character 354,
    if you want to use the Thorn character for the 'Field group' .
    Here you must insert the sequence '1D' to get the 'Field group'  character .
    Now if you write <354> in the Sapscript layout set in the old
    editor or if you press 'Insert command' in the new editor
    and insert SAP character 354, a 'Field group'  should be inserted
    at that place in the print data.

    The Function Module CLOSE_FORM has an optional TABLES parameter called OTFDATA that can be filled with the contents of the document instead of printing.
    To do this, you must set the field TDGETOTF = 'X' in the OPTIONS structure that can be supplied as an optional parameter to Function Module OPEN_FORM when the document is opened for output.
    This table contains the OTF format data that describes your SAPscript document. Normally the SAPscript compositer would print this to an external printer or fax or email, but once captured in your program you can do anything you like with this information.
    For instance, you can convert OTF data to a PDF file using SX_OBJECT_CONVERT_OTF_PDF. See the many threads on this topic in the ABAP Development forum if you are interested.
    I hope this hint is helpfull to you. Good luck.

Maybe you are looking for

  • NFe GRC e PI 7.1 - Instalação

    Boa tarde. Necessito implementar uma solução de NF-e em um cliente SAP que tem o PI 7.1 instalado. A dúvida é se eu posso utilizar o PI 7.1 para configurações referentes a NF-e  e se existe alguma mudança no design / configuration. A segunda dúvida é

  • Video Streaming from a network camera!

    Hi, Does anybody have an idea how to go about streaming video from a network camera using applets? I know its possible in JMF, but I was wondering if its possible without that because I guess it needs the user to install JMF. Or a way with JMF but to

  • Can't get HP 3050A to connect to wireless router

    Hi, I've recently switched from Comcast to AT&T and I cannot get my printer to connect to the new AT&T router. I've gone through the same steps as far as holding down the WPS button and what have you that it states to do on the printer but it just "t

  • Why can't I change the cost center in PA0001 using PA30?

    Hi Gurus, Please help.  I cannot seem to find the means to enter the cost center in the organizational assignment infotype PA0001.  It is showing as greyed out/display only.  As background, these personnel files were setup for time entry.  Other info

  • Shortcuts for previous/next page on german keyboard are unreliable

    Hi there! I'm struggeling with the previous/next page shortcut on my german MacBook Pro. When I open the "History" menu it says the command for "next page" is ⌘ + ] and "previous page" is ⌘ + [ But there are no such bracket keys on the german keyboar