Unicode-UTF8-Support for MaxDB?

Hello,
according to general information MaxDB just supports UTF16. If one is
facing the decision to migrate an Oracle 9.2 Non-Unicode-DB to either a Unicode-UTF16-MaxDB or a Unicode-UTF8-Oracle-DB, the answer probably would be to go for a Unicode-UTF8-Oracle-DB because the SAN costs used should be app. half of the MaxDB-SAN-costs.
Is UTF8-Support soon ready for MaxDB?
Thanks for your reply.
br
Chris

Hi Chris,
yeah, that note needs a small change (UTF-16 changed into USC-2).
Indeed, a 2TB DB under UTF-8 needs about 3TB under USC-2. So yes, one would need more space for MaxDB, but the TCO is not only dependent on the cost of the SAN. Easy admin, being reorganisation free are only a few of the examples which can drastically lower the TCO of a system. That added to the low cost of the MaxDB software itself <i><b>can</b></i> make a difference.
Because of other topics, which I cannot go into and currently have a higher priority, the UTF-8 support is planned long-term.
I'm not sure if this is allowed, but this (german) article is quite an interesting read:
<a href="http://www.computerwoche.de/produkte_technik/storage/581295/">http://www.computerwoche.de/produkte_technik/storage/581295/</a>
With the Unicode / ASCII mixture I just meant that for example a Unicode WebAS contains ASCII data aswell. Another example: in MDM there's an explicit distinction between the ASCII and UNICODE data. Our interfaces support all three (ASCII, UTF-8 and UCS-2).
Regards,
Roland

Similar Messages

  • How to buy support for MaxDB?

    hi,
    i wish to know, how we can buy support for MaxDB?
    KR
    Clóvis

    Hi Clovis,
    currently, SAP does not sell MaxDB support to non-SAP customers.
    You might want to take a look at the ODMG, maybe they can provide what you're looking for: [Open MaxDB Group (OMDG) u2013 The User Group|http://www.open-maxdb-group.org].
    Other option: this forum.
    Regards,
    Roland

  • OpenJPA support for MaxDB?

    I'm not finding anything on the internet that shows that MaxDB will work with OpenJPA. If it isn't supported, is there a persistance abstraction like OpenJPA (or Hibernate which unfortunately does not have an acceptable license so we can't use it)?
          Thanks,
          Harold Shinsato

    Hi Chris,
    yeah, that note needs a small change (UTF-16 changed into USC-2).
    Indeed, a 2TB DB under UTF-8 needs about 3TB under USC-2. So yes, one would need more space for MaxDB, but the TCO is not only dependent on the cost of the SAN. Easy admin, being reorganisation free are only a few of the examples which can drastically lower the TCO of a system. That added to the low cost of the MaxDB software itself <i><b>can</b></i> make a difference.
    Because of other topics, which I cannot go into and currently have a higher priority, the UTF-8 support is planned long-term.
    I'm not sure if this is allowed, but this (german) article is quite an interesting read:
    <a href="http://www.computerwoche.de/produkte_technik/storage/581295/">http://www.computerwoche.de/produkte_technik/storage/581295/</a>
    With the Unicode / ASCII mixture I just meant that for example a Unicode WebAS contains ASCII data aswell. Another example: in MDM there's an explicit distinction between the ASCII and UNICODE data. Our interfaces support all three (ASCII, UTF-8 and UCS-2).
    Regards,
    Roland

  • Cdrtools package, support for nls/utf8 character sets

    Hello ppl,
    I've been trying desperetly to burn a cd/dvd(k3b) with greek filenames and directory names. I ended up with file names like "???????????????????? (invalid unicode)"
    After a lot of searching, i managed to isolate and solve the problem. There has been a patch(http://bugs.gentoo.org/attachment.cgi?id=52097) for cdrtools to support nls/utf8 character sets.
    I guess that 90%+ of people using arch and burning cd's/dvd's, ignore the problem cause they just burn cd's/dvd's using standard english characters.
    For all others here it is     :
    # Patched cdrtools to support nls/utf8 character sets
    # Contributor: Akis Maziotis <[email protected]>
    pkgname=cdrtools-utf8support
    pkgver=2.01.01
    pkgrel=3
    pkgdesc="Tools for recording CDs patched for nls/utf8 support!"
    depends=('glibc')
    conflicts=('cdrtools')
    source=(ftp://ftp.berlios.de/pub/cdrecord/alpha/cdrtools-2.01.01a01.tar.gz http://bugs.gentoo.org/attachment.cgi?id=52097)
    md5sums=('fc085b5d287355f59ef85b7a3ccbb298' '1a596f5cae257e97c559716336b30e5b')
    build() {
    cd $startdir/src/cdrtools-2.01.01
    msg "Patching cdrtools ..."
    patch -p1 -i ../attachment.cgi?id=52097
    msg "Patching done "
    make || return 1
    make INS_BASE=$startdir/pkg/usr install
    It's a modified pkgbuild of the official arch cdrtools package (http://cvs.archlinux.org/cgi-bin/viewcv … cvs-markup) patched to support nls/utf8 character sets.
    Worked like a charm. 
    If u want to install it, u should uninstall the cdrtools package
    pacman -Rd cdrtools
    P.S.: I've issued this as a bug in http://bugs.archlinux.org/task/3830 but nobody seemed to care...    :cry:  :cry:  :cry:

    Hi Bharat,
    I have created a Oracle 8.1.7 database with UTF8 character set
    on WINDOWS 2000.
    Now , I want to store and retrieve information in other languages
    say Japanese or Hindi .
    I had set the NLS Language and NLS Terrritory to HINDI and INDIA
    in the SQL*PLUS session but could not see the information.You cannot view Hindi using SQL*Plus. You need iSQL*Plus.
    (Available as a download from OTN, and requiring the Oracle HTTP
    server).
    Then you need the fonts (either Mangal from Microsoft or
    Code2000).
    Have your NLS_LANG settings in your registry to
    AMERICAN_AMERICA.UTF8. (I have not tried with HINDI etc, because
    I need my solution to work with 806,817 and 901, and HINDI was
    not available with 806).
    Install the language pack for Devanagari/Indic languages
    (c_iscii.dll) on Windows NT/2000/XP.
    How can I use the Forms 6i to support this languages ?I am not sure about that.
    Do write back if this does not solve your problem.
    --Shirish

  • MaxDB support for SAP XI 3.0 SP 18 (NW04)

    Hi,
    I need an advice.
    We are using SAP XI 3.0 SP 18 (NW04) on MaxDB 7.5.0.046. Because of some issues with the database we are considering upgrading to 7.6. However, it is not clear if MaxDB 7.6 is supported for NW04.
    service.sap.com/pam -> SAP NetWeaver -> SAP NetWeaver 04 suggests:
    1. tab Database platforms
    SAP KERNEL 6.40 64-BIT UNICODE  Maintained until 31.03.2013; MaxDB
    supported database version: MaxDB 7.5 64-BIT
    2. tab Process integration (PI/XI)
    supported database version: MaxDB 7.6 64-BIT
    QUESTION IS: which tab (1,2) should I trust regarding supported database version for XI?
    Best Regards,
    Piotr

    Hi,
    2. tab Process integration (PI/XI)
    supported database version: MaxDB 7.6 64-BIT  <b>is correct</b> as ..
    1. tab Database platforms
    SAP KERNEL 6.40 64-BIT UNICODE Maintained until 31.03.2013; MaxDB
    <b> is for application server ABAP</b>
    Award points if helpful
    Regards
    Umesh

  • Call upon even better support for Unicode

    Hello
    Following some messages I have posted regarding problems I encountered while developing a non-English web application, I would like to call upon an even better support for Unicode. Before I describe my call, I want to say that I consider Berkeley DBXML a superb product. Superb. It lets us develop very clean and maintainable applications. Maintainability is, in my view, the keyword in good software development practices.
    In this message I would like to remind you that the US-ASCII 8-bit set of characters only represents 0.4% of all characters in the world. It is also true to say that most of our software comes from efforts of American developers, for which I am of course very grateful.
    But problems with non US-ASCII characters are very very time consuming to solve. To start with, our operating systems need to be configured especially for unicode, our servers too, our development tools too, our source code too and, finally, our data too. That's a lot of configuring, isn't it? Believe me, as a Flemish french-speaking, danish-speaking developer who develops currently a new application in Portuguese, I know what I am talking about.
    Have you ever tried to write a Java class called Ação.java, that loads an xml instance called Ação.xml that contains something like <?xml version="1.0" charset="utf-8"?></ação variável="descrição"/>? It takes at least the double of time to have all this work right in a web application on a Linux server than it would take to write a Acao.java that loads Acao.xml containing <?xml version="1.0" charset="us-ascii"?></acao variavel="descricao"/> (which is clearly something we do not want in Portugal).
    I have experienced a problem while using the dbxml shell to load documents that have utf-8 encoded names. See difficulties retrieving documents with non ascii characters in name The work around is not to use the dbxml shell, with which I am of course not very happy.
    So, while trying not to be arrogant and while trying to express my very very great appreciation for this great product, I call upon even better support for Unicode. After all, when the rest of us, that use another 65279 characters in our software, will be able to use this great product without problems, will it not contribute to the success of Berkeley DBXML?
    Thank you
    Koen
    Edited by: koenheene on 29/Out/2009 3:09

    Hello John and thank you for replying,
    You are completely correct that it is a shell problem. I investigated and found solutions for running dbxml in a Linux shell. On Windows, as one could expect, no solution so far.
    Here is an overview of my investigation, which I hope will be useful for other developers which also presist writing code and xml in their language.
    difficulties retrieving documents with non ascii characters in name
    I was wondering though if it would not be possible to write the dbxml shell in such a way that it becomes independent from the encoding of the shell. Surely their must be, not? Rewrite dbxml in Java? Any candidates :-) ?
    Thanks again for the very good work,
    Koen

  • XDK support for UTF-16 Unicode

    Hi,
    Does the Oracle Java XDK, specifically the XSQL servlet and api,
    support UTF16-Unicode.
    Presumably, .xsql files have to be stored, read and queries
    executed in a Unicode compliant format for this to work. Is this
    possible currently???
    Thanks,
    - Manish

    If you are using XDK 9.0.1 or later, and you are using JDK 1.3,
    this combination supports UTF-16. XSQL inherits the support
    for "free" in this case.

  • Unicode Support for Brio 8

    <p>Does Brio 8 (or any of the subsequent Hyperion versions)provide support for Unicode characters. If no, how do wetackle reporting from databases containing non-English characterslike German, Japanese etc. Thanks..Cheers..</p>

    It's not what you describe, but here is more detail on what I'm doing.
    This an example of the value string I'm storing. It's a simple xml object, converted to a unicode string, encoded in utf-8:
    u'<d cdt="1267569920" eml="[email protected]" nm="\u3059\u3053\u3099\u304f\u597d\u304d\u306a\u4e16\u754c" pwd="2689367b205c16ce32ed4200942b8b8b1e262dfc70d9bc9fbc77c49699a4f1df" sx="M" tx="000000000" zp="07030" />'
    The nm attribute is Japanese text: すごく好きな世界
    So when I add a secondary index on nm, my callback function is an xml parser which returns the value of a given attribute:
    Generically it's this:
    def callbackfn (attribute):
    """Define how the secondary index retrieves the desired attribute from the data value"""
    return lambda primary_key, primary_data: xml_utils.parse_item_attribute(primary_data, attribute)
    And so for this specific attribute ("nm"), my callback function is:
    callbackfn('nm')
    As I said in my original post, if I add this the the db, I get this type error:
    TypeError: DB associate callback should return DB_DONOTINDEX/string/list of strings.
    But when I do not place a secondary index on "nm", the type error does not occur.
    So that's consistent with what Sandra wrote in the other post, i.e.:
    "Berkeley DB never operates on the value part of a record. Values are simply payload, to be stored with keys and reliably delivered back to the application on demand."
    My guess is that I need to add an additional utf-8 encoding or decoding to the callback function, or else define a custom comparsion function so the callback will know what to do with the nm attribute value, but I'm not sure what exactly.

  • MaxDB support for production systems

    Hi,
    I have a DMS with MaxDB version 7.6.00.18.
    I opened OSS message regarding an issue. I got a solution, however I was told that '7.6.00.18 is very outdated and no longer supported for productive systems'.
    I can install the latest patch but how do I know when MaxDB version is no longer supported for productive system.
    Can someone refer me to a note/article?
    Thanks,
    Omri

    here is the details i found
    Especially make sure to check in the Product Availability Matrix (PAM) http://service.sap.com/pam that the MaxDB / SAP DB version you plan to use is released for your specific SAP product!
    Note 1178367 - SAP MaxDB: End of Support Dates
    Cheers,
    -Sunil

  • Chinese characters or else with a driver db2 z-os Unicode (UTF8)

    Post Author: fred_181061
    CA Forum: Desktop Intelligence Reporting
    Hi,
    Is there anybody use or test this driver with a database DB2 z-os and tablespace declared in Unicode UTF8 ?
    I notice that this driver is not supported by Business Objects.
    Interface succeeded on html viewer (deski 3 tier)
    Interface not succeeded on java (webi) or deski 2 tier
    Thanks for your answer and best regards

    I am also not an expert and haven't used DB/2 on z/OS but am using it on OS/400. I think your DBA is confused.
    From what I have experienced setting commit control to CS causes more locking than RS. From what I can determine (which has been through trial and error so it may not be totally accurate) CS causes the database to exclusive lock all records in the cursor while RS will use shared locks. I do know from painful experience that if you set all your transactions to CS you will experience frequent update problems due to record locks.  We had to go through and set everything to RS and, once we did, our concurrency problems were nearly eliminated. There are some instances that you have to use *CS, however, such as when a stored procedure returns a cursor, but the compiler will tell you about this.
    Another thing that doesn't make sense is locks are typically implemented as data structures in memory and do not require much if any processor time. I don't know the specifics of DB/2 implementation but Oracle uses simple semaphores, so I would assume IBM does, too.
    Also, from my experience, there appears to be no way around DB/2 locking. Having come from an Oracle background I find the amount of locking DB/2 does totally frustrating. It's not as bad as SQL Server but can become problematic at times. That is just my $.02 anyway.

  • Disk Volume cost for MaxDB 7.6 and 7.8

    Hello, everyone
    I heared that MaxDB 7.8 uses UTF-8 to save characters while 7.6 uses UTF-16. Therefore I made a comparasion to prove this.
    To be disappointed, the result did not support it. The experiment procedures were as follows:
    1. Create a DB with same disk volume size(30MB) and log volume size(20MB) and default char code as Unicode.
    2. Create a table with only one column of varchar(200) unicode.
    3. Insert 6000 lines of records to this table.
    4. See the data area of percentage filled.
    5. It showed that the two were nearly same and  7.6 was even less than 7.8.
    So, I have the following questions:
    1. Does Maxdb 7.8 use UTF-8 to save characters?
    2. What are the feature lists for MaxDB 7.6 and 7.8?
    I was asked a sap service marketplace account to login. Is this kind of information so secret?
    3. If Maxdb 7.8 uses UTF-8 , how to configure to use it?
    Any reply will be highly appreciated!

    Hi,
    As per the Wiki link http://wiki.sdn.sap.com/wiki/display/MaxDB/FAQ, section:
    Bigger development request (e.g Unicode UTF-8 or page compression) that would have required man-years of effort in SAP MaxDB, are either already available in SAP In-Memory Database or will be done there
    So, seems UTF-8 is not yet there in MaxDB 7.8.
    Additional features in MaxDB 7.8: http://maxdb.sap.com/training/expert_sessions/SAP_MaxDB_New_Features_in_Version_78.pdf
    Regards,
    Srikishan

  • Unicode UTF8

    Hello all,
    we are using SAP R/3 Enterprise 4.7 wit database Oracle 9.2.x.
    Begin of this year we have migrated to Unicode UTF8..
    Due to serveal purposes we are planning to migrate to MaxDB next year.
    Our current Oracle-DB size is about 250 GB.  ( plus  QAS  +  DEV ).
    Big tables are for e.g.
    GLPCA (data: 16 GB,  index: 9 GB), PPOIX (data:10 GB, index: 12 GB),
    COEP   (data:8 GB, index 3 GB),     RFBLG (data: 7 GB, index: 0,5 GB).
    Now my questions:
    1) I have read a SAP document with message:
        migrate to MaxDB with UTF16  means additionally disk space ( 40 - 60 % )
        Is this more or less realistic ?  (  I heard up to 100 % )
    2) I read the new features of MaxDB 7.7. There is a kernel parameter
        "useunicodecolumncompression = YES"
         the system uses UTF8 for Unicode, except for key columns
         Does this means that indexes will migrate to UTF16 and data to UTF8 ?
         So the totally increasing will be about 20 - 30 % ?
    3)   MaxDB  8.0  can use  UTF8.
          When the realease time,  approximately ?
    Thanks for your answers in advance.
    Regards, Reinhard

    Hello Reinhard
    Customers reported that the size of Oracle DBs might be larger than the same data in a MaxDB (see e.g. <a href="http://maxdb.sap.com/events/MaxDB_Infoday_2007/K1_2007_MaxDB%20in%20der%20GESIS-Systemlandschaft.pdf">GESIS-Systemlandschaft.pdf</a>). This could balance your demands a bit.
    ad 1)
    Of course, the numbers depend on the data. The given 40-60% are expectations.
    ad 2)
    I would answer the question with yes. However, how do you come up with 20-30%? Is there an index on every second Unicode character column?
    ad 3)
    Neither I know nor I can answer anything about 8.0.
    Regards Thomas

  • Backup using backint fails for maxdb

    Hi All, I have configured backint for backup of maxdb for content server 640. I configured it as per the documents available, created the configuration fiale and the parameter file. Created the backup medium in dbmgui. Now when i try to run the backup using the pipe am getting the above mentioned error. Please find below the dbm.ebp log for the same...
    more dbm.ebp 2009-10-22 02:06:08 Setting environment variable 'TEMP' for the directory for temporary files and pi pes to default ''. Setting environment variable 'TMP' for the directory for temporary files and pip es to default ''. Using connection to Backint for MaxDB Interface. 2009-10-22 02:06:08 Checking existence and configuration of Backint for MaxDB. Using configuration variable 'BSI_ENV' = '/sapdb/CFC/lcbackup/apoatlas.env' as path of the configuration file of Backint for MaxDB. Setting environment variable 'BSI_ENV' for the path of the configuration fil e of Backint for MaxDB to configuration value '/sapdb/CFC/lcbackup/apoatlas.env' . Reading the Backint for MaxDB configuration file '/sapdb/CFC/lcbackup/apoatl as.env'. Found keyword 'BACKINT' with value '/sapdb/CFC/db/bin/backint'. Found keyword 'INPUT' with value '/tmp/backint4sapdbCFC.in'. Found keyword 'OUTPUT' with value '/tmp/backint4sapdbCFC.out'. Found keyword 'ERROROUTPUT' with value '/tmp/backint4sapdbCFC.err'. Found keyword 'PARAMETERFILE' with value '/sapdb/CFC/lcbackup/param.cfg' . Found keyword 'TIMEOUT_SUCCESS' with value '1800'. Found keyword 'TIMEOUT_FAILURE' with value '1800'. Finished reading of the Backint for MaxDB configuration file. Using '/sapdb/CFC/db/bin/backint' as Backint for MaxDB program. Using '/tmp/backint4sapdbCFC.in' as input file for Backint for MaxDB. Using '/tmp/backint4sapdbCFC.out' as output file for Backint for MaxDB. Using '/tmp/backint4sapdbCFC.err' as error output file for Backint for MaxDB . Using '/sapdb/CFC/lcbackup/param.cfg' as parameter file for Backint for MaxD B. Using '1800' seconds as timeout for Backint for MaxDB in the case of success . Using '1800' seconds as timeout for Backint for MaxDB in the case of failure . Using '/sapdb/data/wrk/CFC/dbm.knl' as backup history of a database to migra te. Using '/sapdb/data/wrk/CFC/dbm.ebf' as external backup history of a database to migrate. Checking availability of backups using backint's inquire function. Check passed successful. 2009-10-22 02:06:08 Checking medium. Check passed successfully. 2009-10-22 02:06:08 Preparing backup. Setting environment variable 'BI_CALLER' to value 'DBMSRV'. Setting environment variable 'BI_REQUEST' to value 'NEW'. Setting environment variable 'BI_BACKUP' to value 'FULL'. Constructed Backint for MaxDB call '/sapdb/CFC/db/bin/backint -u CFC -f back up -t file -p /sapdb/CFC/lcbackup/param.cfg -i /tmp/backint4sapdbCFC.in -c'. Created temporary file '/tmp/backint4sapdbCFC.out' as output for Backint for MaxDB. Created temporary file '/tmp/backint4sapdbCFC.err' as error output for Backi nt for MaxDB. Writing '/sapdb/CFC/lcbackup/pipe1 #PIPE' to the input file. Writing '/sapdb/CFC/lcbackup/pipe2 #PIPE' to the input file. Prepare passed successfully. 2009-10-22 02:06:08 Creating pipes for data transfer. Creating pipe '/sapdb/CFC/lcbackup/pipe1' ... Done. Creating pipe '/sapdb/CFC/lcbackup/pipe2' ... Done. All data transfer pipes have been created. 2009-10-22 02:06:08 Starting database action for the backup. Requesting 'SAVE DATA QUICK TO '/sapdb/CFC/lcbackup/pipe1' PIPE,'/sapdb/CFC/ lcbackup/pipe2' PIPE BLOCKSIZE 8 NO CHECKPOINT MEDIANAME 'BACKINT_ONLINE1'' from db-kernel. The database is working on the request. 2009-10-22 02:06:09 Waiting until database has prepared the backup. Asking for state of database. 2009-10-22 02:06:09 Database is still preparing the backup. Waiting 1 second ... Done. Asking for state of database. 2009-10-22 02:06:10 Database is still preparing the backup. Waiting 2 seconds ... Done. Asking for state of database. 2009-10-22 02:06:12 Database has finished preparation of the backup. The database has prepared the backup successfully. 2009-10-22 02:06:12 Starting Backint for MaxDB. Starting Backint for MaxDB process '/sapdb/CFC/db/bin/backint -u CFC -f back up -t file -p /sapdb/CFC/lcbackup/param.cfg -i /tmp/backint4sapdbCFC.in -c >>/tm p/backint4sapdbCFC.out 2>>/tmp/backint4sapdbCFC.err'. Process was started successfully. Backint for MaxDB has been started successfully. 2009-10-22 02:06:12 Waiting for end of the backup operation. 2009-10-22 02:06:12 The backup tool is running. 2009-10-22 02:06:12 The database is working on the request. 2009-10-22 02:06:14 The backup tool process has finished work with return co de 2. 2009-10-22 02:06:17 The database is working on the request. 2009-10-22 02:06:27 The database is working on the request. 2009-10-22 02:06:42 The database is working on the request. 2009-10-22 02:07:02 The database is working on the request. 2009-10-22 02:07:15 Canceling Utility-task after a timeout of 60 seconds ela psed ... OK. 2009-10-22 02:07:17 The database has finished work on the request. Receiving a reply from the database kernel. Got the following reply from db-kernel: SQL-Code :-903 The backup operation has ended. 2009-10-22 02:07:17 Filling reply buffer. Have encountered error -24920: The backup tool failed with 2 as sum of exit codes. The database request was canceled and ended with error -903. Constructed the following reply: ERR -24920,ERR_BACKUPOP: backup operation was unsuccessful The backup tool failed with 2 as sum of exit codes. The database request was canceled and ended with error -903. Reply buffer filled. 2009-10-22 02:07:17 Cleaning up. Removing data transfer pipes. Removing data transfer pipe /sapdb/CFC/lcbackup/pipe2 ... Done. Removing data transfer pipe /sapdb/CFC/lcbackup/pipe1 ... Done. Removed data transfer pipes successfully. Copying output of Backint for MaxDB to this file. -
    Begin of output of Backint for MaxDB (/tmp/backint4sapdbCFC.out)- -
    Data Protection for mySAP(R) Interface between BR*Tools and Tivoli Storage Manager - Version 5, Release 4, Modification 0.0 for Linux x86_64 - Build: 303 compiled on Nov 16 2006 (c) Copyright IBM Corporation, 1996, 2006, All Rights Reserved. BKI0008E: The environment variable BI_CALLER is not set correctely. The current value is "DBMSRV" usage: backint -p  [-u ] [-f ] [-t ] [-i ] [-o ] [-c] where:  backint utility user  backup | restore | inquire | password | delete  file | file_online  parameter file for backup utility  name of a text file that defines the objects default: STDIN  Pool for processing messages and the results of the executed function. default: STOUT BKI0020I: End of program at: Thu 22 Oct 2009 02:06:14 AM EDT . BKI0021I: Elapsed time: 01 sec . BKI0024I: Return code is: 2. -
    End of output of Backint for MaxDB (/tmp/backint4sapdbCFC.out)- - Removed Backint for MaxDB's temporary output file '/tmp/backint4sapdbCFC.out '. Copying error output of Backint for MaxDB to this file. - Begin of error output of Backint for MaxDB (/tmp/backint4sapdbCFC .err) - End of error output of Backint for MaxDB (/tmp/backint4sapdbCFC.e rr)--
    Removed Backint for MaxDB's temporary error output file '/tmp/backint4sapdbC FC.err'. Removed the Backint for MaxDB input file '/tmp/backint4sapdbCFC.in'. Have finished clean up successfully.
    Also, is there any specification about the user permissions and about how the backup should be run?

    Hi Lars,
    I understand that its a clumpsy over here, but i already have raised an OSS message and SAP said, that they cannot support this issue with backint. If you can provide me with an email id, i can send you the log files which would be easy to read.
    My issue is that am not able run backup for maxdb of content server 640 using the backint tool.
    I have created the configuration file and the parameter file as per the specifications from http://maxdb.sap.com/doc/7_7/a9/8a1ef21e4b402bb76ff75bb559a98a/content.htm and http://maxdb.sap.com/doc/7_7/50/075205962843f69b9ec41f34427be7/content.htm.
    THe server is registered to the TSM server. Now when i run the wizard to take the backup using the backint tool, it gives the error "Begin of output of Backint for MaxDB (/tmp/backint4sapdbCFC.out)- -
    Data Protection for mySAP(R) Interface between BR*Tools and Tivoli Storage Manager - Version 5, Release 4, Modification 0.0 for Linux x86_64 - Build: 303 compiled on Nov 16 2006 (c) Copyright IBM Corporation, 1996, 2006, All Rights Reserved. BKI0008E: The environment variable BI_CALLER is not set correctely. The current value is "DBMSRV" usage: backint -p  [-u ] [-f ] [-t ] [-i ] [-o ] [-c] where:  backint utility user  backup | restore | inquire | password | delete  file | file_online  parameter file for backup utility  name of a text file that defines the objects default: STDIN  Pool for processing messages and the results of the executed function. default: STOUT BKI0020I: End of program at: Thu 22 Oct 2009 02:06:14 AM EDT . BKI0021I: Elapsed time: 01 sec . BKI0024I: Return code is: 2. -
    End of output of Backint for MaxDB (/tmp/backint4sapdbCFC.out)- - Removed Backint for MaxDB's temporary output file '/tmp/backint4sapdbCFC.out '. Copying error output of Backint for MaxDB to this file. - Begin of error output of Backint for MaxDB (/tmp/backint4sapdbCFC .err) - End of error output of Backint for MaxDB (/tmp/backint4sapdbCFC.e rr)--
    Removed Backint for MaxDB's temporary error output file '/tmp/backint4sapdbC FC.err'. Removed the Backint for MaxDB input file '/tmp/backint4sapdbCFC.in'. Have finished clean up successfully."
    I think this should be fine to read...
    Krishna KK

  • Multilingual support for BI Publisher reports

    Hi all,
    We are currently using BI Publisher APIs to generate reports. Our aim is to enable multi-language support for these reports.The translatable strings based on the locale are stored in xliff files. Using jedit, UTF-8 encoding is chosen as the default encoding for these xliff files. It works fine with french and some other single byte languages but with double byte languages like chinese,japanese it is throwing the below mentioned error.
    Interestingly, with the same type of encoding (UTF-8), the BI Publisher stand alone server successfully supports the multi byte feature.
    java.io.UTFDataFormatException: Invalid UTF8 encoding.
    at oracle.xml.parser.v2.XMLUTF8Reader.checkUTF8Byte(XMLUTF8Reader.java:160)
    at oracle.xml.parser.v2.XMLUTF8Reader.readUTF8Char(XMLUTF8Reader.java:187)
    at oracle.xml.parser.v2.XMLUTF8Reader.fillBuffer(XMLUTF8Reader.java:120)
    at oracle.xml.parser.v2.XMLByteReader.saveBuffer(XMLByteReader.java:450)
    at oracle.xml.parser.v2.XMLReader.fillBuffer(XMLReader.java:2363)
    at oracle.xml.parser.v2.XMLReader.tryRead(XMLReader.java:1087)
    at oracle.xml.parser.v2.XMLReader.scanXMLDecl(XMLReader.java:2922)
    at oracle.xml.parser.v2.XMLReader.pushXMLReader(XMLReader.java:519)
    at oracle.xml.parser.v2.XMLParser.parse(XMLParser.java:283)
    at oracle.apps.xdo.template.FOProcessor.mergeTranslation(FOProcessor.java:1922)
    at oracle.apps.xdo.template.FOProcessor.createFO(FOProcessor.java:1706)
    at oracle.apps.xdo.template.FOProcessor.generate(FOProcessor.java:1027)
    at glog.server.report.BIPReportRTFCoreBuilder.format(BIPReportRTFCoreBuilder.java:61)
    at glog.server.report.BIPReportCoreBuilder.build(BIPReportCoreBuilder.java:152)
    at glog.server.report.BIPReportCoreBuilder.buildFormatDocument(BIPReportCoreBuilder.java:77)
    at glog.webserver.oracle.bi.BIPReportProcessServlet.process(BIPReportProcessServlet.java:279)
    at glog.webserver.util.BaseServlet.service(BaseServlet.java:614)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:803)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.apache.catalina.core.ApplicationDispatcher.invoke(ApplicationDispatcher.java:654)
    at org.apache.catalina.core.ApplicationDispatcher.processRequest(ApplicationDispatcher.java:445)
    at org.apache.catalina.core.ApplicationDispatcher.doForward(ApplicationDispatcher.java:379)
    at org.apache.catalina.core.ApplicationDispatcher.forward(ApplicationDispatcher.java:292)
    at glog.webserver.report.BIPReportPreProcessServlet.getDocument(BIPReportPreProcessServlet.java:57)
    at glog.webserver.util.AbstractServletProducer.process(AbstractServletProducer.java:79)
    at glog.webserver.util.BaseServlet.service(BaseServlet.java:614)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:803)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at glog.webserver.session.ParameterValidation.doFilter(ParameterValidation.java:29)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at glog.webserver.screenlayout.ClientSessionTracker.doFilter(ClientSessionTracker.java:59)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at glog.webserver.util.SetCharacterEncodingFilter.doFilter(SetCharacterEncodingFilter.java:44)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
    at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:175)
    at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:128)
    at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
    at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
    at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:263)
    at org.apache.jk.server.JkCoyoteHandler.invoke(JkCoyoteHandler.java:190)
    at org.apache.jk.common.HandlerRequest.invoke(HandlerRequest.java:283)
    at org.apache.jk.common.ChannelSocket.invoke(ChannelSocket.java:767)
    at org.apache.jk.common.ChannelSocket.processConnection(ChannelSocket.java:697)
    at org.apache.jk.common.ChannelSocket$SocketConnection.runIt(ChannelSocket.java:889)
    at org.apache.tomcat.util.threads.ThreadPool$ControlRunnable.run(ThreadPool.java:690)
    at java.lang.Thread.run(Thread.java:595)
    <Line 1, Column 1>: XML-20108: (Fatal Error) Start of root element expected.
    oracle.apps.xdo.XDOException
    java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:585)
    at oracle.apps.xdo.common.xml.XSLT10gR1.invokeNewXSLStylesheet(XSLT10gR1.java:641)
    at oracle.apps.xdo.common.xml.XSLT10gR1.transform(XSLT10gR1.java:247)
    at oracle.apps.xdo.common.xml.XSLTWrapper.transform(XSLTWrapper.java:181)
    at oracle.apps.xdo.template.fo.util.FOUtility.generateFO(FOUtility.java:1151)
    at oracle.apps.xdo.template.fo.util.FOUtility.generateFO(FOUtility.java:275)
    at oracle.apps.xdo.template.FOProcessor.createFO(FOProcessor.java:1809)
    at oracle.apps.xdo.template.FOProcessor.generate(FOProcessor.java:1027)
    at glog.server.report.BIPReportRTFCoreBuilder.format(BIPReportRTFCoreBuilder.java:61)
    at glog.server.report.BIPReportCoreBuilder.build(BIPReportCoreBuilder.java:152)
    at glog.server.report.BIPReportCoreBuilder.buildFormatDocument(BIPReportCoreBuilder.java:77)
    at glog.webserver.oracle.bi.BIPReportProcessServlet.process(BIPReportProcessServlet.java:279)
    at glog.webserver.util.BaseServlet.service(BaseServlet.java:614)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:803)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.apache.catalina.core.ApplicationDispatcher.invoke(ApplicationDispatcher.java:654)
    at org.apache.catalina.core.ApplicationDispatcher.processRequest(ApplicationDispatcher.java:445)
    at org.apache.catalina.core.ApplicationDispatcher.doForward(ApplicationDispatcher.java:379)
    at org.apache.catalina.core.ApplicationDispatcher.forward(ApplicationDispatcher.java:292)
    at glog.webserver.report.BIPReportPreProcessServlet.getDocument(BIPReportPreProcessServlet.java:57)
    at glog.webserver.util.AbstractServletProducer.process(AbstractServletProducer.java:79)
    at glog.webserver.util.BaseServlet.service(BaseServlet.java:614)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:803)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at glog.webserver.session.ParameterValidation.doFilter(ParameterValidation.java:29)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at glog.webserver.screenlayout.ClientSessionTracker.doFilter(ClientSessionTracker.java:59)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at glog.webserver.util.SetCharacterEncodingFilter.doFilter(SetCharacterEncodingFilter.java:44)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
    at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:175)
    at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:128)
    at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
    at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
    at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:263)
    at org.apache.jk.server.JkCoyoteHandler.invoke(JkCoyoteHandler.java:190)
    at org.apache.jk.common.HandlerRequest.invoke(HandlerRequest.java:283)
    at org.apache.jk.common.ChannelSocket.invoke(ChannelSocket.java:767)
    at org.apache.jk.common.ChannelSocket.processConnection(ChannelSocket.java:697)
    at org.apache.jk.common.ChannelSocket$SocketConnection.runIt(ChannelSocket.java:889)
    at org.apache.tomcat.util.threads.ThreadPool$ControlRunnable.run(ThreadPool.java:690)
    at java.lang.Thread.run(Thread.java:595)
    Caused by: oracle.xdo.parser.v2.XMLParseException: Start of root element expected.
    at oracle.xdo.parser.v2.XSLProcessor.reportException(XSLProcessor.java:806)
    at oracle.xdo.parser.v2.XSLProcessor.newXSLStylesheet(XSLProcessor.java:571)
    Thanks,
    Anuradha .

    Hi Klaus,
    Even notepad does not solve the issue!
    Is there any other text editor with reliable encoding, by which we can save as UTF-8?
    Thanks
    Kosstubh

  • Support for Asian Languages

    Well, it looks like support for Asian Languages in Dynamic
    Textfields wasn't included in this version as well. I have a simple
    flash file embeded in an html file with 2 text boxes. Here's the
    code I'm using:
    var strJapanese = "あらわせて"
    var strChinese = "我想你";
    japanese_txt.text = strJapanese;
    chinese_txt.text = strChinese;
    I currently have these fonts installed in the \Windows\Fonts\
    directory:
    mingliu.ttc
    msgothic.ttc
    but all I see is a bunch of squares in the text boxes. I have
    verified that my pocket pc can use these fonts in other programs.
    Any ideas?
    Thanks,
    BackBayChef

    I used both the options but no one helps. Check the images for the understanding the problam. I try Arial unicode ms as other font  but it does give correct visiblity of font.
    tell some another option.

Maybe you are looking for