Strugging with exporting data out from an Unicode database

Background information
Server: Sun Solaris 5.10; 10g
Client: Windows 2000; 10g, TOAD, Oracle ODBC 10.2.0.1
select * from v$NLS_PARAMETERS
NLS_LANGUAGE AMERICAN
NLS_TERRITORY AMERICA
NLS_CURRENCY $
NLS_ISO_CURRENCY AMERICA
NLS_NUMERIC_CHARACTERS .,
NLS_CALENDAR GREGORIAN
NLS_DATE_FORMAT DD-MON-RR
NLS_DATE_LANGUAGE AMERICAN
NLS_CHARACTERSET AL32UTF8
NLS_SORT BINARY
NLS_TIME_FORMAT HH.MI.SSXFF AM
NLS_TIMESTAMP_FORMAT DD-MON-RR HH.MI.SSXFF AM
NLS_TIME_TZ_FORMAT HH.MI.SSXFF AM TZR
NLS_TIMESTAMP_TZ_FORMAT DD-MON-RR HH.MI.SSXFF AM TZR
NLS_DUAL_CURRENCY $
NLS_NCHAR_CHARACTERSET AL16UTF16
NLS_COMP BINARY
NLS_LENGTH_SEMANTICS BYTE
NLS_NCHAR_CONV_EXCP FALSE
We import SAS data (Windows Latin character set) into Oracle, use OWB for ETL, export the results to SAS. Per regulatory requirements, character columns cannot exceed 200 in length.
Problem scenario
Data that cause the trouble (200 characters, with a degree sign at the 77th position):
XXX PT PRIOR TO MSFC NOT TO USE WALKER, PT STATED SHE NEEDED IT LAST VISIT 2° BAD HEADACHE DECREASED BALANCE, WHICH WAS LATER FOUND TO BE SINUS INFECTION. ASKED PT NOT TO USE WALKER THIS TIME, PT SAID
Degree sign is U+00B0 in UTF-8, or 0xB0 (176) in ASCII. Though, I found out select ascii('°') from dual would return 49480 (or, 0xC2 0xB0).
In order to accommodate the import, Source.COMMENTX is VARCHAR2(201). Using OWB, we are mapping this to Target.COVAL which is VARCHAR2(200).
To get around ORA-12899: value too large for column, we use the expression convert(Source.COMMENTX, 'WE8ISO8859P1', 'AL32UTF8')).
Although viewing Target.COVAL shows a ¿ (true in TOAD, SQL*Plus), dump(COVAL) confirms the 77th character is 176:
DUMP(COVAL)
Typ=1 Len=200: [...],32,50,176,32,[...]
Desirable outcome
Store and display the text in a VARCHAR2(200) column without compromising the high-bit ASCII characters, e.g., degree sign, micro sign (i.e., Greek character mu), copyright sign, etc.
Questions
1. Is it a wrong assumption that AL32UTF8 supports the high-bit ASCII characters (i.e., characters between 128 and 255)? If not, why do the clients display the inverted question mark instead of degree sign when executing select chr(176) from dual?
2. The aforementioned DUMP statement seems to confirm ASCII 0xB0 (i.e., not 0xC2 0xB0, or 0xBF) is being stored in the database at the 77th position. Why do my applications via ODBC interpreted and replaced it as 0xBF, which is the inverted question mark?
Avenues attempted without the desirable outcome
1. Changing Target.COVAL from VARCHAR2(200) to NVARCHAR2(200) or VARCHAR2(200 CHAR) would make SAS (data access through ODBC) think the length is 400 or 800, respectively [Note: The vendor claims it is ODBC 3.0 compliant]
2. Through Microsoft's ODBC Test software, this is the output for describe column all against select COLVAL from Target:
icol, szColName, pcbColName, pfSqlType, pcbColDef, pibScale, *pfNullable
1, COMMENTX, 8, SQL_WVARCHAR=-9, 200, 0, SQL_NULLABLE=1

Degree sign is U+00B0 in UTF-8, or 0xB0 (176) in
ASCII. Though, I found out select ascii('°') from
dual would return 49480 (or, 0xC2 0xB0).Well, U+00B0 represents 'degree sign' in Unicode, and the UTF-8 encoded value for this code point is C2 B0. ASCII does not include a degree sign, and 176 is not a ASCII code value (only 0-127). The function ascii will just return the decimal form of the encoded value, in the character set of the database (not necessarily ASCII, or US7ASCII as it is called in Oracle).
>
To get around ORA-12899: value too large for column,
we use the expression convert(Source.COMMENTX,
'WE8ISO8859P1', 'AL32UTF8')).This part I don't understand. And where are you storing this? In the same AL32UTF8 database? I think this might be your problem.
>
Although viewing Target.COVAL shows a ¿ (true in
TOAD, SQL*Plus), dump(COVAL) confirms the 77th
character is 176:Yes, since 176 is an invalid value in UTF-8. U+0079 is encoded as 79, U+0080 is encoded as C2 80 - notice the "leap" there. If I would input 176 in a "utf-8 decode" I would get "out of range" or NaN back. Similarily, if you have managed to illegally store 176 as a character encoded value in a AL32UTF8 database, and are trying to retrieve that, involving a conversion to client character set, you would get the replacement character ¿ meaning "bad conversion".
>
DUMP(COVAL)
Typ=1 Len=200: [...],32,50,176,32,[...]
Try
select chr(49480) from dual;
- but you need to do this from a tool such as Oracle SQL Developer (it's free) that can handle Unicode ouput.

Similar Messages

  • Data extraction from Non-Unicode ECC6 to Unicode SAP BI system

    Hi,
    We have an existing non-unicode ECC6 system. Currently we are installing SAP BI unicode system. Can anyone tell me are there any issues in data extraction for SAP BI from a non-unicode ECC6 system to an Unicode SAP BI system ?
    Please also note that our data consists of Asian (korean, Japanese & Chinese) fonts.
    Regards,
    Anirban

    Hi Des Gallagher,
    Thank you for your reply.
    I have gone through the notes suggested by you, but they suggest issues related to BW 3.x versions. We are currently on SAP_BW 700 - SP16. Also, among other notes i found note 510882 which might be helpful for custom developments.
    But i am still wondering whether we are going to face major issues related to data extraction from non-unicode ECC6 system to unicode SAP_BW 700 system.
    Incase you have any further details, please let me know.
    Thanks in advance.
    Regards,
    Anirban Kundu

  • Target Spry RowID on page with Multiple data sets from another page

    Hi all,
    I am trying to target a specific data item, on a page with
    multiple data sets, from a link on another page. (I also have to
    pass the link through Flash, but lets start with the simple
    part...)
    You can take a look at the site in progress here:
    http://www.3andband.com/TestSite/iframeTest3.html
    From the Home page I want to link to specific news or concert
    items on the News page
    I have been trying to get SpryURLUtils to do it but I can't
    seem to get it working.
    Any help would be greatly appreciated.
    Thanks!
    Ben

    did u try if it even passes the row value?? with a simple
    alert? alert(params.row)
    Also maby u need to reorder the scripts to this;
    <script src="../SpryAssets/SpryURLUtils.js"
    type="text/javascript"></script>
    <script src="../SpryAssets/xpath.js"
    type="text/javascript"></script>
    <script src="../SpryAssets/SpryData.js"
    type="text/javascript"></script>
    <script src="../SpryAssets/SpryCollapsiblePanel.js"
    type="text/javascript"></script>
    <script src="../SpryAssets/SpryEffects.js"
    type="text/javascript"></script>
    <script src="../SpryAssets/SpryAccordion.js"
    type="text/javascript"></script>
    and your js script
    var params = Spry.Utils.getLocationParamsAsObject();
    var dsConcerts = new
    Spry.Data.XMLDataSet("includes/concerts.xml", "Concerts/concert");
    dsConcerts.setColumnType("image", "image");
    var dsNews = new Spry.Data.XMLDataSet("includes/news.xml",
    "News/item");
    //Set an observer so that when the data is loaded, we update
    the current row to the url param value
    dsNews.addObserver({ onPostLoad: function(ds, type) {
    dsNews.setCurrentRow(params.row); }
    function MM_effectBlind(targetElement, duration, from, to,
    toggle)
    Spry.Effect.DoBlind(targetElement, {duration: duration,
    from: from, to: to, toggle: toggle});
    So url params get loaded before the data

  • Help with exporting data from pdf form

    I have about 100 pdf forms that I created in adobe forms central and distributed as a pdf form (rather than on the web). I am trying to export the data into a spreadsheet but when I export it, the fields are all jumbled in the csv file, as in they are not in the same order. I need to export the data all together so I'm going to the forms menu and selecting "manage form data" and then selecting "merge data files into spreadsheet". I tried exporting a single file but that gave me something really weird.
    Please help, I have a deadline next week to analyze this data and can't make sense of it once it is exported to a spreadsheet.

    Would you please share your form with me and send me one of your pdf forms and some of the csv files?
    You can share your form by doing the following:
    1. Click on the “Share” icon on the bottom left corner.
    2. Click on “Add Collaborator” on the popup menu.
    3. Enter [email protected] under “People to share with”.
    4. Set subject to "Export data from pdf form"
    5. Click the “Share” button on the bottom right of the dialog.
    Thanks
    Ken

  • Help needed with Export Data Pump using API

    Hi All,
    Am trying to do an export data pump feature using the API.
    while the export as well as import works fine from the command line, its failing with the API.
    This is the command line program:
    expdp pxperf/dba@APPN QUERY=dev_pool_data:\"WHERE TIME_NUM > 1204884480100\" DUMPFILE=EXP_DEV.dmp tables=PXPERF.dev_pool_data
    Could you help me how should i achieve the same as above in Oracle Data Pump API
    DECLARE
    h1 NUMBER;
    h1 := dbms_datapump.open('EXPORT','TABLE',NULL,'DP_EXAMPLE10','LATEST');
    dbms_datapump.add_file(h1,'example3.dmp','DATA_PUMP_TEST',NULL,1);
    dbms_datapump.add_file(h1,'example3_dump.log','DATA_PUMP_TEST',NULL,3);
    dbms_datapump.metadata_filter(h1,'NAME_LIST','(''DEV_POOL_DATA'')');
    END;
    Also in the API i want to know how to export and import multiple tables (selective tables only) using one single criteria like "WHERE TIME_NUM > 1204884480100\"

    Yes, I have read the Oracle doc.
    I was able to proceed as below: but it gives error.
    ============================================================
    SQL> SET SERVEROUTPUT ON SIZE 1000000
    SQL> DECLARE
    2 l_dp_handle NUMBER;
    3 l_last_job_state VARCHAR2(30) := 'UNDEFINED';
    4 l_job_state VARCHAR2(30) := 'UNDEFINED';
    5 l_sts KU$_STATUS;
    6 BEGIN
    7 l_dp_handle := DBMS_DATAPUMP.open(
    8 operation => 'EXPORT',
    9 job_mode => 'TABLE',
    10 remote_link => NULL,
    11 job_name => '1835_XP_EXPORT',
    12 version => 'LATEST');
    13
    14 DBMS_DATAPUMP.add_file(
    15 handle => l_dp_handle,
    16 filename => 'x1835_XP_EXPORT.dmp',
    17 directory => 'DATA_PUMP_DIR');
    18
    19 DBMS_DATAPUMP.add_file(
    20 handle => l_dp_handle,
    21 filename => 'x1835_XP_EXPORT.log',
    22 directory => 'DATA_PUMP_DIR',
    23 filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
    24
    25 DBMS_DATAPUMP.data_filter(
    26 handle => l_dp_handle,
    27 name => 'SUBQUERY',
    28 value => '(where "XP_TIME_NUM > 1204884480100")',
    29 table_name => 'ldev_perf_data',
    30 schema_name => 'XPSLPERF'
    31 );
    32
    33 DBMS_DATAPUMP.start_job(l_dp_handle);
    34
    35 DBMS_DATAPUMP.detach(l_dp_handle);
    36 END;
    37 /
    DECLARE
    ERROR at line 1:
    ORA-39001: invalid argument value
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
    ORA-06512: at "SYS.DBMS_DATAPUMP", line 3043
    ORA-06512: at "SYS.DBMS_DATAPUMP", line 3688
    ORA-06512: at line 25
    ============================================================
    i have a table called LDEV_PERF_DATA and its in schema XPSLPERF.
    value => '(where "XP_TIME_NUM > 1204884480100")',above is the condition i want to filter the data.
    However, the below snippet works fine.
    ============================================================
    SET SERVEROUTPUT ON SIZE 1000000
    DECLARE
    l_dp_handle NUMBER;
    l_last_job_state VARCHAR2(30) := 'UNDEFINED';
    l_job_state VARCHAR2(30) := 'UNDEFINED';
    l_sts KU$_STATUS;
    BEGIN
    l_dp_handle := DBMS_DATAPUMP.open(
    operation => 'EXPORT',
    job_mode => 'SCHEMA',
    remote_link => NULL,
    job_name => 'ldev_may20',
    version => 'LATEST');
    DBMS_DATAPUMP.add_file(
    handle => l_dp_handle,
    filename => 'ldev_may20.dmp',
    directory => 'DATA_PUMP_DIR');
    DBMS_DATAPUMP.add_file(
    handle => l_dp_handle,
    filename => 'ldev_may20.log',
    directory => 'DATA_PUMP_DIR',
    filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
    DBMS_DATAPUMP.start_job(l_dp_handle);
    DBMS_DATAPUMP.detach(l_dp_handle);
    END;
    ============================================================
    I dont want to export all contents as the above, but want to export data based on some conditions and only on selective tables.
    Any help is highly appreciated.

  • Problem with Exporting Data to Excel

    Hi Everyone,
    I have problem with exporting the data to a excel file. I am using a XP
    system in German which uses comma "," as the decimal point , and I also set
    "Use localized decimal point*" under the FronPanel Options to be true. But
    the exported excel file can not recognize (or just ignore) the comma for a
    whole column (flow rate), for examle "1,234" (1.234 in non-german system)
    would be 1234. But if I open its text file where the data came
    from by calling Excel directly, there will be no problem. Is this a bug of
    LabVIEW, or I forget some settings?
    Thanks a lot!
    Le
    P.S: The LabVIEW version is 8.2.1 and the Office version is Office2007.

    Hi Le,
    Sound familiar; here in The Netherlands we have the same problem.   That is why we use the English versions for XP and Office.
    I don’t quite understand how you interface to Excel. Directly with ActiveX or through a CSV file.
    Can you explain a bit more ?

  • Getting error "Cannot create a BACPAC from a file that does not contain exported data." from SqlManagementClient.Dac.ImportAsync

    We're trying to import a dacpac to azure via the new SqlManagementClient DacOperations ImportAsync api I get an exception with the error: "Cannot create a BACPAC from a file that does not contain exported data."
    This same dacpac imports fine using an alternate but less friendly API from sql server's tooling. We'd like to use the new management SDK instead for various reasons.

    Hi Kyle A Wilt,
    I am trying to involve someone more familiar with this topic for a further look at this issue. Sometime delay might be expected from the job transferring. Your patience is greatly appreciated.
    Thank you for your understanding and support.
    Regards,
    Sofiya Li
    Sofiya Li
    TechNet Community Support

  • Issue with exporting the image from camera and scanner

    Hello,
    When  importing JPG digital photos from my Conica Minolta SLR digital camera,  the file size is typically 3-4 megabytes. If importing
    and then  exporting the file out of Lightroom 2.6, the size of the file is not  changed (at least not very much).
    However, when scanning 35 mm color negative films using a  Plustek OpticFilm 7600i with SilverFast-SE the JPG file has a size of  abou
    t 5 megabytes (generated by SilverFast-SE). If I take this file  and import it to Lightroom and then export it out again, the size in
    crements to about 12 megabytes. What could be the reason for this?
    Browser:  Microsoft IE
    Operating System: Windows 7
    Scanner / driver  version: SilverFast-SE v6.6.0r6
    Has it ever worked? If so, what's  changed? (provide comments in description field): Not Applicable
    Same results with different file?: Not Applicable
    Same results with  different computer?: Not Applicable
    Recent System Hardware or  Software change?: Not Applicable
    Regards
    JPSingh

    It is the compression settings on export that will define the size of an exported image from LR, the amount of compression applied will produce differing file sizes according to the type of image, the amount of sharpening and many other factors.. To compare actual file sizes (rather than compressed sizes) see what it says when in Lightroom or when open in Photoshop. Two images with the same pixel dimension can have very different file sizes when compressed. When open, in PS for instance, the file size will be the same for images of the same pixel dimensions and bit depth. .

  • Export data forms from unix

    Hi,
    we have installed planning in Unix,Can any body help on, how to export data forms?
    we have tried to export as normal way...but its not wrking....see below wt it is giving after enter the correct format...
    server(user):./FormDefUtil.sh export "Form1" localhost admin passwrd TestApp
    Usage: HspFormDefUtil <import/export> <filename/formname/-all> <server> <username> <password> <application>
    Please help on it....
    Note: i heard that we can't run from unix box? if it is correct then how can we export the forms?
    Thanks,

    Hi John,
    Thanks for quick reply.
    we are using Planning 9.3.1.1 version.
    Its not updating any log...its just giving like below.....
    just its coming out.that's it. its not updating any log(planning log/FormDefUtil.log)
    server(user):./FormDefUtil.sh export "Form1" localhost admin passwrd TestApp
    Usage: HspFormDefUtil <import/export> <filename/formname/-all> <server> <username> <password> <application>
    server(user):
    Even FormDefUtil.log also not updated anything...
    Please help.

  • Data load issue with export data source - BW 3.5

    Hi,
    We are facing issues in loading data with the help of export data source.
    We have created export data source of 0PCA_C01 cube. With the help of this export datasource,  we are loading data to other custom cube. Scenario is working fine in development server.
    But when we transported objects to quality server data is not getting loaded to custom target cube.
    It is extracting zero records.  All transports are ok and we have generated export datasource in quality before transports .Also regenerated export datasource after transport and activated infosource, update rule via RS* programs.  Every object is active but data is not getting extracted.
    RSA3 for 80PCA_C01 datasource isn't extracting any record in Quality. Records getting extracted in development.   We are in BW 3.5 with patch level 19.
    Please guide us to resolve the issue.
    Thanks,
    Aditya

    Hi
    Make sure that you have relevant Role & Authorization at Quality/PRS.
    You have to Transport the Source Cube first and then Create a Generate Export Data Source in QAS. Then, replicate data sources for BW QAS Soruce System. Make sure this replicated Data Source in QAS. Only then can transport new update rules for second cube.
    Hope it helps and clear

  • Call a method with complex data type from a DLL file

    Hi,
    I have a win32 API with a dll file, and I am trying to call some methods from it in the labview. To do this, I used the import library wizard, and everything is working as expected. The only problem which I have is with a method with complex data type as return type (a vector). According to this link, import library wizard can not import methods with complex data type.
    The name of this method is this:   const std::vector< BlackfinInterfaces::Count > Counts ()
    where Count is a structure defined as below:
    struct Count
       Count() : countTime(0) {}
       std::vector<unsigned long> countLines;
       time_t countTime;
    It seems that I should manually use the Call Library Function Node. How can I configure parameters for the above method?

    You cannot configure Call Library Function Node to call this function.  LabVIEW has no way to pass a C++ class such as vector to a DLL.

  • Creating a dynamic nested menu with xml data received from a webservice

    I need to create a dynamic menu based on a xml returned by a webservice.
    the xml comes basically in this format:
    [quote]
    <resposta>
        <status>Success</status>
        <mensagem>Whatever</mensagem>
        <dados>
            <projeto nome="name" cliente="client name">
                <atividade nome="name">
                    <etapa>
                        <nome>name</nome>
                         <other_attributes>...</other_attributes>
                    </etapa>
                    (other etapas)
                 </atividade>
                 (other atividades)
            </projeto>
            (other projetos)
        </dados>
    </resposta>
    [/quote]
    What I need is to create a menu like:
    - Projeto.Nome - Projeto.Cliente:
        - Atividade.nome:
            (start button) etapa1
            (start button) etapa2
    - Projeto2.Nome - Projeto2.Cliente:
        - Atividade.nome:
            (start button) etapa1
            (start button) etapa2
    And so on...
    I've tried using an HTTPService and a DataGroup, this code above works fine for  display the projeto's names:
    [quote]
    <s:HTTPService id="loginService"
                            url="http://timesheet.infinitech.local/services"
                            method="POST" contentType="application/xml"
                            result="handleLoginResult();"
                            fault="handleFault(event);" >
                            <s:request xmlns="">
                                <requisicao>
                                    <tipo>login</tipo>
                                    <usuario>{campoUsuario.text}</usuario>
                                    <senha>{campoSenha.text}</senha>
                    </requisicao>
                </s:request>
    </s:HTTPService>
    and the DataGroup:
    <s:DataGroup dataProvider="{tarefasService.lastResult.resposta.dados.projeto}" width="100%" y="100" x="20"
                         includeIn="Principal">
                <s:layout>
                    <s:VerticalLayout />
                </s:layout>
                <s:itemRenderer>
                    <fx:Component>
                        <s:ItemRenderer>
                            <s:layout>
                                <s:HorizontalLayout />
                            </s:layout>
                            <s:Button />
                            <s:Label text="{data.nome}" />
                        </s:ItemRenderer>
                    </fx:Component>
                </s:itemRenderer>
            </s:DataGroup>
    [/quote]
    I have then tried including another datagroup inside the datagroup item renderer, but I just couldn't get it to work anyway, and tried it in a lot of ways... (basically, it would be a datagroup with dataProvider={data.atividade}).
    Can anyone tell me how to get this to work?
    I've uploaded an example xml, you can use it as the url for the HTTPService:
    http://www.pdinfo.com.br/example.xml
    Thanks in advance.

    Hi,
    A lot of the information you need is in Adobe's scripting guide http://www.adobe.com/go/learn_lc_scriptingReference  Also there is a very useful Adobe guide to Calculations and Scripts (and while it is for version 6 it is still very good because of the way it is laid out) http://partners.adobe.com/public/developer/en/tips/CalcScripts.pdf
    The Javascript could be used in the Layout: Ready event.
    For italic font:
    if (...some test...)
         this.font.posture = "italic";
    else
         this.font.posture = "normal";
    For bold font:
    if (...some test...)
         this.font.weight = "bold";
    else
         this.font.weight = "normal";
    The script will change the font for the complete field. I don't think you can change parts of a field.You can also change font colour and font type (the guides above will help).
    Good luck,
    Niall

  • Data errors/changes in unicode database Once all code is unicode compliant

    Hi All,
    This is regarding unicode project.
    We have currently made all the programs unicode complaint
    and the database we are using is not unicode database.
    We are moving now the entire code to the Unicode database system.
    1>Could anyone tell what kind of data errors that might be encountered due to this new database system.
    2>What kind of data changes regarding the format/data we might observe in the output files generated.
    Any expertise and experiences in the similar upgradation will be very helpful..
    Thank you all in advance

    Hi Kumar,
    each code page encodes characters into a binary representation. ASCII is may be the best known. It encode 128 characters with seven bits. The first 32 characters are control characters for printers and terminals like carriage return and bell. Then there are some special characters like Space and Comma followed by digits and the characters of the roman alphabet in upper case and lower case representation. Unicode is another code page which is defined in unicode standard documentation. Because unicode characters are wider than one byte (the current standard contains almost 100.000 characters) there are different encoding used in applications. The most used encoding is probably UTF-8 which is used by DB2 and Oracle. MaxDB uses UTF-16 which uses much more space for most used characters. Languages use characters from a code page to build words. You may have multiple code pages in one system (MDMP) or a unicode system which supports all languages on a single code page.
    I hope this help you to understand the difference between a code page and a language. May be you check out the links [http://www.asciitable.com|http://www.asciitable.com] and [http://unicode.org|http://unicode.org].
    Best regards
    Ralph

  • Load data warehouse from non-Oracle database

    Hi
    I need to load a data warehouse (in a Oracle database 8.1.6) using Post Gree database tables.
    The question is: Oracle Warehouse Builder can identify this database like a source module?
    Or is better I generate txt files from Post Gree tables, load a staging database (in Oracle) with the same structure of POst Gree tables and after that, load my data warehouse?
    Best Regards
    Honorio Cardozo Jr

    You can identify Oracle sources from Oracle 7.3 and up. I.e. yes, you can directly go against the database and you do not need to extract into flat files.
    Mark.

  • Data fetched from buffer or database

    Hi,
    How to check tin the select query the data is being fetched from buffer or database .Is there any method to trace that  or it is just the setting we are  doing while creating a table..
    Pls suggest

    Hi,
    >
    arun purushothaman wrote:
    >Is there any method to trace that  or it is just the setting we are  doing while creating a table..
    > Pls suggest
    sure. ST05.
    SQL Trace shows everything that is going to the database. This means those statements are NOT
    using the buffer. The SQL Trace lines are yellow.
    Buffer Trace shows everything that is going to the buffer. This means those statements are NOT
    going to the database. The Buffer Trace lines are blue.
    Kind regards,
    Hermann

Maybe you are looking for

  • How do I get back my tool bar on the top of screen without pushing F10

    I lost the tool bar showing File- View-History- Bookmarks -Help.... across the top of my screen. How do I get it back without having to push F10 each time to show it?

  • Problem with Price

    hello here is my problem say i have an amount of 8,765.43, some times in sap this value is comming as 8.765,43 i mean to say its displaying . in the place of , and , in the palce of . now if my value is displayed like 8.765,43 how can i change this t

  • Why does my title text stay on screen for the duration of my iMovie?

    Adding a title text to my iMovie, can someone help me figure out why it stays on screen for the duration of the video? Can't seem to find a way to edit the duration of the text.

  • BADI for VL31N which has the Inbound Delivery Number as a parameter.

    Hi, I have a requirement that after creation of a new inbound delivery through VL31N the control should go to a custom screen. The new inbound delivery no. generated thru VL31N should be seen in this custom screen. I am searching for a user exit or B

  • Index wizard. Is it worth it?

    Hello, RH 8x generating a CHM I am about to manually index my moderately large help system. Sadly, I find the exercise completely mind-numbing and I'm looking for a way out. I am loathe to use the Index wizard because in the past I believe there was