Export with data length semantics

Hello,
I've following problem.
I have a table abcd which contains 2 VARCHAR2 columns with different data length semantics (one with BYTE, one with CHAR). Charset is Single Byte; let's say WE8MSWIN1252, so data length semantics should not be a problem. should not. details later.
So this would be:
create table abcd (a_char VARCHAR2(2 CHAR), a_byte VARCHAR2(2 BYTE));after that I export the table via exp. I'm not setting NLS_LENGTH_SEMANTICS environment variable, so BYTE is used.
In the dump file the data length semantics for the byte col is omitted, as I exported it with BYTE:
create table abcd (a_char VARCHAR2(2 CHAR), a_byte VARCHAR2(2));after that, I "accidently" import it with data length semantics set to CHAR, and the table looks like this now
abcd
a_char VARCHAR2(2 CHAR)
a_byte VARCHAR2(2 CHAR)Same happens vice versa when using CHAR for export and BYTE for import...
In single byte charsets this might not be so much of a problem, as one CHAR is equal to one BYTE, but...
If I compile plsql against the original table, and run against the outcoming table after export, I get an ORA-4062, and I have to recompile...
Would not be a problem if the plsql I compile would be on the database...Big problem is that the ORA-4062 occurs in forms, where it's difficult for me to recompile (I would have to transfer all the sources to customer and compile there).
Is there any possibility to export data length semantics regardless which environment variable is set?
database version would be 9.2.0.6; but if there exists a solution in higher versions I would also be happy to hear them...
many thanks,
regards

I can't reproduce your problem:
SQL> show parameter nls_length_semantics
NAME                                 TYPE        VALUE
nls_length_semantics                 string      BYTE
SQL> create table scott.demo( col1 varchar2(10 byte), col2 varchar2(10 char) );
SQL> describe scott.demo
Name                                      Null?    Type
COL1                                               VARCHAR2(10)
COL2                                               VARCHAR2(10 CHAR)
$ export NLS_LENGTH_SEMANTICS=BYTE
$ exp scott/tiger file=scott.dmp tables=demo
SQL> drop table scott.demo;
$ export NLS_LENGTH_SEMANTICS=CHAR
$ imp scott/tiger file=scott.dmp
SQL> describe scott.demo
Name                                      Null?    Type
COL1                                               VARCHAR2(10 BYTE)
COL2                                               VARCHAR2(10)
SQL> alter session set nls_length_semantics=byte;
SQL> describe scott.demo
Name                                      Null?    Type
COL1                                               VARCHAR2(10)
COL2                                               VARCHAR2(10 CHAR)Can you post a test like mine?
Enrique
PS If you have access to Metalink, read Note:144808.1 Examples and limits of BYTE and CHAR semantics usage. From 9i and up, imp doesn't read nls_length_semantics from the environment.
Edited by: Enrique Orbegozo on Dec 16, 2008 12:50 PM
Edited by: Enrique Orbegozo on Dec 16, 2008 12:53 PM

Similar Messages

  • Data Length Semantics?

    Dear memebers,
    what is Data Length Semantics property of text box?
    what's its behavior?
    thanks
    Muhammad Nadeem
    [email protected]

    Values CHAR, BYTE, null
    Refer to Built-in
    GET_ITEM_PROPERTY
    Usage Notes
    - If a null value is specified, then byte semantics will be used unless the environment variable NLS_LENGTH_SEMANTICS is set to CHAR when the form is compiled.
    - When the Synchronize with Item property is set, DATA_LENGTH_SEMANTICS will be ignored in a subordinate mirror item. The DATA_LENGTH_SEMANTICS property is always taken from the master mirror item. A compiler (generator) warning will be issued if a non-null value is specified in a subordinate mirror item.
    - A compiler (generator) warning will also be issued if a non-null value is specified in an item whose datatype is neither CHAR, ALPHA, nor LONG.

  • Querying CHAR columns with character length semantics unreliable

    Hi again,
    It appears that there is a bug in the JDBC drivers whereby it is highly unlikely that the values of CHAR columns that use character length semantics can be accurately queried using ResultSet.getString(). Instead, the drivers return the value padded with space (0x#20) characters out to a number of bytes equal to the number of characters multiplied by 4. The number of bytes varies depending on the number and size of any non-ascii characters stored in the column.
    For instance, if I have a CHAR(1) column, a value of 'a' will return 'a ' (4 characters/bytes are returned), a value of '\u00E0' will return '\u00E0 ' (3 characters / 4 bytes), and a value of '\uE000' will return '\uE000 ' (2 characters / 4 bytes).
    I'm currently using version 9.2.0.3 of the standalone drivers (ojdbc.jar) with JDK 1.4.1_04 on Redhat Linux 9, connecting to Oracle 9.2.0.2.0 running on Solaris.
    The following sample code can be used to demonstrate the problem (where the DDL at the top of the file must be executed first):
    import java.sql.*;
    import java.util.*;
    This sample generates another bug in the Oracle JDBC drivers where it is not
    possible to query the values of CHAR columns that use character length semantics
    and are NOT full of non-ascii characters. The inclusion of the VARCHAR2 column
    is just a control.
    CREATE TABLE TMP2
    TMP_ID NUMBER(10) NOT NULL PRIMARY KEY,
    TMP_CHAR CHAR(10 CHAR),
    TMP_VCHAR VARCHAR2(10 CHAR)
    public class ClsCharSelection
    private static String createString(char character, int length)
    char characters[] = new char[length];
    Arrays.fill(characters, character);
    return new String(characters);
    } // private static String createString(char, int)
    private static void insertRow(PreparedStatement ps,
    int key, char character)
    throws SQLException
    ps.setInt(1, key);
    ps.setString(2, createString(character, 10));
    ps.setString(3, createString(character, 10));
    ps.executeUpdate();
    } // private static String insertRow(PreparedStatement, int, char)
    private static void analyseResults(PreparedStatement ps, int key)
    throws SQLException
    ps.setInt(1, key);
    ResultSet results = ps.executeQuery();
    results.next();
    String tmpChar = results.getString(1);
    String tmpVChar = results.getString(2);
    System.out.println(key + ", " + tmpChar.length() + ", '" + tmpChar + "'");
    System.out.println(key + ", " + tmpVChar.length() + ", '" + tmpVChar + "'");
    results.close();
    } // private static void analyseResults(PreparedStatement, int)
    public static void main(String argv[])
    throws Exception
    Driver driver = (Driver)Class.forName(
    "oracle.jdbc.driver.OracleDriver").newInstance();
    DriverManager.registerDriver(driver);
    Connection connection = DriverManager.getConnection(
    argv[0], argv[1], argv[2]);
    PreparedStatement ps = null;
    try
    ps = connection.prepareStatement(
    "DELETE FROM tmp2");
    ps.executeUpdate();
    ps.close();
    ps = connection.prepareStatement(
    "INSERT INTO tmp2 ( tmp_id, tmp_char, tmp_vchar " +
    ") VALUES ( ?, ?, ? )");
    insertRow(ps, 1, 'a');
    insertRow(ps, 2, '\u00E0');
    insertRow(ps, 3, '\uE000');
    ps.close();
    ps = connection.prepareStatement(
    "SELECT tmp_char, tmp_vchar FROM tmp2 WHERE tmp_id = ?");
    analyseResults(ps, 1);
    analyseResults(ps, 2);
    analyseResults(ps, 3);
    ps.close();
    connection.commit();
    catch (SQLException e)
    e.printStackTrace();
    connection.close();
    } // public static void main(String[])
    } // public class ClsColumnInsertion

    FYI, this has been mentioned as early as November last year:
    String with length 1 became 4 when nls_lang_semantics=CHAR
    and was also brought up in Feburary:
    JDBC thin driver pads CHAR col to byte size when NLS_LENGTH_SEMANTICS=CHAR

  • Export with DATA and DDL

    Hi,
    in 8i version, I have to export a table with DATA and DDL. I wonder if following Command line is enough or I should add some other option ?
    exp sys/***@DB TABLES=MYTABLE file=d:\MYTABLE.dmp
    Many thanks.

    or I should add some other option ?
    exp sys/***@DB TABLES=MYTABLE
    file=d:\MYTABLE.dmp Yup, this enough, it will bring DLL for indexes on table MYTABLE as well
    Virag

  • Exporting with precise length?

    Hey everyone,
    Relatively new to Logic, but not a total DAW dummy. The problem I'm having is kind of infuriating but otherwise I love the software.
    Basically, I am building 4- and 8-bar loops in Logic at a tempo of 103 BPM. I have uploaded my samples to a beat sequencer and am triggering them via MIDI. I then quantize the samples according to Logic's 103 BPM grid. Once I have a loop of either 4 or 8 bars, I export its track as a WAV file. Under normal circumstances, I then load the WAV file, which should be precisely 4 or 8 bars long at 103 BPM, into my sampling pad (Roland SPD-SX). With the click on the Roland set to 103, I should be able to play the loop and have it repeat indefinitely without it "creeping".
    What I described above would be my best-case scenario. Thankfully, this has worked many times for me. In fact, I can double-check that it's working by using the Tempo Match feature on the Roland-- if I tell the Roland that the loop is 8 bars long, it infers that the tempo is 103 based on the length of the WAV. This works perfectly.
    Until today. I did my same system as always, exactly as described above, no hitches. But when I load this new loop in the Roland, it creeps and gets ahead. The Tempo Match check? Roland says my loop is 103.4 BPM. I was careful to not make a mistake in Logic; Logic told me it was 103.
    I have since tried to make this particular loop 4 or 5 times and it's always the same. The Roland thinks it's at 103.4, which I would assume means that Logic is truncating the length of the export WAV. I don't think the problem is that Logic's metronome is faulty. My ears tell me the samples within the loop are perfectly in time at 103, but at the end of the 8 bars, the loop starts again just a hair early, getting further and further ahead each time the loop recycles. Which sounds very much like Logic is truncating the length of the export WAV.
    I called Roland and confirmed the issue is not with the sampling pad. Does anyone know why this is happening now and never happened before, and could someone please tell me how I can be more precise with the export length in Logic so that my pad's Tempo Match feature correctly reads the length of the WAV loop to arrive at 103 (instead of 103.4) BPM?
    Thanks in advance to anyone who takes the time.
    Jake

    Logic's track export ends the exported file at the end of the audio so that if the last sound ends before the end of the four bars, it will not add the small amount of silence to the end of the file to make it come out even.
    Set your left and right locators precicely at the loop start and end (4 or 8 bars), turn cycle on and bounce the loop instead of exporting it.

  • Select Option with data length more than 45 character

    Hi everyone,
    Kindly suggest how to use select-option when the data type is of length 128 characters because it is truncating the input after 45 characters. The requirement is to pass 1-7 files in select option input.
    Thanks in advance.

    Hi,
    Select option doest support STRING data type.
    The VISIBLE LENGTH addition is also not working in this case. Also for multiple selection the input length remains the same i.e. 45 characters.
    Thanks.

  • Help on export sybase iq tables with data and import in another database ?

    Help on export Sybase iq 16 tables with data and import into another database ?

    Hi Nilesh,
    If you have table/index create commands (DDLs), you can create them in Developper and import data using one of methods below
    Extract/ Load table
    Insert location method : require IQ servers to be entered in interfaces file
    Backup/Restore : copy entire database content
    If you have not the DDLs, you can generate them using IQ cockpit or SCC.
    http://infocenter.sybase.com/help/topic/com.sybase.infocenter.dc01773.1604/doc/html/san1288042631955.html
    http://infocenter.sybase.com/help/topic/com.sybase.infocenter.dc01840.1604/doc/html/san1281564927196.html
    Regards,
    Tayeb.

  • Is it possible to export interactive textfields with data?

    Hey guys,
    I have following problem. I´m creating our variable price list for our customers. "variable" means an interactive pdf for our customers where they can edit the prices, product discription etc.
    I can paginate our catalogue with "Easy Catalogue" including all interactive textfields. It looks like this:
    https://www.dropbox.com/s/kn81v9db69dx0a7/screenshot_pagination_incl_interactive_textfield s.png?dl=0
    The problem now is, when I export it as an interactive pdf, the data of the textfields will not be exported -> Dropbox - screenshot_exported_interactive_form.png
    it`s just blank
    Does anybody have a solution for this problem. I need the fields to be preset with the data!
    Is it generally possible to export interactive textfields with data?
    Any idea is appreciated!
    cheers from Austria,
    Chris

    I´m using Easy Catalog http://www.65bit.com/software/easycatalog/
    But I think it has something to do with indesign interactive pdf export properties.
    What I also tried:
    Converted an ordinary textfield with text into an interactive textfield -> then exported as an interactive pdf -> also blank editable field!

  • How can i export the data to excel which has 2 tables with same number of columns & column names?

    Hi everyone, again landed up with a problem.
    After trying a lot to do it myself, finally decided to post here..
    I have created a form in form builder 6i, in which on clicking a button the data gets exported to excel sheet.
    It is working fine with a single table. The problem now is that i am unable to do the same with 2 tables.
    Because both the tables have same number of columns & column names.
    Below are 2 tables with column names:
    Table-1 (MONTHLY_PART_1)
    Table-2 (MONTHLY_PART_2)
    SL_NO
    SL_NO
    COMP
    COMP
    DUE_DATE
    DUE_DATE
    U-1
    U-1
    U-2
    U-2
    U-4
    U-4
    U-20
    U-20
    U-25
    U-25
    Since both the tables have same column names, I'm getting the following error :
    Error 402 at line 103, column 4
      alias required in SELECT list of cursor to avoid duplicate column names.
    So How can i export the data to excel which has 2 tables with same number of columns & column names?
    Should i paste the code? Should i post this query in 'SQL and PL/SQL' Forum?
    Help me with this please.
    Thank You.

    You'll have to *alias* your columns, not prefix it with the table names:
    $[CHE_TEST@asterix1_impl] r
      1  declare
      2    cursor cData is
      3      with data as (
      4        select 1 id, 'test1' val1, 'a' val2 from dual
      5        union all
      6        select 1 id, '1test' val1, 'b' val2 from dual
      7        union all
      8        select 2 id, 'test2' val1, 'a' val2 from dual
      9        union all
    10        select 2 id, '2test' val1, 'b' val2 from dual
    11      )
    12      select a.id, b.id, a.val1, b.val1, a.val2, b.val2
    13      from data a, data b
    14      where a.id = b.id
    15      and a.val2 = 'a'
    16      and b.val2 = 'b';
    17  begin
    18    for rData in cData loop
    19      null;
    20    end loop;
    21* end;
      for rData in cData loop
    ERROR at line 18:
    ORA-06550: line 18, column 3:
    PLS-00402: alias required in SELECT list of cursor to avoid duplicate column names
    ORA-06550: line 18, column 3:
    PL/SQL: Statement ignored
    $[CHE_TEST@asterix1_impl] r
      1  declare
      2    cursor cData is
      3      with data as (
      4        select 1 id, 'test1' val1, 'a' val2 from dual
      5        union all
      6        select 1 id, '1test' val1, 'b' val2 from dual
      7        union all
      8        select 2 id, 'test2' val1, 'a' val2 from dual
      9        union all
    10        select 2 id, '2test' val1, 'b' val2 from dual
    11      )
    12      select a.id a_id, b.id b_id, a.val1 a_val1, b.val1 b_val1, a.val2 a_val2, b.val2 b_val2
    13      from data a, data b
    14      where a.id = b.id
    15      and a.val2 = 'a'
    16      and b.val2 = 'b';
    17  begin
    18    for rData in cData loop
    19      null;
    20    end loop;
    21* end;
    PL/SQL procedure successfully completed.
    cheers

  • How to export some data from the tables of an owner with integrity?

    Hi to all,
    How to export some data from the tables of an owner with integrity?
    I want to bring some data from all tables in a single owner of the production database for development environment.
    My initial requirements are: seeking information on company code (emp), contract status (status) and / or effective date of contract settlement (dt_liq_efetiva) - a small amount of data to developers.
    These three fields are present in the main system table (the table of contracts). Then I thought about ...
    - create a temporary table from the query results table to contract;
    - and then use this temporary table as a reference to fetch the data in other tables of the owner while maintaining integrity. But how? I have not found the answer, because: what to do when not there is the possibility of a join between the contract and any other table?
    I am considering the possibility of consulting the names of tables, foreign keys and columns above, and create dynamic SQL. Conceptually, something like:
    select r.constraint_name "FK name",
    r.table_name "FK table",
    r.column_name "FK column",
    up.constraint_name "Referencing name",
    up.table_name "Referencing table",
    up.column_name "Referencing column"
    from all_cons_columns up
    join all_cons_columns r
    using (owner, position), (select r.owner,
    r.constraint_name fk,
    r.table_name table_fk,
    r.r_constraint_name r,
    up.table_name table_r
    from all_constraints up, all_constraints r
    where r.r_owner = up.owner
    and r.r_constraint_name = up.constraint_name
    and up.constraint_type in ('P', 'U')
    and r.constraint_type = 'R'
    and r.owner = 'OWNERNAME') aux
    where r.constraint_name = aux.fk
    and r.table_name = aux.table_fk
    and up.constraint_name = aux.r
    and up.table_name = aux.table_r;
    -- + Dynamic SQL
    If anyone has any suggestions and / or reuse code to me thank you very much!
    After resolving this standoff intend to mount the inserts in utl_file by a table and create another program to read and play in the development environment.
    Thinking...
    Let's Share!
    My thanks in advance,
    Philips

    Thanks, Peter.
    Well, I am working with release 9.2.0.8.0. But the planning is migrate to 10g this year. So my questions are:
    With Data Pump can export data just from tables owned for me (SCHEMAS = MYOWNER) parameterizing the volume of data (SAMPLE) and filters to table (QUERY), right? But parameterizing a contract table QUERY = "WHERE status NOT IN (2,6) ORDER BY contract ":
    1º- the Data Pump automatically searches for related data in other tables in the owner? ex. parcel table has X records related (fk) with Y contracts not in (2,6): X * SAMPLE records will be randomly exported?
    2º- for the tables without relation (fk) and which are within the owner (MYOWNER) the data is exported only based on the parameter SAMPLE?
    Once again, thank you,
    Philips
    Reading Oracle Docs...

  • Transferring data to a flat file with a length greater than 255 bytes??

    Is there a way to do this?  At the end of the month, my dataset will reach a length of anywhere between 271 and 335.  Even though I have the transfer field setup with a length of 512, I am only getting 255 characters worth of data when I pull the flat file in from the server.
    Has anyone discovered a way to handle this?  I cannot break the record up into blocks of 255, the Transfer has to be able to handle something greater than a length of 255.
    Many Thanks!
    Tavares L. Phillips

    OK - according to OSS note 626010:
    Short text          "TRANSFER f TO dataset" ignores LENGTH addition                                                                               
    Responsible         SAP AG                                              
    Component           BC-ABA-LA                                           
                        Syntax, Compiler, Runtime                           
    Long text                                                                               
    Symptom                                                                 
    In rare cases, the "TRANSFER f TO dataset" statement ignores the LENGTH 
    addition.                                                               
    Other terms                                                             
    DATASET, FILE                                                           
    Reason and Prerequisites                                                
    This is caused by a kernel error.                                       
    Solution                                                                
    The error is corrected for SAP_BASIS 6.20 using kernel patch 848.       
    Valid releases                                                          
    Software Component                        Release                       
                                              from            to                                                                               
    SAP_BASIS  SAP Basis component                                                                               
    610          - 620            
    It's an old note but...?
    Rob

  • Videos in iPhoto used to export with correct date, no they don't.

    Hello, I use iPhoto a lot. About every 6 months I rename and export my photos and videos to a separate portable hard drive for safe storage.
    I am doing that right now, but have noticed that my videos do not export with the correct date (changes the date to the export date), and therefore the videos are not going to be in the right order, and will not have the correct date on my backup files. The pictures export properly with correct date. Just the videos are the problem.
    This problem did not exist before. What has changed? Is there a way to export my videos and retain the correct date again?
    Thank you!

    Thank you for the quick reply Terence!
    Is this something new for Maverick, or new iPhoto update? I have always been able to export video with the correct date. Even when I imported it back into iPhoto later, the correct date was there. I think I updated to Maverick after my last renaming and exporting session.
    I have found that if I drag the video from iPhoto to my desktop, it keeps the correct date on the file and the video is intact. So there must be some sort of "metadata" in the file that has the correct date?? You probably have the knowledge to experiment with this??
    Just seems weird, because for years it was just fine. Now it is not.
    How did the videos used to keep the correct date?
    Thank you.

  • Materialized View with "error in exporting/importing data"

    My system is a 10g R2 on AIX (dev). When I impdp a dmp from other box, also 10g R2, in the dump log file, there is an error about the materialized view:ORA-31693: Table data object "BAANDB"."MV_EMPHORA" failed to load/unload and is being skipped due to error:
    ORA-02354: error in exporting/importing data
    Desc mv_emphora
    Name              Null?    Type
    C_RID                      ROWID
    P_RID                      ROWID
    T$CWOC            NOT NULL CHAR(6)
    T$EMNO            NOT NULL CHAR(6)
    T$NAMA            NOT NULL CHAR(35)
    T$EDTE            NOT NULL DATE
    T$PERI                     NUMBER
    T$QUAN                     NUMBER
    T$YEAR                     NUMBER
    T$RGDT                     DATEAs i ckecked here and Metalink, I found the info is less to do with the MV? what was the cause?

    The total lines are 25074. So I used the GREP from the OS to get the lines involved with the MV. Here are:
    grep -n -i "TTPPPC235201" impBaanFull.log
    5220:ORA-39153: Table "BAANDB"."TTPPPC235201" exists and has been truncated. Data will be loaded but all dependent metadata will be skipped due to table_exists_action of truncate
    5845:ORA-39153: Table "BAANDB"."MLOG$_TTPPPC235201" exists and has been truncated. Data will be loaded but all dependent meta data will be skipped due to table_exists_action of truncate
    8503:. . imported "BAANDB"."TTPPPC235201"                     36.22 MB  107912 rows
    8910:. . imported "BAANDB"."MLOG$_TTPPPC235201"               413.0 KB    6848 rows
    grep -n -i "TTCCOM001201" impBaanFull.log
    4018:ORA-39153: Table "BAANDB"."TTCCOM001201" exists and has been truncated. Data will be loaded but all dependent metadata will be skipped due to table_exists_action of truncate
    5844:ORA-39153: Table "BAANDB"."MLOG$_TTCCOM001201" exists and has been truncated. Data will be loaded but all dependent metadata will be skipped due to table_exists_action of truncate
    9129:. . imported "BAANDB"."MLOG$_TTCCOM001201"               9.718 KB      38 rows
    9136:. . imported "BAANDB"."TTCCOM001201"                     85.91 KB     239 rows
    grep -n -i "MV_EMPHORA" impBaanFull.log
    8469:ORA-39153: Table "BAANDB"."MV_EMPHORA" exists and has been truncated. Data will be loaded but all dependent metadata will be skipped due to table_exists_action of truncate
    8558:ORA-31693: Table data object "BAANDB"."MV_EMPHORA" failed to load/unload and is being skipped due to error:
    8560:ORA-12081: update operation not allowed on table "BAANDB"."MV_EMPHORA"
    25066:ORA-31684: Object type MATERIALIZED_VIEW:"BAANDB"."MV_EMPHORA" already exists
    25072: BEGIN dbms_refresh.make('"BAANDB"."MV_EMPHORA"',list=>null,next_date=>null,interval=>null,implicit_destroy=>TRUE,lax=>
    FALSE,job=>44,rollback_seg=>NULL,push_deferred_rpc=>TRUE,refresh_after_errors=>FALSE,purge_option => 1,parallelism => 0,heap_size => 0);
    25073:dbms_refresh.add(name=>'"BAANDB"."MV_EMPHORA"',list=>'"BAANDB"."MV_EMPHORA"',siteid=>0,export_db=>'BAAN'); END;the number in front of each line is the line number of the import log.
    Here is my syntax of import pmup:impdp user/pw  SCHEMAS=baandb DIRECTORY=baanbk_data_pump DUMPFILE=impBaanAll.dmp  LOGFILE=impBaanAll.log TABLE_EXISTS_ACTION=TRUNCATEYes I can create the MV manually and I have no problem to refresh manually it after the inmport.

  • Using expdp to export a mix of tables with data and tables without data

    Hi,
    I would like to create a .dmp file using expdp, exporting a set of tables with data and another set without data. Is there a way to do this in a single .dmp file? For example, I want all the tables in a schema with data, but for the fact tables in that schema, I only want the fact table objects, not the data. I thought it might be easier to create two separate .dmp files, one for each scenario, but would be nice to have one .dmp file that satisfies my requirement. Any help is appreciated.
    Thanks,
    -Rodolfo
    Edited by: user6902559 on May 11, 2010 12:05 PM

    You could do this with where clauses. Let's say you have 10 tables to export, 5 with data and 5 without data. I would do it like this
    tab1_w_data
    tab2_w_data
    tab3_w_data
    tab4_w_data
    tab5_w_data
    tab1_wo_data
    tab2_wo_data
    tab3_wo_data
    tab4_wo_data
    tab5_wo_data
    I would make one generic query
    query="where rownum = 0"
    and I would make 5 specific queries
    query=tab1_w_data:"where rownum > 0"
    query=tab2_w_data:"where rownum > 0"
    query=tab3_w_data:"where rownum > 0"
    query=tab4_w_data:"where rownum > 0"
    query=tab5_w_data:"where rownum > 0"
    The first query will be applied to all tables that don't have their own specific query and it will export no rows, the next 5 will apply to each of the corresponding table.
    Dean

  • Export Of Data In The .CSV Format With Column Headings On Top

    I'm trying to export data in the .csv format from a report (Crystal Reports XI Release 2).  When I export the data, the heading appears in the first few columns on the left hand side (i.e. columns "A" thru "G") for every row, rather as a column heading on top.  These header describe the data beneath it.  I tried various combinations of export options.  The file needs an extension .csv (or .xls) for the tool that uses the data.  Any suggestions how I can accomplish this ?

    Abhishek,
    I tried to apply your solution, but forgot that the Crystal Reports support desk updated my version of Crystal Reports XI to Release 2 for a previous problem.  My CDs are Crystal Reports XI Release 1.  Now I can't export at all until I figure out how to turn on the export option.  I do have access to older version (version unknown) that came with one of business system.  The options it has are "Character", "Tab", and "Delimeter" for "Separated Values (CSV)".  It does not have the option that you mentioned.  Is there a similar option with this older version ?  Lastly, How can I turn on the "Export" for "Crystal Reports XI Release 2" ?  Thanks ! ! !

Maybe you are looking for

  • Pdf download in bi publsiher 10g

    Hi All, We are using OBIEE 10g. We are placing bi publisher reports on dashboard. When we click on the report link on dashboard, we need to get direct download option for the publisher report in pdf format instead of viewing it in pdf format, like we

  • TS3402 iMac with OS X 10.8.4 will not stay in sleep mode.

    After installing latest OS updates, 10.8.4,  we cannot get our iMac to stay in sleep mode.  Energy saving settings have computer sleep and display sleep set to15 minutes, put hard disks to sleep when possible and allow power button to put to sleep ar

  • Deployable proxy lookup problem

    Hello SAP Community, here is my issue: I need to access a web service in my web dynpro application, so I generated a deployable proxy from the wsdl file and deployed it to the server. I added the proxy's ClientAPI.jar to my web dynpro build path and

  • Lov not doing mandatroy validation

    hi i have this lov it does not do required validation even if i set required =true ,am in jdeveloper 11.1.1.6.0,is not checking that validation <af:selectOneChoice value="#{bindings.Officecode.inputValue}"                                         labe

  • Opening TL files in Logic 8

    Has anyone had issues importing Tascam TL files into Logic 8? I have had no luck following the manual's instructions.