Creating Exception Tables

Hi,
how to create exception tables?

how to create exception tables? There are more kinds of exception tables: exceptions_clause.

Similar Messages

  • How to create an exceptions table

    I am trying to record the records that fail to load in my dimension and fact mappings. I found the Constraint Management option for "Exception Table Name" but I could not find how to create the table i.e. What is the structure of the table and if the table is special (AQ table, etc.)
    I looked through the docs and it is not mentioned any where.
    Regards,
    Abraham

    Your database should provide a script for this:
    rdbms\admin\utlexcpt.sql
    Regards,
    Torsten

  • Exceptions Table query help?

    I ahve two exceptions tables cust_day_of_week and cust_date
    Cust_day_of the week has following fields:
    Key_Id,Day_week,begintime,endtime
    1,1,8,20 1--is oracle number for sunday
    cust_date has following fields
    Key_Id, exception_date,begintime,endtime
    2,08/24/2011,8,20
    i am writing a function to get the key_id to use for some purpose by checking if the sysdate falls in any of the exception
    say sysdate is sunday then profile ID =1 shud be returned
    if date exception is to be checked then the Cust_day table shud be checked...
    is there a way to get the profileid from both tables in one function by chekcing which exception is fullfilled....Hope i am clear...
    say i wnat to check if the sysdate day is sunday or not..if yes then return profileid =1
    else if i want check if tuday =08/24/2011 the retrun profile id =2 but this shud be done in one function if possible...is it possible...
    This is the function i planning to create....
    how to check the for the Cust_date table date exception in the same query...
    FUNCTION key_ID
    ( v_date     IN          DATE     DEFAULT SYSDATE
    RETURN     Number IS
    ret_value number:=0
    select Key_ID into ret_value from cust_day_of_week
    where (to_char( v_date,'D') in (select day_week from cust_day_of_week ))
    retun ret_value;
    EXCEPTION
    WHEN OTHERS THEN
    dbms_output.put_line('Error:'||SQLERRM);
    RETURN 0;
    END ;
    Edited by: 874167 on Aug 25, 2011 1:43 PM

    Does this help?
    It's not a function, but I don't see a need for one.
    drop table cust_date purge;
    drop table cust_day_of_week purge;
    create table cust_day_of_week (key_id number,Day_week number,begintime number,endtime number);
    insert into cust_day_of_week values (1,1,8,20);
    insert into cust_day_of_week values (2,2,8,20);
    insert into cust_day_of_week values (3,3,8,20);
    insert into cust_day_of_week values (4,4,8,20);
    insert into cust_day_of_week values (5,5,8,20);
    insert into cust_day_of_week values (6,6,8,20);
    insert into cust_day_of_week values (7,7,8,20);
    create table cust_date (key_id number,exception_date date,begintime number,endtime number, constraint edunique unique(exception_date));
    insert into cust_date values (8,to_date('08/24/2011','MM/DD/YYYY'),8,20);
    insert into cust_date values (9,to_date('08/25/2011','MM/DD/YYYY'),8,20);
    commit;
    select key_id from (
    SELECT key_id from cust_date where exception_date = trunc(sysdate)
    union all
    SELECT key_id from cust_day_of_week where day_week = to_number(to_char(sysdate,'D','nls_date_language = AMERICAN')))
    where rownum =1;I am not 100% sure that you are allowed to rely on union all preserving the order. I have not seen otherwise, but I am not sure whether it's guaranteed.
    As documentation doesn't say it's guaranteed, it probably isn't.
    to be sure, this is a safe way of doing it. Maybe someone else can say whether there is a shorter way of doing that:
    select key_id from (
    select key_id ,rank() over (order by sorted) rnk from (
    SELECT key_id, 1 sorted from cust_date where exception_date = trunc(sysdate)
    union all
    SELECT key_id, 2 sorted from cust_day_of_week where day_week = to_number(to_char(sysdate,'D','nls_date_language = AMERICAN')))
    ) where rnk = 1;

  • NULL in primary keys NOT logged to exceptions table

    Problem: Inconsistent behavior when enabling constraints using the "EXCEPTIONS INTO" clause. RDBMS Version: 9.2.0.8.0 and 10.2.0.3.0
    - NULL values in primary keys are NOT logged to exceptions table
    - NOT NULL column constraints ARE logged to exceptions table
    -- Demonstration
    -- NULL values in primary keys NOT logged to exceptions table
    TRUNCATE TABLE exceptions;
    DROP TABLE t;
    CREATE TABLE t ( x NUMBER );
    INSERT INTO t VALUES ( NULL );
    ALTER TABLE t
    ADD ( CONSTRAINT tpk PRIMARY KEY (x) EXCEPTIONS INTO exceptions );
    SELECT * FROM exceptions; -- returns no rows
    -- NOT NULL column constraints logged to exceptions table
    TRUNCATE TABLE exceptions;
    DROP TABLE t;
    CREATE TABLE t ( x NUMBER );
    INSERT INTO t VALUES ( NULL );
    ALTER TABLE t MODIFY ( X NOT NULL EXCEPTIONS INTO EXCEPTIONS );
    SELECT * FROM exceptions; -- returns one row
    I would have expected all constraint violations to be logged to exceptions. I was not able to find any documentation describing the behavior I describe above.
    Can anyone tell me if this is the intended behavior and if so, where it is documented?
    I would also appreciate it if others would confirm this behavior on their systems and say if it is what they expect.
    Thanks.
    - Doug
    P.S. Apologies for the repost from an old thread, which someone else found objectionable.

    I should have posted the output. Here it is.
    SQL>TRUNCATE TABLE exceptions;
    Table truncated.
    SQL>DROP TABLE t;
    Table dropped.
    SQL>CREATE TABLE t ( x NUMBER );
    Table created.
    SQL>INSERT INTO t VALUES ( NULL );
    1 row created.
    SQL>ALTER TABLE t ADD ( CONSTRAINT tpk PRIMARY KEY (x) EXCEPTIONS INTO exceptions );
    ALTER TABLE t ADD ( CONSTRAINT tpk PRIMARY KEY (x) EXCEPTIONS INTO exceptions )
    ERROR at line 1:
    ORA-01449: column contains NULL values; cannot alter to NOT NULL
    SQL>SELECT * FROM exceptions;
    no rows selected
    SQL>
    SQL>TRUNCATE TABLE exceptions;
    Table truncated.
    SQL>DROP TABLE t;
    Table dropped.
    SQL>CREATE TABLE t ( x NUMBER );
    Table created.
    SQL>INSERT INTO t VALUES ( NULL );
    1 row created.
    SQL>ALTER TABLE t MODIFY ( X NOT NULL EXCEPTIONS INTO EXCEPTIONS );
    ALTER TABLE t MODIFY ( X NOT NULL EXCEPTIONS INTO EXCEPTIONS )
    ERROR at line 1:
    ORA-02296: cannot enable (MYSCHEMA.) - null values found
    SQL>SELECT * FROM exceptions;
    ROW_ID OWNER TABLE_NAME CONSTRAINT
    AAAkk5AAMAAAEByAAA MYSCHEMA T T
    1 row selected.
    As you can see, I get the expected error message. But I only end up with a record in the exceptions table for the NOT NULL column constraint, not for the null primary key value.

  • Error while creating adf table in jdev 11.1.1.2.0

    Hi
    I am getting below error while running the jsf page.
    =======
    <UIXInclude><_setupIncludeContext> Illegal call to setup the context of an include that is already in context
    <UIXInclude><_tearDownIncludeContext> Illegal call to tear down the context of an include that is not in context
    <UIXPageTemplate><tearDownVisitingContext> Tear down of page template context failed due to an unhandled exception.
    java.util.NoSuchElementException
    =======
    In this jsf page I just dragged view object to create a table.
    What could be the reason?
    Thanks

    Hi,
    Did you drop it as ADF table? Can you post the code snippet of your jspx page?
    -Arun

  • Issue while creating ADF Table programatically

    Hi,
    I am trying to create a table programatically...And i implemented like below:
                RichTable phoneTable = new RichTable();
                phoneTable.setEmptyText("No Phone Details yet");
                getContactPhone(contactObj.getPhone());
                phoneTable.setValue(contactObj.getPhone());
    // contactObj.getPhone() is a ArrayList<TelephoneBOD> in pojo Object...
    // Which is taken from a Object returned from DataControl Method (Captured in pageFlowScope)
                phoneTable.setVar("row");
                // Add Columns
                RichColumn column = new RichColumn();
                column.setHeaderText("Type");
                column.setId("phoneType");
                column.setAlign("right");
                column.setWidth("100");
                // Set output.
                RichOutputText output = new RichOutputText();
                output.setValue("#{row.phoneType}");
                // Add output into column.
                column.getChildren().add(output);
                // Add column into table.
                phoneTable.getChildren().add(column);When i try to implement like this i am getting fllowing error:
    popup:
    ZIP_STATE_FAILED
    ADF_FACES-60097:For more information, please see the server's error log for an entry beginning with: ADF_FACES-60096:Server Exception during PPR, #2log:
    Caused By: java.io.NotSerializableException: org.ieee.internal.ws.proxy.conf.types.TelephoneBOD
         at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1156)
         at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:326)
         at java.util.ArrayList.writeObject(ArrayList.java:570)
         at sun.reflect.GeneratedMethodAccessor252.invoke(Unknown Source)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
         at java.lang.reflect.Method.invoke(Method.java:597)
         at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:945)
         at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1461)
         at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1392)
         at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1150)
         at java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1338)
         at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1146)
         at java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1338)
         at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1146)
         at java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1338)
         at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1146)
         at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:326)
         at org.apache.myfaces.trinidad.component.TreeState.writeExternal(TreeState.java:239)
         at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1421)
         at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1390)
         at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1150)
         at java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1338)
         at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1146)
         at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:326)
         at org.apache.myfaces.trinidad.component.TreeState.writeExternal(TreeState.java:241)
         at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1421)
         at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1390)
         at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1150)
         at java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1338)
         at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1146)
         at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:326)I think this is happening because i am not serializing the object passing to Table. If this is the Reason.. How to serialize the object?
    Or i am missing something in the code?
    Thanks
    Thoom
    Edited by: User007 on Aug 31, 2011 1:47 PM
    Edited by: User007 on Aug 31, 2011 1:47 PM

    Thanks Timo.. for simple solution.
    Actually.. TelephoneBOD is auto-generated on creation of web service proxy from web service.
    I don't think.. updating those Java files directly is a best practice. Because they will be changing if there is any change in web service schema..
    You think there is any other way to do so?
    Thanks
    Thoom

  • Unable to create a table in cloudscape with solaris

    Hi experts,
    I have WLS6.1 running on Solaris. I'm trying to create tables in cloudscape but
    it throughs NullPointerException. I'm able to run java utils.Schema jdbc:cloudscape:test;create=true
    COM.cloudscape.core.JDBCDriver
    -verbose test.ddl
    This is the error :
    CREATE TABLE test ( id varchar(32), name varchar(32))
    SQL Exception: Java exception: ': java.lang.NullPointerException'.
    SQL Error Code: 0
    SQL State: XJ001
    It creates empty database. I have the required permission. And I can run java
    COM.cloudscape.tools.cview but when I try to create a table even using cview it
    throughs NullPointerException.
    Any help is appreciated.

    Try this SQL statement:
    CREATE TABLE myTable (myField COUNTER)
    This creates a table with name 'myTable' and a column 'myField' of type AutoNumber.

  • Uploading data from excel file to a dynamically created internal table

    Hi,
    I have a requirement where i have to upload data from an excel file into a database table. I would be able to determine the structure of the table only at runtime based on the user input.. so i have created an internal table dynamically.
    Could you please tell me if its possible to upload data from an excel file to the dynamically created internal table using any function modules?
    I thought of doing this by declaring a generic internal table of one field and then uploading the *.csv file into it and then splitting it based on "," and then assigning it to the field symbol referencing the internal table.. but my file length exceeds 132 characters and i'm only able to get data of lenght 132 char's in my internal table ( generic one).
    Could anyone please show me a way around this.
    Thanks in advance,
    Harsha

    Sure, check this out.
    report zrich_0002.
    type-pools: slis.
    field-symbols: <dyn_table> type standard table,
                   <dyn_wa>,
                   <dyn_field>.
    data: it_fldcat type lvc_t_fcat,
          wa_it_fldcat type lvc_s_fcat.
    type-pools : abap.
    data: new_table type ref to data,
          new_line  type ref to data.
    data: iflat type table of string.
    data: xflat type string.
      data: irec type table of string with header line.
      data: tabix type sy-tabix.
    data: file type string.
    selection-screen begin of block b1 with frame title text .
    parameters: p_file type  rlgrap-filename default 'c:Test.csv'.
    parameters: p_flds type i.
    selection-screen end of block b1.
    start-of-selection.
    * Add X number of fields to the dynamic itab cataelog
      do p_flds times.
        clear wa_it_fldcat.
        wa_it_fldcat-fieldname = sy-index.
        wa_it_fldcat-datatype = 'C'.
        wa_it_fldcat-inttype = 'C'.
        wa_it_fldcat-intlen = 10.
        append wa_it_fldcat to it_fldcat .
      enddo.
    * Create dynamic internal table and assign to FS
      call method cl_alv_table_create=>create_dynamic_table
                   exporting
                      it_fieldcatalog = it_fldcat
                   importing
                      ep_table        = new_table.
      assign new_table->* to <dyn_table>.
    * Create dynamic work area and assign to FS
      create data new_line like line of <dyn_table>.
      assign new_line->* to <dyn_wa>.
      file = p_file.
      call method cl_gui_frontend_services=>gui_upload
        exporting
          filename                = file
        changing
          data_tab                = iflat
        exceptions
          file_open_error         = 1
          file_read_error         = 2
          no_batch                = 3
          gui_refuse_filetransfer = 4
          invalid_type            = 5
          no_authority            = 6
          unknown_error           = 7
          bad_data_format         = 8
          header_not_allowed      = 9
          separator_not_allowed   = 10
          header_too_long         = 11
          unknown_dp_error        = 12
          access_denied           = 13
          dp_out_of_memory        = 14
          disk_full               = 15
          dp_timeout              = 16
          others                  = 17.
      loop at iflat into xflat.
        clear irec. refresh irec.
        split xflat at ',' into table irec.
        loop at irec.
          tabix = sy-tabix.
          assign component tabix of structure <dyn_wa> to <dyn_field>.
          <dyn_field> = irec.
        endloop.
        append <dyn_wa> to <dyn_table>.
      endloop.
    * Write out data from table.
      loop at <dyn_table> into <dyn_wa>.
        do.
          assign component  sy-index  of structure <dyn_wa> to <dyn_field>.
          if sy-subrc <> 0.
            exit.
          endif.
          if sy-index = 1.
            write:/ <dyn_field>.
          else.
            write: <dyn_field>.
          endif.
        enddo.
      endloop.
    Regards,
    Rich Heilman

  • How to create hashed table in runtime

    hi experts
    how to create hashed table in runtime, please give me the coading style.
    please help me.
    regards
    subhasis

    Hi,
    Have alook at the code, and pls reward points.
    Use Hashed Tables to Improve Performance :
    report zuseofhashedtables.
    Program: ZUseOfHashedTables                                        **
    Author: XXXXXXXXXXXXXXXXXX                                 **
    Versions: 4.6b - 4.6c                                              **
    Notes:                                                             **
        this program shows how we can use hashed tables to improve     **
        the responce time.                                             **
        It shows,                                                      **
           1. how to declare hashed tables                             **
           2. a cache-like technique to improve access to master data  **
           3. how to collect data using hashed tables                  **
           4. how to avoid deletions of unwanted data                  **
    Results: the test we run read about 31000 rows from mkpf, 150000   **
             rows from mseg, 500 rows from makt and 400 from lfa1.     **
             it filled ht_lst with 24500 rows and displayed them in    **
             alv grid format.                                          **
             It needed about 65 seconds to perform this task (with     **
             all the db buffers empty)                                 **
             The same program with standard tables needed 140 seconds  **
             to run with the same recordset and with buffers filled in **
    Objetive: show a list that consists of  all the material movements **
             '101' - '901' for a certain range of dates in mkpf-budat. **
    the columns to be displayed are:                                   **
             mkpf-budat,                                               **
             mkpf-mblnr,                                               **
             mseg-lifnr,                                               **
             lfa1-name1,                                               **
             mkpf-xblnr,                                               **
             mseg-zeile                                                **
             mseg-charg,                                               **
             mseg-matnr,                                               **
             makt-maktx,                                               **
             mseg-erfmg,                                               **
             mseg-erfme.                                               **
    or show a sumary list by matnr - menge                             **
    You'll have to create a pf-status called vista -                   **
    See form set_pf_status for details                                 **
    tables used -
    tables: mkpf,
            mseg,
            lfa1,
            makt.
    global hashed tables used
    data: begin of wa_mkpf, "header
          mblnr like mkpf-mblnr,
          mjahr like mkpf-mjahr,
          budat like mkpf-budat,
          xblnr like mkpf-xblnr,
          end of wa_mkpf.
    data: ht_mkpf like hashed table of wa_mkpf
          with unique key mblnr mjahr
          with header line.
    data: begin of wa_mseg, " line items
          mblnr like mseg-mblnr,
          mjahr like mseg-mjahr,
          zeile like mseg-zeile,
          bwart like mseg-bwart,
          charg like mseg-charg,
          matnr like mseg-matnr,
          lifnr like mseg-lifnr,
          erfmg like mseg-erfmg,
          erfme like mseg-erfme,
          end of wa_mseg.
    data ht_mseg like hashed table of wa_mseg
          with unique key mblnr mjahr zeile
          with header line.
    cache structure for lfa1 records
    data: begin of wa_lfa1,
          lifnr like lfa1-lifnr,
          name1 like lfa1-name1,
          end of wa_lfa1.
    data ht_lfa1 like hashed table of wa_lfa1
          with unique key lifnr
          with header line.
    cache structure for material related data
    data: begin of wa_material,
          matnr like makt-matnr,
          maktx like makt-maktx,
          end of wa_material.
    data: ht_material like hashed table of wa_material
            with unique key matnr
            with header line.
    result table
    data: begin of wa_lst, "
          budat like mkpf-budat,
          mblnr like mseg-mblnr,
          lifnr like mseg-lifnr,
          name1 like lfa1-name1,   
          xblnr like mkpf-xblnr,
          zeile like mseg-zeile,
          charg like mseg-charg,
          matnr like mseg-matnr,
          maktx like makt-maktx,
          erfmg like mseg-erfmg,
          erfme like mseg-erfme,
          mjahr like mseg-mjahr,
          end of wa_lst.
    data: ht_lst like hashed table of wa_lst
            with unique key mblnr mjahr zeile
            with header line.
    data: begin of wa_lst1, " sumary by material
          matnr like mseg-matnr,
          maktx like makt-maktx,
          erfmg like mseg-erfmg,
          erfme like mseg-erfme,
          end of wa_lst1.
    data: ht_lst1 like hashed table of wa_lst1
            with unique key matnr
            with header line.
    structures for alv grid display.
    itabs
    type-pools: slis.
    data: it_lst            like standard table of wa_lst with header line,
          it_fieldcat_lst   type slis_t_fieldcat_alv with header line,
          it_sort_lst       type slis_t_sortinfo_alv,
          it_lst1           like standard table of wa_lst1 with header line,
          it_fieldcat_lst1  type slis_t_fieldcat_alv with header line,
          it_sort_lst1      type slis_t_sortinfo_alv.
    structures
    data: wa_sort         type slis_sortinfo_alv,
          ls_layout       type slis_layout_alv.
    global varialbes
    data: g_lines type i.
    data: g_repid like sy-repid,
          ok_code       like sy-ucomm.
    selection-screen
    "text: Dates:
    select-options: so_budat for mkpf-budat default sy-datum.
    "text: Material numbers.
    select-options: so_matnr for mseg-matnr.
    selection-screen uline.
    selection-screen skip 1.
    "Text: show summary by material.
    parameters: gp_bymat as checkbox default ''.
    start-of-selection.
      perform get_data.
      perform show_data.
    end-of-selection.
          FORM get_data                                                 *
    form get_data.
            select mblnr mjahr budat xblnr
                into table ht_mkpf
               from mkpf
              where budat in so_budat. " make use of std index.
    have we retrieved data from mkpf?
      describe table ht_mkpf lines g_lines.
      if g_lines > 0.
    if true then retrieve all related records from mseg.
    Doing this way we make sure that the access is by primary key
    of mseg.
    The reason is that is faster to filter them in memory
    than to allow the db server to do it.
        select mblnr mjahr zeile bwart charg
                 matnr lifnr erfmg erfme
          into table ht_mseg
          from mseg
            for all entries in ht_mkpf
         where mblnr = ht_mkpf-mblnr
           and mjahr = ht_mkpf-mjahr.
      endif.
    fill t_lst or t_lst1 according to user's choice.
      if gp_bymat = ' '.
        perform fill_ht_lst.
      else.
        perform fill_ht_lst1.
      endif.
    endform.
    form fill_ht_lst.
      refresh ht_lst.
    Example: how to discard unwanted data in an efficient way.
      loop at ht_mseg.
      filter unwanted data
        check ht_mseg-bwart = '101' or ht_mseg-bwart = '901'.
        check ht_mseg-matnr in so_matnr.
      read header line.
        read table ht_mkpf with table key mblnr = ht_mseg-mblnr
        mjahr = ht_mseg-mjahr.
        clear ht_lst.
    * note : this may be faster if you specify field by field.
        move-corresponding ht_mkpf to ht_lst.
        move-corresponding ht_mseg to ht_lst.
        perform read_lfa1 using ht_mseg-lifnr changing ht_lst-name1.
        perform read_material using ht_mseg-matnr changing ht_lst-maktx.
        insert table ht_lst.
      endloop.
    endform.
    form fill_ht_lst1.
      refresh ht_lst1.
    Example: how to discard unwanted data in an efficient way.
             hot to simulate a collect in a faster way
      loop at ht_mseg.
      filter unwanted data
        check ht_mseg-bwart = '101' or ht_mseg-bwart = '901'.
        check ht_mseg-matnr in so_matnr.
    * note : this may be faster if you specify field by field.
        read table ht_lst1 with table key matnr = ht_mseg-matnr
        transporting erfmg.
        if sy-subrc <> 0. " if matnr doesn't exist in sumary table
        " insert a new record
          ht_lst1-matnr = ht_mseg-matnr.
          perform read_material using ht_mseg-matnr changing ht_lst1-maktx.
          ht_lst1-erfmg = ht_mseg-erfmg.
          ht_lst1-erfme = ht_mseg-erfme.
          insert table ht_lst1.
        else." a record was found.
        " collect erfmg.  To do so, fill in the unique key and add
        " the numeric fields.
          ht_lst1-matnr = ht_mseg-matnr.
          add ht_mseg-erfmg to ht_lst1-erfmg.
          modify table ht_lst1 transporting erfmg.
        endif.
      endloop.
    endform.
    implementation of cache for lfa1.
    form read_lfa1 using p_lifnr changing p_name1.
            read table ht_lfa1 with table key lifnr = p_lifnr
            transporting name1.
      if sy-subrc <> 0.
        clear ht_lfa1.
        ht_lfa1-lifnr = p_lifnr.
        select single name1
           into ht_lfa1-name1
          from lfa1
        where lifnr = p_lifnr.
        if sy-subrc <> 0. ht_lfa1-name1 = 'n/a in lfa1'. endif.
        insert table ht_lfa1.
      endif.
      p_name1 = ht_lfa1-name1.
    endform.
    implementation of cache for material data
    form read_material using p_matnr changing p_maktx.
      read table ht_material with table key matnr = p_matnr
      transporting maktx.
      if sy-subrc <> 0.
        ht_material-matnr = p_matnr.
        select single maktx into  ht_material-maktx
          from makt
         where spras = sy-langu
           and matnr = p_matnr.
        if sy-subrc <> 0. ht_material-maktx = 'n/a in makt'. endif.
        insert table ht_material.
      endif.
      p_maktx = ht_material-maktx.
    endform.
    form show_data.
      if gp_bymat = ' '.
        perform show_ht_lst.
      else.
        perform show_ht_lst1.
      endif.
    endform.
    form show_ht_lst.
      "needed because the FM can't use a hashed table.
      it_lst[] = ht_lst[].
      perform fill_layout using 'full display'
                           changing ls_layout.
      perform fill_columns_lst.
    perform sort_lst.
      g_repid = sy-repid.
      call function 'REUSE_ALV_GRID_DISPLAY'
           exporting
                i_callback_program       = g_repid
                i_callback_pf_status_set = 'SET_PF_STATUS'
                is_layout                = ls_layout
                it_fieldcat              = it_fieldcat_lst[]
               it_sort                  = it_sort_lst
           tables
                t_outtab                 = it_lst
           exceptions
                program_error            = 1
                others                   = 2.
    endform.
    form show_ht_lst1.
      "needed because the FM can't use a hashed table.
      it_lst1[] = ht_lst1[].
      perform fill_layout using 'Sumary by matnr'
                           changing ls_layout.
      perform fill_columns_lst1.
    perform sort_lst.
      g_repid = sy-repid.
      call function 'REUSE_ALV_GRID_DISPLAY'
           exporting
                i_callback_program       = g_repid
                i_callback_pf_status_set = 'SET_PF_STATUS'
                is_layout                = ls_layout
                it_fieldcat              = it_fieldcat_lst1[]
               it_sort                  = it_sort_lst
           tables
                t_outtab                 = it_lst1
           exceptions
                program_error            = 1
                others                   = 2.
    endform.
    form fill_layout using p_window_titlebar
                   changing cs_layo type slis_layout_alv.
      clear cs_layo.
      cs_layo-window_titlebar        = p_window_titlebar.
      cs_layo-edit                   = 'X'.
      cs_layo-edit_mode              = space.
    endform.                    " armar_layout_stock
    form set_pf_status using rt_extab type slis_t_extab.
    create a new status
    and then select extras -> adjust template -> listviewer
      set pf-status 'VISTA'.
    endform.        "set_pf_status
    define add_lst.
      clear it_fieldcat_lst.
      it_fieldcat_lst-fieldname     = &1.
      it_fieldcat_lst-outputlen     = &2.
      it_fieldcat_lst-ddictxt       = 'L'.
      it_fieldcat_lst-seltext_l       = &1.
      it_fieldcat_lst-seltext_m       = &1.
      it_fieldcat_lst-seltext_m       = &1.
      if &1 = 'MATNR'.
        it_fieldcat_lst-emphasize = 'C111'.
      endif.
      append it_fieldcat_lst.
    end-of-definition.
    define add_lst1.
      clear it_fieldcat_lst.
      it_fieldcat_lst1-fieldname     = &1.
      it_fieldcat_lst1-outputlen     = &2.
      it_fieldcat_lst1-ddictxt       = 'L'.
      it_fieldcat_lst1-seltext_l       = &1.
      it_fieldcat_lst1-seltext_m       = &1.
      it_fieldcat_lst1-seltext_m       = &1.
      append it_fieldcat_lst1.
    end-of-definition.
    form fill_columns_lst.
    set columns for output.
      refresh it_fieldcat_lst.
      add_lst 'BUDAT' 10.
      add_lst   'MBLNR' 10.
      add_lst  'LIFNR' 10.
      add_lst  'NAME1' 35.
      add_lst  'XBLNR' 15.
      add_lst    'ZEILE' 5.
      add_lst    'CHARG' 10.
      add_lst   'MATNR' 18.
      add_lst   'MAKTX' 30.
      add_lst   'ERFMG' 17.
      add_lst   'ERFME' 5.
      add_lst   'MJAHR' 4.
    endform.
    form fill_columns_lst1.
    set columns for output.
      refresh it_fieldcat_lst1.
      add_lst1 'MATNR' 18.
      add_lst1 'MAKTX' 30.
      add_lst1 'ERFMG' 17.
      add_lst1 'ERFME' 5..
    endform.
    Regards,
    Ameet

  • SSIS 2008 – Read roughly 50 CSV files from a folder, create SQL table from them dynamically, and dump data.

    Hello everyone,
    I’ve been assigned one requirement wherein I would like to read around 50 CSV files from a specified folder.
    In step 1 I would like to create schema for this files, meaning take the CSV file one by one and create SQL table for it, if it does not exist at destination.
    In step 2 I would like to append the data of these 50 CSV files into respective table.
    In step 3 I would like to purge data older than a given date.
    Please note, the data in these CSV files would be very bulky, I would like to know the best way to insert bulky data into SQL table.
    Also, in some of the CSV files, there will be 4 rows at the top of the file which have the header details/header rows.
    According to my knowledge I would be asked to implement this on SSIS 2008 but I’m not 100% sure for it.
    So, please feel free to provide multiple approaches if we can achieve these requirements elegantly in newer versions like SSIS 2012.
    Any help would be much appreciated.
    Thanks,
    Ankit
    Thanks, <b>Ankit Shah</b> <hr> Inkey Solutions, India. <hr> Microsoft Certified Business Management Solutions Professionals <hr> http://ankit.inkeysolutions.com

    Hello Harry and Aamir,
    Thank you for the responses.
    @Aamir, thank you for sharing the link, yes I'm going to use Script task to read header columns of CSV files, preparing one SSIS variable which will be having SQL script to create the required table with if exists condition inside script task itself.
    I will be having "Execute SQL task" following the script task. And this will create the actual table for a CSV.
    Both these components will be inside a for each loop container and execute all 50 CSV files one by one.
    Some points to be clarified,
    1. In the bunch of these 50 CSV files there will be some exception for which we first need to purge the tables and then insert the data. Meaning for 2 files out of 50, we need to first clean the tables and then perform data insert, while for the rest 48
    files, they should be appended on daily basis.
    Can you please advise what is the best way to achieve this requirement? Where should we configure such exceptional cases for the package?
    2. For some of the CSV files we would be having more than one file with the same name. Like out of 50 the 2nd file is divided into 10 different CSV files. so in total we're having 60 files wherein the 10 out of 60 have repeated file names. How can we manage
    this criteria within the same loop, do we need to do one more for each looping inside the parent one, what is the best way to achieve this requirement?
    3. There will be another package, which will be used to purge data for the SQL tables. Meaning unlike the above package, this package will not run on daily basis. At some point we would like these 50 tables to be purged with older than criteria, say remove
    data older than 1st Jan 2015. what is the best way to achieve this requirement?
    Please know, I'm very new in SSIS world and would like to develop these packages for client using best package development practices.
    Any help would be greatly appreciated.
    Thanks, <b>Ankit Shah</b> <hr> Inkey Solutions, India. <hr> Microsoft Certified Business Management Solutions Professionals <hr> http://ankit.inkeysolutions.com
    1. In the bunch of these 50 CSV files there will be some exception for which we first need to purge the tables and then insert the data. Meaning for 2 files out of 50, we need to first clean the tables and then perform
    data insert, while for the rest 48 files, they should be appended on daily basis.
    Can you please advise what is the best way to achieve this requirement? Where should we configure such exceptional cases for the package?
    How can you identify these files? Is it based on file name or are there some info in the file which indicates
    that it required a purge? If yes you can pick this information during file name or file data parsing step and set a boolean variable. Then in control flow have a conditional precedence constraint which will check the boolean variable and if set it will execute
    a execte sql task to do the purge (you can use TRUNCATE TABLE or DELETE FROM TableName statements)
    2. For some of the CSV files we would be having more than one file with the same name. Like out of 50 the 2nd file is divided into 10 different CSV files. so in total we're having 60 files wherein the 10 out of 60 have
    repeated file names. How can we manage this criteria within the same loop, do we need to do one more for each looping inside the parent one, what is the best way to achieve this requirement?
    The best way to achieve this is to append a sequential value to filename (may be timestamp) and then process
    them in sequence. This can be done prior to main loop so that you can use same loop to process these duplicate filenames also. The best thing would be to use file creation date attribute value so that it gets processed in the right sequence. You can use a
    script task to get this for each file as below
    http://microsoft-ssis.blogspot.com/2011/03/get-file-properties-with-ssis.html
    3. There will be another package, which will be used to purge data for the SQL tables. Meaning unlike the above package, this package will not run on daily basis. At some point we would like these 50 tables to be purged
    with older than criteria, say remove data older than 1st Jan 2015. what is the best way to achieve this requirement?
    You can use a SQL script for this. Just call a sql procedure
    with a single parameter called @Date and then write logic like below
    CREATE PROC PurgeTableData
    @CutOffDate datetime
    AS
    DELETE FROM Table1 WHERE DateField < @CutOffDate;
    DELETE FROM Table2 WHERE DateField < @CutOffDate;
    DELETE FROM Table3 WHERE DateField < @CutOffDate;
    GO
    @CutOffDate which denote date from which older data have to be purged
    You can then schedule this SP in a sql agent job to get executed based on your required frequency
    Please Mark This As Answer if it solved your issue
    Please Vote This As Helpful if it helps to solve your issue
    Visakh
    My Wiki User Page
    My MSDN Page
    My Personal Blog
    My Facebook Page

  • Jbo.PCollException: JBO-28006: Could not create persistence table PS_TXN_se

    Hi everyone,
    Got the following exception:
    2005-11-08 13:50:54,514 ERROR enatis.error (MsgLogger.java:logError:161) [Error Ref# INT.1131450654514]- An unhandled runtime exce
    ption occured.
    oracle.jbo.PCollException: JBO-28006: Could not create persistence table PS_TXN_seq
    at oracle.jbo.PCollException.throwException(PCollException.java:39)
    at oracle.jbo.pcoll.OraclePersistManager.createTable(OraclePersistManager.java:893)
    at oracle.jbo.pcoll.OraclePersistManager.queryNextCollectionId(OraclePersistManager.java:1372)
    at oracle.jbo.pcoll.PCollManager.register(PCollManager.java:560)
    at oracle.jbo.pcoll.PCollection.<init>(PCollection.java:102)
    at oracle.jbo.pcoll.PCollManager.createCollection(PCollManager.java:460)
    at oracle.jbo.server.DBSerializer.setup(DBSerializer.java:153)
    at oracle.jbo.server.DBSerializer.passivateRootAM(DBSerializer.java:286)
    at oracle.jbo.server.DBSerializer.passivateRootAM(DBSerializer.java:267)
    at oracle.jbo.server.ApplicationModuleImpl.passivateStateInternal(ApplicationModuleImpl.java:5123)
    at oracle.jbo.server.ApplicationModuleImpl.passivateState(ApplicationModuleImpl.java:5001)
    at oracle.jbo.server.ApplicationModuleImpl.passivateStateForUndo(ApplicationModuleImpl.java:7429)
    Does anyone know whether there is a process that is supposed to cleanup this table? How is it managed?
    Thanks

    Just to wrap this up i will attach the last couple of postings on Metalink:
    09-NOV-05 07:29:03 GMT
    New info : BUKSVDL : Hi Kjeld,
    Im still on the passivateStateForUndo topic. This time with the PS_TXAN table.
    It looks like BC4J writes to this user table when passivating the AM state.
    Please see my questions in the OTN thread below.
    jbo.PCollException: JBO-28006: Could not create persistence table PS_TXN_se
    The latest entry:
    "The data sources are correct. The problem here were the priviledges after
    upgrading the db to 10g rel 2. Some of the implicit priviledges were removed in
    the latest version of the db.
    The question is still, who manages these tables. When/How are entries removed?
    We see this table, "PS_TXN", growing all the time. How do we prevent problems
    like this in the future. Should we include this table, and maybe others, in the
    maintanance scripts? "
    09-NOV-05 09:29:05 GMT
    New info : BUKSVDL : Hi Kjeld,
    The DBA that did the investigation is out of office today.
    What i can tell you is that:
    We use a data-source on the App serves that is defined by the DBA's. We only
    require the DS name. Apparently, in the past, when a user was created certain
    default priveledges were automatically granted. This doesn't happen anymore
    with the latest release of the DB. The DBA had to explicitly grant the
    priveledges.
    09-NOV-05 10:16:09 GMT
    ISSUE CLARIFICATION
    ====================
    After upgrading the database to Oracle Server 10.1.0.2 the ADF application
    returns following error:
    BC4J - ApplicationModuleImpl.passivateStateForUndo();
    oracle.jbo.PCollException: JBO-28006: Could not create persistence table
    PS_TXN_seq
    The error occurs as soon as passivation is done in the application.
    eos (end of section)
    ISSUE VERIFICATION
    ===================
    Verified the issue by error messages supplied by customer.
    eos (end of section)
    CAUSE DETERMINATION
    ====================
    The user connecting to the database from the ADF application does not have
    the required database grants to create a table. The upgrade did
    delete/remove some required privileges.
    eos (end of section)
    CAUSE JUSTIFICATION
    ====================
    If the database user does not have the privilege "CREATE ANY TABLE", then
    this user cannot create a database table. The tables PS_TXN and PS_TXN_seq
    are created during runtime if passivation is done for the first time. If
    the user does not have the necessary privileges the table cannot occur and
    the error JBO-28006 will occur.
    The upgrade of the database removed some necessary
    eos (end of section)
    STATUS
    ======
    @ WIP - Work In Progress
    09-NOV-05 10:16:56 GMT
    POTENTIAL SOLUTION(S)
    ======================
    Make sure the database user has the privileges "CREATE TABLE" and "CREATE
    SEQUENCE" to create objects such as tables and sequences.
    eos (end of section)
    POTENTIAL SOLUTION JUSTIFICATION(S)
    ====================================
    When the database user has the privileges "CREATE TABLE" and "CREATE
    SEQUENCE" it will be possible to create the BC4J tables PS_TXN and
    PS_TXN_seq on passivation.
    eos (end of section)
    SOLUTION / ACTION PLAN
    =======================
    To implement the solution, please execute the following steps:
    1. Connect as user SYS to the database.
    2. Grant at least following priviliges to the ADF application user:
    GRANT CREATE TABLE TO <user>
    GRANT CREATE SEQUENCE TO <user>
    REMARK: Replace <user> with the actual username that is used to connect
    from the adf application to the database.
    eos (end of section)

  • How to create a table using a variable or flat file ?

    Hello All,
    I want to create a table in which column names and data type will be read from a flat file or a variable.
    How can i do this ? Any suggestations are appricaible.....
    Regards,
    Ashish

    How about using this code.
    Used Directory object EXT_DIR where the input file abc.txt is located with the below mentioned data
    TABLE SANJ_TEST
    SANJ_AB NUMBER(6)
    SANJ_BC VARCHAR2(100)
    Declare
    v_file utl_file.file_type;
    z varchar2(100);
    v_cnt number :=0;
    v_str Varchar2(1000) :='';
    v_tab varchar2(40);
    v_col varchar2(30);
    v_data_type varchar2(30);
    Begin
    v_file := utl_file.fopen('EXT_DIR','abc.txt','r');
    loop
    utl_file.get_line(v_file,z);
    if substr(z,1,5) = 'TABLE' then
    if v_cnt <> 0 Then
    v_str := v_str||')';
    execute immediate v_str;
    end if;
    v_cnt := 0;
    v_str := 'Create Table ';
    v_tab :=Ltrim(Rtrim(substr(z,7)));
    v_str := v_str||v_tab||'(';
    Else
    dbms_output.put_line('count in else start '||v_cnt);
    v_col := ltrim(rtrim(substr(z,1,30)));
    v_data_type := ltrim(rtrim(substr(z,31)));
    if v_cnt=0 then
    v_str := v_str||v_col||' '||v_data_type;
    else
    v_str := v_str||','||v_col||' '||v_data_type;
    end if;
    v_cnt := v_cnt+1;
    End if;
    End loop;
    utl_file.fclose(v_file);
    Exception
    When NO_DATA_FOUND then
    v_str := v_str||')';
    execute immediate v_str;
    dbms_output.put_line('In excep '||sqlerrm);
    utl_file.fclose(v_file);
    End;
    /

  • How to create a table using subform if  lifecycle designer 7.1 not availabl

    hi,
    plis tell me how to create a table because i am using adobe lifecycle 6.1 and in the library
    there is no object for table..
    also tell me that if i have adobe lifecycle designer then which is better option and why?
    use table from library directly or create a table using subform...

    Hi Sweta,
    Create the interface attributes of type string and xtring type.
    Create node in the context of type graphics. bind the interface fields to the graphic node context element properties -
    Graphic content, field of xzstring type and mimetype to be string/any char data type of suitable length.
    In  layout drag and drop the image field and bind it to the graphic element.
    In yoour report programme-
    do the code as berlow to pass the graphic data -
    <i>CALL METHOD cl_ssf_xsf_utilities=>get_bds_graphic_as_bmp
    EXPORTING
    p_object = 'GRAPHICS'
    p_name = '<mime type graphic name>'
    p_id = 'BMAP'
    p_btype = 'BCOL' "BMON if monochrome
    RECEIVING
    p_bmp = w_binary
    EXCEPTIONS
    not_found = 1
    internal_error = 2
    OTHERS = 3.
    IF sy-subrc <> 0.
    MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
    WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
    ENDIF.</i>
    in the call <form function module>
    pass the inerface values for
    xstring (graphic field) to be w_binary (import parameter of the previous method)
    and mime type  to be 'image/bmp'.
    This would work.
    Hope fully you may be able to see a tutorial on this soon ;).
    - anto.

  • How to create a table dynamically

    Hello All,
    I want to create a table dynamically in DDIC. I know that there is function module exist DB_CREATE_TABLE but i am getting some errors while using it.
    Could any please post some code for it.
    Regards,
    Lisa

    Here is the code i have writen my self.
    PARAMETERS: tabname TYPE dd02l-tabname,
                fldname TYPE dd03p-fieldname,
                rollname TYPE dd03p-rollname.
    DATA: gt_cl_bc_dyn TYPE REF TO zcl_bc_dyn,
          status TYPE sy-subrc.
    data: lct_table type ZTT_ZTSITAB.
    INITIALIZATION.
      tabname = 'ZTEST1'.
      CREATE OBJECT gt_cl_bc_dyn.
    START-OF-SELECTION.
      CALL METHOD gt_cl_bc_dyn->create_table
        EXPORTING
          tabname   = tabname
          fieldname = fldname
          rollname  = rollname
          itab      = lct_table
        IMPORTING
          status    = status    .
      IF status = 0.
        WRITE: 'Table creation is successful'.
      ELSE.
        WRITE: 'Table creation is unsuccessful'.
      ENDIF.
    Code in the method
    METHOD create_table.
      DATA: ls_dd02v_wa TYPE dd02v,
            ls_dd09l_wa TYPE dd09l,
            ls_dd03p TYPE dd03p,
            lt_dd03p TYPE TABLE OF dd03p,
            rc TYPE sy-subrc.
      ls_dd02v_wa-tabname = tabname.
      ls_dd02v_wa-tabclass = 'TRANSP'.
      ls_dd02v_wa-contflag = 'A'.
      ls_dd09l_wa-tabname = tabname.
      ls_dd09l_wa-tabkat = '0'.
      ls_dd09l_wa-tabart = 'APPL0'.
      ls_dd03p-tabname = tabname.
      ls_dd03p-fieldname = fieldname.
      ls_dd03p-position = '0001'.
      ls_dd03p-keyflag = 'X'.
      ls_dd03p-rollname = rollname.
      APPEND ls_dd03p TO lt_dd03p.
      CALL FUNCTION 'DDIF_TABL_PUT'
        EXPORTING
          name                    = tabname
         dd02v_wa                = ls_dd02v_wa
         dd09l_wa                = ls_dd09l_wa
       TABLES
         dd03p_tab               = lt_dd03p
       EXCEPTIONS
         tabl_not_found          = 1
         name_inconsistent       = 2
         tabl_inconsistent       = 3
         put_failure             = 4
         put_refused             = 5
         OTHERS                  = 6
      IF sy-subrc <> 0.
        MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
                WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
      ENDIF.
      CALL FUNCTION 'DDIF_TABL_ACTIVATE'
        EXPORTING
          name              = tabname
       IMPORTING
         rc                = rc
       EXCEPTIONS
         not_found         = 1
         put_failure       = 2
         OTHERS            = 3
      IF sy-subrc <> 0.
        MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
                WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
      ENDIF.
      status = rc.
    ENDMETHOD.

  • I created new table in database and want to bind with system form

    Hi All,
    1) i created new table in database and want to bind with system form .
    2) How i bind this field to system form sale order where i added new folder tab in that i added some fields that fields i want to bind with database. when i click on the next ,previous ,first and last button
    bind value should change.
    Awaiting soon reply
    Rajkumar G.

    hi,
    try this
    Public Sub BindDataToForm()
            Dim oItem As SAPbouiCOM.Item
            Dim oEdit As SAPbouiCOM.EditText
            Dim oComboBox As SAPbouiCOM.ComboBox
            '// getting the matrix column by the UID
            'oItem = oForm.Items.Item("docname")
            'oComboBox = oItem.Specific
            'oComboBox.DataBind.SetBound(True, "OSRI", "BaseType")
            'oItem = oForm.Items.Item("docno")
            'oEdit = oItem.Specific
            'oEdit.DataBind.SetBound(True, "OSRI", "BaseEntry")
            oColumn = oColumns.Item("Code")
            'oColumn.DataBind.SetBound(True, "", "DSCardCode")
            oColumn.DataBind.SetBound(True, "OSRI", "ItemCode")
            oColumn = oColumns.Item("Serial")
            oColumn.DataBind.SetBound(True, "OSRI", "IntrSerial")
            Try
                oColumn = oColumns.Item("Inspection")
                oColumn.DataBind.SetBound(True, "OSRI", "U_Inspection")
            Catch ex As Exception
                MessageBox.Show(ex.Message)
            End Try
            oColumn = oColumns.Item("Quality")
            oColumn.DataBind.SetBound(True, "OSRI", "U_Quality")
            oColumn = oColumns.Item("Status")
            oColumn.DataBind.SetBound(True, "OSRI", "U_Status")
            oColumn = oColumns.Item("Finish")
            oColumn.DataBind.SetBound(True, "OSRI", "U_Finish")
            oColumn = oColumns.Item("Thickness")
            oColumn.DataBind.SetBound(True, "OSRI", "U_Thickness")
            oColumn = oColumns.Item("uom")
            oColumn.DataBind.SetBound(True, "OSRI", "U_NetUOM")
            oColumn = oColumns.Item("length")
            oColumn.DataBind.SetBound(True, "OSRI", "U_Length")
            oColumn = oColumns.Item("height")
            oColumn.DataBind.SetBound(True, "OSRI", "U_Height")
            oColumn = oColumns.Item("sqf")
            oColumn.DataBind.SetBound(True, "OSRI", "U_sqf")
            oColumn = oColumns.Item("sqm")
            oColumn.DataBind.SetBound(True, "OSRI", "U_sqm")
        End Sub

Maybe you are looking for

  • How to recreate a deleted user account w/ same email address

    Hi, I have a MacMini running Maverics 10.9.1 and Server 3.0.1. In the server, I also have the Open Directory running, but there are no Local Network Users configured at the moment. I deleted one of my Local Users from the server and would now like to

  • A question about headphones of 8800

    Is the sound come out from the right headphones ostiole always louder than the left? Mine does.

  • Dual displays in motion

    Hi - is it possible to separate the timeline from the canvas window, so that in a dual monitor setup the timeline can occupy the full height of one screen? have just switched for this very reason, and can't find a way of doing it! Thanks for any info

  • Rule based monitors (RZ20) - Show system availability for selected systems

    Hi experts, I'm new with CCMS monitoring and need an advice for rule based monitors. I want to create three alert monitors (RZ20). One for our developing systems, one for our quality assurance systems and one to monitor our production systems. For th

  • C++ runtime error opening Photoshop CS3

    Does anyone know how to fix this issue?  We have re-imaged our school district computers with XP SP3 and IE7, and now on Creative Suite CS3, only Photoshop will not open as a student. It opens only as administrator and students get the runtime error.