Jspx neighbor column overwriting when content larger then column length

Hello,
When I have a table in a jspx site and the contend of a column (let’s call it “large_text “) is larger than the column width then the column(s) at the right hand side are overwritten with the contend of the that large_text column.
How can I avoid this ?
Thank you !
Andre

Hi Ric,
please see the code below. It's AF table.
I have created this sample with no extra modifications just as JDev presents is out of the box. In this case the in the row with "United States of America" you can see, that the text overwrites the next column. Even if I can extend the size of the in this special case to Country Nme column there are other cases where it is not possible to do this because there is no more place at the screen.
And at least then this is a serious problem.
Andre
Code:
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN"
"http://www.w3.org/TR/html4/loose.dtd">
<%@ page contentType="text/html;charset=windows-1252"%>
<%@ taglib uri="http://java.sun.com/jsf/html" prefix="h"%>
<%@ taglib uri="http://java.sun.com/jsf/core" prefix="f"%>
<%@ taglib uri="http://xmlns.oracle.com/adf/faces/rich" prefix="af"%>
<f:view>
<af:document>
<af:form>
<af:table value="#{bindings.CountriesView1.collectionModel}" var="row"
rows="#{bindings.CountriesView1.rangeSize}"
first="#{bindings.CountriesView1.rangeStart}"
emptyText="#{bindings.CountriesView1.viewable ? \'No rows yet.\' : \'Access Denied.\'}"
fetchSize="#{bindings.CountriesView1.rangeSize}">
<af:column sortProperty="CountryId" sortable="false"
headerText="#{bindings.CountriesView1.hints.CountryId.label}">
<af:inputText value="#{row.CountryId}" simple="true"
required="#{bindings.CountriesView1.hints.CountryId.mandatory}"
columns="#{bindings.CountriesView1.hints.CountryId.displayWidth}"/>
</af:column>
<af:column sortProperty="CountryName" sortable="false"
headerText="#{bindings.CountriesView1.hints.CountryName.label}">
<af:inputText value="#{row.CountryName}" simple="true"
required="#{bindings.CountriesView1.hints.CountryName.mandatory}"
columns="#{bindings.CountriesView1.hints.CountryName.displayWidth}"/>
</af:column>
<af:column sortProperty="RegionId" sortable="false"
headerText="#{bindings.CountriesView1.hints.RegionId.label}">
<af:inputText value="#{row.RegionId}"
required="#{bindings.CountriesView1.hints.RegionId.mandatory}"
columns="#{bindings.CountriesView1.hints.RegionId.displayWidth}">
<af:convertNumber groupingUsed="false"
pattern="#{bindings.CountriesView1.hints.RegionId.format}"/>
</af:inputText>
</af:column>
</af:table>
</af:form>
</af:document>
</f:view>

Similar Messages

  • I got CS4 software. When the first time i install, i chose the normal installation content. Then I found that there was no photoshop on the start menu. After that I try to put the disk into the notebook again and find that there is no photoshop listed on

    I got CS4 software. When the first time i install, i chose the normal installation content. Then I found that there was no photoshop on the start menu. After that I try to put the disk into the notebook again and find that there is no photoshop listed on the "installation list"

    download the cs4 design premium installation files and install ps using them,
    Downloadable installation files available:
    Suites and Programs:  CC 2014 | CC | CS6 | CS5.5 | CS5 | CS4, CS4 Web Standard | CS3
    Acrobat:  XI, X | 9,8 | 9 standard
    Premiere Elements:  13 | 12 | 11, 10 | 9, 8, 7 win | 8 mac | 7 mac
    Photoshop Elements:  13 |12 | 11, 10 | 9,8,7 win | 8 mac | 7 mac
    Lightroom:  5.7.1| 5 | 4 | 3
    Captivate:  8 | 7 | 6 | 5.5, 5
    Contribute:  CS5 | CS4, CS3
    Download and installation help for Adobe links
    Download and installation help for Prodesigntools links are listed on most linked pages.  They are critical; especially steps 1, 2 and 3.  If you click a link that does not have those steps listed, open a second window using the Lightroom 3 link to see those 'Important Instructions'.

  • UTL_FILE write_error when writing large binary files to unix os

    I am trying to write large files to a folder in unix from a table containing a BLOB object. The procedure below is called by another procedure I have written to do this. It works in windows environment fine with files up to 360MB. When I run this exact same procedure in UNIX I get an initialization error. When I change the WB in the fopen call to W it works. I can store all the files I want up to 130MB in size. The next size larger file I have is 240MB and it fails after writing the first 1KB passing the utl_file.write_error message. If someone can help me to diagnose the problem, I would really appreciate it. i have been trying everything I can think of to get this to work.
    Specifics are, the windows version is 10GR2, on unix we are running on Sun Solaris 9 using 9iR2
    PROCEDURE writebin(pi_file_name IN VARCHAR2, pi_file_url IN VARCHAR2, pi_file_data IN BLOB)
    IS
    v_file_ref utl_file.file_type;
    v_lob_size NUMBER;
    v_raw_max_size constant NUMBER := 32767;
    v_buffer raw(32767);
    v_buffer_offset NUMBER := 1;
    -- Position in stream
    v_buffer_length NUMBER;
    BEGIN
    -- WB used in windows environment. W used in unix
    v_lob_size := dbms_lob.getlength(pi_file_data);
    v_file_ref := utl_file.fopen(pi_file_url, pi_file_name, 'WB', v_raw_max_size);
    v_buffer_length := v_raw_max_size;
    WHILE v_buffer_offset < v_lob_size
    LOOP
    IF v_buffer_offset + v_raw_max_size > v_lob_size THEN
    v_buffer_length := v_lob_size -v_buffer_offset;
    END IF;
    dbms_lob.READ(pi_file_data, v_buffer_length, v_buffer_offset, v_buffer);
    utl_file.put_raw(v_file_ref, v_buffer, TRUE);
    v_buffer_offset := v_buffer_offset + v_buffer_length;
    END LOOP;
    utl_file.fclose(v_file_ref);
    END writebin;
    Message was edited by:
    user599879

    check if this cample code helps -
    CREATE OR REPLACE PROCEDURE prc_unload_blob_to_file IS
    vlocation      VARCHAR2(16) := ‘LOB_OUTPUT’;
    vopen_mode     VARCHAR2(16) := ‘w’;
    bimax_linesize NUMBER := 32767;
    v_my_vr        RAW(32767);
    v_start_pos    NUMBER := 1;
    v_output       utl_file.file_type;
    BEGIN
    FOR cur_lob IN (SELECT vmime_type,
    blob_resim,
    vresim,
    dbms_lob.getlength(blob_resim) len
    FROM tcihaz_resim a
    WHERE rownum < 3 -- for test purposes
    ORDER BY a.nresim_id) LOOP
    v_output := utl_file.fopen(vlocation,
    cur_lob.vresim,
    vopen_mode,
    bimax_linesize);
    dbms_output.put_line(’Column length: ‘ || to_char(cur_lob.len) || ‘ for file: ‘ ||
    cur_lob.vresim);
    v_start_pos := 1;
    IF cur_lob.len < bimax_linesize THEN
    dbms_lob.READ(cur_lob.blob_resim,
    cur_lob.len,
    v_start_pos,
    v_my_vr);
    utl_file.put_raw(v_output,
    v_my_vr,
    autoflush => TRUE);
    dbms_output.put_line(’Finished Reading and Flushing ‘ || to_char(cur_lob.len) ||
    ‘ Bytes’ || ‘ for file: ‘ || cur_lob.vresim);
    ELSE
    dbms_lob.READ(cur_lob.blob_resim,
    bimax_linesize,
    v_start_pos,
    v_my_vr);
    utl_file.put_raw(v_output,
    v_my_vr,
    autoflush => TRUE);
    dbms_output.put_line(’Finished Reading and Flushing ‘ || to_char(cur_lob.len) ||
    ‘ Bytes’ || ‘ for file: ‘ || cur_lob.vresim);
    END IF;
    v_start_pos := v_start_pos + bimax_linesize;
    WHILE (v_start_pos < bimax_linesize) LOOP
    -- loop till entire data is fetched
    dbms_lob.READ(cur_lob.blob_resim,
    bimax_linesize,
    v_start_pos,
    v_my_vr);
    utl_file.put_raw(v_output,
    v_my_vr,
    autoflush => TRUE);
    dbms_output.put_line(’Finished Reading and Flushing ‘ ||
    to_char(bimax_linesize + v_start_pos - 1) || ‘ Bytes’ ||
    ‘ for file: ‘ || cur_lob.vresim);
    v_start_pos := v_start_pos + bimax_linesize;
    END LOOP;
    utl_file.fclose(v_output);
    dbms_output.put_line(’Finished successfully and file closed’);
    END LOOP;
    END prc_unload_blob_to_file;
    set serveroutput on
    set timing on
    create or replace directory LOB_OUTPUT as ‘/export/home/oracle/tutema/’;
    GRANT ALL ON DIRECTORY LOB_OUTPUT TO PUBLIC;
    exec prc_unload_blob_to_file ;
    Column length: 3330 for file: no_image_found.gif
    Finished Reading and Flushing 3330 Bytes for file: no_image_found.gif
    Finished successfully and file closed
    Column length: 10223 for file: OT311.gif
    Finished Reading and Flushing 10223 Bytes for file: OT311.gif
    Finished successfully and file closed
    PL/SQL procedure successfully completedWith 9iR2 PLSQL can write binary files using UTL_FILE put_raw function, prior to Oracle9iR2 you will need to create an external procedure with Java, C, VB or some 3gl language.
    Some references -
    http://asktom.oracle.com/pls/ask/f?p=4950:8:::::F4950_P8_DISPLAYID:6379798216275
    Oracle® Database PL/SQL Packages and Types Reference 10g Release 2 (10.2)
    UTL_FILE - http://download-uk.oracle.com/docs/cd/B19306_01/appdev.102/b14258/u_file.htm#sthref14095
    http://psoug.org/reference/dbms_lob.html
    Metalink Note:70110.1, Subject: WRITING BLOB/CLOB/BFILE CONTENTS TO A FILE USING EXTERNAL PROCEDURES

  • Problem with "Insert" and "Overwrite Sequence Content"

    I'm working with XDCAM footage. I like to take all the individual clips recorded on the XDCAM disc, and after getting them into FCP, taking them and putting them all in a sequence in FCP, then using that sequence as the source in the Viewer. This allows for quick scanning of all the clips, much as is it were a digitized tape.
    I recently found the "Insert Sequence Content" and "Overwrite Sequence Content" commands in FCP, and like them in that they actually put the individual clips into my project timeline, and not just the combined sequence (which looks more or less like a nest when dropped into my project sequence).
    Here's the problem: When I put IN and OUT points in the timeline, and "Insert or Overwrite Sequence Content" using the sequence containing all the clips, the video tends to be contained to between the IN and OUT points I set in the TIMELINE, but the audio tracks tend to expand past the OUT point I set in the TIMELINE, and I can't figure out why, or how to get around this.
    Any assistance would be appreciated.

    If you're inserting or overwriting, you're basically pasting the content that you've chosen. I wouldn't think the out point would be recognized, or even wanted. If you copied 5 minutes of clips and inserted it into a sequence with a 4 minute in/out point duration, do you want to only have the first 4 minutes inserted into the sequence? If you only place an in point, it will insert all the clips, starting at that point, and it will take up as much time as the clip content's duration.
    Am I missing something here?

  • Error when opening large data forms

    Hi,
    We are working on a Workforce planning implementation. We have 2 large custom defined dimensions.
    When opening large data forms we get a standard "Error has occurred" error. If we reduce the member selection the data form opens fine.
    Is anyone aware of a setting that can be increased to open large data forms? Im not referring to the "Warn id data form is larger than 3000 cells" setting.
    Im pretty sure there is a parameter that can be increased but I cant find it.
    Thanks for your help.
    Seb

    Hi Seb,
    If you do find the magic parameter then let us know because I would be interested to know.
    It is probably something like ALLOW_LARGE_FORMS = true :)
    In the planning logs is the error related to planning or is essbase related error.
    Is it failing due to the amount of rows or is it because it is going beyond the max of 256 columns ?
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Update Item when Content Type is changed in the browser

    I feel like I might be missing something obvious here - I set up a number of managed content types.  While editing a list item with multiple content types, when a user changes to a different content type using the drop down at the top of the page,
    the edits are not saved.  Is there any way to force an update when the user changes content type?  Clicking save returns the user to the list view which means the user needs to edit it again, and navigate to the content type.

    Hi Steve,
    Per my understanding, you might want to update item with the input values when content type changed by selecting an option in the “Content Type” drop down menu in
    EditForm.aspx page.
    A Content Type is a set of predefined columns, from the perspective of the use of Content Type in a list, it would be easier to maintain per record with only one content
    type as an item, this is also a recommended behavior when working with multiple content types in a list.
    If you want to take use of columns from multiple content types in an item, a suggestion is that you can create a custom content type with the columns from other content
    types you want, then add it into your list.  
    Thanks 
    TechNet Community Support
    Please remember to mark the replies as answers if they help, and unmark the answers if they provide no help. If you have feedback for TechNet Support, contact
    [email protected]

  • I have a problem with Apple store when installing large application: The system crashes. Any suggestion?

    I have a problem with Apple store when installing large application (example xCode): The system crashes.
    It begun with 10.8 usualy with few tries it work but with 4.6 no succes.
    Any suggestion?

    If you have more than one user account, you must be logged in as an administrator to carry out these instructions.
    Launch the Console application in any of the following ways:
    ☞ Enter the first few letters of its name into a Spotlight search. Select it in the results (it should be at the top.)
    ☞ In the Finder, select Go ▹ Utilities from the menu bar, or press the key combination shift-command-U. The application is in the folder that opens.
    ☞ Open LaunchPad. Click Utilities, then Console in the icon grid.
    Select the most recent panic log under System Diagnostic Reports. Post the entire contents — the text, please, not a screenshot. In the interest of privacy, I suggest you edit out the “Anonymous UUID,” a long string of letters, numbers, and dashes in the header and body of the report, if it’s present (it may not be.) Please don't post shutdownStall, spin, or hang reports.

  • Unable to write CLOB value larger then approx. 97KB

    Hello,
    I am writing the CLOB value using the java JDBC bridge (9.0.1) to the DB (8.1.6). I managed to write CLOB values ok, but when I try to write text larger then 97KB it throws the following error:
    java.io.IOException: ORA-00600: kód vnitønà chyby (internal error code in english), argumenty: [kdlseek-kgbtnscb], [], [], [], [], [], [], []
    ORA-06512: na "SYS.DBMS_LOB", line 708
    ORA-06512: na line 1
    at oracle.jdbc.dbaccess.DBError.SQLToIOException(DBError.java:618)
    at oracle.jdbc.driver.OracleClobWriter.flushBuffer(OracleClobWriter.java:201)
    at oracle.jdbc.driver.OracleClobWriter.flush(OracleClobWriter.java:161)
    at com.trask.edoceo.sql.OracleConnection$OraclePreparedStatement.getCLOB(OracleConnection.java:103)
    at com.trask.edoceo.sql.OracleConnection$OraclePreparedStatement.setCharacterStream(OracleConnection.java:85)
    at ...
    I use the Prepared statement function setClob and for writing I create the CLOB object using the following method:
    private CLOB getCLOB(String text, Connection conn) throws SQLException {
    CLOB tempClob = null;
    try {
    // If the temporary CLOB has not yet been created, create new
    tempClob = CLOB.createTemporary(conn, true, CLOB.DURATION_SESSION);
    // Open the temporary CLOB in readwrite mode to enable writing
    tempClob.open(CLOB.MODE_READWRITE);
                        // Get the output stream to write
    Writer tempClobWriter = tempClob.getCharacterOutputStream();
    // Write the data into the temporary CLOB
    tempClobWriter.write(text);
    // Flush and close the stream
    tempClobWriter.flush();
    tempClobWriter.close();
    // Close the temporary CLOB
    tempClob.close();
    } catch (SQLException sqlexp) {
    tempClob.freeTemporary();
    sqlexp.printStackTrace();
    } catch (Exception exp) {
    tempClob.freeTemporary();
    exp.printStackTrace();
    return tempClob;
    Under the 97KB boundary everything works fine. Is the limit hidden in the JDBC drivers (I tried the newer drivers, but they throw the END OF TNS DATA CHANNEL error) or somewhere in the oracle DB settings?
    I am using JDK 1.3.1 Win2000 and the oracle DB resides on another machine then the application.
    I tried to append the error dump from the DB, but it has 1.5MB of text :-(
    Thanks for your help.
    Jan Antos.
    [email protected]

    This forum is meant for discussions about OTN content/site and services.
    Questions about Oracle products and technologies will NOT be answered in this forum. Please post your product or technology related questions in the appropriate product or technology forums, which are monitored by Oracle product managers.
    Product forums:
    http://forums.oracle.com/forums/index.jsp?cat=9
    Technology forums:
    http://forums.oracle.com/forums/index.jsp?cat=10
    As a general guideline, please first search the forum to see if your question is already answered. You will find answers for the most frequently asked questions by simply searching the forum. This will help you to find the answer right away and will save time for all of us.

  • Text items larger then 32K

    Greetings all
    On our project we are using the RTE inside portal to allow our content managers to edit the contents of the page. This works fine in version 9, but in version 10 we have a small problem. While the information is saved correctly inside the database when it's larger then 32K in length and it shows correctly on the page, the item can't be modified anymore, since you get a character string to long error.
    After logging a TAR on metalink, they know the bug is related to this one: Bug 2346039 - TASK: TEXT ITEMS CANNOT EXCEED 32K IN SIZE. The biggest problem at this time for our content managers that they have no clue at all they are entering an item that is too large to edit, since everything is saved correctly in the tables.
    We've added some JavaScript code to the buildUIembed.html file so that the users get a popup window that tells them the item is to large, but untill now, we're unable to stop oracle for saving the item in the database.
    Does anyone know a way to stop the default action of oracle so that people can copy/paste their work to a safespot to split it up in 2 to avoid this bug?
    Best regards
    Johan Van Volsem

    Hell christian
    Thank you for your response, but when you use a the rich text editor, you don't use text items (oracle does internaly). Because the item of over 32K is nicely stored in the table without any errors. If we don't use the RTE, then when you save the item, the browsers return an error, the RTE doesn't which makes it more confusing for content managers, since they don't even know something is wrong when they save the item.
    Hence why oracle has added my bug from the tar to this ER.
    Thanx for your input anyhow.
    Regards
    Johan

  • DSS problems when publishing large amount of data fast

    Has anyone experienced problems when sending large amounts of data using the DSS. I have approximately 130 to 150 items that I send through the DSS to communicate between different parts of my application.
    There are several loops publishing data. One publishes approximately 50 items in a rate of 50ms, another about 40 items with 100ms publishing rate.
    I send a command to a subprogram (125ms) that reads and publishes the answer on a DSS URL (app 125 ms). So that is one item on DSS for about 250ms. But this data is not seen on my man GUI window that reads the DSS URL.
    My questions are
    1. Is there any limit in speed (frequency) for data publishing in DSS?
    2. Can DSS be unstable if loaded to much?
    3. Can I lose/miss data in any situation?
    4. In the DSS Manager I have doubled the MaxItems and MaxConnections. How will this affect my system?
    5. When I run my full application I have experienced the following error Fatal Internal Error : ”memory.ccp” , line 638. Can this be a result of my large application and the heavy load on DSS? (se attached picture)
    Regards
    Idriz Zogaj
    Idriz "Minnet" Zogaj, M.Sc. Engineering Physics
    Memory Profesional
    direct: +46 (0) - 734 32 00 10
    http://www.zogaj.se

    LuI wrote:
    >
    > Hi all,
    >
    > I am frustrated on VISA serial comm. It looks so neat and its
    > fantastic what it supposes to do for a develloper, but sometimes one
    > runs into trouble very deep.
    > I have an app where I have to read large amounts of data streamed by
    > 13 µCs at 230kBaud. (They do not necessarily need to stream all at the
    > same time.)
    > I use either a Moxa multiport adapter C320 with 16 serial ports or -
    > for test purposes - a Keyspan serial-2-USB adapter with 4 serial
    > ports.
    Does it work better if you use the serial port(s) on your motherboard?
    If so, then get a better serial adapter. If not, look more closely at
    VISA.
    Some programs have some issues on serial adapters but run fine on a
    regular serial port. We've had that problem recent
    ly.
    Best, Mark

  • USB key ejecting itself when transferring large files

    I have a LG 16G USB key formatted in MacOS extended (journaled), I've been encountering a particular issue when I try to transfer files taken on another computer and copying them from the key to my Macbook pro ( OSX 10.6.2). after it start transferring it ejects itself with the "the disk was not ejected properly" message, then reappears right away in the finder...
    the problem doesn't present itself when I copy a file from my macbook pro to the key, and then try transferring it back to my computer. and I've tried this with both small files (under 200mb) and large files (over 4G).
    I've also noticed the transfer rate is very slow when this happens.
    so far the problem only presents itself when I transfer files from one of the MacPro's in my university's studios, but since I work a lot on these machine, this is becoming quite a problem. I don't have the complete specs of these machines, they all run on Leopard, probably 10.5.something. not SL... perhaps that's the problem?
    I've seen similar post on the forum concerning this issue with TimeMachine backup drives, suggesting it had something to do with the sleep preferences, however I've tried all the potential solutions from these post and the problem persists.
    So far the only solution I've found has been to transfer the files to my girlfriend's macbook (running on tiger), and then transferring the files in target mode from her computer to mine... quite inconvenient...
    I have a feeling this might be due to the formatting of my key, I have a 2G USB key formatted in MS-DOS (FAT32) and have had no problems with it so far. The reason I formatted in Mac OS extended is simple, I work in the video field and with HD I find myself often moving around single files larger then 4GB, which I've come to understand isn't possible in FAT 32.
    I'd like to know how to resolve this, and especially like to know if it is indeed a format issue, since I'm soon going to acquire a new external hard drive for the sole purpose of storing my increasing large media files and would like to know how to format it.

    Hi,
    I have an external USB Card reader (indeed - two different readers), which have the same problems with self-ejecting disks. Every transfer of data from SD card from my camera is the pain now. There is many different themes related to the self-"the disk was not ejected properly" situation, but no working solution now.

  • ADF how to display a processing page when executing large queries

    ADF how to display a processing page when executing large queries
    The ADF application that I have written currently has the following structure:
    DataPage (search.jsp) that contains a form that the user enters their search criteria --> forward action (doSearch) --> DataAction (validate) that validates the inputted values --> forward action (success) --> DataAction (performSearch) that has a refresh method dragged on it, and an action that manually sets the itterator for the collection to -1 --> forward action (success) --> DataPage (results.jsp) that displays the results of the then (hopefully) populated collection.
    I am not using a database, I am using a java collection to hold the data and the refresh method executes a query against an Autonomy Server that retrieves results in XML format.
    The problem that I am experiencing is that sometimes a user may submit a query that is very large and this creates problems because the browser times out whilst waiting for results to be displayed, and as a result a JBO-29000 null pointer error is displayed.
    I have previously got round this using Java Servlets where by when a processing servlet is called, it automatically redirects the browser to a processing page with an animation on it so that the user knows something is being processed. The processing page then recalls the servlet every 3seconds to see if the processing has been completed and if it has the forward to the appropriate results page.
    Unfortunately I can not stop users entering large queries as the system requires users to be able to search in excess of 5 million documents on a regular basis.
    I'd appreciate any help/suggestions that you may have regarding this matter as soon as possible so I can make the necessary amendments to the application prior to its pilot in a few weeks time.

    Hi Steve,
    After a few attempts - yes I have a hit a few snags.
    I'll send you a copy of the example application that I am working on but this is what I have done so far.
    I've taken a standard application that populates a simple java collection (not database driven) with the following structure:
    DataPage --> DataAction (refresh Collection) -->DataPage
    I have then added this code to the (refreshCollectionAction) DataAction
    protected void invokeCustomMethod(DataActionContext ctx)
    super.invokeCustomMethod(ctx);
    HttpSession session = ctx.getHttpServletRequest().getSession();
    Thread nominalSearch = (Thread)session.getAttribute("nominalSearch") ;
    if (nominalSearch == null)
    synchronized(this)
    //create new instance of the thread
    nominalSearch = new ns(ctx);
    } //end of sychronized wrapper
    session.setAttribute("nominalSearch", nominalSearch);
    session.setAttribute("action", "nominalSearch");
    nominalSearch.start();
    System.err.println("started thread calling loading page");
    ctx.setActionForward("loading.jsp");
    else
    if (nominalSearch.isAlive())
    System.err.println("trying to call loading page");
    ctx.setActionForward("loading.jsp");
    else
    System.err.println("trying to call results page");
    ctx.setActionForward("success");
    Created another class called ns.java:
    package view;
    import oracle.adf.controller.struts.actions.DataActionContext;
    import oracle.adf.model.binding.DCIteratorBinding;
    import oracle.adf.model.generic.DCRowSetIteratorImpl;
    public class ns extends Thread
    private DataActionContext ctx;
    public ns(DataActionContext ctx)
    this.ctx = ctx;
    public void run()
    System.err.println("START");
    DCIteratorBinding b = ctx.getBindingContainer().findIteratorBinding("currentNominalCollectionIterator");
    ((DCRowSetIteratorImpl)b.getRowSetIterator()).rebuildIteratorUpto(-1);
    //b.executeQuery();
    System.err.println("END");
    and added a loading.jsp page that calls a new dataAction called processing every second. The processing dataAction has the following code within it:
    package view;
    import javax.servlet.http.HttpSession;
    import oracle.adf.controller.struts.actions.DataForwardAction;
    import oracle.adf.controller.struts.actions.DataActionContext;
    public class ProcessingAction extends DataForwardAction
    protected void invokeCustomMethod(DataActionContext actionContext)
    // TODO: Override this oracle.adf.controller.struts.actions.DataAction method
    super.invokeCustomMethod(actionContext);
    HttpSession session = actionContext.getHttpServletRequest().getSession();
    String action = (String)session.getAttribute("action");
    if (action.equalsIgnoreCase("nominalSearch"))
    actionContext.setActionForward("refreshCollection.do");
    I'd appreciate any help or guidance that you may have on this as I really need to implement a generic loading page that can be called by a number of actions within my application as soon as possible.
    Thanks in advance for your help
    David.

  • Please solve  Radio buttons (when selected) will then display on the report

    Each of these new Radio buttons (when selected) will then display on the report in the same locations, which is after the existing columns.
    Check boxes are by default checked.
    When the 1st radio button is selected then a new column will be displayed in the ALV output
    Next to A.
    Like wise when 2nd radio button selected then a new column will be displayed in the ALV output next to B.
    Like wise the ALV output is displayed.
    A   is a checkbox
    B  is a checkbox
    C   is a checkbox
    D   is a checkbox
    1   is a radio button                         
    2   is a radio button      
    3  is a radio button
    4   is a radio button

    while populating fieldcatalog itself u include these condition....
    if Pradiobutton = 'X'.
    *--populate field catalog for field u want in specified position....
    endif.
    similarly u proceed for other fields and other condition

  • How to save JPEG larger then 30.000px?

    Hi everyone,
    I'm new to Photoshop, so probably this is very or too basic for most of you. I just wonder what settings or option do I need to take, if I want to save my .PSB as a JPEG and want to avoid reducing the size down to 30.000px.
    Actually I do not care if some editors won't be able to handle JPEGs larger then 30.000px.
    I just want to produce one with 39.000px, and can't imagine PS won't allow me to do that.
    So what's the trick?
    Please help! Thank you!

    According to Wikipedia (http://en.wikipedia.org/wiki/JPEG)
    It supports a maximum image size of 65535×65535.
    So it seems you are right that Adobe’s limit is not implicitly necessary for the format.
    You could save a tif and open that in Preview (as you don’t state your OS I can just act as if you were a Mac-user) and save the jpg from there.
    Unfortunately the 30.000-pixel-restriction seems to also apply in Photoshop when opening and Indesign when placing files, so you may not be able to use that jpg in an Adobe workflow.
    Why do you want the image as a jpg anyway?

  • "CASE WHEN 1=0 THEN.......ELSE....END"??

    This might be very basic question still I'm confused about it.
    Can you please explain me "CASE WHEN 1=0 THEN.......ELSE....END"?
    Where and why to use this statement??
    Help appreciated.

    First let's break down what it means. Since 1 is never equal to 0, OBI will ignore what comes immediately after and execute what comes after the ELSE.
    So why would we want to do this? Well, often we want to perform some action that has nothing to do with a particular column in our subject area. By using this CASE statement, we trick OBI into thinking we are using a column for a calculation or action, but in effect, the column is unchanged.
    For example, suppose you were building a dashboard prompt. You want to use a column twice in your dashboard prompt, say an ACCOUNT_OPEN_DATE column to get a range of dates. You cannot use the same column twice in a dashboard prompt. (Try it, OBI will ignore your attempt to put the same column twice into your workspace.)
    So what do you do? First you get a column (in this example, I'm using a CHAR column) like Branch Name. It doesn't matter what column you use. The CHAR is used so the syntax makes sense.
    You move the column to your workspace. Now you click on the Edit Formula and you type CASE WHEN 1=0 THEN Organization."Branch Name" ELSE 'TEST' END.
    So in this case, the values of Branch Name are unaffected and the dummy column just represents the word 'TEST.' Now in the "Show" part of the prompt, we switch to "SQL Results" and type SELECT "Account Attributes".Account_Open_Date FROM Subject Area and set this to a PV called StartDate.
    Because the column is a dummy, you can actually call it again. Use the same CASE statement to make it a "dummy" column. Then use the same SQL you used above, but this time save it to a PV called EndDate.
    Now you have two date values you can use on your ACCOUNT_OPEN_DATE column to get a range of dates.
    So, in conclusion, normally you choose a column from your subject area, because you need the values of that column in your report. But when you just need a column to do something unrelated to any column, you use the CASE 1=0 to make it a "dummy" column and then you can perform your action.
    HTH,

Maybe you are looking for