Using bi beans or adf dvt for bi project

hi every body
i'm new to oracle world . i install oracle 11gr2 and by using owb i create some cube . my problem is i want to write application (web application) in jdeveloper that can connect to cube and can run query on it . i use oracle jdeveloper 11gr1 .
i dont know how can i connect to owb olap (cube) in jdeveloper.
i read some where that bi beans is for this issue but bi beans does not work in jdeveloper 11 and there is no component in jdeveloper 11 adf dvt for connecting to cubes.
if any body have some ideas please help me.
thanks

I have 2 TARs currently open...
1. 2865331.995
Is about when running the CWM2 stored procedures to create metadata, my OLAP catalog becomes invalid. This has the bug 2858427 associated with it.
2. 2904033.996
Is about poor performance. When the metatdata has been created using OEM. I have created my metadata and OLAP materialized views, but they are not being utilised in the executed SQL.
Regards
Dylan.

Similar Messages

  • Can you use SQL as a data source for a project in the same way you can in Excel?

    Excel allows you to create a data source that executes a SQL stored procedure, display that data as a table in a spreadsheet and have that data automatically refresh each time you open the spreadsheet. Is it possible to do the same thing in MS Project, displaying
    the data from the stored procedure as a series of tasks?
    Here's what I'm trying to do - I have a stored procedure that pulls task data meeting a specific criteria from all projects in Project Server. We're currently displaying this data as an Excel report. However, the data includes start dates and durations so
    it would be nice to be able to display it as a Gantt Chart. I've played around with creating a Gantt chart in Excel and have been able to do a very basic one, but it doesn’t quite fit our needs.

    No, You can not use sql as a data source for a project.
    You have 3 options to achieve it:
    1. You can create a Sharepoint list with desired column ,fill desired data in that list then you can create a MS project from Sharepoint List.
    2. You can create a SSRS report in which you can display grantt chart Joe has given you that link.
    3. You can write a macro in MPP which will take data from your excel. In excel you will fetch data from your stored procedure. write a schedule which will run every day to update your data or
    create an excel report in which will update automatically and write macro in mpp which will fetch the data then publish it so that it would be available to team members.
    kirtesh

  • Helvetica font for a project

    I need to use a full adobe helvetica version for a project. The system will not let me load it because it uses some helveticas for screen view. How can I change the system font so I can use the helveetica I need?

    We're both getting into scary territory here, but...
    Beginning with Leopard, 10.5, Apple has made it nearly impossible to remove critical fonts. If you attempt to remove protected fonts from the /System/Library/Fonts/ folder, the OS will tell you that you cannot remove the font(s) and replaces them from copies in another location. Sometimes immediately, at other times, after you have restarted. Even if it doesn't immediately replace Lucida Grande, the OS continues to use it from the protected location, preventing you from losing control of your Mac. There are quite a few fonts you can still remove from the /System/Library/Fonts/ folder, but others resurrect themselves if removed. See section five on how to permanently remove Apple's supplied versions of Helvetica and Helvetica Neue in Leopard, 10.5 or Snow Leopard, 10.6 if this is important for you.
    http://designskeywest.com/fonts_osx.html

  • 10,000 RPM HD for LP9 - Projects  ?

    Hi,
    Would using a 10,000 RPM HD for LP9 projects, help improve the overall performance/response of the system, compared to 7,200 RPM HD. when working with Logic Pro 9 projects ?
    Thanks.

    Thanks,
    I will be working with rather a large count of audio tracks, mainly audio stems recorded into LP9 from slave PCs hosting VST instruments. Which I would then need to edit, mix, and process in LP9.
    I guess having a fast drive (10K RPM) would make audio work snappier when editing ?

  • JSF - Best Practice For Using Managed Bean

    I want to discuss what is the best practice for managed bean usage, especially using session scope or request scope to build database driven pages
    ---- Session Bean ----
    - In the book Core Java Server Faces, the author mentioned that most of the cases session bean should be used, unless the processing is passed on to other handler. Since JSF can store the state on client side, i think storing everything in session is not a big memory concern. (can some expert confirm this is true?) Session objects are easy to manage and states can be shared across the pages. It can make programming easy.
    In the case of a page binded to a resultset, the bean usually helds a java.util.List object for the result, which is intialized in the constructor by query the database first. However, this approach has a problem: when user navigates to other page and comes back, the data is not refreshed. You can of course solve the problem by issuing query everytime in your getXXX method. But you need to be very careful that you don't bind this XXX property too many times. In the case of querying in getXXX, setXXX is also tricky as you don't have a member to set. You usually don't want to persist the resultset changes in the setXXX as the changes may not be final, in stead, you want to handle in the actionlistener (like a save(actionevent)).
    I would glad to see your thought on this.
    --- Request Bean ---
    request bean is initialized everytime a reuqest is made. It sometimes drove me nuts because JSF seems not to be every consistent in updating model values. Suppose you have a page showing parent-children a list of records from database, and you also allow user to change directly on the children. if I hbind the parent to a bean called #{Parent} and you bind the children to ADF table (value="#{Parent.children}" var="rowValue". If I set Parent as a request scope, the setChildren method is never called when I submit the form. Not sure if this is just for ADF or it is JSF problem. But if you change the bean to session scope, everything works fine.
    I believe JSF doesn't update the bindings for all component attributes. It only update the input component value binding. Some one please verify this is true.
    In many cases, i found request bean is very hard to work with if there are lots of updates. (I have lots of trouble with update the binding value for rendered attributes).
    However, request bean is working fine for read only pages and simple binded forms. It definitely frees up memory quicker than session bean.
    ----- any comments or opinions are welcome!!! ------

    I think it should be either Option 2 or Option 3.
    Option 2 would be necessary if the bean data depends on some request parameters.
    (Example: Getting customer bean for a particular customer id)
    Otherwise Option 3 seems the reasonable approach.
    But, I am also pondering on this issue. The above are just my initial thoughts.

  • Defining the path for the use of beans in a JSP

              Hello,
              I would like to use a bean in a JSP file, via the <jsp:usebean> directive.
              I didn't find any property (weblogic property) which could define the path
              to the directory of my beans classes.
              So :
              1) i have put the path of this directory in the weblogic classpath in the weblogic
              start script.
              2) i made the same with the java classpath.
              Both solutions don't work at all.
              Any suggestion would be appreciated.
              Thanks in advance.
              

              We use the \weblogic\myserver\serverclasses for beans..
              of course we pack beans into jar files, and these jar files are included into
              the wls classpath..
              "Sylvain R." <[email protected]> wrote:
              >
              >Hello,
              >I would like to use a bean in a JSP file, via the <jsp:usebean> directive.
              >I didn't find any property (weblogic property) which could define the
              >path
              > to the directory of my beans classes.
              >So :
              >1) i have put the path of this directory in the weblogic classpath in
              >the weblogic
              >start script.
              >2) i made the same with the java classpath.
              >
              >Both solutions don't work at all.
              >
              >Any suggestion would be appreciated.
              >Thanks in advance.
              >
              

  • How can i use JS files in ADF for language translation.

    Hi,
    I have JS for different languages and dn't want to convert them to property files(resource bundle files). How can i use JS files in ADF for language translation.
    Thanks

    Hi ILya Cyclone,
    Thanks alotfor the reply. Can you tell me where should i include this in the jspx page.
    Step 1)
    I have the js file as js/ifl_messages_US.js and i created a resource file as you mentioned: JS_FILE_PATH=js/ifl_messages_US.js
    Step 2)
    Then added the entry in faces-config.xml for the resource file as follow:
    <resource-bundle>
    <base-name>resource_en.properties</base-name>
    <var>resource</var>
    </resource-bundle>
    <locale-config>
    <supported-locale>en</supported-locale>
    </locale-config>
    Step 3) This is my jspx page. In which a table is dynamically created on page load. Can you help me where should i enter the "#{resource.JS_FILE_PATH}"
    <?xml version='1.0' encoding='UTF-8'?>
    <jsp:root xmlns:jsp="http://java.sun.com/JSP/Page" version="2.1"
    xmlns:f="http://java.sun.com/jsf/core"
    xmlns:h="http://java.sun.com/jsf/html"
    xmlns:af="http://xmlns.oracle.com/adf/faces/rich">
    <jsp:directive.page contentType="text/html;charset=UTF-8"/>
    <f:view>
    <af:document id="d1">
    <af:messages id="m1"/>
    <af:resource type="javascript" source="/js/pdfFile.js"/>
    <af:form id="f1">
    <input type="hidden" name="checkRadio" id="checkRadio" value=""/>
    <af:panelGroupLayout id="pgl1" halign="left" layout="vertical">
    <af:image source="/images/BRAND_IMAGE.gif" id="i1"/>
    </af:panelGroupLayout>
    <af:spacer width="10" height="10" id="s1"/>
    <af:table varStatus="rowStat" summary="table"
    value="#{backingBeanScope.DummyBean.collectionModel}"
    rows="#{backingBeanScope.DummyBean.collectionModel.rowCount}"
    rowSelection="none" contentDelivery="immediate" var="row"
    rendered="true" id="t1" styleClass="AFStretchWidth"
    binding="#{backingBeanScope.DummyBean.myTableBinding}"
    columnResizing="disabled">
    <af:column id="c2" headerText="Actions">
    <af:activeOutputText value="#{row.Actions}" id="aot2"/>
    <af:goLink text="#{row.Actions}" id="gl1"
    clientComponent="true" visible="false"/>
    <af:selectBooleanRadio text="" id="sbr1"
    valueChangeListener="#{backingBeanScope.DummyBean.checkselectbox}">
    <af:clientListener method="selectCheckBox" type="click"/>
    </af:selectBooleanRadio>
    </af:column>
    <af:forEach items="#{backingBeanScope.DummyBean.columnNames}" end="#{backingBeanScope.DummyBean.columnSize}"
    var="name" begin="1">
    <af:column sortable="false" sortProperty="#{name}"
    rowHeader="unstyled" headerText="#{name}"
    inlineStyle="width:100px;" id="c1">
    <af:activeOutputText value="#{row[name]}" id="aot1" escape="false">
    </af:activeOutputText>
    <!-- <af:outputFormatted value="#{row[name]}" id="of1"/>-->
    <!--<af:goLink text="goLink 1" id="gl1"
    destination="#{row.bindings.url.inputvalue}"/>-->
    </af:column>
    </af:forEach>
    </af:table>
    </af:form>
    </af:document>
    </f:view>
    </jsp:root>
    Thanks in advance

  • Using DVT for Anonymous user

    Hi,
    Can we use DVT for user interface customization for anonymous user,i need to use the DVT with out user login.Please let me know if there is any way of doing it
    Regards,
    Prabu

    Hi Prabu,
    Not really, no. If the user is anonymous, the Portal framework has no way of persisting the changes he/she makes into the database, because they can't be associated with a unique user.
    If you really wanted to, you could automatically log in a predefined user (aka. a user with the name "anonymous"). However, since everyone would be logged in as the same user, changes made by anyone would be seen by everyone, and users would be undoing/redoing each other's changes constantly.
    George

  • How to bind ADF table with a collection of elements using backing bean.

    Hi Experts,
    My JDev version is 11.1.1.6.0.
    I need to bind ADF table with a collection of elements using backing bean.
    My backing bean consists of 6 lists of strings, where each list represents a column of table. How can I populate the entries of table with these lists.
    Thanks
    Gopi

    Hi,
    Create an object representing the row (setter/getter). Then have a list of these objects. Drag and drop the table and point its value to the list and the type to the row object
    Frank

  • Can anyone describe using entity beans for persitance in Software Architect

    Can anyone describe using entity beans for persitance in Software Architecture you built for a product?

    Although this forum is supposed to help you gain knowledge on entity beans, its not a free rider that's supposed regurgitate everyone's knowledge to you.
    Read something first and then clarify your thoughts with this forum.

  • Problem in Master Detail form when using ADF table for Detail

    hi,
    jdev version-11.1.2.1.0
    i have create Master detail form using datacontrol drag as ADF Master Form Detail Table.
    Now when i create a new row in Detail table using CreateInsert button a blank new row created on the top of detail table.
    and other row show that data of previous record based on master.
    problem is that i want when i click on createInsert button all row of detail table should be blank and when user fill two or three row then commit.
    Thanks in Advance

    Hi,
    if a detail table has data, then createInsert adds to these. If you want to hide existing rows, create a new View Object instance and set its "Retrieve from the Database" option to "No Rows". The use an af:switcher to change the table shown when the user clicks the createInsert button. There is a bit of coding required to have this use case in ADF, but its mostly declarative. Bottom line is that there is no automated option other than creating new rows in a separate page or dialog if you are bothered by existing rows
    Frank

  • ADF DVT: Stack Bar Graph unable to display all Bar.

    Hi Experts,
    I'm currently having a problem displaying bar graphs in my Use Case.
    There are unavoidable instance that at some point of the information provided, there may be part that contain a very huge data.
    Below is my sample code.
    In the example code below only the detail with huge data is rendered in the graph. and the rest are not rendered. is this  a known issue?
    screenshot: http://sdrv.ms/13DXeyn.
    I'm using ADF PS6 in windows7(64bit), Chome browser.
    ManagedBean. This bean contains static data for testing only. Notice how big the data in the 2nd to the last detail.
    import java.text.DateFormat;
    import java.text.ParseException;
    import java.text.SimpleDateFormat;
    import java.util.ArrayList;
    import java.util.Date;
    import java.util.List;
    import javax.faces.event.AbortProcessingException;
    import oracle.adf.view.faces.bi.component.graph.UIGraph;
    import oracle.adf.view.faces.bi.event.TimeSelectorEvent;
    public class GraphTimeAxisManagedBean {
        SimpleDateFormat stdFormat = new SimpleDateFormat("yyyy-MM-dd-HH.mm.ss");
        public List getTabularData() {
            ArrayList list = new ArrayList();
            try {
              list.add(new Object[] { new Date(stdFormat.parse("2010-06-18-00.00.00").getTime()),"", new Double(0) });
              list.add(new Object[] { new Date(stdFormat.parse("2010-06-18-00.00.00").getTime()),"description 1", new Double(20) });
              list.add(new Object[] { new Date(stdFormat.parse("2010-06-18-00.00.00").getTime()),"description 2", new Double(50) });
              list.add(new Object[] { new Date(stdFormat.parse("2010-06-18-00.00.00").getTime()),"description 3", new Double(30) });
              list.add(new Object[] { new Date(stdFormat.parse("2010-07-01-00.00.00").getTime()),"", new Double(0) });
              list.add(new Object[] { new Date(stdFormat.parse("2010-07-01-00.00.00").getTime()),"description 1", new Double(150) });
              list.add(new Object[] { new Date(stdFormat.parse("2010-07-01-00.00.00").getTime()),"description 2", new Double(240) });
              list.add(new Object[] { new Date(stdFormat.parse("2010-07-01-00.00.00").getTime()),"description 3", new Double(10) });
              list.add(new Object[] { new Date(stdFormat.parse("2010-08-01-00.00.00").getTime()),"", new Double(0) });
              list.add(new Object[] { new Date(stdFormat.parse("2010-08-01-00.00.00").getTime()),"description 1", new Double(60) });
              list.add(new Object[] { new Date(stdFormat.parse("2010-08-01-00.00.00").getTime()),"description 2", new Double(80) });
              list.add(new Object[] { new Date(stdFormat.parse("2010-08-01-00.00.00").getTime()),"description 3", new Double(10) });
              list.add(new Object[] { new Date(stdFormat.parse("2010-09-01-00.00.00").getTime()),"", new Double(0) });
              list.add(new Object[] { new Date(stdFormat.parse("2010-09-01-00.00.00").getTime()),"description 1", new Double(90) });
              list.add(new Object[] { new Date(stdFormat.parse("2010-09-01-00.00.00").getTime()),"description 2", new Double(50) });
              list.add(new Object[] { new Date(stdFormat.parse("2010-09-01-00.00.00").getTime()),"description 3", new Double(80) });
              list.add(new Object[] { new Date(stdFormat.parse("2010-10-01-00.00.00").getTime()),"", new Double(0) });
              list.add(new Object[] { new Date(stdFormat.parse("2010-10-01-00.00.00").getTime()),"description 1", new Double(10) });
              list.add(new Object[] { new Date(stdFormat.parse("2010-10-01-00.00.00").getTime()),"description 2", new Double(90) });
              list.add(new Object[] { new Date(stdFormat.parse("2010-10-01-00.00.00").getTime()),"description 3", new Double(80) });
              list.add(new Object[] { new Date(stdFormat.parse("2010-11-01-00.00.00").getTime()),"", new Double(0) });
              list.add(new Object[] { new Date(stdFormat.parse("2010-11-01-00.00.00").getTime()),"description 1", new Double(200) });
              list.add(new Object[] { new Date(stdFormat.parse("2010-11-01-00.00.00").getTime()),"description 2", new Double(20) });
              list.add(new Object[] { new Date(stdFormat.parse("2010-11-01-00.00.00").getTime()),"description 3", new Double(70) });
              list.add(new Object[] { new Date(stdFormat.parse("2010-12-01-00.00.00").getTime()),"", new Double(0) });
              list.add(new Object[] { new Date(stdFormat.parse("2010-12-01-00.00.00").getTime()),"description 1", new Double(60) });
              list.add(new Object[] { new Date(stdFormat.parse("2010-12-01-00.00.00").getTime()),"description 2", new Double(80) });
              list.add(new Object[] { new Date(stdFormat.parse("2010-12-01-00.00.00").getTime()),"description 3", new Double(10) });
              list.add(new Object[] { new Date(stdFormat.parse("2011-01-01-00.00.00").getTime()),"", new Double(0) });
              list.add(new Object[] { new Date(stdFormat.parse("2011-01-01-00.00.00").getTime()),"description 1", new Double(90) });
              list.add(new Object[] { new Date(stdFormat.parse("2011-01-01-00.00.00").getTime()),"description 2", new Double(80) });
              list.add(new Object[] { new Date(stdFormat.parse("2011-01-01-00.00.00").getTime()),"description 3", new Double(70) });
              list.add(new Object[] { new Date(stdFormat.parse("2011-02-01-00.00.00").getTime()),"", new Double(0) });
              list.add(new Object[] { new Date(stdFormat.parse("2011-02-01-00.00.00").getTime()),"description 1", new Double(60) });
              list.add(new Object[] { new Date(stdFormat.parse("2011-02-01-00.00.00").getTime()),"description 2", new Double(80) });
              list.add(new Object[] { new Date(stdFormat.parse("2011-02-01-00.00.00").getTime()),"description 3", new Double(30) });
              list.add(new Object[] { new Date(stdFormat.parse("2011-03-01-00.00.00").getTime()),"", new Double(0) });
              list.add(new Object[] { new Date(stdFormat.parse("2011-03-01-00.00.00").getTime()),"description 1", new Double(203)});
              list.add(new Object[] { new Date(stdFormat.parse("2011-03-01-00.00.00").getTime()),"description 2", new Double(90)});
              list.add(new Object[] { new Date(stdFormat.parse("2011-03-01-00.00.00").getTime()),"description 3", new Double(70)});
              list.add(new Object[] { new Date(stdFormat.parse("2011-04-01-00.00.00").getTime()),"", new Double(0) });
              list.add(new Object[] { new Date(stdFormat.parse("2011-04-01-00.00.00").getTime()),"description 1", new Double(75) });
              list.add(new Object[] { new Date(stdFormat.parse("2011-04-01-00.00.00").getTime()),"description 2", new Double(86) });
              list.add(new Object[] { new Date(stdFormat.parse("2011-04-01-00.00.00").getTime()),"description 3", new Double(99) });
              list.add(new Object[] { new Date(stdFormat.parse("2011-05-01-00.00.00").getTime()),"", new Double(0) });
              list.add(new Object[] { new Date(stdFormat.parse("2011-05-01-00.00.00").getTime()),"description 1", new Double(60105) });
              list.add(new Object[] { new Date(stdFormat.parse("2011-05-01-00.00.00").getTime()),"description 2", new Double(50309) });
              list.add(new Object[] { new Date(stdFormat.parse("2011-05-01-00.00.00").getTime()),"description 3", new Double(50210) });
              list.add(new Object[] { new Date(stdFormat.parse("2011-06-01-00.00.00").getTime()),"", new Double(0) });
              list.add(new Object[] { new Date(stdFormat.parse("2011-06-01-00.00.00").getTime()),"description 1", new Double(80) });
              list.add(new Object[] { new Date(stdFormat.parse("2011-06-01-00.00.00").getTime()),"description 2", new Double(77) });
              list.add(new Object[] { new Date(stdFormat.parse("2011-06-01-00.00.00").getTime()),"description 3", new Double(99) });
            } catch (ParseException e) {
            return list;
    JSFF (UI Page).
    <?xml version='1.0' encoding='UTF-8'?>
    <jsp:root xmlns:jsp="http://java.sun.com/JSP/Page" version="2.1"
              xmlns:dvt="http://xmlns.oracle.com/dss/adf/faces"
              xmlns:af="http://xmlns.oracle.com/adf/faces/rich">
      <af:panelGroupLayout id="pgl1">
        <dvt:graph id="barGraph1" graphType="BAR_VERT_STACK" shortDesc="testing"
                   inlineStyle="width:800px; height:300px;"
                   tabularData="#{pageFlowScope.GraphTimeAxisManagedBean.tabularData}">
          <dvt:background>
            <dvt:specialEffects/>
          </dvt:background>
          <dvt:graphPlotArea/>
          <dvt:seriesSet>
            <dvt:series/>
          </dvt:seriesSet>
          <dvt:o1Axis/>
          <dvt:y1Axis/>
          <dvt:legendArea automaticPlacement="AP_NEVER"/>
        </dvt:graph>
      </af:panelGroupLayout>
    </jsp:root>
    Thanks,
    achie

    Achie,
    This is not an issue with the graph as such, but the data issue. Since the value of the data that is being displayed is huge and other datas are relatively very small to occupy the graph area (Ex : 60105 Vs 10).
    You may want to try implementing zoomListener and/or zoomScrollListener for the graph to zoom and see the small datas.
    &amp;lt;dvt:graph&amp;gt;
    -Arun

  • ADF DVT graphs drillAction

    I am following the link to eanble drilling in my ADF DVT graphs - http://download.oracle.com/docs/cd/E12839_01/apirefs.1111/e12418/tagdoc/dvt_graph.html
    It describes DrillActions as
    drillAction String Yes Refers to a backing bean method. The method will be processed when a label slice on an axis is drilled.
    1) How to make the label slices drillable?
    I am using a data control to plot my graph.
    I have added drillingEnabled, and drill listeners to my code as follows, but its not working
    <dvt:graph shortDesc="Graph" id="g1" value="#{bindings.EmpGrpViewObj1.graphModel}"
    drillingEnabled="true"
    drillRequestingListener="#{pageFlowScope.backing_test.onClickDrill}"
    drillRequestedListener="#{pageFlowScope.backing_test.onClickDrill1}"
    animationOnDisplay="auto" animationOnDataChange="alphaFade"
    />
    Also I am follwoing link - http://download.oracle.com/docs/cd/E12839_01/apirefs.1111/e12418/tagdoc/dvt_pivotTable.html
    which says Drill operation is available only when underlying data control supports it. Does same applies for graph as well? I mean, if datacontrol used to define graph, then it should support drill operation. How to make data control to support drill action for graphs?

    some comments floating here
    Drilldown in ADF DVT charts using data control

  • Help Needed : ADF DVT Dual Y graph Tabular data

    Hi,
    I need to display a dvt:lineGraph containing 2 Y axis.
    The data for the graph is tabular data from a backing bean.
    I am able to display a single y axis graph.
    But since dual y graph needs two series set I am facing problem in specifying that which rows belongs to first series set and which rows belong to the other series set.
    The tabular data is a dynamic one which may change upon every partial refresh every 5 seconds.
    regards
    Lalatendu Patra

    Hi,
    Please see my response on this thread on how to achieve this with a relational data control:
    Re: Drilldown in ADF DVT charts using data control
    Thanks
    Katia

  • Generate PDF using Managed Bean with custom HTTP headers

    Background
    Generate a report in various formats (e.g., PDF, delimited, Excel, HTML, etc.) using JDeveloper 11g Release 2 (11.1.2.3.0) upon clicking an af:commandButton. See also the StackOverflow version of this question:
    http://stackoverflow.com/q/13654625/59087
    Problem
    HTTP headers are being sent twice: once by the framework and once by a bean.
    Source Code
    The source code includes:
    - Button Action
    - Managed Bean
    - Task Flow
    Button Action
    The button action:
    <af:commandButton text="Report" id="submitReport" action="Execute" />
    Managed Bean
    The Managed Bean is fairly complex. The code to `responseComplete` is getting called, however it does not seem to be called sufficiently early to prevent the application framework from writing the HTTP headers.
    HTTP Response Header Override
    * Sets the HTTP headers required to indicate to the browser that the
    * report is to be downloaded (rather than displayed in the current
    * window).
    protected void setDownloadHeaders() {
    HttpServletResponse response = getServletResponse();
    response.setHeader( "Content-Description", getContentDescription() );
    response.setHeader( "Content-Disposition", "attachment, filename="
    + getFilename() );
    response.setHeader( "Content-Type", getContentType() );
    response.setHeader( "Content-Transfer-Encoding",
    getContentTransferEncoding() );
    Issue Response Complete
    The bean indirectly tells the framework that the response is handled (by the bean):
    getFacesContext().responseComplete();
    Bean Run and Configure
    public void run() {
    try {
    Report report = getReport();
    configure(report.getParameters());
    report.run();
    } catch (Exception e) {
    e.printStackTrace();
    private void configure(Parameters p) {
    p.put(ReportImpl.SYSTEM_REPORT_PROTOCOL, "http");
    p.put(ReportImpl.SYSTEM_REPORT_HOST, "localhost");
    p.put(ReportImpl.SYSTEM_REPORT_PORT, "7002");
    p.put(ReportImpl.SYSTEM_REPORT_PATH, "/reports/rwservlet");
    p.put(Parameters.PARAM_REPORT_FORMAT, "pdf");
    p.put("report_cmdkey", getReportName());
    p.put("report_ORACLE_1", getReportDestinationType());
    p.put("report_ORACLE_2", getReportDestinationFormat());
    Task Flow
    The Task Flow calls Execute, which refers to the bean's `run()` method:
    entry -> main -> Execute -> ReportBeanRun
    Where:
    <method-call id="ReportBeanRun">
    <description>Executes a report</description>
    <display-name>Execute Report</display-name>
    <method>#{reportBean.run}</method>
    <outcome>
    <fixed-outcome>success</fixed-outcome>
    </outcome>
    </method-call>
    The bean is assigned to the `request` scope, with a few managed properties:
    <control-flow-rule id="__3">
    <from-activity-id>main</from-activity-id>
    <control-flow-case id="ExecuteReport">
    <from-outcome>Execute</from-outcome>
    <to-activity-id>ReportBeanRun</to-activity-id>
    </control-flow-case>
    </control-flow-rule>
    <managed-bean id="ReportBean">
    <description>Executes a report</description>
    <display-name>ReportBean</display-name>
    <managed-bean-scope>request</managed-bean-scope>
    </managed-bean>
    The `<fixed-outcome>success</fixed-outcome>` strikes me as incorrect -- I don't want the method call to return to another task.
    Restrictions
    The report server receives requests from the web server exclusively. The report server URL cannot be used by browsers to download directly, for security reasons.
    Error Messages
    The error message that is generated:
    Duplicate headers received from server
    Error 349 (net::ERR_RESPONSE_HEADERS_MULTIPLE_CONTENT_DISPOSITION): Multiple distinct Content-Disposition headers received. This is disallowed to protect against HTTP response splitting attacks.Nevertheless, the report is being generated. Preventing the framework from writing the HTTP headers would resolve this issue.
    Question
    How can you set the HTTP headers in ADF while using a Task Flow to generate a PDF by calling a managed bean?
    Ideas
    Some additional ideas:
    - Override the Page Lifecycle Phase Listener (`ADFPhaseListener` + `PageLifecycle`)
    - Develop a custom Servlet on the web server
    Related Links
    - http://www.oracle.com/technetwork/middleware/bi-publisher/adf-bip-ucm-integration-179699.pdf
    - http://www.slideshare.net/lucbors/reports-no-notes#btnNext
    - http://www.techartifact.com/blogs/2012/03/calling-oracle-report-from-adf-applications.html?goback=%2Egde_4212375_member_102062735
    - http://docs.oracle.com/cd/E29049_01/web.1112/e16182/adf_lifecycle.htm#CIABEJFB
    Thank you!

    The problem was that the HTTP headers were in fact being written twice:
    1. The report server was returning HTTP response headers.
    2. The bean was including its own HTTP response headers (as shown in the question).
    3. The bean was copying the entire contents of the report server response, including the headers, into the output stream.
    Firefox ignored the duplicate header errors, but Google Chrome did not.

Maybe you are looking for

  • Itunes wont install on my computer. its a geteway with vista.

    i tunes wont install? it will downlaod no problem and have an error while installing and wont continue? any idea why?

  • IPhoto hangs, found Console message

    Though I'm a Mac user for over 10 years, I'm no good when deciphering Console jargon or attempting a Terminal command.  My iPhoto has (as well as most everything else with Mavericks) been unusually slow when opening, importing, changing folders, edit

  • Holiday calendar table logic required in sql instead of pl sql

    My function fn_test calculates the date-1. And if it's a holiday according to the table temp_calendar, then it recursively calls the fn_test again to do -1. This happens till I get a non holiday date. I have implemented as follows: But can I have a s

  • Saving layout in ALV display

    Hi, I am using both is_variant, i_save parameters in this method, but still am not getting save layout button on my grid. ls_variant-REPORT = sy-repid.   CALL METHOD w_alv1->set_table_for_first_display     EXPORTING       is_variant                  

  • Query about original locations of PO files

    Hi, As part of our R12 work we have customised some PO workflows and packages. I forgot to make copies of the original .wft, .pkb and .pks files. When loading the revised versions back to the database we just use a .sh script to load in the updated w