SSAS Date Parameter times out

Environment: We are running SQL Server 2012 Standard Edition with Analysis Services.  Reporting Services on a separate machine. Visual Studio 2010. 
I have a Date dimension with a generated table named Time.  To the table I have added custom columns to show Week_Ending and Short_Date_Alpha that are used in Excel Reports from the Analysis Services Cubes.
When I attempt to use this dimension in Reporting Services, it times out if I try to use it as a parameter.  The report will run for 30 minutes, never surface the parameter selection box and then eventually time-out with an unknown error related to
the time dimension.  This is in Visual Studio. 
If I pull the dimension directly from the SQL table I can produce a report on it.  If I run the full query for the measures with the default date selected it runs in seconds in the query builder.    Here are 10 rows of the relevant
table structure:
PK_Date Date_Name Reporting_Year Reporting_Year_Name Reporting_Month Reporting_Month_Name Reporting_Week Reporting_Week_Name Reporting_Day Reporting_Day_Name Short_Date_Alpha End_of_Week_Name Week_Ending WorkDay
2008-01-01 00:00:00.000 Tuesday, January 01 2008 2007-01-07 00:00:00.000 2007 2007-12-02 00:00:00.000 Rpt Dec 2007 2007-12-30 00:00:00.000 Reporting Week 52, 2007 2008-01-01 00:00:00.000 Tuesday, January 01 2008 Jan 
1 2008  WE 01/07/08 2008-01-07 00:00:00.000 1
2008-01-02 00:00:00.000 Wednesday, January 02 2008 2007-01-07 00:00:00.000 2007 2007-12-02 00:00:00.000 Rpt Dec 2007 2007-12-30 00:00:00.000 Reporting Week 52, 2007 2008-01-02 00:00:00.000 Wednesday, January 02 2008 Jan 
2 2008  WE 01/07/08 2008-01-07 00:00:00.000 1
2008-01-03 00:00:00.000 Thursday, January 03 2008 2007-01-07 00:00:00.000 2007 2007-12-02 00:00:00.000 Rpt Dec 2007 2007-12-30 00:00:00.000 Reporting Week 52, 2007 2008-01-03 00:00:00.000 Thursday, January 03 2008 Jan 
3 2008  WE 01/07/08 2008-01-07 00:00:00.000 1
2008-01-04 00:00:00.000 Friday, January 04 2008 2007-01-07 00:00:00.000 2007 2007-12-02 00:00:00.000 Rpt Dec 2007 2007-12-30 00:00:00.000 Reporting Week 52, 2007 2008-01-04 00:00:00.000 Friday, January 04 2008 Jan 
4 2008  WE 01/07/08 2008-01-07 00:00:00.000 1
2008-01-10 00:00:00.000 Thursday, January 10 2008 2008-01-06 00:00:00.000 2008 2008-01-06 00:00:00.000 Rpt Jan 2008 2008-01-06 00:00:00.000 Reporting Week 1, 2008 2008-01-10 00:00:00.000 Thursday, January 10 2008 Jan
10 2008  WE 01/12/08 2008-01-12 00:00:00.000 1
2008-01-11 00:00:00.000 Friday, January 11 2008 2008-01-06 00:00:00.000 2008 2008-01-06 00:00:00.000 Rpt Jan 2008 2008-01-06 00:00:00.000 Reporting Week 1, 2008 2008-01-11 00:00:00.000 Friday, January 11 2008 Jan
11 2008  WE 01/12/08 2008-01-12 00:00:00.000 1
2008-01-12 00:00:00.000 Saturday, January 12 2008 2008-01-06 00:00:00.000 2008 2008-01-06 00:00:00.000 Rpt Jan 2008 2008-01-06 00:00:00.000 Reporting Week 1, 2008 2008-01-12 00:00:00.000 Saturday, January 12 2008 Jan
12 2008  WE 01/12/08 2008-01-12 00:00:00.000 0
2008-01-13 00:00:00.000 Sunday, January 13 2008 2008-01-06 00:00:00.000 2008 2008-01-06 00:00:00.000 Rpt Jan 2008 2008-01-13 00:00:00.000 Reporting Week 2, 2008 2008-01-13 00:00:00.000 Sunday, January 13 2008 Jan
13 2008  WE 01/19/08 2008-01-19 00:00:00.000 0
2008-01-19 00:00:00.000 Saturday, January 19 2008 2008-01-06 00:00:00.000 2008 2008-01-06 00:00:00.000 Rpt Jan 2008 2008-01-13 00:00:00.000 Reporting Week 2, 2008 2008-01-19 00:00:00.000 Saturday, January 19 2008 Jan
19 2008  WE 01/19/08 2008-01-19 00:00:00.000 0
2008-01-20 00:00:00.000 Sunday, January 20 2008 2008-01-06 00:00:00.000 2008 2008-01-06 00:00:00.000 Rpt Jan 2008 2008-01-20 00:00:00.000 Reporting Week 3, 2008 2008-01-20 00:00:00.000 Sunday, January 20 2008 Jan
20 2008  WE 01/26/08 2008-01-26 00:00:00.000 0
The dimension looks fine when I browse it.   It produces the correct information in the Excel Reports, but I cannot get it to work in Reporting Services.  
I have deleted dates out of the cubes and processed, then added the Time dimension back in and processed and it still won't work.   I have a cube running off a table called Date and that seems to work. 
Is the issue with the table name?  With the custom descriptions?   Something else? 

How would you best suggest limiting the time frame?   Add a filter to the query so that the future dates are not part of the selection?   I'm doing some data refresh in the data mart so won't be able to try limiting the selection
for a few hours. 
Hi Diane,
Could you please let us know how many available values for your date parameter(the quantity of date dimension in SSAS)? One workaround that we can try to add an additional text parameter which will prefilter the available values in the large parameter list.
Here is similar thread about this topic for your reference, please see:
http://dataqueen.unlimitedviz.com/2012/02/filter-a-parameter-with-long-list-of-values-using-type-ahead/
Regards,
Elvis Long
TechNet Community Support

Similar Messages

  • List of Values (dynamic parameter) times out once and then works.

    Hello.  First time posting here.  I'm having a strange (yet very predictable) issue with Crystals Reports List of Values.  The List of Values is used on a dynamic cascading prompt and is published to the BO XI R2 repository.  The prompt/list works fine when using Crystal Reports thick client, but when using the Crystal Reports viewer from InfoView the following happens everytime:
    1.  Any Report that uses the Parameter/LOV is launched.
    2.  After about 1-2 minutes, the viewer returns a "Timeout" error message.
    3.  The report is launched a 2nd time.
    4.  The parameters/LOVs display (takes about 8 seconds) and the report runs fine.
    Then anytime a report is run within a certain period of time it works ok.  But if no one runs a report for an extended period of time (i'm not sure how long, but it's on the order of maybe 30 mins.. to an hour), then it starts all over again.  In addition, if I restart the services/server this same thing will immediately occur.  (time out once, but work after that for a period of time).
    I initially did not schedule the LOV to get generated on the server and left it so that it's done in real-time when the report ran.  But then when I ran into this error, I tried scheduling it every 10 minutes or so, but the error continues to happen.
    I also restarted the server/etc to no avail. 
    I saw another post on here that was kind of similar and that suggested that starting the LOV services as a domain user (that had admin rights on the BO server) helped him.  I tried this as well and restarted everything but that didn't fix my issue.
    Any help would be greatly appreciated!
    Scott

    Just wanted to update this.
    Because I use Business Objects OEM, I didn't have direct support with SAP/Business Objects.  However, because of this issue (and other issues) we purchased Business Objects Edge 3.0.
    After getting BOE 3 installed/configured, I migrated those reports and underlying Business Views over.
    And guess what?   The problem still occurred on the new platform.
    I then created a support incident and the person that initially tried to help me couldn't figure out what was going on.
    However, during that time, I did try re-creating the report using a newer version of Crystal (Crystal 2008) and the problem went away.  (note, I didn't recreate the business views.. just the reports).
    So bottom line:
    I created all reports from scratch instead of using the reports that I migrated from BO XI R2/CR XI and no longer had the problem.
    The techs never figured out the exact cause of the issue.
    So far, my experience with their support has been the same on all incidents.  They don't seem to be trained (at least at the 1st level) for actually using the debug/logs/etc to help quickly pinpoint/analyze issues.  They do basically like I do (the end user) and smartly poke around and do trail/error.  I have yet to get a support tech that knew the exact nature of my issue and could solve it without this kind of trial/error approach.

  • Loading Large Data Set Times Out In APEX

    I am trying to load a large text file using APEX 3 I am using a 1g file which has about 50,000 rows. After about 5 minutes the browser times out. Any ideas nothing in the alert log so it is not database related. Here is the error in the Apache Log
    mod_plsql: Long running URL [pls/apex/wwv_flow.accept] timed out

    Steve,
    The Apache process is timing out. Most likely this is set to 300 in your httpd.conf (the default).
    You can extend it to more than 300, but if I were in your shoes and you had the proper access, I'd use some other means to load 1GB of data (e.g., SQL*Loader).
    Joel

  • Function Module to get Date and Time out of Timestamp

    Hi,
    Source system timestamp field CREATED_TS of Type DEC-15
    BW PSA data in format 20.140.707.105.948
    In DSO, I have created two target InfoObjects; one for Date and one for Time.
    In transformation from PSA to DSO, in field level routine, I want to split timestamp into Date and Time.
    There is a function module CACS_TIMESTAMP_GET_DATE in source system but it is not available in BW. Another function module ADDR_CONVERT_TIMESTAMP_TO_DATE which is available in BW but returns only Date. 
    Does anyone know a function module in BW which takes Timestamp (PSA data in format 20.140.707.105.948) and returns Date and Time.
    Much better would be a FM which take timestamp and also Timezone  and returns Date and Time.
    Thanks
    Ahmad

    Timestamp to date time conversion (with time zone) is built into ABAP. Why use a function module?
    Read the ABAP help on CONVERT.

  • Data load time out error.

    Hi Gurus,
    We have initialized one of our infocube, its having more then 14lakhs of data. each PSA data packet(90 in number)contains 15k odd record. Out of the 90 PSA data packets,10 successeded while yesterday night load scheduling, Now I am scheduling each packet individually through manual upload by right clicking on the data packet, every time its resulting in runtime error.
    Itu2019s a highly time consuming load, so canu2019t schedule it again.
    Do you have any other solution?????
    Thanks in advance,
    Kironmoy Banerjee.

    Hi,
    try to see which step in the package it is taking the time....is it the start routines??
    just see if there are start routines in the update rules.
    May be you should reduce the size of the data package when you first loaded but since you are alredy loading it from PSA its not possible.
    if you can delete the request again and reload it again via PSA and this time reduce the size of package to may be 5000 or 4000 then it should load.
    You can do the settings for nthe infopckage which you are using to schedule the load..go in the infopackage .... infopackage->schedule->datas. source option..
    change the size for the delta as well as for full loads.
    then the schedule the loads again...do an init without data transfer and a full repair without selections this time.
    Thanks
    Ajeet

  • Profile Parameter : Time out for executing query on the web

    Hi gurus,
    I am executing queries on the web directly. This can be done from query designer with the button that says "Execute query on web". The problem is that for queries that take more than 600 Secs to run, I get an Application timed out error. Queries that take less than this run smoothly.
    Can anyone please tell me the profile parameter associated with this particular setting. It is not rdisp/max_wprun_time, I know for sure since the value for this profile parameter in my system is 9999. Please help.
    Thanks & rgds,
    Sree

    Issue resolved.
    Profile Paramter - icm/server_port_0
    Current Value - PROT=HTTP,PORT=8000,TIMEOUT=60,PROCTIMEOUT=600
    Changed to - PROT=HTTP,PORT=8000,TIMEOUT=60,PROCTIMEOUT=3600
    rgds,
    Sree

  • DSO data Activation Time out-DATABASE ERROR-440

    Hello all,
    We are getting  error in one of our standard DSO during activation -Error-Process 000001 did not confirm within the permitted time period,we repeated process chain ,we got the same error,we activated manually -we got the same error, I need to mention that we only have 3278 records which should not take more than few seconds-minutes to activate. we reset RSODSO_SETTINGS wait time by increasing still did do it , we changed data package 10,000 ,50,000 but it didn't correct the error, we even change DSO  SID generation setting during activation to never create SID ,it didn't correct the error, we reactivated DSO,we indexed the table both new and active table but we are getting the same error, in a SM21 we are getting error DATABASE ERROR -440  [IBM][CLI Driver][DB2] NO AUTHORIZED PROCEDURE NAMED > DSO_VERSION HAVING COMPATIBLE ARGUMENTS WAS FOUND... DBA reorg and did some other things but we are still getting the same error and the are not sure what the procedure DSO_VERSION is ,Any body have any idea what that procedure is and why we are getting that in sm21 during one DSO data activation?  Has anybody had this kind of experience before? by the way we have researched on google SCN and nothing close to our scenario. By the way we ran the activation with the different settings both in dialog and background.We also got error  Background process BCTL_4Z81MSY8311QNTKRYGRDYVFXY terminated due to missing confirmation ...Any suggestions will be of great help and points will be awarded....
    Thank you.

    Error in Sm21-
    04:57:20 BTC  036 810 PRGRAU0                   BY  2 Database error -440 at EXE
    04:57:20 BTC  036 810 PRGRAU0                   BY  0 > [IBM][CLI Driver][DB2] NO AUTHORIZED PROCEDURE NAME
    04:57:20 BTC  036 810 PRGRAU0                   BY  0 > DSO_VERSION HAVING COMPATIBLE ARGUMENTS WAS FOUND
    04:57:20 BTC  036 810 PRGRAU0                   BY  1 Database error -440
    Background process BCTL_4Z9BTOI3XPN2XB0VL9IUZ08ZA terminated due to missing
    confirmation
    Message no. RSODSO_PROCESSING041
    Diagnosis
    The confirmation from the background process BCTL_4Z9BTOI3XPN2XB0VL9IUZ08ZA
    did not occur within the set time. It was thus ended by the system.
    System Response
    Background process: BCTL_4Z9BTOI3XPN2XB0VL9IUZ08ZA
    Number: 1
    Procedure
    Check the package size for the application to be processed or change the
    package size.

  • Broadcaster times-out due to dialog process

    Hi !
    I'm trying to set up a broadcasting for a report with a large amount of records, and since when I execute it in dialog it times out, broadcasting it in the background seemed a great idea.
    The thing is that, despite I schedule it to run in a background process, it still creates a dialog process to fetch data wich times out and therefore the background process (the document creation and email sendig) ends correctly sending a 'No Data Available' report or with an error: com.sap.ip.bi.base.exception.BIBase.RuntimeException.
    The parameters for the report are being set in the General Precalculation tab, while there is a Filter Navigation tab which allows you to filer specific characteristics. Which is the difference here ?
    Regards,
    Santiago

    Sonal, thanks for the quick answer.
    Let me see if I get this right.
    The Precalculation tab allows to filter before precalculation, in order to fetch de query from precalc. cache instead of direct db access. The data excluded from filters in precalc. tab will not be available in the query.
    On the other hand, Filter tab sets filter values for characteristics contained in the query. Is this like the right column of the filter tab of Query Designer ?
    Let me know if this is right.
    Regards,
    Santiago

  • FAGLL03 (SAPLFAGL_ITEMS_SELECT) report times out

    When the FAGLL03 (SAPLFAGL_ITEMS_SELECT) report runs to retrieve a lots of data, it times out.
    I wonder if there is a better (similar) way to extract the G/L data by either using another report in ERP or even to create a query (BEx query) in BW to get the data???
    What are the best/recommended practices for running FAGLL03?
    Best regards,
    Tom

    Hi Tom,
    Try reviewing the below information, if this does not help, you might have to narrow your current selection
    for FAGLL03, or run it in background.
    The normal recommendations for performance problems in the FAGLL03 transaction are:
    1. Use ALV grid, if possible - not ALV list
    2. Do not use filters
    3. Collapse/expand functionality-summarization levels
    4. Do not use layout with Special fields - very time consuming
    5. Check which index is used - Activate ST05 and trace the time consuming tables. Evt. replace the index for these tables.
    On the other hand, to improve the performance for transaction FAGLL03 you should try to create the following index to the FAGLFLEXA table.
    FAGLFLEXA should get a new index with the following fields:
    RCLNT
    RACCT
    BUDAT
    RLDNR
    BSTAT
    With these indexes, the selection which is currently causing the issue should run a lot faster. The issue for the selection is with the JOIN of the table BSIS/FAGLFLEXA and BSAS/FAGLFLEXA.
    So with this index, these selection should be a lot faster. Also please update the statistics with the DB20 after you created the new indexes.
    Hope it can help.
    BR,
    Leonardo Cunha

  • Parsing date and time info

    Hello,
    Could you please help me find the date and time out of this output :-
    1094130971507 ?
    The following code resulted in this output,
    Date now = new Date();
    DateFormat fmt = DateFormat.getDateTimeInstance(DateFormat.SHORT,
    DateFormat.MEDIUM, Locale.US);
    String timeFrame = fmt.format(now) + " " + frame; // frame = some data
    try {
    File histFile = new File("logfile" + (new Date()).getTime() + ".txt");
    FileOutputStream histStream = new FileOutputStream(histFile);
    printWriter = new PrintWriter(histStream, true);
    catch(Exception e) {
    System.out.println(e);
    printWriter.println("OUT " + timeFrame);
    I tried to tokenize the logfile and read the date and time value by using the StreamTokenizer and I have the data as 1094130971507 in the double st.nval. I have to extract the time information alone from this double.
    Does some have a smart idea ?.
    Thanks in advance,
    Domnic

    Could you please help me find the date and time out
    of this output :-
    1094130971507 ?
    printWriter.println("OUT " + timeFrame);You are claiming that the first was the result of the second?
    Not with the Sun VM it doesn't.
    The following code...
    DateFormat fmt = DateFormat.getDateTimeInstance(DateFormat.SHORT,
    DateFormat.MEDIUM, Locale.US);
    String timeFrame = fmt.format(now) ...
    Is going to produce the following output....
    1/27/05 1:28:25 PM

  • SNMP Informs and time-outs.

    I understand that SNMP Informs is an alternative to SNMP traps which can send the information if the acknowledgement is not received for a packet sent.
    I have a problem here.
    SNMP Informs sends as many packets as that of the configured retries parameter. I do configure a parameter time-out. Does it mean that the packets will be sent one after other with a time interval configured as time-outs ?
    I did see packets sent one after other instantaneously, Is there any way of controlling the time-interval between the packets.
    Thanx in advance.
    Regards
    Ashraf

    The command to set retries, timeout's etc is: snmp-server informs [retries retries] [timeout seconds] [pending pending]
    When a trap is sent to the manager, the sending station does not wait for an acknowledgement and assumes that the message reached the destination sucessfully. However, when an Inform is sent, the manager will acknowledge the receipt of the inform request. If the sending station does not receive the acknowledgement within a specified period of time (called the timeout), the inform will be retransmitted. Once again the sending station will wait for a period of time (specified earlier as timeout). The sending station will continue retransmission of the inform till it receives the response message or if the number of retransmits becomes equal to the maximum specified retransmission attempts (specified as retries). The parameter 'pending' specifies the maximum number of informs waiting for acknowledgments at any one time. More information is available at http://www.cisco.com/univercd/cc/td/doc/product/software/ios113ed/113t/113t_1/snmpinfm.htm.

  • SSRS Report date parameter values using SSAS Cube as datasource

    Hey Guys, 
    i m building an SSRS report using SSAS cube as the data source. The report contains shared data-sets which provide the required data. below is the set up.
    Sql server version : Sql server 2008R2
    Report builder 3.0 
    I have, Report A, powered by Dataset D, running on Cube C. The dataset has 3 parameters out of which one of them is a date parameter- P1. I have it set up such that on the report there is  another parameter - P2 which contains the type as Date ( to
    ensure the user is exposed to a date picker), the input from this, P2 is then manipulated to required format before fed into the dataset parameter P1 (hidden). I have two questions regarding this parameter.
    Currently, when the user pick's a date (P2) which has no values, the report errors out as it doesn't find that corresponding member on the cube. Can it be made such that if the member doesn't exists, a simple error message like "date not found"
    pops up as opposed to report failing?
    Secondly, can i manipulate the date picker (P2) exposed to the user such that the unavailable dates are grey'ed out on the date parameter (P2)? 
    Please let me know if there is any more info required on each of the questions.
    Thanks
    Srikanth

    Hello Katherine,
    below are the errors which pop up with the 'Constrained' function in place and without. The MDX query is what follows.
    Without: 
    An Error has occurred during report processing. (rsProcessingAborted). The Execution Failed for the shared data set “Dataset1”.(rsDataSetExecutionError). Query Execution failed for dataset ‘DataSet1’. (rsErrorExecutingCommand). The dimension ‘[10 Sep 2014]’
    was not found in the cube when the string, [10 Sep 2014], was parsed.
    With Constrained flag:
    An Error has occurred during report processing. (rsProcessingAborted). The Execution Failed for the shared data set “Dataset1”.(rsDataSetExecutionError). Query Execution failed for dataset ‘DataSet1’. (rsErrorExecutingCommand). Query(1,1476) The restrictions
    imposed by the CONSTRAINED flag in the STRTOSET function were violated.
    SELECT
    NON EMPTY
    [Measures].[A]
    ,[Measures].[B]
    ,[Measures].[C]
    ,[Measures].[D]
    ,[Measures].[E]
    } ON COLUMNS
    ,NON EMPTY
    [DimA].[LevelA].[LevelA].ALLMEMBERS*
    [DimB].[LevelB].[LevelB].ALLMEMBERS*
    [Date].[Date].[Date].ALLMEMBERS
    DIMENSION PROPERTIES
    MEMBER_CAPTION
    ,MEMBER_UNIQUE_NAME
    ON ROWS
    FROM
    SELECT
    StrToSet (@FilterA ,CONSTRAINED) ON COLUMNS
    FROM
    SELECT
    StrToSet(@Date, CONSTRAINED) ON COLUMNS
    FROM [Cube1]
    WHERE
    [DimC].[Level1].&[Member1]
    ,[DimC].[Level2].&[Member1]
    ,[DimC].[Level3].&[Member1]
    ,[DimC].[Level4].&[Member1]
    Thanks
    Srikanth

  • Crystal report Viewer Session times out for more data in Portal

    Hi All,         
         I am using below java SDK code to render a report in crystal report viewer. When i refresh report with more data(more parameter value) the server session times out in portal. Is there any way to fix this issue. The report loads data and then displays in Crystal report viewer, When more data is there the server times out as the server time is set to 60 sec. Is there any way to open the crystal report viewer as and when the report loads data to avoid server time out isse.
    Please help . Please let me know if I am missing something.. Thanks in Advance!!!
    CODE;
    <%@page language="java" contentType="text/html; charset=ISO-8859-1"
           pageEncoding="ISO-8859-1" session="false"%>
    <%@page
           import="com.crystaldecisions.sdk.occa.report.application.OpenReportOptions"%>
    <%@page
           import="com.crystaldecisions.sdk.occa.report.application.ReportClientDocument"%>
    <%@page
           import="com.crystaldecisions.sdk.occa.report.application.ParameterFieldController"%>
    <%@page
           import="com.crystaldecisions.sdk.occa.report.lib.ReportSDKException"%>
    <%@page
           import="com.crystaldecisions.report.web.viewer.CrystalReportViewer"%>
           <%@page import="com.crystaldecisions.report.web.viewer.*"%>
    <%@page
           import="com.crystaldecisions.sdk.occa.report.lib.ReportSDKExceptionBase"%>
    <%@page
           import="com.crystaldecisions.sdk.occa.report.reportsource.IReportSource"%>
    <%@page import="java.io.Writer"%>
    <%@page import="java.io.IOException "%>
    <%@ page import="com.crystaldecisions.report.web.viewer.ReportExportControl" %>
    <%@ page import="com.crystaldecisions.sdk.occa.report.exportoptions.ExportOptions" %>
    <%@ page import="com.crystaldecisions.sdk.occa.report.exportoptions.ReportExportFormat" %>
    <%@page
           import="com.crystaldecisions.sdk.occa.report.application.DatabaseController"%>
                  <%@page
           import="com.crystaldecisions.sdk.occa.report.application.ReportSaveAsOptions"%>
           <% response.setHeader("pragma","no-cache");//HTTP 1.1
    response.setHeader("Cache-Control","no-cache");
    response.setHeader("Cache-Control","no-store");
    response.addDateHeader("Expires", -1);
    response.setDateHeader("max-age", 0);
    //response.setIntHeader ("Expires", -1);
    //prevents caching at the proxy server
    response.addHeader("cache-Control", "private"); %>
    <%
           String reportPath,Sharedpath;
           ReportClientDocument reportClientDocument;
                ParameterFieldController parameterFieldController;
                try{
                    reportPath = "reportlocation";
                 Sharedpath = "Target Location";
                    reportClientDocument = new ReportClientDocument();
                    reportClientDocument.setReportAppServer(ReportClientDocument.inprocConnectionString);
                         reportClientDocument.open(reportPath, OpenReportOptions._openAsReadOnly);
                         reportClientDocument.getDatabaseController().logon("Dbname", "dbpassword");              
                         System.out.println("Connecting...");
                       parameterFieldController = reportClientDocument.getDataDefController()
                   .getParameterFieldController();
                    parameterFieldController.setCurrentValues("", "param 1",
                         new Object[] {1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,29});
    parameterFieldController.setCurrentValues("", "Param 2",
                  new Object[] {1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23});
    reportClientDocument.saveAs("Target Report Name","Target Location", ReportSaveAsOptions._overwriteExisting);
           reportClientDocument.close();
           System.out.println("Finished...");              
    CrystalReportViewer viewer = new CrystalReportViewer();
    viewer.setOwnPage(true);
    viewer.setPrintMode(CrPrintMode.ACTIVEX);
    viewer.setReportSource(Sharedpath);
    viewer.processHttpRequest(request, response, getServletConfig().getServletContext(), null);
                  System.out.println("Finished...");
           }  catch (ReportSDKException e) {
                  // TODO Auto-generated catch block
                  e.printStackTrace();
    %>

    Hi All,         
         I am using below java SDK code to render a report in crystal report viewer. When i refresh report with more data(more parameter value) the server session times out in portal. Is there any way to fix this issue. The report loads data and then displays in Crystal report viewer, When more data is there the server times out as the server time is set to 60 sec. Is there any way to open the crystal report viewer as and when the report loads data to avoid server time out isse.
    Please help . Please let me know if I am missing something.. Thanks in Advance!!!
    CODE;
    <%@page language="java" contentType="text/html; charset=ISO-8859-1"
           pageEncoding="ISO-8859-1" session="false"%>
    <%@page
           import="com.crystaldecisions.sdk.occa.report.application.OpenReportOptions"%>
    <%@page
           import="com.crystaldecisions.sdk.occa.report.application.ReportClientDocument"%>
    <%@page
           import="com.crystaldecisions.sdk.occa.report.application.ParameterFieldController"%>
    <%@page
           import="com.crystaldecisions.sdk.occa.report.lib.ReportSDKException"%>
    <%@page
           import="com.crystaldecisions.report.web.viewer.CrystalReportViewer"%>
           <%@page import="com.crystaldecisions.report.web.viewer.*"%>
    <%@page
           import="com.crystaldecisions.sdk.occa.report.lib.ReportSDKExceptionBase"%>
    <%@page
           import="com.crystaldecisions.sdk.occa.report.reportsource.IReportSource"%>
    <%@page import="java.io.Writer"%>
    <%@page import="java.io.IOException "%>
    <%@ page import="com.crystaldecisions.report.web.viewer.ReportExportControl" %>
    <%@ page import="com.crystaldecisions.sdk.occa.report.exportoptions.ExportOptions" %>
    <%@ page import="com.crystaldecisions.sdk.occa.report.exportoptions.ReportExportFormat" %>
    <%@page
           import="com.crystaldecisions.sdk.occa.report.application.DatabaseController"%>
                  <%@page
           import="com.crystaldecisions.sdk.occa.report.application.ReportSaveAsOptions"%>
           <% response.setHeader("pragma","no-cache");//HTTP 1.1
    response.setHeader("Cache-Control","no-cache");
    response.setHeader("Cache-Control","no-store");
    response.addDateHeader("Expires", -1);
    response.setDateHeader("max-age", 0);
    //response.setIntHeader ("Expires", -1);
    //prevents caching at the proxy server
    response.addHeader("cache-Control", "private"); %>
    <%
           String reportPath,Sharedpath;
           ReportClientDocument reportClientDocument;
                ParameterFieldController parameterFieldController;
                try{
                    reportPath = "reportlocation";
                 Sharedpath = "Target Location";
                    reportClientDocument = new ReportClientDocument();
                    reportClientDocument.setReportAppServer(ReportClientDocument.inprocConnectionString);
                         reportClientDocument.open(reportPath, OpenReportOptions._openAsReadOnly);
                         reportClientDocument.getDatabaseController().logon("Dbname", "dbpassword");              
                         System.out.println("Connecting...");
                       parameterFieldController = reportClientDocument.getDataDefController()
                   .getParameterFieldController();
                    parameterFieldController.setCurrentValues("", "param 1",
                         new Object[] {1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,29});
    parameterFieldController.setCurrentValues("", "Param 2",
                  new Object[] {1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23});
    reportClientDocument.saveAs("Target Report Name","Target Location", ReportSaveAsOptions._overwriteExisting);
           reportClientDocument.close();
           System.out.println("Finished...");              
    CrystalReportViewer viewer = new CrystalReportViewer();
    viewer.setOwnPage(true);
    viewer.setPrintMode(CrPrintMode.ACTIVEX);
    viewer.setReportSource(Sharedpath);
    viewer.processHttpRequest(request, response, getServletConfig().getServletContext(), null);
                  System.out.println("Finished...");
           }  catch (ReportSDKException e) {
                  // TODO Auto-generated catch block
                  e.printStackTrace();
    %>

  • How to increase the web session time out for FDM While data uploading.

    I have very large data files of Balance Sheet and Profit & Loss. These are taking very long time while being loaded through FDM. Kindly let me know of the following:
    1 - How can I increase the time for "web session time out" in FDM; and
    2 - What is the standard data loading time, e.g. how much time should it ideally take for approximately 1,000 lines to be load in Hyperion.
    Regards
    Amjad
    Edited by: ar_aff on Sep 12, 2011 8:30 AM

    You supposedly feed it a (undocumented) parameter, -rxidletimeout, with the time in seconds, at startup.
    app.serverSettings.sessionTimeout will report back whatever value you fed it. However, in my experience so far, the timeout is somewhere around 30 seconds no matter what value you feed it. I might be doing something wrong.
    I currently have a ticket open with Adobe support about this very issue, but it's slow going. I'll try to update you with whatever I find out.
    I'd love to hear whether anyone else has this working.
    Jeff

  • Data extraction hanging/time-out for certain FISCPER parameters

    Hi all,
    Our process chain that extracts and load data from one cube into another hangs when the FISCPER parameter value range is on the same fiscal year and the 'period to' value is 09 or less.  We've tested this also using RSA3 and the same result happened.
    Below are the FISCPER scenarios that we've tested using both our Process Chain and RSA3:
    2009001 to 2009012 -> test result is ok
    2008001 to 2008011 -> test result is ok
    2009002 to 2009012 -> test result is ok
    2009001 to 2009009 -> test result is not ok:loading hangs/time-out
    2008002 to 2008008 -> test result is not ok:loading hangs/time-out
    Can anyone share the solution please if this was also experienced before? 
    Thanks in advance.
    Aloy

    This has been resolved.  Oracle release earlier than 10.2.0.4.0 still uses factview when selecting records.  Because of this, certain issues arise and one of these is the time-out during data selection. the problem can be solved by disabling the use of factview or by upgrading oracle database.

Maybe you are looking for

  • How can i remove my phone number from imessage and just use email address ?

    Hi all I've searched all the forums and still can't find an answer for what i need (what i think i need!) I have an iphone without a data plan so whenever anybody who is registered with imessage sends me a message i will not receive it until i am bac

  • 1099 Form - Misc doesn't print 2 forms per page.

    The 2007A PLD form that we have prints only 1 form per page.  The 1099 forms that we have for MISC have 2 'forms' per page.  I have tried scaling the size of the repetitive area so 2 vendors print per page, but I am unsuccessful.  How does PLD know h

  • Sidecar.xml help

    I've never created a sidecar.xml file before, but with the new update to the folio builder I want to update this way vs. doing all the articles manually. I think I did something wrong because after importing the xml file nothing happens, this is what

  • Downloading Flip4mac and Mac OS X Update to read .wmv files

    G'day, I'm completely new to macs-just got a macbook pro. I have downloaded flip4mac to view .wmv files. This required an update (due to not being intel processor compliant I think) so I downlaoded Mac OS X Update (Intel). When I download this I get

  • How to retrive data from encrypted backup(failed in itunes) - i have the password!!

    I know there are other software that may help to retrive the data but it doesn't work with encrypted backup. Now i have the password, i have the encrypted backup (failed to recover in itunes). What can i do? is it possible to remove the password from