Winter Time Changes & Database functioning

Hi,
Anyone please give me some idea on the following:
How the system time changes (during winter change/daylight saving) affect the data/database/redo/archive/recovery ?
How the timestamp is maintained in this case?
Say system time gives you 2am
at 4 am you change it back to 2am.
Now,after four hours the system time will be 4am again.
How the timestamp of archive logs generated at those two 4am's are managed?
Will that create a problem while doing "recovery until time" ?
Thanks in advance,
Don

if you use recover until time, I advise you to do it with timezone explicitely set, because "30.10.2005 02:30:00 Europe/Zurich" is not unique.
to avoid that, you can use ERROR_ON_OVERLAP_TIME in your session, or always specify UTC time like "30.10.2005 02:30:00 +02:00" or "30.10.2005 02:30:00 +01:00"

Similar Messages

  • ICal Web Publishing Error, UK time change

    We publish booking availability for our village hall via ical onto the hall's website.
    During October 2010 the " We're Sorry" Apple error message was encountered and put down to problems associated with the UK summer/winter time change.
    The problem then disappeared for some weeks - the calendar published to the web site without a hitch.
    Now it has re-occurred.
    Have Apple made a decision to remove the web publishing feature or is it just another technical hitch?

    Hey Christos,
    I'm not sure if that is possible  with the plugin, but I will double check. If you are just looking to do some basic monitoring and control, using a web service might be better suited in this case, since it is lighter and can serve actual web pages instead of relying on a plugin. Feel free to check out our Web development community group, try some of the examples, and post to our discussions: https://decibel.ni.com/content/groups/web-services
    Daniel C.
    Applications Engineer
    National Instruments

  • Return value from database function taking a lot more time than the query

    Hi guys,
    I have a Query that does a call to a database function. The function takes in a few parameters and returns a Date. Now, the query within the function takes barely .05 seconds. However, doing a select get_join_dates from dual is taking almost 6 seconds for each call.
    Here is the Query:
    select s.student_id, s.student_name, s.organization_code
    from   student s
    where  s.student_id = :p_student_id
    and    s.student_enrollment_date = get_join_dates( :p_year,
                                                       :p_month,
                                                       :p_student_id,
                                                       s.organization_code );And here is the database function. The select inside this function barely takes 0.05 seconds per call. This function gets called 3 times in my case as there are 3 records in the org_body table for this student.
    create or replace function
    get_join_dates( p_yyyy in org_body.fiscal_yyyy%type,
                           p_month in  org_body.fiscal_mm%type,
                           p_student_id in student.student_id%type,
                           p_organization_code in org_body.organization_code%type) return date as
        t_enrollment_date  date;
        cursor cur_latest_enrollment_date is
          select max(enrollment_date)
          from   org_body
          where  fiscal_yyyy = p_yyyy
          and    fiscal_mm = p_month
          and    student_id = p_student_id
          and    organization_code = p_organization_code;
      BEGIN
        open cur_latest_enrollment_date ;
        fetch cur_latest_enrollment_date into t_enrollment_date;
        close cur_latest_enrollment_date ;
        return t_enrollment_date;
      exception
        when others then
          null;
    end;owever, when I run the following statement below, it takes close to 6 seconds to retrieve a record. In turn, my Query is becoming really slow and taking almost 35 seconds. Imagine that with more records.
    select get_join_dates( 2010, '01', '2167543', 'PSYCH01' ) from dual;If I run my query with this condition below, it takes 0.5 seconds.
    select s.student_id, s.student_name, s.organization_code
    from   student s
    where  s.student_id = :p_student_id
    and      s.student_enrollment_date = '01-JAN-10'Any ideas would be greatly appreciated.

    Any reason why you are doing this with the stored function?
    You could just do this with SQL. Embed the query in the function as a subquery in your initial query from STUDENT.
    select s.student_id, s.student_name, s.organization_code
    from   student s
    where  s.student_id = :p_student_id
    and    s.student_enrollment_date =
    (select max(enrollment_date)
          from   org_body
          where  fiscal_yyyy = :p_year
          and    fiscal_mm = :p_month
          and    student_id = s.student_id
          and    organization_code = s.organization_code);Why your function is not performing: I cannot say that with the information you have provided.
    Maybe sqltrace a call and see what the reason is.

  • IdM Scheduler and Time Change to Daylight Savings Time GMT to BST

    This isn't really a 'question' , but just a topic for discussion.
    We're not live yet, but will be by the time the clocks go back. So I'm just interest to hear everyone's experiences and any issues with regards to the switching to / from DST
    I did some "interesting" (or not) tests last night to see what happens with Pending Values which are due to take effect at the time change threshold of 01:00 GMT when switching to Daylight Savings Time (BST) GMT +0100.
    It seems as if the IdM scheduler keeps going according to UTC or GMT. However any values provisioned to target systems or to local tables using the ddm.datetime8601 system variable are in local time according to the runtime server or SQL server's locale settings.
    I set up a task to output from a javascript the current system date
    using the following in the functions.
    currentTime = new Date();
    return currentTime.toString(); // returns Locale Time String
    currentTime = new Date();
    return currentTime.toUTCString(); // returns UTC Time String
    in the same pass I also output the ddm.datetime8601 system parameter and the ddm.time24 system parameter.
    This task was to update a comment on an entry and Scheduled to run at 00:00:00 and was executed at 00:00:06 GMT all times were still in GMT as expected.
    Local Time: Sun Mar 27 00:00:06 GMT-0000 (GMT) 2011, UTC Time: Sun, 27 Mar 2011 00:00:06 GMT, ddm.datetime8601: 2011-03-27T00:00:06, ddm.time24: 00:00:06
    This task was scheduled at 01:00:00 in IdM and was exectued at 01:00:08 GMT by the runtime, and you can see the local time is being correctly reported as 02:00:08 BST. Note also the ddmdatetime8061 is also 02:00:08 i.e. BST.
    Local Time: Sun Mar 27 02:00:08 GMT+0100 (BST) 2011, UTC Time: Sun, 27 Mar 2011 01:00:08 GMT, ddm.datetime8601: 2011-03-27T02:00:08, ddm.time24: 02:00:08
    In the database the entry values MXI_VALUES ModifyTime field has the GMT / UTC timestamp.
    I also set up some pending value objects to be applied at 00:59:59 and 01:30:00 to see if the 01:30:00 would be 'skipped' as the locl clock change would've gone from 00:59:59 to 02:00:00. Well the pending values were correctly applied and expired and no 'lost' transactions happened as a result of the change.
    So essentially it seems to me that the IdM runtime and scheduler times are all based on UTC / GMT.
    I guess it's only an issue twice a year if you're expecting something to happen at a specific local time 01:00:00 or if you've got a global user base and their account expiry should be in each user's own country's local time. In these cases, it would seem that you'd need to convert the schedule times and pending value times to UTC first before setting them in the database.
    Has anyone else had any issues, thoughts, questions or suggestioins on scheduling tasks which span the DST time change when switching to and from Daylight Savings Time.
    Edited by: Paul Abrahamson on Mar 27, 2011 10:51 AM

    Read here:
    http://www.appleinsider.com/articles/10/10/11/applesays_ios_software_update_to_fix_pesky_alarm_clockbug.html

  • Problem in using query-database() function in Transformation

    Hi All,
    I am using JDev and SOA 10.1.3.4.
    I have an async process.
    In that I am doing a transformation in which source is InputVariable and target is result.
    In Transformation I am using query-database function to fetch a record from DB and am assigning that to result.
    *<xsl:value-of select='orcl:query-database(concat("select ename from emp where empid=",/ns1:DBXSLProcessRequest/ns1:input),false(),false(),"jdbc:oracle:thin:scott/tiger@localhost:1521:abcd")'/>*
    I am not getting any error but query-database is not returning any thing I maen it is returning null.
    If i run that query in SQL prompt it is returning the empname.
    Please help me out.
    Regards
    PavanKumar.M

    Hi Pavan
    I tried following in BPEL transform actvity's XSL file, and it works fine and returns the sysdate for me.
    <xsl:value-of select='orcl:query-database("select sysdate from dual",false,false,"jdbc/myDS")'/>
    Can you try above in new XSL file of Transform activity and let me know its gives you the result? if it works you can start making changes according to your requirement.
    steps to follow:
    1.create connection pool in EM, and make sure its working fine by using test on connection pool.
    2.create Datasource and point to above created connection pool. and restart the oc4j_soa
    3.rename the xsl file name of transform activity in BPEL and use above query-database function.some times xsl files are cached even if you make changes it will not take effect so for testing if you rename it will be good.
    Thanks
    Seshagiri.Rayala
    http://soabpel.wordpress.com/

  • Changing database server on a report with subreports = formula error

    Good morning,
    I currently have several reports that print out, and were developed attached to our development database. However, I need to be able to dynamically change the server that the report uses according to the server configured in our application. Each of these reports contains one or more subreports, which point to the same server and database as the main report. All reports, both the main and subreports, are based on manual SQL commands.
    I'm running into some significant issues. So significant, in fact, that we were forced to deploy our application with reports that had been switched to our production environment in the designer in order to get them functional. This is, obviously, not an acceptable or long-term solution.
    I've gone round and round a couple of times I get different results with different methods of changing this information. I'll outline them below. First, my current code:
    ConnectionInfo connectionInfo = new ConnectionInfo();
                    TableLogOnInfo logOnInfo = new TableLogOnInfo();
                    Console.WriteLine("Report \"{0}\"", report.Name);
                    foreach (Table table in report.Database.Tables)
                        logOnInfo = table.LogOnInfo;
                        connectionInfo = new ConnectionInfo(logOnInfo.ConnectionInfo);
                        connectionInfo.ServerName = "panthers-dev";
                        connectionInfo.DatabaseName = "Prosys";
                        logOnInfo.ConnectionInfo = connectionInfo;
                        //table.Location = "Prosys.dbo." + table.Location.Substring(table.Location.LastIndexOf(".") + 1);
                        table.ApplyLogOnInfo(logOnInfo);
                        table.LogOnInfo.ConnectionInfo = connectionInfo;
                        Console.WriteLine("\t\"{0}\": \"{1}\", \"{2}\", \"{3}\", {4}", table.Name, table.LogOnInfo.ConnectionInfo.ServerName, table.LogOnInfo.ConnectionInfo.DatabaseName, table.Location, table.TestConnectivity());
                    foreach (Section section in report.ReportDefinition.Sections)
                        foreach (ReportObject ro in section.ReportObjects)
                            if (ro.Kind == ReportObjectKind.SubreportObject)
                                SubreportObject sro = (SubreportObject)ro;
                                ReportDocument subreport = report.OpenSubreport(sro.SubreportName);
                                Console.WriteLine("\tSubreport \"{0}\"", subreport.Name);
                                foreach (Table table in subreport.Database.Tables)
                                    logOnInfo = table.LogOnInfo;
                                    connectionInfo = new ConnectionInfo(logOnInfo.ConnectionInfo);
                                    connectionInfo.ServerName = "panthers-dev";
                                    connectionInfo.DatabaseName = "Prosys";
                                    logOnInfo.ConnectionInfo = connectionInfo;
                                    //table.Location = "Prosys.dbo." + table.Location.Substring(table.Location.LastIndexOf(".") + 1);
                                    table.ApplyLogOnInfo(logOnInfo);
                                    table.LogOnInfo.ConnectionInfo = connectionInfo;
                                    Console.WriteLine("\t\t\"{0}\": \"{1}\", \"{2}\", \"{3}\", {4}", table.Name, table.LogOnInfo.ConnectionInfo.ServerName, table.LogOnInfo.ConnectionInfo.DatabaseName, table.Location, table.TestConnectivity());
    Using this approach, my console output prints what I expect and want to see: the correct server and database information, and True for TestConnectivity for all reports and subreports. The two reports I have that have no subreports print out correctly, with data from the proper server. However, all of the reports with subreports fail with formula errors. If this procedure is not run, they work just fine on either server.
    I had to place the assignment of table.LogOnInfo.ConnectionInfo = connectionInfo after the call to ApplyLogOnInfo, as that function did not behave as expected. If I perform the assignment first (or not at all), then calling ApplyLogOnInfo on the outer report's table did NOT affect the values of its ConnectionInfo object, but it DID affect the values of the ConnectionInfo object's of its subreports!
    In any event, if anyone could post a code sample of changing database connection information on a report containing subreports, I would appreciate it.
    Any help is greatly appreciated and anxiously awaited!

    Hi Adam,
    Code for changing database connection information on a report containing subreports :
    private ReportDocument northwindCustomersReport;
        private void ConfigureCrystalReports()
            northwindCustomersReport = new ReportDocument();
            string reportPath = Server.MapPath("NorthwindCustomers.rpt");
            northwindCustomersReport.Load(reportPath);
            ConnectionInfo connectionInfo = new ConnectionInfo();
            connectionInfo.ServerName = "localhost";
            connectionInfo.DatabaseName = "Northwind";
            connectionInfo.IntegratedSecurity = false;
            SetDBLogonForReport(connectionInfo, northwindCustomersReport);
            SetDBLogonForSubreports(connectionInfo, northwindCustomersReport);
            crystalReportViewer.ReportSource = northwindCustomersReport;
        private void Page_Init(object sender, EventArgs e)
            ConfigureCrystalReports();
        private void SetDBLogonForReport(ConnectionInfo connectionInfo, ReportDocument reportDocument)
            Tables tables = reportDocument.Database.Tables;
            foreach (CrystalDecisions.CrystalReports.Engine.Table table in tables)
                TableLogOnInfo tableLogonInfo = table.LogOnInfo;
                tableLogonInfo.ConnectionInfo = connectionInfo;
                table.ApplyLogOnInfo(tableLogonInfo);
        private void SetDBLogonForSubreports(ConnectionInfo connectionInfo, ReportDocument reportDocument)
            Sections sections = reportDocument.ReportDefinition.Sections;
            foreach (Section section in sections)
                ReportObjects reportObjects = section.ReportObjects;
                foreach (ReportObject reportObject in reportObjects)
                    if (reportObject.Kind == ReportObjectKind.SubreportObject)
                        SubreportObject subreportObject = (SubreportObject)reportObject;
                        ReportDocument subReportDocument = subreportObject.OpenSubreport(subreportObject.SubreportName);
                        SetDBLogonForReport(connectionInfo, subReportDocument);
    Hope this helps!!
    Regards,
    Shweta

  • Conversion from summer time (daylight savings) to winter time.

    Hi All,
    When converting from summer to winter time (with one double hour) for the ABAP stacks action needs to be taken to make sure no problems arise. For the ABAP stacks there are several possibilities; e.g. see /people/tikkana.akurati/blog/2009/12/09/daylight-saving-time-and-slowing-down-the-time
    What about the JAVA stacks however? Is it OK to keep them running.>Or does action need to be taken for JAVA systems to? And what if there is JAVA <-> ABAP communication and the ABAP stack is set to slow time during this hour?
    Can anyone advise on this?
    Thanks.
    Regards,
    Bart Groot

    Hi,
    Well it is no more necessary to stop an ABAP system when coming back to winter time.
    Check OSS note 950114 about system parameter zdate/DSTswitch_contloctime
    For the JAVA stack, as there is no functional data in the database, I don't think there is anything to do.
    You might get strange logs files but who cares ? Java logs files are already so strange !
    Regards,
    Olivier

  • Capturing change code/Reason code in CAT2 transaction for time change

    Experts,
    Wanted your suggestion and expertise in a solution where in CAT2 if a user changes the time/wbs element/tax area/work type he should be forced to put the reason code for change. The change event would populate/capture the change code, reason for change at cell level in CAT2
    Your pointers would be of great help

    The following enhancements are required to achieve the new pop up window functionality:
    1.     A new customer field should be added to the CATS time entry screen in the CATSDB.
    2.     A new check table should be created in CATSDB containing the allowable drop down entries for the new field.
    Customer fields are inserted into the CI_CATSDB structure. This structure is contained in the database table for the Time Sheet (CATSDB).
    Steps:
    1. Create a customer project using the SAP enhancement CATS0005.
    2. On the initial project administration screen, select the Enhancement components field and choose Change.
    3. Select the entry CI_CATSDB and choose Edit -> Components.
    4. Create the structure CI_CATSDB.
    5. Insert the new field to the Time Sheet database table into the structure.
        Select the data dictionary type CHAR
    6. Check and save the structure.
    7. Activate the structure.
    8. In the Cross Application > Time sheet section of the IMG Choose Make field assignment.
    9. Assign the new field to the view.
    10. Assign a number from 1 - 10 specify the name of the field, u201CChange Reasonu201D.
    Then Conditions Check for New Field:According to requirement

  • Summer Time -- Winter Time

    Hello fellow admins,
    In Europe, during the night of october 29th to october 30th, we will change from Summer time to Winter Time.
    In my company we have always stopped our SAP systems during the "time leap" from 3am to 2am.
    We know that, when setting the system parameter zdate/DSTswitch_contloctime = on, the SAP abap kernel will be able to "slow" time to avoid the time leap.
    We are now investigating if we could avoid to stop our ECC6, R/3 4.7, SRM and CRM systems.
    What bothers us is that SAP only guarantees the BASIS module and tell that we should check each functional module to decide if a time difference between SAP time and legal time up to 30 minutes is acceptable for our business processes.
    I would like to do a kind of poll to know what you do with your own SAP systems.
    Could you, please, answer by telling which SAP system is used and if you stop it or keep it running during the time change ?
    Thank you in advance for sharing information.
    Regards,
    Olivier

    Hello
    During that period our business will send a notice to the end users that the system will not available during this period and we stop all our applications(DEV, QAS & PRD) ECC, ECC HCM, EP, SRM, Solman.
    It's always good to bring down the system to avoid any business impact or any new challenges.
    Thanks & regards
    bala

  • Fall Daylight time change & 9i

    I am a Unix system administrator and not DBA.
    We have a mission critical DB application running on HP-UX 11.23 Itanium.
    With forthcoming Fall daylight change the time would bounce back from 2AM to 1AM. DBAs are saying the change would create a mess with the application. I am not sure how much the actual application or jobs are dependent on time and how this affect to oracle itself. DBAs are asking a way to slow down the system clock so that downtime could be avoided.
    I want to know how it is being taken care at other places
    Is it possible to have smooth transition of time without affecting the application operation?
    I read there are daylight datatypes in 9i, but I am not sure if they can take care of this change and oracle level ( recovery etc.. ) and application DB transaction level?
    Will they help to have zero downtime?
    I would appreciate your feedback or experience suggestions.
    Thanks in advance

    Thanks Pierre for quick reply.
    I referred those documents and as you said things are dependant on SYSDATE function would be affected. This function could be used in many PL/SQL codes/jobs of the application and would certainly affected by change in system clock.
    Referring the doc 149120.1 , I guess the Timezone/daylight support is added in the oracle 9i so one can use those datatypes within database (part of globalisation of DB). This is nothing to do with SYSDATE or support ability of oracle to take care of PL/SQL application code/job to run without problem in case system clock changes. Am I right?
    The application is running 24x7. Are you aware of any work around in such situation to have smooth operation ( business continuance) ?
    I think this situation might be there with most of the businesses, how they do tackle thie time change?
    Any guidance would be appraciated.

  • How to find which all workbook is using Database function ( User Defined)

    Hi All,
    Is it possible to find out which all workbook is using Database function( User Defined).
    Thanks,

    Hi,
    If I had to do this detective work, I would probably do the following:
    1. Activate for a period of time the function, eul5_post_save_document. This function when activated is triggered at the time a workbook is saved. If you look at its columns, it save the worksheet's SQL.
    2. Next, I would parse the EUL5_WORKSHEET_SQL.SQL_SEGMENT column which is a varchar2(4000) column. There are many effective Oracle functions which could aid you in this effort (e.g. instring or perhaps a regular expression function).
    I hope this helps.
    Patrick

  • Need to know how to find the last execution time for a function module

    HI all
    I need to know
    1) How to find out the last execution time of the function module ?
      say for eg. I have executed a func. module at 1:39pm. How to retrieve this time  (1:39pm)
    2) I have created 3 billing document in tcode VF01 i.e 3 billing doucment no. would be created in SAP TABLE "VBRP" b/w 12am to 12:30 am.
    How to capture the latest SAP database update b/w time intervals?
    3) Suppose I am downloading TXT file using "GUI_DOWNLOAD" and say in 20th record some error has happened. I can capture the error using the exception.
    Is it possible to run the program once again from 21st records ? All this will be running in background...
    Kindly clarify....
    Points will be rewarded
    Thanks in advance

    1.Use tcode STAT input as Tcode of Fm and execute .
    2. See the billing documents are created in table VBRk header and there will always be Creation date and time.
    VBRk-Erdat "date ., u can check the time field also
    So now if u talk the date and time we can filter then display the records in intervals.
    3. with an error exeption how is my txt download finished .
    once exception is raised there will not be a download .
    regards,
    vijay

  • How to pass dynamic parameter to a database function in OBIEE

    Hi,
    I have a requirement like this. I have to create one report in OBIEE which was in Discoverer. Now in discoverer report there are some calculated item in the worksheet based on database pkg.functions. The parameter which user gives at run time that parameters are then passed to the discoverer calculated items dynamically. But I am not able to do this in OBIEE answers.
    Can anyone tell me step by step how I can able to pass the user selected parameter values in OBIEE answer level.
    The example:
    GET_COMM_VALUE_PTD("AFE Cost & Commitment".Afe Id,:"Period Name(AFE)","AFE Cost & Commitment".Data Sel,"AFE Cost & Commitment".Org Id)
    GET_COMM_VALUE_PTD --- Function database
    ("AFE Cost & Commitment".Afe Id,:"Period Name(AFE)","AFE Cost & Commitment".Data Sel,"AFE Cost & Commitment".Org Id --- Parameters... :"Period Name(AFE)" is the dynamic parameter selected run time by user.
    Please help.
    Thanks
    Titas

    Hi,
    I already did that. But the existing discoverer re value report is showing correct value but the OBIEE report with EVALUATE function shows incorrect value.
    The requirement is to create a OBIEE Dashboard from a Discoverer existing report. Now in discoverer report theere are several calculated items in worksheet level where database custom function is used and user selected parameter value is passed run time in that function. But when that is created in OBIEE with EVALUATE function in RPD, the report shows incorrect data. Could not understand how to pass the parameter value runtime.
    Below is the discoverer 'Show SQL' query and EVALUATE function syntax which has been used.
    SELECT APPS.XXBG_PL_PROJ_ANALYSIS_AFE_PKG.GET_COMM_VALUE_PTD(XXBG_PL_PROJ_AFE_V.AFE_ID,
    :"Period Name(AFE)",
    XXBG_PL_PROJ_AFE_V.DATA_SEL,
    XXBG_PL_PROJ_AFE_V.ORG_ID),
    XXBG_PL_PROJ_AFE_V.AFE_DESC,
    XXBG_PL_PROJ_AFE_V.AFE_NUMBER,
    XXBG_PL_PROJ_AFE_V.APPROVED_AFE_AMOUNT
    FROM APPS.PA_PERIODS_ALL PA_PERIODS_ALL,
    APPS.XXBG_PL_PROJ_AFE_V XXBG_PL_PROJ_AFE_V
    WHERE ((XXBG_PL_PROJ_AFE_V.ORG_ID = PA_PERIODS_ALL.ORG_ID))
    AND (XXBG_PL_PROJ_AFE_V.DATA_SEL = :"Data Selection(AFE)")
    AND (PA_PERIODS_ALL.PERIOD_NAME = :"Period Name(AFE)")
    AND (XXBG_PL_PROJ_AFE_V.AFE_NUMBER = :"AFE Number(AFE)")
    AND (XXBG_PL_PROJ_AFE_V.OPERATING_UNIT = :"Operating Unit(AFE)")
    The EVALUATE function syntax is as below:
    EVALUATE('XXBG_PL_PROJ_ANALYSIS_AFE_PKG.GET_COMM_VALUE_PTD(%1,%2,%3,%4)' AS FLOAT , "BG PL Project Analysis Report_1"."AFE Cost & Commitment"."AFE Cost & Commitment.Afe Id", "BG PL Project Analysis Report_1"."Periods 1"."Periods 1.Period Name", "BG PL Project Analysis Report_1"."AFE Cost & Commitment"."AFE Cost & Commitment.Data Sel", "BG PL Project Analysis Report_1"."AFE Cost & Commitment"."AFE Cost & Commitment.Org Id")
    The PERIOD needs to be passed at run time which user will select.
    Please help to solve the issue.

  • Changing Database connection at runtime in CR2008

    Hi,
    I have a VS2008 application which is using CR2008 to generate pdf reports for the user. Problem is I am not able to change the database connection to any server other than the one that was used to create the report template.
    My environment has CR2008 SP2. I have tried reports with no paramter as well as the ones with parameters both dont seem to work.
    I have already tried a couple of things:
    a) report document refresh after loading it
    b) setting the database connection as
    this._crReport.DataSourceConnections[0].SetConnection(dbServerName, dbName, dbUserId, dbPwd);
    c) setting the database connection as
    tbls = this._crReport.Database.Tables;
                    bool test = false;
                    foreach (Table tbl in tbls)
                        tblLogonInfo = tbl.LogOnInfo;
                        tblLogonInfo.ConnectionInfo = connInfo;
                        tbl.ApplyLogOnInfo(tblLogonInfo);
                        test = tbl.TestConnectivity(); ** note this alwys return a true for the changed database server
    I have read a couple of thread but none has worked for me...Is this a known issue with CR2008?
    Thanks
    Kajal

    Hi Kajal,
    I am assuming that you are not changing the schema of the report at any point of time. I would suggest you to try pointing to the other database through designer to make sure that you can hit the database and the schema is correct. You can also create a udl file to check the database connectivity.
    Then just for testing purpose you can try the following code:
    ReportDocument reportDoc = new ReportDocument();
    reportDoc.Load("path of the report");
    reportDoc.SetDatbaseLocation("uid","pwd","Servername","tablename");
    reportDoc.SetParameterValues("Parameter-Name","values");
    CrystalReportViewer.ReportSoure=reportDoc;
    Does that help?
    Thanks.

  • Issue with changing database location at runtime

    I am having a similar issue to:
    Re: Issue with changing database location at runtime
    where I am using Crystal Reports 2008 SP 3 fix pack 3.3 and a OLE DB connection to a SQL 2008 R2 server. Running on a computer on the network where the report can see the original development server is fast, other computers where that server is not available are hanging 20 seconds before coming up.
    I am using the same code from the Crystal Sample app to change the connection on each table. I found that the slowness comes the first time the ReportDocument object is accessed to get the collection of database tables, before the connection info is set.
    Fix pack 3.4 was mentioned in that post.  Does fix pack 3.4 fix that issue? I don't see fix pack 3.4 on the downloads page (https://websmp130.sap-ag.de/sap%28bD1lbiZjPTAwMQ==%29/bc/bsp/spn/bobj_download/main.htm)

    I just found on the reports that were having the issue there was a SQL Expression.  Per this thread:
    Re: Report load is slow after changing database servers
    There was an issue with that and the fix is not out til the end of Feb so I found a way not to use the SQL Expression and the speed is much better.
    However, your information provided led me to this post:
    Cannot Change Table Location, but Only for One Report
    And I am also experiencing an issue where the table location is not changing on one subreport and I will look into that as a possible solution.
    Thanks so much for your help.

Maybe you are looking for

  • How To Turn 1 iTunes Library into 2 / Split movies/tv shows from music?

    I am trying to find answer how to split iTunes library. I want to move the music,podcasts (and all meta data) from my iMac to my new Macbook Air, but want to leave the Movies & Tv Shows (and all meta data) on the iMac. Any suggestions or links to pre

  • Error with Ecc 6.0 install during Abap import of SAPView Package

    Hi All,           I am getting error while installing SAP ECC 6.0 SR3 on Windows Server 2008,  SQL Server 2005.           all 98 of  99 ABAP imports ran well only at the last import the error was with the SAP View Package. SAP View Log ==============

  • Greyed Out TV Episodes on My iPhone

    Hello, I use the latest version of iTunes on my Windows 7 PC to sync with my iPhone 4S, which is running iOS 7.0.2. Ever since I updated to iOS 7, whenever I connect my iPhone to the computer, it will show that there are 4 new items in "TV Shows" und

  • For those of you Struggling with videos!

    Even though i had been using the videora converter my videos were playing as songs, with only the artwork picture instead of the video. Well to change this so that you are able to view the video and hear the sound all you have to do is go to video se

  • IPhoto 6.0.3 asked to upgrade thumbnails-then it left me with no photos

    After the update it asked to upgrade thumbnails or cancel, I choose upgrade and found that it emptied the db of photos and albums(they weren't even in the trash). I made a copy of the library on DVD and then tried the rebuild iPhoto db startup with a