SQL failover test for reporting

I have setup a mirror and failover for sql appv reporting.
does anyone know of a good way to test if the failover is working?
Thanks
Dave Kozlowski

Note that SQL Mirroring is not supported for the Reporting database (only for the Management DB):
http://technet.microsoft.com/en-us/library/dn343758.aspx
In this thread someone tried to do the same for Reporting as for Management, but this didn't work;
http://social.technet.microsoft.com/Forums/en-US/20b1d02d-7b94-4b9d-8dfe-d81df93dc65e/appv-5-reporting-database-sql-mirroring?forum=mdopappv
And here is an article that explains how to configure mirroring for the Managemnt DB a little bit more detailed (but also not for Reporting):http://www.sbcprojects.com/blog/93-how-to-get-your-app-v-5-database-high-available-with-sql-mirroring.html
So I'd say that it perhaps doesn't work that easy. Actually I didn't try how the client behaves if the Reporting Server can't talk to the DB. I'd expect that the Reporting Server returns an error to the client if RepSrv can't access the database - so the
client gets the info that the reporting data was not uploaded successfully. In that case, the client stores the usage data loally and just tries to upload it durig the next schedule... HA for the Reporting DB should not be necessary.
Falko
Twitter
@kirk_tn   |   Blog
kirxblog   |   Web
kirx.org   |   Fireside
appvbook.com

Similar Messages

  • Creating ABAP Unit Tests for Report Programs

    How can we create ABAP unit test  for report programs.
    Please explain the steps?

    Hi Devendra.
    I would like to suggest some references which are as below:
    [SAP HELP - Complete ABAP unit|http://help.sap.com/saphelp_nw04/helpdata/en/a2/8a1b602e858645b8aac1559b638ea4/frameset.htm]
    [SDN - Reference for ABAP unit testing|about unit testing;
    Hope that's usefull.
    Good Luck & Regards.
    Harsh Dave
    Edited by: Harsh Dave on Jul 14, 2008 2:13 PM
    Edited by: Harsh Dave on Jul 14, 2008 2:20 PM

  • SCOM 2012 Installation does not show the SQL Server Instance for reporting services

    I have an issue when I try to install SCOM 2012 with the Reporting Server feature, so in the step that I have to select a SQL Server Instance I cannot do that, because as you can see in the picture no one option appears.
    In my scenario I have a DCSERVER, SQLSERVER2008 R2, SCCMSERVER 2012 and in other server I are trying to install SCOM 2012. I are trying to use the same SQLSERVER to SCOM 2012.
    Should I do other procedure on SQL Server before to continue? I will apreciate someone could say me what I need to do to resolve this issue.
    Regards,
    Paul Mendoza.

    Hi Paul
    Can you confirm that SQL Reporting Services is installed on the server that you are doing the install.
    For a lab environment where you have SCOM Management Server, Web Server on one server and SQL Server on another you need to:
    - run setup on the SCOM Server and choose NOT to install reporting. The install here will create the OperationsManager database and OperationsManagerDW (reporting) database.
    - run setup on the SQL Server where SQL Reporting Services is installed. Then choose to install reporting. You should then see the SQL Reporting Services instance on the above window which is blank in your screenshot.
    Take note of the warning in that window - SCOM uses its own role based security within SQL Reporting Services and this can break other reporting. E.g. you can't install SCOM and SCSM reporting (SQL RS component) together. 
    Cheers
    Graham
    New SCOM 2012 Blog! - http://www.systemcentersolutions.com/blog/
    View OpsMgr tips and tricks at
    http://systemcentersolutions.wordpress.com/

  • SQL to test for any combination of three characters

    Greetings,
    I am using Oracle 10g (10.1.0.4.0) on Linux (Red Hat Enterprise Linux AS release 3). One of our database tables has a VALUE column. Valid values for this column is a string comprising any combination of the letters 'A', 'B', 'C' or 'D' where each letter appears only once, for example: 'CAB' or 'DCB' or 'DB' or 'A' or 'ABCD', etc.
    [Note that 'BBD' is not valid because the letter 'B' appears twice.]
    I am looking for SQL that will test whether the letter 'D' is not present in the string.
    Thanks (in advance),
    Avi.

    Check this out :
    TEST@db102 SQL> create or replace function check_str (str_in in varchar2)
      2     return varchar2
      3  is
      4     len     number;
      5  begin
      6     for letter in 65..68 loop
      7             len := length(str_in) - length(replace(str_in, chr(letter)));
      8             if len > 1 then
      9                     return str_in || ' incorrect';
    10             end if;
    11     end loop;
    12     return str_in || ' correct';
    13* end;
    TEST@db102 SQL> /
    Function created.
    TEST@db102 SQL> select check_str('A') from dual;
    CHECK_STR('A')
    A correct
    TEST@db102 SQL> select check_str('ABC') from dual;
    CHECK_STR('ABC')
    ABC correct
    TEST@db102 SQL> select check_str('ABBC') from dual;
    CHECK_STR('ABBC')
    ABBC incorrect
    TEST@db102 SQL> select check_str('ABCC') from dual;
    CHECK_STR('ABCC')
    ABCC incorrect
    TEST@db102 SQL>                                                                                              ...and so on.

  • Interesting test for report compilation error

    I just tried to figure the compilation error I had and I ran
    into this :
    I delete all fields on my report so the report was empty.
    Then I added the backgroud color to the Report header session and
    ran the report, it worked fine. Then I add the Label on top of
    report header session then I got this error:
    Report compilation error. Error at (32, 20: null
    Is there a bug on the report or just my problem?
    Help!!

    Similar issue here. It happens with just a few reports. I can compile them on reports builder, but not using rwconverter.sh I am using 11.1.2.
    Any clues so far?

  • Mainframe data loaded into Oracle tables - Test for low values using PL/SQL

    Mainframe legacy data has been copied straight from the legacy tables into mirrored tables in Oracle. Some columns from the mainframe data had 'low values' in them. These columns were defined on the Oracle tables as varchar2 types. In looking at the data, some of these columns appear to have data that looks like little square boxes, not sure but maybe that is the way Oracle interprets the 'low values' in the original data into varchar. When I run a select to find all rows where this column is not null, it selects these columns. In the results of the select statement, the columns appear to be blank, however, in looking at the data in the column using SQL Developer, I can see the odd 'square boxes'. My guess is that the select statement is detecting that something exists in this column. Long story short, some how I am going to have to test this legacy data on the Oracle table using Pl/Sql to test for 'low values'. Does anyone have any suggestions on how I could do this????? Help! The mainframe data we are loading into these tables is loaded with columns with low values.
    I am using Oracle 11i.
    Thanks
    Edited by: ncsthbell on Nov 2, 2009 8:38 AM

    ncsthbell wrote:
    Mainframe legacy data has been copied straight from the legacy tables into mirrored tables in Oracle. Not a wise thing to do. Mainframe operating systems typically use EBCDIC and Unix and Windows servers use ASCII. The endian is also different (big endian vs little endian).
    Does anyone have any suggestions on how I could do this????? As suggested, use the SQL function called DUMP() to see the actual contents (in hex) of these columns.

  • Recreate standby after failover test

    Hi -
    I'm doing a failover test for the customer:
    1. Disconnect all network between Primary and Standby
    2. On the Standby - do a failover and open it to perform application tests.
    3. After finish we need to re-create Standby again.
    Is there a way to re-create Standby again without bringing it all from backup? it's a large 2TB database and it will take hours.
    Is there a way like using flashback or other technology to do it?
    We are talking 10.2.0.3 here...
    Thanks
    Edited by: 912294 on 04:07 06/02/2012

    Is there a way to re-create Standby again without bringing it all from backup? it's a large 2TB database and it will take hours.
    Is there a way like using flashback or other technology to do it?
    We are talking 10.2.0.3 here...Its called "open the Standby database in read write mode for any reporting or testing and then move it back to standby database using the flashback technology".
    In detail check this MOS note *How To Open Physical Standby For Read Write Testing and Flashback [ID 805438.1]*
    Thanks.

  • Service Accounts for Reporting Service in SQL Server Failover Cluster setup

    I am setting up 2 Report Services (SSRS) in SQL Failover Clustering (Version: 2012SP1) on Windows 2012, as part of scale out architecture.
    There are 2 options to configure the service account for SSRS:
    Option 1) Using domain accounts, as what I have done for DB Engine and SQL Agent.
    Option 2) accept the default, which is virtual account for SSRS. Per documentation URL:
    http://msdn.microsoft.com/en-us/library/ms143504.aspx
    which is the recommended one? is it option 2?
    There is security note on above URL as well, but does not clearly mention that option 1 is not recommended.
    Security Note:  Always run SQL Server services by using the lowest possible user rights. Use a MSA or  virtual account when possible. When MSA and virtual accounts are not possible, use a specific low-privilege user account or domain account instead
    of a shared account for SQL Server services. Use separate accounts for different SQL Server services. Do not grant additional permissions to the SQL Server service account or the service groups. Permissions will be granted through group membership or granted
    directly to a service SID, where a service SID is supported.
    Thanks very much for your help!

    Hi Luo Donghua,
    In SQL Server Failover Cluster Instance, personally two options can run well. If you use the virtual account for SQL Server Reporting Service. Virtual accounts in Windows Server 2008 R2 and Windows 7 are managed local accounts that provide the features to
    simplify service administration. The virtual account is auto-managed, and the virtual account can access the network in a domain environment.
    Of cause, you can also use domain accounts in your clustering. 
    Just make sure your service account is set up here, or that it is using a proper built-in account.For more information, see:http://ermahblerg.com/2012/11/08/cluster-ssrs-in-2008/
    Thanks,
    Sofiya Li
    Sofiya Li
    TechNet Community Support

  • Need a good test for see Sql Failover "in action"

    I have a two node cluster in a testlab. I'm thinking of INSERTing a million rows in a table, and while the process is running, failover to the other node. (Obviously I'm hoping for minimal loss). Can think of a better way to show failover " in
    action"?
    TIA,
    edm2

    Sorry, to reiterate some of the above but some of the points and the wording will be confusing to someone new to failover clustering.
    From the standpoint of an in-flight transaction you are correct. A failover cluster is no better than a standalone instance. However, if your standalone instance goes down then you are down, in a failover cluster your SQL instance will be restarted or moved
    to another node of the cluster (failover is not an online operation).
    Key points:
    It doesn't matter if you have one node or eight nodes, an individual SQL failover cluster instance will only run on a single node of the cluster at any given time.
    A failover of a SQL Failover Cluster Instance is not an online operation. It is functionally equivalent to restarting SQL.
    In-flight transactions are rolled back during failover as SQL comes back online and your database goes through recovery.
    If you have some problem impacting that instance of SQL then the Windows Cluster will either restart SQL or failover depending on configuration of the cluster.
    Paul Burpo |
    Twitter | LinkedIn

  • ISSUE Sharepoint 2013 databases for reporting services on the second server SQL 2012

    Hello,
    I have server A: Operating system windows 2012 standard, SQL server 2012 standard
    instance: Sharepoint contains data for sharepint
    instance: Reporting should be contain databases for reporting
    Server B:
    Windows server 2012 standard contains installation Sharepoint 2013
    Sharepoint works (without reporting services), it is OK - databases are located on server A:
    My issue is:
    When I have installed reporting services on server B, I have already installed SQL server 2012 on server B, it works.
    I am able to create report in report builder adn place it in to sharepoint.
    But I would like to use only one full SQL machine on server A:
    When I reconfigure repoting settings on server A in central administration - manage service aplications,
    On the SQL server A in instance reporting , there is automatically created databases. It is no problem.
    But the first difference is, when I want to manage service aplication for reporting  in
    Provision Subscriptions and Alerts, there is information
    SQL Server Agent state cannot be determined
    When I want to create report in report builder, I have issue:
    server A-7380mw016\reporting it means server A with full SQL server:
    The Test of connection was successful
    Then I have clicked test connection
    I have recieved this screen with fail: Logon faild for user NT Authority\anonymous logon
    My account belongs to SQL admin on server A (A-7380mw016\reporting) I do not know it is not possible to create report, when it is possible to test connection in the first step and in the second step, there is problem...
    Please, can somebody help me?

    Hi,
    Since you are getting an Anonymous Logon error, it appears there may be a problem passing your credentials to the SQL Server Agent Service. This would indicate a Kerberos issue. See this thread for details:
    http://social.technet.microsoft.com/Forums/sharepoint/en-US/46b7c773-6a77-435d-b471-cb9a6ec41c43/has-anyone-else-upgraded-reporting-services-to-denali-2012
    Microsoft Virtual Academy: Breakthrough Insights using SQL Server 2012 : Analysis Services and Credible, Consistent data (Module 2) - Configuring and Securing Complex BI Applications in a SharePoint 2010 Environment with Microsoft SQL Server 2012
    http://technet.microsoft.com/en-us/video/Video/hh858469
    Tips from the video:
    We are connecting to Reporting services using Kerberos when using Reporting Services in SharePoint integrated mode
    For the account using reporting services, we just need a dummy SPN. We go to Attribute editor tab in AD for RS account. And then we will be enabled with Delegation tab.
    In Delegation tab. I we are using claims to windows token, we need to use "Trust this user for delegation to specified services only"
    There you have 2 options: "Use Kerberos only": It means I only want to delegate in the situation where the service that is doing the delegation actually has the Kerberos ticket to start with
    "Use any authentication protocol" When we need protocol transition (like from NTLM to claims for intra farm communication)
    We need to delegate this to SQL server.
    Please check out these articles as well:
    How to configure SQL Reporting Services in SharePoint Server for Kerberos authentication
    http://support.microsoft.com/kb/2723587
    Configure Kerberos authentication (Office SharePoint Server)
    http://blogs.technet.com/b/mbiswas/archive/2009/07/10/configure-kerberos-authentication-office-sharepoint-server.aspx
    Thanks.
    Tracy Cai
    TechNet Community Support

  • Network DR test causes Exchange DAG network to fail (Failover Cluster Manager reports comms errors)

    We have a DAG configured between 2 mailbox servers, one in each of our main data centres. Our comms team recently performed a DR test between our 2 data centres, switiching from the main production link to the backup link. During this outage the Failover
    Cluster Manager reported errors, with each mailbox server reporting the other as uncontactable. The Events that were logged include the following:
    Isatap interface isatap.{02ADE20A-D5D4-437F-AD00-E6601F7E7A9D} is no longer active. (EventID 4201)
    Cluster node 'MAILBOX_SERVER' was removed from the active failover cluster membership. The Cluster service on this node may have stopped. This could also be due to the node having lost communication with other active nodes in the failover cluster. Run the
    Validate a Configuration wizard to check your network configuration. If the condition persists, check for hardware or software errors related to the network adapters on this node. Also check for failures in any other network components to which the node is
    connected such as hubs, switches, or bridges. (EventID 1135)
    File share witness resource 'File Share Witness (\\WITNESS_SERVER\SHARE_NAME)' failed to arbitrate for the file share '\\WITNESS_SERVER\SHARE_NAME'. Please ensure that file share '\\WITNESS_SERVER\SHARE_NAME' exists and is accessible by the cluster. (EventID
    1564)
    Cluster resource 'File Share Witness (\\\WITNESS_SERVER\SHARE_NAME)' in clustered service or application 'Cluster Group' failed. (EventID 1069)
    The Cluster service is shutting down because quorum was lost. This could be due to the loss of network connectivity between some or all nodes in the cluster, or a failover of the witness disk. Run the Validate a Configuration wizard to check your network
    configuration. If the condition persists, check for hardware or software errors related to the network adapter. Also check for failures in any other network components to which the node is connected such as hubs, switches, or bridges. (EventID 1177)
    The Cluster Service service terminated with service-specific error A quorum of cluster nodes was not present to form a cluster. (EventID 7024)
    The Microsoft Exchange Information Store service terminated unexpectedly.  It has done this 1 time(s).  The following corrective action will be taken in 5000 milliseconds: Restart the service. (EventID 7031)
    Looking at the Cluster Events in the Failover Cluster Manager Snap-In i see a heap of Event ID 47 (cannot activate the DAG databases as the server is not up according to Windows Failover Cluster Service) and:
    Node status could not be recorded. This could prevent some network failure logic from functioning correctly. NodeStatus:IsHealthy=True,HasADAccess=True,ClusterErrorOverrideFalse,LastUpdate=5/2/2011 8:25:42 AMUTC Failure:An Active Manager operation failed.
    Error: An error occurred while attempting a cluster operation. Error: Cluster API '"ClusterRegSetValue() failed with 0x6be. Error: The remote procedure call failed"' failed.. (EventID 184)
    Forcefully dismounting all the locally mounted databases on server 'BACKUP_MAILBOX_SERVER. (EventID 307).
    Our Comms team doesn't believe it is a comms issue as they did not log any network communication errors between the servers in the two sites (using icmp). So if it is not a comms issue, how can I configure the Failover Cluster Manager to be resilient to
    this type of network failover event.
    Thanks
    Dan

    Isn't it also true that in a stretched DAG with even numbered nodes, the PAM needs to be in the same site as the active DAG node?  If the connection between both nodes goes down, and the PAM is in the "passive" site, the primary node will
    dismount the databases since it can't check with the PAM to make sure its safe for it to be up.  
    In a even-numbered node stretched DAG, the PAM changes to the DR/passive site everytime a failover occurs, but doesn't automatically switch back when you reactivate the primary node.

  • How to test for différent Select into a single PL/SQL block ?

    Hi,
    I am relatively new to PL/SQL and I am trying to do multiple selects int a single PL/SQL block. I am confronted to the fact that if a single select returns no data, I have to go to the WHEN DATA_NOT_FOUND exception.
    Or, I would like to test for different selects.
    In an authentification script, I am searching in a table for a USER ID (USERID) and an application ID, to check if a user is registered under this USERID for this APPLICATION.
    There are different possibilities : 4 possibilities :
    - USERID Existing or not Existing and
    - Aplication ID found or not found for this particular USERID.
    I would like to test for thes 4 possibilities to get the status of this partiular user regardin this application.
    The problem is that if one select returns no row, I go to the exception data not found.
    In the example below you see that if no row returned, go to the exception
    DECLARE
    P_USERID VARCHAR2(400) DEFAULT NULL;
    P_APPLICATION_ID NUMBER DEFAULT NULL;
    P_REGISTERED VARCHAR2(400) DEFAULT NULL;
    BEGIN
    SELECT DISTINCT(USERID) INTO P_USERID FROM ACL_EMPLOYEES
    WHERE  USERID = :P39_USERID AND APPLICATION_ID = :APP_ID ;
    :P39_TYPE_UTILISATEUR := 'USER_REGISTERED';
    EXCEPTION
    WHEN NO_DATA_FOUND THEN
    :P39_TYPE_UTILISATEUR := 'USER_NOT_FOUND';
    END;I would like to do first this statement :
    SELECT DISTINCT(USERID) INTO P_USERID FROM ACL_EMPLOYEES
    WHERE  USERID = :P39_USERID Then to do this one if the user is found :
    SELECT DISTINCT(USERID) INTO P_USERID FROM ACL_EMPLOYEES
    WHERE  USERID = :P39_USERID AND APPLICATION_ID = :APP_ID ;etc...
    I basically don't want to go to the not found exception before having tested the 4 possibilities.
    Do you have a suggestion ?
    Thank you for your kind help !
    Christian

    Surely there are only 3 conditions to check?
    1. The user exists and has that app
    2. The user exists and doesn't have that app
    3. The user doesn't exist
    You could do this in one sql statement like:
    with mimic_data_table as (select 1 userid, 1 appid from dual union all
                              select 1 userid, 2 appid from dual union all
                              select 2 userid, 1 appid from dual),
    -- end of mimicking your table
             params_table as (select :p_userid userid, :p_appid appid from dual)
    select pt.userid,
           pt.appid,
           decode(min(case when dt.userid = pt.userid and dt.appid = pt.appid then 1
                           when dt.userid = pt.userid then 2
                           else 3
                      end), 1, 'User and app exist',
                            2, 'User exists but not for this app',
                            3, 'User doesn''t exist') user_app_check
    from   mimic_data_table dt,
           params_table pt
    where  pt.userid = dt.userid (+)
    group by pt.userid, pt.appid;
    :p_userid = 1
    :p_appid = 2
        USERID      APPID USER_APP_CHECK                 
             1          2 User and app exist   
    :p_userid = 1
    :p_appid = 3
        USERID      APPID USER_APP_CHECK                 
             1          3 User exists but not for this app
    :p_userid = 3
    :p_appid = 2
        USERID      APPID USER_APP_CHECK                 
             3          2 User doesn't exist  

  • ORA-01489 Received Generating SQL for Report Region

    I am new to Apex and I am running into an issue with an report region I am puzzled by. Just a foreword, I'm sure this hack solution will get a good share of facepalms and chuckles from those with far more experience. I welcome suggestions and criticism that are helpful and edifying!
    I am on Apex 4.0.2.00.07 running on 10g, I believe R2.
    A little background, my customer has asked an Excel spreadsheet be converted into a database application. As part of the transition they would like an export from the database that is in the same format as the current spreadsheet. Because the column count in this export is dynmic based on the number of records in a specific table, I decided to create a temporary table for the export. The column names in this temp table are based on a "name" column from the same data table so I end up with columns named 'REC_NAME A', 'REC_NAME B', etc. (e.g. Alpha Record, Papa Record, Echo Record, X-Ray Record). The column count is currently ~350 for the spreadsheet version.
    Because the column count is so large and the column names are dynamic I've run into a host of challenges and errors creating this export. I am a contractor in a corporate environmentm, so making changes to the apex environment or installation is beyond my influence and really beyond what could be justified by this single requirement for this project. I have tried procedures and apex plug-ins for generating the file however the UTL_FILE package is not available to me. I am currently generating the SQL for the query in a function and returning it to the report region in a single column (the user will be doing a text-to-column conversion later). The data is successfully being generated, however, the sql for the headers is where I am stumped.
    At first I thought it was because I returned both queries as one and they were joined with a 'union all'. However, after looking closer, the SQL being returned for the headers is about +10K+ characters long. The SQL being returned for the data is about +14k+. As mentioned above, the data is being generated and exported, however when I generate the SQL for the headers I am receiving a report error with "ORA-01489: result of string concatenation is too long" in the file. I am puzzled why a shorter string is generating this message. I took the function from both pages and ran them in a SQL command prompt and both return their string values without errors.
    I'm hopeful that it's something obvious and noobish that I'm overlooking.
    here is the code:
    data SQL function:
    declare
      l_tbl varchar2(20);
      l_ret varchar2(32767);
      l_c number := 0;
      l_dlim varchar2(3) := '''|''';
    begin
      l_tbl := 'EXPORT_STEP';
      l_ret := 'select ';
      for rec in (select column_name from user_tab_columns where table_name = l_tbl order by column_id)
      loop
        if l_c = 1 then
            l_ret := l_ret || '||' || l_dlim || '|| to_char("'||rec.column_name||'")';
        else
            l_c := 1;
            l_ret := l_ret || ' to_char("' || rec.column_name || '")';
        end if;
      end loop;
        l_ret := l_ret || ' from ' || l_tbl;
      dbms_output.put_line(l_ret);
    end;header sql function:
    declare
      l_tbl varchar2(20);
      l_ret varchar2(32767);
      l_c number := 0;
      l_dlim varchar2(3) := '''|''';
    begin
      l_tbl := 'EXPORT_STEP';
      for rec in (select column_name from user_tab_columns where table_name = l_tbl order by column_id)
      loop
        if l_c = 1 then
            l_ret := l_ret || '||' || l_dlim || '||'''||rec.column_name||'''';
        else
            l_c := 1;
            l_ret := l_ret || '''' || rec.column_name || '''';
        end if;
      end loop;
        l_ret := l_ret || ' from dual';
      dbms_output.put_line(l_ret);
    end;-------
    EDIT: just a comment on the complexity of this export, each record in the back-end table adds 12 columns to my export table. Those 12 columns are coming from 5 different tables and are the product of a set of functions calculating or looking up their values. This is export is really a pivot table based on the records in another table.
    Edited by: nimda xinu on Mar 8, 2013 1:28 PM

    Thank you, Denes, for looking into my issue. I appreciate your time!
    It is unfortunately a business requirement. My customer has required that the data we are migrating to this app from a spreadsheet be exported in the same format, albeit temporarily. I still must meet the requirement. I'm working around the 350 columns by dumping everything into a single column, which is working for the data, however, the headers export is throwing the 01489 error. I did run into the error you posted in your reply. I attempted to work around it with the clob type but eneded up running into my string concatentation error again.
    I'm open to any suggestions at this point given that I have the data. I'm so close because the data is exporting, but because the columns are dynamic, the export does me little good without the headers to go along with it.

  • SQL ENTERPRISE: The edition of Reporting Services that you are using requires that you use local SQL Server relational databases for report data sources and the report server database

    The error below makes absolutely no sense! I'm using Enterprise Core...yet I'm being told I can't use remote data sources:
    w3wp!library!8!03/05/2015-19:08:48:: i INFO: Catalog SQL Server Edition = EnterpriseCore
    w3wp!library!8!03/05/2015-19:08:48:: e ERROR: Throwing Microsoft.ReportingServices.Diagnostics.Utilities.OperationNotSupportedException: , Microsoft.ReportingServices.Diagnostics.Utilities.OperationNotSupportedException: The feature: "The edition of Reporting
    Services that you are using requires that you use local SQL Server relational databases for report data sources and the report server database." is not supported in this edition of Reporting Services.;
    Really? This totally contradicts the documentation found here:
    https://msdn.microsoft.com/en-us/library/ms157285(v=sql.110).aspx
    That article says remote connections are completely supported.
    ARGH! Why does this have to be so difficult to setup?!?

    Hi jeffoliver1000,
    According to your description, you are using Enterprise Core edition and you are prompted that you can’t use remote data sources.
    In your scenario, we neither ignore your point nor be doubt with what you say. But actually we have met the case before that even though the SQL Server engine is Enterprise but the reporting services is still standard. So I would recommend you to find the
    actual edition of reporting services you are using. You can find Reporting Services starting SKU in the Reporting Service logs ( default location: C:\Program Files\Microsoft SQL Server\<instance name>\Reporting Services\LogFiles). For more information,
    please refer to the similar thread below:
    https://social.technet.microsoft.com/Forums/en-US/f98c2f3e-1a30-4993-ab41-acbc5014f92e/data-driven-subscription-button-not-displayed?forum=sqlreportingservices
    By the way, have you installed the other SQL Server edition before?
    Best regards,
    Qiuyun Yu
    Qiuyun Yu
    TechNet Community Support

  • SQL*NET V1 FOR APPC/LU6.2 TEST

    제품 : SQL*NET
    작성날짜 : 1995-06-05
    Olivetti system에서 SQL*Net V1 for APPC/LU6.2 test 결과입니다.
    customer : 한국컴퓨터
    Test System : Olivetti UNIX System 2대
    O / S version : SVR4 2.4
    1. Server Install
    1.1 Shared Memory Parameter
    SHMMAX8388608
    SHMSEG6
    SHMMIN1
    SHMMNI100
    1.2 Kernel Parameter
    SVMMLIM0x7FFFFFFF
    HVMMLIM0x7FFFFFFF
    SFSZLIM0x7FFFFFFF
    HFSZLIM0x7FFFFFFF
    파라메터가 이보다 적으면 oracle kernel을 만들면서 에러가 발생함.
    1.3 Oracle Server Install 시 relink option 에서 Yes 선택.
    LU6.2를 사용하기 위해서 반드시 해주어야 함.
    1.4 Version이 2.3이하인 경우 kernel 생성시 에러 발생.
    2. SQL*Net Install
    2.1 Transaction Program 을 Generate.
    $ $ORACLE_HOME/bin/gentpn ORACLE_SID Max_RU_size Min_RU_size
    RU_size는 256 에서 1024 범위내에서 set 해야함.
    ex) $ gentpn ORA7 512 512
    generate 후 TPORA7 이라는 Transaction Program 이 generate
    되었다는 message가 떨어짐
    TP name은 ORACLE_SID 앞에 TP가 붙어서 생성.
    generate후 $ORACLE_HOME/lu62/server directory 에 TPORA7
    이라는 binary file이 생성.
    2.2 TPORA7을 APPC Services에 등록.
    sysadm tool을 이용하여 등록.
    TP_Name : TPORA7
    TP_Filespec : /home/oracle7/lu62/server/TPORA7
    3. Connection.
    3.1 Connect String
    - @L:remote_LU_name::local_LU_name:mode_name:TP_name
    remote_LU_name, local_LU_name, mode_name 의 value는 APPC
    service에 등록이 되어있슴.
    ex) $ sqlplus scott/tiger@L:BBBBBBBB::AAAAAAAA:CCCCCCCC:TPORA7
    4. 특이사항.
    -. gentpn시 Max_RU_size와 Min_RU_size 256으로 set:
    sqlplus에서 접속시 SQL> prompt가 떨어지지 않고 waiting 상태.
    반면 상대방 remote system에 session 은 연결됨.
    -. gentpn시 Max_RU_size와 Min_RU_size 512으로 set:
    sqlplus에서 접속은 됨.
    그러나 512를 넘어서는 데이타일경우에는 ORA-6412 에러가 떨어지면서
    disconnect 가 되어버림. (ORA-6412:bad read length)
    remote에서 select 된 데이타는 local system 까지는 가져옴.
    data size 는 548.
    sqlplus 에서 array size를 '1'로 하면 정상적으로 작동.
    ex) 1). select empno, ename from emp; ----> 정상
    2). select * from emp; ----> disconnect
    3). select empno, ename, hiredate, sal,
    mgr, comm, deptno
    from emp; ----> disconnect
    4). set array 1
    select * from emp; ----> 정상
    -. gentpn시 Max_RU_size와 Min_RU_size 1024으로 set:
    데이타 사이즈가 1024를 넘어설경우 동일한 현상발생.

    great to hear some one talking of sql*net for dos
    hi
    you can't connect from v1 to net8
    you can conn from v1 to v2
    don't start net8
    on the server
    start sql*net 2.?? at the server
    hope this helps
    I need drivers for sql*net on dos
    where can i down load these from
    thnkx
    Adrian Maier (guest) wrote:
    : Here is my problem:
    : I have some DOS applications written for Oracle 6 for DOS.
    : I want that these connect to an Oracle Server using SQL*Net.
    : For the start, I want to connect to the server with sqlplus,
    : from DOS. For testing I have a small TCP/IP network with two
    : computers:
    : 1. the SERVER, running Linux(Debian 2.1) and Oracle 8.0.5
    : 2. the CLIENT, running MSDOS, PSNFS as networking software
    : and SQL*Net Client v1 for DOS.
    : Question: Is this version of SQl*Net client compatible with
    : Net8 which comes with Oracle8?
    : From win95 I've bben able to connect to the server, so I
    : believe the serevr is correctly configured. When I'm trying
    : to connect from DOS with sqlplus, the ethernet card's leds
    : blink two or three times, which means that some data is
    : transmitted through network. After that, sqlplus waits for
    : an indefinite period of time.
    : If I stop the listener, sqlplus generates the error ORA-06136
    : and asks for a new username. If I don't stop the listener,
    : sqlplus remains blocked.
    : In SQLNet documentation I've found:
    : "ORA-06136: Error during connection handshake.
    : Cause: The destination server was unable to obtain enough
    : information to complete the connection.
    : Action: Check that the configuration of the server is correct,
    : blah, blah .... "
    : I think that the server might not understand this version
    : of client, but Net8 should be "backward compatible"!
    : SQL*Net client v1 was the only DOS version I could find.
    : Are there any other newer DOS SQl*Net clients available?
    : If you have any idea about what could I do, please let me know.
    : Best regards,
    : Adrian Maier
    : [email protected]
    null

Maybe you are looking for

  • Palm Pre Calendar Agenda View

    There is a very desperate and valid desire for the Pre to show calendar events in "agenda view".  An upcoming event listing is very quick and useful for planning purposes or answering scheduling questions on the fly.  PALM -PLEASE make this addition!

  • Fonts in terminal emulators (terminal from xfce)

    I always have problems with fonts in terminal emulators, because I can never choose good one. Now I found perfect terminal emulator, it's called Terminal. I have Arch on three computers at home - two workstations and one server. On one workstation I

  • Concerns about DVI-VGA  Adaptor

    My TV only get DVI outlet, so i got a VGA-DVI adaptor the other day to work with the original DVI-VGA adaptor. What i've done was simply to connect the two VGA outlet and plug-in DVI outlet to its repective terminals. However,although my Pro was able

  • CS5.5 configuration error 3

    Hey guys, So I'm trying to run After Effects CS5.5 on my computer, but everytime I open it I get an "Configuration Error- Please uninstall and reinstall the software. Error: 3". I have uninstalled and reinstalled multiple times - using the CS5 cleanu

  • Date Format in Bridge

    How do I change the Date format in Bridge from MM DD YYYY to DD MM YYYY In the past I could remove the time info in the IPTC Date Created field now I get this message and although the Photographs (Most British Photographers) info  is styled DD/MM/YYY