All Data Source History is Empty

Using Crystal Reports 10 for many years and this problem just started happening last week. When I  try to create a new Report or Change the data source of an existing one, all History, Favorite even current connection is empty. ODBC data source still exists and tests successfully in control panel. I'm Using Windows XP sp3.
Also, another oddity is that when I have Crystal open for a while and I go to open a new file it doesn't do anything. I have to open crystal reports again to be able to open a new report.
Any Ideas?
Shirley

Hi Shirley,
CR saves the history in your registry. Either you have lost permissions through your Profile, possibly your IT team has pushed out an updated limited Profile, or possibly your anti-virus software is now blocking CR from accessing the registry.
Check your AV history and log files to see if they logged anything.
Or try doing a Repair install of Cr 10 through Add/Remove Programs. I don't recall now but you may need the keycode to re-install/repair it. Run a program called licensemanager.exe in the CR folder, search your PC for it, and it should show you the keycode, save it in a text file just in case.
Don

Similar Messages

  • Ipod 2 : All data lost after battery empty, Apple logo! *** is this now!?

    I am very upset, i took me countless hours to set up and customize my new ipad 2. Finally i was done, playing "fishfarm" on it, when i noticed the battery was down to 4%. I went to bed, thinking, well the ipad will go to sleep mode soon enough since it was so low on battery power.
    Then the shock when i woke up!!! I wanted to charge my ipad, when i noticed the frigging Apple Logo! I assumed the worse when i had to connect it to itunes.
    Yes, ALL DATA GONE!! Everything, all apps, data, settings, games, it went back to "out of box" mode. I recently upgraded to new software 4.3.3
    This is unacceptable, *** is wrong, did they rush to launch the new ipad when its mainboard wasnt ready? This clearly looks like a major problem, since i dont seem to be the only one having it.
    Apple needs to release a fix ASAP! This doesnt happen with Microsoft software, i never lost all my data at once when working with a MS OS.
    Anyone else with this problem? Luckily i ordered 2 ipads 2 so when i will pick up the other one on Monday, i am sending this one straight back.

    Countless hours? There are only 24 hours in a day.
    Backing up the iPad is as simple as attaching the cable to the iPad and your computer and telling the iTunes app on your computer to make a backup. It takes about 5 minutes usually. You should always backup important data on all computer devices.
    The iPad 2 was not rushed to market. There is nothing wrong with the iPad 2's logicboard. Mine has worked perfectly from day one over a month ago.
    But when a factory assembles millions of electronic devices with dozens of parts from different sources, there may occasionally be a bad one. Call Apple Care and have them walk you through the steps to determine if yours is a lemon or you personally have made a mistake with it, aside from failing to back it up.

  • A-Office 7.1 error when try to open a query: Filter: SAP BI Add-in has disconnected all data sources

    Hi,
    I'm facing a issue when try to consume a BW 7.4 query using A-Office 1.4 SP 7 Patch 1.
    The error is:
    Source type \CLASS=CL_RSBOLAP_VARIABLE_CHARACTERI is not compatible, for the purposes of assignment, with target type (RS_EXCEPTION-000)
    Print Attached:
    Any inputs of how to solve this?
    Thanks,
    Rodrigo.

    Hi Rodrigo - I haven't seen this in a while; since you are an SAP employee you may want to try the internal team, and it might be something in the BW back end.

  • How to find all uses of data source views in an SSIS solution

    I am upgrading Visual Studio from 2008 to 2013 (with SSDT) and SQL Server 2008 R2 to  2012.  I have a solution with over 30 dstx files. Each file has multiple OLE data sources and lookup tasks.  There is inconsistent usage of data source views
    throughout (as compared to SQL queries or table references).  It is my understanding that all data source views need to be removed before upgrading SSIS packages from BIDS to SSDT.  I tried searching the files as XML for the DSVs but it appears the
    GUID reference changes per dstx file.  It seems like I will have to look at each source/lookup. Is there quicker way to search for where they are used? 
    Thanks
    Adding this question here.  Posted question incorrectly in VSO forum.

    All right, yes, they are dropped
    Never upgraded a package that had them.
    What happens if you just upgrade leaving a copy?
    Arthur My Blog
    This ended up being what I did for many of the packages.  Upgrading the packages severed the data source views and left the SQL in the related tasks (e.g. OLE Source task).  Sorry for the delayed mark as answered.

  • Retrieve data source connection string from reportserver.dbo.DataSource

    I would like to retrieve the connection string (and credentials) for each of my data sources (without having to click on each individual item). The DataSource table has a column called ConnectionString, but the data type is image, which cannot be converted to varchar. Does anyone have any ideas of how to pull this information from the database?
    I have tried converting to binary/varbinary, and then converting to varchar, but I get garbage back.
    Note: I am currently using SSRS 2005
    Thanks,
    Marianne

    Hi Jin,
    I downloaded your script and was impressed with the information it retrieves, however, it is not formatted very well with the name of each data source being on the same line as the previous data source so I updated the script to create a blank line between
    each data source block, and also to include the login credentials used by the data source, and the path for the data source, which is very useful in being able to locate the data source for editing in Report Manager. However, it is not retrieving all data
    sources and I don't know why. I know this for a fact because there are data sources defined that are not in the output of this script but I don't know enough about VB to improve the script further, or how the report server is configured to ensure the code
    finds the missing data sources.
    Here's the updated script.
    '=============================================================================
    '  File:      PublishSampleReports.rss
    '  Summary:  Demonstrates a script that can be used with RS.exe to
    '      publish the sample reports that ship with Reporting Services.
    ' This file is part of Microsoft SQL Server Code Samples.
    '  Copyright (C) Microsoft Corporation.  All rights reserved.
    ' This source code is intended only as a supplement to Microsoft
    ' Development Tools and/or on-line documentation.  See these other
    ' materials for detailed information regarding Microsoft code samples.
    ' THIS CODE AND INFORMATION ARE PROVIDED "AS IS" WITHOUT WARRANTY OF ANY
    ' KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE
    ' IMPLIED WARRANTIES OF MERCHANTABILITY AND/OR FITNESS FOR A
    ' PARTICULAR PURPOSE.
    Public Sub Main()
        rs.Credentials = System.Net.CredentialCache.DefaultCredentials
        'Create the parent folder
        GetCatalogItems(String.Empty)
    End Sub
        ' <summary>
        ' recursivly get folder items from report
        ' server catalog; render to treeview control
        ' </summary>
        ' <param name="catalogPath"></param>
        ' <param name="parentNode"></param>
        Public Sub GetCatalogItems(ByVal catalogPath As String)
            Dim items As CatalogItem()
            Try
                ' if catalog path is empty, use root, if not pass the folder path
                If catalogPath.Length = 0 Then
                    ' no recursion (false)
                    items = rs.ListChildren("/", False)
                Else
                    ' no recursion (false)
                    items = rs.ListChildren(catalogPath, False)
                End If
                ' iterate through catalog items and populate treeview control
                For Each item As CatalogItem In items
                    ' if folder is hidden, skip it
                    If item.Hidden <> True Then
                        ' ensure only folders are rendered
                        If item.Type.Equals(ItemTypeEnum.DataSource) Then
                            'use the GetDataSourceContents to get the definition of the data source.
                            Dim definition As DataSourceDefinition = Nothing
                            Try
                                definition = rs.GetDataSourceContents(item.Path)
                                Console.WriteLine(item.Name)
                                Console.WriteLine(item.Path)
                                Console.WriteLine("Connection String: {0}", definition.ConnectString)
                                Console.WriteLine("Extension name: {0}", definition.Extension)
                                Console.WriteLine("UserName: {0}", definition.UserName)
                            Catch e As SoapException
                                Console.WriteLine(e.Detail.InnerXml.ToString())
                            End Try
                            Console.WriteLine(item.Name)
                            Console.WriteLine()
                        ElseIf item.Type.Equals(ItemTypeEnum.Folder) Then
                            ' recurse                     
                            GetCatalogItems(item.Path)
                        End If
                    End If
                Next
            Catch ex As Exception
                Console.WriteLine(ex.Message)
            Finally
                'Do nothing
            End Try
        End Sub
    To run this I use:
    rs -i GetPropertiesOfDataSources.rss -s
    http://<reportserver_name>/reportserver > DataSources_<reportserver_name>.txt
    But as I mentioned it is not picking up all of the data sources and I don't know why. I also don't know why it's so difficult to find all the data sources from within Report Manager. I'm new to SSRS but was told that prior to SQL 2005, this information was
    easily available using Management Studio to connect to the report server but now it's buried in the Report Manager and not accessible from one spot. Bad move by Microsoft apparently. You'd think that there would be an easy way to find all of the data sources.
    We need to migrate the SQL Servers on which reports are running, but we have tons of reports and, historically, little documentation for them so we need to be able to determine where all the data sources are so we know which to update. This is proving extremely
    difficult. Your script helps but is not comprehensive.
    Any and all help is appreciated.
    Michael MacGregor, Senior SQL Server DBA, Carlson Marketing

  • Data Federator data source not available causes Universe Connection error

    I created a Data Federator project that connects to 20 servers across US and Canada.  All data sources are SQL Server 2005.  The DF project maps 40 source objects into 4 target objects.  I created a universe based on the DF project and we have been quite pleased with Webi query response.  Today one of the source servers was taken off line and this generated a connection error when trying to access the universe (not trying to access the data source that failed).  We do not want the universe connection to error when one source server is not available u2013 is that possible?
    If the answer is no then I see us abandoning what appears to be a great solution for real time distributed reporting and resorting to ETL and moving data.

    Hi Chapman,
    Can you be little elobrate on what you have done to solve the issue.
    Thanks,
    Dayanand

  • Fields and Data sources

    BW Experts,
    What is the way to find out in <b>SAP r/3</b>, A field is used is what Datasource. Say for example "SPBUP" (Fisc Period)....How to find out what all Data sources have this field?
    Thanks
    Ashwin

    Hi Ashwin,
    You can look up SE16 > ROOSFIELD. Put SPBUP in the FieldName field and F8.
    Hope this helps...
    Message was edited by: Bhanu Gupta
    Thanks Ashwin! Put yourself on the SDN world map (http://sdn.idizaai.be/sdn_world/sdn_world.html) and earn 25 points.
    Spread the wor(l)d!

  • Any examples of a data template using multiple data sources?

    I'm looking for an example report using multiple data sources. I've seen one where they do a master/detail but I'm just looking to combine results in sorted order (sorted across all data sources). The master/detail used a bind variable to link the two defined queries, I'm thinking what I want won't have that, so I'm lost on how to make that happen. I have reports using multiple sql queries and there is a way in the data source pulldown to tell it to combine the data sources. It appears to be a more manual process with data templates, if it's even possible.
    Any pointers/links would be appreciated.
    Gaff

    Hi Vetsrini :
    That's just it. Mine is simpler than that. There is no master/detail relationship between the two queries. I have the same exact query that I run in two databases and I want to merge the results (ordered, by, say eventTime) in one report. So I think my results are going to be two separate groups (one for each data source) which I'll have to let BI merge vis XSLT or whatever it uses. That's fine for small result sets but for larger ones, it would be nice if the database did the sorting/merging.
    Gaff

  • Reports fail when run against a different data source

    Hello,
    We have a VB.NET 2008 WinForms application running on Microsoft .NET 3.5. We are using Crystal Reports 2008 runtime, service pack 3 -- using the CrystalDecisions.Windows.Forms.CrystalReportViewer in the app to view reports. In the GAC on all our client computers, we have versions 12.0.1100.0 and 12.0.2000.0 of CrystalDecisions.CrystalReports.Engine, CrystalDecisions.Shared, and CrystalDecisions.Windows.Forms.
    Please refer to another one of our posted forum issues, u201CCritical issue since upgrading from CR9 to CR2008u201D, as these issues seem to be related:
    Critical issue since upgrading from CR9 to CR2008
    We were concerned with report display slow down, and we seemed to have solved this by using the Oracle Server driver (instead of either Microsoft's or Oracle's OLEDB driver).  But now we must find a resolution to another piece of the puzzle, which is:  why does a report break if one data source is embedded in the .rpt file is different than the one you are trying to run the report against, in the .NET Viewer?
    Problem:
    If you have a production database name (e.g. "ProdDB") embedded in your .rpt file that you built your report from and try to run that report against a development database (e.g. "DevDB") (OR VICE VERSA -- it is the switch that is the important concept here), the report fails with a list of messages such as this:
        Failed to retrieve data from the database
        Details:  [Database vendor code: 6550 ]
    This only seems to happen if the source of the report data (i.e. the underlying query) is an Oracle stored procedure or a Crystal Reports SQL Command -- the reports run fine against all data sources if the source is a table or a view).  In trying different things to troubleshoot this, including adding a ReportDocument.VerifyDatabase() call after setting the connection information, the Crystal Reports viewer will spit out other nonsensical errers regarding being unable to find certain fields (e.g. "The field name is not known), or not able to find the table (even though the source data should be coming from an Oracle stored procedure, not a table).
    When the reports are run in the Crystal Reports Designer, they run fine no matter what database is being used; but the problem only happens while being run in the .NET viewer.  It's almost as if something internally isn't getting fully "set" to the new data source, or something -- we're really grasping at straws here.
    For the sake of completeness of information, here is how we're setting the connection information
            '-- Set database connection info for the main report
            For Each oConnectionInfo In oCrystalReport.DataSourceConnections
                oConnectionInfo.SetConnection(gsDBDataSource, "", gsDBUserID, gsDBPassword)
            Next oConnectionInfo
            '-- Set database connection info for each subreport
            For Each oSubreport In oCrystalReport.Subreports
                For Each oConnectionInfo In oSubreport.DataSourceConnections
                    oConnectionInfo.SetConnection(gsDBDataSource, "", gsDBUserID, gsDBPassword)
                Next oConnectionInfo
            Next oSubreport
    ... but in troubleshooting, we've even tried an "overkill" approach and added this code as well:
            '-- Set database connection info for each table in the main report
            For Each oTable In oCrystalReport.Database.Tables
                With oTable.LogOnInfo.ConnectionInfo
                    .ServerName = gsDBDataSource
                    .UserID = gsDBUserID
                    .Password = gsDBPassword
                    For Each oPair In .LogonProperties
                        If UCase(CStr(oPair.Name)) = "DATA SOURCE" Then
                            oPair.Value = gsDBDataSource
                            Exit For
                        End If
                    Next oPair
                End With
                oTable.ApplyLogOnInfo(oTable.LogOnInfo)
            Next oTable
            '-- Set database connection info for each table in each subreport
            For Each oSubreport In oCrystalReport.Subreports
                For Each oTable In oSubreport.Database.Tables
                    With oTable.LogOnInfo.ConnectionInfo
                        .ServerName = gsDBDataSource
                        .UserID = gsDBUserID
                        .Password = gsDBPassword
                        For Each oPair In .LogonProperties
                            If UCase(CStr(oPair.Name)) = "DATA SOURCE" Then
                                oPair.Value = gsDBDataSource
                                Exit For
                            End If
                        Next oPair
                    End With
                    oTable.ApplyLogOnInfo(oTable.LogOnInfo)
                Next oTable
            Next oSubreport
    ... alas, it makes no difference.  If we run the report against a database that is different than the one specified with "Set Datasource Location" in Crystal, it fails with nonsense errors 

    Thanks for the reply, Ludek.  We have made some breakthroughs, uncovered some Crystal bugs and workarounds, and we're probably 90% there I hope.
    For your first point, unfortunately the information on the Oracle 6550 error was generic, and not much help in our case.  And for your second point, the errors didn't have anything to do with subreports at that time -- the error would manifest itself even in a simple, one-level report.
    However, your third point (pointing us to KB 1553921) helped move us forward quite a bit more.  For the benefit of all, here is a link to that KB article:
    Link: [KB 1553921|http://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/oss_notes_boj/sdn_oss_boj_bi/sap(bD1lbiZjPTAwMQ==)/bc/bsp/spn/scn_bosap/notes%7B6163636573733d36393736354636443646363436353344333933393338323636393736354637333631373036453646373436353733354636453735364436323635373233443330333033303331333533353333333933323331%7D.do]
    We downloaded the tool referenced there, and pointed it at a couple of our reports.  The bottom line is that the code it generated uses a completely new area of the Crystal Reports .NET API which we had not used before -- in the CrystalDecisions.ReportAppServer namespace.  Using code based on what that RasConnectionInfo tool generated, we were able gain greater visibility into some of the objects in the API and to uncover what I think qualifies as a genuine bug in Crystal Reports.
    The CrystalDecisions.ReportAppServer.DataDefModel.ISCRTable class exposes a property called QualifiedName, something that isn't exposed by the more commonly-used CrystalDecisions.CrystalReports.Engine.Table class.  When changing the data source with our old code referenced above (CrystalDecisions.Shared.ConnectionInfo.SetConnection), I saw that Crystal would actually change the Table.QualifiedName from something like "SCHEMAOWNER.PACKAGENAME.PROCNAME" to just "PROCNAME" (essentially stripping off the schema and package name).  Bad, Crystal...  VERY BAD!  IMHO, Crystal potentially deserves to be swatted on the a** with the proverbial rolled-up newspaper.
    I believe this explains why we were also able to generate errors indicating that field names or tables were not found -- because Crystal had gone and changed the QualifiedName to remove some key info identifying the database object!  So, knowing this and using the code generated by the RasConnectionInfo tool, we were able to work around this bug with code that worked for most of our reports ("most" is the key word here -- more on that in a bit).
    So, first of all, I'll post our new code.  Here is the main area where we loop through all of the tables in the report and subreports:
    '-- Replace each table in the main report with new connection info
    For Each oTable In oCrystalReport.ReportClientDocument.DatabaseController.Database.Tables
        oNewTable = oTable.Clone()
        oNewTable.ConnectionInfo = GetNewConnectionInfo(oTable)
        oCrystalReport.ReportClientDocument.DatabaseController.SetTableLocation(oTable, oNewTable)
    Next oTable
    '-- Replace each table in any subreports with new connection info
    For iLoop = 0 To oCrystalReport.Subreports.Count - 1
        sSubreportName = oCrystalReport.Subreports(iLoop).Name
        For Each oTable In oCrystalReport.ReportClientDocument.SubreportController.GetSubreportDatabase(sSubreportName).Tables
            oNewTable = oTable.Clone()
            oNewTable.ConnectionInfo = GetNewConnectionInfo(oTable)
            oCrystalReport.ReportClientDocument.SubreportController.SetTableLocation(sSubreportName, oTable, oNewTable)
        Next oTable
    Next iLoop
    '-- Call VerifyDatabase() to ensure that the tables update properly
    oCrystalReport.VerifyDatabase()
    (Thanks to Colin Stynes for his post in the following thread, which describes how to handle the subreports):
    Setting subreport connection info at runtime
    There seems to be a limitation on the number of characters in a post on this forum (before all formatting gets lost), so please see my next post for the rest....

  • 903/902/BC4J can't get data-sources.xml conn pooling to work in production; help

    I have several BC4J ears deployed to a 903 instance of OC4J being configured as a standalone
    instance. I've had this problem since I started deploying in development on 902. So it's
    some basic problem that I've not mastered.
    I can't get data-sources.xml managed connection pooling to actually pool conn's. I'm wanting
    to declare my jndi jdbc source connection pool in j2ee/home/config/data-sources.xml.
    Have all BC4J apps get conns from this JNDI JDBC pool. I've removed all data-sources.xml from my BC4J ears,
    and published the jndi jdbc source in my oc4j common data-sources.xml. I've tested that this is
    the place controlling the conn URL/login passwd by commenting it out of config/data-sources.xml
    and my BC4J apps then throw exceptions, can't get conn.
    I've set the oc4j startup cmd line with the BC4J property to enabled connection pooling:
    -Djbo.doconnectionpooling=true
    symptom
    Connections are created and closed. Instead of being put back into the pool managed by oc4j,
    what ever BC4J is doing or my data-sources.xml is doing, the connections are just being created and
    closed.
    I can verify this via (solaris) lsof and netstat, where I see my oc4j instance under test load
    with only 1 or 2 conns to the db box, and the ephemeral port is tumbling, meaning a new socket is
    being opened for each conn. ;( grrrrrrr
    Does anyone have a clue as to why this is happening?
    Thanks, curt
    my data-sources.xml
    <data-sources>
         <data-source
            class="com.evermind.sql.DriverManagerDataSource"
            connection-driver="oracle.jdbc.driver.OracleDriver"
            ejb-location="jdbc/DEVDS"
            location="jdbc/DEVCoreDS"
            name="DEVDS"
            password="j2train"
            pooled-location="jdbc/DEVPooledDS"
            url="jdbc:oracle:thin:@10.2.1.30:1521:GDOC"
            username="jscribe"
            xa-location="jdbc/xa/DEVXADS"
            inactivity-timeout="300"
            max-connections="50"
            min-connections="40"
        />
    </data-sources>

    I've run another test using local data-source.xml, that's packaged in the .ear. Still
    pooling under BC4J doesn't work??
    A piece of info is that the 903 oc4j release notes states that global conn pooling doesn't
    work. Infering that the j2ee/home/config/data-sources.xml data sources aren't pooled or ??
    I just tested so called local connection pooling, where I edited the data-sources.xml that
    gets packaged in the ear, to include the min/max params and re-ran my test.
    Still, the AM creates a new conn, it's to a new socket, and closes the conn when done. Causing
    each conn to not be pooled, rather opened then closed to the DB box. As verified with lsof and
    netstat, checking the ephemeral port # on the DB box side, always changes, meaning it's a
    new socket and not an old pooled conn socket.
    ???? What the heck??
    Surely if the AM conn check out / return code works properly, OC4J's pooling JDBC driver would
    pool and not close the socket??
    Has anywone gotten JDBC Datasource connections in BC4J to actually be pooled under OC4J??
    Since I couldn't get this to work in my early 902 oc4j testing, and now can't get it to work
    still under 903 OC4J, either it's my config or BC4J AM's code or OC4J?
    Any thoughts on how to figure out what's not configed correctly or has a bug?
    Thanks, curt

  • Data Source path in Pivot Table changes to absolute on its own

    Hello.
    I have a .XLSX file, that was created long time ago (I don't even know in which Office version, but definitely not 2013), and maybe even was a .XLS file at first.
    So it's a 4 MB file with 16 Sheets and 8 Pivot Tables.
    All of the Pivot Tables use other sheets from the same file as Data Source.
    Data Source for some of them look like this: 'Sheet3'!$A:$E
    Everything is fine when I save the file, and open it from saved file. 
    But as soon as I try to move the file elsewhere, or rename it, or email it - all Data Source paths change to something like this: '\Users\Sergii_Litnevskyi\Desktop\New folder\[FileName.xlsx]Sheet3'!$A:$E
    And it happens with all Pivot Tables. The problem is that it links to an old file path, where the file does not exist anymore. And it links to an external file, which is not what I want.
    If I Save As and select different path and filename - then it works fine. So it's a workaround for renaming and moving files, but not for sending them to other persons.
    I've read some threads, and people recommend disabling "Save external link values", but it does not help. It is already turned off in my office, but it keeps acting weird. 
    So what I need is: Save the file, close it, rename it, move it to other place, send it over email as attachment. And then I want to have the same Data Source path in my PivotTables as I had before I saved the file. How can I do it?
    My Office version: Microsoft Excel 2013 (15.0.4454.1503) MSO (15.0.4517.1005) 32-bit

    Hi,
    According to your description, I suppose the issue may be caused by some reason.
    Do you link the outside data source?
    I think if the file moves the file elsewhere, or renames, or email,
    data source paths can’t be change.
    But, your data source paths add the absolute path.
    Do you link the outside data source?
    I recommend you zip the file and send it as Email attachment.
    If the issue exists, you may save as it in a new name and test it in another computer.
    Regards,
    George Zhao
    TechNet Community Support
    I am pretty sure that I don't have any external links in the document.
    However, even if I did - why would it change Data Source path for all of the Pivot Tables, when I did not request it?
    I tried zipping it and sending to other person over email, but he got the file with changed data source paths.
    I can even record a short video to show what happens.
    Actually, I just did it. You can see the video here: http://screencast.com/t/qMBild3ck9b
    It is rather big - 23.8 MB.
    Let me explain what I showed there:
    I opened my original file. I showed that there are Pivot Tables, whose Data Sources are in the same file, on various other sheets.
    I showed this for all of the Pivot Tables in the document.
    I saved the file using Save As in a different folder and under a different name (TEST.xlsx).
    I then opened that saved file to show you that it is fine, and the Data Source path for one of the Pivot Tables is the same as it was in original file. It is the same for all of the other Pivot Tables.
    Then I closed, and simply renamed the file to TEST123.xlsx.
    Opened it, and first thing wrong - Security warning.
    Then I got ‘Cannot open PivotTable source file ….’ messages. And, as I showed, now all Data Source paths have been changed to full paths of the file, that was created by Save As (TEST.xlsx) from original file.

  • Connection/data source caching in JSC App Server?

    Tried to change the data source for one of my apps (changed all data sources on all row sets). I got a connection error despite the fact that the IDE could see the data source.
    I restarted the app server to no avail.
    I finally stopped the app server AND restarted JSC and eveything started working again (except for the Conversion error I'm tracking down :-) ...

    Did the connection error occur at runtime ?
    If so, then it's not recommended to change the datasource
    especially if you change databases. It's not impossible to make
    the changes - you need to edit the Page bean constructor and the
    constructor in the SessionBean and anywhere else you
    initialize the rowset with the URL.
    If you changed the database, then because vendors
    map JDBC types differently, you may need to make
    changes . Typically you'll see conversion errors when changing
    the database.
    JDBC Type (which we call SQL Type) --> Java Type:
    =========================================
    CHAR --> java.lang.String
    VARCHAR --> java.lang.String
    LONGVARCHAR --> java.lang.String
    NUMERIC --> java.math.BigDecimal
    DECIMAL --> java.math.BigDecimal
    BIT --> boolean
    BOOLEAN --> boolean
    TINYINT --> byte
    SMALLINT --> short
    INTEGER --> int
    BIGINT --> long
    REAL --> float
    FLOAT --> double
    DOUBLE --> double
    BINARY --> byte[]
    VARBINARY --> byte[]
    LONGVARBINARY --> byte[]
    DATE --> java.sql.Date
    TIME --> java.sql.Time
    TIMESTAMP --> java.sql.Timestamp
    CLOB --> java.sql.Clob
    BLOB --> java.sql.Blob
    ARRAY --> java.sql.Array
    DISTINCT --> (mapping of underlying type)
    STRUCT --> java.sql.Struct
    REF --> java.sql.Ref
    DATALINK --> java.net.URL
    JAVA_OBJECT --> (underlying Java class)
    Source: JDBC 3.0 Specification, Appendix B
    Also, see this post for more info :
    http://swforum.sun.com/jive/thread.jspa?forumID=123&threadID=47015
    John
    JSC QA

  • ODBC Data Source Won't Recover from Connection Errors

    I have several ODBC data sources to a mainframe and want to
    use the "Maintain Connections" setting as it dramatically improves
    performance under heavy load. Problem is that we frequently have
    path problems between the CF server and the mainframe 180 miles
    away. With the Maintain Connections checked, the data source
    doesn't consistently start working again after the path problem
    clears up. The path problem usually only lasts a minute or two, but
    SOME queries through the affected data sources continue to raise
    errors until the ODBC Service is cycled. The SOME aspect is really
    weird because not every hit through that data source reports an
    error after the path comes back up. I've had side-by-side users hit
    the same page using that DS and one consistently get's an error and
    the other works fine. Restarting the ODBC service clears the
    problem so it works fine for everybody. Of course that breaks all
    the connections that were still working so I just affected a bunch
    more users. Silver lining there is that at least nobody can claim
    preferential treatment right. It's also kind of bizzare that I can
    use the CF Administrator to verify all data sources and they
    usually all report that they are working fine even while some
    connections are still reporting problems.
    The specific ODBC drivers we are using are Neon Shadow
    Drivers (Multi Threaded). I'm thinking about using the Admin API to
    hit the Disable Connections and then re-enable it.
    Any constructive, tactful input for addressing this problem
    would be appreciated.

    spikehenning wrote:
    > I have several ODBC data sources to a mainframe and want
    to use the "Maintain
    > Connections" setting as it dramatically improves
    performance under heavy load.
    > Problem is that we frequently have path problems between
    the CF server and the
    > mainframe 180 miles away. With the Maintain Connections
    checked, the data
    > source doesn't consistently start working again after
    the path problem clears
    > up. The path problem usually only lasts a minute or two,
    but SOME queries
    > through the affected data sources continue to raise
    errors until the ODBC
    > Service is cycled. The SOME aspect is really weird
    because not every hit
    > through that data source reports an error after the path
    comes back up. I've
    > had side-by-side users hit the same page using that DS
    and one consistently
    > get's an error and the other works fine. Restarting the
    ODBC service clears the
    > problem so it works fine for everybody. Of course that
    breaks all the
    > connections that were still working so I just affected a
    bunch more users.
    > Silver lining there is that at least nobody can claim
    preferential treatment
    > right. It's also kind of bizzare that I can use the CF
    Administrator to verify
    > all data sources and they usually all report that they
    are working fine even
    > while some connections are still reporting problems.
    >
    > The specific ODBC drivers we are using are Neon Shadow
    Drivers (Multi
    > Threaded). I'm thinking about using the Admin API to hit
    the Disable
    > Connections and then re-enable it.
    Do you have a JDBC Type 4 alternative for these drivers? I
    got rid of
    all my datasource problems long ago when I got rid of ODBC
    altogether.
    Jochem
    Jochem van Dieten
    Adobe Community Expert for ColdFusion

  • SPM: Using "refresh data sources" in Data Source Administrator

    Hi All,
    When making changes to a query, that I want to be reflected in the flex front end, I carry out the following steps:
    1) save the query in query designer
    2) load up the SPM user interface, navigate to Administration -> Data Source Administration
    3) Click on "Refresh All Data Sources"
    4) Wait for the "wait" cursor to finish (only takes a few seconds)
    5)log out of the portal
    6) clear the browsers cache
    7) reload the web page
    8) Repeat Steps 2-7 again
    9) the query's changes are not reflected in the UI
    Does anyone know what I should be doing differently so that the query changes are reflected in the Data Source?
    Kind regards,
    Neil

    Hi Neil,
    You are almost right on the steps, just a few changes.
    1. Modify your query and save the query changes.
    2. Close the query designer so you are not locking the query (optional step, but recommended).
    3. Go to transaction RSRT and generate the query (optional step, but recommended).
    4. Login to portal and launch SPM user interface, navigate to Administration -> Data Source Administration
    5. Click on "Refresh All Data Sources"
    6. Check the UI logs (Settings > View logs), to confirm that datasource refresh has finished successfully (your should receive messages saying 'Metadata Loaded'.  The cursor will turn into a clock multiple times, this is the UI checking to see if the metadata refresh has finished.  You should see the message 'Metadata Loaded' as many times as there are registered BW queries.  This could take a few minutes depending on the number of queries registered in SPM UI.
    You need not clear your browser cache or logout of the application.  At most you may need to relaunch the application.
    These steps should be sufficient to reflect any query changes to the SPM UI.
    Regards,
    Rohit

  • 903/902/BC4J can't get OC4J data-sources.xml conn pooling to work in production: help

    [cross posted to the j2ee forum]
    I have several BC4J ears deployed to a 903 instance of OC4J being configured as a standalone
    instance. I've had this problem since I started deploying in development on 902. So it's
    some basic problem that I've not mastered.
    I can't get data-sources.xml managed connection pooling to actually pool conn's. I'm wanting
    to declare my jndi jdbc source connection pool in j2ee/home/config/data-sources.xml and
    have all BC4J apps get conns from this JNDI JDBC pool. I've removed all data-sources.xml from
    my BC4J ears, and published the jndi jdbc source in my oc4j common data-sources.xml.
    I've tested that this is the place controlling the conn URL/login passwd by commenting it
    out of config/data-sources.xml and my BC4J apps then throw exceptions, can't get conn.
    I've set the oc4j startup cmd line with the BC4J property to enabled connection pooling:
    -Djbo.doconnectionpooling=true
    symptom
    Connections are created and closed. Instead of being put back into the pool managed by oc4j,
    what ever BC4J is doing or my data-sources.xml is doing, the connections are just being created and
    closed.
    I can verify this via (solaris) lsof and netstat, where I see my oc4j instance under test load
    with only 1 or 2 conns to the db box, and the ephemeral port is tumbling, meaning a new socket is
    being opened for each conn. ;( grrrrrrr
    Does anyone have a clue as to why this is happening?
    Thanks, curt
    my data-sources.xml
    <data-sources>
         <data-source
            class="com.evermind.sql.DriverManagerDataSource"
            connection-driver="oracle.jdbc.driver.OracleDriver"
            ejb-location="jdbc/DEVDS"
            location="jdbc/DEVCoreDS"
            name="DEVDS"
            password="j2train"
            pooled-location="jdbc/DEVPooledDS"
            url="jdbc:oracle:thin:@10.2.1.30:1521:GDOC"
            username="jscribe"
            xa-location="jdbc/xa/DEVXADS"
            inactivity-timeout="300"
            max-connections="50"
            min-connections="40"
        />
    </data-sources>

    Thanks Leif,
    Yes, set it to the location jndi path.
    A piece of info is that the 903 oc4j release notes states that global conn pooling doesn't
    work. Infering that the j2ee/home/config/data-sources.xml data sources aren't pooled or ??
    I just tested so called local connection pooling, where I edited the data-sources.xml that
    gets packaged in the ear, to include the min/max params and re-ran my test.
    Still, the AM creates a new conn, it's to a new socket, and closes the conn when done. Causing
    each conn to not be pooled, rather opened then closed to the DB box. As verified with lsof and
    netstat, checking the ephemeral port # on the DB box side, always changes, meaning it's a
    new socket and not an old pooled conn socket.
    ???? What the heck??
    Surely if the AM conn check out / return code works properly, OC4J's pooling JDBC driver would
    pool and not close the socket??
    Has anywone gotten JDBC Datasource connections in BC4J to actually be pooled under OC4J??
    Since I couldn't get this to work in my early 902 oc4j testing, and now can't get it to work
    still under 903 OC4J, either it's my config or BC4J AM's code or OC4J?
    Any thoughts on how to figure out what's not configed correctly or has a bug?
    Thanks, curt

Maybe you are looking for

  • Remote Access VPN Query

    Hi All, I'm looking to setup a Remote Access VPN via my ASA 5510. The clients will be using Cisco Anyconnect (which I assume is compatible...?) When I use the wizard it only mentions Cisco VPN Client. When my clients connect up to the ASA via Anycone

  • Left over objects from install?

    I had started to migrate DNS and DHCP from a Netware 6.5 server to an OES 11 server and then held off. DNS and DHCP are both still running fine on my two Netware 6.5 servers. I noticed that I have some odd objects scattered around my tree now. Netwar

  • Problems with my new GE70

    Hello Msi forum! I bought a msi ge70 a few months ago and from the start I got big problems, I couldnt start programs, the apps in win 8.1 were lagging and so on. It went better when i uninstalled my antivirus-program but just for a month or so. Now

  • Linking a Raw Material Vendor to Specific Customer

    Hi, How to link raw material vendor to a specific customer in SAP. In make to order cycle, PR generated has reference of the Sales Order. So when this PR is converted to PO, the system should only allow the user to create the purchase order for the v

  • Text table

    what is the advantage to create a text table?plzz tell