Quick Question re Data Source

Hi
I have a quick question re datasource login.
I havent touched my Crystal Reports for about a year. Since I created the reports (which users have been using through Crystal Reports server) I have changed my PC.
I need to make a change in the formulas of the reports so I have reinstalled Crystal Reports 11 onto my PC and opened the reports. The forumula change is simple but I am having a nightmare with the data source login. The report runs against an SQL database. From memory all I did the last time was create an ODBC connection on my PC (which used the the SQL login sa account) and configured the reports to use that ODBC connection. When I open the report it asks me for the data source location which I tell it to use the one I created and the report runs fine.
If I close Crystal Reports and and then re-open, I press F5 to run the report and it gain asks me for the data source credentials.
Any ideas, I'm sure it is something simple I am missing?
Thanks

Guys
Thanks for your help
Ive managed to fix it but not really sure how although I'm sure a few of the things you have mentioned is what I have done.
I basically restored the reports from last nights backup (before I did the changes), re-did the changes and saved and now it seems  to work. I think the difference here is that I had created the odbc connection prior to changing the reports - if that makes sense?
What I would like to do is fully understand how this data source connection works.
Don, when I open the report now and hit f5 the report presents me with the parameters and not the database login screen as it was doing before - how is this? What credentials is it using? The SQL sa account (which I configured the ODBC connection to use) or is it my domain account of which I assume permissions for this account must be set on the actual SQL database?
As you and Jeff suggested when I look at the datasource connection screen it shows "trusted Connection: 1" - what does this mean and how is this configured, have I configured this in Crystal somewhere?
Thanks
Andy

Similar Messages

  • Report based on custom Query - Question about Data Source

    Hello,
    I am trying to create Crystal reports (EBO R3.1, Oracle 11g Database, CR2008) based on custom queries. Tried using different kinds of Data Sources but getting stuck once I export them to Infoview.
    1. Using Oracle Server: This method works fine on my local & Infoview. But since I have lot of reports, I do not want the DB username/password associated to the report. I neither want BO to prompt username/pwd to my user.
    2. Using ODBC(RDO):
    Again, works fine on my local. Once I export to Infoview, it fails with this error: "Failed to open the connection"
    Which makes sense, because the DSN is created on my local machine and not on the server.
    How and what do I create on the server, so that the reports run?
    I do not know if using ODBC(RDO) is even the right way for creating custom SQL reports.
    I have been reading about Business Views, but not clear on how good they work.
    My goal is to create multiple reports, using custom SQL queries which use some kind of connection to the database that can be managed on the
    server in one place, and used by multiple reports.
    Can any of you please help me here
    Thanks,
    Sowmya.
    Edited by: SBat22 on Apr 13, 2011 5:58 PM

    Thanks for moving it to the right forum!!
    But, if I setup in CMC, a default username/password that the report needs....what would happen if my password changes?
    I would need to go to each and every report and update the password.
    I am looking for using one common "connection" for multiple reports. So that, when passwords change, all I update is in one central location.
    Can anyone please help!
    Thanks,
    Sowmya.

  • Newbie Question: Creating Data Sources

    Our current Hyperion setup consists of the Hyperion 9.3 client that users access via a terminal server. We are trying to get the web version of Hyperion running.
    I installed some base components of the Oracle Enterprise Performance Management System (11.1.1.2.0.940). It all starts fine and I can login with admin, but I do not see how to create data sources or load .bqy query files. If I create a new interactive reporting document, it makes me choose a Data Source from the virtual directory in which I do not have one. if I try to import one, or import a .bqy file, I browse to the file, select it, then always get a "File path is wrong." error. I also found how to create Data Sources using the CMC, but I do not see how to reference these. What am I doing wrong?.

    HI,
    What version of Hyperion planning are you using version 9.3 or epm11.1?
    Because they have different ways of creating datasources.
    In version 9.3 you can create datsource in the configuration utility.
    While in version 11 you need to create planning application first before creating datasource either by classince planning application or epm.
    Thanks,
    Hanson

  • Quick question about data replication

    Hi all,
    We have 9i database running on Sun solaris. Can we make a copy of this database on another server (backup and restore) and then apply archive logs from the databaser to its copy to keep the copy current.
    I know Oracle has solution like Standby database for this but we need this a temporary solution for few months until we go to 10g/11g.
    Please let me know if I am not clear or any other info. is required to answer this.
    Thanks & Regards,

    That is what a Scripted Standby is -- not using DataGuard.
    You setup scripts to push/move/copy archivelogs from the Primary server to the Standby server (or you use log_archive_dest_2 to have the archivelogs written to the standby via RFS, with a listener and the database instance in MOUNT mode on the standby).
    On the standby, you have a script that periodically "wakes up" (or is awakened by a process on the Primary which has pushed new archivelogs) and then issues a RECOVER but does NOT do an OPEN. It then shuts down after having applied the last archivelog received.
    I don't have a script ready but did do this even in Oracle7. (with additional logic to check CHANGE# or SEQUENCE# and also do Checksum comparisons of the ArchiveLogs).
    A google search will return a number of suggestions.
    for example
    http://www.dbasupport.com/oracle/ora10g/manual_standby.shtml
    http://www.idevelopment.info/data/Oracle/DBA_tips/Standby_Database/SD_2.shtml
    Hemant K Chitale
    http://hemantoracledba.blogspot.com

  • Can't Create a Data Source - Failed to test connection. [DBNETLIB][ConnectionOpen (Connect()).]SQL Server does not exist or access denied

    Hi there,
    I am having a serious issue with The Power BI Data Management Gateway which I am hoping that someone can help me with.
    Basically I am setting a connection between a Power BI demo site and a SQL 2012 Database based on Azure. The Data Management Gateway and is up and running, and Power BI has managed to connect to it successfuly.
    By following the tutorials at
    here I was able to successful create my Data Connection Gateway with a self-signed certificate.
    However, when trying to create the data source I come into problems. The Data Source Manager manages to successfully resolve the hostname, as per the screenshot below:
    Bear in mind that I exposed the require ports in Azure as endpoints and I managed to modify my hosts file on my local machine so I could access the SQL server hosted in Azure using its internal name -- otherwise I would not be able to get this far.
    However the creation of the data source also fails when trying to created it whilst logged in the SQL server in question:
    The Data Source Manager returns the error when using the Microsoft OLE DB Provider for SQL Server:
    Failed to test connection. [DBNETLIB][ConnectionOpen (Connect()).]SQL Server does not exist or access denied
    I tried using the SQL Server Native Client 11.0 instead but I also get an error. This time the error is:
    Failed to test connection. Login timeout expiredA network-related or instance-specific error has occurred while establishing a connection to SQL Server. Server is not found or not accessible. Check if instance name is correct and if SQL Server is configured to allow remote connections. For more information see SQL Server Books Online.Named Pipes Provider: Could not open a connection to SQL Server [53]. 
    Some considerations
    If I provide an invalid username/password, the Data Source Manager does say that the username and password is incorrect.
    Firewall is turned off in the SQL Server (either way, this error also happens if I try top use the Data Source Manager whilst logged in the SQL Server itself).
    SQL Profiler does not show any attempt of connection.
    The SQL server instance in question is the default one.
    The error happens regardless if I select the option to encrypt connection or not.
    In SQL Configuration manager I can see that all protocols are enabled (TCP/IP, Named Pipes and Shared Memory.
    The Event Viewer does not provide any further errors than the one I have copied in this post.
    I'm at a loss here. Could someone please advise what might I be doing wrong?
    Regards,
    P.

    Here is what I had to do to solve this issue:
    Basically I had to add the MSSQL TCP/IP port as an end-point in Azure. After I did that, then I was able to create the data-source. However, I was only able to authenticate with a SQL account, as any domain account would return me an error saying that the
    domain isn't trusted.
    What puzzles me here is how come the Data Source Manager would inform me that an account username/password was invalid, but it would fail/timeout if I provided valid credentials (!?!?!!?)

  • Data Source showoing up as N/A in collection details

    Hi,
    I have a collection configured to use the Enterprise Crawler as a data source. I originally set this up a few years back via the collection details UI by clicking "Set up data source".  I would like to edit the data source without having
    to modify the XML config and applying it with collectioadmin -f <xmlconfig> and I would rather use the GUI. However, under collection details for the collection is question, the data sources box says "N/A".
    I have seen the problem before with the JDBC connector, reinstalling it usually helped.  How can I resolve it with the Crawler though?
    FAST ESP 5.3 on Server2008 R1

    Hello Mark,
    Have you tried using the web interface to edit the data source and re-add the crawler? If there is an interface dependency that is broken, one should be able to add the crawler
    again from the collection overview section.  Let us know your findings, and if you have any questions.
    Thanks!
    Rob Vazzana | Sr Support Escalation Engineer | US Customer Service & Support
    Customer Service   & Support                         
      Microsoft| Services

  • Question on Dynamic Query Data Source and Form Folders in Oracle Forms 6i

    Hi there -
    I have one interesting requirement in Oracle Forms.
    This is what I wanted to do.
    1. Have a LOV and Go button on Form.
    2. From LOV, I will select a pre-defined custom table and click Go.
    3. Based on the selected custom table, I have to populate the Block (Tabular Format).
    4. User should be able to do update, delete, insert on this block.
    5. User should be able to use the Oracle Form folders functionality (select only necessary column and save views etc. Std folder functionality).
    6. If user selects a different custom table name in the LOV on top, I need to refresh the data from the block based on this new table. Remaining functionality should be as it is (steps 3 to 5).
    You can see here, I am going to have dynamic query data source (Table Name as well as column mapping) on the block. I do not know before hand how many columns the user selected table has!
    This is what I have planned for this so far but I have some major questions before I can move on with this design:
    1. I am going to create a table structure with fixed number of column in form (40 cols assuming that the custom table will not have more that 40 cols). (Kind of limitation but it's okay as of now).
    2. Dynamically populate the block based on the table name selected by the user from LOV. Dynamically change the table column names based on the table selected etc.
    3. Perform insert, update, delete using PL/SQL package.
    So far it looks okay.
    Now my real question is,
    Can user still be able to user "Folders" functionality here? I have never done this kind of development before and I doubt the dynamic column naming, dynamic column data source will really work for "folders"!
    Also, I am not really sure whether user will be able to save these "folder" queries?
    Okay so form experts, can you ppl suggest me if this is really going to work? Are there any better ways to do this?
    Initially I tried to do this in OA Framework but I got stuck at because as per OAF developer guide "I cannot user OAF personalization for dynamic items, regions etc".
    For more info on that thread see this link...
    Re: setUserCustomizable issue!
    Thanks in advance for the help.

    Any suggestion anyone?

  • Question - Best practice data source for Vs2008 and Crystal Reports 2008

    I have posted a question here
    CR2008 using data from .NET data provider (ADO.NET DATASET from a .DLL)
    but think that perhaps I need general community advise on best practice with data sources.
    In Crystal reports I can choose the data source location from any number of connection types, eg ado.net(xml), com, oledb, odbc.
    Now in regard to the post, the reports have all been created in Crxi 6.3, upgraded to Crystal XI and now Im using the latest and greatest. I wrote the Crystal Reports 6.3/ XI reports back in the day to do the following: The Reports use a function from COM Object which returns an ADO recordset which is then consumed fine.
    So I don't want to rewrite all these reports, of which there are many.
    I would like to know if any developers are actually using .NET Class libraries to return ADO.NET datasets via the method call or if you are connecting directly to XML data via whatever source ( disk, web service, http request etc).
    I have not been able to eliminate the problem listed in the post mentioned above, which is that the Crystal Report is calling the .NET class library method twice before displaying the data. I have confirmed this by debugging the class lib.
    So any guidance or tips is appreciated.
    Thanks

    This is already being discuss in one of your other threads. Let's close this one out and concentrate on the one I've already replied to.
    Thanks

  • General Data source question

    Hi,
       I am a beginner. I want to know a information about standard Data source.
    Say for example: DS - 0FI_AR_3, how may time can this data source be used. or they can be used only once for one target?
    Question may be silly, but i am trying to know.
    Thanks.

    http://help.sap.com/saphelp_nw04/helpdata/en/70/10e73a86e99c77e10000000a114084/frameset.htm
    http://help.sap.com/saphelp_nw04/helpdata/en/70/10e73a86e99c77e10000000a114084/frameset.htm
    It loads - > 0FIAR_O03 - ODS FIAR: Line Item - > DSO Loads - >0FIAR_C05 - Cube FIAR: Line Item 
    A datasource can be used to update any num of datatargets...
    Message was edited by:
            Jr Roberto

  • Newbie Table Data Source Question

    Hi,
    I have a question about table data sources. I can set them up OK but I can't find the answers I'm looking for in the documentation or online.
    With a table data source, the content column is the one that's indexed right? Are the attribute column sindexed as well? If the content column is null, will the attribute columns still be indexed?
    Thanks in advance
    Mark.

    The attribute column is indexed along with the content column. However, the attribute column will not be indexed if the content column is null. That is, empty document is not indexed.
    The workaround is to at least fill in a few space characters to the content column.
    Steve

  • BI 7.0 Data Source Question

    I have requirement in Funct spec from Tables EKKO , EKPO  . The data sources 2lis_02_itm ,2lis_02_scl etc will have these tabels data . But in BI 7.0 there is a data source called 2LIS_06_Inv which provides data from all these data sources.i.e,
    The new DataStore object of Invoice Verification (0LIV_DS01) is supplied with data by the new DataSource 2LIS_06_INV, whereas previously the relevant invoice verification data was only supplied by the three DataSources of Purchasing 2LIS_02_HDR (purchasing data (header level)), 2LIS_02_ITM (purchasing data (item level)), and 2LIS_02_SCL (purchasing data (delivery schedule line level)).
    Iam already using this elis_06_inv in my other data model for another req. So my question here is instead of wasting time with creating this with 2lis_02_itm /hdr /scl Can I just use the new one that Iam already using for other req.
    Thanks

    Hi Harish,
    You can use the same datasource to load data into different infoproviders. Just create new transformation rules and DTP assigning source as your datasource, for the new infoprovider. In the infopackage, schedule the data transfer to PSA and then execute the DTP to load the data, for the required infoprovider.
    Hope this will help resolve your query.
    Regards
    Vaibhav

  • Newbie XML data source questions

    Post Author: tel
    CA Forum: Data Connectivity and SQL
    I'm trying to use the XML data source via http and i'm running into a couple of issues.
    First, we have separate environments for development, QA, and production.  Each environment will have a seperate URL to retrieve the XML for the report.  I would like to use the same report design to access each environment.  Is there a way to update the URL in the data source (or replace the data source with one that has the correct URL) without having to re-add the fields to the report design?
    The second issue is similar.  I can't seem to get Crystal to recognize changes in the XML format (new fields for example) without having to delete and recreate the data source (which automatically removes all the associated fields from the report).  Is there a way around this, or again, a way of re-creating the data source without it removing the existing fields in the report?
    Thanks,

    Post Author: rosariosanto
    CA Forum: Data Connectivity and SQL
    I have the same problem. Where can I find help about setting the data source when connecting to a web service. Since the hostname is hardcoded in the report, it is necessary to update it in the code.

  • Standard Purchase Order Data Source question

    Hi,
    I have read the documentation discussing how to create data definitions. They all say make sure that the code is the short name of the concurrent request. However I noticed when I look at the Standard Purchase Order Data Source's code it has is PO_STANDARD_PO which seems to match the XML Schema PO_STANDARD_PO.xsd how is the concurrent request initiated at runtime?
    Thanks,
    Mark

    If you go to the XML administrator screen and lookup the definition of the XML page and what program it is associated to as a start. From memory, i believe that the Definition of the Print PO concurrent job is where the output is required in order for the PO to be printed in the XML format.

  • Data-source question

    Hi,
    I have defined a datasource in data-sources.xml file. If I do not specify the following parameters:
    max-connections
    min-connections
    then what are the default values the OC4J container takes?
    Secondly, on what basis are these values defined. I mean to say, what are the factors to be considered in specifying these values.
    Any help appreciated. Thanks in advance.
    Regards,
    Narinder

    Hi:
    If the date format begins with yyyymmdd, then,
    Date Lng should be "CHAR - 8"
    Use Date Lng in Transfer Structure.
    In this case, it will take the first 8 and ignore the rest.
    Then, in Transfer/Update rules you can map both the dates.
    This only works if the date starts with yyyymmdd.
    ALso assuming you don't need hhmmss.
    If not, you have to write ABAP.
    Ram Chamarthy

  • Oracle Data Source Question

    Can anyone tell me if the communication between a ColdFusion oracle data source and the Oracle database itself is encrypted or clear text by default?  I'm assuming it is clear text unless other features, such as Oracle Advanced Security, have been implemented. 
    If it is data transmission is clear text, are the data source user name and password passed in clear text as well?

    I dunno about the authentication info, but I'd say the rest of it would be clear text.
    It might pay to have a look at the JDBC spec, or the docs for the driver you're using.  That'll have the relevant info.
    Adam

Maybe you are looking for