CSV Data Source in CMC

Hello Everyone,
I am trying to schedule a Crystal Report through Central Management Console but running into an issue when my data source is a .CSV file.  The report is a merger of three data points, A SQL database via ODBC Connection and two .CSV files placed in a shared location in the same folder.
Report:  Crystal XI
Crystal Reports Server XI R2 SP5
Microsoft .NET 3.5 SP1
The report runs fine on my desktop but I receive a logon error when processing from the server.
Below is a picture of my connection to the .CSV file.  I have also tried changing to a specific User and Password without any success so I think we can rule out folder rights.
Thanks in advance!

when you are publishing the report to the server, you need to place the CVS file on the server and change the file source location path in your Crystal report to the server location.
the other thing you can do is while developing the report, create two  ODBC connections for the two files and use them to access the data in the two files.

Similar Messages

  • Custom Data Source in CMC does not work

    Hi there,
    Server : Crystal Reports Server 2008 V1
    Designer : CR 2008
    I have several reports that are originally set located to ODBC DSN1. I deployed this report to the server and in CMC, updated the database configuration in CMC as below.
    Enabled "Use Custom Data Source Information"
    Specified a new ODBC DSN2, user id, password, fully qualified table prefix for the new database etc.
    I can see that the report is running against the new database, and the data is selected, but the output is always blank. 
    But if I open the report in CR 2008, set locate the report to the new database DSN2, deploy again and use "original data source information", it works just fine. I have noticed this issue mainly with reports having sub-reports.
    Is there a solution to this problem. I don't want to set locate each report and deploy to the CMC if I have an option.
    Thanks for the help.
    Ajith

    I believe that the connection details are stored as part of the CR itself. Therefore if you update a connection in the CMC that might not change the parameters and credentials, especially when using sub-reports.
    Hope this helps...
    Martijn van Foeken
    Focuzz BI Services
    http://www.focuzz.nl
    http://nl.linkedin.com/in/martijnvanfoeken
    http://twitter.com/mfoeken

  • Performance issue with big CSV files as data source

    Hi,
    We are creating crystal reports for a large banking corporation with CSV files as data source. For some reports, we need join 2 csv files. The problem we met now is that when the 2 csv files are very large (both >200M), the performance is very bad and it takes an hour or so to refresh the data in Crystal Reports designer. The same case for either CR 11.5 or CR 2008.
    And my question is, is there any way to improve performance in such situations? For example, can we create index on the csv files? If you have ever created reports connecting to CSV, your suggestions will be highly appreciated.
    Thanks,
    Ray

    Certainly a reasonable concern...
    The question at this point is, How are the reports going to be used and deployed once they are in production?
    I'd look at it from that direction.
    For example... They may be able to dump the data directly to another database on a separate server that would insulate the main enterprise server. This would allow the main server to run the necessary queries during off peak hours and would isolate any reporting activity to a "reporting database".
    This would also keep the data secure and encrypted (it would continue to enjoy the security provided by an RDBMS). Text & csv files can be copied, emailed, altered & deleted by anyone who sees them. Placing them in encrypted .zip folders prevents them from being read by external applications.
    <Hope you liked the sales pitch I wrote for you to give to the client... =^)
    If all else fails and you're stuck using the csv files, at least see if they can get it all on one file. Joining the 2 files is killing you performance wise... More so than using 1 massive file.
    Jason

  • Setting up csv files as a data source for Hyperion intelligence explorer on Windows 7

    Hi. wondering anyone can point me to a resource or supply me with step son how to Set up csv files as a data source for Hyperion intelligence explorer on Windows 7?
    I used to be able to do it in WXP, but it seems thing shave really changed with W7..... Absolutely desperate... All and any help welcome.....

    Hi
    Can you please try setting up full persmission in the security properties for everyone on the wsconfig folder C:\ColdFusion10\config\wsconfig  and restart ColdFusion Server.
    Swaraj

  • 2.5 GB CSV file as data source for Crystal report

    Hi Experts,
        I  was asked to create a crystal report using crystal report as datasource(CSV file that is pretty huge (2.4Gb)). Could you help with me any doc that expalins the steps mainly with data connectivity.
    Objective is to create Crystal Report using that csv file as data source, save the report as .rpt with the data and send the results to customer to be read with Crystal Reports Viewer or save the results to PDF.
    Please help and suggest me steps as I am new to crystal reports and CSV as source.
    BR, Nanda Kishore

    Nanda,
    The issue of having some records with comma and some with a semi colon will need to be resolved before you can do an import. Assuming that there are no semi colons in any of the text values of the report, you could do a "Find & Replace" to convert the semi colons to commas.
    If find & replace isn't an option, you'll need to get the files separately.
    I've never used the Import Export Wizzard myself. I've always used the BULK INSERT command
    It would look something like this...
    BULK INSERT SQLServerTableName
    FROM 'c:\My_CSV_File.csv'
    WITH (FIELDTERMINATOR = ',')
    This of course implies that your table has the same columns, in the same order as the csv files and that each column is the correct data type to accept the incoming data.
    If you continue to have issues getting your data into SQL Server Express, please post in one of these two forums
    [Transact-SQL|http://social.msdn.microsoft.com/Forums/en-US/transactsql/threads]
    [SQL Server Express|http://social.msdn.microsoft.com/Forums/en-US/sqlexpress/threads]
    The Transact-SQL forum has some VERY knowledgeable people (including MVPs and book authors) posing answers.
    I've never posed to the SQL Server Express but I'm sure they can trouble shoot your issues with the Import Export Wizard.
    If you post in one of them, please copy the post link back to this thread you I can continue to to help.
    Jason

  • Excel-2007 cannot connect to Oracle ODBC data source, Control Panel can.

    <p>
    I cannot make an ODBC connection from Exce-2007 to Oracle work. I am an expert Excel and VBA user (since 1994) and I have frequently used Excel to access ODBC databases, including Oracle (I have done this with Excel-2003 both with worksheet queries and have written VBA ADO-connection routines). And even though in Excel-2007 a worksheet ODBC query is supposed to be easier to create than in previous version of Excel, my connection fails. Any suggestions and all help are welcome and much appreciated.
    DETAILS
    </p>
    <p>
    1) <strong>What is my system?</strong> I am using Excel-2007 on Windows Vista x64 and Oracle server v.11g on my computer (all this is on my computer, no network issues).
    2)<strong> Why use Excel with Oracle at all?</strong> I use Excel-2007 to access Oracle rather than Access-2007 (or any other application like TOAD, etc.) because I do engineering calculations with the data stored in Oracle. These calculations are easier done in Excel (I suppose that one alternative to this could be to use some sql or Access to get the data from the database, then store it as plain vanilla CSV file, then open this file in Excel, then do the math (the math involves complex optimisation algorithms), then save the results as CSV, then use some sql or Access to put the data back into the database. Howwever this does not strike me as a quick or neat solution. And after all Excel has been designed to access ODBC databases, so why not use it?)
    3) <strong>What do I do in Excel-2007 that won't work?</strong> I create an ODBC link to Oracle that does not work. In Excel-2007 this is straightforward:
    </p>
    <ul>
         <li>define an ODBC connection (Data tab --&gt; From other sources --&gt; From data connection wizard);</li>
         <li>define a query on the worksheet -- that's it, this is all!</li>
    </ul>
    <p>
    I start with creating an ODBC connection:
    a) I choose an ODBC data source type: <strong><em>ODBC DSN</em></strong>
    b) Excel-2007 displays the list of the available ODBC data sources. I see in it <strong><em>my Oracle database name</em></strong> and I select it.
    c) Excel-2007 displays the Data Link Properties:
    - the "Provider" has a list of the OLE DB drivers with preselected "<strong><em>Microsoft OLE DB Provider for ODBC Drivers</em></strong>". I keep this default selection.
    - the "Connection" tab has a connection string "<strong><em>DSN=&lt;my database name&gt;</em></strong>" which I keep, it also has fields for the <strong><em>user name</em></strong> and the <strong><em>password</em></strong>, which I fill with the correct credentials.
    - Finally there is a button "Test Connection", which when I click produces the following error message:
    <strong><font color="#ff0000">"Test connection failed because of an error in initializing provider.</font></strong><strong><br />
    </strong><strong><font color="#ff0000">Unespecified error"</font></strong><strong>
    </strong>
    4) <strong>Additional food for thought:</strong>
    a) In the above walk-through the only data, which I type, are the user name and password, everything else is selected from lists offered by Excel-2007, hence any possibility of typos being the cause of the problem can safely be discarded.
    b) I can test the ODBC driver in the Control Panel and it shows that it can connect to the Oracle database:
    - in <strong><em>Control Panel --&gt; Admin Tools --&gt; Data Sources (ODBC)</em></strong> on the "User DSN" tab I can see the list of the available ODBC data sources (same list as in Excel-2007, point 3b above) with the name of my database in it;
    - selecting the name of my database from the list of the sources and clicking "Configure" button opens a tab with <strong><em>Data Source Name</em></strong> (same as in Excel-2007), TNS Service Name and User ID. I enter <strong><em>&lt;user name&gt;/&lt;password&gt;</em></strong> and click "Test Connection" button. A message "<strong><em>Connection successful</em></strong>" appears (just for a test I enter <u>incorrect user credentials</u> and "<strong><em>Unable to connect</em></strong>" message appears)
    BOTTOM LINE
    </p>
    <p>
    The procedure for using an ODBC connection from Excel is very simple, in the past I have created and used such connections numerous times with Excel-2003 and earlier on Win-XP and earlier. But now on Excel-2007 and Vista-x64 I cannot make it work.
    Also, testing an ODBC connection driver is really easy and simple to be done in the Control Panel. There testing the same ODBC connection, which fails in Excel-2007, results in success.
    I am frustrated by the simplicity of the problem and yet the persistant error. I have lost now two full days in failed attempts to make the simple procedure work and in searching the internet for answers.
    All help is highly appreciated <img class="emoticon" src="images/emoticons/happy.gif" border="0" alt="" width="16" height="16" />,
    Plamen
    </p>

    Did you find the solution to your problem?
    If not, I think I may know.
    Excel 2007 is a 32-bit application.
    When installing 32-bit applications in a 64-bit environment, the "default" location is:
    C:\Program files (x86)\...
    Excel then launches from this location.
    However, when it connects to Oracle and passes the name of the calling program, Oracle attempts to "interpret" the value of (x86) as if
    the value within the parenthesis are being passed as a reference. Of course, iOracle doesn't find anything, so the result is (),
    and then it cannot return the connection info back to the calling application.
    I corrected it by installing Excel in C:\Program Files\, and once launched from that location, it works the same as on the 32-bit machines.
    However, at my location, they are FORCING Excel to be installed in the(x86) location.
    What I'm trying to discover now is:
    Is it possible to flag Oracle to NOT process the embedded variables?
    Or, is it possible to assign a variable in Oracle such that x86 = "(x86)", so that the end result is viable?
    Have you had any luck with your installation?
    thanks,
    Paul

  • Changing the data source from DB2 to ORACLE in SBOP 4.0

    Hi Gurus,
    We have done SBOP 4.0 SP 2 installation successfully on Linux by choosing DB2 as default Database as suggested by SAP as there is some issue with RH LINUX 5.5 version. Now, we need to change the CMS data source back to ORACLE 11G. For that we have to execute cmsdbsetup.sh and go with the option of "copy" (Copy data from another Data Source). We need to provide the target/destination CMS database in my case ORACLE (TNS & CMS user) details. And also we need to provide the source CMS user (DB2) details. As we went with the bundled/default DB2 installation, we are not able to find the cms user name and password (no where during the installation it prompted to provide cms username and password).
    What will be the default cms username /password in DB2?
    Thanks,
    Sandeep

    Hi,
    The work around is to create/add an extra node (SIA Node) with default servers option for the existing CMS and provided my ORACLE CMS username/password along with the TNS Names using cmsdbsetup.sh. Make sure that this new node is visible in the Servers section of the CMC console (i.e. http://<webappserver>:8080/BOE/CMC-->servers) and also see that all the servers should be running. Then you can delete the old SIA which was connected to DB2 from the CMC->servers.
    Thanks,
    Sandeep

  • SQL Interface - Error in Loading the data from SQL data source

    Hello,
    We have been using SQl data source for loading the dimensions and the data for so many years. Even using Essbase 11.1.1.0, it's been quite a while (more than one year). For the past few days,we are getting the below error when trying to load the data.
    [Mon Jan 10 11:02:56 2011]Local/{App Name}/{DB Name}/{User Id}/Info(1021013)
    ODBC Layer Error: [S1000] ==> [[DataDirect][ODBC DB2 Wire Protocol driver][UDB DB2 for Windows, UNIX, and Linux]CURSOR IDENTIFIED IN FETCH OR CLOSE STATEMENT
    IS NOT OPEN (DIAG INFO: ).]
    [Mon Jan 10 11:02:56 2011]Local/{App Name}/{DB Name}/{User Id}/Info(1021014)
    ODBC Layer Error: Native Error code [4294966795]
    [Mon Jan 10 11:02:56 2011]Local/{App Name}/{DB Name}/{User Id}/Error(1021001)
    Failed to Establish Connection With SQL Database Server. See log for more information
    [Mon Jan 10 11:02:56 2011]Local/{App Name}/{DB Name}/{User Id}/Error(1003050)
    Data Load Transaction Aborted With Error [7]
    [Mon Jan 10 11:02:56 2011]Local/{App Name}///Info(1013214)
    Clear Active on User [Olapadm] Instance [1]
    Interestingly, after the job fails thru our batch scheduler environment, when I run the same script that's being used in the batch scheduler, the job completes successfully.
    Also, this is first time, I saw this kind of error message.
    Appreciate any help or any suggestions to find a resolution. Thanks,

    Hii Priya,
    The reasons may be the file is open, the format/flatfile structure is not correct, the mapping/transfer structure may not be correct, presence of invalid characters/data inconsistency in the file, etc.
    Check if the flatfile in .CSV format.
    You have to save it in .CSV format for the flatfile loading to work.
    Also check the connection issues between source system and BW or sometimes may be due to inactive update rules.
    Refer
    error 1
    Find out the actual reason and let us know.
    Hope this helps.
    Regards,
    Raghu.

  • Report based on custom Query - Question about Data Source

    Hello,
    I am trying to create Crystal reports (EBO R3.1, Oracle 11g Database, CR2008) based on custom queries. Tried using different kinds of Data Sources but getting stuck once I export them to Infoview.
    1. Using Oracle Server: This method works fine on my local & Infoview. But since I have lot of reports, I do not want the DB username/password associated to the report. I neither want BO to prompt username/pwd to my user.
    2. Using ODBC(RDO):
    Again, works fine on my local. Once I export to Infoview, it fails with this error: "Failed to open the connection"
    Which makes sense, because the DSN is created on my local machine and not on the server.
    How and what do I create on the server, so that the reports run?
    I do not know if using ODBC(RDO) is even the right way for creating custom SQL reports.
    I have been reading about Business Views, but not clear on how good they work.
    My goal is to create multiple reports, using custom SQL queries which use some kind of connection to the database that can be managed on the
    server in one place, and used by multiple reports.
    Can any of you please help me here
    Thanks,
    Sowmya.
    Edited by: SBat22 on Apr 13, 2011 5:58 PM

    Thanks for moving it to the right forum!!
    But, if I setup in CMC, a default username/password that the report needs....what would happen if my password changes?
    I would need to go to each and every report and update the password.
    I am looking for using one common "connection" for multiple reports. So that, when passwords change, all I update is in one central location.
    Can anyone please help!
    Thanks,
    Sowmya.

  • Data security for multiple data sources

    Dear BO guru's,
    I am struggling with a brainbraker on authorizations on Universes since quite some time.
    I am not a BO guru so hopefully someone can help me with this.
    I (more or less) know the concept of data security in BO: users can be restricted on data level in (mainly) two ways:
    1) with roles in CMC and with restrictions in Universe Designer.
    OR
    2) with a DB table that contains all authorized values per user and per field (i.e. John can see data for country UK)
    The first is easy to set up, but hard to maintain.
    The second is difficult to set up, but very flexible.
    Now here is the problem...
    Supporse your BO server is connected to different source systems (i.e. an SQL server, an Oracle server...) and you want only one universe to get data from all systems at the same time and display it in one report.
    If I am not mistaken, this means we need a data federator.
    ...My questions:
    1) Is there a possibility to do this without a data federator (but still have one universe to build my report or dashboard on)?
    2) Where do I keep the table with authorizations for the users? Is that a database of the BO Enterprise server, a seperate data base, or in a table of one of the source systems (SQL, Oracle...)?
    As this questions keeps me busy since long time, I would be very grateful to have your help.
    It seems hard to find information on this.
    Thanks a lot in advance!
    UniverseDummy

    1) Is there a possibility to do this without a data federator (but still have one universe to build my report or dashboard on)?
    Apart from the Data Federator, you can either use an ETL tool to load your data into a single Datawarehouse or wait until XI 4.0
    2) Where do I keep the table with authorizations for the users? Is that a database of the BO Enterprise server, a seperate data base, or in a table of one of the source systems (SQL, Oracle...)?
    If you are going to use the Data Federator then you can create the table in one of your source systems and add it as a data source in your DF project.
    Regards,
    Stratos

  • How can I convert a space from ABAP to CSV data?

    Hello guys,
    how can I convert a string like
    "This is a%test%"
    to
    "This is  a test "
    into CSV data?
    I try to with   REPLACE ALL OCCURRENCES OF '%' IN string_above WITH space,
    but I got the string without space, like
    "This is  atest"
    I read some posts that CSV will ignore space.
    But which kind of symbol in CSV represents space?
    Thanks for any suggestion!
    Regards,
    Liying

    USE THE CODE AS SHOWN BELOW...
    TYPES : BEGIN OF tp_data,
              line(4096),
            END   OF tp_data.
    *&      Form  gui_upload
          Upload the data from the input file to the internal table
    FORM gui_upload.
    *--Local data declaration
      DATA: l_filename_ip TYPE string,
            tl_data       TYPE STANDARD TABLE OF tp_data WITH HEADER LINE.
      l_filename_ip = p_input.
    *--Upload Data
      CALL FUNCTION 'GUI_UPLOAD'
           EXPORTING
                filename                = l_filename_ip
                filetype                = 'ASC'
                has_field_separator     = ''
           TABLES
                data_tab                = tl_data
           EXCEPTIONS
                file_open_error         = 1
                file_read_error         = 2
                no_batch                = 3
                gui_refuse_filetransfer = 4
                invalid_type            = 5
                no_authority            = 6
                unknown_error           = 7
                bad_data_format         = 8
                header_not_allowed      = 9
                separator_not_allowed   = 10
                header_too_long         = 11
                unknown_dp_error        = 12
                access_denied           = 13
                dp_out_of_memory        = 14
                disk_full               = 15
                dp_timeout              = 16
                OTHERS                  = 17.
      IF sy-subrc <> 0.
        MESSAGE ID   sy-msgid TYPE 'S' NUMBER sy-msgno
                WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
        IF sy-subrc = 1.
          MESSAGE e000 WITH text-e04.
        ENDIF.
      ELSE.
        LOOP AT tl_data FROM 2.
    *--Taking the contents of the input file leaving the header part
          PERFORM split_and_save USING tl_data-line.
        ENDLOOP.
      ENDIF.
    ENDFORM.                    " gui_upload
    *&      Form  split_and_save
          Splitting the contents from the input file and saving them into
          an internal table
         -->P_TL_DATA_LINE  text
    FORM split_and_save USING fp_data.
    *--Local data declaration
      DATA : tl_data TYPE STANDARD TABLE OF tp_data WITH HEADER LINE,
             l_type  TYPE c,
             c_space(1) TYPE c VALUE ' '.
    *--Split at the Comma
      SPLIT fp_data AT c_space INTO TABLE tl_data.
      CLEAR inputtab.
    *--Move it to the target fields
      DO.
        ASSIGN COMPONENT sy-index OF STRUCTURE inputtab TO <fs_field>.
        IF NOT sy-subrc IS INITIAL.
          EXIT.
        ENDIF.
    *--Extract source data
        CLEAR tl_data.
        READ TABLE tl_data INDEX sy-index.
    *--Populate the target
        <fs_field> = tl_data-line.
      ENDDO.
    *--Append this record
      APPEND inputtab.
    ENDFORM.                    " split_and_save
    In the above code u can use make inputtab like tp_data and get the result as requires in an internal table as space seperated fields.
    In place of space u can also use any other symbol.
    And after that u can replace all occurences of space with the comma by using the statement that u have written below.

  • Trying to change the data source for a Crystal Report.

    <p>The method below represents my best attempt to programatically change the data source of a Crystal Report. The goal is to have a routine that will update the data source for reports after they have been distributed to production servers. So far I have not been successful in saving the report back to the CMS. No exceptions are thrown, but when I view the Database Configuration of the report in the CMC nothing has changed.
    </p>
    <p>
    Am I missing a step, or is there another way to accomplish this?
    </p>
    <p>
    Thank you.
    </p>
    <hr />
    <pre>
    private void test(String reportName)
       throws SDKException, ReportSDKException, java.io.IOException
       IInfoObjects newInfoObjects;
       IInfoObject reportObj;
       ReportClientDocument clientDoc = new ReportClientDocument();
       DatabaseController dc;
       PropertyBag pBag;
       PropertyBag logonProps;
       ConnectionInfo newConInfo;
       ConnectionInfo oldConInfo;
       ConnectionInfos conInfos;
       int connOptions = DBOptions._ignoreCurrentTableQualifiers + DBOptions._doNotVerifyDB; //0;
       Fields connFields = null;
       String queryStr = "Select * From CI_INFOOBJECTS " +
          "Where SI_NAME='wfr.rpt' AND SI_KIND='CrystalReport' AND SI_INSTANCE=0";
       newInfoObjects = getCms().executeQuery(queryStr);
       if(newInfoObjects.size() > 0)
          reportObj = (IInfoObject)newInfoObjects.get(0);
          try
             clientDoc = getCms().getReportAppFactory().openDocument(
                reportObj
                , OpenReportOptions._refreshRepositoryObjects
                , java.util.Locale.US);
             dc = clientDoc.getDatabaseController();
             conInfos = dc.getConnectionInfos(null);
             for(int i = 0; i < conInfos.size(); ++i)
                oldConInfo = (ConnectionInfo)conInfos.getConnectionInfo(i);
                newConInfo = (ConnectionInfo)oldConInfo.clone(true);
                pBag = newConInfo.getAttributes();
                pBag.putStringValue("QE_ServerDescription", "alio");
                logonProps = new PropertyBag();
                logonProps.putStringValue("Trusted_Connection", "false");
                logonProps.putStringValue("Server", "alio");
                pBag.put("QE_LogonProperties", logonProps);
                newConInfo.setUserName("admin");
                newConInfo.setPassword("password");
                dc.replaceConnection(
                   oldConInfo
                   , newConInfo
                   , connFields
                   , connOptions);
          catch(ReportSDKServerException Ex)
             String msg = "A server error occured while processing the " + reportObj.getKind()
                + " object, " + reportObj.getTitle() + " (" + reportObj.getCUID() + "), from the CMS.";
             Utility.errorOut(msg, Ex);
          catch(Exception Ex)
             String msg = "An error occured while processing the " + reportObj.getKind()
                + " object, " + reportObj.getTitle() + " (" + reportObj.getCUID() + "), from the CMS.";
             Utility.errorOut(msg, Ex);
          finally
             clientDoc.save();
             getCms().commitToInfoStore(newInfoObjects);
             clientDoc.close();
    </pre>
    Edited by: Mark Young on Sep 10, 2009 2:16 PM

    <style type="text/css">
    /<![CDATA[/
        body
            font-size: 1.125em;
              font-family: helvetica,arial,"sans-serif";
          .code{font-family: "courier new",courier,mono,monospace}
          .bi{font-style: italic; font-weight: bold;}
    /]]>/
    </style>
    <p>Justin,</p>
    <p>
    Thank you for the reply. Time constraints have not allowed me to post back to this tread
    till now. I will try your suggestion. My assumption is that <i>Save the report back to the
    info store</i> refers to <span class="code">IInfoStore.commit(IInfoObjects)</span>.
    </p>
    <p>
    I'm afraid that I do not understand why I don't want to change the report client document,
    or why <i>successfully exporting the report with the new login/password</i> is not what I
    want to do. Any explanation on that statement would be appreciated.
    </p>
    <p>
    I did find a way to accomplish my goal. It involved adding the SSOKEY property to the
    logon property bag. Below you'll see my revised code which modifies the report logon and
    server. I have no idea what
    this does, and SAP support has not been able to tell me why it works. However, what I
    discovered is that if I changed the report option, <b>Database Configuration -> When
    viewing report:</b>, in the CMS to <span class="bi">Use same database logon as when report
    is run</span> from <span class="bi">Prompt the user for database logon</span>, then the
    SSOKEY property had been added to the logon property bag having an empty string as its
    value. This allowed me to successfullyupdate and save the modified logon back to the CMS.
    </p>
    <p>
    So I took a chance and added code to always add the SSOKEY property with an empty string
    as its value, and I could then successfully modify and save the report's logon info
    and server. Again, I don't know what this means, but it has worked so far. If anyone has
    some insight or comments, either are welcome. Thank you in advance.
    </p>
    <br />
    <hr />
    <pre>
    private void changeDataSourceOfAWFCrystalReports()
       throws SDKException, ReportSDKException, java.io.IOException
       IInfoObjects newInfoObjects = null;
       IInfoObject reportObj = null;
       IReport curReport = null;
       ReportClientDocument clientDoc = new ReportClientDocument();
       DatabaseController dbController;
       PropertyBag pBag;
       PropertyBag logonProps;
       ConnectionInfo newConInfo;
       ConnectionInfo oldConInfo;
       ConnectionInfos conInfos;
       int connOptions = DBOptions._ignoreCurrentTableQualifiers + DBOptions._doNotVerifyDB;
       Fields connFields = null;
       String outputStr;
       int numOfReports;
       int numOfQueryPages;
       double progressIncrementPerPage = 30;
       int progressIncrementPerReport = 0;
       // Path query to reports is in a .properties file.
       String queryStr = getAppSettingsFile().getWscAwfCrystalReportPathQuery();
       try
          // Executes IInfoStore.getPageingQuery() and generates a list of queries.
          getCms().setPathQueryQueries(queryStr, 100);
          numOfQueryPages = 0;
          // Gets a List&lt;String&gt; of the IPageResult returned from IInfoStore.getPageingQuery().
          if(getCms().getPathQueryQueries() != null)
             numOfQueryPages = getCms().getPathQueryQueries().size();
          if(numOfQueryPages &gt; 0)
             // Use 30% of progress bar for the following loop.
             progressIncrementPerPage = Math.floor(30.0/(double)numOfQueryPages);
          for(int queryPageIndex = 0; queryPageIndex &lt; numOfQueryPages; ++queryPageIndex)
             // Gets the IInfoObjects returned from the current page query
             newInfoObjects = getCms().getPathQueryResultSetPage(queryPageIndex);
             numOfReports = newInfoObjects.size();
             if(newInfoObjects != null && numOfReports &gt; 0)
                progressIncrementPerReport =
                   Math.round((float)Math.floor(progressIncrementPerPage/(double)numOfReports));
                for(int reportIndex = 0; reportIndex &lt; numOfReports; ++reportIndex)
                   reportObj = (IInfoObject)newInfoObjects.get(reportIndex);
                   curReport = (IReport)reportObj;
                   clientDoc = getCms().getReportAppFactory().openDocument(
                      reportObj
                      , OpenReportOptions._refreshRepositoryObjects
                      , java.util.Locale.US);
                   dbController = clientDoc.getDatabaseController();
                   conInfos = dbController.getConnectionInfos(null);
                   for(int conInfosIndex = 0; conInfosIndex &lt; conInfos.size(); ++conInfosIndex)
                      oldConInfo = (ConnectionInfo)conInfos.getConnectionInfo(conInfosIndex);
                      newConInfo = (ConnectionInfo)oldConInfo.clone(true);
                      pBag = newConInfo.getAttributes();
                      pBag.putStringValue(
                         "QE_ServerDescription"
                         ,getConfigFile().getDBDataSourceConnections());
                      logonProps = new PropertyBag();
                      logonProps.putStringValue("Trusted_Connection", "false");
                      <b>logonProps.putStringValue("SSOKEY", "");</b>
                      logonProps.putStringValue(
                         "Server"
                         ,getConfigFile().getDBDataSourceConnections());
                      pBag.put("QE_LogonProperties", logonProps);
                      newConInfo.setUserName(getConfigFile().getUNVConnectionUserName());
                      newConInfo.setPassword(getConfigFile().getUNVConnectionPasswordDecrypted());
                      dbController.replaceConnection(
                         oldConInfo
                         , newConInfo
                         , connFields
                         , connOptions);
                      newConInfo = (ConnectionInfo)conInfos.getConnectionInfo(conInfosIndex);
                   } // end for on conInfosIndex
                   clientDoc.save();
                } // end for on reportIndex
             } // end if on newInfoObjects
          } // end for on queryPageIndex
       } // end try
       catch(ReportSDKServerException Ex)
          // handle...
       catch(Exception Ex)
          // handle...
       finally
          getCms().commitToInfoStore(newInfoObjects);
          if(clientDoc != null)
             clientDoc.close();
    </pre>

  • Error while creating Data Source for master data attributes

    Hi BI Experts,
    Well its been some time for me that I have been part of Extraction in BI.I primarily handled reporting in my last assignments.
    I was trying extraction with flat files in SAP BI 7(new to sap bi 7 but very much familiar with BW3.5) but failed in the activity during master data attributes and text upload in infoobject (say IOSP_Mat).
    Here is the procedure that I did after creation of characteristic IOSP_Mat.I created a source system for flat file followed by data source for Master data attributes, i selected all the parameters correctly.i.e. csv file format, data seperator as   ,
    and other settings, now when i am trying to look at the proposed data in the next tab using Load example data.its not showing the desired result.The columns that I have maintained in Flat File is  MAT_NUMBER and MAT_NAME (with say 100 data in the file)
    same is the result when I am trying to load the text data too columns maintained are
    (LANGUAGE MAT_NUMBER Short Description)(same 100 data).
    now i used to rsa1old transaction  to upload the file using 3.5 version.i created info source for master data/text/hierarchies for  IOSP_Mat
    now when trying to upload it using info package for master and text data,I observe its(the data) not maintained in the characteristic IOSP_Mat.
    When I monitored ,I figured the data has not been even uploaded to the PSA level.
    Can you BI experts tell me the answer for this.
    Thanks,
    Srijith

    apologies to all of you for late response,
    was busy with some other activities.
    I don't remember the exact message,but I remember it was not loaded to even the PSA level.I will try it again and post the exact message.
    Thanks again for your quick response.
    Once again sorry to all of you for my late response
    Thanks,
    Sri

  • Not enough memory for Data Provider-Error while creating Data Source

    Hi,
    I am loading data into Master Data_Attribute InfoObject I am getting following error message while creating Data Source under "Proposal" Tab
    "Not enough memory for Data Provider"
    My Master Data InfoObject having 65 attributes
    My CSV file having 15,00000 records
    I am using BI 7.0 version
    If anybody faced this problem. Please share with me
    Thanks.

    Hi
    Here the problem with the space so plz contact ur BASIS people to increase the spae for particular object.

  • Issue with Crystal Report based on 2 different data sources

    Hi there,
    I am having a frustrating problem with a report I've designed and I'm hoping someone might be able to assist please.
    The report has 3 different prompts, each of which is based on a dynamic list of values retrieved via views within a SQL server db.
    We are wanting to introduce the use of Universes as much as possible, so the data returned is based off a BO Universe data source query.
    I have uploaded the report into BO and have provided the necessary database logon information for the report (in the "Process" > "Database" settings in the CMC) for both the direct db datasource connection (for the views) and the BO Universe query connection.
    When the report is run however, the report still prompts for the database user name & password credentials. I have triple checked my db connection settings, and also have "Use same database logon as when report is run" set to true. I also tested a cut-down version of the report without the Universe connection with the same db logon credentials I provided and there was no credentials prompt when it was run, proving those values are accepted.
    Does anyone know why this is happening & if there is a way around it? Alternatively, is there some way that a report prompt may be based on a dynamic list of values retrieved via a Universe connection? This way I'd be able to remove the db connection for the views and have the report solely based on the Universe.
    Another issue that occurs is out of the 3 prompts, a user can select a User Name OR Number, and also must select a Period. However if the User Name or Number is left blank the message "The value is not valid" is shown. So I tried a cut-down version of the report with only the BO Universe as a data source (static prompts) and this didn't occur, i.e. I was allowed to leave either the User Name or Number empty.
    I hope this all of makes sense - let me know if not. If anyone is able to help out with any of this it would be very much appreciated.
    Cheers,
    Marco

    Please re-post if this is still an issue to the Business Objects Forum or purchase a case and have a dedicated support engineer work with you directly

Maybe you are looking for