Recurring Crystal schedules fail if original data source changes

We're using BO 3.1 (SP3 plus FP 3.3 and 3.5) with Oracle 11g repository and both 10g and 11g clients installed on the BO servers (Windows Server 2008 R2).  We've been trying to resolve an issue with recurring schedules of Crystal reports.  We always configure them to use the Custom Data Source settings but if the Original Data Source name changes, it's causing report schedules to fail with the below error message.  So if the parent report's original data source doesn't match the schedules original data source, the schedule begins to fail even though it's configured to use the custom data source settings.  I've had a ticket open with support for over a month now but am not getting anywhere.  Any feedback would be greatly appreciated. 
Error in File ~tmp6406e9398df050.rpt: Unable to connect: incorrect log on parameters. Details: [Database Vendor Code: 1005 ]

Hi John,
Thanks for the response but rescheduling reports does NOT resolve the issue and should not be necessary when they're set to use the CUSTOM data source.  We've found that the only thing you can really do is delete the existing recurring schedules and create brand new ones.  Needless to say, this isn't very convenient for users and it's difficult to prevent the original data source name from ever changing.  It sure seems like a bug but SAP will not acknowledge it as such.  If you use Query builder to look up all the detail on a recurring schedule, there's nothing to indicate where the problem is.  The SI_USE_ORIGINALDS will be set to "false" but if the original data source in the recurring schedule does not match that of the parent report, it will fail even though the custom data source settings are correct.  Why would the recurring schedule do anything at all with the original data source when it is set to use the custom one??

Similar Messages

  • How to design crystal report with web service data source?

    I want to design and run a Crystal Report 2008 against my ADO.NET DataSet from web service method.
    I choose New Report, for Data I choose XML, then "Use Web Data Source" hit next, then choose "HTTP WSDL URL" I enter http://localhost/RDWS/Service.asmx?wsdl, hit Next, I leave HTTP username and HTTP password blank. Then I see the Services, Ports and Methods screen. Services: Service, Ports: ServiceSoap, Methods: CustomerOrdersDataSet
    When I click Finish, I get Logon Failed, Details, Schema file is invalid. error:Element Schema@htttp://www.w3.or/2001XMLSchemal not found.
    How can I get this to work?

    The native XML driver is incompatible with ADO.NET DataSet XML.  The specific issue is that the driver cannot handle the recursive definition on "Schema" that the .NET DataSet XML uses.
    The workaround is to create a .NET class that Web References, invoke the Web Service method for the DataSet, then returns it. 
    Use the ADO.NET (XML) driver in Crystal Report to consume the .NET data source.
    Note - when you deploy your report, ensure you deploy the DLL for the .NET class you've created.
    Sincerely,
    Ted Ueda

  • How to configure crystal report xml file as data source in BOE in Solaris?

    Hi,
    How to configure crystal reports from xml file as data source in Solaris? I didn't find any suitable driver for xml / excel files for sun solaris.
    Which driver i have to use to connect xml file to crystal report to view my crystal report in solaris BOE?
    And the same way excel file as data source for crystal report.
    Thanks

    Hi Don thanks for the reply,
    In windows environment I donot have any problem when creating crystal report from Xml file and Excel file. After creating reports when I publish those into boe server in solaris, getting connection failed error.
    My solaris BOE server doent have any network connection with windows machines. So i have to place the files in solaris server.
    Below the steps what I tried:
    1. Created crystal reports from cr designer in windows using ADO.Net(xml) and in another try with Xml webservices drivers. Reports works well as it should.
    2. Saved in BOE repository in Solaris server from crystal reports and changed database configuration settings as:
        -Used custom database logon information and specified cr_xml as custom driver.
        -Chnaged database path to file directory path according to solaris server file path </app/../../>
        -tried table prefix also
        - Selected radio button use same database logon as when report is run saved.
    My environment :
    SAP BOXI3.1 sp3
    crystal reports 2008 sp3
    SunOS
    Cr developing in windows 7.
    For Excel I tried with ODBC in windows but I can't find any ODBC or JDBC drivers for Excel in solaris.
    Any help to solve my issues
    Thanks
    Nagalla

  • Re: request failed in FIAP Data source

    Hi Gurus,
                Please let me know  FIAP   data source is  schedule   in delta(daily 1.pm) but delta is failed in daily basis, when i run repeat  delta it is success . sometimes  repeat  delta  happens fails , what may be the root cause.................
    Regards
    s. dipu

    Thanks for reply,
                  let see  below details ..........
    in detail tab
    1.Requests (messages): Everything OK
    2 .Extraction (messages): Missing messages
            Extraction (messages): Missing messages
              Data selection scheduled
       Missing message: Number of sent records
       Missing message: Selection completed
    Transfer (IDocs and TRFC): Everything OK
    Processing (data packet): No data
    Deletion (messages): Everything OK
    IN STATUS TAB..............
    Diagnosis
         The information available is not sufficient to perform an analysis. You
         must determine whether further IDocs exist in BI that have not yet been
         processed, and could deliver additional information.
    Further analysis:
         Check the BI ALE inbox. You can check the IDocs using the wizard or the
         "All IDocs" tab page.
      regards
    s dipu
    Edited by: Siddhes on Nov 30, 2011 2:37 PM

  • Crystal Reports - Failed to retrieve data from the database

    Hi There,
    I'm hoping that somebody can help me.
    I've developed a crystal report from a stored procedure which I wrote. I can execute the stored procedure within SQL Server and within the Crystal Reports designer without any errors. Furthermore, I have imported the report into sap and can run it within SAP from the server without any errors. SAP version 8.81 PL5
    The issue is that when it's run from a client machine, I get the following error: "Failed to retrieve data from the database. Details: Database Vendor Code: 156. Error in the File RCR10010 {tempfile location}
    Here's a list of things which I have tried, all to no avail:
    - Checked user permissions to ensure that they have proper authorizations
    - Re-set the datasource connection and re-imported the report to SAP.
    - Exported the report and reviewed the datasource connection and re-imported to SAP.
    - Tried to run the report on multiple machines to ensure that it's not machine specific
    - Tried to run the report using different users to ensure it's not user specific.
    - Tested other reports built from stored procedures on client machines to ensure that they work.
    Any assistance in this would be GREATLY appreciated.
    Thank you

    After further testing, we found that the report could be run within SAP on any work station which had the CR designer installed on it.
    As it turns out, the procedure which I wrote has temp tables in it.  The runtimes built into the SAP client install do not support creating temp tables when executing the report from within SAP.  Which is why the report could not retreive data.
    To work around this, I installed external runtimes which were the same version of the Crystal Report and now the report can be run within SAP from any workstation which has the external runtimes (and not just the runtimes within the SAP client).
    I hope this makes sense.

  • Sub reports In Crystal reports XML as a data Source

    Hi,
    I am facing one problem related to Sub reports functionality in crystal reports as XML as a data Source. We are using CrystalReports XI for this.
    I was generated PDF's using crystal reports from ruby on rails, by using the JAVA API to produce the PDF by taking the XML,XSD,and RPT files as parameters. I  succeed in this case
    and the way we are communicating with JAVA API using  from my Ruby on Rails application by the following link.
    http://rorataspire.wordpress.com/2008/06/13/how-to-integrate-crystal-reports-with-ruby-on-rails/
    Right Now my problem is as follows.
    The problem with the sub reports generation from  crystal reports.  We had one main report file(RPT), In that we are using the Sub reports,for that sub reports need to parse the XML's and XSD's dynamically based on the report encountered on the main(RPT)file,but to send more than one XSD's and XML's dynamically  based on the  sub report, to the JAVA API from Ruby on Rails application I am not succeeded.  So as a result  PDF with the Dynamic data for Sub reports is not fulfill.
    Your help is appreciable, Thanks in advance.

    Is this a coding question or a data source format question?
    Sincerely,
    Ted Ueda

  • Crystal Reports 2008 and Excel data source

    I want to use a large Excel-sheet (with 139569 lines) as a data source for a report. Crystal Reports 2008 only seems to accept Excel sheets as a data source if they are stored in Excel 97-2003 format. But, then there is a limit of 65536 lines and I miss more than half of the data I need to process. Does anyone have an idea how I can solve this problem??

    Hello,
    No work around in CR when connecting to xls files. CR can only use xls files which has the 65K row limit. Our next version does support xlxs files though. Cr for VS 2010 is available, no Designer outside of the IDE though. CR 2011 is still in beta and no release data announced yet.
    For now all you can do is limit your sheets to 65k or export your XLS files to Access, must also be an MDB file type, not the new extended type. Once in an Access database now you can use it as your data sources.
    You can use a temp database so it gets destroyed each time or simple delete all records. Up to you how to manage it.
    Thank you
    Don

  • Error loading report when data source changed

    We are migrating our Crystal Reports from using XML files generated from our SAP R3 system to using a direct connection to SAP R3 (using the SAP Integration Kit).
    The code is straightforward:
          crReport = new CrystalDecisions.CrystalReports.Engine.ReportDocument();
          crReport.Load(reportPath);
    ...where the reportPath is just a string with the path and filename (e.g. "C:\reports\Report1.rpt").
    This worked fine with the report when it had an XML data source. When we run the code with the new version of the report that connects to a data table on our SAP R3 we get the following error message:
    Error in File Report1 {40205567-F890-4B3C-A103-C9C8F0A1E665}.rpt: Failed to logon to the Crystal Report Object Repository.
    The report is not in an Object Repository, nor do we even have a Repository in place right now. The report can be run manually from Crystal Reports on the desktop without any kind of Repository.
    Both versions of the report were built in Crystal Reports 2008.
    Thanks,
    Byron

    I am not sure if connecting to SAP is any different than connecting to SQL, but we develop one report and then deploy to N number of client who all have their own SQL servers and server names.
    Here is the code I use to alter the database connection at runtime:
      connection.DatabaseName = _datebaseName;
                    connection.ServerName = _serverName;
                    if (_integratedSecurity)
                        connection.IntegratedSecurity = _integratedSecurity;
                    else
                        connection.UserID = _userId;
                        connection.Password = _password;
                    connection.Type = ConnectionInfoType.SQL;
                    // First we assign the connection to all tables in the main report
                    foreach (CrystalDecisions.CrystalReports.Engine.Table table in _reportDocument.Database.Tables)
                        AssignTableConnection(table, connection);
                    foreach (CrystalDecisions.CrystalReports.Engine.Section section in _reportDocument.ReportDefinition.Sections)
                        // In each section we need to loop through all the reporting objects
                        foreach (CrystalDecisions.CrystalReports.Engine.ReportObject reportObject in section.ReportObjects)
                            if (reportObject.Kind == ReportObjectKind.SubreportObject)
                                SubreportObject subReport = (SubreportObject)reportObject;
                                ReportDocument subDocument = subReport.OpenSubreport(subReport.SubreportName);
                                foreach (CrystalDecisions.CrystalReports.Engine.Table table in subDocument.Database.Tables)
                                    AssignTableConnection(table, connection);

  • Nested ItemsControls' binding breaks when underlying data source changes, and the ItemsControls are collapsed

    Windows Phone WinRT App, targeted to Windows Phone 8.1.
    I have a nested data structure: class (SampleDataSource), that has an ObservableCollection (Groups) that contains class (SampleDataGroup) items that has an ObservableCollection (Items)
    that contains class (SampleDataItem) items.
    SampleDataSource is the DataContext of root item, a pivot control's ItemsSource is set to
    Groups. In the pivot's DataTemplate there is an ItemsControl with ItemsSource set to
    Items.
    The above scenario is a slightly modified version of the pivot app template that is generated by Visual Studio.
    So far it works properly.
    Then I added a ToggleSwitch to the page that toggles the Visibility of the pivot.
    Still works.
    Then I added some more code to clear the Groups ObservableCollection when the pivot made collapsed, and reload the data before its visibility is set back to visible.
    The result is that the top-level ItemsControl (the pivot) gets displayed properly, but the ItemsControl is empty. If I scroll to an other pivot page, the ItemsControl is populated, and from this point everything works correctly, even the initially empty
    ItemsControl works just fine.
    The code that is relevant to the issue is:
    private async void ToggleVisibility_Toggled(object sender, RoutedEventArgs e)
    if (pivot != null)
    // Was collapsed, therefore we are going to show it, so load the data source
    if (pivot.Visibility == Windows.UI.Xaml.Visibility.Collapsed)
    await SampleDataSource.GetInstance().GetSampleDataAsync();
    pivot.Visibility = (sender as ToggleSwitch).IsOn ? Visibility.Visible : Visibility.Collapsed;
    // Just got collapsed, so remove the data from memory
    if (pivot.Visibility == Windows.UI.Xaml.Visibility.Collapsed)
    SampleDataSource.GetInstance().Groups.Clear();
    The project (as mentioned above, it's a slightly modified version of Pivot App template) that reproduces this issue is available
    HERE.
    Does anyone have any idea what I should do to make ItemsControl work properly after making pivot visible?

    I can't help you with this myself as I've not done it but there are quite a few tutorials kicking around the net.
    Simple ODBC Connections in Adobe LiveCycle:
    http://www.youtube.com/watch?v=C56_Cz-aE0c
    Connecting a form to a database:
    http://forms.stefcameron.com/2006/09/18/connecting-a-form-to-a-database/
    Database connected forms:
    http://acrobatusers.com/tutorials/database-connected-forms

  • SMSY Data Source changes after performing Managed System Config

    Hello everyone,
    I am working with Solman EHP1 SPS26.  I have setup up my central SLD to push system data to my Solman local SLD which is then retrieved from Landscape Fetch to SMSY.  Initially, everything seemed to be working and all data sources reported SLD.
    However, once I performed the Managed System Confirguration wizard for a particular technical system, that systems data source would change to TMS/RFC under SMSY.  It's actually not even that consistent.  For example, the server says source is RFC, the product system says TMS/RFC and some of the product instances still say SLD.
    Is this normal for this to happen after connecting a managed system?  Will these systems still be updated via the SLD considering SMSY_SETUP is configured to use the SLD?
    Anyone's help would be greatly appreciated.
    Alex

    Hello,
    If you notice in SMSY this is not an editable field.
    It is reporting the last datasource used.
    Thefore when you made a change via managed system setup that became the last data source used.
    So yes it is normal to see this change.
    If you have configured the datasource to be the SLD, it should continue to update these systems.
    Where you need to watch out is making manual changes in SMSY, as the SLD can view this as a different system to the SID it knows and will generate systems with known SIDs appended with a suffix variable to distinguish it, as the SLD will not overwrite manual changes, and will create new systems instead.
    But the field in SMSY reflects the last data source used.
    Hope this helps some.
    Regards,
    Paul

  • Error in Data source 'change'...

    Hi,
    I want to unhide a field DMBE2 from 0FI_AR_3 data source extract structure. When I go to RSA5> FI> 0FI_AR_3> change data source> and uncheck the 'hide' block and try to save it; system displays messege" the OLTP source still has errors". What is this.. how can I solve this?????
    Pls help... Pts for ans.

    Hi,
    I faced the same error. In my append structure i didnot add currency field to the structure. Thats why it obstructs me. Later when i add currency fields(ZZWAERKS) in the structure it is reolved. So please check all the required fields are added or not.
    Hope this helps.
    Thanks
    K

  • Data source change while migrating report to another server

    Hi all.
    I have actually two questions:
    1) How exactly works the data source connection in CR? If I create a package in database and then connect it with my report, what happens in time of creating the report? Does the CR use the current package version stored in DBS or use the package in version when it be connected with report (so it looks like the CR loaded the code inside the report and are resistant to all changes in dbs)? If the second option is right (I think it is, unfortunately), can it be somehow changed?
    2) And now - If I create report and connect it with data source (database package) in development server and then I want just to move the completed and functional report to test server (and then other environments), is there any possibility to change automatically the data source to test server database? Everything is the same on DEV and TEST servers and their databases, except data, of course.
    The imagination that after creating some reports and migrating them to another environment I have to open all of them and manually change the data source is very very bad.
    Thanks for answers!

    Hi Filip,
    I don't think I understand the first question.
    For the second one though, if you want your reports to work seamlessly across environments then you should have your reports connect to the database using an ODBC System DSN.
    Each Envinronment (Dev, Test and Prod) should have the same DSN Name with each DSN pointing to the respective database sever.
    -Abhilash

  • Crystal Reports 2008 - Excel 2007 Data Source

    Have been using Crystal reports 10 and have downloaded 2008 but I can't see a way that I can use an Excel 2007 spreadsheet .xlsx as a datasource. The reason I need 2007 is due to the increased number of columns.
    Can anyone confirm if it can be done if so how
    cheers
    Malcolm

    Hello,
    Sorry its not supported. Next version of CR will support it.
    CR 2008 will generate sheets for every 65,000 rows of data.
    If you need that many coumns you may want to look at some other export format or refining your data.
    Thank you
    Don

  • Spring fails to lookup Data Source using JNDI

    Hi all I am now for configure jndi in tomcat5.5 .. my integration is like spring+hybernate+jpa
    I am getting the error is
    ERROR - ContextLoader.initWebApplicationContext(219) | Context initialization failed
    org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'dataSource' defined in ServletContext resource [WEB-INF/xaconfig/daoJPAConfig.xml]: Invocation of init method failed; nested exception is javax.naming.NamingException: Cannot create resource instance
         at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1412)
         at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:519)
         at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
         at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:291)
         at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222)
         at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:288)
         at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:190)
         at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:546)
         at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:871)
         at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:423)
         at org.springframework.web.context.ContextLoader.createWebApplicationContext(ContextLoader.java:272)
         Caused by: javax.naming.NamingException: Cannot create resource instance
    my tomcat 5.5/context.xml is
    <?xml version="1.0" encoding="UTF-8"?>
    <!-- The contents of this file will be loaded for each web application --><Context>
    <!-- Default set of monitored resources -->
    <WatchedResource>WEB-INF/web.xml</WatchedResource>
         <Transaction factory="com.atomikos.icatch.jta.UserTransactionFactory"/>
    <!-- Uncomment this to disable session persistence across Tomcat restarts -->
    <!--
    <Manager pathname="" />
    -->
    <ResourceLink global="jdbc/Paymentsdb" name="jdbc/Paymentsdb" type="com.atomikos.jdbc.AtomikosDataSourceBean"/>
    </Context>
    and my tomcat5.5/server.xml is
    <?xml version="1.0" encoding="UTF-8"?>
    <Server port="8005" shutdown="SHUTDOWN">
    <Listener className="org.apache.catalina.mbeans.ServerLifecycleListener"/>
    <Listener className="org.apache.catalina.mbeans.GlobalResourcesLifecycleListener"/>
    <Listener className="org.apache.catalina.storeconfig.StoreConfigLifecycleListener"/>
    <Listener className="com.atomikos.tomcat.AtomikosLifecycleListener"/>
    <GlobalNamingResources>
    <Environment name="simpleValue" type="java.lang.Integer" value="30"/>
    <Resource auth="Container" description="User database that can be updated and saved" factory="org.apache.catalina.users.MemoryUserDatabaseFactory" name="UserDatabase" pathname="conf/tomcat-users.xml" type="org.apache.catalina.UserDatabase"/>
    </GlobalNamingResources>
    <Service name="Catalina">
    <Connector acceptCount="100" connectionTimeout="20000" disableUploadTimeout="true" enableLookups="false" maxHttpHeaderSize="8192" maxSpareThreads="75" maxThreads="150" minSpareThreads="25" port="8080" redirectPort="8443"/>
    <Connector enableLookups="false" port="8009" protocol="AJP/1.3" redirectPort="8443"/>
    <Engine defaultHost="localhost" name="Catalina">
         <Realm className="org.apache.catalina.realm.UserDatabaseRealm" resourceName="UserDatabase"/>
         <Host appBase="webapps" autoDeploy="true" name="localhost" unpackWARs="true" xmlNamespaceAware="false" xmlValidation="false">
    <Context docBase="com.evolvus.payments.web" path="/com.evolvus.payments.web" reloadable="true" source="org.eclipse.jst.j2ee.server:com.evolvus.payments.web">
    <Resource name="jdbc/Paymentsdb" auth="Container"
    type="com.atomikos.jdbc.AtomikosDataSourceBean"
    driverClassName="com.mysql.jdbc.jdbc2.optional.MysqlXADataSource"
    url="jdbc\:mysql\://devserver\:3306/payhub"
    username="root"
    password="root"
    maxActive="20"
    maxIdle="10"
    maxWait="20000"
    />
    </Context></Host>
    </Engine>
    </Service>
    </Server>
    and my web.xml is
    <?xml version="1.0" encoding="UTF-8"?>
    <web-app version="2.5"
         xmlns="http://java.sun.com/xml/ns/javaee"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://java.sun.com/xml/ns/javaee http://java.sun.com/xml/ns/javaee/web-app_2_5.xsd">
         <display-name>Payments</display-name>
         <context-param>
              <param-name>contextConfigLocation</param-name>
              <param-value>/WEB-INF/xaconfig/daoJPAConfig.xml</param-value>
         </context-param>
         <context-param>
         <param-name>javax.faces.CONFIG_FILES</param-name>
         <param-value>/WEB-INF/faces-config.xml</param-value>
         </context-param>
         <filter>
              <filter-name>PrimeFaces FileUpload Filter</filter-name>
              <filter-class>org.primefaces.webapp.filter.FileUploadFilter</filter-class>
         </filter>
         <filter-mapping>
              <filter-name>PrimeFaces FileUpload Filter</filter-name>
              <servlet-name>Faces Servlet</servlet-name>
         </filter-mapping>
         <context-param>
    <param-name>javax.faces.DEFAULT_SUFFIX</param-name>
    <param-value>.xhtml</param-value>
    </context-param>
         <servlet>
              <servlet-name>Faces Servlet</servlet-name>
              <servlet-class>javax.faces.webapp.FacesServlet</servlet-class>
              <load-on-startup>1</load-on-startup>
         </servlet>
         <servlet-mapping>
              <servlet-name>Faces Servlet</servlet-name>
              <url-pattern>*.jsf</url-pattern>
         </servlet-mapping>
    <resource-ref>
    <description>PaymentsDatabase</description>
    <res-ref-name>jdbc/Paymentsdb</res-ref-name>
    <res-type>com.atomikos.jdbc.AtomikosDataSourceBean</res-type>
    <res-auth>Container</res-auth>
    </resource-ref>
    </web-app>
    and my daoJpaConfig i did like this
    <beans:bean id="dataSource"
              class="org.springframework.jndi.JndiObjectFactoryBean">
              <beans:property name="jndiName">
                   <beans:value>java:comp/env/jdbc/Paymentsdb</beans:value>
              </beans:property>
              </beans:bean>
                        what are the jar file i have to add in my tomcat 5.5 and webapplication ??
    plz help me i am struggling alot ..where i did mistakes      
    i am using transaction type as transaction-type="JTA" <persistence-unit name="payhub" transaction-type="JTA">

    import java.sql.*;
    import javax.sql.*;
    import javax.naming.*;
    import java.util.*;
    public class MyDataSourceLookupClient
         public final static String JNDI_FACTORY = "weblogic.jndi.WLInitialContextFactory";
         private static String serverUrl ="t3://localhost:7001";
    public static void main(String ar[])throws Exception
    InitialContext ic=null;
    try{
              Hashtable env = new Hashtable();
              env.put(Context.INITIAL_CONTEXT_FACTORY, JNDI_FACTORY);
              env.put(Context.PROVIDER_URL, serverUrl);
              ic = new InitialContext(env);
    catch(Exception e){}
    try{
    DataSource ds=ic.lookup("YourDataSourceJNDIName");
    Connection con=ds.getConnection();
    System.out.println("\n\t Got Connection: "+con);
    con.close();
    catch(Exception e)
    System.out.println("\n\n\t jack Exception => "+e);
    e.printStackTrace();
    Thanks
    Jay SenSharma
    http://jaysensharma.wordpress.com (WebLogic Wonders Are Here)

  • External table fails to view data after changing file name

    Hello,
    I have previously imported 122 external tables and was able to view their data without any problems.
    The names of these files have since changed. The names have a date in them.
    So I dropped the external tables from that database and recreated them with the new file names.
    I then went into OWB and dropped them from OWB.
    I reimported all of the tables without errors.
    I go to view the data and get an error that the file does not exists.
    The problem is that the file it is telling me does not exists, is the file I had been using last week. Not the new file.
    However, when I go and right click the external table and do a configure, the correct name is showing up under datafiles.
    It's almost as though when I dropped the external tables, not everything was cleaned up. For whatever reason, OWB is still using the old names.
    Dan

    Hi there... Wich OWB version are you using?
    If you've created this external tables using any other tool (ie: SQLPLUS, SQLDeveloper, Toad, SQLNavigator, etc), are you able to query this objects from one of these tools?
    If you've created this ET's using OWB, are you creating it as external tables or are you mapping it as source flat files?
    If you are using OWB to create it, there are a few configurations/properties you should change to have them running well.
    Please, post some more info.
    Thanks

Maybe you are looking for

  • How do you use find my phone on a PC to find your Iphone?, How do you use find my phone on a PC to find your Iphone?

    hello, I just lost my phone and I have the app find my Iphone, but I have no idea how you use the program on my laptop to track down the phone. Can someone please tel me how this works? Thanks

  • Issue with pa30 transaction for the infotype 15

    Hi All, I am creating additional payments for the infotype 15 in PA30 transaction. i followed following steps: wage type                                      amount                                      date of origin                                  

  • Flash drive for 10.6.8

    I have a ibookg4 on a ox 10.6.8 use to be 10.4.1. i used a flash prayer 1.0.2.164 flash drive, but i tried to put a 12 adobe flash player on it.  don't work.  what flash drive to i get.

  • WinAMP CD Rip and iTunes id3 problems

    I ripped a couple CDs using WinAMP but no matter what I do iTunes keeps making up the Title tag in the library. -iTunes sets the Title tag to "<Artist> - <Title>" for each song with the correct artist and title name (no quotes). -ID3 tools like Tag&R

  • Issue occurring after modification in the existing table screen

    Hi All, I have a table maintenance generator in which I have developed a functionality. I am calling a SEARCH BUTTON on this and then it is displaying me the SEARCH screen and after entering a search criteria, I am getting another screen with the sim