Difference between DataSource & JDBC Data Source Factory

Hi,
Can anyone tell me what the difference between a 'DataSource' and 'JDBC Data Source
Factory' is ? I see both these on the Weblogic admin console. I have reviewed
the documentation but it is not clear to me what the difference between the two
is and what situations each on is appropriate for.
Thanks,
Raju

Hi Sree,
I have read literally ALL the online documents related to JDBC Data Source Factories
including the one you provided below, but I still do not understand what they
are for. I have also searched in Google but to no avail.
I will try to be more specific as to where my confusion lies.
1)Why would anyone ever choose a JDBCData Source Factory instead of a Tx Data
Source or plain vanilla (not XA) Data Source? EJBs connect just fine with the
latter 2.
2)What should go inside the properties box?
a) http://e-docs.bea.com/wls/docs70/ejb/EJB_environment.html#resourcefact states
that connection properties should go in here
b)Then, the document goes on to describe a syntax for binding a Connection Pool
to a JNDI name, all in the properties box. I have never been able to successfully
create a data resource factory with a JNDI name. I get no errors, but it appears
no where in the JNDI tree
c)If this mysterious syntax associates a resource factory with an existing Connection
Pool, then why go through all the fuss of redefining the Driver Class Name, URL,
etc., when those are already defined on the Connection Pool?
When you setup either a Data Source or a TX Data Source, WebLogic does not require
that information since it was all setup in the pool. Why is it needed here?
Thanks in advance for you help.
"Sree Bodapati" <[email protected]> wrote:
Hi Raju,
Please see,
http://e-docs.bea.com/wls/docs70/ConsoleHelp/domain_jdbcdatasourcefactorytable.html
sree

Similar Messages

  • Difference between 2 AP Data Sources

    Hi guys
    We are on ECC6 and BI 7...We have to developed a data model for 6 reports with 0IF_AP_4 Data Source for Procurement...also for one of FI report (Vender Spint) we need to know whether we can use the same data source 0FI_AP_4 or we need to use the 0FI_AP_6 which is already delivered for Vendor Spend report ..But If we want to use AP_6 Data source we need to enhance this Data source with PRCTR as we need it for our reporting requirement..So we didnt find the corresponding table for it ..When we enhanced it with BSEG table, we are unable to populate the PRCTR with the keys in both extractor and DS..
    Now In our AP_4 DS we already have PRCTR available (We enhanced it with BSEG)...and all fields are available for vendor Spend Report...But Our concern is Whether AP_6 and AP_4 will produce the same data...?/ Is there any thing we miss if we use AP_4 for Vendor Spend? If we can simply use AP_4 DS for Vendor spend Report..what is the use of AP_6 ..Why SAP Delivered a complete flow from DS to Cube for Standard Vendor Spend report?
    Any guidelines please...
    Thanks in advance

    You are running into a problem casting your character variables into dates. As you can see here, it is not such a good idea to let the database do this for you...
    WITH temp AS
    SELECT SYSDATE start_date FROM dual
    SELECT *
    FROM   temp
    WHERE START_DATE >= &BEGIN
    AND start_date <= &END + 1;
    Enter value for begin: '01-dec-07'
    old   7: WHERE START_DATE >= &BEGIN
    new   7: WHERE START_DATE >= '01-dec-07'
    Enter value for end: '01-jan-08'
    old   8: AND start_date <= &END + 1
    new   8: AND start_date <= '01-jan-08' + 1
    AND start_date <= '01-jan-08' + 1
    ERROR at line 8:
    ORA-00932: inconsistent datatypes: expected DATE got NUMBERBut if you wrap the character strings with to_date()....
    WITH temp AS
    SELECT SYSDATE start_date FROM dual
    SELECT *
    FROM   temp
    WHERE START_DATE >= to_date(&BEGIN,'DD-MON-YY')
    AND start_date <= to_date(&END,'DD-MON-YY') + 1;
    Enter value for begin: '01-DEC-07'
    old   7: WHERE START_DATE >= to_date(&BEGIN,'DD-MON-YY')
    new   7: WHERE START_DATE >= to_date('01-DEC-07','DD-MON-YY')
    Enter value for end: '01-JAN-08'
    old   8: AND start_date <= to_date(&END,'DD-MON-YY') + 1
    new   8: AND start_date <= to_date('01-JAN-08','DD-MON-YY') + 1
    START_DAT
    18-DEC-07And to be absolutely correct, you should actually use a format mask with a full YYYY and supply a full year with your variables.
    Greg

  • Using JDBC Data Sources with ADFBC, NoInitialContextException

    Using JDBC Data Sources with ADF Business Components, NoInitialContextException
    I follow the instruction in the link below to create an ADF Swing application using datasource. I am using JDeveloper version 10.1.3.
    http://www.oracle.com/technology/products/jdev/howtos/10g/usingdatasources/using_datasources.html
    The ADF generated code looks like this:
    JUMetaObjectManager.setErrorHandler(new JUErrorHandlerDlg());
    JUMetaObjectManager mgr = JUMetaObjectManager.getJUMom();
    mgr.setJClientDefFactory(null);
    BindingContext ctx = new BindingContext();
    ctx.put(DataControlFactory.APP_PARAM_ENV_INFO, new JUEnvInfoProvider());
    ctx.setLocaleContext(new DefLocaleContext(null));
    HashMap map = new HashMap(4);
    map.put(DataControlFactory.APP_PARAMS_BINDING_CONTEXT, ctx);
    mgr.loadCpx("datasource.view.DataBindings.cpx" , map);
    final FormMain frame = new FormMain();
    frame.setBindingContext(ctx);
    I got this error when executing the last line: frame.setBindingContext(ctx);
    (oracle.jbo.common.ampool.ApplicationPoolException) JBO-30003: The application pool (datasource.datamodel.AppModuleDS) failed to checkout an application module due to the following exception:
    ----- LEVEL 1: DETAIL 0 -----
    (oracle.jbo.JboException) JBO-29000: Unexpected exception caught: oracle.jbo.DMLException, msg=JBO-27200: JNDI failure. Unable to lookup Data Source at context jdbc/xe_hrDS
    ----- LEVEL 2: DETAIL 0 -----
    (oracle.jbo.DMLException) JBO-27200: JNDI failure. Unable to lookup Data Source at context jdbc/xe_hrDS
    ----- LEVEL 3: DETAIL 0 -----
    (javax.naming.NoInitialContextException) Need to specify class name in environment or system property, or as an applet parameter, or in an application resource file: java.naming.factory.initial
    If I configure the application module connection type as JDBC URL, everything works.
    If the connection type is JDBC Datasource, I got the above error.
    Can someone show me how to adjust the generated code by ADF to use datasource?

    ADF BC has a bug. With Data Source in Application Module, application module does not connect. Use JDBC Connection URL.
    Also refer
    ADF BC: JDBC URL vs JDBC DataSource

  • Standalone application can't get JDBC data source from Weblogic 10.3

    We have the following configuration :
    A Weblogic server 10.3 (default installation).
    The server contains a JMS queue (jndi name: "DMQ") and JDBC data sources (jndi names: Oracle thin XA - "dataSource", MS SQL - "dataSource1")
    We have built wlfullclient5.jar for Java 1.5 ([http://edocs.bea.com/wls/docs103/client/jarbuilder.html#wp1078122]) (according to the docs)
    And now we use a test standalone application with the wlfullclient5.jar :
    public static void main (String[] args) throws NamingException {
    bq. Hashtable&lt;String, String&gt; env = new Hashtable&lt;String, String&gt;(); \\     env.put(Context.+INITIAL_CONTEXT_FACTORY+, "weblogic.jndi.WLInitialContextFactory"); \\     env.put(Context.+PROVIDER_URL+, "http://serv1:7001"); \\     env.put(Context.+SECURITY_CREDENTIALS+, "weblogic"); \\     env.put(Context.+SECURITY_PRINCIPAL+, "weblogic"); \\     InitialContext ic = new InitialContext(env); \\ \\ System.+out+.println("Get DMQ"); \\     ic.lookup("DMQ"); \\ System.+out+.println("Get dataSource"); \\     ic.lookup("dataSource");
    bq. System.+out+.println("Get dataSource1"); \\     ic.lookup("dataSource1"); \\ System.+out+.println("Done"); \\
    Here is the output when connected to WLS 10.3:
    bq. Get DMQ \\ Get dataSource \\ Exception in thread "Main Thread" java.lang.AssertionError: Failed to generate class for weblogic.jdbc.common.internal.RmiDataSource_1030_WLStub \\ at weblogic.rmi.internal.StubGenerator.generateStub(_StubGenerator.java:790_) \\ at weblogic.rmi.internal.StubGenerator.generateStub(_StubGenerator.java:779_) \\ at weblogic.rmi.extensions.StubFactory.getStub(_StubFactory.java:74_) \\ at weblogic.rmi.internal.StubInfo.resolveObject(_StubInfo.java:213_) \\ at weblogic.rmi.internal.StubInfo.readResolve(_StubInfo.java:207_) \\ at sun.reflect.NativeMethodAccessorImpl.invoke0(_Native Method_) \\ at sun.reflect.NativeMethodAccessorImpl.invoke(_NativeMethodAccessorImpl.java:39_) \\ at sun.reflect.DelegatingMethodAccessorImpl.invoke(_DelegatingMethodAccessorImpl.java:25_) \\ at java.lang.reflect.Method.invoke(_Method.java:585_) \\ at java.io.ObjectStreamClass.invokeReadResolve(_ObjectStreamClass.java:1033_) \\ at java.io.ObjectInputStream.readOrdinaryObject(_ObjectInputStream.java:1728_) \\ at java.io.ObjectInputStream.readObject0(_ObjectInputStream.java:1305_) \\ at java.io.ObjectInputStream.readObject(_ObjectInputStream.java:348_) \\ at weblogic.utils.io.ChunkedObjectInputStream.readObject(_ChunkedObjectInputStream.java:197_) \\ at weblogic.rjvm.MsgAbbrevInputStream.readObject(_MsgAbbrevInputStream.java:564_) \\ at weblogic.utils.io.ChunkedObjectInputStream.readObject(_ChunkedObjectInputStream.java:193_) \\ at weblogic.rmi.internal.ObjectIO.readObject(_ObjectIO.java:62_) \\ at weblogic.rjvm.ResponseImpl.unmarshalReturn(_ResponseImpl.java:240_) \\ at weblogic.rmi.cluster.ClusterableRemoteRef.invoke(_ClusterableRemoteRef.java:348_) \\ at weblogic.rmi.cluster.ClusterableRemoteRef.invoke(_ClusterableRemoteRef.java:259_) \\ at weblogic.jndi.internal.ServerNamingNode_1030_WLStub.lookup(Unknown Source) \\ at weblogic.jndi.internal.WLContextImpl.lookup(_WLContextImpl.java:392_) \\ at weblogic.jndi.internal.WLContextImpl.lookup(_WLContextImpl.java:380_) \\ at javax.naming.InitialContext.lookup(_InitialContext.java:351_) \\ at test.main(_test.java:23_) \\ Caused by: java.lang.reflect.InvocationTargetException \\ at sun.reflect.NativeConstructorAccessorImpl.newInstance0(_Native Method_) \\ at sun.reflect.NativeConstructorAccessorImpl.newInstance(_NativeConstructorAccessorImpl.java:39_) \\ at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(_DelegatingConstructorAccessorImpl.java:27_) \\ at java.lang.reflect.Constructor.newInstance(_Constructor.java:494_) \\ at weblogic.rmi.internal.StubGenerator.generateStub(_StubGenerator.java:788_) \\ at weblogic.rmi.internal.StubGenerator.generateStub(_StubGenerator.java:779_) \\ at weblogic.rmi.extensions.StubFactory.getStub(_StubFactory.java:74_) \\ at weblogic.rmi.internal.StubInfo.resolveObject(_StubInfo.java:213_) \\ at weblogic.rmi.internal.StubInfo.readResolve(_StubInfo.java:207_) \\ at sun.reflect.NativeMethodAccessorImpl.invoke0(_Native Method_) \\ at sun.reflect.NativeMethodAccessorImpl.invoke(_NativeMethodAccessorImpl.java:39_) \\ at sun.reflect.DelegatingMethodAccessorImpl.invoke(_DelegatingMethodAccessorImpl.java:25_) \\ at java.lang.reflect.Method.invoke(_Method.java:585_) \\ at java.io.ObjectStreamClass.invokeReadResolve(_ObjectStreamClass.java:1033_) \\ at java.io.ObjectInputStream.readOrdinaryObject(_ObjectInputStream.java:1728_) \\ at java.io.ObjectInputStream.readObject0(_ObjectInputStream.java:1305_) \\ at java.io.ObjectInputStream.readObject(_ObjectInputStream.java:348_) \\ at weblogic.utils.io.ChunkedObjectInputStream.readObject(_ChunkedObjectInputStream.java:197_) \\ at weblogic.rjvm.MsgAbbrevInputStream.readObject(_MsgAbbrevInputStream.java:564_) \\ at weblogic.utils.io.ChunkedObjectInputStream.readObject(_ChunkedObjectInputStream.java:193_) \\ at weblogic.rmi.internal.ObjectIO.readObject(_ObjectIO.java:62_) \\ at weblogic.rjvm.ResponseImpl.unmarshalReturn(_ResponseImpl.java:240_) \\ at weblogic.rmi.cluster.ClusterableRemoteRef.invoke(_ClusterableRemoteRef.java:348_) \\ at weblogic.rmi.cluster.ClusterableRemoteRef.invoke(_ClusterableRemoteRef.java:259_) \\ at weblogic.jndi.internal.ServerNamingNode_1030_WLStub.lookup(Unknown Source) \\ at weblogic.jndi.internal.WLContextImpl.lookup(_WLContextImpl.java:392_) \\ at weblogic.jndi.internal.WLContextImpl.lookup(_WLContextImpl.java:381_) \\ at javax.naming.InitialContext.lookup(_InitialContext.java:351_) \\ at test.main(_test.java:26_) \\ Caused by: java.lang.ArrayIndexOutOfBoundsException: 6 \\ at weblogic.jdbc.common.internal.RmiDataSource_1030_WLStub.ensureInitialized(Unknown Source) \\ at weblogic.jdbc.common.internal.RmiDataSource_1030_WLStub.&lt;init&gt;(Unknown Source) \\ at sun.reflect.NativeConstructorAccessorImpl.newInstance0(_Native Method_) \\ at sun.reflect.NativeConstructorAccessorImpl.newInstance(_NativeConstructorAccessorImpl.java:39_) \\ at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(_DelegatingConstructorAccessorImpl.java:27_) \\ at java.lang.reflect.Constructor.newInstance(_Constructor.java:494_) \\ at weblogic.rmi.internal.StubGenerator.generateStub(_StubGenerator.java:788_) \\ at weblogic.rmi.internal.StubGenerator.generateStub(_StubGenerator.java:779_) \\ at weblogic.rmi.extensions.StubFactory.getStub(_StubFactory.java:74_) \\ at weblogic.rmi.internal.StubInfo.resolveObject(_StubInfo.java:213_) \\ at weblogic.rmi.internal.StubInfo.readResolve(_StubInfo.java:207_) \\ at sun.reflect.NativeMethodAccessorImpl.invoke0(_Native Method_) \\ at sun.reflect.NativeMethodAccessorImpl.invoke(_NativeMethodAccessorImpl.java:39_) \\ at sun.reflect.DelegatingMethodAccessorImpl.invoke(_DelegatingMethodAccessorImpl.java:25_) \\ at java.lang.reflect.Method.invoke(_Method.java:585_) \\ at java.io.ObjectStreamClass.invokeReadResolve(_ObjectStreamClass.java:1033_) \\ at java.io.ObjectInputStream.readOrdinaryObject(_ObjectInputStream.java:1728_) \\ at java.io.ObjectInputStream.readObject0(_ObjectInputStream.java:1305_) \\ at java.io.ObjectInputStream.readObject(_ObjectInputStream.java:348_) \\ at weblogic.utils.io.ChunkedObjectInputStream.readObject(_ChunkedObjectInputStream.java:197_) \\ at weblogic.rjvm.MsgAbbrevInputStream.readObject(_MsgAbbrevInputStream.java:564_) \\ at weblogic.utils.io.ChunkedObjectInputStream.readObject(_ChunkedObjectInputStream.java:193_) \\ at weblogic.rmi.internal.ObjectIO.readObject(_ObjectIO.java:62_) \\ at weblogic.rjvm.ResponseImpl.unmarshalReturn(_ResponseImpl.java:240_) \\ at weblogic.rmi.cluster.ClusterableRemoteRef.invoke(_ClusterableRemoteRef.java:348_) \\ at weblogic.rmi.cluster.ClusterableRemoteRef.invoke(_ClusterableRemoteRef.java:259_) \\ at weblogic.jndi.internal.ServerNamingNode_1030_WLStub.lookup(Unknown Source) \\ at weblogic.jndi.internal.WLContextImpl.lookup(_WLContextImpl.java:392_) \\ at weblogic.jndi.internal.WLContextImpl.lookup(_WLContextImpl.java:380_) \\ at javax.naming.InitialContext.lookup(_InitialContext.java:351_) \\ at test.main(_test.java:23_)
    But at the same time the output when connected to WLS 10.0 is :
    Get DMQ \\Get dataSource \\Get dataSource1 \\Done
    (so that the test passed)
    Could you give me a piece of advice ?
    Thanks,
    Sergey

    I hit the same problem as yours. This issue is caused by JDK version. The JDK used by Weblogic 10.3 is java 6, while your client program's jdk version is java5. So change the both of them to java 6. This issue will be fixed.
    Don't forget to rebuild the wlfullclient.jar which also should be java 6.
    Good luck!

  • Using WLST to set up credential mapping on JDBC Data Source

    I am wanting to write a script (WLST) to set up a large initial set of credential mappings on a JDBC data source.
    I can create the datasource and do all the basic set ups, but I can't figure out how to do the credential mappings.
    I tried the "record" option from the Admin console, but it created a script that says that credential mappings are not recorded.
    This is the mapping function that you get to from the "Security" tab on the DataSource page, and then the "Credential Mappings" tab. This allows you to map a "WLS user" to a "remote user" - which in my case is an Oracle OLS user.
    I am using Weblogic 10.3
    Can anyone point me to a set of commands which can perform this function?
    Thanks!

    I wonder if there isn't a publicly visible MBean for the credential mapping. That would explain why the recorded script doesn't have the mbean method invocations.

  • WLS 10.3.4: How to use OS authentication for JDBC Data Source

    Hello all,
    As a preface, I've tried searching the forum/Google for "OS authentication" and reading the WLS JDBC doc to no avail - if it's documented somewhere, a RTFM link would be much appreciated.
    I'm trying to set up a JDBC data source on WLS that leverages the OS Authentication capability of the Oracle database. If it would help, I can go into the reasoning behind why I want to do this, but basically, it's to simplify the config/deployment of a COTS application. What I have in the database is an "identified externally" user that corresponds to the OS user that is running the WebLogic Server. Normally, in tools such as SQL*Plus, I would use "/@db" as the username/password (in other words, no username and no password specified), and I would be logged in as the "idenfitied externally" user. I want to configure the same thing for a WebLogic Data Source, but if I leave the username/password blank, testing the connection in the WLS console gives me "invalid username/password, login denied" I've also tried using "/" as the username, as was documented in a quite old WLS faq, but that gives me the same result.
    Is there some magic switch I need to flip?
    Thanks,
    John

    Hi John, there's no way to do that with connection pools, which is how WLS datasources get their
    connections, or middleware in general. WebLogic would have no way of knowing which if any of the
    pooled connections was appropriate for the current 'user', which is not the application user, but
    instead is the OS identity of the person who started the WebLogic server! If you start up your
    WebLogic server, and people start pointing their browsers to it, doing various stuff, the OS knows
    you started WebLogic, and maybe with the help of OCI, Oracle's JDBC might know it was you who
    started WebLogic's OS process, but what does the OS know about any user that may be running
    a browser or application elsewhere (even if on this same machine), when that browser or application
    connects to your WebLogic server process?
    HTH,
    Joe

  • JDBC Data Sources: Potential Issue with JDeveloper 10.1.3.4

    I think I found a bug or issue with the latest JDeveloper 10.1.3.4 release when using JDBC Data Sources on the Embedded OC4J container.
    To state the issule bluntly, If I use a JDBC Data Source in an ADF Faces application, I get the following error on the screen when I run my application if I create a simple page using a Form layout for database data:
    [http://cs.uwindsor.ca/~ruston7/jdbcError.jpg]
    Or if I use a simple drag and drop ADF Faces Table:
    javax.faces.el.PropertyNotFoundException: Error testing property '<<FIRST_FIELD_ON_THE_PAGE>>' in bean of type null
        at com.sun.faces.el.PropertyResolverImpl.isReadOnly(PropertyResolverImpl.java:274)
        at oracle.adfinternal.view.faces.model.FacesPropertyResolver.isReadOnly(FacesPropertyResolver.java:124)
        at com.sun.faces.el.impl.ArraySuffix.isReadOnly(ArraySuffix.java:236)
        at com.sun.faces.el.impl.ComplexValue.isReadOnly(ComplexValue.java:209)
        at com.sun.faces.el.ValueBindingImpl.isReadOnly(ValueBindingImpl.java:266)
        at oracle.adfinternal.view.faces.renderkit.core.xhtml.EditableValueRenderer.getReadOnly(EditableValueRenderer.java:211)
        at oracle.adfinternal.view.faces.renderkit.core.xhtml.FormElementRenderer.renderAsElement(FormElementRenderer.java:155)
        at oracle.adfinternal.view.faces.renderkit.core.xhtml.InputLabelAndMessageRenderer.getLabelFor(InputLabelAndMessageRenderer.java:53)
        at oracle.adfinternal.view.faces.renderkit.core.xhtml.LabelAndMessageRenderer$Label.getForId(LabelAndMessageRenderer.java:500)
        at oracle.adfinternal.view.faces.renderkit.core.xhtml.OutputLabelRenderer.encodeAll(OutputLabelRenderer.java:69)
        at oracle.adfinternal.view.faces.renderkit.core.CoreRenderer.delegateRenderer(CoreRenderer.java:281)
        at oracle.adfinternal.view.faces.renderkit.core.xhtml.LabelAndMessageRenderer.encodeAll(LabelAndMessageRenderer.java:123)
        at oracle.adfinternal.view.faces.renderkit.core.xhtml.InputLabelAndMessageRenderer.encodeAll(InputLabelAndMessageRenderer.java:94)
        at oracle.adfinternal.view.faces.renderkit.core.CoreRenderer.encodeEnd(CoreRenderer.java:169)
        at oracle.adf.view.faces.component.UIXComponentBase.encodeEnd(UIXComponentBase.java:624)
        at oracle.adfinternal.view.faces.renderkit.core.CoreRenderer.encodeChild(CoreRenderer.java:246)When I change my Application Module connection to a JDBC URL, this all works perfectly. Also, everything works fine when I deploy to our Oracle AS 10.1.3 application servers.
    I also tried this on a different computer using a fresh install of JDeveloper just to make sure that the copy of JDeveloper that I downloaded didn't have a fluke in it.
    Thanks!

    M. Ruston,
    It must be something on your side. i just tried the same thing using the employees table from the HR sample schema (it has a date column). It works with JDBC URL and datasource both.
    Just out of curiosity - if you look at the properties for your Business Components project at the business components section, what does it show for the SQL Flavor and Type Map?
    John

  • Getting JDBC data source from PersistenceUnit

    I have an application accessing a database, and am using JPA to read/write/update records in the database.
    Some of the database tables use stored procedures to perform complex computations across entire
    database tables, and I need to call these stored procedures when I the user does certain operations,
    such as cloning an object and all its associated data. To call the stored procedures I use the same
    JDBC data source that is used to define the PersistenceUnit.
    Is there any way to get that DataSource from the PersistenceUnit?
    Since I do not know how to do this I am currently using @Resource DataSource annotations to get the JDBC
    datasource. That is ugly because I have the datasource specified in two different places in my code.
    Please help!

    I hit the same problem as yours. This issue is caused by JDK version. The JDK used by Weblogic 10.3 is java 6, while your client program's jdk version is java5. So change the both of them to java 6. This issue will be fixed.
    Don't forget to rebuild the wlfullclient.jar which also should be java 6.
    Good luck!

  • Copy the JDBC data sources to another weblogic domain?!

    we have a weblogic domain that has around 20 JDBC data sources
    is there an easy way to copy the JDBC data sources to another weblogic domain?!

    You can make a WLST script that creates the datasource(s) and target it to whatever servers/clusters you like,
    but as implemented, for security reasons, certain aspects of the datasources themselves are encrypted to
    operate only in their originating environment, so for instance, you cannot inadvertently email access to the
    company's DBMS to your new friend in Nigeria.

  • Difference between using new Date() and new Date(System.currentTimeMillis()

    Hi All,
    I have seen many open source api's where they are updating dates with the following code
    new Date(System.currentTimeMillis())
    when i print the new Date() it prints the same
    what is the difference between using new Date() and new Date(System.currentTimeMillis()) ??
    Thanks,
    J.Kathir

    when i print the new Date() it prints the same
    It does because of backward compatability. This constructor exists in version 1.3.1 but not in newer versions.Really ? Please point me to where you read that.
    API doc 1.5 : new Date() (still there, not even deprecated)
    A quick look at the code for this constructor:    /**
         * Allocates a <code>Date</code> object and initializes it so that
         * it represents the time at which it was allocated, measured to the
         * nearest millisecond.
         * @see     java.lang.System#currentTimeMillis()
        public Date() {
            this(System.currentTimeMillis());
        }

  • What is difference between tha master data and tranction data

    Hi, Gurus
    this is  sudhakar
    i want to no what is the difference between the transaction data and master data  give with examples .
    Thanks
    sudhakar

    Hi,
    The most important data classes are master data, transaction data, organizational data and system
    data.
    Master data is data that is rarely modified. An example of master data is the data of an address file, for
    example the name, address and telephone number.
    Transaction data is data that is frequently modified. An example is the material stock of a warehouse,
    which can change after each purchase order.
    Organizational data is data that is defined during customizing when the system is installed and that is
    rarely modified thereafter. The country keys are an example.
    System data is data that the R/3 System itself needs. The program sources are an example
    for more info go through..
    [http://help.sap.com/saphelp_nw70/helpdata/]
    regards,
    NR

  • Disk Utility: Differences between "Zero Out Data" and "7-Pass Erase"?

    I'm wondering if anyone knows if there's a significant difference between the "Zero Out Data" erase option in Disk Utility (specifically Disk Utility 10.5.5), and the "7-Pass Erase" and "35-Pass Erase" options in same software.
    Here's why I'm asking: I have a co-worker with an iMac G5 20" 1.8GHz with 160GB internal hard drive. As a result of the power supply overheating a week ago due to dust, some hard drive problems resulted. I'm trying to assess whether these are 'soft' formatting problems that can be recovered from, or 'hard' problems requiring replacement of the hard drive and/or power supply.
    Following the failure, I removed the dust and restored the iMac to servicable form. The power supply seems to be OK now. The next thing was to attempt to recover as much data as possible from the 160GB, as the last full backup was a week old. Carbon Copy Cloner, shell copy via 'sudo cp -p -R -v', Finder copy, and DiskWarrior recovery all met with problems. TechTool Pro identified a huge swatch of unreadable sectors during repeated surface scans. Unfortunately, these unreadable sectors were located midway in the OSX boot partition (an 80GB partition), and not in the other 80GB partition devoted to lower priority video data.
    When I was satisfied I had backed up the data to the best of my abilities, I next set out to reformat the drive and see if the bad sectors could be eliminated or remapped out of existence. I did a "Zero Out Data" erasure in Disk Utility (with no errors during the erase), but TechTool Pro showed the bad sectors persisted in equal strength at the same location. I next executed a sixteen hour "7-Pass Erase" (again no errors, and confirming that it takes about an hour per 10GB). The next day when I ran TechTool pro, all of the sector errors had disappeared. I'm a bit perplexed as to why the "7-Pass Erase" seems to have recovered the use of the drive. Is it possible that there are simply thousands of bad sectors now remapped that I'm not seeing? [If so, how do I check for this?] TechTool Pro has not reported any S.M.A.R.T. issues to date on the drive. What am I to make of that?
    There are some related threads I've checked into, but I'm not sure how to properly assess my situation based on this information:
    <http://discussions.apple.com/thread.jspa?threadID=232007>
    <http://discussions.apple.com/thread.jspa?threadID=138559>
    <http://discussions.apple.com/thread.jspa?threadID=118455>
    Since the iMac has three weeks left on it's one year warranty, and I've already moved the user to another machine temporarily, I'm thinking that the smart thing to so is to send it in to Apple to have them look at the power supply and hard drive. That way, when it returns, even if there is still a lingering hardware problem, at least it will be covered under warranty for another 90 days.
    Any thoughts?
    iMac G5 20" 1.8GHz   Mac OS X (10.4.6)   1.25GB RAM, 160GB hard disk, SuperDrive

    HI, Bret.
    The only differences between "Zero Out Data", "7-Pass Erase", and "35-Pass Erase" are the number of times a binary zero is written to every bit on the disk. "Zero Out Data" writes a binary zero once, whereas the 7- and 35-Pass options write a zero seven and 35 times, respectively.
    Technically, one pass with Zero Out Data should be sufficient to map bad sectors out of service, a process also known as sparing. If a bad sector is encountered, it is both marked as "in use" in the directory's allocation table and added to the directory's "bad blocks file."
    My understanding is that the Surface Scan of Tech Tool Pro should identify bad sectors every time it is run unless the bad sectors have been locked out by the drive controller of the ATA drive itself. This is because Surface Scan checks the entire surface of the disk.
    What may have happened is that running "Zero Out Data" spared the bad blocks from a directory standpoint, but did not result in the drive's controller locking out those sectors for reasons detailed in the "Surface Scan" section of the Tech Tool Pro manual. However, the 7-Pass Erase may have resulted in the drive's controller locking out the bad sectors and why Surface Scan did not pick them up after such.
    Given the problems you described, I concur with your plan to have Apple check the affected computer. You might also want to consider purchasing an AppleCare Protection Plan for that Mac: I recommend and buy these for all my Macs.
    For some additional information on bad sectors, see the "Bad Sectors" section of my "Resolving Disk, Permission, and Cache Corruption" FAQ.
    Good luck!
    Dr. Smoke
    Author: Troubleshooting Mac® OS X
    Note: The information provided in the link(s) above is freely available. However, because I own The X Lab™, a commercial Web site to which some of these links point, the Apple Discussions Terms of Use require I include the following disclosure statement with this post:
    I may receive some form of compensation, financial or otherwise, from my recommendation or link.

  • How to set JDBC Data Sources in Oracle MapViewer for Oracle database 12c Release 1 (12.1.0.1)

    How to set JDBC Data Sources in Oracle MapViewer for Oracle database 12c Release 1 (12.1.0.1)?
    The following is my configuration in the conf\mapViewerConfig.xml:
    <map_data_source name="mvdemo12"
    jdbc_host="127.0.0.1"
    jdbc_sid="orcl12c1"
    jdbc_port="1522"
    jdbc_user="mvdemo"
    jdbc_password="7OVl2rJ+hOYxG5T3vKJQb+hW4NPgy9EN"
    jdbc_mode="thin"
    number_of_mappers="3"
    allow_jdbc_theme_based_foi="true"
    editable="true"/>
    <!--  ****  -->
    But it does not work.
    After use "sqlplus mvdemo/[email protected]:1522/pdborcl", it connected to the Oracle database 12c.
    Does anyone know it?
    Thanks,

    For 11.1.1.7.1 use the syntax for jdbc_sid, i.e.
    //mypdb1.foo.com as described in the README,
    - MapViewer native (non-container) data sources can now use database service name in place of SID. To supply a db service name, you will use the same jdbc_sid attribute, but specify the service name with double slashes in front, such as follows:
      <map_data_source name="myds"
        jdbc_host="foo.com"
        jdbc_sid="//mypdb1.foo.com"
        jdbc_port="1522"
      />
    For 11.1.1.7.0 use a container_ds instead.
    i.e. instead of using
    <map_data_source name="my_12c_test"
                       jdbc_host="mydbinstance"
                       jdbc_sid="pdborcl12c"
                       jdbc_port="1522"
                       jdbc_user="mytestuser"
                       jdbc_password="m2E7T48U3LfRjKwR0YFETQcjNb4gCMLG8/X0KWjO00Q="
                       jdbc_mode="thin"
                       number_of_mappers="6"
                       allow_jdbc_theme_based_foi="false"
                       editable="false"
       />
    use
      <map_data_source name="my_12c_test"
                       container_ds="jdbc/db12c"
                       number_of_mappers="6"
                       allow_jdbc_theme_based_foi="false"
                       editable="false"
       />
    In my case the Glassfish 3.1.2.2 JDBC connection pool definition was
    Property
    url  jdbc:oracle:thin:@mydbinstance:1522/pdborcl12c.rest_of.service.name
    Uncheck the Wrap JDBC Objects option in Advanced panel, i.e. the Edit JDBC Connection Pool Advanced properties page.
    Add a JDBC resource for that newly created pool
    Use that in mapviewerconfig.xml as above

  • Differences between the Portal Data Collector and the Activity Data Collect

    Hello,
    I want to know what are the differences between the Portal Data Collector and the Activity Data Collector?
    Best Regards.
    Pablo Mortera.

    All of my SQL Server instances  are sql server 2008r2 standard edition(10.50.2500). MDW is existing database, I try to setup collection sets for multiple instances and store data in one central MDW database 
    I create MDW in one instance, then run run configure Management Data Warehouse in target intance. collection sets were created successfully, but job failed with following error:
    Executed as user: COCAD\INTDEPT01SQLAgentC10. The Management Data Warehouse version "00.00.0000.00" is not supported by the current data collector. Please upgrade the Management Data Warehouse by running the Management Data Warehouse Configuration
    Wizard.  Process Exit Code 5.  The step failed.
    Thanks
    PAULqaz

  • How to create JDBC data source w/o LDAP server

    I am trying to test using JDBC data source on a computer without a LDAP server. Is there an alterative JNDI solution? How about using file system or RMI registry JNDI service providers?

    Any J2EE container should be able to handle that. I use JNDI data sources with Tomcat 4.1.27. I'm sure any other J2EE app server (e.g., WebLogic, WebSphere, JBOSS, etc.) would be able to manage it, too. - MOD

Maybe you are looking for