DeployingTables in my own datasource

Hi, I have created some tables with the Java Dictionary. When I make the deploy, they are automatically created in de default datasource of the J2EE Engine.
I need to deploy them in another datasource that I have created with the visual administrator. How can i configure this?
Thanks in advance
Pablo

Hi Pablo,
You specify the O/R mapping for CMP entity beans in the persistent.xml. For this purpose you have to define the DataSource that will be used. Since you want to use another DB and not the system one, you have the option to define its SQL Engine type as VendorSQL or NativeSQL only (i.e. you cannot benefit from the OpenSQL persistency layer). For more information please have a look at these documents:
http://help.sap.com/saphelp_nw04/helpdata/en/bb/69da54d8aedc419d46d9ea074e4d41/frameset.htm
http://help.sap.com/saphelp_nw04/helpdata/en/d2/369f1dddb7ff4c96c2bacdd7aa6c87/frameset.htm
and the links from there.
Hope that helps!
Vladimir

Similar Messages

  • How can I create my own datasource

    Hi All,
    I would like to create my own datasource. Please provide me a step to do so. I don't want to modify the standard datasource in RSA6 because many fields will be appened and I don't want to interrupt the other BI consults as well.
    Thank you in advance,
    Sukanya

    Hi,
    You can create your own datasource by using Generic extraction in RSO2 in R/3 side.
    Generic extraction Function module:
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/d3219af2-0c01-0010-71ac-dbb4356cf4bf
    Generic Delta:
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/84bf4d68-0601-0010-13b5-b062adbb3e33
    GENERIC EXTARCTIONS:
    https://www.sdn.sap.com/irj/sdn/wiki?path=/display/bi/generic%2bextraction
    Regards
    Tg

  • Using Single Datasource to Access Multiple Databases

    Hi,
    We would like to know the pros and cons of accessing multiple
    databases through a single datasource, versus accessing each
    database through its own datasource. Our environment includes
    multiple web servers w/ the latest version of ColdFusion MX 7,
    clustered through a load balancer. Each web server has 800+ dsns
    pointing to different SQL databases on the same SQL server. We have
    noticed that the ColdFusion administrator is taking a long time to
    display or verify all datasources and sometimes it even times out.
    Another problem is that sometimes the neo-query file gets corrupted
    (for unknown reasons) which results in the deletion of one, or
    more, or all datasources on the web server.
    Because of the issues above we are researching the
    possibility of removing most of the datasources, and then accessing
    each database through a single bridge datasource. In that regard we
    plan to change our queries by inserting the sql db name and user in
    front of each table in the query such as:
    <cfquery name="query" datasource="single_dsn_name">
    select * from [#dbname#].dbo.tableName
    </cfquery>
    In the example above, obviously #dbname# would be a variable
    that will hold the name of the requested database. The code above
    would similarly apply to queries using, update, insert and join
    words.
    Are there any limitations or negatives from scalability,
    performance, and reliability perspective in implementing the above
    scenario versus having one datasource for each database.
    Also, if there is a better way of accomplishing this, we
    would love to hear about it.

    Here is my opinion, because I work with both schemas. The
    main advantage to use one datasource for all DBs in a SQL Server is
    the simplicity of administration.
    But the main disadvantage is security, because you are using
    a single user to access all DB in a server, you don't have
    isolation, and a user that knows your schema can access data of
    other DBs that he sould not be authorized.
    Another issue is is a user must access 2 differents DB with
    different permissions (a DB only read and the other read/write),
    you'll have to create another datasource, user, etc for it.
    But the decision depends in the enviroment. If you are a
    hosting company, I would use 1 datasource for user or DB. If the
    servers and DBs are of the same company, I could use one datasource
    for each SQL server.
    Best regards

  • KODO DataSource questions

    Our setup:
    Tomcat 5.5, KODO 3.4.1, single transaction per request on the application (application is sessionless).
    Observations:
    1. When we specify javax.jdo.option.ConnectionURL, javax.jdo.option.ConnectionUserName, javax.jdo.option.ConnectionPassword in the properties file, it uses the KODO DataSource specifically com.solarmetric.jdbc.PoolingDataSource.
    2. If we want to use JNDI, we specify javax.jdo.option.ConnectionFactoryName=java:comp/env/jdbc/MyDB in the properties file and then in our tomcat server.xml context we add something like the following:
       <Resource
       name="jdbc/MyDB"
       auth="Container"
       type="javax.sql.DataSource"
       driverClassName="oracle.jdbc.driver.OracleDriver"
       url="jdbc:oracle:thin:@_host_:_port_:_SID_"
       username="_username_"
       password="_password_"
       maxActive="10"
       maxIdle="5"
       maxWait="-1"/> 3. When specifying with JNDI in tomcat, it appears to use the tomcat DataSource (org.apache.tomcat.dbcp.dbcp.BasicDataSource).
    4. Using the tomcat BasicDataSource means that we get no prepared statement caching - this is a documented issue with using third party DataSources.
    5. Using the tomcat BasicDataSource also seems to do a database rollback after every persistence manager 'close', which in our setup equates to each request.
    Questions:
    1. How do we force KODO to use it's own DataSource when using JNDI in tomcat?
    2. How would we force KODO to use a different DataSource in the properties file (like the tomcat BasicDataSource)?
    3. Is there any way to stop the database rollback after each persistenceManager close when using the tomcat BasicDataSource?
    4. Is it recommended to use the KODO DataSource? Are there any reasons to not use it?
    5. What are the main differences between the KODO DataSource and the tomcat BasicDataSource?
    Many thanks in advance.
    Nick

    Can anyone shed some light on how we get statement caching working whilst using JNDI to specify the db connection details? Is this possible???
    Any info/help would be very much appreciated as statement caching is something we would now like to enable.
    Many thanks,
    Nick

  • Delta on Sales Data (Generic Datasource)

    Hi,
    I have created a generic datasoucrce which is taking data from VBRK & VBRP. Now i have to set delta on it, for which i was looking for change date field but didnt found anyone.
    Does anyone have any idea on which field for sales data i can set my delta ??

    Hi,
    generic extractor is an extractor that you create yourself for a special requirement which is not solved by the standard available business content.
    Before creating your own datasource take a look at the Logistic Cockpit (Transaction LBWE) -> application 13. Here you found BW content datasources like 2LIS_13_VDITM
    to extract the tables for SD billing document (VBRK & VBRP) in delta mode.
    After you have filled the setup table for application 13 you can do an initial run from BW. Now you can extract your delta without building your own datasource. Delta updating is part of the standard Logistics extraction process.
    Additional you can use now standard business content in BW system for this extraction (or take a look on it how it is implemented).
    Check this blogs for Logistics Extraction:
    /people/sap.user72/blog/2004/12/16/logistic-cockpit-delta-mechanism--episode-one-v3-update-the-145serializer146
    /people/sap.user72/blog/2004/12/23/logistic-cockpit-delta-mechanism--episode-two-v3-update-when-some-problems-can-occur
    /people/sap.user72/blog/2005/01/19/logistic-cockpit-delta-mechanism--episode-three-the-new-update-methods
    /people/sap.user72/blog/2005/02/14/logistic-cockpit--when-you-need-more--first-option-enhance-it
    /people/sap.user72/blog/2005/04/19/logistic-cockpit-a-new-deal-overshadowed-by-the-old-fashioned-lis
    I hope this helps.
    Best regards
    Andreas

  • Websphere 5.0 and non-jts-datasource = 2PC exception!

    Hello all
    We're migrating from a working weblogic 8 app to websphere 5.0, and we run into this problem.
    Toplink tries to enlist the NON-JTS datasource in the global transaction. In weblogic we defined our non-jts datasource as a non-transactional datasource, but there is no such option in websphere. What is going on??
    Please help
    TIA
    - Russ -

    Hello Rustam,
    WebSphere 5 throws exceptions when you try to get a non-jta datasource while in a transaction - it seems to try to enlist it in the transaction I think.
    This is more a WebSphere issue, since it means you cannot read outside of the transaction.
    There are 3 options:
    1) Don't define a non-jta datasource at all in TopLink. Draw backs to this are that there may be problems reading when there is no transaction, such as when you are using cache synch.
    2) Create your own datasource (outside of WebSphere) and place it in JNDI. Then have TopLink access it as a non-jta datasource. Your datasource must be completely independent of WebSphere so that it does not attempt to associate with JTA.
    3) Use a TopLink maintained read connetion pool. You can use the non-jts-connection-url sessions.xml tag, which will use the login setting defined in your project.xml. I've not tested it, but can also override the read pool in a preLogin event that should look something like:
    public void preLogin(SessionEvent event) {
    DatabaseLogin dbLogin(new Oracle9iPlatform());
    dbLogin.setUserName("name");
    dbLogin.setPassword("password");
    dbLogin.setConnectionString("jdbc:oracle:thin:@ip:port:sid");
    ConnectionPool readPool = new ReadConnectionPool("read", dbLogin, minCon, maxCon, event.getSession());
    event.getSession().setReadConnectionPool(readPool);
    Best Regards,
    Chris Delahunt

  • "Source is not Active" when creating DTP from  Datasource (BI. 7.0)

    Hello,
      I need to load data to ODS from a ZTable. Version 7.0
    This is what i have done so far:
      - Created a ZTABLE. (OK)
      - Loaded data via function modules through excel macros.  (OK)
      - Created DataSource(ZGDRDS01)  from a ZTABLE in RSO2. (I guess OK)
      - Create DataTransfer Process - (NOK)
           Source of DTP
             Object Type : DataSource
             DataSource : ZGDRDS01
             Source System: BWD
        - Error Message: Source ZGDRDS01 of BWD(type DTASRC) is not active (like if it didnt exists)
    (I didnt find any Activate button in RSO2)
    Somebody else did something similar before and created its own DataSource, that one I can use, but i need to create my own.

    Thank you for your fasts answers! That seemed to be part of the problem.
    Did you create Transformation which is a prereq for creating a DTP ?
    - No i didnt create the transformation first, but the problem was the same.  When I created a Transformation it asks me for the DataSource and back again to "source is not Active".
    New problem:
    I went to SourceSystem and replicated the datasources,
    - Know it appears just below the other one that was previously working.
    - However it appears with status Inactive and there is no Activate Button.
    - Something went wrong with the version because the column "M = A Version" appears  =/=, for my new Datasource. I right click on it -> Replicate Metadata but didnt help.  Any other option marks me something lik is not Active or
       "The DataSource  ZGDRDS01 (BWD) does not exists in object version A"
    if i right  clic -> Display
    Version = Modified (yellow)
    I deleted the datasource and generated it again with the same result.
    ¿do you know how to fix this?
    P.S ¿Should I post a new thread for this problem?

  • Playback delay of customized DataSource/Stream

    I have developed a centralized voice chat conferencing program using JMF. The server mixes received streams into a single stream and sends the mixed stream to each client. To make this possible, I have developed my own DataSource and Stream classes. My stream class uses the standard GSM_RTP audio format. When a client receives the mixed stream, the client immediately creates a player for the stream. The player spends about four seconds realizing the stream before playback begins. If I send a non-mixed/non-customized stream to the clients, there is no delay. Any ideas as to why a customized stream would cause the player to delay for so long?

    I found the problem. I had some old code left over from the previous version that was setting the buffer length and threshold for the receive RTPManager. As soon as I removed this code, there was no longer a delay realizing the player.

  • Datasource for plain UDP ....

    Hi. I tried to write out my own datasource which should be able to help interpreting plain UDP Signals (without RTP). Can anybody watch of them and help me with 2. Problems:
    1. The: UDPPushSourceStream s = new UDPPushSourceStream(sock, ml.getIpAdresse(), ml.getPortnummer()); in DataSource.java doesn't work -> "Cannot resolve symbol"! Why?
    2. When I put this line in comment, I can compile all the files. But when loading the Player Applet I get the following Exception. Why? What does it mean:
    java.security.AccessControlException: access denied (java.net.SocketPermission 225.225.225.225 connect,accept,resolve)
         at java.security.AccessControlContext.checkPermission(Unknown Source)
         at java.security.AccessController.checkPermission(Unknown Source)
         at java.lang.SecurityManager.checkPermission(Unknown Source)
         at java.lang.SecurityManager.checkMulticast(Unknown Source)
         at java.net.MulticastSocket.joinGroup(Unknown Source)
         at DataSource.connect(DataSource.java:80)
         at player.init(player.java:32)
         at sun.applet.AppletPanel.run(Unknown Source)
         at java.lang.Thread.run(Unknown Source)
    Sources:
    First DataSource.java:
    /** DataSource f�r UDP-Protokoll **/
    import javax.media.*;
    import javax.media.protocol.*;
    import javax.media.udp.*;
    import java.io.*;
    import java.net.*;
    import java.util.*;
    public class DataSource extends javax.media.protocol.PushDataSource {
      protected UDPMediaLocator ml;
      protected MulticastSocket sock;
      // No Argument Constructor
      public DataSource() {
        this.ml = null;
      // Constructor
      public DataSource(MediaLocator src) { 
        this();
        setLocator(src);
      // Wirft die Streams zur�ck, die von dieser
      // DataSource verwaltet werden!
      public PushSourceStream[] getStreams() {
        initCheck();
        PushSourceStream[] streams = new PushSourceStream[1];
        UDPPushSourceStream s = new UDPPushSourceStream(sock, ml.getIpAdresse(), ml.getPortnummer());
        streams[0] = s;
        return streams;
      // Setzt MediaLocator und damit die Location der Daten
      // Diese Methode soll nur einmal aufgerufen werden!
      public void setLocator(MediaLocator src) {
        if(this.ml == null) {
          this.ml = (UDPMediaLocator) src;
        else {
          throw new Error("MediaLocator bereits gesetzt!");
      // Gibt das MediaLocator-Objekt zur�ck
      public MediaLocator getLocator() {
        return this.ml;
      // Wurde die Verbindung initialisiert?
      protected void initCheck() {
        if(this.ml == null) {
          throw new Error("Datenquelle uninitialisiert");
      // Gibt den Content-Type der DataSource aus
      // bei uns unbekannt.
      public String getContentType() {
        initCheck();
        return Manager.UNKNOWN_CONTENT_NAME;
      // �ffnet die Verbindung zum Server
      public void connect() throws java.io.IOException {
        initCheck();
        if(sock != null) {
          disconnect();
          sock = null;
          // reader = null;
        InetAddress addresse = ml.getIpAdresse();
        sock = new MulticastSocket(ml.getPortnummer());
        sock.joinGroup(addresse);   
      // Schlie�t Verbindung zum Server
      public void disconnect() {
        initCheck();
        sock.close();
        sock = null;
      // Start hat nichts zu tun.
      public void start() {
      // Stop hat nichts zu tun.
      public void stop() throws java.io.IOException {
      // Wirft die Dauer des Mediums aus.
      // Ist bei uns unbekannt.
      public Time getDuration() {
        return Duration.DURATION_UNKNOWN;
      // Wirft ein Array mit allen Controls aus
      // Wir haben keine Controls
      public Object[] getControls() {
        return new Object[0];
      public Object getControl(String controlName) {
        return null;
    ===========================
    Now: UDPMediaLocator.java:
    /** MediaLocator f�r UDP-Protokoll **/
    import javax.media.udp.*;
    import javax.media.*;
    import java.net.*;
    import java.io.*;
    import java.util.*;
    public class UDPMediaLocator extends javax.media.MediaLocator {
      protected String IpAdresse;
      protected int Portnummer;
      // Constructor  
      public UDPMediaLocator(String IpAdresse, int Portnummer) {
        super(IpAdresse + ":" + Portnummer);
        this.IpAdresse = IpAdresse;
        this.Portnummer = Portnummer;
      // Gibt IP-Adresse als InetAdresse zur�ck
      public InetAddress getIpAdresse() throws UnknownHostException {
        return InetAddress.getByName(IpAdresse);
      // Gibt Portnummer zur�ck
      public int getPortnummer() {
        return this.Portnummer;
      // Wirft 0 zur�ck, da keine URL existiert
      public URL getURL() throws MalformedURLException {
        return null;
      // Wirft Protokoll-Namen zur�ck {
      public String getProtocol() {
        return "udp";
      // Wirft 0 zur�ck, wird in unserem Fall nicht ben�tigt
      public String getRemainder() {
        return null;
    ===========================
    Now: UDPPushSourceStream.java:
    /** PushSourceStream f�r UDP-Protokoll **/
    import java.io.IOException;
    import java.net.InetAddress;
    import java.net.DatagramSocket;
    import java.net.MulticastSocket;
    import java.net.DatagramPacket;
    import java.net.SocketException;
    import javax.media.udp.*;
    import javax.media.protocol.DataSource;
    import javax.media.protocol.PushSourceStream;
    import javax.media.protocol.ContentDescriptor;
    import javax.media.protocol.SourceTransferHandler;
    public class UDPPushSourceStream implements javax.media.protocol.PushSourceStream {
      MulticastSocket sock;
      InetAddress address;
      String addresse;
      int port;
      SourceTransferHandler sth = null;
      // Constructor
      public UDPPushSourceStream(MulticastSocket sock, String address,int port) {
      this.sock = sock;
      this.addresse = address;
      this.port = port;
      // Liest Daten aus dem Stream
      public synchronized int read(byte buffer[], int offset, int length) throws IOException {
        DatagramPacket p = new DatagramPacket(buffer, offset, length, address, port);
        try {
         sock.receive(p);
        } catch (IOException e) { return -1; }
        return p.getLength();
      // Stream ist theoretisch unendlich
      public boolean endOfStream() {
        return false;
      // Gibt den ContentDescriptor f�r den Stream wieder
      public ContentDescriptor getContentDescriptor() {
        ContentDescriptor cd = null;
        cd = new ContentDescriptor(ContentDescriptor.CONTENT_UNKNOWN);
        return cd;
      // Gibt die Anzahl an Bytes im Stream wieder
      // Ist bei uns nicht bekannt.
      public long getContentLength() {
        return LENGTH_UNKNOWN;
      // Control-Methode wird nicht unterst�tzt
      public Object getControl(String ControlName) {
        return null;
      // Gibt ein Array mit allen Controls zur�ck
      // Wird nicht unterst�tzt.
      public Object[] getControls() {
        return new Object[0];
      public synchronized void setTransferHandler(SourceTransferHandler sth) {
        this.sth = sth;
      public int getMinimumTransferSize() {
        return 2 * 1024;
    ===========================
    Now: Player.java:
    import javax.media.udp.*;
    import javax.media.*;
    import javax.media.protocol.*;
    import java.applet.*;
    import java.awt.*;
    import java.net.*;
    import java.util.*;
    public class player extends java.applet.Applet implements ControllerListener {
      private UDPMediaLocator locator;
      private String ip = "225.225.225.225";
      private int port = 1112;
      private Label cachingStatus;
      private transient Player player;
      private transient DataSource dataSrc;
      private transient Component visualComp;
      private transient long bytesReceived;
      public void init() {
        setLayout(new BorderLayout());
        cachingStatus = new Label("");
        add("Center", cachingStatus);
        locator = new UDPMediaLocator (ip, port);
        DataSource ds = null;
        Class dsClass = null;
        String dsClassname = "DataSource";
        try {
          dsClass = Class.forName(dsClassname);
          ds = (DataSource) dsClass.newInstance();
          ds.setLocator(locator);
          ds.connect();
          player = Manager.createPlayer(ds);
          player.addControllerListener(this);
          player.prefetch();
        catch (ClassNotFoundException ex) { throw new Error(ex.toString()); } 
        catch (IllegalAccessException ex) { throw new Error(ex.toString()); }    
        catch (InstantiationException ex) { throw new Error(ex.toString()); } 
        catch (javax.media.NoPlayerException ex) { throw new Error(ex.toString()); }   
        catch (java.io.IOException ex) { throw new Error(ex.toString()); } 
      public void stop() {
        if(player != null) {
          player.stop();
          player.deallocate();
      public void destroy() {
        if(player != null) {
          player.removeControllerListener(this);
          player.stop();
          player.close();
          player = null;
        super.destroy();
      public synchronized void controllerUpdate(ControllerEvent evt) {
        if(evt instanceof EndOfMediaEvent) {
          player.stop();
          player.setMediaTime(new Time(0L));
          player.start();
        else if(evt instanceof PrefetchCompleteEvent) {
          remove(cachingStatus);
          visualComp = player.getVisualComponent();
          if (visualComp != null){ this.add("Center", visualComp); }
          else {
            Label novisual = new Label("This Player has no visual component");
            add("Center", novisual);
          validate();
          repaint();
          player.start();
        else if(evt instanceof CachingControlEvent) {
          CachingControlEvent cce = (CachingControlEvent) evt;
          bytesReceived = cce.getContentProgress();
          cachingStatus.setText(bytesReceived + " bytes Received");
          validate();
          repaint();
        else if (evt instanceof ControllerErrorEvent) {
          ControllerErrorEvent cee = (ControllerErrorEvent) evt;
          System.out.println("cee.getMessage " + cee.getMessage());
          System.out.println("bytesReceived = " +bytesReceived);
    }What's going wrong? Are these files correct or does anybody have some comments of them? Thx for any help!
    Jan

    Okay ... I read sth. about the sanbox - thing, but I can't exactly find a way how to get Applets working with those networks things?! Can you tell me a little about this?
    Thx,
    Jan

  • Error while activating Process Chains

    Hi all,
    while activating the Process Chains, i am getting the following error "Job BI_PROCESS_PSAPROCESS could not be scheduled. Termination with returncode 8"
    when i double click on the error msg, i got the following help msg: "
    <i>Message no. RSPC065
    Diagnosis
    Program RSPROCESS is to be scheduled as job BI_PROCESS_PSAPROCESS under user ALEREMOTE.</i>
    Can any one please show some way to solve this problem? please do this favor, i have been suffering with error for a long time.
    Points will be given
    Thanks
    Ganesh

    Hi,
    Just analyze the error message that you get while activating the PC, don't give any server name.If you are trying to run process chain using Flat file, it won't work,
    and you should have source system R/3 or you have own datasources in BW system itself at that toime you can use PC to extract data. If your source system is Flat ile, it won't work. other wise you should place your flat file in application server, using AL11 tcode.
    <b>OSS : 511475</b>
    <b>Symptom</b>
    You cannot schedule or perform any batch jobs with the BW or source system background user.
    The error RSPC 065 occurs in the process chains:"Job could not be scheduled, termination with return code 8"
    <b>Other terms</b>
    RSPC065
    <b>Reason and Prerequisites</b>
    The user type is
    "CPIC" up to 4.6B
    "Communication" as of 4.6C
    This user type may not execute or start any batch jobs, irrespective of the user authorizations.
    <b>Solution</b>
    Set the type of background user to
    "Background" up to 4.6B
    "System" as of 4.6C
    This user type corresponds to the "Communication" type and may also perform background functions.
    Through the Customizing, the BW user is automatically created by mistake as a communication user.Depending on your BW system release, you can solve this problem as follows:
    BW 2.0B
               Import Support Package 24 for 2.0B (BW2.0B patch24 or SAPKW20B24) into your BW system. The Support Package is available once note 456551 with the short text "SAPBWNews BW 2.0B Support Package 24", which describes this Support Package in more detail, has been released for customers.
    BW 2.1C
               Import Support Package 16 for 2.1C (BW2.1C patch16 or SAPKW21C16) into your BW system. The Support Package is available once note 456566 with the short text "SAPBWNews BW 2.1C Support Package 16" has been released for customers.
    BW 3.0A
               Import Support Package 8 for 3.0A (BW3.0A patch08 or SAPKW30A08) into your BW system. The Support Package is available once note 452632 with the short text "SAPBWNews BW 3.0A Support Package 08" has been released for customers.
    <b></b>

  • Vendor Master Extraction

    Hello,
    I have successfully activated 0vendor_attr and 0vendor_text. The extract structure name, BIW_LFA1_S, and description indicate that this is vendor master data from table LFA1.
    My requirement is to also extract data from tables LFB1 and LFM1.
    Using transaction code RSA5, I am not able to identify a business content datasource whose extract structure corresponds to these two OLTP system tables.
    Using SE11 I have verified that the structure BIW_LFB1_S and BIW_LFM1_S exist in the OLTP system.
    How can I activate these two datasources?
    Thanks in advance for your assistance.
    Mark
    Message was edited by: Mark Castaldi
    Found the business content data sources under SAP-R/3 - LO - LO-IO: 0ven_compc and 0ven_purorg.

    Hi Mark,
    You can try creating your own datasources in RSO2 using these structures for extraction.
    Hope this helps...

  • MapViewer metadata problem - accessing spatial data in a different schema.

    I have a MapViewer application that uses data from three different schemas.
    1. Dynamic Themes come from schema A.
    2. Static Themes come from schema B.
    3. A newly added static theme in B whose data comes from schema C.
    The mapviewer datasource points to schema B where the static themes, data and metadata are defined while the dynamic themes have their own datasource specified as part of addJDBCTheme(...).
    To get the newly added map to work I've had to add a view in schema B that points to C instead of referencing directly the table and I've had to add the metadata twice, once for schema B and once for schema C.
    If I put the metadata in just one of the two schemas I get the following errors.
    08/11/21 13:58:57 ERROR [oracle.sdovis.ThemeTable] cannot find entry in ALL_SDO_GEOM_METADATA table for theme: AMBITOS_REST
    08/11/21 13:58:57 ERROR [oracle.sdovis.ThemeTable] java.sql.SQLException: Invalid column index
    OR
    08/11/21 13:53:39 ERROR [oracle.sdovis.theme.pgtp] java.sql.SQLException: ORA-29902: error in executing ODCIIndexStart() routine
    ORA-13203: failed to read USER_SDO_GEOM_METADATA view
    It's not a big deal but I'd like to know if anyone else has has similar problems.
    Saludos,
    Lew.
    Edited by: Lew2 on Nov 21, 2008 6:42 AM

    Hi Lew,
    if you are using a recent version (10.1.3.1 or later) there is no need to use a view and to create the metadata in both schemas.
    You need to grant selection on tables between the schemas.
    You can try the following. Assume you have the MVDEMO schema (from MapViewer kit) and SCOTT schema.
    1) grant select on MVDEMO Counties table to SCOTT
    SQL> grant select on counties to scott;
    2) Now you are ready to create a predefined theme in schema SCOTT using the MVDEMO Counties table.
    - Open MapBuilder and loads the SCOTT schema.
    - On the Data navigator (bottom left tree), go to Geometry tables and you should see the MVDEMO node and the COUNTIES node inside it.
    - Start a wizard to create a geometry theme based on this Counties table.
    - At the end you should see that the base table name is MVDEMO.COUNTIES. Therefore MapViewer will use the metadata in MVDEMO schema and there is no need to replicate it in SCOTT schema.
    Joao

  • SQL Error in Query Editor

    Hello All,
    I am new to the Sun Java Studio Creator and in the tutorial Linking Components to Data tutorial.
    I created my own datasource which is a MS Access database.
    I have a simple look up table and I dragged the table onto the dropdown list as instructed and nothing happened. The dialog box the tutorial said would appear didn't appear.
    So I right clicked on the rowset added on to the page to get in the Query Editor.
    When opening the Query Editor, I get an error:
    Error Connecting to Database:
    java.sql.SQLException:[Microsoft][ODBC Driver Manager] does not support this function
    The SQL reads in the editor is something like this:
    SELECT ALL tableName.Column 1 FROM tblName
    Why does it automatically put the "All". Is that standard SQL? I don't think so. But I guess I could be wrong.
    I have not seen this used in other SQL. I've used T-SQL, PL/SQL, ACCESS SQL and never had I seen the "ALL" preceding the columns of the table.
    Quite frustrating when something as simple as a simple SELECT statement does not work when you're trying to learn a new tool.
    When adding the datasource, the test connection is fine. The query editor sees all of my tables in the MS Access database and all of the added table's columns.
    Am I missing a driver, or update to a driver? Or is this Sun's way of saying "Screw Your Microsoft database use something that we want you to use!!"
    Which is not pretty cool.
    I'm a newbie with this tool, but I have been coding for a while using MICROSOFT products. And have not had the same problems as this. Or at least getting help through documentation.
    I'm just venting here. I want to learn this tool. And when simple processes doesn't work. And documentation in why it's now working is hard to find, it gets quite frustrating.
    Anyone's help would be greatly appreciated.
    I truly appreciate your time for reading through my frustration.

    Hi,
    Please check the driver and also take a look at the link for "Supported Database Servers and JDBC Drivers"
    http://www.sun.com/software/products/jscreator/sysreq.xml
    regards
    MJ

  • Java connect to SAP R/3

    I would seek for opinion on the problem that we're currently encountered related to java We are developing an intranet system using Java and Borland Enterprise Server as an application/web server and also connecting to SAP R/3 database to get certain informations. The problem is that the connection to the database is not disconnect eventhough users already logout from the system. I can check the connection via the SAP (transaction SM04) and the connection are still established.
    According to the Java developers they already put in the disconnect statement in the java program.
    Pls help.

    Try and adhere to this pattern :
    Datasource datasource = //Go configure your own datasource if you need to
    Connection con = null ;
    PreparedStatement pstat = null;
    ResultSet rset = null; 
    try {
       con = datasource.getConnection() ;
       pstat = con.prepareStatement(SQLSTRING) ;
       rset = pstat.executeQuery() ;
       if(rset.next()) {
               //blah blah
    } catch (SQLException e) {
      e.printStackTrace() ;
    } finally {
       try {
          rset.close() ;
          pstat.close() ;
          con .close() ;
        }catch (SQLException e) {
           e.printStackTrace() ;

  • Expand FI_GL_4 by BSEG and PUR fields

    Hi Experts,
    I am trying to expand FI-data from standard datasource FI_GL_4 by purchase information.
    Above all: EBELN & EBELP
    Additionally fields (BSEG): MWART, ZUONR,
    Additionally fields (EKBE, for Scheduling agreements): MENGE, MEINS
    Due to the fact that I am unexperienced to FI-extraction I was searching for notes, hints etc.
    I found thread: 0FI_GL_4 customer exit 
    , which seemed to be very helpful.
    But after studying that, I am facing some (similar) problems:
    1) Expanding FI_GL_4 by append fields to Customer Include CI_BSIS (as recommended in note 410799)
    After creating CI_BSIS and adding EBELN & EBELP to it I cannot activate the include, because there are some dependend tables (other own datasources) which already contain these fields. That brings up some errors.
    How can I activate the extract structure?
    Should I let begin these fields with ZZ*, but in that case I think it will not be filled automatically.
    I cannot implement note 430303 as all these fields are contained in DTFIGL_WF.
    2) Field ZUONR not filled
    This component is already in the structure DTFIGL_4, but it is not filled (RSA3 shows no data in that field, however it is filled in table BSEG for a acc-nr)
    It seems to be that it has been added by a SAP enhancement to that datasource afterwards.
    As shown in  http://help.sap.com/saphelp_nw04/helpdata/en/f2/f7e93a4295ac61e10000000a114084/frameset.htm
    the column Table of Origin is empty.
    How can I get data into that field?
    3) How should I append fields to FI_GL_4 to be filled by EKBE (MENGE, MEINS).
    Do I have to create a new append and fill them via CMOD?
    Or can I also use CI_BSIS and fill them by a function module?
    I would be really happy, if you could help me asap!
    Maybe Mr. Simon Turnbull has an idea to get along with it?
    Thanks a lot in advance!!!
    Best regards,
    Daniel
    Edited by: Daniel Haserodt on Sep 27, 2011 11:43 AM
    Edited by: Daniel Haserodt on Sep 27, 2011 11:47 AM

    Hi Suyhas,
    thanks for your encouraging reply.
    1)
    yes, I try to append fields EBELN & EBELP (without ZZ*), but it seems to be that these both fields are
    somehow used in 2 customer-spezific datasources (Lets call them ZBW_t1 & ZBW_t2).
    I guess the CI_BSIS is used within these DS, too. Maybe that's why there are some dependencies and this trouble.
    The error message is below:
    TABL CI_BSIS activated
    Check table CI_BSIS
    Enhancement category for table missing
    Enhancement category for include or subtype missing
    Field EBELN does not lie within customer namespace
    Field EBELP does not lie within customer namespace
    List of tables dependent on CI_BSIS
    Table DTFIGL_4
    Table ZBW_t1
    Table ZBW_t2
    Table CI_BSIS was checked with warnings
    =========================================================================
    Adjustment of active dependent objects
    =========================================================================
    TABL DTFIGL_4 was adjusted
    TABL ZBW_t1 is inconsistent in active version
    Check table ZBW_t1
    Enhancement category 3 possible, but include or subty. not yet classified
    Field EBELN in table ZBW_t1 is specified twice. Please check
    Field EBELP in table ZBW_t1 is specified twice. Please check
    Check on table ZBW_t1 resulted in errors
    TABL ZBW_t2 is inconsistent in active version
    Check table ZBW_t2
    Enhancement category 3 possible, but include or subty. not yet classified
    Field EBELN in table ZBW_t2 is specified twice. Please check
    Field EBELP in table ZBW_t2 is specified twice. Please check
    Check on table ZBW_t2 resulted in errors
    2)
    I have to check whether ZUONR is used in these other extract-structures.
    But there are no (syntax-)errors during activation.
    I respond to it later...
    Regards,
    Daniel

Maybe you are looking for

  • MBP 17" report, occasional white noise from right speaker

    My 2.16Ghz MBP with 100GB 7200rpm drive and 1GB memory arrived a few days ago. Since then I have been installing software, running apps, and generally putting it through it's paces. I have also added a 1GB stick from Crucial. On the whole the MBP has

  • After updating my Macbook Pro to 6.0 and my iPhone to 10.7 I can't sync my iTunes anymore

    After I updated my Macbook Pro today with 6.0 and my iphone with Itunes 10.7 my playlist don't show up and I can't sync them to my iphone. Old playlist appeared on the iphone that I have deleted a while back but all the new stuff I can't on. Please a

  • Currency Translation Key for Restricted Key Figure

    Hello Everyone, I created a Restricted Key Figure and want to assign a currency translation key and the field is greye out. How do you assign a currency translation to a restricted key figure? Thanks so much, Colleen

  • CONCSUB user does not exist

    Could anybody explain me what CONCSUB schema is used for ? When I stop all services on apps tier I receive the message "CONCSUB failed due to ORA-01017: invalid username/password; login denied". However this user does not exist in the database. Could

  • Loggin Utility for JCAPS 6.0

    Implementing Utility Logging for J CAPS 6.0 . Actually I want to implement Log4j utility for BPEL and my query is how we can call java file from BPEL is there any way for that ? I have mentioned sample data for the fileds which I want to use for logg