Icorrect table creation statements being generated for DB2

Hi,
I am using JDeveloper 903 and I have a datsource of IBM DB2 ver 7.3 type. I am creating a CMP2.0 beans Person and Address. When I try running these beans from JDeveloper on the embedded OC4J, I get the following exception :
AAuto-creating table: create table Person_current_workspace_app__vkk_AmsNTO_Project1_classes_ (id BIGINT not null primary key, firstName VARGRAPHIC(255) null, lastName VARGRAPHIC(255) null, middleName VARGRAPHIC(255) null)
Auto-creating table: create table Address_current_workspace_app__vkk_AmsNTO_Project1_classes_ (id BIGINT not null primary key, address1 VARGRAPHIC(255) null, address2 VARGRAPHIC(255) null, city VARGRAPHIC(255) null, state VARGRAPHIC(255) null, zip VARGRAPHIC(255) null, type VARGRAPHIC(255) null, Address_addresses BIGINT null)
Auto-creating table: create table Address_addresses_Address_Address_addresses_current_workspace_app__vkk_AmsNTO_Project1_classes_ (Address_id BIGINT not null)Error creating table: [IBM][CLI Driver][DB2/NT] SQL0104N An unexpected token "," was found following "VARGRAPHIC(255) null". Expected tokens may include: "<references_spec>". SQLSTATE=42601Error creating table: [IBM][CLI Driver][DB2/NT] SQL0104N An unexpected token "," was found following "VARGRAPHIC(255) nu[i]Long postings are being truncated to ~1 kB at this time.

Thanks wildmight for the reply.
You're right on with the tip you provided -- I figured that this had to be related to some DB object setting in the Physical Layer. As I've never been able to find any documentation from Oracle on the definition of the "Features" tab settings, I just began trial-and-error troubleshooting with the settings to see what I could find. I actually tested it by selecting every "Value" checkbox in the 'Features" tab to see what I'd get. :D
Apparently, when I selected "DB2/AS400" as the database object's data source type in the "General" tab, the "Features" tab also got updated to default DB2 features that OBIEE thinks are appropriate. HOWEVER, there is one +slightly+ important feature is then automatically deselected (and also not listed as "default") in the list:
*DERIVED_TABLES_SUPPORTED*
Once I put a check in the "Value" column of the "Features" tab and re-ran my query, it actually submitted one unified query to the DB as I had hoped!
I have a fact table with about 30 million rows and when I would ask for only a month's worth of data, it previously would return ALL 30M records and then filter down on my month. That query was taking about 15 minutes. After I got the view to work properly, the same query took 19 seconds. :)
Cheers,
Jeremy

Similar Messages

  • Notifications are not being generated for any emails that have rules applied to them.

    Notifications are not being generated for any emails that have rules applied to them. Any clues how to fix this?

    In System Center 2012 Operations Manager, the alert notification will be sent when the alert first meets all criteria, regardless of resolution state, unless resolution state itself is a criterion. If alert suppression is enabled for the rule or monitor
    that raises an alert, only one notification will be sent when the subscription criteria are first met. No additional notifications will be sent until the alert is closed and a new alert is raised that meets all subscription criteria.  PLease check
    1) whether your rule has trun on alert supression
    2) Close the alert, craised by rule,  and do it again
    Roger

  • Table creation Error "Ref Count for this object is higher than 0"

    Hi All
    I have a problem in creation of table using SDK on a button event.  I have a procedure to create tables and fields. If I call this procedure on a menu Event than it works fine but If I call this procedure on a button event than It gives error "Ref count for this object is higher than 0" . I know this error occur When an object does not release after creating a table. but this error occur at when first table is created. My code it given below. plz see and give me your valuable suggestions.
      Dim oUserTablesMD As SAPbobsCOM.UserTablesMD
                Try
                    oUserTablesMD = B1Connections.diCompany.GetBusinessObject(SAPbobsCOM.BoObjectTypes.oUserTables)
                    oUserTablesMD.TableName = "AM_OBIN"
                    oUserTablesMD.TableDescription = "PutAway Table"
                    oUserTablesMD.TableType = BoUTBTableType.bott_Document
                    lRetCode = oUserTablesMD.Add
                    '// check for errors in the process
                    If lRetCode <> 0 Then
                        B1Connections.diCompany.GetLastError(lRetCode, sErrMsg)
                        MsgBox(sErrMsg)
                    Else
                        B1Connections.theAppl.StatusBar.SetText("Table: " & oUserTablesMD.TableName & " was added successfully", BoMessageTime.bmt_Short, BoStatusBarMessageType.smt_Success)
                    End If
                    System.Runtime.InteropServices.Marshal.ReleaseComObject(oUserTablesMD)
                    oUserTablesMD = Nothing
                    GC.Collect()
                    oUserTablesMD = B1Connections.diCompany.GetBusinessObject(SAPbobsCOM.BoObjectTypes.oUserTables)
                    oUserTablesMD.TableName = "AM_BIN1"
                    oUserTablesMD.TableDescription = "PutAway Upper Grid"
                    oUserTablesMD.TableType = BoUTBTableType.bott_DocumentLines
                    lRetCode = oUserTablesMD.Add
                    '// check for errors in the process
                    If lRetCode <> 0 Then
                        If lRetCode = -1 Then
                        Else
                            B1Connections.diCompany.GetLastError(lRetCode, sErrMsg)
                            MsgBox(sErrMsg)
                        End If
                    Else
                        B1Connections.theAppl.StatusBar.SetText("Table: " & oUserTablesMD.TableName & " was added successfully", BoMessageTime.bmt_Short, BoStatusBarMessageType.smt_Success)
                    End If
                    System.Runtime.InteropServices.Marshal.ReleaseComObject(oUserTablesMD)
                    oUserTablesMD = Nothing
                    GC.Collect()
                Catch ex As Exception
                    MsgBox(ex.Message)
                End Try
    Thanks
    Regards
    Gorge

    Hi Gorge,
    The "Ref count error..." usually occurs in case of Meta Data objects not being properly cleared.  And yes, you have got the error in the right place, generally while creating tables / fields / UDOs.  For this, you just need to clear the Meta Data object (At times even DI objects like Record Set) once they are used.
    Eg: Release these objects in the below way in a Finally Block.
    System.Runtime.InteropServices.Marshal.ReleaseComObject(oRecordSet);
    Hope this helps.
    Regards,
    Satish

  • Notifications not being generated for alerts generated by rules.

    I'm using SCOM 2012 R2 and have configured an alert generating rule to create alerts when specific events occur in a windows event log.
    The rule works fine, alerts are generated as expected.
    I have also configured subscriptions/subscribers/channels to send notifications via email and SMS (via script in command channel) however even with the criteria set to send on all alerts, I only receive notifications for alerts generated by monitors (so
    I know the subs and channels do function)
    It seems like some sort of workflow issue where alerts generated by a rule fail to create a notification.
    Is there any logging I can enable to see what's happening here? 
    Any other ideas?

    In System Center 2012 Operations Manager, the alert notification will be sent when the alert first meets all criteria, regardless of resolution state, unless resolution state itself is a criterion. If alert suppression is enabled for the rule or monitor
    that raises an alert, only one notification will be sent when the subscription criteria are first met. No additional notifications will be sent until the alert is closed and a new alert is raised that meets all subscription criteria.  PLease check
    1) whether your rule has trun on alert supression
    2) Close the alert, craised by rule,  and do it again
    Roger

  • Table DFKKQSR not being populated for 1099 withholding tax reporting

    we had to run a catch up program to populate this table in order to run program rfidyywt to generate 1099 forms. am wondering what config might be missing that would allow this table to be populated automatically as relevant transactions occur? ie: payments to business partners that are 1099able...
    thanks,
    Kashmir

    I wouldn't mind it being closed if it weren't related to FICA public sector, but the table DFKKQSR is not an FI table, its a table in contract accounting... FI folks won't know the answer to this... I don't believe. its something someone working on FICA would have encountered..

  • Wrong Select statement being generated

    The problem is to do with querying an object (db table) in an inheritance. The main comment is at the end.
    A. Here is the base class:
    @javax.persistence.Table(name = "DATA_PROFILE", schema="LPDS")
    @javax.persistence.Entity
    @javax.persistence.Inheritance(     strategy = javax.persistence.InheritanceType.JOINED)
    @javax.persistence.DiscriminatorColumn(name="DOMAIN_TYPE" , discriminatorType=javax.persistence.DiscriminatorType.STRING)
    public abstract class DataProfileEntity implements java.io.Serializable{
    B. And here is the sub class being queried:
    @javax.persistence.Table(name = "DATA_CONSTRAINT", schema="LPDS")
    @javax.persistence.Entity
    @javax.persistence.DiscriminatorValue("DataConstraint")
    @javax.persistence.PrimaryKeyJoinColumn(name="ID", referencedColumnName="ID")
    public class DataConstraintEntity extends DataProfileEntity implements java.io.Serializable, Cloneable{
    C. Here is my DAO using the Eclipselink API:
    @javax.ejb.TransactionAttribute(javax.ejb.TransactionAttributeType.NOT_SUPPORTED)
    public java.util.Collection<T> queryByExample(T entity) {
         if (type == null) {
              throw new UnsupportedOperationException("The type must be set to use this method.");
         if (entity == null) {
    throw new java.lang.NullPointerException(DAOExceptionMessageBuilder.buildMessage("param.value.empty", new Object[] { entity}));
         try{
    ReadAllQuery readAllQuery = new ReadAllQuery(type);
    readAllQuery.setExampleObject(entity);
    return (java.util.Collection<T>) getUnitOfWork().executeQuery(readAllQuery);
    } catch (org.eclipse.persistence.exceptions.DatabaseException dbe) {
    dbe.printStackTrace();
    throw new DAOException(DAOExceptionMessageBuilder.buildMessage("object.find.exception"), dbe);
    } catch (java.lang.RuntimeException rte) {
    rte.printStackTrace();
    throw new DAOException(DAOExceptionMessageBuilder.buildMessage("object.find.exception"), rte);
    D: When I do the following:
         DataConstraintEntity entity = new DataConstraintEntity();
         entity.setParentTable("User");
         entity.setCrudMode("All");
    dao.queryByExample(entity);
    The log message shows the following:
    2009-11-20 20:43:11,671 INFO [STDOUT] (http-127.0.0.1-8080-1) Begin profile of{ReadAllQuery(referenceClass=DataConstraintEntity )
    2009-11-20 20:43:11,689 INFO  [STDOUT] (http-127.0.0.1-8080-1) [EL Fine]: 2009-11-20 20:43:11.68--ServerSession(25760263)--Connection(32815217)--Thread(Thread[http-127.0.0.1-8080-1,5,jboss])
    --SELECT t0.ID, t0.DOMAIN_TYPE, t0.IS_LOCKOUT, t0.VERSION, t0.UPDATED_BY, t0.HITS, t0.CREATED_BY, t0.DESCRIPTION, t0.NAME, t0.DATE_LAST_UPDATED, t0.DATE_CREATED, t0.NOTES, t0.owner, t1.ID, t1.DATA_TYPE, t1.READ_ONLY, t1.ENABLED_ON_USER_ROLE, t1.RENDERED_ON_USER_ROLE, t1.CRUD_MODE, t1.PARENT_TABLE, t1.DATA_NAME FROM ODMS.DATA_PROFILE t0, ODMS.DATA_CONSTRAINT t1 WHERE ((((t0.DOMAIN_TYPE = ?) AND (t1.CRUD_MODE = ?)) AND (t1.PARENT_TABLE = ?)) AND ((t1.ID = t0.ID) AND (t0.DOMAIN_TYPE = ?)))
         bind => [DataProfile, All, User, DataConstraint]
    2009-11-20 20:43:12,412 INFO [STDOUT] (http-127.0.0.1-8080-1) Profile(ReadAllQuery,
         class=com.xoftsystems.odms.domain.bean.DataConstraintEntity,
         total time=736,
         local time=736,
         profiling time=1,
         connect=1,
         logging=10,
         query prepare=5,
         sql prepare=1,
         sql execute=716,
         sql generation=3,
    }End profile
    In "bind => [DataProfile, All, User, DataConstraint]", I don't know why Eclipselink is passing "DataProfile" as a value to the query or even why the two "(t0.DOMAIN_TYPE = ?)" in the statement. t0.DOMAIN_TYPE cannot be "t0.DOMAIN_TYPE = 'DataProfile'" and "t0.DOMAIN_TYPE = 'DataConstraint'". It should only be "t0.DOMAIN_TYPE = 'DataConstraint'".
    Testing the above Select statement directly on my database fails but succeeds when I remove the first "(t0.DOMAIN_TYPE = ?)" and run it.
    Questions:
    1. Could this be a bug in Eclipselink?
    2. How do I get rid of the first "(t0.DOMAIN_TYPE = ?)" in the Select statement?
    3. What am I doing wrong?
    Thanks in advance.

    Hello,
    First question should really be where is the "'DataProfile" DOMAIN_TYPE value coming from? I don't see a descriminatorValue annotation set with this value, so do you have it set through a mapping some where?
    Best Regards,
    Chris

  • Is there any post 2008R2 information available on Table Valued Parameters being usable for writes?

    The last I heard on the efforts to make TVPs writable was that they were on the roadmap for the 2008 R2 release but that it didn't make the cut.  
    Srini Acharya commented in the connect item associated with this feature that...
    Allowing table valued parameters to be read/write involves quite a bit of work on the SQL Engine
    side as well as client protocols. Due to time/resource constraints as well as other priorirites, we will not be able to take up this work as part of SQL Server 2008 release. However, we have investigated this issue and have this firmly in our radar to address
    as part of the next release of SQL Server.
    I have never heard any information regarding why this was pulled from the 2008R2 release and why it wasn't implemented in either SQL Server 2012 or SQL Server 2014.  Can anyone shed any light on what's going on here and why it hasn't been enabled
    yet?  I've been champing at the bit for the better part of 6 years now to be able to move my Data Access Methodology to a more properly structured message oriented architecture using Request and Response Table Types for routing messages to and from SQL
    Server Functions and Stored Procs.    
    Please tell me that I won't have to manually build all of this out with XML for much longer.
    Note that in SQL Server 2008 table valued parameters are read only. But as you notice we actually
    require you to write READONLY. So that actually then means that at some point in the future maybe if you say please, please please often enough we might be able to actually make them writable as well at some point.
    Please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please,
    please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please,
    please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please,
    please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please,
    please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please,
    please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please,
    please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please,
    please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please,
    please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please, please,
    please!

    Can someone please explain what the complication is?  
    It makes no sense to me that you can
    1)declare a table typed variable inside a stored procedure
    2)insert items into it
    3)return the contents of it with a select from that table variable
    but you can't say "hey. The OUTPUT parameter that was specified by the calling client points to this same variable."
    I would like to understand what is so different between
    create database [TechnetSSMSMessagingExample]
    create schema [Resources]
    create schema [Messages]
    create schema [Services]
    create type [Messages].[GetResourcesRequest] AS TABLE([Value] [varchar](max) NOT NULL)
    create type [Messages].[GetResourcesResponse] AS TABLE([Resource] [varchar](max) NOT NULL, [Creator] [varchar](max) NOT NULL,[AccessedOn] [datetime2](7) NOT NULL)
    create table [Resources].[Contrivance] ([Value] [varchar](max) NOT NULL, [CreatedBy] [varchar](max) NOT NULL) ON [PRIMARY]
    create Procedure [Services].[GetResources]
    (@request [Messages].[GetResourcesRequest] READONLY)
    AS
    DECLARE @response [Messages].[GetResourcesResponse]
    insert @response
    select [Resource].[Value] [Resource]
    ,[Resource].[CreatedBy] [Creator]
    ,GETDATE() [AccessedOn]
    from [Resources].[Contrivance]
    inner join @request as [request]
    on [Resource].[Value] = [request].[Value]
    select [Resource],[Creator],[AccessedOn]
    from @responseGO
    and
    create Procedure [Services].[GetResources]
    ( @request [Messages].[GetResourcesRequest] READONLY
    ,@response [Messages].[GetResourcesResponse] OUTPUT)
    AS
    insert @response
    select [Resource].[Value] [Resource]
    ,[Resource].[CreatedBy] [Creator]
    ,GETDATE() [AccessedOn]
    from [Resources].[Contrivance]
    inner join @request as [request]
    on [Resource].[Value] = [request].[Value]
    GO
    that this cannot be accomplished in 7 years with 3 major releases of SQL Server.
    If you build the database that I provided (I didn't provide flow control commands, of course so they'll need to be chunked into individual executable scripts) and then 
    insert into [Resources].[Contrivance] values('Arbitrary','kalanbates')
    insert into [Resources].[Contrivance] values('FooBar','kalanbates')
    insert into [Resources].[Contrivance] values('NotInvolvedInSample','someone-else')
    GO
    DECLARE @request [Message].[GetResourcesRequest]
    insert into @request
    VALUES ('Arbitrary')
    ,('FooBar')
    EXEC [Services].[GetResources] @request
    your execution will return a result set containing 2 rows.  
    Why can these not 'just' be pushed into a "statically typed" OUTPUT parameter rather than being returned as a loose result set that then has to be sliced and diced as a dynamic object by the calling client?

  • Stop purchase requisition being generated for third party scenario

    Hi All,
    I have a requirement that in case of third party order the Purchase Requisition should Not be created if the order has a header level Delivery block.
    How to fulfill this requirement?
    Thanks & Regards,
    Amrish Purohit

    Hi Amrish,
       I do have same requirement can you please let me know what you have done for the same.
    Your reply will be very helpful for me in this regards
    Regards
    kiran

  • Trace files generated for every session in 11g

    Hi
    I have two databases - both 11.1.0.7, both on RHEL5
    Database A runs on Server A
    Database B runs on Server B
    Both installation of 11g and each database are new installations.
    On Database A a trace file is being created for every session in ADR_HOME.../trace.
    On Database B - this is not happening
    The problem I have is Database A. As every session connection creates a trace file (or 2 - being *.trc and *trm), at the end of the day we have 1000's of unnecessry trace files.  
    A trace file is created for every user - SYS, SYSTEM, application users, etc... It's being created immediately - even if no SQL statements are run in the session.
    I've compared the init.ora parameters running in each database - and can find no differences. btw - SQL_TRACE is set to FALSE.
    Any ideas why a trace file is being generated for every session on Database A? And how to switch this off?
    TIA
    Regards
    Paul

    What type of content is in generated trace files? Is it SQL trace or something different?
    Have you any AFTER LOGON trigger? It can be checked with:
    col text format a100
    select name, text
      from dba_source
    where name in (select trigger_name from dba_triggers where triggering_event like 'LOGON%')
    order by name, line

  • Grey (blank) EWA reports generated for Portal systems

    I am attempting to set up EWA reports for J2EE systems. At this stage despite all efforts I continue to get grey rated (blank) reports generated for our Portal systems.
    As far as I can tell all required components are installed correctly (Wily, SMD server, SMD agents etc).
    This problem seems to be restricted to our Portal systems as report data is being generated for our XI J2EE engines with the same SMD components installed.
    CGA collection jobs are running successfully against the managed Portal systems, and I have attempted to manually generate reports via transaction DSA in the Solution Manager system, using an uploaded service data xml file generated with the SMD/services utility. This also results in a blank report.
    Our systems comprise the following components:
    Wily v7.1
    SMD 13 on Solution Manager and managed systems
    Solution Manager 4.0 SP13: (ABAP):
    SAP_ABA     700     0013     SAPKA70013
    SAP_BASIS     700     0013     SAPKB70013
    ST-PI  2005_1_700     0006     SAPKITLQI6
    PI_BASIS 2005_1_700     0013     SAPKIPYJ7D
    SAP_BW     700     0015     SAPKW70015
    SAP_AP     700     0009     SAPKNA7009
    BBPCRM     500     0009     SAPKU50009
    CPRXRPM     400     0010     SAPK-40010INCPRXRPM
    ST     400     0013     SAPKITL423
    BI_CONT     703     0008     SAPKIBIIP8
    ST-A/PI     01K_CRM560     0000          -
    ST-ICO     150_700     0011     SAPK-1507BINSTPL
    ST-SER     700_2006_2     0004     SAPKITLOK4
    Solution Manager (J2EE):
    sap.com     SAP_JTECHF     7.00 SP13 (1000.7.00.13.0.20070812015644)
    sap.com     ISAGENT     7.00 SP13 (1000.7.00.14.0.20070927052333)
    sap.com     SAP-JEE     7.00 SP13 (1000.7.00.13.0.20070812015311)
    sap.com     SAP-JEECOR     7.00 SP13 (1000.7.00.13.0.20070907082334)
    sap.com     BASETABLES     7.00 SP13 (1000.7.00.13.0.20070812013638)
    sap.com     CORE-TOOLS     7.00 SP13 (1000.7.00.13.0.20070812014121)
    sap.com     JLOGVIEW     7.00 SP13 (1000.7.00.13.0.20070812001600)
    sap.com     JSPM     7.00 SP13 (1000.7.00.13.0.20070812001700)
    sap.com     CAF     7.00 SP13 (1000.7.00.13.0.20070809092315)
    sap.com     UMEADMIN     7.00 SP13 (1000.7.00.13.0.20070809093947)
    sap.com     ADSSAP     7.00 SP13 (1000.7.00.13.0.20070812011854)
    sap.com     BI_MMR     7.00 SP13 (1000.7.00.13.0.20070812013749)
    sap.com     CAF-UM     7.00 SP13 (1000.7.00.13.0.20070809092324)
    sap.com     KM-KW_JIKS     7.00 SP13 (1000.7.00.13.0.20070812014519)
    sap.com     SAP_JTECHS     7.00 SP13 (1000.7.00.13.0.20070812015951)
    sap.com     BI_UDI     7.00 SP13 (1000.7.00.13.0.20070811190400)
    sap.com     LM-SERVICE     7.00 SP13 (1000.7.00.13.0.20070812001200)
    sap.com     LM-TOOLS     7.00 SP13 (1000.7.00.13.0.20070906104634)
    Among many, many others, the following notes and documentation has been followed:
    976054 Availability of EWA for Non ABAP components
    1010428 End-to-End Diagnostics
    762969 - Grey rating for Earywatch Report
    EWA guide for NON-ABAB components
    SAP Solution Manager 4.0 End-to-End Root Cause Analysis Installation Guide
    762969 - Grey rating for Earywatch Report
    And of course all the Wily & SMD installation guides.
    Please let me know if there is any other information required. I'm posting this in the hopes that someone else has had this problem and has been able to resolve it. I have read though what seems like hundreds of threads on SMD setup problems, however not many that describe the problem I'm having exactly.
    Many thanks in advace.
    Tim

    Hi Stephan,
    You're on the right track! When I try manually generate reports for our XI systems via uploaded service data, I only get valid report data if I generate report with graphics - otherwise the report is blank.
    Alas with the Portal systems, I get no report data either way.
    The XML storage table does contain data (with XI & Portal systems) - so comms with the J2EE is working.
    Thanks for your help.
    Tim

  • Expired stock being picked for production order staging.

    Hi all,
    While picklists are being generated for production order staging, materials that are past their expiry date are being selected and allowed to be picked and confirmed. Is there a way that such expired materials can be excluded from picking?
    Many thanks in advance.
    Kevin.

    Hi,
    As mentione dby Philippe, You can make use of batch search strategy to avoid picking of expired batches be it for production or Customer delivery.
    You can use LS51 transaction to setup batch search strategy WM01 (at Warehouse level) or WM02 (Warehouse / movement type). Hope you maintain classification data for your materials (023 batch class) where you maintain expiry duration and productioninformation etc.  In LS51, select the appropriate batch class and maintain batch parameters as released and  sort the strategy by Shelf life expiration date.
    Storage type startegy search (T334T) has to be maintained accordingly.
    If this helps you kindly reward your points.
    thanks and regards
    Varadharajan

  • VLD-1119: Unable to generate Multi-table Insert statement for some or all t

    Hi All -
    I have a map in OWB 10.2.0.4 which is ending with following error: -
    VLD-1119: Unable to generate Multi-table Insert statement for some or all targets.*
    Multi-table insert statement cannot be generated for some or all of the targets due to upstream graphs of those targets are not identical on "active operators" such as "join".*
    The map is created with following logic in mind. Let me know if you need more info. Any directions are highly appreciated and many thanks for your inputs in advance: -
    I have two source tables say T1 and T2. There are full outer joined in a joiner and output of this joined is passed to an expression to evaluate values of columns based on
    business logic i.e. If T1 is available than take T1.C1 else take T2.C1 so on.
    A flag is also evaluated in the expression because these intermediate results needs to be joined to third source table say T3 with different condition.
    Based on value taken a flag is being set in the expression which is used in a splitter to get results in three intermediate tables based on flag value evaluated earlier.
    These three intermediate tables are all truncate insert and these are unioned to fill a final target table.
    Visually it is something like this: -
    T1 -- T3 -- JOINER1
    | -->Join1 (FULL OUTER) --> Expression -->SPLITTER -- JOINER2 UNION --> Target Table
    | JOINER3
    T2 --
    Please suggest.

    I verified that their is a limitation with the splitter operator which will not let you generate a multi split having more than 999 columns in all.
    I had to use two separate splitters to achieve what I was trying to do.
    So the situation is now: -
    Siource -> Split -> Split 1 -> Insert into table -> Union1---------Final tableA
    Siource -> Split -> Split 2 -> Insert into table -> Union1

  • Creation of Table Maintainence Generator for Standard table

    Hi,
    I need to create a table maintainence generator for  a standard SAP table.
    The end user need  have to delete certain entries from the table.
    Is it viable to go for a table maintaninence  generator or go for program which will delete the entry from the standard SAP table.
    Which is the best method to go.
    Thanks in Advance,
    Irfan Hussain

    Hi,
    But i think there is a differtent procedure to create a table maintainenece generator for the standard SAP tables than normal Ztables.
    I thinl we need to take access key  and do the modification.
    Thanks in Advance,
    Irfan Hussain

  • Generating Create Table As Statements

    Hi all - just after a bit of guidance on the following procedure:
    I have identified various columns that need their numeric values upgrading during a cutover. These are stored in the table FOR_UPGRADE which contains the columns TABLE_NAME, COLUMN_NAME, and FACTOR, (the factor is assigned depending on the object type being upgraded and is used in the calculation to create the new 'upgraded' value for that column).
    My problem is that I'm dealing with very large volumes, (TBs) of data and speed is a key factor here. As such I've been asked to upgrade these using create table statements rather than updates, (I believe this was down to being able to turn almost all the logging off during creates, but not updates). So the plan would be to use CTAS a select from the existing table, but apply the upgrades based on the factor during this, then rename these upgraded tables to match the original.
    Now I'd rather not manually create 200+ of these create table statements so I was hoping someone could advise as to how I could use a combination of FOR_UPGRADE table and ALL_TABLE_COLS to dynamically output all the statements required.
    Any help greatly appreciated.

    Write one script manually and use it as a template to create other scripts.
    The easiest way would be loop through FOR_UPGRADE table
    - as soon as the table name changes you know that next statement starts and previous end so you should do something different than for all other rows.
    - for each column generate its part of insert statement and oputput for example using dbms_output.
    Question remains would you really need user/all_tab_columns view because as I've understood you have already all column names in FOR_UPGRADE table.
    In case you need it (for example to check data type or whatever) you can simply join it to your base cursor.
    With one template already created and a bit of PL/SQL it should not be that hard :)
    Gints Plivna
    http://www.gplivna.eu

  • Confusion over DBCA script generated for manual RAC DB creation

    Version:11.2.0.4/RHEL 6.3
    We would like to create our 3-node RAC DB manually.  DBCA cannot meet our requirement because our redo log files, datafiles, tempfiles and control files are placed in a complicated manner . If we use DBCA , we will have to spend a lot of time configuring to our requirements after the DB creation.
    I generated the DB creation scripts from DBCA (DB Name = BRCFPRD )
    DBCA placed the db creation scripts in the specified directory in all the 3 nodes !!
    They all have almost the same contents . The only difference being the instance name (BRCFPRD12.sql for Node2, ... etc).
    Scripts in each node have the createDB.sql statement which has CREATE DATABASE "BRCFPRD" statement. Why is this ? The database need to be created only from one node. Then why did DBCA place createDB.sql in all nodes ?
    I just want to run the script from just one node , say Node1 and it should create the 3-Node RAC DB. How can I do this manually?
    -- The scripts genereated by DBCA in Node1
    apex.sql
    BRCFPRD1.sh
    BRCFPRD1.sql
    context.sql
    CreateClustDBViews.sql
    CreateDB.sql
    CreateDBCatalog.sql
    CreateDBFiles.sql
    cwmlite.sql
    emRepository.sql
    init.ora
    interMedia.sql
    JServer.sql
    lockAccount.sql
    ordinst.sql
    owb.sql
    postDBCreation.sql
    spatial.sql
    xdb_protocol.sql
    -- The contents of the main shell script BRCFPRD1.sh
    $ cat BRCFPRD1.sh
    #!/bin/sh
    OLD_UMASK=`umask`
    umask 0027
    mkdir -p /optware/product/admin/BRCFPRD/adump
    mkdir -p /optware/product/admin/BRCFPRD/dpdump
    mkdir -p /optware/product/admin/BRCFPRD/hdump
    mkdir -p /optware/product/admin/BRCFPRD/pfile
    mkdir -p /optware/product/cfgtoollogs/dbca/BRCFPRD
    umask ${OLD_UMASK}
    ORACLE_SID=BRCFPRD1; export ORACLE_SID
    PATH=$ORACLE_HOME/bin:$PATH; export PATH
    echo You should Add this entry in the /etc/oratab: BRCFPRD:/optware/product/oracle/11.2.0:Y
    /optware/product/oracle/11.2.0/bin/sqlplus /nolog @/optware/product/BRCFPRD1.sql
    -- Contents of BRCFSPRD1.sql
    $ cat BRCFPRD1.sql
    set verify off
    ACCEPT sysPassword CHAR PROMPT 'Enter new password for SYS: ' HIDE
    ACCEPT systemPassword CHAR PROMPT 'Enter new password for SYSTEM: ' HIDE
    ACCEPT sysmanPassword CHAR PROMPT 'Enter new password for SYSMAN: ' HIDE
    ACCEPT dbsnmpPassword CHAR PROMPT 'Enter new password for DBSNMP: ' HIDE
    host /optware/product/oracle/11.2.0/bin/orapwd file=/optware/product/oracle/11.2.0/dbs/orapwBRCFPRD1 force=y
    host /grid/product/11.2.0/bin/setasmgidwrap o=/optware/product/oracle/11.2.0/bin/oracle
    host /optware/product/oracle/11.2.0/bin/srvctl add database -d BRCFPRD -o /optware/product/oracle/11.2.0 -p +DATA/BRCFPRD/spfileBRCFPRD.ora -n BRCFPRD -a "DATA,ARCH_DG"
    host /optware/product/oracle/11.2.0/bin/srvctl add instance -d BRCFPRD -i BRCFPRD1 -n cimprd175
    host /optware/product/oracle/11.2.0/bin/srvctl add instance -d BRCFPRD -i BRCFPRD3 -n cimprd177
    host /optware/product/oracle/11.2.0/bin/srvctl add instance -d BRCFPRD -i BRCFPRD2 -n cimprd176
    host /optware/product/oracle/11.2.0/bin/srvctl disable database -d BRCFPRD
    @/optware/product/CreateDB.sql
    @/optware/product/CreateDBFiles.sql
    @/optware/product/CreateDBCatalog.sql
    @/optware/product/JServer.sql
    @/optware/product/context.sql
    @/optware/product/xdb_protocol.sql
    @/optware/product/ordinst.sql
    @/optware/product/interMedia.sql
    @/optware/product/cwmlite.sql
    @/optware/product/spatial.sql
    @/optware/product/emRepository.sql
    @/optware/product/apex.sql
    @/optware/product/owb.sql
    @/optware/product/CreateClustDBViews.sql
    host echo "SPFILE='+DATA/BRCFPRD/spfileBRCFPRD.ora'" > /optware/product/oracle/11.2.0/dbs/initBRCFPRD1.ora
    @/optware/product/lockAccount.sql
    @/optware/product/postDBCreation.sql
    -- Contents of CreateDB.sql in Node1
    $ cat /optware/product/CreateDB.sql
    SET VERIFY OFF
    connect "SYS"/"&&sysPassword" as SYSDBA
    set echo on
    spool /optware/product/CreateDB.log append
    startup nomount pfile="/optware/product/init.ora";
    CREATE DATABASE "BRCFPRD"
    MAXINSTANCES 32
    MAXLOGHISTORY 1
    MAXLOGFILES 192
    MAXLOGMEMBERS 3
    MAXDATAFILES 3000
    DATAFILE SIZE 700M AUTOEXTEND ON NEXT  10240K MAXSIZE UNLIMITED
    EXTENT MANAGEMENT LOCAL
    SYSAUX DATAFILE SIZE 600M AUTOEXTEND ON NEXT  10240K MAXSIZE UNLIMITED
    SMALLFILE DEFAULT TEMPORARY TABLESPACE TEMP TEMPFILE SIZE 20M AUTOEXTEND ON NEXT  640K MAXSIZE UNLIMITED
    SMALLFILE UNDO TABLESPACE "UNDOTBS1" DATAFILE SIZE 200M AUTOEXTEND ON NEXT  5120K MAXSIZE UNLIMITED
    CHARACTER SET AL32UTF8
    NATIONAL CHARACTER SET AL16UTF16
    LOGFILE GROUP 1  SIZE 28672M,
    GROUP 2  SIZE 28672M
    USER SYS IDENTIFIED BY "&&sysPassword" USER SYSTEM IDENTIFIED BY "&&systemPassword";
    set linesize 2048;
    column ctl_files NEW_VALUE ctl_files;
    select concat('control_files=''', concat(replace(value, ', ', ''','''), '''')) ctl_files from v$parameter where name ='control_files';
    host echo &ctl_files >>/optware/product/init.ora;
    spool off

    If you look at scripts generated in Node2 and Node3 , you can see all scripts except the instance specific ones are commented using REM .
    REM host /u01/product/oracle/11.2.0.3/dbhome_1/bin/srvctl add instance -d STOMPER -i STOMPER1 -n ugxtlprd186
    REM host /u01/product/oracle/11.2.0.3/dbhome_1/bin/srvctl add instance -d STOMPER -i STOMPER2 -n ugxtlprd187
    REM host /u01/product/oracle/11.2.0.3/dbhome_1/bin/srvctl disable database -d STOMPER
    REM @/u01/product/CreateDB.sql
    REM @/u01/product/CreateDBFiles.sql
    REM @/u01/product/CreateDBCatalog.sql
    REM @/u01/product/JServer.sql
    REM @/u01/product/context.sql
    <snipped >

Maybe you are looking for

  • How to go back to the first track on ipod shuffle 2nd generation

    how can i go back to the first track on my ipod shuffle 2nd generation? the first tracks in my shuffle-playlist are my new podcasts, at second place is my current audiobook, sequentially the music is listed. the podcasts and audiobooks are marked wit

  • Video problem with 10.4.2 Tiger

    I have been tryig to get video working on my iMac 2Gz with iSight. With the help of several members I have tracked the problem down to strange issue. I can connect to others and the Apple test server and get 15+ frames per second from them. My FPS st

  • Conversion formulas from RGB to CMYK

    Hi, I've been writing a Colour swatch tool (in excel! hell yeah!) which allows me to pick a bunch of colours, generate complimentary colours from them, blend between 2 colours in a set number of steps and a whole bunch of other cool stuff, and then o

  • WiFi on touch not recognizing my airport extreme

    okay well i know a lot about my network and such, researched it for a while to set it up, and i must say it runs very nicely with my two wireless enabled macs. it consists of an airport express for music streaming which is also used to extend the net

  • Help: Error Code burning DVD

    Mac OS 10.8.4 iDVD 7.1.2 iPhoto 11 (9.4.3) I have successfully converted an iPhoto slideshow to a quicktime movie which plays fine on the computer but refuses to burn to disk in iDVD. I have also tried to burn with Disk Utility and Toast. The error I