NVARCHAR2 table columns in server model generator

Hi All,
I am using designer version 9.0.2.4.
Oracle 9.0.2.3. NLS_NCHAR_CHARACTERSET = UTF8
I have table definition with column whose type is NVARCHAR2.
I run generate server model definitions from table.
If I run the generator second time (or any time when table is already present in db) the report gives me database value for column Max Length three times more than it is shown by DESC <table>.
My guess is designer takes data_length not char_length from all_tab_columns or smth like this (when using utf8 the ratio is 3).
Any ideas ?

Just installed designer version 9.0.2.5. And this annoying feature is fixed.
Thanks to all good people at Oracle.

Similar Messages

  • Can anyone describe how to cut and paste table columns in Data Modeler?

    A number of times I have tried using the copy and paste columns functionality in Data Modeler, but in all circumstances when I've copied a set of columns from a table (by highlighting the column set and pressing the copy columns button), opened a new table, and pressed the paste columns at the end button, nothing has happened. is there some unintuitive thing that I need to do?

    Ok, thanks.
    I see I left "Copy" out of the first reply I sent.
    1. Create a table or tables, say tables A and B.
    2. Expand the table which has the columns you want to copy, say table A, in the browser
    3. Shift (or ctrl) select the columns and select COPY from the context menu.
    4. Click in the table you want to copy to, say table B, in the diagram and right-click to use the context menu , select paste
    or select the table B in the browser and right-click, select paste.
    Are you using the copy in the menu?
    There was a bug in our Early Adopter that required you continued to hold the keys, but this has been addressed - you can read this thread to see how things were handled there: Re: Data Model: set of columns

  • Trouble with opening/creating Server Model Diagrams

    I created server model diagrams by drag-and-dropping specific tables from the Server Model Navigator. Is it supposed to take a LONG time
    for this step (e.g., 3 whole minutes for one table) or should it be basically "immediate"? They eventually made it, but seemed unreasonably long!
    I saved the diagrams, quit, and come back to the repository, see the diagram names, click on a diagram, verify the elements inside, everything
    looks good -- until I try to open the diagram, it (Design Editor) locks up every time!!
    Does anybody know why I'm having either one of the above problems? Is there any setting I need to configure?
    (By the way, I'm using Designer 6i with an Oracle 8.1.7 database.)
    Thanks in advance.
    Brenda

    Could be your memory. Designer takes alot to run, as you may know. Minimum should be 256mb. Close all apps and try to open
    the diagram again. I have had lots of trouble with my designer locking up!

  • Any techniques to speeden up Server model generation/Table to Entity retrofit??

    Hi,
    Do we have any technique/method to speeden up the Server Model generation/table to entity retrofit??
    Basically I am reverse engineering an Oracle 8i database.
    Generating ERDs from exisiting table structures in the database.
    Designer is usually taking around 12hrs to generate the Server Model and 14 hrs to do the table to entity retrofit.
    The client machine is Win NT 4 with 128 MB RAM and the Server is Win NT Server with 256 MB RAM.
    -Shaju B

    Hi Cherif Bouslimi,
    Thanks for your hints. I can now generate Table API within Design Editor's Server Model Diagram. I started the generated SQL file and Package Specification, Package Body and Triggers are created. However, when I try to insert data in a Master Table, the following errors appear:
    SQL> desc products
    Name Null? Type
    PRODUCT_NO NOT NULL VARCHAR2(6)
    DESCRIPTION NOT NULL VARCHAR2(30)
    UNIT_PRICE NOT NULL NUMBER(5,2)
    SQL> insert into products values ('1','1',1);
    insert into products values ('1','1',1)
    ERROR at line 1:
    ORA-04068: existing state of packages has been discarded
    ORA-04063: package body "POSOWNER.CG$PRODUCTS" has errors
    ORA-06508: PL/SQL: could not find program unit being called
    ORA-06512: at "POSOWNER.CG$BIS_PRODUCTS", line 4
    ORA-04088: error during execution of trigger 'POSOWNER.CG$BIS_PRODUCTS'
    Are these Table APIs have to be used with generated Forms but NOT working in SQL Plus?
    Is there any relationship betw[i]Long postings are being truncated to ~1 kB at this time.

  • "Importing" table-comments into an existinting server model with a text file

    Hello
    i have redesigned a database with the "capture design" tool - server model and
    server model diagramm is ok.
    Now, i want add many table- and column-comments into the server model.
    I wrote these comments with the "comment on table..." and "comment on column...."-command (in uppercase mode)
    into a scriptfile and tried, to "import" this scriptfile with the "capture design" ==> "server model" ==> "DDL-Files".
    The designer said "Processing Complete: 0 error(s), 0 warning(s)". But if i look in the server model or in the
    server model diagramm....nothing happened.
    Has anyone an idea?
    regards

    Aha, we're onto something here + I've just been reading the thread with +75 replies that's currently on the front screen which seems to be related though not entirely.
    OK the original filename was Renting Checklist.txt with a space, I deleted the space and renamed it Renting_Checklist.txt, made it the only attachment to a new email and sent from my ISP email account to my gmail account and it went fine.
    So that's why then but it's strange that it went when it was attached on an email with another attachment (whose filename didn't contain spaces or special characters). Does this sound tied in with the similar problems that others are having? It ought to be worthy of a fix, I can't tell when this problem started because I haven't had cause to send an email with a text attachment that had spaces in the filename.

  • Generating DDL from Server Model

    I would like to generate DDL that matches ERDs from the server model in Designer. However, when selecting an application folder, and expanding Relational Tables, I see a lot of tables I don't recognize - they aren't entities in the ERD. I don't know where they came from, and some of the ERD entities aren't showing up in the server model. I just want to generate DDL that creates only the tables shown in the ERD. Any way to do this?

    About your question for more complex constraints. That is not possible in Designer.
    If you are using Headstart, you can achieve this by using CDM RuleFrame. This is a way to record all types of (complex) constraints (business rules).
    see http://www.oracle.com/technology/products/headstart/index.html for more information.

  • Data Model: Relational Tables/Columns Naming Options

    Hello,
    My design consists of several Relational models.
    1. Using Logical model properties, I defined Naming Options (Max. Name Length, Character Case, Valid Characters) applicable to all Entities, Attributes and Views.
    2. But for Relational models, Tables/Columns/Views Naming Options (Max. Name Length, Character Case, Valid Characters) can only be defined per Relational model and not for all the models.
    Question: Is it possible to define Tables/Columns/Views Naming Options (Max. Name Length, Character Case, Valid Characters) applicable to all Relational models? If yes, please inform me.
    Thanks
    Chiedu

    Andrey,
    When You select all the columns, you need to hold the Shift key to select all. Don't let it go to do the copy. So, still holding the shift key, right click for the context menu. The menu only has 2 items, Copy and Delete. Now Copy. Then you can go to the diagram and select paste, to paste them in each of the tables you want those columns to appear in.
    Sue

  • Need To Create a table in Sql Server and do some culculation into the table from Oracle and Sql

    Hello All,
    I'm moving a data from Oracle to Sql Server with ETL (80 tables with data) and i want to track the number of records that i moving on the daily basis , so i need to create a table in SQL Server, wilth 4 columns , Table name, OracleRowsCount, SqlRowCount,
    and Diff(OracleRowsCount - SqlRowCount) that will tell me the each table how many rows i have in Oracle, how many rows i have in SQL after ETL load, and different between them, something like that:
    Table Name  OracleRowsCount   SqlRowCount  Diff
    Customer                150                 150            
    0
    Sales                      2000                1998          
    2
    Devisions                 5                       5             
    0
    (I can add alot of SQL Tasks and variables per each table but it not seems logicly to do that, i tryid to find a way to deal with that in vb but i didn't find)
    What the simplest way to do it ?
    Thank you
    Best Regards
    Daniel

    Hi Daniel,
    According to your description, what you want is an indicator to show whether all the rows are inserted to the destination table. To achieve your goal, you can add a Row Count Transformation following the OLE DB Destination, and redirect bad rows to the Row
    Count Transformation. This way, we can get the count of the bad rows without redirecting these rows. Since the row count value is stored in a variable, we can create another string type variable to retrieve the row count value from the variable used by the
    Row Count Transformation, and then use a Send Mail Task to send the row count value in an email message body. You can also insert the row count value to the SQL Server table through Execute SQL Task. Then, you can check whether bad rows were generated in the
    package by querying this table.  
    Regards,
    Mike Yin
    TechNet Community Support

  • New table/column in publication breaks replication

    Hi, 
    SQL 2008R2 
    I added a control table to a database that is being replicated to a different server.  The tabled called [__Updated] has one column called [DateUpdated] of type datetime2.  I manually created the table in the subscriber, added the new table/column
    to the list of articles and ran the replication. 
    It falls over with the error shown below, any ideas? 
    2015-02-24 16:23:34.32 [95%] Generating schema scripts for article 'AAA'
    2015-02-24 16:23:34.32 [95%] Generating schema scripts for article '__Updated'
    2015-02-24 16:23:34.33 [95%] The replication agent had encountered an exception.
    2015-02-24 16:23:34.33 Source: Unknown
    2015-02-24 16:23:34.33 Exception Type: Microsoft.SqlServer.Management.Smo.FailedOperationException
    2015-02-24 16:23:34.33 Exception Message: Script failed for Table 'dbo.__Updated'.
    2015-02-24 16:23:34.33 Message Code: Not Applicable
    2015-02-24 16:23:34.33
    Exact version is: 
    SELECT @@VERSION
    Microsoft SQL Server 2008 R2 (SP2) - 10.50.4000.0 (X64)
    Jun 28 2012 08:36:30
    Copyright (c) Microsoft Corporation
    Standard Edition (64-bit) on Windows NT 6.2 <X64> (Build 9200: ) (Hypervisor)

    Replication is considered to be a mature technology which means there are little changes. Oracle publishing is gone and updateable subcriptions are gone. Other than that it is the same.
    looking for a book on SQL Server 2008 Administration?
    http://www.amazon.com/Microsoft-Server-2008-Management-Administration/dp/067233044X looking for a book on SQL Server 2008 Full-Text Search?
    http://www.amazon.com/Pro-Full-Text-Search-Server-2008/dp/1430215941

  • What table column size is needed to accomodate Unicode characters

    Hi guys,
    I have encounter something which i dont understand and i hope gurus here will shed some light on me.
    I am running a non-unicode database and i decided to port the data over to a unicode database.
    So
    1) i export the schema out --> data.dmp
    2) then i create the unicode database + create a user
    3) then i import the schema into the database
    during the imp i can see that character conversion will take place.
    During importing of data into the unicode database
    I encounter some error
    saying column size is too small
    so i went to check the row that has the column value that is too large to fit in the table.
    I realise it has some [][][][] data.. so i went to the live non-unicode database and find the row. Indeed it has some [][][][] rubbish data which i feel that someone has inserted other language then english into the database.
    But regardless,
    I went to modify the column size to a larger size, now the row can be accommodated. However the data is still [][][].
    q1) why so ? since now my database is unicode, during the import, this column data [][][] should be converted to unicode already but i still have problem seeing what language it is.
    q2) why at the non-unicode database, the [][][] data can fit into the table column size, but on unicode database, the same table column size need to be increase ?
    q3) while doing more research on unicode, it was said that unicode character takes up 2 byte per character. Alot of my table data are exactly the same size of the table column size.
    E.g Name VARCHAR2(5);
    value - 'Peter'
    Now if converting to unicode, characters will take 2byte instead of 1, isnt 'PETER' going to take up 10byte ( 2 byte per character ),
    why is it that i can still accomodate the data into the table column ?
    q4) now with unicode database up, i will be supporting different language characters around the world. How big should i set my column size to ? the longest a name can get ? or ?
    Thanks guys!

    /// does oracle automatically "look" at the each and individual characters in a word and determine how much byte it should take.
    Characters usually originate from a keyboard, which has an associated keyboard layout and an associated character set encoding (a.k.a code page, a.k.a. encoding). This means, the keyboard driver knows that when a key with a letter "á" on it is pressed on a French keyboard, and the associated character set encoding is MS Code Page 1252 (Oracle name WE8MSWIN1252), then one byte with the value 225 is generated. If the associated character set encoding is UTF-16LE (standard internal Windows encoding), two bytes 225 and 0 are generated. When the generated bytes travel through APIs, they may undergo character set conversions from one encoding to another encoding. The conversion algorithms use translation tables to find out how to translate given byte sequence from one encoding to another encoding. In case of translation from WE8MSWIN1252 to AL32UTF8, Oracle will know that the byte sequence resulting from conversion of the code 225 should be 195 followed by 161. For a Chinese characters, for example when converting it from ZHS16GBK, Oracle knows the resulting sequence as well, and this sequence is usually 3 bytes.
    This is how AL32UTF8 data gets into a database. Now, when Oracle processes a multibyte string, and needs to look at individual characters, for example to count them with LENGTH, or take a substring with SUBSTR, it uses information it has about the structure of the character set. Multibyte character sets are of two type: fixed-width and variable-width. Currently, Oracle supports only one fixed-width multibyte character set in the database: AL16UTF16, which is Oracle's name for Unicode UTF-16BE encoding. It supports this character set for NCHAR/NVARCHAR2/NCLOB data types only. This character set uses two bytes per each character code. To find the next code, 2 is simply added to the string pointer.
    All other Oracle multibyte character sets are variable-width character sets, including AL32UTF8. In most cases, the length of each character code can be determined by looking at its first byte. In AL32UTF8, the number of 1-bits in the most significant positions in the first byte before the first 0-bit tells how many bytes a character has. 0 such bits means 1 byte (such codes are identical to 7-bit ASCII), 2 such bits mean two bytes, 3 bits mean 3 bytes, 4 bits mean four bytes. 1 bit (e.g. the bit sequence 10) starts each second, third or fourth byte of a code.
    In other ASCII-based multibyte character sets, the number of bytes is usually determined by the value range of the first byte. Bytes below 128 means a one-byte code, bytes above 128 begin a two- or three-byte sequence, depending on the range.
    There are also EBCDIC-based (mainframe) multibyte character sets, a.k.a shift-sensitive character sets, where a sequence of two-byte codes is introduced by inserting the SO character (code 14=0x0e) and ended by inserting the SI character (code 15=0x0f). There are also character sets, like ISO-2022-JP, which use more complicated byte sequences to define the length and meaning of byte sequences but Oracle supports them only in limited number of places.
    /// e.g i have a word with 4 character. the 3rd character will be a chinese character..the rest are ascii character
    /// will oracle use 4 byte per character regardless its ascii(english) or chinese
    No.
    /// or it will use 1 byte per english character then 3 byte for the chinese character ? e.g.total - 6 bytes taken
    It will use 6 bytes.
    Thnx,
    Sergiusz

  • How to implement authorisation on table columns

    Can anyone suggest a smart way to use Weblogic platform capabilities to implement
    a table column security/authorisation "control". ie. control on a column by column
    basis who can view or update a column? Scenario - a primary data owner "owns"
    a set of records in a database, but would like to give (or delegate) selected
    access to groups of users to view and/or update the content of certain fields
    in the recordset.
    Seems like this is probably not that uncommon a requirement but can't seem to
    find any design patterns for this.

    Dean Tine wrote:
    Can anyone suggest a smart way to use Weblogic platform capabilities to implement
    a table column security/authorisation "control". ie. control on a column by column
    basis who can view or update a column? Scenario - a primary data owner "owns"
    a set of records in a database, but would like to give (or delegate) selected
    access to groups of users to view and/or update the content of certain fields
    in the recordset.
    Seems like this is probably not that uncommon a requirement but can't seem to
    find any design patterns for this.The first question is, if you are going through an application server
    why do you need to do anything clever at all? You can check the role of
    authenticated users and grant or deny access based on that (i.e. some
    dynamically generated SQL dependent on role)?
    If you really need support at the DBMS level, you can use SQL VIEWs.
    Create a view containing the appropriate columns and grant permissions
    to that view to the appropriate users. With an app server you will need
    multiple connection pools though and it quickly gets messy.
    Alternatively use stored procedures. Depending on your DBMS this might
    be required (if views aren't updatable) or could give better
    performance. Or worse performance.
    Robert

  • Compare two tables on different server without using Linked Servers

    I have an SSIS package that needs to compare a table to another table on another server.  Then only delete data from the first table if it exists in the second table.  
    I know this is possible with Linked Servers but we are doing away with them in my company.  Is there a way inside of SSIS to do this?
    Oh and I can't use a Data Flow since I don't know the table names or columns at run time (the whole package is dynamic) so I can't map the columns ahead of time.  
    So no MERGE JOIN unless it can be used outside of a Data Flow.  

    You 'll have to use the Data Flow Transformation, thing is, given the constraints, your only option seems to be in generating the package programmatically (using code).
    The type of a program would be a Windows Console. The program would expect a few parameters as the source and target tables along with any necessary connection parameters.
    Then use the merge join (being the easiest to code in) in an Execute SQL Task component.
    Examples of coding packages:
    Samples for creating SSIS packages programmatically
    PS: I can tell you, given your circumstances, at the very minimum the impact on the bottom line of your company is brittle apps accompanied with expensive maintenance which translates into reduced ROI from your tooling.
    Arthur My Blog
    The package is table driven and is a dynamic archival process.  The end user just fills in the framework table and the package archives data from within a certain time range.  The table can exist on any server, any database.  
    That part is done.  The part that still needs to be accomplished is to delete data out of the base table since it is archived.  I need a safe way of doing this and so thought to compare the archive table against the base table.  Because they
    can exist anywhere and can have any name I am not sure how to do this without Linked Servers.  That being said we will not have Linked Servers for this.  
    Is there another safe way to delete data out of the base table?

  • Designer 6i : Server Model Generation Error while reverse engineering

    Hi there,
    I am Creating a DB Model from exising database (Reverse
    Engineering ). When I am trying to generate capture design of
    Server Model. I am getting below errors and operation fails.
    --------------------------------------------- Error Message start
    Internal API error - ORA-20291: integrity constraint
    (FK_OTB_TO_OSP) violated - parent key not found
    ORA-06512: at "REP_USER.SDD_OTB_VAL", line 22
    ORA-04088: error during execution of
    trigger 'REP_USER.SDD_OTB_VAL'
    RME-00020: Internal API error - ORA-20291: integrity constraint
    (FK_OTB_TO_OSP) violated - parent key not found
    ORA-06512: at "REP_USER.SDD_OTB_VAL", line 22
    ORA-04088: error during execution of
    trigger 'REP_USER.SDD_OTB_VAL'
    CDR-01052: Tablespace HHNEWBUILD/HHNEWBUILD.USERS is referencing
    invalid Storage Definition (ref =
    2105528996887633452799620949781640613).
    RME-00011: Operation 'ins' on TABLESPACE has failed
    RME-00222: Failed to dispatch operation to Repository
    RME-00222: Failed to dispatch operation to Repository
    --------------------------------------------- Error Message End
    My init.ora is created according to Installation guide.
    Could anyone please help me with this ???
    Your earliest action would highly appreciated.
    Thanks & regards,
    Bhavesh Doshi

    1. Bounce the Repository's DB Instance. Usually an indication
    of RME errors.
    2. Goto RAU and perform all of your maintenance
    a. Compute Statistics
    b. View Objects and Recompile all 'INVALID' objects
    c. Clear temporary tables.
    3. If you are reverse engineering with DBA privileges then you
    are going to retrieve alot of uneccessary objects. Be explicit
    in your retrieval.
    Any questions? Arnold Angel, [email protected]

  • BUG: Export DDL and Data fails for mixed case table/column names

    Hi there,
    I have found a bug in SQL Developer. See details below.
    Description:
    When "Export DDL and Data) function is used on a table/columns not named in UPPERCASE, sql generated by SQL Developer is invalid.
    Steps to reproduce:
    - open SQL Developer, connect to DB
    - make a table named "lowerCase" (in double quotes, so it won't be automatically changed to capital letters)
    - you may also add some columns, for example "lowerCol1", "UpCol2", ALLUPCOL3
    - add some data rows to the table
    - choose Tools -> Export DDL and Data
    - check exporting of tables and data, on "filter" tabs choose your "lowerCase" table
    - press "Apply"
    Error:
    Generated SQL contains invalid INSERTs: mixed-case table and columns are referenced without obligatory double quotes, which yields an error when generated script is executed (see below, relevant line is underlined)
    -- DDL for Table lowerCase
    CREATE TABLE "DBO_HT"."lowerCase"
    (     "lowerCol1" VARCHAR2(100),
         "UpCol2" VARCHAR2(100),
         "ALLUPCOL3" VARCHAR2(100)
    -- DATA FOR TABLE lowerCase
    -- FILTER = none used
    -- INSERTING into lowerCase
    Insert into lowerCase (lowerCol1,UpCol2,ALLUPCOL3) values ('lc','uc','auc');
    -- END DATA FOR TABLE lowerCase
    Remarks
    SQL Developer: version 1.2.1, build MAIN-32.13
    Oracle DBs: 9.2 & Express
    OS: Windows 2000 Professional
    If you need any more details/testing, let me know. I'd really appreciate a quick patch for this issue...
    Alternatively, do you know of any other simple way of copying a single database (it's called a schema in Oracle, right?) from one computer to another? Possibly something so simple like detaching->copying->reattaching mdf (data) files in SQL Server... I thought that this "Export DDL&Data" function will do, but as you can see I couldn't use it.
    I just need a simple solution that works - one operation on source to stuff, get the resulting files to other computer and one operation to have it running there... I think that such scenario is very basic, yet I just can't achieve it and I am simply not allowed to spend more time on it (read: our test project fails, my company rejects my "lobbying" and stays with MSSQL :/ )
    Thanks a lot & bye

    Thanks for your reply.
    ad. 1)
    You're right. I just wanted to give some very short feedback on my experiences with SQL Developer, so I didn't think starting new threads would be necessary, but as I was writing it became much bigger than I initially planned - sorry about that. I will make proper threads as soon as possible. Having "Edit post" button on this forum would also be useful.
    ad. 2)
    Generally, you're right - in most cases it's true that "switching DBMS is a major commitment" and "you will produce terrible code" if you don't learn the new one.
    However, I think that you miss one part of market here - the market that I think Express is also targeted on. I'd call it a "fire&forget databases" market; MySQL comes to mind as possibly most common solution here. It's the rather small systems, possibly web-accessed, whose data-throughput requirements are rather modest; the point is to store data at all, and not necesarily in fastest way, because given the amount of data that is used, even on low-end hardware it will work well enough. What's important here is its general ease of use - how easy is to set up such system, connect and access data, develop a software using it, how much maintenance is needed, how easy this maintenance is, how easy are the most common development tasks as creating a DB, moving a DB from test to production server etc. There, "how easy" directly translates to "how much time we need to set it up", which translates to "how much will the development will cost".
    Considering the current technology, switching the DBMS in such systems is not necesarily a major commitment and believe me that you will not produce terrible code. In many cases it's as simple as changing a switch in your ORM toolkit: hibernate.dialect = Hibernate.Dialect.OracleDialect vs MySQLDialect vs MsSql2005Dialect
    Therefore, in some part of market it's easy to switch DBMS, even on project-by-project basis. The reason to switch will appear when other DBMS makes life easier => development faster. From that point of view, I can understand my colleagues giving me an embarassing look and saying "come on, I won't read all these docs just to have db copied to test server". And it doesn't mean "they are not willing to learn anything new", it's just that they feel such basic task should have self-explaining solution that doesn't require mastering any special knowledge. And if they get such simple solutions somewhere else, it costs them nothing to change the hibernate dialect.
    I think Oracle did the great job with introducing the Express to this "fire&forget" market. The installation is a snap, it just works out of the box, nothing serious to configure, opposite to what I remember from installing and working on Oracle 9 a few years ago. In some places it's still "you need to start SQL*Plus and enter this script", but it's definitely less than before. I also find the SQL Developer a great tool, it can do most of what we need to do with the DB, it's also much better and pleasant to use over Oracle 9 tools. Still, a few basic things still require too much hassle, and I'd say taking your schema to another machine is one of them. So I think that, if you do it well, the "schema copy wizard" you mentioned might be very helpful. If I was to give any general advice for Express line of DB/tools, I'd say "make things simple" - make it "a DB you can't see".
    That's, IMHO, the way to attract more Express users.

  • Npe when using popup inside adf table column

    hi i have a popup witch is inside adf column table but when i click the button i get this npe error
    <af:column id="c7" headerText="Action">
                  <af:commandButton text="Add" id="cb2">
                    <af:showPopupBehavior popupId="p1" triggerType="click"/>
                  </af:commandButton>
                  <af:panelGroupLayout id="pgl2" inlineStyle="width:1042px;"
                                       layout="horizontal" valign="middle"
                                       halign="right">
                    <af:popup id="p1" contentDelivery="lazyUncached">
                      <af:dialog id="d1" type="cancel">
                        <af:region value="#{bindings.usrtaskflowdefinition1.regionModel}"
                                   id="r1"/>
                      </af:dialog>
                    </af:popup>
                    <af:commandButton text="Cancel" id="cb1" rendered="false"/>
                    <af:commandButton text="Remove" id="cb3" visible="false"/>
                  </af:panelGroupLayout>
                </af:column>
    am geting this NPE ERROR
    <FacesCtrlSearchBinding> <release> ADFv: release():: Release all resources.
    <ADFLogger> <end> ADF web request
    <XmlErrorHandler> <handleError> ADF_FACES-60096:Server Exception during PPR, #1
    javax.el.ELException: java.lang.NullPointerException
        at javax.el.BeanELResolver.getValue(BeanELResolver.java:266)
        at com.sun.faces.el.DemuxCompositeELResolver._getValue(DemuxCompositeELResolver.java:173)
        at com.sun.faces.el.DemuxCompositeELResolver.getValue(DemuxCompositeELResolver.java:200)
        at com.sun.el.parser.AstValue.getValue(Unknown Source)
        at com.sun.el.parser.AstEqual.getValue(Unknown Source)
        at com.sun.el.parser.AstOr.getValue(Unknown Source)
        at com.sun.el.ValueExpressionImpl.getValue(Unknown Source)
        at org.apache.myfaces.trinidad.bean.FacesBeanImpl.getProperty(FacesBeanImpl.java:68)
        at org.apache.myfaces.trinidad.component.UIXComponentBase.getBooleanProperty(UIXComponentBase.java:1204)
        at org.apache.myfaces.trinidad.component.UIXComponentBase.isRendered(UIXComponentBase.java:423)
        at org.apache.myfaces.trinidad.component.UIXComponentBase.processValidators(UIXComponentBase.java:810)
        at org.apache.myfaces.trinidad.component.UIXEditableValue.processValidators(UIXEditableValue.java:263)
        at org.apache.myfaces.trinidad.component.UIXComponentBase.validateChildrenImpl(UIXComponentBase.java:1022)
        at org.apache.myfaces.trinidad.component.UIXComponentBase.validateChildren(UIXComponentBase.java:1007)
        at org.apache.myfaces.trinidad.component.UIXComponentBase.processValidators(UIXComponentBase.java:814)
        at org.apache.myfaces.trinidad.component.UIXComponentBase.validateChildrenImpl(UIXComponentBase.java:1022)
        at org.apache.myfaces.trinidad.component.UIXComponentBase.validateChildren(UIXComponentBase.java:1007)
        at org.apache.myfaces.trinidad.component.UIXComponentBase.processValidators(UIXComponentBase.java:814)
        at org.apache.myfaces.trinidad.component.UIXComponentBase.validateChildrenImpl(UIXComponentBase.java:1022)
        at org.apache.myfaces.trinidad.component.UIXComponentBase.validateChildren(UIXComponentBase.java:1007)
        at org.apache.myfaces.trinidad.component.UIXComponentBase.processValidators(UIXComponentBase.java:814)
        at org.apache.myfaces.trinidad.component.UIXComponentBase.validateChildrenImpl(UIXComponentBase.java:1022)
        at org.apache.myfaces.trinidad.component.UIXComponentBase.validateChildren(UIXComponentBase.java:1007)
        at org.apache.myfaces.trinidad.component.UIXComponentBase.processValidators(UIXComponentBase.java:814)
        at org.apache.myfaces.trinidad.component.UIXComponentBase.validateChildrenImpl(UIXComponentBase.java:1022)
        at org.apache.myfaces.trinidad.component.UIXComponentBase.validateChildren(UIXComponentBase.java:1007)
        at org.apache.myfaces.trinidad.component.UIXComponentBase.processValidators(UIXComponentBase.java:814)
        at org.apache.myfaces.trinidad.component.UIXComponentBase.validateChildrenImpl(UIXComponentBase.java:1022)
        at org.apache.myfaces.trinidad.component.UIXComponentBase.validateChildren(UIXComponentBase.java:1007)
        at org.apache.myfaces.trinidad.component.UIXComponentBase.processValidators(UIXComponentBase.java:814)
        at org.apache.myfaces.trinidad.component.UIXComponentBase.validateChildrenImpl(UIXComponentBase.java:1022)
        at org.apache.myfaces.trinidad.component.UIXComponentBase.validateChildren(UIXComponentBase.java:1007)
        at org.apache.myfaces.trinidad.component.UIXComponentBase.processValidators(UIXComponentBase.java:814)
        at org.apache.myfaces.trinidad.component.UIXComponentBase.validateChildrenImpl(UIXComponentBase.java:1022)
        at oracle.adf.view.rich.component.fragment.UIXRegion.validateChildrenImpl(UIXRegion.java:634)
        at org.apache.myfaces.trinidad.component.UIXComponentBase.validateChildren(UIXComponentBase.java:1007)
        at org.apache.myfaces.trinidad.component.UIXComponentBase.processValidators(UIXComponentBase.java:814)
        at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl$ProcessValidationsCallback.invokeContextCallback(LifecycleImpl.java:1422)
        at org.apache.myfaces.trinidad.component.UIXComponentBase.invokeOnNamingContainerComponent(UIXComponentBase.java:1358)
        at oracle.adf.view.rich.component.fragment.UIXRegion.invokeOnComponent(UIXRegion.java:555)
        at org.apache.myfaces.trinidad.component.UIXComponentBase.invokeOnChildrenComponents(UIXComponentBase.java:1330)
        at org.apache.myfaces.trinidad.component.UIXComponentBase.invokeOnComponent(UIXComponentBase.java:1424)
        at org.apache.myfaces.trinidad.component.UIXComponentBase.invokeOnChildrenComponents(UIXComponentBase.java:1330)
        at org.apache.myfaces.trinidad.component.UIXComponentBase.invokeOnComponent(UIXComponentBase.java:1424)
        at oracle.adf.view.rich.component.rich.RichDocument.invokeOnComponent(RichDocument.java:168)
        at javax.faces.component.UIComponent.invokeOnComponent(UIComponent.java:720)
        at javax.faces.component.UIComponentBase.invokeOnComponent(UIComponentBase.java:678)
        at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl._executePhase(LifecycleImpl.java:407)
        at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl.execute(LifecycleImpl.java:194)
        at javax.faces.webapp.FacesServlet.service(FacesServlet.java:265)
        at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
        at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
        at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:300)
        at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:26)
        at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
        at oracle.adf.model.servlet.ADFBindingFilter.doFilter(ADFBindingFilter.java:205)
        at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
        at oracle.adfinternal.view.faces.webapp.rich.RegistrationFilter.doFilter(RegistrationFilter.java:106)
        at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl$FilterListChain.doFilter(TrinidadFilterImpl.java:446)
        at oracle.adfinternal.view.faces.activedata.AdsFilter.doFilter(AdsFilter.java:60)
        at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl$FilterListChain.doFilter(TrinidadFilterImpl.java:446)
        at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl._doFilterImpl(TrinidadFilterImpl.java:271)
        at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl.doFilter(TrinidadFilterImpl.java:177)
        at org.apache.myfaces.trinidad.webapp.TrinidadFilter.doFilter(TrinidadFilter.java:92)
        at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
        at oracle.adf.library.webapp.LibraryFilter.doFilter(LibraryFilter.java:179)
        at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
        at oracle.security.jps.ee.http.JpsAbsFilter$1.run(JpsAbsFilter.java:119)
        at java.security.AccessController.doPrivileged(Native Method)
        at oracle.security.jps.util.JpsSubject.doAsPrivileged(JpsSubject.java:315)
        at oracle.security.jps.ee.util.JpsPlatformUtil.runJaasMode(JpsPlatformUtil.java:442)
        at oracle.security.jps.ee.http.JpsAbsFilter.runJaasMode(JpsAbsFilter.java:103)
        at oracle.security.jps.ee.http.JpsAbsFilter.doFilter(JpsAbsFilter.java:171)
        at oracle.security.jps.ee.http.JpsFilter.doFilter(JpsFilter.java:71)
        at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
        at oracle.dms.servlet.DMSServletFilter.doFilter(DMSServletFilter.java:139)
        at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
        at weblogic.servlet.internal.RequestEventsFilter.doFilter(RequestEventsFilter.java:27)
        at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
        at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.wrapRun(WebAppServletContext.java:3715)
        at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3681)
        at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
        at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:120)
        at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2277)
        at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2183)
        at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1454)
        at weblogic.work.ExecuteThread.execute(ExecuteThread.java:209)
        at weblogic.work.ExecuteThread.run(ExecuteThread.java:178)
    Caused by: java.lang.NullPointerException
        at oracle.adf.model.binding.DCControlBinding.reportException(DCControlBinding.java:201)
        at oracle.jbo.uicli.binding.JUCtrlValueBinding.findAttributeDef(JUCtrlValueBinding.java:632)
        at oracle.jbo.uicli.binding.JUCtrlValueBinding.findAttributeDef(JUCtrlValueBinding.java:597)
        at oracle.jbo.uicli.binding.JUCtrlValueBinding.getAttribute(JUCtrlValueBinding.java:1341)
        at oracle.adfinternal.view.faces.model.binding.FacesCtrlSearchBinding$AdfAttributeCriterion.getOperator(FacesCtrlSearchBinding.java:2240)
        at sun.reflect.GeneratedMethodAccessor357.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at javax.el.BeanELResolver.getValue(BeanELResolver.java:261)
        ... 87 more
    <QueryCollection> <finalize> [4342] ##### QueryCollection.finl no RowFilter
    <QueryCollection> <finalize> [4343] ##### QueryCollection.finl oracle.jbo.Key[Surname 0 0 ]
    <QueryCollection> <finalize> [4344] ##### QueryCollection.finl oracle.jbo.Key[Firstname 0 0 ]
    <ControllerState> <finalizeRequest> ADFc: Request number [9] for session [14gycvxp1h_] has been finalized.
    <QueryCollection> <finalize> [4345] ##### QueryCollection.finl oracle.jbo.Key[Username 0 0 ]
    <RootViewPortContextImpl> <unlockViewPortRequestLock> ADFc: Attempting to release RootViewPort request lock on 14gycvxp1h_0
    <QueryCollection> <finalize> [4346] ##### QueryCollection.finl oracle.jbo.Key[Organisationname 0 0 ]
    <QueryCollection> <finalize> [4347] ##### QueryCollection.finl oracle.jbo.Key[Surname 0 0 ]
    <RootViewPortContextImpl> <unlockViewPortRequestLock> ADFc: Successfully released RootViewPort request lock on 14gycvxp1h_0
    <QueryCollection> <finalize> [4348] ##### QueryCollection.finl oracle.jbo.Key[Firstname 0 0 ]
    <QueryCollection> <finalize> [4349] ##### QueryCollection.finl oracle.jbo.Key[Username 0 0 ]
    <Auditor> <isEnabled> IAU:Event Enabled : false, Event Type : CheckPermission, Event Status : true, Properties : null
    <QueryCollection> <finalize> [4350] ##### QueryCollection.finl oracle.jbo.Key[Organisationname 0 0 ]
    <QueryCollection> <finalize> [4351] ##### QueryCollection.finl no RowFilter
    <QueryCollection> <finalize> [4352] ##### QueryCollection.finl no RowFilter
    <Auditor> <isEnabled> IAU:Event Enabled : false, Event Type : CheckPermission, Event Status : true, Properties : null
    <QueryCollection> <finalize> [4353] ##### QueryCollection.finl no RowFilter
    <QueryCollection> <finalize> [4354] ##### QueryCollection.finl no RowFilter
    <QueryCollection> <finalize> [4355] ##### QueryCollection.finl no RowFilter
    <QueryCollection> <finalize> [4356] ##### QueryCollection.finl no RowFilter
    <QueryCollection> <finalize> [4357] ##### QueryCollection.finl no RowFilter
    <QueryCollection> <finalize> [4358] ##### QueryCollection.finl no RowFilter
    <QueryCollection> <finalize> [4359] ##### QueryCollection.finl no RowFilter
    <QueryCollection> <finalize> [4360] ##### QueryCollection.finl no RowFilter
    <QueryCollection> <finalize> [4361] ##### QueryCollection.finl no RowFilter
    <QueryCollection> <finalize> [4362] ##### QueryCollection.finl no RowFilter
    <QueryCollection> <finalize> [4363] ##### QueryCollection.finl no RowFilter
    <QueryCollection> <finalize> [4364] ##### QueryCollection.finl no RowFilter
    <AbstractExecutionContext> <getECForJDBC> adding new ECForJDBC null to set of listeners for this context
    <WatchingDocumentChangeNotifier> <run> decide if checkUsingListeners should run. loopCnt: 0 changeInterval: 60000 originalChangeInterval: 60000 forceCheckForUpdate: false notifier instance: oracle.as.config.notification.filesystem.WatchingDocumentChangeNotifier@ca7192
    <WatchingDocumentChangeNotifier> <checkUsingListeners> BEGIN checkUsingListeners for notifier instance: oracle.as.config.notification.filesystem.WatchingDocumentChangeNotifier@ca7192
    <WatchingDocumentChangeNotifier> <checkUsingListeners> notifier processing file: C:\Users\10017134\App

    Caused by: java.lang.NullPointerException
        at oracle.adf.model.binding.DCControlBinding.reportException(DCControlBinding.java:201)
        at oracle.jbo.uicli.binding.JUCtrlValueBinding.findAttributeDef(JUCtrlValueBinding.java:632)
        at oracle.jbo.uicli.binding.JUCtrlValueBinding.findAttributeDef(JUCtrlValueBinding.java:597)
        at oracle.jbo.uicli.binding.JUCtrlValueBinding.getAttribute(JUCtrlValueBinding.java:1341)
        at oracle.adfinternal.view.faces.model.binding.FacesCtrlSearchBinding$AdfAttributeCriterion.getOperator(FacesCtrlSearchBinding.java:2240)
    It seems that problem is caused by af:query or quickquery component(or underlying view object or view criteria).
    Maybe you can drop BTF as static region directly to your page and see if this will run correctly?
    YES i have try stll does not work
    So then this is not related with "npe when using popup inside adf table column"  
    Dario

Maybe you are looking for

  • No suitable driver error

    Hi when i compile the folowing piece of code i get and error say "no suitable driver" can anybody explain this for me import java.sql.*; public class JDBC { public static void main(String args[]) { String url = "jdbc:mySubprotocol:Grace.mdb"; Connect

  • Widescreen

    I am making a video and I am recording it as 16:9 widescreen on my camera. I imported the video into Final Cut Express with the "anamorphic" capturing option but when I export the video it has the black bars. I want to make sure when it will be true

  • Master / Detail (i think)

    Hi, i'm pretty new to spry, and i'm trying the following: table: 4 columns: city_id, customer, description, image_city got a "city map" (image w hotspots): 'onclick' city filters all customers in that particular city in a <ul>, all other columns show

  • After update to BB 10.2: problem viewing photos from camera app

    After each OS update something gets screwed... after update to 10.2,  camera app doesn't show pictures from media card. Pictures are still there and can be accessed through File Manager. Camera saves new pictures to the media card, but to see them I

  • MSI CX600 won't boot

    Hello all, Would anyone gives an idea what could be the problem I bought MSI CX600 in jan 2010 and 2 years later it started with this problem When laptop is turned on or resumed, the operating system (windows7) will not load. The laptop screen does n