No business key column is identified on existing dimension - Slow chg dimension

Hi,
I m working to insert/update my Dimension table (in datamart) using SSIS "Kimball Method SCD component " tool.
I have different columns as PK in source and identity column as PK in Dimension table.
I also have Unique Index on the columns in dimension which are PK in source table yet I m getting error from the tool that "no business key column is identified on existing dimension"
Any suggestions?
Thanks,

Business key needs to be defined in the component tool itself.
Double click the component and it will open a wizard kind dialog box.
The first tab is "Existing Dimension Input Column Definitions".
In this tab, the grid has a column "SCD Column Type" which looks like label field but if you click it twice, it turns into dropdown list and from here, you can define Business key for your dimension table.
HTH

Similar Messages

  • What is the best practice for inserting (unique) rows into a table containing key columns constraint where source may contain duplicate (already existing) rows?

    My final data table contains a two key columns unique key constraint.  I insert data into this table from a daily capture table (which also contains the two columns that make up the key in the final data table but are not constrained
    (not unique) in the daily capture table).  I don't want to insert rows from daily capture which already exists in final data table (based on the two key columns).  Currently, what I do is to select * into a #temp table from the join
    of daily capture and final data tables on these two key columns.  Then I delete the rows in the daily capture table which match the #temp table.  Then I insert the remaining rows from daily capture into the final data table. 
    Would it be possible to simplify this process by using an Instead Of trigger in the final table and just insert directly from the daily capture table?  How would this look?
    What is the best practice for inserting unique (new) rows and ignoring duplicate rows (rows that already exist in both the daily capture and final data tables) in my particular operation?
    Rich P

    Please follow basic Netiquette and post the DDL we need to answer this. Follow industry and ANSI/ISO standards in your data. You should follow ISO-11179 rules for naming data elements. You should follow ISO-8601 rules for displaying temporal data. We need
    to know the data types, keys and constraints on the table. Avoid dialect in favor of ANSI/ISO Standard SQL. And you need to read and download the PDF for: 
    https://www.simple-talk.com/books/sql-books/119-sql-code-smells/
    >> My final data table contains a two key columns unique key constraint. [unh? one two-column key or two one column keys? Sure wish you posted DDL] I insert data into this table from a daily capture table (which also contains the two columns that make
    up the key in the final data table but are not constrained (not unique) in the daily capture table). <<
    Then the "capture table" is not a table at all! Remember the fist day of your RDBMS class? A table has to have a key.  You need to fix this error. What ETL tool do you use? 
    >> I don't want to insert rows from daily capture which already exists in final data table (based on the two key columns). <<
    MERGE statement; Google it. And do not use temp tables. 
    --CELKO-- Books in Celko Series for Morgan-Kaufmann Publishing: Analytics and OLAP in SQL / Data and Databases: Concepts in Practice Data / Measurements and Standards in SQL SQL for Smarties / SQL Programming Style / SQL Puzzles and Answers / Thinking
    in Sets / Trees and Hierarchies in SQL

  • ORA-20001: Secondary key column identified wwas not located in the select l

    Error in mru internal routine: ORA-20001: Secondary key column identified was not located in the select list of the query
    I am receiving this error when I try and update a record in an updatable report (SQL QUERY Updateable). Two of the values being updated rely on a named LOV.
    The SQL query for the report does pull in both the keys (there are two keys)....I am not certain why this is happening...any clues would be appreciated.
    thanks. Karen

    sorry. I thought I had made it current. It is now.
    I am not certain if you will be able to run it as it is accessing our test tables. Will I need to move those tables to the workspace. I am also checking here to see if anyone has had experience. thankyou for your patience. I really appreciate it.
    karen

  • Dropping business key from existing dimension

    A business key needs to be removed from an existing dimension in a data warehouse.
    I couldnt locate any documentation regarding this topic, so please let me know of any links to documentation if you know of any.
    Im needing to know if this can be done, or will I need to create another dimension (with correct keys), transfer data,
    then swap dimensions in the mapping.
    Thanks for any input.

    I am sure th business key would be a part of the constaint with which u load the data into the table. So its better u make an analysis of how the data is going to affect the load before u make the change.either create a new one with the correct keys test it for the data and then swap the data or take the existing table, make the changes ,test it in DE?V/SIT and once u r confident it is not affecting the load u can move it to quality and production.
    Regards
    Bharath

  • How we generate Surrogate Keys without using identify column

    Hi All,
    How we generate Surrogate Keys without using identify column.
    Regards,
    Manish

    There are various options
    1.IDENTITY columns - simplest to implement
    2. Using NEWID(), NEWSEQUENTIALID() functions (if you want to use GUID values as surrogate keys)
    3. SEQUENCE object (if SQL 2012 and above)
    4. Using custom functions to generate keys yourself
    This is an good article which compares use of GUIDs against integers as surrogate keys
    http://blog.jonathanoliver.com/integers-vs-guids-and-natural-vs-surrogate-keys/
    Please Mark This As Answer if it solved your issue
    Please Vote This As Helpful if it helps to solve your issue
    Visakh
    My Wiki User Page
    My MSDN Page
    My Personal Blog
    My Facebook Page

  • Why not allow Business Rule Design Transformer to overwrite existing BRDDs

    Hi,
    It can be quite annoying during development to continually having to remove the BRDD Trigger and PL/SQL Definition when running the BR Design Transformer reveals missing information in the BR Analysis definition. It seems that it would be practical to have at least the option to have the Business Rule Design Transformer overwrite an existing Business Rule Design Definition.
    You could think of an additional parameter in this utilily labeled "Overwrite existing BRDD?" with allowable values Y(es) and N(o) with No as default (so that the default functionality is the same as before).
    Below you find the code that can be added to the HSU_BRTR package to realize such a functional improvement. Maybe you can consider it for a next release...?!
    best regards,
    Lucas Jellema
    AMIS Services BV
    code to add to HSU_BRTR to provide a parameter that allows the user to specify whether or not the HSU_BRTR is allowed to overwrite an existing BRDD:
    in the revision history:
    09-jan-2003 Lucas Jellema
    6.5.2.3AMIS1.1 Added new parameter and supporting code that allows user to indicate
    whether existing Business Rule Design Definitions may be/should be overwritten
    at the end of procedure install
    hsu_install.add_parameter
    ( PACKAGE_NAME -- package name
    , 40 -- sequence
    , 'Overwrite existing BRDDs?' -- prompt
    , 'N' -- default value actual
    , 'No' -- default value displayed
    , 'Y' -- mandatory
    , 'N' -- allow multi-select
    , 'N' -- include shared elements
    , '' -- element type short name
    , '' -- sql expression actual
    , '' -- sql expression displayed
    , null -- where clause
    , null -- synchronize with
    , null -- foreign key column
    , -- help text
    'Choose whether you want this utility to overwrite existing Business Rule Design Definitions.'
    hsu_install.add_allowable_value
    ( PACKAGE_NAME -- package name
    , 40 -- parameter
    , 1 -- sequence
    , 'Y' -- actual value
    , 'Yes' -- displayed value
    hsu_install.add_allowable_value
    ( PACKAGE_NAME -- package name
    , 40 -- parameter
    , 2 -- sequence
    , 'N' -- actual value
    , 'No' -- displayed value
    --------6.5.2.3AMIS1.1
    Just before procedure transform_stage2
    procedure delete_brdds
    ( p_tbl_id in ci_table_definitions.id%type
    , p_br_name in varchar2
    -- Purpose Delete existing Business Rule Design Definition
    -- Usage from run procedure
    -- Remarks Find PL/SQl Module Definition based on Business Rule Label
    is
    cursor c_brdd
    ( b_tbl_id in ci_table_definitions.id%type
    , b_br_name in varchar2
    is
    select trg.id trg_id
    , trg.name name
    , plm.id plm_id
    from ci_database_triggers trg
    , ci_plsql_modules plm
    where trg.table_definition_reference = b_tbl_id
    and trg.plsql_module_reference = plm.id
    and plm.name = b_br_name
    begin
    -- loop over journalling busrules of this application
    <<brdd>>
    for r_brdd in c_brdd(p_tbl_id, p_br_name) loop
    -- delete busrule (delete of trigger is since 6i not enough anymore, so
    -- explicitly delete PL/SQL module also)
    bllog.write
    ( 'Deleting trigger '||r_brdd.name||' and PL/SQL module '
    ||hsu_name.get_name_and_path(r_brdd.plm_id)
    ||', a new business rule design definition overwrites this old one.'
    , bllog.information
    bltrg.del(r_brdd.trg_id);
    blplm.del(r_brdd.plm_id);
    end loop brdd;
    end delete_brdds;
    inside procedure transform_stage2
    (just before the comment: -- check if BRDD with this plm_name already exists)
    if p_overwrite_br = 'Y'
         then
    delete_brdds
    ( p_tbl_id => g_trg_tbl(i).tbl_id
    , p_br_name => l_plm_name
         end if; -- find_trg and overwrite =Y
    A new parameter in procedure transform_br_fun
    procedure transform_br_fun
    , p_overwrite_br in varchar2 default 'N' -- 6.5.2.3AMIS1.1
    use the parameter p_overwrite_br in the call to transform_stage2
    transform_stage2
    ( p_fun_id => p_fun_id
    , p_fun_label => r_fun.fun_function_label
    , p_fun_short_definition => r_fun.fun_short_definition
    , p_msg_prefix => p_msg_prefix
    , p_msg_language => p_msg_language
    , p_rule_type => l_rule_type
                   , p_overwrite_br => p_overwrite_br -- 6.5.2.3AMIS1.1
    new parameter p_overwrite_br in the procedure run
    NOTE: in both PACKAGE SPECIFICATION and BODY
    , p_overwrite_br in varchar2 default'N' -- 6.5.2.3AMIS1.1
    use the parameter p_overwrite_br in all calls to transform_br_fun inside procedure run
    transform_br_fun
    ( p_fun_id => r_fun_tree.id
    , p_msg_prefix => p_msg_prefix
    , p_msg_language => p_msg_language
    , p_create_att_usages => p_create_att_usages
    , p_overwrite_br => p_overwrite_br -- 6.5.2.3AMIS1.1

    Among the alternatives not mentioned... Using a TiVo DVR, rather than the X1; a Roamio Plus or Pro would solve both the concern over the quality of the DVR, as well as providing the MoCA bridge capability the poster so desperately wanted the X1 DVR to provide. (Although the TiVo's support only MoCA 1.1.) Just get a third-party MoCA adapter for the distant location. Why the hang-up on having a device provided by Comcast? This seems especially ironic given the opinions expressed regarding payments over time to Comcast. If a MoCA 2.0 bridge was the requirement, they don't exist outside providers. So couldn't the poster have simply requested a replacement XB3 from the local office and configured it down to only providing MoCA bridging -- and perhaps as a wireless access point? Comcast would bill him the monthly rate for the extra device, but such is the state of MoCA 2.0. Much of the OP sounds like frustration over devices providing capabilities the poster *thinks* they should have.

  • Ability to auto create a primary key column while creating an entity ?

    Product : oracle data modeler ( 3.0.0.665 32 bit version ) on windows XP Professional , : Target DB Oracle 11GR2
    I am new to Mozilla Rhino. Is it feasible to have a mechanism where in the primary key column name would be auto generated when the entity ( name ) is just created for the first time, and it has no columns defined yet . For e.g. one could create an entity named vendor , and after the table name is typed in, a column vendor identifier,of type numeric , is to be auto created. This mechanism should work even if the entity name changes, and the primary key is in format {tablename}_id , but it should not , if the primary key was manually set in later and is not in the expected format.

    I created this script (1) for adding an ID column (using a specific domain) to all supertype entities that do not already have an ID attribute, and a script (2) for adding primary key constraints to all ID attributes.
    - Marc de Oliveira
    Script 1:
    entities = model.getEntitySet().toArray();
    domain = model.getDesign().getDomainSet().getByName('Integer DIS Domain');
    for (var e = 0; e<entities.length;e++)
    entity = entities[e];
    if (entity.getHierarchicalParent()==null)
    attributes = entity.getAttributes().toArray(); // find existing attributes
    IDattribute = null;
    for (var a = 0; a<attributes.length; a++) // look for an ID attribute
    if (attributes[a].getName()=="ID") { IDattribute = attributes[a]; }
    if (IDattribute==null) // if an ID attribute was not found...
    IDattribute = entity.createAttribute(); // create a new ID attribute
    IDattribute.setName("ID");
    IDattribute.setDomain(domain);
    entity.moveToIndex(IDattribute,0); // move ID attribute to top
    Script 2:
    entities = model.getEntitySet().toArray();
    for (var e = 0; e<entities.length;e++)
    entity = entities[e];
    attributes = entity.getAttributes().toArray(); // find existing attributes
    IDattribute = null;
    for (var a = 0; a<attributes.length; a++) // look for an ID attribute
    if (attributes[a].getName()=="ID") { IDattribute = attributes[a];}
    if (IDattribute!=null) // if an ID attribute was found...
    key = entity.createCandidateKey(); // Create a key
    key.add(IDattribute); // Add the ID attribute to the key
    key.setPK(true); // Define the key as a primary key
    PKName = entity.getName().replace(" ","_")+"_PK";
    key.setName(PKName); // Define the key as a primary key
    entity.setPK(key); // workaround for bug...
    }

  • Hit the exception when editing the value of row key column in a new created row in a table

    1. I created a view object with 2 entity objects (parent table: YARD_FIXED_SLOT - child table: YARD_FIXED_SLOT_DETAIL) and the primary key of child table composes of 2 columns ( one of them is FK: YardFixedSlotDetail.FIXED_SLOT_ID REFERENCES YARD_FIXED_SLOT(FIXED_SLOT_ID)
    SQL queries:
    SELECT YardFixedSlotDetail.FIXED_SLOT_ID,
           YardFixedSlotDetail.MODIFIED_DT,
           YardFixedSlotDetail.SLOT_FROM_N,
           YardFixedSlotDetail.SLOT_TO_N,
           YardFixedSlotDetail.USER_ID,
           YardFixedSlot.BLOCK_M,
           YardFixedSlot.BLOCK_N,
           YardFixedSlot.FIXED_SLOT_ID AS FIXED_SLOT_ID1,
           YardFixedSlot.SECTION_N,
           YardFixedSlot.STATUS_C,
           YardFixedSlot.TERMINAL_C
    FROM  YARD_FIXED_SLOT_DETAIL YardFixedSlotDetail, YARD_FIXED_SLOT YardFixedSlot
    YardFixedSlotDetail.FIXED_SLOT_ID = YardFixedSlot.FIXED_SLOT_ID
    2. I dragged this view object into JSF page as an ediable table and add 'add' button to add a new row to the table. and the handling logic in managed bean is as followed. now one new row can be added succesfully in the table.
        public void processSlotDetailCreation(ActionEvent ae)
            DCBindingContainer bindings = (DCBindingContainer)getBindings();
            DCIteratorBinding dciter = bindings.findIteratorBinding("YardFixedSlotDetailFindAllByBlock1Iterator");
            Row row = dciter.getCurrentRow();
            //get the last row for the index and create a new row for the //user to edit
            Row lastRow = dciter.getNavigatableRowIterator().last();
            YardFixedSlotDetailFindAllByBlockRowImpl newRow = (YardFixedSlotDetailFindAllByBlockRowImpl)dciter.getNavigatableRowIterator().createRow();
            newRow.setFixedSlotId(new Integer(21));
            newRow.setUserId("adftest");
            newRow.setModifiedDt(new Timestamp(System.currentTimeMillis()));
            //bug exist here
            newRow.setSlotFromN(new Integer(1));
            //newRow.setSlotToN(new Integer(1));
            newRow.setNewRowState(Row.STATUS_INITIALIZED);
            int lastRowIndex = dciter.getNavigatableRowIterator().getRangeIndexOf(lastRow);
            dciter.getNavigatableRowIterator().insertRowAtRangeIndex( lastRowIndex+1, newRow);
            // make the new row the current row of the table
            dciter.setCurrentRowIndexInRange(lastRowIndex);
            dciter.setCurrentRowWithKey(newRow.getKey().toStringFormat(true));
            //table should have its displayRow attribute set to //"selected"
           // AdfFacesContext.getCurrentInstance().addPartialTarget(slotDetailsTable);
    3. When filling in a new value for SlotFromN column (note that SlotFromN column and FixedSlotId column are the rowKey), hit the exception below:
    [2013-12-04T13:04:28.866+08:00] [DefaultServer] [ERROR] [] [oracle.adfinternal.view.faces.webapp.rich.RegistrationFilter] [tid: [ACTIVE].ExecuteThread: '14' for queue: 'weblogic.kernel.Default (self-tuning)'] [userId: <anonymous>] [ecid: eb5e281b-6b07-4c17-987e-049792c97dda-000001bf,0] [APP: YPCApp] [DSID: 0000KAvzIaA5qYWFLzmJOA1IbdqZ000003] ADF_FACES-60096:Server Exception during PPR, #7[[
    oracle.jbo.InvalidOperException: JBO-29114 ADFContext is not setup to process messages for this exception. Use the exception stack trace and error code to investigate the root cause of this exception. Root cause error code is JBO-34014. Error message parameters are {0=oracle.jbo.Key[21 null ], 1=root}
    at oracle.jbo.uicli.binding.JUCtrlHierBinding.bringNodeToRangeKeyPath(JUCtrlHierBinding.java:859)
    at oracle.adfinternal.view.faces.model.binding.FacesCtrlHierBinding.bringNodeToRangeKeyPath(FacesCtrlHierBinding.java:122)
    at oracle.adfinternal.view.faces.model.binding.RowDataManager.setRowKey(RowDataManager.java:131)
    at oracle.adfinternal.view.faces.model.binding.FacesCtrlHierBinding$FacesModel.setRowKey(FacesCtrlHierBinding.java:951)
    at org.apache.myfaces.trinidad.component.UIXCollection.setRowKey(UIXCollection.java:527)
    at org.apache.myfaces.trinidad.component.UIXTable.setRowKey(UIXTable.java:760)
    at oracle.adfinternal.view.faces.renderkit.rich.TableRendererUtils._processStampedChildrenForActiveRow(TableRendererUtils.java:2950)
    at oracle.adfinternal.view.faces.renderkit.rich.TableRendererUtils.processFacetsAndChildrenForClickToEdit(TableRendererUtils.java:1604)
    at oracle.adfinternal.view.faces.renderkit.rich.TableRenderer.processFacetsAndChildrenForClickToEdit(TableRenderer.java:352)
    at oracle.adfinternal.view.faces.renderkit.rich.TableRenderer.decodeChildren(TableRenderer.java:193)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.decodeChildren(UIXComponentBase.java:1347)
    at org.apache.myfaces.trinidad.component.UIXCollection.processDecodes(UIXCollection.java:226)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.decodeChildrenImpl(UIXComponentBase.java:1365)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.decodeChildren(UIXComponentBase.java:1351)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.processDecodes(UIXComponentBase.java:1124)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.decodeChildrenImpl(UIXComponentBase.java:1365)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.decodeChildren(UIXComponentBase.java:1351)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.processDecodes(UIXComponentBase.java:1124)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.decodeChildrenImpl(UIXComponentBase.java:1365)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.decodeChildren(UIXComponentBase.java:1351)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.processDecodes(UIXComponentBase.java:1124)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.decodeChildrenImpl(UIXComponentBase.java:1365)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.decodeChildren(UIXComponentBase.java:1351)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.processDecodes(UIXComponentBase.java:1124)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.decodeChildrenImpl(UIXComponentBase.java:1365)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.decodeChildren(UIXComponentBase.java:1351)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.processDecodes(UIXComponentBase.java:1124)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.decodeChildrenImpl(UIXComponentBase.java:1365)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.decodeChildren(UIXComponentBase.java:1351)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.processDecodes(UIXComponentBase.java:1124)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.decodeChildrenImpl(UIXComponentBase.java:1365)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.decodeChildren(UIXComponentBase.java:1351)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.processDecodes(UIXComponentBase.java:1124)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.decodeChildrenImpl(UIXComponentBase.java:1365)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.decodeChildren(UIXComponentBase.java:1351)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.processDecodes(UIXComponentBase.java:1124)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.decodeChildrenImpl(UIXComponentBase.java:1365)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.decodeChildren(UIXComponentBase.java:1351)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.processDecodes(UIXComponentBase.java:1124)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.decodeChildrenImpl(UIXComponentBase.java:1365)
    at oracle.adf.view.rich.component.fragment.UIXRegion.decodeChildrenImpl(UIXRegion.java:605)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.decodeChildren(UIXComponentBase.java:1351)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.processDecodes(UIXComponentBase.java:1124)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.decodeChildrenImpl(UIXComponentBase.java:1365)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.decodeChildren(UIXComponentBase.java:1351)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.processDecodes(UIXComponentBase.java:1124)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.decodeChildrenImpl(UIXComponentBase.java:1365)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.decodeChildren(UIXComponentBase.java:1351)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.processDecodes(UIXComponentBase.java:1124)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.decodeChildrenImpl(UIXComponentBase.java:1365)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.decodeChildren(UIXComponentBase.java:1351)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.processDecodes(UIXComponentBase.java:1124)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.decodeChildrenImpl(UIXComponentBase.java:1365)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.decodeChildren(UIXComponentBase.java:1351)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.processDecodes(UIXComponentBase.java:1124)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.decodeChildrenImpl(UIXComponentBase.java:1365)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.decodeChildren(UIXComponentBase.java:1351)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.processDecodes(UIXComponentBase.java:1124)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.decodeChildrenImpl(UIXComponentBase.java:1365)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.decodeChildren(UIXComponentBase.java:1351)
    at org.apache.myfaces.trinidad.component.UIXForm.processDecodes(UIXForm.java:75)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.decodeChildrenImpl(UIXComponentBase.java:1365)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.decodeChildren(UIXComponentBase.java:1351)
    at org.apache.myfaces.trinidad.component.UIXComponentBase.processDecodes(UIXComponentBase.java:1124)
    at javax.faces.component.UIComponentBase.processDecodes(UIComponentBase.java:1176)
    at javax.faces.component.UIViewRoot.processDecodes(UIViewRoot.java:933)
    at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl$ApplyRequestValuesCallback.invokeContextCallback(LifecycleImpl.java:1574)
    at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl._executePhase(LifecycleImpl.java:416)
    at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl.execute(LifecycleImpl.java:225)
    at javax.faces.webapp.FacesServlet.service(FacesServlet.java:593)
    at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:280)
    at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:254)
    at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:136)
    at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:341)
    at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:25)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:79)
    at oracle.adf.model.servlet.ADFBindingFilter.doFilter(ADFBindingFilter.java:192)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:79)
    at oracle.adfinternal.view.faces.webapp.rich.RegistrationFilter.doFilter(RegistrationFilter.java:106)
    at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl$FilterListChain.doFilter(TrinidadFilterImpl.java:478)
    at oracle.adfinternal.view.faces.activedata.AdsFilter.doFilter(AdsFilter.java:60)
    at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl$FilterListChain.doFilter(TrinidadFilterImpl.java:478)
    at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl._doFilterImpl(TrinidadFilterImpl.java:303)
    at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl.doFilter(TrinidadFilterImpl.java:208)
    at org.apache.myfaces.trinidad.webapp.TrinidadFilter.doFilter(TrinidadFilter.java:92)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:79)
    at oracle.security.jps.ee.http.JpsAbsFilter$1.run(JpsAbsFilter.java:137)
    at java.security.AccessController.doPrivileged(Native Method)
    at oracle.security.jps.util.JpsSubject.doAsPrivileged(JpsSubject.java:315)
    at oracle.security.jps.ee.util.JpsPlatformUtil.runJaasMode(JpsPlatformUtil.java:460)
    at oracle.security.jps.ee.http.JpsAbsFilter.runJaasMode(JpsAbsFilter.java:120)
    at oracle.security.jps.ee.http.JpsAbsFilter.doFilter(JpsAbsFilter.java:217)
    at oracle.security.jps.ee.http.JpsFilter.doFilter(JpsFilter.java:81)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:79)
    at oracle.dms.servlet.DMSServletFilter.doFilter(DMSServletFilter.java:225)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:79)
    at weblogic.servlet.internal.RequestEventsFilter.doFilter(RequestEventsFilter.java:27)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:79)
    at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.wrapRun(WebAppServletContext.java:3367)
    at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3333)
    at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
    at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:120)
    at weblogic.servlet.provider.WlsSubjectHandle.run(WlsSubjectHandle.java:57)
    at weblogic.servlet.internal.WebAppServletContext.doSecuredExecute(WebAppServletContext.java:2220)
    at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2146)
    at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2124)
    at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1564)
    at weblogic.servlet.provider.ContainerSupportProviderImpl$WlsRequestExecutor.run(ContainerSupportProviderImpl.java:254)
    at weblogic.work.ExecuteThread.execute(ExecuteThread.java:295)
    at weblogic.work.ExecuteThread.run(ExecuteThread.java:254)
    4. I think problem maybe is related with row key, but I need end user to change rowkey column value. does it allow changing the value of column as row key? I found this problem maybe only occur for new created row. For those existing rows, even I change the value of row key column, no such problem occurred, how do I handle this situation?
    Appriciate if anybody can help.

    Hi Bangaram,
    Thank you for your reply. 
    The error: "Root cause error code is JBO-34014. Error message parameters are {0=oracle.jbo.Key[21 null ], 1=root} "
    I didn't create master records, I just used joint queries for information display of both master and detail. I am trying to create a row in the UI table to create a new detail record and master record already exists.
    The row key for new added row in UI rich table is [21 null ], row key of detail records table composes of 2 columns. 21 is for FixedSlotId and null is for SlotFromN. when I provide a new value for SlotFromN column in UI rich table, problem will occur.

  • What is the key column name and value column name in JDBC Adapter parameter

    Hi
    Can any one please tell me what is the Key Column Name and Key Column Value in JDBC adatper parameters. If i dont mention those parameters i am getting the following error
    <b> Value missing for mandatory configuration attribute tableEOColumnNameId</b>
    Please help me
    Best Regards
    Ravi Shankar B

    Hi
    I am doing DataBase Lookup in XI
    First i have created a  Table in Database( CheckUser) which has two fields UserName and PhoneNumber and then i have created
    I have created one Communication Channel For Reciever Adapter .
    I have given the parameters like this
    JDBC Driver : com.microsoft.jdbc.sqlserver.SQLServerDriver
    Connection : jdbc:microsoft:sqlserver://10.7.1.43:1433;DatabaseName=Ravi;
    UserName.... sa
    password.... sa
    persistence : Database
    Database Table Name : CheckUser
    Key column name and Value column name i left blank and activated
    and then
    I have created
    Data Types : Source ...... UserName
                      Destination.... PhoneNumber
    Message Types
    Message Interfaces
    In Message Mapping
                  I have created one User Defined function DBProcessing_SpecialAPI().This method will get the data from the database....
    In this function i have written the following code
       //write your code here
    String query = " ";
    Channel channel = null;
    DataBaseAccessor accessor = null;
    DataBaseResult resultSet = null;
    query = "select Password from CheckUser where UserName = ' " +UserName[0]+ " '  ";
    try {
         channel = LookupService.getChannel("Ravi","CC_JDBC");
         accessor = LookupService.getDataBaseAccessor(channel);
         resultSet = accessor.execute(query);
         for(Iterator rows = resultSet.getRows();rows.hasNext();){
              Map  rowMap = (Map)rows.next();
              result.addValue((String)rowMap.get("Password"));
    catch(Exception e){
         result.addValue(e.getMessage());
    finally{
         try{
              if(accessor != null)
                   accessor.close();
         }catch(Exception e){
              result.addValue(e.getMessage());
    And the i have mapped like this
    UserName -
    > DBProcessing_SpecialAPI----
    >PhoneNumber
    when i am testing this mapping i am getting the following error
    <?xml version="1.0" encoding="UTF-8"?>
    <ns0:Dest_JDBC_MT xmlns:ns0="http://filetofilescenario.com/ilg"><phoneNumber>Plain exception:Problem when calling an adapter by using communication channel CC_JDBC (Party: , Service: Ravi, Object ID: c360bc139a403293afbc49d5c46e4478) Check whether the communication channel exists in the Integration Directory; also check the cache notifications for the instance Integration Server (central Adapter-Engine) Channel object with Id Channel:c360bc139a403293afbc49d5c46e4478 not available in CPA Cache.
    com.sap.aii.mapping.lookup.LookupException: Problem when calling an adapter by using communication channel CC_JDBC (Party: , Service: Ravi, Object ID: c360bc139a403293afbc49d5c46e4478) Check whether the communication channel exists in the Integration Directory; also check the cache notifications for the instance Integration Server (central Adapter-Engine) Channel object with Id Channel:c360bc139a403293afbc49d5c46e4478 not available in CPA Cache.
         at com.sap.aii.ibrun.server.lookup.AdapterProxyLocal.<init>(AdapterProxyLocal.java:61)
         at com.sap.aii.ibrun.server.lookup.SystemAccessorInternal.getProxy(SystemAccessorInternal.java:98)
         at com.sap.aii.ibrun.server.lookup.SystemAccessorInternal.<init>(SystemAccessorInternal.java:38)
         at com.sap.aii.ibrun.server.lookup.SystemAccessorHmiServer.getConnection(SystemAccessorHmiServer.java:270)
         at com.sap.aii.ibrun.server.lookup.SystemAccessorHmiServer.process(SystemAccessorHmiServer.java:70)
         at com.sap.aii.utilxi.hmis.server.HmisServiceImpl.invokeMethod(HmisServiceImpl.java:169)
         at com.sap.aii.utilxi.hmis.server.HmisServer.process(HmisServer.java:178)
         at com.sap.aii.utilxi.hmis.web.HmisServletImpl.processRequestByHmiServer(HmisServletImpl.java:296)
         at com.sap.aii.utilxi.hmis.web.HmisServletImpl.processRequestByHmiServer(HmisServletImpl.java:211)
         at com.sap.aii.utilxi.hmis.web.workers.HmisInternalClient.doWork(HmisInternalClient.java:70)
         at com.sap.aii.utilxi.hmis.web.HmisServletImpl.doWork(HmisServletImpl.java:496)
         at com.sap.aii.utilxi.hmis.web.HmisServletImpl.doPost(HmisServletImpl.java:634)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:760)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
         at com.sap.engine.services.servlets_jsp.server.HttpHandlerImpl.runServlet(HttpHandlerImpl.java:390)
         at com.sap.engine.services.servlets_jsp.server.HttpHandlerImpl.handleRequest(HttpHandlerImpl.java:264)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.startServlet(RequestAnalizer.java:347)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.startServlet(RequestAnalizer.java:325)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.invokeWebContainer(RequestAnalizer.java:887)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.handle(RequestAnalizer.java:241)
         at com.sap.engine.services.httpserver.server.Client.handle(Client.java:92)
         at com.sap.engine.services.httpserver.server.Processor.request(Processor.java:148)
         at com.sap.engine.core.service630.context.cluster.session.ApplicationSessionMessageListener.process(ApplicationSessionMessageListener.java:33)
         at com.sap.engine.core.cluster.impl6.session.MessageRunner.run(MessageRunner.java:41)
         at com.sap.engine.core.thread.impl3.ActionObject.run(ActionObject.java:37)
         at java.security.AccessController.doPrivileged(Native Method)
         at com.sap.engine.core.thread.impl3.SingleThread.execute(SingleThread.java:100)
         at com.sap.engine.core.thread.impl3.SingleThread.run(SingleThread.java:170)
    </phoneNumber></ns0:Dest_JDBC_MT>
    In RWB i have checked the status of JDBC driver its showing the following error
    <b>Value missing for mandatory configuration attribute tableEOColumnNameId</b>
    Best Regards
    Ravi Shankar B
    Message was edited by:
            RaviShankar B

  • FillSchema picks too many primary key columns

    I don't know whether this is an ODP.NET error or a Microsoft error.
    Oracle9i Release 9.2.0.1.0
    ODP.NET 9.2.0.4
    .NET Framework 1.1
    Create a table with one primary key column and one unique column and name the primary key column like the unique column with the suffix "_ID":
    CREATE TABLE t_bib_uebertrag_kap(
    &nbsp kapazitaet_id NUMBER(8) NOT NULL
    &nbsp&nbsp CONSTRAINT pk_kapa primary key,
    &nbsp kapazitaet VARCHAR2(15) NOT NULL
    &nbsp&nbsp CONSTRAINT a_un_kapa unique,
    &nbsp geschwindigkeit NUMBER(12) NOT NULL,
    &nbsp bemerkung VARCHAR2(255)
    After calling FillSchema on this table the PrimaryKey property of the DataTable contains 2 columns: KAPAZITAET_ID and KAPAZITAET.
    The same happens with other tables and similar column names
    R. Lüthke

    Tony, I'm gonna find time to try this because I was thinking it probably would be a double check.
    BUT... Oracle is written in C, and I imagine the extra check as a simple if comparison for the new value versus a constant NULL value. Kind of like checking for end of string. It's already doing things like checking datatypes and that as it inserts data, and I don't think this additional check (if it even exists) would add any significant overhead.
    But unless I find a much larger performance hit than I expect, I'm gonna stick with creating the NOT NULL's explicitly. Good data modelling trumps small performance tricks for me.
    My main worry is if/when somebody removes the primary key constraint (such as the example above where they might do a CREATE TABLE x AS SELECT * FROM y). Or if the data model gets updated to where the primary key becomes just a unique key, then the NOT NULL goes away when it shouldn't. I don't like basic table and column properties changing.
    Okay.. test done. Ran about 31,000 records from dba_tables through an insert into two tables, both with primary keys and one with explicit NOT NULLs. No measurable difference in stats detected.
    With NOT NULL and PRIMARY KEY:
    call count cpu elapsed disk query current rows
    Parse 1 0.35 0.35 0 0 0 0
    Execute 1 1.88 1.83 0 24792 35887 30954
    Fetch 0 0.00 0.00 0 0 0 0
    total 2 2.24 2.18 0 24792 35887 30954
    Now with only PRIMARY KEY.
    call count cpu elapsed disk query current rows
    Parse 1 0.37 0.36 0 0 0 0
    Execute 1 1.87 1.83 0 24792 35887 30954
    Fetch 0 0.00 0.00 0 0 0 0
    total 2 2.24 2.19 0 24792 35887 30954

  • JPA: Attr ... mapped to a primary key column in the DB. Update not allowed.

    Let me just say that I posted a bug report for this here:
    https://glassfish.dev.java.net/issues/show_bug.cgi?id=3937
    But I'm also posting the info here, so that people who search on this forum may get some help:
    TopLink (both Essentials and 11g) has a problem with flushing entities
    containing a self-reference relationship. When flushing, the following exception
    occurs:
    Exception [TOPLINK-7251] (Oracle TopLink Essentials - 2.1 (Build b14-fcs
    (12/17/2007))): oracle.toplink.essentials.exceptions.ValidationException
    Exception Description: The attribute [id] of class [test.jpa.entities.Person] is
    mapped to a primary key column in the database. Updates are not allowed.
    No manual updates have been made to ANY primary key.
    What I'm doing is:
    1. I instantiate a new entity.
    2. I start a transaction
    3. I persist the new entity.
    4. I read an existing entity from the DB.
    5. I let the existing entity point to the new entity via the self-reference
    relationship.
    6. I flush the persistence context.
    7. I issue commit(), and the exception occurs. (I have provided the stack traces for various versions of TopLink below.)
    This is a clear bug.
    Here are some additional observations:
    1. Reproduced on the following versions of TopLink:
    1.1. Oracle TopLink Essentials - 2.0 (Build b58g-fcs (09/07/2007))
    1.2. Oracle TopLink Essentials - 2.1 (Build b14-fcs (12/17/2007))
    1.3. Oracle TopLink - 11g Technology Preview 3 (11.1.1.0.0) (Build 071214)
    2. Reproducible both on Java SE and Java EE. (I tested on Oracle Application Server)
    3. Reproducible with and without class weaving
    4. Reproducible regardless of whether the JPA annotations are on fields or on
    methods
    5. Reproducible regardless of whether "cascade={CascadeType.PERSIST}" is used or
    not.
    6. Reproducible regardless of the fetch type of the self-reference relationship
    (EAGER or LAZY).
    Also:
    1. Without flushing, the bug doesn't occur. That is, if I commit without
    flushing, it works.
    2. Without setting the self-reference relationship, the bug doesn't occur.
    This is an issue that appears when using BOTH self-reference relationship AND
    flushing.
    Best regards,
    Bisser
    The message was edited by bisser:
    Added info that the exception occurs "when I issue commit()" on step 7.

    I'm extremely surprised that you couldn't reproduce the error. It's reproduced each time when I run the Test Scenario that I described above.
    You could download a sample Eclipse project that reproduces the error from here: https://glassfish.dev.java.net/issues/show_bug.cgi?id=3937
    For the log below I used TopLink, version: Oracle TopLink Essentials - 2.0 (Build b58g-fcs (09/07/2007)).
    Could you, please, tell me what version you use and I will try the Test Case on it.
    Here's the FINEST log:
    [TopLink Finest]: 2008.01.09 07:35:58.094--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--property=toplink.weaving; value=false
    [TopLink Finest]: 2008.01.09 07:35:59.312--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--property=toplink.orm.throw.exceptions; default value=true
    [TopLink Finer]: 2008.01.09 07:35:59.312--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--Searching for default mapping file in file:/D:/dev/bull/jpa_pk_bug/bin/
    [TopLink Config]: 2008.01.09 07:35:59.547--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--The alias name for the entity class [class test.jpa.entities.Person] is being defaulted to: Person.
    [TopLink Config]: 2008.01.09 07:35:59.594--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--The column name for element [private java.lang.Long test.jpa.entities.Person.id] is being defaulted to: ID.
    [TopLink Config]: 2008.01.09 07:35:59.609--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--The column name for element [private java.lang.String test.jpa.entities.Person.name] is being defaulted to: NAME.
    [TopLink Config]: 2008.01.09 07:35:59.641--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--The target entity (reference) class for the many to one mapping element [test.jpa.entities.Person test.jpa.entities.Person.mgr] is being defaulted to: class test.jpa.entities.Person.
    [TopLink Config]: 2008.01.09 07:35:59.703--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--The primary key column name for the mapping element [test.jpa.entities.Person test.jpa.entities.Person.mgr] is being defaulted to: ID.
    [TopLink Finest]: 2008.01.09 07:35:59.703--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--end predeploying Persistence Unit Test; state Predeployed; factoryCount 0
    [TopLink Finer]: 2008.01.09 07:35:59.703--Thread(Thread[Main Thread,5,main])--cmp_init_transformer_is_null
    [TopLink Finest]: 2008.01.09 07:35:59.703--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--begin predeploying Persistence Unit Test; state Predeployed; factoryCount 0
    [TopLink Finest]: 2008.01.09 07:35:59.703--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--end predeploying Persistence Unit Test; state Predeployed; factoryCount 1
    [TopLink Finest]: 2008.01.09 07:35:59.719--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--begin deploying Persistence Unit Test; state Predeployed; factoryCount 1
    [TopLink Finest]: 2008.01.09 07:35:59.734--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--property=toplink.logging.level; value=FINEST; translated value=FINEST
    [TopLink Finest]: 2008.01.09 07:35:59.734--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--property=toplink.logging.level; value=FINEST; translated value=FINEST
    [TopLink Finest]: 2008.01.09 07:35:59.750--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--property=toplink.jdbc.user; value=rms
    [TopLink Finest]: 2008.01.09 07:35:59.750--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--property=toplink.jdbc.password; value=xxxxxx
    [TopLink Finest]: 2008.01.09 07:36:00.766--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--property=toplink.target-database; value=Oracle; translated value=oracle.toplink.essentials.platform.database.oracle.OraclePlatform
    [TopLink Finest]: 2008.01.09 07:36:00.781--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--property=toplink.jdbc.driver; value=oracle.jdbc.OracleDriver
    [TopLink Finest]: 2008.01.09 07:36:00.781--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--property=toplink.jdbc.url; value=jdbc:oracle:thin:@//10.20.6.126:1521/region2
    [TopLink Info]: 2008.01.09 07:36:00.797--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--TopLink, version: Oracle TopLink Essentials - 2.0 (Build b58g-fcs (09/07/2007))
    [TopLink Config]: 2008.01.09 07:36:00.812--ServerSession(1968077)--Connection(5182312)--Thread(Thread[Main Thread,5,main])--connecting(DatabaseLogin(
         platform=>OraclePlatform
         user name=> "rms"
         datasource URL=> "jdbc:oracle:thin:@//10.20.6.126:1521/region2"
    [TopLink Config]: 2008.01.09 07:36:01.797--ServerSession(1968077)--Connection(4252099)--Thread(Thread[Main Thread,5,main])--Connected: jdbc:oracle:thin:@//10.20.6.126:1521/region2
         User: RMS
         Database: Oracle  Version: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP and Data Mining options
         Driver: Oracle JDBC driver  Version: 10.2.0.1.0
    [TopLink Config]: 2008.01.09 07:36:01.797--ServerSession(1968077)--Connection(5744890)--Thread(Thread[Main Thread,5,main])--connecting(DatabaseLogin(
         platform=>OraclePlatform
         user name=> "rms"
         datasource URL=> "jdbc:oracle:thin:@//10.20.6.126:1521/region2"
    [TopLink Config]: 2008.01.09 07:36:01.875--ServerSession(1968077)--Connection(5747801)--Thread(Thread[Main Thread,5,main])--Connected: jdbc:oracle:thin:@//10.20.6.126:1521/region2
         User: RMS
         Database: Oracle  Version: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP and Data Mining options
         Driver: Oracle JDBC driver  Version: 10.2.0.1.0
    [TopLink Config]: 2008.01.09 07:36:01.891--ServerSession(1968077)--Connection(5760373)--Thread(Thread[Main Thread,5,main])--connecting(DatabaseLogin(
         platform=>OraclePlatform
         user name=> "rms"
         datasource URL=> "jdbc:oracle:thin:@//10.20.6.126:1521/region2"
    [TopLink Config]: 2008.01.09 07:36:01.969--ServerSession(1968077)--Connection(5763284)--Thread(Thread[Main Thread,5,main])--Connected: jdbc:oracle:thin:@//10.20.6.126:1521/region2
         User: RMS
         Database: Oracle  Version: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP and Data Mining options
         Driver: Oracle JDBC driver  Version: 10.2.0.1.0
    [TopLink Config]: 2008.01.09 07:36:01.969--ServerSession(1968077)--Connection(5497095)--Thread(Thread[Main Thread,5,main])--connecting(DatabaseLogin(
         platform=>OraclePlatform
         user name=> "rms"
         datasource URL=> "jdbc:oracle:thin:@//10.20.6.126:1521/region2"
    [TopLink Config]: 2008.01.09 07:36:02.047--ServerSession(1968077)--Connection(5500006)--Thread(Thread[Main Thread,5,main])--Connected: jdbc:oracle:thin:@//10.20.6.126:1521/region2
         User: RMS
         Database: Oracle  Version: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP and Data Mining options
         Driver: Oracle JDBC driver  Version: 10.2.0.1.0
    [TopLink Config]: 2008.01.09 07:36:02.047--ServerSession(1968077)--Connection(5512041)--Thread(Thread[Main Thread,5,main])--connecting(DatabaseLogin(
         platform=>OraclePlatform
         user name=> "rms"
         datasource URL=> "jdbc:oracle:thin:@//10.20.6.126:1521/region2"
    [TopLink Config]: 2008.01.09 07:36:02.125--ServerSession(1968077)--Connection(5514977)--Thread(Thread[Main Thread,5,main])--Connected: jdbc:oracle:thin:@//10.20.6.126:1521/region2
         User: RMS
         Database: Oracle  Version: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP and Data Mining options
         Driver: Oracle JDBC driver  Version: 10.2.0.1.0
    [TopLink Config]: 2008.01.09 07:36:02.125--ServerSession(1968077)--Connection(5527528)--Thread(Thread[Main Thread,5,main])--connecting(DatabaseLogin(
         platform=>OraclePlatform
         user name=> "rms"
         datasource URL=> "jdbc:oracle:thin:@//10.20.6.126:1521/region2"
    [TopLink Config]: 2008.01.09 07:36:02.203--ServerSession(1968077)--Connection(5530440)--Thread(Thread[Main Thread,5,main])--Connected: jdbc:oracle:thin:@//10.20.6.126:1521/region2
         User: RMS
         Database: Oracle  Version: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP and Data Mining options
         Driver: Oracle JDBC driver  Version: 10.2.0.1.0
    [TopLink Config]: 2008.01.09 07:36:02.203--ServerSession(1968077)--Connection(5542993)--Thread(Thread[Main Thread,5,main])--connecting(DatabaseLogin(
         platform=>OraclePlatform
         user name=> "rms"
         datasource URL=> "jdbc:oracle:thin:@//10.20.6.126:1521/region2"
    [TopLink Config]: 2008.01.09 07:36:02.281--ServerSession(1968077)--Connection(5545904)--Thread(Thread[Main Thread,5,main])--Connected: jdbc:oracle:thin:@//10.20.6.126:1521/region2
         User: RMS
         Database: Oracle  Version: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP and Data Mining options
         Driver: Oracle JDBC driver  Version: 10.2.0.1.0
    [TopLink Finest]: 2008.01.09 07:36:02.312--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--sequencing connected, state is Preallocation_NoTransaction_State
    [TopLink Info]: 2008.01.09 07:36:02.484--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--file:/D:/dev/bull/jpa_pk_bug/bin/-Test login successful
    [TopLink Finest]: 2008.01.09 07:36:02.484--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--end deploying Persistence Unit Test; state Deployed; factoryCount 1
    [TopLink Finer]: 2008.01.09 07:36:02.516--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--client acquired
    [TopLink Finest]: 2008.01.09 07:36:02.531--UnitOfWork(5663897)--Thread(Thread[Main Thread,5,main])--Execute query DoesExistQuery()
    [TopLink Finest]: 2008.01.09 07:36:02.547--UnitOfWork(5663897)--Thread(Thread[Main Thread,5,main])--PERSIST operation called on: test.jpa.entities.Person@563c8c.
    [TopLink Finest]: 2008.01.09 07:36:02.562--ClientSession(5666151)--Thread(Thread[Main Thread,5,main])--Execute query ValueReadQuery()
    [TopLink Fine]: 2008.01.09 07:36:02.594--ServerSession(1968077)--Connection(5747801)--Thread(Thread[Main Thread,5,main])--SELECT PERSONS_ID_SEQ.NEXTVAL FROM DUAL
    [TopLink Finest]: 2008.01.09 07:36:03.297--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--sequencing preallocation for PERSONS_ID_SEQ: objects: 1 , first: 5, last: 5
    [TopLink Finest]: 2008.01.09 07:36:03.312--UnitOfWork(5663897)--Thread(Thread[Main Thread,5,main])--assign sequence to the object (5 -> test.jpa.entities.Person@563c8c)
    [TopLink Finest]: 2008.01.09 07:36:03.328--UnitOfWork(5663897)--Thread(Thread[Main Thread,5,main])--Execute query ReadObjectQuery(test.jpa.entities.Person)
    [TopLink Fine]: 2008.01.09 07:36:03.438--ServerSession(1968077)--Connection(4252099)--Thread(Thread[Main Thread,5,main])--SELECT ID, NAME, MGR_ID FROM Persons WHERE (ID = ?)
         bind => [1]
    [TopLink Finest]: 2008.01.09 07:36:03.531--UnitOfWork(5663897)--Thread(Thread[Main Thread,5,main])--Register the existing object test.jpa.entities.Person@3a4484
    [TopLink Finer]: 2008.01.09 07:36:03.625--ClientSession(5666151)--Connection(5763284)--Thread(Thread[Main Thread,5,main])--begin transaction
    [TopLink Finest]: 2008.01.09 07:36:03.625--UnitOfWork(5663897)--Thread(Thread[Main Thread,5,main])--Execute query UpdateObjectQuery(test.jpa.entities.Person@3a57fa)
    [TopLink Finest]: 2008.01.09 07:36:03.641--UnitOfWork(5663897)--Thread(Thread[Main Thread,5,main])--Execute query WriteObjectQuery(test.jpa.entities.Person@563c8c)
    [TopLink Fine]: 2008.01.09 07:36:03.656--ClientSession(5666151)--Connection(5763284)--Thread(Thread[Main Thread,5,main])--INSERT INTO Persons (ID, NAME, MGR_ID) VALUES (?, ?, ?)
         bind => [5, Boss, null]
    [TopLink Fine]: 2008.01.09 07:36:03.688--ClientSession(5666151)--Connection(5763284)--Thread(Thread[Main Thread,5,main])--UPDATE Persons SET MGR_ID = ? WHERE (ID = ?)
         bind => [5, 1]
    [TopLink Finer]: 2008.01.09 07:36:03.703--UnitOfWork(5663897)--Thread(Thread[Main Thread,5,main])--begin unit of work commit
    [TopLink Finest]: 2008.01.09 07:36:03.703--UnitOfWork(5663897)--Thread(Thread[Main Thread,5,main])--Execute query UpdateObjectQuery(test.jpa.entities.Person@563c8c)
    [TopLink Warning]: 2008.01.09 07:36:03.812--UnitOfWork(5663897)--Thread(Thread[Main Thread,5,main])--Local Exception Stack:
    Exception [TOPLINK-7251] (Oracle TopLink Essentials - 2.0 (Build b58g-fcs (09/07/2007))): oracle.toplink.essentials.exceptions.ValidationException
    Exception Description: The attribute [id] of class [test.jpa.entities.Person] is mapped to a primary key column in the database. Updates are not allowed.
         at oracle.toplink.essentials.exceptions.ValidationException.primaryKeyUpdateDisallowed(ValidationException.java:2222)
         at oracle.toplink.essentials.mappings.foundation.AbstractDirectMapping.writeFromObjectIntoRowWithChangeRecord(AbstractDirectMapping.java:750)
         at oracle.toplink.essentials.internal.descriptors.ObjectBuilder.buildRowForUpdateWithChangeSet(ObjectBuilder.java:948)
         at oracle.toplink.essentials.internal.queryframework.DatabaseQueryMechanism.updateObjectForWriteWithChangeSet(DatabaseQueryMechanism.java:1263)
         at oracle.toplink.essentials.queryframework.UpdateObjectQuery.executeCommitWithChangeSet(UpdateObjectQuery.java:91)
         at oracle.toplink.essentials.internal.queryframework.DatabaseQueryMechanism.executeWriteWithChangeSet(DatabaseQueryMechanism.java:390)
         at oracle.toplink.essentials.queryframework.WriteObjectQuery.executeDatabaseQuery(WriteObjectQuery.java:109)
         at oracle.toplink.essentials.queryframework.DatabaseQuery.execute(DatabaseQuery.java:628)
         at oracle.toplink.essentials.queryframework.DatabaseQuery.executeInUnitOfWork(DatabaseQuery.java:555)
         at oracle.toplink.essentials.queryframework.ObjectLevelModifyQuery.executeInUnitOfWorkObjectLevelModifyQuery(ObjectLevelModifyQuery.java:138)
         at oracle.toplink.essentials.queryframework.ObjectLevelModifyQuery.executeInUnitOfWork(ObjectLevelModifyQuery.java:110)
         at oracle.toplink.essentials.internal.sessions.UnitOfWorkImpl.internalExecuteQuery(UnitOfWorkImpl.java:2233)
         at oracle.toplink.essentials.internal.sessions.AbstractSession.executeQuery(AbstractSession.java:952)
         at oracle.toplink.essentials.internal.sessions.AbstractSession.executeQuery(AbstractSession.java:909)
         at oracle.toplink.essentials.internal.sessions.CommitManager.commitChangedObjectsForClassWithChangeSet(CommitManager.java:309)
         at oracle.toplink.essentials.internal.sessions.CommitManager.commitAllObjectsWithChangeSet(CommitManager.java:195)
         at oracle.toplink.essentials.internal.sessions.AbstractSession.writeAllObjectsWithChangeSet(AbstractSession.java:2657)
         at oracle.toplink.essentials.internal.sessions.UnitOfWorkImpl.commitToDatabase(UnitOfWorkImpl.java:1044)
         at oracle.toplink.essentials.internal.ejb.cmp3.base.RepeatableWriteUnitOfWork.commitToDatabase(RepeatableWriteUnitOfWork.java:403)
         at oracle.toplink.essentials.internal.sessions.UnitOfWorkImpl.commitToDatabaseWithChangeSet(UnitOfWorkImpl.java:1126)
         at oracle.toplink.essentials.internal.ejb.cmp3.base.RepeatableWriteUnitOfWork.commitRootUnitOfWork(RepeatableWriteUnitOfWork.java:107)
         at oracle.toplink.essentials.internal.sessions.UnitOfWorkImpl.commitAndResume(UnitOfWorkImpl.java:856)
         at oracle.toplink.essentials.internal.ejb.cmp3.transaction.base.EntityTransactionImpl.commit(EntityTransactionImpl.java:102)
         at oracle.toplink.essentials.internal.ejb.cmp3.transaction.EntityTransactionImpl.commit(EntityTransactionImpl.java:60)
         at test.jpa.TestPkBug.runTest(TestPkBug.java:53)
         at test.jpa.TestPkBug.main(TestPkBug.java:95)
    [TopLink Finer]: 2008.01.09 07:36:03.828--ClientSession(5666151)--Connection(5763284)--Thread(Thread[Main Thread,5,main])--rollback transaction
    [TopLink Finer]: 2008.01.09 07:36:03.844--UnitOfWork(5663897)--Thread(Thread[Main Thread,5,main])--release unit of work
    [TopLink Finer]: 2008.01.09 07:36:03.844--UnitOfWork(5663897)--Thread(Thread[Main Thread,5,main])--initialize identitymaps
    [TopLink Finer]: 2008.01.09 07:36:03.844--ClientSession(5666151)--Thread(Thread[Main Thread,5,main])--client released
    [TopLink Finest]: 2008.01.09 07:36:03.844--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--begin undeploying Persistence Unit Test; state Deployed; factoryCount 1
    [TopLink Finest]: 2008.01.09 07:36:03.844--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--sequencing disconnected
    [TopLink Config]: 2008.01.09 07:36:03.844--ServerSession(1968077)--Connection(4252099)--Thread(Thread[Main Thread,5,main])--disconnect
    [TopLink Finer]: 2008.01.09 07:36:03.859--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--initialize identitymaps
    [TopLink Info]: 2008.01.09 07:36:03.859--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--file:/D:/dev/bull/jpa_pk_bug/bin/-Test logout successful
    [TopLink Config]: 2008.01.09 07:36:03.859--ServerSession(1968077)--Connection(5747801)--Thread(Thread[Main Thread,5,main])--disconnect
    [TopLink Config]: 2008.01.09 07:36:03.859--ServerSession(1968077)--Connection(5182312)--Thread(Thread[Main Thread,5,main])--disconnect
    [TopLink Config]: 2008.01.09 07:36:03.859--ServerSession(1968077)--Connection(5500006)--Thread(Thread[Main Thread,5,main])--disconnect
    [TopLink Config]: 2008.01.09 07:36:03.875--ServerSession(1968077)--Connection(5514977)--Thread(Thread[Main Thread,5,main])--disconnect
    [TopLink Config]: 2008.01.09 07:36:03.875--ServerSession(1968077)--Connection(5530440)--Thread(Thread[Main Thread,5,main])--disconnect
    [TopLink Config]: 2008.01.09 07:36:03.875--ServerSession(1968077)--Connection(5545904)--Thread(Thread[Main Thread,5,main])--disconnect
    [TopLink Config]: 2008.01.09 07:36:03.891--ServerSession(1968077)--Connection(5763284)--Thread(Thread[Main Thread,5,main])--disconnect
    [TopLink Finest]: 2008.01.09 07:36:03.891--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--end undeploying Persistence Unit Test; state Undeployed; factoryCount 0
    Exception in thread "Main Thread" javax.persistence.RollbackException: Exception [TOPLINK-7251] (Oracle TopLink Essentials - 2.0 (Build b58g-fcs (09/07/2007))): oracle.toplink.essentials.exceptions.ValidationException
    Exception Description: The attribute [id] of class [test.jpa.entities.Person] is mapped to a primary key column in the database. Updates are not allowed.
         at oracle.toplink.essentials.internal.ejb.cmp3.transaction.base.EntityTransactionImpl.commit(EntityTransactionImpl.java:120)
         at oracle.toplink.essentials.internal.ejb.cmp3.transaction.EntityTransactionImpl.commit(EntityTransactionImpl.java:60)
         at test.jpa.TestPkBug.runTest(TestPkBug.java:53)
         at test.jpa.TestPkBug.main(TestPkBug.java:95)
    Caused by: Exception [TOPLINK-7251] (Oracle TopLink Essentials - 2.0 (Build b58g-fcs (09/07/2007))): oracle.toplink.essentials.exceptions.ValidationException
    Exception Description: The attribute [id] of class [test.jpa.entities.Person] is mapped to a primary key column in the database. Updates are not allowed.
         at oracle.toplink.essentials.exceptions.ValidationException.primaryKeyUpdateDisallowed(ValidationException.java:2222)
         at oracle.toplink.essentials.mappings.foundation.AbstractDirectMapping.writeFromObjectIntoRowWithChangeRecord(AbstractDirectMapping.java:750)
         at oracle.toplink.essentials.internal.descriptors.ObjectBuilder.buildRowForUpdateWithChangeSet(ObjectBuilder.java:948)
         at oracle.toplink.essentials.internal.queryframework.DatabaseQueryMechanism.updateObjectForWriteWithChangeSet(DatabaseQueryMechanism.java:1263)
         at oracle.toplink.essentials.queryframework.UpdateObjectQuery.executeCommitWithChangeSet(UpdateObjectQuery.java:91)
         at oracle.toplink.essentials.internal.queryframework.DatabaseQueryMechanism.executeWriteWithChangeSet(DatabaseQueryMechanism.java:390)
         at oracle.toplink.essentials.queryframework.WriteObjectQuery.executeDatabaseQuery(WriteObjectQuery.java:109)
         at oracle.toplink.essentials.queryframework.DatabaseQuery.execute(DatabaseQuery.java:628)
         at oracle.toplink.essentials.queryframework.DatabaseQuery.executeInUnitOfWork(DatabaseQuery.java:555)
         at oracle.toplink.essentials.queryframework.ObjectLevelModifyQuery.executeInUnitOfWorkObjectLevelModifyQuery(ObjectLevelModifyQuery.java:138)
         at oracle.toplink.essentials.queryframework.ObjectLevelModifyQuery.executeInUnitOfWork(ObjectLevelModifyQuery.java:110)
         at oracle.toplink.essentials.internal.sessions.UnitOfWorkImpl.internalExecuteQuery(UnitOfWorkImpl.java:2233)
         at oracle.toplink.essentials.internal.sessions.AbstractSession.executeQuery(AbstractSession.java:952)
         at oracle.toplink.essentials.internal.sessions.AbstractSession.executeQuery(AbstractSession.java:909)
         at oracle.toplink.essentials.internal.sessions.CommitManager.commitChangedObjectsForClassWithChangeSet(CommitManager.java:309)
         at oracle.toplink.essentials.internal.sessions.CommitManager.commitAllObjectsWithChangeSet(CommitManager.java:195)
         at oracle.toplink.essentials.internal.sessions.AbstractSession.writeAllObjectsWithChangeSet(AbstractSession.java:2657)
         at oracle.toplink.essentials.internal.sessions.UnitOfWorkImpl.commitToDatabase(UnitOfWorkImpl.java:1044)
         at oracle.toplink.essentials.internal.ejb.cmp3.base.RepeatableWriteUnitOfWork.commitToDatabase(RepeatableWriteUnitOfWork.java:403)
         at oracle.toplink.essentials.internal.sessions.UnitOfWorkImpl.commitToDatabaseWithChangeSet(UnitOfWorkImpl.java:1126)
         at oracle.toplink.essentials.internal.ejb.cmp3.base.RepeatableWriteUnitOfWork.commitRootUnitOfWork(RepeatableWriteUnitOfWork.java:107)
         at oracle.toplink.essentials.internal.sessions.UnitOfWorkImpl.commitAndResume(UnitOfWorkImpl.java:856)
         at oracle.toplink.essentials.internal.ejb.cmp3.transaction.base.EntityTransactionImpl.commit(EntityTransactionImpl.java:102)
         ... 3 moreEDIT: Are you using the EXACT Test Case as I have described it in the previous posts? It's important that you commit(), and not rollback(), the transaction after the flush.
    EDIT: Updated the log because I found out that I had made a small change to the original Test Case while I was trying to find a workaround. The current log is produced by the EXACT Test Case I described in my previous posts.
    Message was edited by:
    bisser

  • Is it mandatory to assign language key to all the words exist in the syste

    Hai experts plz clarify my dout,
    Current Scenario: We are in the Scan phase of the Combined Upgrade and Unicode Conversion.
    Our questions:
    1.      Is it mandatory to assign language key to all the words exist in the system? When we tried, without assigning a language key, still we were able to proceed to the next step.
    2.      Now the upgrade has started, so confirm us that whether it is necessary to assign the language key for all the words without a language assignment before export or SAP considers SPACE as a language character? This question is raised because in MDMP systems there is 1 column with Space.
    3.      For words with mix characters what is the language to be assigned? Mix characters eg: “a#p//e”
    4.      How such words are generated? Usually words are generated from table CDPOS where the field: table key has a provision to store binary numbers and the when system doesn’t recognize the characters given at the input, an ambiguous word is generated and it is stored in the vocabulary list. Hence what language is to be assigned to this word?
    There are 21,000 such words without language assignment in the Vocabulary.

    Hello!:
    1. It's mandatory, to do all correctly.
    2. You have to do all the scan's for the MDMP.
    3. I've put German (the original lang)
    4. 21195 in my case, it's normal. The total number of tables is 27265.
    Please confirm the message, and points (you can contact with be by email)
    Regards,
    Alfredo.

  • Is it mandatory to assign language key to all the words exist in the system

    Current Scenario: We are in the Scan phase of the Combined Upgrade and Unicode Conversion.
    Our questions:
    1.    Is it mandatory to assign language key to all the words exist in the system? When we tried, without assigning a language key, still we were able to proceed to the next step.
    2.    Now the upgrade has started, so confirm us that whether it is necessary to assign the language key for all the words without a language assignment before export or SAP considers SPACE as a language character? This question is raised because in MDMP systems there is 1 column with Space.
    3.    For words with mix characters what is the language to be assigned? Mix characters eg: “a#p//e”
    4.    How such words are generated? Usually words are generated from table CDPOS where the field: table key has a provision to store binary numbers and the when system doesn’t recognize the characters given at the input, an ambiguous word is generated and it is stored in the vocabulary list. Hence what language is to be assigned to this word?
    There are 21,000 such words without language assignment in the Vocabulary.
    Thanks ,
    vinay

    Hello Vinay!:
    I'm doing the same project, CU&UC.
    Now i'm in the PREPARE of the source system (all the steps in the additional steps has been done)....and i think that Is it mandatory to assign language key.
    About other questions, i'm investigating too.
    Alfredo.

  • Key value does not currently exist in the primary key table error

    Im testing out a form and creating a new insert once all the fields are filled in it wont allow me to move on and comes up with this error message;
    key value does not currently exist in the primary key table
    Is this becuase the form contains foriegn keys which reference different tables which fields are not dispalyed on this form?
    If so how to I link the other forms together so that when i create a new record in this form it will automaticaly create the primary key entry in the other form to allow the insert of the referenced foreign key in this form?
    does this make sense?

    It seems that it has the table in which you'r inserting is a foreign key table.
    Take the emp& dept table example in the schema scott/tiger.
    Say we have deptno 10, 20, 30, 40 in the dept table.
    We make a form for emp table and insert into this form.
    Now, we'r bound to insert deptno 10, 20, 30, 40 into this emp form. This form will not allow us to insert deptno 50 as it doesn't exist in the primary key column in dept table.
    Obviously if a department doesn't exist in the company, it doesn't make sense to assign the employees to that department.
    So for this purpose, we need to add a department in the dept table then we can insert values in the emp table for that department.
    For this purpose, I suggest to make a button on the mp form and invoke the dept form and add a new department to it and then insert into emp table.
    The same scenario can be applied in your case.

  • Process ADD(increment) using SSIS for a dimension with two Key columns

    Hi,
    I'm trying to add data to the existing dimension using Process ADD increment (SSIS data flow). The dimension has 2 key columns (ID and Timestamp) and when I do a full process for a particular day the dimension run's fine, but when I try
    to add data for the next day using Process Add(increment) from SSIS I'm getting error like   
    And Yes, I will have same ID's of the first day for the next day as well but with a different Timestamp, so for that I have set the
    Key not found  to "Ignore errors". But still I see the above error. Below is my DSV
    Can anyone help me on how to approach on this. 

    Hi K.Kalyan,
    According to your description, you get the above error when executing the ProcessAdd data flow task. Right?
    In this scenario, since you have same ID for one which is differed on Timestamp, it can cause duplicated key issue so that it will throw this error. Please change the "Duplicate Key" error action to "Report and continue"
    (recommended) or "Ignore error" to resolve this issue. 
    Reference:
    Dimension processing failing (Incremental)
    FIX: "Internal error: An unexpected error occurred" when you run a "Process" command against the TFS SSAS cube
    Best Regards, 
    Simon Hou
    TechNet Community Support

Maybe you are looking for

  • Tutorial: Clone/Installing new hard drive (hdd) & install windows(FAT NTFS)

    This is a tutorial for people looking to clone their internal hard drive and install it into their macbook (usually because you've upgraded to a larger hdd). I have also included an installation of windows vis a vis boot camp style. Ive been battling

  • Seeing constant download of "something" to creative cloud.

    I am not sure what's going on, but today is the first day that I've noticed a continuous download of some kind of data at about 1 mb/sec to the Creative Cloud installed on my PC.  I'm pretty sure it's the CC applications because I can block Adobe CC

  • How to move layer content around

    There will be an easy answer to this i'm sure. Yesterday I took a photo of my friend who stood in front of a wonderful mountain view. It was not possible to get both her and the mountains in focus, so I took 2 identical photos. 1 with her in focus, a

  • Undo doesn't undo in 7.2.1

    Hi, Since upgrading to 7.2.1, I noticed that when I did a "cut time and move by locators", and then did an undo, not everything is undone. It created havoc in my arrangement. Good thing I could revert to a save. Maybe this bug is only isolated to the

  • Firewire interference with AirPort Extreme card?

    Hi Everybody- So I tried searching on the discussion boards for some insight on my problem with no avail. I use a Aiport Express station that is providing several Macs in my house with a good signal and good internet connection. I've also been using