Attribute sorting

Hi,
If I read in an xml document and subsequently interrogate one of the Element nodes in the document to discover the attribute list is there anyway of preserving the order that the attributes are specified in the dtd?
At the moment the attributes seem to be sorted in the order they are specified provided that they have a value.
Any attributes without a specified value appear to be stored after those with a value.

Yes, that is what I mean. I want to do that because the data is updatable from a gui and it would be nice to have associated attributes displayed close together.
An example (probably a poor one ;-) ) would be having an address element with housename, street, city, county attributes. It woud look silly if the gui asked for the attribute data in a (city, street, county,housename) order. I realise that I can code the order into the gui, but it is a dynamic, flexible gui and it would be handy if the there was someway of sorting the attributes in the Document api. From your answer I can see that I have to do it myself. Drat ;-(

Similar Messages

  • Report Attribute - Column Attribute - Sort

    I have a SQL Query (PL/SQL function body returning SQL Query) report. I have the number of returned limited to 100 rows (Report Attributes - Layout and Pagination - Max Row Count). When I run this report without any column sorts (none of the Column Attribute Sort columns are checked) it runs fast. If I set either the Sort Sequence, or check any one of the Sort columns, the report takes forever to run. This with only 100 rows on the report.
    Where is the sort being performed - Is the sort being performed within Oracle (meaning, in the database) or is it being sorted in Apex or on the Client?
    Note that there is no "order by" nor "group by" contained within the SQL - just a simple 2 table join (an inner join).

    Thank You, Thank You, Thank You.
    That is indeed the point I am trying to make.
    My report is a SQL Query (PL/SQL Function body returning SQL Query) and is based on information provided in several PopUp LOVs. The more LOVs used, the more information provided to the WHERE CLAUSE, and therefore the less data returned to the screen.
    When the End User produces a small report, they are likely to want to sort it.
    But when they bring back huge amounts of data, they don't want it sorted. Especially not automatically sorted.
    In most cases, this is happening when we are joining two or three tables with say 10 million rows on each table. The join condition is good (meaning there is not a Cartesian Product) but the resultant data is 10 million rows - something you not likely want to sort. Even with a limit on the number of rows returned (Report Attributes - Layout and Pagination - Max Row Count) to something reasonable, like 500, the 10 million resultant rows are first being sorted and then the first 500 returned.
    So, yes, a "sort-enabled" report should NOT automatically sort. Which is what is happening.
    Of course, it might be a good thing to have some sort of Variable that we could set on a page to decide if we want automatic sorting. On most pages, an automatic sort is great - as the amount of data returned will always be small.
    On the pages where I have this problem, it would be great to do the automatic sort when one or more of the PopUp LOVs are populated. So, within a "After Submit" process I might want to turn automatic sorting on.
    Hey, if we are going to ask for an enhancement/change, we might as well ask for something really flexible.

  • Report not initially sorted as defined by Report Attributes "Sort Sequence"

    I have a Report Region with Type SQL Query and Source "SELECT * FROM <table>" where <table> has a primary key from a sequence. Under Report Attributes, I have Report Column "ST_NM" with Show and Sort checked and having a Sort Sequence of "1". I assume this is to set the iniital display sequence but regardless, the report rows display in Primary key order initially.
    To try it: http://apex.oracle.com/pls/apex/f?p=21997:2 with Dever/Ima9Dever
    1) How do I set the initial display sequence to be other than the primary key sequence?
    [Note: It seems to work part (or all?) of the time under our 4.1 implementation.]
    Thanks,
    Howard
    I thought this one would be easy!

    Howard (DBA in Training) wrote:
    I have a Report Region with Type SQL Query and Source "SELECT * FROM <table>" where <table> has a primary key from a sequence. Under Report Attributes, I have Report Column "ST_NM" with Show and Sort checked and having a Sort Sequence of "1". I assume this is to set the iniital display sequence but regardless, the report rows display in Primary key order initially.
    To try it: http://apex.oracle.com/pls/apex/f?p=21997:2 with Dever/Ima9Dever
    1) How do I set the initial display sequence to be other than the primary key sequence?
    [Note: It seems to work part (or all?) of the time under our 4.1 implementation.]Standard reports store sort columns persistently across sessions as user preferences. It's likely that Dever user has at some point clicked to sort on the PK column and created a persistent preference. Try it with a new user that has never viewed the page before...
    See +{thread:id=2433320}+ for more on this, including how to reset it.

  • Attribute sort of characteristic

    Is it possible to create a sort, or custom sort, of a characteristic's attributes within Analysis? We are seeing that the sequence between BEx and Analysis characteristics is very different. We have a requirement that these be the same for a few reports.
    Can characteristic sorting be forced from BEx into Analysis? Note: in BW we are getting the correct sequencing/sorting of attributes.

    I think the description is clear; but I don't think this is possible with Analysis Office
    If you want this functionality, I recommend turning the attribute characteristic into a navigation one on the backend of BW - of course this would require some BW development

  • How to change report attributes like heading, sort etc programmatically

    Some of the columns from a classic reports using pl/sql as region source should be sortable. Is it possible to change this attribute (sort = yes) using JavaScript or APEX API?
    David
    Edited by: david on Sep 6, 2011 11:56 PM

    There is something I don't understand here. Why do you need to point at the sxd? What is changing? Like I mentioned before. If the contents of the xsd change, all the report will do is - at best - drop the changed fields so you will be missing data. Perhaps some of the following will help;
    [Crystal Reports Guide To ADO.NET|http://www.sdn.sap.com/irj/boc/go/portal/prtroot/docs/library/uuid/401c4455-a31d-2b10-ae96-fa57af5aec20?quicklink=index&overridelayout=true]
    [Crystal Reports For Visual Studio .NET Reporting Off ADO.NET Datasets|http://www.sdn.sap.com/irj/boc/go/portal/prtroot/docs/library/uuid/2091d0c3-da1d-2b10-22be-a3426b183f75?quicklink=index&overridelayout=true]
    [Crystal Reports For Visual Studio 2005 Walkthroughs|http://www.sdn.sap.com/irj/boc/go/portal/prtroot/docs/library/uuid/2081b4d9-6864-2b10-f49d-918baefc7a23?quicklink=index&overridelayout=true] (page 332 on).
    Ludek

  • Sorting folders base on extended attribute?

    I created a subclass of folder and added custome
    attribute 'sort'. I added several folders with the 'sort' value
    set. Now I want to retrive these folders sorted by the 'SORT'
    attribute. Below is my code, note if I put 'NAME' as my
    sortqualifier value the sort works, but using 'SORT' does not
    work, it returns 0 results.
    How can I specify the sort order which I want my folders to
    appear when the order is not based on the Folder class
    attributes? Does if not possible with subclassing what about
    other options?
    Vector liblist = new Vector();
    Folder po = null;
    PublicObject[] libraries = null;
    SortSpecification sort = new SortSpecification();
    SortQualifier sq = new SortQualifier("SORT", true); // <<< note
    attr.
    sort.addSortQualifier(sq);
    String rootDir = m_Root + curloc;
    po = (Folder)getPublicObject(session, rootDir);
    po.setSortSpecification(sort);
    libraries = po.getItems();
    for (int i = 0; i < libraries.length; i++) {
         liblist.addElement(libraries.getName());

    I don't know of any functionality to do that in Bridge now; however, it certainly can be done via scripting.
    Perhaps one of our forum contributors might want to take on a "picture sorter" script...
    Bob
    Adobe Workflow Scripting

  • Sorting transient attributes

    Does anyone know how to take advantage of adf/uix built in sorting for view objects that contain only transient fields? For example, when I build a view object that references an entity object, sorting (via a uix table tag) works great. However, when using a view object with only transient attributes, sorting does not work through a uix table. Any ideas would really be appreciated...

    FYI,
    I've figured this out for anyone that sees this post in the future. Here's how to do it.
    1 Open the transient view object in the view object editor (by double-clicking on the view object).
    2 Select Java.
    3 Press the Class extends button.
    4 For Object, browse to the following class: SortableTransientViewObject (see below for implementation)
    5 Press OK.
    6 Select Attributes.
    7 For each attributes:
    1. Select the "Selected in Query" property.
    2. Copy the exact name of the attribute to the query column alias field.
    8 Press OK.
    SortableTransientViewObject implementation:
    =============================================================
    import java.util.Collections;
    import java.util.Iterator;
    import java.util.LinkedList;
    import java.util.List;
    import oracle.jbo.Row;
    import oracle.jbo.RowSetIterator;
    import oracle.jbo.server.ViewObjectImpl;
    import org.apache.commons.logging.Log;
    import org.apache.commons.logging.LogFactory;
    import oracle.jbo.NoDefException;
    public class SortableTransientViewObject extends ViewObjectImpl
    //flag for determining an invalid column index
    public static final int INVALID_COLUMN_INDEX_FLAG = -1;
    //default sort to INVALID_COLUMN_INDEX_FLAG
    private int sortColumnIndex = INVALID_COLUMN_INDEX_FLAG;
    private boolean sortAscending = true;
    * Sole constructor. Do not remove.
    public SortableTransientViewObject()
    * Sets the default sort order. If the order has not been pre-defined, then the
    * sort will not be executed when the view object is first created.
    * @param sortField the sort field index. Each view object row implementation
    * defines these attributes indices as constants. Those values are the values
    * that should be used for this method.
    * @param sortAscending sort ascending if true; descending otherwise.
    public void setDefaultSortOrder( int sortField, boolean sortAscending )
    this.setSortColumnIndex( sortField );
    this.setSortAscending( sortAscending );
    * ADF Framework extension.
    * <br>
    * This method is called by ADF when a view object is to be refreshed. In
    * the case of sorting, this signals that the view object should be sorted.
    * The ADF framework will have already called the setOrderByClause() method
    * which tells this view object what to sort by.
    public void executeQuery()
    sort( getSortColumnIndex(), isSortAscending() );
    * Over-riding to implement sort for transient view object.
    * <br>
    * This method will be called by the adf framework whenever a request
    * is made to sort the view object through a ui component; most
    * likely a table. The orderByClause parameter will always contain
    * the sort column and will optionally contain the sort direction.
    * <br>
    * This method parses the orderByClass parameter and sets the
    * corresponding sort index and direction. This method does not perform
    * the actual sort; the sort will not take place until the executeQuery()
    * method has been invoked.
    * @param orderByClause a string representing the sort column and sort
    * direction.
    public void setOrderByClause( String orderByClause )
    if( isNullOrEmpty( orderByClause ) ||
    isNullOrEmpty( orderByClause.trim() ))
    this.setSortColumnIndex( INVALID_COLUMN_INDEX_FLAG );
    this.setSortAscending( true );
    return;
    else
    orderByClause = orderByClause.trim();
    boolean sortAscending = true;
    String sortColumn = null;
    int spaceCharacterIndex = orderByClause.indexOf( ' ' );
    if( spaceCharacterIndex == -1 )
    sortColumn = orderByClause;
    sortAscending = true;
    else
    sortColumn = orderByClause.substring( 0, spaceCharacterIndex );
    String sortDirectionStringValue = orderByClause.substring( spaceCharacterIndex, orderByClause.length() );
    if( isNullOrEmpty( sortDirectionStringValue ) ||
    isNullOrEmpty( sortDirectionStringValue.trim() ) )
    sortAscending = true;
    else
    sortDirectionStringValue = sortDirectionStringValue.trim();
    if( "desc".equals( sortDirectionStringValue.toLowerCase() ) )
    sortAscending = false;
    try
    this.setSortColumnIndex( this.getAttributeIndexOf( sortColumn ) );
    catch( NoDefException e )
    this.setSortColumnIndex( INVALID_COLUMN_INDEX_FLAG );
    this.setSortAscending( sortAscending );
    * Sort helper method.
    * <br>
    * Sorts the view object. Because adf's built in sorting only works for
    * non-transient attributes (ie attributes from an entity object), this
    * method was created as a framework extension to support sorting.
    * <br>
    * This method accomplishes sorting by using Java's built in api for
    * sorting: the Collections framework. First, all view object rows are
    * copied into List object. Second, the list is sorted using Java's built
    * in Collections.sort method. Third, all rows are removed from the view
    * object and lastly, the view object is re-populated by iterating through
    * the list.
    * @param sortField the field to sort by.
    * @param sortAscending sort ascending if true; descending otherwise.
    private void sort( int sortFieldParameter, boolean sortAscendingParameter )
    //don't sort if the column index is invalid
    if( sortFieldParameter < 0 || sortFieldParameter >= this.getAttributeCount() )
    return;
    //step 1 - copy all rows to a List object and remove all rows from view object
    List list = new LinkedList();
    RowSetIterator rowIterator = this.getRowSet().createRowSetIterator( null );
    while( rowIterator.hasNext() )
    Row rowToAdd = (Row) rowIterator.next();
    list.add( rowToAdd );
    rowIterator.closeRowSetIterator();
    //step 2 - sort the List object.
    Collections.sort( list, new ViewObjectRowComparator( sortFieldParameter, sortAscendingParameter ) );
    //step 3 - remove all rows from view object
    rowIterator = this.getRowSet().createRowSetIterator( null );
    while( rowIterator.hasNext() )
    rowIterator.next();
    rowIterator.removeCurrentRowAndRetain();
    rowIterator.closeRowSetIterator();
    //step 4 - re-populate the view object with the sorted list.
    Iterator sortedValuesIterator = list.iterator();
    while( sortedValuesIterator.hasNext() )
    Row row = (Row) sortedValuesIterator.next();
    this.insertRow( row );
    public boolean isSortAscending()
    return sortAscending;
    private void setSortAscending(boolean sortAscending)
    this.sortAscending = sortAscending;
    public int getSortColumnIndex()
    return sortColumnIndex;
    private void setSortColumnIndex(int sortColumnIndex)
    this.sortColumnIndex = sortColumnIndex;
    =============================================================

  • Issue with ADF table range paging and sorting

    Hello,
    We have a requirement to support pagination in ADF tables. For this, we have made use of Range paging access mode in the view object level. The paging works perfectly fine with this. But we have another requirement of letting the user do a sort on the table (Yes!! The paginated table). The sort should be applied on all rows in the db and the control should return to the first row.
    Applying sort as it is provided by the fwk, sorts the records in the obtained range only. So we have over ridden the Sort listener in Managed bean for this and are able to acheive the sorting of all rows through a AM method call. We are seeing a problem after this. If the sort action is applied to the key attribute then the previous and next range navigation works fine. If the same action is applied to non-key field, then some times (yea!! This is not consistent) the next set is not fetched.
    Here is the code snippet that is called on Next navigation:
    Map<String,Object> pfScope = AdfFacesContext.getCurrentInstance().getPageFlowScope();
    Object objPageNumber = pfScope.get("pageNumber");
    int pageNumber = 0;
    if(null != objPageNumber)
    pageNumber = new Integer(objPageNumber.toString()).intValue();
    BindingContainer bindings = BindingContext.getCurrent().getCurrentBindingsEntry();
    JUCtrlRangeBinding view = (JUCtrlRangeBinding)bindings.getControlBinding("GeDmRequestVO");
    int iRange = getTable().getAutoHeightRows();
    int currentPage = view.getIteratorBinding().getNavigatableRowIterator().getRangeStart()/(iRange + 1);
    System.out.println("Before " + view.getIteratorBinding().getNavigatableRowIterator().getRangeStart());
    System.out.println("Current : " + currentPage);
    System.out.println("Page Number : " + pageNumber);
    System.out.println("Range : " + iRange);
    System.out.println("Value : " + iRange*(currentPage + pageNumber));
    view.getIteratorBinding().getNavigatableRowIterator().scrollRange(iRange*pageNumber);
    System.out.println("After " + view.getIteratorBinding().getNavigatableRowIterator().getRangeStart());
    Although, the new values are not refreshed in the table, the SOPs for before and after print the proper range sizes. And as I mentioned above, the above code works perfectly fine if there is no sort applied or when key attribute is sorted.
    Would appreciate your help on this regard with navigation after non-key attribute sort.
    Thanks,
    Chitra.

    Hi Chitra,
    Can you specify some links to implement RangePaging.....We need to implement pagination.If possible can you share the code or specify the steps how you achieved it.
    Thanks

  • Time Dimension Type allows different values in attributes - Bug or Feature?

    Not sure if this is a bug or a feature.
    But if one has multiple hierarchies on a Time dimension. You have the ability to specify different values for member attributes in different hierarchies.
    Example.
    Hierarchy A has MIN_ID for it's Member and uses MIN_END_DATE for it's END_DATE
    Hierarchy B has MIN_ID for it's Member and uses SESS_END_DATE for it's END_DATE
    As per this post and David Greenfield's comment:
    Dimension Sort issue when multiple mappings for different hierarchies
    "Are you attempting to map the same attribute, SORT, to different columns in the two hierarchies? Put another way, do you expect the same member to have different values for the attribute in the two different hierarchies? If so, then this is a problem since a member must have the same value for the attribute regardless of the hierarchy."
    Unlike a user dimension, a time dimension appears to allow this and it appears to work as intended. Is the behavior in this case intended to be different between a user and time dimension?

    I think that this is not a bug. There is an incompatibility in design which prevents you from using the same attribute differently for both hierarchies.
    NOTE: Unlike parent relationship which depends on <dimension, dimension hierarchy>, Dimension Attribute is dependent on <dimension> alone, not dependent on <dimension, dimension hierarchy> combination. Hence it can only take on 1 value for 1 dimension member.
    I think that the time dimension only appears to allow this. The key thing to check is for Time Dimension members which are common to both the hierarchies. Only one of the mappings will take effect (usually the hierarchy which is loaded last will remain in the aw/usable for queries, reports.. it would have over-written the earlier attribute value loaded as per the earlier hierarchy load).
    Visualize a dimension as a long list of members which are built up contiguously on a per hierarchy, per level process using the mapping information saved. Once a member is defined (created) via Hierarchy A, it wont be created once again while loading Hierarchy B but is instead updated or redefined based on Hierarchy B's mapping info.
    Assuming the dimension load attempts to load Hierarchy A first and then Hierarchy B,
    * Dimension load for Hierarchy A will define the various members using MIN_ID and set the END_DATE attribute to value=MIN_END_DATE
    * Dimension load for Hierarchy B will re-define the various members using MIN_ID and re-set or over-write the END_DATE attribute to value=SESS_END_DATE
    * In this case, it looks like all members are common for both hierarchies (as both members are mapped to same column MIN_ID) and you would end up with END_DATE=SESS_END_DATE.
    Actually whether all members are common to both hierarchies or not depends on the quality of data in your snowflake/star table: if parent level for Hierarchy A as well as Hierarchy B is setup fine then the members will be same set (overlapping in whole). If some rows for MIN_ID have parent column for Hierarchy A setup correctly but parent column for Hierarchy B =null or invalid value then that member will exist in Hierarchy A alone and would contain END_DATE=MIN_END_DATE as the corresponding update along Hierarchy B would fail due to hierarchy data quality issues (join from current level to parent level).
    As regards a solution to your problem, you should not use the same attribute "SORT" for dual purpose (both hierarchies). Instead define attributes SORT_A and SORT_B and make them enabled for Hierarchy A, Hierarchy B respectively and map/use them appropriately in your reports.
    HTH
    Shankar

  • Sorting table view columns

    Hi ,
    In my application i want to sort specific tableview columns.When i am keeping Sort="server" .Now the hand point showing sort on every column.I wanted to restrict this to specific column.If you have information can you give how exactly sort option will work when sort by server and sort by application.
    Any weblogs available on sorting of tableview columns.
    Regards
    Usman

    Hi Usman,
    you can do this either with a tableview iterator or with the extension htmlb:tableviewcolumn.
    With the iterator you can change the behaviour of your columns in the method IF_HTMLB_TABLEVIEW_ITERATOR~GET_COLUMN_DEFINITIONS. The attribute P_COLUMN_DEFINITIONS is of type TABLEVIEWCONTROLTAB. With this attribute you can check "SORT" for a special column.
    With the extension htmlb:tableviewcolumn you can use the attribute SORT.
    Hoping this helps.
    Regards,
    Rainer

  • Changing sort order changes column width

    I turned on column sorting (Report Attributes -> Sort checkbox) for a particular column. Now when I sort descending on that column, it seems to disable word-wrap and the width of the column increases dramatically.
    Does anyone know why this happens or how to prevent it?

    The width of the column would be the width of the widest bit of text within the column.
    You can fix the width of a column by entering the following into the HTML Expression for the column definition:
    &lt;table&gt;&lt;col style="width:100px;"&gt;&lt;tr&gt;&lt;td&gt;#COLUMN_NAME#&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
    Regards
    Andy

  • Content sort on merged folders

    Hello experts,
    in the content administration I have defined three roles with following structure
    RoleA
    Reports - folder entry point
    -Projects - folder
    --ReportA1
    --ReportA2
    RoleB
    Reports - folder entry point
    -Projects - folder
    --ReportB1
    --ReportB2
    --ReportB3
    The folders "Report" and "Projects" in both the roles are identified with same Merge ID. Subsequetly when a user has both the roles assigned he or she sees following structure:
    Reports
    --Projects
    --ReportA1
    --ReportA2
    --ReportB1
    --ReportB2
    --ReportB3
    so far so good. Now I want to reorder the reports in the Projets folder following way:
    Reports
    --Projects
    --ReportB1
    --ReportB2
    --ReportB3
    --ReportA1
    --ReportA2
    I tried to change the attribute "Sort Order" on individual reports or on the folders (Reports, Projects) in both the roles but without any effect on the display order.
    Can someone help?
    Regards
    Jiri

    Hi Jiri,
    Change the merge priority so that Role B becomes the dominant role.
    Refer
    http://help.sap.com/saphelp_nw70/helpdata/en/53/89503ede925441e10000000a114084/content.htm
    Thanks
    Prashant

  • Sorting columns in editor.

    Hi,
    I haven't found anything written so far on sorting columns by clicking on the column header in the data tab of the editor. I find that particular operation fro sorting very useful.
    Thanks.
    JC

    if you are looking at a result set from some SQL, then there is no sort buttons, functions, etc.. excpet for you to change the SQL. Be nice to be able to click on the data grid headers though, eh?
    If you are browsing table data via the DATA tab in th etables tabs, then you will see 'Sort...'. Click it to define your sort. You get to this by clicking on a table name. Be nice to be able to click on the data grid headers though for simple one attribute sorting, eh?
    DK

  • XML: sorting XML nodes

    How would I sort a bunch of XML nodes? I don't necessarily
    need to change the existing structure, but I do need to iterate
    through in a specific order. In AS2 I would use the childNodes
    array and use Array.sort.

    And here's a attribute sort function from
    http://freerpad.blogspot.com/2007/07/more-hierarchical-sorting-e4x-xml-for.html.
    var xml:XML =
    <root>
    <node id="2">alpha</node>
    <node id="3">delta</node>
    <node id="5">bravo</node>
    <node id="0">foxtrot</node>
    <node id="1">echo</node>
    <node id="4">charlie</node>
    </root>
    sortXmlAttribute(xml,"id",true, Array.CASEINSENSITIVE);
    trace(xml)
    function sortXmlAttribute
    ( avXml :XML,
    avAttributeName :String,
    avPutEmptiesAtBottom :Boolean,
    avArraySortArgument0 :* = 0,
    avArraySortArgument1 :* = 0 )
    :void
    var lvChildrenCount:int
    = avXml.children().length();
    if( lvChildrenCount == 0 )
    return;
    if( lvChildrenCount > 1 )
    var lvAttributeValue :String;
    var lvXml :XML;
    var lvSortOptions:int
    = avArraySortArgument0 is Function
    ? avArraySortArgument1 : avArraySortArgument0;
    var lvSortCaseInsensitive:Boolean
    = ( lvSortOptions & Array.CASEINSENSITIVE )
    == Array.CASEINSENSITIVE;
    var lvArray:Array = new Array();
    for each( lvXml in avXml.children() )
    lvAttributeValue
    = lvXml.attribute( avAttributeName );
    if( lvSortCaseInsensitive )
    lvAttributeValue
    = lvAttributeValue.toUpperCase();
    if( lvArray.indexOf( lvAttributeValue ) == -1 )
    lvArray.push( lvAttributeValue );
    } // for each
    if( lvArray.length > 1 )
    lvArray.sort
    avArraySortArgument0,
    avArraySortArgument1
    if( avPutEmptiesAtBottom )
    if( lvArray[0] == "" )
    lvArray.push( lvArray.shift() );
    } // if
    } // if
    var lvXmlList:XMLList = new XMLList();
    for each( lvAttributeValue in lvArray )
    for each( lvXml in avXml.children() )
    var lvXmlAttributeValue:String
    = lvXml.attribute( avAttributeName );
    if( lvSortCaseInsensitive )
    lvXmlAttributeValue
    = lvXmlAttributeValue.toUpperCase();
    if( lvXmlAttributeValue == lvAttributeValue )
    lvXmlList += lvXml;
    } // for each
    } // for each
    avXml.setChildren( lvXmlList );
    } // if
    for each( var lvXmlChild:XML in avXml.children() )
    sortXmlAttribute
    lvXmlChild,
    avAttributeName,
    avPutEmptiesAtBottom,
    avArraySortArgument0,
    avArraySortArgument1
    } // for each
    } // sortXmlAttribute

  • Error in transfer with oracle WB transfer

    hi,
    I try to transfer my oracle target module to database (target schema).
    I recieve error as follow :
    **! Transfer logging started at Sat Feb 05 09:21:30 IRST 2005 !**
    OWB Bridge processed arguments
    Bridge Parameters are:
    <BA>=<All Collections>
    <LANGUAGE>=<All Languages>
    <LOCKREPOSITORY>=<True>
    <file>=<C:\TEMP\bridges\null-nullMy_Metadata_Transfer1107582690567.XMI>
    Using attribute sorting
    export to file
    Default local= fa_IR
    Exporting project:KAFA_PROJECT2
    initializing project:KAFA_PROJECT2
    Initializing module :KAFA_DW_MODULE
    exportSchema:KAFA_DW_MODULE SCHM14314
    Exporting cube:EMPLOYEMENT_CUBE
    exportCube:EMPLOYEMENT_CUBE/EMPLOYEMENT_CUBE Cube14396
    exportCubeDimUse:CDU14344FK14400
    exportFactLevelUse:FLU14396FK14400
    exportCubeDimUse:CDU14359FK14403
    exportFactLevelUse:FLU14396FK14403
    exportCubeDimUse:CDU14379FK14397
    exportFactLevelUse:FLU14396FK14397
    exportClassificationEntry for:KAFA_COLLECTION2 CLE_0
    exportMeasure:SALARY/SALARY MEA14410
    Exporting dimension:DEP_DIM
    exportDimension:DEP_DIM/DEP_DIM DIM14359
    exportClassificationEntry for:KAFA_COLLECTION2 CLE_2
    exportLevel:DEP_MAIN/DEP_MAIN LEV14360
    exportClassificationEntry for:Description CLE_3
    exportLevelAttribute:DEP_ID/DEP_ID LATR14363
    exportLevelAttribute:DEP_NAME/DEP_NAME LATR14367
    exportKey:DEP_MAIN_UK/DEP_MAIN_UK KEY14361
    exportKeyAttributeUse:KEYAU14363KEY14361
    exportHierarchy:HIE_DEP/HIE_DEP HEIR14374
    exportHierarchyLevelUse:HLU14374LEV14360
    Exporting dimension:EMP_DIM
    exportDimension:EMP_DIM/EMP_DIM DIM14344
    exportClassificationEntry for:KAFA_COLLECTION2 CLE_5
    exportLevel:EMP_MAIN/EMP_MAIN LEV14345
    exportClassificationEntry for:Description CLE_6
    exportLevelAttribute:EMP_ID/EMP_ID LATR14348
    exportLevelAttribute:EMP_NAME/EMP_NAME LATR14352
    exportKey:EMP_MAIN_UK/EMP_MAIN_UK KEY14346
    exportKeyAttributeUse:KEYAU14348KEY14346
    exportHierarchy:HIE_EMP/HIE_EMP HEIR14354
    exportHierarchyLevelUse:HLU14354LEV14345
    Exporting dimension:JOB_DIM
    exportDimension:JOB_DIM/JOB_DIM DIM14379
    exportClassificationEntry for:KAFA_COLLECTION2 CLE_7
    exportLevel:JOB_MAIN/JOB_MAIN LEV14380
    exportClassificationEntry for:Description CLE_8
    exportLevelAttribute:JOB_ID/JOB_ID LATR14383
    exportLevelAttribute:JOB_NAME/JOB_NAME LATR14389
    exportKey:JOB_MAIN_UK/JOB_MAIN_UK KEY14381
    exportKeyAttributeUse:KEYAU14383KEY14381
    exportHierarchy:HIE_JOB/HIE_JOB HEIR14391
    exportHierarchyLevelUse:HLU14391LEV14380
    Exporting mappings
    exportDimensionTableMap:DTMAP14359
    exportItemMap:IMAP14364
    exportItemUse:TIU14363
    exportItemUse:SIU14364
    exportItemMap:IMAP14368
    exportItemUse:TIU14367
    exportItemUse:SIU14368
    exportDimensionEntityUse:EU_DIM14359
    exportDimensionTableUse:EU_TAB14359
    exportDimensionTableMap:DTMAP14344
    exportItemMap:IMAP14349
    exportItemUse:TIU14348
    exportItemUse:SIU14349
    exportItemMap:IMAP14353
    exportItemUse:TIU14352
    exportItemUse:SIU14353
    exportDimensionEntityUse:EU_DIM14344
    exportDimensionTableUse:EU_TAB14344
    exportDimensionTableMap:DTMAP14379
    exportItemMap:IMAP14384
    exportItemUse:TIU14383
    exportItemUse:SIU14384
    exportItemMap:IMAP14390
    exportItemUse:TIU14389
    exportItemUse:SIU14390
    exportDimensionEntityUse:EU_DIM14379
    exportDimensionTableUse:EU_TAB14379
    exportFactTableMap:FTMAP14396
    exportMapDependency:MAPDEP14396DTMAP14344
    exportMapDependency:MAPDEP14396DTMAP14359
    exportMapDependency:MAPDEP14396DTMAP14379
    exportItemMap:IMAP14410
    exportItemUse:TIU14410
    exportItemUse:SIU14410
    exportCubeEntityUse:EU_Cube14396
    exportFactUse:EU_TAB14396
    exportFactLevelGroup:FLG_TAB14396
    Exporting table:DEP_DIM
    exportTable:DEP_DIM/DEP_DIM TAB14359
    exportKey:DEP_DIM_DEP_MAIN_UK/DEP_DIM_DEP_MAIN_UK KEY14362
    exportKeyAttributeUse:KEYAU14364KEY14362
    exportColumn:DEP_MAIN DEP_ID/DEP_MAIN_DEP_ID COL14364
    exportColumn:DEP_MAIN_DEP_NAME/DEP_MAIN_DEP_NAME COL14368
    exportClassificationEntry for:KAFA_COLLECTION2 CLE_9
    Exporting table:EMP_DIM
    exportTable:EMP_DIM/EMP_DIM TAB14344
    exportKey:EMP_DIM_EMP_MAIN_UK/EMP_DIM_EMP_MAIN_UK KEY14347
    exportKeyAttributeUse:KEYAU14349KEY14347
    exportColumn:EMP_MAIN EMP_ID/EMP_MAIN_EMP_ID COL14349
    exportColumn:EMP_MAIN_EMP_NAME/EMP_MAIN_EMP_NAME COL14353
    exportClassificationEntry for:KAFA_COLLECTION2 CLE_10
    Exporting table:JOB_DIM
    exportTable:JOB_DIM/JOB_DIM TAB14379
    exportKey:JOB_DIM_JOB_MAIN_UK/JOB_DIM_JOB_MAIN_UK KEY14382
    exportKeyAttributeUse:KEYAU14384KEY14382
    exportColumn:JOB_MAIN JOB_ID/JOB_MAIN_JOB_ID COL14384
    exportColumn:JOB_MAIN_JOB_NAME/JOB_MAIN_JOB_NAME COL14390
    exportClassificationEntry for:KAFA_COLLECTION2 CLE_11
    Exporting table:EMPLOYEMENT_CUBE
    exportTable:EMPLOYEMENT_CUBE/EMPLOYEMENT_CUBE TAB14396
    exportKey:SEG_UK_EMPLOYEMENT_CUBE/SEG_UK_EMPLOYEMENT_CUBE KEY14406
    exportKeyAttributeUse:KEYAU14401KEY14406
    exportKeyAttributeUse:KEYAU14404KEY14406
    exportKeyAttributeUse:KEYAU14398KEY14406
    exportForeignKey:FK_EMPLOYEMENT_CUBE_14362/FK_EMPLOYEMENT_CUBE_14362 FK14403
    exportKeyAttributeUse:KEYAU14404FK14403
    exportForeignKey:FK_EMPLOYEMENT_CUBE_14347/FK_EMPLOYEMENT_CUBE_14347 FK14400
    exportKeyAttributeUse:KEYAU14401FK14400
    exportForeignKey:FK_EMPLOYEMENT_CUBE_14382/FK_EMPLOYEMENT_CUBE_14382 FK14397
    exportKeyAttributeUse:KEYAU14398FK14397
    exportColumn:DEP_MAIN_DEP_ID/DEP_MAIN_DEP_ID COL14404
    exportColumn:EMP_MAIN_EMP_ID/EMP_MAIN_EMP_ID COL14401
    exportColumn:JOB_MAIN_JOB_ID/JOB_MAIN_JOB_ID COL14398
    exportColumn:SALARY/SALARY COL14410
    exportClassificationEntry for:KAFA_COLLECTION2 CLE_12
    exportContext:CTXT00
    exportTypeSet:TSET
    Exporting datatypes
    exportScalarDatatype:NUMBER DTYP7836
    exportScalarDatatype:VARCHAR2 DTYP7839
    exportClassification:KAFA_COLLECTION2 CLAS1
    exportClassificationType:Warehouse Builder Business Area CLT13
    exportClassification:Description CLAS4
    exportClassificationType:Dimensional Attribute Descriptor CLT14
    Exporting project KAFA_PROJECT2 complete.
    **! Target bridge jvm parameters = "..\..\..\jdk\jre\bin\javaw" -mx50m -DORCLCWM_META_MODEL_FILE=..\..\bridges\admin\orcl_cwm.xml -classpath .;..\..\bridges\lib\bridge_cwmlite10.jar;..\..\bridges\lib\bridge_parser10.jar;..\..\bridges\lib\vek10.jar;..\..\..\lib\xmlparserv2.jar;..\..\..\jdbc\lib\ojdbc14.jar;..\..\bridges\lib\util10.jar;..\..\lib\int\util.jar;..\..\lib\int\rts.jar;..\..\..\jlib\rts2.jar;..\..\bridges\lib\bridge_cwmlite10.jar;..\..\bridges\lib\bridge_parser10.jar;..\..\bridges\lib\vek10.jar;..\..\..\lib\xmlparserv2.jar;..\..\..\jdbc\lib\ojdbc14.jar;..\..\bridges\lib\util10.jar;..\..\lib\int\util.jar;..\..\lib\int\rts.jar;..\..\..\jlib\rts2.jar;"..\..\..\jdk\jre\lib\rt.jar;..\..\..\jdk\jre\lib\charsets.jar";"..\..\bridges\admin;..\..\bridges\lib\bridge_wrapper10.jar;..\..\bridges\lib\util10.jar;..\..\lib\int\reposimpl.jar;..\..\lib\int\repossdk.jar;..\..\lib\int\rts.jar;..\..\..\jlib\rts2.jar;..\..\lib\int\util.jar" oracle.cwm.tools.bridge.BridgeWrapper -bridge_name oracle.cwm.bridge.cwmlite.ImportMain !**
    **! Target bridge parameters = -log_level 2 -log_file C:\TEMP\bridges\log\null-nullMy_Metadata_Transfer1107582690567.log -olapimp.deploytoaw N -olapimp.awname -olapimp.awobjprefix -olapimp.loadcubedata Y -olapimp.awaggregate 1 -olapimp.dimuniquekeys N -olapimp.awuser -olapimp.createviews N -olapimp.viewprefix -olapimp.viewaccesstype OLAP -olapimp.creatematviews Y -olapimp.viewscriptdir -olapimp.deploy Y -olapimp.username dw_targetschema -olapimp.password password -olapimp.host 200.20.20.11 -olapimp.port 1521 -olapimp.sid ora10g -olapimp.inputfilename C:\TEMP\bridges\null-nullMy_Metadata_Transfer1107582690567.XMI -olapimp.outputfilename C:\Documents and Settings\stabatabaee\Desktop\test.sql -paramfile C:\TEMP\bridges\1107582690832.par !**
    setting parameter: olapimp.deploytoaw = N
    setting parameter: olapimp.awname =
    setting parameter: olapimp.awobjprefix =
    setting parameter: olapimp.loadcubedata = Y
    setting parameter: olapimp.awaggregate = 1
    setting parameter: olapimp.dimuniquekeys = N
    setting parameter: olapimp.awuser =
    setting parameter: olapimp.createviews = N
    setting parameter: olapimp.viewprefix =
    setting parameter: olapimp.viewaccesstype = OLAP
    setting parameter: olapimp.creatematviews = Y
    setting parameter: olapimp.viewscriptdir =
    setting parameter: olapimp.deploy = Y
    setting parameter: olapimp.username = dw_targetschema
    setting parameter: olapimp.host = 200.20.20.11
    setting parameter: olapimp.port = 1521
    setting parameter: olapimp.sid = ora10g
    setting parameter: olapimp.inputfilename = C:\TEMP\bridges\null-nullMy_Metadata_Transfer1107582690567.XMI
    setting parameter: olapimp.outputfilename = C:\Documents and Settings\stabatabaee\Desktop\test.sql
    Loading Metadata
    Loading XMI input file
    connecting ...
    processing dim: DEP_DIM
    processing level: DEP_MAINin dimension DEP_DIM
    processing level attribute use: DEP_MAIN_DEP_ID in level DEP_MAIN for level attribute DEP_ID
    processing level attribute : DEP_ID in level DEP_MAIN
    processing level attribute use: DEP_MAIN_DEP_NAME in level DEP_MAIN for level attribute DEP_NAME
    processing level attribute : DEP_NAME in level DEP_MAIN
    declare
    id number;
    s_id number;
    l_id number;
    levelday_id number;
    levelmonth_id number;
    levelquarter_id number;
    levelyear_id number;
    f_id number;
    begin
    select descriptor_id into l_id from all_olap_descriptors where descriptor_value='Long Description';
    select descriptor_id into s_id from all_olap_descriptors where descriptor_value='Short Description';
    select descriptor_id into levelday_id from all_olap_descriptors where descriptor_value='Day';
    select descriptor_id into levelmonth_id from all_olap_descriptors where descriptor_value='Month';
    select descriptor_id into levelquarter_id from all_olap_descriptors where descriptor_value='Quarter';
    select descriptor_id into levelyear_id from all_olap_descriptors where descriptor_value='Year';
    cwm_olap_dimension.Set_Description (USER, 'DEP_DIM', '');
    cwm_olap_dimension.Set_Display_Name(USER, 'DEP_DIM', 'DEP_DIM');
    cwm_olap_dimension.Set_Plural_Name(USER, 'DEP_DIM', 'DEP_DIM');
    cwm_olap_level.Set_Description (USER, 'DEP_DIM', 'DEP_MAIN', '');
    cwm_olap_level.Set_Display_Name(USER, 'DEP_DIM', 'DEP_MAIN', 'DEP_MAIN');
    begin
    cwm_olap_level_attribute.set_name(USER, 'DEP_DIM', 'DEP_MAIN', 'DEP_MAIN_DEP_ID', 'DEP_ID');
    exception when OTHERS then null;
    end;
    begin
    cwm_olap_level_attribute.Set_Display_Name(USER, 'DEP_DIM', 'DEP_MAIN', 'DEP_ID', 'DEP_ID');
    exception when OTHERS then null;
    end;
    begin
    cwm_olap_level_attribute.Set_description(USER, 'DEP_DIM', 'DEP_MAIN', 'DEP_ID', '');
    exception when OTHERS then null;
    end;
    begin
    cwm_olap_level_attribute.set_name(USER, 'DEP_DIM', 'DEP_MAIN', 'DEP_MAIN_DEP_NAME', 'DEP_NAME');
    exception when OTHERS then null;
    end;
    begin
    cwm_olap_level_attribute.Set_Display_Name(USER, 'DEP_DIM', 'DEP_MAIN', 'DEP_NAME', 'DEP_NAME');
    exception when OTHERS then null;
    end;
    begin
    cwm_olap_level_attribute.Set_description(USER, 'DEP_DIM', 'DEP_MAIN', 'DEP_NAME', '');
    exception when OTHERS then null;
    end;
    begin
    cwm_olap_dim_attribute.drop_Dimension_Attribute(USER, 'DEP_DIM', 'Short_Description');
    exception when others then null;
    end;
    begin
    cwm_olap_dim_attribute.Create_Dimension_Attribute(USER, 'DEP_DIM', 'Short_Description', 'Short_Description', 'Short Description');
    select descriptor_id into id from all_olap_descriptors where descriptor_value='Short Description';
    cwm_classify.add_entity_descriptor_use(id, cwm_utility.DIMENSION_ATTRIBUTE_TYPE,USER, 'DEP_DIM', 'Short_Description');
    exception when others then null;
    end;
    begin
    cwm_olap_dim_attribute.drop_Dimension_Attribute(USER, 'DEP_DIM', 'Long_Description');
    exception when others then null;
    end;
    begin
    cwm_olap_dim_attribute.Create_Dimension_Attribute(USER, 'DEP_DIM', 'Long_Description', 'Long_Description', 'Full Description');
    select descriptor_id into id from all_olap_descriptors where descriptor_value='Long Description';
    cwm_classify.add_entity_descriptor_use(id, cwm_utility.DIMENSION_ATTRIBUTE_TYPE,USER, 'DEP_DIM', 'Long_Description');
    exception when others then null;
    end;
    begin
    cwm_olap_dim_attribute.Add_Level_Attribute(USER, 'DEP_DIM', 'Short_Description', 'DEP_MAIN', 'DEP_NAME');
    exception when cwm_exceptions.attribute_already_exists then null; when cwm_exceptions.attribute_not_found then null;
    end;
    begin
    cwm_olap_dim_attribute.Add_Level_Attribute(USER, 'DEP_DIM', 'Long_Description', 'DEP_MAIN', 'DEP_NAME');
    exception when cwm_exceptions.attribute_already_exists then null; when cwm_exceptions.attribute_not_found then null;
    end;
    commit;
    exception when others then dbms_output.enable(1000000); cwm_utility.dump_error(); raise program_error;
    end;
    processing dim: EMP_DIM
    processing level: EMP_MAINin dimension EMP_DIM
    processing level attribute use: EMP_MAIN_EMP_ID in level EMP_MAIN for level attribute EMP_ID
    processing level attribute : EMP_ID in level EMP_MAIN
    processing level attribute use: EMP_MAIN_EMP_NAME in level EMP_MAIN for level attribute EMP_NAME
    processing level attribute : EMP_NAME in level EMP_MAIN
    declare
    id number;
    s_id number;
    l_id number;
    levelday_id number;
    levelmonth_id number;
    levelquarter_id number;
    levelyear_id number;
    f_id number;
    begin
    select descriptor_id into l_id from all_olap_descriptors where descriptor_value='Long Description';
    select descriptor_id into s_id from all_olap_descriptors where descriptor_value='Short Description';
    select descriptor_id into levelday_id from all_olap_descriptors where descriptor_value='Day';
    select descriptor_id into levelmonth_id from all_olap_descriptors where descriptor_value='Month';
    select descriptor_id into levelquarter_id from all_olap_descriptors where descriptor_value='Quarter';
    select descriptor_id into levelyear_id from all_olap_descriptors where descriptor_value='Year';
    cwm_olap_dimension.Set_Description (USER, 'EMP_DIM', '');
    cwm_olap_dimension.Set_Display_Name(USER, 'EMP_DIM', 'EMP_DIM');
    cwm_olap_dimension.Set_Plural_Name(USER, 'EMP_DIM', 'EMP_DIM');
    cwm_olap_level.Set_Description (USER, 'EMP_DIM', 'EMP_MAIN', '');
    cwm_olap_level.Set_Display_Name(USER, 'EMP_DIM', 'EMP_MAIN', 'EMP_MAIN');
    begin
    cwm_olap_level_attribute.set_name(USER, 'EMP_DIM', 'EMP_MAIN', 'EMP_MAIN_EMP_ID', 'EMP_ID');
    exception when OTHERS then null;
    end;
    begin
    cwm_olap_level_attribute.Set_Display_Name(USER, 'EMP_DIM', 'EMP_MAIN', 'EMP_ID', 'EMP_ID');
    exception when OTHERS then null;
    end;
    begin
    cwm_olap_level_attribute.Set_description(USER, 'EMP_DIM', 'EMP_MAIN', 'EMP_ID', '');
    exception when OTHERS then null;
    end;
    begin
    cwm_olap_level_attribute.set_name(USER, 'EMP_DIM', 'EMP_MAIN', 'EMP_MAIN_EMP_NAME', 'EMP_NAME');
    exception when OTHERS then null;
    end;
    begin
    cwm_olap_level_attribute.Set_Display_Name(USER, 'EMP_DIM', 'EMP_MAIN', 'EMP_NAME', 'EMP_NAME');
    exception when OTHERS then null;
    end;
    begin
    cwm_olap_level_attribute.Set_description(USER, 'EMP_DIM', 'EMP_MAIN', 'EMP_NAME', '');
    exception when OTHERS then null;
    end;
    begin
    cwm_olap_dim_attribute.drop_Dimension_Attribute(USER, 'EMP_DIM', 'Short_Description');
    exception when others then null;
    end;
    begin
    cwm_olap_dim_attribute.Create_Dimension_Attribute(USER, 'EMP_DIM', 'Short_Description', 'Short_Description', 'Short Description');
    select descriptor_id into id from all_olap_descriptors where descriptor_value='Short Description';
    cwm_classify.add_entity_descriptor_use(id, cwm_utility.DIMENSION_ATTRIBUTE_TYPE,USER, 'EMP_DIM', 'Short_Description');
    exception when others then null;
    end;
    begin
    cwm_olap_dim_attribute.drop_Dimension_Attribute(USER, 'EMP_DIM', 'Long_Description');
    exception when others then null;
    end;
    begin
    cwm_olap_dim_attribute.Create_Dimension_Attribute(USER, 'EMP_DIM', 'Long_Description', 'Long_Description', 'Full Description');
    select descriptor_id into id from all_olap_descriptors where descriptor_value='Long Description';
    cwm_classify.add_entity_descriptor_use(id, cwm_utility.DIMENSION_ATTRIBUTE_TYPE,USER, 'EMP_DIM', 'Long_Description');
    exception when others then null;
    end;
    begin
    cwm_olap_dim_attribute.Add_Level_Attribute(USER, 'EMP_DIM', 'Short_Description', 'EMP_MAIN', 'EMP_NAME');
    exception when cwm_exceptions.attribute_already_exists then null; when cwm_exceptions.attribute_not_found then null;
    end;
    begin
    cwm_olap_dim_attribute.Add_Level_Attribute(USER, 'EMP_DIM', 'Long_Description', 'EMP_MAIN', 'EMP_NAME');
    exception when cwm_exceptions.attribute_already_exists then null; when cwm_exceptions.attribute_not_found then null;
    end;
    commit;
    exception when others then dbms_output.enable(1000000); cwm_utility.dump_error(); raise program_error;
    end;
    processing dim: JOB_DIM
    processing level: JOB_MAINin dimension JOB_DIM
    processing level attribute use: JOB_MAIN_JOB_ID in level JOB_MAIN for level attribute JOB_ID
    processing level attribute : JOB_ID in level JOB_MAIN
    processing level attribute use: JOB_MAIN_JOB_NAME in level JOB_MAIN for level attribute JOB_NAME
    processing level attribute : JOB_NAME in level JOB_MAIN
    declare
    id number;
    s_id number;
    l_id number;
    levelday_id number;
    levelmonth_id number;
    levelquarter_id number;
    levelyear_id number;
    f_id number;
    begin
    select descriptor_id into l_id from all_olap_descriptors where descriptor_value='Long Description';
    select descriptor_id into s_id from all_olap_descriptors where descriptor_value='Short Description';
    select descriptor_id into levelday_id from all_olap_descriptors where descriptor_value='Day';
    select descriptor_id into levelmonth_id from all_olap_descriptors where descriptor_value='Month';
    select descriptor_id into levelquarter_id from all_olap_descriptors where descriptor_value='Quarter';
    select descriptor_id into levelyear_id from all_olap_descriptors where descriptor_value='Year';
    cwm_olap_dimension.Set_Description (USER, 'JOB_DIM', '');
    cwm_olap_dimension.Set_Display_Name(USER, 'JOB_DIM', 'JOB_DIM');
    cwm_olap_dimension.Set_Plural_Name(USER, 'JOB_DIM', 'JOB_DIM');
    cwm_olap_level.Set_Description (USER, 'JOB_DIM', 'JOB_MAIN', '');
    cwm_olap_level.Set_Display_Name(USER, 'JOB_DIM', 'JOB_MAIN', 'JOB_MAIN');
    begin
    cwm_olap_level_attribute.set_name(USER, 'JOB_DIM', 'JOB_MAIN', 'JOB_MAIN_JOB_ID', 'JOB_ID');
    exception when OTHERS then null;
    end;
    begin
    cwm_olap_level_attribute.Set_Display_Name(USER, 'JOB_DIM', 'JOB_MAIN', 'JOB_ID', 'JOB_ID');
    exception when OTHERS then null;
    end;
    begin
    cwm_olap_level_attribute.Set_description(USER, 'JOB_DIM', 'JOB_MAIN', 'JOB_ID', '');
    exception when OTHERS then null;
    end;
    begin
    cwm_olap_level_attribute.set_name(USER, 'JOB_DIM', 'JOB_MAIN', 'JOB_MAIN_JOB_NAME', 'JOB_NAME');
    exception when OTHERS then null;
    end;
    begin
    cwm_olap_level_attribute.Set_Display_Name(USER, 'JOB_DIM', 'JOB_MAIN', 'JOB_NAME', 'JOB_NAME');
    exception when OTHERS then null;
    end;
    begin
    cwm_olap_level_attribute.Set_description(USER, 'JOB_DIM', 'JOB_MAIN', 'JOB_NAME', '');
    exception when OTHERS then null;
    end;
    begin
    cwm_olap_dim_attribute.drop_Dimension_Attribute(USER, 'JOB_DIM', 'Short_Description');
    exception when others then null;
    end;
    begin
    cwm_olap_dim_attribute.Create_Dimension_Attribute(USER, 'JOB_DIM', 'Short_Description', 'Short_Description', 'Short Description');
    select descriptor_id into id from all_olap_descriptors where descriptor_value='Short Description';
    cwm_classify.add_entity_descriptor_use(id, cwm_utility.DIMENSION_ATTRIBUTE_TYPE,USER, 'JOB_DIM', 'Short_Description');
    exception when others then null;
    end;
    begin
    cwm_olap_dim_attribute.drop_Dimension_Attribute(USER, 'JOB_DIM', 'Long_Description');
    exception when others then null;
    end;
    begin
    cwm_olap_dim_attribute.Create_Dimension_Attribute(USER, 'JOB_DIM', 'Long_Description', 'Long_Description', 'Full Description');
    select descriptor_id into id from all_olap_descriptors where descriptor_value='Long Description';
    cwm_classify.add_entity_descriptor_use(id, cwm_utility.DIMENSION_ATTRIBUTE_TYPE,USER, 'JOB_DIM', 'Long_Description');
    exception when others then null;
    end;
    begin
    cwm_olap_dim_attribute.Add_Level_Attribute(USER, 'JOB_DIM', 'Short_Description', 'JOB_MAIN', 'JOB_NAME');
    exception when cwm_exceptions.attribute_already_exists then null; when cwm_exceptions.attribute_not_found then null;
    end;
    begin
    cwm_olap_dim_attribute.Add_Level_Attribute(USER, 'JOB_DIM', 'Long_Description', 'JOB_MAIN', 'JOB_NAME');
    exception when cwm_exceptions.attribute_already_exists then null; when cwm_exceptions.attribute_not_found then null;
    end;
    commit;
    exception when others then dbms_output.enable(1000000); cwm_utility.dump_error(); raise program_error;
    end;
    processing cube: EMPLOYEMENT_CUBE
    declare
    id number;
    s_id number;
    l_id number;
    levelday_id number;
    levelmonth_id number;
    levelquarter_id number;
    levelyear_id number;
    f_id number;
    begin
    select descriptor_id into l_id from all_olap_descriptors where descriptor_value='Long Description';
    select descriptor_id into s_id from all_olap_descriptors where descriptor_value='Short Description';
    select descriptor_id into levelday_id from all_olap_descriptors where descriptor_value='Day';
    select descriptor_id into levelmonth_id from all_olap_descriptors where descriptor_value='Month';
    select descriptor_id into levelquarter_id from all_olap_descriptors where descriptor_value='Quarter';
    select descriptor_id into levelyear_id from all_olap_descriptors where descriptor_value='Year';
    begin
    cwm_olap_cube.drop_cube(USER, 'EMPLOYEMENT_CUBE');
    exception when cwm_exceptions.cube_not_found then null;
    end;
    cwm_olap_cube.create_cube(USER, 'EMPLOYEMENT_CUBE', 'EMPLOYEMENT_CUBE', '');
    id := cwm_olap_cube.add_dimension(USER, 'EMPLOYEMENT_CUBE', USER, 'JOB_DIM', 'JOB_DIM');
    id := cwm_olap_cube.add_dimension(USER, 'EMPLOYEMENT_CUBE', USER, 'DEP_DIM', 'DEP_DIM');
    id := cwm_olap_cube.add_dimension(USER, 'EMPLOYEMENT_CUBE', USER, 'EMP_DIM', 'EMP_DIM');
    cwm_olap_cube.map_cube(USER, 'EMPLOYEMENT_CUBE', USER, 'EMPLOYEMENT_CUBE', 'FK_EMPLOYEMENT_CUBE_14382', 'JOB_MAIN', USER, 'JOB_DIM', 'JOB_DIM', false);
    cwm_olap_cube.map_cube(USER, 'EMPLOYEMENT_CUBE', USER, 'EMPLOYEMENT_CUBE', 'FK_EMPLOYEMENT_CUBE_14362', 'DEP_MAIN', USER, 'DEP_DIM', 'DEP_DIM', false);
    cwm_olap_cube.map_cube(USER, 'EMPLOYEMENT_CUBE', USER, 'EMPLOYEMENT_CUBE', 'FK_EMPLOYEMENT_CUBE_14347', 'EMP_MAIN', USER, 'EMP_DIM', 'EMP_DIM', false);
    cwm_olap_measure.create_measure(USER, 'EMPLOYEMENT_CUBE', 'SALARY', 'SALARY', '');
    cwm_olap_measure.set_column_map(USER, 'EMPLOYEMENT_CUBE', 'SALARY', USER, 'EMPLOYEMENT_CUBE', 'SALARY');
    begin
    f_id := cwm_utility.create_function_usage('SUM');
    cwm_olap_measure.set_default_aggregation_method(USER, 'EMPLOYEMENT_CUBE', 'SALARY', f_id, USER, 'JOB_DIM', 'JOB_DIM');
    exception when others then null;
    end;
    begin
    f_id := cwm_utility.create_function_usage('SUM');
    cwm_olap_measure.set_default_aggregation_method(USER, 'EMPLOYEMENT_CUBE', 'SALARY', f_id, USER, 'DEP_DIM', 'DEP_DIM');
    exception when others then null;
    end;
    begin
    f_id := cwm_utility.create_function_usage('SUM');
    cwm_olap_measure.set_default_aggregation_method(USER, 'EMPLOYEMENT_CUBE', 'SALARY', f_id, USER, 'EMP_DIM', 'EMP_DIM');
    exception when others then null;
    end;
    commit;
    exception when others then dbms_output.enable(1000000); cwm_utility.dump_error(); raise program_error;
    end;
    processing classification type is := Warehouse Builder Business Area
    processing catalog name := KAFA_COLLECTION2 ,and description is := KAFA_COLLECTION2
    processing Cube
    processing catalog entity cube := EMPLOYEMENT_CUBE
    processing measure := SALARY , in a cube := EMPLOYEMENT_CUBE
    processing classification type is := Dimensional Attribute Descriptor
    declare
    id number;
    s_id number;
    l_id number;
    levelday_id number;
    levelmonth_id number;
    levelquarter_id number;
    levelyear_id number;
    f_id number;
    begin
    select descriptor_id into l_id from all_olap_descriptors where descriptor_value='Long Description';
    select descriptor_id into s_id from all_olap_descriptors where descriptor_value='Short Description';
    select descriptor_id into levelday_id from all_olap_descriptors where descriptor_value='Day';
    select descriptor_id into levelmonth_id from all_olap_descriptors where descriptor_value='Month';
    select descriptor_id into levelquarter_id from all_olap_descriptors where descriptor_value='Quarter';
    select descriptor_id into levelyear_id from all_olap_descriptors where descriptor_value='Year';
    begin
    id := cwm_classify.create_catalog('KAFA_COLLECTION2', 'KAFA_COLLECTION2');
    exception when cwm_exceptions.catalog_already_exists then
    SELECT catalog_id INTO id from all_olap_catalogs
    WHERE catalog_name = 'KAFA_COLLECTION2'
    AND parent_catalog_id is null;
    end;
    begin
    cwm_classify.add_catalog_entity(id, USER, 'EMPLOYEMENT_CUBE', 'SALARY');
    exception when OTHERS then null;
    end;
    commit;
    exception when others then dbms_output.enable(1000000); cwm_utility.dump_error(); raise program_error;
    end;
    declare
    s_dir varchar2(4000);
    begin
    dbms_odm.createdimlevtuple (USER, 'EMPLOYEMENT_CUBE');
    dbms_odm.createcubeleveltuple (USER, 'EMPLOYEMENT_CUBE');
    dbms_odm.createfactowb (1,USER, 'EMPLOYEMENT_CUBE');
    exception when others then raise program_error;
    end;
    declare
    s_dir varchar2(4000);
    begin
    dbms_odm.createdimowb (2,USER, 'DEP_DIM');
    dbms_odm.createdimowb (3,USER, 'EMP_DIM');
    dbms_odm.createdimowb (4,USER, 'JOB_DIM');
    exception when others then raise program_error;
    end;
    BRD-05030: Error occurred while executing the PL/SQL file, see log file for details.
    ORA-06501: PL/SQL: program error
    ORA-06512: at line 8
    ORA-29283: invalid file operation
    BRD-05030: Error occurred while executing the PL/SQL file, see log file for details.
    disconnecting ...
    closing output file
    closing log stream
    **! Transfer logging stopped at Sat Feb 05 09:21:41 IRST 2005 !**
    thanks,
    shima

    I have run into the same problem using oracle 8.1.6.1 with redhat 6.2..
    I have found out that only when I specify host ip in the mts_dispatchers the mts will work correctly..(specifying host domain name is no use)
    Here is a sample of the configuration, hope that helps..
    mts_dispatchers = "(address=(protocol=tcp)(host=192.168.0.1))(dispatchers=4)"

Maybe you are looking for

  • PPStream installation problem on Windows 7

    I just recently install bootcamp and use bootcamp to install windows 7 pro. Everything seems fine until i download PPStream and install it. The installation process was smooth just like usual. However, when it finishes installing, it asked me if i wa

  • Black lines appearing around windows and on screen in Pro 9.3.1

    I have just installed the update for Acrobat Pro 9.3.1 and now am getting random black keylines around any windows open on my Mac. It does not matter which application, they are appearing on everything. They also show up on the screen when randomly w

  • Where is the dads.conf for Apex

    I have installed 11.0.2 on windows 32bit. I have installed the APEX software. I can get to the instace admin pages using the URL http://localhost:8080/apex But I cannot find the dads.conf on my c drive .. I did a search on the whole drive. From what

  • Scheduling Agreements-Intersting question

    Dear Friends I would like to know whether scheduling agreements can hand purchase order numbers at item level & if so where is it inputted. Thank You Regards N.Rao

  • Any way to find out the backend still in Process for the record after saved

    once after we create a record and save here we are using many procs and functions from backend, after the record saved is there any way we will know still the backend is still in process for that record to insert into concerned tables for that module