File and boolean attribute on item

Hi,
I work with Oracle9iAS Portal PL/SQL API (9.0.2.6).
I define an item type (CAID = 213/ ID = 37399) with several attributes. When I try to create/modify an item I have problems with boolean attributes and file attributes.
In case of boolean attributes, I'm not able to set the value to true. I try with several values (IS_ON, True, 1) but the attribute is still set to false. Is there anything wrong with the value I assign to the attribute before creating the item ? When I try to create the item and then modifying the attribute, the value stay to false.
For the file attributes, I use the upload_blob function inside wwsbr_api and the file appear in the wwdoc_document table. I set the file attribute value with the return value of the upload_blob function.
When I call the add_item_post_upload, an error occurs (ORA-29532: Java call terminated by uncaught Java exception: java.lang.NullPointerException : -29532)
Is this a bug or not ?
Thanks Eddy.
For help see my sample code below :
(ID of attribute, page, region are correct. I also test each type of attribute separately)
declare
l_master NUMBER;
l_store portal.wwsto_api_session;
l_custom_attribute portal.wwsbr_type.array := portal.wwsbr_type.empty;
l_custom_attribute_id portal.wwsbr_type.array := portal.wwsbr_type.empty;
l_custom_attribute_caid portal.wwsbr_type.array := portal.wwsbr_type.empty;
l_custom_attribute_data_type portal.wwsbr_type.array := portal.wwsbr_type.empty;
l_str VARCHAR2(100);
l_Blob BLOB;
l_filename VARCHAR2(100);
begin
-- set context
portal.wwctx_api.SET_CONTEXT('ctx','ctx01','');
-- load a session (Allow use of set_Attribute ...)
DBMS_OUTPUT.put_line('Load session');
l_store := portal.wwsto_api_session.load_session('ctx','ctx');
-- set parameters
-- item type = 'Item_ed'
l_store.set_attribute('ITEM_TYPE', 37399); -- Item type id
l_store.set_attribute('ITEM_CAID', 213); -- Item type caid (page group owner of item type)
l_store.set_attribute('PAGE_GROUP_ID', 213); -- Page group
l_store.set_attribute('FOLDER_ID', 37179); -- Page within page group
l_store.set_attribute('REGION_ID', 3216); -- Region id within page
-- see wwv_user_corners to determine template of page
-- see wwsbr_all_folder_regions for region display_name and region id (for template)
-- Get date format to insert right date string
SELECT DISTINCT value
INTO l_str
FROM v$nls_parameters
WHERE parameter = 'NLS_DATE_FORMAT';
dbms_output.put_line('date format ins : ' || l_str);
-- define attributes (for example, id are hardcoded
DBMS_OUTPUT.put_line('Define attributes');
-- 1080 = PRODUCT_CODE
l_custom_attribute(1) := 'MEHI';
l_custom_attribute_id(1) := 1080;
l_custom_attribute_caid(1) := portal.wwsbr_api.SHARED_OBJECTS; -- = 0
l_custom_attribute_data_type(1) := 'text';
-- 1081 = PRODUCT_AUTHOR
l_custom_attribute(2) := 'ESTAT';
l_custom_attribute_id(2) := 1081;
l_custom_attribute_caid(2) := portal.wwsbr_api.SHARED_OBJECTS; -- = 0
l_custom_attribute_data_type(2) := 'text';
-- 1469 = LANGUAGE
l_custom_attribute(3) := 'fr';
l_custom_attribute_id(3) := 1469;
l_custom_attribute_caid(3) := portal.wwsbr_api.SHARED_OBJECTS; -- = 0
l_custom_attribute_data_type(3) := 'text';
-- 3 = title
l_custom_attribute(4) := 'title value';
l_custom_attribute_id(4) := 3;
l_custom_attribute_caid(4) := portal.wwsbr_api.SHARED_OBJECTS; -- = 0
l_custom_attribute_data_type(4) := 'text';
-- 50 = wwsbr_text_
l_custom_attribute(5) := 'wwsbr_text_ value';
l_custom_attribute_id(5) := 50;
l_custom_attribute_caid(5) := portal.wwsbr_api.SHARED_OBJECTS; -- = 0
l_custom_attribute_data_type(5) := 'text';
-- 1464 = Release date
l_custom_attribute(6) := TO_CHAR(TO_DATE('21-JAN-2004 10:00 AM', 'DD-MON-YYYY HH12:MI PM'),l_str);
l_custom_attribute_id(6) := 1464;
l_custom_attribute_caid(6) := portal.wwsbr_api.SHARED_OBJECTS; -- = 0
l_custom_attribute_data_type(6) := 'date';
-- 1108 = download
l_custom_attribute(7) := 'http://www.oracle.com/';
l_custom_attribute_id(7) := 1108;
l_custom_attribute_caid(7) := portal.wwsbr_api.SHARED_OBJECTS; -- = 0
l_custom_attribute_data_type(7) := 'url';
-- 1485 = CDROM
l_custom_attribute(8) := '1';
l_custom_attribute_id(8) := 1485;
l_custom_attribute_caid(8) := portal.wwsbr_api.SHARED_OBJECTS; -- = 0
l_custom_attribute_data_type(8) := 'boolean';
-- 1111 = PAGE_NB
l_custom_attribute(8) := '1';
l_custom_attribute_id(8) := 1111;
l_custom_attribute_caid(8) := portal.wwsbr_api.SHARED_OBJECTS; -- = 0
l_custom_attribute_data_type(8) := 'number';
-- 1783 = COVER_IMAGE
-- get image
SELECT BANNER
INTO l_Blob
FROM metadata_tbl
WHERE PRODUCT_CODE = 'caa10000';
-- upload image in repository
l_filename := portal.wwsbr_api.upload_blob('BANNERupload',l_Blob, 'image/pjpeg');
DBMS_OUTPUT.put_line('filename : ' || l_filename);
l_custom_attribute(9) := l_filename;
l_custom_attribute_id(9) := 1783;
l_custom_attribute_caid(9) := portal.wwsbr_api.SHARED_OBJECTS; -- = 0
l_custom_attribute_data_type(9) := 'file';
DBMS_OUTPUT.put_line('Insert item starts');
l_master := portal.wwsbr_api.add_item_post_upload(
p_caid => l_store.get_attribute_as_number('PAGE_GROUP_ID'),
p_folder_id => l_store.get_attribute_as_number('FOLDER_ID'),
p_display_name => 'Insert : MEHI',
p_type_id => l_store.get_attribute_as_number('ITEM_TYPE'),
p_type_caid => l_store.get_attribute_as_number('ITEM_CAID'),
p_region_id => l_store.get_attribute_as_number('REGION_ID'), --to set or default
p_display_option => portal.WWSBR_API.IN_PLACE,
-- p_category_id in number default general_category,
-- p_category_caid in number default shared_objects,
-- p_perspectives in g_perspectiveidarray default g_perspectiveidemptyarray,
-- p_perspectives_caid in g_caid_array default g_empty_caid_array,
-- p_author in varchar2 default wwctx_api . get_user,
-- p_image_name => l_filename,
-- p_image_alignment in varchar2 default align_left,
-- p_description in varchar2 default null,
-- p_keywords in varchar2 default null,
-- p_file_name =>l_filename, --in varchar2 default null,
p_text => 'text field',
-- p_url in varchar2 default null,
-- p_plsql in varchar2 default null,
-- p_plsql_execute_mode in varchar2 default null,
-- p_plsql_execute_user in varchar2 default null,
-- p_folderlink_id in number default null,
-- p_folderlink_caid in number default null,
-- p_publish_date in varchar2 default null,
-- p_expire_mode in varchar2 default permanent,
-- p_expiration in varchar2 default null,
-- p_master_item_id in number default null,
-- p_hide_in_browse in number default no,
-- p_checkable in number default no,
-- p_parent_item_id in number default 0,
p_attribute_id => l_custom_attribute_id,
p_attribute_caid => l_custom_attribute_caid,
p_attribute_data_type => l_custom_attribute_data_type,
p_attribute_value => l_custom_attribute
DBMS_OUTPUT.put_line('Insert item ends. Item identifier : ' || l_master);
-- Invalidate cache from SQLPLUS
portal.wwpro_api_invalidation.execute_cache_invalidation;
DBMS_OUTPUT.put_line('Cache invalidated');
-- Drop session
portal.wwsto_api_session.drop_session('ctx','ctx');
-- Clean context
portal.wwctx_api.clear_context;
COMMIT;
exception
WHEN portal.wwctx_api.AUTHENTICATION_EXCEPTION THEN
DBMS_OUTPUT.PUT_LINE('AUTHENTICATION_EXCEPTION : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.CANNOT_INSERT_DOCUMENT THEN
DBMS_OUTPUT.PUT_LINE('CANNOT INSERT DOCUMENT : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.DUPLICATE_FOLDER THEN
DBMS_OUTPUT.PUT_LINE('DUPLICATE_FOLDER : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.DUPLICATE_ID THEN
DBMS_OUTPUT.PUT_LINE('DUPLICATE_ID : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.DUPLICATE_NAME THEN
DBMS_OUTPUT.PUT_LINE('DUPLICATE_NAME : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.FOLDER_VERSIONING_IS_AUDIT THEN
DBMS_OUTPUT.PUT_LINE('FOLDER_VERSIONING_IS_AUDIT : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.FOLDER_VERSIONING_IS_NONE THEN
DBMS_OUTPUT.PUT_LINE('FOLDER_VERSIONING_IS_AUDIT : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.ILS_DISABLED THEN
DBMS_OUTPUT.PUT_LINE('ILS_DISABLED : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.ILS_DISABLED_FOR_ITEM THEN
DBMS_OUTPUT.PUT_LINE('ILS_DISABLED_FOR_ITEM : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.INVALID_CAID THEN
DBMS_OUTPUT.PUT_LINE('INVALID CAID : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.INVALID_CATEGORY THEN
DBMS_OUTPUT.PUT_LINE('INVALID CATEGORY : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.INVALID_EXPIRE_DATE THEN
DBMS_OUTPUT.PUT_LINE('INVALID_EXPIRE_DATE : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.INVALID_EXPIRE_DATE_FORMAT THEN
DBMS_OUTPUT.PUT_LINE('INVALID_EXPIRE_DATE_FORMAT : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.INVALID_EXPIRE_NUMBER THEN
DBMS_OUTPUT.PUT_LINE('INVALID_EXPIRE_NUMBER : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.INVALID_FOLDER THEN
DBMS_OUTPUT.PUT_LINE('INVALID_FOLDER : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.INVALID_FOLDER_ID THEN
DBMS_OUTPUT.PUT_LINE('INVALID FOLDER ID : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.INVALID_ITEM_ID THEN
DBMS_OUTPUT.PUT_LINE('INVALID ITEM ID : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.INVALID_ITEMTYPE THEN
DBMS_OUTPUT.PUT_LINE('INVALID ITEMTYPE : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.INVALID_MOVE THEN
DBMS_OUTPUT.PUT_LINE('INVALID MOVE : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.INVALID_NAME THEN
DBMS_OUTPUT.PUT_LINE('INVALID NAME : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.INVALID_NUMBER THEN
DBMS_OUTPUT.PUT_LINE('INVALID NUMBER : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.INVALID_PERSPECTIVE THEN
DBMS_OUTPUT.PUT_LINE('INVALID PERSPECTIVE : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.INVALID_PLSQL_EXECUTE_USER THEN
DBMS_OUTPUT.PUT_LINE('INVALID_PL/SQL_EXECUTE_USER : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.INVALID_PUBLISH_DATE_FORMAT THEN
DBMS_OUTPUT.PUT_LINE('INVALID_PUBLISH_DATE_FORMAT : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.INVALID_PUBLISH_DATE_VALUE THEN
DBMS_OUTPUT.PUT_LINE('INVALID_PUBLISH_DATE_VALUE : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.INVALID_USERNAME THEN
DBMS_OUTPUT.PUT_LINE('INVALID USERNAME : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.ITEM_CREATION_ERROR THEN
DBMS_OUTPUT.PUT_LINE('ITEM_CREATION ERROR : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.ITEM_NOT_FOUND_ERROR THEN
DBMS_OUTPUT.PUT_LINE('ITEM NOT FOUND ERROR : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.ITEM_UPDATE_ERROR THEN
DBMS_OUTPUT.PUT_LINE('ITEM_UPDATE_ERROR : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.MISSING_DISPLAY_NAME THEN
DBMS_OUTPUT.PUT_LINE('MISSING DISPLAY NAME : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.MISSING_ITEM_TYPE THEN
DBMS_OUTPUT.PUT_LINE('MISSING ITEM TYPE : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.MISSING_NAME THEN
DBMS_OUTPUT.PUT_LINE('MISSING NAME : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.MISSING_PLSQL_EXECUTE_USER THEN
DBMS_OUTPUT.PUT_LINE('MISSING plsql execute user : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.NAME_TOO_LONG THEN
DBMS_OUTPUT.PUT_LINE('name too long : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.NO_ITEM_REGION THEN
DBMS_OUTPUT.PUT_LINE('not item region : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.NO_MASTER_ITEM_ID THEN
DBMS_OUTPUT.PUT_LINE('no master item id : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.NOT_ENOUGH_PRIVS THEN
DBMS_OUTPUT.PUT_LINE('not enough privs : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.NULL_EXPIRE_DATE THEN
DBMS_OUTPUT.PUT_LINE('null expire date : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.NULL_EXPIRE_NUMBER THEN
DBMS_OUTPUT.PUT_LINE('null expire number : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.PERMISSION_DENIED THEN
DBMS_OUTPUT.PUT_LINE('permission denied : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.PLSQL_REQUIRED THEN
DBMS_OUTPUT.PUT_LINE('plsql required : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.QUOTA_EXCEEDED THEN
DBMS_OUTPUT.PUT_LINE('quota exceeded : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.REQD_ATTR_MISSING THEN
DBMS_OUTPUT.PUT_LINE('reqd attr missing : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.UNKNOWN_ERROR THEN
DBMS_OUTPUT.PUT_LINE('unknown error : ' || sqlerrm);
ROLLBACK;
WHEN portal.wwsbr_api.URL_REQUIRED THEN
DBMS_OUTPUT.PUT_LINE('url required : ' || sqlerrm);
ROLLBACK;
when OTHERS then
DBMS_OUTPUT.PUT_LINE('OTHERS : ' || sqlerrm || ' : ' || sqlcode);
ROLLBACK;
end;

Hi Eddy:
On Friday, we put in a TAR with Oracle support about the identical problem. When editing an item using the API, the boolean attributes always reset. The values of the boolean attributes aren't even being changed. The only reason we need to set them again is the limitation of the API where you have to feed all attributes back in.
So far, Oracle has said it sounds like a bug and they want to know the exact sequence of steps to repeat it.
Rgds/Mark M.
Portal 9.0.2.6

Similar Messages

  • How to index text in TIFF and DWG files - and the attributes of the files

    Hi all
    Scenario
    Portal installation with a TREX on the side (NW7.0).
    No enterprise search involved.
    Questions
    1) Is it possible for the TREX to extract text from .DWG and .TIFF files in any standard way?
    2) Is it possible to index custom properties that are maintained on a DWG-file and a TIFF-file? Or is it only the name and description fields that will be indexed?
    3) If it is not possible for the TREX to extract text from DWG-files and TIFF-files, do you know of a third-party filter for TREX like dwgifilter for Microsoft software? http://www.dwgifilter.com/default.aspx
    4) Would it be possible to code and implement your own python extension in order for OCR to work with the TREX? If yes, would this be a complicated or fairly easy task? Where would you start... ?
    Thanks!
    Best regards,
    Martin

    Hi,
    my answers:
    1) Is it possible for the TREX to extract text from .DWG and .TIFF files in any standard way?
    for TIFF Not in the standard delivery. You have to develop this features.
    for DWG? perhaps AutoCAD Interchange and Native Drawing Formats (DXF and DWG)
    V. 2.5 - 2.6, 9.0 - 14.0, 2000i - 2002
    see help.sap.com
    2) Is it possible to index custom properties that are maintained on a DWG-file and a TIFF-file? Or is it only the name and description fields that will be indexed?
    Within KM you can define some Meta-Properties. During upload of your documents you can maintain the values for this proerties for every file. It is possible to search for the content of the properties as well.
    3) If it is not possible for the TREX to extract text from DWG-files and TIFF-files, do you know of a third-party filter for TREX like dwgifilter for Microsoft software? http://www.dwgifilter.com/default.aspx
    I am not quite sure but I think there doesn't exist a third party solution for this.
    4) Would it be possible to code and implement your own python extension in order for OCR to work with the TREX? If yes, would this be a complicated or fairly easy task? Where would you start... ?
    You can implement your own python extention.
    Add your coding for python extention in the default python extention code.
    Best regards
    Frank

  • CS4-JS : Read XML file and getting Attributes

    Dear All,
    How to get the Attributes based on the RootElements.
    For Example:
    //========================== XML File : Start ================================//
    <stag>
         <cust>
            <custname>120</custname>
             <atagst name="alpha" attributename="1" attributevalue="2" sty="First"/>
             <atagst name="beta" attributename="1" attributevalue="5" sty="Second"/>
             <atagst name="gama" attributename="1" attributevalue="2" sty="Third"/>
              <atagst name="theta" attributename="1" attributevalue="5" sty="Fourth"/>
          <cust>
         <cust>
             <custname>121</custname>
              <atagst name="A.alpha" attributename="1" attributevalue="2" sty="First"/>
              <atagst name="A.beta" attributename="1" attributevalue="5" sty="Second"/>
             <atagst name="A.gama" attributename="1" attributevalue="2" sty="Third"/>
              <atagst name="A.theta" attributename="1" attributevalue="5" sty="Fourth"/>
          <cust>
         <cust>
             <custname>122</custname>
              <atagst name="B.alpha" attributename="1" attributevalue="2" sty="First"/>
              <atagst name="Bbeta" attributename="1" attributevalue="5" sty="Second"/>
             <atagst name="B.gama" attributename="1" attributevalue="2" sty="Third"/>
              <atagst name="B.theta" attributename="1" attributevalue="5" sty="Fourth"/>
          <cust>
    </stag>
    //==========================  XML File : End ================================//
    Here I want to check through Java Script Code [InDesign]
    //======================== Script : Starts ====================================//
    var myEveryName = new Array();
    traverse(roots);
      for(var Element_name=0; Element_name<myEveryName.length; Element_name++)
                      if(myEveryName[Element_name] == "customername")
                                      custname.push(myEveryContent[Element_name]);
                    if(myEveryName[Element_name] == "applytagstyle")
                                Aname.push(myEveryAttributes[Element_name][0]);
                                 Aattributename.push(myEveryAttributes[Element_name][1]);
                                 Aattributevalue.push(myEveryAttributes[Element_name][2]);
                                 Asty.push(myEveryAttributes[Element_name][3]);
    function traverse(tree) {
        myEveryName.push(tree.name()); 
        if(tree.elements().length() > 0) {
            for(var i=0; i<tree.elements().length(); i++) {
                traverse(tree.elements()[i]);
    //========================  XML File : End ====================================//
    Everything working fine, but I couldn't get attribute values. Please check the below example
    For Example:
    If you check first root element in above xml code
    i need the output like
    custname=120
    name=alpha,beta,gama,theta
    attributename=1,1,1,1
    attributevalue=2,5,2,5
    sty=first,second,third,fourth
    custname=121
    Please any one can help me and give me the solutions.
    Thanks & Regards
    T.R.Harihara SudhaN

    Few questions:
    1. Your XML is not well formed.
    2. Secondly, I do not see any relation of XML with script. For instance, I do see any elements "customername", "applytagstyle" in input.
    3. Either you have not provided the complete source or either your dummy XML is incorrect.
    Anyways, having a quick look, I guess you are trying to get specific attributes values from XML tree. I will try to give you a kick start though you will be required to customized the script as per requirement (for instance rearranging the attribute values in array and so forth). Otherwise please try to post complete inputs.
    #include "glue code.jsx"
    //Get the attribute values of all elements
    main();
    function main(){
    if (app.documents.length != 0){
    var myDoc = app.activeDocument;
    var myRuleSet = new Array (
    new findObjAttribute("//*")
    with(myDoc){
    var elements = xmlElements;
    __processRuleSet(elements.item(0), myRuleSet);
    else{
    alert("You have no document open!");
    exit();
    function findObjAttribute(XPATH){
    this.name = "findObjAttribute";
    this.xpath = XPATH;
    this.apply = function(myElement, myRuleProcessor)
    var elmName=myElement.markupTag.name;
    with(myElement){
    try {
    var Name=myElement.xmlAttributes.itemByName("name").value;
    var AttName=myElement.xmlAttributes.itemByName("attributename").value;
    var AttValue=myElement.xmlAttributes.itemByName("attributevalue").value;
    var AttSty=myElement.xmlAttributes.itemByName("sty").value;
    $.writeln("Name: "+Name);
    $.writeln("AttributeName: "+AttName);
    $.writeln("AttributeValue: "+AttValue);
    $.writeln("Sty: "+AttSty);
         } catch(e){};
    return true;
    This will just print the values JavaScript console.
    HTH,
    Pankaj Chaturvedi

  • JAVA Read XML file and modify attribute values based on some conditions

    I have the following XML file "C:/Data.xml".
    If the attributes on Dimension, Metirc, Data date Matches then Add the amount values and remove the duplicate DS node.
    I looked some examples on hashtable/hashmapping but I could not find that meets my creiteria. I appriciate any direction or suggestions on this.
    <ED LG="US">
    <DS name="1" source="A" freq="Day">
    <Dimension name="code" value="3">
    <Metric ref_name="A1-ACT">
    <Data date="2011-03-04T00:00:00" amount="30" />
    </Metric>
    </Dimension>
    </DS>
    <DS name="1" source="A" freq="Day">
    <Dimension name="code" value="3">
    <Metric name="A1-ACT">
    <Data date="2011-03-04T00:00:00" amount="40" />
    </Metric>
    </Dimension>
    </DS>
    <DS name="1" source="A" freq="Day">
    <Dimension name="code" value="3">
    <Metric name="A1-ACT">
    <Data date="2011-03-05T00:00:00" amount="20" />
    </Metric>
    </Dimension>
    </DS>
    </ED>
    Expected Result:
    <ED LG="US">
    <DS name="1" source="A" freq="Day">
    <Dimension name="code" value="3">
    <Metric ref_name="A1-ACT">
    <Data date="2011-03-04T00:00:00" amount="70" />
    </Metric>
    </Dimension>
    </DS>
    <DS name="1" source="A" freq="Day">
    <Dimension name="code" value="3">
    <Metric name="A1-ACT">
    <Data date="2011-03-05T00:00:00" amount="20" />
    </Metric>
    </Dimension>
    </DS>
    </ED>
    thanks
    Edited by: user7188033 on Mar 19, 2011 1:40 PM
    Edited by: user7188033 on Mar 19, 2011 2:01 PM
    Edited by: user7188033 on Mar 19, 2011 2:02 PM

    Use XSLT for transforming the XML document.

  • Pros and Cons - Representation of boolean attributes

    A coworker and I are discussing the different ways to represent boolean attributes in a new schema.
    So far, there are three options we came up with and were wondering what the pros, cons, and opinions were. There are many attributes that work like the following examples:
    Addresses - A person may have multiple addresses
    Shipping Address - One and only one address is the shipping address for a person
    1) shipping is a boolean attrbiute of the Addresses table represented by a char(3) datatype than can be "yes" and "no"
    2) A seperate shipping address table contains a primary key made up of the primary key from the People table and the primary key from the Addresses table.
    3) One intermediate table is made for all the boolean value associated with the address entity, creating a many to many relationship between Addresses and "boolean attributes".
    I prefer 1, he prefers 2 and suggested 3.
    I like 1, because
    It enforces the relationship that a person can only have one shipping address, since the key is composed of a person and an address, and is unique.
    It is also easy to add too or drop when clients change their minds as much as they do. I can also associate extra data with it if the users decide to start asking for more (they often do)
    I dislike 2, because I have to actually compare text to get a boolean value. It also leaves room for people to make typos or put something besides yes or no. I don't mind 2 of an entity happens to have bunches of boolean attributes as I don't want to create 25 seperate tables...I guees it depends on what the data is.
    We've got attributes that work like this all over the place. Shipping addresses, billing addresses, primary phone number, primary email address. Courses that are 'advanced" courses, active vs non active, etc.
    I see his argument for long queries or too many tables, if an entity has many boolean attributes and I start making tables for all of them, but when there are 1-4 I think it makes more sense to make a seperate table for groupings of things. I see it more as a subset than an attribute...Give me all the addresses that are shipping addresses or give me all the courses that are not in the active course table.
    What are your thoughts?
    Edited by: brekehan on Apr 16, 2010 11:12 AM

    user12999515 wrote:
    This is the coworker trying to clarify the question. There is a distinct possibility that an entity represented in a database may have Boolean attributes. So the question is what is the best way to represent them. The first answer is to make them a column of a table. But let's think about it. What if we are going to search by these attributes.
    A solution is to have a separate table for each Boolean attribute pointing to the primary key of the addresses table. In this case shipping address and billing address.Sure you could do that, but then you introduce a few complexities, and i'm not sure i see where the tangible benefit is. With the example provided by your colleague you would have.
    Entity = address (pk_value, <other_columns>)
         where pk_value = person_id, some incremented ID for address
    Entity = shipping address (pk_value)
         where pk_value = person_idSo this takes care of the business rule that a person can have at most 1 shipping address (that's good), but when you want to know ALL the address information for a given person on file you need to query
    select
       a.*,
       case when sa.person_id is null then 'NO' else 'YES' end as shipping_address
    from addresses a, shipping_address sa
    where a.person_id = :person_id
    and a.person_id = sa.person_id (+)As opposed to
    Entity = address (pk_value, shipping_address, <other_columns>)
         where pk_value = person_id, some incremented ID for address
         shipping_address = 0 or 1
         with a check constraint on shipping_address ( in (0,1) )
         with a unique index to enforce the rule that a person can have at most one shipping address .. which would look something like
    drop table addresses;
    create table addresses
         person_id number,
         address_id number,
         shipping_address     number(1),
         constraint addresses_pk primary key (person_id, address_id),
         constraint addresses_c1 check (shipping_address in (0,1))
    create unique index addresses_u01 on addresses (case when shipping_address = 1 then person_id else null end);
    --shows the unique index enforcing the business rule that 1person can have at most 1 shipping address
    insert into addresses values (1, 1, 0);
    insert into addresses values (1, 2, 0);
    insert into addresses values (1, 3, 1);
    --take a break, add someone else's info
    insert into addresses values (2, 2, 0);
    insert into addresses values (2, 3, 1);
    --back to person 1, add another shipping address (which will raise an error because of our index)
    insert into addresses values (1, 4, 1);And when querying this table we have a simple
    select
       a.*
    from addresses a
    where a.person_id = :person_id
    --if you need to, decode(shipping_address, 0, 'NO', 'YES')When you start getting into adding tables to represent a 1 – 1 relationship, you're really adding overhead. In this case, instead of adding a single column (with a number(1) datatype) you'll be adding a table with a number(x) where x is the length of your person_id. So the storage goes up (i won't argue this is a big deal as we're in the year 2010 :) however when you need to query you now have 2 tables to grab data from which means more IO (you have data blocks for the address table, and data blocks for the shipping_address table).
    Unless you have a situation like the following
    http://www.oracle.com/technology/oramag/oracle/09-mar/o29asktom.html “Wide Load Storage” i would recommend staying away from modelling 1 – 1 relationships.

  • TS4153 few files and folder are not deleting from trash folder

    i am trying to delete all files and folders from the trash folder by clicking  empty securely, but its not deleteing the files anf folder. please suggest how to clean the trash folder by deleteing all the deleted files and folder.

    Securely deleting items in Trash writes Zeros over the data so that it is completely eradicated.
    This will take quit a long time depending on how much data you have in Trash.
    Unless there is data that needs to be rendered unrecoverable in the Trash, you could choose to just delete the files and folders with out the secure option.
    Once you have Emptied the Trash unsecurely you can always use disk utility to Securely Erase Free Space on the Drive.

  • How to get nodes and its attributes of an XML file usiong DOM parsing?

    how to get nodes and its attributes of an XML file usiong DOM parsing?
    i am new to XML parsing.......
    Thanking you........

    import org.w3c.dom.Document;
    import org.w3c.dom.*;
    import javax.xml.parsers.DocumentBuilderFactory;
    import javax.xml.parsers.DocumentBuilder;
    import org.xml.sax.SAXException;
    import org.xml.sax.SAXParseException;      ...
    //Setup the document
    DocumentBuilderFactory docBuilderFactory = DocumentBuilderFactory.newInstance();
         DocumentBuilder docBuilder = docBuilderFactory.newDocumentBuilder();
         Document doc = docBuilder.parse (new File("MY_XML_FILE.xml"));
    //get elemets by name
         String elementValue = doc.getElementsByTagName("MY_ELEMENT").item(0).getTextContent();
    //This method can return multiple nodes, in this instance I get item(0) , first nodeRead the api for other methods of getting data.

  • Reading text file and display in the selectOnechoice list item In ADF.

    Hi,
    I have a requirement to read the text field which have list of strings and that string display in the SelectOneChoice List item component on page load.
    I am using Jdeveloper 11.1.2.3 version.
    Any suggestion will highly appreciated..
    Thanks in advance.
    Regards

    Hi,
    Google will produce you with hints on how to read content of a file from Java (ideally the file uses some delimiter). Then in a managed bean, you read the file and save its content in a list of SelectItem. So your managed bean should have the following property and setter/getter pairs
    ArrayList<SelectItem> listFromFile = new ArrayList<SelectItem>();
    public void setListFromFile(ArrayList l){ //you don't need this }
    public ArrayList<SelectItem> getListFromFile(){
       //read file content and iterate over the file list entries
      for(i=0, i < fileContent.length, ++i){
         SelectItem si = new SelectItem();
         si.setValue(... the value to update the list of value with ...);
         si.setLabel("... the label to show in the list ...");
         listFromFile.add(si);
      return listFromFile;
    }The af:selectOneChoice component should look as follows
    <af:selectOneChoice id=".." value="...attribute to update with selection ..." ...>
       <f:selectItems value="#{managedBean.listFromFile}"/>
    </af:selectOneChoice>Frank

  • War file and new Trusted attributes since SE 6 update 19

    We deploy a war file which provides multiple applets. The new mixed code warning dialog is being displayed multiple times per user session (whenever a user selects another applet). Our war file contains multiple signed jar files, and several other components/files that are not packaged in a jar file.
    Adding "deployment.security.mixcode=HIDE_RUN" to a client PC's deployment.properties file suppresses the new warning dialog, but is not a practical solution for our web customers.
    #1) Is there any logging facility available in a develop/test environment that identifies the offending unsigned item(s) ?
    #2) To use the Trusted-Only attribute, must every component/file in our war file be packaged in a signed jar file, or is there another alternative ?
    #3) For this issue and for similar deployment issues, how likely is the hope that the Oracle Java team will improve upon these Update 19 mixed code enhancements in a near-future Update ?

    What are you missing?
    I inherited this app and signing the third party jars is how it was setup, I was wondering the same thing too, why was it necessary to sign the third party jars?
    The applet runs in either JRE 1.6.0_13 or JRE 1.6.0_27 depending on the other Java apps the user uses. JRE 1.6.0_13 does not have the mixed code security (so it is like is disable), but JRE 1.6.0_27 does have the mixed code security and the applet will not launch with mixed code security enable, so we have to disable it. With all the hacking going on in the last two years, is important to improve security; so this is a must.
    Yes, I always clear up the cache.
    Any idea on how to resolve this problem?

  • Mis match in attributes of item category B and Blanket PO item details

    Hi
    I could not able to create a Blanket PO with document type FO and item category B (limit).
    Normally for a blanket PO, GR is not allowed and IR is mandatory. So, I could able to see these checkboxes for GR (not set and grayed) and IR (set and grayed) in the item details of the PO in delivery and invoice tabs. But I end with an error message that 'GR is not set is used is not allowed' (exactly i don't remember but some thing like that)
    Before creating a blanket PO, I found that in 'attributes of item categories', that for the item category B, GR is set and IR is not set (in customizing-OMH4). Of course, one cannot change the attributes of item categories.
    I suppose due to this mis-match in the attributes of item categories and that in the PO, I'm getting an error message while creating a blanket PO and could not able to save.
    Also I have checked all the system messages for the message displayed. No where(i.e. in any message class or category and for that message number) I found any such message defined with error in 'attributes of system messages' for purchasing, material master, inventory, invoice verification etc.
    I suppose this error message is coming somewhere from the application program for the PO.
    So, what can be the solution
    waiting for your reply.
    thank you

    Thank you for your answer. I haven't mention this point in my thread but I have already tried this.
    I have tried matching the GR and IR controls of account assignment category with PO GR, IR controls and also with item category B, GR, IR controls. Still i am getting the same error and could not able to save the PO
    Also I have checked the field settings for PO for all categories like transaction relevant, item category relevant, etc in customizing.
    Main problem is both GR and IR controls are being grayed in PO. Other wise I could able to change them as per my wish and could able to avoid the error even though the GR and IR controls in PO are correct as per the process requirement.
    If possible, please check the GR and IR controls for item category B (in my system it showing that GR is set with binding and IR is not set with binding). But it should be reverse as per the process requirements and in limit PO it is showing correctly. i.e. with GR is not set with binding and IR is set with binding.
    thank you

  • Mis match in attributes of item category B-limit and PO item details

    Hi
    I could not able to create a Blanket PO with document type FO and item category B (limit).
    Normally for a blanket PO, GR is not allowed and IR is mandatory. So, I could able to see these checkboxes for GR (not set and grayed) and IR (set and grayed) in the item details of the PO in delivery and invoice tabs. But I end with an error message that 'GR is not set is used is not allowed' (exactly i don't remember but some thing like that)
    Before creating a blanket PO,  I found that in 'attributes of item categories', that for the item category B, GR is set and IR is not set (in customizing-OMH4). Of course, one cannot change the attributes of item categories.
    I suppose due to this mis-match in the attributes of item categories and that in the PO, I'm getting an error message while creating a blanket PO and could not able to save.
    Also I have checked all the system messages for the message displayed. No where(i.e. in any message class or category and for that message number) I found any such message defined with error in 'attributes of system messages' for purchasing, material master, inventory, invoice verification etc.
    I suppose this error message is coming somewhere from the application program for the PO.
    So, what can be the solution
    waiting for your reply.
    thank you
    Message was edited by:
            srinivas vandana

    Thank you for your answer. I haven't mention this point in my thread but I have already tried this.
    I have tried matching the GR and IR controls of account assignment category with PO GR, IR controls and also with item category B, GR, IR controls. Still i am getting the same error and could not able to save the PO
    Also I have checked the field settings for PO for all categories like transaction relevant, item category relevant, etc in customizing.
    Main problem is both GR and IR controls are being grayed in PO. Other wise I could able to change them as per my wish and could able to avoid the error even though the GR and IR controls in PO are correct as per the process requirement.
    If possible, please check the GR and IR controls for item category B (in my system it showing that GR is set with binding and IR is not set with binding). But it should be reverse as per the process requirements and in limit PO it is showing correctly. i.e. with GR is not set with binding and IR is set with binding.
    thank you

  • How do I read txt file and add items to dropdownlist or checkbox

    I want to add items to a dropdown or check box by reading from a text file(and select one of them). (I donot use any table or database). The list of items is sometimes upto 20MB and hence cannot populate using session bean.I want items to be added to either checkbox or listbox during a button action. I have done this for textarea but unable so far to acheive for checkbox or listbox. I use following code which does not work:
    public String button3_action() {       
    try{           
    FileReader fr = new FileReader "F:/CreatorProjects/checkboxtst.prs");
    BufferedReader br = new BufferedReader(fr);
    String s;
    while((s=br.readLine())!=null) {
    dropdown1.setValue(s);
    br.close();
    fr.close();
    }catch(Exception e) {
    e.printStackTrace();
    return null;
    I know I cant just transplant textarea code for dropdownlist or checkbox.
    Any help is greatly appreciated.
    Thanks.
    Dr.AM.Mohan Rao

    I am able to read from txt file to a listbox if i write in sessionsbean1:
    try{
    FileReader fr = new FileReader("F:/CreatorProjects/checkboxtst.prs");
    BufferedReader br = new BufferedReader(fr);
    String s1="";
    String s="";
    while((s=br.readLine())!=null) {
    s1 = s1+s;
    s1= s1+"\n";
    disOptions = new com.sun.rave.web.ui.model.Option[] {              
    new Option(s1,s1)};
    diseases = new String[] {};
    fr.close();
    br.close();
    catch(Exception e) {
    e.printStackTrace();
    But I get all data in one line!! if I click submit button text area gets all. How to display items in each line????Please help...
    Dr.AM. Mohan Rao

  • I renamed a power point a file and now when I try to open it it keeps giving me the error code -108. If I try to open or copy it say: The operation can't be completed because one or more required items can't be found. (Error code -43). What do I do?

    I renamed a power point a file and now when I try to open it it keeps giving me the error code -108. If I try to open or copy it say: The operation can’t be completed because one or more required items can’t be found. (Error code -43). What do I do?

    Post in Microsoft's Powerpoint (Mac) message boards.

  • Reading XML file and skip certain elements/attributes??

    Hi folks!
    Suppose I have a XML file looking like this:
    <?xml version="1.0" encoding="ISO-8859-1"?>
    <!DOCTYPE dvds SYSTEM "DTDtest.dtd">
    <dvds>
    <dvd>
    <title>
    Aliens
    </title>
    <director>
    James Cameron
    </director>
    <format>
    1.85:1
    </format>
    </dvd>
    <dvd>
    <title>
    X-Men
    </title>
    <director>
    Bryan Singer
    </director>
    <format>
    2.35:1
    </format>
    </dvd>
    </dvds>
    In my Java application I want to read this XML file and print it on the screen (including all tags etc). So far, so good. BUT, if I want to skip certain elements, i.e. all information about the dvd 'X-Men', how am I supposed to do this? In other words, I would like my app to skip reading all information about X-Men and continue with the next <dvd>... </dvd> tag. Is this possible?
    My code so far is from the XML tutorial from Sun and it looks like this:
    import java.io.*;
    import org.xml.sax.*;
    import org.xml.sax.helpers.DefaultHandler;
    import javax.xml.parsers.SAXParserFactory;
    import javax.xml.parsers.ParserConfigurationException;
    import javax.xml.parsers.SAXParser;
    public class MyXML extends DefaultHandler
    public static void main(String argv[]) {
    if (argv.length != 1) {
    System.err.println("Usage: cmd filename");
    System.exit(1);
    // Use an instance of ourselves as the SAX event handler
    DefaultHandler handler = new MyXML();
    // Use the default (non-validating) parser
    SAXParserFactory factory = SAXParserFactory.newInstance();
    try {
    // Set up output stream
    out = new OutputStreamWriter(System.out, "UTF8");
    // Parse the input
    SAXParser saxParser = factory.newSAXParser();
    saxParser.parse( new File(argv[0]), handler);
    } catch (Throwable t) {
    t.printStackTrace();
    System.exit(0);
    static private Writer out;
    //===========================================================
    // SAX DocumentHandler methods
    //===========================================================
    public void startDocument()
    throws SAXException
    emit("<?xml version='1.0' encoding='UTF-8'?>");
    nl();
    public void endDocument()
    throws SAXException
    try {
    nl();
    out.flush();
    } catch (IOException e) {
    throw new SAXException("I/O error", e);
    * <p>This method prints the start elements including attr.
    * @param namespaceURI
    * @param lName
    * @param qName
    * @param attrs
    * @throws SAXException
    public void startElement(String namespaceURI,
    String lName, // local name
    String qName, // qualified name
    Attributes attrs)
    throws SAXException
    String eName = lName; // element name
    if ("".equals(eName)) eName = qName; // namespaceAware = false
    emit("<"+eName);
    if (attrs != null) {
    for (int i = 0; i < attrs.getLength(); i++) {
    String aName = attrs.getLocalName(i); // Attr name
    if ("".equals(aName)) aName = attrs.getQName(i);
    emit(" ");
    emit(aName+"=\""+attrs.getValue(i)+"\"");
    emit(">");
    public void endElement(String namespaceURI,
    String sName, // simple name
    String qName // qualified name
    throws SAXException
    emit("</"+qName+">");
    * <p>This method prints the data between 'tags'
    * @param buf
    * @param offset
    * @param len
    * @throws SAXException
    public void characters(char buf[], int offset, int len)
    throws SAXException
    String s = new String(buf, offset, len);
    emit(s);
    //===========================================================
    // Utility Methods ...
    //===========================================================
    // Wrap I/O exceptions in SAX exceptions, to
    // suit handler signature requirements
    private void emit(String s)
    throws SAXException
    try {
    out.write(s);
    out.flush();
    } catch (IOException e) {
    throw new SAXException("I/O error", e);
    // Start a new line
    private void nl()
    throws SAXException
    String lineEnd = System.getProperty("line.separator");
    try {
    out.write(lineEnd);
    } catch (IOException e) {
    throw new SAXException("I/O error", e);
    Sorry about the long listing... :)
    Best regards
    /Paul

    A possibility that comes to mind is to create an XSLT script to do whatever it is you want - and call it from inside the program. The XSLT script can be stashed inside your .jar file by using getClass().getClassLoader().getResource("...")
    - David

  • PI 7.1 : Taking a input PDF file and mapping it to a hexBinary attribute

    Hello All,
    We have a requirement which involves taking in an input PDF file and mapping it to a message type with binary attribute and sending it to an R3 system.
    Can anyone please detail the steps or point us to the correct documents for setting up the scenario.
    The scenario is file to Proxy adapter. The part which we need assitance is pulling up the input pdf and mapping it to binary field.
    Thanks.
    Kiran

    Thanks Praveen,Mayank,Sarvesh and Andreas for your  valuable help with the issue.
    I was able to successfully pick up the binary PDF file from a file server , encode it using Base 64 and post it to R3.
    I used the following code snippet and added the mentioned jar files to create a new jar file which was used as java mapping in the operation mapping.
    import com.sap.aii.mapping.api.StreamTransformation;
    import com.sap.aii.mapping.api.*;
    import com.sap.aii.utilxi.base64.api.*;
    import java.io.*;
    import java.util.*;
    public class Base64EncodingXIStandard implements StreamTransformation{
         String fileNameFromFileAdapterASMA;
         private Map param;
         public void setParameter (Map map)
              param = map;
              if (param == null)
                   param = new HashMap();
         public static void main(String args[])
              Base64EncodingXIStandard con = new Base64EncodingXIStandard();
              try
                   InputStream is = new FileInputStream(args[0]);
                   OutputStream os = new FileOutputStream(args[1]);
                   con.execute(is, os);
              catch (Exception e)
                   e.printStackTrace();
    public void execute(InputStream inputstream, OutputStream outputstream)
                   DynamicConfiguration conf = (DynamicConfiguration) param.get("DynamicConfiguration");
                   DynamicConfigurationKey KEY_FILENAME = DynamicConfigurationKey.create("http://sap.com/xi/XI/System/File","FileName");
                   fileNameFromFileAdapterASMA = conf.get(KEY_FILENAME);
                   if (fileNameFromFileAdapterASMA == null)
                        fileNameFromFileAdapterASMA = "ToBase64.txt";
              try
                   while ((len = inputstream.read(buffer)) > 0)
                        baos.write(buffer, 0, len);
                   str = Base64.encode(baos.toByteArray());     //buffer);
                   outputstream.write("<?xml version=\"1.0\" encoding=\"utf-8\"?><ROOT>".getBytes());
                   outputstream.write(("<FILENAME>" + fileNameFromFileAdapterASMA + "</FILENAME>").getBytes());
                   outputstream.write( ("<BASE64DATA>" + str + "</BASE64DATA></ROOT>" ).getBytes());
              catch(Exception e)
                   e.printStackTrace();
         byte[] buffer = new byte[1024*5000];
         ByteArrayOutputStream baos = new ByteArrayOutputStream();
         int len;
         String str = null;
    I had to do the following configuration settings
    1)  Create a Sender Comm Channel with Adapter Specific message attributes and Filename checkbox checked.
    2) Use the Java Mapping in the Operation mapping.
    The scenario is working smoothly with out any issues.
    Thanks.
    Kiran

Maybe you are looking for

  • Document Date on Outgoing Payment/Statement Report PLD

    Hi, Is there a way to display the Document Date of the paid/outstanding invoice on the print layout for Remittances (Outgoing Payment) and Customer Statement (Ageing - Customer Statement Report)? At the moment it's only the Posting Date for the Outgo

  • Japanese JIS encoding conversion in Linux

    My program reads in a japanese data from MySQL database which the data is stored in UTF-8 and convert it into EUC-JP encoding in Linux and finally write it into a Text File. However, when I open the Text File, the contents have been garbled. I do not

  • 3 new Cisco Aironet 1600

    Hello I have  3 new Cisco AIR-SAP1602I-E-K9 for a large 2 floor halls. Max Users: 50. What could be the best practice  tools to calculate the  signal coverage + quality + speed when installing the access points through the halls ? Thanks for the answ

  • Table SRGBTBREL not found in SAP 4.6c version

    Hi All, when i have checked the table name SRGBTBREL in se16. it is saying table does not exist.  I have attatched the PDF files in FB03 transaction.  when i am checking SRGBTBREL table system saying table does not exist. Is there any version problem

  • Why aperture scares me

    When first got and started to learn Aperture 3, I had a series of images I downloaded off my camera to file on an external drive. Then I referenced them from within Aperture 3. I worked with a bit just playing around. Later the same week I opened Ape