Cmsdk 10G - need to export Clas Object/attributes into XML

I am running CMSDK 10g, and need to export Class Object and attributes (metadata) for a document into XML. Specifically the Class Object and attributes that I created. I am thinking I can either use an API to export the Class object and attribute data, or I can go directly into the DB and pull the data from the table into XML.
1) Is there an API available?
2) What user is the owner of the Class Objects and attributes?
3) What table contains the Class Object and attributes names and data?
Thanks,
Bilal

Hi Bilal,
I think the CUP (command line utility) will do the hard job for you.
1. Find the ID of the wanted classobject
(find CLASSOBJECT "name='WHATEVER'" -attrall)
2. Cat the definition with:
cat -id <the id>
You will get the description in xml. You can run the commands as script and with a XSL or CCS you will be able to transform the output into html for your doc.
Hope it helps
Regards
Gunther

Similar Messages

  • Need to Pass Active Directory Attributes into OBIEE for use in Report Query

    Hi,
    We're using OBIEE 11.1.1.5 and have integrated it with AD security. Users successfully log into OBIEE and BIP (which is integrated with OBIEE) using their network/AD accounts.
    Next step for us is to be able to pass some AD attributes that are on the user accounts into OBIEE and BIP so we can restrict data queries for the person logged in.
    I've searched the web and so far have only come up with specific steps for setting this up in BIP. Since BIP is integrated with OBIEE, we have not set any of the MSAD/LDAP security up there - it is all in OBIEE. I have been unable to find the equivalent for adding the attribute names for data query bind variables.
    With our current setup, we are unable to filter the report data automatically based on who has logged into the application (analytics and BIP). We have security setup within the catalog so only certain AD groups can access objects, but beyond that we need to be able to secure the data using AD attribute information. For example, all of our users should be able to access an Open Purchase Order report, but they should only be able to see their own purchase orders - not everyone in the company.
    Anyway, we're looking to be able to pass AD attributes into OBIEE so that we can use this data in our Analytics and BIP queries.
    How can we get this setup?
    Thank you in advance!
    Suzanne

    A fairly common problem, see:
    http://www.williamrobertson.net/documents/comma-separated.html
    http://www.oracle-base.com/articles/misc/DynamicInLists.php
    http://tkyte.blogspot.com/2006/06/varying-in-lists.html

  • Why we need to conver Context  Node data into XML file----Export to Excel

    Hi All,
    Let me clarify my dought........today i have gone through the concept of  "Exporting Context Data Using the Webdynpro Binary cache" in SAP Online Help.
    From the SAP Online Help pdf document, i have found that, the context node data has been converted first in to XML file,after that file had been stored in the web dynpor binary cache...bla....bla.........
    Here my qtn is why they had converted context node data into XML file. With out doing that can not we export context node data to excel file..?
    Regards
    Seshu
    Edited by: Sesshanna D on Dec 19, 2007 7:25 AM

    Hi Sesshanna,
    it is not neccessary to do that but xml has the advantage, that it can be easily transformed into every output format that might occur in later project stages.
    If it's simply about blowing out some Excel, I suggest using an OSS library such as jexcelAPI or Jakarta POI and building the Excel how you need it.
    regards,
    Christian

  • Exporting xmltype table data into xml/txt file

    I want to export data stored in oracle as xmltype table into xml format file.
    I want to use alternatives to the method shown below as my xml file is large.
    set long 10000000
    spool c:\\StudentXMLJan08.xml
    SELECT
    XMLElement("Student",
    XMLForest(s.studentid "studentid",
    s.firstname "firstname",
    s.lastname "surname"),
    XMLElement("enrollments",
    (SELECT XMLAGG(
    XMLForest(sc.coursecode "courseid"))
    FROM studentcourse sc
    WHERE sc.studentid = s.studentid
    and sc.is_approved='Y'
    and sc.takenyear='2008'
    and sc.takenterm='1')))
    FROM student s
    where s.statuscode in (select studentstatuscode from studentstatus where studentstatusactive=1)
    order by s.studentid;
    spool off
    please help, thank you

    How's this one for size
    SQL> create or replace view DEPARTMENT_XML of xmltype
      2  with object id
      3  (
      4    'DEPARTMENT'
      5  )
      6  as
      7  select xmlElement
      8         (
      9           "Departments",
    10           (
    11             select xmlAgg
    12                    (
    13                      xmlElement
    14                      (
    15                      "Department",
    16                      xmlAttributes( d.DEPARTMENT_ID as "DepartmentId"),
    17                      xmlElement("Name", d.DEPARTMENT_NAME),
    18                      xmlElement
    19                      (
    20                        "Location",
    21                        xmlForest
    22                        (
    23                           STREET_ADDRESS as "Address", CITY as "City", STATE_PROVINCE as "State",
    24                           POSTAL_CODE as "Zip",COUNTRY_NAME as "Country"
    25                        )
    26                      ),
    27                      xmlElement
    28                      (
    29                        "EmployeeList",
    30                        (
    31                          select xmlAgg
    32                                 (
    33                                   xmlElement
    34                                   (
    35                                     "Employee",
    36                                     xmlAttributes ( e.EMPLOYEE_ID as "employeeNumber" ),
    37                                     xmlForest
    38                                     (
    39                                       e.FIRST_NAME as "FirstName", e.LAST_NAME as "LastName", e.EMAIL as "EmailAddre
    ss",
    40                                       e.PHONE_NUMBER as "Telephone", e.HIRE_DATE as "StartDate", j.JOB_TITLE as "Job
    Title",
    41                                       e.SALARY as "Salary", m.FIRST_NAME || ' ' || m.LAST_NAME as "Manager"
    42                                     ),
    43                                     xmlElement ( "Commission", e.COMMISSION_PCT )
    44                                   )
    45                                 )
    46                            from HR.EMPLOYEES e, HR.EMPLOYEES m, HR.JOBS j
    47                           where e.DEPARTMENT_ID = d.DEPARTMENT_ID
    48                             and j.JOB_ID = e.JOB_ID
    49                             and m.EMPLOYEE_ID = e.MANAGER_ID
    50                        )
    51                      )
    52                    )
    53                  )
    54             from HR.DEPARTMENTS d, HR.COUNTRIES c, HR.LOCATIONS l
    55            where d.LOCATION_ID = l.LOCATION_ID
    56              and l.COUNTRY_ID  = c.COUNTRY_ID
    57           )
    58         )
    59    from dual
    60  /
    View created.
    SQL> create or replace trigger DEPARTMENT_DML
      2  instead of INSERT or UPDATE or DELETE
      3  on DEPARTMENT_XML
      4  begin
      5    null;
      6  end;
      7  /
    Trigger created.
    SQL> declare
      2    cursor getDepartments is
      3      select ref(d) XMLREF
      4        from DEPARTMENT_XML d;
      5    res boolean;
      6    targetFolder varchar2(1024) :=  '/public/Departments';
      7  begin
      8    if dbms_xdb.existsResource(targetFolder) then
      9       dbms_xdb.deleteResource(targetFolder,dbms_xdb.DELETE_RECURSIVE_FORCE);
    10    end if;
    11    res := dbms_xdb.createFolder(targetFolder);
    12    for dept in getDepartments loop
    13      res := DBMS_XDB.createResource(targetFolder || '/Departments.xml', dept.XMLREF);
    14    end loop;
    15  end;
    16  /
    PL/SQL procedure successfully completed.
    SQL> select path
      2    from path_view
      3   where equals_path(RES,'/public/Departments/Departments.xml') = 1
      4  /
    PATH
    /public/Departments/Departments.xml
    SQL> select xdburitype('/public/Departments/Departments.xml').getXML()
      2    from dual
      3  /
    XDBURITYPE('/PUBLIC/DEPARTMENTS/DEPARTMENTS.XML').GETXML()
    <Departments>
      <Department DepartmentId="60">
        <Name>IT</Name>
        <Location
    SQL> quit
    Disconnected from Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    C:\Temp>ftp localhost
    Connected to mdrake-lap.
    220- mdrake-lap
    Unauthorised use of this FTP server is prohibited and may be subject to civil and criminal prosecution.
    220 mdrake-lap FTP Server (Oracle XML DB/Oracle Database) ready.
    User (mdrake-lap:(none)): SCOTT
    331 pass required for SCOTT
    Password:
    230 SCOTT logged in
    ftp> cd /public/Departments
    250 CWD Command successful
    ftp> ls -l
    200 EPRT Command successful
    150 ASCII Data Connection
    -rw-r--r--   1 SCOTT    oracle         0 NOV 10 20:18 Departments.xml
    226 ASCII Transfer Complete
    ftp: 71 bytes received in 0.01Seconds 7.10Kbytes/sec.
    ftp> get Departments.xml -
    200 EPRT Command successful
    150 ASCII Data Connection
    <Departments><Department DepartmentId="60"><Name>IT</Name><Location><Address>2014 Jabberwocky Rd</Address><City>Southlak
    e</City><State>Texas</State><Zip>26192</Zip><Country>United States of America</Country></Location><EmployeeList><Employe
    e employeeNumber="103"><FirstName>Alexander</FirstName><LastName>Hunold</LastName><EmailAddress>AHUNOLD</EmailAddress><T
    elephone>590.423.4567</Telephone><StartDate>2006-01-03</StartDate><JobTitle>Programmer</JobTitle><Salary>9000</Salary><M
    anager>Lex De Haan</Manager><Commission></Commission></Employee><Employee employeeNumber="105"><FirstName>David</FirstNa
    me><LastName>Austin</LastName><EmailAddress>DAUSTIN</EmailAddress><Telephone>590.423.4569</Telephone><StartDate>2005-06-
    25</StartDate><JobTitle>Programmer</JobTitle><Salary>4800</Salary><Manager>Alexander Hunold</Manager><Commission></Commi
    ssion></Employee><Employee employeeNumber="106"><FirstName>Valli</FirstName><LastName>Pataballa</LastName><EmailAddress>
    VPATABAL</EmailAddress><Telephone>590.423.4560</Telephone><StartDate>2006-02-05</StartDate><JobTitle>Programmer</JobTitl
    e><Salary>4800</Salary><Manager>Alexander Hunold</Manager><Commission></Commission></Employee><Employee employeeNumber="
    107"><FirstName>Diana</FirstName><LastName>Lorentz</LastName><EmailAddress>DLORENTZ</EmailAddress><Telephone>590.423.556
    7</Telephone><StartDate>2007-02-07</StartDate><JobTitle>Programmer</JobTitle><Salary>4200</Salary><Manager>Alexander Hun
    old</Manager><Commission></Commission></Employee><Employee employeeNumber="104"><FirstName>Bruce</FirstName><LastName>Er
    nst</LastName><EmailAddress>BERNST</EmailAddress><Telephone>590.423.4568</Telephone><StartDate>2007-05-21</StartDate><Jo
    bTitle>Programmer</JobTitle><Salary>6000</Salary><Manager>Alexander Hunold</Manager><Commission></Commission></Employee>
    </EmployeeList></Department><Department DepartmentId="50"><Name>Shipping</Name><Location><Address>2011 Interiors Blvd</A
    ddress><City>South San Francisco</City><State>California</State><Zip>99236</Zip><Country>United States of America</Count
    ry></Location><EmployeeList><Employee employeeNumber="120"><FirstName>Matthew</FirstName><LastName>Weiss</LastName><Emai
    lAddress>MWEISS</EmailAddress><Telephone>650.123.1234</Telephone><StartDate>2004-07-18</StartDate><JobTitle>Stock Manage
    r</JobTitle><Salary>8000</Salary><Manager>Steven King</Manager><Commission></Commission></Employee><Employee employeeNum
    ber="122"><FirstName>Payam</FirstName><LastName>Kaufling</LastName><EmailAddress>PKAUFLIN</EmailAddress><Telephone>650.1
    23.3234</Telephone><StartDate>2003-05-01</StartDate><JobTitle>Stock Manager</JobTitle><Salary>7900</Salary><Manager>Stev
    en King</Manager><Commission></Commission></Employee><Employee employeeNumber="121"><FirstName>Adam</FirstName><LastName
    Fripp</LastName><EmailAddress>AFRIPP</EmailAddress><Telephone>650.123.2234</Telephone><StartDate>2005-04-10</StartDate><JobTitle>Stock Manager</JobTitle><Salary>8200</Salary><Manager>Steven King</Manager><Commission></Commission></Employee
    <Employee employeeNumber="124"><FirstName>Kevin</FirstName><LastName>Mourgos</LastName><EmailAddress>KMOURGOS</EmailAddress><Telephone>650.123.5234</Telephone><StartDate>2007-11-16</StartDate><JobTitle>Stock Manager</JobTitle><Salary>5800<
    /Salary><Manager>Steven King</Manager><Commission></Commission></Employee><Employee employeeNumber="123"><FirstName>Shan
    ta</FirstName><LastName>Vollman</LastName><EmailAddress>SVOLLMAN</EmailAddress><Telephone>650.123.4234</Telephone><Start
    Date>2005-10-10</StartDate><JobTitle>Stock Manager</JobTitle><Salary>6500</Salary><Manager>Steven King</Manager><Commiss
    ion></Commission></Employee><Employee employeeNumber="128"><FirstName>Steven</FirstName><LastName>Markle</LastName><Emai
    lAddress>SMARKLE</EmailAddress><Telephone>650.124.1434</Telephone><StartDate>2008-03-08</StartDate><JobTitle>Stock Clerk
    </JobTitle><Salary>2200</Salary><Manager>Matthew Weiss</Manager><Commission></Commission></Employee><Employee employeeNu
    mber="127"><FirstName>James</FirstName><LastName>Landry</LastName><EmailAddress>JLANDRY</EmailAddress><Telephone>650.124
    .1334</Telephone><StartDate>2007-01-14</StartDate><JobTitle>Stock Clerk</JobTitle><Salary>2400</Salary><Manager>Matthew
    Weiss</Manager><Commission></Commission></Employee><Employee employeeNumber="126"><FirstName>Irene</FirstName><LastName>
    Mikkilineni</LastName><EmailAddress>IMIKKILI</EmailAddress><Telephone>650.124.1224</Telephone>
    <StartDate>2002-06-07</St
    artDate><JobTitle>Public Relations Representative</JobTitle><Salary>10000</Salary><Manager>Neena Kochhar</Manager><Commi
    ssion></Commission></Employee></EmployeeList></Department></Departments>226 ASCII Transfer Complete
    ftp: 40392 bytes received in 0.08Seconds 480.86Kbytes/sec.
    ftp>

  • Need to export Missing object type called "TYPE" Please Help

    I did a referesh of one of the schema, I could see the difference between the source and destination schema object count.
    but object type is "TYPE" , where the difference lies.
    Three TYPE kind of objects are extra found in production which have not been imported to dest db.
    Could you please help me how can i take export of those object types and import the same in dest.
    For reference: Missing objects in destination schema
    OBJECT_NAME OBJECT_TYPE
    SYS_PLSQL_88116_DUMMY_1 TYPE
    SYS_PLSQL_88116_481_1 TYPE
    SYS_PLSQL_88116_433_1 TYPE

    Sir I have already tried that, but couldnt succeed.
    For reference: The 19 type which are imported successfully out of 22, the DDL is acheived, but when I tried to get the ddl of rest three from production it throws error, like not found in schema, even i tried to get it from sys schema, but same error..
    The one which is imported properly
    =========================
    SQL> select dbms_metadata.get_ddl('TYPE','XRATE_TABLE_TYPE','TKCSOWNER') FROM DUAL;
    DBMS_METADATA.GET_DDL('TYPE','XRATE_TABLE_TYPE','TKCSOWNER')
    CREATE OR REPLACE TYPE "TKCSOWNER"."XRATE_TABLE_TYPE" IS TABLE OF XRATE_RECORD
    The one which is not imported:
    ==================
    SQL> select dbms_metadata.get_ddl('TYPE','SYS_PLSQL_88116_DUMMY_1','TKCSOWNER') FROM DUAL;
    ERROR:
    ORA-31603: object "SYS_PLSQL_88116_DUMMY_1" of type TYPE not found in schema
    "TKCSOWNER"
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 105
    ORA-06512: at "SYS.DBMS_METADATA", line 3241
    ORA-06512: at "SYS.DBMS_METADATA", line 4812
    ORA-06512: at line 1
    no rows selected

  • Export Schema Objects

    I have Oracle XE in dev machine with objects (tables, sequences, views indexes etc) in a chaluwa schema. We want to test on a mock production machine with Oracle Enterprise Manager 11g (just downloaded it). First how do I use the XE interface (not code like in the link in the previous post) to export these objects.
    So far, I used XE's DDL creation utility to generate the code, I tried to run this code on an empty test schema in my dev machine to see if the DDL statements will replicate / reproduce the objects from the chaluwa schema, but I got errors. The DDL statements where not arranged in sequence, a foreign key field been created in a table was referencing a table not yet created. I wouldn't want to start tampering with the DDL code to rearrange it. In XE, I could make a schema by creating a new user, I 'v not been able (at least so far) to do this (create a schema) on the mock production machine running Oracle Enterprise Manager 11g.
    so the issues are :
    1. need to export the objects from chaluwa schema in the Oracle XE in my dev machine.
    2. create a schema on the mock production machine running Oracle Enterprise Manager 11g where I will put the exported objects from step 1.
    3. put the objects into the schema created in step two.
    Sorry if my questions were stupid, I really need your help. Cheers

    Hi,
    I have two XE at different machines, one in my office PC(production), and another at my laptop(dev), I use a lot of the datapump utility at XE, and it is great.
    And it is very simple to use, and it does have much more options of usage than my simple one, but my one works
    In case I would like to export all my data at machine1 to machine 2(and the schema name is chaluwa
    1. on machine1
    I create a directory by the OS at my machine1, like 'c:\chaluwa_080914'
    then I go to the sql command line(sql plus),
    sqlplus / as sysdbasql> create or replace directory anyname as 'c:\chaluwa_080914';
    sql> directory created.
    sql> grant read, write on directory chaluwa_080914 to chaluwa;
    sql> grant successed
    sql> exit;
    --- now you are back to the windows command line again
    c:\Document and Settings\peteryzhang> expdp system schemas=chaluwa directory=chaluwa_080914
    password: ******
    and then they will export all stuffs to the export file, the default name will be export and a log file.
    copy the directory(with its data exported) chaluwa to machine2;
    2. import the file you just exported to another database
    at the machine 2 , copy the chaluwa_080914 to a place, like 'd:\chaluwa_080914'
    and then go the sqlplus
    connect as sysdba
    sql>create user chaluwa identified by password default tablespace USERS(note: better the use the same tablespace as from machine 1).
    sql> user created.
    sql> grant connect, resource, dba to chaluwa ( that is what I usually use, maybe you not need the dba role, but you need some more proviliges for successfully import the export file, like, it depends, but, let's say, if you don't any right to create triggers, you cannot import triggers to it, it will fail)
    sql> success
    sql> create or replace directory chaluwa_080914 as 'd:\chaluwa_080914';
    sql> directory created.
    sql> grant read, write on directory chaluwa_080914 to chaluwa;
    sql> success;
    sql> exit;
    c:\document and settings\hostname>: impdp system schemas=chaluwa directory=chaluwa_080914
    and then the password.
    and at that, you will be successfully import the schema, and there's only one error, the user is already created, but it is Ok, fine.
    if you want to import the file into a different schema, you can do that, and if you need to import the file into a different tablespace, you can do that, there's some options,
    you can reference the doc at the*Xe two day DBA guide* and oracle database utilities:
    two day DBA
    http://download.oracle.com/docs/cd/B25329_01/doc/admin.102/b25107/impexp.htm#BCEEDCIB
    Oracle Database Utilities
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/part_dp.htm#i436481
    Peter
    Edited by: PeterCN on Sep 15, 2008 10:05 AM

  • RE: How to Export the Table data Into PDF File  in ADF

    Hi Experts,
    I am using Jdeveloper 11.1.2.3.0
    I am created employee VO and Drag and Drop as a Table in a page. So need to Export the Table data into A PDF file.
    So please give me some suggestions regarding this Scnerio.
    With Regards,
    satish

    Hi Guys ,
    Any more answers for this question.
    Please find my jsff below
    <?xml version='1.0' encoding='UTF-8'?>
    <jsp:root xmlns:jsp="http://java.sun.com/JSP/Page" version="2.1" xmlns:af="http://xmlns.oracle.com/adf/faces/rich"
              xmlns:f="http://java.sun.com/jsf/core" xmlns:report="http://www.adfwithejb.blogspot.com">
      <af:panelGroupLayout layout="vertical" id="pgl2">
          <af:query id="qryId1" headerText="Service Tariff Mapping Details" disclosed="true"
                    value="#{bindings.findByTarifValidFromQuery.queryDescriptor}"
                    model="#{bindings.findByTarifValidFromQuery.queryModel}"
                    queryListener="#{reportWiseInvoiceBean.genericQueryListener}"
                    queryOperationListener="#{bindings.findByTarifValidFromQuery.processQueryOperation}"
                    resultComponentId="pc1::t2">
         <f:attribute name="queryExpression" value="bindings.findByTarifValidFromQuery.processQuery"/>
                          </af:query>
        <af:panelCollection id="pc1" styleClass="AFStretchWidth">
          <f:facet name="menus"/>
          <f:facet name="toolbar">
              <af:toolbar id="t1">
                 <af:menuBar id="pt_m1">
                <report:reportDeclarative ButtonName="ExportToExcel" ReportName="ServiceTariffMappingDetails"
                                          ReportType="PDF" TableId=":::pc1:t2" id="rd1" Pagination="true"/>
                <af:commandButton text="excel" id="cb1" binding="#{exportToExcelBean.exportID}">
                <af:setActionListener from="pt1:pgl1:pgl2:pc1:t2" to="#{viewScope['exporter.exportedId']}"/>
                <af:setActionListener from="border:1px solid #cccccc" to="#{viewScope['exporter.thStyle']}"/>
                <af:setActionListener from="border:1px solid #cccccc" to="#{viewScope['exporter.tdStyle']}"/>
                <af:fileDownloadActionListener method="#{exportToExcelBean.exportToExcel}" filename="Service TariffMapping.xls"
                                                 contentType="text/excel;chatset=UTF-8;"/>
                </af:commandButton>
                <af:commandMenuItem id="pt_cmi133" icon="/images/common/Excel-icon.png"
                                                shortDesc="ExportToExcel"
                                >
                                <af:exportCollectionActionListener exportedId="t2" type="excelHTML"
                                                                   title="Service Tariff Mapping"
                                                                   filename="Service Tariff Mapping.xls"/>
                            </af:commandMenuItem></af:menuBar>
              </af:toolbar>
          </f:facet>
          <f:facet name="statusbar"/>
          <af:table value="#{bindings.ServiceTariffMappingDtlsRVO1.collectionModel}" var="row"
                    rows="#{bindings.ServiceTariffMappingDtlsRVO1.rangeSize}"
                    emptyText="#{bindings.ServiceTariffMappingDtlsRVO1.viewable ? 'No data to display.' : 'Access Denied.'}"
                    fetchSize="#{bindings.ServiceTariffMappingDtlsRVO1.rangeSize}" rowBandingInterval="0"
                    filterModel="#{bindings.findByTarifValidFromQuery.queryDescriptor}"
                    queryListener="#{bindings.findByTarifValidFromQuery.processQuery}" filterVisible="true" varStatus="vs"
                    id="t2" columnStretching="last" binding="#{ServiceTariffMappBean.testTable}">
            <af:column sortProperty="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NormalTariffCode.name}" filterable="true"
                       sortable="true" headerText="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NormalTariffCode.label}"
                       id="c1">
              <af:inputText value="#{row.bindings.NormalTariffCode.inputValue}"
                            label="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NormalTariffCode.label}"
                            required="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NormalTariffCode.mandatory}"
                            columns="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NormalTariffCode.displayWidth}"
                            maximumLength="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NormalTariffCode.precision}"
                            shortDesc="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NormalTariffCode.tooltip}" id="it1">
                <f:validator binding="#{row.bindings.NormalTariffCode.validator}"/>
              </af:inputText>
            </af:column>
            <af:column sortProperty="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceCode.name}" filterable="true"
                       sortable="true" headerText="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceCode.label}"
                       id="c2">
              <af:inputText value="#{row.bindings.ServiceCode.inputValue}"
                            label="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceCode.label}"
                            required="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceCode.mandatory}"
                            columns="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceCode.displayWidth}"
                            maximumLength="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceCode.precision}"
                            shortDesc="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceCode.tooltip}" id="it2">
                <f:validator binding="#{row.bindings.ServiceCode.validator}"/>
              </af:inputText>
            </af:column>
            <af:column sortProperty="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfVldFrm.name}" filterable="true"
                       sortable="true" headerText="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfVldFrm.label}" id="c3">
              <f:facet name="filter">
                <af:inputDate value="#{vs.filterCriteria.TrfVldFrm}" id="id1">
                  <af:convertDateTime pattern="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfVldFrm.format}"/>
                </af:inputDate>
              </f:facet>
              <af:inputDate value="#{row.bindings.TrfVldFrm.inputValue}"
                            label="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfVldFrm.label}"
                            required="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfVldFrm.mandatory}"
                            columns="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfVldFrm.displayWidth}"
                            shortDesc="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfVldFrm.tooltip}" id="id2">
                <f:validator binding="#{row.bindings.TrfVldFrm.validator}"/>
                <af:convertDateTime pattern="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfVldFrm.format}"/>
              </af:inputDate>
            </af:column>
            <af:column sortProperty="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceDesc.name}" filterable="true"
                       sortable="true" headerText="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceDesc.label}"
                       id="c4">
              <af:inputText value="#{row.bindings.ServiceDesc.inputValue}"
                            label="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceDesc.label}"
                            required="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceDesc.mandatory}"
                            columns="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceDesc.displayWidth}"
                            maximumLength="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceDesc.precision}"
                            shortDesc="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceDesc.tooltip}" id="it3">
                <f:validator binding="#{row.bindings.ServiceDesc.validator}"/>
              </af:inputText>
            </af:column>
            <af:column sortProperty="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtTrfCode.name}" filterable="true"
                       sortable="true" headerText="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtTrfCode.label}" id="c5">
              <af:inputText value="#{row.bindings.OtTrfCode.inputValue}"
                            label="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtTrfCode.label}"
                            required="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtTrfCode.mandatory}"
                            columns="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtTrfCode.displayWidth}"
                            maximumLength="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtTrfCode.precision}"
                            shortDesc="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtTrfCode.tooltip}" id="it4">
                <f:validator binding="#{row.bindings.OtTrfCode.validator}"/>
              </af:inputText>
            </af:column>
            <af:column sortProperty="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtUnitRate.name}" filterable="true"
                       sortable="true" headerText="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtUnitRate.label}" id="c6">
              <af:inputText value="#{row.bindings.OtUnitRate.inputValue}"
                            label="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtUnitRate.label}"
                            required="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtUnitRate.mandatory}"
                            columns="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtUnitRate.displayWidth}"
                            maximumLength="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtUnitRate.precision}"
                            shortDesc="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtUnitRate.tooltip}" id="it5">
                <f:validator binding="#{row.bindings.OtUnitRate.validator}"/>
              </af:inputText>
            </af:column>
            <af:column sortProperty="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NtUnitRate.name}" filterable="true"
                       sortable="true" headerText="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NtUnitRate.label}" id="c7">
              <af:inputText value="#{row.bindings.NtUnitRate.inputValue}"
                            label="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NtUnitRate.label}"
                            required="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NtUnitRate.mandatory}"
                            columns="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NtUnitRate.displayWidth}"
                            maximumLength="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NtUnitRate.precision}"
                            shortDesc="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NtUnitRate.tooltip}" id="it6">
                <f:validator binding="#{row.bindings.NtUnitRate.validator}"/>
              </af:inputText>
            </af:column>
            <af:column sortProperty="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfGrt.name}" filterable="true"
                       sortable="true" headerText="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfGrt.label}" id="c8">
              <af:inputText value="#{row.bindings.TrfGrt.inputValue}"
                            label="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfGrt.label}"
                            required="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfGrt.mandatory}"
                            columns="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfGrt.displayWidth}"
                            maximumLength="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfGrt.precision}"
                            shortDesc="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfGrt.tooltip}" id="it7">
                <f:validator binding="#{row.bindings.TrfGrt.validator}"/>
              </af:inputText>
            </af:column>
            <af:column sortProperty="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ChargePartyCode.name}" filterable="true"
                       sortable="true" headerText="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ChargePartyCode.label}"
                       id="c9">
              <af:inputText value="#{row.bindings.ChargePartyCode.inputValue}"
                            label="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ChargePartyCode.label}"
                            required="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ChargePartyCode.mandatory}"
                            columns="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ChargePartyCode.displayWidth}"
                            maximumLength="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ChargePartyCode.precision}"
                            shortDesc="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ChargePartyCode.tooltip}" id="it8">
                <f:validator binding="#{row.bindings.ChargePartyCode.validator}"/>
              </af:inputText>
            </af:column>
          </af:table>
        </af:panelCollection>
      </af:panelGroupLayout>
    </jsp:root>

  • Marshalling 'autotype' generated objects back to XML

    I have a document-literal webservice implementation that uses
    WebLogic request/response binding classes generated by
    'autotype'. My trouble is that there does not seem to be any way to programmatically
    transform these Java objects back into XML form. I'm fairly new to web services
    development with webLogic, so I may be missing something, and my hope is that
    there is a straight-forward solution for this.
    I have tried to use the serialize method from the Codec class, but something in
    the underlying implementation throws a NullPointerException in reference to the
    SerializationContext object. And I can't seem to find any documentation on this
    object, or anything explicitly pertaining to marshalling back to XML. So at this
    point, my less than desirable approach is to create a set of parallel classes
    using JAXB, so that I can have access to the JAXBContext marshaller.
    Here is the underlying method for our web service operation (which unsuccessfully
    calls the serialize method on the codec class):
    public com.hp.bea.OutValidateConfig validate_v1_2(com.hp.bea.InValidateConfig
    validateRequest)
    InValidateConfigCodec vcrequestCodec = new InValidateConfigCodec();
    weblogic.xml.stream.XMLName xmlType =
    weblogic.xml.stream.ElementFactory.createXMLName( "http://production.psg.hp.com/types"
    , "InValidateConfigType" );
    try {
    DocumentBuilder docBuilder = DocumentBuilderFactory.newInstance().newDocumentBuilder();
    Document vcDoc = docBuilder.newDocument();
    XMLOutputStream xmlOutputStream = XMLOutputStreamFactory.newInstance().newOutputStream(vcDoc);
    SerializationContext ctx= SerializationContextFactory.newInstance().createSerializationContext();
    // ctx.setNamespacePrefixMap();
    if ( validateRequest == null )
    System.out.println("ValidateRequest is null");
    else
    System.out.println("ValidateRequest is not null");
    System.out.println(validateRequest);
    try
    vcrequestCodec.serialize(validateRequest, xmlType, xmlOutputStream, ctx);
    catch (Exception excp)
    excp.printStackTrace();
    // TODO - call service handler
    Document vcResp = docBuilder.newDocument();
    OutValidateConfigCodec vcResponseCodec = new OutValidateConfigCodec();
    XMLInputStream xmlInputStream = XMLInputStreamFactory.newInstance().newInputStream(vcResp);
    DeserializationContext dCtx= DeserializationContextFactory.newInstance().createDeserializationContext();
    vcResponseCodec.deserialize(xmlType, xmlInputStream, dCtx);
    } catch (Exception e) {
    // TODO Auto-generated catch block
    e.printStackTrace();
    return null;
    There must be something I'm missing, because I'm sure there is a simple way to
    Marshall my object back to XML. Am I using the wrong API? Can someone please
    shed some light on this? Any insight or suggestions are much appreciated.
    Thank you,
    Matt

    Hi Matthew,
    Try this example from Manoj [1] and see it fits your use case.
    Hope this helps,
    Bruce
    [1]
    http://www.manojc.com/?sample43
    Matthew Cohen wrote:
    >
    I have a document-literal webservice implementation that uses
    WebLogic request/response binding classes generated by
    'autotype'. My trouble is that there does not seem to be any way to programmatically
    transform these Java objects back into XML form. I'm fairly new to web services
    development with webLogic, so I may be missing something, and my hope is that
    there is a straight-forward solution for this.
    I have tried to use the serialize method from the Codec class, but something in
    the underlying implementation throws a NullPointerException in reference to the
    SerializationContext object. And I can't seem to find any documentation on this
    object, or anything explicitly pertaining to marshalling back to XML. So at this
    point, my less than desirable approach is to create a set of parallel classes
    using JAXB, so that I can have access to the JAXBContext marshaller.
    Here is the underlying method for our web service operation (which unsuccessfully
    calls the serialize method on the codec class):
    public com.hp.bea.OutValidateConfig validate_v1_2(com.hp.bea.InValidateConfig
    validateRequest)
    InValidateConfigCodec vcrequestCodec = new InValidateConfigCodec();
    weblogic.xml.stream.XMLName xmlType =
    weblogic.xml.stream.ElementFactory.createXMLName( "http://production.psg.hp.com/types"
    , "InValidateConfigType" );
    try {
    DocumentBuilder docBuilder = DocumentBuilderFactory.newInstance().newDocumentBuilder();
    Document vcDoc = docBuilder.newDocument();
    XMLOutputStream xmlOutputStream = XMLOutputStreamFactory.newInstance().newOutputStream(vcDoc);
    SerializationContext ctx= SerializationContextFactory.newInstance().createSerializationContext();
    // ctx.setNamespacePrefixMap();
    if ( validateRequest == null )
    System.out.println("ValidateRequest is null");
    else
    System.out.println("ValidateRequest is not null");
    System.out.println(validateRequest);
    try
    vcrequestCodec.serialize(validateRequest, xmlType, xmlOutputStream, ctx);
    catch (Exception excp)
    excp.printStackTrace();
    // TODO - call service handler
    Document vcResp = docBuilder.newDocument();
    OutValidateConfigCodec vcResponseCodec = new OutValidateConfigCodec();
    XMLInputStream xmlInputStream = XMLInputStreamFactory.newInstance().newInputStream(vcResp);
    DeserializationContext dCtx= DeserializationContextFactory.newInstance().createDeserializationContext();
    vcResponseCodec.deserialize(xmlType, xmlInputStream, dCtx);
    } catch (Exception e) {
    // TODO Auto-generated catch block
    e.printStackTrace();
    return null;
    There must be something I'm missing, because I'm sure there is a simple way to
    Marshall my object back to XML. Am I using the wrong API? Can someone please
    shed some light on this? Any insight or suggestions are much appreciated.
    Thank you,
    Matt

  • Problem to access the object attribute in a code

    Hi guys,
    I need to pass the attribute of a business object to class.Below is the code but its throwing an error 'There is no component ZYBTT in l_bus2000116'.Since the business object conatins the attribute but its throwing an error.Could any one throw some light on the below code -
    data: l_BUS2000116 type swc_object,
          Z_BRF_Function_name TYPE FDT_UUID.
    swc_get_element ac_container 'Z_Quotation' l_BUS2000116.
    swc_get_element ac_container 'Z_BRF_Function_name' Z_BRF_Function_name.
    CALL METHOD ZYCLOTO_AGENTDTMT_BRF=>ZYOTOM_GET_AGENT_IDS_SING_LEV
      EXPORTING
        IM_FUNCTION_ID  = Z_BRF_Function_name
        IM_BTT          = l_BUS2000116-ZYBTT
       IM_COMPL_CAT    =
       IM_COUNTRY_CODE = ZYCOUNTRY
        IM_LEAD_BRAND   = l_BUS2000116-leadbrand
        IM_RISK_LEVEL   = l_BUS2000116-ZYRISKLEVEL
        IM_SALES_ORG    = l_BUS2000116-ZYsalesorg
        IM_TCV          = l_BUS2000116-ZYTCV.
    IMPORTING
       EX_FULL_NAME    =
       EX_RETURN       =

    Hi,
    Is ZYBTT a new attribute? Have you changed the status of the attribute to implemented or released (in SWO1: Edit -> Change release status)?
    And actually I think that your syntax is wrong too (BOR world is different compared to classes). If you want to use an attribute, I think that the syntax should be something like this:
    SWC_GET_PROPERTY <Object> <Attribute> <AttributeValue>.
    Read more here:
    http://help.sap.com/saphelp_nw04/helpdata/en/c5/e4acef453d11d189430000e829fbbd/content.htm
    Regards,
    Karri

  • Oracle 10g - Data Pump: Export / Import of Sequences ?

    Hello,
    I'm new to this forum and also to Oracle (Version 10g). Since I could not find an answer to my question, I open this post in hoping to get some help from the experienced users.
    My question concerns the Data Pump Utility and what happens to sequences which were defined in the source database:
    I have exported a schema with the following command:
    "expdp <user>/<pass> DIRECTORY=DATA_PUMP_DIR DUMPFILE=dumpfile.dmp LOGFILE=logfile.log"
    This worked fine and also the import seemed to work fine with the command:
    "impdp <user>/<pass> DIRECTORY=DATA_PUMP_DIR DUMPFILE=dumpfile.dmp"
    It loaded the exported objects directly into the schema of the target database.
    BUT:
    Something has happened to my sequences. :-(
    When I want to use them, all sequences start again with value "1". Since I have already included data with higher values in my tables, I get into trouble with the PK of these tables because I used sequences sometimes as primary key.
    My question go in direction to:
    1. Did I something wrong with Data Pump Utility?
    2. How is the correct way to export and import sequences that they keep their actual values?
    3. When the behaviour described here is correct, how can I correct the values that start again from the last value that was used in the source database?
    Thanks a lot in advance for any help concerning this topic!
    Best regards
    FireFighter
    P.S.
    It might be that my english sounds not perfect since it is not my native language. Sorry for that! ;-)
    But I hope that someone can understand nevertheless. ;-)

    My question go in direction to:
    1. Did I something wrong with Data Pump Utility?I do not think so. But may be with the existing schema :-(
    2. How is the correct way to export and import
    sequences that they keep their actual values?If the Sequences exist in the target before the import, oracle does not drop and recreate it. So you need to ensure that the sequences do not already exist in the target or the existing ones are dropped before the import.
    3. When the behaviour described here is correct, how
    can I correct the values that start again from the
    last value that was used in the source database?You can either refresh with the import after the above correction or drop and manually recreate the sequences to START WITH the NEXT VALUE of the source sequences.
    The easier way is to generate a script from the source if you know how to do it

  • Need help exporting large file to bitmap format

    I have a large file that I need to export to a bitmap format and I am experiencing unknown "IMer" errors, memory and file size limitations.
    The file contains only one artboard of size around 8000 pixels by 6000 pixels. I can successfully export as a png at 72dpi giving me a 100% image of my AI file. However I need to export an image at 200% of the AI file. If I export as a png at 144dpi to achieve this I get the following error(s)
    "The operation cannot complete because of an unknown error (IMer)" or "Could not complete this operation. The rasterised image exceeded the maximum bounds for savung to the png format"
    I have tried exporting as tiff, bmp and jpeg but all give "not enough memory"  or "dimensions out of range" errors.
    Does anyone know the limitations with AI for saving large files to bitmap format?
    Does anyone have a workaround I could try?

    You could try printing to Adobe Postscript files.
    This will actually save a .ps file and you can scale it to 200% when doing so. You'll need to "print" for every image change.
    Then open a Photoshop document at your final size (16000x12000 pixels) and place or drag the .ps files into the Photoshop document.
    I just tested this and it seems to work fine here. But, of course, I don't have your art, so complexity of objects might factor in and I'm not seeing that here.
    And you are aware that Photoshop won't allow you to save a png at those dimension either, right? The image is too large for PNG. Perhaps you could explain why you specifically need a png that large???

  • Need help for finding objects impacted by size change for an infoobject

    hi all,
    need help for finding objects impacted by size change
    for xxx infoobject, due to some requirements, the size to be changed from
    char(4) to char(10), in the source database tables as well as adjustment
    to be done in BI side.
    this infoobject xxx is nav attribute of YYY as well as for WWW
    infoobjects. and xxx is loaded from infopkg for www infoobject load.
    now that i have to prepare an impact analysis doc for BI side.
    pls help me with what all could be impacted and what to be done as a
    solution to implement the size change.
    FYI:
    where used list for xxx infoobject - relveals these object types :
    infocubes,
    infosources,
    tranfer rules,
    DSO.
    attribute of characteristic,
    nav attribute,
    ref infoobject,
    in queries,
    in variables

    Hi Swetha,
    You will have to manually make the table adjustments in all the systems using SE14 trans since the changes done using SE14 cannot be collected in any TR.
    How to adjust tables :
    Enter the table name in SE14. For ex for any Z master data(Say ZABCD), master data table name would be /BIC/PZABCD, text table would be /BIC/TZABCD. Similarly any DSO(say ZXYZ) table name would be /BIC/AZXYZ00 etc.
    Just enter the table name in SE14 trans --> Edit --> Select the radio button "Save Data" --> Click on Activate & adjust database table.
    NOTE : Be very careful in using SE14 trans since there is possibility that the backend table could be deleted.
    How to collect the changes in TR:
    You can collect only the changes made to the IO --> When you activate, it will ask you for the TR --> Enter the correct package name & create a new TR. If it doesn't prompt you for TR, just goto Extras --> Write transport request from the IO properties Menu screen. Once these IO changes are moved successfully, then the above proceduce can be followed using SE14 trans.
    Hope it helps!
    Regards,
    Pavan

  • Setting object attribute in dynamic sql

    I am trying to set an object attribute in PL/SQL. It looks like execute immediate is unhappy with
    create type test_object as object(username varchar2(200),id number);
    declare
    obj test_object;
    username varchar2(50);
    begin
    obj := test_object(null,null);
    username := 'MIKE';
    execute immedaite 'begin :x.username := :y; end;' using obj,username;
    end;
    Is this supposed to work?

    Hi Tony,
    Thanks so much for your response. I've had to study up on the dbms_sql package to understand your function... first time I've used it. I've fed my dynamic query to your function and see that it returns a colon delimited list of the column names; however, I think I need a little more schooling on how and where exactly to apply the function to actually set the column names in APEX.
    From my test app, here is the code for my dynamic query. I've got it in a "PL/SQL function body returning sql query" region:
    DECLARE 
      v_query      VARCHAR2(4000);
      v_as         VARCHAR2(4);
      v_range_from NUMBER;
      v_range_to   NUMBER;         
    BEGIN
      v_range_from := :P1_FY_FROM;
      v_range_to   := :P1_FY_TO;
      v_query      := 'SELECT ';
      -- build the dynamic column selections by looping through the fiscal year range.
      -- v_as is meant to specify the column name as (FY10, FY11, etc.), but it's not working.
      FOR i IN v_range_from.. v_range_to  LOOP
        v_as    := 'FY' || SUBSTR(i, 3, 4);
        v_query := v_query || 'MAX(DECODE(FY_NB,' || i || ',PFH_HEADCOUNT,0)) '
          || v_as || ',';
      END LOOP;
      -- add the rest of the query to the dynamic column selection
      v_query := rtrim(v_query,',') || ' FROM ('
        || 'SELECT FY_NB, PFH_HEADCOUNT FROM ('
        || 'SELECT FY_ID, FY_NB FROM FISCAL_YEAR) A '
        || 'LEFT OUTER JOIN ('
        || 'SELECT FY_ID, PFH_HEADCOUNT '
        || 'FROM PROJECT_FY_HEADCOUNT '
        || 'JOIN PROJECT_FY USING (PF_ID) '
        || 'WHERE PL_ID = ' || :P1_PROJECT || ') B '
        || 'ON A.FY_ID = B.FY_ID)';
      RETURN v_query;
    END;I need to invoke GET_QUERY_COLS(v_query) somewhere to get the column names, but I'm not sure where I need to call it and how to actually set the column names after getting the returned colon-delimited list.
    Can you (or anyone else) please help me get a little further? Once again, feel free to login to my host account to see it first hand.
    Thanks again!
    Mark

  • Which 'Values' need to Export in START-OF-SELECTION for Hotspot

    I develop a small Test Report for Hotspot, and it's working fine.
    Actually, Here - Error is at START-OF-SELECTION.
    I tried many combinations for passing the Parameters but It's Not Working.
    Which 'Values' I need to Export ?
    START-OF-SELECTION.
      CREATE OBJECT lr_details.
      lr_details->data_gathering( ).
      IF gi_final IS NOT INITIAL.
        lr_details->display_alv( ).
        lr_details->on_double_click( EXPORTING row = ? column = '?' ). " ?
        lr_details->on_link_click( EXPORTING row = ? column = '?' ).     " ?
      ELSE.
        MESSAGE 'No Data for the Selection Critaria' TYPE 'S' DISPLAY LIKE 'E'.
      ENDIF.
    Here row & column are obligatory fileds - I need to pass the Values
    (I am already passing the Values with Form get_aufnr_info.)
    I try with Hard Coding row = 1 & column = 'AUFNR'
        lr_details->on_double_click( EXPORTING row = 1 column = 'AUFNR' ).
        lr_details->on_link_click( EXPORTING row = 1 column = 'AUFNR' ).
    One may find it Silly, But in this Case - Hotspot is not working & when I go for 'BACK' it's showing IW33 (req. Tcode).
    The Complete code for HOTSPOT is as follow.
    METHODS:  data_gathering,
                  display_alv,
                  on_double_click FOR EVENT double_click OF cl_salv_events_table IMPORTING row column,
                  on_link_click FOR EVENT link_click OF cl_salv_events_table IMPORTING row column.
    METHOD on_double_click.
        PERFORM get_aufnr_info USING row column.
      ENDMETHOD.                   
      METHOD on_link_click.
        PERFORM get_aufnr_info USING row column.
      ENDMETHOD.
    FORM get_aufnr_info  USING row TYPE salv_de_row
                            column TYPE salv_de_column.
      IF column = 'AUFNR'.
        CLEAR: gwa_final.
        READ TABLE gi_final INTO gwa_final INDEX row.
        IF sy-subrc EQ 0.
          SET PARAMETER ID 'ANR' FIELD gwa_final-aufnr.
          CALL TRANSACTION 'IW33' AND SKIP FIRST SCREEN.
        ENDIF.
      ENDIF.
    ENDFORM.      

    Thanks Clemens & Lakshmi,
    But Here I am not taking a FieldCat and
    cl_gui_custom_container, "Detail container
    cl_gui_alv_grid        , "Detail ALV instance
    I am taking display as -->code
    DATA:  lr_table     TYPE REF TO cl_salv_table,
               lr_layout    TYPE REF TO cl_salv_layout,
               lr_functions TYPE REF TO cl_salv_functions_list,
               lr_columns   TYPE REF TO cl_salv_columns_table,
               lr_column1   TYPE REF TO cl_salv_column_table,
               lr_column    TYPE REF TO cl_salv_column,
               lr_events    TYPE REF TO cl_salv_events_table .
        DATA  : gr_events TYPE REF TO get_details.
        DATA: ls_layout TYPE salv_s_layout_key.
        ls_layout-report = sy-repid.
          TRY.
            CALL METHOD cl_salv_table=>factory           " call factory method of alv
    *        EXPORTING
              IMPORTING
                r_salv_table   = lr_table
              CHANGING
                t_table        = gi_final.
          CATCH cx_salv_not_found .
        ENDTRY.
        lr_functions = lr_table->get_functions( ).    " activate the alv funcationality
        lr_layout  = lr_table->get_layout( ).
        lr_table->display( ).
          TRY.
            CALL METHOD pr_columns->get_column
              EXPORTING
                columnname = 'AUFNR'
              RECEIVING
                value      = lr_column.
          CATCH cx_salv_not_found.
        ENDTRY.
        lr_column->set_long_text( text-006 ).
    I already developed Reports With FIELDCAT(set_table_for_first_display) with double_click
    And also created a small Test Report where I am taking double_click in CLASS DEF.
    Both are working.
    But when I am using double_click in Reports with  START-OF-SELECTION it's not working. -
    If I need not Use double_click in START-OF-SELECTION, THEN HOW I SHOULD CALL IT ??

  • Entity object attribute with a list of objects

    Does anyone know how one sets up an entity object that has an attribute with a list of objects as the type? (assuming that's supported)
    as in:
    CREATE TYPE phones AS VARRAY(10) OF varchar2(10);
    Create table suppliers (supcode number(5),
    Company varchar2(20),
    ph phones);
    The SOA Suite in jDeveloper (new Entity Object/attributes etc) has an ARRAY that can point to REF or OBJECT. Neither work. When I try to Create DB Object later from the Entity Object I've created I get an invalid type.

    What you suggested about "validation codes on the VO" is not written on the ADF Documentation.
    I try to blindly/strictly follow best practices (particularly on Validations, using Declartive and/or built in validators) on most ADF documentation and blogs but there are many scenarios on coding some large ADF projects that I think must veer away from the best practices stated on the documentation or maybe add new rules on the documentation depending on how complex an ADF project would be.
    I religiously followed best practices stated on the documentation to use Entity and Attribute Validators when performing validations. What I did was i had created lots of Custom Validators (by implementing JboVAlidatorInterface interface) for each of the attributes on an Entity Object that need validated. So those validator is valid only for one attribute, its not reusable. And those validation codes either have reference to a ViewObject or call some PL/SQL procedure. So at some point are codes became messy.
    Ultimately the whole project became harder to manage when the codes became large. Now I am trying to refactor the whole application by separating it into project/package and I am hoping to do it with little Re-coding as possible.
    Hope to get your opinion on this one.
    regards,
    Anton

Maybe you are looking for

  • Idoc 2 file

    Hi, I am doing an idoc2file scenario. With the help of various documentations, I assume I have done all the necessary configurations. I sent the idoc through the test tool (we19). I got the success message and confirmed the same from the txn we02. Ho

  • JFrame (menubar) problem

    Hi, I have a JFrame with few JPanels on it ( those in turn have JButtons etc). JFrame has the ussual menubar ( set using setJMenuBar) . Menubar is not accessiable using <TAB> . All other items are. Any ideas? Thank you in advence for any help.

  • Help me!  Flash won't download! :(

    Hello! Thank you in advance for taking the time to read this. I have not had Flash in about 4 months and so my internet does next to nothing I want it to... I try every week, sometimes several times, to fix this but it never works! I download it, say

  • Page incomplete in edit mode

    I use frames, and when attempting to edit one of the frames, the correct frame comes up but it is incomplete with about 50%missing. I suspect this may be from a far earlier version of the page. So how do I get to the current page to edit. If you go t

  • Cash Discount Clearing

    Hi to all, Can somebody tell me when the Cash Discount Clearing account determination (in purchasing tab) is called in SAP.  I have tried many scenarios and there is none calling this account. Can somebody help me Thanks