More About Data Template

I've searched and referenced some sample Data Template code from the community.
I tried and changed the parameter and sql in the DT which I think it's supposed to be in my case and got it working by submitting via Concurrent Manger.
However, I dont really know Data Template other then using someone's example.
For example, I have no idea what's the effect to turn on or off with "include_parameters".
<property name="include_parameters" value="false" />Just wondering where can I find more information about coding Data Template?
Any info, reference or documentation would really be helpful.
Thanks in advance!

With your references, I've finally completed a truly customized BIP report by using Data Template and RTF Template.
In case if anyone also needs RTF Template references, here is the link as well.
http://download.oracle.com/docs/cd/E10415_01/doc/bi.1013/e12187/T421739T481157.htm
Thanks everyone.

Similar Messages

  • More about Data Load

    Hello All,
    I am learning Importing CSV file and I have different errors and warnings.
    I would like to know more about : Utilities>Data Load
    Any recommendations to read links, books or information about it.
    Thanks,
    NYorker

    It is good question.
    Plus, illegal schema and table space names are another issue.
    Any thoughts!
    EB

  • OBI Publisher Data Template

    Hi all
    Please can some one tell me where can i find description about Data Template .
    I want to call a PL/SQL Proc inside some package and get data from it.
    I am using OBI Publisher to create new report and defining data model in it.
    Please guide me to some good online tutorial ablut data templage.
    Thanks a lot
    Amit

    Hi,
    You can get some info about DataTemplate (structure etc.) in OBI Publisher Report Designer`s Guide.
    If you want to call PL\SQL procedure you should define your default package and then call proc from it in DataTrigger section.
    Read this guide for additional info and examples =)

  • I was updating software and suddenly my IPHONE  started asking for I tunes on mobile screen ,  how can  i  get by screen back or How i can restore without loosing data , I m more worried about data , Please help in resolutio

    i was updating software and suddenly my IPHONE  started asking for I tunes on mobile screen ,  how can  i  get by screen back or How i can restore without loosing data , I m more worried about data , Please help in resolutio

    What exactly are you seeing on the phone's screen ? If the iTunes icon and cable then the phone is in recovery mode, in which case it's too late to take a new backup or to copy any content off the phone - you will have to connect the phone to your computer's iTunes and reset it back to factory defaults, after which you can restore to the last backup that you took or just resync your content to it

  • Replacement of iPad-3 to iPad-4? I want to know more about policy within 14 days the date of purchase,how can i replace with the ipad4? ?

    Hi
    I want to know the policy of replacement under 14 days of purchase iPad-3. I bought ipad-3 3 days ago before the launch of iPad-4. I want to know more about 14 days policy. I bought my iPad-3 not from the apple store , but it was the authorized apple center. If they are not the part of you so why you giving the license to sell your product. Is this not a cheating? We are purchasing Apple product. We have trust on Aplle thats why we purchase from them.
    I bought Apple's iPad-3 not samsung or nokia iPad-3 so Apple is responsible not ANYONE. Because Apple dint announce anything before neither they cut off their prices on iPad-3.
    Now my point is this i want to know more about 14 days exchange policy who bought their ipad-3 within 14 days from the dealers.
    Now I want to replace my iPad-3 with iPad-4 because i bought just 3 days before new one is Launch.Same price less Technology 3 days before woowwwwww....
    New CEO Tim Cook is taking gr8 step for its loyal customers. Tim are you working with samsung now? How much money they pay you to destroy Apple?

    I have rang my local apple store this morning and they have told me that they will swap out the ipad 3 to the ipad 4. I have the 4G version and im in the uk im sure that this may be the reason why they will for me due to 4g networks in uk not being compatible with the ipad 3..
    Just ring your local store they will im sure help you with all the info

  • About the template FSCM9.1 FP2 Peopletools 8.52.03 (v4 - July 2012)

    Hello,
    Just tested quickly this new template delivered 2 months ago (July 2012).
    As far as I undestand, it is just a recut of the one delivered in April 2012. At least it solves the main issue I reported in that other About the template FSCM9.1 FP2 Peopletools 8.52.03 (v3) about the network prompt missing.
    But I still have remarks/issues on the template FSCM9.1 FP2 Peopletools 8.52.03 (v4) released earlier in July 2012.
    _1. First of all, there a lot of errors reported in /var/log/messages_
    Sep 13 05:02:55 localhost kernel: type=1400 audit(1347526918.883:3): avc:  denied  { read } for  pid=92 comm="restorecon" name="libc.so.6" dev=xvda2 ino=21 scontext=system_u:sys
    tem_r:restorecon_t:s0 tcontext=system_u:object_r:file_t:s0 tclass=lnk_file
    Sep 13 05:02:55 localhost kernel: type=1400 audit(1347526918.910:4): avc:  denied  { execute } for  pid=92 comm="restorecon" path="/lib64/libc-2.5.so" dev=xvda2 ino=20 scontext=
    system_u:system_r:restorecon_t:s0 tcontext=system_u:object_r:file_t:s0 tclass=file
    Sep 13 05:02:55 localhost kernel: type=1400 audit(1347526921.489:5): avc:  denied  { read } for  pid=296 comm="pam_console_app" name="ld.so.cache" dev=xvda2 ino=94143 scontext=s
    ystem_u:system_r:pam_console_t:s0-s0:c0.c1023 tcontext=system_u:object_r:file_t:s0 tclass=file
    Sep 13 05:02:55 localhost kernel: type=1400 audit(1347526921.489:6): avc:  denied  { getattr } for  pid=290 comm="pam_console_app" path="/etc/ld.so.cache" dev=xvda2 ino=94143 sc
    ontext=system_u:system_r:pam_console_t:s0-s0:c0.c1023 tcontext=system_u:object_r:file_t:s0 tclass=file
    Sep 13 05:02:55 localhost kernel: type=1400 audit(1347526921.530:7): avc:  denied  { read } for  pid=293 comm="pam_console_app" name="libc.so.6" dev=xvda2 ino=21 scontext=system
    _u:system_r:pam_console_t:s0-s0:c0.c1023 tcontext=system_u:object_r:file_t:s0 tclass=lnk_file
    Sep 13 05:02:55 localhost kernel: type=1400 audit(1347526921.530:8): avc:  denied  { execute } for  pid=293 comm="pam_console_app" path="/lib64/libc-2.5.so" dev=xvda2 ino=20 sco
    ntext=system_u:system_r:pam_console_t:s0-s0:c0.c1023 tcontext=system_u:object_r:file_t:s0 tclass=file
    Sep 13 05:02:55 localhost kernel: input: PC Speaker as /class/input/input3
    Sep 13 05:02:55 localhost kernel: Initialising Xen virtual ethernet driver.
    Sep 13 05:02:55 localhost kernel: Error: Driver 'pcspkr' is already registered, aborting...
    Sep 13 05:02:55 localhost kernel: Floppy drive(s): fd0 is unknown type 15 (usb?), fd1 is unknown type 15 (usb?)
    Sep 13 05:02:55 localhost kernel: floppy0: Unable to grab IRQ6 for the floppy driver
    Sep 13 05:02:55 localhost kernel: lp: driver loaded but no devices found
    Sep 13 05:02:55 localhost kernel: md: Autodetecting RAID arrays.
    Sep 13 05:02:55 localhost kernel: md: Scanned 0 and added 0 devices.
    Sep 13 05:02:55 localhost kernel: md: autorun ...
    Sep 13 05:02:55 localhost kernel: md: ... autorun DONE.
    Sep 13 05:02:55 localhost kernel: EXT3 FS on xvda2, internal journal
    Sep 13 05:02:55 localhost kernel: type=1400 audit(1347526929.896:9): avc:  denied  { execute } for  pid=965 comm="restorecon" path="/lib64/libc-2.5.so" dev=xvda2 ino=20 scontext
    =system_u:system_r:restorecon_t:s0 tcontext=system_u:object_r:file_t:s0 tclass=file
    Sep 13 05:02:55 localhost kernel: kjournald starting.  Commit interval 5 seconds
    Sep 13 05:02:55 localhost kernel: EXT3 FS on xvda1, internal journal
    Sep 13 05:02:55 localhost kernel: EXT3-fs: mounted filesystem with ordered data mode.
    Sep 13 05:02:55 localhost kernel: kjournald starting.  Commit interval 5 seconds
    Sep 13 05:02:55 localhost kernel: EXT3 FS on xvdb1, internal journal
    Sep 13 05:02:55 localhost kernel: EXT3-fs: mounted filesystem with ordered data mode.
    Sep 13 05:02:55 localhost kernel: type=1400 audit(1347526930.647:10): avc:  denied  { execute } for  pid=989 comm="setfiles" path="/lib64/libc-2.5.so" dev=xvda2 ino=20 scontext=
    system_u:system_r:setfiles_t:s0 tcontext=system_u:object_r:file_t:s0 tclass=file
    Sep 13 05:02:55 localhost kernel: type=1400 audit(1347526942.398:11): avc:  denied  { net_admin } for  pid=990 comm="setfiles" capability=12  scontext=system_u:system_r:setfiles
    _t:s0 tcontext=system_u:system_r:setfiles_t:s0 tclass=capability
    Sep 13 05:02:55 localhost kernel: hrtimer: interrupt took 35229469 ns
    Sep 13 05:02:55 localhost kernel: Adding 2104504k swap on /dev/xvda3.  Priority:-1 extents:1 across:2104504k SS
    Sep 13 05:02:55 localhost kernel: warning: process `kudzu' used the deprecated sysctl system call with 1.23.
    Sep 13 05:02:55 localhost kernel: Loading iSCSI transport class v2.0-870.
    Sep 13 05:02:55 localhost kernel: libcxgbi:libcxgbi_init_module: tag itt 0x1fff, 13 bits, age 0xf, 4 bits.
    Sep 13 05:02:55 localhost kernel: libcxgbi:ddp_setup_host_page_size: system PAGE 4096, ddp idx 0.
    Sep 13 05:02:55 localhost kernel: Chelsio T3 iSCSI Driver cxgb3i v2.0.0 (Jun. 2010)
    Sep 13 05:02:55 localhost kernel: iscsi: registered transport (cxgb3i)
    Sep 13 05:02:55 localhost kernel: NET: Registered protocol family 10
    Sep 13 05:02:55 localhost kernel: cnic: Broadcom NetXtreme II CNIC Driver cnic v2.2.14 (Mar 30, 2011)
    Sep 13 05:02:55 localhost kernel: Broadcom NetXtreme II iSCSI Driver bnx2i v2.6.2.3 (Jan 06, 2010)
    Sep 13 05:02:55 localhost kernel: iscsi: registered transport (bnx2i)
    Sep 13 05:02:55 localhost kernel: iscsi: registered transport (tcp)
    Sep 13 05:02:55 localhost kernel: iscsi: registered transport (iser)
    Sep 13 05:02:55 localhost kernel: iscsi: registered transport (be2iscsi)
    Sep 13 05:02:55 localhost kernel: ip6_tables: (C) 2000-2006 Netfilter Core Team
    Sep 13 05:02:55 localhost kernel: warning: `mcstransd' uses 32-bit capabilities (legacy support in use)
    Sep 13 05:02:55 localhost kernel: type=1400 audit(1347526970.336:12): avc:  denied  { sys_tty_config } for  pid=1374 comm="consoletype" capability=26  scontext=system_u:system_r
    :consoletype_t:s0 tcontext=system_u:system_r:consoletype_t:s0 tclass=capability
    Sep 13 05:02:55 localhost kernel: RPC: Registered udp transport module.
    Sep 13 05:02:55 localhost kernel: RPC: Registered tcp transport module.
    Sep 13 05:02:55 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
    Sep 13 05:03:00 localhost automount[1769]: lookup_read_master: lookup(nisplus): couldn't locate nis+ table auto.master
    Sep 13 05:03:59 localhost kernel: type=1400 audit(1347527039.771:13): avc:  denied  { sys_tty_config } for  pid=2029 comm="consoletype" capability=26  scontext=system_u:system_r
    :consoletype_t:s0 tcontext=system_u:system_r:consoletype_t:s0 tclass=capability
    Sep 13 05:04:00 localhost NET[2061]: /sbin/dhclient-script : updated /etc/resolv.conf
    Sep 13 05:04:01 localhost kernel: IPv6 over IPv4 tunneling driver
    Sep 13 05:04:01 localhost NET[2219]: /opt/oracle/psft/vm/oraclevm-template.sh : updated /etc/resolv.conf
    Sep 13 05:04:08 localhost NET[2472]: /etc/sysconfig/network-scripts/ifup-post : updated /etc/resolv.conf
    Sep 13 05:06:08 localhost restorecond: Reset file context /etc/resolv.conf: system_u:object_r:etc_runtime_t:s0->system_u:object_r:net_conf_t:s0
    Sep 13 05:08:19 localhost kernel: Slow work thread pool: Starting up
    Sep 13 05:08:19 localhost kernel: Slow work thread pool: Ready
    Sep 13 05:08:19 localhost kernel: FS-Cache: Loaded
    Sep 13 05:08:19 localhost kernel: FS-Cache: Netfs 'nfs' registered for caching
    Sep 13 05:08:19 localhost kernel: svc: failed to register lockdv1 RPC service (errno 97).
    ...Well, I don't know if it triggers others problems yet, but the last line could reveale an error within the /etc/hosts file which has not been properly modified during deployment (especially IPV6, it probably should be removed) :
    [root@psovmfscmfp2 /]# more /etc/hosts
    127.0.0.1       localhost.localdomain   localhost
    ::1     localhost6.localdomain6 localhost6
    192.168.1.150   psovmfscmfp2.phoenix.nga psovmfscmfp2
    [root@psovmfscmfp2 /]#_2. Now about the COBOL_
    Despite I choosed to install Microfocus, COBOL does not work. Sample COBOL processes such as PTPDBTST and PTPDTTST finished in error.
    The log is empty, here below the output from the file $PS_CFG_HOME/psft/pt/8.52/appserv/prcs/PRCSDOM/LOGS/stdout (psadm2) :
    =================================Error===============================
    Message:     Process 10899 is marked 'Initiated' or 'Processing' but can not detect status of PID
            Process Name: PTPDBTST
            Process Type: COBOL SQL
            Session Id:   9313
    =====================================================================
    OprId = VP1Note that I successfully tested AEs and SQRs.
    Here is the command line fired that I can see from the process monitor > parameter (nga is being my run control id) :
    PSRUN PTPDBTST ORACLE/F91TMPLT/VP1/OPRPSWD/nga/10899//0 I used the following trace setting on the PTPDBTST's process parameter (override) to see what happens :
    %%DBTYPE%%/%%DBNAME%%/%%OPRID%%/%%OPRPSWD%%/%%RUNCNTLID%%/%%INSTANCE%%//%%DBFLAG%%But it does not generate more logs...
    I also use "RCCBL Redirect =1" in psappsrv.cfg (reconfigure and restart appdom), then start the COBOL through menu PeopleTools > Utilities > Debug > PeopleTools Test Utilities, and run a "Remote Call Test".
    I received "COBOL Program PTPNTEST aborted (2,-1) FUNCLIB_UTIL.RC_TEST_PB.FieldChange PCPC:2143 Statement:26", but it generated two empty files (PTPNTEST_VP1_0913064910.out and PTPNTEST_VP1_0913064910.err).
    Next step, checking the folder $PS_HOME/cblbin, it is...er... empty... does this mean COBOL have not been compiled ? Hmmm, I'm pretty sure I replied 'yes' when it was prompted though (still have the screenshots)...
    And we can see several folders dated from today and license seems ok from Microfocus directories :
    [psadm1@psovmfscmfp2 tools]$ cd /opt/oracle/psft/pt/cobol/svrexp-51_wp4-64bit
    [psadm1@psovmfscmfp2 svrexp-51_wp4-64bit]$ ls -lrt
    total 264
    -r--r--r--  1 root root 10455 Nov 19  2009 ADISCTRL
    dr-xr-xr-x 10 root root  4096 Nov 19  2009 terminfo
    dr-xr-xr-x  2 root root  4096 Nov 19  2009 xdb
    -r--r--r--  1 root root 11949 Nov 19  2009 eslmf-mess
    dr-xr-xr-x  2 root root  4096 Nov 19  2009 include
    dr-xr-xr-x 17 root root  4096 Nov 19  2009 lang
    dr-xr-xr-x  4 root root  4096 Nov 19  2009 es
    dr-xr-xr-x  2 root root  4096 Nov 19  2009 dynload
    drwxrwxrwx  2 root root  4096 Nov 19  2009 deploy
    dr-xr-xr-x  2 root root  4096 Nov 19  2009 dynload64
    dr-xr-xr-x  2 root root  4096 Nov 19  2009 dialog
    dr-xr-xr-x  2 root root  4096 Nov 19  2009 cpylib
    dr-xr-xr-x  8 root root 28672 Nov 19  2009 lib
    dr-xr-xr-x  3 root root  4096 Nov 19  2009 snmp
    dr-xr-xr-x  8 root root  4096 Nov 19  2009 src
    dr-xr-xr-x 28 root root  4096 Nov 19  2009 demo
    dr-xr-xr-x  6 root root  4096 Nov 19  2009 docs
    -rw-r--r--  1 root root    49 Sep 13 05:13 license.txt
    -r-xr-xr-x  1 root root 12719 Sep 13 05:13 install.orig
    -r-xr-xr-x  1 root root 13006 Sep 13 05:13 install
    dr-xr-xr-x  6 root root  4096 Sep 13 05:13 lmf
    dr-xr-xr-x  2 root root  4096 Sep 13 05:13 aslmf
    dr-xr-xr-x  6 root root  4096 Sep 13 05:15 etc
    dr-xr-xr-x  4 root root 12288 Sep 13 05:15 bin
    [psadm1@psovmfscmfp2 svrexp-51_wp4-64bit]$ more license.txt
    I
    ORACLE-30DAYDEV64
    01030 A0780 014A6 7980B A17CSo let's assume it has been properly installed and let's compile the COBOLs. Here we go :
    [psadm1@psovmfscmfp2 svrexp-51_wp4-64bit]$ cd $PS_HOME/setup
    [psadm1@psovmfscmfp2 setup]$ ./pscbl.mak
    /opt/oracle/psft/pt/tools/setup/pscbl_mf.mak : Convert all files for Unicode ....
    Conversion Summary for Source Codes in  :
         Source: /opt/oracle/psft/pt/tools/src/cbl/
         Target: /opt/oracle/psft/pt/tools/src/cblunicode/
          Number of Copy Libraries Read: 71
                         Modified:       71
                     Not Modified:       0
          Number of Programs Read:       44
                         Modified:       44
                     Not Modified:       0
    /opt/oracle/psft/pt/tools/setup/pscbl_mf.mak : All COBOL files were converted for Unicode successfully
    /opt/oracle/psft/pt/tools/setup/pscbl_mf.mak : Compiling PTPCBLAE.cbl ...
    /opt/oracle/psft/pt/tools/setup/pscbl_mf.mak: line 249: cob: command not found
    cp: cannot stat `PTPCBLAE.gnt': No such file or directory
    cp: cannot stat `PTPCBLAE.int': No such file or directory
    cp: cannot stat `PTPCBLAE.lst': No such file or directory
    ...What about env. variables ? COBDIR, COBPATH and COBOL do not appears anywhere in PATH...
    [psadm1@psovmfscmfp2 setup]$ env|grep -i cobol
    [psadm1@psovmfscmfp2 setup]$Let's set the env variables as we could expect to be (page 27, step 17 of the given doc), and retry to compile the COBOL :
    [psadm1@psovmfscmfp2 setup]$ export COBDIR=/opt/oracle/psft/pt/cobol/svrexp-51_wp4-64bit
    [psadm1@psovmfscmfp2 setup]$ export LD_LIBRARY_PATH=/opt/oracle/psft/pt/cobol/svrexp-51_wp4-64bit/lib:$LD_LIBRARY_PATH
    [psadm1@psovmfscmfp2 setup]$ export PATH=/opt/oracle/psft/pt/cobol/svrexp-51_wp4-64bit/bin:$PATH
    [psadm1@psovmfscmfp2 setup]$ ./pscbl.mak
    /opt/oracle/psft/pt/tools/setup/pscbl_mf.mak : Convert all files for Unicode ....
    Conversion Summary for Source Codes in  :
         Source: /opt/oracle/psft/pt/tools/src/cbl/
         Target: /opt/oracle/psft/pt/tools/src/cblunicode/
          Number of Copy Libraries Read: 71
                         Modified:       71
                     Not Modified:       0
          Number of Programs Read:       44
                         Modified:       44
                     Not Modified:       0
    /opt/oracle/psft/pt/tools/setup/pscbl_mf.mak : All COBOL files were converted for Unicode successfully
    /opt/oracle/psft/pt/tools/setup/pscbl_mf.mak : Compiling PTPCBLAE.cbl ...
    Micro Focus LMF - 010: Unable to contact license manager.                                                                                                                              This product has been unable to contact the                                     License Manager.                                                                                                                                                Execution of this product has been terminated.                                                                                                                  This product cannot execute without the License                                 Manager. Contact your license administrator                                     or refer to the 'Information Messages' chapter                                  of the License Management Facility                                              Administrator's Guide.
    cob64: error(s) in compilation: PTPCBLAE.cbl
    cp: cannot stat `PTPCBLAE.gnt': No such file or directory
    cp: cannot stat `PTPCBLAE.int': No such file or directory
    cp: cannot stat `PTPCBLAE.lst': No such file or directory
    ...Ok, maybe a bit better, at least it is trying to contact LMF. Probably the LMF is not started. Let's try to start it :
    [root@psovmfscmfp2 microfocus]# ./mflmman
    MF-LMF:Thu Sep 13 07:19:37 2012: LMF Starting
    [root@psovmfscmfp2 microfocus]#Good, it is starting, it means it wasn't (sic). Now retry to compile :
    [psadm1@psovmfscmfp2 setup]$ export COBDIR=/opt/oracle/psft/pt/cobol/svrexp-51_wp4-64bit
    [psadm1@psovmfscmfp2 setup]$ export LD_LIBRARY_PATH=/opt/oracle/psft/pt/cobol/svrexp-51_wp4-64bit/lib:$LD_LIBRARY_PATH
    [psadm1@psovmfscmfp2 setup]$ export PATH=/opt/oracle/psft/pt/cobol/svrexp-51_wp4-64bit/bin:$PATH
    [psadm1@psovmfscmfp2 setup]$ ./pscbl.mak
    /opt/oracle/psft/pt/tools/setup/pscbl_mf.mak : Convert all files for Unicode ....
    Conversion Summary for Source Codes in  :
         Source: /opt/oracle/psft/pt/tools/src/cbl/
         Target: /opt/oracle/psft/pt/tools/src/cblunicode/
          Number of Copy Libraries Read: 71
                         Modified:       71
                     Not Modified:       0
          Number of Programs Read:       44
                         Modified:       44
                     Not Modified:       0
    /opt/oracle/psft/pt/tools/setup/pscbl_mf.mak : All COBOL files were converted for Unicode successfully
    /opt/oracle/psft/pt/tools/setup/pscbl_mf.mak : Compiling PTPCBLAE.cbl ...
    /opt/oracle/psft/pt/tools/setup/pscbl_mf.mak : Compiling PTPCURND.cbl ...
    /opt/oracle/psft/pt/tools/setup/pscbl_mf.mak : Compiling PTPDBTST.cbl ...
    <snipped>
    /opt/oracle/psft/pt/tools/setup/pscbl_mf.mak : Compiling PTPWLGEN.cbl ...
    /opt/oracle/psft/pt/tools/setup/pscbl_mf.mak : All COBOL programs have been successfully compiled.
    /opt/oracle/psft/pt/tools/setup/pscbl_mf.mak : The COBOL executables were copied to /opt/oracle/psft/pt/tools/cblbin
    rm: cannot remove `/opt/oracle/psft/pt/apptools/src/cblunicode/CECCRLP1.cbl': Permission denied
    rm: cannot remove `/opt/oracle/psft/pt/apptools/src/cblunicode/CECCRLUP.cbl': Permission deniedIt looks better, I think the last lines marked with "Permission denied" can be safely be ignored.
    Those files are owned by psadm3 with a read only for other users (sic). But more concern, I'm wondering why it looks into apptools (???) whereas I'm using psadm1 (tools only, COBPATH=/opt/oracle/psft/pt/tools/cblbin).
    Anyway, seems the *.gnt files required to run the COBOLs programs are now in bin :
    [psadm1@psovmfscmfp2 setup]$ ls /opt/oracle/psft/pt/tools/cblbin
    PTPCBLAE.gnt  PTPDTTST.gnt  PTPECOBL.gnt  PTPLOGMS.gnt  PTPRATES.gnt  PTPSQLGS.gnt  PTPTESTU.gnt  PTPTSCNT.gnt  PTPTSLOG.gnt  PTPTSTBL.gnt  PTPTSWHR.gnt
    PTPCURND.gnt  PTPDTWRK.gnt  PTPEFCNV.gnt  PTPMETAS.gnt  PTPRUNID.gnt  PTPSQLRT.gnt  PTPTESTV.gnt  PTPTSEDS.gnt  PTPTSREQ.gnt  PTPTSUPD.gnt  PTPUPPER.gnt
    PTPDBTST.gnt  PTPDYSQL.gnt  PTPERCUR.gnt  PTPNETRT.gnt  PTPSETAD.gnt  PTPSTRFN.gnt  PTPTFLDW.gnt  PTPTSEDT.gnt  PTPTSSET.gnt  PTPTSUSE.gnt  PTPUSTAT.gnt
    PTPDEC31.gnt  PTPECACH.gnt  PTPESLCT.gnt  PTPNTEST.gnt  PTPSHARE.gnt  PTPTEDIT.gnt  PTPTLREC.gnt  PTPTSFLD.gnt  PTPTSTAE.gnt  PTPTSWHE.gnt  PTPWLGEN.gnt
    [psadm1@psovmfscmfp2 setup]$Have a try to link COBOLs :
    [psadm1@psovmfscmfp2 setup]$ ./psrun.mak
    ./psrun.mak - linking PSRUN for oel-5-x86_64, Version 2.6.32-200.13.1.el5uek ...
    ./psrun.mak - Successfully created PSRUN in directory: /opt/oracle/psft/pt/tools/bin
    ./psrun.mak - linking PSRUNRMT for oel-5-x86_64, Version 2.6.32-200.13.1.el5uek ...
    ./psrun.mak - Successfully created PSRUNRMT in directory: /opt/oracle/psft/pt/tools/bin
    [psadm1@psovmfscmfp2 setup]$The err files are empty :
    -rw-r--r-- 1 psadm1 oracle     0 Sep 13 07:26 psrun.err
    -rw-r--r-- 1 psadm1 oracle     0 Sep 13 07:26 psrunrmt.errSo far, so good now. We are able to test again the sample COBOL... until next failure.
    Yes, unfortunately, it fails again. But good thing, the log is not empty now :
    PSRUN: error while loading shared libraries: libcobrts64.so: cannot open shared object file: No such file or directoryThat's probably coming from some missing libraries during the psprcs.cfg configuration. Let's use the same env. variables settings as for psadm1 when compiling COBOLs.
    [psadm2@psovmfscmfp2 appserv]$ export COBDIR=/opt/oracle/psft/pt/cobol/svrexp-51_wp4-64bit
    [psadm2@psovmfscmfp2 appserv]$ export LD_LIBRARY_PATH=$COBDIR/lib:$LD_LIBRARY_PATH
    [psadm2@psovmfscmfp2 appserv]$ export PATH=$COBDIR/bin:$PATH
    [psadm2@psovmfscmfp2 appserv]$ ./psadminReconfigure, restart prcs and re-test... SUCCESSFULLY !!!!!!!!!!!!!!!!!!!!!!!!!
    Log from PTPDBTST process shows :
    SUCCESSFUL DATABASE CONNECTION
    SUCCESSFUL DATABASE DISCONNECTWhat a pain !
    I did not go further, but we could expect the same issue within the Application COBOLs, since the cblbin directory is also empty out there.
    According to psprcs.env, there're two values for COBDIR and the one for the applications cobol is empty :
    [psadm2@psovmfscmfp2 PRCSDOM]$ more psprcsrv.env
    INFORMIXSERVER=192.168.1.149
    COBPATH=/opt/oracle/psft/pt/apptools/cblbin:/opt/oracle/psft/pt/tools/cblbin
    PATH=/opt/oracle/psft/pt/apptools/bin:/opt/oracle/psft/pt/apptools/bin/interfacedrivers::/opt/oracle/psft/pt/cobol/svrexp-51_wp4-64bit/bin:/opt/oracle/psft/pt/tools/appserv:/opt
    /oracle/psft/pt/tools/setup:/opt/oracle/psft/pt/tools/jre/bin:/opt/oracle/psft/pt/bea/tuxedo/bin:.:/opt/oracle/psft/pt/oracle-client/11.2.0.x/bin:/opt/oracle/psft/pt/oracle-clie
    nt/11.2.0.x/perl/bin:/usr/local/bin:/bin:/usr/bin:/opt/oracle/psft/pt/tools/bin:/opt/oracle/psft/pt/tools/bin/sqr/ORA/bin:/opt/oracle/psft/pt/tools/verity/linux/_ilnx21/bin:/hom
    e/psadm2/bin:.
    [psadm2@psovmfscmfp2 PRCSDOM]$ ls /opt/oracle/psft/pt/apptools/cblbin
    [psadm2@psovmfscmfp2 PRCSDOM]$ ls /opt/oracle/psft/pt/tools/cblbin
    PTPCBLAE.gnt  PTPDTTST.gnt  PTPECOBL.gnt  PTPLOGMS.gnt  PTPRATES.gnt  PTPSQLGS.gnt  PTPTESTU.gnt  PTPTSCNT.gnt  PTPTSLOG.gnt  PTPTSTBL.gnt  PTPTSWHR.gnt
    PTPCURND.gnt  PTPDTWRK.gnt  PTPEFCNV.gnt  PTPMETAS.gnt  PTPRUNID.gnt  PTPSQLRT.gnt  PTPTESTV.gnt  PTPTSEDS.gnt  PTPTSREQ.gnt  PTPTSUPD.gnt  PTPUPPER.gnt
    PTPDBTST.gnt  PTPDYSQL.gnt  PTPERCUR.gnt  PTPNETRT.gnt  PTPSETAD.gnt  PTPSTRFN.gnt  PTPTFLDW.gnt  PTPTSEDT.gnt  PTPTSSET.gnt  PTPTSUSE.gnt  PTPUSTAT.gnt
    PTPDEC31.gnt  PTPECACH.gnt  PTPESLCT.gnt  PTPNTEST.gnt  PTPSHARE.gnt  PTPTEDIT.gnt  PTPTLREC.gnt  PTPTSFLD.gnt  PTPTSTAE.gnt  PTPTSWHE.gnt  PTPWLGEN.gnt
    [psadm2@psovmfscmfp2 PRCSDOM]$The directory "/opt/oracle/psft/pt/apptools/cblbin" is owned by psadm3 and hosted on the database server (nfs mounted), so I assume we also need to set proper values for env variables, and compile the COBOLs before being able to use them.
    To resume what I did to make the COBOLs working on this PSOVM :
    1. As root, start LMF (this has to be done only once)
    cd /opt/oracle/psft/pt/cobol/microfocus
    ./mflmman
    2. As psadm1, set proper env. variable and compile (setting env variable has to be done each time you want to compile COBOLs)
    export COBDIR=/opt/oracle/psft/pt/cobol/svrexp-51_wp4-64bit
    export LD_LIBRARY_PATH=$COBDIR/lib:$LD_LIBRARY_PATH
    export PATH=$COBDIR/bin:$PATH
    cd $PS_HOME/setup
    ./pscbl.mak
    ./psrun.mak
    3. As psadm2, set proper env. variable and reconfigure psprcs.cfg, restart, restart (setting env variable has to be done each time you want to start the process scheduler, so probably better to append these in the .bash_profile)
    export COBDIR=/opt/oracle/psft/pt/cobol/svrexp-51_wp4-64bit
    export LD_LIBRARY_PATH=$COBDIR/lib:$LD_LIBRARY_PATH
    export PATH=$COBDIR/bin:$PATH
    cd $PS_HOME/appserv
    ./psadmin
    4. Same as step 2, but with user psadm3.
    HTH,
    Nicolas.
    PS: will it be the same issue on the HCM template delivered at the same time ? To be tested as well.
    PS2: and yes, I tested it twice before posting, result is same.
    Edited by: N Gasparotto on Sep 13, 2012 5:17 PM

    Fortunately, the COBOL issue does not exist on PSOVM HCM9.1 FP2 PT8.52.06 delivered in July 2012 (v3). COBOL are properly compiled (tools and app COBOLs), cblbin is not empty and they run successfully on the first shot.
    Nicolas.

  • How to pass the parameter in "Data Template"

    Hi,
    Can any one please help me out to sort the problem, "Passing Parameter and checking condition for the parameters in the "data template".
    And also please tell me how to link the multiple query in "Data template", as per the user guide i have it says to use link tag. But its not working with my "Data template".
    Please check the data template given below and help me out to sort the problem.
    And one more thing is in table the data type for number and name is varchar. I dono weather i can use that data type in "Data template", please let me know if i can.
    <?xml version="1.0"?>
    <dataTemplate name="VENODR_DETAILS">
    <parameters>
    <parameter name="VENDOR_NUM" dataType="number"/>
    <parameter name="VENDOR_NAME" dataType="char"/>
    </parameters>
    <dataQuery>
    <sqlStatement name="Q1">
    <![CDATA[SELECT pv.segment1 ven_num,
              pv.vendor_name vname,
                          pvs.vendor_site_code VEN_SIT_CODE,
              pvs.purchasing_site_flag PURCH_SIT_FLG,
              pvs.pay_site_flag PAY_SIT_FLG,
              pv.vendor_name VENDOR_ADDR_NAME,
              pvs.address_line1 VENDOR_ADDR1,
              pvs.address_line2 VENDOR_ADDR2,
              pvs.city VENDOR_CITY,
              pvs.state VENDOR_STATE,
              pvs.zip VENDOR_ZIP
    FROM vendors pv,supplier_info pvi,vendor_sites_all pvs
    where pv.segment1 = pvi.vendor_id(+)
    and pv.vendor_id = pvs.vendor_id
    and (pv.segment1=:VENDOR_NUM or pv.vendor_name=:VENDOR_NAME)
    ]]>
    </sqlStatement>
    <sqlStatement name="Q2">
    <![CDATA[select pvc.area_code||'-'||pvc.phone Contact_Phone,
    pvc.last_name||' '||pvc.first_name||' '||pvc.middle_name cname
    from vendor_contacts pvc,vendor_sites_all pvs,vendors pv
    where pvs.vendor_site_id = pvc.vendor_site_id
    and pv.vendor_id = pvs.vendor_id
    and pv.segment1=(pv.segment1=:VENDOR_NUM or pv.vendor_name=:VENDOR_NAME)
    ]]>
    </sqlStatement>
    </dataQuery>
    <dataStructure>
    <group name="VENDOR" source="Q1">
    <element name="VENDOR_NUMBER" value="ven_num"/>
    <element name="VENDOR_NAME" value="vname"/>
    <element name="CATEGORY" value="VENDOR_CAT"/>
    <element name="UPDATE_DATE" value="UPDATE_DT"/>
         <group name="VENDOR_SITE" source="Q1">
         <element name="VEN_SIT_CODE" value="VEN_SIT_CODE"/>
         <element name="PURCH_SIT_FLG" value="PURCH_SIT_FLG"/>
         <element name="PAY_SIT_FLG" value="PAY_SIT_FLG"/>
         <element name="VENDOR_ADDR1" value="VENDOR_ADDR1"/>
         <element name="VENDOR_ADDR2" value="VENDOR_ADDR2"/>
         <element name="VENDOR_CITY" value="VENDOR_CITY"/>
         <element name="VENDOR_STATE" value="VENDOR_STATE"/>
         <element name="VENDOR_ZIP" value="VENDOR_ZIP"/>
         </group>
    </group>
    <group name="VENDOR_CONTACT" source="Q2">
    <element name="Contact_Phone" value="Contact_Phone"/>
    <element name="cname" value="cname"/>
    </group>
    </dataStructure>
    </dataTemplate>
    Thanks in advance.
    Regards,
    SP

    Hi Rani,
    I see one issue in the Q2 with the where statement:
    and pv.segment1=(pv.segment1=:VENDOR_NUM or pv.vendor_name=:VENDOR_NAME)
    the right one should be
    and pv.vendor_id = :VEN_VENDOR_ID
    the variable VEN_VENDOR_ID you have to select from the first query Q1. Add here after select
    pv.vendor_id VEN_VENDOR_ID,
    Regarding the other question about the data types, well, number and varchar2 is fine, perhaps char also, but go with varchar2 instead.
    One last note: you should not mix uppercase and lowercase, keep it clear and structured, check your element names and values.
    Best Regards
    Volker

  • How to generate xml using "Data Template"

    Hi,
    Can any one please tell me the steps to create xml using "Data Template".
    As per the user guide the execution method for "data Template" is "Java Concurrent Program" and the Executable mentiones is "XDODTEXE".
    But there is no information about the "execution file name" and "execution path name" which is mandatory.
    Thanks and Regards,
    Sandhya

    Hi Sandhya,
    To put a Data Template into use, you don't need to define any new executables. XDODTEXE executable is already registered in the system, as it ships with the application. The Data Template is not an executable as such, but a collection of instructions for what queries to perform and what should the resulting XML look like. XDODTEXE knows how to interpret those instructions, and produce the XML output.
    (And if you add a layout template, handle that too.)
    You do need to do the following:
    1) Create a new Data Definition (XML Publisher Administrator > Data Definitions)
    2) Upload your Data Template (remember the Code for the next step)
    3) Create a Concurrent Program that is linked to the Data Definition (Short Name = Code) (System Administrator > Program > Define)
    4) Add your Concurrent Program to a Request Group (System Administrator > Security > Responsibility > Request)
    I recommend you take a look at the XML Publisher Administration and Developer's Guide, you will find more information there.
    (http://download-west.oracle.com/docs/cd/B40089_02/current/acrobat/120xdoig.pdf)
    Best Regards & Happy New Year 2008,
    Matilda Smeds

  • Index: 0, Size: 0 error while creating data template in BI Publisher

    Hi
    When i am trying to create the data template for BI Publisher reporting. i am gettinhg the Index: 0, Size: 0
    error. I checked the query that i have used to create template and it is working fine. But not sure why it is not happening while creating the data template. Can anybody help me out please.

    how about pasting the content of your data template here, so that forum members can see what could be the problem.

  • PA_CONTRACT_XSLFO: How to invoke a RTF-template with related data template

    Dear Reader,
    actually I want to extend the standard Document Type Layout for a Purchase Agreement Contract with additional data from approved supplier list (ASL).
    Therefor I have created a RTF-template and a data template with the needed sql-statement. For testing I put this in a standalone concurrent programm and it works fine (result was a blue table with all data rows).
    Next step for me was to invoke the RTF-template into the PA_CONTRACT_XSLFO template for extending the Document Type Layout for my Purchase Agreement Contract. So I put the neede invoke-statements
    <xsl:import href="xdo://XXOC.XX_RTF_TEMPLATE.de.00/"/>
    and
    <xsl:call-template name="XX_RTF_TEMPLATE"/>
    into the XSLFO-template. Also I extend the RTF-template with the define template statement
    <?template:XX_RTF_TEMPLATE?>
    So all seems to be fine.
    As result I get the standard document for Purchase Agreement Contract with the additional blue table from RTF-template BUT WITHOUT DATA !
    From my point of view there is no execution of the sql-statement in data template. But I dont know why.
    Do Oracle support a combination of XSLFO-template with data template?
    [XSLFO-template] with related [XSD-data definition]
    calls [RTF-template] with related [data template (with included sql-statement)]
    Thanks for your help.
    Best regards
    Mario.

    How to call a rtf template from another rtf template by passing a value try in main template create hyperlink of url with parameters for another template
    http://bipconsulting.blogspot.ru/2010/02/drill-down-to-detail-or-another-report.html
    When user pull a quote report from siebel this new rtf template should attach to the quote at the end.it'll be only another report
    IMHO you can not attach it to main. it'll be second independent report
    you can try subtemplate but it's not about rtf from rtf by click
    it's about call automatically rtf subtemplate from main rtf based on some conditions
    for example, main template contain some data and if some condition is true then call subtemplate and place it instead of its condition

  • Any examples of a data template using multiple data sources?

    I'm looking for an example report using multiple data sources. I've seen one where they do a master/detail but I'm just looking to combine results in sorted order (sorted across all data sources). The master/detail used a bind variable to link the two defined queries, I'm thinking what I want won't have that, so I'm lost on how to make that happen. I have reports using multiple sql queries and there is a way in the data source pulldown to tell it to combine the data sources. It appears to be a more manual process with data templates, if it's even possible.
    Any pointers/links would be appreciated.
    Gaff

    Hi Vetsrini :
    That's just it. Mine is simpler than that. There is no master/detail relationship between the two queries. I have the same exact query that I run in two databases and I want to merge the results (ordered, by, say eventTime) in one report. So I think my results are going to be two separate groups (one for each data source) which I'll have to let BI merge vis XSLT or whatever it uses. That's fine for small result sets but for larger ones, it would be nice if the database did the sorting/merging.
    Gaff

  • Report Based on XML Data Template is timing out

    Hi there,
    i have an XML publisher report based on an XML data template that i've written. The report is working fine except when there is huge data that is supposed to come out, its timing out. Ive set the db_fetch_size parameter to 20 and scalable_mode to on but its still timing out.
    If anyone has a solutiom or a thought it would be much appreciated.
    Thanks.

    Thanks for the reply.
    Well when i'm opening the report , i wait for about 15 seconds then 'Page Cannot be Displayed' appears on my browser. I'm having this case only when i'm expecting huge data. By huge data i mean that the report gets informations about a deal, when the deal has 10 categories the report runs fine, but when i add all catgories, about 1000, it gives me the 'Page cannot be displayed' error. I'm guessing its a time out issue since i've set the scalable_mode parameter to on. Any thoughts ???

  • I have a question about Data Rates.

    Hello All.
    This is a bit of a noob question I'm sure. I don't think I really understand Data Rates and how it applies to Motion... therefore I'm not even sure what kind of questions to ask. I've been reading up online and thought I would ask some questions here. Thanks to all in advance.
    I've never really worried about Data Rates until now. I am creating an Apple Motion piece with about 15 different video clips in it. And 1/2 of them have alpha channels.
    What exactly is Data Rate? Is it the rate in which video clip data is read (in bits/second) from the Disc and placed into my screen? In Motion- is the Data Rate for video only? What if the clip has audio? If a HDD is simply a plastic disc with a dye read by "1" laser... how come my computer can pull "2" files off the disc at the same time? Is that what data transfer is all about? Is that were RAM comes into play?
    I have crunched my clips as much as I can. They are short clips (10-15seconds each). I've compressed them with the Animation codec to preserve the Alpha channel and sized them proportionally smaller (320x240). This dropped their data rate significantly. I've also taken out any audio that was associated with them.
    Is data rate what is slowing my system down?
    The data rates are all under 2MBs. Some are as low as 230Kbs. They were MUCH higher. However, my animation still plays VERY slowly.
    I'm running a 3GigRam Powerbook Pro 2.33GHz.
    I store all my media on a 1TB GRaid Firewire 800 drive. However for portability I'm using a USB 2 smartdisk external drive. I think the speed is 5200rpm.
    I'm guessing this all plays into the speed at which motion can function.
    If I total my data rate transfer I get somewhere in the vicinity of 11MBs/second. Is that what motion needs for it to play smoothly a 11MBs/second data connection? USB 2.0 is like what 480Mbs/second. So there is no way it's going to play quickly. What if I played it from my hard drive? What is the data rate of my internal HDD?
    I guess my overall question is.
    #1. Is my thinking correct on all of these topics? Do my bits, bytes and megs make sense. Is my thought process correct?
    #2. Barring getting a new machine or buying new hardware. What can I do to speed up this workflow? Working with 15 different video clips is bogging Motion down and becoming frustrating to work with. Even if only 3-4 of the clips are up at a time it bogs things down. Especially if I throw on a glow effect or something.
    Any help is greatly appreciated.
    -Fraky

    Data rate DOES make a difference, but I'd say your real problem has more to do with the fact that you're working on a Powerbook. Motion's real time capabilities derive from the capability of the video card. Not the processor. Some cards do better than others, but laptops are not even recommended for running Motion.
    To improve your workflow on a laptop will be limited, but there are a few things that you can try.
    Make sure that thumbnails and previews are turned off.
    Make sure that you are operating in Draft Mode.
    Lower the display resolution to half, or quarter.
    Don't expect to be getting real time playback. Treat it more like After Effects.
    Compressing your clips into smaller Animations does help because it lowers the data rate, but you're still dealing with the animation codec which is a high data rate codec. Unfortunately, it sounds necessary in your case because you're dealing with alpha channels.
    The data rate comes into play with your setup trying to play through your USB drive. USB drives are never recommended for editing or Motion work. Their throughput is not consistent enough for video work. a small FW drive would be better, though your real problem as I said is the Powerbook.
    If you must work on the powerbook, then don't expect real-time playback. Instead, build your animation, step through it, and do RAM previews to view sections in real time.
    I hope this helps.
    Andy

  • IF THEN ELSE in DATA TEMPLATE NOT IN RTF

    Hi,
    Can any one tell me How to Implement IF THEN ELSE or DECODE or CASE Logic in a DATA TEMPLATE not in RTF
    as I saw loads of Post about implementing them in RTF not in the Template itself.
    I have a requirement please click this below for the Detailed Version of this Post, Else Please Just give me a Snippet from which i can Implement the IF THEN ELSE Logic in a DATA TEMPLATE( the .xml file) bu please not via a RTF not using PL/SQL Package
    Logic to select Report name based on User's input.
    Thanks
    vasanthanand

    Hello
    I am looking to do the same type of thing did you ever get an answer without having to use a RTF format perhaps in another forum or through investigation?

  • Regarding Data Template Issue

    HI All,
    I am using the Data Template to publish output.
    Based on Data Template.xdo
    <?xml version = '1.0' encoding = 'utf-8'?>
    <report version="1.1" xmlns="http://xmlns.oracle.com/oxp/xmlp" defaultDataSourceRef="CNE1">
    <title>Based on Data Template</title>
    <properties>
    <property name="showControls" value="true"/>
    <property name="online" value="true"/>
    <property name="parameterColumns" value="3"/>
    <property name="openLinkInNewWindow" value="true"/>
    <property name="autoRun" value="true"/>
    </properties>
    <dataModel defaultDataSet="New DataSet 2">
    <dataSet id="New DataSet 2">
    <dataTemplate name="MyDT" dataSourceRef="CNE1">
                        <dataQuery>
                             <sqlStatement name="Q_Emp">
                                  <![CDATA[ SELECT * FROM EMP]]>
                             </sqlStatement>
                        </dataQuery>
                   </dataTemplate>
    </dataSet>
    </dataModel>
    <valueSets/>
    <parameters/>
    <templates/>
    <burst enabled="false"/>
    </report>
    I have created the template & register the same in EBS.Also created a Concurrent Program with XDODTEXE as the executable.
    I am getting the following error :
    Custom APPS: Version : UNKNOWN
    Copyright (c) 1979, 1999, Oracle Corporation. All rights reserved.
    XX_TEST_DP module: XX: Test Data template -1
    Current system time is 12-SEP-2007 00:26:25
    XDO Data Engine Version No: 5.6.3
    Resp: 20420
    Org ID : 83
    Request ID: 7440294
    All Parameters:
    Data Template Code: XX_TEST_DP
    Data Template Application Short Name: CUSTOM
    Debug Flag: N
    Calling XDO Data Engine...
    [091207_122635145][][EXCEPTION] java.lang.NullPointerException
         at oracle.apps.xdo.dataengine.DataTemplateParser.GetNodeNumChildren(DataTemplateParser.java:345)
         at oracle.apps.xdo.dataengine.DataTemplateParser.templateParser(DataTemplateParser.java:277)
         at oracle.apps.xdo.dataengine.XMLPGEN.setDataTemplate(XMLPGEN.java:599)
         at oracle.apps.xdo.dataengine.DataProcessor.setDataTemplate(DataProcessor.java:193)
         at oracle.apps.xdo.oa.util.DataTemplate.<init>(DataTemplate.java:136)
         at oracle.apps.xdo.oa.cp.JCP4XDODataEngine.runProgram(JCP4XDODataEngine.java:282)
         at oracle.apps.fnd.cp.request.Run.main(Run.java:161)
    java.lang.IndexOutOfBoundsException: Index: 0, Size: 0
         at com.sun.java.util.collections.ArrayList.RangeCheck(ArrayList.java:492)
         at com.sun.java.util.collections.ArrayList.get(ArrayList.java:306)
         at oracle.apps.xdo.dataengine.DataTemplateParser.getParentDataSource(DataTemplateParser.java:1737)
         at oracle.apps.xdo.dataengine.XMLPGEN.writeDefaultGroup(XMLPGEN.java:320)
         at oracle.apps.xdo.dataengine.XMLPGEN.writeGroupStructure(XMLPGEN.java:279)
         at oracle.apps.xdo.dataengine.XMLPGEN.processData(XMLPGEN.java:266)
         at oracle.apps.xdo.dataengine.XMLPGEN.processXML(XMLPGEN.java:205)
         at oracle.apps.xdo.dataengine.XMLPGEN.writeXML(XMLPGEN.java:237)
         at oracle.apps.xdo.dataengine.DataProcessor.processData(DataProcessor.java:364)
         at oracle.apps.xdo.oa.util.DataTemplate.processData(DataTemplate.java:236)
         at oracle.apps.xdo.oa.cp.JCP4XDODataEngine.runProgram(JCP4XDODataEngine.java:293)
         at oracle.apps.fnd.cp.request.Run.main(Run.java:161)
    Start of log messages from FND_FILE
    End of log messages from FND_FILE
    Executing request completion options...
    ------------- 1) PUBLISH -------------
    Beginning post-processing of request 7440294 on node SJCORCL05 at 12-SEP-2007 00:26:37.
    Post-processing of request 7440294 failed at 12-SEP-2007 00:26:44 with the error message:
    One or more post-processing actions failed. Consult the OPP service log for details.
    Finished executing request completion options.
    Concurrent request completed
    Current system time is 12-SEP-2007 00:26:44
    Please help.
    Thanks
    Bharat B Nayak
    [email protected]

    Go to DSO -> manage.
    On content Tab, you get option for selective deletion (last button "selective").
    Mention your selections inside Delete Selection & start the job.
    This will delete the selected data only from active table.
    Please note the data from change log will not be deleted. No selective deletion is possible on Change log. However, you can delete the older change logs if not necessary. Go to Environment -> Delete Change Log Data & specify the date or days.

Maybe you are looking for