Application Migration from HP-UNIX to SUN Solaris

Hi Gurus,
Need help on the application migration from HP-UNIX to Sun Solaris . Copied the filesystems APPL_TOP, OA_HTML, OA_JAVA, COMMON_TOP/util and COMMON_TOP/_pages from SOURCE( HP) to TARGET SUN-solaris as per 238276.1
Now trying to clone Autoconfig XML context file on the TARGET System
perl adclonectx.pl migrate java=\opt\java1.4 \
contextfile=\ul01\app\applmgr\lasrx\admin\LASRX_fiapd4.xml
Error: source context file ul01appapplmgrlasrxadminLASRX_fiapd4.xml does not exist.
Questions:
======
1. we still need to install jdk in the TARGET or copy the file system from source(HP) to target( Solaris) bring the JDK and how to check jdk on Solaris?
2. In clone autoconfig XML, contextfile=? Is the location of SOURCE or TARGER?
3. I think running above with root permission.
Please help somebody on this issue.
Thank You.

Hi Hussein,
I need your help on the Application migration from HP-UNIX to SUN Solaris. As per the note 238276.1 we dont copy $806_ORACLE_HOME for cross platform migration, Copied all the filesystem from SOURCE to TARGET but rename it because it suppose to create while installing Middle tier.
Issue:1
=====
When i start to set environment like below, i get the error and
. /ul01/app/applmgr/lasrx/APPSORA.env
ksh: /ul02/app/oracle/lasrx/8.0.6_new/LASRX_fiapd4.env: not found
I am trying to install Middle Tier Technology Stack on SUN server ( Migrating from HP-UNIX to SUN solaris)
Issue 2: ( Display error) while install Tech Stack
=====
./rapidwiz -techstack
Rapid Install Wizard is validating your file system......
>> Wizard requires the DISPLAY variable to be set.
>> Please set your DISPLAY variable and restart Rapid Install Wizard.
even i can not see
echo $DISPLAY because not set the environment.
Need your help and suggetions on this.
Thank You.

Similar Messages

  • Cross platform migration from hp unix to sun

    Hi
    recently we have done cross platform migration from hp unix to sun (10g R2 RAC +ASM). It was a challenging experience to complete the migration in 8 hours downtime for a database of about 2 TB.
    I would like to share the experience , so that it would be hepfull to others and i would learn more new techniques for improvement .
    source environment:
    HP-UX -ITANIUM - 2 node RAC - 10g R2 - Oracle clusterware - ASM - Heavily loaded all the time .
    Destination Env:
    Sun solaris 10 - 2 node RAC - 10g R2 - Oracle Clusterware - ASM
    Total Downtime available : 6 hours.
    Steps followed
    1. snapshot backup of source database on a intermediate HP-Unix box. (stage area)
    2. recover the database and start the rman image backup on to disk in stage area.
    3. mount this filesystem locally (Veritas ) on to one of the node in the destination env.
    4. Mount this filesystem as NFS on second node using a cross cable between both of them (we can use interconnect VLAN also here)
    5. Keep applying the archive logs from source env till the downtime .
    5. startup nomount the database , change the datafile names from ASM to RMAN image copy files and mount.
    6. start rman convert statements and pump into the already exisitng ASM nodes with parallel sessions as much as possible (with 35 sets (513 datafiles) we were able to pump 2 TB data in 4.5 Hrs) - only system and undo datafiles need a convert statements (platform migration ) others would need a file system to ASM conversion only.
    7. alter database open resetlogs in the destination environment.
    8. create services manually and start the application.
    we were given very less time to decide upon this stategy , By gods grace we were able to complete it .
    I got a doubt while we are chalking out this plan .
    can we connect to ASM instance remotely and pump data from one ASM to another ASM. It seems it is not possible , just a wild thought.
    Let me know if you see any more improvements in this.
    Thanks
    Naveen samala

    can we connect to ASM instance remotely and pump data from one ASM to another ASM. It seems it is not possible , just a wild thought.No. Thats not possible.

  • Migration from iplanet webserver to Sun Directory Server

    Hi,
    I have Oracle Iplanet WebServer Enterprise edition V6.0 SP2 in my dev environment. I would want to migrate the system to Sun Java System Directory Server V6.0. I have looked up the migration guide for Sun DS V6.0. But i could not find any reference to Iplanet WebServers.
    Can anybody please let me know the migration procedure for migrating from Iplanet Server to Sun Directory server.
    Any help would be appreicated
    Thank you
    Nowfal

    Please ignore this question since we have dropped the plan to migrate, instead set a new DS instance from the beginning

  • Testing of SAP BW system after migration from OS Unix to windows

    Hi Experts,
    I am working on a project wherein testing of BW system is required after migration from OS Unix to Windows.
    If any one has done same set of activities in past or has any idea of this, please share all major test steps required after migration.
    Thanks in advance.
    Regards,
    Neeraj

    Hi Neeraj,
    I will paste the list we used for our upgrade from 3.1 to 7.0. I removed some company specific details. Sorry for the "lay-out", it is copied from excel, this messes up the whole message.
    best regards, André.
    Preparation     PHASE 1
    Review New Functionality     Review BW section on SAPNet for latest information on new release
    Confirm Upgrade Timing (First Sat)    
    GUI Upgrade     Install GUI's for Upgrade Test persons
    "Check Prerequisites     -Operating System
    -SAP Kernel
    -DBMS
    -Disk space
    -R/3 plugin
    -BW software version and SP
    Citrix.."
    "Check Compatibility Reqs with other systems     APO
    ECC6.0"
    Check Compatibility Reqs with 3rd party software     ?
    "Test Team 2004s Delta Training     To
    have idea on new functionality
    not have confusion if looks different.."
    "End User Communication: upgraded system and changes     e.g. right vs left click
    - send out communication twice"
    End User Communication: several "look and feel" sessions  NOTE: incorporate WISBECH !!!    
    Setup Portal connected to BW QA for testing once upgraded     Setup Portal connected to BW DEV for testing once upgraded
    Check best go live date - based on assuption 2 days needed    
    Determine test set  - based on input BS / KU    
    Estimate resources Business for testing    
    BluePrint     PHASE 2
    Check/CleanUp Development Objects     Check all open and not-released developments for release prior to upgrade
    Prepare System Setup     Prepare (if needed) flat-files for test loads
    Check OSS for release notes on objects     Each functional area should check out OSS for release notes on objects (for changes etcu2026, new functionality)
    "Prepare (detail) list of objects to be tested     Complete list of objects to be tested, special attention to custom/non-standard programs, enhancements. List of:
    - extractors
    - process chains
    - reports"
    "Setup delta mechanisms (create transaction data)     BackEnd Test: Financial Extraction and load, process chains
    u2022tbd
    u2022
    u2022 "
    Verification / creation of process chains     Check existence of to-be-tested process chains; if none, create 'sample' process chains
    Realization     PHASE 3
    Prior to Upgrade: Install any necessary frontend software    
    Prior to Upgrade: Install any necessary frontend software (end-users)    
    Prior to Upgrade: Complete any extraction - suspend V3 and scheduled BW jobs    
    Create Sandbox    
    Connect Sandbox to external systems (connections/BDLS)    
    Pre-upgrade steps    
    Perform Upgrade of BW     
    Check connection with R/3/APO    
    Perform system backup    
    "BackEnd Test: Logistics extraction and load, process chains
    u2022 Z_example_CHAIN
    u2022 2lis_03_BF and 2lis_03_BX     Test:
    - Extractor within process chain preferably otherwise mimic behavior (drop index, load, rebuild, aggragate etc)
    - Delta extraction"
    "BackEnd Test: Sales Extraction and load, process chains
    u2022 Z_SALES_1
    u2022 all lis extractors:
    u2022 12_vcitm
    u2022 13_vditm
    u2022 11_vaitm
    u2022 11_vahdr
    u2022 11_v_ssl
    u2022 11_vasti
    u2022 12_vcscl
    u2022 selection of z-extractors
         Test:
    - Extractor within process chain preferably otherwise mimic behavior (drop index, load, rebuild, aggragate etc)
    - Delta extraction"
    "BackEnd Test: Financial Extraction and load, process chains
    u2022 0FI_GL_4
    u2022 NOTE: no further extractors are chosen for testing
    u2022 tbd     Test:
    - Extractor within process chain preferably otherwise mimic behavior (drop index, load, rebuild, aggragate etc)
    - Delta extraction"
    "BackEnd Test: Master Data... Extraction and load, process chains     Test:
    - Extractor within process chain preferably otherwise mimic behavior (drop index, load, rebuild, aggragate etc)
    - Delta extraction"
    "BackEnd Test: Zxxx Extraction and load, process chains
    u2022 Z_TRANSACTION_DATA
    u2022 Zxxx
    u2022 Zyyy     Test:
    - Extractor within process chain preferably otherwise mimic behavior (drop index, load, rebuild, aggragate etc)
    - Delta extraction"
    "BackEnd Test: Plant Maintenance Extraction and load, process chains     Test:
    - Extractor within process chain preferably otherwise mimic behavior (drop index, load, rebuild, aggragate etc)
    - Delta extraction"
    BackEnd Test: PP Extraction and load, process chains     Test: load from Zxxx
    "BackEnd Test: Purchasing Extraction and load, process chains     Test:
    - Extractor within process chain preferably otherwise mimic behavior (drop index, load, rebuild, aggragate etc)
    - Delta extraction"
    "BackEnd Test: APO Extraction and load, process chains
    u2022 ZAPOxxx
    u2022 ZAPOyyy
    u2022 others can be tested, but were not included in de testset     Test:
    - Extractor within process chain preferably otherwise mimic behavior (drop index, load, rebuild, aggragate etc)
    - Delta extraction"
    BackEnd Test: Regular Schedule - Production    
    BackEnd Test: Flat file upload     Test: 2 types from server and PC
    BackEnd Test: Test Admin Workbench: creation of objects, change existing     Test the Admin Workbench: creation of infoobjects, change infoobject, create/change hierarchy, create cube,u2026
    BackEnd Test: Test Bex: creation of queries (kf etc...); change existing     Creation of new query/ change query, create condition, calculated/restricted kf, variables,..
    BackEnd Test: Issue handling    
    Development box issue fixing - on forehand     apply fixes based on issues BWQ
    FrontEnd Test: Run Logistics Reports (log results in Reports tab)     Run reports, execute various navigation steps, make sure to execute each item of menu paths,validate no loss of functionality, report new functionality. Run few managed bookmarks
    FrontEnd Test: Run Sales Reports (log results in Reports tab)     Run reports, execute various navigation steps, make sure to execute each item of menu paths,validate no loss of functionality, report new functionality. Run few managed bookmarks
    FrontEnd Test: Run Supply Chain Reports (log results in Reports tab)     Run reports, execute various navigation steps, make sure to execute each item of menu paths,validate no loss of functionality, report new functionality. Run few managed bookmarks
    FrontEnd Test: Run Financial Reports (log results in Reports tab)     Run reports, execute various navigation steps, make sure to execute each item of menu paths,validate no loss of functionality, report new functionality. Run few managed bookmarks
    FrontEnd Test: Run Controling Reports     Run reports, execute various navigation steps, make sure to execute each item of menu paths,validate no loss of functionality, report new functionality. Run few managed bookmarks
    FrontEnd Test: Run Master Data and Special Reports (log results in Reports tab)     Run reports, execute various navigation steps, make sure to execute each item of menu paths,validate no loss of functionality, report new functionality. Run few managed bookmarks
    FrontEnd Test: Run PM Reports (log results in Reports tab)     Run reports, execute various navigation steps, make sure to execute each item of menu paths,validate no loss of functionality, report new functionality. Run few managed bookmarks
    FrontEnd Test: Run Purchasing Reports (log results in Reports tab)     Run reports, execute various navigation steps, make sure to execute each item of menu paths,validate no loss of functionality, report new functionality. Run few managed bookmarks
    FrontEnd Test: Run APO Reports (log results in Reports tab)     Run reports, execute various navigation steps, make sure to execute each item of menu paths,validate no loss of functionality, report new functionality. Run few managed bookmarks
    FrontEnd Test: Issue handling    
    FrontEnd Test: Make before image of reports - Production     Run reports; save of before image to compare once upgraded
    FrontEnd Test: Run All Reports - Production     Run reports, execute various navigation steps, compare with 'before image' taken in prevous step
    Test Security     Test security with various end-user roles/IDS other than BW developer (usually has * ALL)
    Test Portal integration     Test portal with various end-user roles/IDS other than BW developer (usually has * ALL)
    Edited by: A. Nagelhout on Feb 12, 2010 10:50 AM
    Edited by: A. Nagelhout on Feb 12, 2010 10:50 AM
    Edited by: A. Nagelhout on Feb 12, 2010 10:52 AM
    Edited by: A. Nagelhout on Feb 12, 2010 10:52 AM

  • How can I import eex files into Applications EUL from the unix command line

    How can I import *.eex and *.dis files into an Applications EUL from the unix command line?
    Thanks

    Hi
    The simple answer is you either have to use the client tool DIS51ADM to import files using the command line (Discoverer Admin is a windows only client tool), or the Java command line which needs a browser.
    In theory if you have a browser running on your Unix box you may be able to use the Java command line to make this work.
    Best wishes
    Michael

  • How to invoke VI function in LabView running on Windows from a unix process in Solaris

    Labview is only installed on Windows and Process on Solaris is a standard unix process without any Labview installed.

    I'm not sure this is possible. Really, the only way to call a VI are from VIServer or ActiveX. Since you don't have LabVIEW on the other machine, that won't work, and you can't use DCOM from a non Windows machine. There might be a 3rd party application like PCanywhere that will work between environments, but I doubt it. Your best chance might be to rig it so that LabVIEW works as a TCP/IP server and will respond to commands by using the TCP VI's. That should be easy enough to use from a UNIX system.

  • Migration from Siteminder DMS to Sun Access Manager

    Hi
    We are working on a project that involves migration of Siteminder and SiteMinder-DMS to Sun Access Manager.
    My concerns are
    1. Do I need any changes to the Directory Tree of the LDAP..?
    2. How do I migrate the policies..>
    3. Does Sun have the exact quivalents(the same coarse grained APIs) as Siteminder-DMS..?
    4. Heard of a tool that can do the migration from Siteminder to Sun Access Manager. How good is the tool and what in its scope and what are its limitations.?
    Thnx
    siva

    I currently reviewing migrating from SiteMinder to Sun Access Manager. I have the same issues as you have had. I would greatly appreciate any feedback on any of these issues. My email address is [email protected] if you prefer to email me directly.

  • Migrating from 8.1.7 (on Solaris) to 10.2.0 (to Linux/Windows)

    Hi,
    Our company is considering to migrate our database from 8.1.7 (on Solaris) to 10.2.0 (on either Linux or Windows). Currently, I'm testing on migrating our database from 8.1.7.0.0 (on Solaris) to 10.2.0 (on Windows 2000 Advanced Server). I checked the 8i menu and found I cannot take advantage of "transportable tablespace" due to different platforms. I also tried to export and import the schemas but failed due to complicated referential integrity constraints among those schemas. Just wondering is there any other way I can try in order to migrate the database to 10.2.0 on a Windows platform successfully?
    Any input will be appreciated!

    Add this paramter CONSTRAINTS=NO while exporting, could be one solution.
    Amir

  • Migration from physical server to zone (solaris 10)

    Hello all,
    I found an old thread about the subject Migrate Solaris 10 Physical Server to Solaris 10 Zone but i have a question.
    Using the flarcreare command, will add to the flar archive all the zpools i have in the server?. Right now we have 14 zpools.
    If i execute this command "flarcreate -n "Migration File" > -S -R / -x /flash /flash/Migration_File-`date '+%m-%d-%y'`.flar" will take all the zpools?
    This is for migrating from a E25k Server to a M9k Server
    The E25k (Physical Server) have this release "Solaris 10 10/08 s10s_u6wos_07b SPARC" and the zone server (M9k) have this release "Oracle Solaris 10 8/11 s10s_u10wos_17b SPARC" this could be an issue?
    Thanks for any help.
    Edited by: 875571 on Dec 9, 2011 7:38 AM

    flarcreate will only include data from the root pool (typically called rpool). The odds are that this is what you actually want.
    Presumably, on a 25k you would have one pool for storing the OS and perhaps home directories, etc. This is probably from some sort of a disk tray. The other pools are likely SAN-attached and are probably quite large (terabytes, perhaps). It is quite likely that instead of creating a multi-terabyte archive, you would instead want an archive of the root pool (10's to 100's of megabytes) and would use SAN features to make the other pools visible to the M9000.
    One thing that you need to do that probably isn't obvious from the documentation is that you will need to add dataset resources to the solaris10 zone's configuration to make the other zpools visible in the solaris10 zone. Assuming that these other pools are on a SAN, the zpools are no longer imported on the 25k, and the SAN is configured to allow the M9000 to access the LUNs, you will do something like the following for each zpool:
    # zpool import -N -o zoned=on +poolname+
    # zonecfg -z +zonename+ "add dataset; set name=+poolname+; end"In the event that you really do need to copy all of the zpools to the M9000, you can do that as well. However, I would recommend doing that using a procedure like the one described at http://docs.oracle.com/cd/E23824_01/html/821-1448/gbchx.html#gbinz. (zfs send and zfs recv can be used to send incremental streams as well. Thus, you could do the majority of the transfer ahead of time, then do an incremental transfer when you are in your cut-over window.)
    If you are going the zfs send | zfs recv route and you want to consolidate the zpools into a single zpool, you can do so, then use dataset aliasing to make the zone still see the data that exists in multiple pools as though nothing changed. See http://docs.oracle.com/cd/E23824_01/html/821-1448/gayov.html#gbbst and http://docs.oracle.com/cd/E23824_01/html/821-1460/z.admin.task-11.html#z.admin.task-12.

  • Migrating from UCD snmp to SUN snmp

    Hi, Can anyone tell me if there is an easy way to migrate from UCD snmp we currently use to SUN snmp. Basically we currently have a config file snmpd.conf and use it for monitoring. We'd like to migrate as seamlessly to SUN snmp as possible. Any help is appreciated. Thank you.

    Hi Jeff,
    This is definitely OS/DB migration. since the binary codes are different between Sparc and AMD CPUs. You will need certified migration specialist to perform it. That is required from SAP for support reason.
    I've done few migrations. It is complex process and it's duration depends of your dictionary and size of cluster tables.
    Good luck,
    Savo

  • Moving Oracle 9i database from AIX server to Sun Solaris 9 server

    I want to move my oracle 9i database residing on AIX server to Sun Solaris 9 server.
    The database residing on AIX server has the following features :-
    1. It is a production database.
    2. Running in NOARCHIVE LOG Mode.
    3. It was Live for few weeks, hence transactions are stored on it.
    4. It has one schema,around 700 tables & 129 views.....etc
    5. Huge downtime is affordable.
    What is the safest way to do so ??
    Yachendra

    Hi Yachendra,
    1. It is a production database.
    2. Running in NOARCHIVE LOG Mode.By the way do you have any idea that PRODUCTION databases are always kept in ARCHIVE LOG Mode. It's is highly advice able.
    3. It was Live for few weeks, hence transactions are stored on it.If you are really serious about your data and don't want to lose anything and wants to perform CROSS PLATFORM you must make sure that you have all your achive logs.
    my oracle 9i database residing on AIX serverYou have already got the solution regarding that. Maran has told you for EXPORT/ IMPORT and Burleson has given you enough documents to consult.
    Thanks
    Shivank

  • MIgration from physical servers to Oracle Solaris Containers

    We are in process of migrating our Oracle Databases from a physical SUN SPARC servers to Oracle Solaris Container.
    Do we need to any extra setting in the Oracle side or server side in a virtualized environment to run Oracle databases?
    Any comment will be of help.
    Thanks
    Abdul

    Oracle databases work fine in zones.
    Some planing is required to find out if you need to use dedicated cpus or other resource management options.
    Use projects to set the appropriate memory settings.
    Depending on the versions of the databases the installer may choke and complain about missing entries in /etc/system (like shminfo_shmmax) but they can be safely ignored.

  • Deploying ear application - migrating from OC4j to Weblogic

    I am trying to migrate application from OC4J to weblogic server. Apart from usual differences in descriptors, structure of EAR etc. I came across unusual error:
    "both the remote home and remote component interface must be specified. Currently, only one of them is specified"
    It is about deploying session bean. Usual EJB3 stateless session bean with local and remote interface, deployed without any problem on OC4J and JBoss. Of course, I know home interfaces are something to do with EJB 2.1, but this is EJB3 application and, I guess, Weblogic has full support for it. Does it? :)
    Anyway, is there workaround? Or am I missing something? I don't have to tell you what problems we will be facing if I have to restructure whole application with dozens and hundreds of session beans.
    Thanks a lot!

    OK. I just want to make sure that there are no problems with the deployment descriptors.
    To pin point the problem do the following in JDeveloper:
    1. Create a simple EJB 3.0 session bean
    2. On that session bean create a Sample Java Client.
    3. Run the Session Bean
    4. Run the Client
    5. If both are running fine, compare the deployment descriptors from the sample and your real application.
    --olaf                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • Planning classical application Migration from version 9.3.1 to 11.1.1.3.

    Hi,
    Current Version
    Hyperion Version 11.1.1.3
    Database: SQL Server 2005
    Windows Server 2003 R2
    Old Version
    Hyperion Version 9.3.1
    Database: SQL Server 2000
    Windows Server 2000 R2
    1.I have created classical planning applicationin(CPLAN) in the 11.1.1.3 which similar to 9.31 application, interms of application name, database names and dimensions.
    2.I took backup of CPLAN of 11, and CPLAN of 931.
    3.stopped planning, Essbase , RMI Registry, Shared services.
    4.Restored the 931 CPLAN Backup( of SQLServer 2000) into 11113 CPLAN(SQL Server 2005)
    5.HSP_OBJECT table - in this table object id  column and objectname column is password, Here tried with both old password hypadmin and new password admin.
    6.HSP_USERS table - USER _ID =50001 and SID= (I given 11 version details)
    7. HSP_SYSTEMCFG Table- I tried with changing the VERSION from 9 to 11 and cleared the verion - tried with both the ways
    and EIE_SERVER changed- this is shared service path with port number and server name changed.
    I mean http://mycurrent server:28080/interop.
    8. Above These3 Tables columns changed and started the services all planning etc.
    9. tried to open planning web, i got the application CPLAN name, once enter the username and password and CPLAN , its giving error. the error message , error while processing this current page, check the log for details. I got this error message.
    10. I checked the logs, Access.log, and Hyperion Planning.log these two logs verified i didnot find any error.
    --please help meon this who has done this kind of migration. ASAP.
    Thanks
    Chinna
    Edited by: Hyperion Chinna on Aug 12, 2010 4:45 PM

    Make sure you follow the steps here :- Migration of Hyperion Planning Application
    Do not start changing anything in HSP_SYSTEMCFG
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Application migration from 10.1.2 to 10.1.3.4

    Hi,
    In migrating our application from JDeveloper 10.1.2 to 10.1.3.4, we are experiencing an issue on one of the pages. The page contains First, Prev, Next, Last and All pagination buttons where the default range size is 10. If the user clicks the "ALL" button then the range size should be changed to be 500. Unfortunately, the range size continues to be 10 once the "ALL" button is clicked. The application is currently using Struts and the stuts-config.xml file was migrated to 10.1.3.4. I have been unable to find anything under Forums or through Google, so any help that you could give me would be greatly appreciated.
    Regards,
    TJ
    showTransactionsAction.java
    public class ShowTransactionsAction extends DataAction {
    private static int DefaultRangeSize = 10;
    private static int MaxRangeSize = 500;
    private static String RowSetIterator = "MasterTransactionView1Iterator";
    public String getRowSetIterator() {
    return RowSetIterator;
    * Supports the All button of the Super Search by setting a larger range size
    public void onEventAll(DataActionContext ctx) {
    DCIteratorBinding iter = (DCIteratorBinding)ctx.getBindingContainer().findIteratorBinding(getRowSetIterator());
    RowSetIterator rsi = iter.getRowSetIterator();
    int currentRangeSize = iter.getRangeSize();
    // Set a max range size to facilitate ALL. We do not want -1 here, as the max rows could be quite large.
    iter.setRangeSize(MaxRangeSize);
    showTransactions.jsp
    <c:set var="previousDisabled" value="${bindings.PreviousSet.enabledString}"/>
    <c:set var="nextDisabled" value="${bindings.NextSet.enabledString}"/>
    <c:set var="rangeBinding" value="${bindings.MasterTransactionView1}"/>
    <c:set var="totalRows" value="${rangeBinding.estimatedRowCount}"/>
    <c:set var="firstRowShown" value="${rangeBinding.rangeStart + 1}"/>
    <c:choose>
    <c:when test="${rangeBinding.rangeSize == -1}">
    <c:set var="rowsPerPage" value="${totalRows}"/>
    </c:when>
    <c:otherwise>
    <c:set var="rowsPerPage" value="${rangeBinding.rangeSize}"/>
    </c:otherwise>
    </c:choose>
    <c:choose>
    <c:when test="${firstRowShown + rowsPerPage - 1 > totalRows}">
    <c:set var="lastRowShown" value="${totalRows}"/>
    </c:when>
    <c:otherwise>
    <c:set var="lastRowShown" value="${firstRowShown + rowsPerPage - 1}"/>
    </c:otherwise>
    </c:choose>
    <html:form name="pagingOne" type="org.apache.struts.action.DynaActionForm"; action="/Transactions.do?hash=page&;reset=false" >
    <input type="hidden" name="<c:out value='${bindings.MasterTransactionView1.statetokenid}'/>" value="<c:out value='${bindings.MasterTransactionView1.statetoken}'/>" >
    <table width="970" border="0" cellpadding="0" cellspacing="0">
    <tr valign="bottom">
    <td width="10"><img src="img/whitespacer.jpg"; width="5" height="3"></td>
    <td width="460" class="table-header">
    TRANSACTION DETAIL</td>
    <c:choose >
    <c:when test="${lastRowShown > 0}">
    <td width="140" class="table-header">
    <fmt:parseNumber var="firstRowShownFMT" type="number" pattern="##,###" value="${firstRowShown}"/>
    <fmt:parseNumber var="lastRowShownFMT" type="number" pattern="##,###" value="${lastRowShown}"/>
    <fmt:parseNumber var="totalRowsFMT" type="number" pattern="##,###" value="${totalRows}"/>
    <c:out value="${firstRowShown}"/> - <c:out value="${lastRowShown}"/> of <c:out value="${totalRowsFMT}"/>
    </td>
    <td width="370" class="table-header">
    <table border="0" align="right" cellpadding="0" cellspacing="0">
    <tr valign="bottom">
    <td width="40" align="center" valign="bottom" nowrap>
    <img src="img/export-icon3.jpg"; style="background-color:transparent; border-style:none;"/>
    </td>
    <!-- <td width="30" valign="bottom" ><img src="img/whitespacer.jpg"; width="5" height="3"></td> -->
    <c:choose>
    <c:when test="${previousDisabled != 'disabled'}">
    <td width="35" align="center" valign="bottom" nowrap><input type="image" src="img\first-icon3.jpg"; name="event_First" value="First" <c:out value="${bindings.First.enabledString}" /> > </td>
    <td width="35" align="center" valign="bottom" nowrap><input type="image" src="img\prev-icon3.jpg"; name="event_PreviousSet" value="PreviousSet" <c:out value="${bindings.PreviousSet.enabledString}" /> > </td>
    </c:when>
    <c:otherwise>
    <c:choose>
    <c:when test="${lastRowShown == totalRows && firstRowShown == 1 && totalRows > 10}">
    <td width="35" align="center" valign="bottom" nowrap><input type="image" src="img\first-icon3.jpg"; name="event_First" value="First" > </td>
    </c:when>
    <c:otherwise>
    <td width="35" align="center" valign="bottom" nowrap><img src="img\first-icon-gray3.jpg";> </td>
    </c:otherwise>
    </c:choose>
    <td width="35" align="center" valign="bottom" nowrap><img src="img\prev-icon-gray3.jpg";> </td>
    </c:otherwise>
    </c:choose>
    <c:choose>
    <c:when test="${nextDisabled != 'disabled'}">
    <td width="35" align="center" valign="bottom" nowrap><input type="image" src="img\next-icon3.jpg"; name="event_NextSet" value="NextSet" <c:out value="${bindings.NextSet.enabledString}" /> > </td>
    <td width="35" align="center" valign="bottom" nowrap><input type="image" src="img\last-icon3.jpg"; name="event_Last" value="Last" <c:out value="${bindings.Last.enabledString}" /> > </td>
    </c:when>
    <c:otherwise>
    <td width="35" align="center" valign="bottom" nowrap><img src="img\next-icon-gray3.jpg";> </td>
    <td width="35" align="center" valign="bottom" nowrap><img src="img\last-icon-gray3.jpg";> </td>
    </c:otherwise>
    </c:choose>
    <c:choose>
    <c:when test="${lastRowShown == totalRows && firstRowShown == 1}">
    <td width="35" align="center" valign="bottom" nowrap><img src="img\all-icon-gray-v5.jpg";> </td>
    </c:when>
    <c:otherwise>
    <td width="35" align="center" valign="bottom" nowrap><input type="image" src="img\all-icon-v5.jpg"; name="event_EventAll" value="All" </td>
    </c:otherwise>
    </c:choose>
    </tr>
    </table>
    </td>
    </c:when>
    <c:otherwise>
    <td>
    </td>
    </c:otherwise>
    </c:choose>
    </tr>
    </html:form>
    </table>
    ...

    Your issue seems not on OA Framework.
    To check more response , put your issue on this forum
    JDeveloper and ADF
    Thanks

Maybe you are looking for