Best Practice to transfer workspace in NWDS

Hello,
I have to change the machine on which I was working till date.
The issue is that I have NWDS installed in that machine and have few local web dynpro java projects in that workspace.
In new machine I will get NWDS installed.
I would appreciate if anybody here can explain me on how should I transfer my code/workspace/web dynpro java applications from Machine 1 to machine 2.

Hi Saurabh,
Hope this [link |http://www.****************/Tutorials/EP/Versions/Page2.htm] will give you some idea how to transfer .dtc folder from one workspace to some other workspace.
Hope this helps you.
Regards,
Saleem Mohammad.

Similar Messages

  • Best Practice to transfer objects

    Hello,
    I am trying to transfer multiple DTOs to the same method. Is there a good practice of approaching this? Here is an example,
    public ObjectA oA;
    public ObjectB oB;
    someClass.setObject(oA);
    //How to set oB?
    I woud like to create one method which takes all the Objects (ObjectA, ObjectB .....) and iterate at one single place. Any suggestions. Your help is appreciated, thank u!

    Thanks for the reply.
    I might be wrong here. But the number of DTOs may change, It's not fixed.
    And i am trying not to make it application specific. Like this is my common class, just takes all the class iterate and return some result (which i will code insside it). Thanks again!

  • Best practice for timing of Transfer of forecast to R/3 Demand Management

    Hi All,
    I have APO DP 4.0 with monthly bucket. What is best practice to transfer the planning results to R/3. Meaning how many times we should transfer it?
    And when forecast (PIR) is transfered it replaces the old PIR in demand management for that month. How it will affect the consumption of forecast in R/3? Meaning let us say in same month first time forecast transfered was 100 and by middle of the month 50 has been consumed. Now in same month forecast has been carried out and now 200 has been calculated for the same month and transfered to R/3. Now ideally the remaining quantity to be made should be 150 in R/3. Is it so? Any idea?
    TBR, MM

    Hi Somnath,
    Thanks for the reply. The problem with frequent transfer of forecast to R/3 demand management is that the latest forecast ignores the prior shipments made in that month.
    I will give numeric example:
    Let us say we are in September month and in first week transfered forecast of 100 for one product to demand management. In first two weeks 40 items have been shipped so remaining is 60.
    Now we are in third week and planner realizes that instead of 100 we needed to plan 150 and 150 number is sent to R/3 demand management as a fresh forecast for that month.
    Now for September month we should see net demand as 110 (=150-40). Somehow this is not happening. It shows as a whole 150 net demand. Any idea why it is not happening.
    Thanks and Best Regards, Manoj

  • Best way to transfer old Macbook to new Macbook, both with two hard drives?

    I am waiting for the new Macbook Pro's to come out, and I'm planning to upgrade my original 13in MBP. When I bought that MBP, I simply used Time Machine to copy my old Macbook to the new one. It work well.
    But I've since replaced the hard drive with a solid state drive that houses the application and OS, plus some working files (photos), and I removed the optical drive and installed a large harddrive where I keep the user files.
    I'm planning to order the new Macbook Pro with a larger SSD, and then replace the optical drive with a bigger hard drive, probably the one I currently have in my MBP.
    What will be the best practice to transfer everything over. I want to keep most of what was on my old machine, inlcuding preferences, etc — but I also don't want years and years of junk transfered over. I've gone from Leopard to Snow Leopard to Mountain Lion, and so probably some junk from there as well, and I'm hoping that Mavericks will come with the new MBP.
    Thanks for the help!

    Just one comment on this:
    ‘If I have, say, an older version of iPhoto but a newer version of one on my new Macbook Pro, will it still transfer photos no problem?’
    the problems are ALWAYS in the reverse way, when you have the newest (or a newer) version of a given app, and wish to transfer its library to an older version.
    Otherwise, see the link to Pondini.
    I have been using Martin Jahn’s iBackup, with which I have become accustomed and works fine. It also makes daily backups. Its advantage over other apps is that you may add whatever you wish to save, beside its default settings (which you may delete or cancel, of course, even if not recommended); it also has a friendly interface and easily customizable. Of course, this is a personal view, you may try other methods as well, or other backup apps. All are good if you are satisfied and correspond to your needs.

  • Transfer iphoto library to external harddrive. Best practice?

    Need to transfer iphoto library to external harddrive due to space issues. Best practice?

    Moving the iPhoto library is safe and simple - quit iPhoto and drag the iPhoto library intact as a single entity to the external drive - depress the option key and launch iPhoto using the "select library" option to point to the new location on the external drive - fully test it and then trash the old library on the internal drive (test one more time prior to emptying the trash)
    And be sure that the External drive is formatted Mac OS extended (journaled) (iPhoto does not work with drives with other formats) and that it is always available prior to launching iPhoto
    And backup soon and often - having your iPhoto library on an external drive is not a backup and if you are using Time Machine you need to check and be sure that TM is backing up your external drive
    LN

  • NWDS iFlow and Communication Channel best practice

    Hi,
    Is there a recommended best practice for creation of Communication Channels when using NWDS? I know that CC can be created:
    CC can be created automatically from the iFlow
    CC can be created manually against the Business System and then referenced in the iFlow
    When a CC is created from the iFlow it is not visible against the Business System in NWDS, which is annoying.
    Any advise? This is a fresh system so no CC migrated from the Integration Directory.
    Che

    Hi,
    to me this rather sounds an iFlow related question than NWDS or NWDI.
    I am not familiar with iFlow, I only found some guides after googleing like
    Introducing iFlow in PI 7.31 Configuration
    and
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/304eda9e-736f-2f10-7c99-a9b61353cc34?overridelayout=t…
    but I can't tell whether this is related to your question.
    What I could though definetely suggest is to raise this at:
    Process Integration (PI) & SOA Middleware
    Still I hope this helps.
    Regards,
    Ervin

  • SAP Best Practice for Document Type./Item category/Acc assignment cat.

    What is the Best Practice for the document Type & Item category
    I want to use NB -  Item category  - B & K ( Blanket PO) , D ( Service)  and T( Text) .
    Is sap recommends to use FO Only for the Blanket Purchase Order.
    We want to use service contract (with / without service entry sheet) for all our services.
    We want to buy asset for our office equipments .
    Which is the best one to use NB or FO ?
    Please give me any OSS notes or reference for this
    Thanks
    Nick

    Thank you very much for your response. 
    I hope I can provide some clarity on how the accounting needs to be handle per FERC  Regulations.  The G/L balance on the utility that is selling the assets will be in the following accounts (standard accounts across all FERC Regulated Utilities):
    101 - Acquisition Value for the assets
    108 - Accumulated Depreciation Value for the assets
    For an example, there is Debit $60,000,000 in FERC Account 101 and a credit $30,000,000 in FERC Account 108.  When the purchase occurs, the net book value for the asset will be on our G/L in FERC Account 102.  Once we have FERC Approval to acquire the plant assets, we will need to enter the Acquisition Value and associated Accumulated Depreciation onto our G/L to FERC Account 101 and FERC Account 108 respectively with an offset to FERC Account 102.
    The method that I came up with is to purchase the NBV of the assets to a clearing account.  I then set up account assignments that will track the Acquisition Value and respective Accumulated Depreciation for each asset that is being purchased.  I load the respective asset values using t-code AS91 and then make an entry to the 2 respective accounts with the offset against the clearing account using t-code OASV.  Once my company receives FERC approval, I will transfer the asset to new assets that has the account assignments for FERC Account 101 and FERC Account 108 using t-code ABUMN or FB01.

  • Best practice for TM on AEBS with multiple macs

    Like many others, I just plugged a WD 1TB drive (mac ready) into the AEBS and started TM.
    But in reading here and elsewhere I'm realizing that there might be a better way.
    I'd like suggestions for best practices on how to setup the external drive.
    The environment is...
    ...G4 Mac mini, 10.4 PPC - this is the system I'm moving from, it has all iPhotos, iTunes, and it being left untouched until I get all the TM/backup setup and tested. But it will got to 10.5 eventually.
    ...Intel iMac, 10.5 soon to be 10.6
    ...Intel Mac mini, 10.5, soon to be 10.6
    ...AEBS with (mac ready) WD-1TB usb attached drive.
    What I'd like to do...
    ...use the one WD-1TB drive for all three backups, AND keep a copy of system and iLife DVD's to recover from.
    From what I'm reading, I should have a separate partition for each mac's TM to backup to.
    The first question is partitioning... disk utility see's my iMac's internal HD&DVD, but doesn't see the WD-1TB on the AEBS. (when TM is activity it will appear in disk utility, but when TM ends, it drops off the disk utility list).
    I guess I have to connect it via USB to the iMac for the partitioning, right?
    I've also read the benefits of keeping a copy of the install DVD's on the external drive... but this raises more questions.
    How do I get an image of the install DVD onto the 1TB drive?
    How do I do that? (install?, ISO image?, straight copy?)
    And what about the 2nd disk (for iLife?) - same partition, a different one, ISO image, straight copy?
    Can I actually boot from the external WD 1TB while it it connected to the AEBS, or do I have to temporarily plug it in via USB?
    And if I have to boot the O/S from USB, once I load it and it wants to restore from the TM, do I leave it USB or move it to the AEBS? (I've heard the way the backups are created differ local vs network)>
    I know its a lot of question but here are the two objectives...
    1. Use TM in typical fashion, to recover the occasion deleted file.
    2. The ability to perform a bare-metal point-in-time recovery (not always to the very last backup, but sometimes to a day or two before.)

    dmcnish wrote:
    From what I'm reading, I should have a separate partition for each mac's TM to backup to.
    Hi, and welcome to the forums.
    You can, but you really only need a separate partition for the Mac that's backing-up directly. It won't have a Sparse Bundle, but a Backups.backupdb folder, and if you ever have or want to delete all of them (new Mac, certain hardware repairs, etc.) you can just erase the partition.
    The first question is partitioning... disk utility see's my iMac's internal HD&DVD, but doesn't see the WD-1TB on the AEBS. (when TM is activity it will appear in disk utility, but when TM ends, it drops off the disk utility list).
    I guess I have to connect it via USB to the iMac for the partitioning, right?
    Right.
    I've also read the benefits of keeping a copy of the install DVD's on the external drive... but this raises more questions.
    Can I actually boot from the external WD 1TB while it it connected to the AEBS, or do I have to temporarily plug it in via USB?
    I don't think so. I've never tried it, but even if it works, it will be very slow. So connect via F/W or USB (the PPC Mac probably can't boot from USB, but the Intels can).
    And if I have to boot the O/S from USB, once I load it and it wants to restore from the TM, do I leave it USB or move it to the AEBS? (I've heard the way the backups are created differ local vs network)
    That's actually two different questions. To do a full system restore, you don't load OSX at all, but you do need the Leopard Install disc, because it has the installer. See item #14 of the Frequently Asked Questions *User Tip* at the top of this forum.
    If for some reason you do install OSX, then you can either "transfer" (as part of the installation) or "Migrate" (after restarting, via the Migration Assistant app in your Applications/Utilities folder) from your TM backups. See the *Erase, Install, & Migrate* section of the Glenn Carter - Restoring Your Entire System / Time Machine *User Tip* at the top of this forum.
    In either case, If the backups were done wirelessly, you must transfer/migrate wirelessly (although you can speed it up by connecting via Ethernet).

  • Best practice with WCCP flows for WAAS

    Hi,
    I have a WAAS SRE 910 module in a 2911 router that intercepts packets from this router with WCCP.
    All packets are received by external interface (gi 2/0, connected to a switch with port configured in WCCP vlan), and are sent back to the router via internal interface (gi 1/0 directly connected to the router) :
    WAAS# sh interface gi 1/0
    Internet Address                    : 10.0.1.1
    Netmask                             : 255.255.255.0
    Admin State                         : Up
    Operation State                     : Running
    Maximum Transfer Unit Size          : 1500
    Input Errors                        : 0
    Input Packets Dropped               : 0
    Packets Received                    : 20631
    Output Errors                       : 0
    Output Packets Dropped              : 0
    Load Interval                       : 30
    Input Throughput                    : 239 bits/sec, 0 packets/sec
    Output Throughput                   : 3270892 bits/sec, 592 packets/sec
    Packets Sent                        : 110062
    Auto-negotiation                    : On
    Full Duplex                         : Yes
    Speed                               : 1000 Mbps
    WAAS# sh interface gi 2/0
    Internet Address                    : 10.0.2.1
    Netmask                             : 255.255.255.0
    Admin State                         : Up
    Operation State                     : Running
    Maximum Transfer Unit Size          : 1500
    Input Errors                        : 0
    Input Packets Dropped               : 0
    Packets Received                    : 86558
    Output Errors                       : 0
    Output Packets Dropped              : 0
    Load Interval                       : 30
    Input Throughput                    : 2519130 bits/sec, 579 packets/sec
    Output Throughput                   : 3431 bits/sec, 2 packets/sec
    Packets Sent                        : 1580
    Auto-negotiation                    : On
    Full Duplex                         : Yes
    Speed                               : 100 Mbps
    The default route configured in WAAS module is 0.0.0.0/0 to 10.0.1.254 (router interface).
    Would it be better that packets leave WAAS module by the external interface (in place of the internal interface) ?
    Is there a best practice recommended by Cisco on this ?
    Thanks.
    Stéphane

    Hi Stephane,
    We usually advise the following in such scenario with an internal module:
    "ip wccp 61 redirect in" the LAN interface.
    "ip wccp 61 redirect in" on the WAN one.
    "ip wccp redirect exclude in" on the internal interface between the WAAS and the router.
    That way, we are sure that no loops are created because of the WCCP redirection.
    Regards,
    Nicolas

  • Best practice - material staging for production order

    Hi Experts,
    could any of You pls, support me with some hints of best practice how to handle material staging WM-PP interface in a certain case?
    Up till now we had a system, where production had no separate location in IM, but one location existed including raw material wh and production. In the same time in WM we had separate storage types for production and raw materials u2013 hence we did material staging transferring goods only inside one IM location between different WM storage types. The material staging should be done based on separate prd. orders.
    Now this need to be changed and separate location need to be handled in IM for production u2013 which means the staging should be done between different IM locations and WM administration also need to be handled.
    Up till now we used LP10 for staging, then LB13 for TO creation etc. We can keep going like that, but if do so, there is another step required in IM u2013 movement 311, where material numbers and qty need to be added manually to finish the whole procedure. I would like to avoid this u2013 which makes the administrational procedure quite long.
    I have been checking the following possibilities:
    1.     Set released order parts-staging at control cycle and use MF60 for staging u2013 but I can not select requirements based on pro ordders here (only able to find demand if component including into selection)
    2.     Two step transfer 313/315 u2013 but this not a supported procedure u2013 313 TI /TO / 315
    3.     Try to find solution how to create 311 movement based on TO or based on WM stock at certain storage type / dynamic bin.
    I have failed.
    So, could any of You pls, support me with some useful ideas, how to handle material staging where 311 included and definetly the last step of procedure, but administrator does not need to enter items manually one by one in MIGO.
    All answers will be appreciated

    Hi,
    Storage location control should be able to take care of your problem.
    If you want to stage the material to a different IM location then the WM location then make the following settings
    If location xxxx is your WM location and location yyyy is your Production location.
    You have defined Production storage type ZZZ for production storage location YYYY and have maintained the supply area for the same
    In WM configuration - For interfaces - IM interface-Control of Assignment "Plant / Stor.Loc. - Whse Number"
    Assign location XXXX as the Standard Location. Maintain entry donot copy sloc in TR for location YYYY
    In WM configuration - For interfaces - IM interface-  Storage Location control for WH
    This entry ensures that there will be a WM tarnsfer Posting between your WM and Production storage Location automatically when you confirm your TO. You can have this done via a btach job also if you want cumulative posting. (schedule job RLLQ0100)

  • Best practice for having separate clone data for development purposes?

    Hi
    I am on a hosted Apex environment
    I have a workspace containing two instances/ copies of the application: DEV and PROD
    I would like to be able to develop functionality and data in/ with the DEV instance and then insert it into DEV.
    I gather that I can insert pages from DEV to PROD via Create -> New page as copy -> Page in another application
    But I don't know how I can mimic this process with database objects, eg. if I want to create a new table or manipulate the data in an existing table in a DEV environment before implementing in a PROD environment.
    Ideally this would be done in such a way that minimises changing table names etc when elevating pages from DEV to PROD.
    Would it be possible to create a clone schema that could contain the same tables (with the same names) as PROD?
    Any tips, best practices appreciated :)
    Thanks

    Hi,
    ideally you should have a little more separation between your dev and prod environments. At the minimum you should have separate workspaces each addressing separate schemas. Apex can be a little difficult if you want to move individual Apex application objects, such as pages, between applications (a much requested improvement), but this can be overcome by exporting and importing the whole application. You should also have some form of version control/backup of export files.
    As far as database objects go, tables etc, if you have tns access to your hosted environment, then you can use SQL Developer to develop, maintain and synchronize between your development and production schemas and objects in the different environments should have identical names. If you don't have that access, then you can use the Apex SQL Workshop features, but these are a little more cumbersome than a tool like SQL Developer. Once again, scripts for creating and upgrading your database schemas should be kept under some sort of version control.
    All of this is supposing your hosting solution allows more than one workspace and schema, if not you may have to incur the cost of a second environment. One other option would be to do your development locally in an instance of Oracle XE, ensuring you don't have any version conflicts between the different database object features and the Apex version.
    I hope this helps.
    Regards
    Andre

  • Best Practice for Acquisition of Utility Plant Assets from another utility

    My company is located in the United States and will be taking on an initiative of purchasing the Utility Plant assets from another company.  We are governed by Federal Energy Regulatory Commission (FERC) Accounting Standards.  In the guidance for the accounting where one utility purchases the assets of another utility, it states that the purchasing company must account for the Utility Plant Assets in FERC Account 102 at a net book value until FERC approval is received on the sale of the other utility.  Depreciation must be calculated based on the Gross Book Value and applied to this same FERC Account.  What must happen in order to track the assets is that the asset's APC Value and the Depreciation Value must transfer from the utility selling the Plant Assets balance sheet to the utility purchasing the Plant Assets balance sheet respectively.
    As an example:
    Utility ABC is selling their plant assets to Utility XYZ.  The NBV of the plant assets is $60,000,000.  It is broken down to Debit $80,000,000 for the APC Value (in FERC account 101 on Utility ABC's balance sheet) and credit $20,000,000 associated to Depreciation Value (in FERC account 108 on Utility ABC's balance sheet).  When the sale is pending FERC approval the NBV is accounted for on Utility XYZ's balance sheet in FERC account 102.  This amount will be processed to the selling Utility on a PO for the purchase.
    I have configured the Fixed Asset module of SAP to account for the APC Value and the Depreciation Value in separate sub accounts of FERC account 102, that are se-up as reconciliation accounts on the G/L, in the account assignment of the respective asset classes.  we track our assets based on asset class that pertain to the FERC Primary Plant Accounts.
    I am trying to load the assets to the Fixed Asset module having the APC Value and the Depreciation Value reported respectively.  If the NBV amount is processed on the PO, what would be the best practice to load the  APC Value and Depreciation Value to the respective assets?
    My first thought would be to process the PO for the NBV of the assets against a generic FERC Account 102, that is not set-up as a reconciliation account.  I would then process an asset transaction using t-code ABSO with transaction type 158 and use the generic FERC account 102 as the Offsetting Account in the entry and using Document Type AA.
    I would like to follow best practice in this scenario.
    You help on this subject would be greatly appreciated.
    Wayne
    Edited by: Wayne Rochon on Mar 31, 2011 9:19 PM

    Thank you very much for your response. 
    I hope I can provide some clarity on how the accounting needs to be handle per FERC  Regulations.  The G/L balance on the utility that is selling the assets will be in the following accounts (standard accounts across all FERC Regulated Utilities):
    101 - Acquisition Value for the assets
    108 - Accumulated Depreciation Value for the assets
    For an example, there is Debit $60,000,000 in FERC Account 101 and a credit $30,000,000 in FERC Account 108.  When the purchase occurs, the net book value for the asset will be on our G/L in FERC Account 102.  Once we have FERC Approval to acquire the plant assets, we will need to enter the Acquisition Value and associated Accumulated Depreciation onto our G/L to FERC Account 101 and FERC Account 108 respectively with an offset to FERC Account 102.
    The method that I came up with is to purchase the NBV of the assets to a clearing account.  I then set up account assignments that will track the Acquisition Value and respective Accumulated Depreciation for each asset that is being purchased.  I load the respective asset values using t-code AS91 and then make an entry to the 2 respective accounts with the offset against the clearing account using t-code OASV.  Once my company receives FERC approval, I will transfer the asset to new assets that has the account assignments for FERC Account 101 and FERC Account 108 using t-code ABUMN or FB01.

  • Best practices for gathering statistics in 10g

    I would like to get some opinions on what is considered best practice for gathering statistics in 10g. I know that 10g has auto statistics gathering, but that doesn't seem to be very effective as I see some table stats are way out of date.
    I have recommended that we have at least a weekly job that generates stats for our schema using DBMS_STATS (DBMS_STATS.gather_schema_stats). Is this the right approach to generate object stats for a schema and keep it up to date? Are index stats included in that using CASCADE?
    Is it also necessary to gather system stats? I welcome any thoughts anyone might have. Thanks.

    Hi,
    Is this the right approach to generate object stats for a schema and keep it up to date? The choices of executions plans made by the CBO are only as good as the statistics available to it. The old-fashioned analyze table and dbms_utility methods for generating CBO statistics are obsolete and somewhat dangerous to SQL performance. As we may know, the CBO uses object statistics to choose the best execution plan for all SQL statements.
    I spoke with Andrew Holsworth of Oracle Corp SQL Tuning group, and he says that Oracle recommends taking a single, deep sample and keep it, only re-analyzing when there is a chance that would make a difference in execution plans (not the default 20% re-analyze threshold).
    I have my detailed notes here:
    http://www.dba-oracle.com/art_otn_cbo.htm
    As to system stats, oh yes!
    By measuring the relative costs of sequential vs. scattered I/O, the CBO can make better decisons. Here are the data items collected by dbms_stats.gather_system_stats:
    No Workload (NW) stats:
    CPUSPEEDNW - CPU speed
    IOSEEKTIM - The I/O seek time in milliseconds
    IOTFRSPEED - I/O transfer speed in milliseconds
    I have my notes here:
    http://www.dba-oracle.com/t_dbms_stats_gather_system_stats.htm
    Hope this helps. . . .
    Don Burleson
    Oracle Press author
    Author of “Oracle Tuning: The Definitive Reference”
    http://www.dba-oracle.com/bp/s_oracle_tuning_book.htm

  • JEE5 Application Architecture Best Practice.

    Hi Everybody
    I am going to redesign a moderate size application (not v big but larger then normal).
    Now I have few Question in my mind.
    I am using JSF as front-end, EJB3 Session Bean for Business Logic and last but not the least JPA as domain model.
    1 - With JPA we have a domain classes. Now its better to use entity as manage-bean for JSF or manage bean should be saperate.
    2 - Using DTO (Data Transfer Object) is good practice or not in JEE5.
    3 - Simplicity or Complexity but with EntityManager I feel no need of DAO but I am used to with DAO pattern. So again as best practice I have to make 1 session bean as DAO and call it from all the session bean where I write business logic or forget about DAO session bean and call EntityManager from all session bean everywhere.
    4 - For initializing EJB JNDI is 1 way other way is
    @EJB EJBCLASSNAME ejbclassobject; //this auto initialize and create object.
    Initializing like above is standard or it is an extended support from some app server.

    Hi,
    Follow my opinion:
    1 - With JPA we have a domain classes. Now its better to use entity as manage-bean for JSF or manage bean should be saparated.
    >> I think that Managed-Bean must be separated, because you can need to bind you Visual Components to it too.
    2 - Using DTO (Data Transfer Object) is good practice or not in JEE5.
    >> You can put your Entity as a member of you Managed Bean.
    3 - Simplicity or Complexity but with EntityManager I feel no need of DAO but I am used to with DAO pattern. So again as best practice I have to make 1 session bean as DAO and call it from all the session bean where I write business logic or forget about DAO session bean and call EntityManager from all session bean everywhere.
    >> For CRUD operation I don't create a additional class, but for complex business logic, you can use a separated class (Business Manager)
    Best regards

  • Need best practice when accessing an ucm content after being transferred.

    Hi All,
    I have a business requirement where I need to auto-transfer the content to another UCM when this content expires in the source UCM.
    This content needs to be deleted after it spends a certain duration in the target UCM.
    Can anybody advise me the best practice to do this in the Oracle UCM?
    I have set up an expiration date and trying to auto Replicate the content to the target UCM once the content reaches the expiration date.
    I am not aware of the best practice to access the content when it is in the target UCM?
    Any help in this case would be greatly appreciated.
    Regards,
    Ashwin

    SR,
    Unfortunately temp tables are the way to go. In Apex we call them collections (not the same as PL/SQL collections) and there's an API for working with them. In other words, the majority of the leg work has already been done for you. You don't have to create the tables or worry about tying data to different sessions. Start you learning here:
    http://download.oracle.com/docs/cd/E14373_01/appdev.32/e11838/advnc.htm#BABFFJJJ
    Regards,
    Dan
    http://danielmcghan.us
    http://sourceforge.net/projects/tapigen
    http://sourceforge.net/projects/plrecur
    You can reward this reply by marking it as either Helpful or Correct ;-)

Maybe you are looking for

  • Websites service not working

    I am running YoYo with Server 4.0 on a late 2012 MacMini. The Websites service says "Available on your local network at server.local"  How can I get that published to the rest of the Universe? -The mini is connected via Ethernet using a Manual IP of

  • Differnce between BAPI and FM for MIRO

    Hi All, What is the difference between using the BAPI_INCOMINGINVOICE_CREATE or MRM_INVOICE_CREATE. I have a situation where i need to use the CALL transaction of MIRO as i cannot use the BAPI_INCOMINGINVOICE_CREATE. THough i have done a recording an

  • Iphone 4s lagging for 15 second when deleting app .

    Currently i am using iphone 4s without jailbreak . i found that when i try to delete some application , my 4s lag for 15 second and just pop out the box for 'delete' or 'cancle' . i already done some little tips to make it run smooth like close all m

  • Article on PPro CS4 64 bit performance...

    I thought this was an interesting read and maybe some of you all here will too... Part 1 http://digitalcontentproducer.com/affordablehd/newsletter/cs4_64bit_1208/index.html?imw=Y Part 2 http://digitalcontentproducer.com/workflow/cs4_64bit_1222/index.

  • MacBook Pro, 5.1 audio issue!

    Hello Apple community! At first, I searched many threads here and on other forums, also youtube and haven't found answer to my problem. I am trying to set my 5.1 speakers via external sound card. I have found nice topic here by Vegas, who shows how t