Best Practices - WOrking with lot of fields

I am working on new modules based on a migrated project. And I have
many fields to work with, which is making the code large and
repetitive (kind of). If I follow the existing structure (from
migration) it works fine but I am looking for better ways to do it.
I would appreciate any suggestions or experiences.
Thanks.
Sincerely,
Partha

Welcome to the forum!
I don't see anything obviously wrong with that SELECT statement. It looks very much like this query:
SELECT     e.ename, e.empno, e.mgr
,     d.deptno, d.dname
FROM     scott.emp     e
,     scott.dept     d
WHERE     e.deptno     = d.deptno
START WITH     e.empno          = 7566
CONNECT BY     PRIOR e.empno     = e.mgr
;which produces this output:
ENAME           EMPNO        MGR     DEPTNO DNAME
JONES            7566       7839         20 RESEARCH
SCOTT            7788       7566         20 RESEARCH
ADAMS            7876       7788         20 RESEARCH
FORD             7902       7566         20 RESEARCH
SMITH            7369       7902         20 RESEARCHIn the scott.emp table, empno is unique, and mgr is the empno of the boss, or parent . I assume employees.id is unique, and the employees.dependencia on any row is the boss of that employee.
Whenever you have a problem, it helps if you post a little sample data (CREATE TABLE and INSERT statements) and the results ypu want from that data, or give an example using commonly available tables, like those in the scott schema. Point out where the equery is getting the wrong results, say what the right results are in those places, and explain how you get those results from that data.
Always say which version of Oracle you're using (for example, 10.2.0.3.0).

Similar Messages

  • Best Practice: Working with Win7 32Bit & 64Bit Versions in MDT 2012

    Whats the recommended way for producing bootable media using MDT2012 that would support 32 and 64 bit versions of Windows 7?
    Should I have 2 deployment shares or do I import the 2 operating systems source dvd's into the same deployment share?
    is it possible to create 32 & 64Bit bootable media from 1 deployment share and if so what settings would I need to modify in the task sequence & advanced configs to allow me to do so?

    No problemt to run both x86 and x64 in the same deploymentshare.
    Yes you can create both 32 and 64 bits boot media (winpe) in one deployment share.
    In the properties of you deploymentshare, general tab you choose if you want to support both x86 & x64.

  • Aperature crashed after 3.3 update. I am a novice at best when working with restore functions. Any suggestions on how to fix this problem?

    I had the 3.3 update window with Aperature pop up and say that I could no longer work with my library until updating Aperature. The update froze up about 90% through the completion of the last set(9 of 9/ 90% complete) and ran over night. My MAC shows version 3.3 installed. When I try to open Aperature, I get a system crash error. I am a novice at best when working with software and I am panicing because of the amount of pictures that I have on the system. I do back them up but do not know how to restore Aperature software(I purchased it before the app store and can not find it). Please advise.

    There is an Aperture forum that you can repost in, that is where you will find the guru's for that product. Simply click: Aperture to get there. BTW before you do please complete your profile, it makes it much easier for those attempting to assist you. You can find instructions by clicking Profile Update

  • Best Practice(s) for Laptop in Field, Server at Home? (Lightroom 3.3)

    Hi all!
    I just downloaded the 30-day evaluation of Lightroom, now trying to get up to speed. My first task is to get a handle on where the files (photos, catalogs, etc.) should go, and how to manage archiving and backups.
    I found a three-year-old thread titled "Best Practice for Laptop in Field, Server at Home" and that describes my situation, but since that thread is three years old, I thought I should ask again for Lightroom 3.3.
    I tend to travel with my laptop, and I'd like to be able to import and adjust photos on the road. But when I get back home, I'd like to be able to move selected photos (or potentially all of them, including whatever adjustments I've made) over to the server on my home network.
    I gather I can't keep a catalog on the server, so I gather I'll need two Lightroom catalogs on the laptop: one for pictures that I import to the laptop, and another for pictures on the home server -- is that right so far?
    If so, what's the best procedure for moving some/all photos from the "on the laptop catalog" to the "on the server catalog" -- obviously, such that I maintain adjustments?
    Thanks kindly!  -Scott

    Hi TurnstyleNYC,
    Yes, I think we have the same set-up.
    I only need 1 LR-catalog, and that is on the laptop.
    It points to the images wherever they are stored: initially on the laptop, later on I move some of them (once I am am fairly done with developing) within LR per drag&drop onto the network storage. Then the catalog on the laptop always knows they are there.
    I can still continue to work on the images on the network storage (slightly slower than on laptop's hard drive) if I still wish to.
    While travelling, I can also work on metadata / keywording, although without access to my home network the images themselves are offline for develop work.
    2 separate catalogs would be very inconvenient, as I would always have to remember if I have some images already moved. No collections would be possible of images including some on the laptop, some on the network.
    Remember: a LR catalog is just a database with entries about images and the pointer to their storage location.
    You can open only 1 DB of this sort at a time.
    There is no technical reason for limiting a LR-catalog - I have read of people with several hundert thousand images within one.
    The only really ever growing part on my laptop with this setup is the previews folder "<catalog name> Previews.lrdata". I render standard previews so that I can do most of the work for offline-images while travelling.
    The catalog itsself "<catalog name>.lrcat" grows much slower. It is now 630 MB for 60'000+ images, whereas previews folder is 64 GB.
    So yes, I dedicate quite a junk of my laptop hard disk to that. I could define "standard"-previews somewhat smaller, fitting to the laptop's screen resolution, but then when working at home with a bigger external monitor LR would load all the time for the delta size, which is why I have defined standard-preview-size for my external monitor. It may turn out to be the weakness of my setup long-term.
    That is all what is needed in terms of Lightroom setup.
    What you need additionally to cover potential failure of drives is no matter of LR, but *usual common backup sense* along the question "what can be recreated after failure, if so by what effort?" Therefore I do not backup the previews, but very thoroughly the images themselves as well as the catalog/catalog backups, and for convenience my LR presets.
    Message was edited by: Cornelia-I: sorry, initially I had written "1:1-previews", but "standard previews" is correct.

  • I needed a 2tb hard drive for my new iMac whats the best to work with time machine for around £100

    Any suggestions about the best 2 tb hard drive to work with my imac,compatible with time machine for around £100

    There is no such thing as "best" and "inexpensive", you either want one or the other but you cannot get both. IMHO I'd recommend OWC's Mercury Elite Pro (www.macsales.com) however because  you are on the other side of the pond you may want to consider a Lacie product. High quality, not cheap but good stuff for the most part.

  • Couldnt work with multiple key fields in keyword search

    Hi,
    I have some issue while working keyword search uder free form search in datamanager. in my repository i have two keyfields name(default field) and nation(lookup flat). but while working with keyword search if i pass any value that is comparing with only the values under keyfield name not comparing with other keyfield nation values. i am working with MDM 5.5 sp6. It would be appreciable if any one solve this issue.
    Regards
    Ravi

    Hi Ravi,
    If you want to perform Keyword search for Nation field as well without changing it's field type; then plz follow these steps:
    1. Set the Keyword property for Nation Field as Normal in Main table.
    2. Now go to the lookup table to which Nation field is looking into (say the Lookup table name is Nations).
    3. Now set the Keyword property for Nation field as Normal in the lookup table (Nations table) as well.
    4. Go back to Main table and now perform Keyword search. This time it will perform the search on both the fields (i.e., Name as well as Nation).
    Please let us know, if problem still persists.
    Regards,
    Varun

  • SAP Best Practice - working parallely on same object

    Experts,
    We are doing a roll-out on a SAP box for a particular geography, which is already being used/live for other markets. There are 2 teams, one supporting the existing system, and our team for roll-out.
    There are common includes that can be accessed by both the teams (for e.g. MV45AFZZ, RV50AFZZ, RV60AFZZ etc.).
    Could you please suggest the best practice, in which both the teams can work on the same include simultaneously (like defining Z-includes in the standard include and working on them) and avoid any conflicts/ locks.
    Rgds,
    Birendra

    Birendra Chatterjee wrote:
    > Could you please suggest the best practice, in which both the teams can work on the same include simultaneously (like defining Z-includes in the standard include and working on them) and avoid any conflicts/ locks.
    Not possible within the same System (sy-sysid) and the same Client (sy-mandt) - not that it makes any sense to do it anyway!
    Cheers,
    Sougata.

  • JSF Best Practices: Dealing with phone numbers

    I am sure this would be benefitial to everyone so... I was wondering what would be the best practice to show/edit/search phone numbers in JSF-based UI? I guess it all depends on the table/attribute definition? Suppose in your table you have
    COUNTRY_CODE
    AREA_CODE
    PHONE (xxx-xxxx)
    EXT
    how would you design the 'most' user friendly UI? Is there any component with 'pre-built' phone mask?

    Hello,
    It's funny since I got the same question today in a meeting.
    I would write a PhoneNumber and PhoneNumberConverter class. The getAsString would simply returns the (xxx) xxx-xxxx format, the getAsObject method would check to string value against a regex, probably something like: \s*((\(\d{3}\))|(\d{3}\s*-))?\s*(\d{3})\s*-?\s*(\d{4})
    That way the user will always see the format xxx-xxxx but will able to enter all those formats:
    xxx-xxx-xxxx
    xxx - xxx - xxxx
    (xxx)xxx-xxxx
    (xxx) xxx-xxxx
    xxx-xxxx
    xxx - xxxx
    Regards,
    Simon Lessard
    Message was edited by:
    Simon Lessard

  • SAP Best Practice Integrated with Solution manager

    We have a server in which we installed SAP Best practice baseline package and we have the solution manager 7.01 SP 25
    We maintained the logical port but when we try to check connectivity to solution manager we got the following error:
    Connectivity check to sap solution manager system not successful
    Message no. /SMB/BB_INSTALLER375
    Can anyone guide us how to solve the problem and also if there is another way to upload the solution defined on the best practice solution builder into sap solution manager as a template project
    Thanks,
    Heba Hesham

    Hi,
    Patches for SAPGUI 7.10 can be found at the following location:
    http://service.sap.com/patches
    -> Entry by Application Group -> SAP Frontend Components
    -> SAP GUI FOR WINDOWS -> SAP GUI FOR WINDOWS 7.10 CORE
    -> SAP GUI FOR WINDOWS 7.10 CORE -> Win 32
    -> gui710_2-10002995.exe

  • Best Practice for Oil and Gas Field Data Capture

    Hi All
    What is the best practice to capture volume data of Oil and Gas commodities in SAP? There is a solution [FDC|http://help.sap.com/miicont-fdc] that address the requirements however what parameters it asks, whats the process and how it calculates different variables isn't provided over the resource center.
    Appreciate any response.
    Regards
    Nayab

    Hi Zack,
    It will be easier for you to maintain the data in a single application. Every application needs to have the category dimension, mandatorily. So, you can use this dimension to maintain the actual and plan data.
    Hope this helps.

  • SAP Best Practice: Problems with Loading of Transaction Data

    Hi!
    I am about to implement SAP Best Practices scenario "B34: Accounts Receivable Analysis".
    Therefore I load the data from SAP ERP IDES system into SAP NetWeaver 2004s system.
    My problems are:
    when I try to load the Transaction data for Infosources 0FI_AR_4 and 0FI_AR_6 I get the following errors/warnings:
    when I start the "schedule process" the status getting still "yellow 14:27: 31(194 from 0 records)"
    On the right side I see some actions that are also "yellow", e.g. "DataStore Activation (Change Lo ): not yet activated".
    As a result I cannot see any data in tcode "RSRT" executing the queries "0FIAR_C03/...".
    The problems there
    1) Input help of the web template of query don't contain any values
    2) no data will be shown
    Can some one help me to solve this problem?
    Thank you very much!
    Jürgen

    Be in the monitor window where u got the below issue
    when I start the "schedule process" the status getting still "yellow 14:27: 31(194 from 0 records)"
    and go to environment in the menu options TransactRFC--->in the sourcesystem...
    give the logon details and enter and from there give the correct target destination as ur BI server and execute...
    if u find some idoc's pending there push that manually using F6..
    and come back to ur load and refresh....
    if still it doen't turn green u can manully change status to red in STATUS tab and come to processing tab and expand ur processing details and right click on ur data packet which was not yet updated and select manual update...
    it shows busy status and when it comes out of that once again refresh...
    rgds,

  • APEX 4.0 - AJAX not working with Popup LOV field

    Hello,
    In the application I have built I have three fields which are 'feeded' by selects via a AJAX construction.
    All three fields are based on three others, which are PopUp LOV fields. Up to version 3.2 I had no problems: the fields were 'refreshed' as expected when in the 'parent' field a specific value was entered or choosen by the Popup Lov selector.
    But since the migration to APEX-4 (lin a test-system) all the three fields are not correctly refreshed anymore: two of them give the value 'undefined', the third, which is a select-list shows values, but at the moment I try to click on the select-list down arrow to pick a value, the list is emptied and stays that way.
    More specific about the third field:
    item 1 is a popup-lov defined item in which a user can type a value, but can also choose a value from a popup-lov. Item 2 is based on item-1: it does a select based on the value of item-1 and refreshes a select-list. Item-2 is defined as a select list.
    I got it all working when making item-1 also a select list, but since the list can become very large, I made a Popup lov of it. Most users do know what code they have to enter and in case somebody is not sure, the lov can be used.
    In this forum I came across a problem with AJAX callback, which was answered by Patrick Wolf, saying that in an htmldb_Get ( ... ,0) the last 0 should be replaced by $v('pFlowStepId'), but this did not fix my problem. I have the feeling that the problem is somewhre else, since on first hand, after entering a value in item-1 I see item-2 refreshed with the correct values, but the moment I try to select one item, the list is emptied.....
    I hope I made it clear what my problem is and that somebody can help me, else APEX-3.2 was the latest version for this application....
    Thanks in advance and best regards,
    Jan.
    Edited by: user13110728 on 9-aug-2010 8:44

    Hi Jan,
    the problem is the
    onBlur="javascript:f_P21_select_kostenposten(this,'P21_KOSTENPOST');"on P21_GBREKNR. This is getting attached to the HTML input text field but as well the the anchor link which opens the popup. So when you leave the text field the cursor will be put onto the icon to open the Popup LOV. When you press tab another time or leave the anchor link by clicking somewhere the above JavaScript function is fired again. But the problem is that "this" this time points to the anchor and not the text item anymore, so it will return an invalid value when you access this.value in your f_P21_select_kostenposten function which results in an empty result of your on-demand call.
    I still have to investigate why the above JavaScript code is now added to the anchor link as well. But as a workaround, just change your code to
    onChange="javascript:f_P21_select_kostenposten(this,'P21_KOSTENPOST');"which is better anyway, because it's just firing when the value really got changed. And because the anchor link can't really be changed, the event will never fire when you leave the anchor. BTW you can use the Application Search feature to search for all your onBlur events. onChange is much better than onBlur because it doesn't fire that often unintended.
    But if you want to use some of the new APEX 4.0 features, you could use the cascading LOV feature of APEX 4 and avoid all this JavaScript and on-demand code which you have written.
    I have changed your application to take use of it.
    1) I changed the LOV query for P21_KOSTENPOST from
    select '('||replace(lpad(gbreknr, 3, chr(1)),chr(1),' ')||') '|| omschrijving d, kostenpost r
    from   kostenposten
    where  vervallen is null
      and  bedrijf = :P21_BEDRIJF
    order by gbreknr, kostenpostto
    select '('||replace(lpad(gbreknr, 3, chr(1)),chr(1),' ')||') '|| omschrijving d, kostenpost r
    from   kostenposten
    where  vervallen is null
      and  bedrijf = :P21_BEDRIJF
      and  (:P21_GBREKNR is null or gbreknr = :P21_GBREKNR)
    order by gbreknr, kostenpostas it was in your on-demand process. The query will return all values if P21_GBREKNR is null but will restrict it if the field is filled out.
    2) Set "Cascading LOV Parent Item(s)" to P21_BEDRIJF,P21_GBREKNR Because based on your LOV query I assume "Kostenpost" should be refreshed if one of the two fields gets changed.
    3) Set "Optimize Refresh" to No because one of your parent items (P21_GBREKNR) can contain a null value.
    4) Removed the onBlur/onChange from P21_GBREKNR
    5) Removed the *%null%* from the "Null Return Value" attribute of P21_BEDRIJF, because that will cause a ORA-1722 if someone picks "-- kies: --"
    That's it.
    Have a look at dynamic actions and the "Set Value" action as well, because I think that could remove your other on-demand calls as well without having to write any JavaScript code.
    I will still investigate what's going on with the anchor link in APEX 4.0
    Regards
    Patrick
    My Blog: http://www.inside-oracle-apex.com
    APEX 4.0 Plug-Ins: http://apex.oracle.com/plugins

  • What is the best practice dealing with process.getErrorStream()

    I've been playing around creating Process objects with ProcessBuilder. I can use getErrorStream() and getOutputStream() to read the output from the process, but it seems I have to do this on another thread. If I simply call process.waitFor() and then try to read the streams that doesn't work. So I do something like final InputStream errorStream = process.getErrorStream();
    final StringWriter errWriter = new StringWriter();
    ExecutorService executorService = Executors.newCachedThreadPool();
    executorService.execute(
        new Runnable() {
            public void run() {
                try {
                    IOUtils.copy(errorStream, errWriter, "UTF-8");
             } catch (IOException e) {
                    getLog().error(e.getMessage(), e);
    int exitValue = process.waitFor();
    getLog().info("exitValue = " + exitValue);
    getLog().info("errString =\n" + errWriter); This works, but it seems rather inelegant somehow.
    The basic problem is that the Runnable never completes on its own. Through experimentation, I believe that when the process is actually done, errorStream is never closed, or never gets an end-of-file. My current code works because when it goes to read errWriter it just reads what is currently in the buffer. However, if I wanted to clean things up and use executorService.submit() to submit a Callable and get back a Future, then a lot more code is needed because "IOUtils.copy(errorStream, errWriter, "UTF-8");" never terminates.
    Am I misunderstanding something, or is process.getErrorStream() just a crappy API?
    What do other people do when they want to get the error and output results from running a process?
    Edited by: Eric Kolotyluk on Aug 16, 2012 5:26 PM

    OK, I found a better solution.Future<String> errString = executorService.submit(
        new Callable<String>() {
            public String call() throws Exception {
                StringWriter errWriter = new StringWriter();
                IOUtil.copy(process.getErrorStream(), errWriter, "UTF-8");
                return errWriter.toString();
    int exitValue = process.waitFor();
    getLog().info("exitValue = " + exitValue);
    try {
        getLog().info("errString =\n" + errString.get());
    } catch (ExecutionException e) {
        throw new MojoExecutionException("proxygen: ExecutionException");
    } The problem I was having before seemed to be that the call to Apache's IOUtil.copy(errorStream, errWriter, "UTF-8"); was not working right, it did not seem to be terminating on EOS. But now it seems to be working fine, so I must have been chasing some other problem (or non-problem).
    So, it does seem the best thing to do is read the error and output streams from the process on their own daemon threads, and then call process.waitFor(). The ExecutorService API makes this easy, and using a Callable to return a future value does the right thing. Also, Callable is a little nicer as the call method can throw an Exception, so my code does not need to worry about that (and the readability is better).
    Thanks for helping to clarify my thoughts and finding a good solution :-)
    Now, it would be really nice if the Process API had a method like process.getFutureErrorString() which does what my code does.
    Cheers, Eric

  • Best practice dealing with mixed footage types

    I will soon embark on my first proper FCP project. The film will be shot at the end of the week on a Sony PMW-EX1 in 1080p25. We already shot some footage a few weeks ago, alas this was mistakenly shot interlaced 1080i50. What would be the best way to go about combining these two sets of footage? Should I de-interlace the 1080i50 stuff first or just drop it all into the timeline and allow FCP to deal with it?

    OK, I found a better solution.Future<String> errString = executorService.submit(
        new Callable<String>() {
            public String call() throws Exception {
                StringWriter errWriter = new StringWriter();
                IOUtil.copy(process.getErrorStream(), errWriter, "UTF-8");
                return errWriter.toString();
    int exitValue = process.waitFor();
    getLog().info("exitValue = " + exitValue);
    try {
        getLog().info("errString =\n" + errString.get());
    } catch (ExecutionException e) {
        throw new MojoExecutionException("proxygen: ExecutionException");
    } The problem I was having before seemed to be that the call to Apache's IOUtil.copy(errorStream, errWriter, "UTF-8"); was not working right, it did not seem to be terminating on EOS. But now it seems to be working fine, so I must have been chasing some other problem (or non-problem).
    So, it does seem the best thing to do is read the error and output streams from the process on their own daemon threads, and then call process.waitFor(). The ExecutorService API makes this easy, and using a Callable to return a future value does the right thing. Also, Callable is a little nicer as the call method can throw an Exception, so my code does not need to worry about that (and the readability is better).
    Thanks for helping to clarify my thoughts and finding a good solution :-)
    Now, it would be really nice if the Process API had a method like process.getFutureErrorString() which does what my code does.
    Cheers, Eric

  • Best Practice Solution with 2 Internet Connections

    Good day everyone,
    We used to have single ADSL connections at our clients that provide the internet connection for the network. We have recently partnered up with a Fiber provider and are slowly busy rolling out Fiber connections at our clients. We are also offering redundancy
    with the ADSL connection as a back up for the Fiber connection.
    I would like to know if anyone has suggestions as to what would be the best practise configuration on various Windows Server platforms to make this possible? The idea is that if the Fiber connection fails (for whatever reason) that the ADSL connection takes
    over. This must be an automated process.
    I am open for any form of suggestions.
    Thanks for your time.
    Rudi

    Hi,
    Our current solution is to have to NICs and have both connect to the server. They then have 2 different IPs and we have the DHCP give out the IP as the gateway. The only problem with that is that we cannot control the automated change of gateway IP if the
    main connection fails. 
    We are also willing to look into other hardware solutions that could control this.
    Regards,
    Rudi

Maybe you are looking for

  • MacBook 2nd display (mini-DVI-DVI (D)- VGA)?

    I am trying to hook up a second display to my MacBook. The problem is that I don't have a mini-DVI to VGA adapter, but I have a mini-DVI to DVI, and a DVI to VGA adapters. So I connected my Apple mini-DVI to DVI, to my DVI to VGA, and then to my 2nd

  • About displaying report in web ?

    Will everything be displayed in web using oracle 10g ? Do I need to install any software on the client machine if I want to generate paper report ? The report format displayed in web and paper layout are different, can I adjust the web layout to disp

  • Is there a way I can click

    I am trying to create a movie & I would to use pictures from a web page, that are not in my photo album. Is there a way to do this, I tried click & drag but with no success. Thanks

  • Acrobat DC installs "Send File" without asking

    I updated Acrobat and had DC delete and replace Pro XI. It also installed a add-on in MS Outlook called Send File. It did not ask if this was OK. Personally I don't like the look of the new program and since you can uninstall and go back a generation

  • Sending class instance to an external application

    Hi, I was wondering if it is possible to send an instance of a class that has been initialized and instantiated to an external application as a parameter to the application? Thanks in advance.