Best Practice regarding using and implementing the pref.txt file

Hi All,
I would like to start a post regarding what is Best Practice in using and implementing the pref.txt file. We have reached a stage where we are about to go live with Discoverer Viewer, and I am interested to know what others have encountered or done to with their pref.txt file and viewer look and feel..
Have any of you been able to add additional lines into the file, please share ;-)
Look forward to your replies.
Lance

Hi Lance
Wow, what a question and the simple answer is - it depends. It depends on whether you want to do the query predictor, whether you want to increase the timeouts for users and lists of values, whether you want to have the Plus available items and Selected items panes displayed by default, and so on.
Typically, most organizations go with the defaults with the exception that you might want to consider turning off the query predictor. That predictor is usually a pain in the neck and most companies turn it off, thus increasing query performance.
Do you have a copy of my Discoverer 10g Handbook? If so, take a look at pages 785 to 799 where I discuss in detail all of the preferences and their impact.
I hope this helps
Best wishes
Michael Armstrong-Smith
URL: http://learndiscoverer.com
Blog: http://learndiscoverer.blogspot.com

Similar Messages

  • Best Practice on using and refreshing the Data Provider

    I have a �users� page, that lists all the users in a table - lets call it master page. One can click on the first column to of the master page and it takes them to the �detail� page, where one can view and update the user detail.
    Master and detail use two different data providers based on two different CachedRowSets.
    Master CachedRowSet (Session scope): SELECT * FROM UsersDetail CachedRowSet (Session scope): SELECT * FROM Users WHERE User_ID=?I want the master to be updated whenever the detail page is updated. There are various options to choose from:
    1. I could call masterDataProvider.refresh() after I call the detailDataProvider.commitChanges() - which is called on the save button on the detail page. The problem with this approach is that the master page will not be refreshed across all user sessions, but only for the one saving the detail page.
    2. I could call masterDataProvider.refresh() on the preRender() event of the master page. The problem with this approach is that the refresh() will be called every single time someone views the master page. Further more, if someone goes to next page (using the built in pagination on the table on master page) and clicks on a user to view its detail and then close the detail page, it does not keep track of the pagination (what page the user was when he/she clicked on a record to view its detail).
    I can find some work around to resolve this problem, but I think this should be a fairly common usage (two page CRUD with master-detail). If we can discuss and document some best practices of doing this, it will help all the developers.
    Discussion:
    1.     What is the best practice on setting the scope of the Data Providers and CahcedRowSet. I noticed that in the tutorial examples, they used page/request scope for Data Provider but session scope for the associated CachedRowSet.
    2.     What is the best practice to refresh the master data provider when a record/row is updated in the detail page?
    3.     How to keep track of pagination, (what page the user was when he/she clicked on the first column in the master page table), so that upon updating the detail page, we cab provide user with a �Close� button, to take them back to whaterver page number he/she was.
    Thanks
    Message was edited by:
    Sabir

    Thanks. I think this is a useful information for all. Do we even need two data providers and associated row sets? Can't we just use TableRowDataProvider, like this:
    TableRowDataProvider rowData=(TableRowDataProvider)getBean("currentRow");If so, I am trying to figure out how to pass this from master to detail page. Essentially the detail page uses a a row from master data provider. Then I need user to be able to change the detail (row) and save changes (in table). This is a fairly common issue in most data driven web apps. I need to design it right, vs just coding.
    Message was edited by:
    Sabir

  • Best Practices regarding AIA and CDP extensions

    Based on the guide "AD CS Step by Step Guide: Two Tier PKI Hierarchy Deployment", I'll have both
    internal and external users (with a CDP in the DMZ) so I have a few questions regarding the configuration of AIA/CDP.
    From here: http://technet.microsoft.com/en-us/library/cc780454(v=ws.10).aspx
    A root CA certificate should have an empty CRL distribution point because the CRL distribution point is defined by the certificate issuer. Since the roots certificate issuer is the root CA, there is no value in including a CRL distribution point for
    the root CA. In addition, some applications may detect an invalid certificate chain if the root certificate has a CRL distribution point extension set.A root CA certificate should have an empty CRL distribution point because the CRL distribution point is defined
    by the certificate issuer. 
    To have an empty CDP do I have to add these lines to the CAPolicy.inf of the Offline Root CA:
    [CRLDistributionPoint]
    Empty = true
    What about the AIA? Should it be empty for the root CA?
    Using only HTTP CDPs seems to be the best practice, but what about the AIA? Should I only use HTTP?
    Since I'll be using only HTTP CDPs, should I use LDAP Publishing? What is the benefit of using it and what is the best practice regarding this?
    If I don't want to use LDAP Publishing, should I omit the commands: certutil -f -dspublish "A:\CA01_Fabrikam Root CA.crt" RootCA / certutil -f -dspublish "A:\Fabrikam Root
    CA.crl" CA01
    Thank you,

    Is there any reason why you specified a '2' for the HTTP CDP ("2:http://pki.fabrikam.com/CertEnroll/%1_%3%4.crt"
    )? This will be my only CDP/AIA extension, so isn't it supposed to be '1' in priority?
    I tested the setup of the offline Root CA but after the installation, the AIA/CDP Extensions were already pre-populated with the default URLs. I removed all of them:
    The Root Certificate and CRL were already created after ADCS installation in C:\Windows\System32\CertSrv\CertEnroll\ with the default naming convention including the server name (%1_%3%4.crt).
    I guess I could renamed it without impact? If someday I have to revoke the Root CA certificate or the certificate has expired, how will I update the Root CRL since I have no CDP?
    Based on this guide: http://social.technet.microsoft.com/wiki/contents/articles/15037.ad-cs-step-by-step-guide-two-tier-pki-hierarchy-deployment.aspx,
    the Root certificate and CRL is publish in Active Directory:
    certutil -f -dspublish "A:\CA01_Fabrikam Root CA.crt" RootCA
    certutil -f -dspublish "A:\Fabrikam Root CA.crl" CA01
    Is it really necessary to publish the Root CRL in my case?
    Instead of using dspublish, isn't it better to deploy the certificates (Root/Intermediate) through GPO, like in the Default Domain Policy?

  • Any "Best Practice" regarding use of zfs in LDOM with zones

    I have 3 different networks and I want to create a guest-domain for each of the three networks on the same control domain.
    Inside each guest-domain, I want to create 3 zones.
    To make it easy to handle growth and also make the zones more portable, I want to create a zpool inside each guest domain and then a zfs for each zoneroot.
    By doing this I will be able to handle growth by adding vdisks to the zpool(in the guest domain) and also to migrate individual zones by using zfs send/receive.
    In the "LDoms Community Cookbook", I found a description on how to use zfs clone in the control domain to decrease deploy time of new guest domains:
    " You can use ZFS to very efficiently, easily and quickly, take a copy of a previously prepared "golden" boot disk for one domain and redeploy multiple copies of that image as a pre-installed boot disk for other domains."
    I can see clear advantages in using zfs in both the control domain and the guest domain, but what is the downside?
    I ends up with a kind of nested zfs where I create a zpool inside a zpool, the first in the control domain and the second inside a guest domain.
    How is zfs caching handled, will I end up with a solution with performance problems and a lot of I/O overhead?
    Kindest,
    Tor

    I'm not familiar with the Sybase agent code and you are correct, only 15.0.3 seems to be supported. I think we'd need a little more debug information to determine if there was a workaround. May be switching on *.info messages in syslogd.conf might get some more useful hints (no guarantee).
    Unfortunately, I can't comment on if, or when, Sybase 15.5.x might be supported.
    Regards,
    Tim
    ---

  • Best practice to use Tortoise SVN with LV

    Can anyone recommend what is the best practice to use and structure the project using TSVN with LV? I have seen the jki tool and have also read about some issues of linkage when using TSVN with LV as posted on the forum here. I suppose these linkage issues still exists? Other than perforce is there any suggestion on source control that integrates well with LV?
    TIA
    CLD,CTD

    We use Tortoise SVN with LV and it works very good. It's not integrated in that i cannot from within LV check in and out stuff, i have to do that in the Explorer. That's not a problem to me.
    SVN is a very good source and version control regardless.
    One small issue with external handling is if you want to change an already used and active filename. In LV you can save to another filename and references will update, but ofcourse SVN doesn't pick up on that automagically. There are two solutions to this:
    1. When you check in, you'll get 1 added and 1 deleted file, select both, r-click and "Repair move".
    2. After changing the filename in LV, change it back in explorer and r-click the file for a SVN rename and rename it to the new name.
    /Y
    LabVIEW 8.2 - 2014
    "Only dead fish swim downstream" - "My life for Kudos!" - "Dumb people repeat old mistakes - smart ones create new ones."
    G# - Free award winning reference based OOP for LV

  • What is the best practice for using the Calendar control with the Dispatcher?

    It seems as if the Dispatcher is restricting access to the Query Builder (/bin/querybuilder.json) as a best practice regarding security.  However, the Calendar relies on this endpoint to build the events for the calendar.  On Author / Publish this works fine but once we place the Dispatcher in front, the Calendar no longer works.  We've noticed the same behavior on the Geometrixx site.
    What is the best practice for using the Calendar control with Dispatcher?
    Thanks in advance.
    Scott

    Not sure what exactly you are asking but Muse handles the different orientations nicely without having to do anything.
    Example: http://www.cariboowoodshop.com/wood-shop.html

  • SAP RAR - Best Practice ECC,CRM and BW systems

    Hi All
    i have the requirement to configure RAR for the systems ECC,CRM and BW systems . Each system has only one client . whats the best practice regarding using the rules against each system . i am assuming the rules will be the same irrespective of the system but when i see the names of the initial files , they are system specific . can anybody elloborate around this . thanks
    Regards
    Prasad

    Prasad,
    To build on Chinmaya's explanation, make sure you use a logical system for CRM, BI, and ECC for the basis portion of the rule set (and only the basis portion).  This will keep you from duplicating your rules to meet your basis requirements.  The other rules should be attributed to the individual systems (or additional logical systems if including mult landscapes, ex. Dev, QA, and Prod ECC merged into one ECC logical system).

  • Correct values for DefaultExportPath option in Disco pref.txt file

    Hi All,
    I am trying to edit the "DefaultExportPath" parameter in Disco pref.txt file file. We are using Discoverer Plus and Viewer on CP4 and are using JVM1.5 or higher.
    I referred to Metalink Doc Id: 365245.1 and Oracle Configuration Guide doc B13918_03.
    I am interested in making the local system desktop as the default export location for all the users with no exceptions.
    Tried to edit the parameter values to
    DefaultExportPath = "C:\Documents and Settings\<Windows user name>\Desktop"
    but it didn't work.
    Also came across Metalink Doc Id: 438598.1 which talks about changing individual systems "deployment.properties" file which I don’t want to get into. It’s too much of over head.
    I am looking for some pointer so that we can edit the pref.txt file to change the default location to the desktop universally across all the users.
    Any help is appreciated.
    Thanks.

    Hi Rod,
    Thx for the update.
    Just one more dumb questions. Is there an automated way of changing these properties. Physically changing them would be vey tough since quiet a few of our users work remotely.
    Thanks.

  • 3i PREF.txt file

    The IP address of the machine that 3i is installed is suppose to be placed in the PREF.txt file. Now does it matter if the machine has two IP adresses (two network cards)? I guess what I am trying to as is does is matter which address you use in the PREF.txt file? Thanks!

    I have never seen this with a disco server. Why don't you try it both ways. Ounce with a single IP address and ounce with both. I would be interested to hear your results.
    Christopher

  • What are the best practices for using the enhancement framework?

    Hello enhancement framework experts,
    Recently, my company upgraded to SAP NW 7.1 EhP6.  This presents us with the capability to use the enhancement framework.
    A couple of senior programmers were asked to deliver a guideline for use of the framework.  They published the following statement:
    "SAP does not guarantee the validity of the enhancement points in future releases/versions. As a result, any implemented enhancement points may require significant work during upgrades. So, enhancement points should essentially be used as an alternative to core modifications, which is a rare scenario.".
    I am looking for confirmation or contradiction to the statement  "SAP does not guarantee the validity of enhancement points in future releases/versions..." .  Is this a true statement for both implicit and explicit enhancement points?
    Is the impact of activated explicit and implicit enhancements much greater to an SAP upgrade than BAdi's and user exits?
    Is there any SAP published guidelines/best practices for use of the enhancement framework?
    Thank you,
    Kimberly
    Edited by: Kimberly Carmack on Aug 11, 2011 5:31 PM

    Found an article that answers this question quite well:
    [How to Get the Most From the Enhancement and Switch Framework as a Customer or Partner - Tips from the Experts|http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/c0f0373e-a915-2e10-6e88-d4de0c725ab3]
    Thank you Thomas Weiss!

  • Best practices for using the knowledge directory

    Anyone know when it is best to store docs in the Knowledge Directory versus Collab? They are both searchable, but I guess you can publish from the Publisher to the KD. Anyone have any best practices for using the KD or setting up taxonomies in the KD?

    Hi Richard,
    If you need to configure dynamic pricing that may vary by tenant and/or if you want to set up cost drivers that are service item attributes, you should configure Billing Tables in the Demand Management module in 10.0. 
    The cost detail functionality in 9.4 will likely be changed to merged with the new pricing feature in 10.0.  The current plan is not to bring cost detail into the Service Catalog module.

  • Best practices for using the 'cost details' fields

    Hi
    Please could you advise us to the best practices for using the 'cost details' field within Pricing. Currently I cannot find the way to surface the individual Cost Details fields within the Next Generation UI, even with the tick box for 'display both cost and price' ticked. It seems that these get surfaced when the Next Generation UI is turned off, but cannot find them when it is turned on. We can see the 'Pricing Summary' field but this does not fulfill our needs, as some of our services have both recurring and one-off costs.
    Attached are some screenshots to further explain the situation.
    Many thanks,
    Richard Thornton

    Hi Richard,
    If you need to configure dynamic pricing that may vary by tenant and/or if you want to set up cost drivers that are service item attributes, you should configure Billing Tables in the Demand Management module in 10.0. 
    The cost detail functionality in 9.4 will likely be changed to merged with the new pricing feature in 10.0.  The current plan is not to bring cost detail into the Service Catalog module.

  • Does anyone know the best practices to use Captivates on an Elearning course, please...

    I need to know the best practices to use captivates on an eLearning course, as how much information should it has, etc..

    Hello There,
    Adobe Captivate has multiple workflows which can help you to create eLearning courses. It can create various types of learning content and I suggest you to visit the following links.
    Product Info: www.adobe.com/products/captivate/
    OnDemand Seminars to get more info on what captivate can do: http://www.adobe.com/cfusion/event/index.cfm?event=list&type=ondemand_seminar&loc=en_us
    Register for Trainings and Webinars: http://www.adobe.com/cfusion/event/index.cfm?event=list&loc=en_us&type=&product=Captivate& interest=&audience=&monthyear=
    If you have specific scenarios to discuss, you can mail me at [email protected] or tweet me at @vish_adobe
    Thanks,
    Vish
    @vish_adobe

  • I am using Firefox 11 and I am unable to print web pages. I can print PDFs ok. I tried to open the prefs.js file to delete anything that starts with print and I get an error message.

    I cannot print web pages. I've tried the solutions suggested like deleting anything that starts with print in the prefs.js file but I get an error message: Script: c:\users\rick\appdata\roading\mozilla\firefox\profiles\iviwh5c1.default\prefs.js
    line: 1
    Char: 1
    Error: invalid character
    code: 800A03F6
    Source: Microsoft JScript compilation error

    When editing '''prefs.js''' you need to use a simple text editor program. WinXP Notepad messes up the line formatting of prefs.js, so I use Wordpad - but you need to save the edited file in a Text-Only format (Wordpad asks about that when you go to save the edited file). I never used Win7, so I don't know if that version of Notepad messes with the line formatting of prefs.js or not.
    Another thing - make sure '''''"Hide extensions for known file types"''''' is de-delected in Windows file / folder options > view -> advanced settings. Otherwise Windows may add a .txt file extension that you won't be able to see, and that will break that file in Firefox. '''prefs.js.txt''' won't be recognized by Firefox, when is expecting to find '''''prefs.js''''' .

  • I have been trying to download and install a trial. What is the best browser to use and how do i do it?

    I have been trying to download and install a trial. What is the best browser to use and how do i do it? The set up and everything is on the desktop but it never seems to load anything?

    what file (name and extension) is on your desktop and what os?
    what have you done to install the desktop file and what do you see when you do that?

Maybe you are looking for

  • How to find out the max/min value of one field corresponding to a second field in HANA through graphical way.

    Hi, I am trying to find out the latest delivery date(EINDT)  for each purchasing document (EBELN) through graphical way. The view contains other fields apart from the above mentioned two fields. When only the two fields (EBELN, EINDT) are there, then

  • Parallel Insert statements (bulk) to the same table

    Hi, I am in need to insert set of insert statements 50000+ in to table and wanted to split them in to tow halves and run in two SQL worksheets. Note that both will try to insert on the same table. Please let me know whether it is advisable or tell me

  • Set deletion flag in the maintenance order

    Hi all, Am not able to set the deletion flag from the maintenance order, the :set: button is not active. can anyone help me in this regards? Or is there any tcode for setting deletion flag to maintenance orders? Thanks Anish

  • Configuring Shutdown without logon, using command line or script

    I´ve configured the shutdown without logon capability in Win2012/R2 using the GPEDIT.MSC How cna i do that by command line? In the past i could use the reg key ShutdownWithoutLogon, but i´ve configured in the GUI but nothing happened in the registry

  • AR - Autoinvoice error

    Hi Sory, my english is poor.. I have problem with : Module AR . Autoinvoice Master Program is launched from the Applications. Parameters: 1, 1094, KOAS_TEST, 2011/10/01 00:00:00, , , , , , , , , , , , , , , , , , , , , Y, , 84 Output of the log file