Exception Propagation - Best Practises

Hello,
I was thinking what is the best practise for exception propogation.
The way I know and have been doing is to specify an error page in my web.xml and in that I get a request paramater from the sesion ( which I populate in the catch blocks across the various classes) and display it to the user and ask him to contact the admin. Ofcourse I log them using log4j.
I was wondering if there are other ways people do this (other than just displaying a "Sorry, Application Error" page and what do you think should be the best practise of exception handling and more importantly exception propagation.
Thanks in advance for your time
rgds,

Sarvananda wrote:
...what do you think should be the best practise of exception handling and more importantly exception propagation.The very best practice is to always handle the exception, that is to say: never use empty "catch blocks".
As already stated there are many correct ways to handle exceptions depending largely on the result you desire according to the exception. If you want feedback for debugging: I've made the errors descriptive... class/method and exception/error included in the message to the end user. This almost never works, since they never read it and if they report it, they just say: "I got this error thingy and it said to call you..." I got smarter the second time around and put the errors in logs, so when they actually called, then I could have them look up the error for me, or even better, just send me the log so I could see any other problems they didn't bother to report.
It sounds like you are doing web development, one thing I have done in the past is to just pop up an e-mail ready to go with all the info in it. All the end user had to do is hit send.

Similar Messages

  • Best Practises on SMART scans

    For Exadata x2-2 is there a best practises document to enable SMART scans for all the application code on exadata x2-2?

    We cover more in our book, but here are the key points:
    1) Smarts scans require a full segment scan to happen (full table scan, fast full index scan or fast full bitmap index scan)
    2) Additionally, smart scans require a direct path read to happen (reads directly to PGA, bypassing buffer cache) - this is automatically done for all parallel scans (unless parallel_degree_policy has been changed to AUTO). For serial sessions the decision to do a serial direct path read depends on the segment size, smalltable_threshold parameter value (which is derived from buffer cache size) and how many blocks of a segment are already cached. If you want to force the use of a serial direct path read for your serial sessions, then you can set serialdirect_read = always.
    3) Thanks to the above requirements, smart scans are not used for index range scans, index unique scans and any single row/single block lookups. So if migrating an old DW/reporting application to Exadata, then you probably want to get rid of all the old hints and hacks in there, as you don't care about indexes for DW/reporting that much anymore (in some cases not at all). Note that OLTP databases still absolutely require indexes as usual - smart scans are for large bulk processing ops (reporting, analytics etc, not OLTP style single/a few row lookups).
    Ideal execution plan for taking advantage of smart scans for reporting would be:
    1) accessing only required partitions thanks to partition pruning (partitioning key column choices must come from how the application code will query the data)
    2) full scan the partitions (which allows smart scans to kick in)
    2.1) no index range scans (single block reads!) and ...
    3) joins all the data with hash joins, propagating results up the plan tree to next hash join etc
    3.1) This allows bloom filter predicate pushdown to cell to pre-filter rows fetched from probe row-source in hash join.
    So, simple stuff really - and many of your every-day-optimizer problems just disappear when there's no trouble deciding whether to do a full scan vs a nested loop with some index. Of course this was a broad generalization, your mileage may vary.
    Even though DWs and reporting apps benefit greatly from smart scans and some well-partitioned databases don't need any indexes at all for reporting workloads, the design advice does not change for OLTP at all. It's just RAC with faster single block reads thanks to flash cache. All your OLTP workloads, ERP databases etc still need all their indexes as before Exadata (with the exception of any special indexes which were created for speeding up only some reports, which can take better advantage of smart scans now).
    Note that there are many DW databases out there which are not used just only for brute force reporting and analytics, but also for frequent single row lookups (golden trade warehouses being one example or other reference data). So these would likely still need the indexes to support fast single (a few) row lookups. So it all comes from the nature of your workload, how many rows you're fetching and how frequently you'll be doing it.
    And note that the smart scans only make data access faster, not sorts, joins, PL/SQL functions coded into select column list or where clause or application loops doing single-row processing ... These still work like usual (with exception to the bloom filter pushdown optimizations for hash-join) ... Of course when moving to Exadata from your old E25k you'll see speedup as the Xeons with their large caches are just fast :-)
    Tanel Poder
    Blog - http://blog.tanelpoder.com
    Book - http://apress.com/book/view/9781430233923

  • Request for howto - error processing best practise

    Hi JDev Team. Something I would like to see in a future HOWTO would be error handling in a BC4J/JSP application. What is best practise? How do we make sure that when a database error occurs, we can trap the error and provide a friendly error message, or failing that, at least ensure the standard error is usable by a maintenance programmer. For eg. the following error occurs if a referential constraint restricts the delete:
    javax.servlet.jsp.JspException: JBO-26041: Failed to post data to database during "Delete": SQL Statement " DELETE FROM TECHTRANSFER.TTSITES Sites WHERE SITEID=:1".
    in fact the same error message is displayed for almost any database error - the programmer can't fix the problem when he has no idea what it is!! (same with update and insert)
    I wasn't going to request this until I had read all of the help available on error processing but the way this project is going I won't get time. If you think that it is adequately covered in the help, then fine, just let me know where.
    Thanks,
    Simon

    You can enclose your bc4j/jsp code with a try / catch expression. That way if a failure occurs, you can trap it, display a friendy error, and do whatever you want with the exception.
    What I have been doing for develpment purposes, is send via email a modified errorpage.jsp. Here is what gets emailed to me (*'s in potentially sensitive data) and displayed to the screen (I'm eventually going to replace all the displayed garbage with something friendly):
    An error occured in application PDC User Administration
    User Session Properties:
    Sesion ID: *********
    App ID: *********
    User Name: *********
    User ID: *********
    Priv Role: *********
    Password: *********
    Org No: *********
    First Name: skunitzer
    Last Name: ANALYST
    App Title : PDC User Administration
    Current Url: insertNewUser.jsp
    Specific error is javax.servlet.jsp.JspException: JBO-25013: Too many objects match the primary key oracle.jbo.Key[1423 ].
    Parameters:
    LastName
    Kunitzer
    EmailAddress
    [email protected]
    FirstName
    SteveLiveTest
    OrgNo
    PhoneWorkNo
    I have no phone #
    ExpireDate
    2001-04-26
    ExpireDateString
    jRQiIsFGANIbrGlihGTl[epofZmSNgEkGqbHN@iErHNPRi
    UserID
    UserPrivs
    Exception:
    javax.servlet.jsp.JspException: JBO-25013: Too many objects match the primary key oracle.jbo.Key[1423 ].
    Message:
    JBO-25013: Too many objects match the primary key oracle.jbo.Key[1423 ].
    Localized Message:
    JBO-25013: Too many objects match the primary key oracle.jbo.Key[1423 ].
    Stack Trace:
    javax.servlet.jsp.JspException: JBO-25013: Too many objects match the primary key oracle.jbo.Key[1423 ].
    at java.lang.Throwable.fillInStackTrace(Native Method)
    at java.lang.Throwable.fillInStackTrace(Compiled Code)
    at java.lang.Throwable.<init>(Compiled Code)
    at java.lang.Exception.<init>(Compiled Code)
    ...Stack Trace goes on but I won't bother with it anymore...
    While not always as specific as I would like, I have not had too much trouble hunting down the errors.
    null

  • Any best practise to archive PO's which does not have corresponding invoice

    Hello,
             As part of initial implementation and conversion, We have a lot of PO's / LTA created but their corresponding invoices were never converted into SAP from legacy system.  SAP archiving program tags those as not business complete as the invoice qty does not match with po qty (there are no invoices to start with).  Just flagging 'delivery complete and final confirmation' of PO does not help.  Anybody ran into similar situation and how did they resolve it?  I am reluctant to enhance standard SAP archiving program to bypass those checks and that is my only last option. Any SAP recommended Note / best practise etc would help.
    Satyajit Deb

    Where is the invoice posted?
    was the invoice posted in the legacy system?
    Clearance of GR/IR account with MR11 will usually close such POs.

  • When granting a user or a role access to a group of pages, it is best practise to grant that access to what type of file or component?

    My question is same while granting user or role in the application, what is the best practise? How to decide the level of applying role to pagedef's, xml files, or some other file that i have missed out.

    As for my concern I would go for page definition files.

  • Best practise in SAP BW master data management and transport

    Hi sap bw gurus,
    I like to know what is the best practise in sap bw master data transport. For example, if I updated my attributes in development, what are the 'required only' bw objects should I transport?
    Appreciate advice.
    Thank you,
    Eric

    Hi Vishnu,
    Thanks for the reply but that answer may be suitable if I'm implementing a new BW system. What I'm looking for is more on daily operational maintenance and transport (a BW systems that has gone live awhile).
    Regards,
    Eric

  • What is the best practise to provide a text file for a Java class in a OSGi bundle in CQ?

    This is probably a very basic question so please bear with me.
    What is the best way to provide a .txt file to be read by a Java class in a OSGi bundle in CQ 5.5?
    I have been able to read a file called "test.txt" that I put in a structure like this /src/resources/<any-sub-folder>/test.txt  from my java class  at /src/main/java/com/test/mytest/Test.java using the bundle's getResource and getEntry calls but I was not able to use the context.getDataFile. How is this getDataFile method call to be used?
    And what if I want to read the file located in another bundle, is it possible? or can I add the file to some repository and then access it - but I am not clear how to do this.
    And I would also like to know what is the best practise if I need to provide a large  data set in a flat file to be read by a Java class in CQ5.
    Please provide detailed steps or point me to a how to guide or other helpful resources as I am a novice.
    Thank you in advance for your time and help.
    VS

    As you can read in the OSGi Core specification (section 4.5.2), the getDataFile() method is to read/write a file in the bundle's private persistent area. It cannot be used to read files contained in the bundle. The issue Sham mentions refers to a version of Felix which is not used in CQ.
    The methods you mentioned (getResource and getEntry) are appropriate for reading files contained in a bundle.
    Reading a file from the repository is done using the JCR API. You can see a blueprint for how to do this by looking at the readFile method in http://svn.apache.org/repos/asf/jackrabbit/tags/2.4.0/jackrabbit-jcr-commons/src/main/java /org/apache/jackrabbit/commons/JcrUtils.java. Unfortunately, this method is not currently usable as it was declared incorrectly (should be a static method, but is an instance method).
    Regards,
    Justin

  • Advice or best practise information about 1 or 2 clients in SAP R/3 DEV

    I'm searching for advice or best practise information about clients in a SAP R/3 development system.
    Reason for this is that we are up to refresh our SAP R/3 development system and up to now we have two clients on it:
    -     One customizing/development client without master data, transaction data et cetera
    -     One local test client with master data, transaction data and so on
    One of our developers suggested to only have one client on development, where we could customize, program and test. So that client would be with master data, transaction data et cetera.
    What would be your advice or what would be best practise for the development system: 1 client (with data) or 2 clients (one clean customizing and one with data). And what are the most important reasons to do it so.
    Maybe there is already some good (SAP) information about this specific subject, but up to now I havenu2019t found it yet.

    Maybe I've asked my question too broad. I'll try to narrow it down.
    Up to now we always had two clients on our SAP R/3 development system:
    - Client 200 - Customizing/development only. No other data in this client
    - Client 400 - Local test client with master data and transaction data. New customizing is copied from client 200 to test
    The reason for having those two clients are:
    - It feels someway good to have a customizing-only client
    - We've always done this before
    A developer suggested to only have one client in our SAP R/3 development system for the following reason:
    - You'll never need to copy the customizing (tr.SCC1) first to be able to test it
    - You can work in one client and don't need to login in the other client to test it (for example: ABAP reports)
    - For customizing of easy setting (for example producthiërarchie, as we don't test it everytime in client 400) it is possible to forget copying it into client 400 (test client). With one client, you can not forget it
    The reasons of this developer seems very valid and up to now we haven't found a convincing/compelling reason to make a good choice for one or two clients.
    Please, try to convince us with good reasons to choose for one or two clients.

  • Experiences on SAP Best Practise packages?

    We are considering applying a specific SAP Best Practise package on top of ECC6. It will be a new ECC6 installation.
    I have read the BP faq pages http://help.sap.com/bp_bw370/html/faq.htm and read note 1225909 - How to apply SAP Best Practices - and the specific note and the specific installation guide for required Best Practise package.
    I understand that applying a BP package is something not that easy but should speed up the implementation process.
    If you have been participating in a project that implemented SAP Best Practises on ERP, would you please share your experiences with me.
    Here are some topics for discussion:
    (BP=SAP Best Practise package)
    BP implementation could make constraints to your future EHP or SP upgrades, while BP is tightly linked to certain EHP and SP-level?
    SAP support?
    Installation process including config steps according BP installation guide is longer than usual, while there can be multiple activation and configuration steps done by others than basis consultants?
    BPs are country dependent. What if your application is used multinationally?
    What if you later find BP unneccessary?
    generate difficulties in the beginning but did considerably speed up reaching the final goal compared to ECC6 without BP.
    traps to fall in?
    typical issues that basis staff has to notify when preparing/installing a ERP system to be with BP?
    positive/negative things
    I am not saying above statements are true or false. Just wanted to charge you giving comments.
    Br: KimZi
    Edited by: Kimmzini Siewoinenski on Aug 11, 2009 8:35 PM

    Hi,
    Make sure your web service URL correct in Live Office connection and also in Xcelsius data manager.
    Did you check all connection in refresh button proprties? you may try selecting "Refresh after component are Loaded" in Refresh button properties Behavior tab.
    I think Xcelsius refresh are serial refresh so it may be possible that first component refresh is still in progress but you are expecting other component to refresh.
    Click on "Enable Load Cursor" in data manager's Usage tab, it will give you visibility of refresh. If anything refreshing you will see hour glass.
    Cheers

  • ECC 5.0 and BO without BW/BI Integration Best Practise

    Hi,
    I know that SAP can connect with BO system using RapidMart (replacing data warehouse BW/BI). Is this best practise?Because when i see the BI Platform RoadMap,I conclude that BI and BO will be integrate into one product, and i'm afraid to invest right now because in the future it seems we need to implement the BW system to make this BI system optimal.
    My company want to use BO without implement BW system right now. SAP system that we use right now is ECC 5.0 and my company only upgrade it to next version when version 5.0 not supported again.Thank you.
    Cheers,
    Satria

    Hi Satria,
    the answer depends a little bit on what the requirement is.
    RapidMarts are an option to implement a data mart solution on top of an ERP solution. On top of the Rapid Marts you can then use the BusinessObjects tools without hitting the OLTP system directly.
    Ingo

  • BEST PRACTISE on users deletions HR/SU01

    Hi
    we use CUA/SSO.
    The records are fed from HR records and sent to Active Directory (AD) 
    AD brings backs the records and creates/changes users in SU01
    A function module populates the CVR (timesheet) parameter dependent on whether you are an employee or a contractor 
    Occasionally, our HR department request records to be deleted from the SAP Support team - for example if the employee or contractor hasn't in fact joined the company.
    Until some time ago, the deletion was causing problems because:
    a) the record does not get deleted in AD and there is  no way to send the deletion across after
    b) when AD tries to reprocess that specific record, LDAP connector will not find it as HR record so what happens in SU01 for some reasons, the VALID from field gets wiped out and the CVR parameter for Timesheet also...
    We have changed the process for the deletion however, I would like to ask if you know what is the best practise for this?? HR want to delete the record so it can be re-utilised
    I cannot delete those records from UMR unless I am 100% sure they have never used the system (will have to check that)
    I hope I have provided enough info on what the issue is..
    Thank you
    Nadia

    Best practice is not to delete.
    > HR want to delete the record so it can be re-utilised
    So many people with the same name? Perhaps a suffix of 2 numbers when the ID naming convention produces a clash. Besides, do your AD admins not want unique names in the AD as well?
    E.g. (just an imperfect example)
    MUSTERMA = Alfred MUSTERMan
    MUSTERMM = Manfred MUSTERMan
    MUSTER01 = Mechtilde MUSTERMuller
    > I cannot delete those records from UMR unless I am 100% sure they have never used the system (will have to check that)
    Surest way is to determine that they have never logged on before. But that does not exclude that records might exist for them, which may eventually do a "user existence check" to be read. One such example is the Security Audit Log, e.g. there may have been failed login attempts.
    Good luck,
    Julius

  • Transport landscape best practise

    I'm wondering if SAP has a best practise document on transport landscape planning.
    SAP Help has pretty clear description about a standard 3 system landscape. But not document is found describing complext transport landscape considerations --- multiple ABAP development/test systems, conflict resolution between project landscape and maintenance landscape.
    Any feedback is greatly appreciated.

    Hi. GO to http://help.sap.com/bp/initial/index.htm
    There you find all about BP.
    Regards, Award if helpful

  • Configure "best practise baseline" manually

    Dear experts!
    I'm studying SAP ERP as a student. I am interesting in SD. I have read the certification material TSCM and SCM.
    Now I am waiting for your advice whether I need to configure "best practise baseline" manually.
    We see, it will take a lot of time to configure "best practise baseline" manually.
    However, I was thinking that it would be helpfull to my general view on ERP, as well as, I can use it to study "best practise industry" when I complete it.
    How do you think?
    Waiting for your advice!
    Thank you!
    Best regard!
    Tang Dark

    Hi,
    Best Practises is a specific SAP solution and to get it you have to specifically buy it - it doesn't come standard with every ERP solution.
    What is Best Practises? Best Practises is basically a set of pre configured stuff (BC sets) which you upload into SAP, hence reducing your configuration.
    Baseline is basically the general configuration (per module) that a company requires to carry on with config. Things like org structure, etc.
    On top of the baseline you can then start loading other more specific BC sets per module.
    Best Practises is a methodology to address smaller companies because it reduces the blueprint and configuration phases. Projects can be finalised within 6 months compared to std ERP implementations that may take 1 to 2 years.
    If you want to be a proper SD consultant do not study Best Practises only. Do your SAP academy. Best Practises you can learn after you know SAP.
    As to Industries specific, these are add-ons to address specific industry requirements. For example, bills of services, is something used in project environments such as engineering & construction. So the funcitonality is not necessary, for example, the retail industry which sells only finished products.
    Again, firs learn your SAP and then you can focus on learning others. If you learn SAP you can work on any Best Practices package or any Industry, If you study Best PRactises you can only work with BP, and the same goes for industry solutions.

  • Best practise for SAP users who leave the company

    Hi
    Could anyone reccommend a best practise document or give advice on how to deal with SAP user ID's when employee's/contractors/consultants leave? I am the basis admin just starting an SAP implementation and we have no dedicated authorisation team at the moment, so I have been asked to look into this :
    Currently we set the validity date in SU01 to the termination date.
    We chack there are no background jobs scheduled under that user id, if there are, we change the job owner to a valid user (we try to run all background jobs under an admin account).
    We do not delete the user as from an audit point of view I believe it restricts information you can report on and there are implications on change documents etc, so best to lock it with validity dates.
    Can anyone advise further?
    We are running SAP ECC 5.0 on Windows 2003 64 Bit/MS SQL 2000.
    Thanks for any help.

    Hi,
    Different people will tell you different versions of what they believe is best practice, but in my opinion you are already doing reasonably well.
    What I prefer is
    1. Lock ID & set validity date.
    2. Assign user to user group LEAVER or EXPIRED or something similar (helps with reporting) out of SUIM/S_BCE* reports.
    3. Delete role assignment (should you need it, the role assignment will be in the change history docs anyway).
    4. Check background jobs & act accordingly.
    For ease of getting info I prefer not to delete the ID though plenty of people do.

  • Business Content Best Practise

    Hi Guys
    Just a quick request -
    I have activated BC a couple of times, but each time it takes longer than it should - missing certain areas, activate far too much etc.
    Apart from 'help.sap.com.', does anyone have any Best Practise guides or docs on BC, to allow me to cut down on my activation.
    Thanks.

    Hi Scott,
    I would really commont only to activate with only necassary objects (befor and after is very painfull). Just collect the data structures like DataSource, InfoSource, DataStore Object, InfoCube, ..., Queries (then all Infoobjects gets collected). Then collected the linkages (tranfer rules, transformations, etc.) - these can easily be found in the help menue. after the collection activate it in batch.
    best regards clemens

Maybe you are looking for

  • Sub Contracting PO Collaboration Process, Manual GR in SNC using Goods Recipient View

    Hi, I have a 3rd party PO in R/3 with the SC Vendor tick. So PO is sent to my Vendor A who will have 'ship-to- Party' as Vendor B and me as buyer. So Vendor A can log into PO Details (Supplier) and do confirmation (Acknowledgement - AB) and ASN (LA).

  • The fan in my Macbook Pro heats up

    The fan in my Macbook Pro heats up quite fast. I wanted to know how much it would cost to get it fixed as I don't have Apple care and my warranty has expired?

  • Help regarding Charts

    Hello, I am currently developing an application and I have a requirement. The table structure is as below: Table - leaves_months_test Columns - email, jan, feb, mar, apr, may, jun, jul, aug, sept, oct, nov, dec, total This table stores the leaves tak

  • Premiere Elements 10 AVI file with MJPG CODEC

    I am working on a project in PE 10 with video clips exported from MPEG Streamclip as AVI files.  They are sports highlight clips of about 20 seconds each.  I've been doing projects like these for years and have had no problems since my last post here

  • Ethernet Network Setup

    I currently have an iMac, macbook, time capsule and apple TV. I have a 50M broadband connection into the time capsule, and the time capsule generates the wireless network. I want to connect all devices via ethernet to get better performance (ie iMac