Best practice for the use of reserved words
Hi,
What is the best practice to observe for using reserved words as column names.
For example if I insisted on using the word comment for a column name by doing the following:
CREATE TABLE ...
"COMMENT" VARCHAR2(4000),
What impact down the track could I expect and what problems should I be aware of when doing something like this?
Thank You
Ben
Hi, Ben,
Benton wrote:
Hi,
What is the best practice to observe for using reserved words as column names.Sybrand is right (as usual): the best practice is not to use them
For example if I insisted on using the word comment for a column name by doing the following:
CREATE TABLE ...
"COMMENT" VARCHAR2(4000),
What impact down the track could I expect and what problems should I be aware of when doing something like this?Using reserved words as identifiers is asking for trouble. You can expect to get what you ask for.
Whatever benefits you may get from naming the column COMMENT rather than, say, CMNT or EMP_COMMENT (if the table is called EMP) will be insignificant compared to the extra debugging you will certainly need.
Similar Messages
-
Best Practice for the Service Distribution on multiple servers
Hi,
Could you please suggest as per the best practice for the above.
Requirements : we will use all features in share point ( Powerpivot, Search, Reporting Service, BCS, Excel, Workflow Manager, App Management etc)
Capacity : We have 12 Servers excluding SQL server.
Please do not just refer any URL, Suggest as per the requirements.
Thanks
srabonHow about a link to the MS guidance!
http://go.microsoft.com/fwlink/p/?LinkId=286957 -
Best practice for the test environment & DBA plan Activities Documents
Dears,,
In our company, we made sizing for hardware.
we have Three environments ( Test/Development , Training , Production ).
But, the test environment servers less than Production environment servers.
My question is:
How to make the best practice for the test environment?
( Is there any recommendations from Oracle related to this , any PDF files help me ............ )
Also please , Can I have a detail document regarding the DBA plan activities?
I appreciate your help and advise
Thanks
Edited by: user4520487 on Mar 3, 2009 11:08 PMFollow your build document for the same steps you used to build production.
You should know where all your code is. You can use the deployment manager to export your configurations. Export customized files from MDS. Just follow the process again, and you will have a clean instance not containing production data.
It only takes a lot of time if your client is lacking documentation or if you re not familiar with all the parts of the environment. What's 2-3 hours compared to all the issues you will run into if you copy databases or import/export schemas?
-Kevin -
Best Practice for the Vendor Consignment Process
Hiii ,,,
Can anybody have the best practices for the Vendor consignment process???
Please Send me the document.
Please explain me the Consignment process in SAP??
Thanx & Regards,
Kumar RayuduHi Kumar,
In order to have Consigment in SAP u need to have master data such as material master, vendor master and purchase inforecord of consignment type. U have to enter the item category k when u enter PO. The goods receipt post in vendor consignment stock will be non valuated.
1. The intial steps starts with raising purchase order for the consignment item
2. The vendor recieves the purchase order.
3. GR happens for the consignment material.
4. Stocks are recieved and placed under consignment stock.
5. When ever we issue to prodn or if we transfer post(using mov 411) from consignment to own stock then liability occurs.
6. Finally comes the settlement using mrko. You settle the amount for the goods which was consumed during a specific period.
regards
Anand.C -
Req:SAP Best practice for the Funds Management
Dear all,
Let me know where I can get the SAP Best practice for the Funds Management . Waiting for your valuable reply.
Regards
ManoharHello Manohar,
You can find documentation in links below:
Industry Solution Master Guide - SAP for Public Sector:
https://websmp105.sap-ag.de/~form/sapnet?_SHORTKEY=00200797470000065911
SAP Best Practices for Public Sector:
http://help.sap.com/ SAP Best Practices -> Industry Packages -> Public
Sector
Online Library for Funds Management:
http://help.sap.com/saphelp_erp2005vp/helpdata/en/41/c62c6d6d84104ab938a
a7eae51db06/frameset.htm
I hope it helps
Best Regards,
Vanessa Barth. -
Best Practice for the database owner of an SAP database.
We recently had a user account removed from our SAP system when this person left the agency. The account was associated with the SAP database (he created the database a couple of years ago).
I'd like to change the owner of the database to <domain>\<sid>adm (ex: XYZ\dv1adm) as this is the system admin account used on the host server and is a login for the sql server. I don't want to associate the database with another admin user as that will change over time.
What is the best practice for database owner for and SAP database?
Thanks
Laurie McGinleyHi Laura
I'm not sure if this is best practise or not, but I've always had the SA user as the owner of the database. It just makes it easier for restores to other systems etc.
Ken -
Best practice for the Update of SAP GRC CC Rule Set
Hi GRC experts,
We have in a CC production system a SoD matrix that we would like to modified extensively. Basically by activating many permissions.
Which is a best practice for accomplish our goal?
Many thanks in advance. Best regards,
ImanolHi Simon and Amir
My name is Connie and I work at Accenture GRC practice (and a colleague of Imanolu2019s). I have been reading this thread and I would like to ask you a question that is related to this topic. We have a case where a Global Rule Set u201CLogic Systemu201D and we may also require to create a Specific Rule Set. Is there a document (from SAP or from best practices) that indicate the potential impact (regarding risk analysis, system performance, process execution time, etc) caused by implementing both type of rule sets in a production environment? Are there any special considerations to be aware? Have you ever implemented this type of scenario?
I would really appreciate your help and if you could point me to specific documentation could be of great assistance. Thanks in advance and best regards,
Connie -
What are the best practices for the RCU's schemas
Hi,
I was wondering if there is some best practices about the RCU's schemas created with BIEE.
I already have discoverer (and application server), so I have a metadata repository for the Application Server. I will upgrade Discoverer 10g to 11, so I will create new schema with RCU in my metada repository (MR) of the Application Server. I'm wondering if I can put the BIEE's RCU schemas in the same database.
Basically,
1. is there a standard for the PREFIX ?
2. If I have multiple components of Fusion in the same Database, I will have multiples PREFIX_MDS schema ? Can they have the same PREFIX ? Or They all need to have a different prefix ?
For exemple: DISCO_MDS and BIEE_MDS or I can have DEV_MDS and this schema is valid for both Discoverer and BIEE.
Thank you !What are the best practices for exception handling in n-tier applications?
The application is a fat client based on MVVM pattern with
.NET framework.
That would be to catch all exceptions at a single point in the n-tier solution, log it and create user friendly messages displayed to the user. -
Best Practice for CTS_Project use in a Non-ChARM ECC6.0 System
We are on ECC6.0 and do not leverage Solution Manager to any extent. Over the years we have performed multiple technical upgrades but in many ways we are running our ECC6.0 solution using the same tools and approaches as we did back in R/3 3.1.
The future vision for us is to utilize CHARM to manage our ITIL-centric change process but we have to walk before we can run and are not yet ready to make that leap. Currently we are just beginning to leverage CTS_Projects in ECC as a grouping tool for transports but are still heavily tied to Excel-based "implementation plans". We would appreciate references or advice on best practices to follow with respect to the creation and use of the CTS_Projects in ECC.
Some specific questions:
#1 Is there merit in creating new CTS Projects for support activities each year? For example, we classify our support system changes as "Normal", "Emergency", and "Standard". These correspond to changes deployed on a periodic schedule, priority one changes deployed as soon as they are ready, and changes that are deemed to be "pre-approved" as they are low risk. Is there a benefit to create a new CTS_Project each year e.g. "2012 Emergencies", "2013 Emergencies" etc. or should we just create a CTS_Project "Emergencies" which stays open forever and then use the export time stamp as a selection criteria when we want to see what was moved in which year?
#2 We experienced significant system performance issues on export when we left the project intersections check on. There are many OSS notes about performance of this tool but in the end we opted to turn off this check. Does anyone use this functionality? Any reocmmendations?
Any other advice would be greatly appreciated.Hi,
I created a project (JDeveloper) with local xsd-files and tried to delete and recreate them in the structure pane with references to a version on the application server. After reopening the project I deployed it successfully to the bpel server. The process is working fine, but in the structure pane there is no information about any of the xsds anymore and the payload in the variables there is an exception (problem building schema).
How does bpel know where to look for the xsd-files and how does the mapping still work?
This cannot be the way to do it correctly. Do I have a chance to rework an existing project or do I have to rebuild it from scratch in order to have all the references right?
Thanks for any clue.
Bette -
BEST PRACTICE FOR THE REPLACEMENT OF REPORTS CLUSTER
Hi,
i've read the noter reports_gueide_to_changed_functionality on OTN.
On Page 5 ist stated that reports cluster is deprecated.
Snippet:
Oracle Application Server High Availability provides the industrys most
reliable, resilient, and fault-tolerant application server platform. Oracle
Reports integration with OracleAS High Availability makes sure that your
enterprise-reporting environment is extremely reliable and fault-tolerant.
Since using OracleAS High Availability provides a centralized clustering
mechanism and several cutting-edge features, Oracle Reports clustering is now
deprecated.
Please can anyone tell me, what is the best practice to replace reports cluster.
It's really annoying that the clustering technology is changing in every version of reports!!!
martinhello,
in reality, reports server "clusters" was more a load balancing solution that a clustering (no shared queue or cache). since it is desirable to have one load-balancing/HA approach for the application server, reports server clustering is deprecated in 10gR2.
we understand that this frequent change can cause some level of frustration, but it is our strong believe that unifying the HA "attack plan" for all of the app server components will utimatly benefit custoemrs in simpifying their topologies.
the current best practice is to deploy LBRs (load-balancing routers) with sticky-routing capabilites to distribute requests across middletier nodes in an app-server cluster.
several custoemrs in high-end environments have already used this kind of configuration to ensure optimal HA for their system.
thanks,
philipp -
Best practice for ConcurrentHashMap use?
Hi All, would the following be considered "best practice", or is there a better way of doing the same thing? The requirement is to have a single unique "Handler" object for each Key:
public class HandlerManager {
private Object lock = new Object();
private Map<Key,Handler> map = new ConcurrentHashMap<Key,Handler>();
public Handler getHandler(Key key) {
Handler handler = map.get(key);
if (handler == null) {
synchronized(lock) {
handler = map.get(key);
if (handler == null) {
handler = new Handler();
map.put(key, handler);
return handler;
}Clearly this is the old "double-checked-locking" pattern which didn't work until 1.5 and now only works with volatiles. I believe I will get away with it because I'm using a ConcurrentHashMap.
Any opinions? is there a better pattern?
Thanks,
HuwMy personal choice would be to use the reliable "single-checked-locking" pattern:
public Handler getHandler(Key key) {
synchronized(lock) {
Handler handler = map.get(key);
if (handler == null) {
handler = new Handler();
map.put(key, handler);
return handler;
}But I'm afraid the Politically Correct way of doing it nowadays looks as ugly as this:
class HandlerManager {
private Map<Key,Handler> map = new HashMap<Key,Handler>();
private final Lock readLock;
private final Lock writeLock;
public HandlerManager() {
ReadWriteLock lock = new ReentrantReadWriteLock();
readLock = lock.readLock();
writeLock = lock.writeLock();
public Handler getHandler(Key key) {
Handler handler = null;
readLock.lock();
try {
handler = map.get(key);
} finally {
readLock.unlock();
if (handler == null) {
writeLock.lock();
try {
handler = map.get(key);
if (handler == null) {
handler = new Handler();
map.put(key, handler);
finally {
writeLock.unlock();
return handler;
} -
Best practices for the datapush in Blazeds?
Hello,
Currently I'm working on the data push using Blazeds. I'm currently wondering about the best practices that needs to be followed to reduce the hits to the server when using Data push/Increase the performance, Could you please guide me or direct me to good resources on this. Also i'm currently using default Amf for the data push, help me choose the Default channel for this process.Hi,
According to
this documentation, “You must configure a new name in Domain Name Services (DNS) to host the apps. To help improve security, the domain name should not be a subdomain
of the domain that hosts the SharePoint sites. For example, if the SharePoint sites are at Contoso.com, consider ContosoApps.com instead of App.Contoso.com as the domain name”.
More information:
http://technet.microsoft.com/en-us/library/fp161237(v=office.15)
For production hosting scenarios, you would still have to create a DNS routing strategy within your intranet and optionally configure your firewall.
The link below will show how to create and configure a production environment for apps for SharePoint:
http://technet.microsoft.com/en-us/library/fp161232(v=office.15)
Thanks
Patrick Liang
Forum Support
Please remember to mark the replies as answers if they
help and unmark them if they provide no help. If you have feedback for TechNet
Subscriber Support, contact [email protected]
Patrick Liang
TechNet Community Support -
Any best practices for the iPad mini????
I am in the beginning stages of designing my mag for the iPad.... now the iPad mini seems to be all the hype and the latest news states that the mini may out sell the larger one.
So... I know that the dimensions are 1x1 on bumping down to the smaller screen... but what about font sizes? what about the experience? Anyone already ahead of the game?
I have my own answers to these questions, but any one out there have some best practice advice or links to some articles they find informative...I think 18-pt body text is fine for the iPad 2 but too small for the iPad mini. Obviously, it depends on your design and which font you're using, but it seems like a good idea to bump up the font size a couple points to account for the smaller screen.
For the same reason, be careful with small buttons and small tap areas.
I've also noticed that for whatever reason, MSOs and scrollable frames in PNG/JPG articles look great on the iPad 2 but look slightly more pixelated on the iPad Mini. It might just be my imagination.
Make sure that you test your design on the Mini. -
Best practice for development using REST API - OData
Hi All, I am new to REST. I am a developer who works mostly in server-side code using Visual Studio. Now that Microsoft is advocating to write code using REST API instead of server-side code or client side object model, I am trying to use REST API.
I googled and most of the example shows to write a code and put it on Content Editor/Script Editor. How to organize code and deploy to the staging/production in this scenario? Is there any Best Practice or example around this?
Regards,
KhushiIf you are writing code in aspx or cs it does not mean that you need to deploy it in the SharePoint server, it could be any other application running from your remote server. What I mean it you can use C# & Rest API to connect to SharePoint server.
REST API in SharePoint 2013 provides the developers with a simple standardized method of retrieving information from SharePoint and it can be used from any technology that is capable of sending standard HTTP requests.
Refer to the following blog that provide your more details about comparison of the major features of these programming choices/
http://msdn.microsoft.com/en-us/library/jj164060.aspx#RESTODataA
http://dlr2008.wordpress.com/2013/10/31/sharepoint-2013-rest-api-the-c-connection-part-1-using-system-net-http-httpclient/
Hope this helps
--Cheers -
Defragmenting Mac (Leopard) best practice for the safest disk optimisation
What is the best way to defragment?
Its always dangerous to perform disk related stuff, sometimes some of the apps can lead to file corruption, I know lot cases where almost all the third party software has caused issues.
I have disk warrior, Techtool pro, drive genius, which one is safest? I know most of them require to boot of media to perform on boot drive.
Recently, i upgraded to 320gb 7200 rpm and since i had about 190gb of data, i thought disk utility will be faster, and it did copy in 1.5 hour but it didn't defragment my hard drive since it does block level copy. My computer is IN FACT RUNNING SLOWER.
Im thinking of carbon copy cloner to external hard drive, file level copy, and then boot of restore dvd 10.5 to perform restore function by disk utility which is lot faster since it does block level copy.I use idefrag.
http://www.coriolis-systems.com/iDefrag-faq.php
from iDefrag help:
Why Defragment?
It has often been asserted that defragmentation (or disk optimization) is not a good idea on
systems using Apple’s HFS+ filesystem. The main reasons given for this historically have been:
HFS+ is very much better at keeping files defragmented than many other commodity filesystems.
Advanced features in recent versions of HFS+ can easily be disrupted by a defragmentation tool
that does not support them, resulting in decreased performance.
There is a risk associated with defragmentation.
Whilst these arguments are certainly valid, they are not the whole story. For one thing, iDefrag,
unlike most other disk defragmentation tools, fully supports the most recent features of HFS+,
namely the metadata zone (or “hot band”) and the adaptive hot file clustering support added in
Mac OS X 10.3. Not only does it avoid disrupting them, but it is capable of fixing disruption caused
by other software by moving files into or out of the metadata zone as appropriate.
Sensible arguments for occasional optimization of your disk include:
HFS+ is not very good at keeping free space contiguous, which can, in turn, lead to large files
becoming very fragmented, and can also cause problems for the virtual memory subsystem on
Mac OS X.
Older versions of the Mac OS are not themselves aware of the metadata zone policy, and may
disrupt its performance.
HFS+ uses B-Tree index files to hold information about the filesystem. If a large number of files
are placed on a disk, the filesystem may have to enlarge these B-Tree structures; however, there
is no built-in mechanism to shrink them again once the files are deleted, so the space taken up
by these files has been lost.
Whilst HFS+ is good at keeping individual files defragmented, mechanisms like Software Update
may result in files that are components of the same piece of software being scattered across the
disk, leading to increased start-up times, both for Mac OS X itself and for applications software.
This is a form of fragmentation that is typically overlooked.
Defragmenting disk images can be helpful, particularly if they are to be placed onto a CD/DVD, as
seeks on CD/DVD discs are particularly expensive.
Some specific usage patterns may cause fragmentation despite the features of HFS+ that are
designed to avoid it.
We do not recommend very frequent optimization of your disk; optimizing a disk can take a
substantial amount of time, particularly with larger disks, far outweighing the benefits that are
likely to be obtained by (say) a weekly optimization regime.
Optimization may make more sense, however, following large software updates, or on an
occasional basis if you notice decreased performance and lots of hard disk seeking on system
start-up or when starting an application.
Kj
Maybe you are looking for
-
How to add a new Line in Labeltext of Ribbon button added in runtime ?
HI , i'm adding a new buuton ribbon to an existing Tab (Edit Tab) , i use the AliasTemplate "o1" , and this is my button : The text "Mettre a jour la fiche a partir de la concept Note " is very long i want to add a new line after "a partir" so i will
-
My creative cloud app would not launch. I removed it with Windows uninstall, downloaded and re-installed. Again, it would not launch. Removed again using Adobe clean tool, appeared to have two copies on machine. re-installed, still have same p
-
Advice on inserting data at parent level
Hi experts, I would need some advice on how can I enter data at parent level. As we cannot enter data at parent level I am figuring out different strategies to do so. The one I seem to like the most is to create members representing the parents where
-
So I just bought a new computer, installed Windows 8 then installed Firefox. Upon loading it, this happens: http://imgur.com/BndEiGV I tried uninstalling then reinstalling and it didn't make any difference. I even tried to redownload it but still not
-
A way to find as to a variable from table TAVRV is used in which program?
Hello Experts, Is there a way to find as to a variable from table TAVRV is used in which of the variants and in what programs. Example: can we find a variable ZZ_AAAA_ACTUEL (from table TVARV) is used in which variants / programs? I want maintain one