JDBC / SQL Update Best Practice

My application updates a database table whenever a user modifies their profile.
I have two questions on this...
1. I've chosen to use PreparedStatements purely because it means I don't have to worry about special characters (e.g. '%"? in my data), and not because I re-use the statements. Is this a respected approach?
2. Is it worth dynamcially building the update SQL and adding paramaters because in most cases only a subset of the possible fields will be modified? (e.g. avoid setting col1="test" if col1 already equals "test"). Is there an acknowledged pattern / algorithm / library that does this?
Thanks,
Steve

My application updates a database table whenever a
user modifies their profile.
I have two questions on this...
1. I've chosen to use PreparedStatements purely
because it means I don't have to worry about special
characters (e.g. '%"? in my data), and not because I
re-use the statements. Is this a respected approach?Yes.
>
2. Is it worth dynamcially building the update SQL
and adding paramaters because in most cases only a
subset of the possible fields will be modified? Probably not. The only time this is going to matter is if there is a significantly sized field (like a blob) that often does not get updated. In that case you would probably want to exclude that.
(e.g. avoid setting col1="test" if col1 already equals
"test"). Is there an acknowledged pattern / algorithm
/ library that does this?Officially not as far as I know.
There are several patterns that I have used.
1. A modified flag for each field. If the set method is called then the flag is set to true.
2. A modified flag for each field. If the set method is called then the new value is compared to the old and the flag is set depending on the outcome.
3. The database layer holds (or retrieves) the previous data. It compares the two, noting the fields that have changed.
In the above note that primary keys must be dealt with. Usually the primary key is either set or not set. If not set then it is a new record. If set then it is an update.

Similar Messages

  • SQL server Best Practice Analyzer output in .CSV

    Hi Team, I ran SQL server Best practice analyzer on our SQL 2008 R2 server. I was trying to export scan result in .csv format but it is only giving me option to save it in .xml format. I have been looking for ways to export output in such a way
    that it can be readable and I can send it to our clients but no luck.
    How can I export SQL BPA output in .csv or any other user friendly format?
    Thanks in Advance.

    Hi MSRS27,
    You can run Best Practices Analyzer (BPA) scans either from Server Manager, by using the BPA GUI, or by using cmdlets in Windows PowerShell. We can view or save BPA results from Windows PowerShell session in different format.
    If you want to export BPA results to a comma-separated values (CSV) text file, run the following cmdlet, where Path represents the path and text file name to which you want to save the CSV results.
     CSV results can be imported into Microsoft® Excel, or other programs that display data in spreadsheets or grids.
    Get-BPAResultModel ID| Export-CSVPath
    For more information, see: Run Best Practices Analyzer Scans and Manage Scan Results
    http://technet.microsoft.com/en-us/library/hh831400.aspx
    Regards,
    Sofiya Li
    Sofiya Li
    TechNet Community Support

  • Sql backup best practice on vms that are backed up as a complete vm

    hi,
    apologies as i am sure this has been asked many times before but i cant really find an answer to my question. so my situation is this. I have two types of backups; agent based and snap based backups.
    For the vm's that are being backed up by snapshots the process is: vmware does the snap, then the san takes a snap of the storage and then the backup is taken from the san. we then have full vm backups.
    For the agent based backups, these are only backing up file level stuff. so we use this for our sql cluster and some other servers. these are not snaps/full vm backups, but simply backups of databases and files etc.
    this works well, but there are a couple of servers that need to be in the full vm snap category and therefore cant have the backup agent installed on that vm as it is already being backed up by the snap technology. so what would be the best practice on these
    snapped vms that have sql installed as well? should i configure a reoccurring backup in sql management studio (if this is possible??) which is done before the vm snap backup? or is there another way i should be backing up the dbs?
    any suggestions would be very welcome.
    thanks
    aaron

    Hello Aaron,
    If I understand correctly, you perform a snapshot backup of the complete VM.
    In that case you also need to create a SQL Server backup schedule to perform Full and Transaction Log backups.
    (if you do a filelevel backup of the .mdf and .ldf files with an agent you also need to do this)
    I would run a database backup before the VM snapshot (to a SAN location if possible), then perform the Snapshot backup.
    You should set up the transaction log backups depending on business recovery needs.
    For instance: if your company accepts a maximum of 30 minutes data loss make sure to perform a transaction log backup every 30 minutes.
    In case of emergency you could revert to the VM Snapshot, restore the full database backup and restore transaction log backups till the point in time you need.

  • SAP Business One 2007 - SQL Security best practice

    I have a client with a large user base running SAP Business One 2007. 
    We are concerned over the use of the sql sa user and the ability to change the password of this ID from the logon of SAP Business One.
    We therefore want to move to use Windows Authentication (ie Trusted Connection) from the SAP BO logon.  It appears however that this can only work by granting the window IDs (of the SAP users) sysadmin access in SQL.
    Does anyone have a better method of securing SAP Business One or is there a recommended best practice.  Any help would be appreciated.
    Damian

    See Administrators Guide for best practise.
    U can use SQL Authentication mode Don't tick Remember password.
    Also check this thread
    SQL Authentication Mode
    Edited by: Jeyakanthan A on Aug 28, 2009 3:57 PM

  • SQL Server Best Practices Architecture UCS and FAS3270

    Hey thereWe are moving from EMC SAN and physical servers to NetApp fas3270 and virtual environment on Cisco UCS B200 M3.Traditionally - Best Practices for SQL Server Datbases are to separate the following files on spearate LUN's and/or VolumesDatabase Data filesTransaction Log filesTempDB Data filesAlso I have seen additional separations for...
    System Data files (Master, Model, MSDB, Distribution, Resource DB etc...)IndexesDepending on the size of the database and I/O requirements you can add multiple files for databases.  The goal is provide optimal performance.  The method of choice is to separate Reads & Writes, (Random and Sequential activities)If you have 30 Disks, is it better to separate them?  Or is better to leave the files in one continous pool?  12 Drives RAID 10 (Data files)10 Drives RAID 10 (Log files)8 Drives RAID 10 (TempDB)Please don't get too caught up on the numbers used in the example, but place focus on whether or not (using FAS3270) it is better practice to spearate or consolidate drives/volumes for SQL Server DatabasesThanks!

    Hi Michael,It's a completely different world with NetApp! As a rule of thumb, you don't need separate spindles for different workloads (like SQL databases & logs) - you just put them into separate flexible volumes, which can share the same aggregate (i.e. a grouping of physical disks).For more detailed info about SQL on NetApp have a look at this doc:http://www.netapp.com/us/system/pdf-reader.aspx?pdfuri=tcm:10-61005-16&m=tr-4003.pdfRegards,Radek

  • PL/SQL Design: Best Practice

    Hello everybody,
    I'm trying to improve myself acquiring compentence in PL/SQL design. At now I am (or I wish to be) a good practitioner but I have to complain myself for lackings in designing architecture.
    I mean, for example, how to organize procedure and functions in a package, how to use pattern architecture, logging and testing practices...
    Do you know where I can find resources for these arguments? Which books do you suggest to start with?
    Thank you very much.
    Nicola

    The best practices are the very same fundamentals that apply to all other languages and have existed for since the dawn of programming.
    The single biggest fundamental principle is to modularise your design and code. A well designed program consists of building blocks. Different languages have different names for these - procedures, functions, units, packages, methods, routines, etc.
    A program lives or dies by how well it is modularised.
    If you only get that right, you can claim to be a Programmer and not a mere two bit developer.

  • Coldfusion, MS SQL, Hash Best Practices,...

    Hello,
    I am trying trying to store hashed data (user password) in an
    ms sql database; the datatype in the database is set to varbinary.
    I get a datatype conflict when trying to insert the hashed data. It
    works when the datatype in the database is set to varchar.
    I understand that you can set your hash function with
    arguments that will convert the data before sending to the
    database, but I am not clear on how this is done. Now, along with
    any assistance with the conversion, what exactly is the best
    practice for storing the hash data? Should I store as varcahar or
    varbinary? Of course, if varchar I won't have the problem, but I am
    interested in best practices as well.
    Thnx

    brwright,
    I suggest parameterizing your queries to add protecting from
    injection.
    http://livedocs.adobe.com/coldfusion/6.1/htmldocs/tags-b20.htm
    hashing is best suited for passwords because the encryption
    is one way, once encrypted using hash() it can't be decrypted.
    Other fields that you might want to encrypt and still have the
    ability to decrypt, you can use the encrypt() and decrypt()
    functions.
    http://livedocs.adobe.com/coldfusion/6.1/htmldocs/functi75.htm
    I think there are also new encryption functions available in
    coldfusion 8...

  • IOS Update Best Practices for Business Devices

    We're trying to figure out some best practices for doing iOS software updates to business devices.  Our devices are scattered across 24 hospitals and parts of two states. Going forward there might be hundreds of iOS devices at each facility.  Apple has tools for doing this in a smaller setting with a limited network, but to my knowledge, nothing (yet) for a larger implementation.  I know configurator can be used to do iOS updates.  I found this online:
    https://www.youtube.com/watch?v=6QPbZG3e-Uc
    I'm thinking the approach to take for the time being would be to have a mobile sync station setup with configurator for use at each facility.  The station would be moved throughout the facility to perform updates to the various devices.  Thought I'd see if anyone has tried this approach, or has any other ideas for dealing with device software updates.  Thanks in advance. 

    Hi Bonesaw1962,
    We've had our staff and students run iOS updates OTA via Settings -> Software Update. In the past, we put a DNS block on Apple's update servers to prevent users from updating iOS (like last fall when iOS 7 was first released). By blocking mesu.apple com, the iPads weren't able to check for or install any iOS software updates. We waited until iOS 7.0.3 was released before we removed the block to mesu.apple.com at which point we told users if they wanted to update to iOS 7 they could do so OTA. We used our MDM to run reports periodically to see how many people updated to iOS 7 and how many stayed on iOS 6. As time went on, just about everyone updated on their own.
    If you go this route (depending on the number of devices you have), you may want to take a look at Caching Server 2 to help with the network load https://www.apple.com/osx/server/features/#caching-server . From Apple's website, "When a user on your network downloads new software from Apple, a copy is automatically stored on your server. So the next time other users on your network update or download that same software, they actually access it from inside the network."
    I wish there was a way for MDMs to manage iOS updates, but unfortunately Apple hasn't made this feature available to MDM providers. I've given this feedback to our Apple SE, but haven't heard if it is being considered or not. Keeping fingers crossed.
    Hope this helps. Let us know what you decide on and keep us posted on the progress. Good luck!!
    ~Joe

  • Exchnage sp1 to sp3 Update Best Practices?

    Can someone outline Patching Best Practices?
    I need to upgrade Exchnage 2010 sp1 to sp3 for my potential customer?-There are Few questions in my mind
    -How do i proceed with Multi site Deployment , can i patch all CAS, MBx and Edge servers for site A and then move to site 2?
    -What are things that must be taken care of while performing whole process? -I've theoretical overview of how to proceed with the Upgrade, Just wanted to make sure if
    there is anything else that needs to be taken care of before performing this to production environments?
    Thanks in Advance.

    Hi,
    In addition to Will Martin's suggestion, I would like to verify if there is a DAG in your environment. If yes, please follow the steps below to upgrade it.
    1. Run the StartDagServerMaintenance.ps1 script to put the DAG member into maintenance mode and prepare it for the update rollup installation.
    2. Install the update rollup.
    3. Run the StopDagServerMaintenance.ps1 script to take the DAG member out of maintenance mode and put it back into production.
    4. Optionally rebalance the DAG by using the RedistributeActiveDatabases.ps1 script.
    Hope this can be helpful to you.
    Best regards,
    Please remember to mark the replies as answers if they help, and unmark the answers if they provide no help. If you have feedback for TechNet Support, contact [email protected]
    Amy Wang
    TechNet Community Support

  • EJB3 entity bean update, best practice

    I have an ejb3 entity bean that models a time that can be reserved in a booking system.
    I need a way to reserve the time for a specific user. Ofcourse the reservation should not overwrite if the time has already been reserved by another user.
    What is the best/cleaneste way to provide this service?
    I have thought of the following ways.
    I could put a version field on the entity. When the user is set in the frontend the entitybean will check that the time is not already reserved. If not it will be sent backup to the stateless session bean for persisting. If the time has been reserved by another user in the meantime JPA will throw an exception since the version doesn't match any more. The frontend can then show the error to the user.
    I could make the frontend call a method in a stateless session bean to reserve the time. The function could take the times primary id and the users primary id and loaded them from persistence. Then check if the time is already reserver else set the user and persist the time again. This should ofcourse be within a transaction and possibly also use a version attribute on the entity.

    Only fields detected as persistent-dirty will be updated in the database record.
    Laurent

  • Update Hierarchy Acrobat 9.x/Update Best Practices?

    Because of the many problems with Acrobat that seem to be caused by updates, I find it is common that I have to reinstall Acrobat (we are using 9) on someones machine.
    Our installer disc/files are Acrobat 9, missing of course all the updates.  I typically run the updates seperately as I have them already downloaded...but I have been going in each step (9.12, 9.13, 9.14, etc.) which takes forever.  By my count, I have 14 updates that need installed one by one after I install Acrobat.  This is very time consuming. 
    Is there any idea or list of what updates I can skip b/c their fixes are included in the next release?  Or, better yet is there a way that I can download a full installer of 9.4.4 or w/e is current that works with my 9.x activation keys?
    Wondering what other system admins do with the growing updates for this application.

    how would one create an installer as you mentioned?  I created a batch file to install each update silently one at a time, but it is still a pain b/c if you don't disable UAC, you sit there and press continue every time...not ideal for widespread use.

  • SQL 2012 Best Practice Analyzer issue with nothing available in pulldown on Microsoft Baseline Configuration Analyzer V2.0

    We have tried using both a Windows 7 and a Windows 8 machine and still cannot see any items available in the pulldown (ie. no sql 2012 or anything)  Is this a known issue and BPA does not work for SQL 2012?  Any suggestions?  I've seen several
    posts with the same issue but, no resolution.
    Laura

    Hi Laura,
    I installed Microsoft Baseline Configuration Analyzer 2.0 successfully. I can select a product: SQL Server 2012 BPA. Do you mean this?
    Thanks.
    If you have any feedback on our support, please click
    here.
    Maggie Luo
    TechNet Community Support

  • Sql connection best practices

    Can someone discuss the pros and cons of setting your db connection in the web.xml and then using the following in a jsp;
    <sql:query var="myQuery">
         SELECT * FROM mytable
    </sql:query>I find that it is quick and easy, but would I want to give this kind of code to my supervisor? From 20,000 ft up. :)
    Edited by: Reme on Aug 2, 2008 5:12 AM

    [http://java.sun.com/javaee/5/docs/tutorial/doc/bnald.html]
    The JSTL SQL tags for accessing databases listed in Table 7-7 are designed for quick prototyping and simple applications. For production applications, database operations are normally encapsulated in JavaBeans components.I wouldn't use it. Layer your application properly. Make use of a DAO class.

  • Piloting Software Updates Best Practices

    Hello All
    I am in the process of configuring our Software Updates infrastructure. My plan is to deploy patch Tuesday updates first to a Pilot Group (Collection) and then to what I will call Production group (Collection). Now when it comes to the pilot phase of software
    update testing of course we are just trying to see if any updates may prove problematic. Are there any other extensive testing methods or tips anyone use. Any helpful info would be appreciated.
    Thanks,
    Phillip
    Phil Balderos

    Generally, folks let the pilot users know that updates are coming and to do their normal "thing(s)". If there is something business critical, then having them test that would also be prudent. The pilot users should be representative of a broad
    cross-section of users to try to catch any issues among all of the various tasks that go on. Getting informed pilot users involved is of course helpful also so that they can provide relevant and pertinent feedback. At the end of the day though, there are no
    guarantees so you should stress this management.
    Jason | http://blog.configmgrftw.com | @jasonsandys

  • SQL 2008 R2 Best Practices for Updating Statistics for a 1.5 TB VLDB

    We currently have a ~1.5 TB VLDB (SQL 2008 R2) that services both OLTP and DSS workloads pretty much on a 24x7x365 basis. For many years we have been updating statistics (full scan- 100% sample size) for this VLDB once a week on the weekend, which
    is currently taking up to 30 hours to complete.
    Somewhat recently we have been experiencing intermitent issues while statistics are being updated, which I doubt is just a coincidence. I'd like to understand exactly why the process of updating statistics can cause these issues (timeouts/errors). My theory
    is that the optimizer is forced to choose an inferior execution plan while the needed statistics are in "limbo" (stuck between the "old" and the "new"), but that is again just a theory. I'm somewhat surprised that the "old" statistics couldn't continue to
    get used while the new/current statistics are being generated (like the process for rebuilding indexes online), but I don't know all the facts behind this mechanism yet so that may not even apply here.
    I understand that we have the option of reducing the sample percentage/size for updating statistics, which is currently set at 100% (full scan).  Reducing the sample percentage/size for updating statistics will reduce the total processing time, but
    it's also my understanding that doing so will leave the optimizer with less than optimal statistics for choosing the best execution plans. This seems to be a classic case of not being able to have one’s cake and eat it too.
    So in a nutshell I'm looking to fully understand why the process of updating statistics can cause access issues and I'm also looking for best practices in general for updating statistics of such a VLDB. Thanks in advance.
    Bill Thacker

    I'm with you. Yikes is exactly right with regard to suspending all index optimizations for so long. I'll probably start a separate forum thread about that in the near future, but for now lets stick to the best practices for updating statistics.
    I'm a little disappointed that multiple people haven't already chimed in about this and offered up some viable solutions. Like I said previously, I can't be the first person in need of such a thing. This database has 552 tables with a whole lot more statistics
    objects than that associated with those tables. The metadata has to be there for determining which statistics objects can go (not utilized much if at all so delete them- also produce an actual script to delete the useless ones identified) and what
    the proper sample percentage/size should be for updating the remaining, utilized statistics (again, also produce a script that can be used for executing the appropriate update statistics commands for each table based on cardinality).
    The above solution would be much more ideal IMO than just issuing a single update statistics command that samples the same percentage/size for every table (e.g. 10%). That's what we're doing today at 100% (full scan).
    Come on SQL Server Community. Show me some love :)
    Bill Thacker

Maybe you are looking for

  • File format conversion of Target file using FTP adapter

    Hi All, I am using FTP adapter to create the file on the Target side. But file needs to below format : How do i conver the XML File fomat( Default generate by XI ) to be generat to below file format; 000000000000154162, CWC1A,,,, CWC1B,,,, CWC2A,,,,

  • FCC - File Receiver to generate pipedelemited file

    Hi, Is there a Bean in the file receiver adapter (PI 7.1 EHP1) to generate pipe delemited files instead of the FixedLen one? Or CSV, etc. Thanks, Martin

  • 100 GB cloud storage??

    Hello, we have two licenses creative cloud for teams and that allows us to have 100 GB of cloud storage per license. What type of storage is concerned? Can we share video or image files with clients? Is it a type FTP storage? How can we do it? The in

  • Downloaded firefox and can't get rid of ebay in search window

    Everytime i have downloaded firefox, ebay is in search column. If i go ahead and type a desired address in search such as cement it carries me to items related to cement for sale on ebay site. No matter what, firefox carries me to ebay. Where does th

  • IE8 keeps prompting me to install flash player

    Flash 11.7 is installed. It is enabled in IE. Windows 7 SP1  64bit. When watching youtube videos, youtube prompts me install flash. Refreshing the page allows the video to be played. I have uninstalled and reinstalled. This problem doesn't occur in o