Storing File path - Best Practice

I have a db that stores the path of images that are loaded into a page.  All the files are stored in the same folder.  My question:  Is it better to store the entire file path in my db or should I just store the file name and make a Constant within the webpage calling the picture?  I would think it's the second option as it is less data in the db but I just wanted to make sure.
Thanks!

If the path is always the same, I would store just the filenames. Another option is to create a new field in your table for the path, and assign the current path as the default value. When inserting records, just add the file name; the path field will just take the default value.
In your SQL:
SELECT *, CONCAT(path, filename) AS image
FROM mytable
If you already have records stored in the table, you can update them very quickly with two SQL queries:
UPDATE mytable SET path = '/path/to/images/folder/'
UPDATE mytable SET filename = REPLACE(filename, 'path/to/images/folder/', '')

Similar Messages

  • SQL Server installation paths best practices

    In my company we're planning to setup a new (consolidated) SQL Server 2012 server (on Windows 2012 R2, VMWare). Current situation is there is a SQL Server 2000, a few SQL Server 2008 Express and a lot of Access databases. For the installation I'm wondering
    what the best selections for the various installation paths are. Our infra colleagues (offshore) have the following standard partition setup for SQL Server servers:
    C:\ OS
    E:\ Application
    L:\ Logs
    S:\ DB
    T:\ TEMPDB
    And during the installation I have to make a choice for the following
    Shared feature directory: x:\Program Files\Microsoft SQL Server\
    Shared feature directory (x86): x:\Program Files\Microsoft SQL Server\
    Instance root directory (SQL Server, Analysis Services, Reporting Services): x:\Program Files\Microsoft SQL Server\
    Database Engine Configuration Data Directories:
    Data root directory: x:\Program Files\Microsoft SQL Server\
    User database directory: x:\Program Files\Microsoft SQL Server\MSSQL11.x\MSSQL\...
    User database log directory: x:\Program Files\Microsoft SQL Server\MSSQL11.x\MSSQL\...
    Temp DB directory: x:\Program Files\Microsoft SQL Server\MSSQL11.x\MSSQL\...
    Temp DB log directory: x:\Program Files\Microsoft SQL Server\MSSQL11.x\MSSQL\...
    Backup directory: x:\Program Files\Microsoft SQL Server\MSSQL11.x\MSSQL\...
    Analysis Services Configuration Data Directories:
    User database directory: x:\Program Files\Microsoft SQL Server\MSSQL11.x\MSSQL\...
    User database log directory: x:\Program Files\Microsoft SQL Server\MSSQL11.x\MSSQL\...
    Temp DB directory: x:\Program Files\Microsoft SQL Server\MSSQL11.x\MSSQL\...
    Temp DB log directory: x:\Program Files\Microsoft SQL Server\MSSQL11.x\MSSQL\...
    Backup directory: x:\Program Files\Microsoft SQL Server\MSSQL11.x\MSSQL\...
    Distributed Replay Client:
    Working Directory: x:\Program Files (x86)\Microsoft SQL Server\DReplayClient\WorkingDir\
    Result Directory: x:\Program Files (86)\Microsoft SQL Server\DReplayClient\ResultDir\
    So I'd like some on assistance on the filling in the x drive letters. I understand it's best practice to seperate the data files and the logs files. But should that also be the case for TempDB? And should both the database and tempdb log files go to the
    same log paritition then? What about the backup directories? Any input is very much appreciated!
    Btw, I followed the http://www.sqlservercentral.com/blogs/basits-sql-server-tips/2012/06/23/sql-server-2012-installation-guide/ guide for the installation (Test server now).

    You can place all installation libraries on E:\ Drive.
    >>So I'd like some on assistance on the filling in the x drive letters. I understand it's best practice
    to seperate the data files and the logs files. But should that also be the case for TempDB? And should both the database and tempdb log files go to the same log paritition then? What about the backup directories? Any input is very much appreciated!
    You can place tempdb data files on T drive and I prefer to place tempdb log and user database log file
    on the same drive i.e is L:\ Drive.
    >>Backup directories
    If you are not using any third party tool then i would prefer to create separate drive for backup.
    Refer the below link for further reading
    http://www.brentozar.com/archive/2009/02/when-should-you-put-data-and-logs-on-the-same-drive/
    --Prashanth

  • Essbase unix file system best practice

    Is there such thing in essbase as storing files in different file system to avoid i/o contention? Like for example in Oracle, it is best practice to store index files and data files indifferent location to avoid i/o contention. If everything in essbase server is stored under one file directory structure as it is now, then the unix team is afraid that there may run into performance issue. Can you please share your thought?
    Thanks

    In an environment with many users (200+) or those with planning apps where users can run large long-running rules I would recommend you separate the application on separate volume groups if possible, each volume group having multiple spindles available.
    The alternative to planning for load up front would be to analyze the load during peak times -- although I've had mixed results in getting the server/disk SME's to assist in these kind of efforts.
    Some more advanced things to worry about is on journaling filesystems where they share a common cache for all disks within a VG.
    Regards,
    -John

  • File import best practice

    I need some outside input on a process. We get a file from a bank and I have to take it and move it along to where it needs to go. Pretty straight forward.
    The complexity is the import process from the bank. It's a demand pull process where an exe needs to be written that pulls the file from the bank and drops it into a folder. My issue is they want me to kick the exe off from inside SSIS and then use a file
    watcher to import the file into a database once the download is complete. My opinion is the the SSIS package that imports the file and the exe that gets the file from the bank should be totally divorced from each other.
    Does anybody have an opinion on the best practice of how this should be done?

    Here it is:http://social.msdn.microsoft.com/Forums/sqlserver/en-US/bd08236e-0714-4b8f-995f-f614cda89834/automatic-project-execution?forum=sqlintegrationservices
    Arthur My Blog

  • Flat File load best practice

    Hi,
    I'm looking for a Flat File best practice for data loading.
    The need is to load a flat fle data into BI 7. The flat file structure has been standardized, but contains 4 slightly different flavors of data. Thus, some fields may be empty while others are mandatory. The idea is to have separate cubes at the end of the data flow.
    Onto the loading of said file:
    Is it best to load all data flavors into 1 PSA and then separate into 4 specific DSOs based on data type?
    Or should data be separated into separate file loads as early as PSA? So, have 4 DSources/PSAs and have separate flows from there-on up to cube?
    I guess pros/cons may come down to where the maintenance falls: separate files vs separate PSA/DSOs...??
    Appreciate any suggestions/advice.
    Thanks,
    Gregg

    I'm not sure if there is any best practise for this scenario (Or may be there is one). As this is more data related to a specific customer needs. But if I were you, I would handle one file into PSA and source the data according to its respective ODS. As that would give me more flexibility within BI to manipulate the data as needed without having to involve business for 4 different files (chances are that they will get them wrong  - splitting the files). So in case of any issue, your trouble shooting would start from PSA rather than going thru the file (very painful and frustating) to see which records in the file screwed up the report. I'm more comfortable handling BI objects rather than data files - coz you know where exactly you have look.

  • FDM file format best practice

    All, We are beginning to implement an Oracle GL and I have been asked to provide input as to the file format provided from the ledger to process through FDM (I know, processing directly into HFM is out..at least for now..).
    Is there a "Best Practice" for file formats to load through FDM into HFM. I'm really looking for efficiency (fastest to load, easiest to maintain, etc..)
    Yes, we will have to use maps in FDM, so that is part of the consideration.
    Questions: Fix or delimited, concatenate fields or not, security, minimize the use of scripts, Is it better to have the GL consolidate etc...?
    Thoughts appreciated
    Edited by: Wtrdev on Mar 14, 2013 10:02 AM

    If possible a Comma or Semi-Colon Delimited File would be easy to maintain and easy to load.
    The less use of scripting on the file, the better import performance.

  • DW:101 Question - Site folder and file naming - best practices

    OK - My 1st post! I’m new to DW and fairly new to developing websites (have done a couple in FrontPage and a couple in SiteGrinder), Although not new at all to technical concepts building PCs, figuring out etc.
    For websites, I know I have a lot to learn and I'll do my best to look for answers, RTFM and all that before I post. I even purchased a few months of access to lynda.com for technical reference.
    So no more introduction. I did some research (and I kind of already knew) that for file names and folder names: no spaces, just dashes or underscores, don’t start with a number, keep the names short, so special characters.
    I’ve noticed in some of the example sites in the training I’m looking at that some folders start with an underscore and some don’t. And some start with a capital letter and some don’t.
    So the question is - what is the best practice for naming files – and especially folders. And that’s the best way to organize the files in the folders? For example, all the .css files in a folder called ‘css’ or ‘_css’.
    While I’m asking, are there any other things along the lines of just starting out I should be looking at? (If this is way to general a question, I understand).
    Thanks…
    \Dave
    www.beacondigitalvideo.com
    By the way I built this site from a template – (modified quite a bit) in 2004 with FrontPage. I know it needs a re-design but I have to say, we get about 80% of our video conversion business from this site.

    So the question is - what is the best practice for naming files – and especially folders. And that’s the best way to organize the files in the folders? For example, all the .css files in a folder called ‘css’ or ‘_css’.
    For me, best practice is always the nomenclature and structure that makes most sense to you, your way of thinking and your workflow.
    Logical and hierarchical always helps me.
    Beyond that:
    Some seem to use _css rather than css because (I guess) those file/folder names rise to the top in an alphabetical sort. Or perhaos they're used to that from a programming environment.
    Some use CamelCase, some use all lowercase or special_characters to separate words.
    Some work with CMSes or in team environments which have agreed schemes.

  • File Creation - Best Practice

    Hi,
    I need to create a file daily based on a single query. There's no logic needed.
    Should I just spool the query in a unix script and create the file like that or should I use a stored procedure with a cursor, utl_file etc.?
    The first is probably more efficient but is the latter a cleaner and more maintainable solution?

    I'd be in favour of keeping code inside the database as far as possible. I'm not dismissing scripts at all - they have their place - I just prefer to have all code in one place.

  • Storing File Path Names - Indexing problems

    I would like to store the files in the file system and the path in the database. (for example: '/mypath/test.doc')
    The statament for creating the index is the following:
    CREATE INDEX Test.IDX_Dokument ON Test.Dokument(path)
    INDEXTYPE IS CTXSYS.CONTEXT
    PARAMETERS ('DATASTORE CTXSYS.FILE_DATASTORE');
    This works with WINDOWS XP without any problems.
    But with Linux I always get an empty index and the following message:
    SQL> 2 3 4 /opt/oracle/product/9ir2/ctx/bin/ctxhx: relocation error: /opt/oracle/product/9ir2/ctx/lib/libsc_ut.so: undefined symbol: stat
    Index created.
    Any ideas??
    Oracle 9.2.0.1.0, SuSe Enterprise Server 8
    Thank you in advance
    Wolfgang

    Hi,
    In my case I have been able to create the same type of file_datastore index, but so far on files residing on the same server where the database is. I haven't been able to index files in this way when they reside on another server on a linked system. This is in spite of having active mapping to the targeted directories on the other server. I can however read those files and can load them into a BLOB column in my table and create an index using direct_datastore.
    This indicated that oracle recognizes the path and I have the required operating system permissions to read files from those directories, otherwise I wouldn't have been able to copy them into the database.
    Can you please get your head around this because all documentations about file_datastore, both from Oracle and others, only say that you can index data using the path when files are on the operating system? They never say if the system covers a network broadly.
    Thanks in advance for any replies.
    Regards
    Nehro

  • File naming best practice

    Hello
    We are building a CMS system that uses BDB XML to store the individual xhtml pages, editorial content, config files etc. A container may contain tens of thousands of relatively small (<20 Kb) files.
    We're trying to weigh up the benefit on meaningful document names such as "about-us.xml" or "my-profile.xml" versus integer/long file names such as 4382, 5693 without the .xml suffix to make filename indexing as efficient as possible.
    In both situations the document remain unique: appending '_1', '_2' etc where necessary to the former and always incrementing by 1 the latter.
    There is a 'lookup' document that describes the hierarchy and relationships of these files (rather like a site map) so the name of the required document will be known in advance (as a reference in the lookup doc) so we believe that we shouldn't need to index the file names. XQuery will run several lookups but only based upon the internal structure/content of the documents, not on the document names themeselves.
    So is there any compelling reason not to use meaningful names in the container, even if there are > 50,000 documents?
    Thanks, David

    George,
    I was interested in finding out whether document names made of integers would be much more efficient - albeit less intuitive - in the name index than something like 'project_12345.xml'.
    We may need to return all documents of type 'project' so putting the word in the document name seemed like a good idea, but on reflection perhaps we're better off putting that info in the metadata, indexing that and leave the document name as a simple integer/long such as '12345'.
    If so, is it worth rolling out my own inetger-based counter to uniquely name documents or am I better off just using the in-built method setGenerateName()? Is there likely to be much of a performance difference?
    Regards, David

  • Photo file management - best practice query

    A number of recent posts have discussed file management protocols, but I wondered if some of the experts on this forum would be so kind as to opine on a proposed system set up to better manage my photo files.
    I have an imac, time machine & various external hard drives.
    I run Aperture 3 on my imac (my main computer), with about 15k referenced images. Currently my photo masters are kept on my imac, as is my aperture library file, and vault. After editing in APerture, I then export the edited jpegs onto another part of my imac. The imac is backed up to time machine and an off site drive.
    Following some of the threads, the main message seems to be to take as many of my photo files as possible off my imac, and store on a separate drive. So does the folloing set up sound better?
    *Aperture run on imac, still using referenced images
    *Master images moved from imac to external drive 1
    *Aperture library file moved from imac to external drive 1
    *Aperture vault moved to external drive 1
    *External drive 1 backed up to external drive 2. Run idefrag on both drives regularly.
    *Edited exports from Aperture kept on imac, which then feed Apple TV, iphone, mobileme etc. Backed up to time machine.
    *If ever I ran Aperture on an additional machine, presumably I could just plug and play / synch with external hard drive 1.
    Is that a "good" set up? Any enhancements? The set up would seem to free up my boot volume, whilst hopefully maintaining the satefy / integrity of my files. But happy to be told it is all wrong!!
    Many thanks
    Paul

    Seems to be a good approach. However,
    Depending on how much disk space on the local drive and the speed of the external along with the iMac specs... you might keep the the library on the iMac instead of the external, but keep the masters on the external. Assuming referenced, then you could keep the vault on the external (as is I wouldn't put vault on the same external drive as both the masters and library).

  • BPC 5 - Best practices - Sample data file for Legal Consolidation

    Hi,
    we are following the steps indicated in the SAP BPC Business Practice: http://help.sap.com/bp_bpcv151/html/bpc.htm
    A Legal Consolidation prerequisit is to have the sample data file that we do not have: "Consolidation Finance Data.xls"
    Does anybody have this file or know where to find it?
    Thanks for your time!
    Regards,
    Santiago

    Hi,
    From [https://websmp230.sap-ag.de/sap/bc/bsp/spn/download_basket/download.htm?objid=012002523100012218702007E&action=DL_DIRECT] this address you can obtain .zip file for Best Practice including all scenarios and csv files under misc directory used in these scenarios.
    Consolidation Finance Data.txt is in there also..
    Regards,
    ergin ozturk

  • SAP Best Practice for Water Utilities v 1.600

    Hi All,
    I want to install SAP Best Practice for Water Utilities v 1.600. I have downloaded the package (now  available only Mat.No. 50079055 "Docu: SAP BP Water Utilities-V1.600")  from Marketplace, but there is NO transport file included on it. It only contains documentation.  Should I use the transport file from Best Practice for Utilities v 1.500?
    Thank you,
    Vladimir

    Hello!
    The file should contain eCATTs with data according to best practice preconfigured scenarios and transactions to install them.
    Some information about preconfigured scenario you could find here:
    http://help.sap.com/bp_waterutilv1600/index.htm -> Business Information -> Preconfigured Scenarios
    Under the "Installation" path you could find "Scenario Installation Guide" for Water Utilities.
    I hope it would be helpful.
    Vladimir

  • Best practice on storing the .as and .mxml files

    I have some custom components, and they use their own .as
    action script files. The custom components are placed in the
    "src/component" folder right now. Should I place the associated .as
    files in the same "src/component" folder? What is the suggested
    best practices?
    Thanks,

    Not quite following what you mean by "associated .as files ",
    but yes, that sounds fine.
    Tracy

  • Best practice for storing user's generated file?

    Hi all,
    I have this web application that user draws an image off the applet and is able to send the image via mms.
    I wonder what is the best practice to store the user image before sending the mms.
    Message was edited by:
    tomdog

    java.util.prefs

Maybe you are looking for

  • Sharepoint Workflow Access Token Error

    Whenever run a workflow on the SharePoint site it gets stuck on "Started" and throws this error when inspected in the workflow view: Retrying last request. Next attempt scheduled after 07/01/2015 10:58. Details of last request: HTTP  to https://***.s

  • Disk Utility won't erase iMac drive

    I have a late 2012 27" iMac. I was taking it to the Apple Store to fix The Mechanism™ broke. (The tilt hinge).  As I knew fixing it would entail leaving the iMac there for several days, I cloned my drive and erased the iMac. HOWEVER, when trying to d

  • Color issues after update to 10.4.9

    Has anyone discovered color printing (images) after the 10.4.9 update? ColorSync was updated as stated here: http://docs.info.apple.com/article.html?artnum=305214. I personally swear that I have onscreen proofing go bad using Aperture with a particul

  • Inbound IDOC not getting posted in database

    Hello All, I have a similar problem below: http://www.sapfans.com/forums/viewtopic.php?t=59840&highlight=inboundaleconfiguration Currently, I have distributed exchange rates from my sending system to my receiving system. When I check on the IDOC stat

  • If i downloaded a paid app and want to redownload

    i downloaded an app for seven dollars a LONG time ago. i went to the app store to look at it and it said download instead of the price. if i redownload it, will it cost money? will it show up on the bill?