Upload file to global directory in Dev, Q&A and Prod!

I have an upload application in BSP, that uploads files to for example /usr/sap/BWD/files
this works in Development, but of course this directory is not available in Production, so BSP won't work there.
Isn't it possible to use one global directory?
Right now somebody created for us a directory, that's the same on all 3 systems (Dev, Q, and Production)
this dir is
on Development:  DIR_TRANS     /usr/sap/transBW
on Quality:  DIR_TRANS     /usr/sap/transBW
on Production:  DIR_TRANS     /usr/sap/trans
notice the small difference in path in Production... Is there a way to use the DIR_TRANS instead of the real path?
my application writes data like this:
fname = '/usr/sap/CBD/files/FILE.CSV'.
OPEN DATASET fname FOR OUTPUT in TEXT MODE encoding default.
if sy-subrc gt 0.
WRITE: / 'Error opening file'.
endif.
LOOP AT data_TAB INTO LIN.
TRANSFER LIN TO FNAME.
ENDLOOP.
CLOSE DATASET FNAME.
thanks a lot, points will be rewarded for usefull answers!
thanks!

use transaction FILE to create logical path for the actual file path.
and then use FM
call function 'FILE_GET_NAME'
         exporting
              client           = sy-mandt
              logical_filename = pil_file  "Input logical file name
              operating_system = sy-opsys
         importing
              file_name        = p_i_file  "Physical file name
         exceptions
              file_not_found   = 1
              others           = 2.
Regards
Raja

Similar Messages

  • How to Upload files in a directory?

    Can any one tell me how to upload a file in my root directory using jsp.
    any example, links,script
    i want to upload only a small text file (how to restrict size)

    Read this tutorial and implement -
    http://www.javazoom.net/jzservlets/uploadbean/uploadbean.html
    Its easy and effective.

  • I have read 118 files (from a directory) each with 2088 data and put them into a 2 D array with 2cols and 246384row, I want to have aeach file on a separate columns with 2088 rows

    I have read 118 files from a directory using the list.vi. Each file has 2 cols with 2088rows. Now I have the data in a 2 D array with 2cols and 246384rows (118files * 2088rows). However I want to put each file in the same array but each file in separate columns. then I would have 236 columns (2cols for each of the 118 files) by 2088 rows. thank you very much in advance.

    Hiya,
    here's a couple of .vi's that might help out. I've taken a minimum manipulation approach by using replace array subset, and I'm not bothering to strip out the 1D array before inserting it. Instead I flip the array of filenames, and from this fill in the end 2-D array from the right, overwriting the column that will eventually become the "X" data values (same for each file) and appear on the right.
    The second .vi is a sub.vi I posted elsewhere on the discussion group to get the number of lines in a file. If you're sure that you know the number of data points is always going to be 2088, then replace this sub vi with a constant to speed up the program. (by about 2 seconds!)
    I've also updated the .vi to work as a sub.vi if you wa
    nt it to, complete with error handling.
    Hope this helps.
    S.
    // it takes almost no time to rate an answer
    Attachments:
    read_files_updated.vi ‏81 KB
    Num_Lines_in_File.vi ‏62 KB

  • Transporting roles from dev to qa and prod servers ep6sp11

    Hi
    We have EP6 SP11 -erp2004
    I will be transporting roles from the dev environment to a QA(quality assurance) and a production portal server.
    Only the roles needs to be transported, <b>no</b> user assignments to be transported.
    I have read some docs on this and would just like to confirm this is the procedure:
    Basically:
    1.Export the roles to a common directory
    2.Importing it in qa and prod
    3. From here onwards i need help on..Do i add the roles through delta links??? What other steps needs to be done?

    Hi Pradeep
    Thanx for your reply.
    With the exception of the delta link assignment, i will assign it directly, I can just follow the procedure i have written?
    RD

  • How to deploy Business components from DEV to QA and PROD

    I developed Business components in my development database using JDEVELOPER and would like to port it to QA and then production. Because XML files have all your constraint names in there, which will be different in my QA and Prod., how do I make if work. Should I be playing with my XML files...???

    The constraint names are not used at runtime, only at design time. They would be used if you decided to forward-generate the DDL for your tables.
    Not to worry.

  • PL/SQL package global variable in Dev. 6i and newer....

    Hi,
    In db packages , the use of global variable is an appropriate method to keep the value it contains globally... without the danger of being overriden by other session....(as for each session , there is a 'private' section in memory(library cache)).
    Is the above also true to pl/sql packaged global variables kept at client's side(forms/reports)...????
    Thanks,
    Sim

    db-package variables are global to your session
    forms globals are global to your forms application
    In one case you have trouble: When you start a new forms via open_form (, new session, ) then your db-globals are global in that new session, while your forms globals are the same in all forms of the actual forms - session

  • ODI Objects Migration from Dev to Test and Prod Enviornments

    Hi ,
    Can someone please advise the order of the steps that have to be followed for ODI code migration form Development to Test and Prod Environments ?
    Below are the Details ,
    ODI Version : 11.1.1.6.0
    Development Environment : 1 Master Repository (M1) and 1 Work Repository
    Test Environment : 1 Master Repository (M2) and 1 Execution Repository
    Prod Environment : 1 Master Repository ( M2) and 1 Execution Repository
    Test and Prod environments are on the same master Repository.
    Thank you.

    Thanks for your prompt response !!!
    "If they are not in sync, import models, datastores, topology first & then import interfaces/projects"
    Do we import Models, data stores ? Since they are execution repositories we can only import scenarios right.

  • Dev is ODI and Prod is Sunopsis

    We're in the process of upgrading from Sunopsis 4.1 to ODI. During this time, can I use the ODI Designer to view our Production Master/Repository which has not been upgraded? Or do I have to maintain my Sunopsis Designer until everything is converted to ODI?

    Firstly I want to say that I haven't done the test.
    But for having updated several time Sunopsis environment into ODI one I think that you won't be able to connect to your Sunopsis Repositories with an ODI Designer.
    But I can be wrong.

  • Uploading file with an enforced category

    I am trying to upload files to a directory with an enforced category. I took the HandlingEnforcedCategories.java from the DEV Kit, modified to my requirements and still running into issue. It's something to do with library/workspace I think which I am still confused about.
    My sandbox path is /regionUS/BUSINESS%20SECTOR/ (I am escaping the scape or else the api chokes)
    My library/workspace is CUSTOMERS (library/bookshelf icon shows up here)
    My folder where category is applied is COOLEZ which is under CUSTOMERS
    I am trying to upload images and apply category to /regionUS/BUSINESS%20SECTOR/CUSTOMERS/COOLEZ/IMAGES/2009/04
    all the folders exist and the category exist too but I am still getting the error that says
    "Required metadata folder missing. Ensure prerequisite sample has been run."
    The question is why would this complain about the missing metadata folder?
    Here are the couple interested methods
    public static void main(String args[]) {
    try {
    upload ( "coolac",
    "coolpw",
    "http://contentdbserverurl:7779/content/ws",
    "regionUS/BUSINESS%20SECTOR/",
    "CUSTOMERS/COOLEZ/IMAGES/2009/04/",
    "test1.jpg",
    new FileInputStream(new File("c:/test/test1.jpg")),
    "DAMAGED_PRODUCT_IMAGES",
    mapObj
    catch (Exception e) {
    e.printStackTrace();
    } catch (ExelContentDBException e) {
    e.printStackTrace();
    // TODO
    public static void upload(String username,
    String password,
    String contentDBURL,
    String sandboxRootPath,
    String workspaceName,
    String sourceFilename,
    FileInputStream sourceFileInputStream,
    String categoryName,
    Map categoryMap) throws ContentDBException {
    System.out.println("**** : ApplyingCategories: starting");
    FdkCredential credential = new SimpleFdkCredential(username, password);
    ManagersFactory session = null;
    try {
    session = ManagersFactory.login(credential, contentDBURL);
    FileManager fileM = session.getFileManager();
    Item folder = null;
    try {
    Item sandbox = fileM.resolvePath(sandboxRootPath, null);
    // Note: some clients may use the term 'Library' when referring to
    // or displaying a Workspace Item (including the Content DB Web UI).
    //for now I am hardcoding the values
    Item workspace = (sandbox == null) ? null : fileM.resolveRelativePath(sandbox.getId(), "CUSTOMERS", null);
    folder = (workspace == null) ? null : fileM.resolveRelativePath( workspace.getId(), "COOLEZ/IMAGES/2009/04/", null);
    catch (FdkException e) {
    System.out.println("**** : Error locating metadata folder.");
    throw e;
    if (folder == null) {
    throw new ContentDBException("**** : Required metadata folder missing. Ensure prerequisite sample has been run.");
    CommonManager commonM = session.getCommonManager();
    AttributeRequest[] requestedAttributes = AttributeRequests.ITEM_CATEGORY_CONFIGURATION;
    folder = commonM.getItem(folder.getId(),requestedAttributes);
    Item categoryConfiguration = (Item) CommonUtils.getAttribute(folder, Attributes.CATEGORY_CONFIGURATION);
    Map m = CommonUtils.getAttributesMap(categoryConfiguration);
    boolean configEnabled = ((Boolean) m.get(Attributes.CONFIGURATION_ENABLED)).booleanValue();
    Item[] requiredCategories = (Item[]) m.get(Attributes.REQUIRED_CATEGORIES);
    if (configEnabled = false ||
    requiredCategories == null ||
    requiredCategories.length != 1) {
    System.out.println("**** : Category configuration settings unexpected. Has prerequisite samples been run?");
    throw new ContentDBException("Category configuration settings unexpected.");
    requestedAttributes = AttributeRequests.CATEGORY_CLASS_ATTRIBUTES;
    Item catClass = commonM.getItem(requiredCategories[0].getId(), requestedAttributes);
    String displayName = (String) CommonUtils.getAttribute(catClass, Attributes.DISPLAY_NAME);
    if (!displayName.equals(categoryName)) {
    System.out.println("**** : Category configuration settings unexpected. Has prerequisite samples been run?");
    throw new ContentDBException ("Category configuration settings unexpected.");
    // get category metadata attribute internal names from the
    // metadata attribute display names
    // when creating a category instance definition, we specify internal names
    String categoryKeyItems[] = new String[categoryMap.size()];
    Set setObj = categoryMap.keySet();
    Iterator iterObj = setObj.iterator();
    int i = 0;
    while (iterObj.hasNext()) {
    categoryKeyItems[i] = (String) iterObj.next();
    i++;
    Map attributeMap = CategoryUtils.getCategoryAttrInternalNames(session,
    catClass.getId(),
    categoryKeyItems);
    if (attributeMap == null ||
    attributeMap.isEmpty() ) {
    System.out.println("**** : Category class structure invalid.");
    throw new ContentDBException ("Metadata configuration of the required Category class is invalid. Suggested resolution is to drop and recreate category class.");
    requestedAttributes = null;
    Item docDef = fileM.createDocumentDefinition(new NamedValue[] {
    ClientUtils.newNamedValue(Attributes.NAME, sourceFilename),
    ClientUtils.newNamedValue(Options.CONTENTSTREAM,
    sourceFileInputStream)
    }, requestedAttributes);
    // First attempt to createDocument in the enforced metadata folder without
    // supplying a category instance definition.
    // If the enforced metadata folder has been setup correctly, an
    // FdkException with error code ORACLE.FDK.AggregateError will occur
    // containing an FdkExceptionEntry with errorcode ORACLE.FDK.MetaDataError
    // and detailed error code ORACLE.FDK.MetadataRequired. The info
    // NamedValue[] of the FdkExceptionEntry should have a key
    // ECM.EXCEPTIONINFO.MissingMetaData with a long[] value containing the
    // ids of the mandatory category classes for which definition were
    // not supplied.
    Item doc = null;
    NamedValue[] documentDef = new NamedValue[] {
    ClientUtils.newNamedValue(Options.USE_SAVED_DEFINITION,new Long(docDef.getId())),
    ClientUtils.newNamedValue(Options.DESTFOLDER, new Long(folder.getId())),
    try {
    // The following call should FAIL !!! - and exception be caught in
    // catch block
    doc = fileM.createDocument(documentDef, null, null);
    // The following calls should never occur ...
    System.out.println("**** : Document create succeeded.");
    System.out.println("**** : Are there no longer required categories on the folder?");
    catch (FdkException fe) {
    System.out.println("**** : Document create failed (which is what we expected).");
    System.out.println("**** : " + fe);
    // Next, attempt to retry createDocument in the enforced metadata folder
    // this time supplying a category instance definition for the mandatory
    // category class. The advantage of utilizing document definitions is
    // that the content does not need to be reuploaded. The document
    // definition can be reused when retrying the failed operation.
    NamedValue[] categoryInstanceAttributes = new NamedValue[categoryKeyItems.length];
    for (int x=0; x < categoryKeyItems.length; x++) {
    categoryInstanceAttributes[x] = (NamedValue) categoryMap.get(categoryKeyItems[x]);
    NamedValue[] categoryDef = new NamedValue[] {
    ClientUtils.newNamedValue(Options.CATEGORY_CLASS_ID,
    new Long(catClass.getId())),
    ClientUtils.newNamedValue(Options.CATEGORY_DEFINITION_ATTRIBUTES,
    categoryInstanceAttributes)
    documentDef = new NamedValue[] {
    ClientUtils.newNamedValue(Options.USE_SAVED_DEFINITION,new Long(docDef.getId())),
    ClientUtils.newNamedValue(Options.DESTFOLDER, new Long(folder.getId())),
    ClientUtils.newNamedValue(Options.CATEGORY_DEFINITION, categoryDef)
    System.out.println("**** : Create a Document using saved definition item ...");
    requestedAttributes = AttributeRequests.DOCUMENT_CATEGORY_ATTRIBUTES;
    doc = fileM.createDocument(documentDef, null, requestedAttributes);
    catch (Throwable t)
    t.printStackTrace();
    finally
    CommonUtils.bestEffortLogout(session);
    System.out.println("**** : ApplyingCategories: done");
    }

    I finally got the code working that uploads a file with the attached catogory values. The problem was the internalAttributeMap that had to be created with CategoryUtils.getCategoryAttrInternalNames
    I will post the code later with some cleanup. The documentation is not very friendly. The example code is very confusing. I don't have any books on it and I didn't go to any of Oracle's training either.
    Do you guys know of any good books, sites or blogs on development with Cotnent DB?

  • How do I delete a file from a directory

    I want to delete the file I have finished processing in the c:\upload directory but the f.delete() command is not working. Am I using this command correctly?
    The f.delete() is after I copy the file to an archive directory.
    Program:
    import java.io.*;
    import java.util.*;
    import java.text.*;
    import java.sql.*;
    public class ReadSource {
    public static void main(String[] args) throws Exception {
         StringTokenizer st1;
              String val1, val3, val4, val5, val9, val10, val11, val12, val13, val14, val16;
              String val2, val6, val7, val8, val15, val17, val18, val19, val20;
              int cnt;
              File f = new File("C://upload" );
              FileWriter outFile = new FileWriter("C://RIFIS/log/logfile.txt", true);
              String filetext = "Starting RIFIS Upload";
              java.util.Date d = new java.util.Date();
              SimpleDateFormat form = new SimpleDateFormat("dd/MMM/yyyy hh:mm:ss");
              String dateString = form.format(d);
         try {
              Class.forName("oracle.jdbc.driver.OracleDriver").newInstance();
    Connection conn = DriverManager.getConnection("jdbc:oracle:thin:@xxxxx.xxxx.xxx:1521:xx","xxxxxx","xxxx");
                        Statement st = conn.createStatement();
                        outFile.write(System.getProperty("line.separator"));
                        outFile.write(filetext+" - "+dateString);
                        if (f.isDirectory())
                        { String [] s = f.list();
                        for (int i=0; i<s.length; i++)
                        { outFile.write(System.getProperty("line.separator"));
                             outFile.write("Found file - "+f+"/"+s);
              FileReader file = new FileReader(f+"/"+s[i]);
                                  File inputFile = new File(f+"/"+s[i]);
                                  File outputFile = new File("C://RIFIS/archive/"+s[i]);
                        BufferedReader buff = new BufferedReader(file);
                        boolean eof = false;
                        String val0="";
                        ResultSet rec = st.executeQuery("SELECT landings_hold_batch_seq.nextval FROM dual");
                        while(rec.next())
                        { val0 = rec.getString(1); }
                                  cnt=0;
                        while (!eof)
                        { String line = buff.readLine();
                             if (line == null)
                             { eof = true; }
                             else
                             { //System.out.println(line);
                                       cnt = cnt+1;
                                  st1 = new StringTokenizer(line,",");
                                       val1 = st1.nextToken();
                                       val2 = st1.nextToken();
                                       val3 = st1.nextToken();
                                       val4 = st1.nextToken();
                                       val5 = st1.nextToken();
                                       val6 = st1.nextToken();
                                       val7 = st1.nextToken();
                                       val8 = st1.nextToken();
                                       val9 = st1.nextToken();
                                       val10 = st1.nextToken();
                                       val11 = st1.nextToken();
                                       val12 = st1.nextToken();
                                       val13 = st1.nextToken();
                                       val14 = st1.nextToken();
                                       val15 = st1.nextToken();
                                       val16 = st1.nextToken();
                                       val17 = st1.nextToken();
                                            val18 = st1.nextToken();
                                            val19 = st1.nextToken();
                                            val20 = st1.nextToken();
                                       /*System.out.println("Token 0: " + val0);
                                  System.out.println("Token 1: " + val1);
                                  System.out.println("Token 2: " + val2);
                                       System.out.println("Token 3: " + val3);
                                       System.out.println("Token 4: " + val4);
                                       System.out.println("Token 5: " + val5);
                                       System.out.println("Token 6: " + val6);
                                       System.out.println("Token 7: " + val7);
                                       System.out.println("Token 8: " + val8);
                                       System.out.println("Token 9: " + val9);
                                       System.out.println("Token 10: " + val10);
                                       System.out.println("Token 11: " + val11);
                                       System.out.println("Token 12: " + val12);
                                       System.out.println("Token 13: " + val13);
                                       System.out.println("Token 14: " + val14);
                                       System.out.println("Token 15: " + val15);
                                       System.out.println("Token 16: " + val16);
                                       System.out.println("Token 17: " + val17);
                                            System.out.println("Token 18: " + val18);
                                            System.out.println("Token 19: " + val19);
                                            System.out.println("Token 20: " + val20);*/
                                       st.executeUpdate("INSERT INTO LANDINGS_HOLD (lh_id, lh_batch, supplier_dr_id, supplier_unique_id, supplier_dealer_id, supplier_cf_id, supplier_vessel_id, unload_year, unload_month, unload_day, state_code, county_code, port_code, itis_code, market, grade, reported_quantity, unit_measure, dollars, lh_loaddt, lh_loadlive, purch_year, purch_month, purch_day)" +
                        "VALUES (0,'"+val0+"','"+val1+"',"+val2+",'"+val3+"','"+val4+"','"+val5+"',"+val6+","+val7+","+val8+",'"+val9+"','"+val10+"','"+val11+"','"+val12+"','"+val13+"','"+val14+"',"+val15+",'"+val16+"',"+val17+",SYSDATE,NULL,"+val18+","+val19+","+val20+")");
                             FileReader in = new FileReader(inputFile);
                             FileWriter out = new FileWriter(outputFile);
    int c;
                             while ((c = in.read()) != -1)
                             { out.write((char)c); }
                             in.close();
                             out.close();
                             outFile.write(System.getProperty("line.separator"));
                             outFile.write("Number of records inserted - "+cnt);
                             outFile.write(System.getProperty("line.separator"));
                             outFile.write("Copied upload file to archive directory");
                             f.delete(); // delete the upload file
                             outFile.write(System.getProperty("line.separator"));
                             outFile.write(f+"/"+s[i]+" - Has been removed from upload directory");
                             buff.close();
                        outFile.write(System.getProperty("line.separator"));
                        outFile.write("Upload Complete...NO ERRORS");
                        outFile.write(System.getProperty("line.separator"));
                        outFile.write("*************************************************************");
                        outFile.write(System.getProperty("line.separator"));
                        //outFile.flush();
              //outFile.close();
                        conn.close();
                        else
                        { outFile.write("No files to process"); }
              catch(Exception e)
                   { outFile.write(System.getProperty("line.separator"));
                   outFile.write("ALERT....ALERT....ALERT");
                        outFile.write(System.getProperty("line.separator"));
                        outFile.write("Error Occurred in ReadSource.java - RIFIS Upload");
                        outFile.write(System.getProperty("line.separator"));
                   outFile.write("My Error: " + e);
                        outFile.write(System.getProperty("line.separator"));
                        outFile.write("*************************************************************");
                   outFile.flush();
              outFile.close();

    Gave it a try but file c:\upload\DS121002.csv did not go away. Here is the location of my delete command:
    st.executeUpdate("INSERT INTO LANDINGS_HOLD (lh_id, lh_batch, supplier_dr_id, supplier_unique_id, supplier_dealer_id, supplier_cf_id, supplier_vessel_id, unload_year, unload_month, unload_day, state_code, county_code, port_code, itis_code, market, grade, reported_quantity, unit_measure, dollars, lh_loaddt, lh_loadlive, purch_year, purch_month, purch_day)" +
                        "VALUES (0,'"+val0+"','"+val1+"',"+val2+",'"+val3+"','"+val4+"','"+val5+"',"+val6+","+val7+","+val8+",'"+val9+"','"+val10+"','"+val11+"','"+val12+"','"+val13+"','"+val14+"',"+val15+",'"+val16+"',"+val17+",SYSDATE,NULL,"+val18+","+val19+","+val20+")");
                             FileReader in = new FileReader(inputFile);
                             FileWriter out = new FileWriter(outputFile);
    int c;
                             while ((c = in.read()) != -1)
                             { out.write((char)c); }
                             in.close();
                             out.close();
                             inputFile.delete();
                             outFile.write(System.getProperty("line.separator"));
    ............

  • File upload: problem with writing uploaded file to server-harddisk

    Hello,
    according to the example in the File upload tutorial, I tried to save an uploaded file to disk, but it didn't work (and also no messages / exceptions were thrown).
    The Javadoc for the function uploadedFile.write(filename) says that if the file should be written in an other directory than the servers' tmp-directory, the server.policy has to be adjusted.
    The following line is already included in my server.policy:
    permission java.io.FilePermission "<<ALL FILES>>", "read,write";
    So I have two questions:
    1) Where can I find the servers' tmp-directory?
    2) How should the sever.policy permission code look like?

    The File Upload tutorial at http://developers.sun.com/prodtech/javatools/jscreator/learning/tutorials/2/file_upload.html has you save the uploaded file to disk. It also gives this information:
    The server holds the uploaded file in memory unless it exceeds 4096 bytes, in which case the server holds the file contents in a temporary file. You can change this threshold by modifying the sizeThreshold parameter for the UploadFilter filter entry in the web application's web.xml file. For more information on modifying the web.xml file, see the last section in this tutorial, Doing More: Modifying the Maximum File Upload Size.
    In cases where you want to retain the uploaded file, you have three choices:
    * Write the file to a location of your choosing, as shown in this tutorial.
    * Create an UploadedFile property in a managed bean and set it to the component's value before you exit the page (as in the button's action method).
    * Save the file to a database.
    By default, the File Upload component can handle files up to one megabyte in size. You can change the maximum file size by modifying the maxSize parameter for the UploadFilter filter entry in the application's web.xml file, as described in the last section in this tutorial, Doing More: Modifying the Maximum File Upload Size.

  • Photo upload file permissions?

    Hi,
    My website will give users the option to upload photographs which will be displayed on the website.
    Before a user can do this they must register or be logged in to their account.
    I will be using a shared hosting environment.
    My script does check the file size and type and only allows .jpg .gifs and .png.
    I need to set global permissions, is there a way of defining a registered user as the owner so that I only need apply owner permissions of 700 rather than setting at 777?
    My other concern is that the files that I upload the photos to is also the one that I link to from my web pages to display the images, should I be copying the uploaded images to another file then linking to that one to display the images?
    Hope I am making sense.
    Thank you in advance for your help and information.

    Hi Rob,
    Just to clarify your helpful comments, there are two comments that I am not fully understanding.
    Firstly yes I am allowing registered users to upload through http protocol.
    And yes the files that are being uploaded to will be under website ownership.
    As it will be shared apache hosting I need to set read write and execute permissions to allow the upload script to perform, which I have to do using chmod and assigning restrictive permissions if possible.
    Your comment: As long as the scripts performing the uploads are within the SAME ACCOUNT....
    The script is just there within the page, a user registers their details and then is allow to go to the page that upload information to the database and photos to the upload script, returning users, after log in is verified, are also allowed on the page that upload photos, DOES THAT MEAN THEY ARE WITHIN THE SAME ACCOUNT AS WEBSITE OWNER?
    Your comment:  Why do you feel you need to assign apache permissions to INDIVIDUAL USERS?
    I wanted to apply permission the the upload files but I thought the 'status' of my users would be like 'general public' I guess that ties in with the last comment about account ownership, for what ever reason I was thinking that a user, even if registered, would be just like a a public person and for them to be able to use the upload scripts I thought that I would need to somehow tell the files that this person was the 'owner' so that I could apply 700 permissions to the actual file rather than 777 permissions, I was trying to find a way to use a more restrictive permission level ( sorry if I didn't explain it well).
    So I am getting this right, I do hope so! A user on my website who is using the upload scripts has ownership permissions, so if I set the permissions on my upload file to 700, it will allow read, write and execute permission for the file and I don't need to set the status of my users to 'owner' they just will be as such the 'owner' because they are using the script?
    Thank you for your time a patience, I look forward to your reply and hopefully confirmation that i am now understanding this correctly.
    Best regards 
    Date: Sat, 17 Nov 2012 20:58:59 -0700
    From: [email protected]
    To: [email deleted]
    Subject: Re: photo upload file permissions? photo upload file permissions?
        Re: photo upload file permissions?
        created by Rob Hecker2 in Developing server-side applications in Dreamweaver - View the full discussion
    is there a way of defining a registered user as the owner so that I only need apply owner permissions of 700 rather than setting at 777?If users are uploading through the HTTP protocol, then the owner of the folders and files is going to be set to the website ownership.All files and folders will share the same ownership. As long as the scripts performing the uploads are within the same account, there should not be an issue, and you should be able to assign more restrictive permissions than 777. Why do you feel you need to assign apache permissions to individual users? (which you can't do anyway, using http)  It would be pretty easy using sessions and PHP  to keep user files separate from each other in unique folders. But if users will use the FTP protocol, the situation would be very different.
    Please note that the Adobe Forums do not accept email attachments. If you want to embed a screen image in your message please visit the thread in the forum to embed the image at http://forums.adobe.com/message/4855936#4855936
    Replies to this message go to everyone subscribed to this thread, not directly to the person who posted the message. To post a reply, either reply to this email or visit the message page: Re: photo upload file permissions?
    To unsubscribe from this thread, please visit the message page at Re: photo upload file permissions?. In the Actions box on the right, click the Stop Email Notifications link.
    Start a new discussion in Developing server-side applications in Dreamweaver by email or at Adobe Community
      For more information about maintaining your forum email notifications please go to http://forums.adobe.com/message/2936746#2936746.

  • Regarding uploading file

    I would like to ask abt uploading file from one directory from another.
    I wanted to upload a .mdb file from a network directory to a local directory.
    Could somebody help me???? Thanks

    You could also try UploadBean. It includes working JSP samples.
    http://www.javazoom.net/jzservlets/uploadbean/uploadbean.html
    It also allows to select the underlying multpart parser such as Commons FileUpload, COS, ...
    Some add-ons (upload progress bar, email notification on upload) are available too.
    Hope it helps.

  • Script to open all PST files in a directory and open with Outlook.

    Hi!  I am looking for a script that opens all PST files in a directory, opens it in outlook, and writes to a log to see if it completed correctly.  I'm new to VBscript and just want to see how the script would be written.
    Thank you!!

    Thank you for all the input!  JRV, I went through the repository and couldn't find anything about importing into Outlook.  I will use it for future references and I appreciate you directing me there.
    Grant, I need to be able to locate all PST files within a directory, not point it to a specific pst.  I also need it written to a log file.  Here is what I have...Don't laugh, I'm very new...
    const ForAppending = 8
    set objTextFile = objFSO.OpenTextFile ("C:\Users\jimmy.nguyen\Desktop\Lucky.txt", ForAppending, True)
    set objFSO = CreateObject("Scripting.FileSystemObject")
    Set objShell = CreateObject("WScript.Shell")
    strCommand = "C:\Program Files (x86)\Microsoft Office\Office12\outlook.exe"
    Set objExecObject = objShell.Exec(strCommand)
    wscript.sleep 4000
    Set myOlApp = CreateObject("Outlook.Application")
    Set myNS = myolapp.GetNamespace("MAPI")
    myNS.AddStore "objfile.name"
    Sub ShowSubFolders(fFolder)
    Set objFolder = objFSO.GetFolder(fFolder.Path)
    Set colFiles = objFolder.Files
    For Each objFile in colFiles
    If UCase(objFSO.GetExtensionName(objFile.name)) = "pst" Then
    objFile.Name
    End If
    Next
    For Each Subfolder in fFolder.SubFolders
    ShowSubFolders(Subfolder)
    Next
    End Sub
    strResults = ShowSubFolders(fFolder)
    objTextFile.WriteLine(strResults)
    objTextFile.Close

  • Uploading file to the servelet and write directly into database

    i want client to upload file to the server (using servlet for that) and i just don't want to store it to the hard disk, but i want to write it
    into the the ms sql database (that i m using).
    TIA

    Here is a good site for learning how to upload a file using a servlet:
    http://www.servlets.com/cos/index.html
    Once you get the file in the servlet you shouldn't have any problem storing it to a DB.

Maybe you are looking for

  • Exporting Photos

    Is it possible to export photos from my ipad to a USB stick or memory card. I do have the adapters for transferring photos. I have been able import but don't see how to export

  • Location of various music files

    II've recently purchased a MacBook Pro, I now wish to sort my music collection out and store all music on the network drive which can be accessed via the cloud, not apple cloud incidentally.   The thing is I have a variety if music files in various l

  • Ipod nano not uploading new songs

    When I plug my nano in the screen flashes "do not disconect" but it wont load any of my music and my ipod is not showing up in the sources list either. Help please!

  • Comments in Selection Formulas Disappearing

    Hi there people, I am finding that when I add comments to selection formulas they often disappear when I exit the editor screen and run the report. I have some older reports where the comments have been saved but if I edit them them they will vanish

  • LOV’s default format set to Check Box - Error

    1. I created a custom attribute and a custom Item type for a particular page group in 10g 2. I created a LOV and used it in the custom attribute type 3. I used the custom attribute type in the custom Item type that I created. 4. While customizing the