FileUpload - Checking Resource data prior to upload

Hi All,
Is there a way of examining the Resource data before a file is uploaded, i.e. to prevent them uploading files above a certain size or with a certain extension? I can't see how I can do this until the file has already been uploaded.
Cheers,
Steve

// for checking the size of the resource use the following code.
public byte[] getByteArray(InputStream ipStr) {
byte[] bytArr = null;
try {
     bytArr = new byte[ipStr.available()];
} catch (IOException e) {
     e.printStackTrace();
try {
     ipStr.read(bytArr);
} catch (IOException e1) {
     e1.printStackTrace();
          return bytArr;
// getting the resource that user has browsed          
IWDResource resource = element.getResDocumentUploaded();
// Getting the byte array of the resource selected
byte[] arrBIN = getByteArray(resource.read(false));
//checking If file size is less than 25 MB     
if(arrBIN.length < 25 * 1024 * 1024)
Using this code we can check if file uploaded is less than 25MB
for getting file extension of document uploaded one can use the following code
ublic static java.lang.String getFileExtension(
          java.lang.String strResourceName) {
          String strExtension = null;
          int iIndex = strResourceName.lastIndexOf(".");
          strExtension = strResourceName.substring(iIndex + 1);
          return strExtension;
IWDResource resource = element.getResDocumentUploaded();
String strResource = resource.getResourceName();
String strExtension = getFileExtension(strResource);
one can create another input fiels wich takes in the extension of the uploaded file and compare it with strExtension
string application = context_of_field acceptingextension
application.equalsIgnoreCase(strExtension)
Regards
Rohan.
Please do provide me points if this helps you .Thank you
Edited by: rohan Henry on May 27, 2008 11:05 AM
Edited by: rohan Henry on May 27, 2008 11:08 AM
Edited by: rohan Henry on May 27, 2008 11:09 AM

Similar Messages

  • Data is not uploaded from the dso to the cube

    Dear Experts,
    In one of my process chains the data is not uploaded from the dso to the cube .
    I have tried to upload the data month wise also but still after certain data records the data gets stuck up.
    I have recreated the indexes also.
    When I am checking in DB02 the table space is shown as 18491 MB.The used space is 97%.
    Please suggest.

    Hi.....
    I didn't get your point........what have you mention that you have recreated the index before loading....
    Basically.the process should be........Delete index --> Load --> Create index......
    You have mentioned that 97% of the memory space is in use........please check with the basis team once......
    Also, in SM37 check if any such job is there which is running for a long time but not progressing......or which you think you don't need......then kill that job......
    Regards,
    Debjani.....

  • Data Source to upload plan data for CO_OM_CCA_: CC: Costs and Allocations

    Hi Guru's,
    We have Data source that which upload Actual data for CCA (0CO_OM_CCA_1 - Cost Centers:cost and allocations).  Every time it is full upload before loading the data deletion of previous load request will be done.
    One more information required that i have checked in the BW cube for plan data it is available in the info cube for one cost center and rest of the costcenter i don't have the data now the users are requesting to upload the plan data for rest of cost centers.
    Now as per the business requirement the users are requesting to upload the plan data in to the same i am not sure whether it will pull the plan data with actuals in to BW cube. Is there any alternate with the same data source or else is there any specific data source for uploading Costs and Allocations plan data
    Please suggest me how can i go a head with this requirement.
    Thanks in Advance.
    Ganni

    You can use the same datasource to load plan/budget data to the cube for all the costcenters. And regarding the actuals, you can use 0CO_OM_CCA_9 datasource to load to actuals cube and create a multi on top of both plan and actuals cube for reporting purposes.

  • Need help on dates of photos uploaded in info

    HI people
    i have an issue with the storing of my photos . when i first got the system Adobe Elements  it would store the photos and in the PHOTO INFO it recorded the date the photo was taken . I need this as i am a scrapbooked and it is useful for my journalling over the past few months it is only recording as the date TAKEN the date it was uploaded to the computer . which could be days later giving me the incorrect dates of events
    Can anyone help me please to get it back to what it should be the DATE TAKEN
    Photoshop Elements

    Hi,
    Check this [Link1|https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/3004a2d2-0653-2a10-779c-f5562b3fac39] [Link2|https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/92914af6-0d01-0010-3081-ded3a41be8f2] & [Link3|https://wiki.sdn.sap.com/wiki/x/dDg ].
    Regards,
    Surjith

  • Data (.dat) download and upload from a PC in background

    Hi All,
    In my program of R/3, I have used in last ws_download for downloading a .dat file (after processing) to my local machine and I have a similar ws_upload function module used in ISU to populate/update records in the (mirror image type) table in ISU(both different system and server).
    Now file is created in default path set and name is sy-datum with some variable, after my program in R/3 is executed and after this I am suppose to run upload function module program in ISU to update/populate data in the table maintained here. there are few condition and logic check before data is updated / inserted in ISU. here also it takes that file directly from the default set path and sync file name. (i.e. no user interaction for file & location selection)
    Now the problem is, the data file was not getting downloaded, when the program was run in back ground. and nor the program to upload works when run in back ground. so I came to know that these two function modules dont run in backgroud.
    So can anybody provide me with an alternative.
    Also these programs would be scheduled daily after 12 midnight in back ground mode only.
    Awaiting for your answers
    Thanks

    Hi All,
    Finally I was able to fix my problem of transferring data across the two diff servers.
    As suggested by few I used Open & Close dataset, to put data file on respective directory on application server. ( both R/3 and  ISU ).
    Now to transfer a data file, we wrote a shell script (xxx.sh) in the same directoy of application server, where the ftp command, target ip, user, dest loc, 'PUT' command & file name to be trasnferred coded. Then External Command (check out Tx sm69 & sm49 ) was assigned in SAP  to this shell script. 
    Now after my program executes Open dataset i.e writing a file on appl server, i call a function module SXPG_EXECUTE_COMMAND, where in  i pass external command name (note - authorization needed to execute this function module).
    Upon execution it will write on appl server (R/3)  and transfer the file to ISU appl server dest location.
    To add, prog now very well runs in background.
    Regards.
    Message was edited by:
            navin devda

  • SAP IHC PAYMENT ORDER - posting date prior to Current posting date

    Hello folks,
    I have a general question about SAP-IHC module.
    While creating a manual internal payment order using IHC1IP transaction code, if I change the "Date Executed" date, the system allows to post the payment order in closed period(posting date prior to the current posting date(for the Bank area in F9B1), i.e. the payment items posted, contained in the IHC payment order, has an old posting date, which is prior to the current posting date of the Bank area - Is this normal?

    Dear,
    Check with settings in OLMR transaction.
    Invoice Block - Set Tolerance Limits [Transaction : OMR6]
    Here you need to add New entries with - Tolerance Key : LD and company code and set the limits. Please check the documentation which available read as below:
    LD: Blanket purchase order time limit exceeded
    The system determines the number of days by which the invoice is outside the planned time interval. If the posting date of the invoice is before the validity period, the system calculates the number of days between the posting date and the start of the validity period. If the posting date of the invoice is after the validity period, the system calculates the number of days between the posting date and the end of the validity period. The system compares the number of days with the with the absolute upper limit defined.
    Regards,
    Syed Hussain.

  • Check Box data NOT been collected consistently by Acrobat XI Pro

    Hi folks
    I'm having a real problem with Check Box data not being processed consistently by Acrobat XI Pro. 
    I've got 140 response forms back - and have processed them in one batch or smaller batches of 50 forms.  I can see the various check boxes ticked in the pdf view but on some forms that data does not get exported to the csv or xls.   For other forms it works fine. 
    I've tried various exports, even opening one returned pdf form and exporting that alone.  All with the same result.
    I'm at a loss to the problem. 
    The original form is at:
    http://www.orchardrevival.org.uk/wp-content/uploads/2014/02/Orchard-Inventory-Survey-form- v2-superceded.pdf
    Any pointers would be very welcome. 
    thanks
    Crispin

    Thanks for that George. 
    Corruption:  If the file is corrupted, do you know why Acrobat doesn’t report this?  There was no indication of an error in this respect. 
    Method: I have followed your suggestion. Opened ‘BORD0008checkboxissue.pdf’ in Acrobat ProXI.  Export as .fdf.  Close all files.
    Open a blank form in Acrobat Pro XI.  Import the fdf file. 
    Result;  The newly filled form is still missing checkbox data.  Result file at http://www.orchardrevival.org.uk/?p=798
    [ I also exported as xfdf and xml and csv, and looked at these files in a text editor. Neither had the correct checkbox data exported.  Therefore I think it is clear that Acrobat Pro XI is not exporting checkbox data for this file to any format]
    Does that shed any more light on what might be the problem ?
    thanks
    Crispin
    Background to document:  An outline form was created on Word Mac 2008, and then a pdf generated with the Save As pdf in the Word dialogue (which I think may have been an Mac OS based generator). 
    Then the fillable form was created complete with all 143 fields in Acrobat Pro 8.  Then distributed by email.  So despite the fillable form being created in Acrobat Pro, the PDF Producer is reported as Mac OSX 10.n.n Quartz PDFContext for all forms.  However some of them work, some don’t. Obviously I do not have control over how the forms are filled out there in the wild – even though I’ve asked folk to use Adobe Reader.
    I’ve tried processing results with both Acrobat Pro 8 and XI with similar inconsistent results. (a further interesting issue is that the filename stated in the record is incorrect for that record)

  • How can i check the file  which is upload from  the server

    when upload the excel file from the server file to the internal table ,how can i check the data whether it accord with  the required condition .
    for example ,i want to upload the file which have the data whose type is pack, and it have three integer and  two decimal ,how can i check in my code.
    thanks,

    Hi Sichen,
    First upload the file, Then do ur validations and delete the records that doesn't satisfy ur requirements.
    Thanks,
    Vinod.

  • Resource Data Found Damaged: Adobe Elements 10 Organizer.app

    I ran DiskWarrior, which "scanned  the disk named “Macintosh HD” checking all files and folders for damage and potential compatibility problems.
    1 file had resource data that was found to be damaged.
    File:  “AdobeSWFL.rsrc”
    Detected that the resource header is damaged and cannot be repaired. 
    Location:  “Macintosh HD/Applications/Adobe Elements 10 Organizer.app/Contents/ElementsAutoAnalyzer.app/Contents/Frameworks/AdobeOwl.framework/Ve rsions/A/Resources/AdobeSWFL.bundle/Contents/Resources/”
    I deleted Adobe Elements 10, as well as Adobe Elements 10 Organizer, reinstalled Adobe Elements 10, and ran DiskWarrior again.  Again, the same file had resource data that was found to be damaged.
    Should I be worried about the problem?  Has anyone else found the same damage?  What can be done about it, short of Adobe fixing the problem and issuing an upgrade?

    First of all, I personally don't think it's wise to run pre-emptive strikes with repair utilities. Use them when you actually have a problem.
    If you uninstalled PSE by deleting it, you have a problem. You MUST use the uninstaller in Applications>Adobe Photoshop Elements 10. PSE is not a nice tidy package like most mac apps, and there are bits and bobs all over your hard drive. Try running the uninstaller now, but it may well not work. If it doesn't, post back.
    IMHO, running DW unecessarily is more likely to cause problems than to fix them. I'm always hearing from enthusiastic DW users about how it narrowly saved them from a major failure of this or that, while people who don't run these utiities don't have these crises at all. It's great when things really have gone wrong, don't get me wrong. It's the very best for fixing serious problems, but OS X just doesn't need all that maintenance and actually runs better without intervention, most of the time.

  • Can the case be changed while uploading the data or after uploading ????

    hi all ,
    can u pls help me ???  can the case of the data in a itab be changed while running the program? the data is uploaded to an internal table and then based on loop at that itab the conditions will evaluate to give the result.... but the problem is like wen the data is given in small letters the worste(last)  condition is executing even the data satisfies the condtion which is not supposed to happen. this is due to case sensitive problem ...can u pls help me ....can the case be changed while uploading the data r after uploading ????

    this is the itab declaration ..
    data: begin of it_input occurs 0 ,
           tra          like tstc-tcode,
         end of it_input.
    and then from the uploaded data the prog should check wheather it is having any userexits or not ...
    here comes the code...
    sort it_input by tra.
    delete adjacent duplicates from it_input  .
    loop at it_input.
               it_itab-sno = sy-tabix.
      select single * from tstc where tcode eq it_input-tra.
    if sy-subrc eq 0.
        select single devclass from tadir into v_devclass
                 where pgmid = 'R3TR'
                        and object = 'PROG'
                        and obj_name = tstc-pgmna.
             if sy-subrc ne 0.
             select single * from trdir where name = tstc-pgmna.
             if trdir-subc eq 'F'.
                select single * from tfdir where pname = tstc-pgmna.
                select single * from enlfdir where funcname =
                tfdir-funcname.
                select single * from tadir where pgmid = 'R3TR'
                                   and object = 'FUGR'
                                   and obj_name eq enlfdir-area.
             move : tadir-devclass to v_devclass.
              endif.
           endif.
           select * from tadir into table jtab
                         where pgmid = 'R3TR'
                           and object = 'SMOD'
                           and devclass = v_devclass.
           if sy-subrc = 0.
            select single * from tstct where sprsl eq sy-langu and
                                            tcode eq it_input-tra.
                      if not jtab[] is initial.
               loop at jtab.
                    select single modtext from modsapt  into str
                         where sprsl = sy-langu and
                                name = jtab-obj_name.
                    it_itab-tra        = it_input-tra.
                    it_itab-i_obj_name = jtab-obj_name.
                    it_itab-i_modtext = str.
                    append it_itab.
                    str = ''.
               endloop.
              endif.
            else.
                    it_itab-tra        = it_input-tra .
                    it_itab-i_obj_name = ' '.
                    it_itab-i_modtext = 'No user Exit exists'.
                     append it_itab.
            endif.
          else.
                    it_itab-tra        = it_input-tra .
                    it_itab-i_obj_name = ' '.
                    it_itab-i_modtext = 'Transaction Code Does Not Exist'.
                     append it_itab.
          endif.
    endloop.

  • Bad resource data offset 0

    GraphicConverter (an image processing program) is dumping this message onto my console about 2 of my files:
    bad resource data offset 0 and size -928468624 in file
         I routinely check my file system with DiskUtility, Disk Warrior, etc. for any inconsistencies.  I can open the files fine from Finder; what is a "bad resource data offset" and does it mean anything about these files in my file system that I should be concerned about?

    Here's what I see when I do mdls on one of those files:
    kMDItemContentCreationDate
    = 2010-08-25 07:10:09 -0400
    kMDItemContentModificationDate = 2010-08-25 07:10:09 -0400
    kMDItemContentType        
    = "com.adobe.pdf"
    kMDItemContentTypeTree    
    = (
    "com.adobe.pdf",
    "public.data",
    "public.item",
    "public.composite-content",
    "public.content"
    kMDItemDisplayName        
    = "pnas00101-0399.pdf"
    kMDItemEncodingApplications
    = (
    "Apex PDFWriter"
    kMDItemFSContentChangeDate
    = 2010-08-25 07:10:09 -0400
    kMDItemFSCreationDate     
    = 2010-08-25 07:10:09 -0400
    kMDItemFSCreatorCode      
    = ""
    kMDItemFSFinderFlags      
    = 0
    kMDItemFSHasCustomIcon    
    = 0
    kMDItemFSInvisible        
    = 0
    kMDItemFSIsExtensionHidden
    = 0
    kMDItemFSIsStationery     
    = 0
    kMDItemFSLabel            
    = 0
    kMDItemFSName             
    = "pnas00101-0399.pdf"
    kMDItemFSNodeCount        
    = 0
    kMDItemFSOwnerGroupID     
    = 20
    kMDItemFSOwnerUserID      
    = 501
    kMDItemFSSize             
    = 1024494
    kMDItemFSTypeCode         
    = ""
    kMDItemKind               
    = "Adobe PDF document"
    kMDItemLastUsedDate       
    = 2012-05-24 07:57:45 -0400
    kMDItemNumberOfPages      
    = 8
    kMDItemPageHeight         
    = 747.6
    kMDItemPageWidth          
    = 493.44
    kMDItemSecurityMethod     
    = "None"
    kMDItemUsedDates          
    = (
    "2010-08-25 00:00:00 -0400",
    "2012-05-24 00:00:00 -0400"
    kMDItemVersion            
    = "1.3"

  • I need to load the reference/check table data in to MDM Server - help

    Hi,
      I need to load the referece table/check table data from ECC50 into FTP server/Ports. I am intrested only in Material and vendor extraction. Is this is possible with MDMGX or I need to use old zreports to extract reference table data? Any help on this is appreciated.
    Thanks,
    Daniel.LA

    Hi Daniel,
    U have to use generic extratctor MDMGX to extract reference data(customizing data) from R/3.
    Prerequisite:
    MDMGX to be installed.
    Procedure to use it.
    A)Setup Execution.
    1) Define Object type as per ur requirement (drop down list standard objects are provided like customer,vendor,products etc.).
    2)Define Repository and ftp server name.
    3)Upload ports and check table.
    4)Maintain ports and check table.
    B)Execute Generation and Extraction.
    1)Generate XSD.
    2) Starrt Execution.
    For each check table u have to create a separate port in MDM Repository.
    zreport is a older version ,it can be but neednot contain latest updates in check table .
    Reward Points if helpful.
    Regards,
    Neethu Joy.

  • Itunes match; a) if incorrect genre etc data has been uploaded to icloud, cannot delete it or replace the track/ data b) now have 9k tracks "stuck" in icloud and cannot delete them no matter what I do. eh?

    Hi
    I;m having so itunes match issues.
    I ran itunes match. I';ve since foumd, I think:
    a) if incorrect genre etc data has been uploaded to icloud, I cannot find how to delete it and reupload correct song and data. I am presently assuming that once songs are matched, the meta-data, genre, etc etc thats uploaded is now treated as 'the master' by itunes.
    Question: if I want to change the genre on a track, whats the correct way to overwrite the data held in the icloud? if I delete the track i just bring down the old incorrect genres again!
    b) I've deleted from the icloud all my music tracks (select all, delete, Y to delete tracks from icloud). I have turned off itunes match on my ipad, iphone, appletv mac and windows pc. I've deleted the itunes library files too from mac and pc in desperation. But hours /days later, when I turn on itunes match on mac or pc, it tells me i still have 9k tracks in icloud. All with the old, wrong meta-data ready to mess up my [now-corrected]  files back on my pc!
    I cannot delete them no matter what I do. The number of tracks in the cloud was 12k, now 9k so its reducing, by about 1500 tracks a day.
    does it really take itunes servers that long to delete the tracks I've asked it to delete, or am i seeing garbage data?
    is there a quicker way of deleting all my itunes match library, songs and data?

    Thanks for the helpful replies.
    My original intention was to 'upgrade' some of my tracks. In those cases, I'd iTunes matched, and the process was completed.
    I later downloaded those new tracks but found during that process, the tracks I was downloading had the old metadata.
    I guess the idea there is to update the local genres, 'update iTunes match' and wait. How long, I don't know but that's the current solution,yes ?
    Thing is I was trying to get my local library updated and, as I thought it, back in order, but was finding my local genre updates were being overwritten. I had a lot of tracks that kept on stating they could be downloaded so I thought the new files hadn't been copied down...
    I think I was trying to get match to do too many things at once.
    Deleting my tracks in the cloud? Was an act of desperation to start over with all known good genres locally and to re- upload, start over. I'd read in some forum or other to do that, I guess that was wrong,
    Ok so I will.. Err,
    A  wait and see how long it takes to delete my tracks in the cloud ie let it complete the process I'd already started.
    B  Correct my local tracks genres
    C  Wait til those changes propagate up to the iCloud ( or whatever the right term is)
    Q1: How long should I wait for A and C to complete?
    Q2: Is it 100% definite that local genre changes will get copied up to the cloud, when I 'update iTunes match'?
    I don't see any animated cloud icon, but I will watch for it!
    Thanks again
    Iain

  • Create a ReportingServices data source and upload to a data connection library

    I have installed reporting services 2012 in SharePoint 2013 integrated mode. I need to create a report data source and upload it to a data connection library using
    C#.
    As I know it uses the ReportingServices2010 class but I cannot explore a reporting server url.
    There is a ReportingServices2010.asmx file in the 15 hive though.
    Also it works fine when I manually set the data source.

    Hi,
    The following materials would be helpful:
    Inserting Data Connections into a SharePoint Library
    https://social.technet.microsoft.com/Forums/en-US/df79dce5-fd92-4506-af4e-11127cb0d655/inserting-data-connections-into-a-sharepoint-library?forum=sharepointdevelopment
    Programmatically exporting reports from SQL 2012 Reporting Services
    http://stackoverflow.com/questions/12199995/programmatically-exporting-reports-from-sql-2012-reporting-services
    Report Server Web Service Endpoints
    http://msdn.microsoft.com/en-us/library/ms155398(v=sql.110).aspx
    Best Regards
    Dennis Guo
    TechNet Community Support

  • TS3297 My 2nd generation ipod touch is giving me the following error when I try accessing Itunes or the app store. "Cannot connect to the Store. A secure connection could not be established. Please check your date & time settings"  I am on a secure networ

    My 2nd generation ipod touch is giving me the following error when I try accessing Itunes or the app store. "Cannot connect to the Store. A secure connection could not be established. Please check your date & time settings"  I am on a secure networkl.

    Can't connect to the iTunes Store
    Make sure that time zone is correct in addition to date and time

Maybe you are looking for