How to get creation date of a file created in application server?

Dear experts ,
I have some txt files in a particular path in application server .Suppose path is - \\PMICHSAPLA30\INTERFACE\\BCD\CO\OTC\SALES_RPT\test.txt.
I want to fetch  the creation date of this txt file - test.txt . I tried with fm -  ADS2KIPUPL_GET_FILE_ATTRIBUTES , but this will give proper out put.
I need to get file creation date. If today I am creating file - abc.txt, so according to this file I should get today's date as creation date.
Please suggest me.  

                    BEGIN OF ty_file,
                    line(400)   TYPE c,                                      "File names
                  END   OF ty_file,
This is the structure declaration.
l_v_unixcom would be ls -l \\PMICHSAPLA30\INTERFACE\\BCD\CO\OTC\SALES_RPT\test.txt
now call
CALL 'SYSTEM' ID 'COMMAND' FIELD l_v_unixcom
                ID 'TAB'     FIELD i_file[].
and see contents of i_file[] it should have the details, post results so we can help further

Similar Messages

  • How to get creation date from Fuego.Papi.Instance

    I got all process instances from papi. The instances are Fuego.Papi.Instance.
    I also need the creation date of the process instance, but I cannot find it in Fuego.Papi.Instance.
    How to get the creation date from Fuego.Papi.Instance?
    Thanks,

    Yes, I knew Fuego.Lib.ProcessInstance is able to get creation date.
    But my code is not in the that process. I am using papi to get the process instance which returns me Fuego.Papi.Instance. Is there a way to get Fuego.Lib.ProcessInstance from Fuego.Papi.Instance?
    My code is like this
    businessProcess = ProcessService.getProcess(process : "processName");
    instances = businessProcess.getInstancesByFilter(filter : instFilter);
    //then loop each instance and I need the creation date

  • Regarding reading data from a file in the application server.

    Hello Everyone,
    My question is:
    The file in the application server consists of data with header, detail and trail out of which the detail contains the main information. The detail again contains the data in the form of a continuous string and again some spaces corresponding to a single record. I need to split the data in the internal table in such a way so that the first few characters get into field-1 of the target internal table. Again I need to consider the spaces for accessing the data for filling up in field-2. How do I decide on the 'Split' statement and specially when the whole string has to be taken care of as contatining data in a single string format without space and again some data after some spaces corresponding to a single record.
    Your help is very much needed. Thanks to all the experts in advance.

    Hi
      This is the sample code I was used for the similar requirement.
    DATA: single_line TYPE string .
    v_file_listings = pa_filn1.
    IF v_file_listings IS INITIAL .
    MESSAGE e039 WITH v_file_listings.
    ENDIF.
    *-- read file, split lines into fields and put data into table
    OPEN DATASET v_file_listings FOR INPUT IN TEXT MODE ENCODING NON-UNICODE. "Opening the files
    IF sy-subrc EQ 0.
    DO.
    READ DATASET v_file_listings INTO single_line. "Reading the content of file into line
    IF sy-subrc = 0.
    IF sy-index > 1. "skip header-line
    SPLIT "Split the content of line into work area
    single_line
    AT k_split
    INTO
    wa_listings-kschl " Condition type
    wa_listings-tabname16 " Condition table name
    wa_listings-vkorg " Sales organisation
    wa_listings-kunnr " sold-to party numberor ship-to party number
    wa_listings-matnr " Material Number
    wa_listings-kodatab " Valid-from date
    wa_listings-kodatb1. " Valid-to date
    APPEND wa_listings TO itab_listings. "Appending Work Area to internal table
    ENDIF.
    ELSE.
    EXIT.
    ENDIF.
    v_count1 = sy-tabix.
    ENDDO.
    Regards,
    Sreeram

  • Missing CRLF in file created on application server

    Hello,
    We have a requisite to create a file on the Application Server (AS/400) with CRLF at the end of every record. The records are stored in this structure:
    - OUTPUT: CHAR655, text to export;
    - LENGTH: NUMC4, length of the record (text to export).
    To create the file I have the following code:
      open dataset lv_filename
        for output in text mode
        encoding default
        with windows linefeed.
      loop at pt_output into ls_output.
        try.
            clear lv_length.
            lv_length = ls_output-length.
            transfer ls_output-output to lv_filename
              length lv_length.
          catch cx_root into gv_oref.
    *    TODO
        endtry.
      endloop.
    I have two problems:
    1º When I open the file in Notepad++ the records should have 500 caracters (lv_length = 500), but it has 502;
    2º Only LF appears, not CRLF.
    Can anyone help me please?
    Regards.

    Hello,
    Strangest thing, when I test it with 255 characters it works fine (CR+LF):
            clear lv_string.
            ls_output-output+254(1) = cl_abap_char_utilities=>cr_lf.
            lv_string = ls_output-output+0(255).
            transfer lv_string to lv_filename
              length 255.
    When I try with 256 characters it doesn't work, at the end of each record I can only find LF:
            clear lv_string.
            ls_output-output+255(1) = cl_abap_char_utilities=>cr_lf.
            lv_string = ls_output-output+0(256).
            transfer lv_string to lv_filename
              length 256.
    Don't understand why...
    Regards.

  • How to find creation date of a file in RHEL 5

    We have instances having RHEL 5.10 in our setup. The file system type shown using
    df –T
    says that file system type is ext3.  The ls –ltr command seems to show the modification time but not created date.
    I hope my question is clear on how to find the created date of a file in Red Hat Linux.
    Requesting a reply to my query.
    Regards

    Hi,
    I agree with Dude.
    Also you can check the command find, this command has many options like:
    -atime n  
                  File  was  last accessed n*24 hours ago.
    -cmin n
                  File’s status was last changed n minutes ago.
    -ctime n
                  File’s  status  was  last changed n*24 hours ago.  See the comments for -atime to understand how
                  rounding affects the interpretation of file status change times.
    -mtime n
                  File’s data was last modified n*24 hours ago.  See the comments for  -atime  to  understand  how
                  rounding affects the interpretation of file modification times.¨
    Best regards

  • How to get creation date of a released transport??

    Hi Guys!!
    I need to get the creation date of a released transport. Before the transport is released this date is stored in table E070CREATE; but once it is released the record from this table is removed. Does anyone have any idea is there a table in SAP where I can pick up the creation date from?
    Any help will be appreciated.
    <REMOVED BY MODERATOR>
    Edited by: Alvaro Tejada Galindo on Feb 25, 2008 9:22 AM

    Hi Sandip,
    The creatin date is overwritten in the same field E070-AS4DATE along with time E070-AS4TIME and E070-TRSTATUS in table E070.
    Hope this helps
    Thanks
    Lakshman

  • How to save 4 versions of a file in the application server

    Hi,
       I have to write a program which will extract few data from SAP and write it in a file in the SAP application server. This program will be run once every month.
    I need to save 4 versions of the files, current and 3 prior.  These versions should cycle off the oldest once a new version has been created.
    Could anyone help me how to do this?
    Thank you!
    Regards,
    Sunitha.

    Hi Sunitha,
    The best way is to write a unix script in the unix server to do the job for you. Once The file is written in the server then leave the job from the unix side to get this done.
    This is the best option i think. Writing a script is not a big task. You can call the script from ABAP or from an external daemon in the unix server. We have many such programs in our org.
    Hope this helps
    Cheers
    VJ

  • How to get PDF generated by adobe forms on to Application Server

    I have a program which gathers data into an internal table. The program then calls function FP_JOB_OPEN (with relevant outputparams). Then it calls the function module name, of the Adobe Form, passing the internal table across. Then the job is closed.
    This all works fine and we end up with a PDF file on the spool (type ADSP).
    The requirement is, within the program, instead of outputting to the spool, to output the PDF document onto the Application server.
    Any help greatly appreciated.

    If the PDF spool is already generated then,
    You may need to read the PDF data , may be using 'RSPO_RETURN_SPOOLJOB
    Once you get the spool data in internal table , use Open Dataset and close dataset to download , see if that works.
    Cheers,
    VB

  • Getting the size of a file on the application server ?

    hi,
    hi everyone !
    i' m trying to transfer a file with the function module FTP_R3_TO_SERVER...
    But for this, i need to know the size of the file i wish to transfer...
    I found out nothing about this, but if anyone have any idea, it would be really helpful...
    thnks.

    HI,
    Why don't you use a ShellScript for FTPing your file.
    You can execute it via ABAP and it will probably does the same
    but you don't have to know how big the file is.
    can think of a few:
    1) Check the F1 help on READ DATASET. It has the addition LENGTH which you should be able to use for this purpose.
    2) Call function SXPG_COMMAND_EXECUTE for command LIST_DB2DUMP. Parse the returned table. You may need to explicitly pass 'UNIX' or 'Windows NT' to the OPERATINGSYSTEM parameter.
    3) CALL 'SYSTEM' ID 'COMMAND' FIELD 'ls -l' ID 'TAB' FIELD result-sys.
    4) Disect AL11 and make use of the same C-calls
    There's probably more...
    if u r sending the data form presentation server (SAPFTP)
    use
    describe table itab line lin .
    size = lin * width of internal row
    If u r sending the data form application server
    read the data in applicaion server file in ur prg internal table and calculate as above and size of file in ftp_r3_to_server fun module
    Cheers,
    Chandra Sekhar.

  • How to get the data from a file and insert into a table

    Good morning,
    NEED TO READ THIS FILE sui_facturacion_alcantarillado_15085_2011_01_76845_00A.csv containing the following information
    NUID,NUMERO_DE_CUENTA_CONTRATO,CÓDIGO_DANE_DEPARTAMENTO,CÓDIGO_DANE_MUNICIPIO,ZONA_IGAC,SECTOR_IGAC,MANZANA_O_VEREDA_IGAC,NÚMERO_DEL_PREDIO_IGAC,CONDICION_DE_PROPIEDAD_DEL_PREDIO_IGAC,DIRECCIÓN_DEL_PREDIO,NÚMERO_DE_FACTURA,FECHA_DE_EXPEDICIÓN_DE_LA_FACTURA,FECHA_DE_INICIO_DEL_PERÍODO_DE_FACTURACIÓN,DIAS_FACTURADOS,CÓDIGO_CLASE_DE_USO,UNIDADES_MULTIUSUARIO_RESIDENCIAL,UNIDADES_MULTIUSUARIO_NO_RESIDENCIAL,HOGAR_COMUNITARIO_O_SUSTITUTO,USUARIO_FACTURADO_CON_AFORO,USUARIO_CUENTA_CON_CARACTERIZACIÓN,CARGO_FIJO,CARGO_POR_VERTIMIENTO_BASICO,CARGO_POR_VERTIMIENTO_COMPLEMENTARIO,CARGO_POR_VERTIMIENTOSUNTUARIO,CMT,VERTIMIENTO_DEL_PERIOD_EN_METROS_CUBICOS,VALOR_FACTURADO_POR_VERTIDO,VALOR_DEL_SUBSIDIO,VALOR_DE_LA_CONTRIBUCIÓN,FACTOR_DE_SUBSIDIO_O_CONTRIBUCIÓN_CARGO_FIJO,FACTOR_DE_SUBSIDIO_O_CONTRIBUCIÓN_VERTIMIENTO,CARGOS_POR_CONEXIÓN,PAGO_ANTICIPADO_DEL_SERVICIO,DÍAS_DE_MORA,VALOR_DE_MORA,INTERESES_POR_MORA,OTROS_COBROS,CAUSAL_DE_REFACTURACIÓN,NUMERO_DE_LA_FACTURA_OBJETO_DE_REFACTURACIÓN,VALOR_TOTAL_FACTURADO,PAGOS_DEL_CLIENTE_DURANTE_EL_PERÍODO_FACTURADO
    242602,242602,76,845,99,99,9999,9999,999,CLL 5 CRA 7 PEATONAL,24911920,12-01-2011,01-12-2010,30,01,,,0,0,0,1,0000000000.00,0000000000.00,0000000000.00,0000000000.00,0000000005,0000002200.00,0000000000,0000000000,0.000,0.000,0000000000.00,0000000000.00,0,0000000000.00,0000000000.00,0000000000.00,0,0,0000002201.00,0000000000.00
    242604,242604,76,845,99,99,9999,9999,999,CRA 4 # 6 - 13,24911846,12-01-2011,01-12-2010,30,01,,,0,0,0,1,0000000000.00,0000000000.00,0000000000.00,0000000000.00,0000000013,0000002200.00,0000000000,0000000000,0.000,0.000,0000000000.00,0000000000.00,0,0000000000.00,0000000000.00,0000000000.00,0,0,0000002201.00,0000004411.00
    242605,242605,76,845,99,99,9999,9999,999,CRA 2 CLLES 3 Y 4,24911509,12-01-2011,01-12-2010,30,01,,,0,0,0,1,0000000000.00,0000000000.00,0000000000.00,0000000000.00,0000000004,0000002200.00,0000000000,0000000000,0.000,0.000,0000000000.00,0000000000.00,0,0000000000.00,0000000000.00,0000000000.00,0,0,0000002201.00,0000002200.00
    this is the function that I have
    <<function_test>>
    DECLARE
    TOTAL_CAR NUMBER;
    POS_1 NUMBER:= 0;
    POS_2 NUMBER:= 0;
    REST NUMBER:= 0;
    ACUM NUMBER:= 0;
    CADEN VARCHAR2(200);
    nom_archivo varchar2(80);
    v1 utl_file.file_type;
    v2 varchar2(2048);
    BEGIN
         nom_archivo := 'sui_facturacion_alcantarillado_15085_2011_01_76845_00A.csv';
         v1:= utl_file.fopen('PUBLIC_ACCESS',nom_archivo,'R',32767);
         utl_file.get_line(v1,v2);
         SELECT LENGTH(v2) INTO TOTAL_CAR FROM DUAL;
         ACUM:=1;
         POS_1:=0;
         WHILE ACUM <= 60
         LOOP
              select instr(v2, ',', 1, ACUM) PRUEBA
              INTO      POS_2
              FROM      DUAL;
              dbms_output.put_line(' TOTAL POSICION 1--> '|| POS_1);
              dbms_output.put_line(' TOTAL POSICION 2--> '|| POS_2);
              dbms_output.put_line(' TOTAL ACUMULADO --> '|| ACUM);
              REST := (POS_2-POS_1)-1;
              SELECT SUBSTR(v2,(POS_1+1),REST) PRUEBA2
                   INTO CADEN
              FROM      DUAL;     
              dbms_output.put_line(' CADENA SELECCIONADA --> '|| CADEN);
              ACUM := ACUM + 1;
              POS_1:= POS_2;
         END LOOP;
         utl_file.fclose(v1);
         dbms_output.put_line(' -->');
         dbms_output.put_line(' TOTAL POSICION 1-->'|| POS_1);
         dbms_output.put_line(' TOTAL POSICION 2-->'|| POS_2);
         dbms_output.put_line(' TOTAL ACUMULADO -->'|| ACUM);
         dbms_output.put_line(' TOTAL DE CARACTERES -->'|| TOTAL_CAR);
         dbms_output.put_line(' ');     
         EXCEPTION
              WHEN NO_DATA_FOUND THEN
                   dbms_output.put_line('NO SE ENCONTRARON MAS CARACTERES');
              WHEN OTHERS THEN
                   dbms_output.put_line('OTRO TIPO DE ERROR ');
                   dbms_output.put_line('CODIGO ERROR '|| SQLCODE ||' '||SQLERRM);
    END;
    Which must be separated by a comma and enter a table I have the following procedure in which only brings me the first line, which need not
    The current role I have just read and extract data from the row number 1, I do not need information.
    I need information for rows 2,3,4. In each row there are 41 fields, which I enter in a table called Dato_archivos.
    how to perform this function? ...
    I appreciate the cooperation and explanation ...
    GOOD DAY ...
    REYNEL SALAZAR MARTINEZ
    COLOMBIA ...

    When you get an error with external tables (or sql*loader) look in the same folder as the data file and you should get a .log file and maybe a .bad file too.
    The log file should indicate the nature of the error it has trying to load the data.
    I've just copied your sample data from your first post to a file on my server and tried it to find that you are not specifying the required format for your dates. The below shows it now working...
    CREATE TABLE tabla_prueba
      (NUID NUMBER,
       NUM_CUENTA_CONTRATO NUMBER,
       COD_DANE_DD NUMBER,
       COD_DANE_MM NUMBER,
       ZONA_IGAC NUMBER,
       SECTOR_IGAC NUMBER,
       MANZANA_VEREDA_IGAC NUMBER,
       NUM_PREDIO_IGAC NUMBER,
       CONDICION_PREDIO_IGAC NUMBER,
       DIRECCION_PREDIO_IGAC VARCHAR2(80),
       NUM_FACTURA NUMBER,
       FECHA_EXPED_FACTURA DATE,
       FECHA_INI_PERIODO_FACTURACION DATE,
       DIAS_FACTURADOS NUMBER,
       COD_CLASE_USO NUMBER,
       UNI_MULTIUSUARIO_RESIDENCIAL NUMBER,
       UNI_MULTIUSUARIO_NORESIDENCIAL NUMBER,
       HOGAR_COMUNITARIO NUMBER,
       USUARIO_FACTURADO_AFORO NUMBER,
       USUARIO_CON_CARACTERIZACION NUMBER,
       CARGO_FIJO NUMBER,
       CARGO_VERTIMENTO_BAS NUMBER,
       CARGO_VERTIMENTO_COMP NUMBER,
       CARGO_VERTIMENTO_SUNT NUMBER,
       CMT NUMBER,
       VLR_FACTURADO_VERTIDO NUMBER,
       VLR_SUBSIDIO NUMBER,
       VLR_CONTRIBUCCION NUMBER,
       FACTOR_SUBS_CONTR_CARGO_FIJO NUMBER,
       FACTOR_SUBS_CONTR_VERTIMENTO NUMBER,
       CARGO_CONEXION NUMBER,
       PAGO_ANTICIPADO_SERVICIO NUMBER,
       DIAS_MORA NUMBER,
       VLR_MORA NUMBER,
       INTERES_MORA NUMBER,
       OTROS_COBROS NUMBER,
       CAUSAL_REFACTURACION NUMBER,
       NUM_FACTURA_OBJ_REFACTURACION NUMBER,
       VLR_TOTAL_FACTURADO NUMBER,
       PAGOS_CLIENTE_DURANTE_PERIODO NUMBER
    ORGANIZATION EXTERNAL
      (TYPE ORACLE_LOADER
       DEFAULT DIRECTORY TEST_DIR
       ACCESS PARAMETERS
        (RECORDS DELIMITED BY NEWLINE
         SKIP 1
         FIELDS TERMINATED BY ","
         OPTIONALLY ENCLOSED BY '"'
          (NUID,
           NUM_CUENTA_CONTRATO,
           COD_DANE_DD,
           COD_DANE_MM,
           ZONA_IGAC,
           SECTOR_IGAC,
           MANZANA_VEREDA_IGAC,
           NUM_PREDIO_IGAC,
           CONDICION_PREDIO_IGAC,
           DIRECCION_PREDIO_IGAC,
           NUM_FACTURA,
           FECHA_EXPED_FACTURA CHAR DATE_FORMAT DATE MASK "DD-MM-YYYY",
           FECHA_INI_PERIODO_FACTURACION CHAR DATE_FORMAT DATE MASK "DD-MM-YYYY",
           DIAS_FACTURADOS,
           COD_CLASE_USO,
           UNI_MULTIUSUARIO_RESIDENCIAL ,
           UNI_MULTIUSUARIO_NORESIDENCIAL,
           HOGAR_COMUNITARIO ,
           USUARIO_FACTURADO_AFORO ,
           USUARIO_CON_CARACTERIZACION ,
           CARGO_FIJO ,
           CARGO_VERTIMENTO_BAS ,
           CARGO_VERTIMENTO_COMP,
           CARGO_VERTIMENTO_SUNT,
           CMT,
           VLR_FACTURADO_VERTIDO,
           VLR_SUBSIDIO ,
           VLR_CONTRIBUCCION ,
           FACTOR_SUBS_CONTR_CARGO_FIJO ,
           FACTOR_SUBS_CONTR_VERTIMENTO ,
           CARGO_CONEXION ,
           PAGO_ANTICIPADO_SERVICIO ,
           DIAS_MORA ,
           VLR_MORA ,
           INTERES_MORA ,
           OTROS_COBROS ,
           CAUSAL_REFACTURACION ,
           NUM_FACTURA_OBJ_REFACTURACION,
           VLR_TOTAL_FACTURADO,
           PAGOS_CLIENTE_DURANTE_PERIODO
        LOCATION ('test.csv')
    SQL> select * from tabla_prueba;
          NUID NUM_CUENTA_CONTRATO COD_DANE_DD COD_DANE_MM  ZONA_IGAC SECTOR_IGAC MANZANA_VEREDA_IGAC NUM_PREDIO_IGAC CONDICION_PREDIO_IGAC DIRECCION_PREDIO_IGAC                                                    NUM_FACTURA FECHA_EXPE FECHA_INI_
    DIAS_FACTURADOS COD_CLASE_USO UNI_MULTIUSUARIO_RESIDENCIAL UNI_MULTIUSUARIO_NORESIDENCIAL HOGAR_COMUNITARIO USUARIO_FACTURADO_AFORO USUARIO_CON_CARACTERIZACION CARGO_FIJO CARGO_VERTIMENTO_BAS CARGO_VERTIMENTO_COMP CARGO_VERTIMENTO_SUNT        CMT
    VLR_FACTURADO_VERTIDO VLR_SUBSIDIO VLR_CONTRIBUCCION FACTOR_SUBS_CONTR_CARGO_FIJO FACTOR_SUBS_CONTR_VERTIMENTO CARGO_CONEXION PAGO_ANTICIPADO_SERVICIO  DIAS_MORA   VLR_MORA INTERES_MORA OTROS_COBROS CAUSAL_REFACTURACION NUM_FACTURA_OBJ_REFACTURACION
    VLR_TOTAL_FACTURADO PAGOS_CLIENTE_DURANTE_PERIODO
        242602              242602          76         845         99          99                9999         9999              999 CLL 5 CRA 7 PEATONAL                                                         24911920 12-01-2011 01-12-2010
                 30             1                                                                             0               0                           0          1                    0                     0                     0          0
                        5         2200                 0                            0                            0              0                        0          0          0            0            0                    0                             0
                      0                          2201
        242604              242604          76         845         99          99                9999         9999              999 CRA 4 # 6 - 13                                                               24911846 12-01-2011 01-12-2010
                 30             1                                                                             0               0                           0          1                    0                     0                     0          0
                       13         2200                 0                            0                            0              0                        0          0          0            0            0                    0                             0
                      0                          2201
        242605              242605          76         845         99          99                9999         9999              999 CRA 2 CLLES 3 Y 4                                                            24911509 12-01-2011 01-12-2010
                 30             1                                                                             0               0                           0          1                    0                     0                     0          0
                        4         2200                 0                            0                            0              0                        0          0          0            0            0                    0                             0
                      0                          2201
    SQL>
    {code}                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

  • How to get Remote Data from a file which is in remote system

    Hi everybody,
    I have developed four classes.
    FileInterface.java:-
    import java.rmi.Remote;
    import java.rmi.RemoteException;
    public interface FileInterface extends Remote {
    public byte[] downloadFile(String fileName) throws
    RemoteException;
    FileServer.java:-
    import java.io.*;
    import java.rmi.*;
    public class FileServer {
    public static void main(String argv[]) {
    if(System.getSecurityManager() == null) {
    System.setSecurityManager(new RMISecurityManager());
    try {
    FileInterface fi = new FileImpl("FileServer");
    Naming.rebind("//10.161.15.219/FileServer", fi);
    } catch(Exception e) {
    System.out.println("FileServer: "+e.getMessage());
    e.printStackTrace();
    FileImpl.java:-
    import java.io.*;
    import java.rmi.*;
    import java.rmi.server.UnicastRemoteObject;
    public class FileImpl extends UnicastRemoteObject
    implements FileInterface {
    private String name;
    public FileImpl(String s) throws RemoteException{
    super();
    name = s;
    public byte[] downloadFile(String fileName){
    try {
    File file = new File(fileName);
    byte buffer[] = new byte[(int)file.length()];
    BufferedInputStream input = new
    BufferedInputStream(new FileInputStream(fileName));
    input.read(buffer,0,buffer.length);
    input.close();
    return(buffer);
    } catch(Exception e){
    System.out.println("FileImpl: "+e.getMessage());
    e.printStackTrace();
    return(null);
    FileClient.java:-
    import java.io.*;
    import java.rmi.*;
    public class FileClient{
    public static void main(String argv[]) {
    try {
    String name = "//10.161.15.219/FileServer";
    System.out.println("1");
    FileInterface fi = (FileInterface) Naming.lookup(name);
    System.out.println("2");
    byte[] filedata = fi.downloadFile("//10.161.15.88/C/RMITEST/Test.txt");
    System.out.println("3");
    File file = new File("//10.161.15.219/T.txt");
    System.out.println("4");
    BufferedOutputStream output = new
    BufferedOutputStream(new FileOutputStream(file.getName()));
    output.write(filedata,0,filedata.length);
    output.flush();
    output.close();
    } catch(Exception e) {
    System.err.println("FileServer exception: "+ e.getMessage());
    e.printStackTrace();
    Now Iam trying to access text file which is in remote s/m.
    It is developing stub and skeleton classes,when i run it is coming for local
    If I want to access remotely the error is coming like this
    1
    2
    3
    FileServer exception: null4
    java.lang.NullPointerException
         at FileClient.main(FileClient.java:18)
    Can anybody help me please.

    Probably the easiest way in this case is to run the performance trace (System/Utilities/Performance trace). Start transaction MD04, put on the trace in another window, press enter to see the stock/requirements list, then stop the trace and list the results. Then you'll see which tables were accesses with which queries.

  • How to get list data to Excel to create chart with date filter?

    Hi all,
    I have to create chart from a custom list in o365 site. There is one column named "Due Date" in my list. I want only those records whose Due Date is today or gone, I mean Due Date <= Today.
    How can do it?
    I have tried following ways.
    I have tried with REST (OData Data Feed) but not able to use Today's date (I mean dynamic) as filter.
    I have tried by Export to Excel my view and it is working but if I am uploading my excel file to o365 and refreshing data connection, it is showing error and not working.
    NOTE : I cannot user Power BI features like Power Query we have not that licences.
    Thanks,
    Ritesh
    Ritesh Goswami

    Hi Ritesh
    Not sure if I understood you correctly but what about creating a calculated field which has an if condition like
    if([Due Date]<=today(), "past", "future"
    and then just filter the 'past' / 'future' column?
    Kind regards,
    https://www.sharepointbay.com

  • How do I get the date of a file in an FTP?

    As stated in the subject.
    I am at wits end trying to get the date of a file from FTP(Stuck for days).
    I tried the "URLConnection" Class but could only get me the size of the file with "getContentLength()"
    "getContentDate()" & "getLastModified()" both give me a 0. They only works for files on HTTP but not FTP.
    Anyone has any simple solution on how to get the date of a file on FTP?
    Thanks in Advance.

    The FTP specification has long awaited the magical file meta-data access protocol, which never came. There is no FTP defined manner to obtain the date of a file. The common work-around is to get the containing directory list and <shudder> parse the format of the returned text for the target file and date. Since there is no standard directory listing format, there is no guarantee that you will be able to get that date, and your application would have to cry to the user for help.
    Some FTP applications, in addition to providing parsers for Unix/Linux and DOS/Windows directory list formats, also provide the user with a format definition script to parse whatever the user encounters.
    I hope you find this of some help.

  • How to find creation date of a cron job in RHEL 5

    We have instances having RHEL 5.10 in our setup. I wanted to know how to find the creation date of a cron job in RHEL 5. I was investigating a cron job, hence the query to know the creation date of a cron job.
    I hope my question is clear.
    Requesting a reply to my query.
    Regards

    Sorry to have upset you, but I cannot understand your accusation. Could you please explain where you see aggression in my problem evaluation or where I implied "how dare you"?
    I also provided an explanation why the cronfile in /var/spool/cron is not helpful. If I'm wrong, by all means, please let me and anyone else know.
    If no filesystem is specified I assume the default file system, which under OL 6 is ext4 and xfs under OL 7. Furthermore I was assuming ext3 due to other questions the OP has been asking in a similar context, which you may have not seen. For example: How to find creation date of a file in RHEL 5

  • How to read the data from Excel file and Store in XML file using java

    Hi All,
    I got a problem with Excel file.
    My problem is how to read the data from Excel file and Store in XML file using java excel api.
    For getting the data from Excel file what are all the steps i need to follow to get the correct result.
    Any body can send me the code (with java code ,Excel sheet) to this mail id : [email protected]
    Thanks & Regards,
    Sreenu,
    [email protected],
    india,

    If you want someone to do your work, please have the courtesy to provide payment.
    http://www.rentacoder.com

Maybe you are looking for

  • Iphone 3g not detected in my computer

    my iphone 3g does not show up as a removable storage in my computer..but it does show up in itunes to sync..i tried updating but that does not work..what shud i do?

  • Pool Table Creation in ABAP Dictionary

    Please Help How to create Pool table in ABAP DDIC ? I Tried with the Existing Transparent Table Changing Its Category to Pool Table by going through EXTRA menu and then Change Table Category Option but After this it is asking for Table Pool name in D

  • Windows AD setup

    Hi Gurus, Please provide the check list and steps needed to setup for windows AD with BOE.  What information do I need to gather before setting up. Thanks RUC

  • SQL statements

    Hey, Ive been trying to insert a string variable into a SQL statement but as of yet with no luck. I know how to do it using VB but not in java. Heres a example in VB of what i need to do in java: String position = "manager"; String SQL = " Select * f

  • Compressed film to big for dvd studio pro - help

    I have a wedding film which is 2 hrs and 6 mins long at final edit... it was a long wedding! I used the 150min preset in compressor for audio and video but when i add it to DVD studio pro it takes up 5.3gb, far to much for the single layer DVD. Can a