Initialize a date using eCATT and a file

We are trying to clear the values in a date field (MBEW-ZKDAT), i.e. we're trying to update the field from some date to the initial value.
We have an eCATT that has been working fine when there is some value in the date field. But when we try to put any other value (00/00/0000, 00000000 or just leave it blank) in the file, we're getting 'Invalid date' error.
I've looked in FAQ, SAP Help, but can't find any useful information on this. Does anyone know what we need to enter in the date field to make it blank (=initialize)?
Thank you.

Hello Jelena
Normally &CLEAR does this job.
[Special Variables (eCATT Tutorial)|http://help.sap.com/saphelp_nw04/helpdata/EN/3d/72cd3bb961766de10000000a11402f/content.htm]
Regards
  Uwe

Similar Messages

  • Error message when importing data using Import and export wizard

    Getting below error message when importing data using IMPORT and EXPORT WIZARD
    Error 0xc0202009: Data Flow Task 1: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80004005.
    <dir>
    <dir>
    Messages
    Error 0xc0202009: Data Flow Task 1: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80004005.
    An OLE DB record is available.  Source: "Microsoft SQL Server Native Client 11.0"  Hresult: 0x80004005  Description: "Could not allocate a new page for database REPORTING' because of insufficient disk space in filegroup 'PRIMARY'.
    Create the necessary space by dropping objects in the filegroup, adding additional files to the filegroup, or setting autogrowth on for existing files in the filegroup.".
    (SQL Server Import and Export Wizard)
    Error 0xc0209029: Data Flow Task 1: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR.  The "Destination - Buyer_.Inputs[Destination Input]" failed because error code 0xC020907B occurred, and the error row disposition on "Destination
    - Buyer_First_Qtr.Inputs[Destination Input]" specifies failure on error. An error occurred on the specified object of the specified component.  There may be error messages posted before this with more information about the failure.
    (SQL Server Import and Export Wizard)
    Error 0xc0047022: Data Flow Task 1: SSIS Error Code DTS_E_PROCESSINPUTFAILED.  The ProcessInput method on component "Destination - Buyer" (28) failed with error code 0xC0209029 while processing input "Destination Input" (41). The
    identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.  There may be error messages posted before this with more information
    about the failure.
    (SQL Server Import and Export Wizard)
    Error 0xc02020c4: Data Flow Task 1: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
    (SQL Server Import and Export Wizard)
    </dir>
    </dir>
    Error 0xc0047038: Data Flow Task 1: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED.  The PrimeOutput method on Source - Buyer_First_Qtr returned error code 0xC02020C4.  The component returned a failure code when the pipeline engine called PrimeOutput().
    The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.  There may be error messages posted before this with more information about the failure.
    (SQL Server Import and Export Wizard)
    Smash126

    Hi Smash126,
    Based on the error message” Could not allocate a new page for database REPORTING' because of insufficient disk space in filegroup 'PRIMARY'. Create the necessary space by dropping objects in the filegroup, adding additional files to the filegroup, or setting
    autogrowth on for existing files in the filegroup”, we can know that the issue is caused by the there is no sufficient disk space in filegroup 'PRIMARY' for the ‘REPORTING’ database.
    To fix this issue, we can add additional files to the filegroup by add a new file to the PRIMARY filegroup on Files page, or setting Autogrowth on for existing files in the filegroup to increase the necessary space.
    The following document about Add Data or Log Files to a Database is for your reference:
    http://msdn.microsoft.com/en-us/library/ms189253.aspx
    If there are any other questions, please feel free to ask.
    Thanks,
    Katherine Xiong
    Katherine Xiong
    TechNet Community Support

  • Upload XML data using XSQL and HTTP Post ?

    Upload XML data using XSLQ and HTTP Post: is that possible ?
    An xsql contains an <xsql:insert-request table="aTable">
    The XML data file follows the ROWSET/ROW paradigm.
    What is the HTML form to upload the xml file to the XSQL ?
    I tried:
    <form action="myXSQL.xsql" method="POST" ENCTYPE="multipart/form-data">
    XML data file to upload: <input type="file">
    <input type="submit">
    </form>
    But the answer of myXSQL is:
    <xsql-status action="xsql:insert-request" result="No posted document to process" />
    Where is the problem ?
    Thank you.

    Hello,
    You are posting your XML file as a parameter therefore you should use the <xsql:insert-params/> tag, not the <xsql:insert-request/>. The insert-request can only handle data not posted via a parameter.
    Usage:
    <form action="myXSQL.xsql" method="GET" ENCTYPE="multipart/form-data">
    XML data file to upload: <input type="file" name="myXML">
    <input type="submit">
    </form>
    in combination with
    <xsql>
    <xsql:insert-params name="myXML" table="your table"/>
    </xsql>
    2 remarks:
    I was not able to succesfully POST the form. The answer was <xsql-status action="xsql:insert-request" result="No posted document to process" />. With GET is was succesfull.
    Second, if you use MSInternet explorer 5 or higher use could post the XML directly (not aw parameter) using an ActiveX object.
    Regards,
    Harm Verschuren

  • Useful logs and trace files

    Hello experts, for our Netweaver AS administration, I am in charge of periodically checking logs and trace files. I would like to know which are the most useful logs and trace files and the information each one will hold. I am familiar with "DefaultTrace.trc", and as of today it is the only one I have used, but I believe I should also be looking at other logs and trace files.
    Any suggestions?

    Hi Pedro,
    If you are talking about JAVA only system defaulttrace is the best log/trace to look, there are other log files like application log, but maybe the best way to check you logs is using NWA (NetWeaver Administrator) on the following URL on your JAVA system:
    http://<hostname>:<port>/nwa
    From there you need to go to Monitoring -> Logs and Traces and then Predefined View/SAP logs.
    My other recommendation is to change the severity level to ERROR for all you JAVA component within the Visual Administrator -> ServeNode -> Services -> Log Configurator -> Locations, otherwise it is possible that you see a lot of garbage on the defaulttraces. Anyway you can change the severity level per component, on demand, to investigate any possible problem.
    The work directory is very imporant and maybe you can also check the file "dev_serverX" that also will give you information about any out of memory conditions and garbage collection activity if you have these values set for the server node using the config tool:
    -verbose:gc
    -XX:+PrintGCDetails
    -XX:+PrintGCTimeStamps
    You can find more information on here:
    http://help.sap.com/saphelp_nw70/helpdata/en/ac/e9d8a51c732e42bd0e7de54b9ff4e2/content.htm
    Hopefully this help you, let me know if you need more information,
    Zareh

  • I need to open and use Excel and Word files on my iPad, which software need to get?

    I need to open and use Excel and Word files on my iPad, which software need to get?

    The options include :
    Apple's Pages app for Word docs and Numbers for Excel spreadsheets
    There are also third-party apps which support both word and excel in the one app e.g.Documents To Go and QuickOffice HD

  • Error while updating data using session and call transaction method

    Hi all,
        i have to update data using MM01 transaction from flat file to database.i have used both session method and call transaction method to do that.in both the methods data has been transferred from internal tables to screens but while updating the data that is by clicking the ok-code at the end of the transaction iam getting a dialogue box stating
       SAP EXPRESS DOCUMENT "UPDATE WAS TERMINATED" RECEIVED FROM AUTHOR "SAP".
      please tell whether the problem lies and solution for that.
                                       thanks and regards.

    hi,
    check your recording.check whether u saved your material no in recording or not.
    once again record the transacton mm01.
           MATNR LIKE RMMG1-MATNR,
           MBRSH LIKE RMMG1-MBRSH,
           MTART LIKE RMMG1-MTART,
           MAKTX LIKE MAKT-MAKTX,
           MEINS LIKE MARA-MEINS,
           MATKL LIKE MARA-MATKL,
           BISMT LIKE MARA-BISMT,
           EXTWG LIKE MARA-EXTWG,
    these are the fields which u have to take in internal table.
    this is the record which i took in my flatfile.use filetype as asc and hasfieldseperator as 'X'.
    SUDHU-6     R     ROH     MATSUDHU     "     001     7890     AA
    i did the same.but i didn't get any error.

  • Using archivelog and control file from other Oracle server

    I am still bothered with my backup process.
    I have 2 AIX boxes (same model, say A and B); both have BAAN 5 and Oracle 10g R2 on. Right now my colleague insists to use the export pump (cold backup) from Prod Oracle server (A) to restore the Oracle server on Box B. The Prod server has the archivelog mode turn on. But it will miss any transaction data from import pump till the crash point of Box A. So this is my confusion.
    Can I pass the control files and archivelog files form Box A (prod server) to Box B and use them to restore the Box B as the latest Prod server? How?
    I tried to convince them to use the RMAN backup? But not successful?
    I think the best way is probably to use the Oracle Data guard. However, there is always one concern to my manager and colleague, that such process will cause the data on the restored server (failover, Box B) not recognizable by the BAAN, which define the objects (tables).
    Thanks

    Performing a logical backup is not useful to restore to the point of failure. The only valid and available option is a hot backup/archivelog mode. Your recovery manager backup perform a controlfile and redologfile backup, so those can be restored at the destination. You must take care of the way you perform the backup, and ensure the paths where your backup is being deposited are visible by the second node. A shared storage with same mount points is suitable in this case. A tape robot configured at both nodes is also a suitable solution.
    Recovery manager perfoms a controlfile and spfile restore, too. This rman command perform the action:
    SET DBID <DBID of the database,for which you want to restore the controlfile>;
    RESTORE CONTROLFILE FROM <name_of_backupiece_which_contains_the_controlfile backup>';
    I don't see any problem on the recovery manager side, and technically speaking, on the Oracle side it is perfectly possible to restore your database at a remote location. I don't know what happens on the BAAN side, if you are required to have it configured to be operative on the target node. You could try to clone your database at the node B, configure BAAN and prepare the proceduere in case of failure.
    Configuring a dataguard is also a recomended action. So it is to think about Cold Failed Over clusters. I have recently performed a CFC configuration with BAAN. No problem it works smoothly.
    ~ Madrid.

  • How do I completely crop a PDF so that the cropped data is removed and the file size is reduced?

    How do I completely crop a PDF so that the cropped data is removed and the total file size is reduced?
    When I use the "Crop" function, the cropped data still remains in the file and there is no reduction in file size. I need a way to truly crop a PDF using Acrobat software.

    When you export, try to get the full file path or else you will have to do a lot of manual searching.
    If you downloaded the picture from Messages, the picture is stored in your User Library/Messages. to make your User Library visible, hold down the option key while using the Finder “Go To Folder” command. Enter ~/Library/Messages/Attachments. 
    If you prefer to make your user library permanently visible, use the Terminal command found below.
    http://osxdaily.com/2011/07/04/show-library-directory-in-mac-os-x-lion/
    You might want to bookmark the command. I had to use it again after I installed 10.8.4. I have also been informed that if you drag the user library to Finder it will remain visible.

  • Not able to Import data using "clear and replace"

    Hi,
    If I import data using the data admin package "Import" and "Merge" as 'method for importing' the process runs without problems.
    If I change the 'method for importing' to "Clear and Replace" the process fails. See message:
    TOTAL STEPS  2
    1. Convert Data:         completed  in 3 sec.
    2. Load and Process:     Failed  in 1 sec.
    3. Import:               completed  in 1 sec.
    [Selection]
    FILE=\UHRENHOLT\LEGAL_DATALOAD\DataManager\DataFiles\\Axapta_Load.txt
    TRANSFORMATION=\UHRENHOLT\LEGAL_DATALOAD\DataManager\TransformationFiles\\Axapta_Load.xls
    CLEARDATA= Yes
    RUNLOGIC= Yes
    CHECKLCK= Yes
    [Messages]
    Key cannot be null.
    Parameter name: key
    I'm uisng the standard data admin package (and thereby the values 0 and 1). For some reason the value 1 is not accepted.
    Any suggestions?
    /Lars

    Hi,
    The "Replace & clear..." feature during data import depends on Work Status. So to use this functionality, you need to setup Work Status under your application. Notice that you need to setup Work Status even if you aren't selecting the option to check Work Status when running the package.
    Hope this will help you.
    Kind Regards,
    Patrick

  • Problem in Uploading Data using a Tab Separated File?

    Hi All,
    I am trying to upload a file which tab separated containing customer and bank details and my file structure somewhat in the following manner.
    10     21169     abcde     xyz     kdHDHLk     gdh     ghgah  (Customer Details)
    20     21169     DE     20050000     01122334  (bank details for customer 21169)
    20     21169     DE     23022200     1122334455
    (bank details for customer 21169)
    20     21169     DE     23984899     223344556    (bank details).
    But when I am trying to intial upload the details to an internal table using GUI_upload FM and display to check if it is loading correctly or not it is not giving me any o/p.
    I am copying the code which I am trying to execute. Please tell me what way I need to modify the code so that it executes correctly.
    parameters: p_file type rlgrap-filename.
    data: begin of wa_file,
          text(256) type c,
          end of wa_file.
    data: it_file like table of wa_file.
    types: begin of ty_kna1,
           kunnr type kunnr,
           name1 type name1,
           sortl type sortl,
           stras type stras,
           ort01 type ort01,
           land1 type land1,
           spras type spras,
           end of ty_kna1.
    data: it_kna1 type standard table of ty_kna1,
          wa_kna1 type ty_kna1.
    types: begin of ty_knbk,
           kunnr type kunnr,
           banks type knbk-banks,
           bankl type knbk-bankl,
           bankn type knbk-bankn,
           end of ty_knbk.
    data: it_knbk type standard table of ty_knbk,
          wa_knbk type ty_knbk.
    data: v_id(2).
    At Selection-Screen on Value-Request for p_file.
    CALL FUNCTION 'F4_FILENAME'
    EXPORTING
       PROGRAM_NAME        = SYST-CPROG
       DYNPRO_NUMBER       = SYST-DYNNR
    *   FIELD_NAME          = ' '
    IMPORTING
       FILE_NAME           = p_file
    Start-of-Selection.
    data: p_file1 type string.
          p_file1 = p_file.
    CALL FUNCTION 'GUI_UPLOAD'
      EXPORTING
        FILENAME                      = p_file1
       FILETYPE                      = 'ASC'
       HAS_FIELD_SEPARATOR           = 'X'
    *   HEADER_LENGTH                 = 0
    *   READ_BY_LINE                  = 'X'
    *   DAT_MODE                      = ' '
    * IMPORTING
    *   FILELENGTH                    =
    *   HEADER                        =
      TABLES
        DATA_TAB                      = it_file
    * EXCEPTIONS
    *   FILE_OPEN_ERROR               = 1
    *   FILE_READ_ERROR               = 2
    *   NO_BATCH                      = 3
    *   GUI_REFUSE_FILETRANSFER       = 4
    *   INVALID_TYPE                  = 5
    *   NO_AUTHORITY                  = 6
    *   UNKNOWN_ERROR                 = 7
    *   BAD_DATA_FORMAT               = 8
    *   HEADER_NOT_ALLOWED            = 9
    *   SEPARATOR_NOT_ALLOWED         = 10
    *   HEADER_TOO_LONG               = 11
    *   UNKNOWN_DP_ERROR              = 12
    *   ACCESS_DENIED                 = 13
    *   DP_OUT_OF_MEMORY              = 14
    *   DISK_FULL                     = 15
    *   DP_TIMEOUT                    = 16
    *   OTHERS                        = 17
    IF SY-SUBRC <> 0.
    * MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
    *         WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.
    constants: c_tab type X value '09'.
    loop at it_file into wa_file.
    if wa_file+0(2) = '10'.
    split wa_file at 'c_tab'
              into v_id
                   wa_kna1-kunnr
                   wa_kna1-name1
                   wa_kna1-sortl
                   wa_kna1-stras
                   wa_kna1-ort01
                   wa_kna1-land1
                   wa_kna1-spras.
    append wa_kna1 to it_kna1.
    elseif wa_file+0(2) = '20'.
    split wa_file at 'c_tab'
           into v_id
                wa_knbk-kunnr
                wa_knbk-banks
                 wa_knbk-bankl
                 wa_knbk-bankn.
    append wa_knbk to it_knbk.
    endif.
    endloop.
    write:/ 'Customer Master General Data'.
    uline.
    loop at it_kna1 into wa_kna1.
    write:/ wa_kna1-kunnr,
             wa_kna1-name1,
                   wa_kna1-sortl,
                   wa_kna1-stras,
                   wa_kna1-ort01,
                   wa_kna1-land1,
                   wa_kna1-spras.
    endloop.
    clear wa_kna1.
    skip 2.
    write:/ 'Customer Master Bank Data'.
    uline.
    loop at it_knbk into wa_knbk.
    write:/ wa_knbk-kunnr,
             wa_knbk-banks,
             wa_knbk-bankl,
             wa_knbk-bankn.
    endloop.
    clear wa_knbk.
    Regards,
    MD

    Declare Class cl_abap_char_utilities
    Use File type as 'DBF'
    Has_field_seperator = w_tab in FM GUI_UPLOAD
    DATA: w_tab TYPE c VALUE cl_abap_char_utilities=>horizontal_tab.
        CALL FUNCTION 'GUI_DOWNLOAD'
        EXPORTING
        BIN_FILESIZE                    =
          filename                        = w_file
       filetype                        = 'DBF'
        append                          = ' '
       write_field_separator           = w_tab
        TABLES
          data_tab                        = it_extractchar
       fieldnames                      = it_header
    EXCEPTIONS
       file_write_error                = 1
       no_batch                        = 2
       gui_refuse_filetransfer         = 3
       OTHERS                          = 22

  • Data definition, templates and ldt files

    Can anyone,please, advise me where should i look up for templates,data definitions and ldt files for certain concurrent requests? I am relatively new to EBS, not quite sure where I should start looking..
    Edited by: maylo on Feb 23, 2013 4:28 AM

    maylo wrote:
    Thank you All for your reply.
    What entities (rdf, xml, concurrent programs, data definitions.....) are required to consider when migrating customization reports? Is it enough to move rdf, ldt (for concurrent program) and xml template? Do I need to get ldt for Data Definition and Templates using something like this:
    FNDLOAD apps/$CLIENT_APPS_PWD O Y DOWNLOAD $XDO_TOP/patch/115/import/xdotmpl.lct XX_CUSTOM_DD.ldt XDO_DS_DEFINITIONS APPLICATION_SHORT_NAME='XXCUST' DATA_SOURCE_CODE='XX_SOURCE_CODE' TMPL_APP_SHORT_NAME='XXCUST' TEMPLATE_CODE='XX_SOURCE_CODE'
    If anyone could give just a big picture of what is required to migrate when migrating customization reports.
    Thank you
    Maylo
    Edited by: maylo on Mar 13, 2013 1:53 AMPlease see these docs.
    How To Use XDOLoader to Manage, Download and Upload Files? [ID 469585.1]
    Running XDOLoader to Download Data for XML Publiser Data Definitions Fails with ORA-6401 [ID 374195.1]
    Unable to Download Data Templates With XDOLoader: java.lang.NullPointerException [ID 428956.1]
    How To Upload Xliff Files for XML Publisher Using XDOLoader [ID 810163.1]
    Thanks,
    Hussein

  • Unload dynamic data using JDBC into a file

    Hi,
    I have to download the result of a dynamic SELECT query into a file.
    The query will be provided at runtime. So I am unsure of the number of columns and the data types of the columns.
    Also I am unable to use io on the returned resultset object.
    So basically i have to persist the data from the resultset object into a file.
    Any hints
    Thanks in advance

    Also I am unable to use io on the returned resultset object.What do you mean by this?
    See the JDBC tutorial: http://java.sun.com/docs/books/tutorial/jdbc/index.html
    and the file I/O tutorial: http://java.sun.com/docs/books/tutorial/essential/io/index.html

  • Problem in Creating new row & inserting data using CreateInsert and Commit

    Hello All,
    I have created a page there are few input text and i want to insert the data into a database table. I have created an Application Module I am using CreateInsert and Commit operation but there is one problem.
    At first it created a row in database after that it is not creating the new row instead it is updating the same row with the new values.
    In bindings of my jspx page I have created two binding for action (1) CreateInsert for the VO of that Application Module (2) Commit operation of that Application Module.
    Here is the code snippet of my application:
    BindingContainer bindings = getBindings();
    OperationBinding operationBinding = bindings.getOperationBinding("CreateInsert");
    Object result = operationBinding.execute();
    *if (!operationBinding.getErrors().isEmpty()) {*
    return null;
    OperationBinding operationBinding1 = bindings.getOperationBinding("Commit");
    Object result1 = operationBinding1.execute();
    *if (!operationBinding1.getErrors().isEmpty()) {*
    return null;
    I have tried using Execute+Commit and Insert+Commit case also in every case it is updating the same row and not inserting a new row.
    Is there anything I am missing?
    Please Help.

    hi user,
    i dono. why are trying with codes. adf provides zero lines codes.
    a wonderful drag and drop functionality provide by the framework.
    while double click the button the codes are  registered in your bean
        public String cb6_action() {
            BindingContainer bindings = getBindings();
            OperationBinding operationBinding = bindings.getOperationBinding("CreateInsert");
            Object result = operationBinding.execute();
            if (!operationBinding.getErrors().isEmpty()) {
                return null;
            return null;
        public String cb8_action() {
            BindingContainer bindings = getBindings();
            OperationBinding operationBinding = bindings.getOperationBinding("Commit");
            Object result = operationBinding.execute();
            if (!operationBinding.getErrors().isEmpty()) {
                return null;
            return null;
        public String cb7_action() {
            BindingContainer bindings = getBindings();
            OperationBinding operationBinding = bindings.getOperationBinding("Delete");
            Object result = operationBinding.execute();
            if (!operationBinding.getErrors().isEmpty()) {
                return null;
            return null;
        public String cb14_action() {
            BindingContainer bindings = getBindings();
            OperationBinding operationBinding =
                bindings.getOperationBinding("Delete4");   // some different here. after deleting usually do commit
            OperationBinding operationBinding1 =  
                bindings.getOperationBinding("Commit");    // so here commit operation.
            Object result = operationBinding.execute();
            Object result1 = operationBinding1.execute();
            if (!operationBinding.getErrors().isEmpty()) {
                return null;
            if (!operationBinding1.getErrors().isEmpty()) {
                //add error handling here
                return null;
            return null;
        }if am not understud correctly. please some more explanation need.

  • Changing "Created Date" on image and other files.

    I am scanning old family photographs and need to redate all of these photos.  I would like to place them in actual date order, but iphoto doesn't allow that.
    How can I change the "Created Date" on the .jpg files I'm creating on my iMAC?
    Thanks!!
    (And i'm fairly new to the MAC world, "I WAS A PC"..... ;-)

    Hello & a warm welcome to the forums & Macdom!
    There's the terminal method...
    http://danilo.ariadoss.com/howto-change-date-modified-date-created-mac/
    Then a couple of Apps to do it...
    http://www.publicspace.net/ABetterFinderAttributes/
    Mac App Store - File Date Changer 5
    And Automator method...
    http://reviews.cnet.com/8301-13727_7-57411491-263/how-to-batch-rename-files-usin g-automator-in-os-x/

  • How to use jsp and xhtml files in an application

    This is web.xml file,
    <web-app>
    <context-param>
         <param-name>javax.faces.DEFAULT_SUFFIX</param-name>
         <param-value>.xhtml</param-value>
    </context-param>
    <context-param>
    <param-name>facelets.DEVELOPMENT</param-name>
    <param-value>true</param-value>
    </context-param>
    <servlet>
         <servlet-name>Faces Servlet</servlet-name>
         <servlet-class>javax.faces.webapp.FacesServlet</servlet-class>
         <load-on-startup>1</load-on-startup>
    </servlet>
    <servlet-mapping>
         <servlet-name>Faces Servlet</servlet-name>
         <url-pattern>*.jsf</url-pattern>
    </servlet-mapping>
    <listener>
    <listener-class>org.apache.myfaces.webapp.StartupServletContextListener</listener-class>
    </listener>
    <context-param>
    <param-name>javax.faces.CONFIG_FILES</param-name>
    <param-value>/WEB-INF/faces-config.xml</param-value>
    </context-param>
    </web-app>
    If i configure like this, i am not able to work with jsp file.
    So, how to configure web.xml file to work with both jsp and xhtml file.
    Thanks,
    Vinutha.

    Hi Sam,
    To Use properties file you need to keep the properties file in a location and in the code you need to mention the path from where it should read in the runtime.
    Fuego.Io.PropertiesFile propfile;
    propfile.load(fileName : "<Path: C:/sample.properties>"); //Path: Place the properties file in the respective directory/drive and mention the path
    String val = propfile.get(key : "One"); //One - is the key against which the value should be mentioned in the properties file like <One = 1>. It should return 1
    logMessage("Value: " + val);
    To use the Enum in your project first create a Module in your catalogue component and then right click on module create New Enumeration say ProcessStatus
    Uncheck the Is Sequential if you want to keep key/value as pair. Click on add +* and mention the Name and Value say ABORTED as Name and Value as Aborted. To use it in your project conditional path write the condion as somevariable == String(ProcessStatus.ABORTED)
    Hope this will help you.
    Bibhu

Maybe you are looking for

  • Using remote machines for report generation

    Since LabView does not have cross-platform openoffice support, I was looking at ways of generating reports from a LabView session (2010) running on a mac. The report generation VI contains a somewhat suggestive machine name input. Can one connect to

  • Class Cast exception in JSP

    Hi All, Softwares used : UI--> JSP/Flex Grid/Dojo framwork Spring MVC Back end : EJB 3.0 application Server : Jboss 4.2.2 i have one Ajax call witch will update my flex grid after getting the data from Backend. i am able to get responce object but i

  • Recording MP3

    HI Please excuse my ignorance but I'm totally new to the mac. I am looking for a way of recording an mp3 directly on to my mac for a audio assignment I need to do. I can record in garage band but what format is this saved in? and is there anyway of c

  • Importing Word doc or pdf as an asset

    IS there any way of importing a word document or pdf as an asset to use on the dvd. I would like to use the simple menu to link them to a pdf so that it can be printed from the dvd. Thanks

  • Approval Stages- Marketing Documents

    Dear All I am working on SAP Business One 8.81 PL 06 and SAP Business One 8.82 PL00 (Test Enviroment) In both the version approval process is not working . I defined Approval stages, Template and condition as Always. Please let me know is there any c