SQL*Loader or external table for load a MSG (email) file

Hi there!
I'm looking for a way to load an email in a Oracle DB.
I mean, not all the email's body in a column, but to "parse" it in a multi column/table fashion.
Is it possible to do with a sql*loader script or an external table?
I think it is not possible, and that I must switch to XML DB.
Any idea?
Thanks,
Antonio

Hello,
Why don't you just load the entire MSG (email) as clob into one email_body column or whatever column name you want to use.
To load data upto 32k, you can use varchar2(32656) but its not a good idea to load clob in that manner because it's very inconsistent as length can
vary resulting in string literal too long. So you have 2 choices now, first you have to use either procedure or anonymous block to load clob data.
First Method -- I loaded alert.log successfully and you can imagine how big this file can be (5MB in my test case)
CREATE OR REPLACE DIRECTORY DIR AS '/mydirectory/logs';
DECLARE
   clob_data   CLOB;
   clob_file   BFILE;
BEGIN
   INSERT INTO t1clob
   VALUES (EMPTY_CLOB ())
   RETURNING clob_text INTO clob_data;
   clob_file   := BFILENAME ('DIR', 'wwalert_dss.log');
   DBMS_LOB.fileopen (clob_file);
   DBMS_LOB.loadfromfile (clob_data,
                          clob_file,
                          DBMS_LOB.getlength (clob_file)
   DBMS_LOB.fileclose (clob_file);
   COMMIT;
END;Second Method: Use of Sqlldr
Example of controlfile
LOAD DATA
INFILE alert.log "STR '|\n'"
REPLACE INTO  table t1clob
   clob_text char(30000000)
)Hope this helps

Similar Messages

  • SQL Loader versus External Table

    If anyone has worked on external tables please let me know your views.

    user637544 wrote:
    for sqlldr i follow the following approach
    1. truncate table
    2. call sqlldr and load records in staging table
    3. check for bad records
    how do i tuncate, call external table and check bad records for external table.As part of the SQL*Loader control file you can tell it to truncate the table as part of the process before it loads in the data.
    The key differences between SQL*Loader and External Tables are:
    1. SQL*Loader is an external utility run from the o/s command line, whereas External tables are usable from within SQL and PL/SQL.
    2. SQL*Loader can only load data up into the database, whereas External tables can allow you to read files and write files (10g onwards)
    3. SQL*Loader has limited control on skipping rows etc. whereas External tables can be queried using SQL and has all the flexibility that SQL can offer.
    4. SQL*Loader requires connection information to the database, whereas External tables are already part of the database
    5. SQL*Loader can only change the source filename through dynamic o/s level scripts, whereas External tables can have the filename changed using an ALTER TABLE ... LOCATION ... command.
    If you want to truncate your staging table before loading it with data from the external table, then you will just have to do that as a seperate step, because the external table is not associated with other tables. Simply truncate the other table and then query the data from the external table into the staging table.
    External tables offer a lot more flexibility than using SQL*Loader.

  • Sql loader or external table

    Hi all good morning.
    can we write a stored procedure that loads a file into the table using sql loader?
    can Java call the above stored procedure?
    regards
    raj
    Edited by: user10887630 on Apr 23, 2009 6:18 AM

    Are you saying the files themselves can't be located on the server? or just that the process for loading the files can't be located on the server.
    If the files themselves cannot reside on the server then you won't be able to use SQL*Loader or External tables from within stored procedures. It would require some clever Java code type stuff to get across to a.n.other machine where the files are stored and get the data.
    Typically it is normal for such files to be located on the server. What reasons do the DBA's give for not wanting them there?
    DBA's say that application specific files cannot be maintained on the Database serverSo what are all the database data files then? They hold data from the applications. ;)

  • SQL *Loader and External Table

    Hi,
    Can anyone tell me the difference between SQL* Loader and External table?
    What are the conditions under we can use SQL * Loader and External Table.
    Thanx

    External tables are accessible from SQL, which generally simplifies life if the data files are physically located on the database server since you don't have to coordinate a call to an external SQL*Loader script with other PL/SQL processing. Under the covers, external tables are normally just invoking SQL*Loader.
    SQL*Loader is more appropriate if the data files are on a different server or if it is easier to call an executable rather than calling PL/SQL (i.e. if you have a batch file that runs on a server other than the database server that wants to FTP a data file from a FTP server and then load the data into Oracle).
    Justin

  • Use of External tables to load XML data.

    Hi,
    I have used external table definitions to load various XML files to the database, usually splitting the XML into seperate records - 1 per major element tag, and using PL/SQL to parse out a primary key to store in a relational table with all of the XML relevant to that primary key value stored as an XMLTYPE coumn in a row of the table. This has worked fine for XML with a single major entity (element tag) .
    However, I now have an XML file that contains two "major" elements (both children of the root) that I would like to split out and store in seperate tables.
    The XML file is of the following basic format:-
    <drugs>
    <drug>drug 1...</drug>
    <drug>drug 2...</drug>
    <partners>
    <partner>partner 1</partner>
    <partner>partner 2</partner>
    </partners>
    </drugs>
    The problem is there are around 18000 elements of the first type, followed by several thousand of the 2nd type. I can create two seperate external tables - one for each element type, but how do I get the external table for the 2nd to ignore all the elements of the first type? My external table definition is :-
    CREATE TABLE DRUGBANK_OWNER.DRUGBANK_PARTNERS_XML_EXTERNAL
    DRUGBANK_XML CLOB
    ORGANIZATION EXTERNAL
    ( TYPE ORACLE_LOADER
    DEFAULT DIRECTORY DRUGBANK_DIR
    ACCESS PARAMETERS
    ( records delimited by "</partner>" SKIP 100000
    characterset al32utf8
    badfile extlogs:'drugbank_partners_xml.bad'
    logfile extlogs:'drugbank_partners_xml.log'
    discardfile extlogs:'drugbank_partners_xml.dis'
    READSIZE 52428800
    fields
    drugbank_xml CHAR(50000000) terminated by '</partners>'
    LOCATION (DRUGBANK_DIR:'drugbank.xml')
    REJECT LIMIT UNLIMITED
    PARALLEL ( DEGREE 8 INSTANCES 1 )
    NOMONITORING;
    The problem is that before the first <partners> element the 1800 or so <drugs> elements cause a data cartrdige error:-
    ORA-29913: error in executing ODCIEXTTABLEFETCH callout
    ORA-29400: data cartridge error
    KUP-04020: found record longer than buffer size supported, 52428800
    This happens regardless of the value of the SKIP or the size of the drugbank_xml field.
    I have tried using an OR on the "records delimited by" access parameter, to 'delimit by "</partner>" OR "</drug>"', with the intention of filtering out the <drug> elements but this leads to a syntax error.
    Anyone ever tried anything similar and got it to work?
    Any other suggestions?
    Thanks,
    Sid.

    No, the content inside quotes is spanned across multiple lines....there are line breaks after every html tag.
    "What's the error message you are getting?"
    Iam not getting any error while selecting from external table , but I am getting tose rows in BAD file and log file has the following entries
    KUP-04021: field formatting error for field TKBS_DSCN
    KUP-04036: second enclosing delimiter not found
    Message was edited by:
    user627610

  • External Table for Variable Length EBCDIC file with RDWs

    I am loading an ebcdic file where the record length is stored in the first 4 bytes. I am able to read the 4 bytes using the db's native character set, ie;
    records variable 4
    characterset WE8MSWIN1252
    data is little endianBut I have to then convert each string column individually on the select, ie;
    convert(my_col, 'WE8MSWIN1252', 'WE8EBCDIC37')If I change the character set to ebcdic;
    records variable 4
    characterset WE8EBCDIC37
    data is little endianI get the following error reading the first 4 bytes;
    ORA-29913: error in executing ODCIEXTTABLEFETCH callout
    ORA-29400: data cartridge error
    KUP-04019: illegal length found for VAR record in file ...We can not use the ftp conversion as the file contains packed decimals.
    There are other options for converting the file but I am wondering if was able to get an external table to read a native ebcdic file without a pre-process step.

    I am loading an ebcdic file where the record length is stored in the first 4 bytes. I am able to read the 4 bytes using the db's native character set, ie;
    records variable 4
    characterset WE8MSWIN1252
    data is little endianBut I have to then convert each string column individually on the select, ie;
    convert(my_col, 'WE8MSWIN1252', 'WE8EBCDIC37')If I change the character set to ebcdic;
    records variable 4
    characterset WE8EBCDIC37
    data is little endianI get the following error reading the first 4 bytes;
    ORA-29913: error in executing ODCIEXTTABLEFETCH callout
    ORA-29400: data cartridge error
    KUP-04019: illegal length found for VAR record in file ...We can not use the ftp conversion as the file contains packed decimals.
    There are other options for converting the file but I am wondering if was able to get an external table to read a native ebcdic file without a pre-process step.

  • Can we create single External Table for multiple files?

    HI,
    Can we create External table for multiple files? Could anyone please explain it.
    Thanks and regards
    Gowtham Sen.

    to merge 16 files having same structureWell, if files have the same structure, as per the reading of the example from the following documentation, you can create one external table for all your files :
    http://download-uk.oracle.com/docs/cd/B19306_01/server.102/b14231/tables.htm#i1007480
    Nicolas.

  • Mozilla shows NOT RESPONDING until it loads the pages completely, for FACEBOOK and checking EMAILS on yahoo and google

    Mozilla shows NOT RESPONDING until it loads the pages completely, for FACEBOOK and checking EMAILS on yahoo and google.
    You can't do anything, you should just wait.

    I'm getting a similar error over here. Sporadically some sites won't load specifically in Safari; they open just fine in Opera, Firefox and Camino though. This is really annoying.
    It's pretty clear it's nothing network related…

  • I upgraded to OSX.8 and cannot load MS Encore. I desperately need some email files from ENCORE. Is there a program that can open my ENCORE email files?

    I upgraded to OSX.8 and cannot load MS Encore. I desperately need some email files from ENCORE. Is there a program that can open my ENCORE email files?
    SLM01

    Do you mean Entourage? It appears you have Office 2004 or older. Those are PowerPC apps only and will not run in Lion or later. If you need access to those emails like, "RIGHT NOW!", then I would suggest purchasing Office 2011 for Mac. Outlook is the new Office email client in 2011 and can import your older Entourage data.

  • Sql Loader Vs External table? Which one is preffered?

    Hello guru,
    We trying to load the data into about 160 tables. Here are the following things we are doing currently,
    1. We get the flat file in .txt format
    2. We are writing the control file for each flat file for thier respective tables
    3.And we are using sql loader command to load this data into the table.
    I was wondering if this process is cumbersome, and i m not sure if external table could be simple in loading the tables compared to what we are doing ?And i have not used external tables, so wanted to know which on eis better ? So any idea is greatly appriciated !
    FYI.. Version :- Oracle 11g
    Thank you so much!

    Thanks for you reply justin !
    I found the below example for loading the data into external table...
    CREATE OR REPLACE DIRECTORY dat_dir AS 'C:\Oradata\Data';
    CREATE OR REPLACE DIRECTORY log_dir AS 'C:\Oradata\Log';
    CREATE OR REPLACE DIRECTORY bad_dir AS 'C:\Oradata\Bad';
    GRANT READ ON DIRECTORY dat_dir TO scott;
    GRANT WRITE ON DIRECTORY log_dir TO scott;
    GRANT WRITE ON DIRECTORY bad_dir TO scott;
    CREATE TABLE revext (person      VARCHAR2(20),
                         rev_jan     NUMBER(4),
                         rev_feb     NUMBER(4),
                         rev_mar     NUMBER(4),
                         rev_apr     NUMBER(4),
                         rev_mai     NUMBER(4),
                         rev_jun     NUMBER(4),
                         rev_jul     NUMBER(4),
                         rev_aug     NUMBER(4),
                         rev_sep     NUMBER(4),
                         rev_oct     NUMBER(4),
                         rev_nov     NUMBER(4),
                         rev_dez     NUMBER(4))
    ORGANIZATION EXTERNAL
       TYPE ORACLE_LOADER
       DEFAULT DIRECTORY dat_dir
       ACCESS PARAMETERS
         records delimited by newline
         badfile bad_dir:'revext%a_%p.bad'
         logfile log_dir:'revext%a_%p.log'
         fields terminated by ','
         missing field values are null
         ( person,
           rev_jan,
           rev_feb,
           rev_mar,
           rev_apr,
           rev_mai,
           rev_jun,
           rev_jul,
           rev_aug,
           rev_sep,
           rev_oct,
           rev_nov,
           rev_dez
       LOCATION ('revext.dat')
    PARALLEL 4
    REJECT LIMIT UNLIMITED;
    CREATE TABLE revenue (
        person       VARCHAR2(20),
        month        VARCHAR2(3),
        revenue      NUMBER,
        CONSTRAINT revenue_pk PRIMARY KEY (person,month));
    INSERT INTO revenue (person,month,revenue)
       SELECT person,'Jan',rev_jan
       FROM revext--but currently we are using sql loader, our data looks like this
      1119Smith      01/01/1982AXYZ corporation  xyz corp
      1111collen      01/01/1990AABC corporation  abc corp
         and control file is like this
    INTO TABLE "XYZ_tbl"
       ID                     POSITION(01:05)        CHAR                  "DECODE(RTRIM(:ID), NULL, 'NA', :ID)"       ,
       Name                POSITION(06:15)        CHAR                  "DECODE(RTRIM(:NAME), NULL, 'NA', :Name)"   ,
       Act_dt              POSITION(16:25)        DATE                  "MM/DD/YYYY" NULLIF ACT_DT=BLANKS
    My question is, can i use the options like NULLIF/ DECODE or changing datatypes when loading or functions like REPLACE / TO_DATE in External tables ? Any idea? Any example code or SQL will great
    Thank you so much!

  • Sql loader utl_file & external table

    can any one let me know the differences between.
    1.sql loader
    2.utl_file
    3.external table
    Regards.
    Asif.

    To expand on Aron's answer....
    SQL*Loader - An operating system utility which uses control files (which you create) to load data files onto database tables.
    UTL_FILE - A database package which can be used for reading and writing files in any format you care to design programmatically.
    External Table - The latest thing which can be used instead of SQL*Loader. This is done from the database end, by creating a table as an external table and pointing it at the source file on the operating system. It also allows information similar to that put in the SQL*Loader control files to be specified against the table. By querying against the table you are in fact querying against the source file. There are some limitation compared to regular database tables such as there is no ability to write to the external table.
    ;)

  • SQL loader  verses External table

    hi
    Can anybody guide me the advantages of External table over SQL Loader in data warehousing environment.
    Or and any URL where I can find suitabel knowledge.
    Thanks in advance.
    Moloy

    here it is.
    http://download-uk.oracle.com/docs/cd/B19306_01/server.102/b14215/et_concepts.htm#sthref1672

  • Error when loading from External Tables in OWB 11g

    Hi,
    I face a strange problem while loading data from flat file into the External Tables.
    ORA-12899: value too large for column EXPIRED (actual: 4, maximum: 1)
    error processing column EXPIRED in row 9680 for datafile <data file location>/filename.dat
    In a total of 9771 records nearly 70 records are rejected due to the above mentioned error. The column (EXPIRED) where the error being reported doesn't have value greater than 1 at all. I suspect it to be a different problem.
    Example: One such record that got rejected is as follows:
    C|234|Littérature commentée|*N*|2354|123
    highlightened in Bold is the EXPIRED Column.
    When I tried to insert this record into the External Table using UTL_FILE Utility it got loaded successfully. But when I try with the file already existing in the file directory it again fails with the above error, and I would like to mention that all the records which have been loaded are not Ok, please have a look at the DESCRIPTION Column which is highlightened. The original information in the data file looks like:
    C|325|*Revue Générale*|N|2445|132
    In the External Table the Description Value is replaced by the inverted '?' as follows:
    Reue G¿rale
    Please help.
    Thanks,
    JL.

    user1130292 wrote:
    Hi,
    I face a strange problem while loading data from flat file into the External Tables.
    ORA-12899: value too large for column EXPIRED (actual: 4, maximum: 1)
    error processing column EXPIRED in row 9680 for datafile <data file location>/filename.dat
    In a total of 9771 records nearly 70 records are rejected due to the above mentioned error. The column (EXPIRED) where the error being reported doesn't have value greater than 1 at all. I suspect it to be a different problem.
    Example: One such record that got rejected is as follows:
    C|234|Littérature commentée|*N*|2354|123
    highlightened in Bold is the EXPIRED Column.
    When I tried to insert this record into the External Table using UTL_FILE Utility it got loaded successfully. But when I try with the file already existing in the file directory it again fails with the above error, and I would like to mention that all the records which have been loaded are not Ok, please have a look at the DESCRIPTION Column which is highlightened. The original information in the data file looks like:
    C|325|*Revue Générale*|N|2445|132
    In the External Table the Description Value is replaced by the inverted '?' as follows:
    Reue G¿rale
    Please help.
    Thanks,
    JL.sorry, couldnt see the highlighted test.could you plesae enclsoe it in  tags
    also post the table definition with attributes. BTW..Whats your NLS_LANGUAGE SET TO?

  • Selective load in external table

    Hi all,
    I need to load data into my external table based on conditions. If i use the 'load when' im able to load the data into the table but my requirement is to load data into different columns of the same table based on the conditional "when" clause.
    This is equivalent to wht we do in .ctl file using loader..Th scenario being:
    LOAD DATA     
    INFILE '/m018/applmgr/ERP1/modap/1.0.0/bin/BNKSTMT260508.txt'                                   
    APPEND
    INTO TABLE                                   
    MODAP_IN_BNK_RECON_TBL     
    WHEN record_type = '01'
    FIELDS TERMINATED BY ','                              
    OPTIONALLY ENCLOSED BY '"'                              
    TRAILING NULLCOLS                                   
    (RECORD_TYPE POSITION(1)
    ,SENDER_IDENTI
    ,RECEIVER_IDENTI
         ,FILE_CREATION_DATE DATE "YYMMDD" NULLIF FILE_CREATION_DATE=BLANKS
         ,FILE_CREATION_TIME
         ,FILE_SEQ_NO
         ,RECORD_LENGTH
         ,BLOCKING_FACTOR
         ,VERSION_NO
    INTO TABLE                                   
    MODAP_IN_BNK_RECON_TBL     
    WHEN record_type = '02'
    FIELDS TERMINATED BY ','                              
    OPTIONALLY ENCLOSED BY '"'                              
    TRAILING NULLCOLS                                   
    (RECORD_TYPE POSITION(1) "TRIM(:RECORD_TYPE)"
    ,RECEIVER_IDENTI "TRIM(:RECEIVER_IDENTI)"
    ,ORIGINATOR_IDENTI "TRIM(:ORIGINATOR_IDENTI)"
         ,GROUP_STATUS "TRIM(:GROUP_STATUS)"
         ,AS_OF_DATE DATE "YYMMDD" NULLIF AS_OF_DATE=BLANKS
         ,AS_OF_TIME "TRIM(:AS_OF_TIME)"
         ,CURRENCY_CODE "TRIM(:CURRENCY_CODE)"
    I want to replicate the same in external table. Please suggest me a suitable solution for this at the earliest.
    Regards,
    Balaji V

    Here is a sample I used:
    DROP TABLE myschema.mytable;
    CREATE TABLE myschema.mytable (
    "ID" VARCHAR2(12),
    "RESP_DOB" DATE,
    "RESP_FAM_REL" VARCHAR2(5),
    "PET_FIRST_NAME" VARCHAR2(11),
    "PET_MIDDLE_NAME" VARCHAR2(11),
    "PET_LAST_NAME" VARCHAR2(17),
    "PET_SUFFIX" VARCHAR2(1)
    ORGANIZATION EXTERNAL (
    TYPE ORACLE_LOADER
    DEFAULT DIRECTORY "MY_DIRECTORY"
    ACCESS PARAMETERS(
    RECORDS DELIMITED BY NEWLINE
    BADFILE 'the_BAD.TXT'
    DISCARDFILE 'the_DISCARD.TXT'
    LOGFILE 'the_LOG.TXT'
    FIELDS
    MISSING FIELD VALUES ARE NULL
    ID POSITION(1:12),
    RESP_DOB POSITION(98:105) DATE MASK "mmddyyyy",
    RESP_FAM_REL POSITION(106:110) NULLIF RESP_FAM_REL=".",
    PET_FIRST_NAME POSITION(111:121) NULLIF PET_FIRST_NAME=".",
    PET_MIDDLE_NAME POSITION(122:132) NULLIF PET_MIDDLE_NAME=".",
    PET_LAST_NAME POSITION(133:149) NULLIF PET_LAST_NAME=".",
    PET_SUFFIX POSITION(150:150) NULLIF PET_SUFFIX="."
    LOCATION ('the_file.txt')
    );

  • While loading through External Tables, Japanese characters wrong load

    Hi all,
    I am loading a text file through External Tables. While loading, japanese characters are loading as junk characters. In text file, the characters are showing correctly.
    My spool file
    SET ECHO OFF
    SET VERIFY OFF
    SET Heading OFF
    SET LINESIZE 600
    SET NEWPAGE NONE
    SET PAGESIZE 100
    SET feed off
    set trimspool on
    spool c:\SYS_LOC_LOGIC.txt
    select CAR_MODEL_CD||',' || MAKER_CODE||',' || CAR_MODEL_NAME_CD||',' || TYPE_SPECIFY_NO||',' ||
         CATEGORY_CLASS_NO||',' || SPECIFICATION||',' || DOOR_NUMBER||',' || RECOGNITION_TYPE||',' ||
         TO_CHAR(SALES_START,'YYYY-MM-DD') ||',' || TO_CHAR(SALES_END,'YYYY-MM-DD') ||',' || LOGIC||',' || LOGIC_DESCRIPTION
    from Table where rownum < 100;
    spool off
    My External table load script
    CREATE TABLE SYS_LOC_LOGIC
         CAR_MODEL_CD                         NUMBER               ,
         MAKER_CODE                              NUMBER,
         CAR_MODEL_NAME_CD                    NUMBER,
         TYPE_SPECIFY_NO                         NUMBER               ,
         CATEGORY_CLASS_NO                    NUMBER               ,
         SPECIFICATION                         VARCHAR2(300),
         DOOR_NUMBER                              NUMBER,
         RECOGNITION_TYPE                    VARCHAR2(30),
         SALES_START                          DATE ,
         SALES_END                               DATE ,
         LOGIC                                   NUMBER,
         LOGIC_DESCRIPTION                    VARCHAR2(100)
    ORGANIZATION EXTERNAL
    TYPE ORACLE_LOADER
    DEFAULT DIRECTORY XMLTEST1
    ACCESS PARAMETERS
    RECORDS DELIMITED BY NEWLINE
    FIELDS TERMINATED BY ','
    MISSING FIELD VALUES ARE NULL
                        CAR_MODEL_CD,MAKER_CODE,CAR_MODEL_NAME_CD,TYPE_SPECIFY_NO,
                        CATEGORY_CLASS_NO,SPECIFICATION,DOOR_NUMBER,RECOGNITION_TYPE,
                        SALES_START date 'yyyy-mm-dd', SALES_END      date 'yyyy-mm-dd',
                        LOGIC, LOGIC_DESCRIPTION     
    LOCATION ('SYS_LOC_LOGIC.txt')
    --location ('products.csv')
    REJECT LIMIT UNLIMITED;
    How to solve this.
    Thanks in advance,
    Pal

    Just so I'm clear, user1 connects to the database server and runs the spool to generate a flat file from the database. User2 then uses that flat file to load that data back in to the same database? If the data isn't going anywhere, I assume there is a good reason to jump through all these unload and reload hoops rather than just moving the data from one table to another...
    What is the NLS_LANG set in the client's environment when the spool is generated? Note that the NLS_CHARACTERSET is a database setting, not a client setting.
    What character set is the text file? Are you certain that the text file is UTF-8 encoded? And not encoded using the operating system's local code page (assuming the operating system is capable of displaying Japanese text)
    There is a CHARACTERSET parameter for the external table definition, but that should default to the character set of the database.
    Justin

Maybe you are looking for

  • Is there any way to retrieve a previous backup after the reset version has backed up?

    My phone recently freaked out and went into safe mode and when i went to back it up to the most recent back up, i accidentally set it up as a new iphone.. then it backed up to my computer which erased the most recent back up i wanted... Is there any

  • Import json file to sql server

    I have 1000 records in text file. My requirement is to insert data into sql server. For that, if i can convert to xml or push to script component also fine. Text file consists in this format... [{"ID":1,:"Name":"test1","State":"AP"},{"ID":2,:"Name":"

  • Csv file format problem

    prodorder;batchno   ;text1    ; text2; text3;text4;text5; "100001012;0008339492;487-07-G1;22;6,86;000081022G;  " "100001013;0008339493;487-07-G1;22;6,86;000081022E;1 " This is my sample csv file. i want to upload using bdc. im splitting the internal

  • Can't upload files to lighttp server

    Hi * I'm starting to have grey hairs abut this... My sys is Archlinux (64bit), lighttpd 1.4.29-2, curl 7.22. I'm trying to setup hhtp server with capability of upload files. After basic setup I use following command to test functionality:  curl -T "t

  • Use of Threads with JSP's

    Can you use threads in a web application that uses JSP's plus Servlets and Beans ? We have a JRUN 3.1 application using JSPs, Servlets and Beans. Shortly after we went live it became apparent that any 2 users using the same object experienced problem