Creating abap data flow, open file error

hello experts,
i am trying to pull all the field of MARA table in BODS.
so i m using abap data flow.but after executing the job i got error "cant open the .dat file"
i am new to abap data flow so i think may be i did some mistake in configuration of datastore.
can any one guide me how to create a datastore for abap data flow???

In your SAP Applications datastore, are you using "Shared Directory" or "FTP" as the "Data transfer method"?  Given the error, probably the former.  In that case, the account used by the Data Services job server must have access to wherever SAP is putting the .DAT files.  When you run an ABAP dataflow, SAP runs the ABAP extraction code (of course) and then exports or saves the results to a .DAT file, which I believe is just a tab-delimited flat text file, in the folder "Working directory on SAP server." This is specified from the perspective of the SAP server, e.g., "E:\BODS\TX," where the E:\BODS\TX folder is local to the SAP application server. I believe this folder is specified as a directive to the ABAP code, telling SAP where to stick it (the .DAT files). The DS job server then picks it up from there, and you tell it how to get there via "Application path to the shared directory," which, in the above case, might be
SAPDEV1\BODS\TX" if you shared-out the E:\BODS folder as "BODS" and the SAP server was SAPDEV1.  Anyway: the DS job server needs to be able to read files at
SAPDEV1\BODS\TX, and may not have any rights to do so, especially if it's just logging-in as Local System.  That's likely your problem. In a Windows networking environment, I always have the DS job server log-in using an AD account, which then needs to be granted privileges to the, in our example's case,
SAPDEV1\BODS\TX folder.  Also comes in handy for getting to data sources, sometimes.
Best wishes,
Jeff Prenevost
Data Services Practice Manager
itelligence

Similar Messages

  • Abap Data flow error in BODS

    Hi All,
    I have started to run a simple abap data flow in BODS so the data flow has been designed and it is executed then the below error is thrown.
    It seems to be some abap drivers issues , not the design flow.
    so anyone please suggest the actual issue here and let me know you need any further information.
    Thanks.
    Best Regards,
    Edu

    Hi Konakanchi,
    Please share the Job execution log for more details like RFC Configuration issue or BODS Issue, Mean while can you please check the RFC Connection Test through SAPGUI.
    Thanks,
    Daya

  • RapidMarts - "Open file error" R3C-150607 loading from file

    Hello experts.
    We are installing RapidMarts for the SA-module, but when executing the main workflow in Data services we get an error trying to load the file dates.dat.
    The generated ABAP looks just fine, and the file exists where it should be (in SAP working directory), but still not able to load it.
    This is the error msg:
    Data flow DF TimeDim SAP - Execute ABAP program <E:/RM generated ABAP dir/TimeDim.aba> error  > Open File Error -- E:/RM Working directory/dates.dat>.
    We have the following setup:
    Windows server
    MS SQLserver 2005 database
    SAP BO Dataservices v12.2.9.0
    SAP BO RapidMarts for v12.2.0
    Any good ideas on how to fix this?
    Thanks in advance!
    Best regards,
    IngA

    Hi again.
    The problem is solved.
    When configuring the R3 datastore in Dataservices we assigned the different directories (working dir, generated ABAP dir etc.) on a server that's NOT our R3-server.
    When we  changed this to use folders on our SAP-server it works just fine (for this at least)
    Cheers, IngA

  • DS 4.2 get ECC CDHDR deltas in ABAP data flow using last run log table

    I have a DS 4.2 batch job where I'm trying to get ECC CDHDR deltas inside an ABAP data flow.  My SQL Server log table has an ECC CDHDR last_run_date_time (e.g. '6/6/2014 10:10:00') where I select it at the start of the DS 4.2 batch job run and then update it to the last run date/time at the end of the DS 4.2 batch job run.
    The problem is that CDHDR has the date (UDATE) and time (UTIME) in separate fields and inside an ABAP data flow there are limited DS functions.  For example, outside of the ABAP data flow I could use the DS function concat_date_time for UDATE and UTIME so that I could have a where clause of 'concat
    _date_time(UDATE, UTIME) > last_run_date_time and concat_date_time(UDATE, UTIME) <= current_run_date_time'.  However, inside the ABAP data flow the DS function concat_date_time is not available.  Is there some way to concatenate UDATE + UTIME inside an ABAP data flow?
    Any help is appreciated.
    Thanks,
    Brad

    Michael,
    I'm trying to concatenate date and time and here's my ABAP data flow where clause:
    CDHDR.OBJECTCLAS in ('DEBI', 'KRED', 'MATERIAL')
    and ((CDHDR.UDATE || ' ' || CDHDR.UTIME) > $CDHDR_Last_Run_Date_Time)
    and ((CDHDR.UDATE || ' ' || CDHDR.UTIME) <= $Run_Date_Time)
    Here are DS print statements showing my global variable values:
    $Run_Date_Time is 2014.06.09 14:14:35
    $CDHDR_Last_Run_Date_Time is 1900.01.01 00:00:01
    The issue is I just created a CDHDR record with a UDATE of '06/09/2014' and UTIME of '10:48:27' and it's not being pulled in the ABAP data flow.  Here's selected contents of the generated ABAP file (*.aba):
    PARAMETER $PARAM1 TYPE D.
    PARAMETER $PARAM2 TYPE D.
    concatenate CDHDR-UDATE ' ' into ALTMP1.
    concatenate ALTMP1 CDHDR-UTIME into ALTMP2.
    concatenate CDHDR-UDATE ' ' into ALTMP3.
    concatenate ALTMP3 CDHDR-UTIME into ALTMP4.
    IF ( ( ALTMP4 <= $PARAM2 )
    AND ( ALTMP2 > $PARAM1 ) ).
    So $PARAM1 corresponds to $CDHDR_Last_Run_Date_Time ('1900.01.01 00:00:01') and $PARAM2 corresponds to $Run_Date_Time ('2014.06.09 14:14:35').  But from my understanding ABAP data type D is for date only (YYYYMMDD) and doesn't include time, so is my time somehow being defaulted to '00:00:00' when it gets to DS?  I ask this as a CDHDR record I created on 6/6 wasn't pulled during my 6/6 testing but this 6/6 CDHDR record was pulled today.
    I can get  last_run_date_time and current_run_date_time into separate date and time fields but I'm not sure how to build the where clause using separate date and time fields.  Do you have any recommendations or is there a better way for me to pull CDHDR deltas in an ABAP data flow using something different than a last run log table?
    Thanks,
    Brad

  • Intermittent too many open files error and Invalid TLV error

    Post Author: jam2008
    CA Forum: General
    I'm writing this up in the hopes of saving someone else a couple of days of hair-pulling...
    Environment: Crystal Reports XI Enterprise / also runtime via Accpac ERP 5.4
    Invalid TLV error in Accpac
    "too many open files" error in event.log file
    Situation:
    Invalid TLV error occurs seemingly randomly on report created in CR Professional 11.  Several days of troubleshooting finally lead to the following diagnosis:
    This error occurs in a report that contains MORE THAN 1 bitmap image.
    The error only shows up after 20 or more reports have been generated sequentially, WITHOUT CLOSING the application that is calling the report.  In our case the Invoice Report dialog within Accpac.  This same error occurred in a custom 3rd party VB.NET app that also called the report through an Accpac API.

    after getting this message you need to do 2 things:
    1. delete the current workspace because it contains some bad data in one the config files - failure to delete the workspace will result the error message to appear even if trying to upload a single file.
    2. add to DTR files in groups - no more than 500 in a single add.

  • Cannot Open File error - tried what I know

    I have received the dreaded Cannot Open File error. Usually I can manage to recover the file by deleting the lock file or recovering from a backup. This time, nothing is working.
    I've searched the forum, and while many people have this issue I cannot find a solution.
    Any suggestions?
    Thanks!

    Hi,
    Have you checked this article for steps to recover a project from Cache. (Adobe Captivate Cache Projects folder from My Documents)
    http://blogs.adobe.com/captivate/2010/09/recovering-the-project.html
    For Future, make sure you have the option to create project Backup on save automatically at the same location where your project is saved. (Edit--Preferences--General-- Generate Project backup)
    Thanks,
    Anjaneai

  • Using ABAP DATA FLOW to pull data from APO tables

    I am trying to use an ABAP Data flow to pull data from APO and receive error 150301. I can do a direct table pull and receive no error, but when I try to put it in an ABAP data data flow I get the issue. Any help would be great.

    Hi
    I know you "closed" this, however someone else might read it so I'll add that when you use an ABAP dataflow, logic can be pushed to ECC - table joins, filters, etc.  (Which can be seen in the generated ABAP).
    Michael

  • Suddenly getting "Too Many Open File" error

    Dear All,
    I have listener program which have been working well for the past few months. Suddenly I start to get "Too Many Open File" error. What can be solution? Is it I need to increase the file descriptors or any other solution? Thank you.

    Dear Ejp,
    Attached below is my codes. I have remove some of the fields for db just to make the code a bit more easier to read. Where do you think is leaking?
    public class commServer {
    public static void main(String[] args) {
    try {
                   final ServerSocket serverSocketConn = new ServerSocket(8888);
                        while (true)
                                  try
                             Socket socketConn1 = serverSocketConn.accept();
    new Thread(new ConnectionHandler(socketConn1)).start();               
                                  catch(Exception e)
                                                 System.out.println(e.toString());
    catch (Exception e)
    System.out.println(e.toString());
    //System.exit(0);
    class ConnectionHandler implements Runnable {
    private Socket receivedSocketConn1;
    DateFormat dateFormat = new SimpleDateFormat("dd/MM/yyyy HH:mm:ss");
    DateFormat formatter = new SimpleDateFormat("EEE, dd MMM yyyy");
    DateFormat inDf=new SimpleDateFormat("ddMMyyHHmmss");
    DateFormat outDf=new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");
    ConnectionHandler(Socket receivedSocketConn1) {
    this.receivedSocketConn1=receivedSocketConn1;
    //@Override
    public void run() {
    Connection dbconn = null;
    BufferedWriter w = null;
    BufferedReader r = null;
    try {
    PrintStream out = System.out;
         BufferedWriter fout = null;
    w = new BufferedWriter(new OutputStreamWriter(receivedSocketConn1.getOutputStream()));
    r = new BufferedReader(new InputStreamReader(receivedSocketConn1.getInputStream()));
    int m = 0, count=0;
    String line="";
    String n="";
    w.write("$PA\n");
    w.flush();
    while ((m=r.read()) != -1)
    Date dateIn = new Date();
         n = n + (char) m;
         int i = n.indexOf("GET");
                                  if(i != -1) {
                                       break;
         if (m==35)
         String ori = n;
         String noCheckSum = n.substring(0,(n.length()-4));
                   int addExist = n.indexOf("@");
    String[] slave = null;
    if(addExist!=-1)
         slave = noCheckSum.split("@");
              n = slave[0];
              //System.out.println(" slave : "+slave.length);
                   else
                        n = noCheckSum;
         String[] result = n.split(",");
         Statement stmt = null;
         w.write("$PA\n");
    w.flush();
                   int count1 = 0;
    Date date = Calendar.getInstance().getTime();
    String today = formatter.format(date);
                        String filename= "MyDataFile"+today+".txt";
    boolean append = true;
    FileWriter fw = null;
    try
    fw = new FileWriter(filename,append);
         fw.write(ori+" "+dateFormat.format(dateIn)+"\n");//appends the string to the file
         Date dateOut = new Date();
         fw.write("$PA"+" "+dateFormat.format(dateOut)+"\n");//appends the string to the file
    catch (IOException ex)
                        //ex.printStackTrace(new PrintWriter(sWriter));
                        System.out.println("MyError:IOException has been caught in in the file operation");
                        ex.printStackTrace();
                        finally
                   try
         if ( fw != null )
              fw.close();
         else
              System.out.println("MyError:fw is null in finally close");
                        catch(IOException ex){
                        System.out.println("MyError:IOException has been caught in fw is null in finally close");
                        ex.printStackTrace();
         try
                             String deviceID=result[3].trim();      
                             String dateTime=result[4].trim();                          
                             String[] result2 = result[10].split("'");                          
                             String gpsDate = result2[1].trim().substring(0,6);                    
                             String gpsTime = result2[1].trim().substring(6,12);                         
                             String gpsLat = result2[1].trim().substring(13,20);
                             String latitude = result2[1].trim().substring(13,20).substring(0,2)+"."+result2[1].trim().substring(13,20).substring(2,7);
                             String gpsLong = result2[1].trim().substring(21,29);
                             String longitude = result2[1].trim().substring(21,29).substring(0,3)+"."+result2[1].trim().substring(21,29).substring(3,8);
                             String speed = result2[1].trim().substring(30,33);                         
                             String course = result2[1].trim().substring(33,36);
                   String dateTimer = null;
                             try
                             Date inDate=null;
                             inDf.setTimeZone(TimeZone.getTimeZone("UTC"));
                             inDate=inDf.parse(dateTime);
                             outDf.setTimeZone(TimeZone.getTimeZone("Asia/Kuala_Lumpur"));
                                       dateTimer=outDf.format(inDate);
                             catch(ParseException ex)
                             System.out.println("MyError:Parse Error has been caught for date parse close");
              ex.printStackTrace();
                        dbconn = DriverManager.getConnection("jdbc:mysql://192.168.1.155:3306/db1?"+"user=db1&password=test1");
                   stmt = dbconn.createStatement();
                   String selectQuery2 = "Select * from tripData Where deviceID='"+ deviceID +"' and dateTimer>'"+dateTimer+"' Order By dateTimer Desc Limit 1";
                   ResultSet rs2 = stmt.executeQuery(selectQuery2);
                   String updateQuery = "";
                   if(rs2.next())
                        String previousLatitude="";
                        String previousLongitude="";
                        String previousSpeed="";
                        previousLatitude = rs2.getString("latitude");
                        previousLongitude = rs2.getString("longitude");
                        previousSpeed = rs2.getString("speed");
                        updateQuery = "UPDATE device SET " +
                                       "latitude='" + previousLatitude +
                                       "',longitude='" + previousLongitude +
                                       "',speed='" + previousSpeed +
                                       "' WHERE serialNumber='" + deviceID + "'";
                   else
                   updateQuery = "UPDATE device SET " +
                                       "latitude='" + latitude +
                                       "',longitude='" + longitude +
                                       "',speed='" + speed +
                                       "',course='" + course +
                                       "',dateTimer='" + dateTimer +
                                       "' WHERE serialNumber='" + deviceID + "'";
                        count = stmt.executeUpdate(updateQuery);                    
                   String insertQuery = "INSERT INTO tripData" +
                                       "(latitude,longitude,speed,course,dateTimer,deviceID)" + " VALUES ('" +
                                       latitude + "','" + longitude + "','" + speed + "','" + course + "','" + dateTimer + "','" + deviceID + "' )";
         count = stmt.executeUpdate(insertQuery);
              if(addExist!=-1)
              for(int iSlave=1; iSlave<slave.length ; iSlave++)
                                            String[] slaveDetails = slave[iSlave].split(",",-1);
                                            String slaveEventType = slaveDetails[0];
                                            String slaveGroup = slaveDetails[1];
                                            String slaveUnitID = slaveDetails[2];
                                            String slaveBattLevel = slaveDetails[3];
                                            String slaveSwitchStat = slaveDetails[4];
                                            String slaveTempHumid = slaveDetails[5];
                                       String slaveTemp="",slaveHumid="";                                   
                                       if(slaveTempHumid.length()>0)
                                       slaveHumid = slaveTempHumid.substring(5,7);
                                                 if(slaveTempHumid.charAt(0)=='P')
                                                 slaveTemp = "+"+slaveTempHumid.substring(1,3)+"."+slaveTempHumid.substring(3,5);                                                  
                                                 else if(slaveTempHumid.charAt(0)=='M')
                                                 slaveTemp = "-"+slaveTempHumid.substring(1,3)+"."+slaveTempHumid.substring(3,5);
                                       slaveBattLevel = slaveBattLevel.trim().substring(0,2)+"."+slaveBattLevel.trim().substring(2,3);
                                            String insertQuery2 = "INSERT INTO tripDataSlave" +
                                            "(dateTimer,deviceID,slaveUnitID,slaveEventType,slaveGroup,slaveBattLevel,slaveSwitchStat,slaveTemp,slaveHumidity)" + " VALUES ('" +
                                            dateTimer + "','" + deviceID + "','" + slaveUnitID + "','" + slaveEventType + "','" + slaveGroup + "','" + slaveBattLevel + "','" + slaveSwitchStat + "','" + slaveTemp + "','" + slaveHumid + "')";
              count = stmt.executeUpdate(insertQuery2);
                   catch (SQLException ex)
    System.out.println("MyError:Error : "+ex);
                        finally
                        try
                             if ( stmt != null )
                             stmt.close();
                             else
                                  System.out.println("MyError:stmt is null in finally close");
                        catch(SQLException ex){
                        System.out.println("MyError:SQLException has been caught for stmt close");
    ex.printStackTrace();
                        try
                             if ( dbconn != null )
                             dbconn.close();
                             else
                             System.out.println("MyError:dbconn is null in finally close");
                        catch(SQLException ex){
                        System.out.println("MyError:SQLException has been caught for dbconn close");
    ex.printStackTrace();
         n="";
    catch (IOException ex)
    System.out.println("MyError:IOException has been caught in in the main first try");
    ex.printStackTrace();
    finally
    try
         if ( w != null )
              w.close();
         else
              System.out.println("MyError:w is null in finally close");
    catch(IOException ex){
    System.out.println("MyError:IOException has been caught in w in finally close");
    ex.printStackTrace();
    }

  • "Could not get the audio data from the file" error

    Hi
    I get a lot of "Could not get the audio data from the file" errors when opening a project. Sometimes they crash Premiere Pro CC 7.01 (Mac). But if not Premiere works as usual, with the audio.
    I tried to convert the audio files (coming from Audition) from 32 to 16 bit. It worked while opening a project once, but not the following times.
    Where can it come from ?
    Edit: I also get this error when importing audio: "Error: Premiere Pro version 7.0 is not compatible with the Premiere Pro Plug-in version 5.7.4". Maybe it is connected.

    Got it! It was the Smartsound plugin making trouble. Got rid of it and no more errors.

  • Could not open file error on Copy Express

    Hi Experts,
    Im using Copy Express to copy Item Master from origin db to target db.
    Right now, I am only copying 1 item (I have tried several and a variety, but still the same) and I hit a "Could not open file" error, it has some chinese characters. And I cant make out what is the meaning or why the error is appearing.
    Before this error, I also see a "XML ERROR" on the status that dissappear very quickly upon showing. Then next is this error message.
    [Click here to view screenshot|http://www.flickr.com/photos/45736280@N07/5077898618/sizes/l/in/photostream/]
    Please help and Thanks!
    Warmest Regards,
    Chinho

    Hi,
    The item is able to copy using copy express but you must make sure that there is no exclamation sign before run it.
    The maximum item can be copied is limited in the copy express. it is just only 2,000 items.
    I suggest to use data transfer workbench.
    JimM

  • How to store SP2-0310: unable to open file errors in log file

    We are using oracle 10g on Linux platform.
    When we send scripts to the clients, then we also send a control file that executes all the sql files e.g.
    control.sql
    SPOOL test.log
    SELECT 'Start of Control File at:'||systimestamp from dual;
    SPOOL OFF
    @./dbscripts/00_insert_scripts.sql
    SPOOL test.log APPEND
    SELECT 'End of Control File at:'||systimestamp from dual;
    SPOOL OFFThe scripts are executed from SqlPlusW only by typing @control.sql
    The problem is that when SP2-0310: unable to open file error comes, it appears on the screen only. It does not go in the test.log file. Is there any way to store these errors in a file from SQLPlusW.
    Thanks.

    try mentioning full path....
    try
    @<fully qualified path>/00_insert_scripts.sql
    also check if the file has necessary permission
    ls -l <fully qualified path>/00_insert_scripts.sql

  • Can't open file , error message 1310 & 2203

    can't open file , error message 1310 & 2203

    reproduce is easy. anytime i click to open an attachment or a pdf file.
    it won't open.  comes up with notice that adobe acrobat has stopped
    working and the program will close.  I can go to READER AND OPEN THE
    FILE BUT NOT FROM ACROBAT X PRO.   RATHER SILLY.
    ERROR MESSAGES ATTACHED
    THANKS FOR THE ASSIST
    Dan Valentine, GRI, e-PRO, MA min
    Broker-Owner
    Valentine Sales & Management
    "Building & Maintaining Rental Portfolios"
    "One-Stop-Shop: AcquireRehabLeaseManagementStrategic Disposition"
    www.ValentineTeam.com
    www.ValentinePropertyManagement.com
    www.InvestAZRentals.com
    12031 N. Cave Creek Rd.
    Phoenix, AZ 85020
    Cell-Text 602-377-0621
    Office- 602-866-0150
    Fax-602-926-8260
    [email protected]

  • ABAP Data flow

    Hi
    can we replicate ABAP data flow and do modifications for history data upload?

    It is  a copy and whatever changes you will make is not going to impact the other ABAP dataflows.

  • ABAP Data Flows - Parallel Execution?

    Hi Guys,
    If I have a Data Flow that includes within it multiple, lets say 3, ABAP Data Flows; I see that when the job is started, only 1 ABAP flow at a a time will run even though there is no precedence enforced and they could all kick off together.
    Is there a way to get these ABAP flows to all run in parallel in SAP?
    Thanks,
    Flip.

    Hi Flip,
    Never actually tried to do this but I see that in the Performance Optimization guide they only specify that Dataflows and Workflows can be processed in parallel, so I would suppose then that if you placed each ABAP dataflow into its own separate dataflow and then encapsulated the 3 unlinked dataflows into a workflow then the ABAP dataflows should execute in parallel, since they are initiated by the dataflows.
    Clint.

  • Weblogic Server Switch over automatically due to too many open files error.

    Hi,
    I am facing problem in production environment. I am using Weblogic 8.1 SP4 application.
    Weblogic Server automatically switch over every 3 weeks due to few reasons.
    1. out of memorry error.
    2. Too many open files error.
    Please see my below portalserver. log files. Kindly provide some good solution to solve this problem.
    The following log is portalserver.log file
    ###<May 6, 2009 8:28:15 PM ICT> <Notice> <WebLogicServer> <ebizdr> <portalServer> <ListenThread.Default> <<WLS Kernel>> <> <BEA-000205> <After having failed to listen, the server is now listening on port 9001.>
    ####<May 6, 2009 8:28:15 PM ICT> <Critical> <WebLogicServer> <ebizdr> <portalServer> <ListenThread.Default> <<WLS Kernel>> <> <BEA-000204> <Failed to listen on port 9001, failure count: 1, failing for 0 seconds, java.net.SocketException: Too many open files>
    ####<May 6, 2009 8:28:15 PM ICT> <Error> <HTTP> <ebizdr> <portalServer> <ExecuteThread: '5' for queue: 'default'> <<WLS Kernel>> <> <BEA-101019> <[ServletContext(id=18480784,name=NBIAProject,context-path=)] Servlet failed with IOException
    java.io.FileNotFoundException: /var/opt/weblogic/user_projects/domains/eBizPortalDomain/portalServer/.wlnotdelete/NBIAPortalApp/NBIAProject/images/go.gif (Too many open files)
         at java.io.FileInputStream.open(Native Method)
         at java.io.FileInputStream.<init>(FileInputStream.java:106)
         at weblogic.utils.classloaders.FileSource.getInputStream(FileSource.java:23)
         at weblogic.servlet.FileServlet.sendFile(FileServlet.java:563)
         at weblogic.servlet.FileServlet.service(FileServlet.java:206)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
         at weblogic.servlet.internal.ServletStubImpl$ServletInvocationAction.run(ServletStubImpl.java:1006)
         at weblogic.servlet.internal.ServletStubImpl.invokeServlet(ServletStubImpl.java:419)
         at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:28)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:27)
         at com.bea.p13n.servlets.PortalServletFilter.doFilter(PortalServletFilter.java:293)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:27)
         at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:6724)
         at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
         at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:121)
         at weblogic.servlet.internal.WebAppServletContext.invokeServlet(WebAppServletContext.java:3764)
         at weblogic.servlet.internal.ServletRequestImpl.execute(ServletRequestImpl.java:2644)
         at weblogic.kernel.ExecuteThread.execute(ExecuteThread.java:219)
         at weblogic.kernel.ExecuteThread.run(ExecuteThread.java:178)
    >
    ####<May 6, 2009 8:28:16 PM ICT> <Notice> <WebLogicServer> <ebizdr> <portalServer> <ListenThread.Default> <<WLS Kernel>> <> <BEA-000205> <After having failed to listen, the server is now listening on port 9001.>
    ####<May 6, 2009 8:28:16 PM ICT> <Critical> <WebLogicServer> <ebizdr> <portalServer> <ListenThread.Default> <<WLS Kernel>> <> <BEA-000204> <Failed to listen on port 9001, failure count: 1, failing for 0 seconds, java.net.SocketException: Too many open files>
    ####<May 6, 2009 8:28:16 PM ICT> <Error> <HTTP> <ebizdr> <portalServer> <ExecuteThread: '5' for queue: 'default'> <<WLS Kernel>> <> <BEA-101019> <[ServletContext(id=18480784,name=NBIAProject,context-path=)] Servlet failed with IOException
    java.io.FileNotFoundException: /var/opt/weblogic/user_projects/domains/eBizPortalDomain/portalServer/.wlnotdelete/NBIAPortalApp/NBIAProject/images/search_right.gif (Too many open files)
    Thanks & Regards,
    Suriyaprakash.V

    Sorry for the late resp. Here's what dev suggests be investigated:
    I would want to know:
    Can they do a "$ORACLE_HOME/bin/dmstool -dump" and
    save/compress/send the results?
    Are there any errors printed in the Apache error_log while this leak occurs?
    The customer could/should re-check their TCP settings, especially the TCP time wait interval and generally follow the TCP settings recommended in Chapter 5 of the iAS Performance Guide for 9.0.2.
    Is there anything else interesting/unusual about the site?
    Let us know how it goes.

Maybe you are looking for