Meaning of "data-failed" files in queue?

We're running iMS 6.3-1.04 on Sol9.
We're getting a great many zero-length spool files of the form
./tcp_intranet/spool/0D5B75042B43D85F46B07FBF0BFA9D31.data-failed
in our tcp_intranet queue.
I can't find any mention of these in the manual, or online. Does anyone here have any ideas?
Chris

The final followup:
The culprit is Microsoft's .NET mail. When it completes the sending of its email, it issues the QUIT SMTP command, but doesn't wait for the Messaging Server to reply!
The .NET client just terminates the connection immediately after sending the QUIT, so the Messaging Server is bound by the RFC2181 to signal an error.
Since Microsoft has come back and said, "Huh? No, no, we're not doing anything wrong!" and since the "data-failed" files are 0 bytes, and are regularly cleaned up, we're just going to ignore the problem.
Just FYI, in case anyone else is seeing these.

Similar Messages

  • What does a greyed out folder mean? (after failed file copy from Time Capsule)

    I did a clean install of Mavericks and I'm trying to copy my files back from my Time Capsule.
    The transfer rate is terrible, despite being a direct link via gigabit ethernet.
    I have about 800GB of Logic projects on the Time Capsule, so I set it off to copy them over.
    After 2 days and 233GB copied across, my iMac hung and I had to do a hard reset.
    I now have a 233GB folder on my desktop that is lighter in colour and I can't open it. Nothing happens.
    Is there any way to resume the copy, or am I looking at starting again, leaving my iMac untouched for 5 days and hoping for the best?
    The tech just doesn't work reliably, despite being a fresh install of Mavericks and an offical Apple Time Capsule 2TB. No idea what I'd do if I had 2 or 3GB of data...

    Phil,
    If you haven't already you may want to try to restart the TC first to ensure that it doesn't think that that folder or it's contents are still in use in some way.
    If that doesn't allow deletion, then you may need to use terminal to delete it regardless of permissions.  Be careful when doing the following and make sure that all spaces and typing are done exactly as given below:
    1) Log into the mac as an account that is an Administrator
    2) Open Terminal, then arrange Terminal and Finder so you can see them both
    3) Open a Finder where you can see the grayed-out folders on the TC
    4) In terminal, type the following, including a [space] at the end (but don't hit enter yet)
    sudo rm -r
    5) From Finder, drag the bad folder to the terminal window and release.  This will type the correct path into terminal.  Ignore and leave any backslashes that are added to the path.  You should end up with a command that looks like this:
    sudo rm -r /Volumes/MyTC/ParentFolder/BadFolder
    6) Hit enter ONLY if the path looks correct and ends in the name of the bad folder you are trying to kill.  If the path looks wrong or does not end in the targeted folder name, then do not hit enter and close terminal.
    7) Once you hit enter you will be prompted for the password of your currently logged in user.
    8) it should then blow away the folder.
    9) Close terminal

  • File server made data fails while working on dwg files

    while usrers working on dwg files the server give data fail?!

    Hi,
    Thanks for your posting.
    Please provide more detailed information about the issue. If the issue only occures on the dwg files, as the Milos mentioned you could ask for help from software vendor.
    http://forums.autodesk.com/t5/dwg-trueview/bd-p/109
    Thank you for your understanding and support. 
    Best Regards,
    Mandy
    We
    are trying to better understand customer views on social support experience, so your participation in this
    interview project would be greatly appreciated if you have time.
    Thanks for helping make community forums a great place.

  • ORA-39080: failed to create queues "" and "" for Data Pump job

    When I am running datapump expdp I receive the following error:
    +++Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Produc+++
    +++tion+++
    +++With the Partitioning, OLAP and Data Mining options+++
    +++ORA-31626: job does not exist+++
    +++ORA-31637: cannot create job SYS_EXPORT_SCHEMA_01 for user CHESHIRE_POLICE_LOCAL+++
    +++ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95+++
    +++ORA-06512: at "SYS.KUPV$FT_INT", line 600+++
    +++ORA-39080: failed to create queues "" and "" for Data Pump job+++
    +++ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95+++
    +++ORA-06512: at "SYS.KUPC$QUE_INT", line 1555+++
    +++ORA-01403: no data found+++
    Sys has the following two objects as invalid at present after running catproc.sql and utlrp.sql and manual compilation:
    OBJECT_NAME OBJECT_TYPE
    AQ$_KUPC$DATAPUMP_QUETAB_E QUEUE
    SCHEDULER$_JOBQ QUEUE
    While I run catdpb.sql the datapump queue table does not create:
    BEGIN
    dbms_aqadm.create_queue_table(queue_table => 'SYS.KUPC$DATAPUMP_QUETAB', multiple_consumers => TRUE, queue_payload_type =>'SYS.KUPC$_MESSAGE', comment => 'DataPump Queue Table', compatible=>'8.1.3');
    EXCEPTION
    WHEN OTHERS THEN
    IF SQLCODE = -24001 THEN NULL;
    ELSE RAISE;
    END IF;
    END;
    ERROR at line 1:
    ORA-01403: no data found
    ORA-06512: at line 7

    Snehashish Ghosh wrote:
    When I am running datapump expdp I receive the following error:
    +++Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Produc+++
    +++tion+++
    +++With the Partitioning, OLAP and Data Mining options+++
    +++ORA-31626: job does not exist+++
    +++ORA-31637: cannot create job SYS_EXPORT_SCHEMA_01 for user CHESHIRE_POLICE_LOCAL+++
    +++ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95+++
    +++ORA-06512: at "SYS.KUPV$FT_INT", line 600+++
    +++ORA-39080: failed to create queues "" and "" for Data Pump job+++
    +++ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95+++
    +++ORA-06512: at "SYS.KUPC$QUE_INT", line 1555+++
    +++ORA-01403: no data found+++
    Sys has the following two objects as invalid at present after running catproc.sql and utlrp.sql and manual compilation:
    OBJECT_NAME OBJECT_TYPE
    AQ$_KUPC$DATAPUMP_QUETAB_E QUEUE
    SCHEDULER$_JOBQ QUEUE
    While I run catdpb.sql the datapump queue table does not create:
    BEGIN
    dbms_aqadm.create_queue_table(queue_table => 'SYS.KUPC$DATAPUMP_QUETAB', multiple_consumers => TRUE, queue_payload_type =>'SYS.KUPC$_MESSAGE', comment => 'DataPump Queue Table', compatible=>'8.1.3');does it work better when specifying an Oracle version that is from this Century; newer than V8.1?

  • The Finder can't complete the operation because some data in "FILE NAME" can't be read or writen. (Error code -36)

    I have a RAID that my Mac says is damaged. It does mount, so I want to copy as much of the data as I can. I tried SuperDuper!, but it crashed (or gave me error and stopped).  I tried copying and pasting entire drive (1.75TB) but it stopped with this error, and same error happens with some of the files when I try to copy and paste (Drag and Drop) them individually.
    I get this error: The Finder can't complete the operation because some data in "FILE NAME" can't be read or writen. (Error code -36)

    Well, in pure Apple File System talk... -36            ioErr            I/O error (bummers)
    Not very helpful, but generally means a Drive quit being Readable or Writable since starting the operation...
    http://fuzzy.wordpress.com/2006/12/10/ioerror-36/
    If Disk Utility or fsck should fail to repair it, your best bet is DiskWarrior from Alsoft, you'll need the CD to boot from if you don't have another boot drive...
    http://www.alsoft.com/DiskWarrior/
    Might try the free trial of my favorite, Tri-Backup, it does a better copy/clone job in that it usually doesn't quit copying at every error, & has a nice log afterwards to try certain files agin if you wish....
    http://www.tri-edre.com/english/products/tribackup.html

  • Data retrieval from JMS queue using single SOA composite in a clustered env

    Hi,
    I have following situation:
    1) A requester SOA composite (Composite1)is reading data from a File
    2) Mediator is routing data received from Composite1 and writing in a JMS queue.
    3) A provider composite (Composite2) is reading from the JMS queue.
    Both Composite1 and Composite2 is deployed in a clustered environment.
    Problem is Composite2 is not able to read data sequentially from JMS Queue sometime (for example out of 30000 transactions its failing once). Here it's a queue and only one producer and only one consumer is there.
    What may cause this issue? I thought a single producer and single consumer on a queue will guarantee FIFO retrieval . Could you please advise where may be the gap?
    Thanks
    Edited by: user1054373 on Sep 18, 2012 11:29 PM

    Hi,
    I thought a single producer and single consumer on a queue will guarantee FIFO retrieval Java Message Service Specification does not guarantee ordered message delivery when applications use distributed queues...
    Using Weblogic Message Unit-of-Order may solve your issue...
    http://docs.oracle.com/cd/E23943_01/web.1111/e13727/uoo.htm#JMSPG389
    Hope this helps...
    Cheers,
    Vlad

  • HT1692 I use an iPhone 5, and want to sync my phone calendar and contacts with Outlook 7 calendar and contacts; twpo problems--I now get dozens of new calendars every time, and my calendar will not syncd at all.; meaning that date entries on Outlook do no

    I use an iPhone 5, and want to sync my phone calendar and contacts with Outlook 7 calendar and contacts; two problems--I now get dozens of new calendars every time, and my outlook calendar will not sync at all.; meaning that date entries on Outlook do not come across ot iPhone.

    It is unusual for the calendar to sync and not the contacts in Outlook. I've worked with Outlook for years. You didn't answer what computer OS you are using. If it is Windows, have you tried to reset the sync history in iTunes? Do that by opening iTunes, go to Edit>Preferences>Devices and click on reset sync history. If you have done this and it doesn't help, then we can try and run scanpst.exe on your Outlook file and see if there are any errors. Search your computer for that file, however it normally resides in one of the Microsoft Office folders in the folder Program files. After that, you can see if it will sync.

  • What mean "logical end-of-file reached" messaage ? (screen attached)

    I see that when loading project. Is all ok after i press OK but what is that means ?

    Not seeing a picture... are there audio files associated with this project?
    Logical-end-of-file-reached usually refers to a corrupt audio file where the data header file end does not match the actual file length. In other words, the audio file it truncated or the file header describing the file is corrupt.
    It could also be an audio file has been marked "read only" but that is less likely.
    Are you sure all of your audio files are working correctly.

  • Error occured while reading identity data: failed to decrypt safe contents

    Hello,
    We are trying to access Tibco JMS server through SSL using JNDI lookup. Getting the following error, while executing a sample java file.
    Java Version -
    java version "1.4.2_05"
    Java(TM) 2 Runtime Environment, Standard Edition (build 1.4.2_05-b04)
    Java HotSpot(TM) Client VM (build 1.4.2_05-b04, mixed mode)
    Please let me know if any of you faced similar issues.
    thanks in advace.
    Following are the error messages.
    javax.jms.JMSSecurityException: Error occured while reading identity data: failed to de
    crypt safe contents entryCOM.rsa.jsafe.SunJSSE_cs: Could not perform unpadding: invalid
    pad byte. at com.tibco.tibjms.TibjmsSSL._identityFromStore(TibjmsSSL.java:2699)
    at com.tibco.tibjms.TibjmsSSL.createIdentity(TibjmsSSL.java:2604)
    at com.tibco.tibjms.TibjmsxLinkSSL._initSSL(TibjmsxLinkSSL.java:291)
    at com.tibco.tibjms.TibjmsxLinkSSL.connect(TibjmsxLinkSSL.java:338)
    at com.tibco.tibjms.TibjmsConnection._create(TibjmsConnection.java:611)
    at com.tibco.tibjms.TibjmsConnection.<init>(TibjmsConnection.java:1772)
    at com.tibco.tibjms.TibjmsTopicConnection.<init>(TibjmsTopicConnection.java:37)
    at com.tibco.tibjms.TibjmsxCFImpl._createImpl(TibjmsxCFImpl.java:139)
    at com.tibco.tibjms.TibjmsxCFImpl._createConnection(TibjmsxCFImpl.java:201)
    at com.tibco.tibjms.TibjmsTopicConnectionFactory.createTopicConnection(TibjmsTo
    picConnectionFactory.java:84)
    at tibjmsSSLJNDI.<init>(tibjmsSSLJNDI.java:202)
    at tibjmsSSLJNDI.main(tibjmsSSLJNDI.java:252)
    ##### Linked Exception:
    com.tibco.security.AXSecurityException: failed to decrypt safe contents entryCOM.rsa.js
    afe.SunJSSE_cs: Could not perform unpadding: invalid pad byte.
    at com.tibco.security.impl.j2se.IdentityImpl.init(IdentityImpl.java:70)
    at com.tibco.security.IdentityFactory.createIdentity(IdentityFactory.java:61)
    at com.tibco.tibjms.TibjmsSSL._identityFromStore(TibjmsSSL.java:2680)
    at com.tibco.tibjms.TibjmsSSL.createIdentity(TibjmsSSL.java:2604)
    at com.tibco.tibjms.TibjmsxLinkSSL._initSSL(TibjmsxLinkSSL.java:291)
    at com.tibco.tibjms.TibjmsxLinkSSL.connect(TibjmsxLinkSSL.java:338)
    at com.tibco.tibjms.TibjmsConnection._create(TibjmsConnection.java:611)
    at com.tibco.tibjms.TibjmsConnection.<init>(TibjmsConnection.java:1772)
    at com.tibco.tibjms.TibjmsTopicConnection.<init>(TibjmsTopicConnection.java:37)
    at com.tibco.tibjms.TibjmsxCFImpl._createImpl(TibjmsxCFImpl.java:139)
    at com.tibco.tibjms.TibjmsxCFImpl._createConnection(TibjmsxCFImpl.java:201)
    at com.tibco.tibjms.TibjmsTopicConnectionFactory.createTopicConnection(TibjmsTo
    picConnectionFactory.java:84)
    at tibjmsSSLJNDI.<init>(tibjmsSSLJNDI.java:202)
    at tibjmsSSLJNDI.main(tibjmsSSLJNDI.java:252)
    Subexception stack trace follows:
    java.io.IOException: failed to decrypt safe contents entryCOM.rsa.jsafe.SunJSSE_cs: Cou
    ld not perform unpadding: invalid pad byte.
    at com.sun.net.ssl.internal.ssl.PKCS12KeyStore.engineLoad(Unknown Source)
    at java.security.KeyStore.load(Unknown Source) at com.tibco.security.impl.j2se.IdentityImpl.init(IdentityImpl.java:66)
    at com.tibco.security.IdentityFactory.createIdentity(IdentityFactory.java:61)
    at com.tibco.tibjms.TibjmsSSL._identityFromStore(TibjmsSSL.java:2680)
    at com.tibco.tibjms.TibjmsSSL.createIdentity(TibjmsSSL.java:2604)
    at com.tibco.tibjms.TibjmsxLinkSSL._initSSL(TibjmsxLinkSSL.java:291)
    at com.tibco.tibjms.TibjmsxLinkSSL.connect(TibjmsxLinkSSL.java:338)
    at com.tibco.tibjms.TibjmsConnection._create(TibjmsConnection.java:611)
    at com.tibco.tibjms.TibjmsConnection.<init>(TibjmsConnection.java:1772)
    at com.tibco.tibjms.TibjmsTopicConnection.<init>(TibjmsTopicConnection.java:37)
    at com.tibco.tibjms.TibjmsxCFImpl._createImpl(TibjmsxCFImpl.java:139)
    at com.tibco.tibjms.TibjmsxCFImpl._createConnection(TibjmsxCFImpl.java:201)
    at com.tibco.tibjms.TibjmsTopicConnectionFactory.createTopicConnection(TibjmsTo
    picConnectionFactory.java:84)
    at tibjmsSSLJNDI.<init>(tibjmsSSLJNDI.java:202)
    at tibjmsSSLJNDI.main(tibjmsSSLJNDI.java:252)
    Caused by: COM.rsa.jsafe.SunJSSE_cs: Could not perform unpadding: invalid pad byte.
    at COM.rsa.jsafe.SunJSSE_al.a(Unknown Source)
    at COM.rsa.jsafe.SunJSSE_ag.a(Unknown Source)
    at com.sun.net.ssl.internal.ssl.PKCS12KeyStore.a(Unknown Source)
    ... 16 more
    Subexception stack trace follows:
    java.io.IOException: failed to decrypt safe contents entryCOM.rsa.jsafe.SunJSSE_cs: Cou
    ld not perform unpadding: invalid pad byte.
    at com.sun.net.ssl.internal.ssl.PKCS12KeyStore.engineLoad(Unknown Source)
    at java.security.KeyStore.load(Unknown Source)
    at com.tibco.security.impl.j2se.IdentityImpl.init(IdentityImpl.java:66)
    at com.tibco.security.IdentityFactory.createIdentity(IdentityFactory.java:61)
    at com.tibco.tibjms.TibjmsSSL._identityFromStore(TibjmsSSL.java:2680)
    at com.tibco.tibjms.TibjmsSSL.createIdentity(TibjmsSSL.java:2604)
    at com.tibco.tibjms.TibjmsxLinkSSL._initSSL(TibjmsxLinkSSL.java:291)
    at com.tibco.tibjms.TibjmsxLinkSSL.connect(TibjmsxLinkSSL.java:338)
    at com.tibco.tibjms.TibjmsConnection._create(TibjmsConnection.java:611)
    at com.tibco.tibjms.TibjmsConnection.<init>(TibjmsConnection.java:1772)
    at com.tibco.tibjms.TibjmsTopicConnection.<init>(TibjmsTopicConnection.java:37)
    at com.tibco.tibjms.TibjmsxCFImpl._createImpl(TibjmsxCFImpl.java:139)
    at com.tibco.tibjms.TibjmsxCFImpl._createConnection(TibjmsxCFImpl.java:201)
    at com.tibco.tibjms.TibjmsTopicConnectionFactory.createTopicConnection(TibjmsTo
    picConnectionFactory.java:84)
    at tibjmsSSLJNDI.<init>(tibjmsSSLJNDI.java:202)
    at tibjmsSSLJNDI.main(tibjmsSSLJNDI.java:252)
    Caused by: COM.rsa.jsafe.SunJSSE_cs: Could not perform unpadding: invalid pad byte.
    at COM.rsa.jsafe.SunJSSE_al.a(Unknown Source)
    at COM.rsa.jsafe.SunJSSE_ag.a(Unknown Source)
    at com.sun.net.ssl.internal.ssl.PKCS12KeyStore.a(Unknown Source)
    ... 16 more

    For the benifit of others.
    The issue is resolved.
    When we set the certificate password inside our application we were encrypting it inside our system.
    When we sent it to tibco we did not decrypt it.
    So the encrypted password was sent as it is that was the issue :(
    Thanks,
    Reflex.

  • Multipart/form-data and file attachment

    Hi ,
    This question has probably been asked before, but if not then here it is. Any replies will be appreciated:
    Q. When using "Enctype=Multipart/form-data", with file attachment alongwith other form fields, is it mandatory to attach a file ? What if user selects no file to attach?
    Q. If no, then how can it be possible that a form can be submitted without attaching a file since when I try to submit a form with no file attached to it, it gives me error message saying :java.lang.NullPointerException
    Q. Does it mean that I can't have a form with a blank "File" input field, if the form's Enctype is "multipart/form-data"? Since users may not select a file to attach to the form, in other words it is an optional.
    I hope I was clear enough in explaining my questions.
    Thanks in advance.

    I am using Orielly's file attachement pacakge.
    Here's what I am doing in my JSP page: It does the following:
    int maxFileSize = 10 * 1024 * 1024; // 5MB max
    String uploadDir = "/direct/files/upload/";
    String FormResults = "";
    String FileResults = "";
    String fileName = "";
    String fileName2 = "";
    String paramName="";
    String paramValue="";     
    File f;
    int filecounter=1;
    first get the form fields using following code:
    MultipartRequest multi = new MultipartRequest(request, uploadDir, maxFileSize);
    Enumeration params = multi.getParameterNames();
    //Get the form information
    while (params.hasMoreElements())
         paramName = (String) params.nextElement();     
         paramValue = multi.getParameter(paramName);
         if (paramName.equals("emailconfirm"))
              emailconfirmation = paramValue;
         else if (paramName.equals("Requester"))
              Requester = paramValue;
         else if (paramName.equals("TodaysDate"))
              TodaysDate = paramValue;
         else if (paramName.equals("Extension"))
    }//end while
    Then it gets the file information using the following code: I have two file fields in my form so that's why I am using a filecounter to find out if user has attached two files or just one:
    Enumeration files = multi.getFileNames();
    while (files.hasMoreElements())
         String formName = (String) files.nextElement();
         if (filecounter == 2)
    fileName2 = multi.getFilesystemName(formName);
         String fileType = multi.getContentType(formName);
              f = multi.getFile(formName);
         FileResults += "<BR>" + formName + "=" + fileName2 + ": Type= " + fileType + ":
    Size= " + f.length();
         else
         {     fileName = multi.getFilesystemName(formName);
              String fileType = multi.getContentType(formName);
              f = multi.getFile(formName);
              FileResults += "<BR>" + formName + "=" + fileName + ": Type= " + fileType + ":
    Size= " + f.length();
         filecounter=filecounter+1;
    Then after composing the mail message I send email with the form fields and file attachments using following code:
    Properties props = new Properties();
    MimeBodyPart mbp1 = new MimeBodyPart();
    MimeBodyPart mbp2 = new MimeBodyPart();
    MimeBodyPart mbp3 = new MimeBodyPart();
    URLDecoder urlDecoder = new URLDecoder();
    String to1 = urlDecoder.decode(toemail);
    String from1 = urlDecoder.decode(fromemail);
    String cc1 = urlDecoder.decode(ccemail);
    props.put( "mail.host", host );
    Session session1 = Session.getDefaultInstance(props, null);
    // Construct the message
    Message msg = new MimeMessage( session1 );
    msg.setFrom( new InternetAddress( from1 ) );
    msg.setRecipients( Message.RecipientType.TO, InternetAddress.parse( to1, false ) );
    msg.setRecipients( Message.RecipientType.CC, InternetAddress.parse( cc1, false ) );
    msg.setSubject( subject );
    msg.setHeader( "X-Mailer", "ExceptionErrorMail" );
    msg.setSentDate( new Date() );
    mbp1.setText(mail_message);
    mbp1.setContent(mail_message, "text/html");
    // Send the email message
    FileDataSource fds = new FileDataSource(uploadDir + fileName);
    FileDataSource fds2 = new FileDataSource(uploadDir + fileName2);
    mbp2.setDataHandler(new DataHandler(fds));
    mbp3.setDataHandler(new DataHandler(fds2));
    mbp2.setFileName(fileName);
    mbp3.setFileName(fileName2);
    Multipart mp = new MimeMultipart();
    mp.addBodyPart(mbp1);
    mp.addBodyPart(mbp2);
    mp.addBodyPart(mbp3);
    msg.setContent(mp);
    Transport.send( msg );
    //email sent...
    //delete the two files from the server..
    File f2 =new File(uploadDir + fileName);
    f2.delete();
    File f3 =new File(uploadDir + fileName2);
    f3.delete();     
    //End of code
    So when I don't attach a file and submit my form , I get the error message that I mentioned in my previous post.
    Any more ideas?

  • Lotus Notes Backup data .nsf file corrupted

    Hello friends,
    One of my users backed up data (nsf file) of Lotus notes got corrupted. The file size is 4 GB. Now the user is asking me to retrieve the data (mails) and i am not able to find any utility.
    Anyone having idea about any such free utility / tool to repair corrupt .nsf file.
    Thanks in advance.

    Lotus Notes Database data corruption usually occurs when the software application or the operating system crashes while the .OND data file is open in memory. In most cases, headers or parts of the data file are not saved to disk causing the data
    corruption or the application to fail. There are ways to recover all or part of your data file. U should immediately make 2 or 3 backup copies of the corrupted OND file(s) and stop using the computer except to send the file(s) for recovery analysis.
    There is a very good tool NSF Repair Kit such is powerful that repairs & Recovers Lotus Notes NSF file & restores them in original format to any user-intended location.
    1. Make a new copy of the database and deselect Access Control List in the “Specify What to Copy” section of the Copy Database dialog box. This will bypass the corrupt ACL and create a default ACL that you can then modify as necessary.
    2. Replace the design of a mail database with a non-mail template and then do a replace back to the mail template, which has been known to correct corrupted ACLs.
    3. From the Administration client > Files tab, select a healthy ACL database where the ACL is similar to the one that needs to be replaced. Right-click and select ACL.Copy. Select the bad ACL database then right-click and select ACL.Paste. This
    is an easy way to reset the ACL.
    U may try to download and repair to link
    http://www.nsf.repair/
    If all are failed, then go to ur Lotus dir and search for data directory and take full backup of archieve directory.
    Or other source of solution your trouble can be at:- http://www.filerepairforum.com/forum/other/other-aa/lotus-notes/247-lotus-notes-8-5
    Hope this may help u!

  • Export and Import of data fails

    Hi,
    Import and Export of data fails. Sometimes the application itself crashes when i try to import data. The error i get while importing data using a rule file is
    Reading Rule SQL Information For Database [Rev_FLA]
    Reading Rules From Rule Object For Database [Rev_FLA]
    Parallel dataload enabled: [1] block prepare threads, [1] block write threads.
    Unable to Allocate Aligned Memory for [pMemByte] in [adDatGetFreeBuffer].
    Reducing cache memory pages from [3996] to [1545] due to a lack of virtual memory.
    Transaction [ 0x1004a( 0x4e576783.0x92ba8 ) ] aborted due to status [1130203].
    Data Load Transaction Aborted With Error [1130203]
    Unexpected Essbase error 1003050
    The error from the application log is
    [Fri Aug 26 14:59:29 2011]Local/Plan_UAT/Rev_FLA/HypUser/25336/Info(1013166)
    Received Command [Import] from user [HypUser] using [L_RevFLA.rul]
    [Fri Aug 26 14:59:29 2011]Local/Plan_UAT/Rev_FLA/HypUser/25336/Info(1003040)
    Parallel dataload enabled: [1] block prepare threads, [1] block write threads.
    [Fri Aug 26 14:59:30 2011]Local/Plan_UAT/Rev_FLA/HypUser/22716/Info(1008139)
    Unable to Allocate Aligned Memory for [pMemByte] in [adDatGetFreeBuffer].
    [Fri Aug 26 14:59:30 2011]Local/Plan_UAT/Rev_FLA/HypUser/22716/Info(1006064)
    Reducing cache memory pages from [3996] to [1545] due to a lack of virtual memory.
    [Fri Aug 26 14:59:39 2011]Local/Plan_UAT/Rev_FLA/HypUser/22716/Warning(1080014)
    Transaction [ 0x1004a( 0x4e576783.0x92ba8 ) ] aborted due to status [1130203].
    [Fri Aug 26 14:59:41 2011]Local/Plan_UAT/Rev_FLA/HypUser/25336/Error(1003050)
    Data Load Transaction Aborted With Error [1130203]
    [Fri Aug 26 14:59:41 2011]Local/Plan_UAT/Rev_FLA/HypUser/25336/Info(1003037)
    Data Load Updated [1.52368e+007] cells
    [Fri Aug 26 14:59:41 2011]Local/Plan_UAT/Rev_FLA/HypUser/25336/Error(1013289)
    Command [Import] failed due to memory allocation failure
    What am i supposed to do? This happens in all the three cubes of the applicaion i Have. In all the three cubes, Index and data cache are set to 0.5 GB and the server RAM is 64GB.
    The size of the file i am trying to import is not more than 0.5 GB.
    Regards,
    Ragav.

    You need to check how much memory your application is consuming, if it is windows 32bit then that will need to be less than 2GB.
    It sounds to me like it is asking for more than 2GB and cannot address anymore, the combination of the caches for all the databases in the application could be taking it close to the maximum and when the application asks for more memory it cannot get any from the OS.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • How to use a data link file (UDL)

    Hi,
    several times I developed applications using Crystal Reports 2008 which used a data base on my development PC and when I deployed them on a production server I had the problem to change the Data Source Location to reflect the new Server name and SQL Server Instance. Moreover, this had to be done on the server, because CR 2008 check the new data source location when you change it, and because I was not connected to the server I couldn't do it on my development pc.
    I thought that the data link file was the solution to this problem, meaning that CR2008 could use a data link file (UDL file) as a data source location, so when deploying the reports on the server I had only to point the data link file to the new server instance.
    With my surprise, I have seen that when using a data link file to define the data source in CR 2008, it copies the definition found in it, instead of using it at run time. SO when I deploy the reports on the server I still have the same problem to change the data source location for all the reports.
    Perhaps I didn't understand the data link file use. So, how do I solve the problem of using a data source location that can be valid when deploying the reports on a server?
    Thanks

    Hi Antonio,
    Go to our download page above, there you will find samples to test with:
    http://www.sdn.sap.com/irj/boc/samples
    Or to .NET samples: http://www.sdn.sap.com/irj/boc/samples?rid=/webcontent/uuid/80774579-b086-2b10-db91-ed58c4dda375 [original link is broken]
    Find any of them that sets database location/log on.
    Thank you
    Don

  • Best way to record 50 kS/s data to file

    I am trying to read data from a DAQ at a sampling rate of 50 kS/s and record it to a file. My application needs to run for at least a few hours. Initially, I tried a producer-consumer loop pattern with a Write to Measurement File Express VI, but it wrote the data too slowly, and my queue filled up. Now I am trying to use the Write to Spreadsheet File VI with or without a producer-consumer loop pattern (see attached VIs). Both seem to write data to file, but neither writes the number of datapoints I would expect (both have far fewer datapoints).
    What is the best way to write this data to file? This seems like a basic question, so if it has already been discussed at length in another forum, or if there are examples of it which someone could direct me to, that would also be appreciated.
    Thank you!
    Solved!
    Go to Solution.
    Attachments:
    Record Data 8-7-13 with Producer Consumer.vi ‏54 KB
    Record Data 8-7-13 without Producer Consumer.vi ‏48 KB

    I agree that TDMS is probably the best way to go for fast data streaming.
    I also want to point out that you can read TDMS files with some spreadsheet software. This makes it easier to manage and provides a method for human readability.
    Add in tool for Open Office Calc
    http://www.ni.com/white-paper/6849/en
    Add in tool for MS excel
    http://www.ni.com/white-paper/4906/en
    Jeremy P.
    Applications Engineer
    National Instruments

  • Patch 6372396 - FAILED: file FNDLIBR on worker + 6372396

    Hi,
    i cloned the apps and db.while applying 63772396 patch in windows env, i am getting "Patch 6372396 - FAILED: file FNDLIBR on worker 1" this error.
    thanks in advance

    -ms128m -mx256m -Xrs oracle.aurora.server.tools.loadjava.LoadJavaMain -f -thin -user "APPS/*****@adam.muscat.org:1521:MMAPPS" D:\MMAPPS\mmappsappl\admin\MMAPPS\out\p001ldjva.jar
    Calling D:\MMAPPS\mmappscomn\util\java\1.4\j2sdk1.4.2_04\bin\java.exe ...
    Error:
    Program exited with status 1
    Cause: The program terminated, returning status code 1.
    Action: Check your installation manual for the meaning of this code on this operating system.java.lang.NoClassDefFoundError: oracle/aurora/util/tools/ToolException
         at oracle.aurora.server.tools.loadjava.LoadJavaMain.<init>(LoadJavaMain.java:51)
         at oracle.aurora.server.tools.loadjava.LoadJavaMain.main(LoadJavaMain.java:56)
    Exception in thread "main"
    AD Worker error:
    The above program failed. See the error messages listed
    above, if any, or see the log and output files for the program.
    Time when worker failed: Wed Jan 06 2010 13:06:08
    Manager says to quit.
    Time when worker quit: Wed Jan 06 2010 13:08:26
    AD Worker is complete.
    Errors and warnings are listed in the log file
    D:\MMAPPS\mmappsappl\admin\MMAPPS\log\adwork001.log
    and in other log files in the same directory.

Maybe you are looking for

  • I need to know the right tools and java technology

    Please help,I need to know the right tools and java technology to support what I need. I had background programming in Assembly,C++,Visual Basic,SAP/ABAP 4. All I can say, programming is about logic, now we are very helped building program using obje

  • Problem closing a Program (remains in status 'trying to end program')

    After upgrading to OS Maverick, CrossOver issued an error message. Trying to finish the program didn't work, it continuous indicating it's trying to end it. Even after using the immediatly stopping function, it falls into the 'trying to end program'

  • What router would work best for my app?

    Hi, I have been looking at two Cisco routers for my home/work network. They are the 1811 and the 871 series. My network consists of 1 2K3 SBS (It dose my DHCP, Domain Controller, Wins, and active directory.), 2 Cad stations, 1 Email/web PC, 1 Noteboo

  • Super Drive

    For some reason my superdrive in my powerbook wont accept cd's. It does not pull them in when I insert a cd/dvd. I also tried opening the dvd player and it gave a message that said, "A valid DVD drive could not be found". Any suggestions on this issu

  • What action shouldbe taken

    Enqueue: Elem. lock 0 contains unpermitted lock modes.lock object WD_HEADE