Log.nmbd is a huge, unremovable file

Hi All,
Noticed that people have been reporting problems with Windows sharing and the creation of HUGE log fiiles. I think that I'm reporting the largest file to date: a whopping 85 GB file.
I've stopped Windows file sharing, checked/repaired permissions, fixed disk etc., but the problem is that I can not delete the file. I always get a kernel panic if I try to rm it, srm it, cp /dev/null log.nmbd, etc.
How can I recover what amounts to about HALF of my hard disk?
Eric

Hi BDAqua, I tried ALL of the tricks short of buying DiskWarrior or something like that.
I tried single user mode, and then tried to delete the massive file. It locked up the computer again.
Eventually, I just had to reinstall my hard drive. Hah, this time I do not have Windows file sharing enabled -- i just bought my girlfriend a MacBook so that we don't have any more Windows machines.
Notwithstanding, there is something very very wrong with the deletion of these massive files. What is so bad about it is that I would imagine a number of situations where I would want a file this big -- digital video, HDF5 files, etc.
In short, if you can create it, you should be able to delete it!
Eric

Similar Messages

  • Log.nmbd file hogging HD space

    I have a log.nmbd file on my Macbook Pro that is taking up 18 gigs of HD space.
    private/var/log/samba/log.nmbd
    Why is this file so big and how do I remove it?
    Macbook Pro, 2 gig ram - Dual 2gig G5 2gig RAM, HD Monitor   Mac OS X (10.3.8)  

    You can view the log file with Console. It's in the /var/log section.
    My guess would be that smb/cifs is selected in Directory Access app. (Utilities folder). If you don't do file sharing with windows machines, I don't think it is necessary. I don't know if that will stop it from writing to the log, as it may not be what is doing it. Do you have Windows file sharing running in the Sharing Preference pane--could be that, also. Otherwise, I too am at a loss.
    Regardless, it is a log file and you should be able to delete it. I don't think anything goes back and reads the logs, except you, if you want.

  • Efficient way get FCE4 Log and Transfer to read .mts files stored on drive?

    Hi All
    I've searched the FCE discussion forum and not found an answer verified by more than one user to this question: What is an efficient way to get FCE4 (via the Log and Transfer window) to see .mts files from an AVCHD camera stored on a drive (NOT via the camera -- directly from the drive)?
    I am trying to plan the most space-efficient system possible for storing un-transcoded .mts files from a Panasonic AG-HMC151 on a harddrive so that I can easily ingest them into FCE4. I am shooting a long project and I want to be able to look at .mts files so that I can decide which ones to transcode to AIC for the edit.
    Since FCE4 cannot see .mts files unless they have their metadata wrapper the question is really 'how do I most efficiently transfer .mts files from the camera to a storage harddrive with their metadata wrappers so that FCE4 can see them via the log and transfer window?'
    Nick Holmes, in a reply in this thread
    http://discussions.apple.com/thread.jspa?messageID=10423384&#10423384
    gives 2 options: Use the Disk Utility to make a disk image of the whole SD card, or copy the whole contents of the card to a folder. He says he prefers the first option because it makes sure everything on the card is copied.
    a) Have other FCE users done this successfully and been able to read the .mts files via Log and Transfer?
    In a response to this thread:
    http://discussions.apple.com/thread.jspa?messageID=10257620&#10257620
    wallybarthman gives a method for getting Log and Transfer to see .mts files that have been stored on a harddrive without their metadata wrappers by using Toast 9 or 10.
    b) Have any other FCE4 users used this method? Does it work well?
    c) Why is FCE4 unable to see .mts files without their metadata wrappers in the Log and Transfer window? Is it just a matter of writing a few lines of code?
    d) Is there an archiving / library app. on the market that would allow one to file / name / tag many .mts clips and view them prior to transcoding into space-hungry AIC files in FCE?
    Any/all help would be most gratefully received!

    I have saved the complete file structure on DVD as a backup, but have not needed to open them yet. But I will add this. As I understand the options with Toast you are infact converting the video to AIC or something like it. I haven't looked into it myself, but I can't imagine the extra files are that large, but maybe there are significant, I don't know. The transcoded files are huge in comparison to the AVCHD file.
    A new player on the scene for AVCHD is Clipwrap 2.0. As I understand this product. It rewraps the AVCHD into a wrapper the Quicktime can open and play. This is with the MTS files only, the rest of the file structure is not needed. The rewrap is much faster that the transcode to AIC. So you have the added benefit of being able to play the files as well as not storing the extra files. The 2.0 version (which is for AVCHD) was just recently released. I haven't tried it and don't personally know of anyone who has. You might want to try this, there is a trial version as I recall.

  • 11g SOA with AIA suddenly creates huge temp files(sar files)!!

    Hi All,
    One of our clients that is on 11g SOA with AIA, the team observed while deploying applications, that it suddenly creates huge temp files(sar files) and the server slows down and then shuts down, has anyone seen such behavior or possible reasons?
    If anyone could share such prior experience would be apreciated!
    Thanks for your time!
    Regards,

    Hi Ajay,
    Could you check the managed server logs on the server you are deploying to? I prefer the soa_server1.out file if its available. Hopefully there is something more telling on that side.
    My gut feeling is that there is a schema required by the ProcessFulfillmentOrderBillingBRMCommsAddSubProcess process has not been deployed (which sometimes happens with this PIP in particular).

  • So I have a file on my desktop and I cannot open it, move it to a folder, or delete it from my mac. I dont know what it has in it, and that makes me suspicious. What can I do to remove this unremovable file from my computer? Is it a virus?

    So I have a file on my desktop and I cannot open it, move it to a folder, or delete it from my mac. I dont know what it has in it, and that makes me suspicious. What can I do to remove this unremovable file from my computer? Is it a virus?

    First, I would recommend repairing the hard drive with Disk Utility.
    If that doesn't fix the problem, there's a very dangerous command you can execute in the Terminal.  This is very hazardous, because a simple typo can result in very drastic consequences.  I have seen people erase their entire hard drive by putting a space in the wrong place!  So, please follow the directions I'm going to give you very carefully!
    Open the Terminal, which is found in the Utilities folder in the Applications folder.  In the Terminal, enter the following:
    sudo rm -f
    Make sure to put a space after the "-f"!  Then, drag the troublesome file from the Finder and drop it on the Terminal window.  That should insert the path to the file in the command.  Then go back to the Terminal and press return.  You will be asked for your password, and when you type it, nothing will be shown as a security measure.  Press return again after entering your password.  The file should be deleted.

  • Java Programming: Any Ideas for breaking a huge class file into smaller ?

    Hello Java pros,
    I have some very huge class files, some with dozens of methods; each method containing an average of a screen-page full of code.
    Obviously, such huge class files are difficult to maintain inspite of using an IDE, especially when changes have to be made to a bunch of a category of methods scattered all over the class.
    I am wondering if there are ways/best-practices out there to make the core class file smaller/smarter - fr eg.
    <a> by retaining the real core definitions within the core class and moving the detailed implementation outside the core class
    <b> by breaking down the file into more manageable pieces - something to the effect of using 'include' files that some languages support
    etc.
    Thanks for your help in advance.
    Sree Nidhi

    If you have huge class files with dozens of methods, maybe the design of your application is not so sound. You could use all kinds of OO design techniques to design your application so that it is easier to maintain.
    Start by learning about design patterns. The most famous book about design patterns is this one: http://www.amazon.com/exec/obidos/ASIN/0201633612/qid=1029971487/sr=2-1/ref=sr_2_1/102-4299125-5141710
    Here is also a nice book about anti-patterns: http://www.antipatterns.com/
    Jesper

  • Need help -To Restrict Huge temp file, which grows around 3 GB in OBIEE 11g

    Hi Team,
    I am working on OBIEE 11.1.1.5 version for a client specific BI application. we have an issue concerning massive space consumption in OBIEE 11g installed linux environment whenever trying to run some detail level drill down reports. While investigating, we found that whenever a user runs the drill down report a temp file named nQS_xxxx_x_xxxxxx.TMP is created and keep's growing in size under the below given folder structure,
    *<OBIEE_HOME>/instances/instance1/tmp/OracleBIPresentationServicesComponent/coreapplication_obips1/obis_temp/*
    The size of this temp file grows huge as much as around 3 GB and gets erased automatically when the drill down report output is displayed in UI. Hence when multiple users simultaneously try to access these sort of drill down reports the environment runs out of space.
    Regarding the drill down reports:
    * The drill down report has around 55 columns which is configured to display only 25 rows in the screen and allows the user to download the whole data as Excel output.
    * The complete rows being fetched in query ranges from 1000 to even above 100k rows. Based on the rows fetched, the temp file size keeps growing. ie., If the rows being fetched from the query is around 4000 a temp file of around 60 MB is created and gets erased when the report output is generated in screen (Similarly, for around 100k rows, the temp file size grows up to 3 GB before it gets deleted automatically).
    * The report output has only one table view along side Title & Filters view. (No Pivot table view, is being used to generate this report.)
    * The cache settings for BI Server & BI Presentation services cache are not configured or not enabled.
    My doubts or Questions:
    * Is there any way to control or configure this temp file generation in OBIEE 11g?
    * Why the growing temp file automatically gets deleted immediately after the report output generation in screen. Is there any default server specific settings governing this behaviour?
    * As per certain OBIEE article reference for OBIEE 10g, I learnt that for large pivot table based reports the temp file generation is quite normal because of huge in-memory calculations involved . However we have used only Table view in output but still creates huge temp files. Is this behaviour normal in OBIEE 11g. If not, Can any one Please suggest of any specific settings to be considered to avoid generating these huge files or atleast generate a compressed temp file.
    * Any other work around solution available for generating a report of this type without the generation of temp files in the environment?
    Any help/suggestions/pointers or document reference on this regard will be much appreciated. Please advice
    Thanks & Regards,
    Guhan
    Edited by: 814788 on 11-Aug-2011 13:02

    Hello Guhan,
    The temp files are used to prepare the final result set for OBI presentation server processing, so as long as long you dataset is big the tmp files will be also big and you can only avoid this by reducing your dataset by for example filtering your report.
    You can also control the size of your temp files by reducing the usage of the BI server.I mean by this if you are using any functions like for example sorting that can be handled by your database so just push to the DB.
    Once the report finished the BI server removes automatically the tmp files because it's not necessary anymore.you can see it as a file that is used for internal calculations once it's done the server gets rid of it.
    Hope this helps
    Adil

  • Huge size file processing in PI

    Hi Experts,
    1. I have seen blogs which explains processing huge files. for file and sftp
    SFTP Adapter - Handling Large File
    File/FTP Adapter - Large File Transfer (Chunk Mode)
    Here also we have constrain that we can not do any mapping. it has to be EOIO Qos.
    would it be possible, to process  1 GB size file and do mapping? which hardware factor will decide that sytem is capable of processing large size with mapping?
    is it number of CPUs,Applications server(JAVA and ABAP),no of server nodes,java,heap size?
    if my system if able to process 10 MB file with mapping there should be something which is determining the capability.
    this kind of huge size file processing will fit into some scenarios.  for example,Proxy to soap scenario with 1GB size message exchange does not make sense. no idea if there is any web service will handle such huge file.
    2. consider pi is able to process 50 MB size message with mapping. in order to increase the performance what are the options we have in PI
    i have come across these two point many times during design phase of my project. looking for your suggestion
    Thanks.

    Hi Ram,
    You have not mentioned what sort of Integration it is.You just mentioned as FILE.I presume it is FILE To FILE scenario.In this case in PI 711 i am able to process 100MB(more than 1Million records ) file size with mapping(File is in the delta extract in SAP ECC AL11).In the sender file adapter i have chosen recordset per message and processed the messages in bit and pieces.Please note this is not the actual standard chunk mode.The initial run of the sender adapter will load the 100MB file size into the memory and after that messages will be sent to IE based on recordset per message.If it is more than 100MB PI Java starts bouncing because of memory issues.Later we have redesigned the interface from proxy to file asyn and proxy will send the messages to PI in chunks.In a single run it will sent 5000 messages.
    For PI 711 i believe we have the memory limtation of the cluster node.Each cluster node can't be more than 5GB again processing depends on the number of Java app servers and i think this is no more the limitation from PI 730 version and we can use 16GB memory as the cluser node.
    this kind of huge size file processing will fit into some scenarios.  for example,Proxy to soap scenario with 1GB size message exchange does not make sense. no idea if there is any web service will handle such huge file.
    If i understand this i think if it is asyn communication then definitely 1GB data can sent to webservice however messages from Proxy should sent to PI in batches.May be the same idea can work for Sync communication as well however timeouts in receiver channel will be the next issue.Increasing time outs globally is not best practice however if you are on 730 or later version you can increase timeouts specific to your scenario.
    To handle 50 MB file size make sure you have the additional java app servers.I don't remember exactly how many app server we have in my case to handle 100 MB file size.
    Thanks

  • Word docs become huge RTF files in RoboHelp 8.02

    Every Word document that I add to the RoboHelp 8 project becomes huge in file size.  For example, a 100k Word 2007 document that consists of a couple of pages of text with a couple of small JPEG images (around 50k each) becomes a 15-30 meg RTF file in RoboHelp after importing.  The same happens if I created the page/document from within RoboHelp.  Is there a setting to keep the RTF files from swelling to such large gigantic sizes?

    I found out what the problem was.  For some reason, the RTF files swell up when the a jpeg image is added and embedded into the Word document.  I created a test document and converted some of the jpeg images to bitmap (BMP) files and used the Image import tool that comes with RoboHelp.  The RTF documents stayed tiny even when the same images were added but now as BMPs.   The difference is that the RoboHelp image tool creates a link to the bitmap image as opposed to embedding it in the document which is what happened with the jpegs.  I read about this in another post that alluded to this in one of the RoboHelp forums.
    Thanks!

  • Reading  huge xml files in OSB11gR1(11.1.1.6.0)

    Hi,
    I want to read a huge xml file of size 1GB in OSB(11.1.1.6.0)?
    I will be creating a (JCA)file adapter in jdeveloper and importing artifacts to OSB.
    Please let me know the maximum file size that could be handled in OSB?
    Thanks in advance.
    Regards,
    Suresh

    Depends on what you intend to do after reading the file.
    Do you want to parse the file contents and may be do some transformation? Or do you just have to move the file from one place to another for ex. reading from local system and moving to a remote system using FTP?
    If you just have to move the file, I would suggest using JCA File/FTP adapter's Move operation.
    If you have to parse and process the file contents within OSB, then it may be possible depending on the file type and what logic you need to implement. For ex. for very large CSV files you can use JCA File Adapter batching to read a few records at a time.

  • Error in Java compilation: \s7u.log (The system cannot find the file specified)

    Hi there,
    I got an error when I tried to compile a group of sqlj and java files. The error below:
    ----------------------------------------------sqlj -status -passes -compile=true -ser2class -d=. -classpath=.;%CLASSPATH% E:\work\src\java\com\interadnet\persistence\*.sqlj E:\work\src\java\com\interadnet\persistence\*.java
    out [Translating 53 files]
    out [Reading file AdServerInformationType]
    out [Reading file AdsType]
    out [Reading file AgencyType]
    out [Reading file CampaigntocompletionType]
    out [Reading file CampaignType]
    out [Reading file ChannelType]
    out [Reading file ClientType]
    out [Reading file CompletionType]
    out [Reading file ContactType]
    out [Reading file CreativeType]
    out [Reading file FtpServerType]
    out [Reading file InsertioninstructionType]
    out [Reading file MediaServerType]
    out [Reading file SystemPropertyType]
    out [Reading file VendorType]
    out [Reading file AdServerInformationTypeRef]
    out [Reading file AdServerInformationView]
    out [Reading file AdsTypeRef]
    out [Reading file AdsView]
    out [Reading file AgencyTypeRef]
    out [Reading file AgencyView]
    out [Reading file CampaigntocompletionTypeRef]
    out [Reading file CampaignToCompletionView]
    out [Reading file CampaignTypeRef]
    out [Reading file CampaignTypeRefs]
    out [Reading file CampaignView]
    out [Reading file ChannelTypeRef]
    out [Reading file ChannelTypeRefs]
    out [Reading file ChannelView]
    out [Reading file ClientTypeRef]
    out [Reading file ClientTypeRefs]
    out [Reading file ClientView]
    out [Reading file CompletionTypeRef]
    out [Reading file CompletionTypeRefs]
    out [Reading file CompletionView]
    out [Reading file ContactTypeRef]
    out [Reading file ContactView]
    out [Reading file CreativeTypeRef]
    out [Reading file CreativeTypeRefs]
    out [Reading file CreativeView]
    out [Reading file FtpServerTypeRef]
    out [Reading file FtpServerView]
    out [Reading file InsertioninstructionTypeRef]
    out [Reading file InsertioninstructionTypeRefs]
    out [Reading file InsertionInstructionView]
    out [Reading file MediaServerTypeRef]
    out [Reading file MediaServerView]
    out [Reading file SystemPropertyTypeRef]
    out [Reading file SystemPropertyTypeRefs]
    out [Reading file SystemPropertyView]
    out [Reading file VendorTypeRef]
    out [Reading file VendorTypeRefs]
    out [Reading file VendorView]
    out [Translating file AdServerInformationType]
    out [Translating file AdsType]
    out [Translating file AgencyType]
    out [Translating file CampaigntocompletionType]
    out [Translating file CampaignType]
    out [Translating file ChannelType]
    out [Translating file ClientType]
    out [Translating file CompletionType]
    out [Translating file ContactType]
    out [Translating file CreativeType]
    out [Translating file FtpServerType]
    out [Translating file InsertioninstructionType]
    out [Translating file MediaServerType]
    out [Translating file SystemPropertyType]
    out [Translating file VendorType]
    out [Translating file AdServerInformationTypeRef]
    out [Translating file AdServerInformationView]
    out [Translating file AdsTypeRef]
    out [Translating file AdsView]
    out [Translating file AgencyTypeRef]
    out [Translating file AgencyView]
    out [Translating file CampaigntocompletionTypeRef]
    out [Translating file CampaignToCompletionView]
    out [Translating file CampaignTypeRef]
    out [Translating file CampaignTypeRefs]
    out [Translating file CampaignView]
    out [Translating file ChannelTypeRef]
    out [Translating file ChannelTypeRefs]
    out [Translating file ChannelView]
    out [Translating file ClientTypeRef]
    out [Translating file ClientTypeRefs]
    out [Translating file ClientView]
    out [Translating file CompletionTypeRef]
    out [Translating file CompletionTypeRefs]
    out [Translating file CompletionView]
    out [Translating file ContactTypeRef]
    out [Translating file ContactView]
    out [Translating file CreativeTypeRef]
    out [Translating file CreativeTypeRefs]
    out [Translating file CreativeView]
    out [Translating file FtpServerTypeRef]
    out [Translating file FtpServerView]
    out [Translating file InsertioninstructionTypeRef]
    out [Translating file InsertioninstructionTypeRefs]
    out [Translating file InsertionInstructionView]
    out [Translating file MediaServerTypeRef]
    out [Translating file MediaServerView]
    out [Translating file SystemPropertyTypeRef]
    out [Translating file SystemPropertyTypeRefs]
    out [Translating file SystemPropertyView]
    out [Translating file VendorTypeRef]
    out [Translating file VendorTypeRefs]
    out [Translating file VendorView]
    out [Compiling 53 Java files]
    err*** The following character string is too long:
    err***
    out Error in Java compilation: \s7u.log (The system cannot find the file specified)
    Any suggestion appreciated.
    Daniel Huang

    Having not seen this before, I assume that the issue is that the command line string is too long when Javac is invoked.
    Here are a few things to reduce the size of the command line:
    - do not use the CLASSPATH option, but set the CLASSPATH environment variable (you can to that in a .bat file)
    - do not provide the .java files but rely on the implicit make capability in both, the SQLJ translator and the Java compiler. (Note: you must mention all of the .sqlj files that your program requires, do not rely on implicit make to find all of those.)
    - you can perform the -ser2class conversion as a separate step (see the SQLJ FAQ), as well as the Java compilation
    - there may or may not be a facility in your operating system to increase the size of the environment (and this may or may not have an effect on the maximum command line length)
    - you could try to omit the -passes option and see if that helps (that may actually be where your invocation if failing in the first place, in that it cannot write to some temporary file).
    Let us know how it goes. Thanks!

  • Write the results script of results log pane to XLS or CSV file with VBA.

    Hi,
    How can I write the results script of results log pane to XLS or CSV file with VBA code or something? I tried so hard but i can't.
    Thanks

    MoGas,
    This is actually not a trivial process. You need to use the results object and code it to write to your file (it is described in the help files).
    e-Tester automatically saves the results log as a text file so you may just want to stick with that for simplicity.

  • How do I compress a huge flash file for a web banner?

    Hi there,
    I am having trouble figuring out how to compress a huge flash file for a small web banner. I have 3 jpeg sequences (lowquality) that are used as background animations and the swf file is now over 900kb. I need a file under 200 or even 100. 
    thank you for your attention,
    Ayka

    The only way to reduce the size of the file is to reduce the sizes of the content it has.  If you go into your Flash Publish Settings and choose the option to generate a a size report then it will publish a report that will let you know how the weight of the file is distributed and you can focus on eliminating that weight from there.

  • How to break up a huge XML file and generate serialized JSP pages

    I have a huge xml file, with about 100 <item> nodes.
    I want to process this xml file with pagination and generate jsp pages.
    For example:
    Display items from 0 to 9 on page 1 page1.jsp , 10 to 19 on page2.jsp, and so on...
    Is it possible to generate JSP pages like this?
    I have heard of Velocity but dont know if it will be the right technology for this kind of a job.

    Thank you for your reply, I looked at the display tag library and it looks pretty neat with a lot of features, I could definitely use it in a different situation.
    The xml file size is about 1.35 MB, and the size is unpredictable it could shrink or grow periodically depending on the number of items available.
    I was hoping to create a documentation style (static pages) of the xml feed instead of having 1 jsp with dynamic pages
    I was looking at Anakia : http://jakarta.apache.org/velocity/docs/anakia.html , may be it has features that enable me to create static pages but not very sure.
    I think for now, I will transform the xml with an xsl file and pass the page numbers as input parameters to the xsl file
    null

  • How to compare two huge xml files(50MB+) using Java Code

    I want to compare two huge xml files using java code and need to find the difference of those xml files
    is there any API for that

    You should find third party API

Maybe you are looking for