Deleting Universal Binary Files

I have just downloaded OS X Lion and I am trying to delete some file on my Macbook to speed things up.  I have downloaded cleanmymac and it had found universal binary files. Can i delete theses.  I am using a mac with 2.4 GHz intel core 2 duo.

I have some spareparts from my first car, 2th and 3th car, 5th and 6th car... from the 70s and 80s and 90s and we can never know - perhaps one day we need it....  I never dit use one single part yet, because the new cars use more smart devices and parts - the old will never fit... but still I keep all this stuff. My new car have to stay outside, there is no space left in my garage or the carport, because all the spareparts.
I have also spareparts from my first 8086 PC, Apple II Apple III Lisa 1 + 2, the Commodore 64, Amiga 2000 and all PC and PPC and Mac sience. The parts will not fit in any of the new iPad Post PC Devices anyway. But never know, perhaps one day.... I have to find soon another house - there is no space for me in my bedroom anymore - more spareparts from the first iMac and the MacBooks I do not use anymore because of the iPad...
So better keep all the Universal Binaries - never know - perhaps one day we need it... ;-)

Similar Messages

  • Mac Universal Binary

    Hello,
    I have an application that runs fine on my Mac where I developed it, it is an application package (.app) with Universal Binary files. I also ran it on a friend’s Mac and it ran just fine. This second Mac has never had Valentina or Valentina Studio or anything used to develop a Valentina application on it. Both these Macs are Mac OSX 10.5.6
    Now, one of my client cannot even launch the application let alone run it. It crashes right before the screen even shows up. All you get is a message about "application quit unexpectedly" and when I look at the crash log, I see that there are errors related to the dylib files etc. He tried it on another Mac of his and same problem. I am puzzled, what could be going on? Has anyone come across this situation?
    Vansh

    I think your question would best be asked in the software developer section of the forum.
    http://discussions.apple.com/forum.jspa?forumID=728

  • Deleting a single element from a binary file

    I am working on a server application that must keep track of the messages that have been sent but not responded to.  After I send a message, I append it to an array cycling through an uninitialized shift register and I write it to the end of a binary file.  When I receive a response to a message, which was probably but not necessarily the first message sent, I delete that message from the array and the file.  This allows me to look up entries quickly but also to maintain a permanent record of what messages have been sent and not responded to.
    Basically, I need an intelligent way to make a file, or any other permanent storage medium, act effectively like a queue.  The problem with the current implementation is that when I delete one variable-sized entry from the file, I need to move all subsequent entries, which are usually all of the other entries in the file, forward in memory to take its place.  Is there a way to get around recopying the majority of the file or to implement this entire thing more intelligently?

    You can organize your datafile as a list. For example In this list any record consists of a fixed size data field and referencies (file position f.e.) to the next record and  to the previous record. You can then rewrite the record-have-to-be-deleted with new one, or you can "delete" this record modifiyng the referencies of the "previous" and the "next" records. When you will add a new record you can write one at this "free" place.
    The other way is to manage two files. The first one contains your data record by record. The second one contains the numbers of the records and the corresponding datafile positions or record counter. This second file can be very small and easy-to-manage. It can represent the only record's queue position in the datafile. You can rewrite your records, mark them as deleted without moving large portions of data.
    You can setup the datafile capacity in "number of records" term.
    Additionally it is possible to use the records with variable length but it will be much more dificult. 
    Best regards, Evgeny. 

  • Deleted mysql binary log files - can't start mysql

    Hi all,
    I had mysql running and used it for quite a while.
    Since my disk ran out of space, I deleted all the mysql-bin.0000* files in the mysql data folder and configured that mysql should not use binary logs anymore.
    Quite a mistake I assume; since now I can't start mysql(d) anymore.
    I've tried a lot by now:
    - re-installed mysql
    - deleted everything in the mysql data folder
    But I'm a bit stuck - and can't repair it anymore.. (there is not data loss, everything is backed up - that's not the problem).
    Can someone please guide me in a good direction?
    Running Arch Linux (systemd).
    systemctl status mysqld:
    Mar 08 13:12:34 arch systemd[1]: mysqld.service: control process exited, code=exited status=203
    Mar 08 13:12:34 arch systemd[1]: Failed to start MySQL database server.
    Mar 08 13:12:34 arch systemd[1]: Unit mysqld.service entered failed state
    Mar 08 13:12:34 arch systemd[1]: mysqld.service holdoff time over, scheduling restart.
    Mar 08 13:12:34 arch systemd[1]: Stopping MySQL database server...
    Mar 08 13:12:34 arch systemd[1]: Starting MySQL database server...
    Mar 08 13:12:34 arch systemd[1]: mysqld.service start request repeated too quickly, refusing to start.
    Mar 08 13:12:34 arch systemd[1]: Failed to start MySQL database server.
    Mar 08 13:12:34 arch systemd[1]: Unit mysqld.service entered failed state
    Mar 08 14:32:24 arch systemd[1]: Stopped mysqld.service.
    The data folder /var/lib/mysql is totally empty (which is - I assume - not good...).
    Thanks in advance
    Disclaimer: I tried for many hours now, but since I'm quite new - I don't have a clue anymore what to do.

    These are systemd, not mysql errors, and are quite useless as such -- you already know that mysql does not work Look for /var/log/mysqld.log or $datadir/<hostname>.err -- next time, of course, 'cos there's just one thing to do now: restore your mysql data folder, as there should be much more than binary logs Or, should you prefer to start from scratch, try to use mysql_install_db and then start mysql again.
    For future reference, you can safely delete old binary logs from inside mysql using PURGE MASTER LOGS command, and/or consider using expire_logs_days config option.

  • Aperture DELETES System ID file

    I recently switched from one MacBook at work to another. The first was giving me trouble and we want to have it looked at. I moved the hard drive from the original to the new machine so that I could keep working. All seemed fine until I ran Aperture for the first time. Aperture asked for my serial number. Since I didn't have it handy, I quit. Aperture deleted the "Aperture System ID" file from the support directory. Now I can't find my printed serial number and Apple's software has removed it from my system. I contacted Apple support and they were NO HELP.
    Clearly there could be a better way for Aperture to handle this. How about just refusing to run and NOT deleting the file. I could have eventually moved back to the other MacBook, not anymore.... In an effort to prevent copying their software, Apple has ****** off a paying customer.
    If I need to buy something again, it will be Lightroom not Aperture.

    If you're trying to launch on a machine with Intel processor, you'll need the universal binary version of FCP. FCP 3 is not Universal Binary and will not run on your MacBook Pro.
    -Takayasu

  • Flash Player Universal Binary

    Hello,
    I recently bought a new imac with intel. when I try to veiw
    sites containing flash content i am directed to a macromedia site
    which says i need to
    1. download flash player universal binanry.
    2. replace old plug in by coping new one into same folder as
    old one
    this has not worked both plug ins remainin file (and I still
    can not access flash) Is it ok to just delete the original plug in?
    or is there an uninstall process? Do you think it will work?
    Thanks Shontelle

    Antonio, I think we may have a communication breakdown here. lol.
    An Intel Binary is for Intel processors only.
    A PowerPC Binary is for PowerPC processors only.
    A Universal Binary is for both.
    Adobe offer an Intel download and a PowerPC download, which would lead one to believe that one is for Intel processors only and one is for PowerPC processors.
    However, the Intel download has "UB" in its filename, which would suggest it is a Universal Binary after all, not an Intel binary as the website states.
    A Universal Binary and an Intel Binary will run the same on an Intel machine, I was just wondering which one it actually was.

  • Finding and deleting powerbook g4 files from macbook pro

    I am finding old powerbook G4 files on my macbook pro and would like to find and delete them.  Any suggestions/recommendations on how I can do this?

    Go MBP (the best thing it has to offer is a PCI slot that the MacBook doesn't have). If you really need to use FM7, then cut the track on the iMac and import it into the MBP.
    As far as Rosetta, it won't open a plug in that isn't the same version (i.e., PPC or Universal Binary) as the host. So if your Logic version is UB, then the plug ins have to be too.
    If you can use FM7 in a stand alone mode, then you should be able to use it. I wouldn't recommend it because anything processor intensive tends to run crappy under Rosetta.
    Hope that helps.
    X

  • How to tell if an app is "Universal Binary" i.e. Intel-compatible?

    I have various freeware and shareware apps on my new MacBook Pro which I "migrated" over from an old PowerBook G4 which I had been using until getting my new MBP last week. To whatever extent possible, I want to avoid using old PowerPC-chip designed apps, since they'd have to go through Rosetta translation in order to work, but using Rosetta extensively slows down Snow Leopard's snappy response time.
    So, I want to make sure that each app is designed to be Intel-compatible. And if it isn't, I will try to track down a newer version, or if there aren't any, then find a replacement app that will do the same function.
    However, I don't have documentation or "read me" files for most of these older share/freeware applications, so it's not easy to tell just from "looking at them" whether or not they were written in Universal Binary (or in some other Intel-compatible way). One could always look at the date the application was made, and if it's really really old, then one could assume that it's not Intel compatible; and if it's really really new, then one could assume that it is compatible. Unfortunately, most of the apps are from that exact transitional period when some apps were made for PowerPC, some for Intel, and some Universal Binary, etc. -- mostly they're from the 2006-2009 era.
    So: Is there some method for "peering inside" applications to see if they're written in Universal Binary?
    As an example:
    I have a program called flvThing 1.0.1. If I control-click its icon, choose "Show package contents," then go flvThing>Contents>MacOS>flvThing, and do "Get Info" on the excutable file therein, it says "Unix Executable File (Universal)." Does that mean that it was written in Universal Binary?
    If that's true, can I use this same method to peer inside most/any apps and see if they are a "Unix Executable File (Universal)"? Or is there some other method for finding this out?
    Any suggestions would be most appreciated. Thanks.

    Yow! That was easy! Thanks. Worked like a charm. "Solved."
    Two follow-up questions:
    1. I see five ancient apps listed as being "Classic." I presume that I might as well Trash those right now, since no Classic apps will ever run in the Post-Snow-Leopard world? None of these five Classic apps are important or necessary, so trashing them is not a problem.
    2. Some of my Intel apps are listed as being 64-bit, and some as not 64-bit. Should this concern me? Does it matter? I presume that both 64-bit Intel apps and non-64-bit Intel apps will both work perfectly fine on my i5 MacBook Pro 15" -- right?

  • Universal Binary..Once and For All

    I know this topic has been discussed, but I'm still a little confused. I want to put this issue to rest, at least from my perspective.
    If you install a Universal Application via download or CD/DVD does it install an application that would be able to run of BOTH an Intel or PPC Mac. I know that the file size isn't exactly doubled, but does it still add extra unneeded code.
    I'm on a PPC Mac and I want to know if I download a Universal App am I getting the Intel version too. I really don't want this because I want my drive to be a slim as possible.
    If the Universal App does install both version (which I think was clearly evident in the huge increase in size of iLife '06 and iWork) is the only way to get the Intel version off to use "Trim the Fat".
    I don't really want to get into the nitty gritty...I just want to know if unneeded code is being installed and taking up space on my iBook because of this new Universal App thing.
    Thank you so much.

    I don't really want to get into the nitty gritty...I just want
    to know if unneeded code is being installed and taking up
    space on my iBook because of this new Universal App thing.
    The answer is yes.
    Universal binaries typically include both PowerPC and x86 versions of a compiled application.
    Universal binaries are usually substantially larger than single-platform binaries, because two copies of the compiled code must be stored. The size of the resulting universal binary is usually not double (roughly a 30% increase in size) as a significant portion will be resources which are shared between the two architectures.

  • The first binary file write operation for a new file takes progressively longer.

    I have an application in which I am acquiring analog data from multiple
    PXI-6031E DAQ boards and then writing that data to FireWire hard disks
    over an extended time period (14 days).  I am using a PXI-8145RT
    controller, a PXI-8252 FireWire interface board and compatible FireWire
    hard drive enclosures.  When I start acquiring data to an empty
    hard disk, creating files on the fly as well as the actual file I/O
    operations are both very quick.  As the number of files on the
    hard drive increases, it begins to take considerably longer to complete
    the first write to a new binary file.  After the first write,
    subsequent writes of the same data size to that same file are very
    fast.  It is only the first write operation to a new file that
    takes progressively longer.  To clarify, it currently takes 1 to 2
    milliseconds to complete the first binary write of a new file when the
    hard drive is almost empty.  After writing 32, 150 MByte files,
    the first binary write to file 33 takes about 5 seconds!  This
    behavior is repeatable and continues to get worse as the number of
    files increases.  I am using the FAT32 file system, required for
    the Real-Time controller, and 80GB laptop hard drives.   The
    system works flawlessly until asked to create a new file and write the
    first set of binary data to that file.  I am forced to buffer lots
    of data from the DAQ boards while the system hangs at this point. 
    The requirements for this data acquisition system do not allow for a
    single data file so I can not simply write to one large file.  
    Any help or suggestions as to why I am seeing this behavior would be
    greatly appreciated.

    I am experiencing the same problem. Our program periodically monitors data and eventually save it for post-processing. While it's searching for suitable data, it creates one file for every channel (32 in total) and starts streaming data to these files. If it finds data is not suitable, it deletes the files and creates new ones.
    On our lab, we tested the program on windows and then on RT and we did not find any problems.
    Unfortunately when it was time to install the PXI on field (an electromechanic shovel on a copper mine) and test it, we've come to find that saving was taking to long and the program screwed up. Specifically when creating files (I.E. "New File" function). It could take 5 or more seconds to create a single file.
    As you can see, field startup failed and we will have to modify our programs to workaround this problem and return next week to try again, with the additional time and cost involved. Not to talk about the bad image we are giving to our costumer.
    I really like labview, but I am particularly upset beacuse of this problem. LV RT is supposed to run as if it was LV win32, with the obvious and expected differences, but a developer can not expect things like this to happen. I remember a few months ago I had another problem: on RT Time/Date function gives a wrong value as your program runs, when using timed loops. Can you expect something like that when evaluating your development platform? Fortunately, we found the problem before giving the system to our costumer and there was a relatively easy workaround. Unfortunately, now we had to hit the wall to find the problem.
    On this particular problem I also found that it gets worse when there are more files on the directory. Create a new dir every N hours? I really think that's not a solution. I would not expect this answer from NI.
    I would really appreciate someone from NI to give us a technical explanation about why this problem happens and not just "trial and error" "solutions".
    By the way, we are using a PXI RT controller with the solid-state drive option.
    Thank you.
    Daniel R.
    Message Edited by Daniel_Chile on 06-29-2006 03:05 PM

  • Not able to deploy binary file asset

    Hi
    I am trying to deploy a WWWFileSystem binary file asset, but i am getting an exception while deploying.
    Message Detail:
    CONTAINER:atg.deployment.file.DeploymentSourceException: Unable to find VirtualFileSystem component for path /atg/epub/file/WWWFileSystem; SOURCE:deploymentHandlerNoVFSComponent: level 4: Unable to find VirtualFileSystem component for path /atg/epub/file/WWWFileSystem atg.deployment.file.DeploymentTargetException: Unable to find VirtualFileSystem component for path /atg/epub/file/WWWFileSystem at atg.deployment.file.FileWorkerThread.handleError(FileWorkerThread.java:1256) at atg.deployment.file.FileWorkerThread.runCommand(FileWorkerThread.java:740) at atg.deployment.file.FileWorkerThread.processMarkerForAddUpdatePhase(FileWorkerThread.java:441) at atg.deployment.DeploymentWorkerThread.processMarkerPhase(DeploymentWorkerThread.java:521) at atg.deployment.DeploymentWorkerThread.run(DeploymentWorkerThread.java:300)
    Can anybody tell me if i am missing anything.
    I have the WWWFileSystem and ConfigFileSystem mapped in the target agent.
    Thanks

    Hi Rama,
    definitely you don't need the PDK to upload a PAR.
    From the stack trace it seems a problem with your DB.
    I would try to restart the server completely, even if the chance to get it repaired by this is not that high...
    --- Just read the other thread where you also posted in. On which DB are you? Even if the PAR is not 0 bytes, are there eventually files in it (JARs) which are 0 bytes? If this is the case, try to delete them.
    Hope it helps
    Detlev

  • Downloading a binary file

    Following is a short program that attempts to copy a binary file from a remote site to a local file. I tested the program with a text file and it basically works. The problem is that the read() function does not throw an EOFException. So I have the following questions:
    1. Is there some method that accomplishes what my copyFile(...) method does?
    2. Is there a marker that signals the end of a file? Is this marker the same for text files and for binary files?
    3. Is there a better way to copy a binary file?
    Your help will be much appreciated.
    Miguel
    import java.util.*;
    import java.io.*;
    import java.awt.*;
    import java.net.*;
    public class filecopy {
    public static void main(String[] args) { 
    copyFile("http://localhost/super/input.txt","output.txt");
    static void copyFile(String mapURL,String mapName) {
    BufferedReader bin = null;
    DataOutputStream bout = null;
    try {
    URL url = new URL(mapURL);
    URLConnection connection = url.openConnection();
    bin = new BufferedReader(new InputStreamReader(connection.getInputStream()));
    bout = new DataOutputStream(new BufferedOutputStream(new FileOutputStream(mapName)));
    while(true) {
    bout.write(bin.read());
    } catch (EOFException eofexc) {
    try {
    bin.close();
    bout.flush();
    bout.close();
    } catch (IOException ioexc) {
    System.out.println("copyFile: IOException: " + ioexc);
    } catch (IOException ioexc) {
    System.out.println("copyFile: IOException: " + ioexc);
    }

    I would highly recomend getting the file size of the remote file, and collecting data until you have a file as big as the one you want. I would do this for both binary and text files. Not all textfiles have an EOF marker, and there is no universal "EOF" marker for a binary file.
    ~Kullgen

  • Binary file problem (Read a specific Stream)

    Hi Guys ,
    I have a problem , I want to read a binary files but not the whole binary file. I only want to read one stream.
    Forexample , I have a binar files which has 5 streams. If I only want to read one of the stream what should i do ? How do i get the position of each stream starting and end ?
    The binary file is attached.
    Please note that the file was 5.7mb where as labview message system allows 5.2mb. therfore i uploaded the file on a file host
    http://uploading.com/files/55524a3d/10211001_.raw/
    Thankyou in advance
    Rgs
    M Omar Tariq

    As far as I can understand. You are able to read and decode the file in Labview. But have problems with big files. In such cases. Read the data file in chunked and discard data not needed. You can also make a tool that splits the multi-channel data into separate files in a binary format easy to read from Labview. It is also not needed to have all the data in memory then analyzing the data. Analysis may in many cases be done in chunks.
    Besides which, my opinion is that Express VIs Carthage must be destroyed deleted
    (Sorry no Labview "brag list" so far)

  • Sending binary file via RS232 connection

    Hi,
    I want to send a binary file via RS232 connection but I have troubles reading the .bin file and then converting the data to string in order to send it via RS232. Any ideas how i can deal with this?
    Thanks in advance,
    IG. 

    It is hard to help then you do not post any code. I suggest you go to the toolbar then help. Then you select Find Examples. Search for files. Also since you are new to Labview always have context help enabled. Both tools are important if you want to learn Labview. Feel free to post again if you are stuck.
    Besides which, my opinion is that Express VIs Carthage must be destroyed deleted
    (Sorry no Labview "brag list" so far)

  • Compare service-types.db.pacnew, binary file

    After today's update I have /usr/lib/avahi/service-types.db.pacnew. I use 'Meld' to usually, compare .pacnew with the original file. However I am unable to do so because 'Meld' says, "Could not read file... appears to be a binary file".
    I don't have any knowledge of comparing binary files.. I don't know how to deal with it even if I use any other utility which might read binaries.
    I need to know, whether it will be safe to delete either file, /usr/lib/avahi/service-types.db or /usr/lib/avahi/service-types.db.pacnew?
    I am sure I never edited the said file, unless some application did so.
    The Forum guidance is requested.
    Thanks.
    Last edited by fantab (2013-03-03 11:28:50)

    I got the reply to my query with the package maintainer:
    Gaetan Bisson wrote:
    Feel free to inspect the svnlog:
            https://projects.archlinux.org/svntogit … ages/avahi
    Notice that this changed to the backup array was introduced to
    "implement FS#33930", and check the corresponding bug report:
            https://bugs.archlinux.org/task/33930
    https://projects.archlinux.org/svntogit … 842f2d4496
    If someone can translate the gist of the links in simple English I will be grateful.
    Thanks...

Maybe you are looking for

  • Unboxing MSI Z97 Mpower Max AC Overclocking Motherboard

    Intel Z97 chipset based motherboards is already out and I’m glad that Intel decided it to make the CPU socket the same LGA 1150 as before. Backward compatibility for the current CPUs and at the same time will have support for the upcoming Haswell Ref

  • How can I put a logo, picture in a header in Pages 5.2.2?

    I would like to put a logo, picture in the header of a document. However when I click on the header it appears to only allow page number, date, time stuff. Pages 5.2.2 OS X 10.9.5

  • Problem in Smartform where next page is printing unnecessarily

    I have written a smartform to print our packing slip. In my main window I loop through the delivery items for printing and after I have finished printing all of the items associated with the one delivery item  I want to  print out a u-line so that th

  • Creating select one choice

    Hi I am using JDeveloper 11.5.My usecase is 1. I created a view object based on two entity obj also created association and view link. 2. I use LOV for an Application description attribute in view object. 3. In JSF page I drag and drop view object fr

  • RV042, quickVPN hangs

    I have 5 static IP addresses from my cable provider.  One IP address is for my network that I use IPCop as my firewall.  I have setup the RV042 router+vpn appliance with a different IP address so I can access my network remotely.  When I am connected