Advantage of being immutable

What is the advantage of String object being immutable?

It means there can be many different eferences to that string, and none of them has to worry about another one changing the contents of the string.

Similar Messages

  • Advantage of Being Techno-Functional

    Hi,
    Greetings for the day !!
      Can any one explains what are the advantages of being Techno- Functional.
    With Best Regards
    Mamatha.B

    Hi mamatha,
    According to my personal opinion,
    1. The only advantage is of personal satisfaction.
    2. Salary wise nobody will get extra money for
       having such extra knowledge.
       (Bcos, a person is appointed in a company
        for handling his technical work of abap,  
        he won't be RESPONSIBLE for customization
        and other things of functional area ;
       that responsibility will be of some other
       functional consultant)
    3. Other minor advantage is that abap development
       quality and time gets efficient if we know
       a little about the functional side also.
    regards,
    amit m.

  • VT advantage not being recognized by windows xp

    I have installed VT software version 2.02 and IP communicator. Somehow, windows xp does not recognize hardware installed and in the device manager, USB device is marked not known.
    VT advantage software doesnot recognize softphone installed on PC and also hardphone.
    Any ideas.

    Hi,
    I assum you are administrator on the local machine, did you install the software before pluging in the camera, if not, delete the device and remove the software as well as unplug the camera and do the installation again after a reboot.
    make sure you have the latest Service pack on XP (i think min should be SP1)
    Good day

  • Being able to "click" on an area in a picture?

    Hi there,
    I am currently creating an incident report form and need to include a picture of a Human Body which the person filling in will need to be able to "click" onto any part of the picture to show where they were injured (i.e. Left elbow & top right of the head)
    How do I make the picture clickable? Is this possible? I urgently need some help with this... picture below.

    One method is demonstrated with the following sample: https://workspaces.acrobat.com/?d=mbkU5n6wi87zMO6nhES9BA
    It is based on this earlier simpler example: https://workspaces.acrobat.com/?d=cLYeEBVQqrYO1ox6CP93PA
    but it has additional code that attempts to correct the position of the dot as the position is further from the center. Note that increments are integers from 0 to 100, so you can't get any finer than that.
    You will have to extend this idea if you want to allow for more than one location.
    This approach has the advantage of being able to place the coordinates in a form field so they are included along with the other form data. If you used a drawing markup such as the pencil tool, this wouldn't be the case.

  • WHY DO APPLE APPS NOT TAKE ADVANTAGE OF AN 8-CORE'S SPEED?

    I bought a new 8-core, 3Ghz Mac with 16GB of RAM because I always by the "best" when upgrading, so what I buy lasts as long as possible.
    The unfortunate thing I discovered, is the FCP and Motion neither take full advantage of the processor speed.
    Apple tells me there is a limit to the processor speed they will function - as well as the amount of memory.
    This seems strange since Apple makes the "box".
    I'd be content with all program dynamically sharing the "real estate" but right now the only advantage it being able to work in two apps at the same time. Since that doesn't happen that often, I'd trade that advantage for faster processing per app.
    If anyone knows how to remove the “speed-limiter” for apps on a multi-proc Mac, please let me know.
    Dan

    I appreciate the desire to purchase something that will last and has quality. Still, "best" is a very slippery concept and is often confused with "more stuff" and/or "more expensive stuff".
    I'm reminded of the potential folly of the "I always get the best" attitude when I go out to dinner and the person deciding on the wine simply picks the most expensive thing they can find on the wine list. They did not consider the nature of the food (spicy or subtle, acidic or basic, etc) and how the wine will react. They just equated cost with quality. And while the most expensive bottle on the list may well be an extraordinary vintage, it also may be equally lost on the palates of the diners if it does not suit the food that evening.
    Since "best" is term of relativity, that is, it references a comparison between things, the real question might be "best for what"? Best compared to what other things under what circumstances?
    Did you define the tasks you wanted this machine to accomplish and investigate other options to accomplish these tasks? If not, how can you have decided this is the 'best' machine?
    In the for what it's worth category, FCP has been and will likely remain for the near future a 32bit program. It is in the nature of 32bit computing to be limited to 4GB of addressable RAM. Apple does not decide this. This is just the way it is.
    Eight processor machines used to be the realm of supercomputers. The OSs of those rarified beasts spent enormous time and computing resources simply keeping track of what its processors were doing - keeping them fed and cleaning up the messes when things get out of sequence. Oh, and the programs were written and compiled for that particular machine as well. OSX is not such an OS and FCP is not such a program. They are designed to run on a range of machines, from single processor PPC G4 Powerbooks all the way up to your mighty intel octobeast. Eventually we'll have 64bit OSs and applications - but backward compatibility and consumer based (aka cheap and as broad a market share as possible) economics limit this movement.
    Enjoy your machine, it is very capable. You need to understand how those capabilities can be made manifest. Get an ATI X1900 card and learn how to use as many programs at the same time as possible. If you use Compressor, learn how to set up virtual clusters. You are correct, it is a machine that you should be able to use for a long (in "computer years') time.
    Good luck.
    x

  • "Economizing" in a list of maps with identical keys (like a database table)

    Hi there!:
    I've been checking this forum for information about something like what I state in the title of this message, but haven't found such a specific case. I'm commenting my doubt below.
    I'm working with a list of maps whose keys are exactly the same in all them (they're of type String). Indeed it could be considered an extrapolation of a database table: The list contains maps which act as rows, and every map contains keys and values which represent column names and values.
    However, this means to repeat the same key values on every map and this spends memory. Right, maybe it's not such a big spent, but since the list can contains thousands of maps, I think that it would be better to choose a more "economical" way to achieve the same result.
    I had thought about building a class which stored everything as a list of lists and, internally, it mapped that String keys with the corresponding Integer indexes of every list. But then I realized that maybe I was re-inventing the wheel, because it's very probable that someone has already made that. Maybe is there a class on the Core API which allows that?
    Thank you very much for your help.

    Well, after re-reading the Java tutorial which is located in the Sun website I've came to a conclusion which I should have before, when I thought about using StringBuffers as keys of the maps instead of Strings.
    I'm so used to build Strings using literals instead of the "new String ()" constructor (just as everyone) that I had forgotten that, as it happens with any kind of object but not the primary data types, Strings are not passed to the methods by value, but by reference. The fact of them being immutable made me think that they were passed by value.
    Apart of that, my problem also was that using literals I was creating different String objects every time, despite the fact that they were equal about their content (making 400 different keys called "name" for example)
    In other words, I was doing something like this:
    // It makes a list of maps which will contain maps of boy's personal data (as if they were "rows" in a table).
    List <Map <String, Object>> listData = new ArrayList <Map <String, Object>> (listBoy.size ());
    // It loops over a list of Boy objects, obtained using EJB.
    for (Boy boy : listBoy) {
         // It makes a new map containing only the information which I'm interested on from the Boy object.
         Map <String, Object> map = new HashMap <String, Object> (2);
         map.put ("name", boy.getName ());
         map.put ("surname", boy.getSurname ());
         // It adds the map to the list of data.
         listData.add (map);
    }Well, the "problem" here (being too demanding, but I'm :P ) is that I was adding all the time new Strings objects as keys in every map. The key "name" in the first map was different from "name" in the second one and so on.
    I guess that my knowledge got messed at certain point and thought that it was impossible to use exactly the same String object in different maps (the reference, not the same value!). Thus, my idea of using StringBuffers instead.
    But thinking about it carefully, Why not to do this?:
    List <Map <String, Object>> listData = new ArrayList <Map <String, Object>> (listBoy.size ());
    // It makes the necessary String keys previously, instead of using literals on every loop later.
    String name = "name";
    String surname = "surname";
    for (Boy boy : listBoy) {
         // It uses references (pointers) to the same String keys, instead of new ones every time.
         Map <String, Object> map = new HashMap <String, Object> (2);
         map.put (name, boy.getName ());
         map.put (surname, boy.getSurname ());
         listData.add (map);
    }Unfortunately, the "hasCode" method on String is overloaded and instead of returning the typical hash code based on the single ID of the object in memory, it returns one based on its content. That way I can't make sure that the "name" key in one map refers to the same object in memory than another one. I know, I know. The common sense and the Java documentation confirm that, but had loved having an empiric way to demonstrate it.
    I guess that using "javap" and disassembling the generated bytecode is the only way to make sure that it's that way.
    I believe that it's solved now :) (if no one tells me the contrary). I still am mad at myself for thinking that Strings were passed by value. Thinking about it now it had no sense!
    dannyyates: It's curious because re-reading every answer I think that you maybe were pointing to this solution already. But the sentence "you put the +same+ string" was a little ambiguous for me and thought that you meant putting the same String as "putting the same text" (which I already was doing), not the same object reference (in other words, using a common variable). I wish we could have continued discussing that in depth. Thanks a lot for your help anyway :) .

  • Is there a way to make Wake on Demand work?

    I have checked the AKB and I have a Mac Pro that is all set to wake on demand, the only issues is that I have an older Airport Express (b/g not n). Does anyone know of a work around to get this Airport to work with Wake on Demand, or am I out of luck?

    There are many Sleep related problems on these boards just search them out here in Discussions. You can try troubleshooting steps like disconnecting all peripherals, creating a new user account to see if it is systemwide or limited to your user account. If it is you might try dragging yourname/Library/ "Preference" folder to the desktop and restarting to see if it is a pref...
    In short I'm not sure I can solve your issue but I can offer a good workaround.
    If your Mac is only troublesome with Sleep, you might like to know that sleep is a very unnecessary state.
    If you choose to leave your Mac on 24/7 it uses very little power over "Sleep" as the HD and fans spin down quickly when not in use and you will use Screen Sleep. The advantage of being on 24/7 is there are Periodic scripts (daily, weekly and monthly housekeeping tasks) set to run between 3 - 5 AM. If you do not leave your Mac on these Periodic scripts will not run. If you "Sleep" your Mac they will not run either.
    Another advantage is that your Mac is ready to respond at the touch of a key move of a mouse with no "grogginess". And yet another, of course, is you will suffer no Sleep related problems.
    Just set your Mac to "Never Sleep" and "Screen Sleep" to whatever you like and don't worry.
    For security reasons you may want to log out.
    As for shutting down, each time you reboot your system you can cause damage. A computer running Unix should never need to be rebooted, of course this can't be helped but theoretically it should never need rebooting. Each time you reboot you can send a power surge through your CPU, HD, etc which could cause problems and shorten the life of your computer.
    fyi - some Unix servers have been up and running for over a dozen years. There also have been many posts here in Discussions where folks were posting their "up" times.
    -mj

  • Get the Value copy from Header feild to Line feild through Persnalization

    Hi,
    I have one requirement.
    I have a field in Header level. When I have assigned some value to that field in header level at the same time the same field at line level also has the same value and when I Query the form and changed some value on that field then also the line level field will also changed. How can I achieve that through Form Personalization.
    Any help is appreciated...
    Thanks in Advance.
    Nihar

    If you're string can be modified, why are you using a String? String being immutable whenever the string gets modified, it will generate a copy string as well.
    How many of the String's capabilities, you may be better off using a char array and modifying it manually. I haven't tested it, but the following code should work if you just need to append to the string.
    public CustomString {
    char[] data;
    public CustomString(String in) {
    this(in.toCharArray());
    public CustomString(char[] in) {
    data = in;
    public void append(String a) {
    append(a.toCharArray());
    public void append(char[] a) {
    char[] tmp = new char[data.length+a.length];
    System.arraycopy(data, 0, tmp, 0, data.length);
    System.arraycopy(a, 0, tmp, data.length, a.length);
    data = tmp;
    public char[] get() {
    return data;

  • Flex/AMFPHP App NOT working in IE but works in FireFox...

    Hello I am beginning to develop Flex applications. I am
    working with PHP in a Symfony environment.
    The error below was appearing before in FireFox but
    disappeared after I uninstalled the Flash Player plugin and
    reinstalled the Flash debug player for FireFox. I also uninstalled
    the IE Flash Player and replaced it with the Flash debug player for
    IE (i believe it comes with Active X). I have the exact same
    version of Flash player for both browsers, 0.0.47.0). This is the
    error:
    [object RemotingConnection]
    Error #2044: Unhandled NetStatusEvent:. level=error,
    code=NetConnection.Call.BadVersion
    at sample/initApplication()
    at sample/___Application1_creationComplete()
    at
    flash.events::EventDispatcher/flash.events:EventDispatcher::dispatchEventFunction()
    at flash.events::EventDispatcher/dispatchEvent()
    at mx.core::UIComponent/set initialized()
    at mx.managers::LayoutManager/::doPhasedInstantiation()
    at Function/
    http://adobe.com/AS3/2006/builtin::apply()
    at mx.core::UIComponent/::callLaterDispatcher2()
    at mx.core::UIComponent/::callLaterDispatcher()
    Again this error was appearing in FF before but now ONLY on
    IE. The same error arrises if I try to open the SWF file localed in
    the bin/ directory in the Flex project. My flex project is located
    on my Desktop and the server I am trying to connect to is on a
    Linux box. AGAIN, this project displays the data perfectly on FF
    but NOT on IE. It just displays several rows from a DB table in a
    datagrid in Flex.
    I have no idea how to approach this! Here is my ".mxml" file:
    <?xml version="1.0" encoding="utf-8"?>
    <mx:Application xmlns:mx="
    http://www.adobe.com/2006/mxml"
    xmlns="*" creationComplete="initApplication()">
    <mx:DataGrid dataProvider="{phpData}">
    <mx:columns>
    <mx:DataGridColumn headerText="User ID"
    dataField="userid"/>
    <mx:DataGridColumn headerText="User Name"
    dataField="username"/>
    <mx:DataGridColumn headerText="Email Address"
    dataField="emailaddress"/>
    </mx:columns>
    </mx:DataGrid>
    <mx:Script>
    <![CDATA[
    [Bindable]
    public var phpData:Array;
    import flash.net.Responder;
    public var gateway:RemotingConnection;
    public function initApplication():void
    gateway = new RemotingConnection("
    http://project_dev_environment/web/backend_dev.php/gateway/amfphp");
    gateway.call("Sample.getUsers", new Responder(onResult,
    onFault));
    public function onResult(result:Array):void
    trace(phpData);
    phpData = result;
    public function onFault(fault:String):void
    trace(fault);
    ]]>
    </mx:Script>
    </mx:Application>
    Here is the required ".as" file:
    package
    import flash.net.NetConnection;
    import flash.net.ObjectEncoding;
    public class RemotingConnection extends NetConnection
    public function RemotingConnection(sURL:String)
    objectEncoding = ObjectEncoding.AMF0;
    if (sURL) {
    connect(sURL);
    public function AppendToGatewayUrl(s:String):void
    I have given a differnt URL because this is an internal
    application and wont be accessible by outside users.
    Again, this app works when loaded on FF but not IE! HELP
    please I have been trying to figure this out since 7 am this
    morning and I'm desperate!
    Just FYI AMFPHP is set up correctly, although its version 1.2
    - could this be causing problems?
    Please, PLEASE help me! or at least a hint. Thank you!

    Hey Tikis,
    There are a few factors that could be throwing this thing
    through a loop. I've had this exact same thing happen to me once
    before and the culprit for me was that my crossdomain.xml file
    wasn't 100% perfectly formed. Apparently Firefox didn't care enough
    and went on with it's life however IE decided it wasn't going to
    read it and I received that error, so that would be the first thing
    I would check.
    The next thing is what version of PHP are you running on the
    server, it seems to me that the newest release of 5.2.2 is throwing
    this error more often then ever before, something I might suggest
    is upgrading to AMFPHP 1.9. RemoteObjects natively send data in
    what's called the AMF3 format which is much speedier then it's
    predecessor of AMF0. Only AMFPHP 1.9 has the ability to receive /
    send the data back in AMF3 format, where as with the 1.2 version
    it's sending the call, converts it down to AMF0 sends the data (at
    a slower pace) and generates a result. The other advantage to being
    able to use AMF3 is having true type casted results such as if PHP
    returns a number you can check it by if(event.result == 0) where as
    AMF0 everything is converted into a string and then sent back to
    Flash / Flex. So aside from all of these advantages the last one is
    less code, you no longer need to use the RemotingConnection to
    convert everything down to AMF0. I would make a copy of amfphp on
    your linux of the 1.9 version and send the remote object at that
    first and see what happens, if you are still receiving the same
    error then the problem lies in something else.
    Hope this helps.

  • How can I transfer Itunes from external hard drive to a new computer?

    Hello,
    I had a Dell laptop with Windows XP on it recently crash, but I have everything saved to a Seamagine external 1 terabyte hard drive. I have about 250 gigabytes of media on it, mostly CD's and audiobooks. The setup I had on the XP computer was my Itunes folder saved on the C drive, but I kept all the actual music and audiobooks on the external hard drive, which was labeled the F drive.
    After the crash, a relative sold me their three year old Compaq laptop, with Windows 7 instead of XP as the OS. My relative already had Itunes downloaded on it, but when I opened the Itunes on the new (new to me) laptop, Itunes only recognized their (my relative's) account. So, I uninstalled their Itunes and then I re-downloaded Itunes for 64-bit Windows 7. I then opened the Windows Explorer folder so to drag and drop my backed up Itunes folder and library information from the old C drive to the new Itunes folder in the new C drive. But, it wouldn't do anything. It still wanted the password info from my relative's account. It wouldn't recognize my account, even though I had uninstalled and then reinstalled a newer version of Itunes. Dragging and dropping my backed-up Itunes folder from my external hard drive to the new C drive didn't work because it said it didn't have certain permissions or something like that. I think I might have downloaded a newer version of Itunes, different from what I had saved a couple of days ago, which might be one of the problems here, not to mention that I switched OS's from XP to Windows 7. 
    When I plugged the external hard drive USB into the new laptop no Seamagine manager came up for user-friendly accessibility. I didn't know what to do, so I installed Seagate Dashboard onto the new computer. It didn't do anything for me either. No problem, I thought, I'll just use Windows Explorer to move the Itunes library, and then once I transfer (i.e., drag and drop) the old Itunes folder to the one, all will be as it was. I could then direct my Itunes account to use my external hard drive as the media source folder. I was wrong. It won't let me do that either. It says I don't have permission to do it. 
    How c an I get my old Itunes library info onto the C drive of this new computer, but keep the 250GB of media files on the external hard drive, just like the setup I had on the old laptop before it crashed? If you can help me, then you will be my hero. I'm scared to death that I've lost my library info and afraid to mess with it too much more for fear of erasing it or, worse, losing the 250 GB of media that is on the external hard drive, which has taken me a tremendous amount of time and money to compile. It is, in effect, irreplaceable. I'm very worried and frustrated. I've been working on this for days, to no avail. You're my last hope. Thank you.

    Hmm, I think we should start over...
    Here are the typical layouts for the iTunes folders:
    In the image above the red outline denotes the "media folder", that is the folder listed under Edit > Preferences > Advanced. In the standard layout where the "media folder" is inside the "library folder" the iTunes library has the advantage of being self-contained and portable. Since you've lost the "library folder" from your old computer you're going to have to start over so you might as well get organized properly.
    If the folder curently holding your media files is called G:\iTunes then rename it as G:\Media.
    The support doc How to open an alternate iTunes Library file or create a new one explains how to create a new iTunes library where you want it. In short click the icon to start iTunes and immediately press and hold down the shift key. Keep holding until asked to choose or create a library, click Create and browse to G:\, make sure the folder is going to be called iTunes and click save.
    Move the folder that actually holds the media inside the folder G:\iTunes\iTunes Media. We're going to let iTunes reorganize everything (unless you have a collection of .wav files in which case stop and tell me because they require a different approach).
    Use File > Add folder to library and select the folder G:\iTunes\iTunes Media. iTunes will search through the whole series of subfolders adding all media files into the library and reorganizing them into the correct artist & album folders.
    Again, for data security, you really ought to obtain a second external drive at some point and backup your library to it. Only having one copy, as you have found, leaves you vulnerable to failure. External drives are no less prone to it.
    A) Windows uses \ as a folder separator in path descriptions, Mac uses /.
    B) Do you have a backup of the iTunes folder that was on drive C:\ or are you simply referring to the location of your media as a "backup" when it really the only copy? If you have a backup copy of your "original" iTunes library you could move/copy that to G:\iTunes shift-start-iTunes to open it, and then I can offer tools or techniques for fixing the links that will be broken becuase the drive letter has changed.
    C) Use Store > Sign out and then Store > Sign in to sign into your iTunes account. You will also need to use Store > Authorize the computer to allow playback and transfer of any protected content. If your new library contains protected content from any other accounts then you will have to authorize the computer for those too.
    tt2

  • How do I setup multiple IP addresses?, How do I setup multiple IP addresses?

    Just bought a new Mac Mini and Lion Server to replace my company's three 15-year-old 8500s.  Ooof.  They were running mail, web and filemaker servers, respectively.  We have AT&T DSL Small Business, and five static IP addresses.  Lion Server claims to be able to run all that simulateously on the same new computer.  But, I can't find specific instructions on exactly how to configure the two websites to listen for their respective, dedicated IP addresses. 
    The mail server is up and running (I copied the appropriate server settings from the old OSX 10.3.9 server computer), but I can't find where to add my other assigned IP addresses for my two web sites.  When I add the names of the sites in the setup window, the only options I can see are the two ethernet addresses (one wired and one wireless) of the actual computer itself.  Thanks in advance for your help. 

    You don't need a "Gigabit router". It's way, way overkill (and $$$$$$) for your network.
    The Comcast link is going to max out at about 15-20mpbs, so any router with 100mpbs links is going to be more than sufficient.
    As it happens, I wouldn't get a router anyway. Comcast are going to provide you with a router (they need to in order to terminate the RG-58 cable connection they drop and provide an ethernet port (that ethernet port, BTW, will be 100base-T).
    What you really need is a firewall. One that can handle multiple public IP addresses and allow specific incoming traffic (e.g. the FTP servers).
    I would recommend something from NetScreen. You may be able to pick up one of their NetScreen 5GTs pretty cheap now they're end of life (I have several of these around my network, including one in the exact same configuration you're talking about here). Failing that the SSG-5 would be sufficient.
    They also have the advantage of being able to run as a VPN endpoint should you need that functionality.
    At the end of the day your network topology would look something like:
    Comcast router -> (NetScreen) Firewall -> Switch -> Clients
    All the clients will connect to the switch and can talk amongst themselves at gigabit speeds. Any internet traffic would pass from the switch to the firewall and out to the big bad world.

  • Back-ups with and without Time Machine + G -MINI

    I've got a problem with my new G - MINI 500GB running on an iMac intel 10.5.1/ 230GB. I hope someone can help me with this.
    Overview of problem: I need more hard disc space and opted for an external drive. I would have got a G _Tech Q 500 gb drive but there weren't any immediately available so got the G -MINI 500. Basically this is for video and large audio files. Anyway, I plugged the new MINI in and it pretty much set itself up. Time Machine offered to do a back up of my whole system and contents, I clicked 'accept' and excluding a few items in the options pane in TM it came to around 210 GBs now on the MINI. As I need to take the Mac in for servicing a defective LCD I thought a back up of the system a good idea. I added a few other video files in a separate folder on the MINI. I then thought that was way too much- I will need more than the remaining 245 GBs for future stuff. So I wanted to get some more space back and thinking that it would work pretty much like the Mac's own hard disc and Flash USB drives I just tried to send some files from the back-up folder on the MIni to the trash in the dock. I got a message saying I couldn't trash any files included in a back-up! Kinda locked. I tried trashing one of the video files that wasn't in the system TM back-up folder and it did go to the trash but then I checked the available memory on the MINI hadn't reflected change: the size of that file wasn't added to the 'available memory'. Buff, problem.
    Anyway, I then decided to trash the whole back-up folder and everything else. All went in the trash can in the dock but again, despite expecting the Mini's space to have been freed up, like any hard drive or other storage device, it hadn't been. It still counted 210 GB used up. To make sure I emptied the trash as well but again with no result.
    I couldn't access the trashed files in any way now but the Mini was obviously counting them. What's going on?
    So then used DisK Utility to check the MINI, same result, and then completely exasperated I just went ahead and partitioned it, that seeming the best option at the time. I got the space back but lost about 230 MG in the process. Annoying all in all. Now, I didn't want a Drive that you can't eliminate files from without having to erase the whole bl**dy disk! This seems to be the only way to get space back from the MINI's drive but it seems ludicrous. What am I doing wrong, what don't I know?
    A related question is what's the difference between 'erasing' a disk and 'partitioning' one (in Disk Utility). I can't work this one out. The important question is whether there is a way to delete items from the Mini in order to get back the space they occupy. If not I should have bought something else. I thought G -Tech was supposed to be top end, it is faster than USB I'll give it that for now.
    Thanks in advance for any replies.
    Oh and I'm posting this here because there is nothing in G Tech support about this and the post I sent to the G Tech forum goes unanswered after 2 days already - it seems like a simple question that someone should know the answer to. I want to know if I should just simply copy files to the mini and not bother with system files unless I really want to make a back up. A final question would also be: what exact folders are needed to make just a simple system back up but without all the clutter that you could reinstall from the discs anyway?
    Thanks!
    Message was edited by: Paul Quemades

    If you just want to drag a few files around how about create a folder on the external drive. I know some say you shouldn't do this, but TM will still work even if there is another folder on the same volume you use for backing up with TM. In that folder you can drag whatever into it, just as you are able to do on any flash drive.
    But let's revisit how TM works. I don't understand the detailed operation of TM but once it establishes the initial large copy of your Operating System, subsequent small changes are linked back to the big one, so when in the Time Machine window with all of the different backups over the past days or weeks, clicking on any one of them will show you what appears to be the same thing - the contents of your hard disk. In amongst that will be the small changes or added files.
    To save it recreating by and large the same appearance of your hard disk each time it reuses the initial copy and just adds any changes. It does this by associating any newly changed or newly added file to the base copy, by using links. It is like creating a context for the file, so TM is making file X linked to folder Y which is part of application Z. If we want that file back using TM, it will know that file X has certain associations and it will put it back where it got it from.
    When we drag that same file to another location, it is just that file in isolation. It has no context. The only association it has, is what we remember about it, "that photo was taken at Lucy's wedding" for instance. TM or the computer for that matter doesn't know who Lucy is, or even that she is a human being.
    So in our flesh and blood reality, we see files as objects that have meaning to us, but to the computer it is all just data that has a place and a function. Time machine is a kind of illusion. It is also a Place machine by replacing files in previous places.
    You say to use TM to just store "things like personal documents, music, video and other such stuff, unrelated to system back ups, seems such a fussy and tedious way to do it,..." I agree, but TM has the advantage of being able to restore any file no matter how obscure, or even restore your whole OS if you needed to. You can always retrieve any document or song or photo from TM by going into TM and having it restore that document. Not you dragging and dropping but letting it do it. The other advantage of TM is that it will keep variations of the same file as it is changed, such as various drafts as they are edited.
    Maybe a more suitable backup application for you may be SuperDuper, where you can greatly control what is backs up. The disadvantage of superDuper is it won't keep older versions of something, rather make a fresh copy of that folder each time you use it.
    So you have a few options, but I think if you can get used to opening TM and locating the files that it automatically saved, and telling it to restore those back to your hard drive, it may well serve your needs.
    roam

  • Multiple Users Sharing WinCE Device

    Is it possible for multiple users to share the same WinCE device?
    One of our business requirements is to allow users to share the same WinCE device, example, one device shared for an entire branch office.
    Common scenario, UserA syncs and modifies data on local on device, then UserB syncs devices obtaining their own data partition/snapshot, and modifies data without effecting UserA data.
    In my testing, my initial sync of UserA works fine, but when I can try to sync UserB, I get the error 'connection does not exist'. However, if I delete the database created by UserA, I can successfully sync UserB.
    Any help would be greatly appreciated.

    I guessed that you were using subsetting. This however does mean that lite will detect that there is a new user for the device, and send a complete rebuild of all of the data. This actually links up with the next issue
    If you have 5 (or 10) users in a department, then if the snapshots are defined for each user, then each time the user changes, your initial sync to get thier data will do a complete refresh. this can mean a lot of data, not normally too bad in the comms time, but can take a long time in the processing phase. If you have the snapshot at department level, then the first sync to create the database would indeed be 5/10 times longer (this would be a one off hit), but you could then have all snapshots set to fast refresh, and only smaller updates would come through each time, for everyone, rather than complete refreshed all of the time.
    you will see that when you sync a new user a directory with that user name get created under the orace directory, but i have never seen anything in these. for our PDA's the database gets created in (by priority order) 1 - SD card, 2 - built in storage, 3 - Orace directory. Removable card is a little slower, but have the advantage of being able to up the storage capacity if needed. The application we are just finishing off has 83 objects, with around 60,000 records in total and a database size around 15 Mb
    concli has a table system.c$client_preferences, with (normally) two records in it. (column dev_owner values 0 and 1). 0 will have the user used when the database were created, and is the default for device manager and msync. 1 will normally be the same, but could be different if you have subsequently synced as a different user. NOTE you can override the user in msync, but NOT in device manager. This may be your problem.
    Try (after making sure that device manager is not running) removing these records (at least the type 0), and syncing. This could save removing the databases

  • Is there a way to keep the search bar from unfolding downward?

    I just updated to FF 34 and see that when I start to type something into the search bar it now unfolds downward, taking up about seven lines worth of space -- even though I have search suggestions disabled. The problem with this is that the unfolding covers up part of the urls, tab and top-of-webpage text that I sometimes reference (or even copy-n-paste) to guide my search-term selection. I usually use only my default search engine, so the disadvantages of this "obscuring" definitely outweigh any advantages from being able to more easily select an alternate search engine on-the-fly. In fact, one reason I used the search bar to begin with is to *avoid* the kind of unfolding behavior search engine pages often have. But it's actually more intrusive here, because it is covering up more pertinent text. Is there any way to keep this "unfolding" behavior from happening automatically when using the search bar in FF 34? Thanks

    You can revert to the older Search Bar scheme like this.
    Open up '''about:config''' ''(typed in the Location Bar)''
    Right-click and toggle this preference to '''''false'''''
    '''browser.search.showOneOffButtons'''
    Then restart Firefox.

  • Why I like Aperture

    I need to preface this by saying that no application is perfect for everyone. Different people have different workflows, different post-processing needs, and different priorities. I'm not saying Aperture is perfect for everyone. Nor should anyone else say Aperture is useless. It may be useless to them, but not to everyone.
    I shoot mostly fashion and advertising type work. I'm a pretty serious amateur, in that I have good gear, and I'm very serious about photography, but I have a day job doing something else (security architecture, which I also love). I shoot only RAW as it gives me way more latitude if I want to adjust the exposure after the fact to change or increase a look (i.e. I want to make things darker and moodier, or I want to blow things out a little). My post-processing requirements are usually the following (in order of frequency): Exposure, white point, saturation, sharpening, levels, blemish fixing. On very rare occasion I'll need to do something beyond that.
    My pre-Aperture workflow looked a lot like this:
    Copy files from CF card. Due to my camera putting them in different folders based on the sequence, I had to write an automator script to pull out just the image files from all the folders and put them in a new folder on my desktop. This works, but takes a little while, and is something I had to write myself.
    Create a folder for my project "Sarah-DarkWear hoodie".
    Create the following folders inside that: "raws", "all-jpeg", "best-psd", "best-jpeg". Move all the RAWs from my automator action's results folder into the raws folder.
    Open up Adobe CS2 Bridge. View the files. Try to pick the best ones. I can't emphasize enough how laborious and time consuming this task is. Out of 200 shots, about 20 are really good, and about 5 are worth using (in a portfolio or ad or whatever). Bridge has no way to compare two pictures other than switching back and forth between them. You also can't see the pictures at 100% so figuring out sharpness or focus is pretty impossible unless you open them up in Photoshop. Which requires a multi-dialog process and a conversion time.
    Once I get my 20 good ones, batch convert them all to PSDs using an action I wrote. This takes a while. The PSDs go into the "best-psds" folder. They each take up about 40-70 MB of space vs. 3-6 MB for each RAW file.
    Make the levels, saturation, sharpness adjustments as needed with each file. Using another action I wrote, batch convert the best PSDs to full rez jpegs with my copyright notice on them. As this action involves opening a 70 MB file, creating a new layer for my copyright, setting it up, converting to srgb, converting to 8bit, saving as jpeg, this takes a while. Several seconds each file on my dual 2.5 with 2.5 GB ram.
    Using another action I wrote, batch covert all the RAWs to small rez jpegs with my copyright notice on them. These are for the model if it's a tfcd shoot, or for my records, or whatever. This takes a good long while. Now my 1 GB of raws are about 2.3 GB of raws, jpegs, psds.
    Open up iView Media pro and update it's index so that all my new files are in it.
    Done.
    With Aperture, I put my card in the reader.
    Aperture pops up and asks if I'd like to import these images. I pick a destination, specify the metadata and keywords for this shoot, and it loads them all in.
    I turn on auto-stack. I make a few manual stacking adjustments. I start picking the best shoots. Aperture has excellent compare modes, including 2-up, 3-up, more-up, full rez zoom, a loupe tool for instantly checking focus at full resolution, a 0-5 star rating system, a quick-select key for picking an image as five star, a quick-reject key for an image I know is junk. Within in a stack I can promote, demote, and pick the stack "pick" very quickly and easily. I can do this with just the keyboard. I can easily compare any pictures next to each other. I can go full screen with drops off all the unneeded junk and keeps the various window and toolbar colors for interfering with my vision on my color calibrated display. Picking the best shots is amazingly faster and less frustrating due to the features mentioned above.
    I can now make my adjustments (exposure, levels, brightness, saturation, shadows, highlights, spot and patch blemish fixing, red eye, etc...., and then can apply them to all the other similar condition pictures. (In Photoshop/Bridge you can batch apply things like white point and exposure changes, but you can't do saturation, sharpness, etc...). My adjustments go into a 24kb xml file, instead of a 70 MB psd. Each adjustment can be turned on or off, removed, modified, etc... I can instantly create different version of an image. I might want a crop to zoom in on the model's face, or I might want a black and white version, etc... The versions are just a tiny amount of data in the xml file. In photoshop I'd need a new 70 MB psd for each version I wanted.
    Once I'm all done getting the images rated and adjusted the way I want them, I can at any time use the export function to generate the jpegs. Since the copyright is a watermark layer and is rendered by Core Image in my video card, the export is about 10X faster than the Photoshop batch action processing. I haven't timed the two side by side, but I will. It's about 10X or so faster though. For me.
    Done.
    I just converted my 70 GB working library into Aperture over the weekend. I was able to duplicate my photoshop adjustments in Aperture and drop my psds. This took my 70 GB lib to 35.5 GB. That's about half the size.
    So for me, and my workflow, and my post-processing requirements, Aperture is faster, uses less hard drive space, is easier to use, and does a great job. It will pay for itself during the first shoot's sorting and post-processing.
    There have been reports of the RAW conversion not doing as good a job as Adobe's. It turns out many of those people bringing that up left the default sharpening turned on in Adobe. Since raw files, at least Canon raws, pretty much always need sharpening and a small saturation boots, comparing a converted raw to a converted raw with sharpening will clearly show the one with sharpening looks better. So most of them aren't valid tests. There may be some real issues with Apple's image handling vs. Adobe's. Hopefully if there are, Apple will fix them. My personal experience is that the raw conversion looks pretty much the same to me as Adobe's non-sharpened conversion. I've found that Apple's noise-reduction looks better than Adobe's or Fred Miranda's action. I've found that it takes me less time to get a look I like in Aperture than in Adobe. I've found that my workflow is vastly quicker. To me it is an amazing program that will only get better with each revision.
    Devon
    2X2.5 GHz w/2.5 GB + 2X2.3 GHz w/4.5 GB + 17"pb   Mac OS X (10.4.3)  

    I have used Aperture quite extensively over the weekend and I also see a lot of potential. I also see the typical amount of bugs for a first release of a software of this complexity and I also see a few software architecture problems.
    What did I do with Aperture so far?
    First I have imported a few 70MB Tiff scans. Probably less than hundred and played around with it. Rated them, defined some searches, added some keyword hierarchies, tried some image manipulations, created a light table, printed the light table to pdf, created a book and created some example web pages. I played also with the fullscreen mode.
    There were a few user interface glitches: The light table sometimes has problems with selections. Creating a query takes too long since it tries to update live. I can't seem to create a book with a light table visible at the same time. While entering a query I clicked on a triangle to open a folder - Aperture didn't like that. And some more.
    Lots of stuff worked fine. Some (like the book designer) didn't have enough features. Some features I did understand after some time.
    Now I copied my Aperture library to an external firewire disk. The disk is a fast RAID 0 disk connected via firewire 800.
    Next I loaded my iPhoto library into Aperture. Something like 17000 photos. This took a few (five?) hours and went without any problem. I got a few hundred projects - I would like to join some of them. I created a query to get the iPhoto edited photos (1900 photos) and removed them. This took about ten minutes. Next I created some queries. No problems with that. Speed is okay on the Powermac.
    Next I did some film scans with Nikon Scan and imported them (hundred maybe) large TIFFs into Aperture in small batches.
    Then I imported probably hundred RAW images from my Canon EOS 350d. I tried the raw import patch mentioned somewhere else to get Mac OS X 10.4.3 to recognize. This worked fine. (I later tried another method which I cannot mention here, but that worked also fine.)
    So currently my Aperture library is about 55 GB.
    I never like the rendering of iPhoto too much. Often I used Graphicconverter to view, scale, batch convert, ... Graphicconverter also has quite a good (IMHO) rendering of images. So I was a bit sceptical about Aperture's rendering, but actually I don't have a problem with its on-screen rendering. I kind of like it. I haven't tried to print yet, though.
    I also have Photoshop Elements, though I don't use it very much. GraphicConverter is used though. I also have the Canon tools which I also don't use much. I use the scan application sometimes. For the Filmscanner I use Nikon Scan which is okay. For my taste the Aperture application looks & feels better than those - though I'm not a big fan of an all-grey interface (which may have some advantages with being more neutral).
    So I had a few crashes (two maybe) and had to force Aperture to quit (three times maybe). But I didn't seem to have lost any data and Aperture started quickly again. Sometimes I restarted Aperture when it acted strangely (like didn't want to provide a working crop tool - maybe four times).
    So, would I buy it again? Yes, without a doubt. It's lots of fun... Can't wait to show my friends Aperture loaded with some of the scans I did over the weekend.
    Regards,
    Rainer Joswig
    PowerMac Dual G5, 2.5Ghz, 4GB RAM, 22" Cinema Display, Canon 350d + Canon s80 + Nikon Coolscan IV ED   Mac OS X (10.4.3)  

Maybe you are looking for