Migrating across different $HOMEs

I just bought myself a new laptop. Migrating all my settings should be as simple as just copy/paste. However I have a few concerns about my particular setup.
Firstly, my old laptop is i686 and the new one is x86_64. This is probably of little concern as far as userspace configs go.
But the bigger problem would be that my home directory on the old laptop is /home/andrew-arch and on the new one it is /home/andrew
I did this on the old laptop deliberately because I wanted to be able to dual boot with Ubuntu without there being any strange conflicts due to different package versions getting confused over the config files. So I changed my $HOME to andrew-arch in /etc/passwd and created symlinks across to Ubuntu's /home/andrew/ for doc folders and a few configs that wouldn't break (eg: .mozilla and .wine). It worked fine since I used the same UID.
The way I see things breaking is that random config files may have absolute directories in them and will do weird things if all of a sudden /home/andrew-arch doesn't exist any more.
Does anyone have any advice on how to go about this in the least hackish way? I want to have my new %HOME=/home/andrew because the bash prompt doesn't show ~ $HOME if it's different to the username. The easiest way I could fix it is to just symlink a /home/andrew-arch to the real one but it's still hackish.
What I would like the most is to do a recursive plain text find/replace on every instance of everything. Would that be safe though?

Don't worry about backups. That is what the old laptop is. I'll burn the lot to DVD before I wipe it to give to my mum.
I found the fastest way to do the actual transfer itself is to set up NFS on the old laptop (read only) and mount it in /home/andrew-arch-copy on the new one and then cp -fv /home/andrew-arch-copy/* /home/andrew-arch-copy
Does anyone know what command would do the plain text find/replace? I'll just symlink a second home dir as a temporary measure until I find out how.

Similar Messages

  • LWAPP to CAPWAP across different software versions. ?

    Hi guys,
    I was wondering if someone could clarify what happens when you have controllers live on two different software versions..
    We have 4 controllers.. 2 on version 4.2, which APs currently register to with LWAPP and 2 new ones on version 7..
    On WCS, I have moved an AP currrently registered on a version 4.2 controller and reassigned it to the newer controller ( ver 7 ).
    The AP then reboots and I can see in the event logs that it comes back up on LWAPP after resolving the entry CISCO-LWAPP-CONTROLLER.
    On WCS, the AP shows as registered back on the old Controller ( ver 4.2 ) but the Primary and Secondary Controllers are still pointing to newer ones ( ver7 )..
    According to the Cisco document, the transfer from LWAPP to CAPWAP should be seemless, so is this a problem migrating across different software versions. ?   Will the AP sort itself out eventually.
    If I console into the AP and statically assign the new controller IP address, this forces the AP to download the newer software version ( after multiple auto reboots ) and it comes up on CAPWAP.. This shows that the path between the AP and controller is ok..
    Ideally, Im hoping that I can simply reassign our current APs to the new controller ( and onto CAPWAP ) rather than having to console into all our remote APs and statically reconfigure them.. Not fun !!.
    Any advice would be appreciated..
    Thanks.
    Jon.

    Hi Nicolas,
    Thanks for the reply..
    Yes I agree that, once on version 7 it will perform a lookup for " cisco-capwap-controller ".. I forgot to say in my original post that I had already check this and it does resolve correctly to the two newer controller IPs.. If I console into the AP and manually define the newer controller IP address, it all goes through sucessfully and I can see the discovery and correct registration process in the logs
    So we know that it kind of works but Im hoping that there is a simplier solution from within WCS where I can do all this remotely, rather than consoling into every AP locally.
    Your suggestion of inputting the new controller IPs in WCS instead of the names sounds good however, it needs to be on version 7 first to be able to do this ? As you say, there is a field for this in version 4.2.... Looks like there is a bit of chicken and egg going on here !!.
    Any other ideas or thoughts would be appreciated.
    Many thanks.
    Jon.

  • Migrating existing portable homes to new server

    aside from moving the homedir data from the old server to the new, there seem to be at least a few issues with migrating existing portable home accounts to a new server:
    1. some of users' account details, like GeneratedUID, authentication authority, kerberos principals, OriginalNFSHomeDirectory, are different, while others (name, shortname, UID, GID, etc.) remain the same.
    2. home directory (OriginalNFSHomeDirectory, etc.) point to the old server.
    3. there's data on local machines that we don't sync back to the server, so we can't just blow away the existing local accounts and start fresh.
    the quickest way to migrate these users to the new server (with all the same shortnames and UIDs, etc.) seems to be to remove the local cached accounts (leaving the home folders) and have them recreate new PHDs on login, syncing things back down to the original home folder. i'm guess this won't involve much syncing, it's all the same data, essentially.
    the other way i can see resolving this is to replace the account attributes for each client to match what they should be when pointed to the new server. this would involve scripting the process for reliability and not moving any data or deleting accounts, but it will take more testing on my part.
    what do you think? can you think of better ways to accomplish this task?
    summary: what's the best way to move existing portable home accounts bound to "Server A" to "Server B," while maintaining data and portable homes pointed to the new server and storage?
    thanks.

    that createmobileaccount syntax was wrong. i guess you don't need the -t option and can instead specify the whole path to the user's home. it seems to work well enough, creating a portable home with no default sync settings -- basically manual. for my needs that's fine. the sync settings are managed via mcx anyway.
    here's an updated version of the standalone script. i realized just now the script assumes the diradmin usernames and passwords are the same between servers. if that's not the case, you can hard code it or add a couple of variables. since they're just taken in order on stdin, add them in order. i should also add a function to interactively ask for the passwords with stty -echo to avoid having the passes logged in command history or allowing the script to curl the pass from another file on a web server or something. for now, this seems to work for my purposes. edit as you see fit.
    #!/bin/bash
    # nate@tsp, 3/4/10: initial version
    # 3/5/10: added prettier heredoc usage statement, variables, further tested
    # todo: add function to add user to local admin group, as needed. this shouldn't be required in most environments.
    # todo: convert some of these one-liners to functions for better modular use; make it "smarter"
    # todo: convert the whole thing to ruby for practice
    # automates the process of unbinding from the old OD server, binding to the new, removing the existing local user, adding it back, and other bits
    # there are no "smarts" in this script, so use carefully
    # variables
    diradminpass=$1
    account=$2
    password=$3
    oldserver=$4
    newserver=$5
    mkdadmin=$6 # not used in this version
    # if no parameters are passed, display usage, exit
    if [ ! -n "$5" ] ; then
    cat<<endofnote
    usage: you must include at least 5 arguments (6th isn't used right now)
    run with `basename $0`
    1. [directory admin password]
    2. [shortname of account to change]
    3. [account password, which should be the default 'xxxxxxxx' on the new server]
    4. [name of old server]
    5. [name of new server]
    6. [yes or no to make this account a local admin - optional and not used now]
    ex: `basename $0` diradminpass jbrown password oldserver newserver yes
    endofnote
    exit 1
    fi
    # if you're running this as root or with sudo, proceed; otherwise, quit it!
    if [ $(whoami) = "root" ]; then
    echo "you're root. let's proceed..."
    # delete the user in question from the local directory store
    echo "deleting local account: $account"
    dscl . -delete /users/$account
    # remove the old od config
    echo "removing the old OD bind..."
    dsconfigldap -v -r $oldserver -c $HOSTNAME -u diradmin -p $diradminpass
    # remove the old server from the search and contacts paths
    echo "removing the old search paths..."
    dscl /Search -delete / CSPSearchPath /LDAPv3/$oldserver
    dscl /Search/Contacts -delete / CSPSearchPath /LDAPv3/$oldserver
    # add the new one
    echo "adding the new OD bind..."
    dsconfigldap -v -f -a $newserver -n $newserver -c $HOSTNAME -u diradmin -p $diradminpass
    # create and add the new ldap node to the search policy
    echo "adding the new search paths..."
    dscl -q localhost -create /Search SearchPolicy dsAttrTypeStandard:CSPSearchPath
    dscl -q localhost -merge /Search CSPSearchPath /LDAPv3/$newserver
    # create and add the new ldap node for contacts lookups
    dscl -q localhost -create /Contact SearchPolicy dsAttrTypeStandard:CSPSearchPath
    dscl -q localhost -merge /Contact CSPSearchPath /LDAPv3/$newserver
    # give directoryservice a kick to point it to the new server
    echo "killing directoryservice and waiting for 20 seconds..."
    killall DirectoryService
    # rest a bit to ensure everything settled down
    sleep 20
    # optional: lookup the $account you deleted as the first step to ensure it exists in the new directory
    echo "this id lookup should return details because it exists in the new OD:"
    id odtestor
    echo "this id lookup should fail because it doesn't exist in the old OD:"
    id odtestor
    # check the search path to ensure it looks like you need
    echo "verify the new OD server is in the search path:"
    dscl /Search -read / CSPSearchPath
    # optional: create a mobile account on the local machine with various options set.
    echo "creating a portable home for the user..."
    /System/Library/CoreServices/ManagedClient.app/Contents/Resources/createmobileaccount -n $account -v -p $password -h /Users/$account -S -u afp://$newserver/homes/$account
    killall DirectoryService
    cat<<endofnote
    you should be ready to login with this account now.
    if you have trouble, revert the process by re-running with the old and new server names
    (and diradmin passwords, if they're different) reversed.
    endofnote
    else
    echo "you're not root or an admin. please re-run the script as an admin or via sudo."
    exit
    fi
    exit 0

  • Can home sharing be used from two different homes?

    I was wondering if you connect Home Sharing from two different homes is it possibly to share the music?

    The primary constraint is that Home Sharing uses the same wifi network for sharing/streaming between computers/devices. So if the wifi network stretches across multiple houses, it should be possible. But if the houses are on different wifi networks, I'm afraid you're out of luck.

  • On My Mac folders Synchronizing problems across different MACs

    I am getting frustrated dealing with synchronizing "On My Mac" folders across different MACs (I have 3, a laptop, a tower at home and one at office).
    Let's just talk about my tower and laptop. When I need to get going, I always sync files so my laptop will have the same info as my tower. Before i sync tower/laptop, on my tower, I would transfer mail from the mail server to my local hard drive in "On My Mac" folders. I would then run a sync program to sync "On My Mac" folders between tower & laptop. Unfortunatly, when new mail files are loaded to the laptop, the mail program DOES NOT SEEM to automatically view the new mail "1234.emlx" file. The solution would be to run the "Rebuild" function. When that happens, sure, all the new mail loaded on my laptop becomes visible. But the repercussion is that Mail reconfigures and renumbers the original "1234.emlx" file to say "9876.emlx" file.
    Well, guess what, when I have to sync my laptop with my tower again, the files in the tower now have to erase the "1234.emlx" files and have to load the "9876.emlx" files and the "Rebuild" button has to be engaged again, which would then again create a new file sequence "5432.emlx." Good gracious!!!!
    And that's just sync between my tower and laptop. What about my third machine, the one at the office?? It gets confusing, and I end up losing mail. It's absurd.
    Engineers at Apple, do the steps I describe above sound Apple-esque?? I hope you find a simple way to synch "On My Mac" folders, and might I hope you would even find a way to sync Pop mail??!! And whats the point of calling it full sync if .mac cannot give me the option of synching "On My Mac" mail??
    Thanks
    Power PC Tower   Mac OS X (10.4.7)  
    Power PC Tower   Mac OS X (10.4.7)  

    Synchronization of local mail data between computers by means of a file synchronization utility is a really bad idea if more than one computer is allowed to access mail between synchronizations.
    The most prominent issue is that Mail keeps a reference to every message within the ~/Library/Mail/ folder in a global Envelope Index file. If this file is modified in more than one computer between synchronizations, there is no way a file synchronization utility can handle the situation properly.
    Another, more subtle, and potentially more dangerous issue, is that Mail may use different *.emlx sequence numbers to name the same message in different computers or, worse yet, the same sequence number to name different messages in different computers. Again, the only thing a file synchronization utility can do about it is either overwrite files with the same name (thus potentially losing data) or not synchronize them at all.
    Mail data "synchronization" at the filesystem level can only be done reliably if it's a one-way operation, i.e. if the entire contents of the Mail folder in one of the computers are overwritten by the entire contents of the other, and even then it may not work properly because of the issues described here:
    Mac Backup Software Harmful
    This is not an Apple thing, not even a Mail thing. You would encounter similar issues with any other mail client. The only reliable way to achieve mail synchronization between computers is using an IMAP account and storing mail on the server -- that's precisely what IMAP is for.

  • Splitting Music and Podcasts Across Different Discs

    Is it possible to split Music and Podcasts in the same library across different discs? The reason is my music is too big to get on my laptop drive so I have an external disk drive that is always plugged in when I'm at home. However I want to take podcasts with me when I travel so I want them to be put onto my laptop hard drive. I know I could probably achieve this by creating 2 libraries (although I haven't tried yet) but would prefer to see if I can do it with one.

    Put its internal drive into an enclosure and copy the files off.
    (116237)

  • Anyone know of a good IOS app that can play all media files across a home network, file share on a NAS or Windows PC??

    anyone know of a good IOS app that can play all media files, MKV, AVI, Xvid with subtitles across a home network, file share on a NAS or Windows PC?? I know I have BS player free for android, works pretty good on my nexus tablet.  is there a good IOS one that anyone can think of that can browse the network?
    or do I have to run transcoders?
    I store some files on the windows PC and some files on a linux NAS that the windows PC maps to

    I'm sorry to hear that.
    I'm not affiliated w/ the developer, just a happy user that gave up fighting the apple podcast app a while ago.  I used to have a bunch of smart playlists in itunes for my podcasts, and come home every day and pathologically synced my phone as soon as I walked in the door, and again before I walked out the door in the morning.
    Since my wife was doing this too, we were fighting over who's turn it was to sync their phone.
    Since I've switched to Downcast, I no longer worry about syncing my phone to itunes at all.  I can go weeks between syncs.
    Setup a "playlist" in downcast (ex., "Commute") and add podcasts to that playlist.  Add another playlist ("walk" or "workout") and add different podcasts to that one. 
    Set podcast priorities on a per-feed basis (ex., high priority for some daily news feeds, medium priority for some favorite podcasts, lower priority for other stuff).  Downcast will play the things in the priority you specify, and within that priority, it will play in date order (oldest to newest).
    Allegedly, it will also sync your play status to other devices, although that is not a feature I currently use and can't vouch for.  It uses apple's iCloud APIs, so to some extent may be limited by what Apple's APIs can do.

  • How can install a second app icon  of the same on app on a different home screen?

    how can install a second app icon  of the same on app on a different home screen?
    for example i have the phot app on one home screen and the i want the same photo app showing on the other home screens?
    I want to have a duplicate app on on more than one home screen
    Another example if i have the Evernote app showing on my first home screen and then my last home screen as well

    Put it in the dock at the bottom. Apps put there stay there and show at the bottom of all home screens.

  • Is there a way to merge/migrate my local home folder to a network account

    My family has a number of Macbooks and a couple of iMacs and we've been thinking we'd like centralized storage for our media collection and other files and I'd like an easier way to deal with these machines to keep them updated, etc.  Also we swap laptops and desktops depending on who needs to do what at a particular moment.  Is there a way to migrate an existing home folder on a macbook to an account on the server.  What I would like to be able to do is to be able to log into any computer in my home and have it look like "my" computer, with files, settings etc...  Since I am new to the server world I am confused by the terminology re: network accounts and mobile accounts.  Is there a good guide someone could recommend to get me started.  Thanks.

    Hi Yodalogger,
    I hope you have yourself sorted.  I've been through alot of pain with lion server, it's very buggy at best.
    Your best bet is SolidWood's suggestion of network accounts if you are constantly on the same network.  I use this at home and it works very well.  For simplicity, you can use WorkGroup Manager for this as it's more intuitive!
    If, you need a mobile account, this is what I did.
    I migrated local macbook accounts to server machine (migration assistant).
    I renamed the /User home folders on macbook to _backup.  For safety.
    I deleted local accounts from macbook.  Keep your _backup home folders!  Also, you will need to have a local Admin account in place.  Make sure you do not delete it.
    On Server.  You will have local accounts created for all your migrated macbook accounts.  Just remove the accounts in system preferences but ensure you leave the /User home folders in place when prompted.
    On server. I created the new users (old mackbook accounts) and groups in the server app.  This doesn't create or overwrite your existing home folders.  So go ahead and name them exactly the same as before and make sure the accounts match your home folder's names.
    On server, using profile manager, I set up mobility etc., for the device.  That is, you need to enroll your macbook with the server and configure services for it in profile manager.  You can add a placeholder for this in profile manager to configure stuff.
    A handy tip to alleviate all the automatic push settings pain and heartache is to set the general payload to manual.  You can then wip up the profile manager from your macbook to install the profiles manually.  (easily done).
    On Macbook, login with local admin account.
    On Macbook, go to system prefs and accounts, set up your open directory stuff in the login options.
    On Macbook, log out of admin.  Back at the login screen, you should see your admin account and 'Other.'  Give it a few minutes or so to figure this out.  It needs to contact the server etc. for info.
    Once you have 'Other' click on it and login with one of your new network accounts.  This will log you in as a network account - you should see all your usual settings that previously existed on your macbook when it was a local account.
    At this point, you whip up profile manager.  http://yourserver.local/profilemanager  Change yourserver to the name of your server.
    Login to profile manager with your admin account.  I do this as I will be downloading a few profiles that only admin has access to.
    So, you need to download a trust profile, your device profile, and a profile for remote management if you have set this up.  You may have seen various download buttons knocking around the the interface.  In downloads double click these to install (if it doesn't do this automatically).
    Log out of everything.
    Log back in with one of your network accounts.  This time you should be prompted to create a mobile account.  Say yes and let it sync your home folders from server to macbook.
    Once each mobile account is created, you can then further define user/group settings in profile manager.  You download these by logging into http://yourserver.local/mydevices as the user and download the appropriate settings.
    I think thats it.  Sorry, if its not detailed enough - I'm presuming you know yourself around a mac!  I have to say the process is straightforward but Lion Server is not.  I do not get consistent results with it and I'm still trying to tame it... 
    By far the easiest option is network accounts.  Mobile accounts need more attention.
    I hope this helps (and anybody else!)
    Paul.

  • How do I fix colour picker to work across different colour-managed monitors?

    Hey everyone!
    I'm assuming this problem I'm having stems from having colour-calibrated monitors, but let me know if I'm wrong!
    To preface, this is the setup I have:
    Windows 7
    3 monitors as follows, all have individual colour profiles calibrated using the Spyder 3
    Cintiq 12WX
    Dell U2410
    Dell 2409WFP
    Photoshop CS6 - Proofed with Monitor RGB, and tested with colour-managed and non-colour-managed documents
    I usually do most of my work on the Cintiq 12WX, but pull the photoshop window to my main monitor to do large previews and some corrections. I noticed that the colour picker wouldn't pick colours consistently depending on the monitor the Photoshop window is on.
    Here are some video examples:
    This is how the colour picker works on my Dell U2410: http://screencast.com/t/lVevxk5Ihk
    This is how it works on my Cintiq 12WX: http://screencast.com/t/tdREx4Xyhw9
    Main Question
    I know the Cintiq's video capture makes the picture look more saturated than the Dell's, but it actually looks fine physically, which is okay. But notice how the Cintiq's colour picker doesn't pick a matching colour. It was actually happening the opposite way for a while (Dell was off, Cintiq was fine), but it magically swapped while I was trying to figure out what was going on. Anyone know what's going on, and how I might fix it?
    Thanks for *any* help!
    Semi-related Question regarding Colour Management
    Colour management has always been the elephant-in-the-room for me when I first tried to calibrate my monitors with a Spyder colourimeter years ago. My monitors looked great, but Photoshop's colours became unpredictable and I decided to abandon the idea of calibrating my monitors for years until recently. I decided to give it another chance and follow some tutorials and articles in an attempt to keep my colours consistent across Photoshop and web browsers, at least. I've been proofing against monitor colour  and exporting for web without an attached profile to keep pictures looking good on web browsers. However, pictures exported as such will look horrible when uploaded to Facebook. Uploading pictures with an attached colour profile makes it look good on Facebook. This has forced me to export 2 versions of a picture, one with an attached colour profile and one without, each time I want to share it across different platform. Is there no way to fix this issue?
    Pictures viewed in Windows Photo Viewer are also off-colour, but I think that's because it's not colour managed... but that's a lesser concern.

    I think I've figured out the colour management stuff in the secondary question, but the weird eyedropper issue is still happening. Could just be a quirk from working on things across multiple monitors, but I'm hoping someone might know if this is a bug/artifact.
    Going to lay out what I inferred from my experiments regarding colour management in case other noobs like me run into the same frustrations as I did. Feel free to correct me if I'm wrong - the following are all based on observation.
    General Explanation
    A major source of my problems stem from my erroneous assumption that all browsers will use sRGB when rendering images. Apparently, most popular browsers today are colour-managed, and will use an image's embedded colour profile if it exists, and the monitor's colour profile if it doesn't. This was all well and good before I calibrated my monitors, because the profile attached to them by default were either sRGB or a monitor default that's close to it. While you can never guarantee consistency on other people's monitors, you can catch most cases by embedding a colour profile - even if it is sRGB. This forces colour-managed browsers to use sRGB to render your image, while non-colour-managed browsers will simply default to sRGB. sRGB seems to be the profile used by Windows Photo Viewer too, so images saved in other wider gamut colour spaces will look relatively drab when viewed in WPV versus a colour-managed browser.
    Another key to figuring all this out was understanding how Profile Assignment and Conversion work, and the somewhat-related soft-proofing feature. Under Edit, you are given the option to either assign a colour profile to the image, or convert the image to another colour profile. Converting an image to a colour profile will replace the colour profile and perform colour compensations so that the image will look as physically close to the original as possible. Assigning a profile only replaces the colour profile but performs no compensations. The latter is simulated when soft-proofing (View > Proof Colors or ctrl/cmd-Y). I had followed bad advice and made the mistake of setting up my proofing to Monitor Color because this made images edited in Photoshop look identical when the same image is viewed in the browser, which was rendering my images with the Monitor's colour profile, which in turn stemmed from yet another bad advice I got against embedding profiles .  This should formally answer Lundberg's bewilderment over my mention of soft-proofing against Monitor Colour.
    Conclusion and Typical Workflow (aka TL;DR)
    To begin, these are the settings I use:
    Color Settings: I leave it default at North American General Purpose 2, but probably switch from sRGB to AdobeRGB or  ProPhoto RGB so I can play in a wider gamut.
    Proof Setup: I don't really care about this anymore because I do not soft-proof (ctrl/cmd-Y) in this new workflow.
    Let's assume that I have a bunch of photographs I want to post online. RAWs usually come down in the AdobeRGB colour space - a nice, wide gamut that I'll keep while editing. Once I've made my edits, I save the source PSD to prep for export for web.
    To export to web, I first Convert to the sRGB profile by going to Edit > Convert to Profile. I select sRGB as the destination space, and change the Intent to either Perceptual or Relative Colorimetric, depending on what looks best to me. This will convert the image to the sRGB colour space while trying to keep the colours as close to the original as possible, although some shift may occur to compensate for the narrower gamut. Next, go to Save for Web. The settings you'll use:
    Embed Color Profile CHECKED
    Convert to sRGB UNCHECKED (really doesn't matter since you're already in the sRGB colour space)
    and Preview set to Internet Standard RGB (this is of no consequence - but it will give a preview of what the image will look like in the sRGB space)
    That's it! While there might be a slight shift in colour when you converted from AdobeRGB to sRGB, everything from then on should stay consistent from Photoshop to the browser
    Edit: Of course, if you'd like people to view your photos in glorious wide gamut in their colour-managed browsers, you can skip the conversion to sRGB and keep them in AdobeRGB. When Saving for Web, simply remember to Embed the Color Profile, DO NOT convert to sRGB, and set Preview to "Use Document Profile" to see what the image would look like when drawn with the embedded color profile

  • Material for a single project spread across different locations

    Dear Friends
    my client is executinig trun key projects. One single project may be spread across different geographical locations. Say Project name is "PRJ001".
    PRJ001 will be executed in Bombay, Hyderabad, Chennai.
    There are 2 scenarios for procuring materail:
    1. Since my these places are quite far away, I might procure material from a venodr near to these locations.
    2. I might give PO to one single vendor to dispatch material to these different locations for project execution.
    In the both the cases, how to handle material?
    What will be best option? Should I create a storage location (my client stores material @ site as these projects run for years)?
    I'm procurial material as Project Stock (Q).
    Say Bomaby location needs 500 no. of material, Hyderabad location needs 700 no. of material, Chennai location needs 300 no. of material.
    Now how do I ensure that the right material with right quantity is reaching respective project site?
    In some cases, project runs in remote location. Where there won't be any connectivity/ access to system. In such cases, if the site engineer enters GR/ IR & activity confirmation in excel sheet & later on sends an e-mail with this excel sheet to the office. How can we upload it to the system so that it updates the required fields in the system?
    Please give your suggestions.
    I appreciate your support/suggestion .
    Thanks

    Hi Amaresh,
    I think your Option no. 01 holds good for your requirement. You can define the corresponding Project sites in Chennai, Mumbai and other places as Storage locations. Better define Seperate storage locations for different site locations.
    I think having a delivery schedule with the specific requirement quantities and the storage location should resolve your issue of handling different quantities for the same material. This you can discuss & sort it out with your MM consultant.
    Hope this gives some idea.
    Regards,
    L.Balachandar

  • How to maintain same width of columns across different table sections in OBIEE reprot

    I have a prompt and report (analysis) on the dashboard. In the report, there are tables sections.
    The problem is when I run the report, the column widths vary across different section of table. This report also showing totals too.
    The report needs to be fit so there is no horizontal scroll bar in the report.
    I have already given the same width in additional properties of column values, header and folder and in total's property in criteria and also in the result tab.
    I am new to OBIEE, so don't know how to fix it. any help would be appreciated.
    thanks in advance.

    You might want to post to the OBIEE discussion area
    Business Intelligence Suite Enterprise Edition

  • Is it possible to cluster appliances across different subnets?

    We are attempting to cluster two appliances across different subnets in order to provide greater survivability. Although we were able to cluster the appliances, the manageability of the appliances has become somewhat impaired. We've opened ports 443, 22 and 2222 between the two appliances. The appliances are C350s running AsyncOS 7.1.3-010. Are we missing something?
    Thanks,
    Rob

    Rob,
    Are these appliances communicating using IP addresses? If yes, in order to a join cluster,using IP addresses there must be a reverse DNS  (PTR) record configured in DNS server for the Cisco IronPort appliance.Please check that if the the reverse lookup works. If not, it might be another issue.
    Regards,
    Jyothi Gandla
    Customer Support Engineer

  • The screen where my apps show on all the pages is not letting me drag them. The screen looks kind of gray. What do I do to be able to move them to different home screens, and check, and uncheck the apps in the list

    The screen where my apps show on all the pages is not letting me drag them. The screen looks kind of gray. What do I do to be able to move them to different home screens, and check, and uncheck the apps in the list

    The specific demo would need to provide code for touch events.
    http://www.w3.org/TR/touch-events/

  • Retrofit in SAP Solution Manager 7.1 Supported Across Different OS's?

    Is Solution Manager 7.1 Retrofit Capability supported with a different enhancement pack level and across different Operating Systems (HP-UX & AIX)?
    The specific scenario would be setup of a dual landscape where traditional Dev -> QAS-> PRD is on HP UX 11.31. Then we would create a parallel 'Upgrade Project" landscape on IBM AIX 7.1 uDEV -> uQA (on AIX), perform an enhancement pack upgrade on uDeV and uQA and use Solution Manager Retrofit Capability to synchronize transports between the two landscapes.
    Thanks in Advance!
    -Patrick

    Hi we are configuring Google SMTP getting below error..
    No delivery to xxx.com, authentication required
    Message no. XS856
    Diagnosis
    The message was processed successfully in the SAP system. The mail server that is to receive the message for further processing requires authentication. Probably there is no logon data specified in the SAPconnect configuration.
    Information from external system (if available)
    smtp.gmail.com:587
    530 5.7.0 Must issue a STARTTLS command first. i91sm11178241qgd.25 - gsmtp
    Procedure
    Enter the logon data in the SAPconnect node.
    Using Gmail SMTP server using "smtp.gmail.com" with port 587
    Please advise.
    Regards,
    Sudarshan

Maybe you are looking for