Splitting Site Collection and Changing the Managed Paths

Hi there,
I was hoping for some advice.  We are planning the upgrade our SP 2010 to SP 2013.  We have one Site Collection whose DB is nearing 180GB and growing so we are going to "split" it up into four.
So, the logical thing would be to export sites/subsites and import them into their own site collection.  What I have found though is that when doing this, the workflow history is not kept - we need the workflow history so this is not an option.
My other thought was to restore the DB, move the contents to a new Content DB, delete out of there what I don't want and do that four times.  That could work but then I get the problem of the URL.  I'm assuming by restoring this same DB four times
it will be using the same URL?
So, if anybody knows of a tool out there that can do an export/import that will keep the workflow history that would be perfect (it would have to be free...) - or, how can I restore the DB's to new URL/Managed Paths?  Or maybe you have a better
solution?  Any help would be appreciated.

Rebecca, thanks for the suggestion - I just think that could get a bit messy for my users.
Alex, the Workflow Cleanup job is disabled as we need to keep a record of who approved the task items.  I cannot lose the history.
What I have done, and have only just started testing, was to backup and restore in Powershell the "Root site" and delete the three subsites.  Then, do the same again but this time, restore to a new URL 
http://blogs.msdn.com/b/erica/archive/2013/11/26/customer-question-renaming-site-collections-in-sharepoint-2013.aspx 
So, I've the done the following:-
Backup-SPSite http://domain/sites/HR –Path E:\Backup\HRBackup.bak -UseSqlSnapshot
Restore-SPSite http://sp.contoso.net/sites/HR/Subsite1 –Path E:\Backup\HRBackup.bak -ContentDatabase SPS_HRSite1
Restore-SPSite http://sp.contoso.net/sites/HR/Subsite2 –Path E:\Backup\HRBackup.bak -ContentDatabase SPS_HRSite2
Restore-SPSite http://sp.contoso.net/sites/HR/Subsite3 –Path E:\Backup\HRBackup.bak -ContentDatabase SPS_HRSite3
Obviously space could be an issue.  Not a perfect solution but workflow history is kept and I've been able to split my Site Collection. My next challenge is moving the subsite into the root of that new site collection.  I added a comment here if
anybody has any answers -
https://social.technet.microsoft.com/forums/sharepoint/en-US/1dbd3d7c-ccd3-42e2-ba48-ad5731f38a58/change-subsite-to-toplevel-site-without-changing-the-link?prof=required

Similar Messages

  • Can I change the file path that itunes reads my music files from? 65000 files trying to be transferrd from external to internal HD. I dont want itunes to manage my collection, just read it. thanks

    Can I change the file path that itunes reads my music files from?
    I have about 65000 music files that i have transferrd from my external to a new laptop internal HD.
    I have organised the folders myself as I use a PC and do not want itunes to organise the folders because when I'm searching through music on windows it is sorted out by artists/albums with 'the...(eg  The Beatles)' under 'T' instead of 'B'.
    Long story short.... Is there any way that I can keep my itunes library as it is on the external HD and copy it to the new laptops C drive, keep all the info (playlists etc) and still have the same folder structure as on the external HD?
    Or, is there any way of making windows sort things in alphabetical order like it is in itunes (eliminating the 'the' issue)?

    The files that weren't inside the media folder on the original machine need to be copied over to exactly the same paths as they had on the source machine. See this thread for an ongoing discussion of a similar problem. See also this post on migrating the iTunes library.
    tt2

  • Granting permissions to manage my sites site collections and user profiles

    We currently have no governance in place to deal with user profiles and my site site collections.
    So the farm has quite a large number of both profiles and site collections which are orphaned. We do not currently run the timer job that deletes my site site
    I have a few questions about handling this kind of thing.
    1. We have administrative staff who handle the "off boarding" process. They send out mail to people asking whether project folders continue to be needed, etc. and give project members a month to make copies of the important info. Right now, there
    are a few of these staff which have site collection admin rights on normal projects.
    Is there a way that a powershell script could be constructed to give them the ability to delete obsolete user profiles and site collections without giving them full farm admin rights?
    2. Is there a way to set up an AD or SharePoint group and then use that group as a secondary site collection admin on the my site site collections?
    3. With appropriate permissions, would they be able to grant someone else permission to look at data in the my site collection to determine whether any of it needed to be kept?
    I have been requested as a sharepoint admin to work with the offboarding process staff so that they include the sharepoint my site data in the data that they recommend people check before deleting the data.
    I would like to automate as much of the dealings they have to have with the system as possible, just to reduce unrelated accidental actions.

    1. We have administrative staff who handle the "off boarding" process. They send out mail to people asking whether project folders continue to be needed, etc. and give project members a month to make copies of the important info. Right now, there
    are a few of these staff which have site collection admin rights on normal projects.
    Is there a way that a powershell script could be constructed to give them the ability to delete obsolete user profiles and site collections without giving them full farm admin rights?
    Inder : 
    http://blogs.msdn.com/b/kaevans/archive/2012/06/25/top-recommendations-for-managing-the-my-site-cleanup-timer-job.aspx
    http://www.harbar.net/archive/2011/02/10/account-deletion-and-sharepoint-2010-user-profile-synchronization.aspx
    http://blogs.technet.com/b/seanearp/archive/2009/03/04/sharepoint-profile-cleanup.aspx
    2. Is there a way to set up an AD or SharePoint group and then use that group as a secondary site collection admin on the my site site collections?
    Inder: NO, it has to be a user. But you can go to site action > site settings > site collection administrator > Add that group here.
    3. With appropriate permissions, would they be able to grant someone else permission to look at data in the my site collection to determine whether any of it needed to be kept?
    Inder:  Yes
    If this helped you resolve your issue, please mark it Answered

  • Managed Metadata Refiner and Displaying the Full Path

    Does anyone know how best to work with the metadata refiner for a column that is set to show the full path of the tag?  We need to display the full path for this column when used in various views; however, when the full path is shown is the search refiner
    it is too long and run out into the search results.  (see picture)
    Is there a way to only show the tag name and not the full path to the tag within the refiner but still have it show the full path within libraries?  Thank you

    Daniel,
    Thank you for this article.  It looks like it would help, but when I replace the xslt with what's suggested in the article my refinement panel disappears.  Perhaps I'm not implementing it correctly.  I have a couple of managed metadata refiners
    listed as the first two refiners in the panel in addition to the usual ones such as author, date, site.
    What I did is that I replaced the xslt inside the <a> tags with the xlst mentioned in the article.
    <a class="ms-searchref-filterlink" href="{$SecureUrl}" title="{$RefineByHeading}: {$UrlTooltip}">
    <xsl:value-of select="Value"/></a>
    with
    <a class="ms-searchref-filterlink" href="{$SecureUrl}" title="{$RefineByHeading}: {$UrlTooltip}">
    <xsl:variable name="PartOfValue">
    <xsl:call-template name="substring-after-last">
    <xsl:with-param name="string" select="Value" />
    <xsl:with-param name="delimiter" select="':'" />
    </xsl:call-template>
    </xsl:variable>
    <xsl:variable name="PartOfTooltip">
    <xsl:call-template name="substring-after-last">
    <xsl:with-param name="string" select="$UrlTooltip" />
    <xsl:with-param name="delimiter" select="':'" />
    </xsl:call-template>
    </xsl:variable>
    <xsl:choose>
    <xsl:when test="($FilterCategoryType = 'Microsoft.Office.Server.Search.WebControls.TaxonomyFilterGenerator') and ($PartOfValue != '')">
    <xsl:if test="not(contains($PartOfValue, '…'))">
    <xsl:value-of select="$PartOfValue"/>
    </xsl:if>
    <xsl:if test="contains($PartOfValue, '…')">
    <xsl:value-of select="$PartOfTooltip"/>
    </xsl:if>
    </xsl:when>
    <xsl:otherwise>
    <xsl:value-of select="Value"/>
    </xsl:otherwise>
    </xsl:choose>
    </a>
    Is this correct? Thanks for the help.
    Matthew

  • How to resize and change the resolution of a batch of photos using Automator

    I searched for a long time tonight looking for the answer to this (seemingly) simple question:
    How do I use Automator to scale and change the resolution of a batch of images?
    It was not so simple.
    Links to this question:
    https://discussions.apple.com/message/12341246#12341246
    https://discussions.apple.com/message/12342026#12342026
    https://discussions.apple.com/message/5785047#5785047
    https://discussions.apple.com/message/1173358#1173358
    https://discussions.apple.com/message/5641853#5641853
    https://discussions.apple.com/message/3207516#3207516
    These are just the links on this site - I found them all over the place at MacRumors, Apple Tips, Mac Help, etc.
    You can actually manage this in Automator.
    Here are the steps that worked for me:
    Create an Automator APPLICATION - not a workflow (this is due to the way that I'm batch converting images - workflows might be ok for some cases)
    Step 1 is Copy Finder Items
    My flow inserts an SD card, opens the DCIM folder that my Nikon creates, selecting the images that I click (command + click to multi-select) and once I have the photos highlighted, I drag them onto this Automator App we're creating.
    <==  You'll have this guy soon!
    As a result - I want to copy the originals to my computer as step 1.  I don't touch the originals on the SD card (and cards are cheap so I tend to leave them on the cards as well)
    Step 2 is the Scale Images action - you can search the library for this and find it quickly.  For my part, I found that scaling images to about 38.8 percent of their size on the SD card is good for uploading to a blog.  Change this value to whatever you wish.
    Step 3 is Run Shell Script - and here is where we marry the brilliance found at this link with our script.If you have a hard time reading the text in the image, it is as follows:
    #bin/bash
    for f in "$@"
    do
         /usr/bin/sips -s dpiHeight 72.0 -s dpiWidth 72.0 $f
    done
    Save this application (I named mine "Format Photos")
    Place the application inside the target folder where you want the images to end up.  I do this because when I have the SD card window open, I can also open my "Photos" window and see my App sitting there.  I select my images as I mentioned and drag them on top of this app.  The app copies the originals and the conversions into the folder.
    NOTES: When you open a converted pic in Preview, you will see Resolution = 300 dpi if you go to Tools --> Adjust Size...  This reading is explained by another brilliant discussion as sips only touches the JFIF properties inside the file's MetaData.  However, if you look at the bottom of the Adjust Size... window, you'll see the image size is probably around 500 kb (give or take depending on the original).  My goal was to get the images down from the 3.0 MB I shoot at to around 500 kb.  Therefore even though the MetaData still thinks that it is 300 DPI according to Preview, it has been changed to 72 (open it in some other applications mentioned at the links and you'll find different readings - it all depends on what the application reads from the Meta).
    This does not rename the files, so you'll get DSC_1000.jpg and DSC_1000 copy.jpg in all likelihood.  If that annoys you, add a step into the Automator Application that renames the file after the "Run Shell Script" action has run, and you can have each file renamed according to some convention that you like.
    This took a heck of a lot longer than I expected - so I decided to put in the effort to share this with the community and save others the hassle. 

    PPI is pixels per inch of the image.  It is difficult to increase resolution as you are trying to add data that is not there.
    But for printing purposes what you want is dpi or dots per inch.
    The image processor either accessed from Bridge (tools/photoshp) or PS is a good way to change a batch of images.

  • How can I change the folder path to my library

    I just changed the file path to my locally stored music from C:\MyStuff\...\iPod Nano Music\... to C:\OneDrive\...\iPod Nano Music\...
    Is there a way I can edit the path that Apple has locally stored to redirect iTunes to the new, correct location instead of (as I've had to do previously) deleting all the music files in iTunes and then adding each music folder back?
    In Advanced Preferences the 'iTunes media folder location' shows as C:\Users\Mark\Music\iTunes\iTunes Media with an edit button. However this is not the location of all my music files (as indicated above).
    Mark

    If you want to move your media to a new path what you should do is edit the media folder preference in iTunes and then consolidate to that new path. Moving the media first by hand and then trying to fix iTunes is generally the wrong approach.
    Your options are to move everything back where it came from, then consolidate to the new path, then delete the originals that iTunes leaves behind. Alternatively you should be able to use my FindTracks script to reconnect the relocated media to iTunes. See this post for an explanation of how it works.
    See also Make a split library portable for reasons why splitting the media folder to a new path may be undesirable.
    tt2

  • Changing the Management Pack extensions or items are stored in

    I think I know the answer to this but I will ask it anyway. Is it possible to change the Management Pack an extension or an object such as a Queue or Subscription is stored in?
    Reason I ask is the person that was maintaining SM before me stored every single form extension, subscription, queue and view etc...in the same Management Pack. I really don't like this particular setup and I am sure its not best practice. I plan to break
    out all the above and possibly more into separate newly created Management Packs, like a separate one for Views, a separate one for Subscriptions etc.... Is the only solution to delete the existing item and create a new one with the same details including
    the display name?

    You can, but not as easily as you would like. 
    Essentially, what you would need to do is split up the the XML by hand, making sure to sort out the DisplayStrings sections to the correct components. Export out the "main" MP, cut out a section of XML that represents the form extension, paste
    it into a new file with a new name, add a reference to the new MP to the main MP, and replace all references to that object with MP Aliases to that object's new home. Import both the new MP and the reduced "main" MP back into the database. 
    there are two problems thou:
    Some objects, like projections (used in form extensions created by the authoring too) and class extensions should only ever be stored in sealed MPs, because removing them causes data that reference them to be invalid, i.e. properties defined in class extensions
    are lost if that class extension is removed and re-added, and templates that depend on projections are inaccessable if the projection they are based on is lost.
    some atomic SDK objects, like workflows, consist of several non-consecutive XML sections, meaning you would need to trace the references and identify which of many <Rule> and <Action> sections belonged with which workflows, and make sure they
    would all end up in the same MP. 
    now you could do this slowly, pulling sections out of the "main" MP and replacing them with references over various nightly downtimes, exporting the "main" MP and carving out a few objects, then importing the trimmed down "main"
    MP and the new single use MP until you had something resembling best practice functional isolation (all Change request notification over here, all service request workflows over there, etc), and the hollowed out husk of your "main" MP as a miscellaneous
    container, but i think you'll find that you still have to face and deal with data loss and breakage, potentially a lot more trouble then it's worth. 
    Given that, i would instead propose that you give up on correcting sins of the past, and start doing it right now. start setting up your new MPs and put new thingss into them, and while you're working, anything that you can easily rebuild. Make a plan to
    rebuild your templates and form customization. start identifying class extensions you'll have to rebuild, and work up a plan to export data defined by them somewhere (probably powershell and CSV) so you can restore it when the classes are restored. 
    Like most data hygine problems, there isn't an easy bulk application answer. you just have to look at everything and clean it up as you go. you can mix salt and sugar quite easily. UNmixing them is a lot more work. 

  • Search does not crawl new site collection and documents

    We have the following situation. We have two locations with different farms sharing the same databases (using AlwaysOn for the content databases). Everything works fine, the second site is also read-only while having the primary farm online. For existing
    databases the search crawler on the second site is able to crawl existing site collections.
    For new site collections created on the first farm the crawler on the first farm indexes the content proberly. The second farm though is not knowledgable of the content unless you force him to reiterate the content database. After this procedure the sites
    are available on the second site as well (showing them in the web browser), but the search farm still is not able to see the new site collection and data created within.
    Is there any additional iteration we have to go through making the crawler aware of the new structure / content ?
    Thanks in advance, Jens

    Nope.
    Change log wipes are real, that's how incremental crawls work in SharePoint.
    Site A is created and modified. Changes are mirrored to the second AG, content is added, logged in the changes log and then removed as the crawler on the primary farm indexes it.
    This continues until you make farm 2 aware of the changes. At that point farm 2 will look for any changes to the content in the change logs on the newly added sites. Which will be empty, or at least not contain any changes since the primary farm's last crawl.
    That explains why you don't get sites indexed properly when they are added but would explain why some content is indexed afterwards which i believe is the case?
    The second issue you'll find is that the crawls won't synchronise. Assuming continuous crawls kicking off at the same time you'll end up in a race between the two. If the primary farm is quicker then the second farm will continuously fall behind then catch
    up and go ahead of the primary indexing process, but if the secondary farm is faster then it'll race off into the distance and then any changes that occur between the secondary farm indexing a site and the primary indexing the site will be lost on the secondary
    farm.
    You'll have to run full crawls. Unless MS have done a lot of work on the supporting infrastructure incremental or continuous crawls of AOAGs won't work well.

  • Two different site collection cannot use the same Term Set . Is there any solution for this ?

     I can save department site collection as site collection template. Also I can use this template for a new department site collection creation.
    But Managed Meta data Navigation throws the following error
    •Error loading navigation: The Managed Navigation term set is improperly attached to the site. (Correlation ID: 6fa6b19c-820a-001d-b9e0-d939d702050a)

    Hi ,
    this is a limitation. please refer this
    http://social.technet.microsoft.com/forums/sharepoint/en-US/029b7e2e-661f-4ac3-bfcb-eb99b43016f2/the-selected-term-set-is-already-used-by-another-site
    Unfortunately in SP 2013, a termset can only be used by a single site collection for navigation. This is a known limitation. There is no out of the box way to use a central termset to manage navigation across site collections. You need to develop a custom sitemap
    provider to do that.
    Another way to do this is to create termset for each site collection and reuse/pin the terms from the central termset instead of creating duplicate terms. You can script this easily using Powershell. This has some limitations though.
    Also, you can try little customization. Follow this link -
    http://www.mavention.com/blog/building-global-naviga
    Please remember to click 'Mark as Answer' on the answer if it helps you

  • The SharePoint Server Publishing Infrastructure feature must be activated at the site collection level before the Publishing feature can be activated.

    Hi All,
    I am trying to activate the Publishing feature for a SharePoint website using a custom Site Definition File. The problem I am getting
    is when I try to use the custom site template I get an error message:
    New-SPSite : The SharePoint Server Publishing Infrastructure feature must be activated at the site collection level before the Publishing feature can be activated.
    The custom
    Site Definition File snippet is show below:
    <?xml
    version="1.0"
    encoding="utf-8"?>
    <Project
    Title="SiteDefinition1"
    Revision="2"
    ListDir=""
    xmlns:ows="Microsoft
    SharePoint"
    xmlns="http://schemas.microsoft.com/sharepoint/">
      <NavBars>
      </NavBars>
      <Configurations>
        <Configuration
    ID="0"
    Name="SiteDefinitionDelegateControls2">
          <Lists/>
          <SiteFeatures>
            <!--Document
    Set-->     
            <Feature
    ID="{3bae86a2-776d-499d-9db8-fa4cdc7884f8}"
    Name="FeatureDefinition/15/3bae86a2-776d-499d-9db8-fa4cdc7884f8"
    />
            <!--Publishing
    Prerequisites-->     
            <Feature
    ID="{a392da98-270b-4e85-9769-04c0fde267aa}"
    Name="FeatureDefinition/15/a392da98-270b-4e85-9769-04c0fde267aa"
    />
          ...Other Site Features
          </SiteFeatures>
         <WebFeatures>
            <!--2
    Publishing Web-->
         <Feature
    ID="{94c94ca6-b32f-4da9-a9e3-1f3d343d7ecb}"
    Name="FeatureDefinition/15/94c94ca6-b32f-4da9-a9e3-1f3d343d7ecb"/>
         ... Other Web Features
          </WebFeatures>
          <Modules>
            <Module
    Name="DefaultBlank"
    />
          </Modules>
        </Configuration>
      </Configurations>
      <Modules>
        <Module
    Name="DefaultBlank"
    Url=""
    Path="">
          <File
    Url="default.aspx">
          </File>
        </Module>
      </Modules>
    </Project>
    I hope you can help
    Colin

    As the error says.  The Publishing Infrastructure feature must be activated at the Site Collection level where you are creating a new site PRIOR to turning on the Publishing feature.  So if you are trying to use this site definition to create a
    new Site Collection then its not going to work.  If you are trying to create a sub site then make sure that Publishing Infrastructure is enabled in the site collection before trying to create a sub site with this site definition.
    Internally Microsoft does this for the Publishing Portals using custom code.  Here's an article that discusses how to do it in SharePoint 2010.  
    http://msdn.microsoft.com/en-us/library/office/gg615465(v=office.14).aspx
    Compare what's in the article to one of the Publishing templates available out of the box in 2013 and you should be able to find the assembly that is used to load the Publishing features.
    Paul Stork SharePoint Server MVP
    Principal Architect: Blue Chip Consulting Group
    Blog: http://dontpapanic.com/blog
    Twitter: Follow @pstork
    Please remember to mark your question as "answered" if this solves your problem.

  • How do I stop iCal from changing the server path of CalDAV accounts?

    I'm using Davical as a calendar server for our family. Some may wonder why I use Davical and not iCloud. I do for one reason...there is no way to disable getting other users' alarms on the iPhone. I want to be able to view my wife's calendar but not get her alarms and have her be able to view my calendar and not get my alarms.
    So, now onto the issue with server path setting changing in iCal. We each have our own calendar set up within our own accounts on our Davical server. We each have read access to the other's calendar. Within iCal account settings I have her account set up on my Mac with my username/password (jason/****) and, under server settings, the server path points to her calendar /caldav.php/juiper/). After a few days, sometimes just a few hours, suddenly I will have duplicate events throughout my calandar in iCal. Going to preferences I will see that the server path for her calendar will have been changed to the location of my calendar (/caldav.php/jason/). I'll correct the server path, restart iCal and then the calander will be correct for a while unitl the server path is automatically and incorrectly changed again and I must manually correct it again.
    Thanks!

    My coworkers and I use DavMail to share access to a shared Exchange calendar, authenticating with our own ADS credentials. We also see the behavior where iCal will reset the CalDav calendar path to the username (our own) that is authenticating to the CalDav service. In Lion this wasn't a deal breaker for us. While very annoying, the work around was simple. Update the path via preferences, and restart iCal. However in Mountain Lion, the issue has become a lot more difficult to fix. It's no longer a matter of updating the path via preferences and restarting.
    In Mountain Lion the solution to change the CalDav path back to the original path requires you to edit the file ~/Library/Calendars/<somehash>.caldav/Info.plist
    Contained within this file is a value "CalendarHomePath", simply update the string to your desired path, clear out the 'Calendar Cache' file for good measure, then restart iCal.

  • How can one edit a .gif file graphics background and change the fonts in Photoshop Elements 13?

    I am trying to change the color and fonts of some of my web sites .gif files. Can I change the color to another hexadecimal color and change the fonts of the letters in Photoshop 13? I am using Windows 7 and heard that you could save the .gif file to another format, edit it, then change the file back to a .gif file. Is this true? If so, how do you do it? The file just has a solid color with the letters "Firm " on it. No animation.

    You can edit a gif without changing the format, but once you save the file as a gif the text becomes part of the image, not text anymore, so you would need to clone or heal away the existing text and then create a new text layer and use save for web to create a new gif.
    For access to the most editing tools, while the file is open in the editor go to image>mode and change it from index color to RGB. Saving as a gif will change it back to index mode.

  • How can i add a special user to the "site collection administrators" of the personal site on sharepoint online when the personal site is creating ?

    when a new personal site is created on the sharepoint online 2013, then the administrator of this personal site will be himself. we can add another person to the "site collection administrators", then this person have full control of this personal
    site.
    but my request is : add this personal to the personal site when the personal site is creating, but not after the personal site has created ?
    Is anybody know how to do that?

    Hi,
    According to your post, my understanding is that you want to add a special user to the "site collection administrators" of the personal site on sharepoint online when the personal site is creating.
    Per my knowledge,
    there is no out of the box way to accomplish this with SharePoint.
    This is a forum which is supported for SharePoint On-Premise.
    Regarding SharePoint Online, for quick and accurate answers to your questions, it is recommended that you initial a new thread in Office 365 forum.
    Office 365 forum
    http://community.office365.com/en-us/forums/default.aspx
    Thanks,
    Linda Li                
    Forum Support
    Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Subscriber Support, contact
    [email protected]
    Linda Li
    TechNet Community Support

  • How can i change the default path of installation

    I had creative cloud for a while. Today i changed the default installing folder, from the creative cloud settings/prefferences(can't remember) menu, from C to D. Then i uninstalled all the programs from adobe cs6 to take the new cc programs. When i want to reinstall adobe creative cloud it is installing on D and since there is no operating sistem on D i can't install the creative cloud. It keeps installing on D. How can i change the default path of installation if i uninstalled creative cloud?

    Or how can I change the prefered installation folder that can be changed from creative cloud menu/prefferences when i have uninstalled creative cloud?

  • How can I change the installation path  of JWS?

    I have to install the JRE 1.4.0_01 in silent mode on Windows XP. For the JRE I can set the installation path in the corresponding dialog and record it in an *.iss file. When starting the silent installation the JRE is correctly installed into this path but the Java Webstart installation procedure doesn't care about this path and installs JWS in 'D:\Programme\Java Web Start'.
    How can I change the installation path for JWS?
    J�rg

    Verify Free Disk Space. Assuming you are installing the OS to the C: drive.

Maybe you are looking for

  • How to reset iTunes Match

    Hi everyone, Just a new thread to share with you how I did reset iTunes Match, this can be useful in some cases, like mine was. This allow you to sync again your entire iTunes library with iTunes Match like if it was the very first time you activate

  • 5G Volume level?

    So, I am looking into buying a black 60 gig 5G tomorrow sometime, but I had one question I needed answered. I have a line-in input installed in my car. Currently, I use a Creative Zen Touch 40 to listen to my music, but the volume needs to be set to

  • Work Hours Calculation - ERROR

    Hi All, I'm having difficulty with the Work Hours Calculation in Project. It seems that when the number of hours within a subtask group exceeds 1000 hours, the display returns 1 hour. Whilst the Project Statistics for cost are correct, the number of

  • About FM ' FI_TEXTS_DOCUMENT'

    Hi All, Can any body use this FM? When i excuted  this FM by passing Value. Directly the POPup contents value comes. but tables shows this FM empty. I want to put values in internal table. This is for display long text in FB03 Transaction on the uppe

  • Discoverer 4i Service returns service specifici error 3

    Tried installing 9iAS 1.0.2.1.0 on NT. When I tried to start the Discoverer4i service got this error. According to the configuration guide, there should be a Locator.log file, however, I cannot locate it in anywhere. Any ideas? null