Alui 6.1 to MOSS 2010

Hi ,
we are using alui 6.1 and need some guide/link on how can we migrate it to MOSS 2010 in case we need to. Is Tzunami is the preferred tool. What are the problems that we can face. Also how can we migrate the custom application that we build using Alui IDK to MOSS.
Thanks.

why go to MOSS why not WebCenter? Is it the cost? ALUI is now WCI and being merged to WebCenter.
Is your ALUI on IIS and applications using .NET IDK or Java?

Similar Messages

  • Integration of Oracle UCM with MOSS 2010

    Hi Experts,
    Do you know if it the following scenario is possible:
    - MOSS 2010 for the user interface
    - Oracle UCM for all the document management core activities
    Are there any standard connectors available?
    Thanks in advance,
    Ricardo

    Hi
    Yes, there are Share Point adapters available for UCM - MOSS integration .
    Please check the following link : http://www.oracle.com/technetwork/middleware/webcenter/content/downloads/index-ucm10g-082682.html
    Thanks
    Srinath

  • Open Modal Dialog on Page Load of MOSS 2010

    Hi All
    We are trying to open modal dialog when home page opens on Sharepoint portal 2010
    We followed these steps:
    1. Created Page
    2. Created HTML Form Editor web part
    3. Added following code to its source
    <script type="text/javascript" language="javascript">
    Alert('Shahab');
    _spBodyOnLoadFunctionNames.push("HideNewIcons");
    function HideNewIcons(){
      var options=SP.UI.$create_DialogOptions();
    options.url='http://servername/sites/itd/SitePages/Test.aspx?IsDlg=1';
    options.height = 400;void(SP.UI.ModalDialog.showModalDialog(options))
    _spBodyOnLoadFunctionNames.push("HideNewIcons");
    </script>
    4. Neither ALERT nor MODAL DIALOG appear.
    We are on MOSS 2012. I checked the net and found there might be some issues in 2010. However, it works best on 2007
    Is there any other solution?
    Regards
    SSA

    Hi ,
    I understand that you want to open a modal dialog on page load in SharePoint 2010. The code Ashish provided works great. But you need to put the code on the page directly by editing the page in SharePoint designer(When putting the code in Content Editor
    Web Part ,the modal dialog doesn’t pop up correctly).
    Edit the page in SharePoint designer in Advanced Mode.
    Browse to the bottom of the code view .
    Add the code before the </asp:Content> tab.
    <script language="javascript" type="text/javascript">
            ExecuteOrDelayUntilScriptLoaded(yourFunction, 'SP.js');
            function yourFunction() {
                var options = { url:
    '/_layouts/viewlsts.aspx', title: 'Title, Description, and Icon', width: 640, height: 400 };
                 SP.UI.ModalDialog.showModalDialog(options);
             _spBodyOnLoadFunctionNames.push("yourFunction()");
    </script>
    Thanks,
    Entan Ming
    Entan Ming
    TechNet Community Support

  • Getting 401 error while creating a Report Data Source with MOSS 2010 Foundation

    I have setup SQL Server 2008 R2 Reporting Services with SharePoint 2010 Foundation in SharePoint integrated mode. SharePoint Foundation is in machine 1 whereas SQL Server 2008 R2 and SSRS Report Server are in machine 2. While configuring Reporting
    Services - Sharepoint integration, I have used Authentication Mode as "Windows Authentication" (I need to use Kerberos).
    My objective is to setup a Data Connection Library, a Report Model Library, and a Reports Library so that I can upload a Report Data Source, some SMDLs, and a few Reports onto their respective libraries.
    While creating the top level site, "Business Intelligence Center" was not available for template selection since SharePoint Foundation is being used. I therefore selected "Blank Site" as the template.
    While creating a SharePoint Site under the top level site, for template selection I again had to select "Blank Site".
    I then proceeded to create a library for the data connection. Towards this, I created a new document library and selected "Basic page" as the document template. I then went to Library Settings for this newly created library and clicked on
    Advanced Settings. In the Advanced Settings page, for "Allow management of content types?" I selected "Yes". Then I clicked on "Add from existing content types" and selected "Report Data Source". I deleted the existing
    "Document" content type for this library.
    Now I wanted to created a Data Connection in the above Data Connection library. For this when I clicked on "New Document" under "Documents" of "Library Tools" and selected "Report Data Source", I got the error "The
    request failed with HTTP status
    401: Unauthorized.".
    Can anybody tell me why I am getting this error?
    Note: I have created the site and the library using SharePoint Admin account.

    Hi,
    Thank you for your detailed description. According to the description, I noticed that the report server was not part of the
    SharePoint farm. Add the report server to the
    SharePoint farm and see how it works.
    To join a report server to a SharePoint farm, the report server must be installed on a computer that has an instance of a SharePoint product or technology. You can install the report server before or after installing the SharePoint product
    or technology instance.
    More information, see
    http://msdn.microsoft.com/en-us/library/bb283190.aspx
    Thanks.
    Tracy Cai
    TechNet Community Support

  • MOSS 2010 Shared Review Not Connecting For All Reviewers (Acrobat 9 and X (Std and Pro) and Reader9)

    I am using SharePoint 2010 for Shared Review.  Most of the testing has been successful with a few minor problems.  The biggest has been that some are not getting the connection request.  The document for review opens immediately either in the user's browser or Adobe Acrobat or Reader.  We have not been able to nail down the reason yet.  But, it has happened on systems running Acrobat 9 Pro and Acrobat X Pro.  We have had success with others using Acrobat X Pro and Reader 9.
    What I am most confused on is why some of our users are not getting the request to connect to the SharePoint workspace to do their review.  If they go through Windows Explorer to where the file is located to open it directly, it asks the user to Check Out the file and then Save & Continue.  If they do this, they seem to lose the ability to publish their comments directly to the server.  They don't even get the options to check for new comments.  The file is opened and they are able to work with it like a normal file.
    What could we have done wrong that caused this?  Each the users has the appropriate permissions to the SharePoint site.  I have noticed in the tracker that the two that had the issues don't have their name or title appear.  I checked both of them in the Identiy section of the Preferences and their name, title, organization and email are all missing.  Only their UserID is shown.  Could this be part of it?  We were all using the same intranet wireless system.  All users were able to open the review file.  When someone receives the link for review through the email, does the link verify through the email address listed in the prefences?
    Jason Lopez

    Hello Vikas,
    To answer your questions:
    1. Do all reviewers have a local copy of the review file, or is it kept at a shared location and everyone is commenting on it.
    - I have tried it both ways with the same results. Our preference is to keep the PDF in a shared location so that all reviewers can see each others' comments. (Since the problem began, we have lost that capability, and the "semi-workaround" is to email each reviewer the PDF, each reviewer makes comments locally, and emails it back to the writer. This is less than ideal as each review cannot see each other's comments... However, I was able to publish them "on behalf".
    2. Are you only having trouble with publishing new comments or is it that you cannot see earlier added comments also.
    Reviewers are unable to publish their newly-added comments as there is no Publish button/icon. If I have published another's comments on their behalf, they can see them. However, they cannot Reply - as the Reply button is disabled.
    3. Are there some specific files with which you are having trouble or is it a generic problem.
    It happens with all files, so it is generic. Anyone who has Reader 9 has no Publish button when opening a shared review generated from Acrobat 8.
    Thanks for any help you can provide. We are hoping the problem resolves when we upgrade to Acrobat 9 but that alas will not be until sometime in the new year, hopefully.

  • MOSS 2007 SP1 & SQL Server 2008 upgrade plan into MOSS 2010 and SQL Server 2008 R2 ?

    Hi All,
    I've got the following setup in one single server:
    Windows Server 2008 Standard
    SQL Server 2008 Standard
    SharePoint Server 2007 Enterprise SP1
    that I'd like to upgrade into the following:
    Windows Server 2008 R2 Standard
    SQL Server 2008 R2 SP1 ?
    SharePoint Server 2010 Enterprise
    can i just perform CA farm level backup then import it on the new SharePoint server ? is there any caveat that i should pay attention to ? (eg. non working site list or web parts ? )
    thanks.
    /* Infrastructure Support Engineer */

    Backup/import will not work for upgrade. As mentioned on the other thread, you need to perform a database attach upgrade to move to a separate 2010 from from a 2007 farm. For information about upgrading to SharePoint Server 2010, see the upgrade guide:
    http://technet.microsoft.com/en-us/library/cc303420.aspx 

  • How to Save site as template in SharePoint 2010

    hello,
    I am little bit confused about how I can save a site as a template in moss 2010!
    In SharePoint 2007 it was simple > site actions > save site as template... and done!
    but I do not find how to make that in the new SharePoint, I would be really happy if someone could tell me how to do that.
    I found some tutorial:
    To create a Site Template, browse to the Site you want to create from.  Go to Site Actions –> Site Settings –> Save site as template (under the Site Actions heading).  Give your template a name, and don't forget to check the "Include Content" box if you want to include library and list content in the template.  This will create a new site template in your Solutions Gallery, which you can then use when creating new sites.  You can also download directly from the Solutions Gallery to a .WSP file.
    but the thing is, I do not have the "save site as template" link! I also checked if I need some special site feature activated, but I have all activated.
    thank you in advance

    I imagine you are trying to save a publishing site as a template, correct? If so, the save as site template link is not available for publishing sites. This was not supported in MOSS and I imagine isn't in SharePoint 2010. The reason was due to publishing sites having pages and layouts tied to content types that do not move with the template.
    If you would like a work around, you can disable the publishing feature (which will then make the save as site template link visible), save the site as a template, then re-enable the publishing feature. You would have to re-enable the publishing feature on each new site built from this template, if desired.
    Go to Site Actions > Site Settings > Manage site features under Site Actions. Deactive the SharePoint Server Publishing feature, then go back to the Site Settings page. You should now see the save as site template link. After you save the template, turn back on SharePoint Server Publishing.
    Adam Preston - MCTS | Blog: http://sptwentyten.wordpress.com | Twitter:@_apreston

  • Term Store Error

    Hello,
    when I want creating a Managed Metadata column for your SharePoint 2010 I get this error:
    Default termstore for this site cannot be identified.
    I found a lot of solutions, at this link:
    http://www.sharepointfire.com/MyBlog/2013/01/default-termstore-for-this-site-cannot-be-identified-2/
    At our case, our default MMS Service is on another SharePoint farm (service app farm), which is connected successfully whith the not service app farm.
    At MMS, at our service app farm, there is this setting already done: checked, ‘’This service application is the default storage location for column specific term sets”, and click on OK.
    What can be now the problem with this connected service app?
    I hope somebody can help me,
    Thanks Horst MOSS 2007 Farm; MOSS 2010 Farm; TFS 2010; TFS 2013; IIS 7.5

    Hi wuwu,
    According to your description, I understanding is that you were unable to create a managed metadata column using the shared managed metadata service application.
    Please make sure you have shared the managed metadata service application across farms correctly. You can check your process as the following link:
    http://www.c-sharpcorner.com/uploadfile/anavijai/how-to-publish-managed-metadata-service-across-farms-in-sharepoint-2010/
    Check if the link(Update section) is useful for you:
    http://sharedpointers.blogspot.jp/2011/03/exceptions-when-creating-site-columns.html
    In addition, please check the log file to find more information about this issue.
    Best Regards,
    Wendy
    Forum Support
    Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Subscriber Support, contact
    [email protected]
    Wendy Li
    TechNet Community Support

  • Parse robocopy Log File - new value

    Hello,
    I have found a script, that parse the robocopy log file, which looks like this:
       ROBOCOPY     ::     Robust File Copy for Windows                             
      Started : Thu Aug 07 09:30:18 2014
       Source : e:\testfolder\
         Dest : w:\testfolder\
        Files : *.*
      Options : *.* /V /NDL /S /E /COPYALL /NP /IS /R:1 /W:5
         Same          14.6 g e:\testfolder\bigfile - Copy (5).out
         Same          14.6 g e:\testfolder\bigfile - Copy.out
         Same          14.6 g e:\testfolder\bigfile.out
                   Total    Copied   Skipped  Mismatch    FAILED    Extras
        Dirs :         1         0         1         0        
    0         0
       Files :         3         3         0         0        
    0         0
       Bytes :  43.969 g  43.969 g         0         0         0         0
       Times :   0:05:44   0:05:43                       0:00:00   0:00:00
       Speed :           137258891 Bytes/sec.
       Speed :            7854.016 MegaBytes/min.
       Ended : Thu Aug 07 09:36:02 2014
    Most values at output file are included, but the two speed paramter not.
    How can I get this two speed paramters at output file?
    Here is the script:
    param(
    [parameter(Position=0,Mandatory=$true,ValueFromPipeline=$false,HelpMessage='Source Path with no trailing slash')][string]$SourcePath,
    [switch]$fp
    write-host "Robocopy log parser. $(if($fp){"Parsing file entries"} else {"Parsing summaries only, use -fp to parse file entries"})"
    #Arguments
    # -fp File parse. Counts status flags and oldest file Slower on big files.
    $ElapsedTime = [System.Diagnostics.Stopwatch]::StartNew()
    $refreshrate=1 # progress counter refreshes this often when parsing files (in seconds)
    # These summary fields always appear in this order in a robocopy log
    $HeaderParams = @{
    "04|Started" = "date";
    "01|Source" = "string";
    "02|Dest" = "string";
    "03|Options" = "string";
    "07|Dirs" = "counts";
    "08|Files" = "counts";
    "09|Bytes" = "counts";
    "10|Times" = "counts";
    "05|Ended" = "date";
    #"06|Duration" = "string"
    $ProcessCounts = @{
    "Processed" = 0;
    "Error" = 0;
    "Incomplete" = 0
    $tab=[char]9
    $files=get-childitem $SourcePath
    $writer=new-object System.IO.StreamWriter("$(get-location)\robocopy-$(get-date -format "dd-MM-yyyy_HH-mm-ss").csv")
    function Get-Tail([object]$reader, [int]$count = 10) {
    $lineCount = 0
    [long]$pos = $reader.BaseStream.Length - 1
    while($pos -gt 0)
    $reader.BaseStream.position=$pos
    # 0x0D (#13) = CR
    # 0x0A (#10) = LF
    if ($reader.BaseStream.ReadByte() -eq 10)
    $lineCount++
    if ($lineCount -ge $count) { break }
    $pos--
    # tests for file shorter than requested tail
    if ($lineCount -lt $count -or $pos -ge $reader.BaseStream.Length - 1) {
    $reader.BaseStream.Position=0
    } else {
    # $reader.BaseStream.Position = $pos+1
    $lines=@()
    while(!$reader.EndOfStream) {
    $lines += $reader.ReadLine()
    return $lines
    function Get-Top([object]$reader, [int]$count = 10)
    $lines=@()
    $lineCount = 0
    $reader.BaseStream.Position=0
    while(($linecount -lt $count) -and !$reader.EndOfStream) {
    $lineCount++
    $lines += $reader.ReadLine()
    return $lines
    function RemoveKey ( $name ) {
    if ( $name -match "|") {
    return $name.split("|")[1]
    } else {
    return ( $name )
    function GetValue ( $line, $variable ) {
    if ($line -like "*$variable*" -and $line -like "* : *" ) {
    $result = $line.substring( $line.IndexOf(":")+1 )
    return $result
    } else {
    return $null
    function UnBodgeDate ( $dt ) {
    # Fixes RoboCopy botched date-times in format Sat Feb 16 00:16:49 2013
    if ( $dt -match ".{3} .{3} \d{2} \d{2}:\d{2}:\d{2} \d{4}" ) {
    $dt=$dt.split(" ")
    $dt=$dt[2],$dt[1],$dt[4],$dt[3]
    $dt -join " "
    if ( $dt -as [DateTime] ) {
    return $dt.ToStr("dd/MM/yyyy hh:mm:ss")
    } else {
    return $null
    function UnpackParams ($params ) {
    # Unpacks file count bloc in the format
    # Dirs : 1827 0 1827 0 0 0
    # Files : 9791 0 9791 0 0 0
    # Bytes : 165.24 m 0 165.24 m 0 0 0
    # Times : 1:11:23 0:00:00 0:00:00 1:11:23
    # Parameter name already removed
    if ( $params.length -ge 58 ) {
    $params = $params.ToCharArray()
    $result=(0..5)
    for ( $i = 0; $i -le 5; $i++ ) {
    $result[$i]=$($params[$($i*10 + 1) .. $($i*10 + 9)] -join "").trim()
    $result=$result -join ","
    } else {
    $result = ",,,,,"
    return $result
    $sourcecount = 0
    $targetcount = 1
    # Write the header line
    $writer.Write("File")
    foreach ( $HeaderParam in $HeaderParams.GetEnumerator() | Sort-Object Name ) {
    if ( $HeaderParam.value -eq "counts" ) {
    $tmp="~ Total,~ Copied,~ Skipped,~ Mismatch,~ Failed,~ Extras"
    $tmp=$tmp.replace("~","$(removekey $headerparam.name)")
    $writer.write(",$($tmp)")
    } else {
    $writer.write(",$(removekey $HeaderParam.name)")
    if($fp){
    $writer.write(",Scanned,Newest,Summary")
    $writer.WriteLine()
    $filecount=0
    # Enumerate the files
    foreach ($file in $files) {
    $filecount++
    write-host "$filecount/$($files.count) $($file.name) ($($file.length) bytes)"
    $results=@{}
    $Stream = $file.Open([System.IO.FileMode]::Open,
    [System.IO.FileAccess]::Read,
    [System.IO.FileShare]::ReadWrite)
    $reader = New-Object System.IO.StreamReader($Stream)
    #$filestream=new-object -typename System.IO.StreamReader -argumentlist $file, $true, [System.IO.FileAccess]::Read
    $HeaderFooter = Get-Top $reader 16
    if ( $HeaderFooter -match "ROBOCOPY :: Robust File Copy for Windows" ) {
    if ( $HeaderFooter -match "Files : " ) {
    $HeaderFooter = $HeaderFooter -notmatch "Files : "
    [long]$ReaderEndHeader=$reader.BaseStream.position
    $Footer = Get-Tail $reader 16
    $ErrorFooter = $Footer -match "ERROR \d \(0x000000\d\d\) Accessing Source Directory"
    if ($ErrorFooter) {
    $ProcessCounts["Error"]++
    write-host -foregroundcolor red "`t $ErrorFooter"
    } elseif ( $footer -match "---------------" ) {
    $ProcessCounts["Processed"]++
    $i=$Footer.count
    while ( !($Footer[$i] -like "*----------------------*") -or $i -lt 1 ) { $i-- }
    $Footer=$Footer[$i..$Footer.Count]
    $HeaderFooter+=$Footer
    } else {
    $ProcessCounts["Incomplete"]++
    write-host -foregroundcolor yellow "`t Log file $file is missing the footer and may be incomplete"
    foreach ( $HeaderParam in $headerparams.GetEnumerator() | Sort-Object Name ) {
    $name = "$(removekey $HeaderParam.Name)"
    $tmp = GetValue $($HeaderFooter -match "$name : ") $name
    if ( $tmp -ne "" -and $tmp -ne $null ) {
    switch ( $HeaderParam.value ) {
    "date" { $results[$name]=UnBodgeDate $tmp.trim() }
    "counts" { $results[$name]=UnpackParams $tmp }
    "string" { $results[$name] = """$($tmp.trim())""" }
    default { $results[$name] = $tmp.trim() }
    if ( $fp ) {
    write-host "Parsing $($reader.BaseStream.Length) bytes"
    # Now go through the file line by line
    $reader.BaseStream.Position=0
    $filesdone = $false
    $linenumber=0
    $FileResults=@{}
    $newest=[datetime]"1/1/1900"
    $linecount++
    $firsttick=$elapsedtime.elapsed.TotalSeconds
    $tick=$firsttick+$refreshrate
    $LastLineLength=1
    try {
    do {
    $line = $reader.ReadLine()
    $linenumber++
    if (($line -eq "-------------------------------------------------------------------------------" -and $linenumber -gt 16) ) {
    # line is end of job
    $filesdone=$true
    } elseif ($linenumber -gt 16 -and $line -gt "" ) {
    $buckets=$line.split($tab)
    # this test will pass if the line is a file, fail if a directory
    if ( $buckets.count -gt 3 ) {
    $status=$buckets[1].trim()
    $FileResults["$status"]++
    $SizeDateTime=$buckets[3].trim()
    if ($sizedatetime.length -gt 19 ) {
    $DateTime = $sizedatetime.substring($sizedatetime.length -19)
    if ( $DateTime -as [DateTime] ){
    $DateTimeValue=[datetime]$DateTime
    if ( $DateTimeValue -gt $newest ) { $newest = $DateTimeValue }
    if ( $elapsedtime.elapsed.TotalSeconds -gt $tick ) {
    $line=$line.Trim()
    if ( $line.Length -gt 48 ) {
    $line="[...]"+$line.substring($line.Length-48)
    $line="$([char]13)Parsing > $($linenumber) ($(($reader.BaseStream.Position/$reader.BaseStream.length).tostring("P1"))) - $line"
    write-host $line.PadRight($LastLineLength) -NoNewLine
    $LastLineLength = $line.length
    $tick=$tick+$refreshrate
    } until ($filesdone -or $reader.endofstream)
    finally {
    $reader.Close()
    $line=$($([string][char]13)).padright($lastlinelength)+$([char]13)
    write-host $line -NoNewLine
    $writer.Write("`"$file`"")
    foreach ( $HeaderParam in $HeaderParams.GetEnumerator() | Sort-Object Name ) {
    $name = "$(removekey $HeaderParam.Name)"
    if ( $results[$name] ) {
    $writer.Write(",$($results[$name])")
    } else {
    if ( $ErrorFooter ) {
    #placeholder
    } elseif ( $HeaderParam.Value -eq "counts" ) {
    $writer.Write(",,,,,,")
    } else {
    $writer.Write(",")
    if ( $ErrorFooter ) {
    $tmp = $($ErrorFooter -join "").substring(20)
    $tmp=$tmp.substring(0,$tmp.indexof(")")+1)+","+$tmp
    $writer.write(",,$tmp")
    } elseif ( $fp ) {
    $writer.write(",$LineCount,$($newest.ToString('dd/MM/yyyy hh:mm:ss'))")
    foreach ( $FileResult in $FileResults.GetEnumerator() ) {
    $writer.write(",$($FileResult.Name): $($FileResult.Value);")
    $writer.WriteLine()
    } else {
    write-host -foregroundcolor darkgray "$($file.name) is not recognised as a RoboCopy log file"
    write-host "$filecount files scanned in $($elapsedtime.elapsed.tostring()), $($ProcessCounts["Processed"]) complete, $($ProcessCounts["Error"]) have errors, $($ProcessCounts["Incomplete"]) incomplete"
    write-host "Results written to $($writer.basestream.name)"
    $writer.close()
    I hope somebody can help me,
    Horst
    Thanks Horst MOSS 2007 Farm; MOSS 2010 Farm; TFS 2010; TFS 2013; IIS 7.5

    Hi Horst,
    To convert mutiple robocopy log files to a .csv file with "speed" option, the script below may be helpful for you, I tested with a single robocopy log file, and the .csv file will output to "D:\":
    $SourcePath="e:\1\1.txt" #robocopy log file
    write-host "Robocopy log parser. $(if($fp){"Parsing file entries"} else {"Parsing summaries only, use -fp to parse file entries"})"
    #Arguments
    # -fp File parse. Counts status flags and oldest file Slower on big files.
    $ElapsedTime = [System.Diagnostics.Stopwatch]::StartNew()
    $refreshrate=1 # progress counter refreshes this often when parsing files (in seconds)
    # These summary fields always appear in this order in a robocopy log
    $HeaderParams = @{
     "04|Started" = "date"; 
     "01|Source" = "string";
     "02|Dest" = "string";
     "03|Options" = "string";
     "09|Dirs" = "counts";
     "10|Files" = "counts";
     "11|Bytes" = "counts";
     "12|Times" = "counts";
     "05|Ended" = "date";
     "07|Speed" = "default";
     "08|Speednew" = "default"
    $ProcessCounts = @{
     "Processed" = 0;
     "Error" = 0;
     "Incomplete" = 0
    $tab=[char]9
    $files=get-childitem $SourcePath
    $writer=new-object System.IO.StreamWriter("D:\robocopy-$(get-date -format "dd-MM-yyyy_HH-mm-ss").csv")
    function Get-Tail([object]$reader, [int]$count = 10) {
     $lineCount = 0
     [long]$pos = $reader.BaseStream.Length - 1
     while($pos -gt 0)
      $reader.BaseStream.position=$pos
      # 0x0D (#13) = CR
      # 0x0A (#10) = LF
      if ($reader.BaseStream.ReadByte() -eq 10)
       $lineCount++
       if ($lineCount -ge $count) { break }
      $pos--
     # tests for file shorter than requested tail
     if ($lineCount -lt $count -or $pos -ge $reader.BaseStream.Length - 1) {
      $reader.BaseStream.Position=0
     } else {
      # $reader.BaseStream.Position = $pos+1
     $lines=@()
     while(!$reader.EndOfStream) {
      $lines += $reader.ReadLine()
     return $lines
    function Get-Top([object]$reader, [int]$count = 10)
     $lines=@()
     $lineCount = 0
     $reader.BaseStream.Position=0
     while(($linecount -lt $count) -and !$reader.EndOfStream) {
      $lineCount++
      $lines += $reader.ReadLine()  
     return $lines
    function RemoveKey ( $name ) {
     if ( $name -match "|") {
      return $name.split("|")[1]
     } else {
      return ( $name )
    function GetValue ( $line, $variable ) {
     if ($line -like "*$variable*" -and $line -like "* : *" ) {
      $result = $line.substring( $line.IndexOf(":")+1 )
      return $result
     } else {
      return $null
    }function UnBodgeDate ( $dt ) {
     # Fixes RoboCopy botched date-times in format Sat Feb 16 00:16:49 2013
     if ( $dt -match ".{3} .{3} \d{2} \d{2}:\d{2}:\d{2} \d{4}" ) {
      $dt=$dt.split(" ")
      $dt=$dt[2],$dt[1],$dt[4],$dt[3]
      $dt -join " "
     if ( $dt -as [DateTime] ) {
      return $dt.ToStr("dd/MM/yyyy hh:mm:ss")
     } else {
      return $null
    function UnpackParams ($params ) {
     # Unpacks file count bloc in the format
     # Dirs :      1827         0      1827         0         0         0
     # Files :      9791         0      9791         0         0         0
     # Bytes :  165.24 m         0  165.24 m         0         0         0
     # Times :   1:11:23   0:00:00                       0:00:00   1:11:23
     # Parameter name already removed
     if ( $params.length -ge 58 ) {
      $params = $params.ToCharArray()
      $result=(0..5)
      for ( $i = 0; $i -le 5; $i++ ) {
       $result[$i]=$($params[$($i*10 + 1) .. $($i*10 + 9)] -join "").trim()
      $result=$result -join ","
     } else {
      $result = ",,,,,"
     return $result
    $sourcecount = 0
    $targetcount = 1
    # Write the header line
    $writer.Write("File")
    foreach ( $HeaderParam in $HeaderParams.GetEnumerator() | Sort-Object Name ) {
     if ( $HeaderParam.value -eq "counts" ) {
      $tmp="~ Total,~ Copied,~ Skipped,~ Mismatch,~ Failed,~ Extras"
      $tmp=$tmp.replace("~","$(removekey $headerparam.name)")
      $writer.write(",$($tmp)")
     } else {
      $writer.write(",$(removekey $HeaderParam.name)")
    if($fp){
     $writer.write(",Scanned,Newest,Summary")
    $writer.WriteLine()
    $filecount=0
    # Enumerate the files
    foreach ($file in $files) { 
     $filecount++
        write-host "$filecount/$($files.count) $($file.name) ($($file.length) bytes)"
     $results=@{}
    $Stream = $file.Open([System.IO.FileMode]::Open,
                       [System.IO.FileAccess]::Read,
                        [System.IO.FileShare]::ReadWrite)
     $reader = New-Object System.IO.StreamReader($Stream)
     #$filestream=new-object -typename System.IO.StreamReader -argumentlist $file, $true, [System.IO.FileAccess]::Read
     $HeaderFooter = Get-Top $reader 16
     if ( $HeaderFooter -match "ROBOCOPY     ::     Robust File Copy for Windows" ) {
      if ( $HeaderFooter -match "Files : " ) {
       $HeaderFooter = $HeaderFooter -notmatch "Files : "
      [long]$ReaderEndHeader=$reader.BaseStream.position
      $Footer = Get-Tail $reader 16
      $ErrorFooter = $Footer -match "ERROR \d \(0x000000\d\d\) Accessing Source Directory"
      if ($ErrorFooter) {
       $ProcessCounts["Error"]++
       write-host -foregroundcolor red "`t $ErrorFooter"
      } elseif ( $footer -match "---------------" ) {
       $ProcessCounts["Processed"]++
       $i=$Footer.count
       while ( !($Footer[$i] -like "*----------------------*") -or $i -lt 1 ) { $i-- }
       $Footer=$Footer[$i..$Footer.Count]
       $HeaderFooter+=$Footer
      } else {
       $ProcessCounts["Incomplete"]++
       write-host -foregroundcolor yellow "`t Log file $file is missing the footer and may be incomplete"
      foreach ( $HeaderParam in $headerparams.GetEnumerator() | Sort-Object Name ) {
       $name = "$(removekey $HeaderParam.Name)"
                            if ($name -eq "speed"){ #handle two speed
                            ($HeaderFooter -match "$name : ")|foreach{
                             $tmp=GetValue $_ "speed"
                             $results[$name] = $tmp.trim()
                             $name+="new"}
                            elseif ($name -eq "speednew"){} #handle two speed
                            else{
       $tmp = GetValue $($HeaderFooter -match "$name : ") $name
       if ( $tmp -ne "" -and $tmp -ne $null ) {
        switch ( $HeaderParam.value ) {
         "date" { $results[$name]=UnBodgeDate $tmp.trim() }
         "counts" { $results[$name]=UnpackParams $tmp }
         "string" { $results[$name] = """$($tmp.trim())""" }  
         default { $results[$name] = $tmp.trim() }  
      if ( $fp ) {
       write-host "Parsing $($reader.BaseStream.Length) bytes"
       # Now go through the file line by line
       $reader.BaseStream.Position=0
       $filesdone = $false
       $linenumber=0
       $FileResults=@{}
       $newest=[datetime]"1/1/1900"
       $linecount++
       $firsttick=$elapsedtime.elapsed.TotalSeconds
       $tick=$firsttick+$refreshrate
       $LastLineLength=1
       try {
        do {
         $line = $reader.ReadLine()
         $linenumber++
         if (($line -eq "-------------------------------------------------------------------------------" -and $linenumber -gt 16)  ) {
          # line is end of job
          $filesdone=$true
         } elseif ($linenumber -gt 16 -and $line -gt "" ) {
          $buckets=$line.split($tab)
          # this test will pass if the line is a file, fail if a directory
          if ( $buckets.count -gt 3 ) {
           $status=$buckets[1].trim()
           $FileResults["$status"]++
           $SizeDateTime=$buckets[3].trim()
           if ($sizedatetime.length -gt 19 ) {
            $DateTime = $sizedatetime.substring($sizedatetime.length -19)
            if ( $DateTime -as [DateTime] ){
             $DateTimeValue=[datetime]$DateTime
             if ( $DateTimeValue -gt $newest ) { $newest = $DateTimeValue }
         if ( $elapsedtime.elapsed.TotalSeconds -gt $tick ) {
          $line=$line.Trim()
          if ( $line.Length -gt 48 ) {
           $line="[...]"+$line.substring($line.Length-48)
          $line="$([char]13)Parsing > $($linenumber) ($(($reader.BaseStream.Position/$reader.BaseStream.length).tostring("P1"))) - $line"
          write-host $line.PadRight($LastLineLength) -NoNewLine
          $LastLineLength = $line.length
          $tick=$tick+$refreshrate      
        } until ($filesdone -or $reader.endofstream)
       finally {
        $reader.Close()
       $line=$($([string][char]13)).padright($lastlinelength)+$([char]13)
       write-host $line -NoNewLine
      $writer.Write("`"$file`"")
      foreach ( $HeaderParam in $HeaderParams.GetEnumerator() | Sort-Object Name ) {
       $name = "$(removekey $HeaderParam.Name)"
       if ( $results[$name] ) {
        $writer.Write(",$($results[$name])")
       } else {
        if ( $ErrorFooter ) {
         #placeholder
        } elseif ( $HeaderParam.Value -eq "counts" ) {
         $writer.Write(",,,,,,")
        } else {
         $writer.Write(",")
      if ( $ErrorFooter ) {
       $tmp = $($ErrorFooter -join "").substring(20)
       $tmp=$tmp.substring(0,$tmp.indexof(")")+1)+","+$tmp
       $writer.write(",,$tmp")
      } elseif ( $fp ) {
       $writer.write(",$LineCount,$($newest.ToString('dd/MM/yyyy hh:mm:ss'))")   
       foreach ( $FileResult in $FileResults.GetEnumerator() ) {
        $writer.write(",$($FileResult.Name): $($FileResult.Value);")
      $writer.WriteLine()
     } else {
      write-host -foregroundcolor darkgray "$($file.name) is not recognised as a RoboCopy log file"
    write-host "$filecount files scanned in $($elapsedtime.elapsed.tostring()), $($ProcessCounts["Processed"]) complete, $($ProcessCounts["Error"]) have errors, $($ProcessCounts["Incomplete"]) incomplete"
    write-host  "Results written to $($writer.basestream.name)"
    $writer.close()
    If you have any other questions, please feel free to let me know.
    If you have any feedback on our support,
    please click here.
    Best Regards,
    Anna Wang
    TechNet Community Support

  • Search string and export

    Hello,
    I want to search for two strings "+error" and "+denied"  at .txt files in one Folder.
    If one line exist in such file, it should Export this line into an new txt file.
    How can I do that?
    Thanks Horst MOSS 2007 Farm; MOSS 2010 Farm; TFS 2010; TFS 2013; IIS 7.5

    Can try updating the if statement to be
    if (($_ -like "*+error*") -or ($_ -like "*+denied*"))
    Out-File C:\MyNewFile.txt -InputObject $_ -Append
    That way, if the current line has +error or +denied with characters before or after, it should pick it up by using the wild card *
    If you find that my post has answered your question, please mark it as the answer. If you find my post to be helpful in anyway, please click vote as helpful.
    Don't Retire Technet

  • Error: HTML content does not allow it to be displayed in a frame.

    Hi,
    We migrated our site from MOSS 2010 to 2013.system prevents html pages on other domains from hosting our site in a frame
    because system assumes that external links have security threat.Please find the below error details.I got from different source that there is a workaround like: adding
    <meta http-equiv="X-Frame-Options" content="ALLOW"> to the html file but it didn't work.Can you please guide me if I am missing anything .I am using IE-9.I think the line
    <meta http-equiv="X-Frame-Options" content="ALLOW">
    will help to resolve but I am missing something here.
    <o:p></o:p>
    Knowledge is power.

    Hi,
    Please try FormDigest control in your master page as
    <asp:ContentPlaceHolderID="PlaceHolderFormDigest"runat="server">
                    <SharePoint:FormDigestID="Formdigest1"runat="server"/>
    </asp:ContentPlaceHolder>
    Thanks! Best Regards, Prasham Sabadra http://prashamsabadra.blogspot.in

  • Removing MissingSiteDefinition and Missingsetupfile

    Hi, 
    i Need to upgrade Moss 2010 to SP1 but these errors are stopping me...
    PS C:\Users\User> Test-SPContentDatabase -name wss_content_inet
    cmdlet Test-SPContentDatabase at command pipeline position 1
    Supply values for the following parameters:
    WebApplication: SharePoint - inet80
    Category        : MissingSiteDefinition
    Error           : True
    UpgradeBlocking : False
    Message         : 1 Sites in database [WSS_Content_net] has reference(s) to a
    missing site definition, Id = [75806], Lcid = [1033].
    Remedy          : The site definitions with Id 75806 is referenced in the datab
    ase [WSS_Content_net], but is not installed on the current f
    arm. The missing site definition may cause upgrade to fail. P
    lease install any solution which contains the site definition
    and restart upgrade if necessary.
    Category        : MissingSetupFile
    Error          
    : True
    UpgradeBlocking : False
    Message         : File [Features\KnowledgeBaseKnowledgeBaseList\kbase\repair.as
    px] is referenced [1] times in the database [WSS_Content_net
    ], but is not installed on the current farm. Please install a
    ny feature/solution which contains this file.
    Remedy          : One or more setup files are referenced in the database [WSS_C
    ontent_net], but are not installed on the current farm. Plea
    se install any feature or solution which contains these files
    Category        : MissingSetupFile
    Error          
    : True
    UpgradeBlocking : False
    Message         : File [Features\KnowledgeBaseKnowledgeBaseList\kbase\Upload.as
    px] is referenced [1] times in the database [WSS_Content_net
    ], but is not installed on the current farm. Please install a
    ny feature/solution which contains this file.
    Remedy          : One or more setup files are referenced in the database [WSS_C
    ontent_net], but are not installed on the current farm. Plea
    se install any feature or solution which contains these files
    Category        : MissingSetupFile
    Error          
    : True
    UpgradeBlocking : False
    Message         : File [Features\KnowledgeBaseModules\default.aspx] is referenc
     ed [1] times in the database [WSS_Content_net], but is not i
    nstalled on the current farm. Please install any feature/solu
    tion which contains this file.
    Remedy          : One or more setup files are referenced in the database [WSS_C
    ontent_net], but are not installed on the current farm. Plea
    se install any feature or solution which contains these files
    Category        : MissingSetupFile
    Error          
    : True
    UpgradeBlocking : False
    Message         : File [Features\TSAimage\homepage.gif] is referenced [1] times
    in the database [WSS_Content_net], but is not installed on
    the current farm. Please install any feature/solution which c
    ontains this file.
    Remedy          : One or more setup files are referenced in the database [WSS_C
    ontent_net], but are not installed on the current farm. Plea
    se install any feature or solution which contains these files
    Category        : MissingSetupFile
    Error          
    : True
    UpgradeBlocking : False
    Message         : File [SiteTemplates\PWA\dwp\OWCViewPart.webpart] is reference
    d [1] times in the database [WSS_Content_net], but is not in
    stalled on the current farm. Please install any feature/solut
    ion which contains this file.
    Remedy          : One or more setup files are referenced in the database [WSS_C
    ontent_net], but are not installed on the current farm. Plea
    se install any feature or solution which contains these files
    Category        : MissingSetupFile
    Error          
    : True
    UpgradeBlocking : False
    Message         : File [SiteTemplates\PWA\OWCView.aspx] is referenced [1] times
    in the database [WSS_Content_net], but is not installed on
    the current farm. Please install any feature/solution which c
    ontains this file.
    Remedy          : One or more setup files are referenced in the database [WSS_C
    ontent_net], but are not installed on the current farm. Plea
    se install any feature or solution which contains these files
    Category        : MissingWebPart
    Error          
    : True
    UpgradeBlocking : False
    Message         : WebPart class [94b5bae7-436c-dcc2-5869-d491181c283b] is refer
    enced [1] times in the database [WSS_Content_net], but is no
    t installed on the current farm. Please install any feature/s
    olution which contains this web part.
    Remedy          : One or more web parts are referenced in the database [WSS_Con
    tent_net], but are not installed on the current farm. Please
    install any feature or solution which contains these web par
    ts.
    Category        : MissingAssembly
    Error          
    : True
    UpgradeBlocking : False
    Message         : Assembly [KnowledgeBaseEventHandler, Version=14.0.0.0, Cultur
    e=neutral, PublicKeyToken=71e9bce111e9429c] is referenced in
    the database [WSS_Content_net], but is not installed on the
    current farm. Please install any feature/solution which conta
    ins this assembly.
    Remedy          : One or more assemblies are referenced in the database [WSS_Co
    ntent_net], but are not installed on the current farm. Pleas
    e install any feature or solution which contains these assemb
    lies.
    Category        : MissingAssembly
    Error          
    : True
    UpgradeBlocking : False
    Message         : Assembly [Microsoft.Office.Project.Server.PWA,Version=12.0.0.
    0,Culture=neutral,PublicKeyToken=71e9bce111e9429c] is referen
    ced in the database [WSS_Content_net], but is not installed
    on the current farm. Please install any feature/solution whic
    h contains this assembly.
    Remedy          : One or more assemblies are referenced in the database [WSS_Co
    ntent_net], but are not installed on the current farm. Pleas
     e install any feature or solution which contains these assemb
    lies.

    Hi,
    Seems that you have some MOSS 2007 history there, or you didn't uninstalled the features/solutions properly. 
    As a start - you can still upgrade to SP1 with these errors, they won't block your upgrade, but it's still a good idea to clean it up.
    You can do two things:
    1. Install the missing features/solutions if you have the installation package or files
    2. Remove the missing definitions from the farm - especially if you are not planning to use them.
    I see that the missing files are involved with the Project Server and KnowledgeBase module. You need to verify where are located the webparts that wants to access the missing files and simply remove these webpart definitions from the sites.
    For missing site templates, you need to add that templates and I don't know the other way (like replacing/removing template definitions). 
    You can check my post about missing features and powershell that helps find them in the content databases - you can rebuild this script to find where are your missing site definitions and other dependencies: http://net-pro.net/2011/sharepoint-server/powershell-script-to-find-missing-features/
    Let me know if you succeed or have more questions
    Regards, Stanisław Delost, NETPRIME. http://www.net-pro.net

  • Warmup scripts

    Hello,
    I have many warmup scipts in powershell for SP 2010, but none of them seems to work, at least i could not make them work.
    I just wish to let you know the particular code fragment which ultimately fails:
        $wc = new-object net.webclient;
        $wc.credentials = [System.Net.CredentialCache]::DefaultCredentials;
        $pageContents = $wc.DownloadString("http://mywebapp/default.aspx");
    --OR--
        $wc = new-object net.webclient;
        $wc.Credentials = New-Object System.Net.NetworkCredential("domainuser", "password", "mydomain");
        $pageContents = $wc.DownloadString("http://mywebapp/default.aspx");
    What i get with both is:
    Exception calling "DownloadString" with "1" argument(s): "The remote server returned an error: (401) Unauthorized."
    The user i used is actually Primary Site Collection Administrator, which has all rights over all the sites.
    The MOSS 2010 farm is recently installed, with default configuration.
    I also allowed the anonym access within the webapp.
    Does anybody have any idea what could be the issue ?
    Thank you

    Dear Leonid Lyublinski,
    no luck :(, i have logged in to the server as FarmAdmin,
    and tried, still get the Unauthorized access.
    Each time i run the script my FarmAdmin account appears in the EventLog,
    Audit Failed, see below.
    How could it be, my FarmAdmin is a localadmin on the server
    + as far as i know the Initial Config Wizard assign more rights to the FarmAdmin account
    (sql db_owner, etc)
    Any thoughts ? What should i check about my FarmAdmin account ?
    Thank You
    Dear Nico,
    it does the same, unfortunately, with the same error.
    Thank You 
    An account failed to log on.
    Subject:
        Security ID:        NULL SID
        Account Name:        -
        Account Domain:        -
        Logon ID:        0x0
    Logon Type:            3
    Account For Which Logon Failed:
        Security ID:        NULL SID
        Account Name:        myFarmAdmin :)
        Account Domain:        myDomain :)
    Failure Information:
        Failure Reason:        An Error occured during Logon.
        Status:            0xc000006d
        Sub Status:        0x0
    Process Information:
        Caller Process ID:    0x0
        Caller Process Name:    -
    Network Information:
        Workstation Name:    SRVMOSS01
        Source Network Address:    (my ip address could not tell:)
        Source Port:        59070
    Detailed Authentication Information:
        Logon Process:      
        Authentication Package:    NTLM
        Transited Services:    -
        Package Name (NTLM only):    -
        Key Length:        0
    This event is generated when a logon request fails. It is generated on the computer where access was attempted.
    The Subject fields indicate the account on the local system which requested the logon. This is most commonly a service such as the Server service, or a local process such as Winlogon.exe or Services.exe.
    The Logon Type field indicates the kind of logon that was requested. The most common types are 2 (interactive) and 3 (network).
    The Process Information fields indicate which account and process on the system requested the logon.
    The Network Information fields indicate where a remote logon request originated. Workstation name is not always available and may be left blank in some cases.
    The authentication information fields provide detailed information about this specific logon request.
        - Transited services indicate which intermediate services have participated in this logon request.
        - Package name indicates which sub-protocol was used among the NTLM protocols.
        - Key length indicates the length of the generated session key. This will be 0 if no session key was requested.

  • Migration sharepoint portals

    Presently our client have the MOSS 2007 and 2003 Portal and they want to migrate those sites in to MOSS 2010 in Different Environment. So, We need
    to show case something to that client. Please anyone have idea on the same, please let us know. It's very urgent.

    At a high level, the following steps need to be taken:
    - content needs to be moved (by moving the db's themselves, or individual site collections, or a 3rd party migration tool).
    - user accounts need to be re-ACLed
    - an inventory must be made and a plan for upgrading customizations
    - a plan must be made to upgrade various sharepoint services
    This is high level advice, if there is something specific...?
    Kind regards,
    Margriet Bruggeman
    Lois & Clark IT Services
    web site: http://www.loisandclark.eu
    blog: http://www.sharepointdragons.com

  • Streamwriter output different as Out-File

    Hello,
    I have a script where I write value's to an file with Out-File:
    "<<<local>>>" | Out-File $NagiosLogFile -Append -encoding ASCII
    $Export_W3SVC | Out-File $NagiosLogFile -Append -encoding ASCII
    $Export_App_Pool | Out-File $NagiosLogFile -Append -encoding ASCII
    $Export_IIS_Site | Out-File $NagiosLogFile -Append -encoding ASCII
    $Export_NetTCP_8093 | Out-File $NagiosLogFile -Append -encoding ASCII
    $Export_IIS_Read_File | Out-File $NagiosLogFile -Append -encoding ASCII
    "" | Out-File $NagiosLogFile -Append -encoding ASCII
    The output  at file is:
    <<<local>>>
    0 PolyOP_service_W3SVC - Der W3SVC Service is running
    0 PolyOP_ApplPool_ZWACMESBGHCON_Client - Application Pool PolyOP_ApplPool_ZWACMESBGHCON_Client is running
    0 PolyOP_ApplPool_ZWACMESBGHCON_Server - Application Pool PolyOP_ApplPool_ZWACMESBGHCON_Server is running
    0 PolyOP_WebSite_ZWACMESBGHCON_Client - IIS Site PolyOP_WebSite_ZWACMESBGHCON_Client is running
    0 PolyOP_WebSite_ZWACMESBGHCON_Server - IIS Site PolyOP_WebSite_ZWACMESBGHCON_Server is running
    0 PolyOP_Net_TCP_Port_8093_more_then_one_time - TCP Port 8093 is only set once at this server
    0 PolyOP_Read_File_EMSDeploy_Folder - EMS Deploy directory can be read from this server
    The other script methode is - streamwriter:
    $sw = new-object system.IO.StreamWriter($NagiosLogFile, "True", [System.Text.Encoding]::Ascii)
    $sw.writeline("<<<local>>>")
    $sw.writeline($Export_W3SVC)
    $sw.writeline($Export_App_Pool)
    $sw.writeline($Export_IIS_Site)
    $sw.writeline($Export_NetTCP_8093)
    $sw.writeline($Export_IIS_Read_File)
    $sw.writeline("")
    $sw.close()
    The output is:
    <<<local>>>
    0 PolyOP_service_W3SVC - Der W3SVC Service is running
    System.Object[]
    System.Object[]
    0 PolyOP_Net_TCP_Port_8093_more_then_one_time - TCP Port 8093 is only set once at this server
    0 PolyOP_Read_File_EMSDeploy_Folder - EMS Deploy directory can be read from this server
    The values for $Export_App_Pool and $Export_IIS_Site are arrays - script part:
    $DateTime = (Get-Date -Format yyyy.MM.dd-hh:mm:ss)
    $nagios_State = "0"
    $ServiceName = "ApplPool_"+$item.Name
    $Export_App_Pool1 = "$nagios_State PolyOP_$ServiceName - Application Pool PolyOP_$ServiceName is running"
    $Export_App_Pool += $Export_App_Pool1
    $Export_App_Pool = @($Export_App_Pool)
    What is the problem?
    I Hope somebody can help me,
    Thanks Horst MOSS 2007 Farm; MOSS 2010 Farm; TFS 2010; IIS 7.5

    Hello,
    thank you it works but I have now a new line in the log file, what I doesn't need.
    <<<local>>>
    0 PolyOP_service_W3SVC - Der W3SVC Service is running
    0 PolyOP_ApplPool_ZWACMESBGHPRO_Client - Application Pool PolyOP_ApplPool_ZWACMESBGHPRO_Client is running
    0 PolyOP_ApplPool_ZWACMESBGHPRO_Server - Application Pool PolyOP_ApplPool_ZWACMESBGHPRO_Server is running
    0 PolyOP_WebSite_ZWACMESBGHPRO_Client - IIS Site PolyOP_WebSite_ZWACMESBGHPRO_Client is running
    0 PolyOP_WebSite_ZWACMESBGHPRO_Server - IIS Site PolyOP_WebSite_ZWACMESBGHPRO_Server is running
    0 PolyOP_Net_TCP_Port_8093_more_then_one_time - TCP Port 8093 is only set once at this server
    0 PolyOP_Read_File_EMSDeploy_Folder - EMS Deploy directory can be read from this server
     How can I remove this line after this two arrays?
    Thanks Horst MOSS 2007 Farm; MOSS 2010 Farm; TFS 2010; IIS 7.5

Maybe you are looking for

  • Master Details form

    I would like to create a Mster details, so pressing the parent record in the up grid will display its detials in the lowwer grid. Is there a way to do that? Best Regrards Erez

  • Error while installing SCOM 2012 reporting component.

    Hello Experts, I had to recently tired installing SCOM 2012 reporting server role on one of the SSRS instances and got the below error. The SQL team created a dedicated SSRS instance for SCOM(SSRS_SCOM2012) on a SQL instance which hostes other SSRS i

  • Need HELP regarding installinfg CR2008/Visual Advantage

    I need help regarding installing CR2008/Visual Advantage. I had the evaluation copy of cr2008. My compnay purchased the CR2008 Visual Advantage. Upon calling your customer service, I was told that I had to UN-install the CR2008 evaluation copy then i

  • OacleAS time out while waiting for a managed process start

    Hello Friends, While starting the Oracle Application server, through OPMN command as well as from the services I am getting following error. opmnctl: starting opmn and all managed processes... =========================================================

  • OT:  Translation needed please

    I'm an Engineer by profession so I never got on with the bean counters. Could anyone translate the second sentence, please :-) Mark Garrett, Adobes new CFO says this about Lightroom: Creative Solutions segment revenue was $346.4 million compared to (