STMS change systems in the request

Hello,
Due a administration process, I have in the Development systems transport request released with the wrong target systems.
Of course I need import the released request to the correct target system, how i can do this?
Because is not posible change the target systems in released requests.
Any idea to transport the request to the correct target systems?
thanks in advance
Javier

Hello,
I have  a pretty simple solution provided both the incorrect target and correct target systems are included in the same TMS i.e are having same TMS domain. I am also assuming that the transports have not been imported into incorrect target system.
In this scenario go to the import queue of the incorrect target system (using STMS_IMPORT). Select the block of transport requests which need to be moved to correct target. This can be done from menu using EDIT-->SELECT BLOCK.
After this from menu choose Request>Forward>System. Give the correct target system and then these transports will also get forwarded to import buffer of correct target.
If however the transports have been imported in incorrect target system then you need to go import history and then there again do a block selection and then forward those teansport .
Using this you can do an enmass forwarding in one single go.
Using the option Extras>Other Requests>Add you can add only 1 transport at a given point of time unless of course you use CATT like I do.
Regards.
Ruchit Khushu.

Similar Messages

  • 'System.Exception: The request failed with HTTP Status 404

    Hi
    BPC version we are using is : 5.0.502
    When we  Publish all reports using the steps below:
    a. Log into BPC Web
    b. Click on Available Interfaces> Select BPC for Administration
    c. Under Web Admin Task Click Publish Reports
    d. Select all of the reports and click the green check mark to publish the
    reports.
    I am getting following message 'System.Exception: The request failed with HTTP Status 404: Not Found. Below are the event log relation to this error.
    Event Type:     Error
    Event Source:     OutlookSoft log
    Event Category:     None
    Event ID:     0
    Date:          12/8/2008
    Time:          11:26:42 AM
    User:          N/A
    Computer:     BPCAPP
    Description:
    ==============[System Error Tracing]==============
    [System  Name] : OSoftCPM
    [Message Type] : ErrorMessage
    [Job Name]     : AuditMgrService/SetAuditReportFiles
    [DateTime]     : 12/8/2008 11:26:42 AM
    [UserId]       : MMPlanning
    [Exception]
        DetailMsg  : {The request failed with HTTP status 404: Not Found.}
    ==============[System Error Tracing  End ]==============
    We tried ' Publish all reports' as the resolution of error 'object variable or with block variable not set' on modifying an application as per sap note Note 1131320
    If anyone knows the resolution for the above issue, please help me.
    Thanks in advance
    Sajeev Abraham

    Hi,
    This forum is for BI Integrated planning an BW/SEM BPS. For BPC related questions please post this message in the forum for Enterprise Performance Management (Enterprise Performance Management (SAP EPM)).
    Best regards,
    Gerd Schoeffl,
    SAP NetWeaver RIG BI

  • Unhandled Exception: System.TimeoutException: The request channel timed out while waiting for a reply after 00:01:59.9139778.Increase the timeout value passed to the call to Request or increase the SendTimeout value on the Binding.

    Hi, 
    I created a simple plugin and since i wanted to use Early Binding i added Xrm.cs file to my solution.After i tried registering the plugin (using the Plugin Registration Tool) the plugin does not gets registered and i get the below mentioned Exception.
    Unhandled Exception: System.TimeoutException: The request channel timed out while waiting for a reply after 00:01:59.9139778. Increase the timeout value passed to the call to Request or increase the SendTimeout value on the Binding. The time allotted to this
    operation may have been a portion of a longer timeout.
    Server stack trace: 
       at System.ServiceModel.Channels.RequestChannel.Request(Message message, TimeSpan timeout)
       at System.ServiceModel.Channels.SecurityChannelFactory`1.SecurityRequestChannel.Request(Message message, TimeSpan timeout)
       at System.ServiceModel.Channels.ServiceChannel.Call(String action, Boolean oneway, ProxyOperationRuntime operation, Object[] ins, Object[] outs, TimeSpan timeout)
       at System.ServiceModel.Channels.ServiceChannelProxy.InvokeService(IMethodCallMessage methodCall, ProxyOperationRuntime operation)
       at System.ServiceModel.Channels.ServiceChannelProxy.Invoke(IMessage message)
    Exception rethrown at [0]: 
       at System.Runtime.Remoting.Proxies.RealProxy.HandleReturnMessage(IMessage reqMsg, IMessage retMsg)
       at System.Runtime.Remoting.Proxies.RealProxy.PrivateInvoke(MessageData& msgData, Int32 type)
       at Microsoft.Xrm.Sdk.IOrganizationService.Update(Entity entity)
       at Microsoft.Xrm.Sdk.Client.OrganizationServiceProxy.UpdateCore(Entity entity)
       at Microsoft.Crm.Tools.PluginRegistration.RegistrationHelper.UpdateAssembly(CrmOrganization org, String pathToAssembly, CrmPluginAssembly assembly, PluginType[] type)
       at Microsoft.Crm.Tools.PluginRegistration.PluginRegistrationForm.btnRegister_Click(Object sender, EventArgs e)
    Inner Exception: System.TimeoutException: The HTTP request to 'https://demoorg172.api.crm.dynamics.com/XRMServices/2011/Organization.svc' has exceeded the allotted timeout of 00:01:59.9430000. The time allotted to this operation may have been a portion of a
    longer timeout.
       at System.ServiceModel.Channels.HttpChannelUtilities.ProcessGetResponseWebException(WebException webException, HttpWebRequest request, HttpAbortReason abortReason)
       at System.ServiceModel.Channels.HttpChannelFactory`1.HttpRequestChannel.HttpChannelRequest.WaitForReply(TimeSpan timeout)
       at System.ServiceModel.Channels.RequestChannel.Request(Message message, TimeSpan timeout)
    Inner Exception: System.Net.WebException: The operation has timed out
       at System.Net.HttpWebRequest.GetResponse()
       at System.ServiceModel.Channels.HttpChannelFactory`1.HttpRequestChannel.HttpChannelRequest.WaitForReply(TimeSpan timeout)
    And to my Surprise after i remove the Xrm.cs file from my solution the Plugin got registered!
    Not understanding what exactly is the issue.
    Any Suggestions are highly appreciated.
    Thanks,
    Shradha
      

    Hello Shardha,
                            I really appreciate that you have faced this issue.This is really very strange issue and basically it occurs because of big size of your early bound class and slow internet
    connection.
                            I would strictly recommend you to reduce the file size of your early bound class and then register.By default early bound class is created for all the entities which are
    present in CRM(System entities as well custom entities).Such kind of early bound classes takes lots of time to register on server and hence timeout exception comes.
                            There is some standard define to reduce the size of early bound class.Please follow the link to get rid from big size of early bound class.
    Create a new C# class library project in Visual Studio called SvcUtilFilter.
    In the project, add references to the following:
    CrmSvcUtil.exe(from sdk)   This exe has the interface we will implement.
    Microsoft.Xrm.Sdk.dll  (found in the CRM SDK).
    System.Runtime.Serialization.
      Add the following class to the project:
    using System;
    using System.Collections.Generic;
    using System.Xml.Linq;
    using Microsoft.Crm.Services.Utility;
    using Microsoft.Xrm.Sdk.Metadata;
    namespace SvcUtilFilter
        /// <summary>
        /// CodeWriterFilter for CrmSvcUtil that reads list of entities from an xml file to
        /// determine whether or not the entity class should be generated.
        /// </summary>
        public class CodeWriterFilter : ICodeWriterFilterService
            //list of entity names to generate classes for.
            private HashSet<string> _validEntities = new HashSet<string>();
            //reference to the default service.
            private ICodeWriterFilterService _defaultService = null;
            /// <summary>
            /// constructor
            /// </summary>
            /// <param name="defaultService">default
    implementation</param>
            public CodeWriterFilter( ICodeWriterFilterService defaultService )
                this._defaultService = defaultService;
                LoadFilterData();
            /// <summary>
            /// loads the entity filter data from the filter.xml file
            /// </summary>
            private void LoadFilterData()
                XElement xml = XElement.Load("filter.xml");
                XElement entitiesElement = xml.Element("entities");
                foreach (XElement entityElement in entitiesElement.Elements("entity"))
                    _validEntities.Add(entityElement.Value.ToLowerInvariant());
            /// <summary>
            /// /Use filter entity list to determine if the entity class should be generated.
            /// </summary>
            public bool GenerateEntity(EntityMetadata entityMetadata, IServiceProvider services)
                return (_validEntities.Contains(entityMetadata.LogicalName.ToLowerInvariant()));
            //All other methods just use default implementation:
            public bool GenerateAttribute(AttributeMetadata attributeMetadata, IServiceProvider services)
                return _defaultService.GenerateAttribute(attributeMetadata, services);
            public bool GenerateOption(OptionMetadata optionMetadata, IServiceProvider services)
                return _defaultService.GenerateOption(optionMetadata, services);
            public bool GenerateOptionSet(OptionSetMetadataBase optionSetMetadata, IServiceProvider services)
                return _defaultService.GenerateOptionSet(optionSetMetadata, services);
            public bool GenerateRelationship(RelationshipMetadataBase relationshipMetadata, EntityMetadata otherEntityMetadata, IServiceProviderservices)
                return _defaultService.GenerateRelationship(relationshipMetadata, otherEntityMetadata, services);
            public bool GenerateServiceContext(IServiceProvider services)
                return _defaultService.GenerateServiceContext(services);
    This class implements the ICodeWriterFilterService interface.  This interface is used by the class generation
    utility to determine which entities, attrributes, etc. should actually be generated.  The interface is very simple and just has seven methods that are passed metadata info and return a boolean indicating whether or not the metadata should be included
    in the generated code file.   
    For now I just want to be able to determine which entities are generated, so in the constructor I read from an XML
    file (filter.xml) that holds the list of entities to generate and put the list in a Hashset.  The format of the xml is this:
    <filter>
      <entities>
        <entity>systemuser</entity>
        <entity>team</entity>
        <entity>role</entity>
        <entity>businessunit</entity>
      </entities>
    </filter>
    Take a look at the methods in the class. In the GenerateEntity method, we can simply check the EntityMetadata parameter
    against our list of valid entities and return true if it's an entity that we want to generate.
    For all of the other methods we want to just do whatever the default implementation of the utility is.  Notice
    how the constructor of the class accepts a defaultService parameter.  We can just save a reference to this default service and use it whenever we want to stick with the default behavior.  All of the other methods in the class just call the default
    service.
    To use our extension when running the utility, we just have to make sure the compiled DLL and the filter.xml file
    are in the same folder as CrmSvcUtil.exe, and set the /codewriterfilter command-line argument when running the utility (as described in the SDK):
    crmsvcutil.exe /url:http://<server>/<org>/XrmServices/2011/Organization.svc /out:sdk.cs  /namespace:<namespace> /codewriterfilter:SvcUtilFilter.CodeWriterFilter,SvcUtilFilter
    /username:[email protected] /password:xxxx
    That's it! You now have a generated sdk.cs file that is only a few hundred kilobytes instead of 5MB. 
    One final note:  There is actually a lot more you can do with extensions to the code generation utility. 
    For example: if you return true in the GenerateOptionSet method, it will actually generated Enums for each CRM picklist (which it doesn't normally do by default).
    Also, the source code for this SvcUtilFilter example can be found here. 
    Use at your own risk, no warranties, etc. etc. 
    Please mark as a answer if this post is useful to you.

  • Permissions Change on Folder - "the requested security information is either unavailable or can't be displayed"

    am I getting this message on some folders security tab, Can not delete file folder
    I can't see the owner. The security tab displays"The Security tab displays "The requested security information is either unavailable or can't be displayed." I've tried to grant access via command line with no success.
    Regards,
    Kanna..

    Testing answer - shows as only proposed so far....
    When you see answers and helpful posts, please click Vote As Helpful, Propose As Answer, and/or Mark As Answer.
    My Blog: Unlock PowerShell
    My Book:
    Windows PowerShell 2.0 Bible
    My E-mail: -join ('6F6C646B61726C406F75746C6F6F6B2E636F6D'-split'(?<=\G.{2})'|%{if($_){[char][int]"0x$_"}})

  • Event Log Error: Microsoft.ResourceManagement: System.Data: System.InvalidOperationException: The requested operation cannot be completed because the connection has been broken. at System.Data.SqlClient.SqlInternalConnectionTds.ExecuteTransaction

    Has anyone ever seen this? Any Clues?

    I had set up a clean environment some weeks ago, and just today I started configuring MA's. When I got to the FIM MA I wanted to do an import, sync and export, and when I got to the export it seemed to run endless. In the eventlog I got the same errors as
    you did.
    Didn't find any real hints in SQL or in the FIM requests history... After doing a good old reboot it just started working...
    http://setspn.blogspot.com

  • The request has not yet been updated into the data targets.

    Request successfully loaded into PSA - start further update
    Diagnosis
    The request has as data target "only PSA". It arrived there successfully.
    System Response
    The request has not yet been updated into the data targets.
    Procedure
    You can now trigger the update manually. The final target is determined in the request tables. You do not have to make any further settings.
    I am getting this error daily. In RSMO screen status tab option called u201CProcess manuallyu201D, so am doing this manually daily. Is there any solution to fix this problem permanently?
    Thanks & Regards,
    Praveen Yagnamurthy.

    Hi Praveen,
    Dont you have Process chains in Production system for loading thid data. If so its better to schedule the PC rather go for manual loading. In Prod system you will not be able to change. You will have the make the changes in the Development Environment & collect these changes in a TR & then move them into Test environment first to make sure it is working fine. And then once it is tested, you can move the TR to Prod sys.
    Hope it helps!
    Regards,
    Pavan

  • The request failed with HTTP status 404: Not Found

    Hi
    BPC version we are using is : 5.0.502
    When we  Publish all reports using the steps below:
    a. Log into BPC Web
    b. Click on Available Interfaces> Select BPC for Administration
    c. Under Web Admin Task Click Publish Reports
    d. Select all of the reports and click the green check mark to publish the
    reports.
    I am getting following message 'System.Exception: The request failed with HTTP Status 404: Not Found. Event log relation to this error is given below.
    Event Type:     Error
    Event Source:     OutlookSoft log
    Event Category:     None
    Event ID:     0
    Date:          12/8/2008
    Time:          11:26:42 AM
    User:          N/A
    Computer:     BPCAPP
    Description:
    ==============[System Error Tracing]==============
    [System  Name] : OSoftCPM
    [Message Type] : ErrorMessage
    [Job Name]     : AuditMgrService/SetAuditReportFiles
    [DateTime]     : 12/8/2008 11:26:42 AM
    [UserId]       : MMPlanning
    [Exception]
        DetailMsg  : {The request failed with HTTP status 404: Not Found.}
    ==============[System Error Tracing  End ]==============
    For more information, see Help and Support Center at http://go.microsoft.com/fwlink/events.
    We tried ' Publish all reports' as the resolution of error 'object variable or with block variable not set' on modifying an application as per sap note Note 1131320
    If anyone knows the resolution for the above issue, please help me.
    Thanks in advance
    Sajeev Abraham

    Hi
    Thanks for the response.
    Below are the full contents of the error message pop-up.
    ( System.Expection: the reguest failed with HTTP status 404:Not found at
    Microsoft.Visualbasic.CompierServices.lateBinding.LateGet(Object o, Type,String name, Object
    []args,sTRING[]paramnames, Boolen[]CopyBack)
    at OSoft.Services.WebServices.ReportmanageProxy.ReportManageProxy.SetPubishReport(String strAppSet,string
    strApp,string strFiter)
    url is given below
    http://<HostName>/osoft/Admin/WebAdminMain.aspx
    BPC Deployment is Multiple server ( Two servers) Database and Application deployed seperately.
    Sql Reporting server is working.. I could connect using sql management studio.
    Thanks

  • The requested Performance Counter is not a custom counter...

    Hello All,
    I'm very new to both Oracle & .NET/IIS, and I am having trouble configuring one of my five IIS6/Win2003 servers. I have successfully deployed the same code on three other servers, each with the same ODAC (11.1.0.5.10 Beta) installation. Using sqlplus, I've been able to validate the connection to my db. There is even code using an older Oracle driver in part of the codebase that connects and works fine with the db, but I can't get my ODAC-reliant code to work.
    When I try to access the code that uses ODAC, I get the error below.
    Thanks in advance for any clues at all!
    - Jason
    The requested Performance Counter is not a custom counter, it has to be initialized as ReadOnly.
    Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code.
    Exception Details: System.InvalidOperationException: The requested Performance Counter is not a custom counter, it has to be initialized as ReadOnly.
    Source Error:
    Line 7: // Code that runs on application startup
    Line 8:
    Line 9: DashboardBusiness.WorkflowHelper.Init();
    Line 10: }
    Line 11:
    Source File: d:\Inetpub\hhh.net\shared\iHHH\Dashboard\Global.asax Line: 9
    Stack Trace:
    [InvalidOperationException: The requested Performance Counter is not a custom counter, it has to be initialized as ReadOnly.]
    System.Diagnostics.PerformanceCounter.Initialize() +1232
    System.Diagnostics.PerformanceCounter..ctor(String categoryName, String counterName, String instanceName, Boolean readOnly) +110
    System.Workflow.Runtime.PerformanceCounterManager.CreateCounters(String name) +70
    System.Workflow.Runtime.Hosting.ManualWorkflowSchedulerService.OnStarted() +220
    System.Workflow.Runtime.Hosting.WorkflowRuntimeService.HandleStarted(Object source, WorkflowRuntimeEventArgs e) +11
    System.EventHandler`1.Invoke(Object sender, TEventArgs e) +0
    System.Workflow.Runtime.WorkflowRuntime.StartRuntime() +835
    DashboardBusiness.WorkflowHelper.Init() +95
    ASP.global_asax.Application_Start(Object sender, EventArgs e) in d:\Inetpub\hhh.net\shared\iHHH\Dashboard\Global.asax:9
    Version Information: Microsoft .NET Framework Version:2.0.50727.1378; ASP.NET Version:2.0.50727.1378

    I've never seen this error in the context of trying to connect via ODP.NET or ODT.
    If you have Oracle Dev Tools for Visual Studio installed, you can try to create a connection to the database using Server Explorer. If this works, that means your ODP.NET connection is just fine.
    If you don't have VS or the tools, another thing you can try is connecting through ODP.NET using a simple OracleConnection.Open. Make sure you have your client set up to connect to the DB server.
    Here's some info about connecting to the DB:
    http://download.oracle.com/docs/html/B28089_01/featConnecting.htm#sthref101

  • The request was aborted: The request was canceled.

    I am trying to upload some files through c# on Azure. But i am getting below error on a particular system.
    The request was aborted: The request was canceled.
    I ran fiddler and get the following response.
    <?xml version="1.0" encoding="UTF-8"?><request protocol="3.0" version="1.3.23.0" ismachine="1" sessionid="{7D4FD5D2-27F8-45D9-9CB9-94995712087B}" installsource="core" testsource="auto"
    requestid="{8D9DECDD-4920-4AE4-B5FD-B2D9DBC56329}"><os platform="win" version="6.1" sp="Service Pack 1" arch="x86"/><app appid="{22BC5D71-241A-4198-94D2-4A45297A4910}" version="1.3.23.0"
    nextversion="" lang="" brand="GGLS" client="" installage="251"><updatecheck/><ping r="-1"/></app><app appid="{BD53C92C-0EDE-40B1-9E29-DA95D1F72764}" version="1.0.0.1"
    nextversion="" lang="" brand="" client=""><updatecheck/><ping r="-1"/></app></request>
    HTTP/1.1 504 Fiddler - Receive Failure
    Date: Mon, 29 Dec 2014 14:13:49 GMT
    Content-Type: text/html; charset=UTF-8
    Connection: close
    Cache-Control: no-cache, must-revalidate
    Timestamp: 19:43:49.762
    [Fiddler] ReadResponse() failed: The server did not return a complete response for this request. Server returned 0 bytes.         
    K K Sanghi

    try
    StorageCredentials sc = new StorageCredentials(storage, accessKey);
    CloudStorageAccount storageAccount = new CloudStorageAccount(sc, true);
    CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
    CloudBlobContainer blobContainer = blobClient.GetContainerReference(container);
    CloudBlockBlob blockBlob = blobContainer.GetBlockBlobReference(targetFileName);
    using (var fileStream = System.IO.File.OpenRead(localPathwithFileName))
    blockBlob.UploadFromStream(fileStream);
    catch (Exception ex)
    throw ex;
    K K Sanghi

  • Checking dates for the request

    Hello,
    Can anybody will help me to resolve this issue.
    I have the scenario for the current month we are loading the data into ODS by delta but i would like to check the condition if the system has the request number for the current selection date hence it need to delete that previous request from the ODS  and load the current request as delta into the ODS.
    thanks

    Hi,
    You have various options in the "Automatic loading od similar/Identical Request" cloumn in the "Data Targets" tab of the infopackage.
    If the none of the options that available suits you,  You can also write a code in the routine in the same screen.
    Thanks.

  • STMS: how to hide the Import All Requests button in the import queue view?

    Hi All,
    I'd like to know how to hide the "Import all requests" button in the import queue view.
    Thanks a lot for your answers.
    G.

    Hi,
    about hiding i have a doubt but u can inactivate by following procedure,
    As referred in a thread:
    On the domain controller, in STMS>overview>systems>(double click the one you want to change)>Select Transport Tool tab and click on change. Add Import_single_only value 1, import_single_strategy value 1 and no_import_all value 1. Save distribute and activate.
    or
    Refer to OSS NOTE 194000.
    Thanks & regards.

  • Schedule lines are not modified when i change the requested delivery date

    Hi  
    i have a issue when i am created sales order with requested delivery date 10.11.2011
    and the same is copied in to item schedule lines as a 10.11.2011
    before saving the order i change the requested derlivery date to 11.11.2011
    this new requested delivery date is not dopied to schedule lines at item level (still is having 10.11.2011)
    can any one tell me this is system standard behaviour or can we correct this
    Thanks and Regards
    Kishore

    Hi DevarapalliKK,
    I tried this solution & problem is solved.
    --Go to VA01 & enter the material & qty
    --Select the line item in VA01 & click on Edit
    --Select Fast Change of & click on Delivery date & mention the Delivery date as 11.11.2011. & click on copy.
    --Check the request delivery date in header level in sales tab it will change to 11.11.2011 & also check the Schedule lines in Item level the date will be 11.11.2011 & Save it.
    Please let me know if your problem is solved
    Regards
    Pradeep

  • Change the status of the request from Released to un released.

    Dear all
    Some requests were created with no target system and were released.
    these requests are to be transported to our Production System.
    Is there any other way to transport these requests?
    things I tried.
    1. I tried to find the files in Cofile and datafile but could not find in trans folder.
    2. tried to add the objects to new requests but facing problem while adding the key to the request.
    Regards
    Mahesh

    Hi,
    you have created the requests as local requests, it means you haven't mentioned Target system it will be treated as local requests so it wont create Cofiles and Datafiles.
    and you can use the option Transport of copies and check. actually u can't change the request from the Rleased to Unreleased.
    you have to create new request that's the only way if your not able to copy the objects to the new request or if the transport of copies wont work.
    -Srinivas.korva

  • Insufficient system resources exist to complete the requested service

    [I did intend to start this post with a screenshot of the above error when I initiate the transfer from Windows Explorer, but apparently 'Body text cannot contain images or links until we are able to verify your account.' so I will just have to do some typing,
    viz the error dialog says:
     'An unexpected error is keeping you from copying the file. If you continue to receive this error, you can use the error code to search for help with this problem.
    Error 0x800705AA: Insufficient system resources exist to complete the requested service.'
    I get this error pretty much 100% of the time from one particular PC when trying to copy a folder of 10 2GB files to a server with both mirror and parity storage spaces.
    I recently purchased a Thecus W5000 running Windows Storage Server 2012 R2 Essentials. Absent any guidance either way I decided to set up a storage pool across the three 3TB WD Red drives that I have installed in it and to allocate 1.5TB of that space to a
    mirror storage space and the remainder to a parity storage space. Having read some faily dire things about storage spaces, but wanting the resilience provided by those two types of storage space, I decided to run some benchmarking tests before finalising anything.
    To that end I only went as far through the Essentials setup as creating a handful of user accounts before setting up the storage spaces and sharing both of them, with all authenticated users permitted full control. My benchmarking consists of a Take Command
    batch file timing three large directory copies - one with 10 2GB files, one with 10240 10K files and another with a multi-level directory with a variety of files of differing sizes. The first two are completely artificial and the latter is a real world example
    but all are roughly 20GB total size.
    To test various aspects of this I copied the three structures to and then from a partition created on the internal disk (the W5000 has a 500GB SSHD) and to the two storage space partitions. I also created a version of the batch file for use internally which
    did something similar between the internal disk and the two storage space partitions, and another as a control that tested the same process between the two Windows PCs. The internal test ran to successful completion, as did the PC to PC copy and the external
    one from my Windows 8.1 64-bit system (i5 3570K, 16GB RAM, 1TB HD) but when I ran it from my Windows 7 Pro 64-bit gaming rig (i7 2600K, 8GB RAM, 1TB HD) I got a number of failures with this error from Take Command:
    TCC: (Sys) C:\Program Files\bat\thecus_test_pass.btm [31] Insufficient system resources exist to complete the requested service.
    (where line 31 of that batch file is a copy command from local D: to the parity space on the Thecus).
    The error occurs only when copying large files (the 2GB ones already mentioned but some of those in the real world structure that are about 750MB in size) from the Win7 system to the Thecus and only when doing so to the storage space volumes - ie. copying to
    the internal disk works fine, copying from all volumes works fine, copying internally within the Thecus works fine, copying between the Win8 and Win7 machines works fine and initiating the copy as a pull from the server between the same two disks also works
    fine. One aspect of this that surprised me somewhat was just how quickly the copy fails when initiated from Windows Explorer - checking out the details section of the copy dialog I see roughly ten seconds of setting up and then within five seconds after the
    first file transfer is shown as starting the error dialog pops up (as per the image no longer at the top of this post).
    There are no entries in the event log on either machine related to this error and I've had the System Information window of the Sysinternals Process Explorer up and running on both machines whilst testing this, and it shows nothing surprising on either side.
    I've also run with an xperf base active and I can't see anything pertinent in the output from either system.
    Frankly, I am at a loss and have no idea what other troubleshooting steps I should try. The vast majority of the existing advice for this error message seems to relate to Windows 2003 and memory pools - which both the fact that this works from one PC but not
    the other and the SysInfo/xperf output seems to suggest is not the issue. The other thing I've seen mentioned is IRPStackSize, but again if that was the problem I would expect the failure to occur where ever I initiated the large file transfer from.

    Ff it works from the win 8 box, it must be in the win 7 box?
    I'm going to answer this one first because much of the rest of this is not going to be pertinent to the problem at hand. I've been over and over this aspect whilst trying to think this issue through and you are right, except that it only happens when copying
    files to the Thecus and only then when the target is a ReFS partition on a mirror or parity storage space. So the best I can come up with is that it is most likely an issue on the Win7 box that is triggered by something that is happening on the server side,
    but even that is a bit of a stretch. This is why the lack of information from the error message bugs me so much - in order to debug a problem like this you need to know what resource has been exhausted and in which part of the software stack.
    Now that may not be easy to do in a generic way, and since programmers are inherently lazy it is tempting just to return a simple error value and be done with it. However, I've been in the position of doing just that in a commercial product and ended up
    having to go back and improve the error information when that particular message/code was tripped and I was expected to debug the problem! Obviously there is a significant difference between a Microsoft consumer product and a mainframe product that costs many
    times as much and comes with a built in maintenance fee, but the underlying requirement is the same - somebody needs to be able to solve the problem using the information returned. In this case that simply isn't possible.
    You spend your time testing file copies, where I devote most of my time to backup and restore
    I don't really want to be testing file copies - the initial intention was to benchmark the different storage space and file system combinations that I was intending to use but the error whilst doing so has spiralled into a cycle of testing and tweaking that
    really isn't achieving anything. My primary reason for having a NAS at all has always been backup. My current strategy for the two boxes participating in this testing involves having a local drive/partition to hold backups, running a daily incremental file
    copy to that partition which is then immediately copied to a NAS and backing that up with a regular (needs to be at least once per month to be totally secure) full image copy of the local disks that is also copied to the NAS afterwards (hence my fascination
    with copying large files).
    There is a weakness in that strategy because I've never been very good at performing that full image backup regularly enough, so one of the reasons for buying the W5000 was the possibility of making those backups automatic and driven from the server end.
    However, that takes the local backup drives out of the equation and leaves me with the need to backup the NAS, which I don't do with my existing unit because there are (nearly) always copies held elsewhere.
    The other reasons for going with the Thecus were a desire to backup the other machines in the household - I've always dreaded a hard drive failure on my wife's laptop but getting her to perform any kind of housekeeping is nigh on impossible and also to provide
    a file server capability protected by a single set of userids (the existing NAS data is open to all household members). So my goal is backup and restore too ;)
    I meant a different nic on the beast (win 7)
    I should have realised that but obviously wasn't thinking straight. I don't have a spare gigabit NIC to hand (although perhaps even a megabit one might provide an interesting data point) although there is such a card in one of my other (less used) PCs that
    I could cannibalise for testing purposes. Another project for the coming weekend methinks.
    put some limits on it to keep the lawyers happy. 2gb ram, OS loaded on a drive, limit the # of Hard Drives
    That statement got me thinking, because I've never been able to find a definition anywhere of what the restrictions are with WSS 2012 R2 Essential - if I bring up the software license terms on the box itself they are for 2012 Standard!?- and wonder whether
    they'd stop me doing things like adding RAM or changing the processor.
    Even my buddies at wegotserved do not seem to have done any hands on reviews and they get "everything."
    The cynic in me wonders whether that is because Thecus know that they've just shovelled this onto a handful of existing boxes that barely meet the spec. and which simply aren't up to snuff as anything other than a box full of disks.  The Thecus boxes
    look like good value because they include the server OS (the unit cost me roughly 50% more than I could buy Windows Server 2012 R2 Essentials for) but if you can't realise that value then they are just an expensive NAS. 
    if perhaps the algorithms in the Seagate SSHD do not know ReFS?
    I haven't put a ReFS partition on the SSHD, only on the three 3TB WD Reds.
    I will ask my contacts at MS to take a look at this thread, but they stay so busy with v.next I don't know if they will spend many cycles on it
    Perhaps you could ask them if the next version of the OS could do a better job of identifying which resources have been exhausted, by what part of the stack and where in the maze of connectivity that makes up a modern computing environment?? {gd&r}
    Cheers, Steve

  • Changing the System in the desktoplaunch/InfoView Log in

    We are on BO XI SP3 (Windows) and recently installed the BO application on a new server.  We did copied our old respository over.   So the "System" on the login screen shows our old server information and I would like to update this.   Can you tell me what config file this is stored in.

    Hello Karen,
      I believe that is what your looking for.  If it's not then what I'd recmmend searching our SAP notes as you might find it helpful in resolving this issue.  This is 1218472 - How to change the default CMS name on the logon page for the InfoView or CMC.
    Jorge
    Symptom
    How do you change the Central Management Server (CMS) name that appears on the logon page for either the InfoView or Central Management Console (CMC)?
    ====================
    NOTE:
    Changing the CMS name is necessary if you want to log on to a cluster (@clustername) and ensure fault-tolerance.
    ====================
    Resolution
    To change the CMS name that appears on the logon page for either the InfoView or Central Management Console (CMC), modify the following files:
    For Microsoft Visual Studio .NET, modify the web.config file.
    For Java, modify the web.xml file.
    Microsoft Visual Studio .NET
    " To modify the CMS name on the logon page for the CMC, search for web.config in the C:\Program Files\Business Objects\BusinessObjects Enterprise 11\Web Content folder.
    " To modify the CMS name on the logon page for the InfoView, search for web.config in the C:\Program Files\Business Objects\BusinessObjects Enterprise 11\Web Content\Enterprise11\InfoView folder.
    Change the following parameter in the web.config file to show a different CMS name the CMC or Infoview logon page:
    <add key="connection.cms" value="@Clustername"/>
    Java
    To modify the CMS name on the logon page for the CMC, search for web.xml in the C:\Program Files\Business Objects\Tomcat\webapps\businessobjects\WEB-INF folder.
    To modify the InfoView name on the logon page for the CMC, search for web.xml in the C:\Program Files\Business Objects\Tomcat\webapps\businessobjects\enterprise11\desktoplaunch\WEB-INF
    Change the following parameter in the web.xml file to show a different CMS name at the CMC or InfoView logon page:
    <param-value>vca-w3s-be11:6400</param-value>
    ====================
    NOTE:
    If the Java Perfomance Manager is installed, you will need to include the appropiate CMS name in the following file:
    \Program Files\Business Objects\Performance Management 11.5\InitConfig.properties
    Add the CMS name in the following line:
    initialization.CMSName=
    ====================
    Additional Information for Java
    You cannot use the cluster name for Java. You can only change the explicit machine name to another cluster member. The cluster name (@clustername) is only for use with COM and .NET SDKs since name resolution with the cluster name relies on a lookup in the Windows registry. However, Java application servers do not use the Windows registry.
    In addition, the Java SDK variable that holds the list of CMSs is an internal variable that is not exposed to the end user.
    Fault-tolerance and load-balancing with Java is achieved after a direct connection to one of the cluster members has been made. This loads the cluster member list and the Java Virtual Machines uses a round-robin principle to handle requests.
    Keywords
    DEFAULT SERVER NAME .NET JAVA CHANGE CLUSTER DOTNET CRYSTAL MANAGEMENT CONSOLE BusinessObjects Enterprise CMC and InfoView Logon page , c2017197

Maybe you are looking for