Adjusting queuestatistics.vbs timeout in MSMQ management pack

Hi all,
I'm running SCOM 2012 R2 and my question is specific to the MSMQ 6.0.6700.88 management pack.  I'm seeing a few alerts titled
"Operations Manager failed to start a process", with the alert details saying "Forced to terminate the following process started at x:xx:xx PM because it ran past the configured timeout 300 seconds"
Command executed: "C:\Windows\system32\cscript.exe" /nologo "queuestatistics.vbs" server.domain.local false
One or more workflows were affected by this.
Workflow name: many
Instance name: many
Instance ID: many
etc etc - we've all seen these errors before.
When I run the script manually on the server as local system, the script runs fine, but takes around 4 mins (360 seconds) to finish executing. 
I want to adjust the timeout threshold for this script, and I understand that for this script timeout, there is no override available via the SCOM console, and I need to go into the management pack.
I've exported the management pack and I've found the vbs script contained within the management pack, however I was just looking for some guidance on what XML field(s) I should change to adjust the timeout?  Searching through the management pack I've
found several areas that might be what I'm after.
I guess I'm just after someone to point me at the right DataSource or Rule ID or something to edit, rather than me just taking a guess and reimporting a busted management pack that I then need to clean up.
Cheers,
Noel.
http://www.dreamension.net

Thanks Michael! 
That didn't quite work out for me, but I eventually got it working.  At first I did change the <TimeoutSeconds> value under the script as per your diagram, and after reimporting the management pack and flushing the health service cache for good
measure, the script still failed with a 300 second timeout.
However I started looking through the rules in more detail, and I found some rules that reference a datasource with "Queue.Statistics" in the name.  I figured that's a close enough match to the VB script that's also running.  Here is a screen grab...
There were several rules & datasources that had "Microsoft.MSMQ.2008R2.DataSource.Queue.Statistics" in the name, and I changed the "TimeoutSeconds" value for all of them to 360 seconds.  I then reimported the management pack again, and now the script
is completing.
Thanks very much for your assistance with this and giving me a nudge in the right direction.
http://www.dreamension.net

Similar Messages

  • Cisco UCS Management Pack for System Center Operations Manager 2007 R2 Login Error

    I have Cisco UCS 1.4 (3q) and just installed the MP v2.1 for Cisco UCS, however when I click on the "Management Pack Events" alert view in the SCOM console, there are error for event number 19900.  The description of the error is as follows:
    Description:
    CISCO UCS R2 MP (https://10.x.x.x/nuova) [CISCO.UCS.R2.Proxy.LoadCache.ProbeAction.vbs] : CISCO UCS Login Error: Authentication failed (Code: 551)
    I noticed that it's looking for the /nuova directory, but in order to login to our UCS environment, we usually just go to the IP address - so I'm wondering if that has something to do with it?  I've verified that when going directly to the IP from Internet Explorer I can login using the opsmgr credentials we specifically created for this account.  I've also gone through the process of creating a run-as account on the SCOM side and associated it with the correct profiles.
    Any help is greatly appreciated.

    Troubleshooting has found out its user error, wrong password configured.

  • Issue with custom management pack

    i've created a form and class for the service request
    i can open the form and it works as designed but when i click ok, i get this error.
    System.Collections.Generic.KeyNotFoundException: The given key was not present in the dictionary.
       at System.ThrowHelper.ThrowKeyNotFoundException()
       at System.Collections.Generic.Dictionary`2.get_Item(TKey key)
       at Microsoft.EnterpriseManagement.GenericForm.ContentPanel.VerifyRequiredValues(List`1& errorMessages)
       at Microsoft.EnterpriseManagement.GenericForm.ContentPanel.OnPreviewSubmit(Object sender, PreviewFormCommandEventArgs e)
       at Microsoft.EnterpriseManagement.GenericForm.FormChanger.OnPreviewSubmit(Object sender, PreviewFormCommandEventArgs e)
       at Microsoft.EnterpriseManagement.UI.FormsInfra.PreviewFormCommandEventArgs.InvokeEventHandler(Delegate genericHandler, Object genericTarget)
       at System.Windows.RoutedEventArgs.InvokeHandler(Delegate handler, Object target)
       at System.Windows.EventRoute.InvokeHandlersImpl(Object source, RoutedEventArgs args, Boolean reRaised)
       at System.Windows.UIElement.RaiseEventImpl(DependencyObject sender, RoutedEventArgs args)
       at Microsoft.EnterpriseManagement.UI.FormsInfra.FormView.RaiseFormEvent(RoutedEvent formEvent, RoutedEventArgs eventArgs)
       at Microsoft.EnterpriseManagement.UI.FormsInfra.FormViewController.OnPreviewSubmit()
       at Microsoft.EnterpriseManagement.UI.FormsInfra.FormViewController.ExecuteSubmitInternal(Boolean async)
       at System.Windows.Input.CommandBinding.OnExecuted(Object sender, ExecutedRoutedEventArgs e)
       at System.Windows.Input.CommandManager.ExecuteCommandBinding(Object sender, ExecutedRoutedEventArgs e, CommandBinding commandBinding)
       at System.Windows.Input.CommandManager.FindCommandBinding(CommandBindingCollection commandBindings, Object sender, RoutedEventArgs e, ICommand command, Boolean execute)
       at System.Windows.Input.CommandManager.FindCommandBinding(Object sender, RoutedEventArgs e, ICommand command, Boolean execute)
       at System.Windows.Input.CommandManager.OnExecuted(Object sender, ExecutedRoutedEventArgs e)
       at System.Windows.RoutedEventArgs.InvokeHandler(Delegate handler, Object target)
       at System.Windows.EventRoute.InvokeHandlersImpl(Object source, RoutedEventArgs args, Boolean reRaised)
       at System.Windows.UIElement.RaiseEventImpl(DependencyObject sender, RoutedEventArgs args)
       at System.Windows.UIElement.RaiseEvent(RoutedEventArgs args, Boolean trusted)
       at System.Windows.Input.RoutedCommand.ExecuteImpl(Object parameter, IInputElement target, Boolean userInitiated)
       at MS.Internal.Commands.CommandHelpers.CriticalExecuteCommandSource(ICommandSource commandSource, Boolean userInitiated)
       at System.Windows.Controls.Button.OnClick()
       at System.Windows.Controls.Primitives.ButtonBase.OnMouseLeftButtonUp(MouseButtonEventArgs e)
       at System.Windows.RoutedEventArgs.InvokeHandler(Delegate handler, Object target)
       at System.Windows.EventRoute.InvokeHandlersImpl(Object source, RoutedEventArgs args, Boolean reRaised)
       at System.Windows.UIElement.ReRaiseEventAs(DependencyObject sender, RoutedEventArgs args, RoutedEvent newEvent)
       at System.Windows.UIElement.OnMouseUpThunk(Object sender, MouseButtonEventArgs e)
       at System.Windows.RoutedEventArgs.InvokeHandler(Delegate handler, Object target)
       at System.Windows.EventRoute.InvokeHandlersImpl(Object source, RoutedEventArgs args, Boolean reRaised)
       at System.Windows.UIElement.RaiseEventImpl(DependencyObject sender, RoutedEventArgs args)
       at System.Windows.UIElement.RaiseEvent(RoutedEventArgs args, Boolean trusted)
       at System.Windows.Input.InputManager.ProcessStagingArea()
       at System.Windows.Input.InputManager.ProcessInput(InputEventArgs input)
       at System.Windows.Input.InputProviderSite.ReportInput(InputReport inputReport)
       at System.Windows.Interop.HwndMouseInputProvider.ReportInput(IntPtr hwnd, InputMode mode, Int32 timestamp, RawMouseActions actions, Int32 x, Int32 y, Int32 wheel)
       at System.Windows.Interop.HwndMouseInputProvider.FilterMessage(IntPtr hwnd, Int32 msg, IntPtr wParam, IntPtr lParam, Boolean& handled)
       at System.Windows.Interop.HwndSource.InputFilterMessage(IntPtr hwnd, Int32 msg, IntPtr wParam, IntPtr lParam, Boolean& handled)
       at MS.Win32.HwndWrapper.WndProc(IntPtr hwnd, Int32 msg, IntPtr wParam, IntPtr lParam, Boolean& handled)
       at MS.Win32.HwndSubclass.DispatcherCallbackOperation(Object o)
       at System.Windows.Threading.ExceptionWrapper.InternalRealCall(Delegate callback, Object args, Boolean isSingleParameter)
       at System.Windows.Threading.ExceptionWrapper.TryCatchWhen(Object source, Delegate callback, Object args, Boolean isSingleParameter, Delegate catchHandler)
       at System.Windows.Threading.Dispatcher.InvokeImpl(DispatcherPriority priority, TimeSpan timeout, Delegate method, Object args, Boolean isSingleParameter)
       at MS.Win32.HwndSubclass.SubclassWndProc(IntPtr hwnd, Int32 msg, IntPtr wParam, IntPtr lParam)
       at MS.Win32.UnsafeNativeMethods.DispatchMessage(MSG& msg)
       at System.Windows.Threading.Dispatcher.TranslateAndDispatchMessage(MSG& msg)
       at System.Windows.Threading.Dispatcher.PushFrameImpl(DispatcherFrame frame)
       at System.Windows.Window.ShowHelper(Object booleanBox)
       at System.Windows.Window.Show()
       at System.Windows.Window.ShowDialog()
       at Microsoft.EnterpriseManagement.ConsoleFramework.WindowManager.GenericWpfWindowConstructor.BeginShow(ShowViewContext showViewContext, Object parent, Object view, AsyncCallback callback, Object state)
       at Microsoft.EnterpriseManagement.ConsoleFramework.ViewConstructor.BeginShow(ShowViewContext showViewContext, AsyncCallback callback, Object state)
       at Microsoft.EnterpriseManagement.ConsoleFramework.WindowManager.WpfWindowRecord.ShowWindow()
       at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state)
       at System.Threading.ThreadHelper.ThreadStart()

    you know the more i look at this.
    i think what happens is, i followed these instructions to the point of creating the runbooks
    http://blogs.technet.com/b/antoni/archive/2014/04/09/system-center-2012-service-manager-and-orchestrator-integration-example-walkthrough-start-to-finish-new-hire-provisioning-service-request.aspx#pi47623=3
    and i can't use the form because there's a few required fields on the general tab and those were written to do all the work with runbooks and request offerings, so those instructions were written with the idea of never even using the form inside of service
    manager, so that's why it doesn't work in my case.
    so i guess the options for this are.
    1. remove the field requirements on the general tab
    2. make the general tab and the other tabs work with this management pack/form.

  • What is the best way to write management pack modules?

    i have written many modules using powershell script but when i deploy that management pack on SCOM it is throwing so many errors saying powershell script has dropped due to timeout.
    My Mp has lot of powershell script which gets the data from the service which executes for each and every instance the mp certainly has around 265 Instances and the powershell have to execute for each and every instance.
    how can i improve the scripts?
    do i need to use someother scripting language like javascript or VBScript in the management pack to execute different modules.
    what is the best practice to write Modules
    i have useed even the cookdown for multi instance data gathering
    Thanks & Regards, Suresh Gaddam

    One thing you have not mentioned is how you are consuming the data after you save it.  Your solution should be compatible with whatever software you are using at both ends.
    Your data rate (40kS/s) is relatively slow.  You can achieve it using just about any format from ASCII, to raw binary and TDMS, provided you keep your file open and close operations out of the write loop.  I would recommend a producer/consumer architecture to decouple the data collection from the data writing.  This may not be necessary at the low rates you are using, but it is good practice and would enable you to scale to hardware limited speeds.
    TDMS was designed for logging and is a safe format (<fullDisclosure> I am a National Instruments employee </fullDisclosure> ).  If you are worried about power failures, you should flush it after every write operation, since TDMS can buffer data and write it in larger chunks to give better performance and smaller file sizes.  This will make it slower, but should not be an issue at your write speeds.  Make sure you read up on the use of TDMS and how and when it buffers data so you can make sure your implementation does what you would like it to do.
    If you have further questions, let us know.
    This account is no longer active. Contact ShadesOfGray for current posts and information.

  • UCS management pack for SCOM

    I've installed the UCS management pack for Microsoft SCOM 2007 R2. It appears to be working, however I now keep seeing all computers that have the SCOM agent installed trying to run the cisco.ucs.computer.probeaction.vbs script, which fails and generates alerts. Does anyone know why these servers that are unrelated to UCS would be trying to run these scripts, or how to fix this problem?

    The Problem is that you use an “&” string in a name of one of the protection Groups.
    Change the name of the protection Group where you have used an “&” symbol
    . Restart the “System Center Management” and you should see that the discovery success.
    If still have same issue, started to register the differtent dll’s again on the machine and apparently registering the MOMScriptAPI.dll
    http://scug.be/dieter/2012/11/08/scom-event-id-21406-file-name-or-class-not-found/
    Please remember, if you see a post that helped you please click "Vote As Helpful" and if it answered your question, please click "Mark As Answer"
    Mai Ali | My blog: Technical | Twitter:
    Mai Ali

  • SQL Management Pack Agent Job Unit Monitors

    I am trying to utilize the SQL Agent Job Monitors included with the SQL Management Pack (page 81 of the MP guide), particularly the "Last Run Status" monitor. I have set an Override for "Generate Alerts" to True. The goal
    is to have an email notification sent when any SQL Agent job fails on a specific group of computers. However, I am not familiar enough with what "Last Run Status" is actually monitoring and if it will do what I am looking for. Does this monitor
    check the job status of any agent job on the target group? I tried setting up a job for it to fail, which it does fail, but it does not generate an alert. Any suggestions appreciated. Thanks.

    1) For monitoring SQL server job status, you can refer to the following blog
    http://blogs.technet.com/b/kevinholman/archive/2011/08/05/how-to-monitor-sql-agent-jobs-using-the-sql-management-pack-and-opsmgr.aspx
    2) For troubleshoot: Make sure that your SQL agent job is discovered, you may find the discovered job under Microsoft SQL Server --> SQL Agent --> SQL agent job status. By default, the discovery is disabled and you should enabled the discovery by using
    override.
    Roger
    Thanks. So I enabled Agent Job discovery for the systems that I want. It discovered those objects. Now, I still cannot get the alerting to work. I can see the Health State of the job change to a Warning, indicating "Last Run Status" equals Failed, but
    it is not generating an alert. I set Overrides (Generates Alert = True) on a Group that included one of the job objects, nothing. I also set the same Override directly on the job object and I still do not receive an alert. Any ideas? I
    noticed someone in the comments mentioned in the blog you linked, a similar issue, except with the Job Duration monitor. However, he was able to get alerts to generate by setting the Override directly on the Job object and noted that it is not a
    feasible solution due to the number of jobs he wants to monitor. Again, I tried the same, but still could not get alerts. That solution would not be acceptable for me either because of the number of jobs that I would have to set individual Overrides for.

  • Error showing while adding a management pack to 2007R2 authoring console

    Hi Every one,
    While trying to add an unsealed management pack to 2007 authoring console we are getting error 
    XSD Verification failed for management pack[line: 1,  "color:#222222;font-family:arial,sans-serif;font-size:13px;line-height:normal;">
    system.xml.schema.xmlschemavalidationException:The schemaversion atribute is not declared
    any help will be thank full
    Regards,
    Nikhil

    The Authoring Console has not been updated and does not understand the OpsMgr 2007R2 MP Schema version. It can still be used to produce MP that will work on OpsMgr 2007 R2, but you need to reference the old versions of the libraries.
    You can check this link , it will help you in your issue
    http://blogs.inframon.com/post/2012/06/12/Choosing-the-correct-Management-Pack-Solution-with-Visual-Studio-Authoring-Extensions.aspx
    Please remember, if you see a post that helped you please click "Vote As Helpful" and if it answered your question, please click "Mark As Answer"

  • A note on Setup Manager - Application Change Management Pack

    Hi All,
    This is Mugunthan, Development manager for iSetup and Setup Manager. I am happy to begin my note with the launch of Application Change Management Pack 3.1. For those who are not aware of Application Change Management Pack, it is a new product built on top of Oracle Enterprise Manager and comprises of three major modules namely Setup Manager, Customization Manager and Patch Manager. You can find more information about this pack here (LCM: Oracle Application Change Management Pack Setup Manager is enhanced and advanced version of iSetup on Oracle Enterprise Manager. There were good amount of reasons to re-architect iSetup on Oracle Enterprise Manager as Setup Manager.
    Why Setup Manager?
    Ability to migrate to data over point releases of EBS – EBS comes up with rollup patches on top of major releases periodically. It means that you have to certify the patch before it get deployed on to production which means that setup data are to be certified again. iSetup cannot connect between 12.0.4 to 12.0.6 or 12.0.6 to 12.1.1. We have achieved the data migration for above scenarios in Setup Manager (11i to R12 is not supported in Setup Manager)
    Projects – Setup Manager supports grouping of Extracts, Transforms, Loads and Reports as a single entity and can be deployed multiple times to multiple targets. The project consists of one or more tasks (of type Extract/Load/Transform/Report) and can be orchestrated according to your ends. Project execution supports scheduling which means that you can use this feature to sync up setup data between two instances at periodic internal. Projects can be shared with other users which is remarkable difference between iSetup and Setup Manager. You can share your projects with others in a combined development environment. The functional configuration data (Extracts) are stored in Oracle Enterprise Manager. This means that you can refresh EBS without losing Extracts.
    Integration with Change Approval mechanism – Execution of projects is integrated with change approval mechanism which means that you can control who executes what. Also, you have fine grain access control where you can control EBS targets assigned to a user.
    Offline Transformation – I would say this is the most unique feature of Setup Manager and as an implementer you would welcome this feature very much. iSetup has got very limited capability on Transformation. Here we have gone way ahead and support Transformation on almost all Setup Objects. We present you the extracted data in excel sheets. You can download the excel template and work offline. All the attribute value fields in the excel sheet are editable which means that you supply your own value. For example, you can download Operating Unit data in xls file and change operating unit “OU100” to “OU200” and upload it back to system and load to target instance. Please note that any attribute of Operating Unit setup data can be edited. Also, system has got intelligence and once you upload the excel sheet, it automatically changes all the inventory organization which belongs to “OU100” to “OU200”. This behavior can be controlled using attribute mapping which you can create in Setup Manager. Hold on, this is not the end, also you can add more operating unit in excel, say OU300 and OU400 and load it to target instance. You can delete few Operating Units from extracted data in excel sheet and load the remaining to target instance. This gives you complete flexibility to manage (add/edit/delete) functional configuration data offline. This is fantastic feature and please have a look at how it works.
    Advanced Filtering – Filter support in iSetup is not flexible where you cannot use comma separated values and complex sql join conditions. Also you do not know what query is executed behind the scenes to the data. All this problems are addressed in Setup Manager 3.1. We show you the “select clause” (SQL) associated with interface and provide you a text field where you can refer the select clause and provide your own custom where criteria (SQL). This means that filter support is unlimited and you can set filter any database columns. For example, you can extract “Responsibilities/Menu” which are active or created as on yesterday. You can extract all Operating Units excluding “OU100 and OU200”. We support all filter criteria that are supported by SQL.

    Hi,
    your note very interesting. Is there any other way to extract the configuration using SQL? not using Enterprise Manager? hope this will be fastest way then using GUI base.
    Thanks,
    Fahmi Fahlevi

  • Management pack version difference in System Center - Operations Manager 2012 R2 - Help us.

    Hi,
    I have SC Operation manager 2012 r2 in my environment. It has managed with multiple management servers.
    I have import some unsealed MPs through Ops SDK code and by manually using Operation manager console.
    Here with i tried to install higher version of MPs and upgrade its version from 1.1.0.0 to 6.0.0.0.
    case 1:
    After successful installation of some MPs and check Operation manager console, it showing newly upgraded version(6.0.0.0).
    Then i have import that MP and checked its version inside it showing 1.1.0.0(older version).
    Here the imported MP has 1.1.0.0 version and console showing its version as 6.0.0.0.
    Why it showing different versions in imported MP(inside xml)  and Operations manager console view?
    case 2:
    After that i tried to import other MPs using console. But it showing below error;
    Verification failed with 1 errors:
    Error 1:
    Found error in 2|Monitoring.ActiveDirectory_server|6.0.0.0|Monitoring.ActiveDirectory_server|| with message:
    Could not load management pack [ID=Monitoring.Base, KeyToken=9e875220a3987e0b, Version=6.0.0.0]. The management pack was not found in the store.
    : Version mismatch. The management pack ([Monitoring.Base, 9e875220a3987e0b, 1.1.0.0]) requested from the database was version 6.0.0.0 but the actual version available is 1.1.0.0.
    Its happened in  multiple management servers scenario. So Is any possibility to happen because of multiple management servers scenario?. There is no issues in this same MPs while i tried in single management server. These same MPs are imported and working
    fine in single management server scenario.
    Help me to solve these 2 issues!!
    Thanks,
    satheesh

    1) At first, you should make sure that the imported MP has correct version. I will open the XML of the imported MP and check its current version e.g.
    When you open the xml file of imported MP, the version number is shown as
    <Manifest>
        <Identity>
            <ID>Microsoft.SystemCenter.Visualization.Component.Library.Resources</ID>
            <Version>1.0.0.2</Version>
        <Identity>
        <Name>Resources library</Name>
        <References>
            <Reference Alias="System">
            <ID>System.Library</ID>
            <Version>1.0.0.1</Version>
    From the above sample, the imported MP version number is 1.0.0.2.
    2) I am using operation console to import the management pack and check the version number in operations console whether it is updated to new version 1.0.0.2
    3) Using operations console to export the MP which has imported in step 1) and open it to check what is the version number of MP. Make sure that the version is 1.0.0.2
    4) If the above procedure provide a prefer match of version number, re-import all problematic MP.
    5) At last, you may use the following SQL to check whether the correct version number of MP is stored in DB successful
    select version, friendlyname from managementpackview where friendlyname like 'Microsoft.SystemCenter.Visualizatio%'
    Roger

  • Advisor failed to import the latest advisor knowledge management packs

    Hello All facing this issue from couple of days, troubleshooted from every perspective but haven't got any successful result...
    Event Description: Failed to import the latest Advisor Management Packs to the Management Server.
    Reason: System.ArgumentException: The requested management pack is not valid. See inner exception for details.
    Parameter name: managementPack ---> Microsoft.EnterpriseManagement.Common.ManagementPackException: Verification failed with 1 errors:
    Error 1:
    Found error in 2|Microsoft.IntelligencePacks.OfflineUpdate|7.0.9401.0|Microsoft.IntelligencePacks.OfflineUpdate|| with message:
    Could not load management pack [ID=Microsoft.IntelligencePacks.Types, KeyToken=31bf3856ad364e35, Version=7.0.9401.0]. The management pack was not found in the store.
    : Version mismatch. The management pack ([Microsoft.IntelligencePacks.Types, 31bf3856ad364e35, 7.0.9366.0]) requested from the database was version 7.0.9401.0 but the actual version available is 7.0.9366.0.
    Summary
    The latest Management Packs for the Advisor connector have not been imported to the management server database.
    Causes
    Management Pack Update Failed (55032,55008), Management pack Update SCOM Error(55031), Management Pack Update Import Failed (55006)
    Resolutions
    Refer to the Operations Manager event log on the management server for more information
    NOTE: 
    Proxy settings are configured on server as well as on console...
    REGARDS DANISH DANIE

    Hi,
    According to the error message, "The management pack ([Microsoft.IntelligencePacks.Types, 31bf3856ad364e35, 7.0.9366.0]) requested from the database was version 7.0.9401.0 but the actual version available is 7.0.9366.0."
    We may need to check the management pack's version.
    Have you applied the SC advisor connector?
    System Center Advisor Connector for Operations Manager
    http://www.microsoft.com/en-us/download/details.aspx?id=38199
    Regards, Yan Li

  • Service monitoring is not showing all the services when i create a service monitor under Management pack templates.

    Hi All,
    I am using SCOM 2007 R2 CU4 in my environment. I want to do a service monitoring on specific agents. When i try creating this service monitor which comes under the Management pack template  when i select the service which i want to monitor it does not
    show all the services in the drop down.
    For example - I have Windows audio service which is present on the machine, But it is not showing in the service list. 
    So from the services stating from "W" i see only 5 in SCOM and in services.msc in the Agent i see more than 5.
    Below is the screen shot.
    Can any one please help.
    Gautam.75801

    Hi Yan Li,
    So based on your above suggestion, If the services are already managed  / monitored by a specific
    management pack those services will not appear here in the Wizard while creating this type of a Management pack object alert right ?
    If that is the case why does not the same reflect here in the operations console in the Services monitor
    TAB ?
    Gautam.75801

  • Distributed Application group relationships not displaying in health view (VSAE created management pack)

    Hi there,
    I am working through my first complex Distributed Application which I have been creating using VSAE. I have followed Brian Wren's training, but the result does not match what I am expecting.
    I have created a single DA (based on the System.Service class) with four groups (base on System.Group) which will be populated with the core components of the application. Each of the groups (and the DA) is configured as a Public, Non-Abstract, Non-Hosted
    Singleton, as per the MPAuthor.Stores example. Each of the four groups is linked via an Internal, Non-Abstract System.Containment relationship to the DA.
    When I open the DA diagram view, I only get the DA class visible. Each of the four groups shows in the Groups node in the Operations Console. Based on the experience with importing the MPAuthor.Stores MP, I was not expecting to see the groups in the Groups
    node, but linked in the DA diagram node.
    I have configured dependency monitors for each of the groups, and these display in the Health Explorer but there is no health rolls up.
    The only difference between the MPAuthor.Stores module and mine (that I am aware of) is MPAuthors.Stores is a 2007 R2 MP, and I have created a 2012 MP as I will be deploying to 2012 SP1 and there is no need for backwards compatibility.

    Hi Vladimir,
    Thanks again for your response. I'm not quite sure what you mean with your last post. My current DA model is:
    | Dist App |
    | System.Service |
    |
    | | | |
    | Group 1 | | Group 2 | | Group 3 | | Group 4 |
    | System.Group | | System.Group | | System.Group | | System.Group |
    Where all the groups have discoveries for membership. Are you saying that I need another level of groups below these to house the objects?
    I thought that the point of System.Group as a base class was that it did not appear in the console, but gave an automatic health rollup (as per TechNet). Interestingly I am not getting
    the automated health rollup for the Groups for their members and it is displaying in the Groups list in the console.
    I have also tried creating a new DA with a single group in an 07 MP, and have the same issue. The sample MPAuthor.Stores management pack works as expected.

  • Changing the Management Pack extensions or items are stored in

    I think I know the answer to this but I will ask it anyway. Is it possible to change the Management Pack an extension or an object such as a Queue or Subscription is stored in?
    Reason I ask is the person that was maintaining SM before me stored every single form extension, subscription, queue and view etc...in the same Management Pack. I really don't like this particular setup and I am sure its not best practice. I plan to break
    out all the above and possibly more into separate newly created Management Packs, like a separate one for Views, a separate one for Subscriptions etc.... Is the only solution to delete the existing item and create a new one with the same details including
    the display name?

    You can, but not as easily as you would like. 
    Essentially, what you would need to do is split up the the XML by hand, making sure to sort out the DisplayStrings sections to the correct components. Export out the "main" MP, cut out a section of XML that represents the form extension, paste
    it into a new file with a new name, add a reference to the new MP to the main MP, and replace all references to that object with MP Aliases to that object's new home. Import both the new MP and the reduced "main" MP back into the database. 
    there are two problems thou:
    Some objects, like projections (used in form extensions created by the authoring too) and class extensions should only ever be stored in sealed MPs, because removing them causes data that reference them to be invalid, i.e. properties defined in class extensions
    are lost if that class extension is removed and re-added, and templates that depend on projections are inaccessable if the projection they are based on is lost.
    some atomic SDK objects, like workflows, consist of several non-consecutive XML sections, meaning you would need to trace the references and identify which of many <Rule> and <Action> sections belonged with which workflows, and make sure they
    would all end up in the same MP. 
    now you could do this slowly, pulling sections out of the "main" MP and replacing them with references over various nightly downtimes, exporting the "main" MP and carving out a few objects, then importing the trimmed down "main"
    MP and the new single use MP until you had something resembling best practice functional isolation (all Change request notification over here, all service request workflows over there, etc), and the hollowed out husk of your "main" MP as a miscellaneous
    container, but i think you'll find that you still have to face and deal with data loss and breakage, potentially a lot more trouble then it's worth. 
    Given that, i would instead propose that you give up on correcting sins of the past, and start doing it right now. start setting up your new MPs and put new thingss into them, and while you're working, anything that you can easily rebuild. Make a plan to
    rebuild your templates and form customization. start identifying class extensions you'll have to rebuild, and work up a plan to export data defined by them somewhere (probably powershell and CSV) so you can restore it when the classes are restored. 
    Like most data hygine problems, there isn't an easy bulk application answer. you just have to look at everything and clean it up as you go. you can mix salt and sugar quite easily. UNmixing them is a lot more work. 

  • How to manage your environments, going from Development (DEV) to Production (PRD) with Management Packs.

    I am looking for a way to go from the development (DEV) environment to the production (PRD) environment in a controlled manner.
    As I do not want to manually click around with a walk-through screen-shots document = error prone.
    As I read these articles
    http://technet.microsoft.com/en-us/library/hh519659.aspx
    http://blogs.technet.com/b/antoni/archive/2013/10/09/system-center-service-manager-operations-manager-management-pack-and-naming-convention-best-practices.aspx
    http://blogs.technet.com/b/wincat/archive/2013/03/28/copying-user-roles-between-management-servers-in-service-manager-2012.aspx
    http://www.netiviaconsulting.com/2012/05/10/staying-organized-with-scsm-beware-of-the-management-pack-jungle/
    I still have some questions:
    1. Can all MPs (un-sealed) be exported from DEV and imported to PRD without breaking things?
    2. Are there any known pitfalls?
    3. Do you have some tips on this matter? In other words, how do you manage this?
    Regards,
    Erik

    Alex is spot-on. Most MPs are environment/management group independent.
    But some MPs contain what I call "Environment Specific Variables". In my experience, ESVs are in:
    1) Templates (such as the runbook guids)
    2) Criteria blocks (like view criteria, and rule criteria)
    3) Parameters (such as write action parameters, and console task parameters)
    Alex already gave an example of a template ESV.
    An example of a Criteria ESV might be some view that you developed that returns objects that have a specific set of values in your dev environment, but may be different for your prod environment. ie: You have a view whose criteria looks afor a specific business
    service ServiceId property. Let's say "Business Service X". But "Business Service X" has a different ServiceId in your dev environment than in your prod environment. So, every time you import your view MP from dev to prod, you have to remember to change your
    criteria.
    A quick Parameter ESV example; you have a notification subscription (a rule+writeaction). In Dev, you've hardcoded the notification recipient to be yourself. In Prod, the recipient has to be your boss. So, every time you import your MP from Dev to Prod,
    you have to remember to change the recipient parameter in the subscription's writeaction.
    Like Alex said, handling ESVs by hand is error-prone and time consuming.
    Alex, when you write that utility, you should make it more robust. Within an MP, allow a user to pick and configure
    any ESV. Store that configuration. Then, just run the MP through the utility and have it automatically set all the configured ESVs to the appropriate environment specific value :) (It's been something I've wanted to write into TFS for
    a while, but I never take the time to do it ;) ) If an ESV is missing (because it was removed or changed in such a way that it no longer resembles the configured ESV), throw a warning during conversion.

  • Setup Manager - Application Change Management Pack

    Hi All,
    This is Mugunthan, Development manager for iSetup and Setup Manager. I am happy to begin my note with the launch of Application Change Management Pack 3.1. For those who are not aware of Application Change Management Pack, it is a new product built on top of Oracle Enterprise Manager and comprises of three major modules namely Setup Manager, Customization Manager and Patch Manager. You can find more information about this pack here (LCM: Oracle Application Change Management Pack Setup Manager is enhanced and advanced version of iSetup on Oracle Enterprise Manager. There were good amount of reasons to re-architect iSetup on Oracle Enterprise Manager as Setup Manager.
    Why Setup Manager?
    Ability to migrate setup data over point releases of EBS – EBS comes up with roll-up patches on top of major releases periodically. It means that you have to certify the patch before it get deployed on to production which means that setup data are to be certified again. iSetup cannot connect between 12.0.4 to 12.0.6 or 12.0.6 to 12.1.1. We have achieved the data migration for above scenarios in Setup Manager (11i to R12 is not supported in Setup Manager)
    Projects – Setup Manager supports grouping of Extracts, Transforms, Loads and Reports as a single entity and can be deployed multiple times to multiple targets. The project consists of one or more tasks (of type Extract/Load/Transform/Report) and can be orchestrated according to your needs. Project execution supports scheduling which means that you can use this feature to sync up setup data between two instances at periodic internal. Projects can be shared with other users which is remarkable difference between iSetup and Setup Manager. You can share your projects with others in a combined development environment. The functional configuration data (Extracts) are stored in Oracle Enterprise Manager. This means that you can refresh EBS without losing Extracts.
    Integration with Change Approval mechanism – Execution of projects is integrated with change approval mechanism which means that you can control who executes what. Also, you have fine grain access control where you can control EBS targets assigned to a user.
    Offline Transformation – I would say this is the most unique feature of Setup Manager and as an implementer you would welcome this feature very much. iSetup has got very limited capability on Transformation. Here we have gone way ahead and support Transformation on almost all Setup Objects. We present you the extracted data in excel sheets. You can download the excel template and work offline. All the attribute value fields in the excel sheet are editable which means that you supply your own value. For example, you can download Operating Unit data as xls file and change operating unit “OU100” to “OU200” and upload it back to system and load to target instance. Please note that any attribute of Operating Unit setup data can be edited. Also, system has got intelligence and once you upload the excel sheet, it automatically changes all the inventory organization which belongs to “OU100” to “OU200”. This behavior can be controlled using attribute mapping which you can create in Setup Manager. Hold on, this is not the end, also you can add more operating unit in excel, say OU300 and OU400 and load it to target instance. You can delete few Operating Units from extracted data in excel sheet and load the remaining to target instance. This gives you complete flexibility to manage (add/edit/delete) functional configuration data offline. This is fantastic feature and please have a look at how it works.
    Advanced Filtering – Filter support in iSetup is not flexible where you cannot use comma separated values and complex sql join conditions. Also you do not know what query is executed behind the scenes to get the data. All this problems are addressed in Setup Manager 3.1. We show you the “select clause” (SQL) associated with interface and provide you a text field where you can refer the select clause and provide your own custom where criteria (SQL). This means that filter support is unlimited and you can set filter on any database columns. For example, you can extract “Responsibilities/Menu” which are active or created as on yesterday. You can extract all Operating Units excluding “OU100 and OU200”. We support all filter criteria that are supported by SQL.

    Hi Mugunthan,
    Can you provide links to any tutorial or example (screenshots) of using Setup Manager to transfer something between two environments (say Users or DFFs)?
    Thanks,
    Gareth

Maybe you are looking for

  • CCMS monitoring configuration

    Hello, I am trying to configure CCMS monitoring in Solution Manager. However I am unable to find any documents to guide me through the process. kindly help. Thanks.

  • Dynamically loading and unloading compositions

    Hi! So I followed this business right here: http://www.adobe.com/devnet/edge/articles/bootstrapping-edge-compositions-with-bootstrapca llback.html What I've found is that my Edge compositions continue to do things while hidden and that all the resour

  • Creator and simple HTML objects like HR

    How can i add simple HTML things like <HR> etc to a jsp page when using creator. And also how to create a page which also contains things like a table to containing just some multiline static textual information etc in creator.

  • Ethernet Atheros AR8132 / L is gone after .34-update

    The ethernet controller: 07:00.0 Ethernet controller: Atheros Communications Atheros AR8132 / L1c Gigabit Ethernet Adapter (rev c0) was just not showing up anywhere after a update to .34 Has been working just fine until this update. Ifconfig was just

  • XMII and Multiple Thin Clients

    Hi, I am thinking of deploying multiple thin clients, each accessing a single xMII instance and am posting a general "broad" question.  Although probably not an "xMII" related thread, but any others using xMII content on thin client(s)? Any issues or