JMX for high level AMPs?

I have to implement a high level API for a special device and I wonder if JMX is the right choice.
Is it possible to write a high level object oriented API with JMX or is it just a modern substitute of SNMP?
For example if I write an API for my mp3 radio player can I say <record the song "Walk Like an Egyptian" from "The
Bangles", when ever it comes> and later copy all received songs to the client or can I just say <start recording now> and <stop recording now>?
Regard, I want to implement an API so other complex applications can control my device comfortable, not a plug-in
into an existing management application that controls other stand-alone devices too.
Any comments? What's your experience?
Marvin Haktar

Yes, JMX is suitable for creating a high level API that offers more flexibility in management than a pure SNMP based approach. One of the goals of the JMX specification is to address the needs of application management, whereas SNMP has traditionally focused on network/device management.
Juha Lindfors
Author of "JMX: Managing J2EE with Java Management Extensions"
http://www.amazon.com/exec/obidos/ASIN/0672322889/104-6670791-7933546
Senior Developer, JBoss Group LLC

Similar Messages

  • How to Skip weight data from Sales order for Higher level BOM material

    I have maintained a BOM at Item level with item category group LUMF.
    And the higher level item is not subjected for pricing and costing. While creating the higher level Item I did not maintain weight for it. Now when I create a sales order due to the incompletion log system is asking to maintain Weight details for higher level item also.
    But I donu2019t want to maintain weight for that item nor I want to remove net weight and gross wt from incompletion log.
    While creating higher level material with material type FERT and I have maintained LUMF and all things are normal. Also tell me do I need to select any different material type or do I need to go for different settings ?
    I hope my query is clear. Please ask if query is not clear.
    Please Guruu2019s suggest me.

    Hi ,
    I do not know what a reference charactersitcs is,but there are all independant characeristics in terms of their values.
    A guess it is because the VC data is changed after the sales order is created, the updated VC data is displayed properly only in VA02 or VA03 but other function modules fail to display the updated VC data.
    I understand there is a relationship of instance -between sales order object and VC data object-which we need to update and then see the latest data.
    Please let me know if anyone has debgged VA03 to know how the standard SAP program retrieves VC data in a sales order.
    Thank you,
    Hemant

  • Maintain Columns for Higher Levels in Hierarchy when using Drill Down

    Hello all,
    I've searched through this forum looking for an answer to this, since I thought it might be a common question, but I wasn't able to find any related content.  So I apologize if this has already been answered in the past.
    I would like to know if when drilling down in a WEBI report, it is possible to automatically add dimension columns to a report as you drill down to the lower levels in the heirarchy, and keep the columns with the higher level dimension values, rather than replace those columns, which is the default behavior.  As a simple example, suppose I have the following columns in a report:
       Product      Sales Amount
    Now suppose I have set up a heirarchy in the Universe with Product, Sub-Product, then Region, in that order.  In the WEBI report with the drill filter options enabled, when I click on the Product link, the default behavior is to replace the Product column with the Sub-Product dimension and display the following:
      Sub-Product  Sales Amount
    Instead, I would like it to display:
      Product Sub-Product Sales Amount
    Then subsequently, of course:
       Product Sub-Product Region Sales Amount
    Does anybody know of a simple way to accomplish this?  I suppose this could be accomplished using the openDocument method with links to separate reports showing the addtional columns, but this seems a bit complicated for a relatively smple requirement.  I'm hoping that there is an existing simpler solution to this problem.  Please advise.  Any information would be greatly appreciated.

    Bernardo,
    The drill down capability is vertical and not horizontal.  There are two solutions, the one of open document as you mentioned is one.  The other solution depends on the way you have deployed your system.  If you have given users the capability to run reports in "modify" mode (WebI), then they can insert columns to the grid according to the scenario you mention
    Product Sub-Product Sales Amount
    and forego using the "drill fiter" per se.  If your deployment only permits most users to view the report in "read-only" mode (InfoView), then you have limited options.
    Thanks,
    John

  • OWB !0g R2: Does OWB create records for higher levels in a dimension?

    For example, if there are two levels PRODUCT_CATEGORY and PRODUCT, it looks like OWB populates the PRODUCT_CATEGORY records too in the dimension. Is this the default behaviour? Is there a way to turn this off, and only have the lowest level records populated?

    I will expand on the prior example to ask another dimension related question.
    ROW - 1
    PRODUCT_CATEGORY - DAIRY, FARM_NAME
    PRODUCT - NULL
    ROW - 2
    PRODUCT_CATEGORY - DAIRY, FARM_NAME
    PRODUCT - 2L MILK, FAT_CONTENT
    Q1: I need to join to another table on DAIRY, and to pull the FARM_NAME and the attributes from the second level (2L MILK, FAT_CONTENT). How do you guys do something like this in practical terms - check where the surrogate key for the second level is not null? One could also join on DAIRY where 2L MILK or FAT_CONTENT are not null?
    Q2: The other area where I am unclear is whether to use in the mapping the dimension object or the bound table. If I try to use the dimension then OWB will not let me do this because you can not source from more than level at the same time. I guess I will have to use a join? The other option is to use the table, but witht a complex large dimension that can get messy. So ... what is the correct approach in general - use dimensions or the underlying tables in a mapping?

  • Forum for High level business process change / enhancements for GAPs

    Hello Guru's,
    Is there any forum like SDN in SAP Portal to discuss, share knowledge and to bring new enhanced functionality in the existing SAP STD Functionality and finding solutions for GAPS and to develop functional spec. Kindly provide link?.
    Reward points for answer

    found solution

  • Higher level work area

    Hi all,
    I am trying to develop a report on EHS
    in which user give input plant then it shows work area and when user double click on work area it shows higher level work area but in this i am unabel to find higher level work area table where higher level area data is stored so can ant body tell me the table name or field name for higher level work area  .
    Thanks
    Ankit Modi

    Hi again,
    the table TWGLV is also used to store data when defining a product catalog. The field "HigherLev. area" is only used, when you are creating a layout for a product catalog (transaction WWM1). Within this layout, you can build up a hierarchy an therefor, the field you asked for is used.
    The field is not integrated in the layout maintenance when talking about store layouts (with transactions like layout workbench, etc.)
    Regards
    TK

  • Errors in the high-level relational engine. The data source view does not contain a definition for the table or view. The Source property may not have been set.

    Hi All,
    I have a cube in which i'm using the TIME DIM that i created in the warehouse. But now i wanted a new measure in the cube which is Average over time and when i wanted to created the new measure i got a message that no time dim was defined, so i created a
    new time dimension in the SSAS using wizard. But when i tried to process the new time dimension i'm getting the follwoing error message
    "Errors in the high-level relational engine. The data source view does not contain a definition for "SSASTIMEDIM" the table or view. The Source property may not have been set."
    Can anyone please tell me why i cannot create a new measure average over the time using my time dimension? Also what am i doing wrong with the SSASTIMEDIM, that i'm getting the error.
    Thanks

    Hi PMunshi,
    According to your description, you get the above error when processing the time dimension. Right?
    In this scenario, since you have updated the DSV, it should have no problem on the table existence. One possibility is that table has been specified for tracking in the notifications for proactive caching, but isn't available any more for some
    reason. Please change the setting in Proactive Caching into "MOLAP".
    Reference:
    How To Implement Proactive Caching in SQL Server Analysis Services SSAS
    If you have any question, please feel free to ask.
    Best Regards,
    Simon Hou
    TechNet Community Support

  • Error: Maintain settlement rule of the sender for a higher level WBS

    Hi,
    I dont want to maintain the settlement rule for a higher level WBS. How can i configure this in such a way that i dont get the following error:" Maintain settlement rule of the sender" while doing CJ88. Maintaing a separate Settlement profile for a higher level WBS is an option but we are looking if something else could be done The problem is that there are no actuals booked against, say, level 2 WBS but when i execute CJ88, i get the aforesaid error. How can i ensure that only the lowest level WBS ask for the settlement rule and not the levels above it. I have already removed the Investment profile from the higher level WBS but still getting the same error.
    Regards,
    DPil

    Hi,
    It is a type Capex WBS and Biling element is not checked. In fact i get a warning while doing the settlement: WBS is neither a billing element nor an account assignment element.
    Diagnosis
    WBS element  is not indicated as either an account assignment element or as a billing element in the master record.
    System Response
    The WBS element cannot be assigned to an account.
    Procedure
    Correct your entries or add the missing indicator to the master record for the WBS element.
    But this is just a warning. On pressing enter i get the error :Maintain settlement rule of the sender "

  • Service item not relevant for pricing if used with higher level item catego

    Hi,
    We have a service item e.g S900 with Item category ZTAD.
    This line Item automatically creates a Service Order.
    Requirement is , if this service item is used with an Equipement Item,
    Item 10 --> Equipment
    Item 20 --> Service
    --> There would be price required for the Equipment ( Condition type ZPRO - Mandatory )
    --> System should not ask for the price of Service Item, as it would be inclusive in the Equipmetn charges.
    --> In short if the Service item is used as Sub item with Equipment, it is not relevan for pricing.
    I tried copying ZTAD and creating a new item cateogry which is not relevant for pricing...would that be the correct approach, but facing several issues related to Automatic Service order generation.
    what could be possible ways to achieve above.
    Regards
    Trupti Deulkar

    Hi,
      System will ask the price for  item category TAD bcoz service also chargble,in your case insted of TAD use TANN as free,
       You can define based on your higher level item category  for Ex : OR + Normusage(Blank)Higherlevl item (Blank)= TAN
                                                                       you can config like this  ORNormUsge(Blank)  + TAN = TANN
    like this you can do it this correct way or els you can manually enter the item catagory (TANN) at sale order line item level.
    Thanks
    Vinayak.
    Edited by: vinayak4all on Jul 12, 2011 2:45 PM

  • High-level design for business process scenario

    Hi all, I have a business process scenario for which CRM and ECC has to be used. The scenario is like this:
    There are 2 subsidiaries of a company - A and B, which are into trading of materials.
    Between A and customer there is sales agreement. Between B and vendor there is purchase agreement.
    Whenever customer requires materials, it will send request to A. So, A creates a sales order. B in turn sends a request to vendor for which a purchase order is created. The vendor ships the goods directly to customer. The vendor bills B for the purchase order. B adds some margin and then it goes for inter-company settlement with A. A bills the customer for the materials. In this entire process, both A and B earns profit through margins.
    It has to be noted that CRM has to be used as contact point with customer.
    Now I have the following questions:
    1. How would the high-level business design be like?
    2. How would be the document flow?
    3. How can inter-company settlement and billing be done?
    All constructive comments and feedback will be rewarded.

    Hi Animesh,
    The thumb rule to CRM process designing is stick to customer process...
    Thus u jst hav to stick to process design of company A and its interaction with Customer.
    We are not concerned with how the material will be purchansed,but we are concerned with how we will be delivering it to end customer.
    Thus, once customer place an order to company A, our process would suggest a sales order for the customer with one of the partner function "Vendor" as B with the required material. Now later process in R/3 would convert this sales order with customer as soldtoparty to purchase order with vendor as B and one of the secondary partner soldto party as our original customer.
    And billing would happen the way it is... ie. based on sales order.. thus u will have some inflow.. again u will be raising a purchase requisition to company B thus that would be your outflow..
    That was a really good scenario to think about.
    Best Regards,
    Pratik Patel
    <b>
    Reward with Points!</b>

  • High level description of why an Enterprise Admin account is required for DirSync config

    Hi all,
    I understand that as part of the Azure AD Sync tool config wizard you are required to enter the credentials of an Enterprise Admin account.  These credentials are required for the creation of the MSOL_AD_Sync service account within the Users OU of Active
    Directory.  This account is granted read and synchronization permissions to the local Active Directory.
    Is someone able to provide a high-level description of what this actually means i.e exactly which permissions are granted and on which objects.  Are we talking about having to modify the permissions of every single object within Active Directory?
    Many thanks in advance,
    Graham

    Hi,
    To start with, I guess you would know that this is a Kind of a Temporary.
    I have some details from previous conversations and blogs
    Hope this puts some light on your query.
    When configuring the Microsoft Online Services Directory Synchronization Tool, you are asked to provide the credentials for an account that has
    Enterprise Admin permissions on your organization's local Active Directory directory service. It accepts credentials in either of the following forms:
    [email protected]
    Example\someone
    These Enterprise Administrator credentials are not saved. They are erased from the computer's memory after the service account is created.
    How the Active Directory Credentials Are Used
    The Microsoft Online Services Directory Synchronization Tool Configuration Wizard uses the Enterprise Admin credentials to create the directory
    synchronization service account, MSOL_AD_Sync. This service account is created as a domain account with directory replication permissions on your local Active Directory and with a randomly generated complex password that never expires.
    Note:
    Changing the password associated with the service account is not recommended.
    How the Service Account Is Used
    When the directory synchronization service runs, it uses the service account credentials to read from your local Active Directory and write to the
    synchronization database. The contents of the synchronization database are written to Microsoft Online Services using the Microsoft Online Services credentials requested on the
    Microsoft Online Services Credentialspage of the Microsoft Online Services Directory
    Synchronization Tool Configuration Wizard.
    Note:
    If you add a domain to your Active Directory forest, you must run the Microsoft Online Services Directory Synchronization Tool Configuration Wizard
    again to add the new domain to the list of domains to be synchronized.
    Thanks & Regards
    John Chris

  • High Level Recommendations For Multi-Tier Application

    Hello:
    I have been reviewing Windows Azure documentation and I'm still somewhat confused/unsure regarding which configuration and set of services is best for my organization.  I will start off by giving a high level description of the what the environment
    should be.
    A) 2 "Front End" IIS Instances, Load Balanced running an MVC 4.0/.Net 4.5 Web Application
    B) A "dedicated" SQL SERVER 2008 R2 server with medium-high resources (ample RAM and processing power)
    C) An application server which hosts a Windows Service.  This service will require access to the SQL Server listed in B. In addition the IIS "Front Ends" listed in A should have access to a "shared" folder or directory where files
    can be dropped and processed by this windows service.
    I have looked at Azure Web Site, Azure Virtual Machines and Cloud Services and I'm not sure what is best for our situation.  If we went with Azure Web Sites, do we need TWO virtual machines, or a single virtual which can "scale out" up to
    6 instances.  We would get a Standard Web Site, and the documentation I see says it can scale out to 6 instances. I'm somewhat confused regarding the difference between a "Virtual Machine" and an "Instance".  In addition, does
    Azure Web Sites come with built in load balancing between instances, virtual machines? both?  Or is it better to go with Azure Virtual Machines and host the IIS Front end there?  I'm just looking for a brief description/advise as to which would be
    better.
    Regarding the SQL Server database, is there a benefit to using Azure SQL Database? Or should we go with a virtual machine with SQL Server installed as the primary template?  We have an existing SQL Server database and initially we would like to move
    up our existing schema to the Cloud.  We are looking for decent processing power for the database and RAM.
    Finally the "application" tier, which requires a Windows Service. Is an Azure Virtual Machine the best route to take? If so, can an Azure Web Site (given that is the best setup for our needs) write to a shared folder/drive on a secondary virtual
    machine.  Basically there will be json instruction files dropped  into a folder which the application tier will pick up, de-serialize and do backend processing.
    As a final question, if we also wanted to use SSRS, is there updated/affordable pricing and hosting options for this as well?
    I appreciate any feedback or advice.  We are definitely leaning towards Azure and I am trying to wrap my head around what our best configuration and service selection should be.
    Thanks in advance

    Hi,
    A) 2 "Front End" IIS Instances, Load Balanced running an MVC 4.0/.Net 4.5 Web Application
    B) A "dedicated" SQL SERVER 2008 R2 server with medium-high resources (ample RAM and processing power)
    C) An application server which hosts a Windows Service.  This service will require access to the SQL Server listed in B. In addition the IIS "Front Ends" listed in A should have access to a "shared" folder or directory where files can be dropped and
    processed by this windows service.
    Base on my experience and your requirement, you could try to use this solution:
    1.Two cloud service to host your "front end" web application. Considering to Load Balanced, You could use traffic manager to set Load Balancing Settings.
    2. About sql server or ssrs, you have two choice:>1,create a sql server vm  >2, use sql azure and azure ssrs
    I guess all of them could meet your requirement.
    3. About your C requirement, which type application is? If it is website, You could host it on azure website or cloud service.
    And if you want to manage the file by your code, I think you could save your file into azure blob storage. You could add,delete file using rest API(http://msdn.microsoft.com/en-us/library/windowsazure/dd135733.aspx
    ) or code(http://www.windowsazure.com/en-us/documentation/articles/storage-dotnet-how-to-use-blobs-20/ ). And the Blob storage could be as a share file
    folder.
    And for accurately, about the billing question , you could ask azure billing support for more details.
    try this:http://www.windowsazure.com/en-us/support/contact/
    Hope it helps.
    Regards,
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • High level estimation for datasource enhancement

    hi,
    I have been assigned in supporting project recenlttly.  Now we are going to data source enhancement(we need one more field i.e juridiction code for tax calculation, Datasource is 0FI_GL_4), we need to high level esstimation for  the procedure and time duration for each process (Devolepment  to Prod).
    Can any one help me in this concern......
    Thanks,
    Shaliny

    Shaliny,
    If the field is ready available to add with no hicups, everything can be done in 3 days.
    Well, if you need to go for an customer exit during enhancement, you need at least 15 days which includes testing also.
    It generally depends on how complex it is expected to be. But have a buffer of atleast 2 days after completion from your side.

  • High-level view of steps for 10g OWB-OLAP to Discoverer

    I would greatly appreciate ANY feedback to the following steps. These are not necessarily correct or the best way to do this. I am attempting to take source data, use OWB, create the analytical workspace, and from there have the metadata available and used by Discoverer.
    This is rather high-level, feel free to jump in anywhere.
    We are trying to see if we can get away with NOT using the Analytical workspace manager (AWM) if possible. With that in mind, we are trying to make the most of the process with OWB & OLAP.
    Is this possible to do without ever using the AWM? Can we go end to end (source data--->discoverer final reporting) primarily using OWB to get to the point where we can use the metadata for Discoverer?
    Can anyone relate experiences perhaps that would make me want to consider using the AWM at certain points instead?
    Most importantly, if I do use this methodology, would I be safe after everything has been setup? WOuld I want to consider using AWM at a later point for performance reasons while I am using Discoverer? Or would OWB be helpful as well in some aspects in maintenance of data? Any clue on how often I might need to rebuild, and if so, what to use in that case to minimize time?
    Thanks so much for any insight or opinion on anything I have mentioned!

    Hi Gregory,
    I guess the answer is that this depends. My first question is whether you are looking at a Relational OLAP or Multi Dimensional OLAP solution? This may change the discussion slightly, but just lets look at some thoughts:
    In essence you can use the OWB bridge to generate the AW objects (cubes etc). If you do that (for either ROLAP or MOLAP) you will get the AW objects enabled for querying, using any OLAPI query tool, like BI Beans or the new Discoverer for OLAP. The current OWB release does not run the discoverer enabler (creating views specifically written for EUL support in Disco classic).
    So if you are looking at Disco classic you must use the AWM route...
    The other things that you must be aware off is that the OWB technology is limited to cloning the relational objects for now. This means that you will create a new model based on your existing data. If you want to tweak the objects generated you will probably need to go to the underlying code in either scenario.
    So if you want to create calculated measures for example you could generate a cube with OWB, create a "dummy measure" and add the formula in OLAP DML. The same will go for some other objects you may want to create such as text measures...
    The benefit of creating place holder or dummy measures is that the metadata is completely in order, you simply change the measure's behavior...
    For the future (the beta starts relatively soon) OWB will support much more modeling, like logical cubes and you can then directly deploy to OLAP. Also the mappings are transparent to the storage. So you map to a logical cube and OWB will implement the correct to load either OLAP or relational targets.
    We will also start supporting calculated measures, sparsity definitions, partioning and compression on cubes, as we will support parallel building of cubes.
    Hope this gives you some insight!
    Jean-Pierre

  • Low-Level or High-Level for GUI ?

    I am developing a MIDLet that has to run CLDC-1.1 MIDP-2.0 Devices. The MIDLet has a simple user interface, and there is no gaming. I developed the MIDLet using the high-level GUI class, but I discovered that this class is nice, yet limited. Now I am investigating using the low level canvas class for the GUI. Is it possible to maintain cross-platform compatibility with the canvas, or should I stick with the high-level class?

    High Level Group
    Classes provided are
         Perfect for development of MIDlets that target the maximum number of devices
         Heavily abstracted to provide minimal control over their look and feel
         Classes do not provide exact control over their display
    Low Level Group
    Classes provided are
         Perfect for MIDlets where we want precise control over the location and display of the UI elements
         If more control there is comes less portability It may not be deployable on certain devices
    Cheers,
    Rohan Chandane

Maybe you are looking for

  • Add a Detail Table Row per master row

    Hi, I am currently working on a requirement for a master detail advanced table. I am trying to add one detail row per master row when the page loads based on the link between the two. I have a master table with 20 records and I need to create 1 detai

  • OAS 10G (10.1.2.0.0) / Certified JDeveloper Version?

    Hello! I want to upgrade to the latest OAS Version 10g (10.1.2.0.0), but i don't know, what JDeveloper version is certified for developing and deploying J2EE - applications on 10.1.2.0.0 ? e.g. it seems that JDeveloper 10g (9.0.5.2) uses as embedded

  • Sequence with in sequence for XML Report

    Hi I am doing XML Report in Oracle Apps. For that i need sequence with in sequence. I am able to do sequence using the below code. <xsl:value-of select="count(preceding::RECEIPT_NUM) + 1"/> but my problem is for every department the sequence must sta

  • IMovie '08 - Video fade in

    How can I get a fade in overlayed on my video clip? I tried the 'fade through black' transition, but the fading occurs prior to the video clip, and I want to fade in over the video clip. Thanks.

  • Actions scheduled using DB13 fail.

    Hi All,        Actions like Whole Database Offline Backup, Check DB, Update Statistics fail when scheduled using DB13. It gives the following error: <b>Can't exec external program (Permission denied)</b>        But the same when scheduled using BrToo