How to implement such validation requirement?

2 af:inputText A and B
only numbers and letters are allowed for both of them,
but both of them are letters is not allowed.
how to implement it?
JDev: 11g.
Thanks.
Kevin.

Hello,
Do you just want to display them as red or also validate on them?
If its just display, you can try this:
Set both fields to autoSubmit="true" and immediate="true"
set the styleclass to depeding on the values like this:
styleClass="#{bindings.A.inputValue == bindings.B.inputValue ? 'AFFieldText': 'customErrorStyleClass'}"-Anton

Similar Messages

  • HOW TO IMPLEMENT SCREEN VALIDATION IN LDB?

    Hi,
    I am making use of Logical Database in my report, and I want to do screen validation with the screen which already exists in the system? Is it possible to do so? If yes, then how? The LDB, I am using is PSJ.
    Also, I would like to know, can we define our own selection screen incase of LDB? How is that possible if the answer is yes to it.BUt here I am not talking about dynamic selection screen. What I mean to say is that..., can we restrict the selection screen in LDB, by defining the fields which are required in the selection screen on execution of the report?
    Someone, please guide.
    Points will be rewarded for a satisfactory answer.
    Thanks & Regards,
    Ravi

    Hi Ravi,..
    Chk out the link - http://help.sap.com/saphelp_erp2005/helpdata/en/60/183d78163011d2953c0000e8353423/frameset.htm
    Yes, we can define our own selection screen and that will come right below your LDB selection screen.
    AND also LDB selection screen can be restricted ..for this you have to code in PBO of report for hiding the elements which u don't required.
    Regards
    Mithlesh

  • How to implement such that "Order reason" not allow the SO to generate PO?

    Hi,
    While Sales Order creation, if 'Order Reason' is "Poor Quality" then I do not want the Purchser order to be generated.
    such that the system should not allow the Sales order to generate Purchade order until the order reason is changed.
    Please tell me how can I do this?
    Regards,
    Chamandeep

    Hi Vinod,
    This is 3rd party Sales order creation situation.I am Creating the PO using the Item category TAS.
    The PO is generated and then the PO generates the Sales order in another SAP system using ALE.
    What I want that if a particular "order reason" is chosen in the sending SAP system while creating the Sales order, then it should not Let the PO be sent to the another system.
    Do tell if any other information is required?
    Regards,
    Chamandeep

  • How to implement JavaScript validation in JSF

    Dear all,
    if i wish to implement JavaScript front end validation function in JSF. what should i do?
    thanks.

    Hi,
    in Core JSF from David Geary there is a chapter on using
    Jakarta Commons Validator for javaScript validation
    source could be found here:
    http://www.horstmann.com/corejsf/
    HTH,
    Matthias

  • My Proxy: How to implement a validation before insert/update

    Hi,
    In my env, the OIM User has an attribute called Employee Grade. When the user is configuring My Proxy, he/she can select users who grade who has grade equals or higher... I created a Rule Generator adapter and I tried to assign it to the Pre-Insert of Users.Proxy data object. Problem: There is no field availabe for this entity and I thought it was related to the table PXD (PXD_ORIG_USR_KEY and PXD_PROXY_KEY).
    If I could access those fields, I would search the proxy user by PXD_PROXY_KEY and get he/she grade. After that, compare with the grade of the original user.
    I alse have to iimplement the same logic when the user reasign a task to another user.
    Have you ever implemented that or have any idea?
    Thanks,
    Renato Guimaraes.
    Edited by: Renato.Guimaraes on 05/10/2009 13:58 - Implement the same logic to reassign task

    One option is to implement the proxy funtion as a separate RO. Makes it possible to do validation in the approval process but will force you to manually implement any functionality that you would like to have associated with the function.
    Probably not easier than a GUI customization but I wanted to mention the option.
    Best regards
    /Martin

  • How to implement a java class in my form .

    Hi All ,
    I'm trying to create a Button or a Bean Area Item and Implement a class to it on the ( IMPLEMENTATION CLASS ) property such as ( oracle.forms.demos.RoundedButton ) class . but it doesn't work ... please tell me how to implement such a class to my button .
    Thanx a lot for your help.
    AIN
    null

    hi [email protected]
    tell me my friend .. how can i extend
    the standard Forms button in Java ? ... what is the tool for that ... can you explain more please .. or can you give me a full example ... i don't have any expereience on that .. i'm waiting for your reply .
    Thanx a lot for your cooperation .
    Ali
    <BLOCKQUOTE><font size="1" face="Verdana, Arial">quote:</font><HR>Originally posted by [email protected]:
    Henrik, the Java importer lets you call Java classes on the app server side - I think what Ali is trying to do is integrate on the client side.
    If you want to add your own button then if you extend the standard Forms button in Java and then use this class name in the implementation class property then the Java for your button will be used instead of the standard Forms button. And since it has extended the basic Forms button it has all the standard button functionality.
    There is a white paper on OTN about this and we have created a new white paper which will be out in a couple of months (I think).
    Regards
    Grant Ronald<HR></BLOCKQUOTE>
    null

  • How to implement a forever running program?

    i made a program that does some file i/o in an infinite loop (while(!stop)). if all work is one it waits until "jnotify" notifies that a new folder was created and it starts working again. this program has a jmx interface to start and stop this loop. all functionality is done only the main() is missing now.
    how to implement a forever running main, only waiting for (user)-methods called to execute them - just like any gui-application? i guess in a swing-app there is no while(!closed)-loop? (in meantime i noticed - such loops consum 100% cpu ;-)). So is there a best practice or a "pattern" for such common apps?
    thx in advance

    817113 wrote:
    in my case the "inner-jnotify-file-io"-thread it self is stoppable (while(!stop) listen for jnotify events). but if this thread is stopped the hosting main-method must keep runnning - waiting until someone calls start() via jmx or shutdown(). Strictly speaking the main methods only task is to host the jmx interface (, the logger..) and to run until someone calls shutdown (like a swing gui that waits until someone closes the window).
    How to implement such behavior? Do i need another thread within main, waiting for such an event and message it to the hosting main method? Main already runs in its own thread, if your "inner-jnotify-file-io"-thread has to be able to be stopped (instead of just paused), then you need another thread that stays alive.
    You could make the main thread wait() on something that gets notified() when close is called.
    (i could hardly believ that there is no "pattern" or example fo a main()-method/app that is running until someone calls a "close"-methode)No, you're just not understanding the big picture here. It's quite easy to make an app like that. And you've been given ways and information on how to do it.

  • How to implement authorization at sales order/projects level?

    If we open a sales order in VA02 and then try to open the same sales order in another session in VA02, there will be a message that the sales order is being processed. 
    I wish to implement similar functionality in my application which consists of editable report. 
    I tried to debug the situation in VA02 case mentioned above, to know something more about it, but could not understand anything!
    I know that lock objects can be helpful in this, but in my case there can a be range of sales orders (and projects within a range of profit centers). 
    How to implement such a functionality?
    Edited by: Ankit Modi on Dec 30, 2009 10:54 AM
    Edited by: Ankit Modi on Dec 30, 2009 10:55 AM

    @ Max -
    Yes Max, but the problem is that I want the lock on many sales orders at a time.  Its like, there is a report in which many rows (for sales orders) are editable.  I want all those to be locked at a time. This will not work by Lock object, I believe...
    @ Soumya -
    Yes Soumya, but, as I explained above, I want a range of sales orders rather than a sales order in particular.  That will not work by lock object.  Right?

  • How many physical Network Adapter required for to implement NVGRE Gateway ?

    Hello Guys,
    Would like to know how many physical Network Adapter required to implement network virtualization including the technologies such as Windows Azure Pack| system center 2012 R2 & Hyper-V 2012 R2.
    Thanks 
    NM-BG

    A quick search of TechNet will turn up all sorts of step-by-step and overviews and blogs with lots of information about how to configure and setup.  Minimum of two.  Actual configurations may vary according to business needs and requirements.
    . : | : . : | : . tim

  • How to implement this requirement

    Greetings!!!
    The following are my tables
    CREATE TABLE SHIPMENT_CARTON(
    SSSD_ID               NUMBER(10) NOT NULL,
    CTN_MEAS_ID          NUMBER(10),
    SSSD_NO_CTN          NUMBER(10),
    SSSD_GW               NUMBER(10,2),
    SSSD_NW               NUMBER(10,2),
    CONSTRAINTS PK_SHIPMENT_CARTON PRIMARY KEY(SSSD_ID))
    TABLESPACE  order_data01
         STORAGE(
         INITIAL 10K
         NEXT 10K
         MINEXTENTS 5
         MAXEXTENTS 10);   
    ALTER TABLE SHIPMENT_CARTON ADD CONSTRAINTS FK_SHIPMENT_CARTON1 FOREIGN KEY(CTN_MEAS_ID)
    REFERENCES CARTON_MEAS(CTN_MEAS_ID);
    CREATE TABLE ITEMS_IN_CARTON(
    IIC_ID               NUMBER(10) NOT NULL,
    IIC_QTY               NUMBER(10),
    SSSD_ID               NUMBER(10),
    OI_ITEM_ID          NUMBER(10),
    CONSTRAINTS PK_ITEMS_IN_CARTON PRIMARY KEY(IIC_ID))
    TABLESPACE  order_data01
         STORAGE(
         INITIAL 10K
         NEXT 10K
         MINEXTENTS 5
         MAXEXTENTS 10);
    ALTER TABLE ITEMS_IN_CARTON ADD CONSTRAINTS FK_ITEMS_IN_CARTON1 FOREIGN KEY(SSSD_ID)
    REFERENCES SHIPMENT_CARTON(SSS_ID);
    ALTER TABLE ITEMS_IN_CARTON ADD CONSTRAINTS FK_ITEMS_IN_CARTON2 FOREIGN KEY(OI_ITEM_ID)
    REFERENCES ORDER_ITEM(OI_ITEM_ID);
    REQUIREMENT:
    More than 1 itemcode(s) will be put in 1 carton.
    Solution:
    I did it's  master detail entry wherein the user will initially select the carton size(CTN_MEAS_ID), enter no of carton(SSSD_NO_CTN),
    gross weight(SSSD_GW), net weight(SSSD_NW) and primary key (SSSD_ID sequence generated) in table SHIPMENT_CARTON(Master Block Entry).
    Afterwhich, user will select itemcodes(OI_ITEM_ID), enter the quantity(IIC_QTY) and primary key (IIC_ID sequence generated) in
    ITEMS_IN_CARTON (Detail Block Entry).
    MY NEW REQUIREMENT:
    More than 1 itemcode(s) will be put in 1 carton.
    User will initially select the itemcode(s)and quantity , then select the carton size,enter the no of carton, gross & net weight.
    How to implement it in reverse manner.
    I hope somebody out there can suggest how to do it in Forms10g.
    Thank you in Advance...

    It depends on what the front end should look like and whether the carton details are known immediately after the items have been entered or not. I'll assume the carton details will not be known for some time.
    Presumably you have other tables in the model, for example there would probably be a table above shipment_carton. "Shipment" perhaps? If not then create it and foreign key both tables here to the Shipment table. When you add items you add them to a shipment and when you add cartons you add them to a shipment. Then you just have to assign items to cartons within that shipment. The foreign key from items to cartons must now include the Carton AND Shipment primary key columns so you know that the items are not being put into another shipment's carton.

  • How to implement the schema validation with XSD in adapter module

    Dear All,
    I am trying to develop a EJB as the file adapter mudule.
    Please guide me how to implement the schema validation of the source message with XSD.
    Or provide me the relative resources about this task.
    Thanks & Regards,
    Red
    Edited by: Grace Chien on Nov 19, 2008 8:23 AM

    Hi Grace,
    You can do the xml scema validation in PI7.1 version directly.
    To develop the adapter module for xml schema validation
    Validating messages in XI using XML Schema
    Schema Validation of Incoming Message
    Regards
    Goli Sridhar

  • How to ... implement characteristic validation forbidden infocube update?

    Hi Gurus,
    I need implement into SEM-BPS a validation so that the update of certain combination of characteristic be forbidden.
    I build up and exit function.It gives an error message when this combination of characteristic be done, but then if the user push save button the information is update into the infocube.
    Someone can help me? how to ...implement this validation forbidden the update into the infocube?
    Thanks in advance for your help.

    Hi,
    I'm using the web Interface. In order to execute the planning function when the save button is pressed I implement the follow how to:
    <a href="http://www.sdn.sap.comhttp://www.sdn.sap.comhttp://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/e032e790-0201-0010-8bae-f9bc2b1e5aaa">How To... Run Planning Sequences on Save and other events (web)</a>
    On the other side the message given back for the planning function is an error message.
    Even though I can save data into the infoprovider.
    Since you confirm to me that the standard behavior is the wished one, I'm going to continue by this way in order to obtain the desired result.
    Thanks a lot.

  • How to implement Dynamic form validation

    Hello,
    I've been reading up on the forums about using the htmldb.item package to dynamically create form elements using a SQL query. Assuming I can generate the form dynamically, what approaches/options are available for implementing dynamic validation of the form elements themselves?
    Thanks in advance

    You could write a page-level 'PL/SQL Function Body returning error text' validation where you loop over your dynamic form elements (htmldb_application.g_fnn), check the values and raise errors as needed.

  • What is BI ? How we implement & what is the cost to implement ?

    What is BI ? How we implement & what is the cost to implement ?
    Thanks,
    Sumit.

    Hi Sumit,
                        Below is the description according to ur query
    Business Intelligence is a process for increasing the competitive advantage of a business by intelligent use of available data in decision making. This process is pictured below.
    The five key stages of Business Intelligence:
    1.     Data Sourcing
    2.     Data Analysis
    3.     Situation Awareness
    4.     Risk Assessment
    5.     Decision Support
    Data sourcing
    Business Intelligence is about extracting information from multiple sources of data. The data might be: text documents - e.g. memos or reports or email messages; photographs and images; sounds; formatted tables; web pages and URL lists. The key to data sourcing is to obtain the information in electronic form. So typical sources of data might include: scanners; digital cameras; database queries; web searches; computer file access; etcetera.
    Data analysis
    Business Intelligence is about synthesizing useful knowledge from collections of data. It is about estimating current trends, integrating and summarising disparate information, validating models of understanding, and predicting missing information or future trends. This process of data analysis is also called data mining or knowledge discovery. Typical analysis tools might use:-
    u2022     probability theory - e.g. classification, clustering and Bayesian networks; 
    u2022     statistical methods - e.g. regression; 
    u2022     operations research - e.g. queuing and scheduling; 
    u2022     artificial intelligence - e.g. neural networks and fuzzy logic.
    Situation awareness
    Business Intelligence is about filtering out irrelevant information, and setting the remaining information in the context of the business and its environment. The user needs the key items of information relevant to his or her needs, and summaries that are syntheses of all the relevant data (market forces, government policy etc.).  Situation awareness is the grasp of  the context in which to understand and make decisions.  Algorithms for situation assessment provide such syntheses automatically.
    Risk assessment
    Business Intelligence is about discovering what plausible actions might be taken, or decisions made, at different times. It is about helping you weigh up the current and future risk, cost or benefit of taking one action over another, or making one decision versus another. It is about inferring and summarising your best options or choices.
    Decision support
    Business Intelligence is about using information wisely.  It aims to provide warning you of important events, such as takeovers, market changes, and poor staff performance, so that you can take preventative steps. It seeks to help you analyse and make better business decisions, to improve sales or customer satisfaction or staff morale. It presents the information you need, when you need it.
    This section describes how we are using extraction, transformation and loading (ETL) processes and a data warehouse architecture to build our enterprise-wide data warehouse in incremental project steps. Before an enterprise-wide data warehouse could be delivered, an integrated architecture and a companion implementation methodology needed to be adopted. A productive and flexible tool set was also required to support ETL processes and the data warehouse architecture in a production service environment. The resulting data warehouse architecture has the following four principal components:
    u2022 Data Sources
    u2022 Data Warehouses
    u2022 Data Marts
    u2022 Publication Services
    ETL processing occurs between data sources and the data warehouse, between the data warehouse and data marts and may also be used within the data warehouse and data marts.
    Data Sources
    The university has a multitude of data sources residing in different Data Base Management System (DBMS) tables and non-DBMS data sets. To ensure that all relevant data source candidates were identified, a physical inventory and logical inventory was conducted. The compilation of these inventories ensures that we have an enterprise-wide view of the university data resource.
    The physical inventory was comprised of a review of DBMS cataloged tables as well as data sets used by business processes. These data sets had been identified through developing the enterprise-wide information needs model.
    3
    SUGI 30 Focus Session
    The logical inventory was constructed from u201Cbrain-stormingu201D sessions which focused on common key business terms which must be referenced when articulating the institutionu2019s vision and mission (strategic direction, goals, strategies, objectives and activities). Once the primary terms were identified, they were organized into directories such as u201CProjectu201D, u201CLocationu201D, u201CAcademic Entityu201D, u201CUniversity Personu201D, u201CBudget Envelopeu201D etc. Relationships were identified by recognizing u201Cnatural linkagesu201D within and among directories, and the u201Cdrill-downsu201D and u201Croll-upsu201D that were required to support u201Creport byu201D and u201Creport onu201D information hierarchies. This exercise allowed the directories to be sub-divided into hierarchies of business terms which were useful for presentation and validation purposes.
    We called this important deliverable the u201CConceptual Data Modelu201D (CDM) and it was used as the consolidated conceptual (paper) view of all of the Universityu2019s diverse data sources. The CDM was then subjected to a university-wide consultative process to solicit feedback and communicate to the university community that this model would be adopted by the Business Intelligence (BI) project as a governance model in managing the incremental development of its enterprise-wide data warehousing project.
    Data Warehouse
    This component of our data warehouse architecture (DWA) is used to supply quality data to the many different data marts in a flexible, consistent and cohesive manner. It is a u2018landing zoneu2019 for inbound data sources and an organizational and re-structuring area for implementing data, information and statistical modeling. This is where business rules which measure and enforce data quality standards for data collection in the source systems are tested and evaluated against appropriate data quality business rules/standards which are required to perform the data, information and statistical modeling described previously.
    Inbound data that does not meet data warehouse data quality business rules is not loaded into the data warehouse (for example, if a hierarchy is incomplete). While it is desirable for rejected and corrected records to occur in the operational system, if this is not possible then start dates for when the data can begin to be collected into the data warehouse may need to be adjusted in order to accommodate necessary source systems data entry u201Cre-worku201D. Existing systems and procedures may need modification in order to permanently accommodate required data warehouse data quality measures. Severe situations may occur in which new data entry collection transactions or entire systems will need to be either built or acquired.
    We have found that a powerful and flexible extraction, transformation and loading (ETL) process is to use Structured Query Language (SQL) views on host database management systems (DBMS) in conjunction with a good ETL tool such as SAS® ETL Studio. This tool enables you to perform the following tasks:
    u2022 The extraction of data from operational data stores
    u2022 The transformation of this data
    u2022 The loading of the extracted data into your data warehouse or data mart
    When the data source is a u201Cnon-DBMSu201D data set it may be advantageous to pre-convert this into a SAS® data set to standardize data warehouse metadata definitions. Then it may be captured by SAS® ETL Studio and included in the data warehouse along with any DBMS source tables using consistent metadata terms. SAS® data sets, non-SAS® data sets, and any DBMS table will provide the SAS® ETL tool with all of the necessary metadata required to facilitate productive extraction, transformation and loading (ETL) work.
    Having the ability to utilize standard structured query language (SQL) views on host DBMS systems and within SAS® is a great advantage for ETL processing. The views can serve as data quality filters without having to write any procedural code. The option exists to u201Cmaterializeu201D these views on the host systems or leave them u201Cun-materializedu201D on the hosts and u201Cmaterializeu201D them on the target data structure defined in the SAS® ETL process. These choices may be applied differentially depending upon whether you are working with u201Ccurrent onlyu201D or u201Ctime seriesu201D data. Different deployment configurations may be chosen based upon performance issues or cost considerations. The flexibility of choosing different deployment options based upon these factors is a considerable advantage.
    4
    SUGI 30 Focus Session
    Data Marts
    This component of the data warehouse architecture may manifest as the following:
    u2022 Customer u201Cvisibleu201D relational tables
    u2022 OLAP cubes
    u2022 Pre-determined parameterized and non-parameterized reports
    u2022 Ad-hoc reports
    u2022 Spreadsheet applications with pre-populated work sheets and pivot tables
    u2022 Data visualization graphics
    u2022 Dashboard/scorecards for performance indicator applications
    Typically a business intelligence (BI) project may be scoped to deliver an agreed upon set of data marts in a project. Once these have been well specified, the conceptual data model (CDM) is used to determine what parts need to be built or used as a reference to conform the inbound data from any new project. After the detailed data mart specifications (DDMS) have been verified and the conceptual data model (CDM) components determined, a source and target logical data model (LDM) can be designed to integrate the detailed data mart specification (DDMS) and conceptual data model (CMD). An extraction, transformation and loading (ETL) process can then be set up and scheduled to populate the logical data models (LDM) from the required data sources and assist with any time series and data audit change control requirements.
    Over time as more and more data marts and logical data models (LDMu2019s) are built the conceptual data model (CDM) becomes more complete. One very important advantage to this implementation methodology is that the order of the data marts and logical data models can be entirely driven by project priority, project budget allocation and time-to-completion constraints/requirements. This data warehouse architecture implementation methodology does not need to dictate project priorities or project scope as long as the conceptual data model (CDM) exercise has been successfully completed before the first project request is initiated.
    McMasteru2019s Data Warehouse design
    DevelopmentTestProductionWarehouseWarehouseWarehouseOtherDB2 OperationalOracle OperationalETLETLETLETLETLETLETLETLETLDataMartsETLETLETLDataMartsDataMartsDB2/Oracle BIToolBIToolBIToolNoNoUserUserAccessAccessUserUserAccessAccess(SAS (SAS Data sets)Data sets)Staging Area 5
    SUGI 30 Focus Session
    Publication Services
    This is the visible presentation environment that business intelligence (BI) customers will use to interact with the published data mart deliverables. The SAS® Information Delivery Portal will be utilized as a web delivery channel to deliver a u201Cone-stop information shoppingu201D solution. This software solution provides an interface to access enterprise data, applications and information. It is built on top of the SAS Business Intelligence Architecture, provides a single point of entry and provides a Portal API for application development. All of our canned reports generated through SAS® Enterprise Guide, along with a web-based query and reporting tool (SAS® Web Report Studio) will be accessed through this publication channel.
    Using the portalu2019s personalization features we have customized it for a McMaster u201Clook and feelu201D. Information is organized using pages and portlets and our stakeholders will have access to public pages along with private portlets based on role authorization rules. Stakeholders will also be able to access SAS® data sets from within Microsoft Word and Microsoft Excel using the SAS® Add-In for Microsoft Office. This tool will enable our stakeholders to execute stored processes (a SAS® program which is hosted on a server) and embed the results in their documents and spreadsheets. Within Excel, the SAS® Add-In can:
    u2022 Access and view SAS® data sources
    u2022 Access and view any other data source that is available from a SAS® server
    u2022 Analyze SAS® or Excel data using analytic tasks
    The SAS® Add-In for Microsoft Office will not be accessed through the SAS® Information Delivery Portal as this is a client component which will be installed on individual personal computers by members of our Client Services group. Future stages of the project will include interactive reports (drill-down through OLAP cubes) as well as balanced scorecards to measure performance indicators (through SAS® Strategic Performance Management software). This, along with event notification messages, will all be delivered through the SAS® Information Delivery Portal.
    Publication is also channeled according to audience with appropriate security and privacy rules.
    SECURITY u2013 AUTHENTICATION AND AUTHORIZATION
    The business value derived from using the SAS® Value Chain Analytics includes an authoritative and secure environment for data management and reporting. A data warehouse may be categorized as a u201Ccollection of integrated databases designed to support managerial decision making and problem solving functionsu201D and u201Ccontains both highly detailed and summarized historical data relating to various categories, subjects, or areasu201D. Implementation of the research funding data mart at McMaster has meant that our stakeholders now have electronic access to data which previously was not widely disseminated. Stakeholders are now able to gain timely access to this data in the form that best matches their current information needs. Security requirements are being addressed taking into consideration the following:
    u2022 Data identification
    u2022 Data classification
    u2022 Value of the data
    u2022 Identifying any data security vulnerabilities
    u2022 Identifying data protection measures and associated costs
    u2022 Selection of cost-effective security measures
    u2022 Evaluation of effectiveness of security measures
    At McMaster access to data involves both authentication and authorization. Authentication may be defined as the process of verifying the identity of a person or process within the guidelines of a specific
    6
    SUGI 30 Focus Session
    security policy (who you are). Authorization is the process of determining which permissions the user has for which resources (permissions). Authentication is also a prerequisite for authorization. At McMaster business intelligence (BI) services that are not public require a sign on with a single university-wide login identifier which is currently authenticated using the Microsoft Active Directory. After a successful authentication the SAS® university login identifier can be used by the SAS® Meta data server. No passwords are ever stored in SAS®. Future plans at the university call for this authentication to be done using Kerberos.
    At McMaster aggregate information will be open to all. Granular security is being implemented as required through a combination of SAS® Information Maps and stored processes. SAS® Information Maps consist of metadata that describe a data warehouse in business terms. Through using SAS® Information Map Studio which is an application used to create, edit and manage SAS® Information Maps, we will determine what data our stakeholders will be accessing through either SAS® Web Report Studio (ability to create reports) or SAS® Information Delivery Portal (ability to view only). Previously access to data residing in DB-2 tables was granted by creating views using structured query language (SQL). Information maps are much more powerful as they capture metadata about allowable usage and query generation rules. They also describe what can be done, are database independent and can cross databases and they hide the physical structure of the data from the business user. Since query code is generated in the background, the business user does not need to know structured query language (SQL). As well as using Information Maps, we will also be using SAS® stored processes to implement role based granular security.
    At the university some business intelligence (BI) services are targeted for particular roles such as researchers. The primary investigator role of a research project needs access to current and past research funding data at both the summary and detail levels for their research project. A SAS® stored process (a SAS® program which is hosted on a server) is used to determine the employee number of the login by checking a common university directory and then filtering the research data mart to selectively provide only the data that is relevant for the researcher who has signed onto the decision support portal.
    Other business intelligence (BI) services are targeted for particular roles such as Vice-Presidents, Deans, Chairs, Directors, Managers and their Staff. SAS® stored processes are used as described above with the exception that they filter data on the basis of positions and organizational affiliations. When individuals change jobs or new appointments occur the authorized business intelligence (BI) data will always be correctly presented.
    As the SAS® stored process can be executed from many environments (for example, SAS® Web Report Studio, SAS® Add-In for Microsoft Office, SAS® Enterprise Guide) authorization rules are consistently applied across all environments on a timely basis. There is also potential in the future to automatically customize web portals and event notifications based upon the particular role of the person who has signed onto the SAS® Information Delivery Portal.
    ARCHITECTURE (PRODUCTION ENVIRONMENT)
    We are currently in the planning stages for building a scalable, sustainable infrastructure which will support a scaled deployment of the SAS® Value Chain Analytics. We are considering implementing the following three-tier platform which will allow us to scale horizontally in the future:
    Our development environment consists of a server with 2 x Intel Xeon 2.8GHz Processors, 2GB of RAM and is running Windows 2000 u2013 Service Pack 4.
    We are considering the following for the scaled roll-out of our production environment.
    A. Hardware
    1. Server 1 - SAS® Data Server
    - 4 way 64 bit 1.5Ghz Itanium2 server
    7
    SUGI 30 Focus Session
    - 16 Gb RAM
    - 2 73 Gb Drives (RAID 1) for the OS
    - 1 10/100/1Gb Cu Ethernet card
    - 1 Windows 2003 Enterprise Edition for Itanium
    2 Mid-Tier (Web) Server
    - 2 way 32 bit 3Ghz Xeon Server
    - 4 Gb RAM
    - 1 10/100/1Gb Cu Ethernet card
    - 1 Windows 2003 Enterprise Edition for x86
    3. SAN Drive Array (modular and can grow with the warehouse)
    - 6 u2013 72GB Drives (RAID 5) total 360GB for SAS® and Data
    B. Software
    1. Server 1 - SAS® Data Server
    - SAS® 9.1.3
    - SAS® Metadata Server
    - SAS® WorkSpace Server
    - SAS® Stored Process Server
    - Platform JobScheduler
    2. Mid -Tier Server
    - SAS® Web Report Studio
    - SAS® Information Delivery Portal
    - BEA Web Logic for future SAS® SPM Platform
    - Xythos Web File System (WFS)
    3. Client u2013Tier Server
    - SAS® Enterprise Guide
    - SAS® Add-In for Microsoft Office
    REPORTING
    We have created a number of parameterized stored processes using SAS® Enterprise Guide, which our stakeholders will access as both static (HTML as well as PDF documents) and interactive reports (drill-down) through SAS® Web Report Studio and the SAS® Add-In for Microsoft Office. All canned reports along with SAS® Web Report Studio will be accessed through the SAS® Information Delivery Portal.
    NEXT STEPS
    Next steps of the project include development of a financial data mart along with appropriate data quality standards, monthly frozen snapshots and implementation of university-wide financial reporting standards. This will facilitate electronic access to integrated financial information necessary for the development and maintenance of an integrated, multi-year financial planning framework. Canned reports to include monthly web-based financial statements, with drill-down capability along with budget templates automatically populated with data values and saved in different workbooks for different subgroups (for example by Department). The later will be accomplished using Microsoft Direct Data Exchange (DDE).
    8
    SUGI 30 Focus Session
    As well, we will begin the implementation of SAS® Strategic Performance Management Software to support the performance measurement and monitoring initiative that is a fundamental component of McMasteru2019s strategic plan. This tool will assist in critically assessing and identifying meaningful and statistically relevant measures and indicators. This software can perform causal analyses among various measures within and across areas providing useful information on inter-relationships between factors and measures. As well as demonstrating how decisions in one area affect other areas, these cause-and-effect analyses can reveal both good performance drivers and also possible detractors and enable u2018evidenced-basedu2019 decision-making. Finally, the tool provides a balanced scorecard reporting format, designed to identify statistically significant trends and results that can be tailored to the specific goals, objectives and measures of the various operational areas of the University.
    LESSONS LEARNED
    Lessons learned include the importance of taking a consultative approach not only in assessing information needs, but also in building data hierarchies, understanding subject matter, and in prioritizing tasks to best support decision making and inform senior management. We found that a combination of training and mentoring (knowledge transfer) helped us accelerate learning the new tools. It was very important to ensure that time and resources were committed to complete the necessary planning and data quality initiatives prior to initiating the first project. When developing a project plan, it is important to

  • How to implement this scenerio in the workflow - More details given Inside?

    We are implementing a simple PO release workflow in ECC 5.0. The workflow just has a decision step and a release step. We have implemented the workflow in our system and it seems to be working fine.
    Problem Statement:
        The client has many Purchase Order Administrators or Purchasing clerks who needs to approve the PO. But the client does not have many user logins or license to have more users in SAP. So they are checking if it is possible to have the workflow in SAP but to execute the workflow from outside SAP by using one user login.
    To be more specific, Here is what they are asking.
    1). They want that SAP must send a work item to their lotus notes inbox with a URL link in the email.
    2). The user will open the email from Lotus notes and then click on the URL link which might be something similar to the decision work item. The user will select YES or NO which is similar to clicking approve or reject
    3). Once the user has made a decision by clicking on the URL in the email they want that the PO should be released in SAP with a common user ID say WF-BATCH.
    4). We cannot use Extended notification as we would like to have only one user WF-BATCH for approval but send emails to all the Purchasing clerks. I believe that the extended notifications can only send emails to the id configured in the user master. Since the client wants to have only one user in SAP but many PO approvers  outside SAP.
    My questions:
    1). Can we configure extended notifications in workflow such that the system sends emails to different purchasing clerks in their lotus notes inbox? If yes then how can I configure that?
    2). If it is not possible then how do we achieve the requirement from outside SAP but still have the workflow?
    3). If we need to implement some external bespoke solutions then what all things would be involved? Any guidance on how this scenario can be achieved and what issues we may have?
    4). How can we trigger a workitem from outside SAP but still make it work as if it would have been triggered from SAP. Can we call some RFC FM to trigger the workitem and force a single user id?
    Any suggestions are welcome. Thanks in advance.
    any idea guys?
    Edited by: Workflow  learner on Jul 28, 2008 11:22 AM

    Well, apart from the obvious licensing issues which you should definitely discuss with your account manager, that's going to fail any audit. The 'signature' on the approval is the user's login, so if approvals are done by a common userid you will need to put a new authentication mechanism in place.
    The two major issues are:
    Forwarding - how do you control someone forwarding it to someone else to approve?
    Traceability - how can you prove who really approved it?
    This is not a small project. Much of this depends on your internal infrastructure, you will need to get Basis, Security, Networks, and Mail people involved, we cannot really help you much at this point.
    Sorry it's not the answer you want to hear.
    Good luck,
    Mike

Maybe you are looking for

  • Twist S230U, should I do Win 8.1 upgrade

    I read in these forums of all the problems people are having after doing the 8.1 upgrade.  Does every one who upgrades end with problems or are most upgrades successful? To upgrade or not to upgrade.

  • Standard FM/ Program/BADI which converts approved SC to Purchase Order

    Hi, In SRM, once the Shopping Cart is created and approved, a Purchase Order (PO) will be generated. I need to debug this standard functionality to analyze one issue. Can anybody provide me the FM/BADI name which does that? Thanks in advance. Regards

  • How to change color of slider-thumb?

    Hi, I want to change the color of the slider in my gui. I tried it with the UIManager: public static void main(String[]args){ //UIManager.put("Slider.thumb", Color.BLACK); //no effect UIManager.put("Slider.foreground", Color.BLACK); UIMangager.put("S

  • MSI GT683DXR - HOW TO SWITCH TO RAID?

    Hi everyone, I'm inclined to switch to RAID 0 on my MSI GT683DXR. (It's the French model but I've put an English Keyboard in). I have 2x 750GB Hard Drives inside. I would very much appreciate if somebody could guide me on how to go about the process.

  • How come since FF11 memory consumption has gone down from 1.2 to 0.5 GB ;-)

    How come since FF11 memory consumption has gone down from 1.2 to 0.5 GB ;-) What have you done .... specifically ? Also, I have almost 50 popular addons :-) Note: I`ve checked the FF11 release notes and a search for memory gives mainly memory account