Hows is SEQUENCE_NUM derived in the AP_SELECTED_INVOICES_ALL

Hello,
Please can you advise how the SEQUENCE_NUM is derived in the AP_SELECTED_INVOICES_ALL table in Payables?
The Sequence Number is said to be the order in which Payables pays Supplier Invoices. How does payables decide on the order of payment?
I very much look forward to your response,
Kind Regards, Claire

Hi Claire,
if my understanding for the question is correct: the following is the steps you need to perform to get the sequence_num(voucher numbering generated) in payables in sequential order:
1 Define Document sequence
Responsibility – System Administrator
Navigation - Application – Document - Define
2 Define Document Categories
Document Category - Payables
Responsibility – System Administrator
Navigation – Application – Document – Category
3) Assign document sequencing
3)Document Assignment -Payab;es
Responsibility – System Administrator
Navigation – Application – Document –Assign
4 Profile Option : Sequential Numbering - Always Used / Partial Used
Sequential Numbering – Payables
Responsibility – System Administrator
Navigation – Profile – System
The profile option name is Sequential Numbering
Please ignore the steps suggested if my understanding is wrong.
Thanks
Manish Jain.

Similar Messages

  • Deriving/ change the item category from handling unit

    Hi SAP GURUS,
    In vl01n I want to derive the item category based on the handling unit. Hnadling unit is derived from the payment terms.
    Please let me know the required configuration or the development for this.
    Thanks

    i need ur help regarding the Sales order item category redetermination.
             we are having two item categories YNAE and YNAG and it contain schedule  line categories CP and CB ( i.e for item category  YNAE schedule line category is CP and 
             for item category  YNAG schedule line category is CB), for YNAG Purchase Requisition is automatically created becz of functional setting on save and it is working fine.
             For every line item in sales order line item level item category YNAE will be determined by default as per functional settings.
             Now my requirement is that, if sales order line item doesn't contain 100% confirmed quantity, than i needs to change the default line item category YNAE to YNAG
             which i am trying to do it in USEREXIT_PRICING_PREPARE_TKOMP becz in this exit i am getting the values in XVBAP and XVBEP and  it is working fine .
             But schedule line category is not getting redetermined automatically what i mean is, if i change the item category to YNAG from YNAE in user exit corresponding schedule line category CB of YNAG is getting determined automatically at the schedule line level and if ichange the schedule line categories in XVBEP in exit USEREXIT_PRICING_PREPARE_TKOMP, changes are reflecting at the schedule line level but Purchase requisation is not getting created for schedule on SAVE and its delivary dates are also not getting determined properly. If press the option ATP check at the schedule line level than very things will getting corrected in the order.
    i am using UPDKZ = U  in both xvbap and xvbep for changes
    I feel that i am not doing this in proper way, do u have any idea about how to redetermine item category. please let me know if u have any idea .
    Regards,
    Vaddepally Manoj

  • HT1386 I am unable to synch photo albums from ipad 3 to itunes on my windows pc. In addition to camera roll and ipod cache albums, i have 6 other albums made from pix derived from the camera roll. I have trawled the net for the last 8 hrs in vain!

    Dear Community members,
    I am unable to synch photo albums from iPad 3 to iTunes on my Windows 7 pc.
    In addition to Camera roll and iPod cache default albums, I have 6 other albums made from pix derived from the camera roll. Eg -  Family pix in a folder, Office pix in another, Plant pixs in another, House architecture pix in another.......all also exist in the camera roll.
    I have trawled the net for the last 8 hrs in vain!
    I have ios5 on Ipad 3 and the latest Itunes on Windows PC.
    As a precaution, I have used the windows "add pictures from...." to keep a copy in a separate folder in the windows desktop, but again it has all the pics mixed up together and not sorted as per the various albums in the ipad!
    How to replicate the albums in the windows PC???
    Regards
    Sohal NS

    Sync the photos back to the iPad using iTunes. You have to place the photos into a folder from which you sync photos. All photos that you want to sync to the device must be included in the main photos folder and selected to sync each time that you sync.
    Connect the iPad to the PC and launch iTunes.
    Click on the iPad name on the left side under devices.
    Click on the Photos Tab on the right.
    Click on the Sync Photos From heading in order to select it.
    Select the photos folder that you want to sync from.
    Select all of the albums or photos within the folder that you want to sync.
    Click on Apply in the lower right corner of iTunes
    You can read more here.
    http://support.apple.com/kb/HT4236

  • Profit Center derivation at the time of invoicing

    Hi,
    Business requirement is to map fixed profit center for combination of Main and Sub transactions at the time of invoicing.
    As per my understanding, to derive fixed values for main and sub transactions, cokey needs to be mapped at EK02 which will in turn pick up values from table TECOD and TFKCOD as per case.
    Values are also flowing correctly in FM ISU_CODATA_DETERMINE and FKK_CODATA_DETERMINE.
    But actual results are not matching with expected values.
    For Profit center derivation, reference SAP notes direct to event 0140 and 1102 both of which are not getting triggered while invoicing.
    Moreover, I have also tried maintaining values in posting area 2610 but still no change.
    So could you kindly provide your expert opinion as to how to achieve subject requirement.
    Regards,
    Paresh

    Hi,
    Finally found solution through EK02.
    Here is basic crux for derivation at the time of invoicing.
    Profit center gets derived through Contract (through COKEY). If this value needs overriding then field NOCRCT (should be X) with proper values in COKEY at EK02 level.
    Moreover, to ensure that it is working always go for full reversal (EA13) and then do rebilling and reinvoicing.
    Hope it helps to others who may have to face such requirement from business.
    Regards,
    Paresh

  • How to call derived class to base class

    Hello everybody,
    I create a GUi application in java swing. Now i want to navigate between the screen but the timing between the screen is very slow bcoz i imported the class from package to another package. Now i want to extends one package to another package to reduce the navigation time but it saying error bcoz i cant able to call my derived class to base class. if anyone know the answer for this please answer this immediately.
    If any other method is there to optimise the navigation time please tell me
    by
    (kamal)

    Sorry, I've got major difficulties understanding your query:
    I create a GUi application in java
    ication in java swing. ok
    Now i want to navigate between
    the screenwhat? switch screens? display a different dialog?
    but the timing between the screen is very
    slowtiming? do you mean the time it takes to display a different dialog?
    bcoz i imported the class from package to
    another package.how did you come to the conclusion that that is the reason for the slowness? Did you do any profiling or is this just guess-work?

  • How to delete derivation rule - COPA (KEA5)

    Hi All,
    Could you please inform me how to delete derivation rule(COPA_KOSTL) in  KEA5 transaction code ? when we try to delete the system displaying  the following message :
    Message No K6430
    Diagnosis
    you are trying to delete a field delivered by SAP, although  you are in the customer name range
    System Response
    You can delete the field, but must assume that the field will reappear in the next put.
    Procedure
    We are dealing with a currency  - you can delete the field.
    We added this field to Operating concern and now we want to delete this field.
    Awaiting your valuable replied.
    Thanks in advance
    Srinivasa Chary

    Hi,
    In KEA5 did you copied the fixed characterisitics and created your own user Characterisirics. It is better not to copy the standard fields with table Since both will have same feature if you rename to another Desciption text.
    Regards,
    Sreekanth

  • 11gR2 OLAP - How to define or set the  size of the OLAP PAGE?

    Hi,all!
    Could you help me? How to define or set the size of the OLAP PAGE in 11gR2?
    Edited by: [email protected] on 25.01.2010 8:36
    Edited by: [email protected] on 25.01.2010 8:36

    Hi there,
    The size of an OLAP AW page is automatically derived from the block size of the tablespace in which it is stored (which in turn is typically derived from the db_block_size initialisation parameter)
    Do you have a particular concern about this setting?
    Thanks,
    Stuart Bunby
    OLAP Blog: http://oracleOLAP.blogspot.com
    OLAP Wiki: http://wiki.oracle.com/page/Oracle+OLAP+Option
    OLAP on OTN: http://www.oracle.com/technology/products/bi/olap/index.html
    DW on OTN : http://www.oracle.com/technology/products/bi/db/11g/index.html

  • How can i derive 0calender year from ZBUDQUART

    hi friends
    i have ZBUDQUART, for example its value is 200402,  that means i juat have year and month.
    now from this how can i derive 0calender year.
    thanks and regards
    sampath

    Hi,
    If ZBUDQUART is following exactly calendar year, then just write the following logic in the 0calender year update rule.
    RESULT = COMM_STRUCTURE-ZBUDQUART+0(4).
    hope it helps...
    regards,
    raju

  • How can I derive Measures for coming periods like (Next Qtr – Measure)

    Hi All,
    How can I derive time series measures for coming periods just like Qtr Ago and Year Ago I want Next Qtr and Next Year columns in my fact, how can I achieve this?
    I have a report which talks about Actual recorded values to the projected values of current quarter , Ago Quarter , Next Quarter
    Ago Qtr | Current Qtr | Next Qtr
    xxx | xxxx |xxxxx
    yyy | yyy |yyyy
    Ago(measure, level, no of periods)
    Thanks,
    SMA
    Edited by: SMA on Sep 21, 2010 4:23 PM

    Hi All
    I have derived Next to Next Qtr starting date using the following formula
    TIMESTAMPADD( SQL_TSI_QUARTER , 3, TIMESTAMPADD( SQL_TSI_QUARTER , -(1), TIMESTAMPADD( SQL_TSI_DAY , -( EXTRACT( DAY_OF_QUARTER FROM "Business Intelligence"."Dso Date Dim"."Date")) + 1, "Business Intelligence"."Dso Date Dim"."Date")))
    Extracted quarter from the derived date as well
    EXTRACT( QUARTER_OF_YEAR FROM "Business Intelligence"."Dso Date Dim"."Next Qtr Start Date")
    However I could not create a measure for next quarter using Ago or ToDate functions in BMM of OBIEE
    Please let me know if any one worked on this before
    Thanks,
    SMA

  • How to find link between the field (kzbew) mvt.ind. and the (ebeln) ?

    Hi all,
    while finding the link between the kzbew and the ebeln the only table i found is MSEG.
    But i want to populate the data in this table only?
    So please give me the solution how i can fetch the movement indicator for a particular purchase order?
    As i am using the idoc segment MBGMCR02((for creating goods receipt)...i have to pass the movt. indicator with the movement type also.......?

    the field - description in MSEG says:
    Movement Indicator
    Specifies the type of document (such as purchase order or delivery note)  that constitutes the basis for the Movement.
    Use
    This indicator is necessary, for example, to enable a distinction to be made between a goods receipt for a purchase order and a goods receipt for a production order. These two goods movements result in different data
    and account updates in the system.
    Dependencies
    The movement indicator is derived from the transaction code.
    what transaction do you want to use for your idoc?

  • What is BI ? How we implement & what is the cost to implement ?

    What is BI ? How we implement & what is the cost to implement ?
    Thanks,
    Sumit.

    Hi Sumit,
                        Below is the description according to ur query
    Business Intelligence is a process for increasing the competitive advantage of a business by intelligent use of available data in decision making. This process is pictured below.
    The five key stages of Business Intelligence:
    1.     Data Sourcing
    2.     Data Analysis
    3.     Situation Awareness
    4.     Risk Assessment
    5.     Decision Support
    Data sourcing
    Business Intelligence is about extracting information from multiple sources of data. The data might be: text documents - e.g. memos or reports or email messages; photographs and images; sounds; formatted tables; web pages and URL lists. The key to data sourcing is to obtain the information in electronic form. So typical sources of data might include: scanners; digital cameras; database queries; web searches; computer file access; etcetera.
    Data analysis
    Business Intelligence is about synthesizing useful knowledge from collections of data. It is about estimating current trends, integrating and summarising disparate information, validating models of understanding, and predicting missing information or future trends. This process of data analysis is also called data mining or knowledge discovery. Typical analysis tools might use:-
    u2022     probability theory - e.g. classification, clustering and Bayesian networks; 
    u2022     statistical methods - e.g. regression; 
    u2022     operations research - e.g. queuing and scheduling; 
    u2022     artificial intelligence - e.g. neural networks and fuzzy logic.
    Situation awareness
    Business Intelligence is about filtering out irrelevant information, and setting the remaining information in the context of the business and its environment. The user needs the key items of information relevant to his or her needs, and summaries that are syntheses of all the relevant data (market forces, government policy etc.).  Situation awareness is the grasp of  the context in which to understand and make decisions.  Algorithms for situation assessment provide such syntheses automatically.
    Risk assessment
    Business Intelligence is about discovering what plausible actions might be taken, or decisions made, at different times. It is about helping you weigh up the current and future risk, cost or benefit of taking one action over another, or making one decision versus another. It is about inferring and summarising your best options or choices.
    Decision support
    Business Intelligence is about using information wisely.  It aims to provide warning you of important events, such as takeovers, market changes, and poor staff performance, so that you can take preventative steps. It seeks to help you analyse and make better business decisions, to improve sales or customer satisfaction or staff morale. It presents the information you need, when you need it.
    This section describes how we are using extraction, transformation and loading (ETL) processes and a data warehouse architecture to build our enterprise-wide data warehouse in incremental project steps. Before an enterprise-wide data warehouse could be delivered, an integrated architecture and a companion implementation methodology needed to be adopted. A productive and flexible tool set was also required to support ETL processes and the data warehouse architecture in a production service environment. The resulting data warehouse architecture has the following four principal components:
    u2022 Data Sources
    u2022 Data Warehouses
    u2022 Data Marts
    u2022 Publication Services
    ETL processing occurs between data sources and the data warehouse, between the data warehouse and data marts and may also be used within the data warehouse and data marts.
    Data Sources
    The university has a multitude of data sources residing in different Data Base Management System (DBMS) tables and non-DBMS data sets. To ensure that all relevant data source candidates were identified, a physical inventory and logical inventory was conducted. The compilation of these inventories ensures that we have an enterprise-wide view of the university data resource.
    The physical inventory was comprised of a review of DBMS cataloged tables as well as data sets used by business processes. These data sets had been identified through developing the enterprise-wide information needs model.
    3
    SUGI 30 Focus Session
    The logical inventory was constructed from u201Cbrain-stormingu201D sessions which focused on common key business terms which must be referenced when articulating the institutionu2019s vision and mission (strategic direction, goals, strategies, objectives and activities). Once the primary terms were identified, they were organized into directories such as u201CProjectu201D, u201CLocationu201D, u201CAcademic Entityu201D, u201CUniversity Personu201D, u201CBudget Envelopeu201D etc. Relationships were identified by recognizing u201Cnatural linkagesu201D within and among directories, and the u201Cdrill-downsu201D and u201Croll-upsu201D that were required to support u201Creport byu201D and u201Creport onu201D information hierarchies. This exercise allowed the directories to be sub-divided into hierarchies of business terms which were useful for presentation and validation purposes.
    We called this important deliverable the u201CConceptual Data Modelu201D (CDM) and it was used as the consolidated conceptual (paper) view of all of the Universityu2019s diverse data sources. The CDM was then subjected to a university-wide consultative process to solicit feedback and communicate to the university community that this model would be adopted by the Business Intelligence (BI) project as a governance model in managing the incremental development of its enterprise-wide data warehousing project.
    Data Warehouse
    This component of our data warehouse architecture (DWA) is used to supply quality data to the many different data marts in a flexible, consistent and cohesive manner. It is a u2018landing zoneu2019 for inbound data sources and an organizational and re-structuring area for implementing data, information and statistical modeling. This is where business rules which measure and enforce data quality standards for data collection in the source systems are tested and evaluated against appropriate data quality business rules/standards which are required to perform the data, information and statistical modeling described previously.
    Inbound data that does not meet data warehouse data quality business rules is not loaded into the data warehouse (for example, if a hierarchy is incomplete). While it is desirable for rejected and corrected records to occur in the operational system, if this is not possible then start dates for when the data can begin to be collected into the data warehouse may need to be adjusted in order to accommodate necessary source systems data entry u201Cre-worku201D. Existing systems and procedures may need modification in order to permanently accommodate required data warehouse data quality measures. Severe situations may occur in which new data entry collection transactions or entire systems will need to be either built or acquired.
    We have found that a powerful and flexible extraction, transformation and loading (ETL) process is to use Structured Query Language (SQL) views on host database management systems (DBMS) in conjunction with a good ETL tool such as SAS® ETL Studio. This tool enables you to perform the following tasks:
    u2022 The extraction of data from operational data stores
    u2022 The transformation of this data
    u2022 The loading of the extracted data into your data warehouse or data mart
    When the data source is a u201Cnon-DBMSu201D data set it may be advantageous to pre-convert this into a SAS® data set to standardize data warehouse metadata definitions. Then it may be captured by SAS® ETL Studio and included in the data warehouse along with any DBMS source tables using consistent metadata terms. SAS® data sets, non-SAS® data sets, and any DBMS table will provide the SAS® ETL tool with all of the necessary metadata required to facilitate productive extraction, transformation and loading (ETL) work.
    Having the ability to utilize standard structured query language (SQL) views on host DBMS systems and within SAS® is a great advantage for ETL processing. The views can serve as data quality filters without having to write any procedural code. The option exists to u201Cmaterializeu201D these views on the host systems or leave them u201Cun-materializedu201D on the hosts and u201Cmaterializeu201D them on the target data structure defined in the SAS® ETL process. These choices may be applied differentially depending upon whether you are working with u201Ccurrent onlyu201D or u201Ctime seriesu201D data. Different deployment configurations may be chosen based upon performance issues or cost considerations. The flexibility of choosing different deployment options based upon these factors is a considerable advantage.
    4
    SUGI 30 Focus Session
    Data Marts
    This component of the data warehouse architecture may manifest as the following:
    u2022 Customer u201Cvisibleu201D relational tables
    u2022 OLAP cubes
    u2022 Pre-determined parameterized and non-parameterized reports
    u2022 Ad-hoc reports
    u2022 Spreadsheet applications with pre-populated work sheets and pivot tables
    u2022 Data visualization graphics
    u2022 Dashboard/scorecards for performance indicator applications
    Typically a business intelligence (BI) project may be scoped to deliver an agreed upon set of data marts in a project. Once these have been well specified, the conceptual data model (CDM) is used to determine what parts need to be built or used as a reference to conform the inbound data from any new project. After the detailed data mart specifications (DDMS) have been verified and the conceptual data model (CDM) components determined, a source and target logical data model (LDM) can be designed to integrate the detailed data mart specification (DDMS) and conceptual data model (CMD). An extraction, transformation and loading (ETL) process can then be set up and scheduled to populate the logical data models (LDM) from the required data sources and assist with any time series and data audit change control requirements.
    Over time as more and more data marts and logical data models (LDMu2019s) are built the conceptual data model (CDM) becomes more complete. One very important advantage to this implementation methodology is that the order of the data marts and logical data models can be entirely driven by project priority, project budget allocation and time-to-completion constraints/requirements. This data warehouse architecture implementation methodology does not need to dictate project priorities or project scope as long as the conceptual data model (CDM) exercise has been successfully completed before the first project request is initiated.
    McMasteru2019s Data Warehouse design
    DevelopmentTestProductionWarehouseWarehouseWarehouseOtherDB2 OperationalOracle OperationalETLETLETLETLETLETLETLETLETLDataMartsETLETLETLDataMartsDataMartsDB2/Oracle BIToolBIToolBIToolNoNoUserUserAccessAccessUserUserAccessAccess(SAS (SAS Data sets)Data sets)Staging Area 5
    SUGI 30 Focus Session
    Publication Services
    This is the visible presentation environment that business intelligence (BI) customers will use to interact with the published data mart deliverables. The SAS® Information Delivery Portal will be utilized as a web delivery channel to deliver a u201Cone-stop information shoppingu201D solution. This software solution provides an interface to access enterprise data, applications and information. It is built on top of the SAS Business Intelligence Architecture, provides a single point of entry and provides a Portal API for application development. All of our canned reports generated through SAS® Enterprise Guide, along with a web-based query and reporting tool (SAS® Web Report Studio) will be accessed through this publication channel.
    Using the portalu2019s personalization features we have customized it for a McMaster u201Clook and feelu201D. Information is organized using pages and portlets and our stakeholders will have access to public pages along with private portlets based on role authorization rules. Stakeholders will also be able to access SAS® data sets from within Microsoft Word and Microsoft Excel using the SAS® Add-In for Microsoft Office. This tool will enable our stakeholders to execute stored processes (a SAS® program which is hosted on a server) and embed the results in their documents and spreadsheets. Within Excel, the SAS® Add-In can:
    u2022 Access and view SAS® data sources
    u2022 Access and view any other data source that is available from a SAS® server
    u2022 Analyze SAS® or Excel data using analytic tasks
    The SAS® Add-In for Microsoft Office will not be accessed through the SAS® Information Delivery Portal as this is a client component which will be installed on individual personal computers by members of our Client Services group. Future stages of the project will include interactive reports (drill-down through OLAP cubes) as well as balanced scorecards to measure performance indicators (through SAS® Strategic Performance Management software). This, along with event notification messages, will all be delivered through the SAS® Information Delivery Portal.
    Publication is also channeled according to audience with appropriate security and privacy rules.
    SECURITY u2013 AUTHENTICATION AND AUTHORIZATION
    The business value derived from using the SAS® Value Chain Analytics includes an authoritative and secure environment for data management and reporting. A data warehouse may be categorized as a u201Ccollection of integrated databases designed to support managerial decision making and problem solving functionsu201D and u201Ccontains both highly detailed and summarized historical data relating to various categories, subjects, or areasu201D. Implementation of the research funding data mart at McMaster has meant that our stakeholders now have electronic access to data which previously was not widely disseminated. Stakeholders are now able to gain timely access to this data in the form that best matches their current information needs. Security requirements are being addressed taking into consideration the following:
    u2022 Data identification
    u2022 Data classification
    u2022 Value of the data
    u2022 Identifying any data security vulnerabilities
    u2022 Identifying data protection measures and associated costs
    u2022 Selection of cost-effective security measures
    u2022 Evaluation of effectiveness of security measures
    At McMaster access to data involves both authentication and authorization. Authentication may be defined as the process of verifying the identity of a person or process within the guidelines of a specific
    6
    SUGI 30 Focus Session
    security policy (who you are). Authorization is the process of determining which permissions the user has for which resources (permissions). Authentication is also a prerequisite for authorization. At McMaster business intelligence (BI) services that are not public require a sign on with a single university-wide login identifier which is currently authenticated using the Microsoft Active Directory. After a successful authentication the SAS® university login identifier can be used by the SAS® Meta data server. No passwords are ever stored in SAS®. Future plans at the university call for this authentication to be done using Kerberos.
    At McMaster aggregate information will be open to all. Granular security is being implemented as required through a combination of SAS® Information Maps and stored processes. SAS® Information Maps consist of metadata that describe a data warehouse in business terms. Through using SAS® Information Map Studio which is an application used to create, edit and manage SAS® Information Maps, we will determine what data our stakeholders will be accessing through either SAS® Web Report Studio (ability to create reports) or SAS® Information Delivery Portal (ability to view only). Previously access to data residing in DB-2 tables was granted by creating views using structured query language (SQL). Information maps are much more powerful as they capture metadata about allowable usage and query generation rules. They also describe what can be done, are database independent and can cross databases and they hide the physical structure of the data from the business user. Since query code is generated in the background, the business user does not need to know structured query language (SQL). As well as using Information Maps, we will also be using SAS® stored processes to implement role based granular security.
    At the university some business intelligence (BI) services are targeted for particular roles such as researchers. The primary investigator role of a research project needs access to current and past research funding data at both the summary and detail levels for their research project. A SAS® stored process (a SAS® program which is hosted on a server) is used to determine the employee number of the login by checking a common university directory and then filtering the research data mart to selectively provide only the data that is relevant for the researcher who has signed onto the decision support portal.
    Other business intelligence (BI) services are targeted for particular roles such as Vice-Presidents, Deans, Chairs, Directors, Managers and their Staff. SAS® stored processes are used as described above with the exception that they filter data on the basis of positions and organizational affiliations. When individuals change jobs or new appointments occur the authorized business intelligence (BI) data will always be correctly presented.
    As the SAS® stored process can be executed from many environments (for example, SAS® Web Report Studio, SAS® Add-In for Microsoft Office, SAS® Enterprise Guide) authorization rules are consistently applied across all environments on a timely basis. There is also potential in the future to automatically customize web portals and event notifications based upon the particular role of the person who has signed onto the SAS® Information Delivery Portal.
    ARCHITECTURE (PRODUCTION ENVIRONMENT)
    We are currently in the planning stages for building a scalable, sustainable infrastructure which will support a scaled deployment of the SAS® Value Chain Analytics. We are considering implementing the following three-tier platform which will allow us to scale horizontally in the future:
    Our development environment consists of a server with 2 x Intel Xeon 2.8GHz Processors, 2GB of RAM and is running Windows 2000 u2013 Service Pack 4.
    We are considering the following for the scaled roll-out of our production environment.
    A. Hardware
    1. Server 1 - SAS® Data Server
    - 4 way 64 bit 1.5Ghz Itanium2 server
    7
    SUGI 30 Focus Session
    - 16 Gb RAM
    - 2 73 Gb Drives (RAID 1) for the OS
    - 1 10/100/1Gb Cu Ethernet card
    - 1 Windows 2003 Enterprise Edition for Itanium
    2 Mid-Tier (Web) Server
    - 2 way 32 bit 3Ghz Xeon Server
    - 4 Gb RAM
    - 1 10/100/1Gb Cu Ethernet card
    - 1 Windows 2003 Enterprise Edition for x86
    3. SAN Drive Array (modular and can grow with the warehouse)
    - 6 u2013 72GB Drives (RAID 5) total 360GB for SAS® and Data
    B. Software
    1. Server 1 - SAS® Data Server
    - SAS® 9.1.3
    - SAS® Metadata Server
    - SAS® WorkSpace Server
    - SAS® Stored Process Server
    - Platform JobScheduler
    2. Mid -Tier Server
    - SAS® Web Report Studio
    - SAS® Information Delivery Portal
    - BEA Web Logic for future SAS® SPM Platform
    - Xythos Web File System (WFS)
    3. Client u2013Tier Server
    - SAS® Enterprise Guide
    - SAS® Add-In for Microsoft Office
    REPORTING
    We have created a number of parameterized stored processes using SAS® Enterprise Guide, which our stakeholders will access as both static (HTML as well as PDF documents) and interactive reports (drill-down) through SAS® Web Report Studio and the SAS® Add-In for Microsoft Office. All canned reports along with SAS® Web Report Studio will be accessed through the SAS® Information Delivery Portal.
    NEXT STEPS
    Next steps of the project include development of a financial data mart along with appropriate data quality standards, monthly frozen snapshots and implementation of university-wide financial reporting standards. This will facilitate electronic access to integrated financial information necessary for the development and maintenance of an integrated, multi-year financial planning framework. Canned reports to include monthly web-based financial statements, with drill-down capability along with budget templates automatically populated with data values and saved in different workbooks for different subgroups (for example by Department). The later will be accomplished using Microsoft Direct Data Exchange (DDE).
    8
    SUGI 30 Focus Session
    As well, we will begin the implementation of SAS® Strategic Performance Management Software to support the performance measurement and monitoring initiative that is a fundamental component of McMasteru2019s strategic plan. This tool will assist in critically assessing and identifying meaningful and statistically relevant measures and indicators. This software can perform causal analyses among various measures within and across areas providing useful information on inter-relationships between factors and measures. As well as demonstrating how decisions in one area affect other areas, these cause-and-effect analyses can reveal both good performance drivers and also possible detractors and enable u2018evidenced-basedu2019 decision-making. Finally, the tool provides a balanced scorecard reporting format, designed to identify statistically significant trends and results that can be tailored to the specific goals, objectives and measures of the various operational areas of the University.
    LESSONS LEARNED
    Lessons learned include the importance of taking a consultative approach not only in assessing information needs, but also in building data hierarchies, understanding subject matter, and in prioritizing tasks to best support decision making and inform senior management. We found that a combination of training and mentoring (knowledge transfer) helped us accelerate learning the new tools. It was very important to ensure that time and resources were committed to complete the necessary planning and data quality initiatives prior to initiating the first project. When developing a project plan, it is important to

  • I have an apple iphone 4 that I want to download audio books from my library onto it.  I cannot figure out how to do that.  The iphone does not show up on my mac laptop as a connected device.  Does this kind of download have to run through itunes?  Thanks

    I have an apple iphone 4 that I want to download audio books from my library onto.  I cannot figure out how to do that.  The iphone does not show up on my mac laptop as a connected device.  Does this kind of download have to run through itunes?  Thanks

    Yes it is done through iTunes. The iPhone will never show up in Finder as a device. The following is general information on iTunes sync: http://support.apple.com/kb/HT1386 and the following is a previous discussion where the post by Andreas Junge helped others that had a problem syncing audiobooks: https://discussions.apple.com/message/20052732#20052732

  • My macbook pro no longer connects to my TV with the mini dvi to hdmi adapter. A while ago the computer fell down and hit the cable. How can I tell if the computer is working ok. Doesn't show mirroring any more.

    My macbook pro no longer connects to my TV with the mini dvi to hdmi adapter. A while ago the computer fell down and hit the cable. How can I tell if the computer is working ok. Doesn't show mirroring any more. Was working while watching a movie but when I pulled it out from the computer and put it back in the system would no longer mirrorr the screen on the TV.

    The best option is an appointment at an Apple store genius bar.  The evaluation will be for FREE!
    Ciao.

  • I have a MBPro mid 2013 with 1 Thunderbolt port. Just bought a Mac monitor and want to daisy chain a PC monitor to it. Use a Mini DVI to VGA adapter from MBP to PC monitor. How do I daisy chain the 2 monitors?

    I have a MBPro mid 2013 with 1 Thunderbolt port. Just bought a Mac monitor and want to daisy chain a PC monitor to it. Use a Mini DVI to VGA adapter from MBP to PC monitor. How do I daisy chain the 2 monitors?

    Hall Palm Desert,
    if your Mac monitor has Thunderbolt ports, and the PC monitor is on the end of the daisy chain, then you might be able to do it by connecting your PC monitor’s VGA cable to a Mini DisplayPort-to-VGA adapter (e.g. Apple, NewerTech), connecting that cable’s Mini DisplayPort end to one of the Mac monitor’s Thunderbolt ports, and then connecting a Thunderbolt cable between the other port on the Mac monitor and your MacBook Pro.

  • How can I will declare the symbol u2018 (Single Quote) in the report

    Hi ,
    Could you please tell me how can I will declare the symbol u2018 (Single Quote) in the report.
    My requirement is that I have concate the data with single quote and after that I have to store the data in to an internal table and I have to download the data in the form text file in the presentation server.
    For example :
    Let the below data I want to download into the presentation serve in the format of text file by storing in internal table.
    Assume all are constants:
    1st line : abcu2019add
    2nd line :  defu2019gef
    Thanks in advance.

    Hi Jyothi,
    Thanks for the quick reply .
    I can agree with you are point but My requirement is like this I am explaining clearly.
    I have declared the internal table like this.
    DATA: BEGIN OF OTAB OCCURS 0,
             LINE (9024),
           END OF OTAB.
    So I have to append the each line item into the internal table.
    So I am explaining what the data I have to append
    Ist line contains
    'UNBUNOC:2020308u2019 where 020308 I will get the  date from reguh table
    2nd line contains:
    'DTM+20020510' where the 20020510 will be reference document number from the table reguh.
    So I want to declare a constant 'UNBUNOC:2
    2nd the date from reguh table
    And another constant u2018
    So that I can concate all the three and I can put into string and I will append into internal table and I can download the data into the presentation server.
    Please let me know if you need any more clarification regarding my requirement.
    Thanks in advance.

Maybe you are looking for

  • Can't update or factory restore Apple TV

    I noticed the Apple TV I just bought for my Living Room hadn't updated to 1.1 even though I'd ran the installer a couple of weeks ago. So I tried running it again - I gave it the "Update Now" prompt, it restarted and when it rebooted I was still stuc

  • My hard drive won't mount on 2012 mac book pro with retina display

    The flash drive in my Sept 2012 mac book pro with retina display is not mounted, won't mount,  and can't be repaired in disk repair.  I need files created right before it failed last night that were not backed up.  Can they be salvaged before a new w

  • In a bit of a mess

    This problem has been caused by me and is doing my head in. I will try to keep this as short as possible. I have itunes on my laptop which was filling up with 600 albums so i decided to buy an external hard drive to transfer the songs and free up spa

  • NEED INFORMATION ABOUT "MAX_TEST" FUNCTION MODULE

    HI ABAPERS,                   I need information about "MAX_TEST" function module . this function module is used in 4.6 version , now system have been shifted to 6.0.....here in 6.0 version this function module is not prasent.can any one tell me whic

  • Business content in ECC6.0

    Hi all, I am working on ECC 6.0. As it also contains BI , i am trying to access the Business content.I tried to activate business content from RSA1 and i couldn't able to see all the business content which normally available in bi 7.0. Could you plea