Granularity regarding scoring

I am trying to find out how granular the scoring result capabilities are within Captivate 4. I do not want simply pass/fail or complete/incomplete. I would like to be able to pass the specific questions answered and the results of each. I would also like to know if we can capture how many attempts took place on any given question. Is this granularity possible out of the box or maybe there is additional programming that could make this need a reality. I am not a programmer and would therefore love to know this is possible without much difficulty. Anyone have any experience?????

I appreciate the swift response, I have several queries:
1.) My girlfriend�s project is a text based game as previously mentioned, one which features multiple rooms. Upon entering certain rooms, the player loses a life, how exactly would such a thing be implemented? Also, what classes would be involved? (There are multiple including Game; Room, Score, etc�)
2.) Would it be at all possible to associate the above code with the game and it�s classes, since implementation is proving troublesome?
3.) Is it possible to implement a points system as well as that of lives? For example, the player would receive points upon entering certain rooms, but would lose them upon entering others.
To clarify the above somewhat, we�re basically looking to introduce a scoring system into the game. We�re unsure how to implement different aspects including the player�s lives, the rooms which result in the loss or gain of points and lives etc�
Any help would be much appreciated.

Similar Messages

  • Delete file Application Server

    Hi ...
    I have an application wrote with Forms Dev. ... Is it possible delete a file on the application server ??
    Is it also possible delete a file in the Application server by a stored procedure ??
    Thank's in advance

    It is also possible to implement a java procedure on the database being invoked with a PL/SQL wrapper class.
    In this way (and if used right) there is also granularity regarding the filestructure permissions given and it may be called during a Forms or other PL/SQL session.
    As it is a requirement in this post to delete file from the AS, the database instance holding the JAVA class needs to be installed on the Application Server itself (which is not recommended anyway). But the article below explains a more generic approach how to invoke shell commands from within an Oracle Instance.
    Be careful with this, because it really works ;)
    Refer to :
    http://www.oracle-base.com/articles/8i/ShellCommandsFromPLSQL.php
    Message was edited by:
    user434854

  • Call a UNIX shell script from an oracle stored procedure

    We need to call a UNIX shell script from an oracle stored procedure
    i.e. the control should come back to the procedure once the script completes. Can any body help to achieve this ?

    There are various ways in achieving this.
    For Example, you can call a PRO*C-Library residing on the database server.
    This requires a PL/SQL library to be generated and some changes to the Listener configuration.
    It is also possible to implement a java procedure on the database being invoked by a PL/SQL wrapper class.
    In this way (and if used right) there is also granularity regarding the filestructure permissions given and it may be called during a Forms or other PL/SQL session.
    The article below explains a more generic approach how to invoke shell commands from within an Oracle Instance.
    Be careful with this, because it really works ;)
    Refer to :
    http://www.oracle-base.com/articles/8i/ShellCommandsFromPLSQL.php
    Message was edited by:
    user434854

  • Question regarding Composite granularity in SOA Suite

    I have a question about service granularity in a SOA Suite composite. In your experience, what is the best way to implement an Enterprise Business Service in SOA Suite?
    Say I have a service interface/wsdl that looks something like:
    Customer
    + getCustomer
    + createCustomer
    + updateCustomer
    + .....
    Would the operations on the service be best implemented as BPEL processes each living in their own composite, or as multiple processes in one composite with a mediator(based on the ebs wsdl) in front of them? I'm also using OSB, so I could do the routing there to each of the separate composites if I break up the operations into a composite for each. But that seems like the amount of composites would become unmaintainably high.
    If I use the mediator to expose the ebs wsdl, then I would just use OSB for virtualization, throttling, etc.
    Not too sure which way to go with this design, but any experienced input is appreciated. I did notice that AIA implements each operation as a separate composite.
    Thanks.

    As per the AIA architecture we should use Mediator in EBS layer, because of below reasons.
    1. OSB is not part of SOA until 11g,
    2. You have to re write the code for exposing EBS in OSB Layer, thats a huge work.
    Both Mediator and OSB does the same thing, ofcouse there are some differences
    As per your design if you use OSB for virtualization, that is fine, but you will be having one more layer of virtualization before it hits the actual composites, which will definately hit the performance, extra care we need take for OSB layer.
    In my current project, we tried both the approach that you wanted to try.
    Approach 1 : Exposing EBS in OSB Layer.
    Tried rewriting the EBS wsdl because the port information will not be available by default, without that we will not be able to create proxy services in OSB.
    Then for how many wsdl files you will rewrite the code, then you need to update the MDS as well with those.
    Did a small POC and then decided to go with as is. Probably we will have OSB layer in AIA 12 g
    Approach 2 : We are using OSB wrapper for thirdparty web services and adapter services not the provider abcs services. I feel this is overhead.
    Thanks,
    Vijay

  • I need help writing a dive scoring program in C#, Help Please?

    The state diving commission wants to computerize the scoring at its diving competitions. I have to write a program to automate the scoring of dives. Requirements:
    After each dive, the user will be prompted to enter the:
    diver's name;                    
    diver's city;                    
    degree of difficulty (ranges from 1.00 to 1.67); and                     
    scores from five judges (scores can range from 0 to 10).
    If an invalid score is entered, an error message will be displayed. The user will be prompted for the score repeatedly until a valid score is entered.
    The program will then display the following information:
    Diver's name                    
    Diver's city                    
    Dive final score: This is calculated by dropping the highest and lowest of the five judges' scores. The remaining three scores are added together, and the result is divided by 3 and then multiplied by the degree of difficulty.
    The program will then prompt the user if she/he wants to process another dive. The user can type "Y" or "y" to continue, and "N" or "n" to quit.

    Hi,
    The forum supports VS setup and installation. I think your issue is not about the forum.
    Please clarify your project type. Is it webform or winform?
    If it is webform, please post your issue to ASP.NET forum.
    If it is winform, please post it to
    winform forum.
    Regards.
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • MDX Calculate minimum price at a higher level of granularity

    Hi all!
    I can't find my way to calculate the following scenario in SSRS mdx statement: I need to know the "category best price" of a product category at a lower level of granularity. See example below:
    Category:         Subcategory:        Price:        
    Category Best Price?
    Adventure         Bikes                   110            105
    Adventure         Kayak                  120            105
    Adventure         Running Coat        105            105
    Fitness             Training Shoes       80              75
    Fitness             T-Shirt                  75              75
    I need to undestand how can I build the Calculated Measure to get the "category best price" column! Note that in the real scenario, there are much more values in the rows, so the granularity is much higher. Additionally, the attributes doesn't
    belong to the same dimension.
    Thanks a ton for your help!

    Hi mts_aa,
    According to your description, you want to show lowest price of each Category group in each row. Right?
    In this scenario, we can generate this column in MDX query level. Please refer the query below:
    with member [Measures].[MinChildren]
    as
    min([Product].[Product Categories].currentmember.parent.children,[Measures].[Price])
    select {[Measures].[Price],[Measures].[MinChildren]} on 0,
    [Product].[Category].[Category].members*[Product].[Subcategory].[Subcategory].members on 1
    from
    [MDX]
    Also you can do this on report level. You just need to have the records group on Category. Then use the expression below in detail row:
    =Min(Fields!Price.Value,"Category")
    Reference:
    Understanding Groups (Report Builder and SSRS)
    If you have any question, please feel free to ask.
    Best Regards,
    Simon Hou

  • Display drilling path at granular level of report title in BAM 11g

    Is it possible to show drilling path at report title location at the granular level of drill-down?consider that i have a report zone wise distribution of total price in a barchart.Once i click on say east zone it will show office wise distribution once i click on say canada-office bar it will show the details of the canada office.in the canada office details lavel i want to disply total driling path in the report title location at run time.so i can have at the report title location of the granular level eastzone-->canada office-->like this.if you have any solution on that please help me.thanks in advanced.
    Edited by: user8925573 on 9 Feb, 2010 9:14 AM

    I am sorry this is not available right now.
    Regards,
    Vishal
    BAM Development

  • REGARDING VENDOR QUALITY RATING

    Hi All
    My client wants vendor quality rating
    for this i maintend quality score in clint data maintened score procedure and assing it to inspection type
    define the formula for quality rating in functional modul for Q score procedure
    is this suficient or i m missing some thing pls tell me
    regards
    Sachin

    HI
    endor Evaluation for Purchasing
    Define the weighting keys in SM30 - V_T147J.
    Weighting keys 01 and 02 are defined in the standard system. You combine the following
    weighting shares for the standard main criteria:
    Main criterion     Key 01 Key 02
    Price                   1      5
    Quality                 1      5
    Delivery                1      2
    Gen. service/support    1      1
    Ext. service provision  1      2
    Define the criteria in SM30 - V_T147G - Double click on the line items
    In this step, you define the criteria by which the system computes scores for vendors and
    specify whether the scores for the subcriteria are computed manually, semi-automatically, or
    automatically.
    You can also store your own methods of computation for the scores for subcriteria in the
    form of user exits. The enhancement MM06L001 is available for this purpose.
    Define the scope of list in SM30 - V_T147M - Double click on the line items
    Define Purchasing Organization data for vendor evaluations in transaction OMGL.
    An example :-
    How the system calculates the score for the automatic subcriteria
    "On-Time Delivery Performance"?
    The system uses the statistics-relevant delivery date in the purchase order
    (Items -> Delivery Schedule) and the goods receipt date to calculate date variances.
    You use the statistics-relevant delivery date, for example, if you know that the vendor
    will not deliver the material as scheduled on September 15 but on September 30. Enter the
    delivery date as September 30, but enter the statistics-relevant delivery date as
    September 15.
    In calculating the score for on-time delivery performance, the system will then not use
    the actual delivery date, but the statistics-relevant delivery date. This has a negative
    effect on the score for this goods receipt.
    However, materials planning and control uses the realistic delivery date (September 30)
    which the vendor will actually adhere to.
    The system considers only goods receipts against purchase orders and scheduling agreements
    into stores and the release of GR blocked stock into stores. In the standard system, these
    are the movement types 101 and 105.
    Minimum Delivery Percentage - OMGL in the On-time delivery section
    If you do not want a vendor to receive a very good score if he delivered the goods on time,
    but did not deliver the required quantity, you can maintain a minimum delivery percentage
    in Customizing.
    Assume you set the Min. del. perc. parameter to 60% and the vendor delivers the goods on
    time, but only 55% of the ordered quantity. Although the goods receipt is punctual, it is
    not included in the calculation of the vendoru2019s score for on-time delivery performance. So
    that the non-scoring of the on-time delivery performance criterion in this case does not
    bring an unfair advantage in comparison with a poor score, the vendor is awarded a low score
    for quantity reliability. On-time delivery performance is thus always to be seen in
    conjunction with quantity reliability.
    Standardizing Delivery Date Variance  - OMGL in the On-time delivery section
    To rate delivery date variances in days, maintain the Std.del.time var. parameter.
    If you assign a lower standard value, this means that relatively low date variances produce
    high percentage variances. If you set a higher standard value, this results in a relatively
    low percentage variance:
    The Std.del.time var. parameter has the value 20. The goods receipt took place on Nov. 27;
    the statistical delivery date was Nov. 15. There is thus a difference of 12 days.
    The system calculates the percentage variance as follows:
    12 / 20 x 100 = 60
    If the Std.del.time var. parameter had the value 60, the variance would be 20%
    (12 / 60 x 100 = 20).
    If you do not maintain this parameter, the system calculates the delivery time variance via
    the firm zone in the case of scheduling agreements, and via the order date and the
    statistics-relevant delivery date in the case of purchase orders. 
    With regrds
    Yashodhan

  • Kindly suggest me regarding vendor evaluation (me63) score

    Dear Friends,
    I have excuted the report of vendor evaluation but here for Price criteria ( subcriteria) 40,40 are coming and some times it is 46,48 are coming. and for other criteria delivery it is 68,70 like that it is coming. kindly let me know how much is the maximum score for delivery,quality and price. and how it is distributed in standard when i have selected equal weightage
    Regards,
    Saurav

    Dear Friend,
    Kindly go through the below documents,
    Define the weighting keys in SM30 - V_T147J. Weighting keys 01 and 02 are defined in the standard system. You combine the following weighting shares for the standard main criteria: Main criterion Key 01 Key 02 Price 1 5 Quality 1 5 Delivery 1 2 Gen. service/support 1 1 Ext. service provision 1 2 Define the criteria in SM30 - V_T147G - Double click on the line items In this step, you define the criteria by which the system computes scores for vendors and specify whether the scores for the subcriteria are computed manually, semi-automatically, or automatically. You can also store your own methods of computation for the scores for subcriteria in the form of user exits. The enhancement MM06L001 is available for this purpose. Define the scope of list in SM30 - V_T147M - Double click on the line items Define Purchasing Organization data for vendor evaluations in transaction OMGL. An example :- How the system calculates the score for the automatic subcriteria "On-Time Delivery Performance"? The system uses the statistics-relevant delivery date in the purchase order (Items -> Delivery Schedule) and the goods receipt date to calculate date variances. You use the statistics-relevant delivery date, for example, if you know that the vendor will not deliver the material as scheduled on September 15 but on September 30. Enter the delivery date as September 30, but enter the statistics-relevant delivery date as September 15. In calculating the score for on-time delivery performance, the system will then not use the actual delivery date, but the statistics-relevant delivery date. This has a negative effect on the score for this goods receipt. However, materials planning and control uses the realistic delivery date (September 30) which the vendor will actually adhere to. The system considers only goods receipts against purchase orders and scheduling agreements into stores and the release of GR blocked stock into stores. In the standard system, these are the movement types 101 and 105. Minimum Delivery Percentage - OMGL in the On-time delivery section If you do not want a vendor to receive a very good score if he delivered the goods on time, but did not deliver the required quantity, you can maintain a minimum delivery percentage in Customizing. Assume you set the Min. del. perc. parameter to 60% and the vendor delivers the goods on time, but only 55% of the ordered quantity. Although the goods receipt is punctual, it is not included in the calculation of the vendoru2019s score for on-time delivery performance. So that the non-scoring of the on-time delivery performance criterion in this case does not bring an unfair advantage in comparison with a poor score, the vendor is awarded a low score for quantity reliability. On-time delivery performance is thus always to be seen in conjunction with quantity reliability. Standardizing Delivery Date Variance - OMGL in the On-time delivery section To rate delivery date variances in days, maintain the Std.del.time var. parameter. If you assign a lower standard value, this means that relatively low date variances produce high percentage variances. If you set a higher standard value, this results in a relatively low percentage variance: The Std.del.time var. parameter has the value 20. The goods receipt took place on Nov. 27; the statistical delivery date was Nov. 15. There is thus a difference of 12 days. The system calculates the percentage variance as follows: 12 / 20 x 100 = 60 If the Std.del.time var. parameter had the value 60, the variance would be 20% (12 / 60 x 100 = 20). If you do not maintain this parameter, the system calculates the delivery time variance via the firm zone in the case of scheduling agreements, and via the order date and the statistics-relevant delivery date in the case of purchase orders.
    http://www.sap-img.com/mm009.htm

  • Please explain this code,this is regarding to ODS activation.

    Hi,
        Please I am unable to understand this code,this exists initial activation of ODS,please can anyone please explain me this
    ob started
    Step 001 started (program RSPROCESS, variant &0000000055152, user ID ALEREMOTE)
    Activation is running: Data target ZYL_O82, from 1.165.349 to 1.165.349
    Data to be activated successfully checked against archiving objects
    SQL: 20.06.2007 05:34:26 ALEREMOTE
    ANALYZE TABLE "/BIC/AZYT_O6240" DELETE STATISTICS
    SQL-END: 20.06.2007 05:34:26 00:00:00
    SQL: 20.06.2007 05:34:26 ALEREMOTE
    BEGIN DBMS_STATS.GATHER_TABLE_STATS ( OWNNAME =>
    'SAPR3', TABNAME => '"/BIC/AZYT_O6240"',
    ESTIMATE_PERCENT => 1 , METHOD_OPT => 'FOR ALL
    INDEXED COLUMNS SIZE 75', DEGREE => 1 ,
    GRANULARITY => 'ALL', CASCADE => TRUE ); END;
    Thanks & Regards,
    Mano

    Hi,
        Please I am unable to understand this code,this exists initial activation of ODS,please can anyone please explain me this
    ob started
    Step 001 started (program RSPROCESS, variant &0000000055152, user ID ALEREMOTE)
    Activation is running: Data target ZYL_O82, from 1.165.349 to 1.165.349
    Data to be activated successfully checked against archiving objects
    SQL: 20.06.2007 05:34:26 ALEREMOTE
    ANALYZE TABLE "/BIC/AZYT_O6240" DELETE STATISTICS
    SQL-END: 20.06.2007 05:34:26 00:00:00
    SQL: 20.06.2007 05:34:26 ALEREMOTE
    BEGIN DBMS_STATS.GATHER_TABLE_STATS ( OWNNAME =>
    'SAPR3', TABNAME => '"/BIC/AZYT_O6240"',
    ESTIMATE_PERCENT => 1 , METHOD_OPT => 'FOR ALL
    INDEXED COLUMNS SIZE 75', DEGREE => 1 ,
    GRANULARITY => 'ALL', CASCADE => TRUE ); END;
    Thanks & Regards,
    Mano

  • SAP GRC 10.0 Risk Management - Forecasting Horizon Scoring Analysis Mode

    Hi everyone,
    In SAP GRC 10.0 Risk Management Support Package 7, we need to assess a corporate risk by performing an automatic analysis aggregation based on a scoring analysis profile.
    The problem is that corporate risks must be created based on a forecasting horizon.
    So, can we create forecasting horizons with scoring analysis mode? How? Must be enabled through customizing or applying a SAP note?
    Best Regards,
    Chema Traveso

    Hi,
    I think this is still user-specific, as it was in 5.X. I have checked the new GRC authorisation object parameters delivered within the roles and also tried to see if a Admin user was able to see all the variants created by the different users, but so far I have not found a solution.
    It may be worthwhile to raise this in "IdeaPlace", hoping it gets enough votes and SAP's attention for implementing in a future Support Pack delivery.

  • Regarding Extended star schema

    Hi Friends,
    In Extended star schema,master data will load separately ,which will connect through sid's to dimension table .
    My question is.. This master data tables can be used other than this cube ?
    Please tell me i am in confusion.
    Thanks in advace,
    Regards,
    ramnaresh.

    Hi
    InfoCubes are made up of a number of InfoObjects. All InfoObjects (characteristics and key figures) are available independent of the InfoCube. Characteristics refer to master data with their attributes and text descriptions.
    An InfoCube consists of several InfoObjects and is structured according to the star schema. This means there is a (large) fact table that contains the key figures for the InfoCube, as well as several (smaller) dimension tables which surround it. The characteristics of the InfoCube are stored in these dimensions.
    An InfoCube fact table only contains key figures, in contrast to a DataStore object, whose data part can also contain characteristics. The characteristics of an InfoCube are stored in its dimensions.
    The dimensions and the fact table are linked to one another using abstract identification numbers (dimension IDs) which are contained in the key part of the particular database table. As a result, the key figures of the InfoCube relate to the characteristics of the dimension. The characteristics determine the granularity (the degree of detail) at which the key figures are stored in the InfoCube.
    Characteristics that logically belong together (for example, district and area belong to the regional dimension) are grouped together in a dimension. By adhering to this design criterion, dimensions are to a large extent independent of each other, and dimension tables remain small with regards to data volume. This is beneficial in terms of performance. This InfoCube structure is optimized for data analysis.
    The fact table and dimension tables are both relational database tables.
    Characteristics refer to the master data with their attributes and text descriptions. All InfoObjects (characteristics with their master data as well as key figures) are available for all InfoCubes, unlike dimensions, which represent the specific organizational form of characteristics in one InfoCube.
    Integration
    You can create aggregates to access data quickly. Here, the InfoCube data is stored redundantly and in an aggregated form.
    You can either use an InfoCube directly as an InfoProvider for analysis and reporting, or use it with other InfoProviders as the basis of a MultiProvider or InfoSet.
    See also:
    Checking the Data Loaded in the InfoCube
    If the above info is useful, please grant me points

  • Quality Management Scoring

    Hi All,
    Can anybody explain me how the system calculates Quality score (which gets reflected in Vendor Evaluation)?.....any formula or principles etc
    Is there any configuration setting the QM side which helps in controlling that score?
    Requirment - It is intended that one critical defect should not be easily overcome by success on other less critical tests. Currently, it has been set in QM as such the quality score is getting 80 irrespective of pass or fail of many tests
    Thanks

    Hello,
    The score *Goods Receipt (main criterion quality)* is calculated as follows:
    1.     If a material is subject to incoming inspection, part of the delivered material is checked by the quality assurance department when goods are received against a purchase order.
    2.     An inspection lot is created. After the inspection or testing, the person responsible in Quality Management then enters a result and makes a decision as to whether the material can be used.
    3.     All the incoming inspection lots are stored in a file with their scores.
    4.     When you run an automatic re-evaluation for a vendor, the system selects all the incoming inspection lots for the vendor that lie within the validity period and calculates the average of the scores.
    The result is the vendor's score for the quality of goods received.
    The *Quality Audit (Main Criterion "Quality")* score is calculated as follows:
    A quality audit has taken place in one of your vendor's plants. This means that either your own company has carried out a check to determine how the vendor ensures that his products are of a high quality, or the vendor has carried out the audit himself.
    1.     An audit lot is created by Quality Management. When the audit has been completed, the person responsible determines a score for this audit lot.
    2.     All the audit lots are stored in a file with their scores.
    3.     When you run an automatic re-evaluation of a vendor, the system selects all the audit lots for the vendor within the validity period.
    4.     The parameter Quality audit is read from Customizing.
    5.     If the indicator is set, the system calculates the average score for all the quality audits in the validity period. The result is the vendor's score for the quality audit.
    If the indicator is not set, only the most recent audit lot is included. This then represents the vendor's score for the subcriterion.
    The Evaln. QM system field on the screen for the main criterion "Quality" shows you the score that was awarded to the Quality Management system (e.g. ISO 9003 with certificate) used by the vendor. It is only displayed if the Prev. QM system field is filled in the vendor master record. The score is awarded by the QM system and is for information purposes only. Display the vendor master record to see which quality system the vendor uses. You can use the content of the field for your own program and have this score included in the vendoru2019s quality score.
    Complaints/Rejection Level (Main Criterion "Quality")
    The score for the subcriterion Complaints/Rejection Level is computed by the SAP component QM Quality Management and passed on to the MM Vendor Evaluation component.
    If the scale for the key quality data in the QM system does correspond to the scoring range in Vendor Evaluation, the Vendor Evaluation system converts the QM data for use in the Vendor Evaluation scoring system.
    The score is calculated as follows:
    9.     On the shop-floor, material A supplied by vendor ACME is being processed in the course of production.
    10.     If the material is OK (that is to say, no quality notifications have been entered for material A during the entire validity period), the vendor is awarded the highest score for the material (100 points).
    11.     If the material is found to be defective, a quality notification (such as a rejection note or nonconformance complaint sent to the vendor) is entered.
    12.     During the vendor evaluation process, the system checks whether the costs associated with the faulty delivery exceed the maximum percentage of business volume defined in the Customizing system.
    In the above example, your business volume with (value of purchases from) ACME amounts to one million dollars annually. The costs associated with defective deliveries may not exceed 0.1 percent of business volume, i.e. $100,000. The costs associated with a quality notification are estimated at $500.
    13.     If the costs that your company incurs as a result of the faulty materials are lower than the defined proportion of business volume, the system calculates a score between 1 and 99 points.
    To do this, the system multiplies your annual business volume with the vendor ($1,000,000) by the parameter Business volume share (0.1).
    The number of accumulated quality notifications (10 in all) multiplied by the cost of each notification ($ 500) is subtracted from this.
    The result is divided by the annual business volume multiplied by the proportion of business volume.
    The result of this is a factor which, multiplied by a hundred, gives a percentage.
    This percentage indicates the quality provided by the vendor in relation to the particular material. The value 95% in the above example means that complaints or rejections involving the vendor ACME only resulted in 5% of the maximum costs allowed, i.e. the quality is very good.
    Note on Scoring Range and Conversion QM - MM:
    Since the minimum points score that can be obtained is not 0 (0 means "not evaluated") but 1, only 99 points are available in the range 1 to 100.
    This is why the system works out how many points out of 99 the value 95% corresponds to.
    The result is 94 points.
    Since the bottom of the scoring range has been moved up by one point (1 point is the worst score not 0), the system adds on 1 point.
    ACME therefore achieves a score of 95 points for the subcriterion Complaints/Rejection Level for material A.
    14.     If the cost of defective or non-conforming deliveries exceeds the share of business volume defined in Customizing (in this case $100,000), the vendor is awarded the lowest score (1 point).
    The score for the subcriterion Complaints/Rejection Level is computed by the SAP component QM Quality Management and passed on to the MM Vendor Evaluation component.
    You can find this information in help.sap.com
    I hope it helps!
    Best Regards,
    Arminda Jack

  • Vendor Evaluation Scoring methods E & F

    Dear Folks,
    In the vendor evaluation, I see two scoring methods:
    E       Determination of Price Variance from Invoice Documents
    F     Determination of Quantity Variance from Invoice Documents
    Can anyone advice me how these two scoring methods works?
    For example, the method E, price variance from invoice documents. What does it means? It's the price variance from invoice documents against what? Is it against the PO price?
    And for method F, what's the quantity variance compared against? GR quantity or PO quantity?
    Best Regards
    Junwen

    Hi,
    In case of scoring method E, the price variance in invoice is with reference to the price mentioned in PO. In customization, you maintain the score for each percentage or a range of percentage variance. System calculates variance from price entered in Invoice with reference to the PO & then assigns the score from the values entered in Customization.
    In case of scoring menthod F, Quantity Variance is with reference to the quantity in GRN. In this case you enter the score in customization for the variance. System calculates the variance from the quantity entered in Invoice & the GRN quantity & assigns the score you maintained in Customization.
    Hope this is clear.
    Regards,
    Prashant

  • Fact tables with different granularity

    We currently have 3 dimensions (Site, Well, Date) and 2 fact tables (GasEngine, GasField), both having granularity of a day.
    GasEngine is linked to Site and Date
    GasField is linked to Site, Well and Date
    We now have a requirement to make the GasEngine fact table have granularity of an hour but keep
    GasField at a day.
    We therefore must include a new Time dimension, which would only be linked to GasEngine.
    Is it ok to have a DW with these two fact tables having different granularity? 
    And would we therefore require two separate cubes for querying this data?

    Hi Rajendra and Visakh16,
    Based on your input provided to this thread, I would like to ask a question just to fine-tune my knowledge regarding data modelling. In Darren’s case I guess his date dimension only store dimension records up to day level granularity. Now the requirement
    is to make the “GasEngine” fact table to hold data granularity of an hour.
    Now based on Rajendra’s input
    “Yes, you can have. but why you need new time dimension, I recommend, make GasEngine fact to
    hour granularity.”
    How Darren could display data for each hour without having a time dimension attached to GasEngine fact table? With the existing date dimension he ONLY can display the aggregated data with the minimum granularity of day level.
    Now anyone can modify the date dimension to hold time records which will complicate the date dimension totally. Instead why Darren cannot have a separate time dimension which hold ONLY time related data and have a timekey in GasEngine fact table and relate
    those tables using the time key? This way isn’t Darren’s data model become more readable and simplified? As we provide another way of slicing and dicing data by using a time dimension I do not think Darren’s cube becomes a complex STAR schema.
    I could be totally wrong therefore for the sake of knowledge for Darren and me I am asking the question from both of you.
    Best regards…   
    Chandima Lakmal Fonseka

Maybe you are looking for

  • Will this work? (swapping HDs)

    Costco has the 160GB Western Digital Passport portable external HD for a really good price (cheaper than the harddrive costs by itself). Does anyone know if it's possible to open up the enclosure and swap the harddrive with my MacBook's stock 80GB dr

  • Clob to varchar2 + metadata + dbms_output = nothing:(

    Hi all! I am in trouble:) i don't know why am getting nothing, i expect either a result or an error, but not nothing. I want to print dll of an object in dbms_output create or replace procedure tomkyte.buildConstraint(      cons_name varchar2,      c

  • Essentials R2 licensing and having more than one VM

    My understanding with Windows Server 2012 Essentials R2 is that the licensing has changed, so that we can install WS2012 R2 with only Hyper-V role on the host AND WS2012E R2 as a VM. However, I just ran into someone who believes that this is the only

  • Creating Special Conversion Routine

    Hi! what are the steps for creating Special Conversion Routine? I have created FM CONVERSION_EXIT_VAKEY_INPUT via se37, is it a mandatory thing to create FM CONVERSION_EXIT_VAKEY_OUTPU? Why my Special Conversion Routine VAKEY is not visible in the li

  • So many errors

    I am always getting these errors when I build a Blu-ray disc. error 6 It was suggested to update XLM file from MS but that hasn't helped. This is so frustrating. It was suggested it has something to do with the button routing but i have been trying s