Performance - Objective Scoring Calculation

Hi,
I would like to calculate the objective by weight_of_objective * factor. The factor is defined as follows:
Rating Level Factor
==============================================
Partialy Meeting Expect High 0.7
Partially Meeting Expect. Mid 0.6
Partially Meeting Expect. Low 0.5
Meeting Expectation High 1.0
" Mid 0.9
" Low 0.8
like that... Where can I define the factor in application. How can I use it in formula?. I checked the seeded formula, but it is very simple without factor.
Thanks in advance.

Hi Sudipta,
IIF is one of the most popular MDX functions. Yet, it can cause significant performance degradation, which is often blamed on other parts of the system. Many times it is simple to rewrite the MDX expression to get rid of IIF altogether, and other times it
is possible to slightly change the IIF to increase performance.
http://sqlblog.com/blogs/mosha/archive/2007/01/28/performance-of-iif-function-in-mdx.aspx
In addtional, i'd suggest you enable SQL Sever profiler to monitor the queries fired by the process, once you find some queries took a very long time to run, consider creating the smaller cube partition or optimzing the query by adding index or
partition to improve the query performance. Here are some links about performance tuning.
http://www.mssqltips.com/sqlservertip/2565/ssas--best-practices-and-performance-optimization--part-1-of-4/
http://sqlmag.com/t-sql/top-9-analysis-services-tips
http://channel9.msdn.com/Events/TechEd/NewZealand/2013/DBI414
Hope this helps.
Regards,
Charlie Liao
TechNet Community Support

Similar Messages

  • Regarding performance widget (performance objects) for custom dashboard

    Hi,
    I'm making a custom dashboard, I'm trying to use the performance widget for a netapp volume, but the performance objects are not available and only see "(All)" on the options (see screenshot).  When I create the dashboard it shows the widget
    with empty information.
    Thanks for any suggestions.

    Hi all,
    I managed to fix all my dashboards, from what I understand it is not a supported MS fix as per articles I found on forums etc.
    My issues started when I renamed some of my groups because of requirements, think SCOM 2012 doesn't like the rename, serious bug that I think causes the info in the db's to go out of sync etc.
    Also found that data from the main opsmgr db to the dwdb hadn't submitted for months, so my reporting was way out. Hence blank stats on dashboards and uptime reports showed grey.
    I followed this article and fixed it, please ensure that your db's are fully backed up before doing this. OperationsMananger DB and the OperationsManagerDW db.
    The issue in the end is actually with the DW db.
    http://operationsmanager2012.blogspot.com/2013/02/scom-availability-report-monitoring.html
    Regards
    Simon Craner

  • Performance of MDX Calculation

    HI All,
    What are the steps and ways to optimize the performance of the calculated Member created in the Cube.
    I have the below Code for the calculated member:
    CREATE MEMBER CURRENTCUBE.[Measures].[K3001 - Loaded Freight Rate (USD/FFE)]
     AS IIF([Measures].[Actual/Forecast FFE Loaded] = 0, NULL, 
    ([Accounts].[LMB Lvl4].&[LMB.4110],[Accounts].[Account Type].&[GR],[Measures].[Actual/Forecast USD])
    +
    ([Accounts].[LMB Lvl5].&[LMB.5157],[Accounts].[Account Type].&[Rest of PnL],[Measures].[Actual/Forecast USD])
    ( [Measures].[Actual/Forecast FFE Loaded])), 
    VISIBLE = 1 ,  DISPLAY_FOLDER = 'Revenue' ,  ASSOCIATED_MEASURE_GROUP = 'Key Figures';   
    Here Accounts is a dimension.
    Now this calculation shows me the correct value, but when I drill it with other Dimension say STRING, it take 40-50 sec to give the result.
    Here I want to know whether there is some correction needed in my query or any other optimization technique is required.
    Thanks
    Sudipta Ghosh
    Sudipta Ghosh Tata Consultancy Services

    Hi Sudipta,
    IIF is one of the most popular MDX functions. Yet, it can cause significant performance degradation, which is often blamed on other parts of the system. Many times it is simple to rewrite the MDX expression to get rid of IIF altogether, and other times it
    is possible to slightly change the IIF to increase performance.
    http://sqlblog.com/blogs/mosha/archive/2007/01/28/performance-of-iif-function-in-mdx.aspx
    In addtional, i'd suggest you enable SQL Sever profiler to monitor the queries fired by the process, once you find some queries took a very long time to run, consider creating the smaller cube partition or optimzing the query by adding index or
    partition to improve the query performance. Here are some links about performance tuning.
    http://www.mssqltips.com/sqlservertip/2565/ssas--best-practices-and-performance-optimization--part-1-of-4/
    http://sqlmag.com/t-sql/top-9-analysis-services-tips
    http://channel9.msdn.com/Events/TechEd/NewZealand/2013/DBI414
    Hope this helps.
    Regards,
    Charlie Liao
    TechNet Community Support

  • Is powerquery the place to perform advanced ETL Calculations?

    Ive been trying to calculate a rows cost based on a rules table using powerpivot but it has melted my mind. I really dont think powerpivot is the place to do this. The ETL Process is ideally wher this price should be calculated but as im only using excel
    and flatfiles, powerquery is my etl process.
    What im doing is:
    I have a flatfile of orders, like below.
    I have a manually created table of Stores & the same for Items.
    To calculate pricings i have a few rules to follow:
    -If itemID 1 is ordered, and on the same order item 2 or 3 is ordered i use the special price.
    -If item 3 is ordered from store 2 i use the special price.
    Otherwise standard price is used.
    Powerpivot is not the place to do this. Is this more suited to powerquery?
    orderid itemid storeid
    1 1 1
    1 2 1
    1 3 1
    1 1 1
    2 1 2
    3 4 3
    3 5 3
    4 1 4
    4 5 4
    4 4 4
    itemid price specialprice
    1 10 5
    2 10 5
    3 10 5
    4 10 5
    5 10 5
    6 10 5
    7 10 5

    In general, it is better if you can do ETL before importing data in Power Pivot.
    However, in this case the result of the Power Query ETL would be in the best case a SQL query that would push the complexity down to the data source engine, and in the worst case scenario it would request a big effort to Power Query engine working with an
    in-memory copy of data. This would result in a slow process time for the table.
    For this specific case, if you have a large amount of data (hundreds of thousands of rows or more), you might see better performance by creating a calculated columns in Power Pivot. I'm not saying this is always the best approach, but the rule you want to
    implement might be implemented in a DAX calculated column (in the Orders table) that should perform very quickly.
    =
    IF (
        OR (
            Orders[StoreID] = 3,
            AND (
                Orders[OrderID] = 1,
                AND (
                    CONTAINS (
                        Orders,
                        Orders[OrderID], Orders[OrderID],
                        Orders[ItemID], 2
                    CONTAINS (
                        Orders,
                        Orders[OrderID], Orders[OrderID],
                        Orders[ItemID], 3
        RELATED ( Items[SpecialPrice] ),
        RELATED ( Items[Price] )
    Marco Russo (Blog,
    Twitter,
    LinkedIn) - sqlbi.com:
    Articles, Videos,
    Tools, Consultancy,
    Training
    Format with DAX Formatter and design with
    DAX Patterns. Learn
    Power Pivot and SSAS Tabular.

  • Performing a specific calculation in a pivot table with BI Publisher 10g

    I am using BI Publisher 10g and new on it and I need to integrate a specific calculation measure in a pivot table
    The logic uses two measures that already exist in the report (Demand & PAB) and the required calculation is represented by a new measure/row (Calculated). The pivot time is in weeks
    Week w0     w1     w2     w3     w4     w5 .....
    Demand     d(w0)     d(w1)     d(w2)     d(w3)     d(w4)     d(w5) .....
    PAB     p(w0)     p(w1)     p(w2)     p(w3)     p(w4)     p(w5) .....
    Calculated     c(w0)     c(w1)     ....     ....     ....     c(w5) .....
    If p(w0) < d(w1) --> then c(w0) = p(w0) / d(w1)
    If not
    If p(w0) < d(w1) + d(w2) --> then c(w0) = p(w0) / ( d(w1) + d(w2) ) * 2
    If not
    If p(w0) < d(w1) + d(w2) + d(w3) --> then c(w0) = p(w0) / ( (dw1) + d(w2) + d(w3 ) ) * 3
    If not
    etc .... Same logics apply for c(w1) and so on
    Is such calculation logic achievable in Bi Publisher ? If yes, how would you do it ?
    Thanks for your time and support on this
    Regards, Samir
    Edited by: skaroui on 02-Mar-2012 05:50
    Edited by: skaroui on 02-Mar-2012 06:02
    Edited by: skaroui on 02-Mar-2012 06:12

    Can you use the logic that you specified in the data model (query) to generate the required data and then use it in the pivot table? That would be the easiest way to do it. Tweaking the code to include the conditional logic/calculated measures is not going to be straightforward..
    Thanks,
    Bipuser

  • How do you get a fillable pdf to perform a new calculation if data is changed on form

    I have created a form to calculate payment terms. The forms work great however if you make a mistake inputting or change data after calculation has been preformed i need the new data to be calculated. Is there a way to do this?
    thanks

    Hi Gilad,
    Thank you for the information. Unfortunately the form that I built does not provide the correct number if any data is changed.
    Is there something that I am missing?
    I am using the form to complete and installment loan contract
    Thanks again

  • Performance Object vs String

    Hello,
    In my program i am considering of inserting string or object in a hashmap
    String term;
    map.put(key, term)
    or
    map.put(key new String(term))
    which one is most efficient?

    Hi Nilsen,
    I would think that map.put(key, term) is far more efficient. Instanciating a new object is always expensive. Java references strings in a way that minimizes the memory and CPU. If you have a reference in one class to a string "string", and another reference to an exact duplicate "string" in another class in fact the JVM references the same String object in memory, this is why Strings are not malleable as they are referenced by potentially many classes, i.e. changing the string from one class would affect the reference from the next.
    So to make a short story long, the first option would be much more efficient.
    See ya
    Michael

  • Performance equipment rate calculation

    Dear All,
    Could you explain to me how to handle performance equipment rate in SAP B1 ? we have created an equipment card but we still unable to calculate it. TIA
    Rgd
    Mark

    I guess and I am sure you have had developed an addon. It is probably a UI addon or DI addon. Let me know about it.
    Basic rating of equipment is efficiency. you can create some UDFs in the production order. The UDF is a routing/operation sequence and linked to a table. Creating more than one table can make you easy to store machine per each product. Of course the table is not included in the general ledger journal transaction. you do this manually in the journal entry. The table consists of machine type, capacity and efficiency.
    let me know if you need more detail.
    Rgds,

  • Effect of Restricted Keyfigure & calculated keyfigure in query performance

    Hi,
             What is the effect of Restricted Keyfigure & calculated keyfigure  in Query Performance?
    Regards
    Anil

    As compared to formulas that are evaluated during query execution, calculated key figures are pre-calculated and their definitions are stored in the metadata repository for reuse in queries. The incorporation of business metrics and key performance indicators as calculated key figures, such as gross profit and return on investment (which are frequently used, widely understood, and rarely changed), improve query performance and ensure that calculated key figures are reported consistently by different users. Note that this approach improves query runtime performance but slows InfoCube or ODS object update time. As a rule of thumb, if multiple and frequently used queries use the same formula to compute calculated fields, use calculated key figures instead of formulas.
    RKFs result in additional database processing and complexity in retrieving the query result and therefore should be avoided when possible.
    other than performance, there might be other considerations to determine which one of the options should be used.
    If the RKF's are query specific and not used anywhere in majority of other queries, I would go for structure selections. And from my personal exp, sometimes all the developers end up with so many RKF and CKF's that you get easily lost in the web and not to mention the duplication.
    if the same structure is needed widely across most of the queries, that might be a good idea to have global structure to be available across the provider, which might considerable cut down the development time.

  • Performing calculation on stored data

    Hi all, maybe a Lookout developer more experienced than me can help me
    with this one.
    I am storing information in a Citadel database regarding production
    levels and timing, I would like to perform some statistical calculations
    such as Average, Std deviation, etc. I am trying to figure out how to pass
    the information from the database to the Calculation Objects, I read in the
    documentation that are 35 samples available as a data member in some
    objects, but I wonder if there is a way to perform the calculations in a
    larger number of data. Any idea?
    Thank you in advance.
    Everardo

    Good Morning Everardo,
    Citadel has built in statistical functions. These are easily experimented
    with using MS Query to access Citadel from MS Excel. You might also refer
    to the reference manual or online help regarding the ODBC compatibility of
    Citadel. With these tools, you can use the SQL object within Lookout to
    make queries on Citadel and invoke the built in statistical functions.
    Good Luck
    Todd Johnson
    "Everardo Hernandez" wrote in message
    news:[email protected]..
    > Hi all, maybe a Lookout developer more experienced than me can help me
    > with this one.
    > I am storing information in a Citadel database regarding production
    > levels and timing, I would like to perform some statistical calculations
    > su
    ch as Average, Std deviation, etc. I am trying to figure out how to pass
    > the information from the database to the Calculation Objects, I read in
    the
    > documentation that are 35 samples available as a data member in some
    > objects, but I wonder if there is a way to perform the calculations in a
    > larger number of data. Any idea?
    > Thank you in advance.
    >
    > Everardo
    >
    >

  • Check performed on an Auth Object?

    Hi All,
    Consider an object in role with 5 fields.Suppose We maintain 2 fields and leave 3 field unmaintained.
    How would the check be done? Will the check be done only for the maintained field values in object and the umaintained fields  will be ignored in check?
    Or is the check performed Object dependent?
    Thankyou,
    Ajit

    Hi Ajit,
    >
    Ajit Nadkarni wrote:
    > Hi All,
    >
    > Consider an object in role with 5 fields.Suppose We maintain 2 fields and leave 3 field unmaintained.
    I think you mean five instances of the the same object.
    An object will have only fixed fields, irrespective the number of times you pull it.
    > How would the check be done? Will the check be done only for the maintained field values in object and the umaintained fields  will be ignored in check?
    > Or is the check performed Object dependent?
    > Ajit
    Take any object as an example... say S_TABU_DIS, you have five instances of it, each with different values. and one with unmaintained or open. You genereate the profile. Now the checks in SAP happen in an AND operation for one instance.
    so if you have S_TABU_DIS as 02, and auth group 'SS'
    and in the second instance as 03 adn 'VS', then they are checked in an AND operation. Only activity 02 will be given to 'SS' . They dont cross-poliinate. hence this will not result in 03 for auth group 'SS'. Open authorizations provide no extra authorization, unless explicitly checked for DUMMY / ' ' . (Julius, I remember you )
    Once the user logs in SAP, the user buffer is loaded and the first successful check is returned with RC=0, else if not found ...it will fail
    Hope this clarifies
    Abhishek
    Edited by: Abhishek Belokar on Sep 23, 2008 6:44 PM

  • Getting error "Object variable or with block variable not set" when trying to open a FR report in studio

    Problem Description
    We are on FR 11.1.2.2.305 installed on AIX. a user is getting this error: "Object variable or with block variable not set" when trying to open a FR report from FR studio client installed on windows xp . Initialy, we thought it may be a FR client installion issue. We uninstalled and cleaned up registry and did a fresh installation of the client but the issue still persists. The FR server and the client are on the same version.
    The user is a LDAP user who is facing the issue. We have confirmed with other users and they dont have any issue accessing FR report from their own client but when they try to connect from the users machine who is having issues, the others users also see the above error. All the users are ldap users and all belong to same shared services groups so the provisiong is the same.
    Any input will be appreciated.
    Thanks

    OK, in this case of one single computer, please make sure that settings as per below KB document as in place and then validate the issue:
    Internet Explorer (IE7, IE8, IE9 and IE10) Recommended Settings for Oracle Hyperion Products (Doc ID 820892.1)
    The information in this document applies to the following Enterprise Performance Management products:
        Calculation Manager
        Data Relationship Management (DRM)
        Enterprise Performance Management Architect (EPMA)
        EPM Workspace
        Essbase Administration Services (EAS)
        Financial Data Quality Management (FDM)
        Financial Management (HFM)
        Financial Reporting
        Foundation Services
        Interactive Reporting
        Planning
        Shared Services
        Web Analysis
    Thanks!

  • Structures Vs RKFs and CKFs In Query performance

    Hi Gurus,
    I am creating a GL query which will be returning with a couple of KFs and some calculations as well with different GL accounts and I wanted to know which one is going to be more beneficial for me either to create Restricted keyfigures and Calculated Keyfigures or to just use a structure for all the selections and formula calculations?
    Which option will be better for query performance?
    Thanks in advance

    As compared to formulas that are evaluated during query execution, calculated key figures are pre-calculated and their definitions are stored in the metadata repository for reuse in queries. The incorporation of business metrics and key performance indicators as calculated key figures, such as gross profit and return on investment (which are frequently used, widely understood, and rarely changed), improve query performance and ensure that calculated key figures are reported consistently by different users. Note that this approach improves query runtime performance but slows InfoCube or ODS object update time. As a rule of thumb, if multiple and frequently used queries use the same formula to compute calculated fields, use calculated key figures instead of formulas.
    RKFs result in additional database processing and complexity in retrieving the query result and therefore should be avoided when possible.

  • End Routine ABAP to read from Internal table and do calculation.

    Hi All...
    I have completed some coding in a start routine to extract some fields from a DSO containing Master Data (Stock Age) into an internal table (the internal table has been defined in the global declarations area) which will then be read in the end routine.
    (the internal table will be read) at loadtime in the end routine and used in a calculation as described below.
    I.E
    GLOBAL DATA DECLARATION
    Data: ITAB1 TYPE TABLE OF /BIC/DSOTAB
    (DSOTAB has 3 fields PLANT, STYLE, 1STDATE (1STDATE IS A DATE FIELD)
    The start routine has the following code:
    IF ITAB1 IS INITIAL.
    SELECT /BIC/PLANT /BIC/STYLE /BIC/1STDATE
                    FROM /BIC/DSOTAB
                    INTO CORRESPONDING FIELDS OF TABLE ITAB1.
    This is working fine when run under simulation i.e ITAB1 is filled no problem.
    I then need to do a calculation in the end routine.
    1. First I have to find the record in the internal table using the key of PLANT AND STYLE from the RESULT_PACKAGE.
    The code i am using now is as follows....
        READ TABLE ITAB1 TRANSPORTING NO FIELDS WITH KEY
        /BIC/PLANT = <result_fields>-/BIC/PLANT /BIC/STYLE =
        <result_fields>-/BIC/STYLE.
    Once this record has been read I then have to perform the following calculation using the following additional fields
    <result_fields>-/BIC/DYS1ST is a NUMC field in the <result_fields> that will be be filled by the result of the calculation described below.
    <result_fields>-CALDAY is a date field which is already populated in the <result-fields> which is used in the calculation below.
    The Calculation required is a difference in days between two dates
    DYS1ST = CALDAY - 1STRED.
    The code i am using is
    If sy-subrc = 0.
         <result_fields>-/BIC/DYS1ST = <result_fields>-CALDAY -
         i_t_1stred_dso-/BIC/1STRED.
    So the whole section of code inside the LOOP at RESULT PACKAGE looks like this in the end routine
           READ TABLE ITAB1 TRANSPORTING NO FIELDS WITH KEY
        /BIC/PLANT = <result_fields>-/BIC/PLANT /BIC/STYLE =
        <result_fields>-/BIC/STYLE.
    IF sy-subrc = 0.
         <result_fields>-/BIC/DYS1ST = <result_fields>-CALDAY -
         i_t_1stred_dso-/BIC/1STRED.
    Im getting the error
    "ITAB1 " is a table without a header line and therefore has no component called "/BIC/1STRED
    Please can someone advise as to what I need to do to get this fixed please.
    Thanks in advance
    Stevo:)

    Hi,
    You will have to do few changes in your code as below,
    GLOBAL DATA DECLARATION
    Data: ITAB1 TYPE standard TABLE OF /BIC/DSOTAB.
    After that declare a workarea to read the values.
    DATA: i_wa_itab1 type /bic/dsotab.
    (DSOTAB has 3 fields PLANT, STYLE, 1STDATE (1STDATE IS A DATE FIELD)
    The start routine has the following code:
    IF ITAB1 IS INITIAL.
    SELECT /BIC/PLANT /BIC/STYLE /BIC/1STDATE
    FROM /BIC/DSOTAB
    INTO CORRESPONDING FIELDS OF TABLE ITAB1.
    This is working fine when run under simulation i.e ITAB1 is filled no problem.
    I then need to do a calculation in the end routine.
    1. First I have to find the record in the internal table using the key of PLANT AND STYLE from the RESULT_PACKAGE.
    The code i am using now is as follows....
    READ TABLE ITAB1 TRANSPORTING NO FIELDS WITH KEY
    /BIC/PLANT = <result_fields>-/BIC/PLANT /BIC/STYLE =
    <result_fields>-/BIC/STYLE.
    Once this record has been read I then have to perform the following calculation using the following additional fields
    <result_fields>-/BIC/DYS1ST is a NUMC field in the <result_fields> that will be be filled by the result of the calculation described below.
    <result_fields>-CALDAY is a date field which is already populated in the <result-fields> which is used in the calculation below.
    The Calculation required is a difference in days between two dates
    DYS1ST = CALDAY - 1STRED.
    The code i am using is
    If sy-subrc = 0.
    <result_fields>-/BIC/DYS1ST = <result_fields>-CALDAY -
    i_t_1stred_dso-/BIC/1STRED.
    So the whole section of code inside the LOOP at RESULT PACKAGE looks like this in the end routine
    READ TABLE ITAB1 into i_wa_itab1 WITH KEY
    /BIC/PLANT = <result_fields>-/BIC/PLANT /BIC/STYLE =
    <result_fields>-/BIC/STYLE.
    IF sy-subrc = 0.
    <result_fields>-/BIC/DYS1ST = <result_fields>-CALDAY -
    i_wa_itab1-/BIC/1STRED.
    Once you do this changes, your code will work fine.
    Regards,
    Durgesh.

  • Problem with Math In Calculated Fields

    I am calculating a group incident rate for data returned from
    a query. The formula is Number of cases multiplied by 200000
    divided by number of hours worked. Cases in my report is the
    calculated field: calc.CaseSum (the sum of cases for the group)
    Hours is calc.SumHours (the sum of hours for the group). The actual
    values for these variables (for the first group are 48 and 29427171
    respectively. When I create the following calculated field called
    rate using the formula: (calc.CaseSum * 200000) / calc.SumHours,
    Cold Fusion Generates a Runtime Error:
    Invalid ColdFusion expression in report. If the expression is
    a string, ensure that it is within quotes. Error: (calc.CaseSum *
    200000) / calc.SumHours is not a valid ColdFusion expression.
    If I use the constant value "29427171" as the divisor, the
    report works albeit only for the first group. Any ideas; is this a
    bug, or am I misusing the product?
    Addition: I forgot to mention I am using CF8. Also this
    formula worked fine as a Report Total before I introduced grouping
    and modified the calculated fields to reset on the change of a
    group.

    Sorry, I've been on another project for awhile. This problem
    will certainly be a "show stopper" for me if I cannot resolve it.
    As I mentioned in my original post, I used a constant in the
    formula in lieu of the variable and the calculation worked. This
    would suggest that CF does not have a problem with a large number.
    In spite of that reasoning, I tried Tony's suggested (thanks
    by the way!) with the identical outcome, only difference is the new
    formula is displayed in the error message.
    Tony, you also suggested that I set the variables using
    CFSET... How would I do this within the report writer environment.
    I had tried a similar approach: to perform half the calculation
    i.e. that within the parenthesis, and assign that value to a
    separate "calculated field: and then perform the rest of the
    calculation on that variable with the same outcome.
    I think that I may be dealing with a CF bug here, I'd like to
    find a workaround... I've noticed that CF8 has a new patch, perhaps
    after I apply it, I may be able to get this thing to work. I'm on
    another project right now so it will be a few days before I can
    test this theory, I report the result.
    Should this fail, and no one can come up with a workaround, I
    will report this to Adobe.

Maybe you are looking for

  • How can I view form details in a popup window?

    Recently, something called Babylon assist took control of Firefox and I was forced to reset Firefox to get rid of this. As a consequence, I lost the ability to right click on a page and select from a list that allowed me to view form details in a pop

  • Remote / Home Sharing not working on a particular user account

    I have recently purchased a new MacBook Pro 13" (Core i5) and I'm having issues using the Remote app on my iPhone 3GS, as well as using Home Sharing. I used Migration Assistant to transfer my account and files across from my previous MacBook (with wh

  • LCD TV VGA problems....

    Hi All, I've pretty much given up on getting my Mac Mini to display on my LCD TV (Goodmans GTVL20W7HD) via DVI, as seen here: http://discussions.apple.com/thread.jspa?threadID=739287&tstart=0 So, I thought I'd "fallback" to the VGA option. I know the

  • Output Preview not working in Acrobat Pro 8

    Hi, I'm trying to check a PDF document for CMYK and all that before it goes off to the printers and Acrobat Professional 8's Output Preview has stopped working for me - the separations list has vanished. Basically, when I open the Output Preview tool

  • PE 7 & photoshop question

    Recently upgraded from PE 3 and have a question about photoshop connection.  At this point I don't see a need to access/use the photoshop feature.  I can live with the opening screen but notice that all during the cession my internet connection is do