Step by step procedure to use dimension operator

Hi,
Is there any link ,where I can find step by step procedure to use dimension operator.. that is mapping data from source table to operator and from operator to table
What attribute should we map into dimension key column???
please help me out with this
Thanks
Edited by: rishiraj on Jan 4, 2012 4:40 AM
Edited by: rishiraj on Jan 4, 2012 5:01 AM

The dimension data object in the design tree is all about the semantics of the object - the description of hierarchies, levels, attributes PLUS how it is bound to some persistence.
The dimension operator object in a mapping is all about how the semantic object is loaded with data. Source to Dimension operator (under the covers OWB knows the semantics of the dimension and how it is bound to some persistence so can generate the expanded mapping).
There is an OBE below that is still pertinent to 11gR2 to understand some of this;
http://www.oracle.com/webfolder/technetwork/tutorials/obe/db/10g/r2/owb/owb10gr2_obe_series/owb10g.htm
You should only map business key columns (to identify a level), attributes for each level, and hierarchy relationship columns. The surrogate key columns are populated under the hood by OWB.
Cheers
David

Similar Messages

  • How to Use Dimension Operator

    Hi
    I'm trying to implement SCD Type 2. I have done so using conventional methods.
    I have read in some blogs that dimension operator can be used for SCD. Can any one provide me material on how to use Dimension Operator. I tried OWB User guide. But its not useful.
    I have seen that we need to create levels. But i dont need levels.
    Can somebody please tell me how to use it.
    Regards
    Vibhuti

    Hi Vibhuti,
    using dimensions with OWB 10g R2 isn't that difficult. You just create a number of attributes like an ID, a business key and probably a description and then associate each of the attributes with a level. You need at least one level that all the attributes are associated to. Then you can use the slowly changing dimension wizard (SCD) to track changes. In the SCD settings you can determin for which attribute you want to trigger history and which attributes contain your effective date and your end date (if you want to use SCD type II). Obviously you would need two additional attributes in every level for that purpose.
    Regards,
    Jörg

  • Anybody got SCD Type 2's to perform quickly using dimension operator

    Hi there,
    Hitting major performance problems running mappings to populate SCD Type 2's when they have large amounts of pre-existing data.
    Anybody got this performing acceptably? Tried indexing but to no avail.
    Many Thanks

    Hi there,
    Thanks for getting back to me - found the patch and this patch hasd already been applied.
    An example of the sql being generated in a really simple mapping with the dimension operator for small tables is as follows
    MERGE
    /*+ APPEND PARALLEL("NS_0") */
    INTO
    "RETAILER_PUBLISHER_NS"
    USING
    (SELECT
    "MERGE_DELTA_ROW_0"."NS_OUTLET_SRC_ID$1" "NS_OUTLET_SRC_ID",
    "MERGE_DELTA_ROW_0"."NS_PUBLISHER_CODE$1" "NS_PUBLISHER_CODE",
    "MERGE_DELTA_ROW_0"."NS_TITLE_CLASSIFICATION_CODE$1" "NS_TITLE_CLASSIFICATION_CODE",
    "MERGE_DELTA_ROW_0"."NS_SUPPLY_FLAG$1" "NS_SUPPLY_FLAG",
    "MERGE_DELTA_ROW_0"."NS_EFF_DATE$1" "NS_EFF_DATE",
    "MERGE_DELTA_ROW_0"."NS_EXP_DATE$1" "NS_EXP_DATE",
    "MERGE_DELTA_ROW_0"."NS_ID$1" "NS_ID"
    FROM
    (SELECT
    "NS_ID" "NS_ID$1",
    "NS_OUTLET_SRC_ID" "NS_OUTLET_SRC_ID$1",
    "NS_PUBLISHER_CODE" "NS_PUBLISHER_CODE$1",
    "NS_TITLE_CLASSIFICATION_CODE" "NS_TITLE_CLASSIFICATION_CODE$1",
    "NS_SUPPLY_FLAG" "NS_SUPPLY_FLAG$1",
    "NS_EFF_DATE" "NS_EFF_DATE$1",
    "NS_EXP_DATE" "NS_EXP_DATE$1"
    FROM
    (SELECT
    (Case When (("SPLITTER_INPUT_SUBQUERY"."NS_ID_0_0" IS NULL) OR ((("SPLITTER_INPUT_SUBQUERY"."NS_EXP_DATE_0_0" IS NULL AND TO_CHAR("SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_0_0", 'J.HH24.MI.SS') <= TO_CHAR("SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_1", 'J.HH24.MI.SS')) OR ("SPLITTER_INPUT_SUBQUERY"."NS_EXP_DATE_0_0" IS NOT NULL AND TO_CHAR("SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_0_0", 'J.HH24.MI.SS') <= TO_CHAR("SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_1", 'J.HH24.MI.SS') AND TO_CHAR("SPLITTER_INPUT_SUBQUERY"."NS_EXP_DATE_0_0", 'J.HH24.MI.SS') >= TO_CHAR("SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_1", 'J.HH24.MI.SS'))) AND (("SPLITTER_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CO_1" IS NULL AND "SPLITTER_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CO_2" IS NOT NULL) OR ("SPLITTER_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CO_1" IS NOT NULL AND "SPLITTER_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CO_2" IS NULL) OR ("SPLITTER_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CO_1" != "SPLITTER_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CO_2") OR ("SPLITTER_INPUT_SUBQUERY"."NS_SUPPLY_FLAG_1" IS NULL AND "SPLITTER_INPUT_SUBQUERY"."NS_SUPPLY_FLAG_0_0" IS NOT NULL) OR ("SPLITTER_INPUT_SUBQUERY"."NS_SUPPLY_FLAG_1" IS NOT NULL AND "SPLITTER_INPUT_SUBQUERY"."NS_SUPPLY_FLAG_0_0" IS NULL) OR ("SPLITTER_INPUT_SUBQUERY"."NS_SUPPLY_FLAG_1" != "SPLITTER_INPUT_SUBQUERY"."NS_SUPPLY_FLAG_0_0")))) then ("SPLITTER_INPUT_SUBQUERY"."NS_ID_1") else ("SPLITTER_INPUT_SUBQUERY"."NS_ID_0_0") end)/* MERGE_DELTA_ROW.OUTGRP1.NS_ID */ "NS_ID",
    "SPLITTER_INPUT_SUBQUERY"."NS_OUTLET_SRC_ID_1"/* MERGE_DELTA_ROW.OUTGRP1.NS_OUTLET_SRC_ID */ "NS_OUTLET_SRC_ID",
    "SPLITTER_INPUT_SUBQUERY"."NS_PUBLISHER_CODE_1"/* MERGE_DELTA_ROW.OUTGRP1.NS_PUBLISHER_CODE */ "NS_PUBLISHER_CODE",
    "SPLITTER_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CO_1"/* MERGE_DELTA_ROW.OUTGRP1.NS_TITLE_CLASSIFICATION_CODE */ "NS_TITLE_CLASSIFICATION_CODE",
    "SPLITTER_INPUT_SUBQUERY"."NS_SUPPLY_FLAG_1"/* MERGE_DELTA_ROW.OUTGRP1.NS_SUPPLY_FLAG */ "NS_SUPPLY_FLAG",
    (Case When (("SPLITTER_INPUT_SUBQUERY"."NS_ID_0_0" IS NULL)) then ((case when ("SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_1" < SYSDATE ) then ("SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_1") else ( SYSDATE ) end)) when ((("SPLITTER_INPUT_SUBQUERY"."NS_EXP_DATE_0_0" IS NULL AND TO_CHAR("SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_0_0", 'J.HH24.MI.SS') <= TO_CHAR("SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_1", 'J.HH24.MI.SS')) OR ("SPLITTER_INPUT_SUBQUERY"."NS_EXP_DATE_0_0" IS NOT NULL AND TO_CHAR("SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_0_0", 'J.HH24.MI.SS') <= TO_CHAR("SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_1", 'J.HH24.MI.SS') AND TO_CHAR("SPLITTER_INPUT_SUBQUERY"."NS_EXP_DATE_0_0", 'J.HH24.MI.SS') >= TO_CHAR("SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_1", 'J.HH24.MI.SS'))) AND (("SPLITTER_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CO_1" IS NULL AND "SPLITTER_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CO_2" IS NOT NULL) OR ("SPLITTER_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CO_1" IS NOT NULL AND "SPLITTER_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CO_2" IS NULL) OR ("SPLITTER_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CO_1" != "SPLITTER_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CO_2") OR ("SPLITTER_INPUT_SUBQUERY"."NS_SUPPLY_FLAG_1" IS NULL AND "SPLITTER_INPUT_SUBQUERY"."NS_SUPPLY_FLAG_0_0" IS NOT NULL) OR ("SPLITTER_INPUT_SUBQUERY"."NS_SUPPLY_FLAG_1" IS NOT NULL AND "SPLITTER_INPUT_SUBQUERY"."NS_SUPPLY_FLAG_0_0" IS NULL) OR ("SPLITTER_INPUT_SUBQUERY"."NS_SUPPLY_FLAG_1" != "SPLITTER_INPUT_SUBQUERY"."NS_SUPPLY_FLAG_0_0"))) then ("SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_1") else ("SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_0_0") end)/* MERGE_DELTA_ROW.OUTGRP1.NS_EFF_DATE */ "NS_EFF_DATE",
    (Case When ((ROW_NUMBER() OVER (PARTITION BY "SPLITTER_INPUT_SUBQUERY"."NS_OUTLET_SRC_ID_1","SPLITTER_INPUT_SUBQUERY"."NS_PUBLISHER_CODE_1" ORDER BY "SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_1" DESC)) = 1) then (Case When (("SPLITTER_INPUT_SUBQUERY"."NS_ID_0_0" IS NULL) OR ((("SPLITTER_INPUT_SUBQUERY"."NS_EXP_DATE_0_0" IS NULL AND TO_CHAR("SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_0_0", 'J.HH24.MI.SS') <= TO_CHAR("SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_1", 'J.HH24.MI.SS')) OR ("SPLITTER_INPUT_SUBQUERY"."NS_EXP_DATE_0_0" IS NOT NULL AND TO_CHAR("SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_0_0", 'J.HH24.MI.SS') <= TO_CHAR("SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_1", 'J.HH24.MI.SS') AND TO_CHAR("SPLITTER_INPUT_SUBQUERY"."NS_EXP_DATE_0_0", 'J.HH24.MI.SS') >= TO_CHAR("SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_1", 'J.HH24.MI.SS'))) AND (("SPLITTER_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CO_1" IS NULL AND "SPLITTER_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CO_2" IS NOT NULL) OR ("SPLITTER_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CO_1" IS NOT NULL AND "SPLITTER_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CO_2" IS NULL) OR ("SPLITTER_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CO_1" != "SPLITTER_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CO_2") OR ("SPLITTER_INPUT_SUBQUERY"."NS_SUPPLY_FLAG_1" IS NULL AND "SPLITTER_INPUT_SUBQUERY"."NS_SUPPLY_FLAG_0_0" IS NOT NULL) OR ("SPLITTER_INPUT_SUBQUERY"."NS_SUPPLY_FLAG_1" IS NOT NULL AND "SPLITTER_INPUT_SUBQUERY"."NS_SUPPLY_FLAG_0_0" IS NULL) OR ("SPLITTER_INPUT_SUBQUERY"."NS_SUPPLY_FLAG_1" != "SPLITTER_INPUT_SUBQUERY"."NS_SUPPLY_FLAG_0_0")))) then ( TO_DATE('31-DEC-4000') ) else ("SPLITTER_INPUT_SUBQUERY"."NS_EXP_DATE_0_0") end) else (("SPLITTER_INPUT_SUBQUERY"."NS_EFF_DATE_1" - INTERVAL '1' SECOND)) end)/* MERGE_DELTA_ROW.OUTGRP1.NS_EXP_DATE */ "NS_EXP_DATE"
    FROM
    (SELECT
    "INGRP1"."NS_ID" "NS_ID_1",
    "INGRP1"."NS_OUTLET_SRC_ID" "NS_OUTLET_SRC_ID_1",
    "INGRP1"."NS_PUBLISHER_CODE" "NS_PUBLISHER_CODE_1",
    "INGRP1"."NS_TITLE_CLASSIFICATION_CODE" "NS_TITLE_CLASSIFICATION_CO_1",
    "INGRP1"."NS_SUPPLY_FLAG" "NS_SUPPLY_FLAG_1",
    "INGRP1"."NS_EFF_DATE" "NS_EFF_DATE_1",
    "INGRP1"."NS_EXP_DATE" "NS_EXP_DATE_1",
    "INGRP2"."NS_ID" "NS_ID_0_0",
    "INGRP2"."NS_OUTLET_SRC_ID" "NS_OUTLET_SRC_ID_0_0",
    "INGRP2"."NS_PUBLISHER_CODE" "NS_PUBLISHER_CODE_0_0",
    "INGRP2"."NS_TITLE_CLASSIFICATION_CODE" "NS_TITLE_CLASSIFICATION_CO_2",
    "INGRP2"."NS_SUPPLY_FLAG" "NS_SUPPLY_FLAG_0_0",
    "INGRP2"."NS_EFF_DATE" "NS_EFF_DATE_0_0",
    "INGRP2"."NS_EXP_DATE" "NS_EXP_DATE_0_0",
    "INGRP2"."DIMENSION_KEY" "DIMENSION_KEY_0"
    FROM
    ( SELECT
    "RETAILER_PUBLISHER_NS"."NS_ID" "NS_ID",
    "RETAILER_PUBLISHER_NS"."NS_OUTLET_SRC_ID" "NS_OUTLET_SRC_ID",
    "RETAILER_PUBLISHER_NS"."NS_PUBLISHER_CODE" "NS_PUBLISHER_CODE",
    "RETAILER_PUBLISHER_NS"."NS_TITLE_CLASSIFICATION_CODE" "NS_TITLE_CLASSIFICATION_CODE",
    "RETAILER_PUBLISHER_NS"."NS_SUPPLY_FLAG" "NS_SUPPLY_FLAG",
    "RETAILER_PUBLISHER_NS"."NS_EFF_DATE" "NS_EFF_DATE",
    "RETAILER_PUBLISHER_NS"."NS_EXP_DATE" "NS_EXP_DATE",
    "RETAILER_PUBLISHER_NS"."DIMENSION_KEY" "DIMENSION_KEY"
    FROM
    "RETAILER_PUBLISHER_NS" "RETAILER_PUBLISHER_NS"
    WHERE
    ( "RETAILER_PUBLISHER_NS"."DIMENSION_KEY" = "RETAILER_PUBLISHER_NS"."NS_ID" ) AND
    ( "RETAILER_PUBLISHER_NS"."NS_ID" IS NOT NULL ) ) "INGRP2"
    RIGHT OUTER JOIN ( SELECT
    NULL "NS_ID",
    "LOOKUP_INPUT_SUBQUERY"."NS_OUTLET_SRC_ID$2" "NS_OUTLET_SRC_ID",
    "LOOKUP_INPUT_SUBQUERY"."NS_PUBLISHER_CODE$2" "NS_PUBLISHER_CODE",
    "LOOKUP_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CODE$2" "NS_TITLE_CLASSIFICATION_CODE",
    "LOOKUP_INPUT_SUBQUERY"."NS_SUPPLY_FLAG$2" "NS_SUPPLY_FLAG",
    "LOOKUP_INPUT_SUBQUERY"."NS_EFF_DATE$2" "NS_EFF_DATE",
    "LOOKUP_INPUT_SUBQUERY"."NS_EXP_DATE$2" "NS_EXP_DATE"
    FROM
    (SELECT
    "DEDUP_SRC"."NS_ID$3" "NS_ID$2",
    "DEDUP_SRC"."NS_OUTLET_SRC_ID$3" "NS_OUTLET_SRC_ID$2",
    "DEDUP_SRC"."NS_PUBLISHER_CODE$3" "NS_PUBLISHER_CODE$2",
    "DEDUP_SRC"."NS_TITLE_CLASSIFICATION_CODE$3" "NS_TITLE_CLASSIFICATION_CODE$2",
    "DEDUP_SRC"."NS_SUPPLY_FLAG$3" "NS_SUPPLY_FLAG$2",
    "DEDUP_SRC"."NS_EFF_DATE$3" "NS_EFF_DATE$2",
    "DEDUP_SRC"."NS_EXP_DATE$3" "NS_EXP_DATE$2"
    FROM
    (SELECT
    NULL/* DEDUP_SRC.OUTGRP1.NS_ID */ "NS_ID$3",
    ("PUB_AGENT_MATRIX_CC"."PAM_CUSTOMER_ID"/* EXPR_SRC.OUTGRP1.NS_OUTLET_SRC_ID */)/* DEDUP_SRC.OUTGRP1.NS_OUTLET_SRC_ID */ "NS_OUTLET_SRC_ID$3",
    ((to_char("PUB_AGENT_MATRIX_CC"."PAM_PUBLISHER_CODE")/* EXP.OUTGRP1.PAM_PUBLISHER_CODE */)/* EXPR_SRC.OUTGRP1.NS_PUBLISHER_CODE */)/* DEDUP_SRC.OUTGRP1.NS_PUBLISHER_CODE */ "NS_PUBLISHER_CODE$3",
    ("PUB_AGENT_MATRIX_CC"."PAM_TITLCLAS_CODE"/* EXPR_SRC.OUTGRP1.NS_TITLE_CLASSIFICATION_CODE */)/* DEDUP_SRC.OUTGRP1.NS_TITLE_CLASSIFICATION_CODE */ "NS_TITLE_CLASSIFICATION_CODE$3",
    ("PUB_AGENT_MATRIX_CC"."PAM_SUPPLY_FLAG"/* EXPR_SRC.OUTGRP1.NS_SUPPLY_FLAG */)/* DEDUP_SRC.OUTGRP1.NS_SUPPLY_FLAG */ "NS_SUPPLY_FLAG$3",
    MIN(("PUB_AGENT_MATRIX_CC"."PAM_EFFECTIVE_DATE"/* EXPR_SRC.OUTGRP1.NS_EFF_DATE */)) KEEP (DENSE_RANK FIRST ORDER BY NULL/* EXPR_SRC.OUTGRP1.NS_ID */)/* DEDUP_SRC.OUTGRP1.NS_EFF_DATE */ "NS_EFF_DATE$3",
    NULL/* DEDUP_SRC.OUTGRP1.NS_EXP_DATE */ "NS_EXP_DATE$3"
    FROM
    "REFSTG"."PUB_AGENT_MATRIX_CC" "PUB_AGENT_MATRIX_CC"
    WHERE
    ( "PUB_AGENT_MATRIX_CC"."PAM_ADD_REMOVE_FLAG" = 'A' )
    GROUP BY
    ("PUB_AGENT_MATRIX_CC"."PAM_CUSTOMER_ID"/* EXPR_SRC.OUTGRP1.NS_OUTLET_SRC_ID */), ((to_char("PUB_AGENT_MATRIX_CC"."PAM_PUBLISHER_CODE")/* EXP.OUTGRP1.PAM_PUBLISHER_CODE */)/* EXPR_SRC.OUTGRP1.NS_PUBLISHER_CODE */), ("PUB_AGENT_MATRIX_CC"."PAM_TITLCLAS_CODE"/* EXPR_SRC.OUTGRP1.NS_TITLE_CLASSIFICATION_CODE */), ("PUB_AGENT_MATRIX_CC"."PAM_SUPPLY_FLAG"/* EXPR_SRC.OUTGRP1.NS_SUPPLY_FLAG */),NULL,NULL/* RETAILER_PUBLISHER_NS.DEDUP_SRC */) "DEDUP_SRC") "LOOKUP_INPUT_SUBQUERY"
    WHERE
    ( NOT ( "LOOKUP_INPUT_SUBQUERY"."NS_OUTLET_SRC_ID$2" IS NULL AND "LOOKUP_INPUT_SUBQUERY"."NS_PUBLISHER_CODE$2" IS NULL ) ) ) "INGRP1" ON ( ( ( "INGRP2"."NS_EFF_DATE" IS NULL OR ( ( "INGRP2"."NS_EXP_DATE" IS NULL AND TO_CHAR ( "INGRP2"."NS_EFF_DATE" , 'J.HH24.MI.SS' ) <= TO_CHAR ( "INGRP1"."NS_EFF_DATE" , 'J.HH24.MI.SS' ) ) OR ( "INGRP2"."NS_EXP_DATE" IS NOT NULL AND TO_CHAR ( "INGRP2"."NS_EFF_DATE" , 'J.HH24.MI.SS' ) <= TO_CHAR ( "INGRP1"."NS_EFF_DATE" , 'J.HH24.MI.SS' ) AND TO_CHAR ( "INGRP2"."NS_EXP_DATE" , 'J.HH24.MI.SS' ) >= TO_CHAR ( "INGRP1"."NS_EFF_DATE" , 'J.HH24.MI.SS' ) ) ) ) ) AND ( ( "INGRP2"."NS_PUBLISHER_CODE" = "INGRP1"."NS_PUBLISHER_CODE" ) ) AND ( ( "INGRP2"."NS_OUTLET_SRC_ID" = "INGRP1"."NS_OUTLET_SRC_ID" ) ) )) "SPLITTER_INPUT_SUBQUERY"
    WHERE
    ( ( ( "SPLITTER_INPUT_SUBQUERY"."NS_OUTLET_SRC_ID_1" = "SPLITTER_INPUT_SUBQUERY"."NS_OUTLET_SRC_ID_0_0" AND "SPLITTER_INPUT_SUBQUERY"."NS_PUBLISHER_CODE_1" = "SPLITTER_INPUT_SUBQUERY"."NS_PUBLISHER_CODE_0_0" ) ) OR ( "SPLITTER_INPUT_SUBQUERY"."NS_OUTLET_SRC_ID_0_0" IS NULL AND "SPLITTER_INPUT_SUBQUERY"."NS_PUBLISHER_CODE_0_0" IS NULL ) )
    UNION
    SELECT
    "DEDUP_SCD_SRC"."NS_ID$4" "NS_ID",
    "DEDUP_SCD_SRC"."NS_OUTLET_SRC_ID$4" "NS_OUTLET_SRC_ID",
    "DEDUP_SCD_SRC"."NS_PUBLISHER_CODE$4" "NS_PUBLISHER_CODE",
    "DEDUP_SCD_SRC"."NS_TITLE_CLASSIFICATION_CODE$4" "NS_TITLE_CLASSIFICATION_CODE",
    "DEDUP_SCD_SRC"."NS_SUPPLY_FLAG$4" "NS_SUPPLY_FLAG",
    "DEDUP_SCD_SRC"."NS_EFF_DATE$4" "NS_EFF_DATE",
    "DEDUP_SCD_SRC"."NS_EXP_DATE$4" "NS_EXP_DATE"
    FROM
    (SELECT
    "AGG_INPUT"."NS_ID$5"/* DEDUP_SCD_SRC.OUTGRP1.NS_ID */ "NS_ID$4",
    "AGG_INPUT"."NS_OUTLET_SRC_ID$5"/* DEDUP_SCD_SRC.OUTGRP1.NS_OUTLET_SRC_ID */ "NS_OUTLET_SRC_ID$4",
    "AGG_INPUT"."NS_PUBLISHER_CODE$5"/* DEDUP_SCD_SRC.OUTGRP1.NS_PUBLISHER_CODE */ "NS_PUBLISHER_CODE$4",
    MIN("AGG_INPUT"."NS_TITLE_CLASSIFICATION_CODE$5") KEEP (DENSE_RANK FIRST ORDER BY "AGG_INPUT"."NS_TITLE_CLASSIFICATION_CODE$5" NULLS LAST)/* DEDUP_SCD_SRC.OUTGRP1.NS_TITLE_CLASSIFICATION_CODE */ "NS_TITLE_CLASSIFICATION_CODE$4",
    MIN("AGG_INPUT"."NS_SUPPLY_FLAG$5") KEEP (DENSE_RANK FIRST ORDER BY "AGG_INPUT"."NS_SUPPLY_FLAG$5" NULLS LAST)/* DEDUP_SCD_SRC.OUTGRP1.NS_SUPPLY_FLAG */ "NS_SUPPLY_FLAG$4",
    MIN("AGG_INPUT"."NS_EFF_DATE$5") KEEP (DENSE_RANK FIRST ORDER BY "AGG_INPUT"."NS_EFF_DATE$5" NULLS LAST)/* DEDUP_SCD_SRC.OUTGRP1.NS_EFF_DATE */ "NS_EFF_DATE$4",
    MIN("AGG_INPUT"."NS_EXP_DATE$5") KEEP (DENSE_RANK FIRST ORDER BY "AGG_INPUT"."NS_EXP_DATE$5" NULLS LAST)/* DEDUP_SCD_SRC.OUTGRP1.NS_EXP_DATE */ "NS_EXP_DATE$4"
    FROM
    (SELECT
    "SPLITTER_INPUT_SUBQUERY$1"."NS_ID_0_0$1"/* UPDATE_DELTA_ROW.OUTGRP1.NS_ID */ "NS_ID$5",
    "SPLITTER_INPUT_SUBQUERY$1"."NS_OUTLET_SRC_ID_1$1"/* UPDATE_DELTA_ROW.OUTGRP1.NS_OUTLET_SRC_ID */ "NS_OUTLET_SRC_ID$5",
    "SPLITTER_INPUT_SUBQUERY$1"."NS_PUBLISHER_CODE_1$1"/* UPDATE_DELTA_ROW.OUTGRP1.NS_PUBLISHER_CODE */ "NS_PUBLISHER_CODE$5",
    "SPLITTER_INPUT_SUBQUERY$1"."NS_TITLE_CLASSIFICATION_CO_2$1"/* UPDATE_DELTA_ROW.OUTGRP1.NS_TITLE_CLASSIFICATION_CODE */ "NS_TITLE_CLASSIFICATION_CODE$5",
    "SPLITTER_INPUT_SUBQUERY$1"."NS_SUPPLY_FLAG_0_0$1"/* UPDATE_DELTA_ROW.OUTGRP1.NS_SUPPLY_FLAG */ "NS_SUPPLY_FLAG$5",
    "SPLITTER_INPUT_SUBQUERY$1"."NS_EFF_DATE_0_0$1"/* UPDATE_DELTA_ROW.OUTGRP1.NS_EFF_DATE */ "NS_EFF_DATE$5",
    ("SPLITTER_INPUT_SUBQUERY$1"."NS_EFF_DATE_1$1" - INTERVAL '1' SECOND)/* UPDATE_DELTA_ROW.OUTGRP1.NS_EXP_DATE */ "NS_EXP_DATE$5"
    FROM
    (SELECT
    "INGRP1"."NS_ID" "NS_ID_1$1",
    "INGRP1"."NS_OUTLET_SRC_ID" "NS_OUTLET_SRC_ID_1$1",
    "INGRP1"."NS_PUBLISHER_CODE" "NS_PUBLISHER_CODE_1$1",
    "INGRP1"."NS_TITLE_CLASSIFICATION_CODE" "NS_TITLE_CLASSIFICATION_CO_1$1",
    "INGRP1"."NS_SUPPLY_FLAG" "NS_SUPPLY_FLAG_1$1",
    "INGRP1"."NS_EFF_DATE" "NS_EFF_DATE_1$1",
    "INGRP1"."NS_EXP_DATE" "NS_EXP_DATE_1$1",
    "INGRP2"."NS_ID" "NS_ID_0_0$1",
    "INGRP2"."NS_OUTLET_SRC_ID" "NS_OUTLET_SRC_ID_0_0$1",
    "INGRP2"."NS_PUBLISHER_CODE" "NS_PUBLISHER_CODE_0_0$1",
    "INGRP2"."NS_TITLE_CLASSIFICATION_CODE" "NS_TITLE_CLASSIFICATION_CO_2$1",
    "INGRP2"."NS_SUPPLY_FLAG" "NS_SUPPLY_FLAG_0_0$1",
    "INGRP2"."NS_EFF_DATE" "NS_EFF_DATE_0_0$1",
    "INGRP2"."NS_EXP_DATE" "NS_EXP_DATE_0_0$1",
    "INGRP2"."DIMENSION_KEY" "DIMENSION_KEY_0$1"
    FROM
    ( SELECT
    "RETAILER_PUBLISHER_NS"."NS_ID" "NS_ID",
    "RETAILER_PUBLISHER_NS"."NS_OUTLET_SRC_ID" "NS_OUTLET_SRC_ID",
    "RETAILER_PUBLISHER_NS"."NS_PUBLISHER_CODE" "NS_PUBLISHER_CODE",
    "RETAILER_PUBLISHER_NS"."NS_TITLE_CLASSIFICATION_CODE" "NS_TITLE_CLASSIFICATION_CODE",
    "RETAILER_PUBLISHER_NS"."NS_SUPPLY_FLAG" "NS_SUPPLY_FLAG",
    "RETAILER_PUBLISHER_NS"."NS_EFF_DATE" "NS_EFF_DATE",
    "RETAILER_PUBLISHER_NS"."NS_EXP_DATE" "NS_EXP_DATE",
    "RETAILER_PUBLISHER_NS"."DIMENSION_KEY" "DIMENSION_KEY"
    FROM
    "RETAILER_PUBLISHER_NS" "RETAILER_PUBLISHER_NS"
    WHERE
    ( "RETAILER_PUBLISHER_NS"."DIMENSION_KEY" = "RETAILER_PUBLISHER_NS"."NS_ID" ) AND
    ( "RETAILER_PUBLISHER_NS"."NS_ID" IS NOT NULL ) ) "INGRP2"
    RIGHT OUTER JOIN ( SELECT
    NULL "NS_ID",
    "LOOKUP_INPUT_SUBQUERY"."NS_OUTLET_SRC_ID$2" "NS_OUTLET_SRC_ID",
    "LOOKUP_INPUT_SUBQUERY"."NS_PUBLISHER_CODE$2" "NS_PUBLISHER_CODE",
    "LOOKUP_INPUT_SUBQUERY"."NS_TITLE_CLASSIFICATION_CODE$2" "NS_TITLE_CLASSIFICATION_CODE",
    "LOOKUP_INPUT_SUBQUERY"."NS_SUPPLY_FLAG$2" "NS_SUPPLY_FLAG",
    "LOOKUP_INPUT_SUBQUERY"."NS_EFF_DATE$2" "NS_EFF_DATE",
    "LOOKUP_INPUT_SUBQUERY"."NS_EXP_DATE$2" "NS_EXP_DATE"
    FROM
    (SELECT
    "DEDUP_SRC"."NS_ID$3" "NS_ID$2",
    "DEDUP_SRC"."NS_OUTLET_SRC_ID$3" "NS_OUTLET_SRC_ID$2",
    "DEDUP_SRC"."NS_PUBLISHER_CODE$3" "NS_PUBLISHER_CODE$2",
    "DEDUP_SRC"."NS_TITLE_CLASSIFICATION_CODE$3" "NS_TITLE_CLASSIFICATION_CODE$2",
    "DEDUP_SRC"."NS_SUPPLY_FLAG$3" "NS_SUPPLY_FLAG$2",
    "DEDUP_SRC"."NS_EFF_DATE$3" "NS_EFF_DATE$2",
    "DEDUP_SRC"."NS_EXP_DATE$3" "NS_EXP_DATE$2"
    FROM
    (SELECT
    NULL/* DEDUP_SRC.OUTGRP1.NS_ID */ "NS_ID$3",
    ("PUB_AGENT_MATRIX_CC"."PAM_CUSTOMER_ID"/* EXPR_SRC.OUTGRP1.NS_OUTLET_SRC_ID */)/* DEDUP_SRC.OUTGRP1.NS_OUTLET_SRC_ID */ "NS_OUTLET_SRC_ID$3",
    ((to_char("PUB_AGENT_MATRIX_CC"."PAM_PUBLISHER_CODE")/* EXP.OUTGRP1.PAM_PUBLISHER_CODE */)/* EXPR_SRC.OUTGRP1.NS_PUBLISHER_CODE */)/* DEDUP_SRC.OUTGRP1.NS_PUBLISHER_CODE */ "NS_PUBLISHER_CODE$3",
    ("PUB_AGENT_MATRIX_CC"."PAM_TITLCLAS_CODE"/* EXPR_SRC.OUTGRP1.NS_TITLE_CLASSIFICATION_CODE */)/* DEDUP_SRC.OUTGRP1.NS_TITLE_CLASSIFICATION_CODE */ "NS_TITLE_CLASSIFICATION_CODE$3",
    ("PUB_AGENT_MATRIX_CC"."PAM_SUPPLY_FLAG"/* EXPR_SRC.OUTGRP1.NS_SUPPLY_FLAG */)/* DEDUP_SRC.OUTGRP1.NS_SUPPLY_FLAG */ "NS_SUPPLY_FLAG$3",
    MIN(("PUB_AGENT_MATRIX_CC"."PAM_EFFECTIVE_DATE"/* EXPR_SRC.OUTGRP1.NS_EFF_DATE */)) KEEP (DENSE_RANK FIRST ORDER BY NULL/* EXPR_SRC.OUTGRP1.NS_ID */)/* DEDUP_SRC.OUTGRP1.NS_EFF_DATE */ "NS_EFF_DATE$3",
    NULL/* DEDUP_SRC.OUTGRP1.NS_EXP_DATE */ "NS_EXP_DATE$3"
    FROM
    "REFSTG"."PUB_AGENT_MATRIX_CC" "PUB_AGENT_MATRIX_CC"
    WHERE
    ( "PUB_AGENT_MATRIX_CC"."PAM_ADD_REMOVE_FLAG" = 'A' )
    GROUP BY
    ("PUB_AGENT_MATRIX_CC"."PAM_CUSTOMER_ID"/* EXPR_SRC.OUTGRP1.NS_OUTLET_SRC_ID */), ((to_char("PUB_AGENT_MATRIX_CC"."PAM_PUBLISHER_CODE")/* EXP.OUTGRP1.PAM_PUBLISHER_CODE */)/* EXPR_SRC.OUTGRP1.NS_PUBLISHER_CODE */), ("PUB_AGENT_MATRIX_CC"."PAM_TITLCLAS_CODE"/* EXPR_SRC.OUTGRP1.NS_TITLE_CLASSIFICATION_CODE */), ("PUB_AGENT_MATRIX_CC"."PAM_SUPPLY_FLAG"/* EXPR_SRC.OUTGRP1.NS_SUPPLY_FLAG */),NULL,NULL/* RETAILER_PUBLISHER_NS.DEDUP_SRC */) "DEDUP_SRC") "LOOKUP_INPUT_SUBQUERY"
    WHERE
    ( NOT ( "LOOKUP_INPUT_SUBQUERY"."NS_OUTLET_SRC_ID$2" IS NULL AND "LOOKUP_INPUT_SUBQUERY"."NS_PUBLISHER_CODE$2" IS NULL ) ) ) "INGRP1" ON ( ( ( "INGRP2"."NS_EFF_DATE" IS NULL OR ( ( "INGRP2"."NS_EXP_DATE" IS NULL AND TO_CHAR ( "INGRP2"."NS_EFF_DATE" , 'J.HH24.MI.SS' ) <= TO_CHAR ( "INGRP1"."NS_EFF_DATE" , 'J.HH24.MI.SS' ) ) OR ( "INGRP2"."NS_EXP_DATE" IS NOT NULL AND TO_CHAR ( "INGRP2"."NS_EFF_DATE" , 'J.HH24.MI.SS' ) <= TO_CHAR ( "INGRP1"."NS_EFF_DATE" , 'J.HH24.MI.SS' ) AND TO_CHAR ( "INGRP2"."NS_EXP_DATE" , 'J.HH24.MI.SS' ) >= TO_CHAR ( "INGRP1"."NS_EFF_DATE" , 'J.HH24.MI.SS' ) ) ) ) ) AND ( ( "INGRP2"."NS_PUBLISHER_CODE" = "INGRP1"."NS_PUBLISHER_CODE" ) ) AND ( ( "INGRP2"."NS_OUTLET_SRC_ID" = "INGRP1"."NS_OUTLET_SRC_ID" ) ) )) "SPLITTER_INPUT_SUBQUERY$1"
    WHERE
    ( "SPLITTER_INPUT_SUBQUERY$1"."NS_OUTLET_SRC_ID_1$1" = "SPLITTER_INPUT_SUBQUERY$1"."NS_OUTLET_SRC_ID_0_0$1" AND "SPLITTER_INPUT_SUBQUERY$1"."NS_PUBLISHER_CODE_1$1" = "SPLITTER_INPUT_SUBQUERY$1"."NS_PUBLISHER_CODE_0_0$1" ) AND
    ( ( "SPLITTER_INPUT_SUBQUERY$1"."NS_EXP_DATE_0_0$1" IS NULL AND TO_CHAR ( "SPLITTER_INPUT_SUBQUERY$1"."NS_EFF_DATE_0_0$1" , 'J.HH24.MI.SS' ) <= TO_CHAR ( "SPLITTER_INPUT_SUBQUERY$1"."NS_EFF_DATE_1$1" , 'J.HH24.MI.SS' ) ) OR ( "SPLITTER_INPUT_SUBQUERY$1"."NS_EXP_DATE_0_0$1" IS NOT NULL AND TO_CHAR ( "SPLITTER_INPUT_SUBQUERY$1"."NS_EFF_DATE_0_0$1" , 'J.HH24.MI.SS' ) <= TO_CHAR ( "SPLITTER_INPUT_SUBQUERY$1"."NS_EFF_DATE_1$1" , 'J.HH24.MI.SS' ) AND TO_CHAR ( "SPLITTER_INPUT_SUBQUERY$1"."NS_EXP_DATE_0_0$1" , 'J.HH24.MI.SS' ) >= TO_CHAR ( "SPLITTER_INPUT_SUBQUERY$1"."NS_EFF_DATE_1$1" , 'J.HH24.MI.SS' ) ) ) AND
    ( ( "SPLITTER_INPUT_SUBQUERY$1"."NS_TITLE_CLASSIFICATION_CO_1$1" IS NULL AND "SPLITTER_INPUT_SUBQUERY$1"."NS_TITLE_CLASSIFICATION_CO_2$1" IS NOT NULL ) OR ( "SPLITTER_INPUT_SUBQUERY$1"."NS_TITLE_CLASSIFICATION_CO_1$1" IS NOT NULL AND "SPLITTER_INPUT_SUBQUERY$1"."NS_TITLE_CLASSIFICATION_CO_2$1" IS NULL ) OR ( "SPLITTER_INPUT_SUBQUERY$1"."NS_TITLE_CLASSIFICATION_CO_1$1" != "SPLITTER_INPUT_SUBQUERY$1"."NS_TITLE_CLASSIFICATION_CO_2$1" ) OR ( "SPLITTER_INPUT_SUBQUERY$1"."NS_SUPPLY_FLAG_1$1" IS NULL AND "SPLITTER_INPUT_SUBQUERY$1"."NS_SUPPLY_FLAG_0_0$1" IS NOT NULL ) OR ( "SPLITTER_INPUT_SUBQUERY$1"."NS_SUPPLY_FLAG_1$1" IS NOT NULL AND "SPLITTER_INPUT_SUBQUERY$1"."NS_SUPPLY_FLAG_0_0$1" IS NULL ) OR ( "SPLITTER_INPUT_SUBQUERY$1"."NS_SUPPLY_FLAG_1$1" != "SPLITTER_INPUT_SUBQUERY$1"."NS_SUPPLY_FLAG_0_0$1" ) )) "AGG_INPUT"
    GROUP BY
    "AGG_INPUT"."NS_ID$5", "AGG_INPUT"."NS_OUTLET_SRC_ID$5", "AGG_INPUT"."NS_PUBLISHER_CODE$5"/* RETAILER_PUBLISHER_NS.DEDUP_SCD_SRC */) "DEDUP_SCD_SRC") ) "MERGE_DELTA_ROW_0"
    MERGE_SUBQUERY
    ON (
    "RETAILER_PUBLISHER_NS"."NS_OUTLET_SRC_ID" = "MERGE_SUBQUERY"."NS_OUTLET_SRC_ID" AND
    "RETAILER_PUBLISHER_NS"."NS_PUBLISHER_CODE" = "MERGE_SUBQUERY"."NS_PUBLISHER_CODE" AND
    "RETAILER_PUBLISHER_NS"."NS_EFF_DATE" = "MERGE_SUBQUERY"."NS_EFF_DATE" AND
    "RETAILER_PUBLISHER_NS"."NS_ID" = "MERGE_SUBQUERY"."NS_ID"
    WHEN MATCHED THEN
    UPDATE
    SET
    "NS_TITLE_CLASSIFICATION_CODE" = "MERGE_SUBQUERY"."NS_TITLE_CLASSIFICATION_CODE",
    "NS_SUPPLY_FLAG" = "MERGE_SUBQUERY"."NS_SUPPLY_FLAG",
    "NS_EXP_DATE" = "MERGE_SUBQUERY"."NS_EXP_DATE"
    WHEN NOT MATCHED THEN
    INSERT
    ("RETAILER_PUBLISHER_NS"."NS_ID",
    "RETAILER_PUBLISHER_NS"."NS_OUTLET_SRC_ID",
    "RETAILER_PUBLISHER_NS"."NS_PUBLISHER_CODE",
    "RETAILER_PUBLISHER_NS"."NS_TITLE_CLASSIFICATION_CODE",
    "RETAILER_PUBLISHER_NS"."NS_SUPPLY_FLAG",
    "RETAILER_PUBLISHER_NS"."NS_EFF_DATE",
    "RETAILER_PUBLISHER_NS"."NS_EXP_DATE",
    "RETAILER_PUBLISHER_NS"."DIMENSION_KEY")
    VALUES
    ("RETAILER_PUBLISHER_NS_SEQ".NEXTVAL,
    "MERGE_SUBQUERY"."NS_OUTLET_SRC_ID",
    "MERGE_SUBQUERY"."NS_PUBLISHER_CODE",
    "MERGE_SUBQUERY"."NS_TITLE_CLASSIFICATION_CODE",
    "MERGE_SUBQUERY"."NS_SUPPLY_FLAG",
    "MERGE_SUBQUERY"."NS_EFF_DATE",
    "MERGE_SUBQUERY"."NS_EXP_DATE",
    "RETAILER_PUBLISHER_NS_SEQ".CURRVAL)
    Explain plan:
    MERGE STATEMENT, GOAL = ALL_ROWS               1412     2     286
    MERGE     DW     RETAILER_PUBLISHER_NS               
    VIEW     DW                    
    SEQUENCE     DW     RETAILER_PUBLISHER_NS_SEQ               
    HASH JOIN OUTER               1412     2     256
    VIEW     DW          940     2     170
    SORT UNIQUE               940     2     218
    UNION-ALL                         
    WINDOW SORT               470     1     133
    FILTER                         
    NESTED LOOPS OUTER               468     1     133
    VIEW     DW          4     1     65
    SORT GROUP BY               4     1     25
    TABLE ACCESS FULL     REFSTG     PUB_AGENT_MATRIX_CC     3     1     25
    VIEW     SYS          464     1     68
    VIEW     DW          464     1     68
    TABLE ACCESS FULL     DW     RETAILER_PUBLISHER_NS     464     1     43
    VIEW     DW          469     1     85
    SORT GROUP BY               469     1     90
    NESTED LOOPS               468     1     90
    VIEW     DW          4     1     37
    SORT GROUP BY               4     1     25
    TABLE ACCESS FULL     REFSTG     PUB_AGENT_MATRIX_CC     3     1     25
    VIEW     SYS          464     1     53
    VIEW     DW          464     1     68
    TABLE ACCESS FULL     DW     RETAILER_PUBLISHER_NS     464     1     43
    TABLE ACCESS FULL     DW     RETAILER_PUBLISHER_NS     467     337417     14508931
    Is this similar to the sql generated at your end? Do you use special loading hints, anything specail with indexing - we have tried standard indexing.
    Does this look untoward - have you any other suggestions?
    Thanks for your interest.

  • I just replaced my hard drive and am trying to install the Snow Leppard OS on my Mac mini using the DVD's that came with the unit. Is there a step by step procedure that will guide me through this process?

    I just replaced my hard drive due to failure in my Mac mini. I am trying to load the Snow Leppard OS from the DVD's that came with the unit. Is there a step-by-step procedure somewhere that would help me? This is my first problem with my first Mac. I thought getting to the hard drive and changing it would be the hard part but I made it through that with little trouble. My lack of software experience on the Mac is really slowing me down. I believe I'm not making the right choices in the Disk Utility program.Any help would be greatly appreciated. Thank you.

    Prep the new drive:
    Drive Preparation
    1.  Boot from your OS X Installer Disc. After the installer loads select your language and click on the Continue button.  When the menu bar appears select Disk Utility from the Utilities menu.
    2. After DU loads select your hard drive (this is the entry with the mfgr.'s ID and size) from the left side list. Note the SMART status of the drive in DU's status area.  If it does not say "Verified" then the drive is failing or has failed and will need replacing.  SMART info will not be reported  on external drives. Otherwise, click on the Partition tab in the DU main window.
    3. Under the Volume Scheme heading set the number of partitions from the drop down menu to one. Set the format type to Mac OS Extended (Journaled.) Click on the Options button, set the partition scheme to GUID then click on the OK button. Click on the Partition button and wait until the process has completed.
    4. Select the volume you just created (this is the sub-entry under the drive entry) from the left side list. Click on the Erase tab in the DU main window.
    5. Set the format type to Mac OS Extended (Journaled.) Click on the Options button, check the button for Zero Data and click on OK to return to the Erase window.
    6. Click on the Erase button. The format process can take up to several hours depending upon the drive size.
    Upon completion quit DU and return to the installer. Install OS X. When the installation has completed you can proceed to restore your data from your backups. If you have an existing backup from Time Machine or another hard drive then upon completing the Setup Assistant you will have an option to restore from another Mac, a TM backup, or another drive. Use the appropriate option.

  • Using Container operation step can i pass one table to another?

    hi
    i have 2 multiline container elements in the workflow.
    in one of my step i am sending email using the receipents from the table LT_RECLIST which is a multiline container. Now before this step i am using a container operation step to pass LT_RECLIST_FINAL to LT_RECLIST. But guess this is not working? is this possible?
    the condition in container operation step is like this
    Result Element   LT_RECLIST
    Assignment         =     Assign (contents of table are deleted first)
    Expression          &LT_RECLIST_FINAL&
    Operator
    any idea where i am going wrong?

    instead of
    =     Assign (contents of table are deleted first)
    try the followinng option.
    <-    Add only to table (contents are extended)
    or you can directly use the element LT_RECLIST_FINAL in your mail step instead of assigning it to LT_RECLIST and using this element. if both the elements are going to store the same values then there is no point in using a separate element.

  • Step by Step procedure to create an entire scenario Using CAF

    Dear All,
        I have a scenario to develop using Composite Application Framework.After creating the CAF i need to display the CAF on the portal. step by step procedure(if possible with screen shots) to create a CAF and how to find web services and then display this application on portal.
    Thanks

    Hi Yogi,
    In your sceanario if you are looking to build either Enity or Application service then create it and test that service from "Service Browser". Once your service is working as per your reqyurements then we have to options to bring it into the Portal:
    1. Generate Web Service for you Service (Entity or Application).
    2. Consume that Web Service in WebDynpro application using Model concepts.
    3. Deploy your WebDynpro Application into your portal Server.
    4. Create a WebDynpro iView for your application and assign it any where you want.
                                                         OR
    1. Generate WebDynpro model for your CAF application.
    2. Create a public part for you application webdynpro project of your CAF application.
    3. Create a new WebDynpro DC and use the Models generated for your CAF application.
    4. Deploy your WebDynpro Application into your portal Server.
    5. Create a WebDynpro iView for your application and assign it any where you want.
    If you can give your complete scenario then that would be more helpful for any one to provide best answer which is approapriate for you.
    Thanks,
    Uday.

  • What username and password is used for a Mac on the Install of Adobe Reader,  (Step 8 in procedure)

    Is it a Mac or Adobe password used for installing Reader to a Mac OS? Doesn't. Say in step 8 of procedure.  We cannot. Get past this point.  Thanks.

    See https://forums.adobe.com/thread/1619850

  • I have about 10000 images of different persons with dimension of 640*480. I wan to crop face from these images to dimension of 200*280. The location of face varies in differnt pics. So, please let me know step by step how can I perform this using the Ligh

    I have about 10000 images of different persons with dimension of 640*480. I wan to crop face from these images to dimension of 200*280. The location of face varies in differnt pics. So, please let me know step by step how can I perform this using the Lightroom software. Also I wan to know what should be the aspect ratio to do so.Thank you in advance

    The aspect ratio should be set to 200x280, or equivalently 5x7.
    I don't think it is possible to do the cropping automatically in Lightroom, you'd have to do this image by image to crop the faces correctly.

  • Step By step Procedure to control Pm orders Budget by using Wbs element

    HI,
    we need step by step procedure to control PM orders Budget By using WBS  Element, step by step means
    1. how to create wbs element
    2 how to assign budget to wbs element
    3 how to control this budget through orders
    Regards
    Ganesh

    Hi,
    Please verify this link:
    http://www.sap-img.com/ps006.htm
    Regards
    Keerthi

  • Lsmw  step by step procedure using   batch input method

    hi,
    lsmw  step by step procedure using   batch input method

    Step-by-Step Guide for using BAPI in LSMW
    Note! The screen prints in this article are from ECC 5.0. They may differ slightly in other versions.
    Introduction:
    This document details the usage of BAPI in LSMW. We have used the example of migration of the purchase order data into SAP.
    Pre-requisites:
    It is assumed that the reader of this article has the minimum knowledge required on the Business Object, BAPI, Message Types and IDoc Types.
    Step-by-Step Procedure:
    Details of the BAPI used in this scenario:
    Business Object: BUS2012
    Method: CreateFromData
    Details of Message Type and Basic IDoc Type:
    Message Type: PORDCR
    Basic IDoc Type: PORDCR02
    Let’s have a look at the BAPI first, before proceeding to the LSMW:
    1. Go to Transaction BAPI
    2. Click on Search Button
    3. Enter the value “BUS2012” and select “Obj.type(Technical Object Name”
    4. Press ENTER
    5. Following screen appears:
    6. On the left side of the screen, Expand the “PurchaseOrder”.
    7. Select “PurchaseOrder” and double-click on the same for details.
    Building LSMW using BAPI:
    1. Go to Transaction LSMW.
    2. Enter the Project, Subproject and Object information and click on CREATE.
    3. Enter the descriptions for Project, Subproject and Object.
    4. Now select Settings à IDoc Inbound Processing
    5. “IDoc Inbound Processing” screen appears. Enter the required details as shown below:
    6. Click on “Activate IDoc Inbound Processing”.
    7. Click on “Yes” when prompted for “Activate IDoc Inbound?”
    8. Hit on “Back” to return to the main screen.
    9. Click on Continue (F8). Following Screen appears:
    10. Select the Step 1 “Maintain Object Attributes” and select “Execute”.
    11. Select the radio button “Business Object Method” and enter the following details:
    Business Object: BUS2012
    Method: CreateFromData
    Hit ENTER
    12. Save and click on BACK button. Following information message is displayed.
    13. Now select step 2 “Maintain Source Structures” and click “Execute”.
    14. In this step, we need to maintain the source structure. In our example, lets consider the example of a file with 2 structures Head and Item data as shown below:
    Click on Create and name the source structure as HEADERDATA. Now select HEADERDATA and click on “Create” again to create the child structure. Following popup appears:
    Select “Lower Level” and click on Continue. Enter the Item data structure name.
    Click Save and hit BACK button to go to the main screen.
    15. Select step 3 “Maintain Source Fields” and hit execute.
    16. Enter the fields as shown below:
    17. Click SAVE and return to main screen.
    18. Select step 4 “Maintain Structure Relations” and click Execute.
    Select E1PORDCR and click on CREATE RelationShip. Following screen appears:
    Select HEADERDATA and hit ENTER
    Similarly do the same for the structure E1BPEKKOA, E1BPEKPOC and E1BPPEKET.
    Click Save and return to main screen.
    19. Select the step “Maintain Field Mapping and Conversion Rules” and click on execute. Maintain the Field Mapping as seen below:
    20. Select step 7 “Maintain Source Files” and provide the link for the test file created. (Create a test file with the same structure as defined earlier).
    Save and return to main screen.
    21. Select the step “Assign Files” and click on Execute.
    Assign the file provided to the source structure. Here the same file is provided for both the structures.
    Save and return to the main screen.
    22. Select the step “Read Data” and click on Execute.
    Click on Execute.
    Return to the main screen.
    23. Select the step “Display read data” and click on execute.
    Click on the structure name to get the field level values.
    24. Return to main screen and now select “Convert Data”.
    25. Return to the main screen and select “Display Converted data”.
    26. Return to main screen and select “Start IDoc generation”.
    27. Now select the step “Start IDoc Processing” on the main screen.
    28. Return to main screen and click on “Create IDoc overview”. Here the data record and status records of the IDoc could be viewed
    It is given in screen shot.
    http://www.****************
    http://www.sapbrainsonline.com/TUTORIALS/TECHNICAL/LSMW_tutorial.html
    http://www.sapbrain.com/TOOLS/LSMW/SAP_LSMW_steps_introduction.html
    http://esnips.com/doc/8e732760-5548-44cc-a0bb-5982c9424f17/lsmw_sp.ppt
    http://esnips.com/doc/f55fef40-fb82-4e89-9000-88316699c323/Data-Transfer-Using-LSMW.zip
    http://esnips.com/doc/1cd73c19-4263-42a4-9d6f-ac5487b0ebcb/LSMW-with-Idocs.ppt
    http://esnips.com/doc/ef04c89f-f3a2-473c-beee-6db5bb3dbb0e/LSMW-with-BAPI.ppt
    http://esnips.com/doc/7582d072-6663-4388-803b-4b2b94d7f85e/LSMW.pdf
    Reward points if useful.

  • Step by step procedure to calibrate NI 4130 using Calibratio​n executive 3.4

    Our company just purchased Calibration Executive and other hardwares necessary in calibrating PXI 4130,4110 and PXI-6251/6259.
    We currently have the following setup to calibrate our NI PXI 4130 SMU:
      • NI PXI 1042 Chassis
      • NI PXI 8108 Controller
      • NI PXI-4071 Digital Multimeter
      • NI APS-4100 Auxilliary Power Supply for PXI-4130
      • Fluke 5700a
    Can someone give a step by step procedure to do calibration using the Cal Exe software? We found this manual http://digital.ni.com/manuals.nsf/websearch/EB53C8​8E7EB640828625762400519455 but it isn't based on cal exe rather on the LabVIEW driver. Sorry, but we really have no idea on how to use it.
    Thanks in Advance!
    Cris Ibarra
    Solved!
    Go to Solution.

    Hi Cris, 
    if you have Calibration Executive installed just start it and go to the Help menu and click Calibration Executive Help (or just press CTRL+F1). This help file will always be a good information source for you, it's a good practice to check it if you never calibrated the device you are about to calibrate. 
    If you have the software running open the drop down menu next to the Device Types label just below the menu bar and select PXI-4130 in your case. The Calibrate button will became active, please click it. 
    A wizard will pop up it will ask for different kinds of information about the "customer" of this calibration, than about the standards, then the environment (temperature and humidity) and at the last page about the board itself (serial number and DAQmx ID). You will know that you filled in everyrhing necessary on a page of the wizard when the next button becames active. 
    After the wizard the calibration will start with a warm-up (my recomendation is to wait and don't cancel) than CalExec will show you connection diagrams on how you should connect your instruments to the DUT (4130), of course you will have to connect the 5700 to the PC over GPIB. 
    The software will show you the progress and if the previous steps were passed or failed. 
    I hope this helps!
    Best Regards
    Botond

  • LIS Step by Step Procedure

    Hi Experts,
    Can any one tell me the step by step procedure to activate LIS for PP (Process Industries). List of T-codes used, which Info structure should be activated for PP.
    My requirement is to activate history data also, please guide me accordingly.
    Thanks in advance,
    Ganesh
    Edited by: Ganesh Kumar Parthasarathy on Jan 11, 2011 1:32 PM

    Dear ,
    Go to COr4 for your order type in case of PP-PI, otherwise go to OPL8( prdn order), Tick the  statistics in screen shop floor information system screen in Implementation Tab.
    This will activate  the following Info structure
    S021          Production order
    S022          Operation
    S023          Material
    S024          Work center
    S025          RS header
    S026          Material usage
    S027          Product costs
    S028          Rep. point statistics
    S029          Kanban
    S225          Goods receipts: repetitive mfg
    S226          Material usage: repetitive mfg
    S227          Product Costs: Repetitive Mfg

  • Step by step procedure with scren shot for BAPI?

    Hi,
        could u tell me the senario for bapi in real time ?
    how to do in realtime ?
    any body tellme the step by step procedure with screen shot pls could u help?
    i will be waiting for reply.
    rehards
    eswar

    Hi
    what is BAPI?
    BAPI stands for Business API(Application Program Interface).
    A BAPI is remotely enabled function module ie it can be invoked from remote programs like standalone JAVA programs, web interface etc..
    You can make your function module remotely enabled in attributes of Function module but
    A BAPI are standard SAP function modules provided by SAP for remote access. Also they are part of Businees Objest Repository(BOR).
    BAPI are RFC enabled function modules. the difference between RFc and BAPI are business objects. You create business objects and those are then registered in your BOR (Business Object Repository) which can be accessed outside the SAP system by using some other applications (Non-SAP) such as VB or JAVA. in this case u only specify the business object and its method from external system in BAPI there is no direct system call. while RFC are direct system call Some BAPIs provide basic functions and can be used for most SAP business object types. These BAPIs should be implemented the same for all business object types. Standardized BAPIs are easier to use and prevent users having to deal with a number of different BAPIs. Whenever possible, a standardized BAPI must be used in preference to an individual BAPI.
    The following standardized BAPIs are provided:
    Reading instances of SAP business objects
    GetList ( ) With the BAPI GetList you can select a range of object key values, for example, company codes and material numbers.
    The BAPI GetList() is a class method.
    GetDetail() With the BAPI GetDetail() the details of an instance of a business object type are retrieved and returned to the calling program. The instance is identified via its key. The BAPI GetDetail() is an instance method. BAPIs that can create, change or delete instances of a business object type
    The following BAPIs of the same object type have to be programmed so that they can be called several times within one transaction. For example, if, after sales order 1 has been created, a second sales order 2 is created in the same transaction, the second BAPI call must not affect the consistency of the sales order 2. After completing the transaction with a COMMIT WORK, both the orders are saved consistently in the database.
    Create( ) and CreateFromData! ( )
    The BAPIs Create() and CreateFromData() create an instance of an SAP business object type, for example, a purchase order. These BAPIs are class methods.
    Change( )
    The BAPI Change() changes an existing instance of an SAP business object type, for example, a purchase order. The BAPI Change () is an instance method.
    Delete( ) and Undelete( ) The BAPI Delete() deletes an instance of an SAP business object type from the database or sets a deletion flag.
    The BAPI Undelete() removes a deletion flag. These BAPIs are instance methods.
    Cancel ( ) Unlike the BAPI Delete(), the BAPI Cancel() cancels an instance of a business object type. The instance to be cancelled remains in the database and an additional instance is created and this is the one that is actually canceled. The Cancel() BAPI is an instance method.
    Add<subobject> ( ) and Remove<subobject> ( ) The BAPI Add<subobject> adds a subobject to an existing object inst! ance and the BAPI and Remove<subobject> removes a subobject from an object instance. These BAPIs are instance methods.
    BAPI-step by step
    http://www.sapgenie.com/abap/bapi/example.htm
    just refer to the link below
    http://www.sapmaterial.com/?gclid=CN322K28t4sCFQ-WbgodSGbK2g
    list of all bapis
    http://www.planetsap.com/LIST_ALL_BAPIs.htm
    for BAPI's
    http://www.sappoint.com/abap/bapiintro.pdf
    http://www.sappoint.com/abap/bapiprg.pdf
    http://www.sappoint.com/abap/bapiactx.pdf
    http://www.sappoint.com/abap/bapilst.pdf
    http://www.sappoint.com/abap/bapiexer.pdf
    http://service.sap.com/ale
    http://service.sap.com/bapi
    http://help.sap.com/printdocu/core/Print46c/en/data/pdf/BCMIDAPII/CABFAAPIINTRO.pdf
    http://help.sap.com/printdocu/core/Print46c/en/data/pdf/CABFABAPIREF/CABFABAPIPG.pdf
    http://help.sap.com/printdocu/core/Print46c/en/data/pdf/BCFESDE8/BCFESDE8.pdf
    http://www.planetsap.com/Bapi_main_page.htm
    http://www.topxml.com/sap/sap_idoc_xml.asp
    http://www.sapdevelopment.co.uk/
    http://www.sapdevelopment.co.uk/java/jco/bapi_jco.pdf
    Also refer to the following links..
    www.sap-img.com/bapi.htm
    www.sap-img.com/abap/bapi-conventions.htm
    www.planetsap.com/Bapi_main_page.htm
    www.sapgenie.com/abap/bapi/index.htm
    Checkout !!
    http://searchsap.techtarget.com/originalContent/0,289142,sid21_gci948835,00.html
    http://techrepublic.com.com/5100-6329-1051160.html#
    Example Code
    U need to give the step_nr, item_nr, cond_count and cond_type so the correct conditon will be updated. If no condition exists for the given parameters, a new condition will be created.
    U can find these parameters for a particular condition type in table KONV.
    *& Form saveTransactionJOCR
    text
    --> p1 text
    <-- p2 text
    FORM saveTransactionJOCR .
    data: salesdocument like BAPIVBELN-VBELN,
    order_header_inx like bapisdh1x,
    order_header_in like bapisdh1,
    return type standard table of bapiret2 with header line,
    conditions_in type standard table of bapicond with header line,
    conditions_inx type standard table of bapicondx with header line,
    logic_switch like BAPISDLS,
    step_nr like conditions_in-cond_st_no,
    item_nr like conditions_in-itm_number,
    cond_count like conditions_in-cond_count,
    cond_type like conditions_in-cond_type.
    salesdocument = wa_order_information-VBELN.
    LOGIC_SWITCH-COND_HANDL = 'X'.
    order_header_inx-updateflag = 'U'.
    conditions
    clear conditions_in[].
    clear conditions_inx[].
    clear: step_nr,
    item_nr,
    cond_count,
    cond_type.
    step_nr = '710'.
    item_nr = '000000'.
    cond_count = '01'.
    cond_type = 'ZCP2'.
    CONDITIONS_IN-ITM_NUMBER = item_nr.
    conditions_in-cond_st_no = step_nr.
    CONDITIONS_IN-COND_COUNT = cond_count.
    CONDITIONS_IN-COND_TYPE = cond_type.
    CONDITIONS_IN-COND_VALUE = 666.
    CONDITIONS_IN-CURRENCY = 'EUR'.
    append conditions_in.
    CONDITIONS_INX-ITM_NUMBER = item_nr.
    conditions_inx-cond_st_no = step_nr.
    CONDITIONS_INX-COND_COUNT = cond_count.
    CONDITIONS_INX-COND_TYPE = cond_type.
    CONDITIONS_INX-UPDATEFLAG = 'U'.
    CONDITIONS_INX-COND_VALUE = 'X'.
    CONDITIONS_INX-CURRENCY = 'X'.
    append conditions_inx.
    CALL FUNCTION 'BAPI_SALESORDER_CHANGE'
    EXPORTING
    SALESDOCUMENT = salesdocument
    ORDER_HEADER_IN = order_header_in
    ORDER_HEADER_INX = order_header_inx
    LOGIC_SWITCH = logic_switch
    TABLES
    RETURN = return
    CONDITIONS_IN = conditions_in
    CONDITIONS_INX = conditions_inx
    if return-type ne 'E'.
    commit work and wait.
    endif.
    ENDFORM. " saveTransactionJOCR
    Bdc to Bapi
    The steps to be followed are :
    1. Find out the relevant BAPI (BAPI_SALESORDER_CHANGE for VA02).
    [for VA01 use BAPI_SALESORDER_CREATEFROMDAT2]
    2. Create a Z program and call the BAPi (same as a Funtion module call).
    2. Now, if you see this BAPi, it has
    -> Importing structures.
    eg: SALESDOCUMENT: this will take the Sales order header data as input.
    -> Tables parameters:
    eg: ORDER_ITEM_IN: this will take the line item data as input.
    Note :
    Only specify fields that should be changed
    Select these fields by entering an X in the checkboxes
    Enter a U in the UPDATEFLAG field
    Always specify key fields when changing the data, including in the checkboxes
    The configuration is an exception here. If this needs to be changed, you need to complete it again fully.
    Maintain quantities and dates in the schedule line data
    Possible UPDATEFLAGS:
    U = change
    D = delete
    I = add
    Example
    1. Delete the whole order
    2. Delete order items
    3. Change the order
    4. Change the configuration
    Notes
    1. Minimum entry:
    You must enter the order number in the SALESDOCUMENT structure.
    You must always enter key fields for changes.
    You must always specify the update indicator in the ORDER_HEADER_INX.
    2. Commit control:
    The BAPI does not run a database Commit, which means that the application must trigger the Commit so that the changes are read to the database. To do this, use the BAPI_TRANSACTION_COMMIT BAPI.
    For further details... refer to the Function Module documentation for the BAPi.
    Bapi to VB(Visual Basic)
    Long back I had used the following flow structure to acheive the same.
    Report -> SM59 RFC destination -> COM4ABAP -> VB.exe
    my report uses the rfc destination to create a COM session with com4abap. com4abap calls the vb.exe and manages the flow of data between sap and vb exe.
    You need to have com4abap.exe
    If com4abap is installed you will find it in sapgui installatin directory , C:\Program Files\SAPpc\sapgui\RFCSDK\com4abap.
    else refer OSS note 419822 for installation of com4abap
    after making the settings in com4abap to point to the vb program and setting up rfc destination in sm59 to point to com4abap session , you can use the following function modules to call the vb code.
    for setting up com4abap and rfc destination please refer to the documentation for com4abap.
    Invoke NEW DCOM session
    call function 'BEGIN_COM_SESSION'
    exporting
    service_dest = service_dest "(this will be a RFC destination created in SM59)
    importing
    worker_dest = worker_dest
    exceptions
    connect_to_dcom_service_failed = 1
    connect_to_dcom_worker_failed = 2
    others = 3.
    call function 'create_com_instance' destination worker_dest
    exporting
    clsid = g_c_clsid
    typelib = g_c_typelib
    importing
    instid = g_f_oid
    exceptions
    communication_failure = 1 message g_f_msg
    system_failure = 2 message g_f_msg
    invalid_instance_id = 3
    others = 4.
    call function 'com_invoke' destination worker_dest
    exporting
    %instid = g_f_oid
    %method = 'UpdatePDF'
    sntemp = g_v_const_filent
    snsysid = sy-sysid
    snflag = 'N'
    tables
    rssaptable = g_t_pdfdetail1
    %return = g_t_pdfdetail1 "t_test
    exceptions
    communication_failure = 1 message g_f_msg
    system_failure = 2 message g_f_msg
    invalid_instance_id = 3
    others = 4.
    then close the com session , using
    FM delete_com_instance
    FM END_COM_SESSION
    see the sample code
    REPORT zpo_bapi_purchord_tej.
    DATA DECLARATIONS *
    TYPE-POOLS slis.
    TYPES: BEGIN OF ty_table,
    v_legacy(8),
    vendor TYPE bapimepoheader-vendor,
    purch_org TYPE bapimepoheader-purch_org,
    pur_group TYPE bapimepoheader-pur_group,
    material TYPE bapimepoitem-material,
    quantity(13),
    delivery_date TYPE bapimeposchedule-delivery_date,
    net_price(23),
    plant TYPE bapimepoitem-plant,
    END OF ty_table.
    TYPES: BEGIN OF ty_alv,
    v_legs(8),
    success(10),
    v_legf(8),
    END OF ty_alv.
    TYPES: BEGIN OF ty_alv1,
    v_legf1(8),
    v_msg(500),
    END OF ty_alv1.
    *-----Work area declarations.
    DATA: x_table TYPE ty_table,
    x_header TYPE bapimepoheader,
    x_headerx TYPE bapimepoheaderx,
    x_item TYPE bapimepoitem,
    x_itemx TYPE bapimepoitemx,
    x_sched TYPE bapimeposchedule,
    x_schedx TYPE bapimeposchedulx,
    x_commatable(255),
    x_alv TYPE ty_alv,
    x_alv1 TYPE ty_alv1,
    x_alv2 TYPE ty_alv1.
    *-----Internal table declarations.
    DATA: it_table TYPE TABLE OF ty_table,
    it_commatable LIKE TABLE OF x_commatable,
    it_item TYPE TABLE OF bapimepoitem,
    it_itemx TYPE TABLE OF bapimepoitemx,
    it_sched TYPE TABLE OF bapimeposchedule,
    it_schedx TYPE TABLE OF bapimeposchedulx,
    it_alv TYPE TABLE OF ty_alv,
    it_alv1 TYPE TABLE OF ty_alv1,
    it_alv2 TYPE TABLE OF ty_alv1.
    DATA: po_number TYPE bapimepoheader-po_number,
    x_return TYPE bapiret2,
    it_return TYPE TABLE OF bapiret2,
    v_file TYPE string,
    v_temp(8),
    v_succsount TYPE i VALUE 0,
    v_failcount TYPE i VALUE 0,
    v_total TYPE i.
    DATA: v_temp1(5) TYPE n VALUE 0.
    DATA: x_event TYPE slis_t_event,
    x_fieldcat TYPE slis_t_fieldcat_alv,
    x_list_header TYPE slis_t_listheader,
    x_event1 LIKE LINE OF x_event,
    x_layout1 TYPE slis_layout_alv,
    x_variant1 TYPE disvariant,
    x_repid2 LIKE sy-repid.
    DATA : it_fieldcat TYPE TABLE OF slis_t_fieldcat_alv.
    SELECTION-SCREEN *
    SELECTION-SCREEN BEGIN OF BLOCK v_b1 WITH FRAME.
    *-----To fetch the flat file.
    PARAMETERS: p_file TYPE rlgrap-filename.
    SELECTION-SCREEN END OF BLOCK v_b1.
    AT SELECTION-SCREEN *
    AT SELECTION-SCREEN.
    IF p_file IS INITIAL.
    MESSAGE text-001 TYPE 'E'.
    ENDIF.
    AT SELECTION-SCREEN ON VALUE-REQUEST FOR p_file.
    *-----To use F4 help to find file path.
    CALL FUNCTION 'F4_FILENAME'
    EXPORTING
    program_name = syst-cprog
    dynpro_number = syst-dynnr
    IMPORTING
    file_name = p_file.
    v_file = p_file.
    START-OF-SELECTION *
    START-OF-SELECTION.
    PERFORM gui_upload.
    LOOP AT it_table INTO x_table.
    PERFORM header_details.
    v_temp = x_table-v_legacy.
    LOOP AT it_table INTO x_table WHERE v_legacy = v_temp.
    PERFORM lineitem.
    PERFORM schedule.
    ENDLOOP.
    DELETE it_table WHERE v_legacy = v_temp.
    PERFORM bapicall.
    MOVE po_number TO x_alv-success.
    APPEND x_alv TO it_alv.
    CLEAR x_alv.
    *-----To clear the item details in internal table after the operation for a header.
    REFRESH: it_item,
    it_itemx,
    it_sched,
    it_schedx.
    CLEAR: v_temp1.
    ENDLOOP.
    v_total = v_succsount + v_failcount.
    PERFORM display_alv.
    FORM GUI_UPLOAD *
    FORM gui_upload .
    CALL FUNCTION 'GUI_UPLOAD'
    EXPORTING
    filename = v_file
    filetype = 'ASC'
    TABLES
    data_tab = it_commatable
    EXCEPTIONS
    file_open_error = 1
    file_read_error = 2
    no_batch = 3
    gui_refuse_filetransfer = 4
    invalid_type = 5
    no_authority = 6
    unknown_error = 7
    bad_data_format = 8
    header_not_allowed = 9
    separator_not_allowed = 10
    header_too_long = 11
    unknown_dp_error = 12
    access_denied = 13
    dp_out_of_memory = 14
    disk_full = 15
    dp_timeout = 16
    OTHERS = 17
    IF sy-subrc = 0.
    *-----To fetch the comma seperated flat file into an internal table.
    LOOP AT it_commatable INTO x_commatable.
    IF x_commatable IS NOT INITIAL.
    SPLIT x_commatable AT ',' INTO
    x_table-v_legacy
    x_table-vendor
    x_table-purch_org
    x_table-pur_group
    x_table-material
    x_table-quantity
    x_table-delivery_date
    x_table-net_price
    x_table-plant.
    APPEND x_table TO it_table.
    ENDIF.
    CLEAR x_table.
    ENDLOOP.
    ENDIF.
    ENDFORM. " gui_upload
    FORM HEADER_DETAILS *
    FORM header_details .
    MOVE 'NB' TO x_header-doc_type.
    CALL FUNCTION 'CONVERSION_EXIT_ALPHA_INPUT'
    EXPORTING
    input = x_table-vendor
    IMPORTING
    output = x_table-vendor
    MOVE x_table-vendor TO x_header-vendor.
    MOVE x_table-purch_org TO x_header-purch_org.
    MOVE x_table-pur_group TO x_header-pur_group.
    x_headerx-doc_type = 'X'.
    x_headerx-vendor = 'X'.
    x_headerx-purch_org = 'X'.
    x_headerx-pur_group = 'X'.
    ENDFORM. " header_details
    FORM LINEITEM *
    FORM lineitem .
    v_temp1 = v_temp1 + 10.
    CALL FUNCTION 'CONVERSION_EXIT_ALPHA_INPUT'
    EXPORTING
    input = v_temp1
    IMPORTING
    output = v_temp1.
    MOVE v_temp1 TO x_item-po_item.
    CALL FUNCTION 'CONVERSION_EXIT_ALPHA_INPUT'
    EXPORTING
    input = x_table-material
    IMPORTING
    output = x_table-material.
    MOVE x_table-material TO x_item-material.
    MOVE x_table-quantity TO x_item-quantity.
    MOVE x_table-net_price TO x_item-net_price.
    MOVE x_table-plant TO x_item-plant.
    x_itemx-po_item = v_temp1.
    x_itemx-material = 'X'.
    x_itemx-quantity = 'X'.
    x_itemx-net_price = 'X'.
    x_itemx-plant = 'X'.
    APPEND x_item TO it_item.
    APPEND x_itemx TO it_itemx.
    CLEAR: x_item, x_itemx.
    ENDFORM. " lineitem1
    FORM SCHEDULE *
    FORM schedule .
    MOVE x_table-delivery_date TO x_sched-delivery_date.
    MOVE v_temp1 TO x_sched-po_item.
    x_schedx-delivery_date = 'X'.
    x_schedx-po_item = v_temp1.
    APPEND x_sched TO it_sched.
    APPEND x_schedx TO it_schedx.
    CLEAR: x_sched, x_schedx.
    ENDFORM. " schedule
    FORM BAPICALL *
    FORM bapicall .
    CALL FUNCTION 'BAPI_PO_CREATE1'
    EXPORTING
    poheader = x_header
    poheaderx = x_headerx
    IMPORTING
    exppurchaseorder = po_number
    TABLES
    return = it_return
    poitem = it_item
    poitemx = it_itemx
    poschedule = it_sched
    poschedulex = it_schedx.
    IF po_number IS NOT INITIAL.
    v_succsount = v_succsount + 1.
    MOVE x_table-v_legacy TO x_alv-v_legs.
    CALL FUNCTION 'BAPI_TRANSACTION_COMMIT'.
    ELSE.
    v_failcount = v_failcount + 1.
    MOVE x_table-v_legacy TO x_alv-v_legf.
    MOVE x_table-v_legacy TO x_alv1-v_legf1.
    LOOP AT it_return INTO x_return.
    IF x_alv1-v_msg IS INITIAL.
    MOVE x_return-message TO x_alv1-v_msg.
    ELSE.
    CONCATENATE x_alv1-v_msg x_return-message INTO x_alv1-v_msg SEPARATED BY space.
    ENDIF.
    ENDLOOP.
    APPEND x_alv1 TO it_alv1.
    CLEAR x_alv1.
    ENDIF.
    ENDFORM. " bapicall
    FORM DISPLAY_ALV *
    FORM display_alv .
    PERFORM x_list_header.
    PERFORM build_fieldcat CHANGING x_fieldcat.
    x_repid2 = sy-repid.
    x_event1-name = 'TOP_OF_PAGE'.
    x_event1-form = 'TOP_OF_PAGE'.
    APPEND x_event1 TO x_event.
    CALL FUNCTION 'REUSE_ALV_GRID_DISPLAY'
    EXPORTING
    i_callback_program = x_repid2
    is_layout = x_layout1
    it_fieldcat = x_fieldcat
    i_callback_user_command = 'USER_COMMAND'
    i_callback_top_of_page = 'TOP_OF_PAGE'
    i_save = 'A'
    is_variant = x_variant1
    it_events = x_event
    TABLES
    t_outtab = it_alv
    EXCEPTIONS
    program_error = 1
    OTHERS = 2.
    IF sy-subrc <> 0.
    MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
    WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
    ENDIF.
    ENDFORM. " display_master_data
    FORM USER_COMMAND *
    FORM user_command USING ucomm LIKE sy-ucomm selfield
    TYPE slis_selfield.
    READ TABLE it_alv INTO x_alv INDEX selfield-tabindex.
    CLEAR : x_alv2,it_alv2[].
    LOOP AT it_alv1 INTO x_alv1 WHERE v_legf1 = x_alv-v_legf.
    x_alv2 = x_alv1.
    APPEND x_alv2 TO it_alv2 .
    ENDLOOP.
    DATA : it_fieldcat TYPE slis_t_fieldcat_alv.
    DATA : x3_fieldcat LIKE LINE OF it_fieldcat.
    CLEAR : x3_fieldcat,it_fieldcat[].
    CLEAR x3_fieldcat.
    x3_fieldcat-col_pos = '1'.
    x3_fieldcat-fieldname = 'V_LEGF1'.
    x3_fieldcat-reptext_ddic = text-111.
    x3_fieldcat-ref_tabname = 'IT_ALV2'.
    APPEND x3_fieldcat TO it_fieldcat.
    CLEAR x3_fieldcat.
    CLEAR x3_fieldcat.
    x3_fieldcat-col_pos = '1'.
    x3_fieldcat-fieldname = 'V_MSG'.
    x3_fieldcat-reptext_ddic = text-112.
    x3_fieldcat-ref_tabname = 'IT_ALV2'.
    APPEND x3_fieldcat TO it_fieldcat.
    CLEAR x3_fieldcat.
    x_layout1-colwidth_optimize = 'X'.
    x_layout1-zebra = 'X'.
    IF it_alv2[] IS NOT INITIAL.
    CALL FUNCTION 'REUSE_ALV_GRID_DISPLAY'
    EXPORTING
    i_callback_program = x_repid2
    is_layout = x_layout1
    it_fieldcat = it_fieldcat
    i_save = 'A'
    i_callback_top_of_page = 'TOP'
    is_variant = x_variant1
    it_events = x_event
    TABLES
    t_outtab = it_alv2
    EXCEPTIONS
    program_error = 1
    OTHERS = 2.
    ENDIF.
    ENDFORM.
    FORM USER_COMMAND *
    FORM top.
    CALL FUNCTION 'REUSE_ALV_COMMENTARY_WRITE'
    EXPORTING
    it_list_commentary = 'Commentry'.
    ENDFORM.
    FORM BUILD_FIELDCAT *
    FORM build_fieldcat CHANGING et_fieldcat TYPE slis_t_fieldcat_alv.
    DATA: x1_fieldcat TYPE slis_fieldcat_alv.
    CLEAR x1_fieldcat.
    x1_fieldcat-col_pos = '1'.
    x1_fieldcat-fieldname = 'V_LEGS'.
    x1_fieldcat-reptext_ddic = text-108.
    x1_fieldcat-ref_tabname = 'IT_ALV'.
    APPEND x1_fieldcat TO et_fieldcat.
    CLEAR x1_fieldcat.
    x1_fieldcat-col_pos = '2'.
    x1_fieldcat-fieldname = 'SUCCESS'.
    x1_fieldcat-key = 'X'.
    x1_fieldcat-reptext_ddic = text-109.
    x1_fieldcat-ref_tabname = 'IT_ALV'.
    APPEND x1_fieldcat TO et_fieldcat.
    CLEAR x1_fieldcat.
    x1_fieldcat-col_pos = '3'.
    x1_fieldcat-fieldname = 'V_LEGF'.
    x1_fieldcat-key = 'X'.
    x1_fieldcat-reptext_ddic = text-110.
    x1_fieldcat-ref_tabname = 'IT_ALV'.
    APPEND x1_fieldcat TO et_fieldcat.
    CLEAR x1_fieldcat.
    ENDFORM. " build_fieldcat
    FORM BUILD_LIST_HEADER *
    FORM x_list_header.
    DATA: x_list_header1 TYPE slis_listheader.
    *-----List Header: type H
    CLEAR x_list_header1 .
    x_list_header1-typ = 'H'.
    x_list_header1-info = text-105.
    APPEND x_list_header1 TO x_list_header.
    *-----List Key: type S
    x_list_header1-typ = 'S'.
    x_list_header1-key = text-106.
    x_list_header1-info = v_total.
    APPEND x_list_header1 TO x_list_header.
    *-----List Key: Type S
    CLEAR x_list_header1 .
    x_list_header1-typ = 'S'.
    x_list_header1-key = text-107.
    x_list_header1-info = v_succsount.
    APPEND x_list_header1 TO x_list_header.
    ENDFORM. " build_list_header
    FORM TOP_OF_PAGE *
    FORM top_of_page.
    CALL FUNCTION 'REUSE_ALV_COMMENTARY_WRITE'
    EXPORTING
    it_list_commentary = x_list_header.
    ENDFORM. " TOP_OF_PAGE
    <b>Reward points for useful Answers</b>
    Regards
    Anji

  • Step-by step procedure for INBOUND IDOC (VENDOR CREATE / CHANGE)

    Hi ,
    Can any body provide me the step-by-step procedure for Inbound IDOCS.
    As i'm new to this i need the the clarification between Inbound & outbound idocs.
    How can we differentiate both?
    where to define outbound & where to define Inbound?
    ( If possible Please explain me the procedure for  Vendor Create through INBOUND IDOCS )
    Thanks in advance..

    Hi,
    Ale Technology is SAPu2019s technology to support distributed yet integrated processes across several SAP systems.
    Outbound Process:
    ALE Outbound Process in SAP sends data to one or more SAP Systems. It involves four steps.
    1. Identify the need of IDoc: This step starts upon creating a application document, can relate to a change to a master data object.
    2. Generate the Master IDoc: The document or master data to be sent is read from the database and formatted into an IDoc format. This IDoc is called as a Master IDoc.
    3. Generate the Communication IDoc: The ALE Service layer generates a separate IDoc from the Master IDoc for each recipient who is interested in the data. Separate IDocs are generated because each recipient might demand a different version or a subset of the Master IDoc. These recipient-specific IDocs are called Communication IDocs and are stored in the database.
    4. Deliver the Communication IDoc: The IDoc is delivered to the recipients using an asynchronous communication method. This allows the sending system to continue its processing without having to wait for the destination system to receiver or process the IDoc.
    Inbound Process:
    The inbound process receives an IDoc and creates a document in the system.
    1. Store the IDoc in the database: The IDoc is received from the sending system and stored in the database. Then the IDoc goes through a basic integrity check and syntax check.
    2. Invoke the Posting Module: The control information in the IDoc and configuration tables are read to determine the posting program. The IDoc is then transferred to its posting program.
    3. Create the Document: The posting program reads the IDoc data and then creates a document in the system. The results are logged in the IDoc.
    Over view of IDocs:
    IDoc is a container that is used to exchange data between any two processes. The document represented in an IDoc is independent of the complex structure SAP uses to store application data. This type of flexibility enables SAP to rearrange its internal structure without affecting the existing interface.
    IDoc interface represents an IDoc Type or IDoc data. IDoc Type represents IDocu2019s definition and IDoc Data is an instance of the IDoc Type.
    IDoc Types:
    IDoc type structure can consist of several segments, and each segment can consist of several data fields. The IDoc structure defines the syntax of the data by specifying a list of permitted segments and arrangement of the segments. Segments define a set of fields and their format.
    An IDoc is an instance of an IDoc Type and consists of three types of records.
    i. One Control record: each IDoc has only one control record. The control record contains all the control information about an IDoc, including the IDoc number, the sender and recipient information, and information such as the message type it represents and IDoc type. The control record structure is same for all IDocs.
    ii. One or Many Data records: An IDoc can have multiple data records, as defined by the IDoc structure. Segments translate into data records, which store application data, such as purchase order header information and purchase order detail lines.
    iii. One or Many Status records: An IDoc can have multiple status records. Status record helps to determine whether an IDoc has any error.
    Message in IDoc Type:
    A Message represents a specific type of document transmitted between two partners.
    Outbound Process in IDocs:
    Outbound process used the following components to generate an IDoc. A customer model, and IDoc structure, selection programs, filter objects, conversion rules, a port definition, an RFC destination, a partner profile, service programs, and configuration tables.
    The Customer Model:
    A customer model is used to model a distribution scenario. In a customer model, you identify the systems involved in a distribution scenario and the message exchanged between the systems.
    Message control:
    Message control is a cross application technology used in pricing, account determination, material determination, and output determination. The output determination technique of Message control triggers the ALE for a business document. Message control separates the logic of generating IDocs from the application logic.
    Change Pointers:
    The change pointers technique is based on the change document technique, which tracks changes made to key documents in SAP, such as the material master, customer master and sales order.
    Changes made to a document are recorded in the change document header table CDHDR, and additional change pointers are written in the BDCP table for the changes relevant to ALE.
    IDoc Structure:
    A message is defined for data that is exchanged between two systems. The message type is based on one or more IDoc structures.
    Selection Program:
    Is typically implemented as function modules, are designed to extract application data and create a master IDoc. A selection program exists for each message type. A selection programu2019s design depends on the triggering mechanism used in the process.
    Filter Objects;
    Filter Objects remove unwanted data for each recipient of the data basing on the recipients requirement.
    Port Definition:
    A port is used in an outbound process to define the medium in which documents are transferred to the destination system. ALE used a Transactional RFC port, which transfers data in memory buffers.
    RFC Destination:
    The RFC destination is a logical name used to define the characteristics of a communication link to a remote system on which a function needs to be executed.
    Partner Profile:
    A partner profile specifies the components used in an outbound process(logical name of the remote SAP system, IDoc Type, message type, TRFC port), an IDocu2019s packet size, the mode in which the process sends an IDoc (batch versus immediate), and the person to be notified in case of error.
    Service Programs and Configuration Tables:
    The outbound process, being asynchronous, is essentially a sequence of several processes that work together. SAP provides service programs and configuration tables to link these programs and provide customizing options for an outbound process.
    Process flow for Distributing Transactional Data:
    Transactional data is distributed using two techniques: with Message control and without message control.
    Process flow for Distributing Master Data:
    Master data between SAP systems is distributed using two techniques: Stand alone Programs and Change Pointers.
    Triggering the Outbound Process via Stand-Alone Programs:
    Stand-Alone programs are started explicitly by a user to transmit data from one SAP system to another. Standard Programs for several master data objects exist in SAP. Ex. The material master data can be transferred using the RBDSEMAT program or transaction BD10.
    The stand-alone programs provide a selection screen to specify the objects to be transferred and the receiving system. After the stand-alone program is executed, it calls the IDoc selection program with the specified parameters.
    Triggering the Outbound Process via Change Pointers:
    The change pointer technique is used to initiate the outbound process automatically when master data is created or changed.
    A standard program, RBDMIDOC, is scheduled to run on a periodic basis to evaluate the change pointers for a message type and start the ALE process for distributing the master data to the appropriate destination. The RBDMIDOC program reads the table TBDME to determine the IDoc selection program for a message type.
    Processing in the Application Layer:
    The customer distribution model is consulted to make sure that a receiver has been defined for the message to be transmitted. If not, processing ends. If at least one receiver exists, the IDoc selection program reads the master data object from the database and creates a master IDoc from it. The master IDoc is stored in memory. The program then calls the ALE service layer by using the function module MASTER_IDOC_DISTRIBUTE, passing the master IDoc and the receiver information.
    Processing in the ALE Interface Layer:
    Processing in the ALE Layer consists of the following steps:
    u2022 Receiver Determination: The determination of the receiver is done through Customer Distribution Model.
    u2022 IDoc Filtering: if an IDoc filter is specified in the distribution model for a receiver, values in the filter are compared against the values in the IDoc data records. If a data record does not meet the filter criteria, it is dropped.
    u2022 Segment Filtering: For each sender and receiver combination, a set of segments that are not required can be filtered out.
    u2022 Field conversion: Field values in data records are converted by using the conversion rules specified for the segment.
    u2022 Version change for segments: Segments are version-controlled. A new version of a segment always contains fields from the preceding version and fields added for the new version. Release in IDoc type field of the partner profile to determine the version of the segment to be generated.
    u2022 Version change for IDocs: IDocs are also version controlled. The version is determined from the Basic Type field of the partner profile.
    u2022 Communication IDocs generated: The final IDoc generated for a receiver after all the conversions and filtering operations is the communication IDoc. One master IDoc can have multiple communication IDocs depending on the number of receivers identified and the filter operations performed. IDoc gets the status record with a status code of 01 (IDoc Created).
    u2022 Syntax check performed: IDoc goes through a syntax check and data integrity validation. If errors found the IDoc get the status of 26 (error during syntax check of IDoc u2013 Outbound). If no errors found the IDoc gets the status 30 (IDoc ready for dispatch u2013 ALE Service).
    u2022 IDoc dispatched to the communication Layer: In the ALE process, IDocs are dispatched using the asynchronous RFC method, which means that the sending system does not await for data to be received or processed on the destination system. After IDocs have been transferred to the communication layer, they get a status code 01 (Data Passed to Port OK).
    Processing in the Communication Layer:
    To dispatch an IDoc to a destination system, the system reads the port definition specified in the partner profile to determine the destination system, which is then used to read the RFC destination. The RFC destination contains communication settings to log o to the remote SAP system. The sending system calls the INBOUND_IDOC_PROCESS function module asynchronously on the destination system and passes the IDoc data via the memory buffers.
    Inbound Process in IDocs:
    An inbound process used IDoc structure, posting programs, filter objects, conversion rules, a partner profile, service programs, and configuration tables to post an application document from an IDoc.
    Posting Program:
    Posting programs, which are implemented as function modules, read data from an IDoc and create an application document from it. A posting program exists for each message. Each posting program is assigned a process code. A process code can point to a function module or a work flow. In the standard program process codes always point to a function module.
    Ex. The posting program for message type MATMAS is IDOC_INPUT_MATMAS which has a process code MATM.
    Workflow:
    A workflow represents a sequence of customized steps to be carried out for a process. The workflow management system is used to model the sequence, identify information required to carry out the steps and identify the person responsible for the dialog steps.
    Partner Profile;
    A partner profile specifies the components used in an inbound process (partner number, message type, and process code), the mode in which IDocs are processed (batch versus immediate), and the person to be notified in case of errors.
    Process flow for the Inbound process via a Function Module:
    In this process, IDocs are received from another system and passed to the posting function module directly.
    1. Processing in the communication Layer:
    The IDOC_INBOUND_ASYCHRONOUS program, triggered as a result of an RFC from the sending system, acts as the entry point for all inbound ALE processes. The IDoc to be processed is passed as an input parameter. Control is transferred to the ALE/EDI layer.
    2. Processing in the ALE/EDI Interface Layer:
    u2022 Basic integrity check: A basic integrity check is performed on the control record.
    u2022 Segment Filtering and conversion: Filtering out unwanted segments and carry out any required conversion of field values.
    u2022 Creation of Application IDoc: The application IDoc is created and stored in the database and a syntax check is performed. If there are errors it gets status code of 60 (Error during Syntax check of IDoc u2013 Inbound). At this point a tangible IDoc, which can be monitored via one of the monitoring transactions, is created and the IDoc gets status code 50 (IDoc Added).
    u2022 IDoc Marked ready for Dispatch: IDoc gets the status code 64 (IDoc ready to be passed to application).
    u2022 IDoc is passed to the posting program: The partner profile table is read. If the value of the Processing field is set to Process Immediately, the IDoc is passed to the posting program immediately using the program RBDAPP01.
    3. Processing in the Posting Module:
    The process code in the partner profile points to a posting module for the specific message in the IDoc. The posting program implemented as a function module either calls a standard SAP transaction by using the Call Transaction command for posting the document or invokes a direct input function module.
    The results of execution are passed back via the function moduleu2019s output parameters. If the posting is successful IDoc gets the status code 53 (Application Document Posted) or it gets status code 51 (Error: Application Document Not Posted).

  • Step by step procedure to create Enhancement spots, points and sections

    Hi all,
    Can anyone of you please provide a step by step procedure to create Enhancement spots, Enhancement points and Enhancement Sections and also give a brief explanation about the same ?
    Regards,
    Pramod

    Hi Pramod,
    The enhancement spots are used to manage explicit enhancement options. Enhancement spots carry information about the positions at which enhancement options were created. One enhancement spot can manage several enhancement options of a Repository object. Conversely, several enhancement spots can be assigned to one enhancement option.
    Use
    You create an explicit enhancement option when processing a Repository object with the relevant tool by creating an enhancement spot element definition at a point where this is possible. This enhancement option can then be called at different points using enhancement spot element calls. The enhancement spot element definition and the corresponding enhancement spot element calls make up the definition of an enhancement option. For example, when editing an ABAP program with the ABAP Editor, you can define explicit enhancement options in the form of the ENHANCEMENT-POINT statement, which also represents the element definition and element call.
    Each enhancement spot element definition must be assigned to at least one enhancement spot. For this, an enhancement spot element definition is assigned one or more simple enhancement spots, which in turn are assigned to at least one composite enhancement spot. Simple and composite enhancement spots are Repository objects that form a tree-like structure, where the leaves and branches represent simple and composite enhancement spots respectively. A simple enhancement spot is always assigned to exactly one enhancement technology (ABAP source code enhancement or BAdI).
    Composite enhancement spots are used for the semantic grouping of simple enhancement spots. A composite enhancement spot contains either one or more simple enhancement spots and/or one or more composite enhancement spots of the relevant type. You can use composite enhancement spots to combine simple enhancement spots into meaningful units.
    The statement ENHANCEMENT-POINT can either be entered directly or created by choosing Edit &#8594; Enhancement operations &#8594; Create enhancement in the Enhancement Builder.
    Kindly Reward Points If You Found The Reply Helpful,
    Cheers,
    Chaitanya.

Maybe you are looking for

  • Mobile Account Home Sync Duplicate Files

    We use profile manager, with an OSX mobility profile with home sync, for management of our OSX devices, combined with AD sign on with "create a mobile account on sign on. For many of our users, they are experiencing files are duplicating with .networ

  • Adobe Acrobat 9 Standard on a PC using Vista - "Run Tine Error & Abnormal Program Termination".

    Trying to use Acrobat 9 standard on a PC  running VISTA business, but it crashes immediately and fails and the error message states Run Time Error and Abnormal program termination.  Lots of stuff on the internet about Adobe and run time errors but no

  • Maximum Thumbnail Size Smaller Thank It Used To Be

    I didn't find this problem when I did a search so forgive me if it has already been posted. I just got a new Intel iMac with iLife '06 and now I am having trouble viewing my thumbnails as large as I used to be able to on my old 15" G4 iMac. I used to

  • Iphone 4s Yellow mark

    Hi Guys, I have an Iphone 4s and a yellow mark appears on my screen whereas my warrantly is expire since February. I go to a Genius Bar and the refuse to take care of my problem. As I saw on Internet it's a latent defect (glue problem) but Apple refu

  • IMovie 6 crashes and shuts down when deleting

    I am trying to update some text in iMovie and when I click update it asks if I want to delete the transition. When I say, "OK," it quits the program. What is going on? Anybody know? Suggestions? Thanks.