Issues with a customized extractor running 'Full' loads.

Hi Gurus,
Is it possible to check at what date/time a particular data record has been posted in the VBPA table [Sales Document Partner Function Table]?
My issue is that we have a customized extractor that pulls data from both the VBAP & VBPA tables based on VBELN [Doc No] & POSNR [Item]. We are using an 'Inner Join' on these two tables, so if either the Doc No or Item is not present in the VBPA table at the time of extraction, the data record will be skipped and we are facing several such missing documents in BW.
The loads are 'Full' based on 'ERDAT' & 'AEDAT' on the VBAP table for the last 3 days i.e. if a Sales Document is either Created or Changed in the last 3 days, it will be picked up by the extractor. So my reasoning for the missing data records in BW is that the VBPA table  entries for that particular Order had been updated at a later date and hence data was missed out during extraction [because of the 'Inner Join' based on Doc No & Item].
But since VBPA does not have a created on or changed on field, I am unable to find out when exactly a data record has been updated in the VBPA table.
Any suggestions that you can provide will be highly appreciated!
Thanks

Hi,
I feel your table join logic is not correct. Please discuss with your functional team once and try to revise the logic. Otherwise you may keep missing some records all the time. Test the extractor alot in RSA3 before replicating in BW.
Regards,
Suman

Similar Messages

  • Task fails while running Full load ETL

    Hi All,
    I am running full load ETL For Oracle R12(vanila Instance) HR But 4 tasks are failing SDE_ORA_JobDimention, SDE_ORA_HRPositionDimention, SDE_ORA_CodeDimension_Pay_level and SDE_ORA_CodeDimensionJob, I changed the parameter for all these task as mentioned in the Installation guide and rebuilled. Please help me out.
    Log is like this for SDE_ORA_JobDimention
    DIRECTOR> VAR_27028 Use override value [DataWarehouse] for session parameter:[$DBConnection_OLAP].
    DIRECTOR> VAR_27028 Use override value [ORA_R12] for session parameter:[$DBConnection_OLTP].
    DIRECTOR> VAR_27028 Use override value [9] for mapping parameter:[$$DATASOURCE_NUM_ID].
    DIRECTOR> VAR_27027 Use default value [] for mapping parameter:[mplt_BC_ORA_JobDimension.$$JOBCODE_FLXFLD_SEGMENT_COL].
    DIRECTOR> VAR_27027 Use default value [] for mapping parameter:[mplt_BC_ORA_JobDimension.$$JOBFAMILYCODE_FLXFLD_SEGMENT_COL].
    DIRECTOR> VAR_27027 Use default value [] for mapping parameter:[mplt_BC_ORA_JobDimension.$$LAST_EXTRACT_DATE].
    DIRECTOR> VAR_27028 Use override value [DEFAULT] for mapping parameter:[$$TENANT_ID].
    DIRECTOR> TM_6014 Initializing session [SDE_ORA_JobDimension_Full] at [Fri Sep 26 10:52:05 2008]
    DIRECTOR> TM_6683 Repository Name: [Oracle_BI_DW_Base]
    DIRECTOR> TM_6684 Server Name: [Oracle_BI_DW_Base_Integration_Service]
    DIRECTOR> TM_6686 Folder: [SDE_ORAR12_Adaptor]
    DIRECTOR> TM_6685 Workflow: [SDE_ORA_JobDimension_Full]
    DIRECTOR> TM_6101 Mapping name: SDE_ORA_JobDimension [version 1]
    DIRECTOR> TM_6827 [C:\Informatica\PowerCenter8.1.1\server\infa_shared\Storage] will be used as storage directory for session [SDE_ORA_JobDimension_Full].
    DIRECTOR> CMN_1805 Recovery cache will be deleted when running in normal mode.
    DIRECTOR> CMN_1802 Session recovery cache initialization is complete.
    DIRECTOR> TM_6703 Session [SDE_ORA_JobDimension_Full] is run by 32-bit Integration Service [node01_HSCHBSCGN20031], version [8.1.1], build [0831].
    MANAGER> PETL_24058 Running Partition Group [1].
    MANAGER> PETL_24000 Parallel Pipeline Engine initializing.
    MANAGER> PETL_24001 Parallel Pipeline Engine running.
    MANAGER> PETL_24003 Initializing session run.
    MAPPING> CMN_1569 Server Mode: [ASCII]
    MAPPING> CMN_1570 Server Code page: [MS Windows Latin 1 (ANSI), superset of Latin1]
    MAPPING> TM_6151 Session Sort Order: [Binary]
    MAPPING> TM_6156 Using LOW precision decimal arithmetic
    MAPPING> TM_6180 Deadlock retry logic will not be implemented.
    MAPPING> TM_6307 DTM Error Log Disabled.
    MAPPING> TE_7022 TShmWriter: Initialized
    MAPPING> TM_6007 DTM initialized successfully for session [SDE_ORA_JobDimension_Full]
    DIRECTOR> PETL_24033 All DTM Connection Info: [<NONE>].
    MANAGER> PETL_24004 Starting pre-session tasks. : (Fri Sep 26 10:52:13 2008)
    MANAGER> PETL_24027 Pre-session task completed successfully. : (Fri Sep 26 10:52:14 2008)
    DIRECTOR> PETL_24006 Starting data movement.
    MAPPING> TM_6660 Total Buffer Pool size is 32000000 bytes and Block size is 1280000 bytes.
    READER_1_1_1> DBG_21438 Reader: Source is [dev], user [apps]
    READER_1_1_1> BLKR_16003 Initialization completed successfully.
    WRITER_1_*_1> WRT_8146 Writer: Target is database [orcl], user [obia], bulk mode [ON]
    WRITER_1_*_1> WRT_8106 Warning! Bulk Mode session - recovery is not guaranteed.
    WRITER_1_*_1> WRT_8124 Target Table W_JOB_DS :SQL INSERT statement:
    INSERT INTO W_JOB_DS(JOB_CODE,JOB_NAME,JOB_DESC,JOB_FAMILY_CODE,JOB_FAMILY_NAME,JOB_FAMILY_DESC,JOB_LEVEL,W_FLSA_STAT_CODE,W_FLSA_STAT_DESC,W_EEO_JOB_CAT_CODE,W_EEO_JOB_CAT_DESC,AAP_JOB_CAT_CODE,AAP_JOB_CAT_NAME,ACTIVE_FLG,CREATED_BY_ID,CHANGED_BY_ID,CREATED_ON_DT,CHANGED_ON_DT,AUX1_CHANGED_ON_DT,AUX2_CHANGED_ON_DT,AUX3_CHANGED_ON_DT,AUX4_CHANGED_ON_DT,SRC_EFF_FROM_DT,SRC_EFF_TO_DT,DELETE_FLG,DATASOURCE_NUM_ID,INTEGRATION_ID,TENANT_ID,X_CUSTOM) VALUES ( ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
    WRITER_1_*_1> WRT_8270 Target connection group #1 consists of target(s) [W_JOB_DS]
    WRITER_1_*_1> WRT_8003 Writer initialization complete.
    WRITER_1_*_1> WRT_8005 Writer run started.
    READER_1_1_1> BLKR_16007 Reader run started.
    READER_1_1_1> RR_4029 SQ Instance [mplt_BC_ORA_JobDimension.Sq_Jobs] User specified SQL Query [SELECT
    PER_JOBS.JOB_ID,
    PER_JOBS.BUSINESS_GROUP_ID,
    PER_JOBS.JOB_DEFINITION_ID,
    PER_JOBS.DATE_FROM,
    PER_JOBS.DATE_TO,
    PER_JOBS.LAST_UPDATE_DATE AS CHANGED_ON_DT,      PER_JOBS.LAST_UPDATED_BY, PER_JOBS.CREATED_BY, PER_JOBS.CREATION_DATE,
    PER_JOB_DEFINITIONS.LAST_UPDATE_DATE AS AUX1_CHANGED_ON_DT,
    PER_JOB_DEFINITIONS.JOB_DEFINITION_ID,
    PER_JOBS.NAME,
    PER_JOBS.JOB_INFORMATION1, PER_JOBS.JOB_INFORMATION3,
    PER_JOBS. AS JOB_FAMILY_CODE, PER_JOB_DEFINITIONS.  AS JOB_CODE,
      '0' AS X_CUSTOM
    FROM
    PER_JOBS, PER_JOB_DEFINITIONS
    WHERE
    PER_JOBS.JOB_DEFINITION_ID = PER_JOB_DEFINITIONS.JOB_DEFINITION_ID]
    WRITER_1_*_1> WRT_8158
    *****START LOAD SESSION*****
    Load Start Time: Fri Sep 26 10:53:05 2008
    Target tables:
    W_JOB_DS
    READER_1_1_1> RR_4049 SQL Query issued to database : (Fri Sep 26 10:53:05 2008)
    READER_1_1_1> CMN_1761 Timestamp Event: [Fri Sep 26 10:53:06 2008]
    READER_1_1_1> RR_4035 SQL Error [
    ORA-01747: invalid user.table.column, table.column, or column specification
    Database driver error...
    Function Name : Execute
    SQL Stmt : SELECT
    PER_JOBS.JOB_ID,
    PER_JOBS.BUSINESS_GROUP_ID,
    PER_JOBS.JOB_DEFINITION_ID,
    PER_JOBS.DATE_FROM,
    PER_JOBS.DATE_TO,
    PER_JOBS.LAST_UPDATE_DATE AS CHANGED_ON_DT, PER_JOBS.LAST_UPDATED_BY, PER_JOBS.CREATED_BY, PER_JOBS.CREATION_DATE,
    PER_JOB_DEFINITIONS.LAST_UPDATE_DATE AS AUX1_CHANGED_ON_DT,
    PER_JOB_DEFINITIONS.JOB_DEFINITION_ID,
    PER_JOBS.NAME,
    PER_JOBS.JOB_INFORMATION1, PER_JOBS.JOB_INFORMATION3,
    PER_JOBS. AS JOB_FAMILY_CODE, PER_JOB_DEFINITIONS. AS JOB_CODE,
    '0' AS X_CUSTOM
    FROM
    PER_JOBS, PER_JOB_DEFINITIONS
    WHERE
    PER_JOBS.JOB_DEFINITION_ID = PER_JOB_DEFINITIONS.JOB_DEFINITION_ID
    Oracle Fatal Error
    Database driver error...
    Function Name : Execute
    SQL Stmt : SELECT
    PER_JOBS.JOB_ID,
    PER_JOBS.BUSINESS_GROUP_ID,
    PER_JOBS.JOB_DEFINITION_ID,
    PER_JOBS.DATE_FROM,
    PER_JOBS.DATE_TO,
    PER_JOBS.LAST_UPDATE_DATE AS CHANGED_ON_DT, PER_JOBS.LAST_UPDATED_BY, PER_JOBS.CREATED_BY, PER_JOBS.CREATION_DATE,
    PER_JOB_DEFINITIONS.LAST_UPDATE_DATE AS AUX1_CHANGED_ON_DT,
    PER_JOB_DEFINITIONS.JOB_DEFINITION_ID,
    PER_JOBS.NAME,
    PER_JOBS.JOB_INFORMATION1, PER_JOBS.JOB_INFORMATION3,
    PER_JOBS. AS JOB_FAMILY_CODE, PER_JOB_DEFINITIONS. AS JOB_CODE,
    '0' AS X_CUSTOM
    FROM
    PER_JOBS, PER_JOB_DEFINITIONS
    WHERE
    PER_JOBS.JOB_DEFINITION_ID = PER_JOB_DEFINITIONS.JOB_DEFINITION_ID
    Oracle Fatal Error].
    READER_1_1_1> CMN_1761 Timestamp Event: [Fri Sep 26 10:53:06 2008]
    READER_1_1_1> BLKR_16004 ERROR: Prepare failed.
    WRITER_1_*_1> WRT_8333 Rolling back all the targets due to fatal session error.
    WRITER_1_*_1> WRT_8325 Final rollback executed for the target [W_JOB_DS] at end of load
    WRITER_1_*_1> WRT_8035 Load complete time: Fri Sep 26 10:53:06 2008
    LOAD SUMMARY
    ============
    WRT_8036 Target: W_JOB_DS (Instance Name: [W_JOB_DS])
    WRT_8044 No data loaded for this target
    WRITER_1__1> WRT_8043 ****END LOAD SESSION*****
    MANAGER> PETL_24031
    ***** RUN INFO FOR TGT LOAD ORDER GROUP [1], CONCURRENT SET [1] *****
    Thread [READER_1_1_1] created for [the read stage] of partition point [mplt_BC_ORA_JobDimension.Sq_Jobs] has completed. The total run time was insufficient for any meaningful statistics.
    Thread [TRANSF_1_1_1] created for [the transformation stage] of partition point [mplt_BC_ORA_JobDimension.Sq_Jobs] has completed. The total run time was insufficient for any meaningful statistics.
    Thread [WRITER_1_*_1] created for [the write stage] of partition point [W_JOB_DS] has completed. The total run time was insufficient for any meaningful statistics.
    MANAGER> PETL_24005 Starting post-session tasks. : (Fri Sep 26 10:53:06 2008)
    MANAGER> PETL_24029 Post-session task completed successfully. : (Fri Sep 26 10:53:06 2008)
    MAPPING> TM_6018 Session [SDE_ORA_JobDimension_Full] run completed with [0] row transformation errors.
    MANAGER> PETL_24002 Parallel Pipeline Engine finished.
    DIRECTOR> PETL_24013 Session run completed with failure.
    DIRECTOR> TM_6022
    SESSION LOAD SUMMARY
    ================================================
    DIRECTOR> TM_6252 Source Load Summary.
    DIRECTOR> CMN_1740 Table: [Sq_Jobs] (Instance Name: [mplt_BC_ORA_JobDimension.Sq_Jobs])
         Output Rows [0], Affected Rows [0], Applied Rows [0], Rejected Rows [0]
    DIRECTOR> TM_6253 Target Load Summary.
    DIRECTOR> CMN_1740 Table: [W_JOB_DS] (Instance Name: [W_JOB_DS])
         Output Rows [0], Affected Rows [0], Applied Rows [0], Rejected Rows [0]
    DIRECTOR> TM_6023
    ===================================================
    DIRECTOR> TM_6020 Session [SDE_ORA_JobDimension_Full] completed at [Fri Sep 26 10:53:07 2008]

    To make use of the warehouse you would probably want to connect to an EBS instance in order to populate the warehouse.
    Since the execution plan you intend to run is designed for the EBS data-model. I guess if you really didn't want to connect to the EBS instance to pull data you could build one using the universal adapter. This allows you to load out of flat-files if you wish, but I wouldn't reccomend making this a habit for actual implementation as it does create another potential point of failure (populating the flat-files).
    Thanks,
    Austin

  • Crash issues with elements organizer while running Apple OS 10.9 Mavericks

    Crash issues with elements organizer while running Apple OS 10.9 Mavericks.  Crashes then ask if you want to reopen or not.  Also when you are trying to manuver the page with your mouse things go crazy from a full page edit back to viewing all photos on the site and skipping around from picture to picture.  Unsusable as it is today!

    It is very critical to delete all preferences when you do a major OS X upgrade. You need to go to your username>library>preferences and delete everything pertaining to PSE. To see that library in 10.9, open a finder window, click your user account in the list on the left (the little house with your name), and then click the gear wheel at the top of the window and choose View Options. At the bottom of the list of checkboxes you'll see Library. Turn it on and then you can easily find the preferences folder.

  • Performance issue with a Custom view

    Hi ,
    I am pretty new to performance tuning and facing a performance issue with a custom view.
    Execution time for view query is good but as soon as I append a where caluse to view query ,the execution time increases.
    Below is the view query:
    CREATE OR REPLACE XXX_INFO_VIEW AS
    SELECT csb.system_id license_id,
    cst.name license_number ,
    csb.system_type_code license_type ,
    csb.attribute3 lac , -- license authorization code
    csb.attribute6 lat , -- license admin token
    csb.attribute12 ols_reg, -- OLS Registration allowed flag
    l.attribute4 license_biz_type ,
    NVL (( SELECT 'Y' l_supp_flag
    FROM csi_item_instances cii,
    okc_k_lines_b a,
    okc_k_items c
    WHERE c.cle_id = a.id
    AND a.lse_id = 9
    AND c.jtot_object1_code = 'OKX_CUSTPROD'
    AND c.object1_id1 = cii.instance_id||''
    AND cii.instance_status_id IN (3, 510)
    AND cii.system_id = csb.system_id
    AND a.sts_code IN ('SIGNED', 'ACTIVE')
    AND NVL (a.date_terminated, a.end_date) > SYSDATE
    AND ROWNUM < 2), 'N') active_supp_flag,
    hp.party_name "Customer_Name" , -- Customer Name
    hca.attribute12 FGE_FLAG,
    (SELECT /*+INDEX (oklt OKC_K_LINES_TL_U1) */
    nvl(max((decode(name, 'eSupport','2','Enterprise','1','Standard','1','TERM RTU','0','TERM RTS','0','Notfound'))),0) covName --TERM RTU and TERM RTS added as per Vijaya's suggestion APR302013
    FROM OKC_K_LINES_B oklb1,
    OKC_K_LINES_TL oklt,
    OKC_K_LINES_B oklb2,
    OKC_K_ITEMS oki,
    CSI_item_instances cii
    WHERE
    OKI.JTOT_OBJECT1_CODE = 'OKX_CUSTPROD'
    AND oklb1.id=oklt.id
    AND OKI.OBJECT1_ID1 =cii.instance_id||''
    AND Oklb1.lse_id=2
    AND oklb1.dnz_chr_id=oklb2.dnz_chr_id
    AND oklb2.lse_id=9
    AND oki.CLE_ID=oklb2.id
    AND cii.system_id=csb.system_id
    AND oklt.LANGUAGE=USERENV ('LANG')) COVERAGE_TYPE
    FROM csi_systems_b csb ,
    csi_systems_tl cst ,
    hz_cust_accounts hca,
    hz_parties hp,
    fnd_lookup_values l
    WHERE csb.system_type_code = l.lookup_code (+)
    AND csb.system_id = cst.system_id
    AND hca.cust_account_id =csb.customer_id
    AND hca.party_id= hp.party_id
    AND cst.language = USERENV ('LANG')
    AND l.lookup_type (+) = 'CSI_SYSTEM_TYPE'
    AND l.language (+) = USERENV ('LANG')
    AND NVL (csb.end_date_active, SYSDATE+1) > SYSDATE)
    I have forced an index to avoid Full table scan on OKC_K_LINES_TL and suppressed an index on CSI_item_instances.instance id to make the view query fast.
    So when i do select * from XXX_INFO_VIEWit executes in a decent time,But when I try to do
    select * from XXX_INFO_VIEW where active_supp_flag='Y' and coverage_type='1'
    it takes lot of time.
    Execution plan is same for both queries in terms of cost but with WHERE clause Number of bytes increases.
    Below are the execution plans:
    View query:
    SELECT STATEMENT ALL_ROWS Cost: 7,212 Bytes: 536,237 Cardinality: 3,211                                         
         10 COUNT STOPKEY                                    
              9 NESTED LOOPS                               
                   7 NESTED LOOPS Cost: 1,085 Bytes: 101 Cardinality: 1                          
                        5 NESTED LOOPS Cost: 487 Bytes: 17,043 Cardinality: 299                     
                             2 TABLE ACCESS BY INDEX ROWID TABLE CSI.CSI_ITEM_INSTANCES Cost: 22 Bytes: 2,325 Cardinality: 155                
                                  1 INDEX RANGE SCAN INDEX CSI.CSI_ITEM_INSTANCES_N07 Cost: 3 Cardinality: 315           
                             4 TABLE ACCESS BY INDEX ROWID TABLE OKC.OKC_K_ITEMS Cost: 3 Bytes: 84 Cardinality: 2                
                                  3 INDEX RANGE SCAN INDEX OKC.OKC_K_ITEMS_N2 Cost: 2 Cardinality: 2           
                        6 INDEX UNIQUE SCAN INDEX (UNIQUE) OKC.OKC_K_LINES_B_U1 Cost: 1 Cardinality: 1                     
                   8 TABLE ACCESS BY INDEX ROWID TABLE OKC.OKC_K_LINES_B Cost: 2 Bytes: 44 Cardinality: 1                          
         12 TABLE ACCESS BY INDEX ROWID TABLE AR.HZ_CUST_ACCOUNTS Cost: 2 Bytes: 7 Cardinality: 1                                    
              11 INDEX UNIQUE SCAN INDEX (UNIQUE) AR.HZ_CUST_ACCOUNTS_U1 Cost: 1 Cardinality: 1                               
         28 SORT AGGREGATE Bytes: 169 Cardinality: 1                                    
              27 NESTED LOOPS                               
                   25 NESTED LOOPS Cost: 16,549 Bytes: 974,792 Cardinality: 5,768                          
                        23 NESTED LOOPS Cost: 5,070 Bytes: 811,737 Cardinality: 5,757                     
                             20 NESTED LOOPS Cost: 2,180 Bytes: 56,066 Cardinality: 578                
                                  17 NESTED LOOPS Cost: 967 Bytes: 32,118 Cardinality: 606           
                                       14 TABLE ACCESS BY INDEX ROWID TABLE CSI.CSI_ITEM_INSTANCES Cost: 22 Bytes: 3,465 Cardinality: 315      
                                            13 INDEX RANGE SCAN INDEX CSI.CSI_ITEM_INSTANCES_N07 Cost: 3 Cardinality: 315
                                       16 TABLE ACCESS BY INDEX ROWID TABLE OKC.OKC_K_ITEMS Cost: 3 Bytes: 84 Cardinality: 2      
                                            15 INDEX RANGE SCAN INDEX OKC.OKC_K_ITEMS_N2 Cost: 2 Cardinality: 2
                                  19 TABLE ACCESS BY INDEX ROWID TABLE OKC.OKC_K_LINES_B Cost: 2 Bytes: 44 Cardinality: 1           
                                       18 INDEX UNIQUE SCAN INDEX (UNIQUE) OKC.OKC_K_LINES_B_U1 Cost: 1 Cardinality: 1      
                             22 TABLE ACCESS BY INDEX ROWID TABLE OKC.OKC_K_LINES_B Cost: 5 Bytes: 440 Cardinality: 10                
                                  21 INDEX RANGE SCAN INDEX OKC.OKC_K_LINES_B_N2 Cost: 2 Cardinality: 9           
                        24 INDEX UNIQUE SCAN INDEX (UNIQUE) OKC.OKC_K_LINES_TL_U1 Cost: 1 Cardinality: 1                     
                   26 TABLE ACCESS BY INDEX ROWID TABLE OKC.OKC_K_LINES_TL Cost: 2 Bytes: 28 Cardinality: 1                          
         43 HASH JOIN Cost: 7,212 Bytes: 536,237 Cardinality: 3,211                                    
              41 NESTED LOOPS                               
                   39 NESTED LOOPS Cost: 7,070 Bytes: 485,792 Cardinality: 3,196                          
                        37 HASH JOIN Cost: 676 Bytes: 341,972 Cardinality: 3,196                     
                             32 HASH JOIN RIGHT OUTER Cost: 488 Bytes: 310,012 Cardinality: 3,196                
                                  30 TABLE ACCESS BY INDEX ROWID TABLE APPLSYS.FND_LOOKUP_VALUES Cost: 7 Bytes: 544 Cardinality: 17           
                                       29 INDEX RANGE SCAN INDEX (UNIQUE) APPLSYS.FND_LOOKUP_VALUES_U1 Cost: 3 Cardinality: 17      
                                  31 TABLE ACCESS FULL TABLE CSI.CSI_SYSTEMS_B Cost: 481 Bytes: 207,740 Cardinality: 3,196           
                             36 VIEW VIEW AR.index$_join$_013 Cost: 187 Bytes: 408,870 Cardinality: 40,887                
                                  35 HASH JOIN           
                                       33 INDEX FAST FULL SCAN INDEX (UNIQUE) AR.HZ_CUST_ACCOUNTS_U1 Cost: 112 Bytes: 408,870 Cardinality: 40,887      
                                       34 INDEX FAST FULL SCAN INDEX AR.HZ_CUST_ACCOUNTS_N2 Cost: 122 Bytes: 408,870 Cardinality: 40,887      
                        38 INDEX UNIQUE SCAN INDEX (UNIQUE) AR.HZ_PARTIES_U1 Cost: 1 Cardinality: 1                     
                   40 TABLE ACCESS BY INDEX ROWID TABLE AR.HZ_PARTIES Cost: 2 Bytes: 45 Cardinality: 1                          
              42 TABLE ACCESS FULL TABLE CSI.CSI_SYSTEMS_TL Cost: 142 Bytes: 958,770 Cardinality: 63,918           
    Execution plan for view query with WHERE clause:
    SELECT STATEMENT ALL_ROWS Cost: 7,212 Bytes: 2,462,837 Cardinality: 3,211                                         
         10 COUNT STOPKEY                                    
              9 NESTED LOOPS                               
                   7 NESTED LOOPS Cost: 1,085 Bytes: 101 Cardinality: 1                          
                        5 NESTED LOOPS Cost: 487 Bytes: 17,043 Cardinality: 299                     
                             2 TABLE ACCESS BY INDEX ROWID TABLE CSI.CSI_ITEM_INSTANCES Cost: 22 Bytes: 2,325 Cardinality: 155                
                                  1 INDEX RANGE SCAN INDEX CSI.CSI_ITEM_INSTANCES_N07 Cost: 3 Cardinality: 315           
                             4 TABLE ACCESS BY INDEX ROWID TABLE OKC.OKC_K_ITEMS Cost: 3 Bytes: 84 Cardinality: 2                
                                  3 INDEX RANGE SCAN INDEX OKC.OKC_K_ITEMS_N2 Cost: 2 Cardinality: 2           
                        6 INDEX UNIQUE SCAN INDEX (UNIQUE) OKC.OKC_K_LINES_B_U1 Cost: 1 Cardinality: 1                     
                   8 TABLE ACCESS BY INDEX ROWID TABLE OKC.OKC_K_LINES_B Cost: 2 Bytes: 44 Cardinality: 1                          
         12 TABLE ACCESS BY INDEX ROWID TABLE AR.HZ_CUST_ACCOUNTS Cost: 2 Bytes: 7 Cardinality: 1                                    
              11 INDEX UNIQUE SCAN INDEX (UNIQUE) AR.HZ_CUST_ACCOUNTS_U1 Cost: 1 Cardinality: 1                               
         28 SORT AGGREGATE Bytes: 169 Cardinality: 1                                    
              27 NESTED LOOPS                               
                   25 NESTED LOOPS Cost: 16,549 Bytes: 974,792 Cardinality: 5,768                          
                        23 NESTED LOOPS Cost: 5,070 Bytes: 811,737 Cardinality: 5,757                     
                             20 NESTED LOOPS Cost: 2,180 Bytes: 56,066 Cardinality: 578                
                                  17 NESTED LOOPS Cost: 967 Bytes: 32,118 Cardinality: 606           
                                       14 TABLE ACCESS BY INDEX ROWID TABLE CSI.CSI_ITEM_INSTANCES Cost: 22 Bytes: 3,465 Cardinality: 315      
                                            13 INDEX RANGE SCAN INDEX CSI.CSI_ITEM_INSTANCES_N07 Cost: 3 Cardinality: 315
                                       16 TABLE ACCESS BY INDEX ROWID TABLE OKC.OKC_K_ITEMS Cost: 3 Bytes: 84 Cardinality: 2      
                                            15 INDEX RANGE SCAN INDEX OKC.OKC_K_ITEMS_N2 Cost: 2 Cardinality: 2
                                  19 TABLE ACCESS BY INDEX ROWID TABLE OKC.OKC_K_LINES_B Cost: 2 Bytes: 44 Cardinality: 1           
                                       18 INDEX UNIQUE SCAN INDEX (UNIQUE) OKC.OKC_K_LINES_B_U1 Cost: 1 Cardinality: 1      
                             22 TABLE ACCESS BY INDEX ROWID TABLE OKC.OKC_K_LINES_B Cost: 5 Bytes: 440 Cardinality: 10                
                                  21 INDEX RANGE SCAN INDEX OKC.OKC_K_LINES_B_N2 Cost: 2 Cardinality: 9           
                        24 INDEX UNIQUE SCAN INDEX (UNIQUE) OKC.OKC_K_LINES_TL_U1 Cost: 1 Cardinality: 1                     
                   26 TABLE ACCESS BY INDEX ROWID TABLE OKC.OKC_K_LINES_TL Cost: 2 Bytes: 28 Cardinality: 1                          
         44 VIEW VIEW APPS.WRS_LICENSE_INFO_V Cost: 7,212 Bytes: 2,462,837 Cardinality: 3,211                                    
              43 HASH JOIN Cost: 7,212 Bytes: 536,237 Cardinality: 3,211                               
                   41 NESTED LOOPS                          
                        39 NESTED LOOPS Cost: 7,070 Bytes: 485,792 Cardinality: 3,196                     
                             37 HASH JOIN Cost: 676 Bytes: 341,972 Cardinality: 3,196                
                                  32 HASH JOIN RIGHT OUTER Cost: 488 Bytes: 310,012 Cardinality: 3,196           
                                       30 TABLE ACCESS BY INDEX ROWID TABLE APPLSYS.FND_LOOKUP_VALUES Cost: 7 Bytes: 544 Cardinality: 17      
                                            29 INDEX RANGE SCAN INDEX (UNIQUE) APPLSYS.FND_LOOKUP_VALUES_U1 Cost: 3 Cardinality: 17
                                       31 TABLE ACCESS FULL TABLE CSI.CSI_SYSTEMS_B Cost: 481 Bytes: 207,740 Cardinality: 3,196      
                                  36 VIEW VIEW AR.index$_join$_013 Cost: 187 Bytes: 408,870 Cardinality: 40,887           
                                       35 HASH JOIN      
                                            33 INDEX FAST FULL SCAN INDEX (UNIQUE) AR.HZ_CUST_ACCOUNTS_U1 Cost: 112 Bytes: 408,870 Cardinality: 40,887
                                            34 INDEX FAST FULL SCAN INDEX AR.HZ_CUST_ACCOUNTS_N2 Cost: 122 Bytes: 408,870 Cardinality: 40,887
                             38 INDEX UNIQUE SCAN INDEX (UNIQUE) AR.HZ_PARTIES_U1 Cost: 1 Cardinality: 1                
                        40 TABLE ACCESS BY INDEX ROWID TABLE AR.HZ_PARTIES Cost: 2 Bytes: 45 Cardinality: 1                     
                   42 TABLE ACCESS FULL TABLE CSI.CSI_SYSTEMS_TL Cost: 142 Bytes: 958,770 Cardinality: 63,918

    Hi,
    You should always try using primary index fields, if not possible then secondary index fields.
    Even if you cannot do anything from either of the two then try this,
    Use Less distinct fields on the top.
    In your case , you can use bukrs ,gjahr ,werks on the top in the where condition..then followed by less distinct values..
    Even when you use secondary index if you have 4 fields in your sec index and you are using only two fields from the top then the index is useful only upto that two fields provided they are in sequence.

  • Issues with Master Data Change Run after upgrade 7.3.1

    We have upgraded to 7.3.1 on both our BW and ECC clients.  When we first run out change run in BW it chose to drop and rebuild aggregates
    on changes that usually don't cause a drop and rebuild.  They usually do a delta.
    Has anyone had this issue once upgraded?
    We looked at our performance setting in SPRO for aggregates and the settings have not changed.

    I loaded data for 0Material, 4 fields are not getting populated....when I checked in R/3 base table and RSA3, I am able to see data for those 4 fields. But in PSA and 0MATERIAL I am not seeing data for those fields.
    THose 4 fields are Z fields mapped to Standard fields in MARA table.
    As you say you can see the values of these fields in RSA3, We can rule out that the fields arent hidden. Check the mapping on the BI side, do you have all the Zinfoobjects added to the Omaterial and the corresponding mapping of source fields to target fields.
    0CUSTOMER_ATTR has some data...I added new field in datasource as well as in BW also. After creating transformation I executed the DTP, it is throwing same error in every data package
    "Record filtered in advance as error records with the same key exist".
    Load till PSA is Init with data transfer and DTP is delta.
    there are some duplicate records in the master data values that are coming in. You can manually change them in PSA and load. And if you are sure the duplicated records can be dropped, you can tick the HANDLE DUPLICATE RECORDS in DTP - update tab

  • How to make LIS Extractor nightly full load

    Hello All,
    We need an LIS Extractor to be a full load.  It appears Full Load only reads setup tables.  How can we do a full load for a LIS Extractor?
    Thanks!

    Hi All,
    Here is our scenerio:
    I am referring to the LO Cockpit. 
    We have deltas coming over to a DSO that has a key of order number, material and batch.  Now, when the order is created, for example, it will have batch NULL because we have not determined what batch to use yet.  It could remain like this for days.  In the meantime this record comes over to BW...just fine and updates.  Now, we have decided for this order that exists in BW with a batch of NULL that we allocate a batch to it now.  Now this record comes over to BW and it creates a new record.  So we have 2 records in BW, one with batch NULL and one with batch updated.  We really only want to see the record with batch updated.
    We also thought about, what happens if we assign a batch (after the record came over to BW as batch NULL), then this creates 2 records as mentioned above and then now we change this batch to yet another batch.  Now when this record comes over to BW this order will have 3 records by batch, yet only the latest is valid.
    So we thought a full load would work, which it would.  How do you handle such a scenerio for deltas?  We load to DSO as mentioned and then push to a Cube.
    Thanks

  • IExpenses-12.1.3 Facing Issue with 2 custom text fields on the standard OAF

    Dear All,
    We are facing problem where custom text fields on standard OAF page does not retain their values when we traverse back-forth on the OAF page.
    Here is the exact issue details
    1) We added 2 text fields(Attribute5 and Attribute6) through personalization on Mileage Line Details Screen(standard OAF page) of iExpenses 12.1.3.
    2) And business requirement is whenever user enters values into these fields, difference of these values is populated in third field which is standard field on that page.
    Issue
    When user enters values into above 2 fields, difference is calculated correctly however once he clicks on return page and comes back again on the detail page
    then all the standard fields retain their values but 2 custom fields have blank value.
    Is there any issue with personalization? or any other issue? Please suggest.
    Thanks,
    Mahesh

    Thanks Pratap for checking
    There is button named as "Calculate Amount" on the line details page so it is happening in below 2 scenario
    1) When User enters values in 2 fields and clicks on Calculate Amount Button then values get disappeared from custom fields
    2) When user clicks on return button, go to main page and clicks on detail button ( to come back on same line) then all the standard fields have valuece and custom one's disappeared.
    Thanks,
    Mahesh

  • Issue with the SNP Extractor in APO-BI.

    Hi Experts,
    I am facing an issue with SNP Extractor. I have a SNP Extractor in APO system, and it has 4 datasources connected to it.Out of the 4 datasources, we are getting display of records for relevant selections in only one data source, but for the other three data sources, given what ever be the selections, it gives a message saying " 0 records available ".
    Please could you suggest, what can be the issue and also indicate a solution to resolve it. Do we need to apply any SAP Note to resolve the issue?? And where can we see the Current Support Package that is installed in the system??
    Replies in this regard, would be very helpful.
    Thanks and Regards,
    Bhargava

    Hi,
    you can see the Current Support Package in this way:
    Log on in your BW system --> System --> Status --> Component Information (hand lens) -->
    near the SAP_BW software component and  release, there is the level (for example 0006).

  • Issues with OSSO ,custom login module and form based authentication

    Hi:
    We are facing issues with OSSO (Oracle Single Sign on ),Our application use the form based
    authentication and Custom login module.
    Application is going in infinite loop when we we try to login using osso ,from the logs
    what I got is looks like tha when we we try to login from OSSO application goes to the login
    page and it gets the remote user from request so it forwards it to the home page till now
    it is correct behaviour ,but after that It looks like home page find that authentication is
    not done and sends it back to the login page and login page again sends it to the home as it
    finds that remote user is not null.
    Our web.xml form authentication entry looks like this :
    <login-config>
    <auth-method>FORM</auth-method>
    <form-login-config>
    <form-login-page>/jsp/login.jsp</form-login-page>
    <form-error-page>/jsp/couldnotlogin.jsp</form-error-page>
    </form-login-config>
    </login-config>
    While entry in orion-application.xml has the following entry for custom login :
    <jazn provider="XML">
         <property name="custom.loginmodule.provider" value="true" />
    <property name="role.mapping.dynamic" value="true" />
    </jazn>
    Whether If I change the authentication type to BASIC and add the following line
    in orion-application.xml will solve the issue :
    <jazn provider="XML">
         <property name="custom.loginmodule.provider" value="true" />
    <property name="role.mapping.dynamic" value="true" />
    <jazn-web-app auth-method="SSO" >
    </jazn>
    Any help regarding it will be appreciated .
    Thanks
    Anil

    Hi:
    We are facing issues with OSSO (Oracle Single Sign on ),Our application use the form based
    authentication and Custom login module.
    Application is going in infinite loop when we we try to login using osso ,from the logs
    what I got is looks like tha when we we try to login from OSSO application goes to the login
    page and it gets the remote user from request so it forwards it to the home page till now
    it is correct behaviour ,but after that It looks like home page find that authentication is
    not done and sends it back to the login page and login page again sends it to the home as it
    finds that remote user is not null.
    Our web.xml form authentication entry looks like this :
    <login-config>
    <auth-method>FORM</auth-method>
    <form-login-config>
    <form-login-page>/jsp/login.jsp</form-login-page>
    <form-error-page>/jsp/couldnotlogin.jsp</form-error-page>
    </form-login-config>
    </login-config>
    While entry in orion-application.xml has the following entry for custom login :
    <jazn provider="XML">
         <property name="custom.loginmodule.provider" value="true" />
    <property name="role.mapping.dynamic" value="true" />
    </jazn>
    Whether If I change the authentication type to BASIC and add the following line
    in orion-application.xml will solve the issue :
    <jazn provider="XML">
         <property name="custom.loginmodule.provider" value="true" />
    <property name="role.mapping.dynamic" value="true" />
    <jazn-web-app auth-method="SSO" >
    </jazn>
    Any help regarding it will be appreciated .
    Thanks
    Anil

  • Minor issue with open parameters upon initial PDF load

    Hello, everyone.
    I am experiencing an odd issue with using open parameters to view a PDF in a browser window.
    We are using ColdFusion Server 9.0.1 (soon to upgrade to 10) and the Solr Collection Server that is bundled with it.  The server is updating the collections on a daily basis via Scheduled Tasks.
    When a user (okay.. it's me.. still in testing mode) uses the form to search the collection of PDFs for a specific keyword (let's use "petroleum"), the collections indicate that there are about 31 PDF files that contain the word "petroleum".  Most of them (when opened via "http://domain.com/pdf_file_a.pdf#search="petroleum"&zoom=100") will highlight the word "petroleum" in the document, every time.
    However, there are some PDFs that when opened will indicate that there are ZERO instances of "petroleum" in the document.  But if you refresh the browser, it suddenly finds three instances that it didn't see the first time.
    Is this a bug?  Has anyone else experienced this issue?  Is there a fix or work-around for it?
    Thank you,
    ^_^

    Anyone?

  • Issue with adding custom speed to a behavior

    I am trying to add a custom speed to a write on behavior. The shape I am trying to write on works fine when I have it at any other speed setting but when I set it to custom (which I need to allow it to both write on and write off at the end) the shape disappears. Despite playing with the custom speed percentage, the shape never comes back. Any ideas?

    Ok. I opened your project in my Motion 4 and after it updated it, I got the custom behavior to work fine. However, I had a thought so I decided to open it on my machine that's using Motion 3. There I noticed the problem you're having. I was unable to get the shape write on behavior to work either.
    I believe the problem to be with your project. It appears slightly corrupt. Here's something you might try. Create a new project. Redraw your circle shape. Apply the write on behavior and keyframe it as I mentioned before. Then copy and paste it into your current sequence to replace the circle you have in there.
    If that doesn't work, I'd rebuild your project. Instead of using all those strokes, just make one X and one O and one line. Then use clones for all the others. That should simplify the project quite a bit.

  • I am having issues with a site for work. it loads all the links on the left side of the screen. is there a way to fix this or get an older version of fire fox?

    Its my office intraweb and I was told its not compatible with this upgrade

    You can try basic steps like these in case of issues with web pages:
    Reload web page(s) and bypass the cache to refresh possibly outdated or corrupted files.
    *Hold down the Shift key and left-click the Reload button
    *Press "Ctrl + F5" or press "Ctrl + Shift + R" (Windows,Linux)
    *Press "Command + Shift + R" (Mac)
    Clear the cache and the cookies from websites that cause problems.
    "Clear the Cache":
    *Firefox/Tools > Options > Advanced > Network > Cached Web Content: "Clear Now"
    "Remove Cookies" from sites causing problems:
    *Firefox/Tools > Options > Privacy > Cookies: "Show Cookies"
    Start Firefox in <u>[[Safe Mode|Safe Mode]]</u> to check if one of the extensions (Firefox/Firefox/Tools > Add-ons > Extensions) or if hardware acceleration is causing the problem (switch to the DEFAULT theme: Firefox/Firefox/Tools > Add-ons > Appearance).
    *Do NOT click the Reset button on the Safe Mode start window.
    *https://support.mozilla.org/kb/Safe+Mode
    *https://support.mozilla.org/kb/Troubleshooting+extensions+and+themes

  • Issue with SSIS Custom Components - 64 bit SQL 2012

    Hello All,
    Our SSIS packages built on SQL 2008 R2 make use of some custom components. These custom components are installed as part of an MSI. The dll's are copied over to the Windows/assemble [GAC Folder] and also to the Program Files(x86)/SQL Server/110/DTS/* folder.
    The installation does not copy the dll's to the 64 bit program files folder.
    X:\Program Files\Microsoft SQL Server\110\DTS
    These packages are executed via SQL Agent jobs on a 64 bit SQL server and there does not seem be any issue.
    Now we are upgrading our servers to SQL 2012 and we have a new installer for the custom components as well. The new custom components use .NET Framework 4.0 and when installed the dll files get copied over to the Ms.NET 32 bit runtime GAC folder and also
    to the SQL Server DTS Folder in x86. The upgraded packages work only when we set the runtime mode to 32 bit. The packages successfully executes within the 32 bit dtexec utility, but when we try to run the same package using a 64 bit dtexec utility the
    process errors out with a component failed to load message. The package moves data between two SQL Server instances.
    The custom components have always been built for 32 bit runtime. I can run a older package through the dtexec utility (from the 64 bit folder in program files) and it does work without any issues. After the upgrade the package will only execute on
    a 32 bit utility. can someone help me understand this issue?
    Regards, Dinesh

    Thank you Arthur.
    i think we got the answer as well, as the .NET framework 3.0 installer copied the files over to the C:\windows assembly the dtexec utility [32 bit/64 bit] was able to load the components.
    Now with the new installer the files are copied to specific runtime gac folders as Arthur has mentioned. The 64  bit  utility does not find the dlls in the GAC whereas the 32 bit version will find them.
    Regards, Dinesh

  • Issue With R/3 Extractor 2LIS_13_VDITM and 2LIS_11_VAITM during Delta Updat

    Hi all,
    We are using 2LIS_13_VDITM and 2LIS_11_VAITM Datasources to upload sales overview cube directly as well as a custome ODS seperately using Direct Delta method.
    Now first time upload was perfect(Initialization) but if any changes are made to the existing data or if any new entries are entered in the source system, R/3 extractor (2LIS_13_VDITM and 2LIS_11_VAITM) is not pulling those records.
    Once we tried deleting SETUP table and reloaded it, extractor was able to pull the records but again the issue crops up for any new entries or changes made in source system.
    This is stopping us from completing the project. So any suggestions the would help us solving the issue is most welcome.
    Regards,
    Surendhar.

    Hi,
    If you have selected the 'Queue delta' Update method for applications 11 and 13,then make sure that the "update" job is scheduled to move the data from LBWQ to RSA7. Once the data available in RSA7, you can execute the infopackage with delta load.
    If you have selected the 'Direct delta' Update method for applications 11 and 13,then there is no need of "update" job to move the data from LBWQ to RSA7. Once the data available in RSA7, you can execute the infopackage with delta load.
    With rgds,
    Anil Kumar Sharma .P

  • Issues with changing connection at run-time

    Post Author: dmazourick
    CA Forum: Data Connectivity and SQL
    Weu2019ve tried a lot of different ways to resolve this issue, but are getting every time the different result.
    Probably someone deal with that issue before and know how to correctly resolve it.
    Weu2019re using Crystal Reports Runtime Components X+ (X, XI, XI R2) u2013 all of them has this issue.
    We need client application to connect to multiple data sources u2013 user chooses report, chooses data source and we show the report for specified data source.
    The data sources are tables or stored procedures stored in different databases on different servers.
    For sure, every data source for a single report has the same structure, but that doesnu2019t matter.
    The issue is: when the name of the database on one server is the same as the name of database on second server, the connection caching occurs.
    How we can check that:
    1.       Weu2019re running report for Server1:<DBN> - report shows data from Server1.
    2.       Weu2019re opening second report for Server2:<DBN> - report shows data from Server1.
    3.       Weu2019re closing application and run 1-2 in opposite order, now both reports show data from Server2.
    Weu2019ve tried different approaches u2013 below is a code sample that opens the report for specific connection.
    Juts to be sure that no one will ask u2013 u201CAre you sure youu2019re passing the correct connection info etc.u201D. Yes! We are sure because weu2019re trying to fix this issue for a long time and tried a lot of different approaches and still cannot find the right solution.
    The code looks like below. This is VB6 code, but also the same situation was tried on VC++ 6.0
    Weu2019re not looking into CR.NET solution for now.
    =================================================
    Sub DisplayReport(Server as String, DB as String, UID as String, PWD as String, viewer as Object)
        Dim app As New CRAXDRT.Application
        Dim report As CRAXDRT.report
        Dim database As CRAXDRT.database
        Dim table As CRAXDRT.DatabaseTable
        Dim par As CRAXDRT.ParameterFieldDefinition
        Set report = app.OpenReport("D:\TestReport_X.rpt")
        report.database.LogOnServer "pdssql.dll", Server, DB, UID, PWD
        Set table = report.database.Tables(1)
        table.SetLogOnInfo Server, DB, UID, PWD
        table.Location = table.Name
        report.database.Verify
        viewer.ReportSource = report
        viewer.ViewReport
    end sub
    =================================================
    The result of above code is the following:
    1.       If we will pass the same viewer and will use different Server u2013 the report will be displayed correctly
    2.       If we will pass different viewers and will use different Server u2013 the reports will contain same data
    The result of above code also depends from the version of Crystal Reports the report was designed in:
    1.       For Report designed in 8.5 u2013 passing of the same viewer with same connection info second time will refresh report
    2.       For Report designed in X, XI, XI R2 u2013 no refresh
    Also, a slight modification of the above code helps for reports designed in XI to work properly, but not for reports designed in X and 8.5:
    1.       Before calling LogonServer, make the following: DB = DB & u201C;u201D & Int(rnd()*32767)
    That makes report designed in XI to display properly in different viewers, but doesnu2019t have any impact to X and no any impact to 8.5
    Weu2019re really looking for any help in this question

    Post Author: fburch
    CA Forum: Data Connectivity and SQL
    I am having similar problems and some successes.
    I have 70+ reports and now suddenly I want to point them at two different servers, but at databases with the same name like you talked about.
    I first just tried the following:
    #1. Load report:
    Dim myReport As New ReportDocument
    myReport.Load(filename)
    #2. Pass in parameter values
    ''Get the collection of parameters from the report
    Dim crParameterFieldDefinitions As ParameterFieldDefinitions = r.DataDefinition.ParameterFields
    ''Access the specified parameter from the collection
    Dim crParameter1 As ParameterFieldDefinition = crParameterFieldDefinitions.Item(ParamName)
    ''Get the current values from the parameter field. At this point
    ''there are zero values set.
    'crParameter1Values = crParameter1.CurrentValues
    ''Set the current values for the parameter field
    Dim crDiscrete1Value As New ParameterDiscreteValue
    If crParameter1.ValueType = FieldValueType.DateField Or crParameter1.ValueType = FieldValueType.DateTimeField Then
    If ParamValue Is System.DBNull.Value Then
    crDiscrete1Value.Value = CDate("1/1/1900")
    ElseIf ParamValue Is Nothing Then
    crDiscrete1Value.Value = CDate("1/1/1900")
    Else
    crDiscrete1Value.Value = ParamValue
    End If
    ElseIf crParameter1.ValueType = FieldValueType.StringField Then
    If ParamValue Is Nothing Then
    crDiscrete1Value.Value = ""
    Else
    crDiscrete1Value.Value = ParamValue
    End If
    ElseIf crParameter1.ValueType = FieldValueType.BooleanField Then
    If ParamValue Is Nothing Then
    crDiscrete1Value.Value = False
    ElseIf ParamValue.ToString.ToUpper = "TRUE" Then
    crDiscrete1Value.Value = True
    Else
    crDiscrete1Value.Value = False
    End If
    ElseIf crParameter1.ValueType = FieldValueType.NumberField Then
    If ParamValue Is Nothing Then
    crDiscrete1Value.Value = 0
    Else
    crDiscrete1Value.Value = ParamValue
    End If
    Else
    If ParamValue Is System.DBNull.Value Then
    crDiscrete1Value.Value = Nothing
    ElseIf ParamValue Is Nothing Then
    crDiscrete1Value.Value = Nothing
    Else
    crDiscrete1Value.Value = ParamValue
    End If
    End If
    ''Add the first current value for the parameter field
    Dim crParameter1Values As New ParameterValues
    crParameter1Values.Add(crDiscrete1Value)
    ''All current parameter values must be applied for the parameter field.
    crParameter1.ApplyCurrentValues(crParameter1Values)
    #3 Set "Table Log in info" (most of my reports using stored procedures, but I guess I still needed this step).
    Dim CrTables As Tables = r.Database.Tables
    Dim CrTable As Table
    Dim crtableLogoninfos As New TableLogOnInfos()
    Dim crtableLogoninfo As New TableLogOnInfo()
    With crConnectionInfo
    .ServerName = connectionParser.GetServerName(connectionString)
    .DatabaseName = connectionParser.GetDatabaseName(connectionString)
    If connectionParser.DoesUseIntegratedSecurity(connectionString) = True Then
    .IntegratedSecurity = True
    Else
    .UserID = connectionParser.GetServerUserName(connectionString)
    .Password = connectionParser.GetServerPassword(connectionString)
    .IntegratedSecurity = False
    End If
    End With
    For Each CrTable In CrTables
    crtableLogoninfo = CrTable.LogOnInfo
    crtableLogoninfo.ConnectionInfo = crConnectionInfo
    CrTable.ApplyLogOnInfo(crtableLogoninfo)
    If InStr(CrTable.Location, ".dbo.") = 0 Then
    CrTable.Location = crConnectionInfo.DatabaseName + ".dbo." + CrTable.Location
    End If
    Next
    If r.Subreports.Count > 0 Then
    Dim crSections As Sections
    Dim crSection As Section
    Dim crReportObjects As ReportObjects
    Dim crReportObject As ReportObject
    Dim crSubreportObject As SubreportObject
    Dim crDatabase As Database
    Dim subRepDoc As New ReportDocument()
    'SUBREPORTS
    'Set the sections collection with report sections
    crSections = r.ReportDefinition.Sections
    'Loop through each section and find all the report objects
    'Loop through all the report objects to find all subreport objects, then set the
    'logoninfo to the subreport
    For Each crSection In crSections
    crReportObjects = crSection.ReportObjects
    For Each crReportObject In crReportObjects
    If crReportObject.Kind = ReportObjectKind.SubreportObject Then
    'If you find a subreport, typecast the reportobject to a subreport object
    crSubreportObject = CType(crReportObject, SubreportObject)
    'Open the subreport
    subRepDoc = crSubreportObject.OpenSubreport(crSubreportObject.SubreportName)
    crDatabase = subRepDoc.Database
    CrTables = crDatabase.Tables
    'Loop through each table and set the connection info
    'Pass the connection info to the logoninfo object then apply the
    'logoninfo to the subreport
    For Each CrTable In CrTables
    crtableLogoninfo = CrTable.LogOnInfo
    crtableLogoninfo.ConnectionInfo = crConnectionInfo
    CrTable.ApplyLogOnInfo(crtableLogoninfo)
    If InStr(CrTable.Location, ".dbo.") = 0 Then
    CrTable.Location = crConnectionInfo.DatabaseName + ".dbo." + CrTable.Location
    End If
    Next
    End If
    Next
    Next
    #4 go get the data
    crv.ReportSource = myReport
    crv.Refresh()
    #5 Call export to disk function.
    This was not changing server - did not realize it was a caching problem as you suggested. That makes sense. So anyway, then of course I threw a verify database statement on there, before I get the data. Now looks like this:
    #1 Load Report
    #2. Pass in parameter values (dummy values that will generate schema of table without having to actually run long running procedures, i.e. select (cast 1 as int) as somefield1, cast(2.0 as numeric(10,0)) as somefield2
    #3 Set "Table Log in info"
    #3b Verify the database which seems to be a necessity:
    myReport.VerifyDatabase()
    #3c Re-populate the report with real parameter values, same as #2 but this time with the ones that will generate the real data
    #4 go get the data
    #5 Call export to disk function.
    This does work, some of the time. When the datasource underlying report are tables, it works. I made a dummy crystal report with lots of different types of params (stored procedure underlying database) - this also worked!
    Unfortunately, when I run this against the majority of my reports, I get this stupid "invalid mapping type value", for which I have not been able to resolve yet.
    I also tried putting a myreport.SetDatabaseLogon("","") -- what would this do, clear it out? (saw this referenced somewhere).
    Then I tried putting the real connection info in there as well ...
    myReport.SetDatabaseLogon(uid, pwd, serverName, DBname)
    I put this setdatabase thing before I called verifydatabase, which is where the process is bombing out and giving me invalid mapping type for the reports that do not run.
    At this point I am still working on solution. I have tried creating dummy report that used same parameter types as a report that was failing and voila - the dummy report worked. Anyway, let me know if you get your problem fixed and I will do the same. Looks like you are using a different method that I didn't notice "LogOnServer"

Maybe you are looking for

  • Adobe Acrobat 8 Standard compatibility issues with Office 2007 and Windows 7

    I just updated my platform to Windows 7 and upgraded from Microsoft Office 2003 to Microsoft Office 2007.  The PDF printer did not download and realized that there are compatibility issues with Office 2007.  How could I get this fixed?

  • Show 'Print' Option not see in view property editor

    I have created a new ABAP web dynpro application and am trying to integrate it into the portal (7.0).  I created a view and am trying to customize the appearance of the tray it appears in so that the user can print the view.  I know that in the stand

  • Powermac G4 only shows desktop background

    I recently acquired an old powermac G4. It was working and could connect to the internet for the person I got it from. Then when I came home and plugged it in, I heard the chime, the screen came on, and it got stuck at the background. I don't see the

  • [JS] Break Link to Symbol || Outline text contained within symbol

    There are a set of symbols in my documents, these all contain editable text. Prior to going to production I need to break the link to all the symbols so that I can outline the text. SymbolItem doesn't have a method that i have uncovered - suggestions

  • "Invalid \escape" error preventing me from adding work email

    Hello, I have been having a series of issues with my BlackBerry Q10 - sorry for the surge of threads, but these bugs seem unrelated and I would like to make separate threads to make them easier to search for others having similar problems. I have bee