Data Package Issue in DTP

Hi gurus,
My dataflow is like datsource->infosource->wrute optimised DSO with semantic key..
In source , i have 10 records in that 7 records are duplicate records.
I reduced the DTP datapackage size from 50000 to 5.
When i excuted the DTP , i got 2 data package. in the first data package i got all the 7 records for the same set of keys and in the second data package i got the remaining records.
My doubt is i have defined the data package size as "5" then how come the first data package can hold 7 records instead of 5 records.
Thanks in advance !

Hi ,
It is because of the Semantic Key seeting that you have maintained .Data records that have the same key are combined in a single data package. This setting is only relevant for DataStore objects with data fields that are overwritten .
Semantic Groups ensures how you want to build the data packages that are read from the source (DataSource or InfoProvider).
This setting also defines the key fields for the error stack. By defining the key for the error stack, you ensure that the data can be updated in the target in the correct order once the incorrect data records have been corrected.
Hope it helps .
Thanks
Kamal Mehta

Similar Messages

  • Data Packet issue for DTP

    Hi,
        I have Data packet  issue with DTP  when i am loding for General Ledger Account -2009
       (cube) the data load is from DATA SOURCE (0FI_GL_4) . the data is first loaded to DSO
       and then to a cube. upto this level the data is going fine, when the data is loaded to 2009
       cube using DTP as full load  i have an issue in the report, where the net balance is not " 0 ".
       but when i do a manual load for Selective company code's as a single value selection in the
       DTP filter condetion for all the company codes my data is matching in the report, where
       the netbalance is " 0 ". 
       With this i think there is an issue with Datapacket for DTP.
       Please sugest in this regard.
       Regards,
       prasad.

    Hi Ngendra,
    Yes, there will problem wth data loads some time with DTP.
    This can be resolved by setting scemantic grouping at dtp level, make sure that scemantic field ur defining will be unique.
    I am not sure with respect to functional side but with respect to ur post. i assume your problem will resoved by seeting company code as scemantic field or by setting some unique field as scemantic while data loading....
    Hope this helps you.
    Best Regards,
    Maruthi

  • Impact of Changing Data Package Size with DTP

    Hi All,
    We have delta dtp to load data from DSO to infocube. Default data package size with dtp is 50,000 records.
    Due to huge no of data, internal table memory space is used and data loading get fails.
    Then we changed the data package size to 10,000, which executes the data load successfully.
    DTP with package size of 50,000 took 40 minutes to execute and failed, but DTP with package size of 10,000 took 15 minutes (for same amount of data).
    Please find below my questions:
    Why a DTP with bigger size of packet runs longer than a DTP with lower packet size ?
    Also by reducing the standard data package size 50,000 to 10,000, will it impact any other data loading?
    Thanks

    Hi Sri,
    If your DTP is taking more time then check your transformation .
    1.Transformation with Routines always take more time so you if you want to reduce the time of execution then routine should be optimized for good performance .
    2.Also check if you have filter at DTP level .Due to filters DTP takes long time .If same data get filtered at routine level it take much lesser time .
    3.If you cannot change routine then you can set semantic keys at your DTP .The package data will be sorted as per semantic keys and thus it may be helpful at routine level for fast processing.
    4.Your routine is getting failed due to  internal table memory space so check if you have select statement in routine without FOR ALL ENTRIES IN RESULT_PACKAGE or SOURCE_PACKAGE line .if you will use this It will reduce record count .
    5.Wherever possible delete duplicate records and if possible filter useless data at start routine itself .
    6.Refresh internal table if data no longer needed .If your tables are global then data will be present at every routine level so refreshing will help to reduce size.
    7.The maximum memory that can be occupied by an internal table (including its internal administration) is 2 gigabytes. A more realistic figure is up to 500 megabytes.
    8.Also check no of jobs running that time .May be you have lots of jobs active at the same time so memory availability will be less and DTP may get failed .
    Why a DTP with bigger size of packet runs longer than a DTP with lower packet size ?
    *Start and end routine works at package level so routine run for each package one by one .By default package have sorted data based on keys (non unique keys (characteristics )of source or target) and by setting semantic keys you can change this order.So Package having more data will take more time in processing then package have lesser data .
    by reducing the standard data package size 50,000 to 10,000, will it impact any other data loading?
    It will only impact running of that load .but yes if lots of other loads are running simultaneously then server can allocate more space to them .So better before reducing package size just check whether it is helpful in routine performance (start and end ) or increasing overhead .
    Hope these points will be helpful .
    Regards,
    Jaya Tiwari

  • Determine the count of data-packages in a dtp request.

    Hello,
    in a start routine I need to know the count (=highest number) of all data-packages in the current DTP-Request.
    I have had a look to table RSTSODSREQUESTPG it contains exactly the fields I need,
    but I do not find every dtp-request in this table.
    For me it looks as if this table was only filled in BW 3.5 but not in BI 7.0 or there is another reason why not every DTP-Request is stored in this table.
    Is there a new table to get the information ?
    Thanks
    Armin

    Hello,
    in the table RSBKDATAPAKID I found the field DATAPAKID, what I was looking for.
    But the field REQUID is only 6 byte like: 123456.
    In the start routine I have only a 30 byte field like: DTPR_4B34567890123456789012345
    In the administartor workbench I can see that both numbers are the same request.
    Is there a matching table ?
    In the table RSDDSTATDTP the field INSTANCE seems to be the 30 byte DTPR-Number.
    And a field DATAPAKID is also in this table.
    That looks good, i will try out if it works.
    Thanks
    Armin
    Edited by: Armin Batzelt on Sep 17, 2008 4:44 PM

  • DTP does not fetch all records from Source, fetches only records in First Data Package.

    Fellas,
    I have a scenario in my BW system, where I pull data from a source using a Direct Access DTP. (Does not extract from PSA, extracts from Source)
    The Source is a table from the Oracle DB and using a datasource and a Direct Access DTP, I pull data from this table into my BW Infocube.
    The DTP's package size has been set to 100,000 and whenever this load is triggered, a lot of data records from the source table are fetched in various Data packages. This has been working fine and works fine now as well.
    But, very rarely, the DTP fetches 100,000 records in the first data package and fails to pull the remaining data records from source.
    It ends, with this message "No more data records found" even though we have records waiting to be pulled. This DTP in the process chain does not even fail and continues to the next step with a "Green" Status.
    Have you faced a similar situation in any of your systems?  What is the cause?  How can this be fixed?
    Thanks in advance for your help.
    Cheers
    Shiva

    Hello Raman & KV,
    Thanks for your Suggestions.
    Unfortunately, I would not be able to implement any of your suggestions because, I m not allowed to change the DTP Settings.
    So, I m working on finding the root cause of this issue and came across a SAP Note - 1506944 - Only one package is always extracted during direct access , which says this is a Program Error.
    Hence, i m checking more with SAP on this and will share their insights once i hear back from them.
    Cheers
    Shiva

  • Issue in Update routine due to Data Package

    We have this peculiar situation.
    The scenario is ..
    We have to load data from ODS1 to ODS2.
    The data package size is 9980 while transferring data from ODS1 to ODS2.
    In the update rule we have some calculations and we rank the records based on these calculations.
    The ODS key for both ODS1 and ODS2 is same ie Delivery Number , Delivery Item & Source System.
    For example a Delivery Number has 12 Delivery Items.
    These Delivery Items are in different Data Packages namely Data Package 1 and Data Package 4.
    So instead of having the ranks as 1 to 10 its calculating it as 1 to 5 and second item as 1 to 5.
    But what we require is Rank as 1 to 10.
    This is due to the fact that the items are in different Data packages.
    In this case the ABAP routine is working fine but the Data Package is the problem.
    Can anybody any alternative solution to this issue.?
    Thanks in advance for assistance.............

    CODE FOR INTER DATA PACKAGE TREATMENT
    PROGRAM UPDATE_ROUTINE.
    $$ begin of global - insert your declaration only below this line  -
    TABLES: ...
    DATA:   ...
    DATA: v_packet_nbr TYPE i VALUE 1.
    DATA:
      g_requnr  TYPE rsrequnr.
    DATA:
      l_is        TYPE string VALUE 'G_S_IS-RECNO',
      l_requnr    TYPE string VALUE 'G_S_MINFO-REQUNR'.
    FIELD-SYMBOLS: <g_f1> TYPE ANY,
                   <g_requnr> TYPE ANY.
    TYPES:
      BEGIN OF global_data_package.
            INCLUDE STRUCTURE /bic/cs8ydbim001.
    TYPES: recno   LIKE sy-tabix,
      END OF global_data_package.
    DATA lt_data_package_collect TYPE STANDARD TABLE OF global_data_package.
    DATA ls_datapack TYPE global_data_package.
    datapackage enhancement Declaration
    TYPES: BEGIN OF datapak.
            INCLUDE STRUCTURE /bic/cs8ydbim001.
    TYPES: END OF datapak.
    DATA: datapak1 TYPE STANDARD TABLE OF datapak,
          wa_datapak1 LIKE LINE OF datapak1.
    Declaration for Business Rules implementation
    TYPES : BEGIN OF ty_ydbsdppx.
            INCLUDE STRUCTURE /bic/aydbsdppx00.
    TYPES: END OF ty_ydbsdppx.
    DATA : it_ydbsdppx TYPE STANDARD TABLE OF ty_ydbsdppx WITH HEADER LINE,
           wa_ydbsdppx TYPE ty_ydbsdppx,
           temp TYPE /bic/aydbim00100-price,
           lv_tabix TYPE sy-tabix.
    $$ end of global - insert your declaration only before this line   -
    The follow definition is new in the BW3.x
    TYPES:
      BEGIN OF DATA_PACKAGE_STRUCTURE.
         INCLUDE STRUCTURE /BIC/CS8YDBIM001.
    TYPES:
         RECNO   LIKE sy-tabix,
      END OF DATA_PACKAGE_STRUCTURE.
    DATA:
      DATA_PACKAGE TYPE STANDARD TABLE OF DATA_PACKAGE_STRUCTURE
           WITH HEADER LINE
           WITH NON-UNIQUE DEFAULT KEY INITIAL SIZE 0.
    FORM startup
      TABLES   MONITOR STRUCTURE RSMONITOR "user defined monitoring
               MONITOR_RECNO STRUCTURE RSMONITORS " monitoring with record n
               DATA_PACKAGE STRUCTURE DATA_PACKAGE
      USING    RECORD_ALL LIKE SY-TABIX
               SOURCE_SYSTEM LIKE RSUPDSIMULH-LOGSYS
      CHANGING ABORT LIKE SY-SUBRC. "set ABORT <> 0 to cancel update
    $$ begin of routine - insert your code only below this line        -
    fill the internal tables "MONITOR" and/or "MONITOR_RECNO",
    to make monitor entries
    TABLES: rsmonfact.
      TYPES:
        BEGIN OF ls_rsmonfact,
          dp_nr TYPE rsmonfact-dp_nr,
        END OF ls_rsmonfact.
      DATA: k TYPE i,
            v_lines_1 TYPE i,
            v_lines_2 TYPE i,
            v_packet_max TYPE i.
    declaration of internal tables
      DATA: it_rsmonfact TYPE STANDARD TABLE OF ls_rsmonfact.
    INTER-PACKAGE COLLECTION TREATMENT *******************
      ASSIGN (l_requnr) TO <g_requnr>.
      SELECT dp_nr FROM rsmonfact
        INTO TABLE it_rsmonfact
        WHERE rnr = <g_requnr>.
      DESCRIBE TABLE it_rsmonfact LINES v_packet_max.
      IF v_packet_nbr < v_packet_max.
      APPEND LINES OF DATA_PACKAGE[] TO lt_data_package_collect[].
        CLEAR: DATA_PACKAGE.
        REFRESH DATA_PACKAGE.
        v_packet_nbr = v_packet_nbr + 1.
        CLEAR: MONITOR[], MONITOR.
        MONITOR-msgid = '00'.
        MONITOR-msgty = 'I'.
        MONITOR-msgno = '398'.
        MONITOR-msgv1 = 'All data_packages have been gathered in one. '.
        MONITOR-msgv2 = 'The last DATA_PACKAGE contains all records.'.
        APPEND MONITOR.
      ELSE.
    last data_package => perform Business Rules.
        IF v_packet_max > 1.
          APPEND LINES OF DATA_PACKAGE[] TO lt_data_package_collect[].
          CLEAR: DATA_PACKAGE[], DATA_PACKAGE.
          k = 1.
    We put back all package collected into data_package, handling recno.
          LOOP AT lt_data_package_collect INTO ls_datapack.
            ls_datapack-recno = k.
            APPEND ls_datapack TO DATA_PACKAGE.
            k = k + 1.
          ENDLOOP.
          CLEAR : lt_data_package_collect.
          REFRESH : lt_data_package_collect.
        ENDIF.
    sorting global data package and only keep the first occurence of the
    *record
      SORT DATA_PACKAGE BY material plant calmonth.
      DELETE ADJACENT DUPLICATES FROM DATA_PACKAGE
            COMPARING material plant calyear.
      SELECT * FROM /bic/aydbsdppx00
          INTO TABLE it_ydbsdppx
          FOR ALL ENTRIES IN DATA_PACKAGE
            WHERE material = DATA_PACKAGE-material
              AND plant    = DATA_PACKAGE-plant
              AND calyear  = DATA_PACKAGE-calyear.
    Enhance Data_package with Target additionnal fields.
      LOOP AT DATA_PACKAGE.
        CLEAR : wa_datapak1, wa_ydbsdppx.
        MOVE-CORRESPONDING DATA_PACKAGE TO wa_datapak1.
        READ TABLE it_ydbsdppx INTO wa_ydbsdppx
          WITH KEY material = DATA_PACKAGE-material
                      plant = DATA_PACKAGE-plant
                    calyear = DATA_PACKAGE-calyear.
        IF sy-subrc NE 0.       "new product price
          APPEND wa_datapak1 TO datapak1.
        ELSE.                   " a product price already exists
          IF wa_ydbsdppx-calmonth GE DATA_PACKAGE-calmonth.
    keep the eldest one  (for each year), or overwrite price if same month
            APPEND wa_datapak1 TO datapak1.
          ENDIF.
        ENDIF.
      ENDLOOP.
    ENDIF.
    if abort is not equal zero, the update process will be canceled
      ABORT = 0.
    $$ end of routine - insert your code only before this line         -
    ENDFORM.
    Edited by: mansi dandavate on Jun 17, 2010 12:32 PM

  • Data Package 1 ( 0 Data Records ) at DTP with Status RED

    Hi All,
       For 0 records at DTP level it is showing Overall status & Technical status as RED and yellow beside Data Package 1 ( 0 Data Records ).  There is no short dump no error message. At PSA level in status tab the message displayed is Status 8 which says no data on R3 side for this particular load. Help me out.
    Regards,
    Krishna.

    Hi,
    if traffic light is not highlighted, you are probably running a delta.
    You will have to set the traffic light in the according init.
    (and run init again )
    the setting in the delta will be the same.
    Udo

  • Error In DTP:Data package 1 / XDateX  XTimeX Status 'Processed with Errors'

    Hi,
    While Pushing the data from the DSO(0PP_DS01 - Planned/Actual Comparison Order/Material View) to Cube 0PP_C15 -( Backlogged Production Orders).
    Getting the Error as : Data package 1 / XDateX  XTimeX Status 'Processed with Errors'
    When checked the Long text it shows the same text i.e.,Data package 1 / XDateX  XTimeX Status 'Processed with Errors'
    It does not show any further error message which i can point out.
    If anyone has come across this error please provide the resolution.
    Thanks,
    Adhvi Rao

    Hi Advirao,
    Are you using DTP to load the data to the target? Then you can enable error stack to collect the erroneos records.
    Also check if you have activated the request in ur source DSO.
    Regards
    Dinesh

  • Similar issue: data package

    Hi all,
    I have to load data to an ODS to Cube. Requirement is while loading data, records with same key should be selected in same datapackage. I am using 3.5 update rule and infopackage for updating the target.
    For example
    Article Area
    123    01
    123    02
    123    03
    I need all articles with 123 should be selected in the same datapackage. Scattered selection shouldn't be allowed. The purpose of selecting similar article in the same data package is, then only the logic written in the update rule will work correctly.
    If any body have idea about this please help me.
    Thanks in advance,

    You cant get all records of similar key into same datapackage.There will always be chances that one or more records come in your next datapackage.
    May be you can increase the size of the datapackage in infopackage scheduler -
    >DataS Default size (something like this), so that all data comes in one datapackage.
    Lots of ppl have this problem but this is limitation in BW 3.X,
    Hope this helps.

  • Copy Data Package (Referencing Target Issue)

    Hello Experts,
    We are trying to run an allocation but on runtime the application doesn't recognize the target variable of the data package.
    This is the Script Logic we are using:
    *RUNALLOCATION
    *FACTOR = 1
    *DIM VERSAO WHAT=$VF$; WHERE = $VT$
    *ENDALLOCATION
    This is the Data Package Script used to set the parameters:
    PROMPT(COPYMOVEINPUT,%SELECTION%,%TOSELECTION%,"Test","VERSAO")
    INFO(%EQU%,=)
    INFO(%TAB%,;)
    TASK(ZBPC_DESP_RATEIOS_RUN_1,SUSER,%USER%)
    TASK(ZBPC_DESP_RATEIOS_RUN_1,SAPPSET,ORC_GERENC)
    TASK(ZBPC_DESP_RATEIOS_RUN_1,SAPP,Desp_Resp_Oper)
    TASK(ZBPC_DESP_RATEIOS_RUN_1,SELECTION,%SELECTION%)
    TASK(ZBPC_DESP_RATEIOS_RUN_1,REPLACEPARAM,%VF%EQU%%VERSAO_SET%%TAB%VT%EQU%%VERSAO_TO%)
    TASK(ZBPC_DESP_RATEIOS_RUN_1,LOGICFILENAME,TESTE.LGF)
    TASK(ZBPC_DESP_RATEIOS_RUN_1,TAB,%TAB%)
    TASK(ZBPC_DESP_RATEIOS_RUN_1,EQU,%EQU%)
    When we run the Data Package, we receive a Succefull Status Message, but on the Log we can see that SAP BPC routine doesn't recognize our target. It understands that VERSAO_SET is the source of the Allocation, but it just don't read the target value of the prompt for wich variable we are using VERSAO_TO.
    LOG BEGIN TIME:2009-08-10 10:39:14
    FILE:\ROOT\WEBFOLDERS\ORC_GERENC\ADMINAPP\Desp_Resp_Oper\TESTE.LGF
    USER:CBD\24613820
    APPSET:ORC_GERENC
    APPLICATION:Desp_Resp_Oper
    FACTOR:1
    ALLOCATION DATA REGION:
    VERSAO:PTPL
    VERSAO:WHAT:PTPL,WHERE:%VERSAO_TO%,USING:,TOTAL:
    On this sample, "PTPL" is the version we've selected as source, and the target we've selected is "COM" but the variable can't read it.
    Assuming that "_SET" is used to reference the first variable of the prompt, could you please clarify us regarding which tag should we use to reference the second variable of the prompt?
    Thanks in advice!
    Edited by: Adalberto  Vides Barbosa on Aug 10, 2009 7:26 PM

    Hi,
    As discussed by the experts earlier, the problem is getting the variable %VERSAO_TO%.
    The variable %VERSAO_SET% is a dynamic variable used by BPC to get value from DM package. However, the other variable is not recognized by the system.
    Hope this helps.

  • Master data loading issue

    Hi gurus,
        Presently i am working on BI 7.0.I have small issue regarding master data loading.
        I have generic data soruce for master data loading.i have to fetch this data to BW side.Always i have to do full data load to the master data object.first time i have scheduled info package and run DTP to load data to master data object, no issues, data got loaded successfully.whenever i run infopacage for second time and run DTP i am getting error saying that duplicated records.
       How can i handle this.
    Best Regards
    Prasad

    Hi Prasad,
    Following is happening in your case:
    <b>Loading 1st Time:</b>
    1. Data loaded to PSA through ipack.It is a full load.
    2. data loaded to infoobject through DTP.
    <b>Loading 2nd Time:</b>
    1. Data is again loaded to PSA. It is a full load.
    2. At this point, data in PSA itself is duplicate. So when you are running the DTP, it picks up the data of both the requests that were loaded to PSA. And hence, you are getting the Duplicate record error.
    Please clear the PSA after the data is loaded to infoobject.
    Assign points if helpful.
    Regards,
    Tej Trivedi

  • Update Master Data Attributes yellow in DTP

    Hi All,
    After  loading the master data DTP, the request becomes greeen but the "Update Master Data Attributes" step is still yellow. Let me know how to overocome this error
    Thanks,
    Sathya

    Hi,
    This is a master data full load and it updates more than 1 lakh records. Technically the Overall and technical status of my  DTP request becomes Green but when i go to the details tab and drill down the data package the "Update Master Data Attributes" still remains yellow so i dont think my load will update the attributes for my info object successfully.
    Is there a  solution to overcome or did anyone face this similar issue?
    Thanks
    Sathya

  • Unable to load the data into Cube Using DTP in the quality system

    Hi,
    I am unable to load the data from PSA to Cube using DTP in the quality system for the first time
    I am getting the error like" Data package processing terminated" and "Source TRCS 2LIS_17_NOTIF is not allowed".
    Please suggest .
    Thanks,
    Satyaprasad

    Hi,
    Some Infoobjects are missing while collecting the transport.
    I collected those objects and transported ,now its working fine.
    Many Thanks to all
    Regards,
    Satyaprasad

  • Data package is missing in the return structure

    Hi BW Folks,
    I have an issue with ODS activation.While activating the data in ODS object am getting following error message
    Activation of data records from ODS object XXXX terminated.
    data package XXXXX contains errors with status 9 in table 'XX' but this data package is missing in the return structure.
    In detail: The data package is entered in the return structure as incorrect.
    Can anyone provide me the solution. Thanks in advance. Have a nice time!
    Regards,
    Nani.

    HI
    Check these links
    Re: Status 9 error when activating an ODS in a Process Chain
    ODS activation error - status 9
    Error while data loading-terminated with Status 9
    Error while data loading-terminated with Status 9
    hope it helps
    regards
    CK
    Assing points if usefull

  • Same set of Records not in the same Data package of the extractor

    Hi All,
    I have got one senario. While extracting the records from the ECC based on some condition I want to add some more records in to ECC. To be more clear based on some condition I want to add addiional lines of data by gving APPEND C_T_DATA.
    For eg.
    I have  a set of records with same company code, same contract same delivery leg and different pricing leg.
    If delivery leg and pricing leg is 1 then I want to add one line of record.
    There will be several records with the same company code contract delivery leg and pricing leg. In the extraction logic I will extract with the following command i_t_data [] = c_t_data [], then sort with company code, contract delivery and pricing leg. then Delete duplicate with adjustcent..command...to get one record, based on this record with some condition I will populate a new line of record what my business neeeds.
    My concern is
    if the same set of records over shoot the datapackage size how to handle this. Is there any option.
    My data package size is 50,000. Suppose I get a same set of records ie same company code, contract delivery leg and pricing leg as 49999 th record. Suppose there are 10 records with the same characteristics the extraction will hapen in 2 data packages then delete dplicate and the above logic will get wrong. How I can handle this secnaio. Whether Delta enabled function module help me to tackle this. I want to do it only in Extraction. as Data source enhancement.
    Anil.
    Edited by: Anil on Aug 29, 2010 5:56 AM

    Hi,
    You will have to do the enhancement of the data source.
    Please follow the below link.
    You can write your logic to add the additional records in the case statement for your data source.
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/c035c402-3d1a-2d10-4380-af8f26b5026f?quicklink=index&overridelayout=true
    Hope this will solve your issue.

Maybe you are looking for

  • HDMI Output not work on Encore Tablet

    Hi guys, First post and a new user of the Toshiba Encore. Some background - I had a faulty Encore previously, but it would connect to my HDMI screen. The replacement, now 3 weeks old, works perfectly, except yesterday I tried HDMI out for sound (to a

  • Borderless Printing Iphoto 6 and HP 3210x All in One Photosmart

    Hi Folks, Has anyone been able to get Iphoto 6 to print borderless 4x6 photos using a HP 3210 Photosmart Printer? I have tried every setting I can think of but it always prints a 4x6 photo with about a 1/8 inch border at each end. Any suggestions wou

  • 24"LED and a Power Mac G5

    Is it possible to connect a 24" LED display to my Power Mac G5? Thanks in advance for any help.

  • PDF version of the ABAP keyword documentation

    hi all, can any1 please help me to find the PDF version of the ABAP keyword documentation??.......i have the HTML version downloaded from the TC ABAPDOCU.....but i prefer the PDF version...so please help out in this regard.................if any1 hav

  • New iPhone 5, Adobe Bridge Hangs or Gives up Downloading Photos from It

    I have Adobe Bridge CS5.   Version 4.1.0.54.    Installed on Windows 7. When I try to download photos and movies from my new iPhone 5, Adobe Bridge hangs. Anyone have a fix?   How can I report this as a bug to Adobe?