Large number of characterist values to filter.  Excel BEx and BEx Web

Hi, I have an Excel BEx Workbook where the users have set up a Customer Payer Filter with 75 Customers.  They are able to add to add or remove entries to this workbook and just save the workbook for next run.
Now I am converting this workbook/query to the Web and trying to figure out how I am to give them the same functionality in BEx Web?   Anyone know how I would accomplish this?

Hi Kenneth,
We have faced the same issue. We did not manage to solve the Issue using the Web Interface. We performed the following workarroung:
Created a simple input program which the user can execute and upload an excel file to a write-optimized DSO.
We have created a customer exit variable which populates filters the query execution by looking up the write- optimized DSO.
Regards,

Similar Messages

  • Passing a large number of column values to an Oracle insert procedure

    I am quite new to the ODP space, so can someone please tell me what's the best and most efficient way to pass a large number of column values to an Oracle procedure from my C# program ? Passing a small number of values as parameters seem OK but when there are many, this seems inelegant.

    Passing a small number
    of values as parameters seem OK but when there are
    many, this seems inelegant.Is it possible that your table with a staggering amount of columns or method that collapses without so many inputs is ultimately what is inelegant?
    I once did a database conversion from VAX RMS system with a "table" with 11,000 columns to a normalized schema in an Oracle database. That was inelegant.
    Michael O
    http://blog.crisatunity.com

  • Conversion of characteristic values before displaying in Bex Analyzer

    Dear all,
    does anybody know whether it's possible to get influence on the manner in which a characteristic is embedded into an BEx Analyzer sheet (may be via user exit, function module exits or something else) .
    In our project we have characteristics (describing production parameters) with numeric values which can't be used successfully with EXCEL AutoFilter functions because they are strings in EXCEL. Due to our data warehaouse design koncept it is not all right for us to store these values as key figures. So we are searching for methods to convert these string values into numeric values before we give them into the EXCEL sheet.
    Lot of thanks for you in advance!
    Dorothea Gebauer.

    High Leo Gillian,
    at first thank you very much for your answer. Unfortunately your information is not practicable for us. There are hundreds of characteristics, in the future there may be mor than two thousend, which have to be converted before they are embedded into the sheets. Therefore we need an automatically done conversion.
    with best regards Dorothea Gebauer

  • Separating values from an excel file and plotting

    Hello,
    I am using the read excel table vi to read the values from an excel file.I see the data in the output.
    I wanted to take the first two columns and multiply each value with 20m and then plot it in a XY graph. 
    I am attaching the files below.
    Thanks in advance 
    Solved!
    Go to Solution.
    Attachments:
    test data.xls ‏436 KB
    ReadExcelTable[1].vi ‏23 KB

    Hello,
    i acheived the multiplication in a different way but i have another problem.I am only considering two columns in the above excel sheet.After multiplying each value with 0.02  i get another array for the two columns with values say a,b,c,d and a1,b1,c1,d1 and so on.
    The plot i require is of (a,a1)   (a+b,a1+b1)  (a+b+c,a1+b1+c1) and so on.
    Any ideas that might help me.I am attaching the VI that solves the problem till multiplication.
    Thanks in advance.
    Attachments:
    INS_plot.vi ‏59 KB
    test data.txt ‏246 KB

  • How to increase the number of rows to be displayed in BEx Web Analyzer

    hi,
    I am viewing reports using BEx Web analyzer option. It is displaying only 24 rows  of data at a time. How to increase the number of rows? do i need to any kind of settings?
    pls reply.

    Hi,
    I think the standard template in 2004s is named 0ANALYSIS_PATTERN. You can find it in Web Application Designer, open, and search for the technical name. If you want to change the number of rows or columns, you can find this setting near the bottom of the XHTML view of the template:
    <bi:BLOCK_COLUMNS_SIZE value="4" />
    <bi:BLOCK_ROWS_SIZE value="17" />
    Then just save, and ignore the error messages.
    Message was edited by:
            vind Reinsberg

  • How to move large number of internal table data to excel by program

    Hi,
    Iam working on a classical report wherein my requirement is:
    Have around 25 internal tables which I am displaying using a selection screen.I need to transfer this all internal tables data to Excel file by executing.
    Now, let me know how can I transfer all those to excel by execution.
    P.S.: GUI_DOWNLOAD or any other excel download related FMs are used to transfer for single/fewer internal tables.
    But here I need to download 25 internal tables data through program.
    How can I meet the requirement..?
    Kindly advice.
    Thanks,
    Shiv.

    Hi,
    Refer to the following code:
    *& Report  ZDOWNLOAD_PROGRAM
    report  zdownload_program.
    parameter : p_path type rlgrap-filename default 'C:\Pdata.xls'.
    data : gt_output   type standard table of trdirt,
           wa_output   type trdirt,
           p_filen     type string.
    at selection-screen on value-request for p_path.
      clear p_path.
      call function 'F4_FILENAME'
        importing
          file_name = p_path.
    start-of-selection.
      select * from trdirt
               into table gt_output
               where name like 'Z%'
                  or name like 'Y%'.
    end-of-selection.
      move : p_path to p_filen.
      call function 'GUI_DOWNLOAD'
        exporting
      BIN_FILESIZE                    =
          filename                        = p_filen
       filetype                        = 'ASC'
      APPEND                          = ' '
       write_field_separator           =
    cl_abap_char_utilities=>horizontal_tab
      HEADER                          = '00'
      TRUNC_TRAILING_BLANKS           = ' '
      WRITE_LF                        = 'X'
      COL_SELECT                      = ' '
      COL_SELECT_MASK                 = ' '
      DAT_MODE                        = ' '
      CONFIRM_OVERWRITE               = ' '
      NO_AUTH_CHECK                   = ' '
      CODEPAGE                        = ' '
      IGNORE_CERR                     = ABAP_TRUE
      REPLACEMENT                     = '#'
      WRITE_BOM                       = ' '
      TRUNC_TRAILING_BLANKS_EOL       = 'X'
      WK1_N_FORMAT                    = ' '
      WK1_N_SIZE                      = ' '
      WK1_T_FORMAT                    = ' '
      WK1_T_SIZE                      = ' '
      WRITE_LF_AFTER_LAST_LINE        = ABAP_TRUE
    IMPORTING
      FILELENGTH                      =
        tables
          data_tab                        = gt_output
      FIELDNAMES                      =
       exceptions
         file_write_error                = 1
         no_batch                        = 2
         gui_refuse_filetransfer         = 3
         invalid_type                    = 4
         no_authority                    = 5
         unknown_error                   = 6
         header_not_allowed              = 7
         separator_not_allowed           = 8
         filesize_not_allowed            = 9
         header_too_long                 = 10
         dp_error_create                 = 11
         dp_error_send                   = 12
         dp_error_write                  = 13
         unknown_dp_error                = 14
         access_denied                   = 15
         dp_out_of_memory                = 16
         disk_full                       = 17
         dp_timeout                      = 18
         file_not_found                  = 19
         dataprovider_exception          = 20
         control_flush_error             = 21
         others                          = 22
      if sy-subrc <> 0.
    message id sy-msgid type sy-msgty number sy-msgno
             with sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
      endif.
    Regards
    Rajesh Kumar

  • How to combine large number of key-value pair tables into a single table?

    I have 250+ key-value pair tables with the following characteristics
    1) keys are unique within a table but may or may not be unique across tables
    2) each table has about 2 million rows
    What is the best way to create a single table with all the unique key-values from all these tables? The following two queries work till about 150+ tables
    with
      t1 as ( select 1 as key, 'a1' as val from dual union all
              select 2 as key, 'a1' as val from dual union all
              select 3 as key, 'a2' as val from dual )
    , t2 as ( select 2 as key, 'b1' as val from dual union all
              select 3 as key, 'b2' as val from dual union all
              select 4 as key, 'b3' as val from dual )
    , t3 as ( select 1 as key, 'c1' as val from dual union all
              select 3 as key, 'c1' as val from dual union all
              select 5 as key, 'c2' as val from dual )
    select coalesce(t1.key, t2.key, t3.key) as key
    ,      max(t1.val) as val1
    ,      max(t2.val) as val2
    ,      max(t3.val) as val3
    from t1
    full join t2 on ( t1.key = t2.key )
    full join t3 on ( t2.key = t3.key )
    group by coalesce(t1.key, t2.key, t3.key)
    with
      master as ( select rownum as key from dual connect by level <= 5 )
    , t1 as ( select 1 as key, 'a1' as val from dual union all
              select 2 as key, 'a1' as val from dual union all
              select 3 as key, 'a2' as val from dual )
    , t2 as ( select 2 as key, 'b1' as val from dual union all
              select 3 as key, 'b2' as val from dual union all
              select 4 as key, 'b3' as val from dual )
    , t3 as ( select 1 as key, 'c1' as val from dual union all
              select 3 as key, 'c1' as val from dual union all
              select 5 as key, 'c2' as val from dual )
    select m.key as key
    ,      t1.val as val1
    ,      t2.val as val2
    ,      t3.val as val3
    from master m
    left join t1 on ( t1.key = m.key )
    left join t2 on ( t2.key = m.key )
    left join t3 on ( t3.key = m.key )
    /

    A couple of questions, then a possible solution.
    Why on earth do you have 250+ key-value pair tables?
    Why on earth do you want to consolodate them into one table with one row per key?
    You could do a pivot of all of the tables, without joining. something like:
    with
      t1 as ( select 1 as key, 'a1' as val from dual union all
              select 2 as key, 'a1' as val from dual union all
              select 3 as key, 'a2' as val from dual )
    , t2 as ( select 2 as key, 'b1' as val from dual union all
              select 3 as key, 'b2' as val from dual union all
              select 4 as key, 'b3' as val from dual )
    , t3 as ( select 1 as key, 'c1' as val from dual union all
              select 3 as key, 'c1' as val from dual union all
              select 5 as key, 'c2' as val from dual )
    select key, max(t1val), max(t2val), max(t3val)
    FROM (select key, val t1val, null t2val, null t3val
          from t1
          union all
          select key, null, val, null
          from t2
          union all
          select key, null, null, val
          from t3)
    group by keyIf you can do this in a single query, unioning all 250+ tables, then you do not need to worry about chaining or migration. It might be necessary to do it in a couple of passes, depending on the resources available on your server. If so, I would be inclined to create the table first, with a larger than normal percent free, then do the first set as a straight insert, and the remaining pass or passes as a merge.
    Another alternative might be to use the approach above, but limit the range of keys in each pass. So pass one would have a predicate like where key between 1 and 10 in each branch of the union, pass 2 would have key between 11 and 20 etc. That way everything would be straight inserts.
    Having said all that, I go back to my second question above, why on earth do you want/need to do this? What is the business requirement you are trying to solve. There might be a much better way to meet the requirement.
    John

  • How to add a large number of keywords to the e-mail filter?

    Hello.
    I would like to know how to add a large number of keywords to a filter.
    The thing I want to accomplish, is it to add around 4000 e-mail addresses to a filter list, which checks the body of incoming e-mails, which get forwarded to me.
    I don't want to outright delete them, but I would love it if it detects that the forwarded message contains one of the e-mail addresses, it would add a tag to the message.
    Is it in any way possible to make a filter like this, which doesn't slow Thunderbird down to ass-crawl speed?
    I tried to copy the whole list into the small filter tab, but It had no discernible effect on my messages, since some of the previously received ones, which I was sure contained the keywords, were not tagged. All it did was make the program super slow and I was forced to delete the filter.

    You can look at creating a exclusion crawl rule:
    http://technet.microsoft.com/en-us/library/jj219686(v=office.15).aspx
    You can also modify your content source starting addresses and remove onedrive:
    http://technet.microsoft.com/en-us/library/jj219808(v=office.15).aspx
    Blog | SharePoint Field Notes Dev Tools |
    SPFastDeploy | SPRemoteAPIExplorer

  • Hi. i used Function module to change Characteristic values of a sales order

    hi. i used Function module to change Characteristic values of a sales order..
    but sales order's Characteristic values didn't change.
    And the Function module doesn't occur any log message.
    please tell me wrong code, and how to solve this problem.
    if i have wrong method, what data can i pass to change the characteristic values
    DATA: LT_E1CUVAL    TYPE TABLE OF E1CUVAL.
      DATA: WA_E1CUVAL    TYPE E1CUVAL.
      DATA: LS_CFG_HEAD   LIKE CUXT_CUCFG_S,
            LS_INSTANCES  LIKE CUXT_CUINS_S,
            LS_VALUES     LIKE CUXT_CUVAL_S,
            LS_E1CUCFG    LIKE E1CUCFG,
            LS_E1CUINS    LIKE E1CUINS,
            LS_E1CUVAL    LIKE E1CUVAL,
            LS_PROFILE    LIKE E1CUCOM,
            LS_VBAP       LIKE VBAP,
            L_CUOBJ       LIKE INOB-CUOBJ,
            L_ATINN       LIKE CABN-ATINN.
      DATA: LT_INSTANCES  LIKE CUXT_CUINS_S OCCURS 0,
            LT_PART_OF    LIKE CUXT_CUPRT_S OCCURS 0,
            LT_VALUES     LIKE CUXT_CUVAL_S OCCURS 0,
            LT_VAR_KEYS   LIKE CUXT_CUVK_S  OCCURS 0,
            LT_KSML       LIKE KSML         OCCURS 0 WITH HEADER LINE,
            BEGIN OF LT_CLINT OCCURS 0,
              CLINT  LIKE KSSK-CLINT,
            END OF LT_CLINT.
      DATA: LT_CUIB       LIKE CUIB_CUOBJ_S OCCURS 0 WITH HEADER LINE.
      DATA: E_ROOT_INSTANCE           TYPE     CUXT_INSTANCE_NO.
      DATA: EV_ROOT_PERSIST_ID     TYPE     IBEXTINST_DATA-EXT_INST_ID.
      DATA: EV_CFG_HAS_CHANGED     TYPE     XFELD.
      DATA: EV_HANDLE_APPL_LOG     TYPE     BALLOGHNDL.
      DATA: L_CUOBJ_NEW           TYPE CUOBJ.
      DATA: L_OWNER               TYPE IBXX_BUSINESS_OBJECT.
      REFRESH LT_E1CUVAL.
      CLEAR LS_VBAP.
      SELECT SINGLE CUOBJ INTO CORRESPONDING FIELDS OF LS_VBAP
                                FROM VBAP WHERE VBELN = I_VBELN
                                            AND POSNR = I_POSNR.
      IF SY-SUBRC <> 0.
        RAISE INSTANCE_NOT_FOUND.
      ENDIF.
      REFRESH LT_CUIB. CLEAR LT_CUIB.
      LT_CUIB-INSTANCE = LS_VBAP-CUOBJ.
      APPEND LT_CUIB.
      CALL FUNCTION 'CUCB_INITIALIZER'
        EXPORTING
          IT_INSTANCES = LT_CUIB[].
      CALL FUNCTION 'CUXI_GET_SINGLE_CONFIGURATION'
        EXPORTING
          I_ROOT_INSTANCE              = LS_VBAP-CUOBJ
        IMPORTING
          E_CFG_HEAD                   = LS_CFG_HEAD
          ES_PROFILE                   = LS_PROFILE
          ET_RETURN                    = ET_RETURN
        TABLES
          E_TAB_INSTANCES              = LT_INSTANCES
          E_TAB_PART_OF                = LT_PART_OF
          E_TAB_VALUES                 = LT_VALUES
          E_TAB_VAR_KEYS               = LT_VAR_KEYS
        EXCEPTIONS
          INVALID_INSTANCE             = 1
          NO_ROOT_INSTANCE             = 2
          INSTANCE_IS_A_CLASSIFICATION = 3
          INTERNAL_ERROR               = 4
          NO_PROFILE_FOUND             = 5
          INVALID_DATA                 = 6
          OTHERS                       = 7.
      IF SY-SUBRC <> 0.
        CASE SY-SUBRC.
          WHEN 1.
            RAISE INSTANCE_NOT_FOUND.
          WHEN 3.
            RAISE INSTANCE_IS_A_CLASSIFICATION.
          WHEN OTHERS.
            RAISE INVALID_DATA.
        ENDCASE.
      ELSE.
        LOOP AT LT_VALUES INTO LS_VALUES.
          IF    LS_VALUES-CHARC = 'SAP_MILLCA_PACKAGING'
             OR LS_VALUES-CHARC = 'PD_CA_PACKING_DM'.
            LS_VALUES-VALUE = '7100010'. "This is test data
            MODIFY LT_VALUES FROM LS_VALUES.
          ELSE.
            DELETE LT_VALUES WHERE CHARC = LS_VALUES-CHARC.
          ENDIF.
          CLEAR LS_VALUES.
        ENDLOOP.
      ENDIF.
    &#50689;&#50629;&#51221;&#48372; &#53945;&#49457; &#48320;&#44221;
      CALL FUNCTION 'CUXI_SET_SINGLE_CONFIGURATION'
        EXPORTING
          I_CFG_HEADER                        = LS_CFG_HEAD
          I_ROOT_INSTANCE                     = LS_VBAP-CUOBJ
        I_PLANT                             =
        I_STRUCTURE_EXPLOSION_DATE          =
        I_STRUCTURE_EXPLOSION_APPL_ID       =
        I_LOGSYS                            =
          IS_PROFILE                          = LS_PROFILE
        IV_ONLY_SINGLE_LEVEL                =
        IV_HANDLE_APPL_LOG                  =
        IV_OBJECT_APPL_LOG                  = 'CIF'
        IV_SUBOBJECT_APPL_LOG               = 'T_CNFG'
        IMPORTING
          E_ROOT_INSTANCE                     = E_ROOT_INSTANCE
          EV_ROOT_PERSIST_ID                  = EV_ROOT_PERSIST_ID
          EV_CFG_HAS_CHANGED                  = EV_CFG_HAS_CHANGED
          EV_HANDLE_APPL_LOG                  = EV_HANDLE_APPL_LOG
          ET_RETURN                           = ET_RETURN
        TABLES
          I_TAB_INSTANCES                     = LT_INSTANCES
          I_TAB_PART_OF                       = LT_PART_OF
          I_TAB_VALUES                        = LT_VALUES
          I_TAB_VAR_KEYS                      = LT_VAR_KEYS
        I_TAB_BLOB                          =
        EXCEPTIONS
          NO_CONFIGURATION_DATA               = 1
          NO_ROOT_INSTANCE                    = 2
          INVALID_INSTANCE                    = 3
          INSTANCE_IS_A_CLASSIFICATION        = 4
          INTERNAL_ERROR                      = 5
          NO_PROFILE_FOUND                    = 6
          INVALID_DATA                        = 7
          OTHERS                              = 8
      IF SY-SUBRC <> 0.
        CASE SY-SUBRC.
          WHEN 1.
            RAISE NO_CONFIGURATION_DATA.
          WHEN 3.
            RAISE NO_ROOT_INSTANCE.
          WHEN 3.
            RAISE INVALID_INSTANCE .
          WHEN 3.
            RAISE INSTANCE_IS_A_CLASSIFICATION.
          WHEN 3.
            RAISE INTERNAL_ERROR.
          WHEN OTHERS.
            RAISE INVALID_DATA.
        ENDCASE.
      ENDIF.
      COMMIT WORK.
    save configuration with next commit
      CLEAR: LS_INSTANCES.
      READ TABLE LT_INSTANCES INTO LS_INSTANCES INDEX 1.
    L_OWNER-OBJECT_TYPE = LS_INSTANCES-OBJ_TYPE.
      L_OWNER-OBJECT_TYPE = 'PVS_POSVAR'.
      L_OWNER-OBJECT_KEY  = LS_INSTANCES-OBJ_KEY.
      CALL FUNCTION 'CUCB_CONFIGURATION_TO_DB'
        EXPORTING
          ROOT_INSTANCE         = LS_VBAP-CUOBJ
          ROOT_OBJECT           = L_OWNER
        IMPORTING
          NEW_INSTANCE          = L_CUOBJ_NEW
        EXCEPTIONS
          INVALID_INSTANCE      = 1
          INVALID_ROOT_INSTANCE = 2
          NO_CHANGES            = 3
          OTHERS                = 4.
      IF SY-SUBRC > 1 AND SY-SUBRC <> 3.
        CLEAR LS_VBAP-CUOBJ.
        RAISE INTERNAL_ERROR.
      ELSEIF SY-SUBRC = 1.
        LS_VBAP-CUOBJ = L_CUOBJ_NEW.
      ENDIF.
    What's wrong?
    help me to solve this problem.
    Thanks a lot.

    <b>SD_SALES_DOCUMENT_READ</b> Reads sales document header and business data: tables VBAK, VBKD and VBPA (Sold-to (AG), Payer (RG) and Ship-to (WE) parties)
    <b>SD_SALES_DOCUMENT_READ_POS</b> Reads sales document header and item material: tables VBAK, VBAP-MATNR
    <b>SD_DOCUMENT_PARTNER_READ</b> partner information including address. Calls SD_PARTNER_READ
    <b>SD_PARTNER_READ</b> all the partners information and addresses
    <b>SD_DETERMINE_CONTRACT_TYPE</b>
    In: at least VBAK-VBELN
    Exceptions: NO CONTRACT | SERVICE_CONTRACT | QUANTITY_CONTRACT
    <b>SD_SALES_DOCUMENT_COPY</b>
    <b>RV_ORDER_FLOW_INFORMATION</b> Reads sales document flow of sales document after delivery and billing
    SD_SALES_DOCUMENT_SAVE create Sales Doc from the copied document
    SD_SALES_DOCUMENT_ENQUEUE to dequeue use DEQUEUE_EVVBAKE
    RV_DELIVERY_PRINT_VIEW Data provision for delivery note printing
    SD_PACKING_PRINT_VIEW
    SD_DELIVERY_VIEW Data collection for printing
    called from RV_DELIVERY_PRINT_VIEW, SD_PACKING_PRINT_VIEW
    RV_BILLING_PRINT_VIEW Data Provision for Billing Document Print
    regards
    vinod

  • Large number of entries in Queue BW0010EC_PCA_1

    Dear BW experts,
    Our BW system 2004s is extracting data from R3 700. I am a Basis guy and observing large number of entries in SMQ1 in R3 system under Queue BW0010EC_PCA_1. I observe in RSA7 similar number of entries for 0EC_PCA_1.
    The number of entries for this queue in SMQ1 everyday are 50000+. The extraction job in BW is running everyday 5:00AM morning in BW but this only clears data which is lying before 2 days (Example on 10.09.2010, it will extract data of 08.09.2010)
    My questions
    1. Is it ok that such large number of entries lying in queue in SMQ1 and they are extracted once a day by a batch job.Then there is no mean of schedular pushing this entries.
    2. Any idea why extraction job only fetches data of 2 days before.any setting somewhere missing.
    Many thanks in advance for your valuable comments

    Hi,
    The entries lying in RSA7 and SMQ1 are one and the same. In SMQ1, BW0010EC_PCA_1 entry means that this data is lying to be sent across to you BW001 client system. Whereas in RSA7, same data is displayed as 0EC_PCA_1.
    1. Is it ok that such large number of entries lying in queue in SMQ1 and they are extracted once a day by a batch job.Then there is no mean of schedular pushing this entries.
    As I can understand from the data thats lying in your R/3 system in SMQ1, this datasource has delta mode as Direct Delta. SAP recommends that if the number of postings for a particular application is greater than 100000 per day, you should have delta mode as Queued Delta. Since in your system it is in some thousands, therefore BI guys would have kept it as direct delta. So, these entries lying in SMQ1 are not problem at all. As for scheduler from BI, it will pick up these entries every morning to clear the queue for the previous data.
    2. Any idea why extraction job only fetches data of 2 days before.any setting somewhere missing.
    I dont think that it is only fetching the data for 2 days before. The delta concept works in such a manner that once you have pulled the delta load from RSA7, this data would still be lying there under Repeat delta section until the next delta load has finished successfully.
    Since in your system, data is pulled only once a day, therefore even though your today's dataload has pulled yesterday's data, it would still be lying in the system till tomorrow's delta load from RSA7 is successful.
    Hope this was helpful.

  • Large number of favorites (maps 3)

    Hi
     I figured out how to converta large number of POIs (~2000) to an lmx file and import into maps. So far so good. Today I tried the ovi maps web browser application and the synchronize feature. It turned out that with this number of favorites ovi maps (browser) is pratically unusable.Then I tried to delete the extra favorites. No problem on the phone, but the browser maps does not seem to allow to select multiple favorites for deletion... On synchronizing the phone it put back my 2000 favorites... Only solution was to kill my ovi account and create a new one!
    My question is: will this poor handling of large number of favorites soon be resolved? Anything I can do to make things smoother now?
    thanks
    klaus

    UPDATE:
    It seems as though it was a hung up process with the AVCMU Agent on one of the Front-End servers.  A simple reboot over the weekend took care of the problem.
    John K. Boslooper | Lync Technical Specialist | Project Leadership Associates Phone: 312.448.2269 | www.projectleadership.net

  • Problems syncing large number of photos while choosing all photos option

    I have 17,800 photos and have experienced problems with trying to sync in iTunes since release day 1.
    If I choose to sync using the all photos option, the iPad will reboot during synch and appears to finish. Ipad will become unresponsive and hang with the apple logo on screen and a hard reset will then be required.
    Then when trying to launch photo app in iPad it displays "updating library" for a minute or 2 and then photo app crashes back to home screen.
    Have tried many options in trying to solve this including restore iPad software, deleting photo cache in iTunes, rebuilding iTunes library, etc.
    Only solution I have found was to break down my 17,800 photos into events and choose to synch events instead of the all photos option and that works, getting all 17,800 photos into iPad. Photos take about 5.8 gb of storage.
    There is definitely a problem with iTunes here in synching large number of photos using the all photos option and I have reported this to AppleCare support via two support calls. There does not appear to be a awareness of this issue in tech supports knowledge base and I am surprised to that there are not a large number of incidents reported as there are a few discussions started here on the issues of photo sync.
    Hopefully this post will help others with the same issue.
    Message was edited by: amfca

    Solved problem of synching iPad with desktop containing large photo library (20,000 photos): In summary, solution was to rename, copy and live iPHoto Library to XHD and reduce size of library on home (desktop) drive.
    In the home directory, rename "iPhoto Library" to "iPhoto Global" (or any other name you want), and copy into an external hard drive via simple drag and drop method.
    Then, go back to the newly renamed iPhoto Global and rename it again, to iPhoto 2010. Now, open iPhoto by holding down the Alt/Option key and opening iPhoto. This provides option to choose library iPhoto 2010. Open the library, and eliminate every photo before 2010. This got us down to a few thousand photos and a much smaller library. Synch this smaller library with the iPad.
    Finally, I suggest downloading a program "iPhoto Library Manager" and using this to organize and access the two libraries you now have (which could be 2, 10 or however many different libraries you want to use.) The iPhoto program doesn't want you to naturally have more than one library, but a download called iPhoto Library Manager allows user to segregate libraries for different purposes (eg. personal, work, 2008, 2009, etc.). Google iPhoto Library Manager, download, and look for links to video tutorials which were very helpful.
    I did not experience any problems w/ iPhoto sequencing so can't address that concern.
    Good luck!

  • How can I form a group in address book. I need to transfer a large number of emails from an excel spread sheet to form a group to send emails to.

    I need to transfer a large number of emails from an excel spread sheet to form a group to send emails to. I can either use address book or transfer them to BTYahoo contacts and send from there.

    Hello, if you have the font that Photoshop is supposed to use to write the barcode, and each image is also listed in the spreadsheet, you can use the little known feature called variables: http://help.adobe.com/en_US/photoshop/cs/using/WSfd1234e1c4b69f30ea53e41001031ab64-7417a.h tml
    see this video: 
    http://www.youtube.com/watch?v=LMAeX5pexNk
    Or this one:
    http://tv.adobe.com/watch/adobe-evangelists-julieanne-kost/pscs5-working-with-variables/

  • Maximum number of filter values in BEx web

    Hi
    Is it possible to change the maximum number of filter values displayed when running a query?
    Example: When the user runs a query and has to put in a material number, he pushes the button which shows him the possible values for input. The number of values displayed per page is 14 and the maximum number of pages is 72. The maximum number of values is 1000. If the user tries to search, he only searches amongst the 1000 possible values. Is it possible to change the number of possible values shown to unlimited?
    We use NW04s BI SP09.
    Kind regards
    Erik

    Hi Experts
    I have the same problem , the maximum number of hits si set to 200 , ant the variable contains over 300 values , it is a variable based on  a time charactersitic , and the values appear in ascending order , i mean the first 01.1900 , and the last value 03.2012 . i wonder if there is a way to change the order of sort  to descending order . I mean the  first value 03.2012 and the last one 01.1900 . I attach the image .
    Thanks for your help.

  • Bex Query - How to capture characteristic value and set it as filter value?

    Dear Experts,
    I would like to create a tricky report that listed sales quantity by companis. Companies consists of production plant as well as trading company.
    Company / Sales Quantity / Sales Quantity 2 (which produced by Company itself)
    A            / 100                  / 50                     (50 was produced by company A)
    B            /  80                   / 0                      (this is a trading company)
    C            / 150                  / 150                   (All are produced by company C)
    First I thought of using variable to capture Company value and then under Sales Quantity 2 I set producing company = this variable. But this only works if I filter company values. Any workaround idea that I can get the above with all the listing of companies?
    Any assistance would be great! Thanks!
    JL

    Hi,
    Have you tried elimination of Internal business Volume?
    http://help.sap.com/saphelp_bw32/helpdata/en/d5/784d3c596f0b26e10000000a11402f/content.htm
    Impliment this solution  by adding another KF Sales qty2.  with ref. to Sales qty.
    In characteristic pair you can have Company and  Production Company(navigational attribute).
    By implimenting this solution  the system will eliminate if the quantity is produced by the company it self.  You can create CKF in query to get the result you need.
    Jaya

Maybe you are looking for

  • CS6 web design premium trial install error

    after finally able to direct download and install the cs6 web design premium, i received the following warning: Exit Code: 6 Please see specific errors and warnings below for troubleshooting. For example,  ERROR: DW050 ... WARNING: DF029, DS013 ... -

  • Migration SQL Server 7.0 to Oracle 8i

    Hi, I am trying to migrate a SQL Server 7.0 database to Oracle 8i using the latest version of the Migration Workbench tool. The source model is normally generated (I see my tables and other objects) but nothing is generated in the Oracle Model... Wha

  • Any experience using Airport Extreme for dial-up?

    Hi, I just purchased a MacBook, which I love already, but I'm a bit disappointed that they decided to drop the internal modem. Regardless, I'm trying to figure out the best way for me to connect to the internet now (I use dial-up). The $50 external m

  • Using iTunes with Two Libraries Simultaneously.

    In the living room I have a Mac Mini with a large external drive running iTunes that I use as a media server. I recently got a new TV for the bedroom and have hooked my MacBook up to it. My laptop has the iTunes library that is synced with my iPod an

  • Ultiboard crashes during Gerber creation

    Ultiboard crashes during Gerber creation, Does anyone have any advice? I am trying to convert a large pcb design to gerber format I am using gerber RS274D, it takes a very long time but is successful in creating some of the files, then gives me a run