Regarding the dataload from the Psa to infocube

Hi Experts,
                       I am having the dought when i am loding the data into the infocube the data is not updating into the infocube .But the data is present in the PSA abt 4 millions of records are there when i am running the DTP from PSA to the Infocube the data is not updating but it is showing the yellow color signal and in PSA totally 20 packets are there each packet is containning abt 50,000 records Can you suggest me what are all the steps i have to take if i want to load the data into an infocube from PSA threw DTP Because i am new to this field .
                                                       Bye.

Hi Vinay,
Check this link...this solves all ur worries..
http://help.sap.com/saphelp_nw2004s/helpdata/en/42/f98e07cc483255e10000000a1553f7/frameset.htm
Also,
Performance Tips for Data Transfer Processes  
Request processing, the process of loading a data transfer process (DTP), can take place in various parallelization upgrades in the extraction and processing (transformation and update) steps. The system selects the most appropriate and efficient processing for the DTP In accordance with the settings in the DTP maintenance transaction, and creates a DTP processing mode.
To further optimize the performance of request processing, there are a number of further measures that you can take:
●      By taking the appropriate measures, you can obtain a processing mode with a higher degree of parallelization.
●      A variety of measures can help to improve performance, in particular the settings in the DTP maintenance transaction. Some of these measures are source and data type specific.
The following sections describe the various measures that can be taken.
Higher Parallelization in the Request Processing Steps
With a (standard) DTP, you can modify an existing system-defined processing by changing the settings for error handling and semantic grouping. The table below shows how you can optimize the performance of an existing DTP processing mode:
Original State of DTP Processing Mode
Processing Mode with Optimized Performance
Measures to Obtain Performance-Optimized Processing Mode
Serial extraction and processing of the source packages (P3)
Serial extraction, immediate parallel processing (P2)
Select the grouping fields
Serial extraction and processing of the source packages (P3)
Parallel extraction and processing (P1)
Only possible with persistent staging area (PSA) as the source
Deactivate error handling
Serial extraction, immediate parallel processing (P2)
Parallel extraction and processing (P1)
Only possible with PSA as the source
Deactivate error handling
Remove grouping fields selection
Further Performance-Optimizing Measures
Setting the number of parallel processes for a DTP during request processing.
To optimize the performance of data transfer processes with parallel processing, you can set the number of permitted background processes for process type Set Data Transfer Process globally in BI Background Management.
To further optimize performance for a given data transfer process, you can override the global setting:
In the DTP maintenance transaction, choose Goto ®  Batch Manager Setting .Under Number of Processes, specify how many background processes should be used to process the DTP. Once you have made this setting, remember to save.
Setting the Size of Data Packets
In the standard setting in the data transfer process, the size of a data packet is set to 50,000 data records, on the assumption that a data record has a width of 1,000 bytes. To improve performance, you can increase the size of the data packet for smaller data records.
Enter this value under Packet Size on the Extraction tab in the DTP maintenance transaction.
Avoid too large DTP requests with a large number of source requests: Retrieve the data one request at a time
A DTP request can be very large, since it bundles together all transfer-relevant requests from the source. To improve performance, you can stipulate that a DTP request always reads just one request at a time from the source.
To make this setting, select Get All New Data in Source by Requeston the Extraction tab in the DTP maintenance transaction. Once processing is completed, the DTP request checks for further new requests in the source. If it finds any, it automatically creates an additional DTP request.
With DataSources as the source: Avoid too small data packets when using the DTP filter
If you extract from a DataSource without error handling, and a large amount of data is excluded by the filter, this can cause the data packets loaded by the process to be very small. To improve performance, you can modify this behaviour by activating error handling and defining a grouping key.
Select an error handling option on the Updating tab in the DTP maintenance function. Then define a suitable grouping key on the Extraction tab under Semantic Groups. This ensures that all data records belonging to a grouping key in a packet are extracted and processed.
With DataStore objects as the source: Do not extract data before the first delta or during full extraction from the table of active data
The change log grows in proportion to the table of active data, since it stored before and after-images. To optimize performance during extraction in the Fill mode or with the first delta from the DataStore object, you can read the data from the table of active data instead of from the change log.
To make this setting, select Active Table (with Archive) or Active Table (without Archive) on the Extraction tab in Extraction fromu2026 or Delta Extraction fromu2026 in the DTP maintenance function.
With InfoCubes as the source: Use extraction from aggregates
With InfoCube extraction, the data is read in the standard setting from the fact table (F table) and the table of compressed data (E table). To improve performance here, you can use aggregates for the extraction.
Select data transfer process Use Aggregates on the Extraction tab in the DTP maintenance transaction. The system then compares the outgoing quantity from the transformation with the aggregates. If all InfoObjects from the outgoing quantity are used in aggregates, the data is read from the aggregates during extraction instead of from the InfoCube tables.
Note for using InfoProviders as the source
If not all key fields for the source InfoProvider in the transformation have target fields assigned to them, the key figures for the source will be aggregated by the unselected key fields in the source during extraction. You can prevent this automatic aggregation by implementing a start routine or an intermediate InfoSource. Note though that this affects the performance of the data transfer process
Hope this helps u..
VVenkat..

Similar Messages

  • Regarding reading the data from the files without using Stremas

    hai to all of u...
    here i have a problem where i have to read the data from the files without using any streams.
    please guide me how to do this one,if possible by giving with an example
    Thanks & Regard
    M.Ramakrishna

    Simply put, you can't.
    By why do you need to?

  • Sum of the keyfigure from the Infocube and need to be placed in process chain

    Hi,
    Before asking your advise, here is my development,
    1. We are implementing retraction through APD and generating CSV files with Excel file data as Extracting Inputs.
    2. So executed all queries in cutomer exit variant by reading this inputs file from AL11 and result files also placed in same location.
    3. Now my query is,
    Extractor parameters/inputs Excel file format is
    Char1     Char2   
    100          RK     
    now I need to calculate the sum of the KF(i.e this is the kf in result files) value from generated file or info cube and that needs to be placed besides/concatenate extract parameters values.
    Char1     Char2   Total of the KF
    100          RK       1113888
    How to fetch the total value for the KF from the cube and add this into extract parameters file and need attach this to email alert to the user.
    Please advise. Your help is more appreciated.
    Thanks,
    RK

    Hi Ramakrishna,
    Just consider this option - using APD you can store data as per your requirement(rad file and cube data) and create report on direct update DSO and using information broadcasting send it to business user.
    Regards,
    Ganesh Bothe

  • Regarding extracting the images of the signatures from the signed pdf

    We are using a TOPAZ sig pad Model: T-LBK750SE-BHSB-R to sign pdf documents using acrobat plugin.
    The PDF has  more than one signature field to sign.
    All this happens in a web application that uses Javascript to submit the pdf.
    We also have a requirement to capture the signature as a image.
    However, the customers (one or more) will only sign the pdf  and so, we will somehow have to extract the images of the signatures from the signed pdf.
    Could you please let me know if this can be done using pdf apis (like IText etc) or some other server side APIs.

    If you set up standard password security so that form filling and signing existing digital signature fields is allowed, you should be OK.

  • The data from the InfoProvider XYZ involved could not be checked

    Hi team,
    i am making one infoprovider, in which i am joining one DSO with one infocube,
    the dso and infocube both have data , however when i join them and create an infoset and when i do display data i receive the following message
    "The data from the InfoProvider XYZ involved could not be checked", and i cant see any records
    plz assist if any one has face such issues
    Regards
    Blusky

    Hi,
    yes i do have data, i have one issue, when join occours does it occurs on SID or actual column
    Means
    Table1
    C1       SID
    CHN     2
    Table 2
    C1     SID
    CHN    4
    Now when i join them at C1 then CHN =  CHN but SID <> SID of T2 , just confused here what exactly happens in back end..

  • How to Delete the dimension from the cube ?

    Hi ,
    how to Delete the dimension from the cube ?
    i have added the new dimension by assiging one characteristic to that dimension .
    now i  want to delete it ,
    but system saying that   Dimension ZXXX  contains InfoObjects; deletion not possible .
    how to delete it ? any help .
    Thanks

    Make sure you donot have any data in cube. If you have some data in cube, then you will not be able to see Delete option.
    Right click on the cube-->Delete data.
    Then double click on the cube>Goto Edit mode> select the IO under that Dimension> right click> now you will be able to see the Delete option (provided you have deleted all the date from Cube)
    Then right click on Dimension-->Delete
    Regards,
    Pavan

  • How can we remove the commas from the Formula value in SAP BW BEx query

    Hi All,
    How can we remove the commas from the Formula value in SAP BW BEx query
    We are using the formula replacing with characteristic.The characteristic value needs to be display as number with out commas.
    Regards
    Venkat.

    Do you want to remove the commas when you run the query on Bex Web or in RSRT?
    Regards

  • How to find out the user from the Jobs queue in Report server

    Hello All!
    I have a doubt about finding out the user from the scheduled jobs queue. Say I go ahead and schedule a report on Reports Server how can I find out the user name. When I view the jobs using showjobs I could see that the DBMS_JOBS table has a column under "Job Owner". But it invariantly shows it is "rwuser". So is there a way to find out which user has scheduled which job?
    Regards
    Shobha

    hi,
    The below tables will give only the name .
    USER_ADDRS
    USER_ADDR
    USER_ADDRP
    USR02
    i think you need email address .
    you can use this Tcode : su01d
    and give the user name and excute it
    i hope it will help you.
    Ram
    Edited by: Ram velanati on Jun 30, 2008 6:57 PM

  • Unable to get the data from the stored procedure

    Hello Folks,
    I have this stored procedure and am trying to get the data from the table stage_bill but for some reason i am not sure its not pulling the data.Am a beginner in pl/sql Can any one please help to find out. I can give the code below.
    create or replace procedure Load_FADM_Staging_Area_TEST(p_data_load_date date) is
    -- local variables
    v_start_date                date;
    v_end_date                  date;
    -- cursor starting
      CURSOR c_get_data
      IS
      SELECT
       a.batch_id 
    ,a.beginning_service_date 
    ,a.bill_id 
    ,a.bill_method 
    ,a.bill_number 
    ,a.bill_received_date 
    ,a.bill_status 
    ,a.bill_type 
    ,a.change_oltp_by 
    ,a.change_oltp_date 
    ,a.client_datafeed_code 
    ,a.client_id 
    ,a.created_date 
    ,a.date_of_incident 
    ,a.date_paid 
    ,a.deleted_oltp_by 
    ,a.deleted_oltp_date 
    ,a.duplicate_bill 
    ,a.ending_service_date 
    ,a.event_case_id 
    ,a.event_id 
    ,a.from_oltp_by 
    ,a.oltp_bill_status 
    ,a.review_status 
    ,'HRI' schema_name
    , sysdate Load_date
    ,'ETLPROCESS001' Load_user
    ,v_start_date as Row_Effective_Date
    ,null Row_End_date
    from stage_bill a
    where
    --created_date >= to_date('20101031 235959', 'YYYYMMDD HH24MISS')
    created_date >= v_start_date
    and
    --created_date <= to_date('20101111 235959', 'YYYYMMDD HH24MISS')
      created_date <= v_end_date
    and not exists
    (select
    b.batch_id 
    ,b.beginning_service_date 
    ,b.bill_id 
    ,b.bill_method 
    ,b.bill_number 
    ,b.bill_received_date 
    ,b.bill_status 
    ,b.bill_type 
    ,b.change_oltp_by 
    ,b.change_oltp_date 
    ,b.client_datafeed_code 
    ,b.client_id 
    ,b.created_date 
    ,b.date_of_incident 
    ,b.date_paid 
    ,b.deleted_oltp_by 
    ,b.deleted_oltp_date 
    ,b.duplicate_bill 
    ,b.ending_service_date 
    ,b.event_case_id 
    ,b.event_id 
    ,b.from_oltp_by 
    ,b.oltp_bill_status 
    ,b.review_status,
    b.schema_name,
    b.Load_date,
    b.Load_user,
    b.Row_Effective_Date,
    b.Row_End_Date
    from STG_FADM_HRI_STAGE_BILL_TEST b)
      -- cursor o/p variables
    v_batch_id                  stage_bill.batch_id%TYPE;
    v_beginning_service_date    stage_bill.beginning_service_date%TYPE;
    v_bill_id                   stage_bill.bill_id%TYPE;
    v_bill_method               stage_bill.bill_method%TYPE;
    v_bill_number               stage_bill.bill_number%TYPE;
    v_bill_received_date        stage_bill.bill_received_date%TYPE;
    v_bill_status               stage_bill.bill_status%TYPE;
    v_bill_type                 stage_bill.bill_type%TYPE;
      v_change_oltp_by            stage_bill.change_oltp_by%TYPE;
      v_change_oltp_date          stage_bill.change_oltp_date%TYPE;
      v_client_datafeed_code      stage_bill.client_datafeed_code%TYPE;
      v_client_id               stage_bill.client_id%TYPE;
      v_created_date          stage_bill.created_date%TYPE;
      v_date_of_incident    stage_bill.date_of_incident%TYPE;
      v_date_paid         stage_bill.date_paid%TYPE;
      v_deleted_oltp_by     stage_bill.deleted_oltp_by%TYPE;
      v_deleted_oltp_date    stage_bill.deleted_oltp_date%TYPE;
      v_duplicate_bill     stage_bill.duplicate_bill%TYPE;
      v_ending_service_date   stage_bill.ending_service_date%TYPE;
      v_event_case_id        stage_bill.event_case_id%TYPE;
      v_event_id            stage_bill.event_id%TYPE;
      v_from_oltp_by         stage_bill.from_oltp_by%TYPE;
      v_oltp_bill_status   stage_bill.oltp_bill_status%TYPE;
      v_review_status     stage_bill.review_status%TYPE;   
      v_schema_name        varchar(50);
      v_Load_date          date;
      v_Load_user            varchar(50);
      v_Row_Effective_Date   date;
      v_Row_End_Date         date;     
    Begin
    if  p_data_load_date is null then
        select (sysdate - 7), (sysdate - 1) into v_start_date, v_end_date from dual;
      elsif p_data_load_date is not null then
       select (p_data_load_date - 7), (p_data_load_date - 1) into v_start_date, v_end_date from dual;
      else
        raise_application_error('-20042', 'Data control - GetDataControlAuditData : Date parameter must be a date of this or a previous week.');
      end if;
    -- cursor c_get_data loop begin
    OPEN c_get_data;
        LOOP                                                       -- cursor c_get_data loop begin
          FETCH c_get_data
           INTO
            v_batch_id,
      v_beginning_service_date,
      v_bill_id ,
      v_bill_method ,
      v_bill_number,
      v_bill_received_date,
      v_bill_status,
      v_bill_type,
      v_change_oltp_by,
      v_change_oltp_date,
      v_client_datafeed_code,
      v_client_id,
      v_created_date,
      v_date_of_incident,
      v_date_paid,
      v_deleted_oltp_by,
      v_deleted_oltp_date,
      v_duplicate_bill,
      v_ending_service_date ,
      v_event_case_id ,
      v_event_id,
      v_from_oltp_by,
      v_oltp_bill_status,
      v_review_status,   
      v_schema_name,
       v_Load_date,
       v_Load_user,
       V_Row_Effective_Date,   
       v_Row_End_Date;
        EXIT WHEN c_get_data%NOTFOUND;
    insert into STG_FADM_HRI_STAGE_BILL_TEST
    batch_id 
    ,beginning_service_date 
    ,bill_id 
    ,bill_method 
    ,bill_number 
    ,bill_received_date 
    ,bill_status 
    ,bill_type 
    ,change_oltp_by 
    ,change_oltp_date 
    ,client_datafeed_code 
    ,client_id 
    ,created_date 
    ,date_of_incident 
    ,date_paid 
    ,deleted_oltp_by 
    ,deleted_oltp_date 
    ,duplicate_bill 
    ,ending_service_date 
    ,event_case_id 
    ,event_id 
    ,from_oltp_by 
    ,oltp_bill_status 
    ,review_status 
    ,schema_name
    ,Load_date
    ,Load_user
    ,Row_Effective_Date
    ,Row_End_Date
    values(
           v_batch_id,
    v_beginning_service_date,
    v_bill_id ,
    v_bill_method ,
    v_bill_number,
    v_bill_received_date,
    v_bill_status,
    v_bill_type,
    v_change_oltp_by,
    v_change_oltp_date,
    v_client_datafeed_code,
    v_client_id,
    v_created_date,
    v_date_of_incident,
    v_date_paid,
    v_deleted_oltp_by,
    v_deleted_oltp_date,
    v_duplicate_bill,
    v_ending_service_date ,
    v_event_case_id ,
    v_event_id,
    v_from_oltp_by,
    v_oltp_bill_status,
    v_review_status,   
    v_schema_name,
    v_Load_date,
    v_Load_user,
    v_Row_Effective_Date,   
    v_Row_End_Date ) ;
      COMMIT;
        END LOOP;                                                 
        CLOSE c_get_data; 

    Maybe you need something else, like
    CREATE OR REPLACE PROCEDURE load_fadm_staging_area_test (
      p_data_load_date DATE
    ) IS
      v_start_date   DATE;
      v_end_date     DATE;
    BEGIN
      SELECT NVL (p_data_load_date, SYSDATE) - 7,
             NVL (p_data_load_date, SYSDATE) - 1
      INTO   v_start_date,
             v_end_date
      FROM   DUAL;
      MERGE INTO stg_fadm_hri_stage_bill_test b
      USING      (SELECT *
                  FROM   stage_bill
                  WHERE  created_date BETWEEN v_start_date AND v_end_date) a
      ON         (b.bill_id = a.billl_id)
      WHEN NOT MATCHED THEN
        INSERT     (batch_id,
                    beginning_service_date,
                    bill_id,
                    bill_method,
                    bill_number,
                    bill_received_date,
                    bill_status,
                    bill_type,
                    change_oltp_by,
                    change_oltp_date,
                    client_datafeed_code,
                    client_id,
                    created_date,
                    date_of_incident,
                    date_paid,
                    deleted_oltp_by,
                    deleted_oltp_date,
                    duplicate_bill,
                    ending_service_date,
                    event_case_id,
                    event_id,
                    from_oltp_by,
                    oltp_bill_status,
                    review_status,
                    schema_name,
                    load_date,
                    load_user,
                    row_effective_date,
                    row_end_date
        VALUES     (a.batch_id,
                    a.beginning_service_date,
                    a.bill_id,
                    a.bill_method,
                    a.bill_number,
                    a.bill_received_date,
                    a.bill_status,
                    a.bill_type,
                    a.change_oltp_by,
                    a.change_oltp_date,
                    a.client_datafeed_code,
                    a.client_id,
                    a.created_date,
                    a.date_of_incident,
                    a.date_paid,
                    a.deleted_oltp_by,
                    a.deleted_oltp_date,
                    a.duplicate_bill,
                    a.ending_service_date,
                    a.event_case_id,
                    a.event_id,
                    a.from_oltp_by,
                    a.oltp_bill_status,
                    a.review_status,
                    'HRI',
                    SYSDATE,
                    'ETLPROCESS001',
                    v_start_date,
                    NULL
    END load_fadm_staging_area_test;Whenever you code a cursor and a loop, ask yourself. Do I need that?
    Regards
    Peter

  • Changed e-mail password in windows live account - now cannot send email from iphone.  Have deleted the account from the iphone and reinstalled it - the password is correct - still no luck - any advice ?

    I changed the password in my windows live e-mail account.  My iphone now recieves but will no longer send e-mails.  I have deleted the account from the iPhone - reinstalled it and verified the password is correct.  I recieve a message that "Cannot Send Mail"  "the user name or password ... is incorrect.  Any advice ?

    Hello DRB1962
    Check to make sure that you validated the device within your account on webmail.
    iOS: Setting up Hotmail, Outlook, Live, or MSN email accounts
    http://support.apple.com/kb/HT1694
    Thanks for using Apple Support Communities.
    Regards,
    -Norm G.

  • Adobe Creative Cloud Muse: Why I don't receive anymore the email from the forms I created in several websites?

    Adobe Creative Cloud Muse: Why I don't receive anymore the email from the forms I created in several websites?

    Hello Sanjit,
    thank you for replying.
    I had the problem on this website, I have two forms, one simple and the other more sofisticated:
    www.oeso.org
    (at the bottom I have a simple contact form) and
    I created also a form on www.oeso.org/new-membership.html
    but i didn't receive any reply so
    I put the form on my website (architecturevisualdesign.ch) in a hidden form:
    http://architecturevisualdesign.ch/oeso-new-membership.html
    and here I receive the OESO New Membership Form Submission reply.
    Thank you for your help.
    Best regards,
    Nicole
    2014-09-09 11:17 GMT+02:00 Sanjit_Das <[email protected]>:
    Adobe Creative Cloud Muse: Why I don't receive anymore the email from the forms I created in several websites?
    created by Sanjit_Das in Help with using Adobe Muse CC - View the full discussion
    Please provide the site url in question , also have you hosted the site on Business Catalyst ?
    Emails can land to spam/junk folders so you should also check the folders.
    Thanks,
    Sanjit
    Please note that the Adobe Forums do not accept email attachments. If you want to embed a screen image in your message please visit the thread in the forum to embed the image at https://forums.adobe.com/message/6713391#6713391
    Replies to this message go to everyone subscribed to this thread, not directly to the person who posted the message. To post a reply, either reply to this email or visit the message page:
    To unsubscribe from this thread, please visit the message page at . In the Actions box on the right, click the Stop Email Notifications link.
    Start a new discussion in Help with using Adobe Muse CC by email or at Adobe Community
    For more information about maintaining your forum email notifications please go to http://forums.adobe.com/thread/416458?tstart=0.

  • I want to get the values from the second hiphen only

    Hey Guys,
    I have one column and the data like this
    col1 = 'AI463-901-001'
    Now,
    I want to get the values from the second hiphen only(any no. of values).
    Please can any one help me on this .
    Thanks in advance!
    Regards,
    -LK

    you have a mistake
    you result is -001
    this is right
      select substr('AI463-901-001',instr('AI463-901-001','-',1,2)+1) from dual;
      -- @user11928732 -  if you are using Oracle Database 11g, ttry this please
    with data as
      (select  'AI463-901-001'from dual)
      select substr(str,instr(str,'-',1,2)+1) from data;
      select substr(<YOUR COLUMN>,instr(<YOUR COLUMN>,'-',1,2)+1) from <YOUR TABLE>;result is : 001
    Edited by: Mahir M. Quluzade on May 3, 2011 5:37 PM

  • How to display the values from the table in the screen

    Hi,
    I have created a screen where i will enter the values for the field treshold amount and desc and if i press update button  .it will update the new values by overriting the existing values .
    Now i have got requirement i need to create a button show which will display the existing value from the table. always there will be only one entry...in this table
    Please can one give me idea...to do this
    or sample code...thanks in advance
    regards
    paveeeeee

    Define a function code 'SHOW' for your button. In your PAI module, when you check for various sy-ucomms, check for 'SHOW' also.
    Your code will be like this:
    Case sy-ucomm.
      when 'SHOW'.
         perform show_details.
    endcase.
    In the perform, you can fetch the data from the table and put it in global variables. In the PBO, move the data from the global variables to the screen fields so that they get displayed on the screen.
    Hope this helps. Reward points for useful answers.
    Regards
    Nithya

  • HT203200 Cannot delete the file from the itunes folder

    I cannont delete the file from the itunes folder because it tells me I need permission to do so.  I am the admin person for the laptop plus I have checked my permissions under the security tab in the files properties.  How do I delete this file to start again?  It downloaded 1.2GB of the file before the error message appeared, so its paid for and all but not looking like I'll get to watch it before the time runs out on the movie being available for me to watch. :-(

    Try deleting the file without iTunes running.
    When you next open iTunes it should then be marked as a missing file and you should be able to delete the entry from iTunes.
    Hope this works
    Regards,
    Colin R.

  • Cisco ASA 5505 performance issues on downloads - data into the ASA from the Internet

    I have having serious issues with performance on my ASA 5505s that I am testing with 9.2.3 code.
    I stripped the config and removed as much stuff as I could - no VPN etc. and I am ONLY getting about 30-40Mbps downloads from sites but 95Mbps uploads????  Anyone else seeing these problems?   If I remove the firewall my PC can hit 300/300Mbps to the same sites using the same switch and cable.
    I installed 1Gb of mem on the ASA 5505 but it made no difference. The ASA has a UL IP Security license but I am only using and inside and outside address for these tests, no other ports configured.
    Is anyone else seeing this performance problem with the 9.2.3 code?  I went to this from 8.2.5 to try to resolve QOS failure bugs that I found in the 8.2.5 code. I did not expect to have a performance hit though and it is only on downloads TO the ASA from the Internet from all speed test sites that I try. Uploading speeds seem fine. No access-lists on my interfaces either...barebones config.
    My FIOS and switch interfaces are fine...no errors on any interfaces and the same switch interface hits 300/300Mbps when my laptop is directly attached. 
    Anyone have a barebones config on their ASA 5505 that flies...I will try it on mine and see if some command somewhere (hidden) is causing the issue. I even cleared the config and started with a clean slate just in case I was missing some command from the older configs that may have impacted performance.

    After changing the switch with a high end switch my performance increased but I am still not happy with the throughput out of my ASA. I have about 50+ ASAs 5505s and a dozen 5510s. Most remote sites have 5505s. All my sites right now have 8.2.5-51 and I wanted to put 9.2.3 out there to solve issues I have uncovered on the 8.2.5 code with regards to QOS issues.
    I get much better results using the Cisco 3750X attached to the FIOS  (right around 300/300 with my laptop directly attached to the 3750x bypassing the ASA - my FIOS circuit rating is also 300/300).  Going through the ASA to the same test site I get download speeds of 35 to 75. Changes randomly which really bothers me. My uploads speeds are ALWAYS faster then my download speeds.  Example - best download I would ever get is 75Mb and my upload would usually hit 95Mb during the same test period.
    I may have to live with it but the inconsistency is what really bothers me.
    Here is the config I am currently using. Nothing going on during testing since only a single PC is attached. VPN tunnel to the main site can be up or down...doesn't seem to make any difference. PC does to site directly from outside interface of ASA...split tunneling. Even when I removed tunnels and tested with just the ASA as a firewall to the Internet I was still seeing the same inconsistencies.
    Anything obviously  missing - new command or anything?   Xlates causing issues?

Maybe you are looking for