Huge number of unprocessed logging table records found

Hello Experts,
I am facing one issue where huge number of unprocessed logging table records were found in SLT system for one table. I have check all setting and error logs but not found any evidence that causing the unprocessed records. In HANA system also it shows in replicated status. Could you please suggest me something other than to replicate same table again, as that option is not possible at this moment.

Hi Nilesh,
What are the performance impacts on the SAP ECC system when multiple large SAP tables like BSEG are replicated at the same time? Is there a guideline for a specific volume or kind of tables?
There is no explicit guideline since aspects as server performance as well as change rate of the tables are also relevant. As a rule of thumb, one dedicated replication job per large table is recommended.
from SLT
How to enable parallel replication before DMIS 2011 SP6    do not ignore its for SP06 == go through
How to improve the initial load
Regards,
V Srinivasan

Similar Messages

  • Corrupted message causing a huge number of transaction logs

    Hello All
    We have a SBS 2011 client and one user is having an issue with a single message being unable to synchronize with the server.
    This message attempts to get synchronized with the server every minute but fails and gets listed on the Sync issues folder as per the below listed message.
    8:19:57 Error synchronizing message 'Palm Beach - Updated Lease Plan'
    8:19:57                 
     [8004010B-501-480-324]
    8:19:57                 
     The client operation failed.
    8:19:57                 
     Microsoft Exchange Information Store
    8:19:57                 
     For more information on this failure, click the URL below:
    8:19:57                 
     http://www.microsoft.com/support/prodredirect/outlook2000_us.asp?err=8004010b-501-480-324
    8:19:57 Moved a message that failed synchronization to 'Local Failures'. Message subject -> 'Palm Beach - Updated Lease Plan'. You can view this message in your offline folder file only.
    There were over 2 GB of the same message on the local failure folder so I have deleted all of those messages and currently there is no similar message on the local failure folder but the sync issues folder still reports a sync failure.
    Because of this reason transaction logs getting generated in massive numbers so I had to enable circular logging to make sure disk space is not fully consumed by transaction logs.
    I have search everywhere on this user’s mailbox but is unable to locate this message in question. Also this is not on the message queue on the exchange server as well.
    How can I locate this message and remove it?
    Thanks,
    Dhanushka

    Hi Dhanushka,
    Sorry for my interrupting.
    Would you please let me confirm whether the sync issue persists when user log on OWA?
    Please refer to the following similar thread and check if can help you.
    Synchronize
    error for Outlook 2007
    Error
    synchronizing message
    Regarding to 10030 and 10031, please refer to the following KB.
    The Microsoft Exchange Information Store service crashes
    occasionally when a folder view is corrupted on an Exchange Server 2010 mailbox server
    If anything I misunderstand, please don’t hesitate to let me know. Meanwhile, please provide the complete event
    message.
    Hope this helps.
    Best regards,
    Justin Gu

  • Slow due to huge number of tables

    Hi,
    unfortunately we have a really huge number of tables in the ( Advantage Server ) database.
    About 18,000 + tables
    Firing the acitveX preview thru RDC, or just running a preview in the designer slows down to a crawl.
    Any hints? ( Besides get rid of that many tables )
    Thanks
    Oskar

    Hi Oskar
    The performance of a report is related to:
    External factors:
    1. The amount of time the database server takes to process the SQL query.
        ( Crystal Reports send the SQL query to the database, the database process it, and returns the data set to Crystal Reports. )
    2. Network traffics.
    3. Local computer processor speed.
        ( When Crystal Reports receives the data set, it generates a temp file to further filter the data when necessary, as well as to group, sort, process formulas, ... )
    4. The number of record returned
        ( If a SQL query returns a large number of records, it will take longer to format and display than if was returning a smaller data set.)
    Report design:
    1. Where is the Record Selection evaluated.
        Ensure your Record Selection Formula can be translated in SQL, so the data can be filtered down on the Server, otherwise the filtering will be done in a temp file on the local machine which will be much slower.
    They have many functions that cannot be translated in SQL because they may not have a standard SQL for it.
    For example, control structure like IF THEN ELSE cannot be translated into SQL. It will always be evaluated in Crystal Reports. But if you use an IF THEN ELSE on a parameter, it will convert the result of the condition to SQL, but as soon as uses database fileds in the conditions it will not be translated in SQL.
    2. How many subreports the report contains and in section they are located.
    Minimise the number of subreports used, or avoid using subreports if possible because
    subreports are reports within a report, and if you have a subreport in a details section, and the report returns 100 records, the subreport will be evaluated 100 times, so it will query the database 100 times. It is often the biggest factor why a report takes a long time to preview.
    3. How many records will be returned to the report.
       Large number of records will slow down the preview of the reports. Ensure you only returns the necessary data on the report, by creating a Record Selection Formula, or basing your report
    off a Stored Procedure, or a Command Object that only returns the desired data set.
    4. Do you use the special field "Page N of M", or "TotalPageCount"
       When the special field "Page N of M" or "TotalPageCount" is used on a report, it will have to generate each page of the report before it displays the first page, therfore it will take more time to display the first page of the report.
        If you want to improve the speed of a report, remove the special field "Page N of M" or "Total Page Count" or formula that uses the function "TotalPageCount". If those aren't used when you view a report it only format the page requested. It won't format the whole report.
    5. Link tables on indexed fields whenever possible.
    6. Remove unused tables, unused formulas, unused running totals from the report.
    7. Suppress unnecessary sections.
    8. For summaries, use conditional formulas instead of running totals when possible.
    9. Whenever possible, limit records through selection, not suppression.
    10. Use SQL expressions to convert fields to be used in record selection instead of using formula functions.
    For example, if you need to concatenate 2 fields together, instead of doing it in a formula, you can create a SQL Expression Field. It will concatenate the fields on the database server, instead of doing in Crystal Reports.
    SQL Expression Fields are added to the SELECT clause of the SQL Query send to the database.
    11. Using one command as the datasource can be faster if you return only the desired data set.
          It can be faster if the SQL query written only return the desired data.
    12. Perform grouping on server
       This is only relevant if you only need to return the summary to your report but not the details. It   will be faster as less data will be returned to the reports.
    Regards
    Girish Bhosale

  • Walk thru all database table records until match is found then delete

    I have a database connected form. It has a scipt that checks to see if that forms data has already been submitted by searching thru the records for the formID number. I want the records to be deleted if found and then add new records incase the form data has been changed. I can make this work once by using the script below. Is there a short way to make it continue searching and deleting all the records that start with that form ID number instead having to copy this script 12 times (maximum number of records allowed).
    var oFile = Subform1.FileName.formattedValue;
    var bFound = false;
    while(!oDB.isEOF()){
    if(xfa.record.DataConnection2.FileName.value == oFile){
    bFound = true;
    break;
    oDB.next();
    if(bFound){ 
    oDB.delete();
    oDB.update();

    types : begin of ty_jtab,
           f1(10) type c,
           end of ty_jtab.
    data : jtab type standard table of ty_jtab with header line,
           itab type standard table of ty_jtab with header line.
    data : wa_jtab type  ty_jtab,
           wa_itab type  ty_jtab.
    *populate itab for say db
    itab-f1 = '1000'. append itab.
    itab-f1 = '2000'. append itab.
    itab-f1 = '3000'. append itab.
    *populate jtab
    jtab-f1 = '1000'. append jtab.
    jtab-f1 = '4000'. append jtab.
    jtab-f1 = '8000'. append jtab.
    jtab-f1 = '9000'. append jtab.
    sort itab by f1.
    sort jtab by f1.
    clear: itab, wa_itab.
    loop at itab into wa_itab.
      read table jtab into wa_jtab with key f1 = wa_itab-f1
                                             binary search.
      if sy-subrc ne 0.
        delete itab from wa_itab.
        if sy-subrc ne 0.
          write :/ 'fail', wa_itab-f1.
        endif.
      endif.
    endloop.
    *o/p is two times fail which shows the assumption.
    br,
    vijay

  • Query using system parameter LEVEL returns incorrect huge number of records

    We migrate our database from Oracle *9.2.0.6* to *11.2.0.1*
    The query below throws "ORA-01788: CONNECT BY clause required in this query block".
    select * from (
    +select a.BOARD_ID, code, description, is_displayable, order_seq,  board_parent_id, short_description, IS_SUB_BOARD_DISPLAYABLE, <font color=blue>LEVEL</font> child_level, sp_board.get_parent_id(a.board_id) top_parent_id, is_top_selected isTopSelected+
    from boards a, ALERT_MESSAGE_BOARD_TARGETS b
    where a.board_id = b.board_id and is_displayable = 'Y' and alert_message_id = 5202) temp
    start with board_parent_id = 0
    connect by prior board_id = board_parent_id
    ORDER SIBLINGS BY order_seq;
    Based from online resources we modified "*_allow_level_without_connect_by*" by executing the statement.
    alter system set "_allow_level_without_connect_by"=true scope=spfile;
    After performing the above, ORA-01788 is resolved.
    The new issue is that the same query above returns *9,015,853 records in 11g* but in *9i it returns 64 records*. 9i returns the correct number of records. And the cause for 11g returning greater number of records is due to system parameter <font color=blue>LEVEL</font> used in the query.
    Why 11g is returning an incorrect huge number of records?
    Any assistance to address this is greatly appreciated. Thanks!

    The problem lies in th query.
    Oracle <font color=blue>LEVEL</font> should not be used inside a subquery. After <font color=blue>LEVEL</font> is moved in the main query, the number of returned records is the same as in 9i.
    select c.BOARD_ID, c.code, c.description, c.is_displayable, c.order_seq, c.board_parent_id, c.short_description, c.IS_SUB_BOARD_DISPLAYABLE, <font color=blue>LEVEL</font> child_level, c.top_parent_id, c.isTopSelected
    from (
    select a.BOARD_ID, code, description, is_displayable, order_seq, board_parent_id, short_description, IS_SUB_BOARD_DISPLAYABLE, sp_board.get_parent_id(a.board_id) top_parent_id, is_top_selected isTopSelected
    from boards a, ALERT_MESSAGE_BOARD_TARGETS b
    where a.board_id = b.board_id and is_displayable = 'Y' and alert_message_id = 5202
    ) c
    start with c.board_parent_id = 0
    connect by prior c.board_id = c.board_parent_id
    ORDER SIBLINGS BY c.order_seq

  • Job Fail:9 duplicate record found. 73 recordings used in table /BIC/XZ_DEAL

    Dear All
      Job load from bw ODS to bw Master data InfoObject and ODS. This job always fail with same message: <b>9 duplicate record found. 73 recordings used in table /BIC/XZ_DEAL</b> (Z_DEAL is Infoobject).
    When I rerun, this job has job no error.
    Please help me solved this problem.
    thanks
    Phet

    Hi,
    What is the info object name.
    Regards,
    Goutam

  • Cut and Paste in Baby Record template, Daily Log table

    Apologies if the answer is obvious, but I just had a baby and my brain is working on very little sleep, and I don't have time to read through the user guide to find the answer, if in fact it's in there.  I've been using the Daily Log table in the Baby Record template, but I can't seem to figure out how to add new categories/lines without literally having to recreate every single category level manually.  That seems incredibly time consuming and inefficient.  Isn't there a way to copy one date (with all three subcategories of Food, Diaper and Sleep) and paste to create the rows for the next date?  For instance, I just manually created an entry for 1/19/2012, and then manually added the three subcategories and had to retype in the names "Food," "Diaper," and "Sleep," and then had to manually add lines for each entry under those subcategories.  Instead, I want to copy the entier hierarchy for 1/19/2012 and then paste it and change the date to 1/20/2012, and so on, so I only have to create the hierarchy once and then can just enter in my data.
    Thanks in advance for any help you can provide. 

    Not a very intuitive template, to my thinking.
    Let's try to unravel the 'magic.'
    Categories on this template are created automatically from the entries in column B (which is hidden).
    Category rows are created automatically when you check the "Insert Categories form the following" checkbox in the Reorganize dialogue. When the box isn't checked, these Category rows don't exist.
    Start by unchecking the box and unhiding column B when your interest is in adding data
    To create a new category, you need only make an entry in column B that is different from the entries already there.
    For data entry, I'd forget about immediately creating a full set of categories for each new date. Instead, try this:
    On 7 July, 2009 at 7:15 AM the baby has a bottle of formula. To record that entry:
    Select the row where the most similar event took place on the previous day (6 July, 2009, 76:10 AM Food Formula), row 12 in the illustration below.
    Press option-down arrow to insert a new, empty row below the row containing that event. The selection will move down to the first cell in the new row.
    Reslect the previous row, then drag the Fill control handle (the small circle at the bottom right corner of the blue selection rectangle) down to fill the contents of row 12 into row 13.
    Note that the Date in column A increments by 1, and now shows 7 July, 2009, the date of the event to be recorded.
    Time has also incremented by one hour, and will need to be edited.
    The other two values, "Food" and "Formula" are the same as there were in the previous row.
    Edit the time in column C, and the entry is done.
    Make other entries in a similar manner.
    When done, click the reorganize button and sort the table by Date (column A). You may want to add two sub-sorts, by Activity (column B), then by Time. Each subsort is done within the previous category. Note that Time entries actually create Date and Time values in the cell, with the Date part equal to the date the entry was made.. If all entries for a specific date are not made during a single day sorting on time will give unexpected results,
    Regards,
    Barry

  • Record Number of users logged in to Portal and bring the data in to BI

    HI,
    I have a requirement for identifying the number of users logging in to SRM Portal and bringing those information in to BI and reporting them
    Can any one please tell me, how do i need to find those
    I checked tables USR01, USR02, USR03, USR41 etc but none of them gives the proper user/server information needed,
    SM04 transaction logs both user logging in to SRM server as well as SRM portal, but i need to find the underlying table for SM04, with which i can get the number of users logged in to the portal on a particular date range.
    If not, can anyone please tell me what is the best way to identify number of users logged in to SRM portal on a particular date and capturing those information and bringing in to BI and reporting them in BI
    Thanks in advance
    Pooja
    Edited by: Pooja on Nov 25, 2008 5:10 PM

    Please check the below link. should guide you in identifying the number of users logged into Portal which can be used for your reporting.
    [/message/319314#319314 [original link is broken];
    Thanks,
    Sruthi

  • Record milli-second into Activity Log table

    Hello,
    Can we have any option to make ME logs into its table(s) the time stamp by milli-second like 2011-03-24 10:31:08.123?
    [Background]
    We develop a custom program to request SAP ME to perform signoff, adjust qty, start and complete.
    The 4 requests will request one by one in the described sequence.
    When we perform above program with an active status SFC and confirm how ME's Activity Log Report displayed.
    Then, the sequence was strange.
    [operation sequence]
    1. SIGNOFF, 2. ADJUST_QTY, 3. START, 4. COMPLETE
    [Activity Log displayed]
    1. ADJUST_QTY, 2. COMPLETE, 3. SIGNOFF, 4. START
    We This log will make user confused.
    According to our engineer, all of these 4 actions are recorded with same time stamp like 2011-03-24 10:31:08.000.
    So, I think if there's some option to allow mill-second log into ME database, it will be a solution.
    (I guess, the other log tables like production log, resource_time_log and production_comment are also in similar situation.)
    Best Regards,
    Takahiro Uesugi

    Takahiro,
    Such changes have already been implemented for some customers to meet requirements of their workload. Not sure about including them into future releases. You need to discuss this with SAP consultants to submit the change for review by Product Management.
    Regards,
    Sergiy

  • No Matching Records found in OOEI Table

    Can Any one Give me the clearance of the given below mentined error
    No matching records found  'G/L Accounts' (OACT) (ODBC -2028)  [Message 131-183]
    When we add the out going excice invoice based on delivary we got the bove error
    Regards,
    Venkat.

    Hi Venkat,
    Welcome you post on the forum.
    You may go though these threads to find a match to your case:
    http://forums.sdn.sap.com/search.jspa?threadID=&q=%22NomatchingrecordsfoundG%2FL+Accounts%22&objID=c44&dateRange=lastyear&numResults=15&rankBy=10001
    Thanks,
    Gordon

  • Payment F110 - No data records found for these selection criteria, FZ208

    Hi all,
    I have done all configuration for payment medium for a customer in Norway. We use Telapay and program RFFONO_T. We have not activated the new general ledger but we have ECC 6.0 so I do not see why it should not work.
    The invoices got paid with payment order but when I should download the file in Environment > Payment medium > DME Administration I got the following error message:
    "No data records found for these selection criteria
    Message no. FZ208
    Diagnosis
    No data could be accessed for this selection.
    Possible causes are:
    No data exists for the activated selection.
    You have no authorization to display or edit data from this selection.
    Procedure
    First check whether your selection criteria are correct.  You may need to expand the criteria to include a larger search area to check whether data exists in the system.
    Make sure you have the proper authorizations for displaying and editing data.  Read the Release note for DME management for further information on the authorization objects.
    Proceed"
    I have tried with different variants but that doesn´t matter. When I look at the payment run log I can see following:
    "Additional parameter specifications 1400 SAPO02 are missing
    Message no. FR193
    Diagnosis
    Entry 1400 $V2& is missing from the additional company code parameter table.
    System Response
    Processing was terminated.
    Procedure
    Maintain the entry according to the instructions in the program documentation."
    I suppose that´s why I can´t get a file. Do any one of you know why I can´t get the file created. Please help.
    Best regards Lisa

    Hi Lisa,
    I have a similar problem with program RFFONO_T and Telepay format for a Norwegian customer. Payment medium is not created. In the payment run log is the following message: "Additional parameter specifications XXXX SAPO02 are missing
    Message no. FR193.Entry XXXX $V2& is missing from the additional company code parameter table."
    According the program documentation for RFFONO_T, a company number (11 digits) has to be maintained under company code global data, additional details. The legal org.number with 9 digits is already entered but I do not understand where to enter a 11 digits company number? A user number (10 digits) is also entered in trans OB94  but the problem remains.
    Did you find a solution to your problem?
    Regards,
    Agneta

  • Create Error log table

    I am supposed to insert the records that are not uploaded to the main table into a error log table and email the users about the error records that was not inserted into the table. How am is supposed to do it ?
    I have few more questions.
    What is the best way to upload the data from a file .
    1, I got to either do the batch processing or
    2, I got to browse and uplaod the file thru the APEX application and
    and insert the records.
    I want to know about which tutorial could be the best to read to do the
    about 2 methods and how do i create and insert records into the error log
    table and send the user with the CSv or txt file that contains the error records in both the methods ?
    Will following the below method be the right way for 2nd method ?
    http://oraexplorer.blogspot.com/2007/11/apex-to-upload-text-file-and-write-into.html

    Ok,
    I am trying to insert the records to an existing table from CSV file.
    I am using the below post to do so..
    Re: File Browse, File Upload
    I get some errors executing the htmldb tools package.
    Error at line 27: PLS-00103: Encountered the symbol "/"
    create or replace PACKAGE htmldb_tools
    AS
    -- Utility functions --{{{
    PROCEDURE parse_textarea ( --{{{
    -- Parse a HTML textarea element into the specified HTML DB collection
    -- The c001 element from the collection is used
    -- The parser splits the text into tokens delimited by newlines, spaces
    -- and commas
    p_textarea IN VARCHAR2,
    p_collection_name IN VARCHAR2
    PROCEDURE parse_file( --{{{
    -- Generic procedure to parse an uploaded CSV file into the
    -- specified collection. The first line in the file is expected
    -- to contain the column headings, these are set in session state
    -- for the specified headings item.
    p_file_name IN VARCHAR2,
    p_collection_name IN VARCHAR2,
    p_headings_item IN VARCHAR2,
    p_columns_item IN VARCHAR2,
    p_ddl_item IN VARCHAR2,
    p_table_name IN VARCHAR2 DEFAULT NULL
    END htmldb_tools;
    create or replace PACKAGE BODY htmldb_tools
    AS
    TYPE varchar2_t IS TABLE OF VARCHAR2(32767) INDEX BY binary_integer;
    -- Private functions --{{{
    PROCEDURE delete_collection ( --{{{
    -- Delete the collection if it exists
    p_collection_name IN VARCHAR2
    IS
    BEGIN
    IF (htmldb_collection.collection_exists(p_collection_name))
    THEN
    htmldb_collection.delete_collection(p_collection_name);
    END IF;
    END delete_collection; --}}}
    PROCEDURE csv_to_array ( --{{{
    -- Utility to take a CSV string, parse it into a PL/SQL table
    -- Note that it takes care of some elements optionally enclosed
    -- by double-quotes.
    p_csv_string IN VARCHAR2,
    p_array OUT wwv_flow_global.vc_arr2,
    p_separator IN VARCHAR2 := ','
    IS
    l_start_separator PLS_INTEGER := 0;
    l_stop_separator PLS_INTEGER := 0;
    l_length PLS_INTEGER := 0;
    l_idx BINARY_INTEGER := 0;
    l_quote_enclosed BOOLEAN := FALSE;
    l_offset PLS_INTEGER := 1;
    BEGIN
    l_length := NVL(LENGTH(p_csv_string),0);
    IF (l_length <= 0)
    THEN
    RETURN;
    END IF;
    LOOP
    l_idx := l_idx + 1;
    l_quote_enclosed := FALSE;
    IF SUBSTR(p_csv_string, l_start_separator + 1, 1) = '"'
    THEN
    l_quote_enclosed := TRUE;
    l_offset := 2;
    l_stop_separator := INSTR(p_csv_string, '"', l_start_separator + l_offset, 1);
    ELSE
    l_offset := 1;
    l_stop_separator := INSTR(p_csv_string, p_separator, l_start_separator + l_offset, 1);
    END IF;
    IF l_stop_separator = 0
    THEN
    l_stop_separator := l_length + 1;
    END IF;
    p_array(l_idx) := (SUBSTR(p_csv_string, l_start_separator + l_offset,(l_stop_separator - l_start_separator - l_offset)));
    EXIT WHEN l_stop_separator >= l_length;
    IF l_quote_enclosed
    THEN
    l_stop_separator := l_stop_separator + 1;
    END IF;
    l_start_separator := l_stop_separator;
    END LOOP;
    END csv_to_array; --}}}
    PROCEDURE get_records(p_blob IN blob,p_records OUT varchar2_t) --{{{
    IS
    l_record_separator VARCHAR2(2) := chr(13)||chr(10);
    l_last INTEGER;
    l_current INTEGER;
    BEGIN
    -- Sigh, stupid DOS/Unix newline stuff. If HTMLDB has generated the file,
    -- it will be a Unix text file. If user has manually created the file, it
    -- will have DOS newlines.
    -- If the file has a DOS newline (cr+lf), use that
    -- If the file does not have a DOS newline, use a Unix newline (lf)
    IF (NVL(dbms_lob.instr(p_blob,utl_raw.cast_to_raw(l_record_separator),1,1),0)=0)
    THEN
    l_record_separator := chr(10);
    END IF;
    l_last := 1;
    LOOP
    l_current := dbms_lob.instr( p_blob, utl_raw.cast_to_raw(l_record_separator), l_last, 1 );
    EXIT WHEN (nvl(l_current,0) = 0);
    p_records(p_records.count+1) := utl_raw.cast_to_varchar2(dbms_lob.substr(p_blob,l_current-l_last,l_last));
    l_last := l_current+length(l_record_separator);
    END LOOP;
    END get_records; --}}}
    -- Utility functions --{{{
    PROCEDURE parse_textarea ( --{{{
    p_textarea IN VARCHAR2,
    p_collection_name IN VARCHAR2
    IS
    l_index INTEGER;
    l_string VARCHAR2(32767) := TRANSLATE(p_textarea,chr(10)||chr(13)||' ,','@@@@');
    l_element VARCHAR2(100);
    BEGIN
    l_string := l_string||'@';
    htmldb_collection.create_or_truncate_collection(p_collection_name);
    LOOP
    l_index := instr(l_string,'@');
    EXIT WHEN NVL(l_index,0)=0;
    l_element := substr(l_string,1,l_index-1);
    IF (trim(l_element) IS NOT NULL)
    THEN
    htmldb_collection.add_member(p_collection_name,l_element);
    END IF;
    l_string := substr(l_string,l_index+1);
    END LOOP;
    END parse_textarea; --}}}
    PROCEDURE parse_file( --{{{
    p_file_name IN VARCHAR2,
    p_collection_name IN VARCHAR2,
    p_headings_item IN VARCHAR2,
    p_columns_item IN VARCHAR2,
    p_ddl_item IN VARCHAR2,
    p_table_name IN VARCHAR2 DEFAULT NULL
    IS
    l_blob blob;
    l_records varchar2_t;
    l_record wwv_flow_global.vc_arr2;
    l_datatypes wwv_flow_global.vc_arr2;
    l_headings VARCHAR2(4000);
    l_columns VARCHAR2(4000);
    l_seq_id NUMBER;
    l_num_columns INTEGER;
    l_ddl VARCHAR2(4000);
    BEGIN
    IF (p_table_name is not null)
    THEN
    BEGIN
    execute immediate 'drop table '||p_table_name;
    EXCEPTION
    WHEN OTHERS THEN NULL;
    END;
    l_ddl := 'create table '||p_table_name||' '||v(p_ddl_item);
    htmldb_util.set_session_state('P149_DEBUG',l_ddl);
    execute immediate l_ddl;
    l_ddl := 'insert into '||p_table_name||' '||
    'select '||v(p_columns_item)||' '||
    'from htmldb_collections '||
    'where seq_id > 1 and collection_name='''||p_collection_name||'''';
    htmldb_util.set_session_state('P149_DEBUG',v('P149_DEBUG')||'/'||l_ddl);
    execute immediate l_ddl;
    RETURN;
    END IF;
    BEGIN
    select blob_content into l_blob from wwv_flow_files
    where name=p_file_name;
    EXCEPTION
    WHEN NO_DATA_FOUND THEN
    raise_application_error(-20000,'File not found, id='||p_file_name);
    END;
    get_records(l_blob,l_records);
    IF (l_records.count < 3)
    THEN
    raise_application_error(-20000,'File must have at least 3 ROWS, id='||p_file_name);
    END IF;
    -- Initialize collection
    htmldb_collection.create_or_truncate_collection(p_collection_name);
    -- Get column headings and datatypes
    csv_to_array(l_records(1),l_record);
    csv_to_array(l_records(2),l_datatypes);
    l_num_columns := l_record.count;
    if (l_num_columns > 50) then
    raise_application_error(-20000,'Max. of 50 columns allowed, id='||p_file_name);
    end if;
    -- Get column headings and names
    FOR i IN 1..l_record.count
    LOOP
    l_headings := l_headings||':'||l_record(i);
    l_columns := l_columns||',c'||lpad(i,3,'0');
    END LOOP;
    l_headings := ltrim(l_headings,':');
    l_columns := ltrim(l_columns,',');
    htmldb_util.set_session_state(p_headings_item,l_headings);
    htmldb_util.set_session_state(p_columns_item,l_columns);
    -- Get datatypes
    FOR i IN 1..l_record.count
    LOOP
    l_ddl := l_ddl||','||l_record(i)||' '||l_datatypes(i);
    END LOOP;
    l_ddl := '('||ltrim(l_ddl,',')||')';
    htmldb_util.set_session_state(p_ddl_item,l_ddl);
    -- Save data into specified collection
    FOR i IN 2..l_records.count
    LOOP
    csv_to_array(l_records(i),l_record);
    l_seq_id := htmldb_collection.add_member(p_collection_name,'dummy');
    FOR i IN 1..l_record.count
    LOOP
    htmldb_collection.update_member_attribute(
    p_collection_name=> p_collection_name,
    p_seq => l_seq_id,
    p_attr_number => i,
    p_attr_value => l_record(i)
    END LOOP;
    END LOOP;
    DELETE FROM wwv_flow_files WHERE name=p_file_name;
    END;
    BEGIN
    NULL;
    END;
    /

  • Deleting data from a very large log table (custom table in our namespace)

    Hello,
    I have been tasked with clearing a log table in our landscape to only include the most recent entries.  Is it possible to do this given that the table has already got 230 000 000 entries and will need to keep around 600 000 recent entries?
    Should I do this via ABAP and if so, how?  Thanks,
    Samir

    Hi,
    so you are going to keep 0,3 % of your data?
    If you should do it in ABAP or on the database is your decission.
    In my opinion doing things on the database directly should be done
    exceptional cases only e.g. for one time actions or actions that have to
    be done very rarely and with different parameters / options. Regular
    and similar tasks should be done in ABAP i think.
    In any case i would not delete the majority of the records but copy
    the records to keep in an empty table with the same structure, delete the
    table as a whole (check clients!) and "copy" the new table back in ABAP
    or rename the new table to the old table after droping the old table on the database.
    If you have only one client you can copy the data you need in a new
    table and truncate the old table (fast deletion for all clients). If you have
    data to keep  for other clients as well check how much data it is per client
    in comparison to the total number of lines (if only a small fraction, pefer copying
    them too).
    On the database you can use CTAS (create table as select) and drop table and
    rename table. Those commands shoudl be very efficient but work client independently.
    If you have to consider clients SELECT; INSERT; DELETE or TRUNCATE (depends on if you have
    copied all data considering clients) are
    your friends.
    Kind regards,
    Hermann

  • How to find out the top ten Change log tables in BW

    I want to know the top ten change log tables in terms of size. Do we have a SAP standard table where we have the following fields : "changlog table name", "number of records" field or "data in size" field.
    Regards,
    Prashant M J

    Hi,
    Click on change log table in the ODS/DSO at the top you see the name as /bic* which is the name of the database table which could also be seen at SE11.
    If you wana see the requests in this change log use rstsodsrequest
    and if you wana see the size in terms of kb, mb DB02 or st04
    use the table name /bic* and click on tables and indexes option in history tab
    DB02 tells you the sizes of the tables present on database. Incase you are not able to your basis team would help you in that
    Thanks and regards
    Kiran

  • No matching records found Goods receipt PO (OPDN) (ODBC-2028) [Message 131-

    Hello,
    We are having a problem viewing one of our goods receipt. When going into a PO weu can see that there is a target document, but when we click on it the following message pops up u2013 u2018No matching records found Goods receipt PO (OPDN) (ODBC-2028) [Message 131-183]u2019. You cannot view the goods receipt by going through the purchasing module either. You can however see it when you go into the item and then right click and choose u2018stock posting listu2019. It then shows the goods received from GRPMS at the bottom under the GR no
    We now have the invoice and need to enter it onto SAP but because you cannot view the GRN it wonu2019t allow me to copy from the GRN.
    Could some offer assistance.
    Regards,
    Juan

    Dear Juan82,
    From what you say it looks like it is an issue that should be reported by message in order to be fixed.
    Please, log a message on the portal with your S number.
    When you write the description of the issue please attach screenshots that can help us to understand the issue.
    Regards,
    Marcella Rivi
    SAP Business One Forums Team

Maybe you are looking for

  • Vendor code in Stock Transfer Inspection Lot(08)

    Hi Everybody, I have noticed that when a Stock Transfer lot is created, the 'Data for lot orogin' tab in the Inspection Lot shows Vendor code in 'Data for goods receipt' section at times and at times Vendor code is not included in this section. Pleas

  • How to buy a book of ibooks with your Itunes money

    How to buy a book with your Itunes money which is already on your account? Really want to buy a book but dont want to buy it with visa?

  • I have apple tv when i try it used wifi in my computer it didnt work?

    i have apple tv when i try it used wifi in my computer it didnt work? plz help me?

  • Ipod make computer freeze

    its pretty wired it works on my friend's computer and orther's (i tryed like 5 diffrent) but on my own computer when i plug the ipod in and synchronise then its starts as normal but then freeze the computer so i have to restart (every time) and i eve

  • Oracle Text for Traditional Chinese

    I would like to ask how can i specific keywords in Chinese and whenever i type sometimes in Chinese, the Oracle Text's token is indexed very interested and the stop word is definite not sometimes rational in Traditional Chinese. Anyone can help? (DR$