How does table SMW3_BDOC become very big?

Hi,
The table SMW3_BDOC which store BDocs in my system becomes very big with several million records. Some BDocs in this table are sent several month ago. I'm very strange that why those BDocs were not processed?
If I want to clean this table, will inconsistancy occurrs in system? And how can I clean this table for those very old BDocs?
Thanks a lot for your help!

Hi Long,
I have faced the same issue recently on our Production system and this created a huge performance issue and completely blocked the system with TimeOut errors.
I was able to clean up the same by running the report SMO8_FLOW_REORG in SE38.
If you are very sure about cleaning up, first delete all the unnecessary Bdocs and then run this report.
At the same time, check any CSA* queue is stuck in CRM inbound queue SMQ2. If yes, select it, manually unlock it, activate and then refresh. Also check any unnecessary queues stuck up there.
Hope this could help you.
regards,
kalyan

Similar Messages

  • How does table CE4JJ01_ACCT populated?

    Hi All,
    Good day.
    I would like to seek help from everyone on how the table CE4JJ01_ACCT populated. Does it needs a program or a transaction code or a function module to be executed first in order to populate it.
    Your help is very much appreciated.
    Thanks.
    Regards,
    JC

    Hi ,
    Is it related to CO-PA ?
    regards
    Prabhu

  • Database Log File becomes very big, What's the best practice to handle it?

    The log of my production Database is getting very big, and the harddisk is almost full, I am pretty new to SAP, but familiar with SQL Server, if anybody can give me advice on what's the best practice to handle this issue.
    Should I Shrink the Database?
    I know increase hard disk is need for long term .
    Thanks in advance.

    Hi Finke,
    Usually the log file fills up and grow huge, due to not having regular transaction log backups. If you database is in FULL recovery mode, every transaction is logged in Transaction file, and it gets cleared when you take a log backup. If it is a production system and if you don't have regular transaction log backups, the problem is just sitting there to explode, when you need a point in time restore. Please check you backup/restore strategy.
    Follow these steps to get transactional file back in normal shape:
    1.) Take a transactional backup.
    2.) shrink log file. ( DBCC shrinkfile('logfilename',10240)
          The above command will shrink the file to 10 GB.(recommended size for high transactional systems)
    >
    Finke Xie wrote:
    > Should I Shrink the Database? .
    "NEVER SHRINK DATA FILES", shrink only log file
    3.) Schedule log backups every 15 minutes.
    Thanks
    Mush

  • How do i open a VERY big file?

    I hope someone can help.
    I did some testing using a LeCroy LT342 in segment mode. Using the
    Labview driver i downloaded the data over GPIB and saved it to a
    spreadsheet file. Unfortunately it created very big files (ranging from
    200MB to 600MB). I now need to process them but Labview doesn't like
    them. I would be very happy to split the files into an individual file
    for each row (i can do this quite easily) but labview just sits there
    when i try to open the file.
    I don't know enough about computers and memory (my spec is 1.8GHz
    Pentium 4, 384MB RAM) to figure out whether if i just leave it for long
    enough it will do the job or not.
    Has anyone any experience or help they could offer?
    Thanks,
    Phil

    When you open (and read) a file you usually move it from your hard disk (permanent storage) to ram.  This allows you to manipulate it in high speeds using fast RAM memory, if you don't have enough memory (RAM) to read the whole file,  you will be forced to use virtual memory (uses swap space on the HD as "virtual" RAM) which is very slow.  Since you only have 384 MB of RAM and want to process Huge files (200MB-600MB) you could easily and inexpensively upgrade to 1GB of RAM and see large speed increases.  A better option is to lode the file in chunks looking at some number of lines at a time and processing this amount of data and repeat until the file is complete, this will be more programming but will allow you to use much lass RAM at any instance.
    Paul
    Paul Falkenstein
    Coleman Technologies Inc.
    CLA, CPI, AIA-Vision
    Labview 4.0- 2013, RT, Vision, FPGA

  • How kann I store a very big file in to Oracle XML DB?

    Hello,
    I´m looking for a fast method to store a XML file in to a oracle 10g XE. I had try to store the 500 kb file in to the database as a xmltype or a clob, but I still have the same Error: "ORA-01704: string literal too long". I´m looking for a long time for a possibility to store this file and another one(113 MB) in to the database. I had searched by google to see if any solution are available, and the unique solution found is to splitt the document in a loop statement(due to the 32 kb limit). But this solution don´t allow any storage with a XML Schema and is to slow.
    here is an example how I did it(but it didn´t work):
    create table Mondial(Nr int, xmldata xmltype);
    INSERT INTO Mondial VALUES (1, 'big xml file');
    I would also try the alternative with a bind variable like this:
    create or replace PROCEDURE ProcMondial IS
    poXML CLOB;
    BEGIN
    poXML := 'big xml file';
    INSERT INTO Mondial VALUES (1, XMLTYPE(poXML));
    EXCEPTION
    WHEN OTHERS THEN
    raise_application_error(-20101, 'Exception occurred in Mondial procedure :'||SQLERRM);
    END ProcMondial;
    I become also the same Error:String to long!
    I am using sql developer for the Query.
    please help me, I´m despaired.
    thanks!
    Michael

    If you use the suggested statement
    create table Mondial(Nr int, xmldaten xmltype) TABLESPACE mybigfile;than I hope for you that this XML data content will not be used for content driven procedures like selecting, updating or deletion of parts of XML data. The default for "xmltype" has a CLOB physical representation that, if not on 11g combining it with an XMLIndex, is only useful for document driven XML storage. That means - You ALWAYS delete, select, update or insert the WHOLE XML content (/per record basis). If this is not your intent, you will encounter performance problems.
    Use instead Object Relational or Binary XML (11g) while using the XMLType datatype or the XMLType datatype in conjunction with CLOB based storage AND an XMLIndex if on 11g. Carefully read the first and/or second chapter of the 10/11g XMLDB Development Guide. Carefully choose your XMLType needed (and or XML design) if you don't want to be disappointed regarding the end result and/or have to redesign your XML tables.
    A short introduction on the possibilities can be found here (not only 11g related btw): http://www.liberidu.com/blog/?p=203

  • How does my company become an Authorized Apple Reseller? I've tried everything in the books.

    Tried every which way and keeps hitting a brick wall. Even got an Apple Resale Certificate ID assigned to my company, only for that to become "invalid". Anyone has any idea how to resell especially the iPad 2 without having to apply to become a reseller through the web? Would it be possible to resell under another reseller? This is pretty frustrating.

    Unfortunately, we are all just users here, so you will need to contact Apple:
    http://www.apple.com/contact/

  • How does table J_1iEWT_ACKN_N  get updated

    Dear All,
    In the TDS certificate printing for India, there is a field to fill teh acknowledgement numbers of all the quarterly statements of TDS. How do we update this field. What is the transaction or the setting to do this.
    Kindly provide your valuable inputs
    regards
    AK

    you ahve to apply oss note 1046758.
    even i am also working on the same.
    reward points if useful

  • HT1473 How does an organization become a content supplier for iTunes Store?

    I'm seeking clarification on this as the organization that I work for would like to use this opportunity to market its content on iTunes Store?

    Start here -> iTunes: Sell your content

  • Question about reading a very big file into a buffer.

    Hi, everyone!
    I want to randomly load several characters from
    GB2312 charset to form a string.
    I have two questions:
    1. Where can I find the charset table file? I have used
    google for hours to search but failed to find GB2312 charset
    file out.
    2. I think the charset table file is very big and I doubted
    whether I can loaded it into a String or StringBuffer? Anyone
    have some solutions? How to load a very big file and randomly
    select several characters from it?
    Have I made myself understood?
    Thanks in advance,
    George

    The following can give the correspondence between GB2312 encoded byte arrays and characters (in hexadecimal integer expression).
    import java.nio.charset.*;
    import java.io.*;
    public class GBs{
    static String convert() throws java.io.UnsupportedEncodingException{
    StringBuffer buffer = new StringBuffer();
    String l_separator = System.getProperty("line.separator");
    Charset chset = Charset.forName("EUC_CN");// GB2312 is an alias of this encoding
    CharsetEncoder encoder = chset.newEncoder();
    int[] indices = new int[Character.MAX_VALUE+1];
    for(int j=0;j<indices.length;j++){
           indices[j]=0;
    for(int j=0;j<=Character.MAX_VALUE;j++){
        if(encoder.canEncode((char)j)) indices[j]=1;
    byte[] encoded;
    String data;
    for(int j=0;j<indices.length;j++) {
         if(indices[j]==1) {
                encoded =(Character.toString((char)j)).getBytes("EUC_CN");
                          for(int q=0;q<encoded.length;q++){
                          buffer.append(Byte.toString(encoded[q]));
                          buffer.append(" ");
                buffer.append(": 0x");buffer.append(Integer.toHexString(j));
                buffer.append(l_separator);
        return buffer.toString();
    //the following is for testing
    /*public static void main(String[] args) throws java.lang.Exception{
       String str = GBs.convert();
       System.out.println(str);*/

  • Iphone fell down and there is a very big dent on the left side. the screen split on both side. can i send it to apple and the repair it for like 160€ or how does it work?

    hi my iphone fell down and there is a very big dent on the left side. the screen split on both side. can i send it to apple and the repair/change it for like 160€ or how does it work?

    Service Answer Center - iPhone - http://support.apple.com/kb/index?page=servicefaq&geo=US&product=iphone  <-- enter correct country in drag-down menu once on page.

  • Improve the performance in stored procedure using sql server 2008 - esp where clause in very big table - Urgent

    Hi,
    I am looking for inputs in tuning stored procedure using sql server 2008. l am new to performance tuning in sql,plsql and oracle. currently facing issue in stored procedure - need to increase the performance by code optmization/filtering the records using where clause in larger table., the requirement is Stored procedure generate Audit Report which is accessed by approx. 10 Admin Users typically 2-3 times a day by each Admin users.
    It has got CTE ( common table expression ) which is referred 2  time within SP. This CTE is very big and fetches records from several tables without where clause. This causes several records to be fetched from DB and then needed processing. This stored procedure is running in pre prod server which has 6gb of memory and built on virtual server and the same proc ran good in prod server which has 64gb of ram with physical server (40sec). and the execution time in pre prod is 1min 9seconds which needs to be reduced upto 10secs or so will be the solution. and also the exec time differs from time to time. sometimes it is 50sec and sometimes 1min 9seconds..
    Pl provide what is the best option/practise to use where clause to filter the records and tool to be used to tune the procedure like execution plan, sql profiler?? I am using toad for sqlserver 5.7. Here I see execution plan tab available while running the SP. but when i run it throws an error. Pl help and provide inputs.
    Thanks,
    Viji

    You've asked a SQL Server question in an Oracle forum.  I'm expecting that this will get locked momentarily when a moderator drops by.
    Microsoft has its own forums for SQL Server, you'll have more luck over there.  When you do go there, however, you'll almost certainly get more help if you can pare down the problem (or at least better explain what your code is doing).  Very few people want to read hundreds of lines of code, guess what's it's supposed to do, guess what is slow, and then guess at how to improve things.  Posting query plans, the results of profiling, cutting out any code that is unnecessary to the performance problem, etc. will get you much better answers.
    Justin

  • Optimize delete in a very big database table

    Hi,
    For delete entries in database table i use instruction:
    Delete from <table> where <zone> = 'X'.
    The delete take seven hours (the table is very big and  <zone> isn't an index)
    How can i optimize for reduce the delete time.
    Thanks in advance for your response.
    Regards.

    what is the size of the table and how many lines are you going to delete?
    I would recommend you to delete only up to 5000 or 10000 records in one step.
    do 100 times.
    select *
              from
              into table itab.
              where
              up to 10.000 records.
    if ( itab is initial )
      exit.
    endif.
    delete ... from table itab.
    commit work.
    If this is still too slow, than you should create a secondary index with zone.
    You can drop the index after the deletion is finished.
    Siegfried

  • My iPad 1 has become very sluggish, why and  how to fix?

    My iPad 1 has become very sluggish. Even typing often has small delays in showing letters. It seems variable, alas, so no obvious app or condition is implicated. I have closed all apps and done a cold restart but that does not help. I have the latest OS updates. Would pruning my photo library help - I usually add photos directly from my cameras and have quite a few, and they are large files.
    Thanks,
    Tom

    Actually I was hoping you could tell me how you resolved the leopard Mail problem:
    https://discussions.apple.com/thread/2673343?start=0&tstart=0

  • Very Big Table (36 Indexes, 1000000 Records)

    Hi
    I have a very big table (76 columns, 1000000 records), these 76 columns include 36 foreign key columns , each FK has an index on the table, and only one of these FK columns has a value at the same time while all other FK have NULL value. All these FK columns are of type NUMBER(20,0).
    I am facing performance problem which I want to resolve taking in consideration that this table is used with DML (Insert,Update,Delete) along with Query (Select) operations, all these operations and queries are done daily. I want to improve this table performance , and I am facing these scenarios:
    1- Replace all these 36 FK columns with 2 columns (ID, TABLE_NAME) (ID for master table ID value, and TABLE_NAME for master table name) and create only one index on these 2 columns.
    2- partition the table using its YEAR column, keep all FK columns but drop all indexes on these columns.
    3- partition the table using its YEAR column, and drop all FK columns, create (ID,TABLE_NAME) columns, and create index on (TABLE_NAME,YEAR) columns.
    Which way has more efficiency?
    Do I have to take "master-detail" relations in mind when building Forms on this table?
    Are there any other suggestions?
    I am using Oracle 8.1.7 database.
    Please Help.

    Hi everybody
    I would like to thank you for your cooperation and I will try to answer your questions, but please note that I am a developer in the first place and I am new to oracle database administration, so please forgive me if I did any mistakes.
    Q: Have you gathered statistics on the tables in your database?
    A: No I did not. And if I must do it, must I do it for all database tables or only for this big table?
    Q:Actually tracing the session with 10046 level 8 will give some clear idea on where your query is waiting.
    A: Actually I do not know what you mean by "10046 level 8".
    Q: what OS and what kind of server (hardware) are you using
    A: I am using Windows2000 Server operating system, my server has 2 Intel XEON 500MHz + 2.5GB RAM + 4 * 36GB Hard Disks(on RAID 5 controller).
    Q: how many concurrent user do you have an how many transactions per hour
    A: I have 40 concurrent users, and an average 100 transaction per hour, but the peak can goes to 1000 transaction per hour.
    Q: How fast should your queries be executed
    A: I want the queries be executed in about 10 to 15 seconds, or else every body here will complain. Please note that because of this table is highly used, there is a very good chance to 2 or more transaction to exist at the same time, one of them perform query, and the other perform DML operation. Some of these queries are used in reports, and it can be long query(ex. retrieve the summary of 50000 records).
    Q:please show use the explain plan of these queries
    A: If I understand your question, you ask me to show you the explain plan of those queries, well, first, I do not know how , an second, I think it is a big question because I can not collect all kind of queries that have been written on this table (some of them exist in server packages, and the others performed by Forms or Reports).

  • Mac book pro has become very slow. How to analyze this and resolve this?

    My mac book pro has become very slow. How to analyze and resolve this?

    First, back up all data immediately, as your boot drive might be failing.
    There are a few other possible causes of generalized slow performance that you can rule out easily.
    Reset the System Management Controller.
    If you have many image or video files on the Desktop with preview icons, move them to another folder.
    If applicable, uncheck all boxes in the iCloud preference pane.
    Disconnect all non-essential wired peripherals and remove aftermarket expansion cards, if any.
    Check your keychains in Keychain Access for excessively duplicated items.
    Boot into Recovery mode, launch Disk Utility, and run Repair Disk.
    If you're booting from an aftermarket SSD, see whether there's a firmware update for it.
    Otherwise, take the steps below when you notice the problem.
    Step 1
    Launch the Activity Monitor application in any of the following ways:
    ☞ Enter the first few letters of its name into a Spotlight search. Select it in the results (it should be at the top.)
    ☞ In the Finder, select Go ▹ Utilities from the menu bar, or press the key combination shift-command-U. The application is in the folder that opens.
    ☞ Open LaunchPad. Click Utilities, then Activity Monitor in the icon grid.
    Select the CPU tab of the Activity Monitor window.
    Select All Processes from the menu in the toolbar, if not already selected.
    Click the heading of the % CPU column in the process table to sort the entries by CPU usage. You may have to click it twice to get the highest value at the top. What is it, and what is the process? Also post the values for % User, % System, and % Idle at the bottom of the window.
    Select the System Memory tab. What values are shown in the bottom part of the window for Page outs and Swap used?
    Next, select the Disk Activity tab. Post the approximate values shown for Reads in/sec and Writes out/sec (not Reads in and Writes out.)
    Step 2
    If you have more than one user account, you must be logged in as an administrator to carry out this step.
    Launch the Console application in the same way you launched Activity Monitor. Make sure the title of the Console window is All Messages. If it isn't, select All Messages from the SYSTEM LOG QUERIES menu on the left. If you don't see that menu, select
    View ▹ Show Log List
    from the menu bar.
    Select the 50 or so most recent entries in the log. Copy them to the Clipboard (command-C). Paste into a reply to this message (command-V). You're looking for entries at the end of the log, not at the beginning.
    When posting a log extract, be selective. Don't post more than is requested.
    Please do not indiscriminately dump thousands of lines from the log into this discussion.
    Important: Some personal information, such as your name, may appear in the log. Anonymize before posting. That should be easy to do if your extract is not too long.

Maybe you are looking for

  • Error 1 Configuration VIs while Passing Refnum through Shift Register on Mac

    All, I am using LabVIEW 2013 SP1 with Mac OS X (Mavericks). I have a simple shift register VI that initializes a .ini file, reads/writes, and closes. I've done this plenty times on windows with no problem. I wrote a quick VI on Mac and I get Error 1.

  • Dead imac?

    Earlier this week my first generation imac g5 started powering down on its own, some time after I was done using it. When I would come back to it later in the day or the next day I would have to turn the machine back on and do a cold boot. I checked

  • Invoic01 cost center / posting key

    Hello gurus. We are implementing a new process which requires us to use inbound IDoc INVOIC01.  These invoices don't go against any purchase order or other pre-existing documents -- they're expense reports from our various employees.  I think I have

  • NLS Character Set problem

    Hi all, I am getting very tricky kind of problem oracle 10G. When I am try to insert a special character like '?' in 180 degree opposite direction( means imagine if we rotate the given special character ' ? ' in 180 degree), its perfectly inserting a

  • Report Execution Error

    Hi Experts I have created a query; it has one characteristic with selection option. The selection option is optional. When I executed the query without selection option it is giving the error like <b>CL_RSDM_READ_MASTER_DATA and form_sidval_direct_ .