Why Argument$ segment size is increased

Hello everyone,
I have imported dump file having size 800mb and after completing importing given file my database size became more than 10 gb and in that arguement$ segment occupied more than 5 gb.
so, I would like to know
1.why it worked in this way ..
2. As this is system tablespace's segment so ultimately System tablespace increased .. hence is there any way to reduce system TS.
below query result for ur reference..
Thanks in adavance
select owner,segment_name,segment_type ,bytes/(1024*1024) size_m from dba_segments where tablespace_name = 'SYSTEM' and bytes/(1024*1024) > 2 order by size_m desc;
OWNER SEGMENT_NAME SEGMENT_TYPE SIZE_M
SYS ARGUMENT$ TABLE 5431
SYS I_ARGUMENT1 INDEX 4366
SYS I_ARGUMENT2 INDEX 2453

Reducing system tablespace this will help
Shrink/Reduce System Tablespace!
System Tablespace have grown too large!
http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:153612348067

Similar Messages

  • Why does rotating a layer cause image size to increase?

    I was working on the top layer of a PSD document and did a custom rotation of 45 degrees so I could get a better painting angle.  When I rotated the layer back I noticed my image size had increased from 10 x 8 inches to 18.3 x 18.3.  When I tried to resize the image back to 10 x 8 it made my image fat and squatty.  Any ideas what I did wrong when rotating?  I just wanted a better wrist angle for stroking lines...

    This is an example of using Image>Resize>Reveal All.
    If you rotate a layer, using free transform or Image>Rotate>Free Rotate Layer,
    then setect the move tool, you should see a bounding box well outside the original image.
    This is the part of the photo you can't see after rotating the layer.
    If you then were to go to Image>Image Size>Reveal All, elements makes those areas visible.
    The only thing about rotating images or layers at angles other than 90 degree increments,
    is that might degrade the image some.
    MTSTUNER

  • Database SP_SearchApp_CrawlStoreDB log size is increasing very large

    HI
    in My sharepoint evironment i confiugred serarch service application
    and  has sepreate search server
    every two three days, the database xyzSP_SearchApp_CrawlStoreDB_32fdb1522c5249088db8b09c1917dbec_log
    size is increasing up to 80 GB to 100 GB
    , how i control this 
    adil

    1) Why do you think that 100GB is too big?:  the total space for a drive  where
    the log file is 100 GB
    2) Are you shrinking the log file?  i configured a job in sql server for shrinking
    every 30 MINS.
    3) How large is your total data set (all content databases including RBS files) and how
    large is the crawl database itself (the mdf file)?:
    i did not get your point , why you asked
    so depend upon the crawl database how much space i will allocate to a drive where log
    file stored
    adil

  • PDF Size will increase in size dramatically with every submit.

    I have a PDF Form desinged using Adobe LiveCycle Desinger ES2.
    It has a submit button which will submit the form to the server (IIS and ASP.NET) using this javascript command:
    event.target.submitForm( {cURL: "http://server/ASPNETWebPage.ASPX", aPackets:["datasets","pdf"], cSubmitAs: "XDP"});
    On the server, from ASP.NET, I use the following code to extract the submitted "chunk" element and convert it from Base64 to Binary PDF File:
                fs = New System.IO.FileStream(mFormFileNameFolder, IO.FileMode.Create)
                bw = New System.IO.BinaryWriter(fs)
                ' Get chunk element form the submitted XML
                Dim srChunk As New StringReader(mXML.GetElementsByTagName("chunk")(0).InnerXml)
                Do While True
                    Dim theChunkLine As String
                    theChunkLine = srChunk.ReadLine
                    If Not String.IsNullOrEmpty(theChunkLine) Then
                        theReadBytes = theChunkLine.Length
                    Else
                        theReadBytes = 0
                        Exit Do
                    End If
                    Dim theBase64Length = (theReadBytes * 3 / 4)
                    Dim buffer() As Byte
                    buffer = Convert.FromBase64String(theChunkLine)
                    bw.Write(buffer)
                Loop
                bw.Close()
                bw = Nothing
                fs.Close()
                fs = Nothing
    The above code is working fine, and PDF is generted successfully.
    I have one problem.
    With every submit, the generated PDF Size will increase dramatically. I reported this to Adobe Support, and they cofirmed that this is by desing and that with every submit, the previous PDF State is saved, and the new state is added. That is why I get huge PDF File.
    I was told that the only way to solve this problem is to submit the form as PDF ONLY, and after I save the PDF File on a file system, I then must use Adobe Service/Process "exportData" to extract the XML Data from the PDF.
    I think this is really big change to me. I was hoping that there is a way to indentify the latest PDF State from the chunk element.
    Any help will be greatly appreciated.
    Tarek.

    Thanks a lot C. Myers,
    You explanation helped me understand what is happening.
    I have been following the same method for the past 4 years, and I was hit by this problem (OutOfMemoryException) only when some users started using image size more than 500KB. Then, I decided to report this problem.
    I was able to rewrite the code to convert from Base64 to binary using buffering:
    http://forums.asp.net/t/1662571.aspx/1?URGENT+Exception+OutOfMemoryException+thrown+when+w hen+converting+to+String+
    So far, I am not getting OutOfMemoryExceptions, but the PDF Size will continue to grow with every submit. However, if the all the images size is less than 50KB, the increase is not significant.
    Please allow me to ask this question:
    Is there a way to change the above code so that I can extract only the last version of the submitted PDF from the Data Stream "chunk" element ?
    Sooner or later, some one will notice that such PDF sizes are not logical. Even when the PDF does not have images, I have noticed in the past, some PDF Sizes (for Staff Profile Data Collection Form) are something like 15MB !!! I was not able to figure out why. But now I understand. I think the user must have submitted the form for saving many times.
    Now, things are OK. But, I will post back if this problem will fire back.
    Tarek.

  • MDS Time Delivery Queue size is increasing

    Hello Everyone
    We are using ICM 8.5.3 , I am getting this message in Event Viewer of Rogger A & Rogger B servers.
    Computer:      LHRRGRA.ef.UCCE
    Description:
    MDS Time Delivery Queue size is increasing, current size is 5525, but will continue to send messages.
    Event Xml:
    <Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event">
      <System>
        <Provider Name="Cisco Systems, Inc. ICM" />
        <EventID Qualifiers="41220">32810</EventID>
        <Level>3</Level>
        <Task>4</Task>
        <Keywords>0x80000000000000</Keywords>
        <TimeCreated SystemTime="2012-06-08T06:12:08.000000000Z" />
        <EventRecordID>127461</EventRecordID>
        <Channel>Application</Channel>
        <Computer>LHRRGRA.ef.UCCE</Computer>
        <Security />
      </System>
      <EventData>
        <Data>5525</Data>
        <Binary>06010000000000003D9030012A8004A160975AA23D45CD0114000000010041006D6473004C485243444352475241003535323500</Binary>
      </EventData>
    </Event>
    Anyone knowns why getting this.
    Regards
    Irfan Tariq

    The MDS processes on these Routers are unable to communicate quickly enough over their private connection.
    You''ll want to check the network connectivity, QoS configuration, link speed, congestion, .... Also be sure to follow this document letter by letter :
    http://docwiki.cisco.com/wiki/Contact_Center_Networking:_Offload,_Receive_Side_Scaling_and_Chimney
    Cheers,
    Kris

  • Java NIO - TCP segment size abnormally low

    Hi !
    After noticing a weird behaviour on our Linux production server for code that works perfectly on my Windows dev box, I used tcpdump to sniff the packets that are actually sent to our clients.
    The code I use to write the data is as simple as :
    // using NIO - buffer is at most 135 bytes long
    channel.write(buffer);
    if (buffer.hasRemaining()) {
        // this never happens
    }When the buffer is 135 bytes long, this systematically results in two TCP segments being sent : one containing 127 bytes of data, the other one containing 8 bytes of data.
    Our client is an embedded system which is poorly implemented and handles TCP packets as whole application messages, which means that the remaining 8 bytes end up being ignored and the operation fails.
    I googled it a bit, but couldn't find any info about the possible culprit (buffer sizes and default max tcp segment sizes are of course way larger that 127 bytes !)
    Any ideas ?
    Thanks !

    NB the fragmentation could also be happening in any intermediate router.
    All I can suggest is that you set the TCP send buffer size to some multiple of the desired segment size, or maybe just set it very large, like 64k-1, so that you can reduce its effect on segmentation.

  • Word Document size is increased after migration?

    Hello
    We migrated from Sp2003 to Sp2010 and noticed that all item documents in each document library, file size is increased.
    I compared the couple of documents, everything is same.
    Is it by default depending on framework or what is logic behind this.
    Can any one please explain on this?
    Avi

    I did experience this after a migration, and attributed it to differences in versions of sql server that were being used. Random checking resulted in identical results, just like you. After using it for many months, we got no end user complaints about it.
    Needless to say, having documents of different sizes after a migration can be very discomforting for just about all involved parties.
    In my case, there was a difference, when migrating via a 3rd party tool there
    were file size differences, when migrating via standard MS tooling (by exporting individual site collections, but also by attaching content db's), we experienced
    no differences.
    Kind regards,
    Margriet Bruggeman
    Lois & Clark IT Services
    web site: http://www.loisandclark.eu
    blog: http://www.sharepointdragons.com

  • In CS6, why the file size remains same after croping? And how do I reduce the size after each cropin

    In CS6, why the file size remains same after croping? And how do I reduce the size after each croping? Thx

    Select the Crop Tool and check the box [  ] Delete Cropped Pixels and you should see a reduction in file size.  With the box unchecked, the data is still maintained in the document, you just can't see it.
    -Noel

  • Will the database size be increased ?

    HI All,
    I like to ask, will the datafile size be increase, if we put our tablespace in backup mode and forget to put in back in end backup mode.
    According to me not, because all the transaction activity will be carried out into redologs and of course, there will be significant increase in redologs. Please correct me, If I am wrong.
    hare krishna
    Alok

    The size of datafile itself will not be affected, but as you mentioned, the amount of redo that you generate will increase.

  • Adobe interactive form(ABAP WD)size is increasing & not opening from EP UWL

    Hello All,
    Adobe interactive form (ABAP WD) size is increasing after cleansing data from MDM and the form is not opening from portal UWL.
    Same application is working fine in Dev and Qa environment. In dev and QA the form size is around 150 KB after cleansing data from MDM. Some reason in production it is more than 2 MB. FYI - We have multiple app servers in production.
    /Padmanaban

    Hi Babi,
    In the adobe form layout library Use the Submit(which internally means submit to SAP) button from the Webdynpro Native category.
    Only this button action can connect adobe to web dynpro.
    Whenever we click on this particular button the event will be trigger in the ONsubmit event of interactive form UI element in the webdynpro.There we can write our abap code.Hope this will help you.
    Regards,
    Simi A  M
    Edited by: amsimi on Mar 22, 2011 11:37 AM

  • "Limit Capture/Export File Segment Size To" doesn't work

    I put the limit to 4000 MB before i startet capturing HD-video, but it didn't work. Several of my files are bigger than 4 GB. This is a problem since I use an online back up service that has 5 GB as the maximum file limit. Any suggestion to fix this problem is highly appreciated.

    I believe, although I am not 100% sure, that the "Limit Capture/Export File Segment Size To" does not apply to Log and Transfer, only Log and Capture.
    Since Log and Capture works with tape sources, when the limit is hit, the tape can be stopped, pre-rolled and started again to create another clip
    In the case of Log and Transfer, it is ingesting and transcoding existing files; the clip lengths (and therefore size) are already determined by the source clip.
    If you are working with very lengthy clips, you may want to set in and out points to divide the clips into smaller segments.
    MtD

  • Segment size problem

    I use apache HttpClient upload xml file to a server, and it is https. It works for some files, but for some of them i can not even get response, and use snoop to check networking, i found out, for the bad cases, there is one frame shows "unreassembled packet" and the checksum is INCORRECT, the size of the frame is 1514, the total length in Internet Protocol is 1500, but in the beginning of connection it shows MSS=1460,
    so who can explain the realtionship between these numbers, and where i can configure or control the segment size when uploading, or it is impossible.

    yes, 1514 is right, and the checksum incorrect is beacuse i am using snoop right on my server, the checksum has not been calculated by NIC, it is always 0, now i noticed the problem happens when the segment reachs 1460, and then, this packet is "unreassambled". So, i am wondering where is the problem, SSL, or system configure, or somthing else. By the way, my server is solaris 10 , and the peer, i am not soure, IBM, but i think they may use proxy server.

  • Why have my monthly payments increased?

    why have my monthly payments increased?

    Hi sallyb89
    You signed up in February 2013 for a promotional price that is valid for the first 12 months.  After the promo period your plan renews at the regular price.  An email is sent 30 days before your annual renewal date with details of the price for the next 12 months - perhaps it went into a spam/junk folder?
    Kind regards
    Bev

  • Email-to facilitate emailing a photo by gmail or AOL, I reduce the file size.  Is it better to reduce the image size or increase the jpg compression?

    To facilitate emailing a photo by gmail or AOL and avoid overwhelming the recipient's screen, I reduce the file size.  Is it better to reduce the image size or increase the jpg compression?  I have been making a duplicate image of 35 MB and reducing the image size to 8"x12" at 72 resolution giving a file size of 1.4 MB.  Then I SAVE AS a jpg of medium compression giving a  file size of about 111 KB.  Overkill?

    Go to File>save for web.
    I usually make the long side 800 px
    At the bottom of the dialog, check "constrain proportions."
    At the top, select JPEG in the dropdown for the file format.
    All the work is done for you! 72px/in is ok for web work. 240-300px/in is the recommended range for printing.

  • Segment Size when Packaging

    For a while now, when packaging material for our intranet I
    use a custom setting that works best with our data transfer rates.
    I am getting ready to put some courseware on our outside server for
    other employees to access when they are offsite. My question is
    should I use 56 kbs Modem or DSL/Cable Modem? If I use the higher
    setting, how would it impact 56 kbs users? If I use lower settings,
    would 56 kbs users notice any difference? Would cable modem users
    see slower performance when packaged at lower settings?
    Anyone have some experience in testing these settings?
    Thanks!
    Steve

    > Anyone have some experience in testing these settings?
    I always go for the larger segment size, for a number of
    reasons, including:
    - The browser has to make fewer calls for segments - each
    request has a
    certain overhead; latency waiting for the request to be
    answered etc. Take
    the extreme and try 1k segments and see how bad it can be.
    - Larger segments may cause what seem like longer delays on a
    single
    request, but they download more code, so there are fewer
    requests. The user
    perception can be that the course is more usable.
    - If you have any large graphics or other internal content,
    they won't be
    split up anyway, so many segments won't ever be so small as
    16k.
    Steve
    EuroTAAC eLearning 2007
    http://www.eurotaac.com
    Adobe Community Expert: Authorware, Flash Mobile and Devices
    My blog -
    http://stevehoward.blogspot.com/
    Authorware tips -
    http://www.tomorrows-key.com

Maybe you are looking for