Inserting large amounts of data

I am not sure if this is the best way of doing this but, I am trying to store icing data that we gather for general aviation purposes, in a spatial table in Oracle 8.1.7
The data set has about 10 altitude levels with about 10,000 points at each level. Each point corresponds to a a small rectangular region on the map.
We receive each data set every hour. The problem is that inserting these points in the database takes a long time. I am using JDBC with Oracle Update Batching with commit at every 100 inserts. The actual insert takes place in a stored procedure.
According to my benchmarking it takes about 10 seconds for each 100 inserts to take place. Here is the insert statement:
insert into ncar_detail
(ncar_detail_id,ncar_id,altitude,grid_value,grid_point)
values
(ncar_detail_seq.nextval,p_ncar_id,p_altitude,p_grid_value,
mdsys.sdo_geometry(2003, 8307,null,
mdsys.sdo_elem_info_array(1,1003,3),
mdsys.sdo_ordinate_array(p_ll_lon,
p_ll_lat,
p_ur_lon,
p_ur_lat)
If you have any comments on how to improve performance I appreciate it. The reason I am storing each point in the database is to be able to run proximity queries for hazard detection. Is this impractical or not?
Thanks
-Selim
null

Hi,
I don't know enough about SQL*Loader to help with the sequence number questions, but I'll see if I can get someone with more knowledge in that area to respond.
Regarding the speed of inserting:
Updating/adding data into spatial columns that have been indexed using r-trees is relatively slow. It may not be a jdbc issue at all.
There are a few things I would suggest:
try rebuilding the rtree index with a very large value for sdo_rtr_pctfree. This will leave room in the index structures for inserted data. After that if things are fast, occasionally rebuild the index.
also, how long does it take to build the index in general? If it is fast, try dropping the index and recreating it after the new data is inserted.
You may want to check the performance of queries and inserts using quadtree indexes.
You might want to have separate tables based on each level, or in 9i, partition the data based on the level.
Also in 9i, the rtree index on data with srid of 8307 (or any geodetic SRID) becomes a geodetic index so it is more accurate. But there are restrictions, one of which is the optimized rectangle will not be supported. I'm not sure you need full geodetic support, in which case you may want to remove the srid fields. Also, indexing geodetic layers will be a bit slower, and some operations will take a bit longer (for instance, length, distance, and area calculations will be excellent on the surface of the earth without having to project the data, but there are additional cpu cycles required because the geometry is non-euclidean). If you do need to keep the geodetic srid, and you want the geodetic features in 9i, then you should load the rectangles with all 4 corners defined (so have 5 coordinates, as the first and last have to be the same).
Hope this is useful.
dan

Similar Messages

  • Couldn't copy large amount of data from enterprise DB to Oracle 10g

    Hi,
    I am using i-batis to copy data from enterprise DB to oracle and viceversa.
    The datatype of a field on EDB is 'text' and the datatype on oracle is 'SYS.XMLTYPE'
    i am binding these to a java string property in a POJO to bind values.
    I could successfully copy limited amount of data from EDB to oracle but if there is more data, i am getting the following exceptions with different oracle drivers ( but i could read large amount of data from EDB):
    --- Cause: java.sql.SQLException: ORA-01461: can bind a LONG value only for insert into a LONG column
    at com.ibatis.sqlmap.engine.mapping.statement.MappedStatement.executeUpdate(MappedStatement.java:107)
    at com.ibatis.sqlmap.engine.impl.SqlMapExecutorDelegate.update(SqlMapExecutorDelegate.java:457)
    at com.ibatis.sqlmap.engine.impl.SqlMapSessionImpl.update(SqlMapSessionImpl.java:90)
    at com.ibatis.sqlmap.engine.impl.SqlMapClientImpl.update(SqlMapClientImpl.java:66)
    at com.aqa.pojos.OstBtlData.updateOracleFromEdbBtlWebservice(OstBtlData.java:282)
    at com.aqa.pojos.OstBtlData.searchEdbAndUpdateOracleBtlWebservice(OstBtlData.java:258)
    com.ibatis.common.jdbc.exception.NestedSQLException:
    --- The error occurred in com/aqa/sqlmaps/SQLMaps_OSTBTL_Oracle.xml.
    --- The error occurred while applying a parameter map.
    --- Check the updateOracleFromEDB-InlineParameterMap.
    --- Check the parameter mapping for the 'btlxml' property.
    --- Cause: java.sql.SQLException: setString can only process strings of less than 32766 chararacters
    at com.ibatis.sqlmap.engine.mapping.statement.MappedStatement.executeUpdate(MappedStatement.java:107)
    at com.iba
    I have latest oracle 10g jdbc drivers.
    remember, i could copy any amount of data from oracle to EDB but not otherway around.
    PLease let me know if you have come across this issue, any recommendation is very much appreciated.
    Thanks,
    CK.

    Hi,
    I finally remembered how I solved this issue previously.
    The jdbc driver isn't able to directly call the insert with a column xml_type. The solution I was using was to build a wrapper function in plSQL.
    Here it is (for insert but I suppose tha update will be the same)
    create or replace procedure insertXML(file_no_in in number, program_no_in in varchar2, ost_XML_in in clob, btl_XML_in in clob) is
    begin
    insert into AQAOST_FILES (file_no,program_no,ost_xml,btl_xml) values(file_no_in, program_no_in, xmltype(ost_XML_in), xmltype(btl_XML_in));
    end insertXML;
    here is the sqlmap file I used
    <?xml version="1.0" encoding="UTF-8" ?>
    <!DOCTYPE sqlMap
    PUBLIC "-//ibatis.apache.org//DTD SQL Map 2.0//EN"
    "http://ibatis.apache.org/dtd/sql-map-2.dtd">
    <sqlMap>
         <typeAlias alias="AqAost" type="com.sg2net.jdbc.AqAost" />
         <insert id="insert" parameterClass="AqAost">
              begin
                   insertxml(#fileNo#,#programNo#,#ostXML:CLOB#,#bltXML:CLOB#);
              end;
         </insert>
    </sqlMap>
    an here is a simple program
    package com.sg2net.jdbc;
    import java.io.IOException;
    import java.io.Reader;
    import java.io.StringWriter;
    import java.sql.Connection;
    import oracle.jdbc.pool.OracleDataSource;
    import com.ibatis.common.resources.Resources;
    import com.ibatis.sqlmap.client.SqlMapClient;
    import com.ibatis.sqlmap.client.SqlMapClientBuilder;
    public class TestInsertXMLType {
         * @param args
         public static void main(String[] args) throws Exception {
              // TODO Auto-generated method stub
              String resource="sql-map-config-xmlt.xml";
              Reader reader= Resources.getResourceAsReader(resource);
              SqlMapClient sqlMap = SqlMapClientBuilder.buildSqlMapClient(reader);
              OracleDataSource dataSource= new OracleDataSource();
              dataSource.setUser("test");
              dataSource.setPassword("test");
              dataSource.setURL("jdbc:oracle:thin:@localhost:1521:orcl");
              Connection connection=dataSource.getConnection();
              sqlMap.setUserConnection(connection);
              AqAost aqAost= new AqAost();
              aqAost.setFileNo(3);
              aqAost.setProgramNo("prg");
              Reader ostXMLReader= Resources.getResourceAsReader("ostXML.xml");
              Reader bltXMLReader= Resources.getResourceAsReader("bstXML.xml");
              aqAost.setOstXML(readerToString(ostXMLReader));
              aqAost.setBltXML(readerToString(bltXMLReader));
              sqlMap.insert("insert", aqAost);
              connection.commit();
         public static String readerToString(Reader reader) {
              StringWriter writer = new StringWriter();
              char[] buffer = new char[2048];
              int charsRead = 0;
              try {
                   while ((charsRead = reader.read(buffer)) > 0) {
                        writer.write(buffer, 0, charsRead);
              } catch (IOException ioe) {
                   throw new RuntimeException("error while converting reader to String", ioe);
              return writer.toString();
    package com.sg2net.jdbc;
    public class AqAost {
         private long fileNo;
         private String programNo;
         private String ostXML;
         private String bltXML;
         public long getFileNo() {
              return fileNo;
         public void setFileNo(long fileNo) {
              this.fileNo = fileNo;
         public String getProgramNo() {
              return programNo;
         public void setProgramNo(String programNo) {
              this.programNo = programNo;
         public String getOstXML() {
              return ostXML;
         public void setOstXML(String ostXML) {
              this.ostXML = ostXML;
         public String getBltXML() {
              return bltXML;
         public void setBltXML(String bltXML) {
              this.bltXML = bltXML;
    I tested the insert and it works correctly
    ciao,
    Giovanni

  • Sorting large amounts of data with treemap

    Hello. Im doing a project where I have to sort a large amount of data. The data is formed by a unique number and a location (a string).
    Something like this
    NUMBER .... CITY
    1000123 BOSTON
    1045333 HOUSTON
    5234222 PARIS
    2343345 PARIS
    6234332 SEATTLE
    I have to sort the data by location and then by unique number...
    I was using the TreeMap to do this : I used the location string as a key - since I wanted to sort the data by that field - but, because the location string is not unique, at the moment to insert the data on the TreeMap, it overwrites the object with the same location string, saving only the last one that was inserted.
    Is there any Collection that implements sorting in the way that I need it?... or if there isnt such thing... is there any collection that supports a duplicated key object???
    Thanks for your time!
    Regards
    Cesar

    ... or use a SortedSet for the list of numbers (as the associated value for
    the location key). Something like this:voidAddTuple(String location, Integer number) {
       SortedSet numbers= set.get(location);
       if (numbers == null)
          set.put(location, numbers= new TreeSet());
       numbers.put(number);
    }kind regards,
    Jos

  • Selecting large amounts of data

    I'm selecting a large amount of data from a database and put the data
    into an object file. For some reason the process dies halfway through with an OutOfMemoryException. Any Ideas?
    Secondly, does anyone know how to insert carriage returns with a
    prepared statement?

    I'm selecting a large amount of data from a database
    and put the data
    into an object file. For some reason the process dies
    halfway through with an OutOfMemoryException. Any
    Ideas?Like...you are running out of memory?
    Have you increased the heap size?
    What does "large" mean? 100k or 100gig?
    >
    Secondly, does anyone know how to insert carriage
    returns with a
    prepared statement?I suppose you mean how to insert one into a text field.
    Do you already have it in the string? If not then you use "\\n" or '\n'.
    If you do have it in the string and you inserted it without error then how do you know it isn't already in the database?

  • Storing large amounts of data

    Hello,
    I'd like to use Berkeley DB for logging large amounts of data - i.e. structures that are ~400KB in size and I need to store them ~10 times per second for up to several hours, but I get into quite big performance issues the more records I insert into the database. I've set the pagesize to its maximum (64KB - I split my data into several packages so it doesn't get stored on an overflow page) and experimented with several chache sizes (8MB, 64MB, 2GB, 4GB), but I haven't managed to get rid of the performance issues, independent of which access method I use (although I got the "best" results when using DB_QUEUE, but that varies heavily from day to day).
    To get to the point: Performance starts at "0" seconds per insert (where 1 "insert" = 7 real inserts because of splitting up the data), between the 16750. and 17000. insertion it takes ~0.00352 per insert and when reaching the 36000. insertion it already takes about 0.0074 seconds per insert, and so on ...
    Does anyone have an idea on how I can increase my performance? Because when the time needed for each insertion keeps increasing over time, it's not possible to keep the program running at its intended speed at some point.
    Thanks,
    Thomas

    Hello,
    A good starting point are the suggestions in the Berkeley DB Reference Guide at:
    http://www.oracle.com/technology/documentation/berkeley-db/db/programmer_reference/am_misc_tune.html
    http://www.oracle.com/technology/documentation/berkeley-db/db/programmer_reference/transapp_tune.html
    http://www.oracle.com/technology/documentation/berkeley-db/db/programmer_reference/transapp_throughput.html
    Thanks,
    Sandra

  • MessageExpired because of large amount of data

    Hi Experts, i need some advice for my following scenario.
    I have a 50000 data that need to be inserted into SAP R3 HR. I am using BPM when calling the RFC dan using a sync message to get the response from the RFC. However, i think because of the large amount of data, i get the error "MessageExpired" in my monitoring.
    After i read some documentation, i find that a sync message has a timeout period (xiadapter.inbound.timeout.default = 180000 [ms] ) . My question is, if the 180000ms is not enough for the RFC to process the whole 50000 data, i can increase the timeout time, am i right? But then, to what maximum value will be the most appropriate for me to set the timeout value so that it will not effect the performance of XI. Anyway, does increasing the timeout value effect anything at all?
    Need the advice and inputs from you experts...Thank you so much in advance...

    Made,
    I posted this answer to a similar request a couple of weeks ago
    I had a similar issue some time back and used the following three parameters
    1. Visual Administrator
    SAP XI Adapter: RFC parameter syncMessageDeliveryTimeoutMsec
    This parameter is explained in SAP Note 730870 Q14 and SAP Note 791379
    2. Alert Configuration
    SA_COMM parameter CHECK_FOR_ASYNC_RESPONSE_TIMEOUT
    This parameter specifies the maximum time in seconds that can pass between the arrival of the synchronous request message on the Integration Server and the asynchronous response message. If this time period is exceeded, a system error is returned to the caller.
    3. Integration Process TimeoutControlStep
    My design is such that timeout parameter 3 which is set in the Integration Process will be triggered first. This will give me control to handle the timeout via an SMTP email prior to either of the other timeouts taking effect
    Regards,
    Mike

  • Large Amount of Data in JSF

    Hello,
    I am using the Table Group component for displaying data in my application designed in Java Studio Creator.
    I have enabled paging on the component. I use CachedRowSet on the bean for the page for getting the data. This works very well at the moment in my development environment. At the moment I am testing on small amount of data.
    I was wondering how does this component perform with very large amounts of data (>75,000 rows). I noticed that there is a button available for users to retrieve all the rows. So I was wondering apart from that instance, when viewing in a paged mode does the component get all the results from the database everytime ?
    Which component would be best suited for displaying large amounts of data in a table format?
    Thanks In Advance!!

    Thanks for your reply. The table control that I use does have paging as a feature and I have enabled it. It still takes time to load the data initially.
    I wonder if it is got to do with the logic of paging. How do you specify which set of 20 records to extract from SQL.
    Thanks for your help!!

  • Open Large amount of data

    Hi
    I have a file on application server in .dat format, it contains large amount of data may be 2 million of records or  more, I need to open the file to check the record count, is there any software or any option to open the file, I have tried opening with Notepad, excel .... it gives error..
    please let me know
    Thanks

    Hi,
    Try this..
    Go to AL11..
    Go to the file directory..Then in the file there will be field called length..which is the total length of the file in characters..
    If you know the length of a single line..
    Divide the length of the file by the length of single line..I believe you will get the number of records..
    Thanks,
    Naren

  • Bex Report Designer - Large amount of data issue

    Hi Experts,
    I am trying to execute (on Portal) report made in BEx Report Designer, with about 30 000 pages, and the only thing I am getting is a blank page. Everything works fine at about 3000 pages. Do I need to set something to allow processing such large amount of data?
    Regards
    Vladimir

    Hi Sauro,
    I have not seen this behavior, but it has been a while since I tried to send an input schedule that large. I think the last time was on a BPC NW 7.0 SP06 system and it worked OK. If you are on a recent support package, then you should search for relevant notes (none come to mind for me, but searching yourself is always a good idea) and if you don't find one then you should open a support message with SAP, with very specific instructions for recreating the problem from a clean input-schedule.
    Good luck,
    Ethan

  • Advice needed on how to keep large amounts of data

    Hi guys,
    Im not sure whats the best way is to make large amounts of data available to my android  app on the local device.
    For example records of food ingredients, in the 100's?
    I have read and successfully created .db's using this tutorial.
    http://help.adobe.com/en_US/AIR/1.5/devappsflex/WS5b3ccc516d4fbf351e63e3d118666ade46-7d49. html
    However to populate the database I use flash? So this kind of defeats the purpose of it. No point in me shifting a massive array of data from flash to a sql database, when I could access the data direct from the as3 array?
    So maybe I could create the .db with an external program? but then how would I include that .db in the apk file and then deploy it to users android device.
    Or maybe I create a as3 class with an xml object init and use that as a means of data storage?
    Any advice would be appreciated

    You can use any means you like to populate your SQLite database, including using external programs, (temporarily) embedding a text file with SQL statements, executing some SQL from AS3 code etc etc.
    Once you have populated your db, deploy it with your project:
    http://chrisgriffith.wordpress.com/2011/01/11/understanding-bundled-sqlite-databases-in-ai r-for-mobile/
    Cheers, - Jon -

  • Error in Generating reports with large amount of data using OBIR

    Hi all,
    we hve integrated OBIR (Oracle BI Reporting) with OIM (Oracle Identity management) to generate the custom reports. Some of the custom reports contain a large amount of data (approx 80-90K rows with 7-8 columns) and the query of these reports basically use the audit tables and resource form tables primarily. Now when we try to generate the report, it is working fine with HTML where report directly generate on console but the same report when we tried to generate and save in pdf or Excel it gave up with the following error.
    [120509_133712190][][STATEMENT] Generating page [1314]
    [120509_133712193][][STATEMENT] Phase2 time used: 3ms
    [120509_133712193][][STATEMENT] Total time used: 41269ms for processing XSL-FO
    [120509_133712846][oracle.apps.xdo.common.font.FontFactory][STATEMENT] type1.Helvetica closed.
    [120509_133712846][oracle.apps.xdo.common.font.FontFactory][STATEMENT] type1.Times-Roman closed.
    [120509_133712848][][PROCEDURE] FO+Gen time used: 41924 msecs
    [120509_133712848][oracle.apps.xdo.template.FOProcessor][STATEMENT] clearInputs(Object) is called.
    [120509_133712850][oracle.apps.xdo.template.FOProcessor][STATEMENT] clearInputs(Object) done. All inputs are cleared.
    [120509_133712850][oracle.apps.xdo.template.FOProcessor][STATEMENT] End Memory: max=496MB, total=496MB, free=121MB
    [120509_133818606][][EXCEPTION] java.net.SocketException: Socket closed
    at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:99)
    at java.net.SocketOutputStream.write(SocketOutputStream.java:136)
    at weblogic.servlet.internal.ChunkOutput.writeChunkTransfer(ChunkOutput.java:525)
    at weblogic.servlet.internal.ChunkOutput.writeChunks(ChunkOutput.java:504)
    at weblogic.servlet.internal.ChunkOutput.flush(ChunkOutput.java:382)
    at weblogic.servlet.internal.ChunkOutput.checkForFlush(ChunkOutput.java:469)
    at weblogic.servlet.internal.ChunkOutput.write(ChunkOutput.java:304)
    at weblogic.servlet.internal.ChunkOutputWrapper.write(ChunkOutputWrapper.java:139)
    at weblogic.servlet.internal.ServletOutputStreamImpl.write(ServletOutputStreamImpl.java:169)
    at java.io.BufferedOutputStream.write(BufferedOutputStream.java:105)
    at oracle.apps.xdo.servlet.util.IOUtil.readWrite(IOUtil.java:47)
    at oracle.apps.xdo.servlet.CoreProcessor.process(CoreProcessor.java:280)
    at oracle.apps.xdo.servlet.CoreProcessor.generateDocument(CoreProcessor.java:82)
    at oracle.apps.xdo.servlet.ReportImpl.renderBodyHTTP(ReportImpl.java:562)
    at oracle.apps.xdo.servlet.ReportImpl.renderReportBodyHTTP(ReportImpl.java:265)
    at oracle.apps.xdo.servlet.XDOServlet.writeReport(XDOServlet.java:270)
    at oracle.apps.xdo.servlet.XDOServlet.writeReport(XDOServlet.java:250)
    at oracle.apps.xdo.servlet.XDOServlet.doGet(XDOServlet.java:178)
    at oracle.apps.xdo.servlet.XDOServlet.doPost(XDOServlet.java:201)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
    at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
    at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
    at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:292)
    at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:26)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:42)
    at oracle.apps.xdo.servlet.security.SecurityFilter.doFilter(SecurityFilter.java:97)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:42)
    at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3496)
    at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
    at weblogic.security.service.SecurityManager.runAs(Unknown Source)
    at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2180)
    at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2086)
    at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1406)
    at weblogic.work.ExecuteThread.execute(ExecuteThread.java:201)
    at weblogic.work.ExecuteThread.run(ExecuteThread.java:173)
    It seems where the querry processing is taking some time we are facing this issue.Do i need to perform any additional configuration to generate such reports?

    java.net.SocketException: Socket closed
         at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:99)
         at java.net.SocketOutputStream.write(SocketOutputStream.java:136)
         at weblogic.servlet.internal.ChunkOutput.writeChunkTransfer(ChunkOutput.java:525)
         at weblogic.servlet.internal.ChunkOutput.writeChunks(ChunkOutput.java:504)
         at weblogic.servlet.internal.ChunkOutput.flush(ChunkOutput.java:382)
         at weblogic.servlet.internal.CharsetChunkOutput.flush(CharsetChunkOutput.java:249)
         at weblogic.servlet.internal.ChunkOutput.checkForFlush(ChunkOutput.java:469)
         at weblogic.servlet.internal.CharsetChunkOutput.implWrite(CharsetChunkOutput.java:396)
         at weblogic.servlet.internal.CharsetChunkOutput.write(CharsetChunkOutput.java:198)
         at weblogic.servlet.internal.ChunkOutputWrapper.write(ChunkOutputWrapper.java:139)
         at weblogic.servlet.internal.ServletOutputStreamImpl.write(ServletOutputStreamImpl.java:169)
         at com.tej.systemi.util.AroundData.copyStream(AroundData.java:311)
         at com.tej.systemi.client.servlet.servant.Newdownloadsingle.producePageData(Newdownloadsingle.java:108)
         at com.tej.systemi.client.servlet.servant.BaseViewController.serve(BaseViewController.java:542)
         at com.tej.systemi.client.servlet.FrontController.doRequest(FrontController.java:226)
         at com.tej.systemi.client.servlet.FrontController.doPost(FrontController.java:128)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
         at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
         at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
         at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:292)
         at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:175)
         at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3498)
         at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
         at weblogic.security.service.SecurityManager.runAs(Unknown Source)
         at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2180)
         at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2086)
         at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1406)
         at weblogic.work.ExecuteThread.execute(ExecuteThread.java:201)
         at weblogic.work.ExecuteThread.run(ExecuteThread.java:17
    (Please help finding a solution in this issue its in production and we need to ASAP)
    Thanks in Advance
    Edited by: 909601 on Jan 23, 2012 2:05 AM

  • With journaling, I have found that my computer is saving a large amount of data, logs of all the changes I make to files; how can I clean up these logs?

    With journaling, I have found that my computer is saving a large amount of data, logs of all the changes I make to files; how can I clean up these logs?
    For example, in Notes, I have written three notes; however if I click on 'All On My Mac' on the side bar, I see about 10 different versions of each note I make, it saves a version every time I add or delete a sentence.
    I also noticed, that when I write an email, Mail saves about 10 or more draft versions before the final is sent.
    I understand that all this journaling provides a level of security, and prevents data lost; but I was wondering, is there a function to clean up journal logs once in a while?
    Thanks
    Roz

    Are you using Microsoft word?  Microsoft thinks the users are idiots. They put up a lot of pointless messages that annoy & worry users.  I have seen this message from Microsoft word.  It's annoying.
    As BDaqua points out...
    When you copy information via edit > copy,  command + c, edit > cut, or command +x, you place the information on the clipboard. When you paste information, edit > paste or command + v, you copy information from the clipboard to your data file.
    If you edit > cut or command + x and you do not paste the information and you quite Word, you could be loosing information.  Microsoft is very worried about this. When you quite Word, Microsoft checks if there is information on the clipboard & if so, Microsoft puts out this message.
    You should be saving your work more than once a day. I'd save every 5 minutes.  command + s does a save.
    Robert

  • Looking for ideas for transferring large amounts of data between systems

    Hello,
    I am looking for ideas based on best practices for transferring Large Amounts of Data in and out of a Netweaver based application.
    We have a new system we are developing in Netweaver that will utilize both the Java and ABAP stack, and will require integration with other SAP and 3rd Party Systems. It is a standalone product that doesn't share any form of data store with other systems.
    We need to be able to support 10s of millions of records of tabular data coming in and out of our system.
    Since we need to integrate with so many different systems, we are planning to use RFC for our primary interface in and out of the system. As it turns out RFC is not good at dealing with this large amount of data being pushed through a single call.
    We have considered a number of possible ideas, however we are not very happy with any of them. I would like to see what the community has done in the past to solve problems like this as well as how SAP currently solves this problem in other applications like XI, BI, ERP, etc.

    Primoz wrote:Do you use KDE (Dolphin) 4.6 RC or 4.5?
    Also I've noticed that if i move / copy things with Dolphin they're substantially slower than if I use cp/mv. But cp/mv works fine for me...
    Also run Dolphin from terminal to try and see what's the problem.
    Hope that help at least a bit.
    Could you explain why Dolphin should be slower? I'm not attacking you, I'm just asking.
    Cause I thought that Dolphin is just a „little" wrapper around the cp/mv/cd/ls applications/commands.

  • Azure Cloud service fails when sent large amount of data

    This is the error;
    Exception in AZURE Call: An error occurred while receiving the HTTP response to http://xxxx.cloudapp.net/Service1.svc. This could be due to the service endpoint binding not using the HTTP protocol. This could also be due to an HTTP request context being
    aborted by the server (possibly due to the service shutting down). See server logs for more details.
    Calls with smaller amounts of data work fine. Large amounts of data cause this error.
    How can I fix this??

    Go to the web.config file, look for the <binding> that is being used for your service, and adjust the various parameters that limit the maximum length of the messages, such as
    maxReceivedMessageSize.
    http://msdn.microsoft.com/en-us/library/system.servicemodel.basichttpbinding.maxreceivedmessagesize(v=vs.100).aspx
    Make sure that you specify a size that is large enough to accomodate the amount of data that you are sending (the default is 64Kb).
    Note that even if you set a very large value here, you won't be able to go beyond the maximum request length that is configured in IIS. If I recall correctly, the default limit in IIS is 8 megabytes.

  • DSS problems when publishing large amount of data fast

    Has anyone experienced problems when sending large amounts of data using the DSS. I have approximately 130 to 150 items that I send through the DSS to communicate between different parts of my application.
    There are several loops publishing data. One publishes approximately 50 items in a rate of 50ms, another about 40 items with 100ms publishing rate.
    I send a command to a subprogram (125ms) that reads and publishes the answer on a DSS URL (app 125 ms). So that is one item on DSS for about 250ms. But this data is not seen on my man GUI window that reads the DSS URL.
    My questions are
    1. Is there any limit in speed (frequency) for data publishing in DSS?
    2. Can DSS be unstable if loaded to much?
    3. Can I lose/miss data in any situation?
    4. In the DSS Manager I have doubled the MaxItems and MaxConnections. How will this affect my system?
    5. When I run my full application I have experienced the following error Fatal Internal Error : ”memory.ccp” , line 638. Can this be a result of my large application and the heavy load on DSS? (se attached picture)
    Regards
    Idriz Zogaj
    Idriz "Minnet" Zogaj, M.Sc. Engineering Physics
    Memory Profesional
    direct: +46 (0) - 734 32 00 10
    http://www.zogaj.se

    LuI wrote:
    >
    > Hi all,
    >
    > I am frustrated on VISA serial comm. It looks so neat and its
    > fantastic what it supposes to do for a develloper, but sometimes one
    > runs into trouble very deep.
    > I have an app where I have to read large amounts of data streamed by
    > 13 µCs at 230kBaud. (They do not necessarily need to stream all at the
    > same time.)
    > I use either a Moxa multiport adapter C320 with 16 serial ports or -
    > for test purposes - a Keyspan serial-2-USB adapter with 4 serial
    > ports.
    Does it work better if you use the serial port(s) on your motherboard?
    If so, then get a better serial adapter. If not, look more closely at
    VISA.
    Some programs have some issues on serial adapters but run fine on a
    regular serial port. We've had that problem recent
    ly.
    Best, Mark

Maybe you are looking for