Importing Large amounts of Data
Hello-
I have an old website that I am both redesigning and
developing... It is for a doctor's office website.
Currently, they have 71 different HTML pages, one for each of
their 71 doctors. All of the pages are structured exactly the same
way except for the content. I have been able to extract all text
and images out of the 71 pages and put them into a text document.
So, the data is still in order and in a repeating format. Now, I am
trying to figure out how to automate the task of putting this data
into my new design.
So, in a nutshell, I have a text document that has all the
site's physician info. I have a SPRY data region I am wanting to
import this data into (either via XML data sets or HTML table data
sets). I would like to do this without cutting and pasting from the
text document, as it would take forever! Is there a way to automate
this process?
Here is the old page:
Old Page
Here is the new:
New
Page
Finally, here are the text and xml files:
Text File
XML File
Any suggestions on automation would be much appreciated!
Thanks!
"fast and easy"? Ha, ha, ha, ha, ha, ha, ha, ha, ha, ha, ha,
ha, ha, ha!
But seriously, folks...open the rhbag.apj file in Notepad,
see how the baggage file entries are formatted, and format each of
your new entries in the proper format in another file (you might
need a good Replace tool like FAR), and then add them to the
rhbag.apj file.
An alternate method might be to perform the new entry
formatting in Word,
but then filtering those results through Notepad first
, then copying the straight text into the rhbag.apj file.
Good luck,
Leon
Similar Messages
-
I have a primary database that need to import large amount of data and database objects. 1.) Do I shutdown the standby? 2.) Turn off archive log mode? 3.) Perform the import? 4.) Rebuild the standby? or is there a better way or best practice?
Instead of rebuilding the (whole) standby, you take an incremental (from SCN) backup from the Primary and restore it on the Standby. That way, if, for example
a. Only two out of 12 tablespaces are affected by the import, the incremental backup would effectively be only the blocks changed in those two tablespaces (and some other changes in system and undo) {provided that there are no other changes in the other ten tablespaces}
b. if the size of the import is only 15% of the database, the incremental backup to restore to the standby is small
Hemant K Chitale -
Couldn't copy large amount of data from enterprise DB to Oracle 10g
Hi,
I am using i-batis to copy data from enterprise DB to oracle and viceversa.
The datatype of a field on EDB is 'text' and the datatype on oracle is 'SYS.XMLTYPE'
i am binding these to a java string property in a POJO to bind values.
I could successfully copy limited amount of data from EDB to oracle but if there is more data, i am getting the following exceptions with different oracle drivers ( but i could read large amount of data from EDB):
--- Cause: java.sql.SQLException: ORA-01461: can bind a LONG value only for insert into a LONG column
at com.ibatis.sqlmap.engine.mapping.statement.MappedStatement.executeUpdate(MappedStatement.java:107)
at com.ibatis.sqlmap.engine.impl.SqlMapExecutorDelegate.update(SqlMapExecutorDelegate.java:457)
at com.ibatis.sqlmap.engine.impl.SqlMapSessionImpl.update(SqlMapSessionImpl.java:90)
at com.ibatis.sqlmap.engine.impl.SqlMapClientImpl.update(SqlMapClientImpl.java:66)
at com.aqa.pojos.OstBtlData.updateOracleFromEdbBtlWebservice(OstBtlData.java:282)
at com.aqa.pojos.OstBtlData.searchEdbAndUpdateOracleBtlWebservice(OstBtlData.java:258)
com.ibatis.common.jdbc.exception.NestedSQLException:
--- The error occurred in com/aqa/sqlmaps/SQLMaps_OSTBTL_Oracle.xml.
--- The error occurred while applying a parameter map.
--- Check the updateOracleFromEDB-InlineParameterMap.
--- Check the parameter mapping for the 'btlxml' property.
--- Cause: java.sql.SQLException: setString can only process strings of less than 32766 chararacters
at com.ibatis.sqlmap.engine.mapping.statement.MappedStatement.executeUpdate(MappedStatement.java:107)
at com.iba
I have latest oracle 10g jdbc drivers.
remember, i could copy any amount of data from oracle to EDB but not otherway around.
PLease let me know if you have come across this issue, any recommendation is very much appreciated.
Thanks,
CK.Hi,
I finally remembered how I solved this issue previously.
The jdbc driver isn't able to directly call the insert with a column xml_type. The solution I was using was to build a wrapper function in plSQL.
Here it is (for insert but I suppose tha update will be the same)
create or replace procedure insertXML(file_no_in in number, program_no_in in varchar2, ost_XML_in in clob, btl_XML_in in clob) is
begin
insert into AQAOST_FILES (file_no,program_no,ost_xml,btl_xml) values(file_no_in, program_no_in, xmltype(ost_XML_in), xmltype(btl_XML_in));
end insertXML;
here is the sqlmap file I used
<?xml version="1.0" encoding="UTF-8" ?>
<!DOCTYPE sqlMap
PUBLIC "-//ibatis.apache.org//DTD SQL Map 2.0//EN"
"http://ibatis.apache.org/dtd/sql-map-2.dtd">
<sqlMap>
<typeAlias alias="AqAost" type="com.sg2net.jdbc.AqAost" />
<insert id="insert" parameterClass="AqAost">
begin
insertxml(#fileNo#,#programNo#,#ostXML:CLOB#,#bltXML:CLOB#);
end;
</insert>
</sqlMap>
an here is a simple program
package com.sg2net.jdbc;
import java.io.IOException;
import java.io.Reader;
import java.io.StringWriter;
import java.sql.Connection;
import oracle.jdbc.pool.OracleDataSource;
import com.ibatis.common.resources.Resources;
import com.ibatis.sqlmap.client.SqlMapClient;
import com.ibatis.sqlmap.client.SqlMapClientBuilder;
public class TestInsertXMLType {
* @param args
public static void main(String[] args) throws Exception {
// TODO Auto-generated method stub
String resource="sql-map-config-xmlt.xml";
Reader reader= Resources.getResourceAsReader(resource);
SqlMapClient sqlMap = SqlMapClientBuilder.buildSqlMapClient(reader);
OracleDataSource dataSource= new OracleDataSource();
dataSource.setUser("test");
dataSource.setPassword("test");
dataSource.setURL("jdbc:oracle:thin:@localhost:1521:orcl");
Connection connection=dataSource.getConnection();
sqlMap.setUserConnection(connection);
AqAost aqAost= new AqAost();
aqAost.setFileNo(3);
aqAost.setProgramNo("prg");
Reader ostXMLReader= Resources.getResourceAsReader("ostXML.xml");
Reader bltXMLReader= Resources.getResourceAsReader("bstXML.xml");
aqAost.setOstXML(readerToString(ostXMLReader));
aqAost.setBltXML(readerToString(bltXMLReader));
sqlMap.insert("insert", aqAost);
connection.commit();
public static String readerToString(Reader reader) {
StringWriter writer = new StringWriter();
char[] buffer = new char[2048];
int charsRead = 0;
try {
while ((charsRead = reader.read(buffer)) > 0) {
writer.write(buffer, 0, charsRead);
} catch (IOException ioe) {
throw new RuntimeException("error while converting reader to String", ioe);
return writer.toString();
package com.sg2net.jdbc;
public class AqAost {
private long fileNo;
private String programNo;
private String ostXML;
private String bltXML;
public long getFileNo() {
return fileNo;
public void setFileNo(long fileNo) {
this.fileNo = fileNo;
public String getProgramNo() {
return programNo;
public void setProgramNo(String programNo) {
this.programNo = programNo;
public String getOstXML() {
return ostXML;
public void setOstXML(String ostXML) {
this.ostXML = ostXML;
public String getBltXML() {
return bltXML;
public void setBltXML(String bltXML) {
this.bltXML = bltXML;
I tested the insert and it works correctly
ciao,
Giovanni -
Streaming large amounts of data of socket causes corruption?
I'm wrinting an app to transfer large amounts of data via a simple client/server architecture between two machines.
Problem: If I send the data too 'fast', the data arrives corrupted:
- Calls to read() returns wrong data (wrong 'crc')
- Subsequent calls to read() do not return -1 but allow me to read e.g. another 60 or 80 KBytes.
- available() returns always '0'; but I'll get rid of that method anyway (as recommended in other forum entries).
The behaviour is somewhat difficult to repeat, but it fails for me reliably when transferring the data between two separate machines and when setting the number of packets (Sender.TM) to 1000 or larger.
Workaround: Reduce number of packages send to e.g. 1; or intruduce the 'sleep' on the Sender side. Another workaround: Changing alone to java.nio.* did not help, but when I got rid of the Streams and used solely ByteBuffers, the problem disappeared. Unfortunately the Streams are required by other parts of my application.
I'm running the code on two dual-CPU machines connected via
Below are the code of the Sender and the Listener. Please excuse the style as this is only to demonstrate the problem.
import java.io.IOException;
import java.io.OutputStream;
import java.net.InetSocketAddress;
import java.nio.channels.Channels;
import java.nio.channels.SocketChannel;
import java.util.Arrays;
public class SenderBugStreams {
public static void main(String[] args) throws IOException {
InetSocketAddress targetAdr = new InetSocketAddress(args[0], Listener.DEFAULT_PORT);
System.out.println("connecting to: "+targetAdr);
SocketChannel socket = SocketChannel.open(targetAdr);
sendData(socket);
socket.close();
System.out.println("Finished.");
static final int TM = 10000;
static final int TM_SIZE = 1000;
static final int CRC = 2;
static int k = 5;
private static void sendData(SocketChannel socket) throws IOException {
OutputStream out = Channels.newOutputStream(socket);
byte[] ba = new byte[TM_SIZE];
Arrays.fill(ba, (byte)(k++ % 127));
System.out.println("Sending..."+k);
for (int i = 0; i < TM; i++) {
out.write(ba);
// try {
// Thread.sleep(10);
// } catch (InterruptedException e) {
// // TODO Auto-generated catch block
// e.printStackTrace();
// throw new RuntimeException(e);
out.write(CRC);
out.flush();
out.close();
import java.io.IOException;
import java.io.InputStream;
import java.net.InetSocketAddress;
import java.nio.channels.Channels;
import java.nio.channels.ServerSocketChannel;
import java.nio.channels.SocketChannel;
public class ListenerBugStreams {
static int DEFAULT_PORT = 44521;
* @param args
* @throws IOException
public static void main(String[] args) throws IOException {
ServerSocketChannel serverChannel = ServerSocketChannel.open();
serverChannel.socket().bind(new InetSocketAddress(DEFAULT_PORT));
System.out.print("Waiting...");
SocketChannel clientSocket = serverChannel.accept();
System.out.println(" starting, IP=" + clientSocket.socket().getInetAddress() +
", Port="+clientSocket.socket().getLocalPort());
//read data from socket
readData(clientSocket);
clientSocket.close();
serverChannel.close();
System.out.println("Closed.");
private static void readData(SocketChannel clientSocket) throws IOException {
InputStream in = Channels.newInputStream(clientSocket);
//read and ingest objects
byte[] ba = null;
for (int i = 0; i < SenderBugStreams.TM; i++) {
ba = new byte[SenderBugStreams.TM_SIZE];
in.read(ba);
System.out.print("*");
//verify checksum
int crcIn = in.read();
if (SenderBugStreams.CRC != crcIn) {
System.out.println("ERROR: Invalid checksum: "+SenderBugStreams.CRC+"/"+crcIn);
System.out.println(ba[0]);
int x = in.read();
int remaining = 0;
while (x != -1) {
remaining++;
x = in.read();
System.out.println("Remaining:"+in.available()+"/"+remaining);
System.out.println(" "+SenderBug.TM+" objects ingested.");
in.close();
}Here is your trouble:
in.read(ba);read(byte[]) does not read N bytes, it reads up to N bytes. If one byte has arrived then it reads and returns that one byte. You always need to check the return value of read(byte[]) to see how much you got (also check for EOF). TCP chops up the written data to whatever packets it feels like and that makes read(byte[]) pretty random.
You can use DataInputStream which has a readFully() method; it loops calling read() until it gets the full buffer's worth. Or you can write a little static utility readFully() like so:
// Returns false if hits EOF immediately. Otherwise reads the full buffer's
// worth. If encounters EOF in mid-packet throws an IOException.
public static boolean readFully(InputStream in, byte buf[])
throws IOException
return readFully(in, buf, 0, buf.length);
public static boolean readFully(InputStream in, byte buf[], int pos, int len)
throws IOException
int got_total = 0;
while (got_total < len) {
int got = in.read(buf, pos + got_total, len - got_total);
if (got == -1) {
if (got_total == 0)
return false;
throw new EOFException("readFully: end of file; expected " +
len + " bytes, got only " + got_total);
got_total += got;
return true;
} -
Hello fellow Java fans
First, let me point out that I'm a big Java and Linux fan, but somehow I ended up working with .NET and Microsoft.
Right now my software development team is working on a web tool for a very important microchips manufacturer company. This tool handles big amounts of data; some of our online reports generates more that 100.000 rows which needs to be displayed on a web client such as Internet Explorer.
We make use of Infragistics, which is a set of controls for .NET. Infragistics allows me to load data fetched from a database on a control they call UltraWebGrid.
Our problem comes up when we load large amounts of data on the UltraWebGrid, sometimes we have to load 100.000+ rows; during this loading our IIS server memory gets killed and could take up to 5 minutes for the server to end processing and display the 100.000+ row report. We already proved the database server (SQL Server) is not the problem, our problem is the IIS web server.
Our team is now considering migrating this web tool to Java and JSP. Can you all help me with some links, information, or past experiences you all have had with loading and displaying large amounts of data like the ones we handle on JSP? Help will be greatly appreciated.Who in the world actually looks at a 100,000 row report?
Anyway if I were you and I had to do it because some clueless management person decided it was a good idea... I would write a program in something that once a day, week, year or whatever your time period produced the report (in maybe a PDF fashion but you could do it in HTML if you really must have it that way) and have it as a static file that you link to from your app.
Then the user will have to just wait while it downloads but the webserver or web applications server will not be bogged down trying to produce that monstrosity. -
Hello,
I am using the Table Group component for displaying data in my application designed in Java Studio Creator.
I have enabled paging on the component. I use CachedRowSet on the bean for the page for getting the data. This works very well at the moment in my development environment. At the moment I am testing on small amount of data.
I was wondering how does this component perform with very large amounts of data (>75,000 rows). I noticed that there is a button available for users to retrieve all the rows. So I was wondering apart from that instance, when viewing in a paged mode does the component get all the results from the database everytime ?
Which component would be best suited for displaying large amounts of data in a table format?
Thanks In Advance!!Thanks for your reply. The table control that I use does have paging as a feature and I have enabled it. It still takes time to load the data initially.
I wonder if it is got to do with the logic of paging. How do you specify which set of 20 records to extract from SQL.
Thanks for your help!! -
Hi
I have a file on application server in .dat format, it contains large amount of data may be 2 million of records or more, I need to open the file to check the record count, is there any software or any option to open the file, I have tried opening with Notepad, excel .... it gives error..
please let me know
ThanksHi,
Try this..
Go to AL11..
Go to the file directory..Then in the file there will be field called length..which is the total length of the file in characters..
If you know the length of a single line..
Divide the length of the file by the length of single line..I believe you will get the number of records..
Thanks,
Naren -
Bex Report Designer - Large amount of data issue
Hi Experts,
I am trying to execute (on Portal) report made in BEx Report Designer, with about 30 000 pages, and the only thing I am getting is a blank page. Everything works fine at about 3000 pages. Do I need to set something to allow processing such large amount of data?
Regards
VladimirHi Sauro,
I have not seen this behavior, but it has been a while since I tried to send an input schedule that large. I think the last time was on a BPC NW 7.0 SP06 system and it worked OK. If you are on a recent support package, then you should search for relevant notes (none come to mind for me, but searching yourself is always a good idea) and if you don't find one then you should open a support message with SAP, with very specific instructions for recreating the problem from a clean input-schedule.
Good luck,
Ethan -
Advice needed on how to keep large amounts of data
Hi guys,
Im not sure whats the best way is to make large amounts of data available to my android app on the local device.
For example records of food ingredients, in the 100's?
I have read and successfully created .db's using this tutorial.
http://help.adobe.com/en_US/AIR/1.5/devappsflex/WS5b3ccc516d4fbf351e63e3d118666ade46-7d49. html
However to populate the database I use flash? So this kind of defeats the purpose of it. No point in me shifting a massive array of data from flash to a sql database, when I could access the data direct from the as3 array?
So maybe I could create the .db with an external program? but then how would I include that .db in the apk file and then deploy it to users android device.
Or maybe I create a as3 class with an xml object init and use that as a means of data storage?
Any advice would be appreciatedYou can use any means you like to populate your SQLite database, including using external programs, (temporarily) embedding a text file with SQL statements, executing some SQL from AS3 code etc etc.
Once you have populated your db, deploy it with your project:
http://chrisgriffith.wordpress.com/2011/01/11/understanding-bundled-sqlite-databases-in-ai r-for-mobile/
Cheers, - Jon - -
Error in Generating reports with large amount of data using OBIR
Hi all,
we hve integrated OBIR (Oracle BI Reporting) with OIM (Oracle Identity management) to generate the custom reports. Some of the custom reports contain a large amount of data (approx 80-90K rows with 7-8 columns) and the query of these reports basically use the audit tables and resource form tables primarily. Now when we try to generate the report, it is working fine with HTML where report directly generate on console but the same report when we tried to generate and save in pdf or Excel it gave up with the following error.
[120509_133712190][][STATEMENT] Generating page [1314]
[120509_133712193][][STATEMENT] Phase2 time used: 3ms
[120509_133712193][][STATEMENT] Total time used: 41269ms for processing XSL-FO
[120509_133712846][oracle.apps.xdo.common.font.FontFactory][STATEMENT] type1.Helvetica closed.
[120509_133712846][oracle.apps.xdo.common.font.FontFactory][STATEMENT] type1.Times-Roman closed.
[120509_133712848][][PROCEDURE] FO+Gen time used: 41924 msecs
[120509_133712848][oracle.apps.xdo.template.FOProcessor][STATEMENT] clearInputs(Object) is called.
[120509_133712850][oracle.apps.xdo.template.FOProcessor][STATEMENT] clearInputs(Object) done. All inputs are cleared.
[120509_133712850][oracle.apps.xdo.template.FOProcessor][STATEMENT] End Memory: max=496MB, total=496MB, free=121MB
[120509_133818606][][EXCEPTION] java.net.SocketException: Socket closed
at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:99)
at java.net.SocketOutputStream.write(SocketOutputStream.java:136)
at weblogic.servlet.internal.ChunkOutput.writeChunkTransfer(ChunkOutput.java:525)
at weblogic.servlet.internal.ChunkOutput.writeChunks(ChunkOutput.java:504)
at weblogic.servlet.internal.ChunkOutput.flush(ChunkOutput.java:382)
at weblogic.servlet.internal.ChunkOutput.checkForFlush(ChunkOutput.java:469)
at weblogic.servlet.internal.ChunkOutput.write(ChunkOutput.java:304)
at weblogic.servlet.internal.ChunkOutputWrapper.write(ChunkOutputWrapper.java:139)
at weblogic.servlet.internal.ServletOutputStreamImpl.write(ServletOutputStreamImpl.java:169)
at java.io.BufferedOutputStream.write(BufferedOutputStream.java:105)
at oracle.apps.xdo.servlet.util.IOUtil.readWrite(IOUtil.java:47)
at oracle.apps.xdo.servlet.CoreProcessor.process(CoreProcessor.java:280)
at oracle.apps.xdo.servlet.CoreProcessor.generateDocument(CoreProcessor.java:82)
at oracle.apps.xdo.servlet.ReportImpl.renderBodyHTTP(ReportImpl.java:562)
at oracle.apps.xdo.servlet.ReportImpl.renderReportBodyHTTP(ReportImpl.java:265)
at oracle.apps.xdo.servlet.XDOServlet.writeReport(XDOServlet.java:270)
at oracle.apps.xdo.servlet.XDOServlet.writeReport(XDOServlet.java:250)
at oracle.apps.xdo.servlet.XDOServlet.doGet(XDOServlet.java:178)
at oracle.apps.xdo.servlet.XDOServlet.doPost(XDOServlet.java:201)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:292)
at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:26)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:42)
at oracle.apps.xdo.servlet.security.SecurityFilter.doFilter(SecurityFilter.java:97)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:42)
at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3496)
at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
at weblogic.security.service.SecurityManager.runAs(Unknown Source)
at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2180)
at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2086)
at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1406)
at weblogic.work.ExecuteThread.execute(ExecuteThread.java:201)
at weblogic.work.ExecuteThread.run(ExecuteThread.java:173)
It seems where the querry processing is taking some time we are facing this issue.Do i need to perform any additional configuration to generate such reports?java.net.SocketException: Socket closed
at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:99)
at java.net.SocketOutputStream.write(SocketOutputStream.java:136)
at weblogic.servlet.internal.ChunkOutput.writeChunkTransfer(ChunkOutput.java:525)
at weblogic.servlet.internal.ChunkOutput.writeChunks(ChunkOutput.java:504)
at weblogic.servlet.internal.ChunkOutput.flush(ChunkOutput.java:382)
at weblogic.servlet.internal.CharsetChunkOutput.flush(CharsetChunkOutput.java:249)
at weblogic.servlet.internal.ChunkOutput.checkForFlush(ChunkOutput.java:469)
at weblogic.servlet.internal.CharsetChunkOutput.implWrite(CharsetChunkOutput.java:396)
at weblogic.servlet.internal.CharsetChunkOutput.write(CharsetChunkOutput.java:198)
at weblogic.servlet.internal.ChunkOutputWrapper.write(ChunkOutputWrapper.java:139)
at weblogic.servlet.internal.ServletOutputStreamImpl.write(ServletOutputStreamImpl.java:169)
at com.tej.systemi.util.AroundData.copyStream(AroundData.java:311)
at com.tej.systemi.client.servlet.servant.Newdownloadsingle.producePageData(Newdownloadsingle.java:108)
at com.tej.systemi.client.servlet.servant.BaseViewController.serve(BaseViewController.java:542)
at com.tej.systemi.client.servlet.FrontController.doRequest(FrontController.java:226)
at com.tej.systemi.client.servlet.FrontController.doPost(FrontController.java:128)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:292)
at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:175)
at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3498)
at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
at weblogic.security.service.SecurityManager.runAs(Unknown Source)
at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2180)
at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2086)
at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1406)
at weblogic.work.ExecuteThread.execute(ExecuteThread.java:201)
at weblogic.work.ExecuteThread.run(ExecuteThread.java:17
(Please help finding a solution in this issue its in production and we need to ASAP)
Thanks in Advance
Edited by: 909601 on Jan 23, 2012 2:05 AM -
With journaling, I have found that my computer is saving a large amount of data, logs of all the changes I make to files; how can I clean up these logs?
For example, in Notes, I have written three notes; however if I click on 'All On My Mac' on the side bar, I see about 10 different versions of each note I make, it saves a version every time I add or delete a sentence.
I also noticed, that when I write an email, Mail saves about 10 or more draft versions before the final is sent.
I understand that all this journaling provides a level of security, and prevents data lost; but I was wondering, is there a function to clean up journal logs once in a while?
Thanks
RozAre you using Microsoft word? Microsoft thinks the users are idiots. They put up a lot of pointless messages that annoy & worry users. I have seen this message from Microsoft word. It's annoying.
As BDaqua points out...
When you copy information via edit > copy, command + c, edit > cut, or command +x, you place the information on the clipboard. When you paste information, edit > paste or command + v, you copy information from the clipboard to your data file.
If you edit > cut or command + x and you do not paste the information and you quite Word, you could be loosing information. Microsoft is very worried about this. When you quite Word, Microsoft checks if there is information on the clipboard & if so, Microsoft puts out this message.
You should be saving your work more than once a day. I'd save every 5 minutes. command + s does a save.
Robert -
Looking for ideas for transferring large amounts of data between systems
Hello,
I am looking for ideas based on best practices for transferring Large Amounts of Data in and out of a Netweaver based application.
We have a new system we are developing in Netweaver that will utilize both the Java and ABAP stack, and will require integration with other SAP and 3rd Party Systems. It is a standalone product that doesn't share any form of data store with other systems.
We need to be able to support 10s of millions of records of tabular data coming in and out of our system.
Since we need to integrate with so many different systems, we are planning to use RFC for our primary interface in and out of the system. As it turns out RFC is not good at dealing with this large amount of data being pushed through a single call.
We have considered a number of possible ideas, however we are not very happy with any of them. I would like to see what the community has done in the past to solve problems like this as well as how SAP currently solves this problem in other applications like XI, BI, ERP, etc.Primoz wrote:Do you use KDE (Dolphin) 4.6 RC or 4.5?
Also I've noticed that if i move / copy things with Dolphin they're substantially slower than if I use cp/mv. But cp/mv works fine for me...
Also run Dolphin from terminal to try and see what's the problem.
Hope that help at least a bit.
Could you explain why Dolphin should be slower? I'm not attacking you, I'm just asking.
Cause I thought that Dolphin is just a „little" wrapper around the cp/mv/cd/ls applications/commands. -
Azure Cloud service fails when sent large amount of data
This is the error;
Exception in AZURE Call: An error occurred while receiving the HTTP response to http://xxxx.cloudapp.net/Service1.svc. This could be due to the service endpoint binding not using the HTTP protocol. This could also be due to an HTTP request context being
aborted by the server (possibly due to the service shutting down). See server logs for more details.
Calls with smaller amounts of data work fine. Large amounts of data cause this error.
How can I fix this??Go to the web.config file, look for the <binding> that is being used for your service, and adjust the various parameters that limit the maximum length of the messages, such as
maxReceivedMessageSize.
http://msdn.microsoft.com/en-us/library/system.servicemodel.basichttpbinding.maxreceivedmessagesize(v=vs.100).aspx
Make sure that you specify a size that is large enough to accomodate the amount of data that you are sending (the default is 64Kb).
Note that even if you set a very large value here, you won't be able to go beyond the maximum request length that is configured in IIS. If I recall correctly, the default limit in IIS is 8 megabytes. -
DSS problems when publishing large amount of data fast
Has anyone experienced problems when sending large amounts of data using the DSS. I have approximately 130 to 150 items that I send through the DSS to communicate between different parts of my application.
There are several loops publishing data. One publishes approximately 50 items in a rate of 50ms, another about 40 items with 100ms publishing rate.
I send a command to a subprogram (125ms) that reads and publishes the answer on a DSS URL (app 125 ms). So that is one item on DSS for about 250ms. But this data is not seen on my man GUI window that reads the DSS URL.
My questions are
1. Is there any limit in speed (frequency) for data publishing in DSS?
2. Can DSS be unstable if loaded to much?
3. Can I lose/miss data in any situation?
4. In the DSS Manager I have doubled the MaxItems and MaxConnections. How will this affect my system?
5. When I run my full application I have experienced the following error Fatal Internal Error : ”memory.ccp” , line 638. Can this be a result of my large application and the heavy load on DSS? (se attached picture)
Regards
Idriz Zogaj
Idriz "Minnet" Zogaj, M.Sc. Engineering Physics
Memory Profesional
direct: +46 (0) - 734 32 00 10
http://www.zogaj.seLuI wrote:
>
> Hi all,
>
> I am frustrated on VISA serial comm. It looks so neat and its
> fantastic what it supposes to do for a develloper, but sometimes one
> runs into trouble very deep.
> I have an app where I have to read large amounts of data streamed by
> 13 µCs at 230kBaud. (They do not necessarily need to stream all at the
> same time.)
> I use either a Moxa multiport adapter C320 with 16 serial ports or -
> for test purposes - a Keyspan serial-2-USB adapter with 4 serial
> ports.
Does it work better if you use the serial port(s) on your motherboard?
If so, then get a better serial adapter. If not, look more closely at
VISA.
Some programs have some issues on serial adapters but run fine on a
regular serial port. We've had that problem recent
ly.
Best, Mark -
Freeze when writing large amount of data to iPod through USB
I used to take backups of my PowerBook to my 60G iPod video. Backups are taken with tar in terminal directly to mounted iPod volume.
Now, every time I try to write a big amount of data to iPod (from MacBook Pro), the whole system freezes (mouse cursor moves, but nothing else can be done). When the USB-cable is pulled off, the system recovers and acts as it should. This problem happens every time a large amount of data is written to iPod.
The same iPod works perfectly (when backupping) in PowerBook and small amounts of data can be easily written to it (in MacBook Pro) without problems.
Does anyone else have the same problem? Any ideas why is this and how to resolve the issue?
MacBook Pro, 2.0Ghz, 100GB 7200RPM, 1GB Ram Mac OS X (10.4.5) IPod Video 60G connected through USBEx PC user...never had a problem.
Got a MacBook Pro last week...having the same issues...and this is now with an exchanged machine!
I've read elsewhere that it's something to do with the USB timing out. And if you get a new USB port and attach it (and it's powered separately), it should work. Kind of a bummer, but, those folks who tried it say it works.
Me, I can upload to Ipod piecemeal, manually...but even then, it sometimes freezes.
The good news is that once the Ipod is loaded, the problem shouldnt' happen. It's the large amounts of data.
Apple should DEFINITELY fix this though. Unbelievable.
MacBook Pro 2.0 Mac OS X (10.4.6)
Maybe you are looking for
-
Dear all, we have the following business case: Free-of-charge invoice for goods: One position in the invoice SD-Kondition A = Price 100,- EUR SD-Kondition B = Discount -100,- EUR SD-Invoice value = 0,- EUR Condition A and B are assinged in CO-PA
-
File and folder permissions for Adobe Photoshop CS5
Good day, I am an IT specialist and I work for a Canadian governement agency and we are having an issue with Photoshop CS5. After successfully installing Photoshop CS5 from the Adobe Creative Suite 5 Design Premium set(using a local machine administr
-
Sending a formatted Excel sheet as an attachment in a mail.
Hi , I have been using following code to to send formatted excel sheet as attachment in email. in the below code I want to change the format of cell from Bold to Underline. I have replaced Bold with Underline for below code but it is not working can
-
Hi all, I have a query for debtors. This query is working fine. In query output, I just want a number that contains total no of records present in a query. How can i do that? Is that possible in BI?? Regards, Aisha Ishrat ICI Pakistan Ltd
-
DadTool.pl obfuscation doesn't work with passwords containing a "$" sign !?
Hi, we are using the OHS and mod_plsql which is bundled with the Oracle 9i Companion CD. When we try to obfuscate a password containing a $-sign (p.e. test1$test2) with dadTool.pl, the dads.conf is updated with an encrypted password and the dadTool.p