Amount of data over  db link

How can i measure how much data is being retrieved over a db link ?
I have several queries, which are run at regular intervals to pull stats from one database to another so I can report on (such as tablespace sizes, memory stats and such like).
But my netywork team are complaining its hiting 7Mb/sec.
(I'm suspect the networking tool is being wrongly interpreted wrongly - I'm not sure how much I am pulling back, but its just for a second or two , which cos of the reporting tool they are using, looks like I am continually pulling 7Mb/Sec, when I'm pulling 7Mb just for one or two seconds every 5 mins.)

7Mb/sec. would saturate 100Mb network, not 1Gb.
1Gb NICs, switches and cables are on market already for couple of years and quite affordable.
It is really a shame for your network team to use old technologies and limit business in getting timely and quality data that you pump through the DB link.;)

Similar Messages

  • Error insert data over db link in table covered by materialized view

    Hello together,
    following problem:
    I got a table called LOCATION_INFO which is defined:
    create table LOCATION_INFO
    LOCATION_ID VARCHAR2(40) not null,
    PLANT VARCHAR2(4) not null,
    PRODUCT VARCHAR2(3),
    AREA VARCHAR2(1),
    LINE NUMBER(10),
    STATION NUMBER(10),
    STATINDEX NUMBER(10),
    FU NUMBER(10),
    WP NUMBER(10),
    TP NUMBER(10),
    LOCATION_LEVEL NUMBER(1) not null,
    LOCATION_PARENT_ID VARCHAR2(40),
    TIME_STAMP TIMESTAMP(6) WITH TIME ZONE not null
    I try to load data over PL/SQL procedure from another database using database link:
    INSERT INTO LOCATION_INFO
    (LOCATION_ID,
    PLANT,
    PRODUCT,
    AREA,
    LINE,
    STATION,
    STATINDEX,
    FU,
    WP,
    TP,
    LOCATION_LEVEL,
    LOCATION_PARENT_ID,
    TIME_STAMP)
    SELECT LOCATION_ID,
    PLANT,
    PRODUCT,
    AREA,
    LINE,
    STATION,
    STATINDEX,
    FU,
    WP,
    TP,
    LOCATION_LEVEL,
    LOCATION_PARENT_ID,
    GetUTCDateTime(TIME_STAMP) AS time_Stamp
    FROM LOCATION_INFO@SOURCE_MMPDDB
    WHERE ROWNUM < 100;
    This works fine (If i do select count(*) from location_info the data is present) but if set a commit
    ORA-00603 appears and the session is terminated.
    The point is i got a materialized view MVIEW_LOCATIONS in another schema in the database reading the data from my table location_info and a corresponding MVIEWLOG.
    create table MLOG$_LOCATION_INFO
    LOCATION_ID VARCHAR2(40),
    SNAPTIME$$ DATE,
    DMLTYPE$$ VARCHAR2(1),
    OLD_NEW$$ VARCHAR2(1),
    CHANGE_VECTOR$$ RAW(255)
    CREATE MATERIALIZED VIEW MVIEW_LOCATIONS
    REFRESH FAST ON COMMIT
    ENABLE QUERY REWRITE
    AS
    SELECT "LOCATION_INFO"."LOCATION_ID" "LOCATION_ID","LOCATION_INFO"."PLANT" "PLANT","LOCATION_INFO"."PRODUCT" "PRODUCT","LOCATION_INFO"."AREA" "AREA","LOCATION_INFO"."LINE" "LINE","LOCATION_INFO"."STATION" "STATION","LOCATION_INFO"."STATINDEX" "STATINDEX","LOCATION_INFO"."FU" "FU","LOCATION_INFO"."WP" "WP","LOCATION_INFO"."TP" "TP","LOCATION_INFO"."LOCATION_LEVEL" "LOCATION_LEVEL","LOCATION_INFO"."LOCATION_PARENT_ID" "LOCATION_PARENT_ID","LOCATION_INFO"."TIME_STAMP" "TIME_STAMP" FROM "CP4MMPDNEW"."LOCATION_INFO" "LOCATION_INFO";
    What do I need to do to make the insert working properly without deleting my mviews?
    Can anyone help me?
    Thanks, Matthias

    Helllo,
    Can you change this on your Materialized View DDL:
    REFRESH FAST ON COMMITTo:
    REFRESH FAST ON DEMANDThen, if your INSERT and COMMIT works OK then, can you try:
    exec DBMS_MVIEW.REFRESH('MVIEW_LOCATIONS')

  • Import Data over network link in oracle 11g

    We want to take export of the OND schema in production database and
    import it to the OND schema in UAT database over a network
    link by using data pump,in Oracle 11g.Kindly share the steps.

    Scenario:
    Directly importing the TEST01 schema in the production database (oraodrmu) to test database oraodrmt, over
    a network by using database link and data pump in Oracle 11g.
    Note: When you perform an import over a database link, the import source is a database, not a dump file set, and the data is imported to the connected database instance.
    Because the link can identify a remotely networked database, the terms database link and network link are used interchangeably.
    =================================================================
    STEP-1 (IN PRODUCTION DATABASE - oraodrmu)
    =================================================================
    [root@szoddb01]>su - oraodrmu
    Enter user-name: /as sysdba
    Connected to:
    Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    SQL> grant resource to test01;
    Grant succeeded.
    SQL> grant imp_full_database to test01;
    Grant succeeded.
    SQL> select owner,object_type,status,count(*) from dba_objects where owner='TEST01' group by owner,object_type,status;
    OWNER OBJECT_TYPE STATUS COUNT(*)
    TEST01 PROCEDURE     VALID 2
    TEST01 TABLE VALID 419
    TEST01 SEQUENCE VALID 3
    TEST01 FUNCTION VALID 8
    TEST01 TRIGGER VALID 3
    TEST01 INDEX VALID 545
    TEST01 LOB VALID 18
    7 rows selected.
    SQL>
    SQL> set pages 999
    SQL> col "size MB" format 999,999,999
    SQL> col "Objects" format 999,999,999
    SQL> select obj.owner "Owner"
    2 , obj_cnt "Objects"
    3 , decode(seg_size, NULL, 0, seg_size) "size MB"
    4 from (select owner, count(*) obj_cnt from dba_objects group by owner) obj
    5 , (select owner, ceil(sum(bytes)/1024/1024) seg_size
    6 from dba_segments group by owner) seg
    7 where obj.owner = seg.owner(+)
    8 order by 3 desc ,2 desc, 1
    9 /
    Owner Objects size MB
    OND                    8,097     284,011
    SYS                    9,601     1,912
    TEST01                    998     1,164
    3 rows selected.
    SQL> exit
    =================================================================
    STEP-2 (IN TEST DATABASE - oraodrmt)
    =================================================================
    [root@szoddb01]>su - oraodrmt
    [oraodrmt@szoddb01]>sqlplus
    SQL*Plus: Release 11.2.0.2.0 Production on Mon Dec 3 18:40:16 2012
    Copyright (c) 1982, 2010, Oracle. All rights reserved.
    Enter user-name: /as sysdba
    Connected to:
    Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    SQL> select name,open_mode from v$database;
    NAME OPEN_MODE
    ODRMT READ WRITE
    SQL> create tablespace test_test datafile '/trn_u04/oradata/odrmt/test01.dbf' size 2048m;
    Tablespace created.
    SQL> create user test01 identified by test123 default tablespace test_test;
    User created.
    SQL> grant resource, create session to test01;
    Grant succeeded.
    SQL> grant EXP_FULL_DATABASE to test01;
    Grant succeeded.
    SQL> grant imp_FULL_DATABASE to test01;
    Grant succeeded.
    Note: ODRMU is the DNS hoste name.We can test the connect with: [oraodrmt@szoddb01]>sqlplus test01/test01@odrmu
    SQL> create directory test_network_dump as '/dbdump/test_exp';
    Directory created.
    SQL> grant read,write on directory test_network_dump to test01;
    Grant succeeded.
    SQL> conn test01/test123
    Connected.
    SQL> create DATABASE LINK remote_test CONNECT TO test01 identified by test01 USING 'ODRMU';
    Database link created.
    For testing the database link we can try the below sql:
    SQL> select count(*) from OA_APVARIABLENAME@remote_test;
    COUNT(*)
    59
    SQL> exit
    [oraodrmt@szoddb01]>impdp test01/test123 network_link=remote_test directory=test_network_dump remap_schema=test01:test01 logfile=impdp__networklink_grms.log;
    [oraodrmt@szoddb01]>
    Import: Release 11.2.0.2.0 - Production on Mon Dec 3 19:42:47 2012
    Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Starting "TEST01"."SYS_IMPORT_SCHEMA_01": test01/******** network_link=remote_test directory=test_network_dump remap_schema=test01:test01 logfile=impdp_grms_networklink.log
    Estimate in progress using BLOCKS method...
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 318.5 MB
    Processing object type SCHEMA_EXPORT/USER
    ORA-31684: Object type USER:"TEST01" already exists
    Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
    Processing object type SCHEMA_EXPORT/ROLE_GRANT
    Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
    Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
    Processing object type SCHEMA_EXPORT/SEQUENCE/SEQUENCE
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    . . imported "TEST01"."SY_TASK_HISTORY" 779914 rows
    . . imported "TEST01"."JCR_JNL_JOURNAL" 603 rows
    . . imported "TEST01"."GX_GROUP_SHELL" 1229 rows
    Job "TEST01"."SYS_IMPORT_SCHEMA_01" completed with 1 error(s) at 19:45:19
    [oraodrmt@szoddb01]>sqlplus
    SQL*Plus: Release 11.2.0.2.0 Production on Mon Dec 3 19:46:04 2012
    Copyright (c) 1982, 2010, Oracle. All rights reserved.
    Enter user-name: /as sysdba
    Connected to:
    Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    SQL> select owner,object_type,status,count(*) from dba_objects where owner='TEST01' group by owner,object_type,status;
    OWNER OBJECT_TYPE STATUS COUNT(*)
    TEST01 PROCEDURE          VALID 2
    TEST01 TABLE               VALID 419
    TEST01 SEQUENCE          VALID 3
    TEST01 FUNCTION          VALID 8
    TEST01 TRIGGER          VALID 3
    TEST01 INDEX               VALID 545
    TEST01 LOB               VALID 18
    TEST01 DATABASE LINK          VALID 1
    8 rows selected.
    SQL>
    SQL> set pages 999
    SQL> col "size MB" format 999,999,999
    SQL> col "Objects" format 999,999,999
    SQL> select obj.owner "Owner"
    2 , obj_cnt "Objects"
    3 , decode(seg_size, NULL, 0, seg_size) "size MB"
    4 from (select owner, count(*) obj_cnt from dba_objects group by owner) obj
    5 , (select owner, ceil(sum(bytes)/1024/1024) seg_size
    6 from dba_segments group by owner) seg
    7 where obj.owner = seg.owner(+)
    8 order by 3 desc ,2 desc, 1
    9 /
    Owner Objects size MB
    OND                8,065          247,529
    SYS               9,554          6,507
    TEST01               999          1,164
    13 rows selected.
    =================================================================
    STEP-3 FOR REMOVING THE DATABASE LINK
    =================================================================
    [oraodrmt@szoddb01]>sqlplus
    SQL*Plus: Release 11.2.0.2.0 Production on Mon Dec 3 19:16:01 2012
    Copyright (c) 1982, 2010, Oracle. All rights reserved.
    Enter user-name: /as sysdba
    Connected to:
    Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    SQL> drop database link remote_test;
    Database link dropped.

  • Sending a large amount of data over a post

    Hi.
    I'm trying to send a large amount of text (about 400,000 characters) with a post to a servlet running in Tomcat. For debugging my servlet just writes the data to standard out and a text file.
    The problem is, I get about 2048 characters printed correctly and then a long stream of garbage.
    Has anyone seen a problem like this, or know of any limitations of the length of a post?
    Thanks,
    Jerry

    there might be a limitation for reading input text
    u better use object input stream for reading.....
    any more idea do mail me
    [email protected]

  • Data Transffer over DB Link

    Hi Every One,
    Please help  me to providing the best solution for below scenario.
    I‘ve to migrate data from one DB to another DB . Both DB’s are different instance. I’ve created the DB links and granted Select privileges on required tables to destination Schema.
    There are around 10 tables and each table contains 2 – 3 million rows approximately.
    Now my question is for this data transfer should I use:
    Using Direct Insert ..Select statement  ( Which contains ONLY SQL and reduces context switching)
    OR
    Copy the source table data into local collections and insert the data by selecting from collections.( By using limit )
    And I cannot use COMMIT , So that the entire process executes successfully or  fails.
    I don’t want to some records to be inserted and if the process fails then do the analysis and insert the remaining records. This would be  a problem when we are migrating data in production.
    And what are the problems we face while accessing data over a DB link.
    Thanks,
    Munna

    INSERT INTO .. SELECT .. FROM .. Is the way to go. You can use APPEND hint to do a direct path insert. This will speed up your insert. Samething is being discussed here Forall insert Vs Direct insert. You can also consider data pump by specifying the NETWORK_LINK option

  • IPhone using extremely high amounts of data when not in use

    I have a 4S on AT&T and am grandfathered in w/ the unlimited data plan. Last month I got the notification that I was in the top 5% of data users (which sounds crazy to me considering what some other people I've read about seem to use). According to AT&T's site I had used 2.09 GB with 16 days left in my billing cycle.
    Once I got that notification I logged on to see if I could find out when I was using such large amounts of data. It turns out between midnight and 1 am almost every night there was large amounts of data sent. I think this is strange since for two reasons, 1) I'm on WiFi at home in my apartment and 2) I wasn't even awake at the times these charges occurred.
    Example: January 6, 2012 at 12:40 am I was charged for 417,327kb, roughly 417MB.
    Similar data usage occurred almost every night going back to 12/28/11. On 12/29 I was charged for 468MB at 12:21am. Very unlikely this was actually data I used since I had work the next day and wasn't awake. It honestly looks like 80-85% of my data usage is coming from these occurrences. It also isn't likely that this is a total of data used throughout the day as there are other entries of smaller amounts spread out throughout the day.
    Now, if I was on a truly unlimited plan and there was no such thing as throttling I really wouldn't care about this. But the fact is that my 3G speeds are being throttled (just ran speed test at a location I used to get over 1Mbps and I am was at .07 Mbps). I have spoken to AT&T and they insisted it was an issue w/ my phone, either the hardware or the OS. So I went to apple and the "Genius" did a DFU restore for me in store and told me that would fix it. It hasn't and now I'm stuck with this constantly happening and unbearably slow 3G data speeds for the rest of the month.
    That was my last billing cycle where I was throttled down to download speeds of .07mbps. Unusable. This month I decided to closely monitor my data usage. I turned off iCloud and photostream, and turn off data whenever I'm using wifi. I turned off the sending info to apple setting.
    I checked my account on ATT.com today and noticed that yesterday morning at 7:58 am while driving to work I somehow used 247 MB. Don't ask me how, I'm not steaming anything and my phone is either locked or playing music from the iPod app. I could probably stream Netflix for 5 hours and not use that much data.
    I called AT&T to let them know (AGAIN) and got bumped up to a "Data manager" who was incredibly b*tchy and rude. She said there must be some app that's using that much data to update. I don't have any apps that are even that size so I don't see how this could possibly be the case. I tried to talk with her about how that can't really be the case and she just kept repeating that there must be some app using the data blah blah. I've gone to apple with this last month after failing with AT&T and at first they did a DFU restore and we restored the phone as new, then it kept happening so I went back they replaced my phone. AT&T said it must be a hardware or software issue last month, but this month it must be an app that's doing this. I can't get a straight answer from them and they just keep passing the ball to say it's Apple's problem.
    I really don't know what to do. I'm grandfathered in on the unlimited plan and don't really want to change that. Say I change my plan to the 3GB's at the same $30/month. What's to say this won't keep happening and I'll be over that 3GB's in 10 days then get charged an extra $10 for each additional GB?!
    I can't think of any apps I've downloaded in the past few months that would result in this change and I can't live with another 2/3 of my billing cycle being throttled down to unusable speeds.
    Has anybody else had anything similar happen? I have no idea what to do next (and sorry for the long rant but I'm fresh off the phone with AT&T and I'm ******)

    Thanks for the link. I only went through the last three pages and unfortunately it looks like there is no solution. I turn data off every night and turn it on only when I'm not in a wifi area. It seems that as soon as I turn it on, within an hour it charges me data for a backlog of whatever it didn't do when it was connected to wifi.
    I think it's absolutely disgusting that AT&T can just dismiss this and act like it's the users fault. Apps and email updating in the middle of the night to the tune of 400MB? I highly doubt it and then when I called to ask them about it they try to make me sound stupid like I have an app or two open. Just really ticked me off. The worst part about it is that there doesn't appear to be anything I can do about it aside from never using my data.

  • VISA error when trying to read hex data over serial port

    Hi - Sorry about this. I am posting this question as a new thread as I am still having serious problems and may be able to shed more light now.
    Basically I am trying to read data formatted as a string of comma separated hexadecimal words over the serial link.
    The data is formatted thus:  <first word>,  <second word>, <third word> etc.... <last word, <CR>  where <CR> is a carriage return.
    Where each word represents a number, e.g.  AB01 is 43777 in decimal.
    Now the VI is stalling at VISA close (I attach VI again to this message), the string data is read across by the VI but it will not then close the link unless I remove to USB cable from the comms port.  When I run the device in hyperterminal I do not have this problem.  The data sting comes back fine (when I send out the command message), the carriage return is recognized as the end of the data message and the comma channel is cleared and ready to accept the next command message.
    LabVIEW is then not happy.  I don't know why but it could be that the amount of data the demo device I have is sending back is more than the NI-VISA can cope with.  Am I correct in thinking 13kbytes is the maximum.  If this were the case This would account for my problem.  It seems that the number of bytes read by the VI is consistently 12999.
    Hope someone can help or this information will at least be useful to somebody so they don't have this problem.
    Many Thanks
    Ashley.

    Yes there is. If your data contains data from 0x00 to 0xFF then you have to disable the 'termination char' in the VISA configure function. The boolean input at the top. Set it to false. Otherwise your transmission will stop at 0x0A by default. You can also tell what the termination character should be, if you want.
    btw. Still no VI attached. 
    Sorry replied to the wrong message. No I did not where are both replying at the same time
    Message Edited by K C on 12-08-2005 12:44 PM
    Message Edited by K C on 12-08-2005 12:46 PM

  • Amount of data read / write in particular database

    Dear All,
    Hi,
    I am using mssql server 2005 on windows server 2008 sp2.
    I want to know how can I check what amount of data is written or read in a particular database
    in specific time in kb or mb.
    Regards,

    One way I can think of to do what you are looking for is to use the DMV sys.dm_io_virtual_file_stats.  By using that DMV you can figure out over a given period of time the amount of disk activity associated with a particular database file.  The
    thing the DMV doesn't show you is total activity because any activity associated with data cached by SQL Server won't be reflected.  It only reflects the activity associated with reads and writes that cannot be satisfied by the SQL Server cache.
    Anyway, try the following code.  It'll tell you how much data SQL Server is accessing from the disk:
    select db_name(f.database_id) as [db_name],
    f.name,
    f.physical_name,
    s.sample_ms,
    s.num_of_bytes_read,
    s.num_of_bytes_written,
    s.num_of_reads,
    s.num_of_writes
    from sys.master_files as f
    cross apply sys.dm_io_virtual_file_stats(f.database_id, f.file_id) as s
    where db_name(f.database_id) = 'MyDatabase'

  • Report in Excel format fails for huge amount of data with headers!!

    Hi All,
    I have developed an oracle report which fetches upto 5000 records.
    The requirements is to fetch upto 100000 records.
    This report fetches data if the headers are removed. If headers are given its not able to fetch the data.
    Have anyone faced this issue??
    Any idea to fetch huge amount of data by oracle report in excel format.
    Thanks & Regards,
    KP.

    Hi Manikant,
    According to your description, the performance is slow when display huge amount of data with more than 3 measures into powerpivot, so you need the hardware requirements for build a PowerPivot to display huge amount of data with more than 3 measures, right?
    PowerPivot benefits from multi-core processors, large memory and storage capacities, and a 64-bit operating system on the client computer.
    Based on my experience, large memory, multiprocessor and even
    solid state drives are benefit PowerPivot performance. Here is a blog about Memory Considerations about PowerPivot for Excel for you reference.
    http://sqlblog.com/blogs/marco_russo/archive/2010/01/26/memory-considerations-about-powerpivot-for-excel.aspx
    Besides, you can identify which query was taking the time by using the tracing, please refer to the link below.
    http://blogs.msdn.com/b/jtarquino/archive/2013/12/27/troubleshooting-slow-queries-in-excel-powerpivot.aspx
    Regards,
    Charlie Liao
    TechNet Community Support

  • Why is my iPhone 5 sending data over cellular while on wifi using Vodafone Germany?

    Hello,
    I know that this is a common mistake on verizon iphone 5. i have the same problem with my iPhone 5 in Germany. It keeps sending data over cellular while I am on wifi. The wifi connection itself works though, even when I turn off cellular data. So my wifi is not broken.
    Another thing that makes me curious is that my iPhone keeps sending data to somewhere and I don't know what it is sending. This led to the point that I reached my Monthly limit of 200mb within 2 days!!! And almost 95% of that was upstream. So I bought another 300mb and turned off cellular. Now I turned it back on and the megabytes are just running through. I thought it might be diagnosis and iCloud but I turned both off and it's still the Same way. Due to the enormous amount of data that is being sent the battery empties really really fast!! I was on a 20 minute car drive and my battery went down 23%!!! When I turn cellular off and use wifi it behaves completely normal.
    So this is really annoying me and I hope that Apple will fix this or one of you guys can help me.
    Best regards from Germany,
    Sitham

    You may have a problem with your WiFi network if your iPhone can't stay connected to it.
    If you are having WiFi problems it is necessary to isolate whether the problem is with your network or your iPhone. Note: Do NOT consider your network to be blameless if some other devices can connect to it.
    First, test your iPhone on some other networks: a friend's, Starbucks, Barnes & Noble, etc.
    If it works well there then the problem is with your network. Try restarting your router by removing power for 30 seconds. If that does not help check for a firmware update for your router. If none exists which corrects the problem consider replacing the router.
    If your iPhone does not function well on other networks it possibly has a hardware problem. Contact Apple Support or visit an Apple store for evaluation. They can provide a replacement if your iPhone is bad.
    If you need more help please give the make, model, and version of your WiFi router and how you have it configured.

  • Airport Extreme Intermittent Network Interruption when Downloading Large Amounts of Data.

    I've had an Airport Extreme Base Station for about 2.5 years and have had no problems until the last 6 months.  I have my iMac and a PC directly connected through ethernet and another PC connected wirelessly.  I occasionally need to download very large data files that max out my download connection speed at about 2.5Mbs.  During these downloads, my entire network loses connection to the internet intermittently for between 2 and 8 seconds with a separation between connection losses at around 20-30 seconds each.  This includes the hard wired machines.  I've tested a download with a direct connection to my cable modem without incident.  The base station is causing the problem.  I've attempted to reset the Base Station with good results after reset, but then the problem simply returns after a while.  I've updated the firmware to latest version with no change. 
    Can anyone help me with the cause of the connection loss and a method of preventing it?  THIS IS NOT A WIRELESS PROBLEM.  I believe it has to do with the massive amount of data being handled.  Any help would be appreciated.

    Ok, did some more sniffing around and found this thread.
    https://discussions.apple.com/thread/2508959?start=0&tstart=0
    It seems that the AEBS has had a serious flaw for the last 6 years that Apple has been unable to address adequately.  Here is a portion of the log file.  It simply repeats the same log entries over and over.
    Mar 07 21:25:17
    Severity:5
    Associated with station 58:55:ca:c7:c2:ae
    Mar 07 21:25:17
    Severity:5
    Installed unicast CCMP key for supplicant 58:55:ca:c7:c2:ae
    Mar 07 21:26:17
    Severity:5
    Disassociated with station 58:55:ca:c7:c2:ae
    Mar 07 21:26:17
    Severity:5
    Rotated CCMP group key.
    Mar 07 21:30:43
    Severity:5
    Rotated CCMP group key.
    Mar 07 21:36:41
    Severity:5
    Clock synchronized to network time server time.apple.com (adjusted +0 seconds).
    Mar 07 21:55:08
    Severity:5
    Associated with station 58:55:ca:c7:c2:ae
    Mar 07 21:55:08
    Severity:5
    Installed unicast CCMP key for supplicant 58:55:ca:c7:c2:ae
    Mar 07 21:55:32
    Severity:5
    Disassociated with station 58:55:ca:c7:c2:ae
    Mar 07 21:55:33
    Severity:5
    Rotated CCMP group key.
    Mar 07 21:59:47
    Severity:5
    Rotated CCMP group key.
    Mar 07 22:24:53
    Severity:5
    Associated with station 58:55:ca:c7:c2:ae
    Mar 07 22:24:53
    Severity:5
    Installed unicast CCMP key for supplicant 58:55:ca:c7:c2:ae
    Mar 07 22:25:18
    Severity:5
    Disassociated with station 58:55:ca:c7:c2:ae
    Mar 07 22:25:18
    Severity:5
    Rotated CCMP group key.
    Mar 07 22:30:43
    Severity:5
    Rotated CCMP group key.
    Mar 07 22:36:42
    Severity:5
    Clock synchronized to network time server time.apple.com (adjusted -1 seconds).
    Mar 07 22:54:37
    Severity:5
    Associated with station 58:55:ca:c7:c2:ae
    Mar 07 22:54:37
    Severity:5
    Installed unicast CCMP key for supplicant 58:55:ca:c7:c2:ae
    Anyone have any ideas why this is happening?

  • IPad downloading enormous amount of data

    We set my folks up with an iPad2 a few months ago, and it has been using extraordinary amounts of data. They live in the country and broadband and dsl are not available, so we have them set up on a cellular hotspot for internet access, which the iPad views as wi-fi. All was fine with a desktop and a netbook. Once we added an iPad to the mix, their data use has skyrocketed. Recent monitoring is showing approximately 1G per day. They are retired and are only doing facebook, email and an occasional view of a website for shopping. Have turned off auto-play of videos on facebook and shut down everything I can think of that might be using data. Yesterday's activity still showed just over 1G of data downloaded and about 100M uploaded.
    One note - my mother has Parkinson's, and is the primary user of the iPad. Due to the tremors from PD, she tends to tap the screen repeatedly while using. Is it possible that each tap is causing a refresh and downloading everything all over again? Would that result in the enormous download figures?
    I've also seen threads where there were issues with iPhones on AT&T using large amounts of data. Any chance there is a similar problem with an iPad on cellular hotspot? Provider is US Cellular, not AT&T, but makes me curious.
    Thanks for any thoughts or help you can provide.

    Poke around in the settings. Background app refresh can chew data - that's where the iPad reaches out and constantly updates the weather and other apps so that you have fresh data at any point in time....which you don't really need. If you only check the weather once a day you don't need it refreshed every 15 minutes.
    Not a way to find it but maybe a way to control it, put the iPad in airline mode unless you're actively using it. That turns off the antennas and keeps it from accessing data.

  • XY graphing with large amount of data

    I was looking into graphing a fairly substantial amount of data.  It is bursted across serial.
    What I have is 30 values corresponding to remote data sensors.  The data for each comes across together, so I have no problem having the data grouped.  It is, effectively, in an array of size 30.  I've been wanting to place this data in a graph.
    The period varies between 1/5 sec and 2 minutes between receptions (its wireless and mobile, so signal strength varies).  This eliminates waveform graph as the time isn't constant and there's alot of data(So no random NaN insertion).  This leaves me with an XY graph.
    Primary interest is with the last 10 minutes or so, with a desire to see up to an hour or two back.
    The data is farily similar, and I'm tring to possibly split it into groups of 4 or 5 sets of ordered pairs per graph.
    Problems:
    1.  If data comes in slow enough, everything is ok, but the time needed to synchrounously update the graph(s) often can exceed the time it would take to fully recieve the chunk of data which contains these data points.  Thinking asynchrounously is useless, as the graphs need to be reasonably in tune with the most recent data recieved.  I can't have the an exponential growth in the delta of time represented on the graph and the time the last bit of data was recieved.
    2.  I could use some advice on making older data points more sparse to allow for older data to be viewed, but with a sort of 'decay' of old data I don't value that 1/5 second resolution at all.
    I'm most concerned with solving problem 1, but random suggestions on 2 are most welcome.

    I didn't quite get the first question. Could you try to find out where
    exactly the time is consumed in your program and then refine your
    question.
    To the second question (which may also solve the first question). You
    can store all the data to a file as it arrives. Keep the most recent
    data in a let's say shift register of the update loop. Add a data point
    corresponding the start of the mesurement to the data and wire it to a
    XY graph. Make X Scrollbar visible. Handle the event for XY graph:X
    scale change. When the scale is changed i.e. the user scrolls the
    scroll bar, load data from the file and display it on the XY graph.
    This was a little simplified, it's a little more complicated.
    In my project, I am writing an XY graph X Control to handle a bit
    similar issue. I have huge data sets i.e. multichannel recordings 
    with millisecond resolution over many hours. One data set may be
    several gigabytes, so the data won't fit on the main memory of the
    computer at once. I wrote an X control which contains a XY graph. Each
    time the time scale is changed i.e. the scroll bar is scrolled, the X
    control generates a user event if it doesn't have enough data to
    display the time slice requested. The user event is handled outside the
    X Control by loading the appropriate set of data from the disk and
    displaying it on the X Control. The user event can be generated already
    a while before the X Control is out of data. This way the data is
    loaded a bit in advance, which allows seamles scrolling on the XY
    graph. One must notice that the front panel updates must be turned off
    in the X Control when the data is updated and back on after the update
    has finnished. Otherwise the XY graph will flicker annoingly.
    Tomi Maila

  • Is there any way to connect time capsule to a MacBook Pro directly via USB. I have a large amount of data that I want to back up and it is taking a very long time (35GB is taking 3 hrs, I have 2TB if files in total)...)?

    Perhaps via USB. I have a large amount of data that I want to back up and it is taking a very long time (35GB is taking 3 hrs, I have 2TB if files in total)...? I want to use TimeCapsule as back-up for an archive which is curently stored on a 2 TB WESC HD. 

    No, you cannot backup via direct usb connection..
    But gigabit ethernet is much faster anyway.. are you connected directly by ethernet?
    Is the drive you are backing up from plugged into the TC? That will slow it down something chronic.. plug that drive in by its fastest connection method.. WESC sorry I have no idea. If ethernet use that.. otherwise USB direct to the computer.. always think what way the files come and go.. but since you are copying from the computer everything has to go that way.. it makes things slower if they go over the same cable.. if you catch the drift.

  • Why does my iPhone use excessive amounts of data?

    My dad, step-mom, brother, and I are all on a 40gb data plan through AT&T. So that gives us each 10gb to use throughout our billing cycle each month. Over time my phone has gradually started to use more and more data. I was on a 2gb data plan when I had an android. After I got my iPhone i kept the same plan but a few months later I kept going over my data so I got a 4gb data plan. I then began to go over my 4gb data plan so I thought maybe it was just my iPhone. So I got a iPhone 5s.. It stayed below the 4gb plan until around 6 months ago. It went from 4gb to 6gb to 8gb over time and around January I was going over 10gb. February my phone was using 10gb in 2 weeks. This month it used 10gb in 12 days. I am on wifi at school so from 8:30-3:30 and I'm on wifi at home, I usually get there around 5:30 or 6. So, I contacted AT&T. They said I must me streaming videos and music constantl, but I'm not. I have turned off data to my apps. I have downloaded a data manager. I turned off my location services and the setting to automatically load videos. When I'm on wifi I turn my cellular data off. I kill my apps as soon as I'm done using them. I have even turned off setting for my apps and iCloud. I just don't know what else to do.

    Hi terrymaclean,
    If you are concerned about the amount of data your phone is using, here are some tips that can help reduce that:
    iOS: About cellular data usage
    http://support.apple.com/kb/HT4146
    Cheers,
    - Ari

Maybe you are looking for