Wiget question regarding system usage

I'm a recent convert from the PC world and am finding the dashboard feature very useful. I did, however have a question regarding the way wigets in the dashboard access the iMac's system resources.
Specifically, I was wondering if a wiget is installed and appears in the "Manage Wigets" list but is NOT active on the dashboard, does that wiget still utilize the system's resources? (i.e. is it still actively updating its information or performing its task) Or does it "sleep" until you actively enable it in the dashboard?

Welcome to Discussions!
I believe widgets don't consume resources unless you have them open in Dashboard-just having them installed, but not open wouldn't consume resources.
Since you're new to mac, you may want to check out Mac 101 and Switch 101.
Message was edited by: joshz

Similar Messages

  • Comprehensible question regarding the usage of a ccBPM in SAP PI 7.31

    Hello experts,
    I have got basic question about the useage of ccBPM in PI 7.31 double Stack. I haven't  created any ccBPM before so the whole set up process is new for me. We are currently using several ccBPM already which where created by our colleagues.
    I just have a question about the basic of work of a ccBPM and if the designed process can be realised at all.
    The current process should be implemented, starting from the erp system with a trigger report (left side). All the offered function are provided by the called system (right side)
    So can this whole process can be realised using ccBPM, so that after the trigger report is called the rest of the process is handled within the pi? The process checkStatus needs to be called within a loop and at the end the generated File should be stored on the file system.
    Thanks in advance for clearing my thoughts on this problem guys!
    regards
    Christian

    Hello Christian,
    is this a 1-1 correlation? So you have just one sender system and one target system? It might be better then to implement the flow in your sender system and only call single interfaces on PI. While it is technically possible to do everything in PI there is no real added value to do so. BPM is most benefitial when you need to aggregate information from different sources and when the process contains decisions that lead to alternative process steps.
    Regards,
    Jörg

  • General question regarding System.out.println

    Just for my knowledge fellas,
    When you have a few System.out.println's in your code, what happens to the lines written to standard out when you run your application from the executable jar file? I know if you run your app in cmd prompt, then the lines are outputted to the cmd screen, and in Netbeans, in the output section...but where do those lines go when you run the app as an executeable? Are they still written, but there's nothing to show the written lines?
    Thanks.
    ...DJVege...

    Just for my knowledge fellas,
    When you have a few System.out.println's in your
    code, what happens to the lines written to standard
    out when you run your application from the executable
    jar file?They get written to standard out, usually the console... doesn't matter at all if it's a JAR or not.
    I know if you run your app in cmd prompt,
    then the lines are outputted to the cmd screen, and
    in Netbeans, in the output section...but where do
    those lines go when you run the app as an
    executeable?cmd screen. Or Java console if you run javaw.

  • Question regarding the usage of DSP48E1 in Zynq7030

    Hi everyone.
    I'm trying to implement 3D audio function using emmbeded DSP slices in the Zynq.
    I got DSP libray which is compiled with XDK and want to import the library to the DSP48E1 slice in the Zynq.
    The DSP library I got is consisted of HRTF (Head Related Transfer Functions). The HRTF is consisted of some FFT, phase compare, phase add, phase substraction and amplitude amplifying or attenuating.
    So, waht I wonder is can we import the library to the DSP slice and use?
    Please let me know.
    Thanks,
    S. J. Lee

    Hello,
    Are you talking about a software library? You cannot just take a software library and automatically map it to DSP slices in the FPGA fabric in the same way that you would re-compile it for a different processor architecture, if that's what you mean. Don't think of the DSP slice as a DSP *processor*. That's not what it is.
    The closest thing to this would be to use high level tools such as HLS or SDSoC to help ease the work of pushing certain portions of the algorithm to hardware. However, you will still be responsible to do some level of partitioning and hardware design for implementing the algorithm.

  • Question regarding roaming and data usage

    I am currently out of my main country of service, and as such I have a question regarding roaming and data usage.
    I am told that the airplane mode is sufficient from keeping the phone off from roaming, but does this apply to any background data usage for applications and such?
    If the phone is in airplane mode, are all use of the phone including wifi and application use through the wifi outside of all extra charges from roaming?

    Ann154 wrote:
    If you are getting charged to use the wifi, then it is possible.  Otherwise no
    Just to elaborate here, Ann154 is referring to access charges for wifi, which is nothing to do with Verizon, so if you are using it in a plane, hotel, an internet cafe etc that charges for Wifi rather than being free .   Verizon does not charge you (or indeed know about!) wifi usage, or any other usage that is not on their cellular network (such as using a foreign SIM for example in global phones)  So these charges, if any, will not show up on the verizon bill app.  Having it in airplane mode prevents all cellular data traffic so you should be fine

  • System usage per user

    Hi Security Experts,
    Our company have a SAP system version ECC 5.0 and right now we are making an analysis related to the quantity of users and the quality of the system usage per user, so the question here is if there's a way to know how much resources does an user consume? this is, time connected, transactions used. This is in order to separate the occasional users from the operating users?
    I'll appreciate a lot your help.
    Best Regards,
    Erik Espinosa

    Hi
    ST03N is your solution.
    There you find every minute detail for performance in the SAP system.
    You can check User Profiles for days , weeks and months individually to check the response times, data executions from the database etc.
    You can also find the processes/ transactions that take the highest memory consumption.
    Apart from this, an earlywatch report from the system would also assist you in such an analysis.
    I hope this helps
    Regards
    CHEN

  • Question regarding DocumentDB RU consumption when inserting documents & write performance

    Hi guys,
    I do have some questions regarding the DocumentDB Public Preview capacity and performance quotas:
    My use case is the following:
    I need to store about 200.000.000 documents per day with a maximum of about 5000 inserts per second. Each document has a size of about 200 Byte.
    According to to the documentation (http://azure.microsoft.com/en-us/documentation/articles/documentdb-manage/) i understand that i should be able to store about 500 documents per second with single inserts and about 1000 per second with a batch insert using
    a stored procedure. This would result in the need of at least 5 CUs just to handle the inserts.
    Since one CU consists of 2000 RUs i would expect the RU usage to be about 4 RUs per single document insert or 100 RUs for a single SP execution with 50 documents.
    When i look at the actual RU consumption i get values i don’t really understand:
    Batch insert of 50 documents: about 770 RUs
    Single insert: about 17 RUs
    Example document:
    {"id":"5ac00fa102634297ac7ae897207980ce","Type":0,"h":"13F40E809EF7E64A8B7A164E67657C1940464723","aid":4655,"pid":203506,"sf":202641580,"sfx":5662192,"t":"2014-10-22T02:10:34+02:00","qg":3}
    The consistency level is set to “Session”.
    I am using the SP from the example c# project for batch inserts and the following code snippet for single inserts:
    await client.CreateDocumentAsync(documentCollection.DocumentsLink, record);
    Is there any flaw in my assumption (ok…obviously) regarding the throughput calculation or could you give me some advice how to achieve the throughput stated in the documentation?
    With the current performance i would need to buy at least 40 CUs which wouldn’t be an option at all.
    I have another question regarding document retention:
    Since i would need to store a lot of data per day i also would need to delete as much data per day as i insert:
    The data is valid for at least 7 days (it actually should be 30 days, depending on my options with documentdb). 
    I guess there is nothing like a retention policy for documents (this document is valid for X day and will automatically be deleted after that period)?
    Since i guess deleting data on a single document basis is no option at all i would like to create a document collection per day and delete the collection after a specified retention period.
    Those historic collections would never change but would only receive queries. The only problem i see with creating collections per day is the missing throughput:
    As i understand the throughput is split equally according to the number of available collections which would result in “missing” throughput on the actual hot collection (hot meaning, the only collection i would actually insert documents).
    Is there any (better) way to handle this use case than buy enough CUs so that the actual hot collection would get the needed throughput?
    Example: 
    1 CU -> 2000 RUs
    7 collections -> 2000 / 7 = 286 RUs per collection (per CU)
    Needed throughput for hot collection (values from documentation): 20.000
    => 70 CUs (20.000 / 286)
    vs. 10 CUs when using one collection and batch inserts or 20 CUs when using one collection and single inserts.
    I know that DocumentDB is currently in preview and that it is not possible to handle this use case as is because of the limit of 10 GB per collection at the moment. I am just trying to do a POC to switch to DocumentDB when it is publicly available. 
    Could you give me any advice if this kind of use case can be handled or should be handled with documentdb? I currently use Table Storage for this case (currently with a maximum of about 2500 inserts per second) but would like to switch to documentdb since i
    had to optimize for writes per second with table storage and do have horrible query execution times with table storage because of full table scans.
    Once again my desired setup:
    200.000.000 inserts per day / Maximum of 5000 writes per second
    Collection 1.2 -> Hot Collection: All writes (max 5000 p/s) will go to this collection. Will also be queried.
    Collection 2.2 -> Historic data, will only be queried; no inserts
    Collection 3.2 -> Historic data, will only be queried; no inserts
    Collection 4.2 -> Historic data, will only be queried; no inserts
    Collection 5.2 -> Historic data, will only be queried; no inserts
    Collection 6.2 -> Historic data, will only be queried; no inserts
    Collection 7.2 -> Historic data, will only be queried; no inserts
    Collection 1.1 -> Old, so delete whole collection
    As a matter of fact the perfect setup would be to have only one (huge) collection with an automatic document retention…but i guess this won’t be an option at all?
    I hope you understand my problem and give me some advice if this is at all possible or will be possible in the future with documentdb.
    Best regards and thanks for your help

    Hi Aravind,
    first of all thanks for your reply regarding my questions.
    I sent you a mail a few days ago but since i did not receive a response i am not sure it got through.
    My main question regarding the actual usage of RUs when inserting documents is still my main concern since i can not insert nearly
    as many documents as expected per second and CU.
    According to to the documentation (http://azure.microsoft.com/en-us/documentation/articles/documentdb-manage/)
    i understand that i should be able to store about 500 documents per second with single inserts and about 1000 per second with a batch insert using a stored procedure (20 batches per second containing 50 documents each). 
    As described in my post the actual usage is multiple (actually 6-7) times higher than expected…even when running the C# examples
    provided at:
    https://code.msdn.microsoft.com/windowsazure/Azure-DocumentDB-NET-Code-6b3da8af/view/SourceCode
    I tried all ideas Steve posted (manual indexing & lazy indexing mode) but was not able to enhance RU consumption to a point
    that 500 inserts per second where nearly possible.
    Here again my findings regarding RU consumption for batch inserts:
    Automatic indexing on: 777
    RUs for 50 documents
    Automatic indexing off &
    mandatory path only: 655
    RUs for 50 documents
    Automatic indexing off & IndexingMode Lazy & mandatory path only:  645 RUs for
    50 documents
    Expected result: approximately 100
    RUs (2000 RUs => 20x Batch insert of 50 => 100 RUs per batch)
    Since DocumentDB is still Preview i understand that it is not yet capable to handle my use case regarding throughput, collection
    size, amount of collections and possible CUs and i am fine with that. 
    If i am able to (at least nearly) reach the stated performance of 500 inserts per second per CU i am totally fine for now. If not
    i have to move on and look for other options…which would also be “fine”. ;-)
    Is there actually any working example code that actually manages to do 500 single inserts per second with one CUs 2000 RUs or is
    this a totally theoretical value? Or is it just because of being Preview and the stated values are planned to work.
    Regarding your feedback:
    ...another thing to consider
    is if you can amortize the request rate over the average of 200 M requests/day = 2000 requests/second, then you'll need to provision 16 capacity units instead of 40 capacity units. You can do this by catching "RequestRateTooLargeExceptions" and retrying
    after the server specified retry interval…
    Sadly this is not possible for me because i have to query the data in near real time for my use case…so queuing is not
    an option.
    We don't support a way to distribute throughput differently across hot and cold
    collections. We are evaluating a few solutions to enable this scenario, so please do propose as a feature at http://feedback.azure.com/forums/263030-documentdb as this helps us prioritize
    feature work. Currently, the best way to achieve this is to create multiple collections for hot data, and shard across them, so that you get more proportionate throughput allocated to it. 
    I guess i could circumvent this by not clustering in “hot" and “cold" collections but “hot" and “cold"
    databases with one or multiple collections (if 10GB will remain the limit per collection) each if there was a way to (automatically?) scale the CUs via an API. Otherwise i would have to manually scale down the DBs holding historic data. I
    also added a feature requests as proposed by you.
    Sorry for the long post but i am planning the future architecture for one of our core systems and want to be sure if i am on
    the right track. 
    So if you would be able to answer just one question this would be:
    How to achieve the stated throughput of 500 single inserts per second with one CUs 2000 RUs in reality? ;-)
    Best regards and thanks again

  • Questions regarding customisation/configuration of PS CS4

    Hello
    I have accumulated a list of questions regarding customising certain things in Photoshop. I don't know if these things are doable and if so, how.
    Can I make it so that the list of blending options for a layer is by default collapsed when you first apply any options?
    Can I make it possible to move the canvas even though I'm not zoomed in enough to only have parts of it visible on my screen?
    Is it possible to enable a canvas rotate shortcut, similar to the way you can Alt+RightClick to quickly change brush size?
    Is it possible to lock button positions? Sometimes I accidentally drag them around when I meant to click.
    Is it possible to lock panel sizes? For example, if I have the Navigator and the Layers panels vertically in the same group, can I lock the height of the navigator so that I don't have to re-adjust it all the time? Many panels have a minimum height so I guess what I am asking for is if it's possible to set a maximum height as well.
    Is it possible to disable Photoshop from automatically appending "copy" at the end of layer/folder names when I duplicate them?
    These are things I'd really like to change to my liking as they are problems I run into on a daily basis.
    I hope someone can provide some nice solutions

    NyanPrime wrote:
    <answered above>
    Can I make it possible to move the canvas even though I'm not zoomed in enough to only have parts of it visible on my screen?
    Is it possible to enable a canvas rotate shortcut, similar to the way you can Alt+RightClick to quickly change brush size?
    Is it possible to lock button positions? Sometimes I accidentally drag them around when I meant to click.
    Is it possible to lock panel sizes? For example, if I have the Navigator and the Layers panels vertically in the same group, can I lock the height of the navigator so that I don't have to re-adjust it all the time? Many panels have a minimum height so I guess what I am asking for is if it's possible to set a maximum height as well.
    Is it possible to disable Photoshop from automatically appending "copy" at the end of layer/folder names when I duplicate them?
    These are things I'd really like to change to my liking as they are problems I run into on a daily basis.
    I hope someone can provide some nice solutions
    2.  No.  It's a sore spot that got some forum time when Photoshop CS4 was first released, then again with CS5.  It's said that the rules change slightly when using full-screen mode, though I personally haven't tried it.
    3.  Not sure, since I haven't tried it.  However, you may want to explore the Edit - Keyboard Shortcuts... menu, if you haven't already.
    4.  What buttons are you talking about?  Those you are creating in your document?  If so, choose the layer you want to lock in the LAYERS panel, then look at the little buttons just above the listing of the layers:
    5.  There are many, many options for positioning and sizing panels.  Most start with making a panel visible, then dragging it somewhere by its little tab.  One of the important features is that you can save your preferred layout as a named workspace.  Choose the Window - Workspace - New Workspace... to create a new named workspace (or to update one you've already created).  The name of that menu is a little confusing.  Once you have created your workspace, if something gets out of place, choose Window - Workspace - Reset YourNamedWorkspace to bring it back to what was saved.
    You'll find that panels like to "stick together", which helps with arranging them outside of the Photoshop main window.
    As an example, I use two monitors, and this is my preferred layout:
    6.  No, it's not possible to affect the layer names Photoshop generates, as far as I know.  I have gotten in the habit of immediately naming them per their usage, so that I don't confuse myself (something that's getting easier and easier to do...).
    Hope this helps!
    -Noel

  • Question regarding ONT connection via Ethernet and Cable cards

    Hi,
    We recently upgraded to Fios Quation 150Mbps/65  plan. We are not getting the advertised speeds (we only get like 5mbps upload) so verizon is sending a tech to switch the ONT connection from coax to ethernet.
    I have 2 questions regarding this new setup:
    1. If the ONT communicates with the Fios Actiontect router via ethernet instead of coax from now on, How will the set top boxes and Tivo-esque cable card powered device I currently have connected to coax, talk to the verizon system from now on, if coax is taken out of the equation? Will fios signal still flow through the internal coax wiring of the house? And moreover, I was under the impression that coax was the way set top boxes communicated and derived independent ip addresses from the Fios router, for on deman purposes and what not. How will this work from now on?
    Quiestion 2.
    Right next to the wall where the ONT sits, theres's a basement office where we have a PC that connects to the Fios system  via an Actiontect MoCa adapter (ECB2500C) which I assume derives it internet connection from the Fios Actiontec router which sits upstairs in the living room. 
    Again, with the Coax about to be disabled next Friday in favor of ethernet connection from the ONT, I assume this PC will be left without internet because of the lack of internet signal in the coax? Is this correct? 
    Question 2.5 If my above assumption is correct, since this office is right on the other side of the wall where the ONT sits outside the house, would it be possible to run an ethernet wire through the wall that connects straight from the ONT to an ethernet switch inside the office, from which I would derive a connection for this basement PC (properly firewalled of course) and then, from said switch, continue running the ethernet wire that would ultimately reach the Actiontec Fios router upstairs from which the rest of the house derives it's internet?  and would this setup affect in any way the propper functioning of the cable boxes in the house?
    I'd appreciate your input and any help you can provide so I can have a ballpark idea of what to tell the Fios guy to do when he comes on Friday.
    Cheers.
    Solved!
    Go to Solution.

    It's not valid to have two devices connected to the ONT, PC and VZ Router.  Must be a single device. The ONT locks onto the MAC Address of the first device it sees. Since you have TV you should have the VZ router as the internet facing router.
    Other options:
    1.  Have the VZ Router located next to the PC in the basement and then use Wireless for all other PC's.
    2.  Have the VZ Router located next to the PC in the basement but run one wire upstairs and connect a switch where other PCs and devices can connect via a wire.
    Hope that helps.

  • I have some questions regarding setting up a software RAID 0 on a Mac Pro

    I have some questions regarding setting up a software RAID 0 on a Mac pro (early 2009).
    These questions might seem stupid to many of you, but, as my last, in fact my one and only, computer before the Mac Pro was a IICX/4/80 running System 7.5, I am a complete novice regarding this particular matter.
    A few days ago I installed a WD3000HLFS VelociRaptor 300GB in bay 1, and moved the original 640GB HD to bay 2. I now have 2 bootable internal drives, and currently I am using the VR300 as my startup disk. Instead of cloning from the original drive, I have reinstalled the Mac OS, and all my applications & software onto the VR300. Everything is backed up onto a WD SE II 2TB external drive, using Time Machine. The original 640GB has an eDrive partition, which was created some time ago using TechTool Pro 5.
    The system will be used primarily for photo editing, digital imaging, and to produce colour prints up to A2 size. Some of the image files, from scanned imports of film negatives & transparencies, will be 40MB or larger. Next year I hope to buy a high resolution full frame digital SLR, which will also generate large files.
    Currently I am using Apple's bundled iPhoto, Aperture 2, Photoshop Elements 8, Silverfast Ai, ColorMunki Photo, EZcolor and other applications/software. I will also be using Photoshop CS5, when it becomes available, and I will probably change over to Lightroom 3, which is currently in Beta, because I have had problems with Aperture, which, until recent upgrades (HD, RAM & graphics card) to my system, would not even load images for print. All I had was a blank preview page, and a constant, frozen "loading" message - the symbol underneath remained static, instead of revolving!
    It is now possible to print images from within Aperture 2, but I am not happy with the colour fidelity, whereas it is possible to produce excellent, natural colour prints using its "minnow" sibling, iPhoto!
    My intention is to buy another 3 VR300s to form a 4 drive Raid 0 array for optimum performance, and to store the original 640GB drive as an emergency bootable back-up. I would have ordered the additional VR300s already, but for the fact that there appears to have been a run on them, and currently they are out of stock at all, but the more expensive, UK resellers.
    I should be most grateful to receive advice regarding the following questions:
    QUESTION 1:
    I have had a look at the RAID setting up facility in Disk Utility and it states: "To create a RAID set, drag disks or partitions into the list below".
    If I install another 3 VR300s, can I drag all 4 of them into the "list below" box, without any risk of losing everything I have already installed on the existing VR300?
    Or would I have to reinstall the OS, applications and software again?
    I mention this, because one of the applications, Personal accountz, has a label on its CD wallet stating that the Licence Key can only be used once, and I have already used it when I installed it on the existing VR300.
    QUESTION 2:
    I understand that the failure of just one drive will result in all the data in a Raid 0 array being lost.
    Does this mean that I would not be able to boot up from the 4 drive array in that scenario?
    Even so, it would be worth the risk to gain the optimum performance provide by Raid 0 over the other RAID setup options, and, in addition to the SE II, I will probably back up all my image files onto a portable drive as an additional precaution.
    QUESTION 3:
    Is it possible to create an eDrive partition, using TechTool Pro 5, on the VR300 in bay !?
    Or would this not be of any use anyway, in the event of a single drive failure?
    QUESTION 4:
    Would there be a significant increase in performance using a 4 x VR300 drive RAID 0 array, compared to only 2 or 3 drives?
    QUESTION 5:
    If I used a 3 x VR300 RAID 0 array, and installed either a cloned VR300 or the original 640GB HD in bay 4, and I left the Startup Disk in System Preferences unlocked, would the system boot up automatically from the 4th. drive in the event of a single drive failure in the 3 drive RAID 0 array which had been selected for startup?
    Apologies if these seem stupid questions, but I am trying to determine the best option without foregoing optimum performance.

    Well said.
    Steps to set up RAID
    Setting up a RAID array in Mac OS X is part of the installation process. This procedure assumes that you have already installed Mac OS 10.1 and the hard drive subsystem (two hard drives and a PCI controller card, for example) that RAID will be implemented on. Follow these steps:
    1. Open Disk Utility (/Applications/Utilities).
    2. When the disks appear in the pane on the left, select the disks you wish to be in the array and drag them to the disk panel.
    3. Choose Stripe or Mirror from the RAID Scheme pop-up menu.
    4. Name the RAID set.
    5. Choose a volume format. The size of the array will be automatically determined based on what you selected.
    6. Click Create.
    Recovering from a hard drive failure on a mirrored array
    1. Open Disk Utility in (/Applications/Utilities).
    2. Click the RAID tab. If an issue has occurred, a dialog box will appear that describes it.
    3. If an issue with the disk is indicated, click Rebuild.
    4. If Rebuild does not work, shut down the computer and replace the damaged hard disk.
    5. Repeat steps 1 and 2.
    6. Drag the icon of the new disk on top of that of the removed disk.
    7. Click Rebuild.
    http://support.apple.com/kb/HT2559
    Drive A + B = VOLUME ONE
    Drive C + D = VOLUME TWO
    What you put on those volumes is of course up to you and easy to do.
    A system really only needs to be backed up "as needed" like before you add or update or install anything.
    /Users can be backed up hourly, daily, weekly schedule
    Media files as needed.
    Things that hurt performance:
    Page outs
    Spotlight - disable this for boot drive and 'scratch'
    SCRATCH: Temporary space; erased between projects and steps.
    http://en.wikipedia.org/wiki/StandardRAIDlevels
    (normally I'd link to Wikipedia but I can't load right now)
    Disk drives are the slowest component, so tackling that has always made sense. Easy way to make a difference. More RAM only if it will be of value and used. Same with more/faster processors, or graphic card.
    To help understand and configure your 2009 Nehalem Mac Pro:
    http://arstechnica.com/apple/reviews/2009/04/266ghz-8-core-mac-pro-review.ars/1
    http://macperformanceguide.com/
    http://www.macgurus.com/guides/storageaccelguide.php
    http://www.macintouch.com/readerreports/harddrives/index.html
    http://macperformanceguide.com/OptimizingPhotoshop-Configuration.html
    http://kb2.adobe.com/cps/404/kb404440.html

  • Questions regarding how I pay my Line Rental Bill ...

    Hi BT,  I'm currently a BE Broadband customer,  I was a BT Broadband customer back in the days when you had BT Home Hub 1.0 and 8mb packages which i was only getting like 5-6mb, 0.3mb upload and constant speed caps on my connection to 1mb.  You guys didn't have true unlimited usage back then so being a gamer and someone that frequently downloads I changed to BE broadband.  They gave me a steady 11mb connection with 1.3mb upload speed which I've been satisfied with.   But now BE Broadband has been sold off to Sky which I don't think I want to be apart of as I've heard a lot of bad feedback and I really do not want to be forced into paying their line rental if I where to switch to Sky Unlimited.  
    So now I'm considering coming back to BT Broadband,  knowing that you've probably improved over the years.  I hope so.  I know you have this whole BT Infinity going now but my area doesn't support fibre optic broadband and probably never will because I've looked on the BT Openreach website and it says there are no plans for my area to be installed with fibre optics which I would of wanted to have but oh well.  What are the chances of my internet speed being the same as it is now with BE Broadband?   I don't see why it would go any slower.   Are your upload speeds around the same?
    Also I have a question regarding the payment methods of your BT Line rental.  I'm currently already on the BT line rental on the anytime calling plan.   I pay the bill quarterly every 3 months, which I'm comfortable with.  What confuses me is when I select the package to order BT Unlimited + Calls I only have the option to select Monthly Line Rental or Line Rental Saver.   Why am I able to pay my phone every 3 months if those are the only two options?   I'd prefrably want to stick to how I pay my phone bill.   So further more how should I go about ordering BT Unlimited broadband without it interfering with my currrent BT line rental?  I just want BT Broadband unlimited along side my current BT line rental method of pay.  Would I still be eligible for the 6 months free and the gift card.  Is the BT Home Hub 4 that comes with it free also?  Also one more question, I'd need my MAC code from BE Broadband right?  

    This is a community forum.  The people on here are other BT customers.
    The speeds have certainly improved but only if BT's equipment at your exchange supports ADSL2/2+.  If not, you'll get what you had before.  While the speeds have improved, the same can't be said for the customer service.  Upload on ADSL2+ can be up to about 1.2M.  On ADSLMax, it's still capped at 448K.
    BT like to combine broadband and phone on one bill.  So if you get line rental and broadband, they will send you one monthly bill for both.  You might be able to get them separately, but not through the web site.  Try sales on 0800 800 150.

  • File system usage report chart generation - automated via reports and email

    Hello,
    I have requirement where I have to generate a monthly file system usage chart (just like a tablespace total_mb, used_mb, free_mb....) and mail it directly to the client.....
    Any ideas or suggestions would be welcome....
    Thanks,
    S.

    It's a pretty open-ended question, because a number of things can be causing slow performance.  You mention needing to look at ODBC connection performance - I found that using ODBC drivers for iSeries was very slow in the past.  You might want to look at using the JDBC driver in the IBM JTOpen Toolkit - it made a big difference for us when we were querying iSeries in the past.
    When you refer to "reports" in this thread's title, do you mean reports from ColdFusion Report Builder?  Or to just ColdFusion .cfm pages?  How complex are these reports?  How much information is displayed, especially in tabular form?  Just rendering huge HTML tables (as in thousands of rows of data) will often cause browsers to become temporarily unresponsive or hang.  If you provide more details, we might better be able to target where the "pain point" really is.
    -Carl V.

  • Basic question regarding SSIS 2010 Package where source is Microsoft Excel 97-2005 and there is no Microsoft office or Excel driver installed in Production

    Hi all,
    I got one basic question regarding SSIS 2010 Package where source is Microsoft Excel 97-2005. I wanted to know How this package works in production where there is no Microsoft office or Excel driver installed. To check that there is excel driver installed
    or not, I followed steps: Start-->Administrative Tools--> Data Sources(ODBC)-->Drivers and I found only 2 drivers one is SQL Server and another one is SQL Server Native Client 11.0.
    Windows edition is Windows Server 2008 R2 Enterprise, Service Pack-1 and System type is 64-bit Operating System.
    We are running this package from SQL Server Agent and using 32-bit (\\Machine_Name\d$\Program Files (x86)\Microsoft SQL Server\110\DTS\Binn\DTExec.exe /FILE "\\Machine_Name\d$\ Folder_Name\EtL.dtsx" /CONFIGFILE "\\Machine_Name\d$\Folder_Name\Config.dtsConfig"
    /MAXCONCURRENT " -1 " /CHECKPOINTING OFF /REPORTING E) to run this package. I opened the package and tried to find out what connection we have used and found that we have used "Excel Connection Manager" and ConnectionString=Provider=Microsoft.Jet.OLEDB.4.0;Data
    Source=F:\Fares.xls;Extended Properties="EXCEL 8.0;HDR=YES"; and source is ‘Excel Source’
    I discussed with my DBA and He said that SSIS is having inbuilt Excel driver but I am not convinced.
    Could anyone please clear my confusion/doubt?
    I have gone through various links but my doubt is still not clear.
    Quick Reference:
    SSIS in 32- and 64-bits
    http://toddmcdermid.blogspot.com.au/2009/10/quick-reference-ssis-in-32-and-64-bits.html
    Why do I get "product level is insufficient..." error when I run my SSIS package?
    http://blogs.msdn.com/b/michen/archive/2006/11/11/ssis-product-level-is-insufficient.aspx
    How to run SSIS Packages using 32-bit drivers on 64-bit machine
    http://help.pragmaticworks.com/dtsxchange/scr/FAQ%20-%20How%20to%20run%20SSIS%20Packages%20using%2032bit%20drivers%20on%2064bit%20machine.htm
    Troubleshooting OLE DB Provider Microsoft.ACE.OLEDB.12.0 is not registered Error when importing data from an Excel 2007 file to SQL Server 2008
    http://www.mytechmantra.com/LearnSQLServer/Troubleshoot_OLE_DB_Provider_Error_P1.html
    How Can I Get a List of the ODBC Drivers that are Installed on a Computer?
    http://blogs.technet.com/b/heyscriptingguy/archive/2005/07/07/how-can-i-get-a-list-of-the-odbc-drivers-that-are-installed-on-a-computer.aspx
    Thanks Shiven:) If Answer is Helpful, Please Vote

    Hi S Kumar Dubey,
    In SSIS, the Excel Source and Excel Destination natively use the Microsoft Jet 4.0 OLE DB Provider which is installed by SQL Server. The Microsoft Jet 4.0 OLE DB Provider deals with .xls files created by Excel 97-2003. To deal with .xlsx files created by
    Excel 2007, we need the Microsoft ACE OLEDB Provider. SQL Server doesn’t install the Microsoft ACE OLEDB Provider, to get it we can install the
    2007 Office System Driver: Data Connectivity Components or
    Microsoft Access Database Engine 2010 Redistributable or Microsoft Office suit.
    The drivers listed in the ODBC Data Source Administrator are ODBC drivers not OLEDB drivers, therefore, the Excel Source/Destination in SSIS won’t use the ODBC driver for Excel listed in it by default. On a 64-bit Windows platform, there are two versions
    of ODBC Data Source Administrator. The 64-bit ODBC Data Source Administrator is C:\Windows\System32\odbcad32.exe, while the 32-bit one is C:\Windows\SysWOW64\odbcad32.exe. The original 32-bit and 64-bit ODBC drivers are installed by the Windows operating system.
    By default, there are multiple 32-bit ODBC drivers and fewer 64-bit ODBC drivers installed on a 64-bit platform. To get more ODBC drivers, we can install the 2007 Office System Driver: Data Connectivity Components or Microsoft Access Database Engine 2010 Redistributable.
    Besides, please note that 2007 Office System Driver: Data Connectivity Components only install 32-bit ODBC and OLEDB drivers because it only has 32-bit version, but the Microsoft Access Database Engine 2010 Redistributable has both 32- bit version and 64-bit
    version.
    If you have any questions, please feel free to ask.
    Regards,
    Mike Yin
    TechNet Community Support

  • Questions regarding creation of vendor in different purchase organisation

    Hi abap gurus .
    i have few questions regarding data transfers .
    1) while creating vendor , vendor is specific to company code and vendor can be present in different purchasing organisations within the same company code if the purchasing organisation is present at plant level .my client has vendor in different purchasing org. how the handle the above situatuion .
    2) i had few error records while uploading MM01 , how to download error records , i was using lsmw with predefined programmes .
    3) For few applications there are no predefined programmes , no i will have to chose either predefined BAPI or IDOCS . which is better to go with . i found that BAPI and IDOCS have same predefined structures , so what is the difference between both of them  .

    Hi,
    1. Create a BDC program with Pur orgn as a Parameter on the selection screen
        so run the same BDC program for different Put organisations so that the vendors
        are created in different Pur orgns.
    2. Check the Action Log in LSMW and see
    3.see the doc
    BAPI - BAPIs (Business Application Programming Interfaces) are the standard SAP interfaces. They play an important role in the technical integration and in the exchange of business data between SAP components, and between SAP and non-SAP components. BAPIs enable you to integrate these components and are therefore an important part of developing integration scenarios where multiple components are connected to each other, either on a local network or on the Internet.
    BAPIs allow integration at the business level, not the technical level. This provides for greater stability of the linkage and independence from the underlying communication technology.
    LSMW- No ABAP effort are required for the SAP data migration. However, effort are required to map the data into the structure according to the pre-determined format as specified by the pre-written ABAP upload program of the LSMW.
    The Legacy System Migration Workbench (LSMW) is a tool recommended by SAP that you can use to transfer data once only or periodically from legacy systems into an R/3 System.
    More and more medium-sized firms are implementing SAP solutions, and many of them have their legacy data in desktop programs. In this case, the data is exported in a format that can be read by PC spreadsheet systems. As a result, the data transfer is mere child's play: Simply enter the field names in the first line of the table, and the LSM Workbench's import routine automatically generates the input file for your conversion program.
    The LSM Workbench lets you check the data for migration against the current settings of your customizing. The check is performed after the data migration, but before the update in your database.
    So although it was designed for uploading of legacy data it is not restricted to this use.
    We use it for mass changes, i.e. uploading new/replacement data and it is great, but there are limits on its functionality, depending on the complexity of the transaction you are trying to replicate.
    The SAP transaction code is 'LSMW' for SAP version 4.6x.
    Check your procedure using this Links.
    BAPI with LSMW
    http://esnips.com/doc/ef04c89f-f3a2-473c-beee-6db5bb3dbb0e/LSMW-with-BAPI
    For document on using BAPI with LSMW, I suggest you to visit:
    http://www.****************/Tutorials/LSMW/BAPIinLSMW/BL1.htm
    http://esnips.com/doc/1cd73c19-4263-42a4-9d6f-ac5487b0ebcb/LSMW-with-Idocs.ppt
    http://esnips.com/doc/ef04c89f-f3a2-473c-beee-6db5bb3dbb0e/LSMW-with-BAPI.ppt
    <b>Reward points for useful Answers</b>
    Regards
    Anji

  • Hi, I want to downgrade from OSX LEOPARD to OSX TIGER but I have a few questions regarding this. My iMac is originally from 2007 it came preloaded with tiger. I have original install tiger discs version 10.4.10. Is it safe to downgrade or not please help

    Hi, I want to downgrade from OSX LEOPARD to OSX TIGER but I have a few questions regarding this. My iMac is originally from Sep 2007 it came preloaded with tiger. I have original install (2) tiger discs version 10.4.10.  I want to know if it is safe and what are the necessary steps to do so. Also by downgrading im wondering if a lot of apps nowadays support tiger for example I have photoshop version 5 and 4 these are very important to me. One last question does anyone know of any reliable virus protection for mac that doesnt slow down your computer? because I have read that a lot of them do so. If anyone can help me I would greatly appreciate it! Here are the specs for my iMac 
    Model Name:
    iMac
      Model Identifier:
    iMac7,1
      Processor Name:
    Intel Core 2 Duo
      Processor Speed:
    2 GHz
      Number Of Processors:
    1
      Total Number Of Cores:
    2
      L2 Cache:
    4 MB
      Memory:
    2 GB
      Bus Speed:
    800 MHz

    Most of the time a perception of general slow performance is the result of installing third party junk alleged to speed up, "clean" or "optimize" your Mac, or to look for viruses that don't exist. Ideally you would know what you installed so you can uninstall it, but if you don't know or aren't sure there are techniques such as Safe Mode and creating a temporary user account to confirm that suspicion.
    If you open Activity Monitor it may show a process, or processes, that occupy a lot of your system's time.
    Slowness confined solely to web browser activity is often the result of an inexorable progress toward websites that demand ever more processor-intensive tasks. If your slow performance is strictly limited to web browsing, you might try disabling Flash by either uninstalling it, or use utilities such as ClickToFlash that allow you to control what Flash content gets loaded. Flash in itself is not inherently evil, but there is nothing to stop websites or the advertisers who pay for them from writing horrible Flash code that can do everything from hogging 100% of your CPU's time to causing random crashes. You can watch Activity Monitor as in the above to correlate these troublesome web pages with performance degradation.
    You are correct; if your computer shipped with Tiger you may certainly revert to it. I forgot that Tiger was shipping on new Macs as recently as five years ago. To downgrade it would be necessary to completely erase your hard disk and boot with the Tiger installation DVD, followed by installing it anew. Such drastic measures are not necessary and you are unlikely to be satisfied with the results anyway.
    Assuming your system is free of third party parasitic junk attached to OS X in an ill-conceived attempt to improve upon it, that your hard disk drive is sound and the boot volume has enough free space to work with, by far the best performance-enhancing improvement would be to add more memory. Buy as much as your computer can use and that you can afford. 2 GB is not that much any more.
    Read the following for some recommended troubleshooting techniques from Apple:
    General purpose Mac troubleshooting guide: Isolating issues in Mac OS X
    Creating a temporary user to isolate user-specific problems: Isolating an issue by using another user account
    Memory limitations: Using Activity Monitor to read System Memory and determine how much RAM is being used
    Identifying resource hogs and other tips: Runaway applications can shorten battery runtime
    Starting the computer in "safe mode": Mac OS X: What is Safe Boot, Safe Mode?

Maybe you are looking for

  • IWeb crashes when I try to start it

    Hey there, I have a 6mos old iMac 2.33 running latest apple software with 2GB RAM but for the life of I can't get iMovie (08) to start up. Each time I try, it makes the little coloured spinning thing comes up and then Activity Monitor tells me it has

  • Kernel patches in Solaris

    All, First of all, I am fairly new to Unix... I have a SAP NW 7.0 subscription system running on Solaris 10; I installed all usage types except PI. Now I want to upgrade to SP14. As part of this I have to patch the kernel from 111 to 133 as well. Acc

  • 2 ipods on one account

    I have 2 Ipods on my account one that is mine and the other one I gave to my BF but if I want to delete music off my Ipod that he has downloaded to Itunes how do I do that with out deleting it off his Ipod too? And how do I keep from getting his down

  • ALV Title not displaying

    I'm using the same User id in two different PC's, In one PC the ALV title is coming and in another PC the ALV title is not coming. Can anyone advice me on this. Thanks.

  • Struckup with exportSDK.dll

    Hi, Im doing XI R2 client installtion on my machine.It is struckup with exportsdk.dll.i forcefully exited the setup ,cleared all the registeries,deleted the folders and restarted the machine and did the installion from the scratch.Again it is stuckup