Best workaround for using Gmail to filter POP spam?

I have 3 POP email addresses and get hundreds of spam mails per day. Can't handle having to delete them manually on the iPhone. I tried setting my Gmail account to check my POP emails and setting up my iPhone with only my Gmail account. Now I get all my POP email without spam, but then I can only send using my Gmail address (which I never use and don't want to). Any suggestions for a workaround?

And the answer is… [according to David Pogue], columnist for the NY Times:
A Smarter Way to Fetch E-Mail
I know everybody's sick to death of hearing about Apple's latest i-product, so I promise not to even mention its name in this newsletter. But as I was trying to get my e-mail set up on that cellphone, I stumbled upon a delicious secret feature of Gmail, Google's fast, free, fantastic Web-based e-mail service. This is a trick that can help everyone, whether you have a cellphone or not.
One big drawback of the Mail program on Apple's phone is that it has no spam filter. That's not a big deal if your e-mail comes from AOL, Yahoo or Gmail, because those services have pretty good spam filters of their own. But if you have some other kind of account-like a standard POP account (provided by your cable company, for example), you may be overrun by junk mail.
I kept hearing from people who told me how they solved this problem: "Oh, I just forward my mail to Gmail," they say. "Then I set up my new phone to check my Gmail account instead of my regular address."
Well, all right; it's easy enough to make your e-mail program auto-forward incoming mail your Gmail address. But there's a huge problem with that setup: Now all of your messages appear to have come from you, the forwarder. If you hit Reply on your phone, the response doesn't go to the original sender; it goes right back to YOU! It gets sent back to your desktop computer (or whatever computer is doing the forwarding).
Clearly, that's no good. So I asked my tech guru, Brian Jepson, if there's any solution-and he told me about Gmail's new Mail Fetcher service. My problem was solved in five minutes.
In essence, this feature tells Gmail to fetch messages from your existing POP account, so that it all shows up at Gmail.com. Better yet, Mail Fetcher offers you the chance to have outgoing messages stamped with your regular e-mail address. In other words, Gmail.com becomes a free, invisible mail processing center, leaving no trace of its involvement. The people you correspond with will never know that their messages, or your responses, went anywhere but straight to your computer and back.
You can still check mail with Outlook, Mail, Entourage, or whatever program you're using now. But now you've solved the spam problem on your phone-and better yet, you can now check your regular POP e-mail-up to five accounts, in fact-at Gmail.com, from any computer in the world! Now, if all you want to do is keep in touch with e-mail while you're on vacation, you can leave your laptop at home.
Here's how you set up this free, no-downsides arrangement. Suppose that your real e-mail address is [email protected].
First, sign up for a free Gmail account at www.gmail.com.
Once your account is active, visit Gmail.com. Click Settings, then Accounts. Under "Get mail from other accounts," click "Add another email account." Fill the e-mail settings for your main address: name, password, mail server address.
If you like, you can also turn on "Leave a copy of retrieved message on the server." That means that you'll also be able to check your mail from Outlook, Mail, or whatever e-mail program you use, just as you always have. The Gmail account will just be a backup, a secondary, Web-based way to do e-mail.
As you complete the setup process in Gmail, a message says: "You can now retrieve mail from this account. Would you also like to be able to send mail as [email protected]?"
Click "Yes, I want to be able to send mail as [email protected]."
In other words, when you reply, your main e-mail address, not Gmail's, will be the return address. It won't matter whether you send from Gmail.com or from your phone; it will all look like it came from Outlook, Mail, or whatever.
You can add up to five e-mail accounts this way, consolidating them all in one place-a very neat trick. Gmail seems to check for new messages about every five minutes, and there's also a "Check mail now" button.
I know this is all sounds much more technical than my usual writings; there's no way around it. The bottom line, though, is that Gmail's Mail Fetcher system solves a big problem for smartphone owners, and-by making your mail available on the Web-another big one for travelers. Nice.

Similar Messages

  • Best practice for use of spatial operators

    Hi All,
    I'm trying to build a .NET toolkit to interact with Oracles spatial operators. The most common use of this toolkit will be to find results which are within a given geometry - for example select parish boundaries within a county.
    Our boundary data is high detail, commonly containing upwards of 50'000 vertices for a county sized polygon.
    I've currently been experimenting with queries such as:
    select
    from
    uk_ward a,
    uk_county b
    where
    UPPER(b.name) = 'DORSET COUNTY' and
    sdo_relate(a.geoloc, b.geoloc, 'mask=coveredby+inside') = 'TRUE';
    However the speed is unacceptable, especially as most of the implementations of the toolkit will be web based. The query above takes around a minute to return.
    Any comments or thoughts on the best practice for use of Oracle spatial in this way will be warmly welcomed. I'm looking for a solution which is as quick and efficient as possible.

    Thanks again for the reply... the query currently takes just under 90 seconds to return. Here are the results from the execution plan ran in sql*:
    Elapsed: 00:01:24.81
    Execution Plan
    Plan hash value: 598052089
    | Id | Operation | Name | Rows | Bytes | Cost (%CPU)| Time |
    | 0 | SELECT STATEMENT | | 156 | 46956 | 76 (0)| 00:00:01 |
    | 1 | NESTED LOOPS | | 156 | 46956 | 76 (0)| 00:00:01 |
    |* 2 | TABLE ACCESS FULL | UK_COUNTY | 2 | 262 | 5 (0)| 00:00:01 |
    | 3 | TABLE ACCESS BY INDEX ROWID| UK_WARD | 75 | 12750 | 76 (0)| 00:00:01 |
    |* 4 | DOMAIN INDEX | UK_WARD_SX | | | | |
    Predicate Information (identified by operation id):
    2 - filter(UPPER("B"."NAME")='DORSET COUNTY')
    4 - access("MDSYS"."SDO_INT2_RELATE"("A"."GEOLOC","B"."GEOLOC",'mask=coveredby+i
    nside')='TRUE')
    Statistics
    20431 recursive calls
    60 db block gets
    22432 consistent gets
    1156 physical reads
    0 redo size
    2998369 bytes sent via SQL*Net to client
    1158 bytes received via SQL*Net from client
    17 SQL*Net roundtrips to/from client
    452 sorts (memory)
    0 sorts (disk)
    125 rows processed
    The wards table has 7545 rows, the county table has 207.
    We are currently on release 10.2.0.3.
    All i want to do with this is generate results which fall in a particular geometry. Most of my testing has been successful i just seem to run into issues when querying against a county sized polygon - i guess due to the amount of vertices.
    Also looking through the forums now for tuning topics...

  • Workaround for using 1080/25p in FCE

    I shot some video on my Panasonic AGHC 151E in 1080/25p not knowing it was not supported in FCE. What is the best workaround for converting this footage into a format I can use on FCE? I am planning on importing it into iMovie then exporting a MOV, Then importing that into FCE. Is there a better solution?

    FCE can be made to work with 1080p25 media, assuming you're not mixing media. Make a 1080i sequence and in item, properties set the field dominance to none.
    Do NOT use iMovie. Any export from iMovie will degrade your media.

  • What are the best practices for using the enhancement framework?

    Hello enhancement framework experts,
    Recently, my company upgraded to SAP NW 7.1 EhP6.  This presents us with the capability to use the enhancement framework.
    A couple of senior programmers were asked to deliver a guideline for use of the framework.  They published the following statement:
    "SAP does not guarantee the validity of the enhancement points in future releases/versions. As a result, any implemented enhancement points may require significant work during upgrades. So, enhancement points should essentially be used as an alternative to core modifications, which is a rare scenario.".
    I am looking for confirmation or contradiction to the statement  "SAP does not guarantee the validity of enhancement points in future releases/versions..." .  Is this a true statement for both implicit and explicit enhancement points?
    Is the impact of activated explicit and implicit enhancements much greater to an SAP upgrade than BAdi's and user exits?
    Is there any SAP published guidelines/best practices for use of the enhancement framework?
    Thank you,
    Kimberly
    Edited by: Kimberly Carmack on Aug 11, 2011 5:31 PM

    Found an article that answers this question quite well:
    [How to Get the Most From the Enhancement and Switch Framework as a Customer or Partner - Tips from the Experts|http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/c0f0373e-a915-2e10-6e88-d4de0c725ab3]
    Thank you Thomas Weiss!

  • Best practice for using messaging in medium to large cluster

    What is the best practice for using messaging in medium to large cluster In a system where all the clients need to receive all the messages and some of the messages can be really big (a few megabytes and maybe more)
    I will be glad to hear any suggestion or to learn from others experience.
    Shimi

    publish/subscribe, right?
    lots of subscribers, big messages == lots of network traffic.
    it's a wide open question, no?
    %

  • Best Practice for using multiple models

    Hi Buddies,
         Can u tell me the best practices for using multiple models in single WD application?
        Means --> I am using 3 RFCs on single application for my function. Each time i am importing that RFC model under
        WD --->Models and i did model binding seperately to Component Controller. Is this is the right way to impliment  multiple            models  in single application ?

    It very much depends on your design, but One RFC per model is definitely a no no.
    Refer to this document to understand how should you use the model in most efficient way.
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/705f2b2e-e77d-2b10-de8a-95f37f4c7022?quicklink=events&overridelayout=true
    Thanks
    Prashant

  • Best Practice for Using Static Data in PDPs or Project Plan

    Hi There,
    I want to make custom reports using PDPs & Project Plan data.
    What is the Best Practice for using "Static/Random Data" (which is not available in MS Project 2013 columns) in PDPs & MS Project 2013?
    Should I add that data in Custom Field (in MS Project 2013) or make PDPs?
    Thanks,
    EPM Consultant
    Noman Sohail

    Hi Dale,
    I have a Project Level custom field "Supervisor Name" that is used for Project Information.
    For the purpose of viewing that "Project Level custom field Data" in
    Project views , I have made Task Level custom field
    "SupName" and used Formula:
    [SupName] = [Supervisor Name]
    That shows Supervisor Name in Schedule.aspx
    ============
    Question: I want that Project Level custom field "Supervisor Name" in
    My Work views (Tasks.aspx).
    The field is enabled in Task.aspx BUT Data is not present / blank column.
    How can I get the data in "My Work views" ?
    Noman Sohail

  • What is the best practice for using the Calendar control with the Dispatcher?

    It seems as if the Dispatcher is restricting access to the Query Builder (/bin/querybuilder.json) as a best practice regarding security.  However, the Calendar relies on this endpoint to build the events for the calendar.  On Author / Publish this works fine but once we place the Dispatcher in front, the Calendar no longer works.  We've noticed the same behavior on the Geometrixx site.
    What is the best practice for using the Calendar control with Dispatcher?
    Thanks in advance.
    Scott

    Not sure what exactly you are asking but Muse handles the different orientations nicely without having to do anything.
    Example: http://www.cariboowoodshop.com/wood-shop.html

  • What is the best  way for using a  C++ in the EJB?

    What is the best way for using C++ in the EJB ie
    either 1. Socket programming
    2. JNI

    To what purpose?
    To use C++ in the client you could generate IDL from your remote interfaces and run that through your vendor's IDL-to-C++ processor.

  • Best practices for using the knowledge directory

    Anyone know when it is best to store docs in the Knowledge Directory versus Collab? They are both searchable, but I guess you can publish from the Publisher to the KD. Anyone have any best practices for using the KD or setting up taxonomies in the KD?

    Hi Richard,
    If you need to configure dynamic pricing that may vary by tenant and/or if you want to set up cost drivers that are service item attributes, you should configure Billing Tables in the Demand Management module in 10.0. 
    The cost detail functionality in 9.4 will likely be changed to merged with the new pricing feature in 10.0.  The current plan is not to bring cost detail into the Service Catalog module.

  • Best practices for using the 'cost details' fields

    Hi
    Please could you advise us to the best practices for using the 'cost details' field within Pricing. Currently I cannot find the way to surface the individual Cost Details fields within the Next Generation UI, even with the tick box for 'display both cost and price' ticked. It seems that these get surfaced when the Next Generation UI is turned off, but cannot find them when it is turned on. We can see the 'Pricing Summary' field but this does not fulfill our needs, as some of our services have both recurring and one-off costs.
    Attached are some screenshots to further explain the situation.
    Many thanks,
    Richard Thornton

    Hi Richard,
    If you need to configure dynamic pricing that may vary by tenant and/or if you want to set up cost drivers that are service item attributes, you should configure Billing Tables in the Demand Management module in 10.0. 
    The cost detail functionality in 9.4 will likely be changed to merged with the new pricing feature in 10.0.  The current plan is not to bring cost detail into the Service Catalog module.

  • HT1229 what is the best method for using a iphoto with an external hard drive for greater capacity?

    what is the best method for using a iphoto with an external hard drive for greater capacity?

    Moving the iPhoto library is safe and simple - quit iPhoto and drag the iPhoto library intact as a single entity to the external drive - depress the option key and launch iPhoto using the "select library" option to point to the new location on the external drive - fully test it and then trash the old library on the internal drive (test one more time prior to emptying the trash)
    And be sure that the External drive is formatted Mac OS extended (journaled) (iPhoto does not work with drives with other formats) and that it is always available prior to launching iPhoto
    And backup soon and often - having your iPhoto library on an external drive is not a backup and if you are using Time Machine you need to check and be sure that TM is backing up your external drive
    LN

  • Best workaround for editing long AVCHD clips in CS6?

    Hi there, I've recently switched to CS6 and have run into incredibly long render times using native AVCHD clips (13 hours to encode a 30 minute m2v).  After reading many posts it seems the issue I am running into is a known bug.  So, my question is: is there a good workaround for this issue?
    It sounds like this issue is only on long clips.  Is it possible to automatically split the .mts files using another program (virtualDub?) prior to importing?
    If transcoding from the native format is required (until the fix) what would be the best method using Windows 7 64bit?  I have heard Avid's DNxHD is a good lossy codec, but I hate to do this if there is an accepted workflow I should learn instead.
    Thanks for any advice!
    -Stephen

    One of these message threads may have some help
    CS6 Bug AVCHD http://forums.adobe.com/thread/1004369?tstart=0
    -and http://forums.adobe.com/thread/1004369?start=0
    -and LOCK the media http://forums.adobe.com/thread/1077245

  • Best Practices for Using Photoshop (and Computing in General)

    I've been seeing some threads that lead me to realize that not everyone knows the best practices for doing Photoshop on a computer, and in doing conscientious computing in general.  I thought it might be a good idea for those of us with some exprience to contribute and discuss best practices for making the Photoshop and computing experience more reliable and enjoyable.
    It'd be great if everyone would contribute their ideas, and especially their personal experience.
    Here are some of my thoughts on data integrity (this shouldn't be the only subject of this thread):
    Consider paying more for good hardware. Computers have almost become commodities, and price shopping abounds, but there are some areas where spending a few dollars more can be beneficial.  For example, the difference in price between a top-of-the-line high performance enterprise class hard drive and the cheapest model around with, say, a 1 TB capacity is less than a hundred bucks!  Disk drives do fail!  They're not all created equal.  What would it cost you in aggravation and time to lose your data?  Imagine it happening at the worst possible time, because that's exactly when failures occur.
    Use an Uninterruptable Power Supply (UPS).  Unexpected power outages are TERRIBLE for both computer software and hardware.  Lost files and burned out hardware are a possibility.  A UPS that will power the computer and monitor can be found at the local high tech store and doesn't cost much.  The modern ones will even communicate with the computer via USB to perform an orderly shutdown if the power failure goes on too long for the batteries to keep going.  Again, how much is it worth to you to have a computer outage and loss of data?
    Work locally, copy files elsewhere.  Photoshop likes to be run on files on the local hard drive(s).  If you are working in an environment where you have networking, rather than opening a file right off the network, then saving it back there, consider copying the file to your local hard drive then working on it there.  This way an unexpected network outage or error won't cause you to lose work.
    Never save over your original files.  You may have a library of original images you have captured with your camera or created.  Sometimes these are in formats that can be re-saved.  If you're going to work on one of those files (e.g., to prepare it for some use, such as printing), and it's a file type that can be overwritten (e.g., JPEG), as soon as you open the file save the document in another location, e.g., in Photoshop .psd format.
    Save your master files in several places.  While you are working in Photoshop, especially if you've done a lot of work on one document, remember to save your work regularly, and you may want to save it in several different places (or copy the file after you have saved it to a backup folder, or save it in a version management system).  Things can go wrong and it's nice to be able to go back to a prior saved version without losing too much work.
    Make Backups.  Back up your computer files, including your Photoshop work, ideally to external media.  Windows now ships with a quite good backup system, and external USB drives with surprisingly high capacity (e.g., Western Digital MyBook) are very inexpensive.  The external drives aren't that fast, but a backup you've set up to run late at night can finish by morning, and if/when you have a failure or loss of data.  And if you're really concerned with backup integrity, you can unplug an external drive and take it to another location.
    This stuff is kind of "motherhood and apple pie" but it's worth getting the word out I think.
    Your ideas?
    -Noel

    APC Back-UPS XS 1300.  $169.99 at Best Buy.
    Our power outages here are usually only a few seconds; this should give my server about 20 or 25 minutes run-time.
    I'm setting up the PowerChute software now to shut down the computer when 5 minutes of power is left.  The load with the monitor sleeping is 171 watts.
    This has surge protection and other nice features as well.
    -Noel

  • Best practices for using AUTOARCHIVING in Exchange 2010

    Hi guys!
    Exchange 2010 SP3 environment. We have 150 users. We bought two 2TB in RAID1 disk and they Will be used for Autoarchive DB's. We have currently about 10 Production DB's named by department. We have created 10 Archive DB's named by department and added Word
    ARCHIVE at the end of the databasename.
    We have 150 users. Some of them (let's say 50) are having 20 - 30 GB of PST files, all the others have about 4-5GB PST files.
    What would be the best practise for putting quota restrictions? As I can see by the default a user is limited with 50GB per archive mailbox. In this scenario where 50 users have more then 20GB+ PST files long, and all the other less then 4GB what would
    be the best practise for setting up quota limitation on newly created 10 ARCHIVE db's to achieve optimal solution?
    with best regards,
    bostjanc

    Hi,
    As far as I know,  by default, in Exchange 2010 SP1, the archive warning quota is set to 45 gigabytes (GB) and the archive quota is set to 50 GB. And we can depend on the following command to set the quotas for all mailboxes in one database:
    Get-Mailbox -ResultSize Unlimited | Where {$_.ArchiveDatabase -ne $null} | Set-Mailbox -Archive Quota 20GB -ArchiveWarningQuota 19GB
    Please note that this is not at database level but mailbox level.
    For more information, you can refer to the following articles:
    http://technet.microsoft.com/en-us/library/dd979795.aspx#AQ
    http://social.technet.microsoft.com/Forums/exchange/en-US/599b2871-6fcc-482f-845b-b59dec342097/usedatabasequotadefaults-for-archive-mailbox?forum=exchange2010
    Thanks,
    Angela Shi
    TechNet Community Support

Maybe you are looking for