Migrating SpamAssassin "Learnings"

How do I teach SpamAssassin on my new Leopard Server what my Tiger Server learned? I do not see that info in the migration or the mail documentation. If I use the tool to export Mail settings, will it migrate what I have taught SpamAssassin or will it even migrate the SpamAssassin settings? If neither, I am sure I can do the settings, but I am not sure how to transfer what it learned. Any ideas?

You can use spamtrainer to backup and restore your bayes DB. spamtrainer is available here:
http://osx.topicdesk.com/tool/
After installation, see "man spamtrainer".
The SpamAssassin setting on OS X, are really the amavisd settings. Do not replace /etc/amavisd.conf with your old file as they are not compatible. Simply carry over the settings you need.
Also, the learnjunkmail script that comes with Leopard server is broken. You can use spamtrainer instead. More info here: http://osx.topicdesk.com/content/view/131/84/

Similar Messages

  • Migrating Spamassassin Bayes DB

    I am migrating a 10.5 mailserver to new hardware running 10.6.
    The old mailserver uses a global spamassassin db stored in /var/amavis/.spamassassin and the files in this dir are as follows:
    -rw------- 1 _amavisd _amavisd 42123264 Dec 8 22:42 auto-whitelist
    -rw------- 1 _amavisd _amavisd 25272 Dec 8 22:42 bayes_journal
    -rw------- 1 _amavisd _amavisd 41549824 Dec 8 22:25 bayes_seen
    -rw------- 1 _amavisd _amavisd 5115904 Dec 8 22:25 bayes_toks
    What is the best way to migrate this bayes db to the new server?
    I thought:
    sa-learn --backup > backup.txt
    might work, but this yields basically an empty output file.
    Tips?
    Rusty
    Message was edited by: Rusty Ross

    Thanks, Alex.
    I used spamtrainer.
    A couple questions on that. The first time I ran spamtrainer -r on the new machine, I got a message from spamtrainer that /var/amavis/.spamassassin did not exist, and then spamtrainer went ahead to create the following two files in /var/amavis:
    -rw------- 1 _amavisd _amavisd 41582592 Dec 9 13:21 .spamassassin_seen
    -rw------- 1 _amavisd _amavisd 4202496 Dec 9 13:21 .spamassassin_toks
    So then, I created /var/amavis/.spamassassin and rain spamtrainer -r again.
    This time, spamtrainer created the following two files in /var/amavis/.spamassassin:
    -rw------- 1 _amavisd _amavisd 41582592 Dec 9 13:26 bayes_seen
    -rw------- 1 _amavisd _amavisd 4202496 Dec 9 13:26 bayes_toks
    ...which seems right to me.
    Two questions.
    (1) Can I safely delete .spamassassin_seen and .spamassassin_toks in /var/amavis now?
    (2) Should I be concerned that spamtrainer didn't seem to migrate bayes_journal and auto-whitelist?
    Thanks.

  • New Windows Azure Migration Content

    The guide, Migrating Data-Centric Applications to Windows Azure, provides detailed guidance on how to migrate data-centric applications to Windows Azure Cloud Services, as well as an introduction on how to migrate those same applications to Windows
    Azure Virtual Machines. By using this guide, you will have the planning process, migration considerations, and prescriptive how to’s needed for a positive migration experience.
    This guide captures best practices gleaned from the real-world engagements of CAT and the technical expertise of the SQL Database Content team and will be updated regularly to cover new learnings and additional features.
    The full guide is at:
    Migrating Data-Centric Applications to Windows Azure.
    The section on migrating SQL Server databases and migrating data to other data management services such as table, blob, Windows Azure drive is at:
    Migrating with Windows Azure Cloud Services.
    Karthika [MSFT] This posting is provided "AS IS" with no warranties, and confers no rights.

    Hi babarzhr,
    For uploading large data files to storage blob, please try AzCopy, a command line utility with concurrent operations to upload, download, and copy blobs. You
    may get the latest version from aka.ms/azcopy, find the AzCopy at "<system disk>:\Program Files (x86)\Microsoft SDKs\Windows Azure\AzCopy" after your installation and use below AzCopy command line pattern:
           AzCopy <source folder> <destination folder> [filepatterns] [options]
    Here is the sample command lines for your scenarios.
    #1. Upload from local to your Azure Storage blob.
    #1.1 Upload single file, say uploading “db1.bck”.
             AzCopy D:\backup\
    https://myaccount.blob.core.windows.net/mycontainer//DestKey:key db1.bck
    #1.2 Upload multi db files, say uploading “db1.mdf”, “db1.log”, “db1.bck”, use file pattern “db*” and option /s to recursively upload.
             AzCopy D:\backup\
    https://myaccount.blob.core.windows.net/mycontainer/ /DestKey:key db* /s
    #1.3 Choose block or page blob by option /BlobType:Page|Block.
            AzCopy D:\backup\
    https://myaccount.blob.core.windows.net/mycontainer/ /DestKey:key db1.bck /BlobType:Block
    #2.
    Download from Storage to your VM local disk.
    #2.1 Download single file, say downloading “db1.bck”.
             AzCopy
    https://myaccount.blob.core.windows.net/mycontainer/ D:\backup\ /SourceKey:key db1.bck
    #2.2 Download multi db files, say downloading “db1.mdf”, “db1.log”, “db1.bck”, use file pattern “db” and option /s to recursively upload, attention, when ‘source’ is storage blob,
         the file pattern will be treated as prefix.
             AzCopy
    https://myaccount.blob.core.windows.net/mycontainer/ D:\backup\ /SourceKey:key db /s
    #3. For uploading/downloading large data files, suggest you open resume mode directly with option /z:<journal file>.
             AzCopy D:\backup\
    https://myaccount.blob.core.windows.net/mycontainer/ /DestKey:key db1.bck /z:”d:\test\restart.log”
    If your operation is
    terminated
    in the middle of downloading/uploading, use same command line to resume directly.

  • Validation rules applied to data migration templates at import

    Hi everyone!
    First post here for me, so please bear with me if I missed something.
    My company has just started the initial implementation of ByDesign. We come from a set of disparate and partially home-grown systems that we outgrew a few years ago.
    As part of this initial phase, we are basically re-creating the data on customers, suppliers, etc. since none of our existing systems makes a good source, unfortunately. We will be using the XML templates provided by ByDesign itself to import the relevant data.
    It has become clear that ByDesign applies validation rules on fields like postal codes (zip codes), states (for some countries), and other fields.
    It would be really helpful if we could get access to the rules that are applied at import time, so that we can format the data correctly in advance, rather than having to play "trial and error" at import time. For example, if you import address data, the first time it finds a postal code in the Netherlands which is formatted as "1234AB", it will tell you that "there needs to a space in the 5th position, because it expects the format to be "1234 AB". At that point, you stop the import, go back to the template to fix all the Dutch postal codes, and try the import again, only to run into the next validation issue.
    We work with a couple of very experienced German consultants to help us implement ByDesign, and I have put this question to them, but they are unaware of a documented set of validation rules for ByDesign. Which is why I ask the question here.
    So just to be very celar on what we are looking for: the data validation/formatting rules that ByDesign enforces at the time the XML data migration templates are imported.
    Any help would be appreciated!
    Best regards,
    Eelco

    Hello Eelco,
    welcome to the SAP ByDesign Community Network!
    The checks performed on postal codes are country specific, and represent pretty much the information that you would find in places like e.g. the "Postal Codes" page in Wikipedia.
    I recommend to start with small files of 50-100 records that are assembled of a representative set of different records, in order to collect the validation rules that need reactions based on your data in an efficient way. Only once you have caught these generic data issues, I would proceed to larger files.
    Personnaly I prefer to capture such generic work items on my list, then fix the small sample file immediately by editing, and do an immediate resimulation of the entire file, so that I can drill deeper and collect more generic issues of my data sample. Only after a while when I have harvested all learnings that were in my sample file, I would then apply the collected learnings to my actual data and create a new file - still not too large, in order to use my time efficiently.
    Best regards
    Michael  

  • SpamAssassin questions (sa-learn, show score in subject, config file)

    I'm currently trying to adjust SpamAssassin, but although I found lots or articles in this forum about it (some of them are quite useful , I still have some open questions looking forward getting assisted with.
    (Manually training SpamAssassin)
    I collected some spam on my local computer's inbox. Uploaded the /Messages directory to the server. Connected to the server via terminal, as root. Executed "sa-learn --showdots --spam /Messages". Fine.
    Now I read that I had not to use root user but user "clamav". Can anyone shortly explain why I need the user of the virus filter to learn the spam filter ? How makes this sense ?
    If I have to do so, I simply can retype "su clamav sa-learn --spam /Messages" when being logged in as "root" to execute sa-learn as user "clamav", correct ?
    At root's directory I found ".spamassassin" folder containing the bayes learnings. Can I safely delete this folder ? I hate to have non-required data wasting diskspace...
    (Show spam scoring in subject line)
    At server admin tools / Mail service, I modified the line "attach subject tag". My goal is that it displays the "spam hits" a message achieved to allow users doing custom rating in their local mail client (e.g. having rules like "if message has 10 points then it is spam, otherwise not", etc.).
    At spamassassin.org I verified (and used it this way on my old server) that the tag is "SCORES" (without ").
    For example in the config file it would state "rewrite_header Subject *Spam(SCORE):"
    But the server admin tool seem not to accept this. It states "SCORE" without replacing it. I tried also HITS
    Any idea ?
    (Manual modifications at the config file)
    As indicated above I want to fine-tune SpamAssassin as done on my old server and therefore would need to know where the config file is located.
    At the forum I read that it is /etc/amavisd.conf and that SpamAssassin's config files are ignored widely. But... in amavisd.conf I found nothing about spam.
    So... which file do I need to edit ?
    Thanks to all helping me

    Sorry, this was a double-post. @admin: Please remove it.

  • GTS Migration.

    Hello GTS experts,
    We are planning to migrate GTS 7.0 from one server to another with different Suse Linux to RedHat Linux.
    We are wondering how the GTS 7.0 can be migrated.
    1. Is it enough to do export/import of the database?
    2. Do we need to take export of any configuration or file system level data?
    Thanks in advance.
    Regards,
    Hari
    Edited by: NM Hari Prasad Pasalapudi on Sep 22, 2010 6:02 PM
    Edited by: NM Hari Prasad Pasalapudi on Sep 24, 2010 3:33 PM

    I think if you touch base with basis folks, it would be helpful. Few things which can be tested after migration such as-
    1. Table space
    2. Run time of transaction (ideally 10 min - 30 min) based on the volume of data.
    3. There would be lot of Run time errors
    4. If TREX is available then test for it. check BD64 TCP/IP for its config and other details
    5. RFC connection
    6. Load test would also be helpful
    7. Fetching programs
    8. jobs
    9. Customized development tests having z tables
    10. Promote to production
    I think having the back up would be a risk mitigation step and it can be justified as well, but decision would be based on the volume of the data.
    All the best and let us know how it goes and your key learnings.

  • Asynchronous process learnings: Why are callbacks skipped?

    Just wanted to log my learnings about asynchronous processes and callbacks.
    I was trying to write a .NET client to invoke an asynchronous process and was not getting anywhere. The process was being invoked successfully but the callback wasn't happening. The visual flow in the console kept saying "skipped callback x on partner y". It appears this happens when the process does not recognize that the incoming invocation is conversational - in other words it does not find the WS-Addressing headers or if present does not understand it. You would notice the same behaviour when you invoke an asynchronous process from the console. This is expected behaviour because the console does not expose a callback.
    So, if you are trying to invoke an asynchronous process from another platform and see the "skipped callback" message, the first place to check is your WS-Addressing headers. Is it present and specified properly? Is the WS-Addressing namespace correct? On the BPEL 2.0.11 version I was testing, newer WS-Addressing versions would not work. I had to switch back to the older "http://schemas.xmlsoap.org/ws/2003/03/addressing" version. Another strange problem I noticed was that it was expecting some bpel attributes such as rootID, parentID in the MessageID header. Without these it would throw a server fault.
    It also helps to write a sample client process in BPEL to invoke the asynchronous process and log the request and callback messages (there are technotes on OTN that show how to do this). This can be compared with the messages produced by your client to debug any problems.
    Hope this information is useful for interested people. I'm yet to test the behaviour on the 10g version. Will keep this thread updated when I do that.

    The migration utility only processes transforms that are created by the user.  If a transform was part of the Data Quality installation, it will not be migrated because the equivalent transform will be installed as part of the Data Services installation. 
    Please read the manual "DS Migration Considerations > Data Quality to Data Services Migration > How transforms migrate" for more details.
    Regards,
    George
    Edited by: George Ruan on Nov 22, 2011 9:04 PM

  • Migrate SP2010 site collection to SP2013 site collection with a new URL - How ?

    How to migrate Classic SP2010 site collection (http://OldWebApp.SP2010.com/sites/OldSiteCollection) to Claims SP2013 site collection: (http://NewWebApp.SP2013.com/projects/NewSiteCollection) ?
    No custom solutions involved, But, The URLs must be different.
    How to accomplish that ?

    Following are the steps:
    Take the backup of the content database in SP 2010.
    Create a classic based web application in SP 2013 using PowerShell.
    New-SPWebApplication -Name "TestApplication" -ApplicationPool "TestApplicationAppPool" -AuthenticationMethod "NTLM" -ApplicationPoolAc
    Create a content database in the web application created above.
    Mount the database 
    Mount-SPContentDatabase WSS_Content_100 -DatabaseServer SQL2012Demo -WebApplication http://sp2013demo:100
    Now convert the web application to use claims authentication:
    Convert-SPWebApplication -Identity "http://sp2013demo:100" -To Claims –RetainPermissions -Force
    Finally, migrate the users to use claims identity:
    $w = Get-SPWebApplication "http://sp2013demo:100"
    $w.MigrateUsers($True)
    See these links for more information:
    http://www.sharepointnadeem.com/2013/12/sharepoint-2013-upgrade-from-sharepoint.html
    http://www.sharepointnadeem.com/2014/01/upgrade-from-sharepoint-2010-classic.html
    Blog | SharePoint Learnings CodePlex Tools |
    Export Version History To Excel |
    Autocomplete Lookup Field

  • How-To: Migrate Classic Path-based SP2010 Site Collection with OOTB/SPD Workflows = To: Claims Different Path-based SP2013 Site Collection ?

    Hello,
    I want to move an entire site collection from an old SharePoint 2010 farm to a new SharePoint 2013 farm. But there are a few requirements:
    - The old SP2010 site collection contains multiple SPD Workflows and OOTB Workflows.
    - URLs must be changes as the example: 
    Source Site Collection URL: http://OldWebApp.SP2010.com/sites/OldSiteCollection 
    Destination Site Collection URL shall be: http://NewWebApp.SP2013.com/projects/NewSiteCollection
    - There are no custom solutions nor web parts.
    How to accomplis this successfully ?
    My steps would be:
    1) Copying the content database from SP2010 SQL to the SP2013 SQL via: Copy-only backup feature, or the traditional Backup & Restore.
    2) I have already the Web Application on the new SP2013 ready, So, I will just need to create the destination site collection in the SP2013 environment with the new desired URL: (http:\\NewWebApp.SP2013.com\projects\NewSiteCollection).
    3) Remove the newly created Content Database of the new site collection in the destination environment.
    4) Test-SPContentDatabase -name Old_Content_DB -webapplication http://NewWebApp.SP2013.com > C:\Issues.txt
    My worry is that the SP2010 Web Application is Classic-based and I cannot change the entire Web Application to Claims-based as it hosts other running site collections !
    Equally, My other worry is that the SP2013 Web Application is Claims-based and I cannot change the entire Web Application to Classic-based as it hosts other running site collections !
    First-Question=> I cannot run Convert-SPWebApplication as I don't want to impact the entire source nor destination web applications - How to resolve this?
    5) Mount-SPContentDatabase "Old_Content_DB" -DatabaseServer "DBServer" -WebApplication http://NewWebApp.SP2013.com
    Second-Question=> As I mentioned above that my requirement is to migrate this old SP2010 site collaction to a totally new URL in SP 2013. My concern is that after completing step-5, the site
    collection will overwrite the root new web application and it gets placed on: (http://NewWebApp.SP2013.com) !  I want it to be: (http://NewWebApp.SP2013.com/projects/NewSiteCollection) ? How to do this trick?
    Third-Question=> Will all the SPD Workflows & the OOTB Workflows work correctly after the entire site collection has been moved from SP2010 to SP2013? I am asking this because I seen this:
    (http://blogs.msdn.com/b/chandru/archive/2013/04/14/upgrading-spd-2010-workflows-to-spd-2013-workflows.aspx), and I got scared. What do you think ?
    Could you please help me out in my three questions ?

    Following are the steps:
    Take the backup of the content database in SP 2010.
    Create a classic based web application in SP 2013 using PowerShell.
    New-SPWebApplication -Name "TestApplication" -ApplicationPool "TestApplicationAppPool" -AuthenticationMethod "NTLM" -ApplicationPoolAc
    Create a content database in the web application created above.
    Mount the database 
    Mount-SPContentDatabase WSS_Content_100 -DatabaseServer SQL2012Demo -WebApplication http://sp2013demo:100
    Now convert the web application to use claims authentication:
    Convert-SPWebApplication -Identity "http://sp2013demo:100" -To Claims –RetainPermissions -Force
    Finally, migrate the users to use claims identity:
    $w = Get-SPWebApplication "http://sp2013demo:100"
    $w.MigrateUsers($True)
    See these links for more information:
    http://www.sharepointnadeem.com/2013/12/sharepoint-2013-upgrade-from-sharepoint.html
    http://www.sharepointnadeem.com/2014/01/upgrade-from-sharepoint-2010-classic.html
    Blog | SharePoint Learnings CodePlex Tools |
    Export Version History To Excel |
    Autocomplete Lookup Field

  • How can I Migrate/Split itunes/icloud account for a large family now that they have family sharing

    Ok, now that Apple has finally come out with family sharing I need to get things sorted out and I think it is going to be a mess. I have 6 family members (my spouse and 4 children) all sharing/using one itunes account. For parental reasons this made perfect sense. Now with the the advent of family sharing and such we have alot of sorting and migrating to do and I am very unsure how to go about it. Here is what we have.
    Myself - I have a 64GB iPad 3 and a 64GB iPhone 5
    Spouse - 64GB iPhone 5
    Children - x4 16GB iPhone 5C (pink, blue, green, and a yellow)
    We have had one itunes account now for several years, we used to have x4 16GB iphone 4's and a 16GB iPod touch 4 so we needed it to be shared at the time. We all have different Game Center ID's and Facetime/iMessages are all setup to the phone numbers (aside from mine which also uses the itunes e-mail). We currently have 1 upgraded icloud account with 200GB of storage on it (love the new tiers). So here are the questions.
    1) How do I go about using and isolating all the devices so they can all use the iCloud storage? Right now if I try and setup backups by device under the what to backup there is no simple "Full Backup" option, they have all the categories separated. I cannot backup my daughters contacts without forcing a merging of them and so on. Same thing with Documents and Data. Would I need to setup a different iCloud account for each device now?
    2) iTunes Music/App purchases. Now that they have a family sharing function I am assuming it would be a good time to separate all the devices (aside from my personal iPad/iPhone). If I understand correctly we need to setup a new itunes account for each and link them with family sharing, does this then allow us to disperse/move items from a singular to the multiple account (at least a one time option to move around) or would it simply be shared off the original account? My daughter for example will be 18 next summer and as such no longer under our care. How do we move her items to her account? She has a lot of purchased Music. Now keep in mind I am talking about an account for someone who was 12 when she had her first device and as such needed to be under her parents.
    I just want to state that repurchasing things is NOT an option, and to be frank,  according to Canadian laws pertaining to *licensing* we actually own the product if we paid for it and have the right to transfer the ownership thereof. This issue went through the courts regarding used copies of OEM editions of Microsoft Windows several years ago and they closed the legal *license* vs ownership separation.

    Go to appleid.apple.com to manage the ID, sign in and select to the Name, ID and Email Addresses section, then change the birth date on the bottom right.  At the present time you may not be able to enter the correct birth date, but others have had success changing the year to 2000, or the date to 1/1/2001, both of which will have child ID status.

  • AR Open/Closed Invoices Migration Help

    Hello Experts,
    We have a data migration requirement for AR invoices from a legacy system to Oracle.
    We have both open and closed invoices. Can anyone of you help in giving the steps to be followed? This is for India AR Invoices. Expecting a quick response and it would be of great help to us! This is in 11.5.10
    Thanks,
    Janani Sekar
    Edited by: user11981778 on 20-Dec-2012 20:32

    Hi,
    Pl. populate the following open interfaces from your legacy data and then run the 'Autoinvoice Master Program'
    (1) RA_INTERFACE_LINES_ALL
    (2) ra_interface_distributions_all
    For open invoices the AR_PAYMENT_SCHEDULES_ALL.STATUS sholuld be equal to 'OP' .
    Pl. visit following link also for more info.
    http://bhaskarreddyapps.blogspot.in/2011/10/ar-invoice-interface.html
    HTH
    Sanjay

  • Migrate all Open Sales Orders From Legacy System (SAP) To SAP System using

    Hi Experts,
                 I've to Migrate all Open Sales Orders From Legacy System (SAP) To SAP System using Business Objects with a new SALES ORDER DOCUMENT NUMBER referencing the older one.
               I'll get all the required data with field in an excel file.
                 Does any standard transaction exist for it ? Or how to go ahead with it ?
    Thanks and regards,
    Jyoti Shankar

    Hi
    If you are checking for CREATE option then Sales Doc Type
    For more Info goto SWO1 transaction -> BUS2032 --> DIsplay --> Execute --> There SELECT the method which you want to perform... There you can fine the MANDATORY parameters also....
    Or in DISPLAY mode PLACE Cursor on the Required Method and CLick the PARAMETERS button on toolbar...
    That will show the MANDATORY parameters...
    Reward if helpful....
    Message was edited by:
            Enter the Dragon

  • Migrating open POs and GR/IR clearing account balance

    Hello gurus,
    I've got the following problem concerning migration of open POs:
    For example: In my source system there is a PO for 10 pcs. of some material. There has been a goods receipt for 5 pcs. (200$ each).
    Now, when I import this order and the corresponding purchase order history into my target system (using LSMW), the target system creates the order, a material document for 5 pcs. and an account document.
    But of course, our FI-team also has to migrate the balance of the 'old' GR/IR clearing account.
    So, the balance in the source system is <> 0 (e.g. 1000$), because the 5 pcs. have been delivered but there has been no invoice receipt yet.
    This balance is imported into the new system and then the open orders are migrated, generating an account document and thus, the balance is 2000$ afterwards.
    This obviously is not correct, so I am sure that I am missing something, just what?
    Thanks
    Alicia

    Hi,
    1. There will be an open PO uploaded for 5 qty and price 200$ each
    2. Opening Balance of Material - 5 qty and corresponding value to Stock A/c - 1000$
    3. Also there will be Vendor Balances uploaded in the system as 1000$ against the invoice. (If the invoice is still expected then chec with FI Users if they can get invoice form vendor)
    if no then do not upload initial stock entry of step2, create a PO of 10 qty and then do GR in system (SAP).

  • [Migrating from 6i to 11G] HTML viewer

    Hi,
    There used to be a program to display the output of a report in 6i. When the report was displayed, it used page breaks (ESC characters) to allow the user to view the next page and print a range of pages from a report.
    In 11G, these control characters are not interpreted by Internet Explorer:
    >
    height 85
    width 94
    before report esc "&l%%0O" esc "&l8D" esc "(s12H"
    after report esc "E"
    after page control(L)
    [End Quote]
    So how could I migrate the printer definition files in order to get after page control in 11G?
    Many thanks for your help.

    As my old college profession used to say - "The only stupid question is the one that doesn't get asked!" Forms 11g only supports web deployment. This means you will need the OAS with the Forms and Reports Services installed as well. Typically, the setup is three tiered ( 1 server = Database, 1 server = Application Server, 1 server = Infrastructure server ). You don't have to set up the technology stack multitiered - all three could reside on the same server or you could have the all three on the same machine but is not recommended. I would at least have the database on a server seperate from OAS.
    Here are a few links to get you started.
    Oracle Documentation
    * Check out: Development Tools - Oracle Forms, Middleware - Applications Server - Oracle Fusion Middleware 11g and Middleware - Data Warehousing and Business Intelligence - Oracle Reports
    Since you will be upgrading from Forms 6i to 11g, be sure to visit the Oracle Forms-Upgrading Forms 6i to Forms 11g documentation.
    BTW, welcome to the forums!
    Craig
    If a response is helpful or correct, please mark it accordingly.
    Edited by: CraigB on May 6, 2010 2:09 PM

  • How can I migrate everything from one account to another on same computer?

    How can I migrate everything from one account to another on same computer?

    Transferring files from one User Account to another

Maybe you are looking for

  • Problem With my E72 with invalid lock code

    Hi, I just configured a new exchange mail account. Prior to that I have configured a my previous firm mail exchange. I forgot the passcode. After setting up new account it asked for the lock code and i tried many time which reulted in data loss. I lo

  • Lost pics (no solution in faqs or knowledgebase)

    Ok here it goes, I am on the most current version of Lightroom, and I have apparently 132 missing pics, I believe the error is on my part and not the database's from an upgrade. The reason is that after importing I didn't export them to my folders. N

  • Drop shadow only on objects below not on background?

    I want the drop shadow to only be visible on the two blue objects below the top object but not on the background.  Any ideas how I can accomplish this? Illustrator CS5. Thanks in advance.

  • Itunes Error 21; Not security and DFU doesn't work

    I have tried recovery mode, tried DFU, fixed my security so that it doesn't affect itunes, and still no dice! I need to know if anything else will work.

  • TOAD/OID and TNSNAMES.ORA

    We had OID installed on a trial basis here to replace tnsnames.ora files on servers, laptops and desktops. We have since uninstalled and removed all files and registry entries of the application from the server where it was installed. I had my laptop