Is it safe to clear c:\windows\temp folder on Exchange 2010?

Good day.
I'm running out of space on my Win 2008 R2 SP1 with Exchange 2010 server. Could you tell me if it safe using DiskCleanup wizard on disk "C" to free some space while Exchange is working? I't made me some doubt is it will be allright with my Exchange
after this?

I know about circular logging, but i'm not sure that i can turn it on until i dismount/mount DB.
Actually, i have number of question that i can't find the answers or try to find out the answer in my testing Exchange virtual environment.
1) What if i delete log then not put on the DB. What the behavior of the base was? Is it dismount right after deletion or it will be fine until i decide to dismount and then mount it?
2) In a previous trying to fix the problem i have dismount the DB (check the state was "Clean shutdown" and, as i see, DB don't need old logs for proper work) but can't to delete all the log files. It told me that they are in use. I try to reload server
with check "Not mount Database after reload"
but i wasn't help. What was the problem of it?
3) In other of my previous trying i deleted some replyed logs, then Dismount DB (check state "Clean Shutdown") and when i try to mount it again it has an error until i was clear all log's on a folder. After this it make a new log files and mount allright. Whats
the problem it may be?  
4) How does backup understand that there was some logs deleted manualy and why it doesn't delete it if it already has a complete backup of DB file. Why does the system needs this logs anymore?
5) And the final step. Is it will be if after dismount the database in a proper way it will no mount again with some error message?
Ufff. Sorry for some kind stupid questions. If you will help me fix some of this i will be very very glad!

Similar Messages

  • Windows\Temp folder filling fast - how to Dispose the Crystal Objects?

    Post Author: pontupo
    CA Forum: .NET
    So the Windows\Temp folder is fast filling the disk in my
    deployment. Each time a report is opened, a number of files are created
    here. The problem is, of course, probably that I'm not releasing my
    report objects in my code, as the reports can't even be manually
    deleted without shutting down IIS. Well, fair enough. What I can't
    figure out is where to release my reports objects. I have two pages:
    one performs a pre-pass of the report object and generates a dynamic
    page to prompt the user for parameters. This page, I believe, has no
    problems because the report.Close() command is in line with the code
    and is the final statement, but I could be mistaken and this page may also be leaving memory leaks. The second page, however, has the
    CrystalReportsViewer object and actually displays the report to the
    user after setting up the parameters correctly. On this page, I can't
    figure out how/when to call report.Close(). If I do it at the
    page.Dispose event, there seems to be no affect. If I do it at the
    page.Close event, the report will load fine, but if you try to go to
    another page in the report (in the case of multi-page reports) or
    refresh the data, the report object has already been unloaded and the
    CrystalReportsViewer won't be able to find the report document. If I
    wrap my code in one big Try-Catch-Finally (rather than just having a
    Try-Catch around the file load itself) as I've seen suggested elsewhere and place a report.Close()
    command in the Finally, the Finally is executed before the viewer even
    loads the report so I get a file not found error. So where
    can I unload the report object? What I want is to persist the report
    via Sessions (which I do) so that as the user moves between pages of
    the report/refreshes the report will remain loaded, but when the user
    closes the page or browses to another page, perhaps, I want to close
    the report and free the resources so that the temp files are deleted.
    Following are some code samples: Protected Sub Page_Init(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.Init        sessionString = Request.QueryString("report")        report = Server.MapPath("reports/") & Request.QueryString("report") & ".rpt"        ConfigureCrystalReport()    End SubPrivate Sub ConfigureCrystalReport()        If (Session(sessionString) Is Nothing) Then            reportDoc = New ReportDocument()            'load the report document            If (IsReportValid()) Then              reportDoc.Load(report)              '******************************              'bunch of other code, authentication              'parameter handling, etc. here              '******************************              Session(sessionString) = reportDoc        Else                Response.Redirect("error.aspx")            End If        Else            reportDoc = CType(Session(sessionString), ReportDocument)        End If    CrystalReportsViewer.ReportSource = reportDoc    End Sub    Private Function IsReportValid() As Boolean        Dim reportIsValid As Boolean = False        Try            If (System.IO.File.Exists(report)) Then 'does the file exist?                'if it does, try to load it to confirm it's a valid crystal report                Dim tryReportLoad As New CrystalDecisions.CrystalReports.Engine.ReportDocument()                tryReportLoad.Load(report)                tryReportLoad.Close()                tryReportLoad.Dispose()                reportIsValid = True            End If        Catch ex As Exception            reportIsValid = False        End Try        Return reportIsValid    End Function'Currently, I've also tried each of the following:Protected Sub Page_Unload(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.Unload        CloseReports(reportDoc)        CrystalReportsViewer.Dispose()        CrystalReportsViewer = Nothing    End Sub    Private Sub CloseReports(ByVal report As ReportDocument)        Dim sections As Sections = report.ReportDefinition.Sections        For Each section As Section In sections            Dim reportObjects As ReportObjects = section.ReportObjects            For Each reportObject As ReportObject In reportObjects                If (reportObject.Kind = ReportObjectKind.SubreportObject) Then                    Dim subreportObject As SubreportObject = CType(reportObject, SubreportObject)                    Dim subReportDoc As ReportDocument = subreportObject.OpenSubreport(subreportObject.SubreportName)                    subReportDoc.Close()                End If            Next        Next        report.Close()    End Sub'This was the solution suggested on another forum. I've also tried:Protected Sub Page_Disposed(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.Disposed        reportDoc.Close()        reportDoc.Dispose()        CType(Session(sessionString), ReportDocument).Close()        Session(sessionString) = Nothing    End Sub'I've also tried wrapping everything inside of the If statement in the ConfigureCrystalReport() method in code to this effect:If (IsReportValid()) Then                Try                    reportDoc.Load(report)Catch e As Exception                    Response.Redirect("error.aspx")                Finally                    reportDoc.Close()                End TryAny advice on this is appreciated. Thanks in advance, Pont

    Post Author: sarasew13
    CA Forum: .NET
    Why are you checking for is valid before closing?  As long as the report object isn't null you should be able to close it (whether it's open or not).  I ran into this same problem when trying to store the report, so now I just store the dataset.  Everything seems to work fine and navigate appropriately so here's more or less how I handle it:
    DataSet myDS;ReportDocument myRPT;
        protected void Page_Load(object sender, EventArgs e)    {        try        {            if (!IsPostBack)            {                //pull variables from previous page if available                //set variables into view state so they'll persist in post backs            }            else            {                //if postback then pull from view state            }
                createReport();    }        catch (Exception err)        {            //handle error        }    }
        private void createReport()    {        myDS = new DataSet();        string rpt;
            rpt = "RPTS/report.rpt";        try        {            if (!IsPostBack || Session["data"] == null)            {                myDS = getData();//make data call here                Session["data"] = myDS;            }            else            {                myDS = (DataSet)Session["data"];            }
                if (myDS.Tables.Count > 0)//make sure the dataset isn't empty            {                myRPT = new ReportDocument();                myRPT.Load(Server.MapPath(rpt));                myRPT.SetDataSource(myDS.Tables[0]);
                    if (!IsPostBack)                {                    //code to set parameters for report here                }
                    MyViewer.ReportSource = myRPT;            }        }        catch (Exception error)        {            //handle error        }    }
        protected void Page_Unload(object Sender, EventArgs e)    {        try        {            if (myRPT != null)            {                myRPT.Close();            }        }        catch (Exception error)        {            //handle error        }    }

  • A question about clearing SM_OBS_DLL in temp folder

    Hi,
    Our users are experiencing issues like this "The requested action is not supported for this object [message 131-171] ". According to the searching result, removing SM_OBS_DLL folder in temp folder could fix this issue. But the problem is our users are processing hundreds of orders every day and these issues impact their efficiency a lot. Is there any permanent solution for this?
    P.S. our SAP B1 version is 8.8 PL17
    Thanks,
    Lan

    Hi Lan and Marc,
    Unfortunately the DI API has never been particularly stable when processing larger amounts of data over extended periods. The error message you receive may not actually be related to the problem. Some patches are better than others. Are you able to do any testing on a different patch?
    Is there any monitoring you can do (eg track the memory usage of your add-on over time) on a workstation?
    Also, if you are not already doing so, I recommend you garbage collect any DI API objects as soon as you've finished with them:
    public static void DisposeObjects(Object obj)
        System.Runtime.InteropServices.Marshal.ReleaseComObject(obj);
        obj = null;
        GC.Collect();
    Kind Regards,
    Owen
    P.S. Incidentally the way that the DI API runtime files are copied to the workstation is changing in 8.82 so, if you can't find a solution in your current version, you could look at 8.82 to see if that makes an improvement.

  • Event IDs 136 and 137 0x80000000000000 in System Log on Windows 2008 R2 Server, Exchange 2010 in Cluster

    Hi,
    I'm having an issue with one of my exchange 2010 Servers. We had a power outage and upon recovery, I cannot start Services Net.Pipe Listener Adapter and Net.Tcp Listener Adapter (And thus cannot Start IIS and provide Exchange Client Services.) This is a
    physical server (Not VMWare or Hyper-V)
    The System event log has lots of Event 136's and 137s on Ntfs with the keyword - 0x80000000000000 - The General Messages are: The default transaction resource manager on volume C: encountered an error while starting and its metadata was
    reset.  The data contains the error code.
    and
    The default transaction resource manager on volume OS encountered a non-retryable error and could not start.  The data contains the error code.
    XML Output as follows:
    - <Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event">
    - <System>
      <Provider Name="Ntfs" />
      <EventID Qualifiers="32772">136</EventID>
      <Level>3</Level>
      <Task>2</Task>
      <Keywords>0x80000000000000</Keywords>
      <TimeCreated
    SystemTime="2014-11-17T18:10:37.788942300Z" />
      <EventRecordID>315532</EventRecordID>
      <Channel>System</Channel>
      <Computer>server.domain.com</Computer>
      <Security />
      </System>
    - <EventData>
      <Data />
      <Data>C:</Data>
      <Binary>1C00040002003000020000008800048000000000060019C000000000000000000000000000000000060019C0</Binary>
      </EventData>
     </Event>
    - <Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event">
    - <System>
      <Provider Name="Ntfs" />
      <EventID Qualifiers="49156">137</EventID>
      <Level>2</Level>
      <Task>2</Task>
      <Keywords>0x80000000000000</Keywords>
      <TimeCreated
    SystemTime="2014-11-17T18:10:37.788942300Z" />
      <EventRecordID>315531</EventRecordID>
      <Channel>System</Channel>
      <Computer>server.domain.com</Computer>
      <Security />
      </System>
    - <EventData>
      <Data />
      <Data>OS</Data>
      <Binary>1C0004000200300002000000890004C000000000020100C000000000000000000000000000000000020100C0</Binary>
      </EventData>
      </Event>
    When I attempt to start the services - I get the following errors:
    The Net.Pipe Listener Adapter service depends on the Windows Process Activation Service service which failed to start because of the following error: 
    Transaction support within the specified resource manager is not started or was shut down due to an error.
    The Net.Pipe Listener Adapter service depends on the Windows Process Activation Service service which failed to start because of the following error: 
    Transaction support within the specified resource manager is not started or was shut down due to an error.
    I have tried the "fsutil resource setautoreset true" fix without success.
    Any ideas or direction would be much appreciated. Restoring this server will be extremely difficult.
    Thanks!

    We can close this question.
    From an elevated prompt, I ran 'fsutil resource setautoreset true' and attempted to remove the files with .blf and regtrans-ms file extensions from C:\Windows\System32\config\TxR. but these files were locked by system processes. (They are also
    tagged with the hidden file attrib so you may not see them at first)
    So, I booted the system with a Windows 2008 R2 Install Disk, selected repair OS and selected the command prompt. I then performed a chkdsk /f c: and selected "Y" to unmount the drive. It made some repairs.
    With the system booted from the install disk, and chkdsk executed, the locks were freed and I was able to delete the files from C:\Windows\System32\config\TxR.
    Once the system rebooted, the services came back fine and everything was back to normal.

  • 20 Gigs of DAT files in my C:\windows\temp folder

    What are these files? How are they being generated? Are they needed? What can I do to prevent this?
    They all look like __CRX7373799929299388.dat
    I have these on my Staging server and on local dev installs of CQ5
    thanks!

    This happens when you upload a package the temp file is not getting cleaned. The size of file will be equal to package size u are loading.   As a temporary workaround once package is uploaded and installed you can delete those temp file Or run cron job to clean temporarily.   Once Service pack 2 is released it might have fix for it.

  • Free/busy information unavailable in Outlook 2013 on Windows 8.1 on Exchange 2010

    Hi,
    This has been discussed a lot inf forums but unfortunately having followed all the guides I have found, I can't resolve it.  I am the only user with the issue.  
    I have primarily followed https://social.technet.microsoft.com/Forums/office/en-US/ee9e1b89-bea1-48aa-9008-cae205c5e816/forum-faq-why-cant-i-retrieve-freebusy-information?forum=outlook 
    including recreating my profile, installing http://support.microsoft.com/kb/2850061/en-us via
    Ofifce 2013 SP1, verifiying the funtion works through OWA etc etc.
    I have the full version of Office 2013 Pro not the Click to Run version.  It seems to be the Outlook client itself not the Exchange server.  I am at a loss what to do next... please help! :)

    Hi,
    Please also enable Automatic Replies in Outlook, check if this feature works well.
    Since the issue only occurs to one client and it works on OWA, I suggest you try these steps below to troubleshoot the issue:
    1. Start Outlook in Safe Mode. It is always a helpful method to troubleshoot Outlook client issues, when in Safe Mode, Outlook doesn't load with add-ins, we can check if this issue persists when add-ins are disabled:
    Press Win + R, type "outlook.exe /safe" in the blank box, press Enter.
    If there's no issue in Safe Mode that you can find free/busy information again, disable the suspicious add-ins to verify which one caused the problem.
    2. Perform a clean boot. This helps determine whether the issue is related to some background programs, any 3rd-party programs will be disabled. If the issue doesn't occur in Clean Boot, then we may suspect it's related to some 3rd-party programs, the detailed
    steps can be found in this link below:
    http://support.microsoft.com/kb/929135
    Regards,
    Melon Chen
    TechNet Community Support
    It's recommended to download and install
    Configuration Analyzer Tool (OffCAT), which is developed by Microsoft Support teams. Once the tool is installed, you can run it at any time to scan for hundreds of known issues in Office
    programs.

  • Windows 2008 R2 SP1 - Exchange 2010 SP2, RU7, 3 node DAG - Possible Port Exhaustion?

    Hey TechNet
    So today at about 10:00 am the exchange server dropped all connectivity (except for ping) and was not accessible by RDP and I could not connect to the service control panel etc. (from a remote machine). In addition I could not log on to the console, when
    I pressed ctrl+alt+del - nothing happened. (This is running on VMware and I was not able to to use the VMware tools to restart the server either) it had to be "reset" in the console.
    I found numerous errors in the event viewer as you might suspect, namely that the server could not locate any domain controllers, global catalogs etc. Of course all the mailbox databases failed over to the other dag member and no one noticed (so
    that worked well).
    In checking some settings, I have determined that the servers memory page file is not set to RAM+10MB, but it was never set anyways, in addition
    static RPC ports are not set. The client was doing mailbox moves today (I am going to confirm with the client again to make sure)
    My question to you is this: will not setting static RPC ports cause port exhaustion? and if so can you provide a link that says as much?
    Thanks,
    Robert
    Robert

    Rhoderick,
    Thanks for the response. I reviewed several different links regarding this especially some of the ones related to memory exhaustion I didn't see RADAR events in the event log. Here is a short list of some of the events that I have seen in the event viewer.
    ESE, 490 (Time Occurred: 9:47 am), several instances.  
    Information Store (5524) db-7: An attempt to open the file "V:\Program Files\Microsoft\Exchange Server\V14\Mailbox\database\E0E.chk" for read / write access failed with system error 1450 (0x000005aa):
    "Insufficient system resources exist to complete the requested service. ".  The open file operation will fail with
    error -1011 (0xfffffc0d).
    ESE 906 (Time Occurred 10:00 am), several instances.
    Information Store (5524) A significant portion of the database buffer cache has been written out to the system paging file.  This may result in severe performance degradation.
    See help link for complete details of possible causes.
    Resident cache has fallen by 78083 buffers (or 38%) in the last 118 seconds.
    Current Total Percent Resident: 16% (124338 of 762960 buffers)
    MSExchangeREPL 2170 (Time occurred, 9:51) several instances.
    A slow file IO operation was encountered on file 'V:\Program Files\Microsoft\Exchange Server\V14\Mailbox\database7\E0E000BB30C.log' for copy 'db-7\CBEXCH1'. The observed latency was 9737.6786
    ms while performing a 'MissingFileNotification'. This may indicate an overloaded system or a storage-related problem.
    MSExchagneADaccess 2070, (Time Occurred, 9:56 am) several instances.
    Process Microsoft.Exchange.EdgeSyncSvc.exe (PID=2864).  Exchange Active Directory Provider lost contact with domain controller dc.contoso.local.  Error was 0x51 (ServerDown) (Active
    directory response: The LDAP server is unavailable.).  Exchange Active Directory Provider will attempt to reconnect with this domain controller when it is reachable. 
    MSExchangeADaccess 2122, (Time Occurred 9:57) Several Instances.
    Process MSEXCHANGEADTOPOLOGYSERVICE.EXE (PID=164). Error 0x8007267c occurred when DNS was queried for the service location (SRV) resource record used to locate a domain controller for domain
    contoso.local
    The query was for the SRV record for _ldap._tcp.dc._dc.contoso.local
    For information about correcting this problem, Type in the command line:
    hh tcpip.chm::/sag_DNS_tro_dcLocator_messageA.htm
    While I do understand that the to really nail this down we would need to setup a data capture set using as you suggested perfmon, and probably (prodump, netmon, etc) I was trying to get the client an answer.
    I do have 2 perfmon collectors from last week and the week before,
    to be clear these captures were not taken when issues occurred; they show some memory pressure on 1 or 2 counters with some high spikes, but in most cases all the averages (with the exception of 2 counters) are completely below the thresholds.
    All captures were done for 4 hours @ 15 secs.
    Page Reads/Sec, has an average of 19.433 with a max of 909.51, There are about 10 spikes over 100
    Pages Input/Sec, Shows an Average of 132.981, Max, 6,695 (I realize the max is bad), I see 13 Instance's of this counter spiking above 1,000.
    Pages Output/Sec, Shows An average of 38.113, Max, 5,221 (I realize the max is bad here), I see about 8 instances of spikes over 1,000.
    Systemcache Resident Bytes remains stable with very minor increases over the entire capture.
    Pool non paged bytes, is stable across the entire capture as well (Stable, meaning straight line)
    Working Set for Store.exe, stable, straight line almost no changes.
    Working set for Total, Stable, Straight line almost no changes.
    % Committed Bytes in use is at 60%, stable across the whole capture.
    This server does not have its page file set to RAM+10MB its set to Dynamically grow. My plan (before yesterday) was to set the page file correctly, then recapture and see if there were any differences. My problem that I have experienced here is that none
    of the counters seem to correlate, or if they do its not really a complete correlation that tells me what's going on.
    Thanks for your help here!!
    Robert
    Robert

  • Scripting for Windows Server Backup for Exchange 2010

    Hi,
    I'm new to writing scripts and was wondering if anyone could assist with writing a script to backup an Exchange Server in a windows 2008 r2 environment.  I currently use Windows Server Backup with a full VSS backup to run exchange backups, however,
    each backup overwrites the previous one, and i would really like to have multiple backups.  I would also like  to specify backup log locations, and have emails sent upon successful backups or failed backups.  I am trying to write this powershell
    but cant seem to get things to work.
    Thanks
    Debbian

    You can use PowerShell in conjunction with the Windows Backup command line utility, wbadmin.exe. The code below should help get you started. I have included the date in order to help maintain multiple back ups, but if you do more than one back up per day,
    you will need to change the date format to include the time. You cannot control the built in logging of Windows Backup, but you could collect your own backup information and write that out to a text file. As far as sending emails, look into the Send-MailMessage
    cmdlet including the -Attachment parameter that could send your log file with your mail message. Cheers.
    $Date = Get-Date -Format 'MM-dd-yyyy'
    $Path = "\\server\share\$Date"
    New-Item -Path $Path -ItemType Directory | Out-Null
    Start-Process wbadmin.exe -ArgumentList "start backup -backupTarget:$Path -allCritical -vssFull -quiet"

  • Access denied. Error in File C:\WINDOWS\TEMP\

    I have searched on Google and all over this forum and none of the solutions have fixed my problem.
    Crystal Version: Crsytal.Net for Visual Studio.Net 2005
    Server: Windows Server 2003
    Error:
    Access denied. Error in File C:\WINDOWS\TEMP\JuryDutyReport {D6296178-3E72-483E-B876-2DFC03D00841}.rpt: Access to report file denied. Another program may be using it.
    When I run my app locally through the Web Server that comes with ASP.Net, everything is fine, it is only when I deploy the application to the Windows 2003 Server that I get the error.
    I'm using impersonation in my ASP.Net application.  I have given that domain user full access to 'C:\Windows\Temp'', the export folder and even the folder where the Crystal Report resides on the Server.  When I run the application on the Web Server, I actually see the ".rpt" get created in the "C:\Windows\Temp" folder but yet it still says there is a permissions error.
    What is bizarre is that the code below that just sends the file to the printer automatically works:
      private void PrintJuryDutyReport(DataSet ds)
            //create report document
            ReportDocument crDoc = new ReportDocument();
            //load, set datasource and print options
            crDoc.Load(Server.MapPath("~/Reports/JuryDutyReport.rpt"));
            crDoc.SetDataSource(ds); //set datasource
            crDoc.PrintOptions.PrinterName = ddlPrinters.SelectedValue.ToString(); //set printername
            crDoc.PrintOptions.PaperOrientation = PaperOrientation.Portrait; //set paper orientation
            crDoc.SetParameterValue("ParamUsername", User.Identity.Name); //set parameter
            crDoc.PrintToPrinter(1, false, 0, 0); //send to printer
    I have to change the code to export to a PDF and this code doesn't work:
        private void PrintJuryDutyReport(DataSet ds)
            //report document
            ReportDocument crDoc = new ReportDocument();
            string myfile = @"G:\COPFS\COPFSPROD\ReportsTemp\MyPDF.pdf";
            //load, set datasource and print options
            crDoc.Load(Server.MapPath("~/Reports/JuryDutyReport.rpt"));
            crDoc.SetDataSource(ds); //set datasource
            crDoc.SetParameterValue("ParamUsername", User.Identity.Name); //set parameter
            //export through http
            crDoc.ExportToDisk(ExportFormatType.PortableDocFormat, myfile);
            crDoc.Close();
            crDoc.Dispose();
            Response.ClearContent();
            Response.ClearHeaders();
            Response.ContentType = "Application/pdf";
            Response.AppendHeader("content-disposition", "attachment; filename=" + myfile);
            Response.WriteFile(myfile);
            Response.Flush();
            Response.Close();
    Any help is greatly appreciated as I have to present this to end users tomorrow.

    Don, thanks for the response.
    As a last ditch effort, I granted "modify" to the Network Service Account on C:\Windows\Temp and that fixed the error.
    There are two things that are troubling about this:
    1) I'm impersonating a domain user in my ASP.Net application and when the PDF is created, the owner is that domain user, so I know impersonation is working.  So I wonder if ASP.Net picks and chooses what account it runs under at different times?
    2) It is a little scary for the Network Service Account to have this access but that people seem to be fine with it.
    http://aspadvice.com/blogs/rjdudley/archive/2005/03/14/2566.aspx

  • Save as dialog on exit in Temp Folder

    Hi,
    We have made a little program that create a PDF in windows temp folder (C:\Users\username\AppData\Local\Temp). We open it and the user can modify is content.
    After we check if they change and send back to our application.
    My problem is when the user want to save his change.
    No problem if the user clic on File->save in Adobe. The file is save without problem in the temp folder and after the user clic to close adobe.
    But if the user doesn't save and clic for close adobe, the dialog of Abode come and it ask if you would like to save your change. If the user click yes,
    a dialog box save as come and want to save the document on the desktop.
    If we put our file in another folder (not C:\Users\username\AppData\Local\Temp) the dialog "save as" doesn't appear after we have press yes to the dialog "Would you like to save your change?".
    So could you say me if we can disable this "Save as" dialog when we are on temporary folder ?
    Thanks for your Help

    Those of us using Leopard routinely learned very quickly to click on the Save button and briefly hold on it. You will then get the Save As dialog you desire, and other options. It is only the quick click that defaults to use the designated Download folder.
    Ernie

  • Duplicates in temp folder

    I've recently upgraded to PS Elements V7. I've noticed that when I modify the date/time of a picture, it is leaving a duplicate copy of the picture in my temp folder (path is C:\Documents and Settings\username\Local Settings\temp). The only difference between this picture file and the original in my "my pictures" folder, is the date of the file/picture.
    There is also an empty "Adobe" and empty "editor" sub-folder in this temp folder.
    I've noticed that this duplicate file appears only when I modify the date/time of a picture, not if I do a full edit of the picture and save it (then it just overwrites the original picture in the "my pictures" folder.
    I've tried closing PS Elements and rebooting the computer, but the files in the temp folder are still there. I deleted all of these pictures and through Elements modified a picture, checked that it had duplicated in temp (it had), closed element, rebooted...still there.
    As I've never noticed this before (I usually clear out the temp folder about 1x a week and have been an elements user for years), I'm wondering what is causing it? If it is normal? Is there a way to disable it?
    As this temp folder is somewhat hidden (most/many computer users probably don't know how to access it), the fact that it is retaining full duplicate copies of modified pictures that must be manually deleted means that many users hard drives may be filling up unnecessarily with these files.
    I've checked in elements and noted that it points to the original picture in the "my pictures" folder and not to the temp folder duplicate. Deleting the temp folder pictures does not seem to affect elements. I've also tried the folder view option in elements to see if it was "recognizing" these duplicates in temp, but the temp folder does not even come up as accessible.
    Thanks for any help, insight...Mike Villella

    John, thanks for the reply. I wonder if anyone else is missing a lot of disk space due to files in a temp folder that don't know about. I would think that closing the program and/or rebooting would flush this folder, but apparently not.
    I use Norton Protected Recycle bin. Never noticed a similar problem before. I just did an "unerase" function with the recycle bin and there were a lot of old files that are associated with elements, just not any actual pix.
    I do keep finished photos on a NAS drive, but the photos in question are on the hard drive (C drive).
    I did the process again paying more attention. Started by deleting everything in the temp folder and rebooting.
    The file had a date taken of 12/5/99 a created date of 1/3/09 and a modified date of 1/24/09 and was in "My pictures". Upon adjusting the time/date to 12/15/99, a new file popped up in the temp folder, same name, size, rez, etc. The temp file has a date taken of 12/5/99, a created date of 1/25/09 and a modified date of 1/24/09, same as the old file except the new created date of today.
    The picture in "my pictures" is also same size, name and rez. The date taken has the new date of 12/15/99, and a modified and created date of 1/25/09 (today).
    So apparently, upon adjusting the time/date, the original file is being duplicated and created in temp, while the original file is being updated with the new date entered and today's date as created and modified.
    The duplicated file in temp is basically orphaned. Elements does not appear to link to it or see it. When I delete it manually, it does not affect Elements.
    So, I guess I can understand why a temp file is created during the process (but, by the same token, not sure why the same thing doesn't happen during a full edit), but I'm not sure why it is left in temp and not automatically deleted at some point. Does that make sense or am I missing something?
    Also, I notice a amt, alm, libFNP_events, and swtag file being created in this temp folder (C:\documents and settings\username\local settings\temp). They don't seem to ever auto delete either.

  • DMS: Documents opened in edit mode(CV02N) are not deleted from Temp Folder

    Hi All,
    My query is whenever i edit documents like word,pdf or autocad files in cv02n, it make a copy in my c/windows/temp folder but doesnot delete them when it close the transaction. i have given PC data carrier with path as %Temp% & in profiles i have defined the path as c/windows/temp/
    So is there any solution by which the files opened in the edit mode will be deleted when i close the transaction or else after some specified time it should get deleted from the temp folder. Is there any standard configration for it.
    Note: I have tried the 741388 note but it is only for display mode documents.
    Regards
    Nishant.

    Hi Nishant,
    unfortunately this is the standard behavior as for editing the application need a
    temporary file and this file is still stored after closing the application to enable a re-checkin in the future too.
    So the only possibility is to delete the temporary files from C:\Windows\Temp after some time manually. There is no functionality in the system standard as the SAP system has no control on the windows folders. 
    To achieve a very similar behavior you can set the "Delete after checkin" flag in transaction DC30 for this applications and see if this will meet your requirements.
    This flag means that the original file gets deleted after you check them into a storage category. Maybe this could be usefull.
    Best regards,
    Christoph

  • SP1 Failed Update - Update missing file in temp folder

    An Update to SP1 failed. The System is running now in its old state and working fine.
    The problem now is, that I can't update to SP1. The SP1 Installer detects the failed patch installation, and tries to continue the installation. It fails, searching for a file that doesn't exist, in the windows temp folder.
    Is there a way to tell the SP1 installer to ignore the failed installation, and behave as if it's the first time it tries to install the update?

    HKEY_Local_Machine\Software\Microsoft\ExchangeServer\V15\CafeRole   is on a CAS Server.
    HKEY_Local_Machine\Software\Microsoft\ExchangeServer\V15\MailboxRole is on the Mailbox Server.
    In my case the problem was on the mailbox server. After I deleted it the
    setup started normally, without noticing an uncomplete install.
    Thanks, that key was exactly what I was hopping for, but I didn't know what to search for. 
    SP1 installation finished.

  • Temp folder and Creative Cloud

    Is Creative Cloud or CC Applications using Windows Temp folder? I use to delete regularly the contents of this folder.

    The temp folder under Windows is used by CC for saving temporarily available updates or complete application downloads while the download and install process is going on through CC desktop app. The Local/Temp folder for your user account contains useful logs like amt3, oobelib and PDApp logs which can be used by Adobe technical support for investigation purposes on encountering application malfunctioning issues. You may clean both Temp folders but by skipping these above files listed.

  • Upgrade Windows 2008 R2 sp1 on an Exchange 2010 SP3 server

    I have looked high and low for an answer to this question:
    1. I have Windows 2008 R2 running Exchange 2010 SP3
    2. Can I upgrade Windows 2008 R2 to SP1 with Exchange 2010 SP3 onboard?
    3. Are there pre-requisites for doing this?
    4. If so, where can I get the documentation?
    I have spent hours looking through Google and Technet.  Lots of information about compatibility of 2008 R2 and Exchange SP1 and SP2.
    Your wisdom and experience would be appreciated.  Thank you!!
    Charlie

    Hi,
    "Can I upgrade Windows 2008 R2 to SP1 with Exchange 2010 SP3 onboard?"
    Yes, you can.
    "Are there pre-requisites for doing this?"
    There is no pre-requisites, however, as what Idan Vexler mentioned above, downtime is required.
    "If so, where can I get the documentation?"
    Here are two related threads for your reference.
    http://social.technet.microsoft.com/Forums/en-US/1093a91d-aed8-4c91-b538-ca82f5860523/upgrading-windows-2008-r2-enterprise-to-sp1-on-exchange-2010?forum=exchange2010
    http://social.technet.microsoft.com/Forums/exchange/en-US/0a7f29a2-04a5-485f-9b0c-061f89b311a5/installing-windows-2008r2-sp1-on-exchange-server-2010?forum=exchange2010
    Hope this helps.
    Best regards,
    Belinda
    Belinda Ma
    TechNet Community Support

Maybe you are looking for