Pfirewall.log file not updating
Hi,
I configured Windows Firewall to allow logging in my DC Firewall GPO. settings are
Log dropped Packets -enabled
Log successfull connections -enabled
Log file path and Name:
%systemroot%\system32\logfiles\firewall\pfirewall.log
size limit 32767
The Pfirewall.log file is located in this area but is not updating itslef. Is there something I need to do to enable the file to have an updated time stamp and overrwite itself after it reaches a certian size. I changed the size limit from
1024 to 32767 today but still does not seem to update itself.
My 2003 Domain's pfirewall.log IS updating and it is pointing to c:\windows directory. Is there a service that needs NFTS permissions on the folder where the log file resides? or a different GPO setting that handles this?
Thanks,
Kevin C.
Hello Kevin,
Please check this link.
http://www.grouppolicy.biz/2010/07/how-to-manage-windows-firewall-settings-using-group-policy/
Before exporting, go to the properties (check the last part of the link), where you see the logging option and configure your logging settings. Then import these settings in your GPO.
Similar Messages
-
Cfmail not working properly, Log files not updating
I'm still trying to get this CFmail problem corrected. The problem is CFmail is only send the mail out sporadically. In many cases it will send out the mail and still put the message in the undeliverable folder. My smtp server shows that the connection was opened and closed.
I'm pretty sure that I'm dealing with a CF problem since the mail logs are screwing up as well. In the past when I'd have a problem with CFmail I could go to my mail.log and see what was happening.
Those logs looked like this:
"Error","scheduler-2","05/28/09","11:00:43",,"Invalid Addresses; nested exception is: javax.mail.SendFailedException: 550 No such user (website_notification) Cached lookup "
"Error","scheduler-2","05/28/09","11:00:43",,"Invalid Addresses; nested exception is: javax.mail.SendFailedException: 550 No such user (wayneseattle) -ERR [email protected] not found "
"Error","scheduler-2","05/28/09","11:00:43",,"Invalid Addresses; nested exception is: javax.mail.SendFailedException: 550 No such user (ca-sl-broardmoor) Cached lookup "
Now my logs look like this:
"Error","scheduler-2","12/07/09","10:05:36",,""
"Error","scheduler-4","12/07/09","10:17:22",,""
"Error","scheduler-0","12/07/09","10:33:37",,""
"Error","scheduler-0","12/07/09","10:53:53",,""
Where did the error messages go?
Could this be a SMTP response message formating issue? Any ideas in how to figure this one out would be very much appreciated.Which version of CF are you using? You may have already checked this but under "Server Settings -> Mail -> Mail Logging Settings" there's an option box for "Log all mail messages sent by ColdFusion". If you have it unchecked you may not see all of that data.
-
Alert log is not updated in Oracle 8i
Database is running
Alert log is not updated,not even after manually logswitch
I have created job on this but , it is not running itself while same job is runnig on other Db
DBA_JOBS shows null value for this job
I can run job manually ..then DBA_JOBS shows value
After creating a job ... it shows in invalid objects list
Thanks in advanceOracle_2410 wrote:
Please somebody help me !!!Its 3 month old problem!!!
For first problem "Alert log not getting updated"
Please post the os version and name. And output of below query
show parameter background
ls -lart (or dir) <path shown above>Regards
Anurag -
Having a problem with Incopy files not updating after designers on Indesign prepare packages on second round of revisions.
First, the designers create an Indesign file and prepare a newly created Incopy package. The editors are able to open the package for the first time. Editors make changes then repackage for designers. Designer then open package from editors and make necessary changes and then repackage back to the editors. At this point, the editors receive a message to replace when opening package. The files open but it doesn't update all the changes that was originally made to it. It as if it is still using the original assignments from the first newly created package. Also, all the assignments have "?" in the assignment window in Incopy. We are unable to figure out how to update the "?" in Incopy or relink or what ever we need to do at this point.
Several things we checked.
1. Unstuffing the package contents from the designers shows all updated file dates. So we know the package contains the newest updated assignment files.
2. If the designers rename the file names on all the assignments in the assignment folder, then repackage. The files open on Incopy with updated changes. But not if the file name remains the same. Also, if the assignment files are all renamed, it doesn't ask to replace anything on Incopy.
At this point. Our only solution is to rename all the assignment files and repackage. But we shouldn't have to be doing this, especially since it adds time to the designers to repackage the files.
Designers using Indesign CS 3 on Mac OS 10.6.4.
Editors using Incopy CS 3 on PC Windows.
Any help would be greatly appreciated.When the InCopy user opens the assignment package for the first time, it unpacks (it's really like a .zip file) into a folder on their hard drive called InCopy Assignments. The folder is located in My Documents on a PC and in the Home > Documents folder on a Mac.
Every unpacked InCopy assignment package goes into a subfolder within this InCopy Assignments parent folder. Each package's folder is named after the *assignment* name (not the name of the package itself, which is usually the same but not always). So if the assignment name is Chapter1.inca, then the subfolder name will be Chapter1.
When the InCopy user repackages it for the designer, it packages a *copy* of this subfolder. This is the subfolder that Bob is suggesting they delete, after sending off the packaged InDesign file. That's a good idea.
If you can't rely on your editors to do that ... many are very leery of throwing anything away ... then your designers should *rename the assignment* in their Assignments panel before creating an InCopy package for the second or third or whatever time. They can just double-click the assignment in the panel, the Assignment Options dialog box opens, and they can change the name from say "Chapter1" to "Chapter1-v2". They click okay, they choose Package for InCopy, and send it off.
When the editor gets this file and double-clicks it, it unpacks into a different subfolder than before: Chapter1-v2, leaving the Chapter1 subfolder intact. So you avoid the prob of duplicate assignments on the InCopy user's hard drive.
FWIW this was much more of a glitch with CS3. Not sure what Adobe did differently, but assignment packages are much more intelligent in CS4 and CS5.
AM -
BACKGROUND JOB WAS NOT SCHEDULED/LOG FILE NOT YET GENERATED
Hello,
To generate the log report, /VIRSA/ZVFATBAK program is scheduled on hourly basis but some time report doesn't get generated and if we see the background job then it shows sucessfully finished.
If we see the maually the log report for FFID then below error message is displayed.
" BACKGROUND JOB WAS NOT SCHEDULED/LOG FILE NOT YET GENERATED"
Can anyone guide me to solve the issue.
Thanks in advance.
Best Regards,
Prashant DubeyHi,
once chk the status of the job by selecting that and check job status(cltr+shift_f12)
since it was periodically scheduled job there will be a RELEASED job after every active job..
so try to copy that into another job using copy option and give some new name which u have to remember...
the moment u copy u can find the same copied job in SCHEDULED status...
from here, try to run it again on hourly basis....
After copying the job u can unschedule the old released job to scheduled otherwise 2 will run at a time...
rgds, -
Any ideas creating app to read log file when updated?
Afternoon all,
I have been asked to write a java app which will read the contents of the server log file every time the log file is updated by the server. The app will be deployed onto WebSphere Application Server.
Can anyone point me in the right direction as I have never written anything like this before and I don't know where to start. Any help will be much appreciated.
Thanks in advance,
A.alex@work wrote:
I agree with most of what you've said but unfortunately I don't have a say in what the company wants. However, I am interested in the appender idea, perhaps they may go for that if I suggest it.I'd say it'll take you a day to read up about Log4J and how to write basic appenders, and another day to write your own appender for this problem. Compare that to the effort of writing something to poll a log file, re-read it constantly and update another file, operations which will get slower and slower as they go along. That's a fair amount more code than a single appender would be. There's how to sell it to your company.
Can you give me a brief overview in how it works?Log4J uses objects called appenders, which take logging info - generated by your container - and do something with it. It ships with some appenders already in it, for writing to stdout, files, sockets and databases. You can also write your own appenders that do something more than these standard ones do. You write logging code in your application - in this case, your container already does this so you don't have to - and the configuration of Log4J decides what happens to these logging messages. That's what you're interested in. You could write an appender - a simple class - that takes raw logging messages in, and writes them out to file in whatever format you want
Come to think of it, depending on how complex the required XML is, you may even be able to do this without writing any code at all. You can write formatting patterns in the Log4J config that existing file appenders will use to write your XML files
A bit of an abstract explanation, I guess. Your best bet is to first ascertain that Log4J is indeed in use, and then read the documentation, which is surprisingly good for an Apache project :-)
[http://logging.apache.org/log4j/1.2/index.html] -
WSA Webroot Malware Category DAT files Not Updating
I have two wsa's that have not updated their webroot dat files since may 16th. I have tried a manual update and it still has not updated. I am not sure if any new updates have been released and availalbe. Is there a way I can verify what the most recent dat files are??
Did you ever resolve this issue. I have the webroot failing to update aswell. Will log a TAC case, but checking to see if you have had any progress?
Ben -
Hi,
I have successfully used library files in my web pages for
years using a past version of Dreamweaver.
I have just started using Dreamweaver MX 2004, and whenever I
update a library file (that has already been inserted in an html
file) the html file is not updated. I started a new web site from
scratch, so am using only files that have been created by this
version of Dreamweaver.
In other words, updating the library file only updates the
library file itself, and doesn't alter the html document that has
the library file inserted into it. This pretty much means that the
library file functionality in Dreamweaver MX 2004 is useless.
Was there a bug fix for this that I needed to get?
CheersTry this -
Use Site Manager to export this site definition to a place
you can find.
Then REMOVE this site definition. Then IMPORT the one you
just exported.
Does it work now?
If it does not, can you confirm that your Library items are
in a folder
named "Library" and that this folder is in the root level of
the local site?
Also, are you uploading the local files after you have
changed the Library
item?
Murray --- ICQ 71997575
Adobe Community Expert
(If you *MUST* email me, don't LAUGH when you do so!)
==================
http://www.dreamweavermx-templates.com
- Template Triage!
http://www.projectseven.com/go
- DW FAQs, Tutorials & Resources
http://www.dwfaq.com - DW FAQs,
Tutorials & Resources
http://www.macromedia.com/support/search/
- Macromedia (MM) Technotes
==================
"chuckee" <[email protected]> wrote in
message
news:[email protected]...
> Hi there,
> yes I am using 7.0.1.
> I've attached the code of the page that has the Library
file inserted in
> it
> (support.htm).
> Below, I have also attached the library file -
header.lbi - (notice how it
> has
> two links that I added to it later, that did not update
the file that it
> was
> inserted in, after I saved the library file).
>
>
> SUPPORT.HTM
> <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01
Transitional//EN"
> "
http://www.w3.org/TR/html4/loose.dtd">
> <html>
> <head>
> <meta http-equiv="Content-Type" content="text/html;
charset=iso-8859-1">
> <title>SMTP2Go - Signup</title>
> <style type="text/css">
> <!--
> .style1 {font-size: 12px}
> -->
> </style>
> <link href="style.css" rel="stylesheet"
type="text/css">
> <style type="text/css">
> <!--
> .style2 {font-size: 12}
> -->
> </style>
> </head>
>
> <body>
> <table width="80%" border="0" align="center">
> <tr>
> <td><div align="right"><!--
#BeginLibraryItem "/header.lbi" -->
> <div align="right">
> <p>Log In</p>
> </div>
> <table width="100%" border="0" cellspacing="2"
cellpadding="0">
> <tr>
> <td width="16%">Home</td>
> <td width="23%">How It Works </td>
> <td width="21%">Sign Up </td>
> <td width="18%">FAQ</td>
> <td width="22%">Support</td>
> </tr>
> </table>
> <!-- #EndLibraryItem --></div>
> <form name="support" method="post"
action="supportresponse.php">
> </p>
> <table width="65%" border="0" align="center">
> <tr>
> <td colspan="2"><p>We welcome all feedback
to do with SMTP2Go,
> and
> any questions you may have.</p>
> <p> </p></td>
> </tr>
> <tr>
> <td width="28%">Name:</td>
> <td width="72%"><input name="name" type="text"
id="name">
> </td>
> </tr>
> <tr>
> <td> Email Address: </td>
> <td><input name="email" type="text"
id="email">
> </td>
> </tr>
> <tr>
> <td valign="top"><span
class="style2">Message::</span></td>
> <td><textarea name="message" cols="45" rows="6"
> id="message"></textarea>
> <br> </td>
> </tr>
> <tr>
> <td> </td>
> <td><input type="submit" name="Submit"
value="Send"></td>
> </tr>
> </table>
> <p> </p>
> </form>
> </td>
> </tr>
> </table>
> </body>
> </html>
>
>
> HEADER.LBI
> <meta http-equiv="Content-Type" content="text/html;
charset=iso-8859-1">
> <div align="right">
> <p>Log In</p>
> </div>
> <table width="100%" border="0" cellspacing="2"
cellpadding="0">
> <tr>
> <td width="16%">Home </td>
> <td width="23%">How It Works</td>
> <td width="21%"><a href="signup.htm">Sign
Up</a> </td>
> <td width="18%">FAQ</td>
> <td width="22%"><a
href="support.htm">Support</a></td>
> </tr>
> </table>
> -
Domain.log is not updated anymore
Hi all,
Through a cleaning operation on the disk, I deleted the domain.log of our BPEL domain
Since then, as no new file has been created, I've tried to create a blank file domain.log or copy a similar file from another machine to put on it.
In any case, our BPEL domain stop to update this file, our process are still running, but we don't have log anymore, event in error case.
What I have to do to re-generate this domain.log ?
Thanks youAlso, please note that nothing new gets logged and periodically updates in the domain logs until and unless you do something on its domain.
Rather, you can see the default_insland logs of that container in the opmn/logs folder.
Regards
Anirudh Pucha -
i follow the steps
1.In Application Set profile FND: Debug Log Level to "Statement"
2.restart apache
3.Run debug from help-->diagnostics-->debug
4.Secure debug log file which should be in
select value from v$parameter where name like 'utl_file%'
but the is no log file created i dont know why (these steps are provided by an SR)
thnxWhat about "FND: Debug Log Filename for Middle-Tier" and "FND: Diagnostics" profile options?
Note: 372209.1 - How to Collect an FND Diagnostics Trace (aka FND:Debug)
https://metalink2.oracle.com/metalink/plsql/ml2_documents.showDocument?p_database_id=NOT&p_id=372209.1
If the above does not help, set the debug log at the user level and check then.
Note: 390881.1 - How To Set The Debug Log At User Level?
https://metalink2.oracle.com/metalink/plsql/ml2_documents.showDocument?p_database_id=NOT&p_id=390881.1 -
Empty Log files not deleted by Cleaner
Hi,
we have a NoSql database installed on 3 nodes with a replication factor of 3 (see exact topology below).
We run a test which consisted in the following operations repeated in a loop : store a LOB, read it , delete it.
store.putLOB(key, new ByteArrayInputStream(source),Durability.COMMIT_SYNC, 5, TimeUnit.SECONDS);
store.getLOB(key,Consistency.NONE_REQUIRED, 5, TimeUnit.SECONDS);
store.deleteLOB(key, Durability.COMMIT_SYNC, 5, TimeUnit.SECONDS);
During the test the space occupied by the database continues to grow !!
Cleaner threads are running but logs these warnings:
2015-02-03 14:32:58.936 UTC WARNING [rg3-rn2] JE: Replication prevents deletion of 12 files by Cleaner. Start file=0x0 holds CBVLSN 1, end file=0xe holds last VLSN 24,393
2015-02-03 14:32:58.937 UTC WARNING [rg3-rn2] JE: Cleaner has 12 files not deleted because they are protected by replication.
2015-02-03 14:32:58.920 UTC WARNING [rg3-rn1] JE: Replication prevents deletion of 12 files by Cleaner. Start file=0x0 holds CBVLSN 1, end file=0xe holds last VLSN 24,393
2015-02-03 14:32:58.921 UTC WARNING [rg3-rn1] JE: Cleaner has 12 files not deleted because they are protected by replication.
2015-02-03 14:32:58.908 UTC WARNING [rg3-rn3] JE: Replication prevents deletion of 12 files by Cleaner. Start file=0x0 holds CBVLSN 1, end file=0xe holds last VLSN 24,393
2015-02-03 14:32:58.909 UTC WARNING [rg3-rn3] JE: Cleaner has 12 files not deleted because they are protected by replication.
2015-02-03 14:33:31.704 UTC INFO [rg3-rn2] JE: Chose lowest utilized file for cleaning. fileChosen: 0xc (adjustment disabled) totalUtilization: 1 bestFileUtilization: 0 isProbe: false
2015-02-03 14:33:32.137 UTC INFO [rg3-rn2] JE: CleanerRun 13 ends on file 0xc probe=false invokedFromDaemon=true finished=true fileDeleted=false nEntriesRead=1129 nINsObsolete=64 nINsCleaned=2 nINsDead=0 nINsMigrated=2 nBINDeltasObsolete=2 nBINDeltasCleaned=0 nBINDeltasDead=0 nBINDeltasMigrated=0 nLNsObsolete=971 nLNsCleaned=88 nLNsDead=0 nLNsMigrated=88 nLNsMarked=0 nLNQueueHits=73 nLNsLocked=0 logSummary=<CleanerLogSummary endFileNumAtLastAdjustment="0xe" initialAdjustments="5" recentLNSizesAndCounts=""> inSummary=<INSummary totalINCount="68" totalINSize="7570" totalBINDeltaCount="2" totalBINDeltaSize="254" obsoleteINCount="66" obsoleteINSize="7029" obsoleteBINDeltaCount="2" obsoleteBINDeltaSize="254"/> estFileSummary=<summary totalCount="2072" totalSize="13069531" totalINCount="68" totalINSize="7570" totalLNCount="1059" totalLNSize="13024352" maxLNSize="102482" obsoleteINCount="66" obsoleteLNCount="971" obsoleteLNSize="12974449" obsoleteLNSizeCounted="971" getObsoleteSize="13019405" getObsoleteINSize="7347" getObsoleteLNSize="12974449" getMaxObsoleteSize="13019405" getMaxObsoleteLNSize="12974449" getAvgObsoleteLNSizeNotCounted="NaN"/> recalcFileSummary=<summary totalCount="2072" totalSize="13069531" totalINCount="68" totalINSize="7570" totalLNCount="1059" totalLNSize="13024352" maxLNSize="0" obsoleteINCount="66" obsoleteLNCount="971" obsoleteLNSize="12974449" obsoleteLNSizeCounted="971" getObsoleteSize="13019405" getObsoleteINSize="7347" getObsoleteLNSize="12974449" getMaxObsoleteSize="13019405" getMaxObsoleteLNSize="12974449" getAvgObsoleteLNSizeNotCounted="NaN"/> lnSizeCorrection=NaN newLnSizeCorrection=NaN estimatedUtilization=0 correctedUtilization=0 recalcUtilization=0 correctionRejected=false
Log files are not delete even if empty as seen using DBSpace utility:
Space -h /mam2g/data/sn1/u01/rg2-rn1/env/ib/kvstore.jar com.sleepycat.je.util.Db
File Size (KB) % Used
00000000 12743 0
00000001 12785 0
00000002 12725 0
00000003 12719 0
00000004 12703 0
00000005 12751 0
00000006 12795 0
00000007 12725 0
00000008 12752 0
00000009 12720 0
0000000a 12723 0
0000000b 12764 0
0000000c 12715 0
0000000d 12799 0
0000000e 12724 1
0000000f 5717 0
TOTALS 196867 0
Here is the configured topology:
kv-> show topology
store=MMS-KVstore numPartitions=90 sequence=106
zn: id=zn1 name=MAMHA repFactor=3 type=PRIMARY
sn=[sn1] zn:[id=zn1 name=MAMHA] 192.168.144.11:5000 capacity=3 RUNNING
[rg1-rn1] RUNNING
single-op avg latency=4.414467 ms multi-op avg latency=0.0 ms
[rg2-rn1] RUNNING
single-op avg latency=1.5962526 ms multi-op avg latency=0.0 ms
[rg3-rn1] RUNNING
single-op avg latency=1.3068943 ms multi-op avg latency=0.0 ms
sn=[sn2] zn:[id=zn1 name=MAMHA] 192.168.144.12:6000 capacity=3 RUNNING
[rg1-rn2] RUNNING
single-op avg latency=1.5670061 ms multi-op avg latency=0.0 ms
[rg2-rn2] RUNNING
single-op avg latency=8.637241 ms multi-op avg latency=0.0 ms
[rg3-rn2] RUNNING
single-op avg latency=1.370075 ms multi-op avg latency=0.0 ms
sn=[sn3] zn:[id=zn1 name=MAMHA] 192.168.144.35:7000 capacity=3 RUNNING
[rg1-rn3] RUNNING
single-op avg latency=1.4707285 ms multi-op avg latency=0.0 ms
[rg2-rn3] RUNNING
single-op avg latency=1.5334034 ms multi-op avg latency=0.0 ms
[rg3-rn3] RUNNING
single-op avg latency=9.05199 ms multi-op avg latency=0.0 ms
shard=[rg1] num partitions=30
[rg1-rn1] sn=sn1
[rg1-rn2] sn=sn2
[rg1-rn3] sn=sn3
shard=[rg2] num partitions=30
[rg2-rn1] sn=sn1
[rg2-rn2] sn=sn2
[rg2-rn3] sn=sn3
shard=[rg3] num partitions=30
[rg3-rn1] sn=sn1
[rg3-rn2] sn=sn2
[rg3-rn3] sn=sn3
Why empty files are not delete by cleaner? Why empty log files are protected by replicas if all the replicas seam to be aligned with the master ?
java -jar /mam2g/kv-3.2.5/lib/kvstore.jar ping -host 192.168.144.11 -port 5000
Pinging components of store MMS-KVstore based upon topology sequence #106
Time: 2015-02-03 13:44:57 UTC
MMS-KVstore comprises 90 partitions and 3 Storage Nodes
Storage Node [sn1] on 192.168.144.11:5000 Zone: [name=MAMHA id=zn1 type=PRIMARY] Status: RUNNING Ver: 12cR1.3.2.5 2014-12-05 01:47:33 UTC Build id: 7ab4544136f5
Rep Node [rg1-rn1] Status: RUNNING,MASTER at sequence number: 24,413 haPort: 5011
Rep Node [rg2-rn1] Status: RUNNING,REPLICA at sequence number: 13,277 haPort: 5012
Rep Node [rg3-rn1] Status: RUNNING,REPLICA at sequence number: 12,829 haPort: 5013
Storage Node [sn2] on 192.168.144.12:6000 Zone: [name=MAMHA id=zn1 type=PRIMARY] Status: RUNNING Ver: 12cR1.3.2.5 2014-12-05 01:47:33 UTC Build id: 7ab4544136f5
Rep Node [rg3-rn2] Status: RUNNING,REPLICA at sequence number: 12,829 haPort: 6013
Rep Node [rg2-rn2] Status: RUNNING,MASTER at sequence number: 13,277 haPort: 6012
Rep Node [rg1-rn2] Status: RUNNING,REPLICA at sequence number: 24,413 haPort: 6011
Storage Node [sn3] on 192.168.144.35:7000 Zone: [name=MAMHA id=zn1 type=PRIMARY] Status: RUNNING Ver: 12cR1.3.2.5 2014-12-05 01:47:33 UTC Build id: 7ab4544136f5
Rep Node [rg1-rn3] Status: RUNNING,REPLICA at sequence number: 24,413 haPort: 7011
Rep Node [rg2-rn3] Status: RUNNING,REPLICA at sequence number: 13,277 haPort: 7012
Rep Node [rg3-rn3] Status: RUNNING,MASTER at sequence number: 12,829 haPort: 7013Solved setting a non documented parameter " je.rep.minRetainedVLSNs"
The solution is described in NoSql forum: Store cleaning policy -
Css files not updating in localhost (XAMPP test server)
I've got an issue with a web page I'm developing where the .css files that I attach to the pages will not update on my local host. I can make a copy of the file with a different name and link it to the page. However, if I name any file a name that I've already used then the CSS reverts to whatever was on the original documents. This is making my php page testing a pain when I view the page in a browser. The problem doesn't seem to occur for my other websites, and completely goes away when I select "none" in the testing server options. I'm using XAMPP to run my localhost testing server. I'm using Dreamweaver CS4 on a Vista machine. Any ideas? If nothing else it would be nice to know how to delete the old files that are stuck in the testing server directory.
Strange thing... I turned it off and walked away for a few hours. Upon my return I found the issue resolved. I blame the shoe elves. Thanks, you little guys!
-
Archived log files not registered in the Database
I have Widows Server 2008 R2
I have Oracle 11g R2
I configured primary and standby database in 2 physical servers , please find below the verification:
I am using DG Broker
Renetly I did failover from primary to standby database
Then I did REINSTATE DATABASE to returen the old primary to standby mode
Then I did Switchover again
I have problem that archive logs not registered and not imeplemented.
SQL> select max(sequence#) from v$archived_log;
MAX(SEQUENCE#)
16234
I did alter system switch logfile then I ssue the following statment to check and I found same number in primary and stanbyd has not been changed
SQL> select max(sequence#) from v$archived_log;
MAX(SEQUENCE#)
16234
Any body can help please?
RegardsThanks for reply
What I mean after I do alter system switch log file, I can see the archived log files is generated in the physical Disk but when
select MAX(SEQUENCE#) FROM V$ARCHIVED_LOG;
the sequence number not changed it should increase by 1 when ever I do switch logfile.
however I did as you asked please find the result below:
SQL> alter system switch logfile;
System altered.
SQL> /
System altered.
SQL> /
System altered.
SQL> /
System altered.
SQL> SELECT DB_NAME,HOSTNAME,LOG_ARCHIVED,LOG_APPLIED_02,LOG_APPLIED_03,APPLIED_TIME,LOG_ARCHIVED - LOG_APPLIED_02 LOG_GAP_02,
2 LOG_ARCHIVED - LOG_APPLIED_03 LOG_GAP_03
3 FROM (SELECT NAME DB_NAME FROM V$DATABASE),
4 (SELECT UPPER(SUBSTR(HOST_NAME, 1, (DECODE(INSTR(HOST_NAME, '.'),0, LENGTH(HOST_NAME),(INSTR(HOST_NAME, '.') - 1))))) HOSTNAME FROM V$INSTANCE),
5 (SELECT MAX(SEQUENCE#) LOG_ARCHIVED FROM V$ARCHIVED_LOG WHERE DEST_ID = 1 AND ARCHIVED = 'YES'),
6 (SELECT MAX(SEQUENCE#) LOG_APPLIED_02 FROM V$ARCHIVED_LOG WHERE DEST_ID = 2 AND APPLIED = 'YES'),
7 (SELECT MAX(SEQUENCE#) LOG_APPLIED_03 FROM V$ARCHIVED_LOG WHERE DEST_ID = 3 AND APPLIED = 'YES'),
8 (SELECT TO_CHAR(MAX(COMPLETION_TIME), 'DD-MON/HH24:MI') APPLIED_TIME FROM V$ARCHIVED_LOG WHERE DEST_ID = 2 AND APPLIED = 'YES');
DB_NAME HOSTNAME LOG_ARCHIVED LOG_APPLIED_02 LOG_APPLIED_03 APPLIED_TIME LOG_GAP_02 LOG_GAP_03
EPPROD CORSKMBBOR01 16252 16253 (null) 15-JAN/12:04 -1 ( null) -
Psd files not updating edited smart objects (PS CC as of 6/3/14)
I created a master PSD file and added four different TIF files as 'linked' (not embedded) Smart Objects. If I edit these linked files (they open when double-clicked, or 'Edit Contents') make changes and then save & close the edited master TIF files, the main PSD file does not reflect the edited changes. If I double click the Smart Object, it opens and shows my edits, but to save & close still does not update the master PSD file. I do this all the time without issue, so I am perplexed. Even if I right-click the smart object and select "Update All Modified Content" nothing changes. Closing the file, restarting Photoshop- nothing updates the main PSD file with the changes to the edited TIF files... EXCEPT if I 'replace contents' with the same file: then it updates the PSD.
Any ideas?I was actually just working in Illlustrator. Selecting the vector object and then select PS and Paste. I'm then asked if I want this to be a smart object. I select yes and it adds the object as a smart object layer. If I double click the layer it opens it in Illustrator, I make a change, save it and close it. Then back to PS and the smart object doesn't update.
I haven't used Place Embedded or Place Linked. I don't understand what those functions do. It's just not how I'm use to working.
Actually I have a large PSD doc. Inside it is a group of objects that I have turned into a Smart object. Inside that (by double clicking) is an AI file that is also a smart object. When I open that up to make a change it won't update once I save and close.
I have tried a coulpe of test with placed Linked and Embedded objects and I can't get those to work either. So what ever the change was. So far not really liking it. -
CSS files not updating correctly
I'm having some trouble with the css files that are uploaded throught the application builder. When uploading an update to an existing css file, the file is not updated in the browser. Even after closing all browser windows and deleting the temporary internet files the new version is not loaded. I've even tried deleting the file from the app builder and holding the control key down when refreshing the browser and it still loads as if one existed. We currently have 3 seperate server environments and it works fine on 2 out of the 3...could this be a setting on the application server when the file is fetched from the database? It is currently calling:
wwv_flow_file_mgr.get_file?p_security_group_id=721404936311769&p_fname=iPRACA.css
I know the correct file is being uploaded to the database because when I view the detailed report in the application builder and click on the file name, the updated version is displayed in the text area. But, when I click on the download icon and open the css file to view, it is still the old version.
I'm currently using APEX version 2.3 and OAS 10.1.2
Any help would be greatly appreciated.Hi YodaHart,
What happens when you rename your css?
Does it work then?
I also guess it's the Oracle AS. However we should investigate further on what you're thinking.
If you query WWV_FLOW_FILE_OBJECTS$ you still see the files while you don't see them in APEX anymore? If yes, I would create another table as select of the WWV_FLOW_FILE_OBJECTS$ table, just to have a backup. If the files aren't used anymore, you should be save by deleting the corresponding old records (for ex through SQLPLUS). After all, these are files you uploaded...
Thanks,
Dimitri
Maybe you are looking for
-
Final Cut Pro X Export Speed Problems
I'm having problems getting any real processor speed out of my Final Cut Pro X exports. Whether I do a direct native Pro Res export straight from FCPX or send to compressor with the same settings, or even change them to H.264 or something else, i mor
-
Anyone out there using Photoshop CS5 or Aperture on a MacBook Air? I'm looking to maybe sell my MacBook Pro, get a new iMac for the heavy lifting, but maybe buy a MBA for travel for importing, tagging, and doing light work in Aperture and maybe Photo
-
Error: This entry already exists in the Financial Report Table OFRT..
Hi , The above error appears only when we try to create any kind of templates in the database. Last time this error appeared when trying to create Profit and Loss Statement using Financial Report Templates. As a result i had to use copy express to co
-
Creating a wireless access point on a wired ethernet
I am spending a month in Italy and will have a wired internet connection in my apartment. I would like to connect my Airport Express to it so that I can use my iPhone in the wi-fi mode to save on AT&T data charges. I know that this configuration is p
-
I know when i convert my images to be uploaded for my clients for their online gallery- they should be converted to sRGB. I just noticed at the bottom of my camera RAW 4.1 there is an option to switch it right there where it gives the information -(