Best Method: Status Command
I might have the wrong methodology...just seeking some unvarnished idealogy. I own the Cisco Press TCL Scripting book, but things have changed some since printing.
First Part: We have one Router that several mobile routers at various branches dial-in to. Very sporadic, long story, but it works well for what we're doing. Problem is, the IP addresses mean nothing, they have site names like San Francisco, Tokyo, etc. and the IP stays the same. Is there a way to have scripting take advantage of a Hosts file of sorts that can associate say a "Show Users" and/or "Show DMVPN" Command and just have it disply the name instead of IP.
Second Part: Is it possible to have this EEM/TCL/ETC continuosly run? It doesn't need to update continuously, but maybe on trigger of a change or periodically like every 30 seconds...nothing crazy. It'd be nice to have untrained people be able to just stroll past and see whats up at the time.
Thank You,
Bryan
TCL string map is one option to replace the IP address with a name in an EEM policy.
Place this in the router config: event manager environment timer * * * * *
#####Start file#######
::cisco::eem::event_register_timer cron name byte_counter.tcl cron_entry "$timer" maxrun 55
namespace import ::cisco::eem::*
namespace import ::cisco::lib::*
array set _sinfo [sys_reqinfo_routername]
set host $_sinfo(routername)
if {![info exists timer]} {
set result \
"Policy cannot be run: variable timer has not been set"
error $result $errorInfo
# Open the CLI
if [catch {cli_open} result] {
error $result $errorInfo
} else {
array set cli1 $result
# Go into enable mode
if [catch {cli_exec $cli1(fd) "enable"} result] {
error $result $errorInfo
if [catch {cli_exec $cli1(fd) "show users" } users ] {
error $users $errorInfo
set result [ string map {192.168.0.125 Hong-Kong \
192.168.0.19 Washington-DC} $users]
puts "$result"
######end file#####
Once the policy is registered it will run every minute and produce this output.
%HA_EM-6-LOG: status_update.tcl: Line User Host(s) Idle Location
%HA_EM-6-LOG: status_update.tcl: 3 tty 3 Async interface 00:00:59
%HA_EM-6-LOG: status_update.tcl: 10 vty 0 cpe idle 00:00:08 Hong-Kong
%HA_EM-6-LOG: status_update.tcl: 11 vty 1 cpe idle 00:00:04 Washington-DC
%HA_EM-6-LOG: status_update.tcl: * 12 vty 2 idle 00:00:00 EEM:status_update.tcl
Similar Messages
-
What is the best method to revert back to 'golden' image of database
Hello,
I want to have a 'golden' image of my test database. What is the best method to keep reverting back to this 'golden' image after testers are done with testing. What I mean is I want to set up a test database for testers. Testers can insert,update, delete on objects as they like. But after the testers have completed their testing, I want to revert the database back to the 'golden' image. This 'golden' image is the state of the database before testers perform their tests. I need to revert back to the 'golden' image many times over as we have to perform many different tests.
I thought about export and import (9i database) but takes 6 hours, too long. I tried using rman and was successful the first time, (set until time; restore database; recover database; open resetlogs;). The seconde time using rman, I got error saying the set until time was before the resetlogs time. I think I would have to restore the control file before the resetlog time frame before I can recover to the set until time, again.
What is a straight forward method to have a 'golden' image so I can keep on restoring back to it? This database is a test 9i database and it is in archive log mode but it doesn't need to be. I just need a best practice method, whether it is rman, exp/imp, manual backup/restore? Something I can use time and time over again. Thank you.If it is a Test envionment, can you just take a COLD Backup (database files
plus control files) ? How big is the database ? What platform is it on ?
eg in a COLD Backup
a. If using a script, run multiple "cp" (or cpio/tar) commands in parallel to
copy different database files
b. If using RMAN (in NOMOUNT), use multiple Chnanels.
Note : Remember to include ControlFile Backups and Restores as well. -
Best method for timestamping? (for later use with perl script)
What is the best method that I can use to timestamp events in Linux for later use with perl script?
I am performing some energy measurements.. where I am running several tasks separated by 20 secs in between. Before I start any execution of tasks, I always place initial delay for me to start the script and start the measurement device.
My problem is that I don't know how long is that first delay exactly. So to solve this, I thought I could use date commands to time stamp all tasks.. or at least to timestamp first dela.
Here is example of what I am doing:
1st delay
task 1
20s
task 2
20s
task 3..... etc
What would be the best to use?logger.
It posts messages straight to the system log. You can see the message, in all its glory using tools like journalctl. You will see the message, the date, time, host name, user name, and the PID of logger when it ran. -
Best method for archiving .mpp files on a separate server or location?
We want to be able to run a program or job on Project Server 2013 that will export all current published project .mpp files to a separate server or location on our network. What is the best, or suggested, method for something like this? Our managers want
to have this job run on a weekly or bi-monthly basis in order to have backup files of each active project schedule. This would be beyond the Project Server archive database. This would be for Business Continuity purposes of having those schedules available
should our servers ever crash.
Any help would be much appreciated. I am not a developer, but if there is code available for something like this we have developers in-house that can perform the work.
Thank you,
Travis
Travis Long IT Project Manager Entry Idaho Power Co. Project Server AdminProject server already has an archiving mechanism which backs up project plans based on schedule and maitains versions of it which can be restored at any point ? check administrative backup in central Admin under PWA settings
However I wouldn't say this is the best method, but you can run a macro which would export all projects and save it at a location(could be network file share), Something like this (havent tested it recently with 2013 but i believe should work
Sub Archiving()
Dim Conn As New ADODB.Connection
Dim Cmd As New ADODB.Command
Dim Recs As New ADODB.Recordset
'Connect to Project Server Reporting DB, Get Project Names
Conn.ConnectionString = "Provider=SQLOLEDB;Data Source=servername;Initial Catalog=ProjectServer_Reporting;User ID=; Password=; Trusted_Connection=yes"
Conn.Open
With Cmd
.ActiveConnection = Conn
.CommandText = "Select ProjectName From MSP_EpmProject_UserView"
.CommandType = adCmdText
End With
With Recs
.CursorType = adOpenStatic
.CursorLocation = adUseClient
.LockType = adLockOptimistic
.Open Cmd
End With
Dim prjName As String
Dim ArchivePrjName As String
Application.Alerts (False)
If Recs.EOF = False Then
Recs.MoveFirst
For x = 1 To CInt(Recs.RecordCount)
prjName = "<>\" + Recs.Fields(0)
FileOpenEx Name:=prjName, ReadOnly:=True
ArchivePrjName = "C:\Temp\" & prjName & "4Sep2014_9PM"
FileSaveAs Name:=ArchivePrjName, FormatID:="MSProject.MPP"
FileCloseEx
prjName = ""
Recs.MoveNext
Next x
End If
Recs.Close
Conn.Close
End Sub
Let us know if this helps
Thanks | epmXperts | http://epmxperts.wordpress.com -
Best method for RT communication
Hi all ,
I am using RT OS for my project , connected with the host PC , both the places I am creating the global variable of same names , like for DMM i create 2 global variable one in host and one is RT .I am passing the infor from host one after the other .I am processing the command in RT and the data is sent back to the host .
for displaing and reporting .
can any one tell me is ther any other best method to communicate between host and RT communincation other than the global variable , or can one tell me that this is the best method .
Regards,
Solved!
Go to Solution.Network Streams are definately the easiest way to send data back forth between RT and Windows. You need to create a read and write endpoint on each side, but then the network stream handles all of the connection and reconnection for you, so it makes your life a lot easier.
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines -
Best method for networking with ubuntu linux
Hi,
I'm setting up an ubuntu linux fileserver, and I was wondering what the best method for filesharing with my mac is. I'm currently running osx 10.4.11, though I may be upgrading to 10.5 soon. I'll be running SMB networking for a couple of other computers, but I'd prefer something a bit more robust that can handle file permissions etc.Mac OS X supports NSF out of the box. Configuration isn't documented.
I recall Apple got rid of net info manager in Leopard, so the configuration will be different. Perhaps more unix like.
Mac OS X support the Unix Network File System (NFS). However, it leaves out
the GUI.
This page show you how to use NetInfo Manager:
http://mactechnotes.blogspot.com/2005/09/mac-os-x-as-nfs-server.html#c1168221713 40271068
NFS Manager can both setup NFS shares and connect to NFS shares.
http://www.bresink.com/osx/NFSManager.html
Once you figure out how NFS Manager configures the NFS shares, you can
use Applications > Utilities > NetInfo Manager to create more shares.
You will either have to coordinate Unix Userid number and Unix Group Id number or use the mapall option on the share.
To find out your Mac OS X userid and group id do:
applications > utilities > terminal
ls -ln
ls -l
# lists the NFS share on your mac
showmount -e localhost
#list NFS shares on a remote host
showmount -e remote-ip-address
Once you see what NFS Manager does, you will be able to use NetInfo Manager to manage a connection. In Mac OS 10.4 you can configure the /etc/exports control file. See man exports for the details. Before that you had to have the data in NetInfo manager. When Mac OS X came out, many common Unix control files were not present. Instead the data had to be in NetInfo manager. Over time Apple has added more and more standard Unix control files.
======
You do know about the need to match userids & groupids.
# display uid and gid
ls -ln
sudo find / -user short-user-name -exec ls '-l' {} \;
# on Mac OS X
you will need to go into NetInfo Manager and select user and find your short-user-name. Change uid and guid.
#on Linux look in
/etc/passwd
/etc/group
# with care...
# change 1000:20 to your values for uid:gid
sudo find / -user short-user-name -exec chown 1000:20 {} \;
The manual for Tenon MachTen UNIX (which Apple checked when doing Mac OS
X) says that one should crate the file /etc/exports, which will cause
portmap, mountd and nsfd to launch at startup via the /etc/rc file. The
file must not contain any blank lines or comments, and each line has the
syntax
directory -option[, option] hostlist
where 'directory is the pathname of the directory that can be exported,
and 'hostlist' is a space separated list of hostnames that can access the
directory. For example
/usr -ro foo bar
/etc
/Applications
/User/gladys gladys
The client the uses a command like
/sbin/mount -t type [-rw] -o [options] server:directory mount_point
where 'type' is 'nfs', 'server' the name of the server, 'directory' the
server directory mounted, and 'mount_point' the client mount point. See
'man mount' for details.
I haven't tried the above, but it would be nice to know if it works on Mac OS X.
Hans Aberg
This will give you some hints on NFS. Post back your questions.
Robert -
Best method for controlling Office 365 updates
Were looking for the best method for updating Office 365. We will be testing prior to releasing the version to the rest of the company. We have a couple of methods we're contemplating but looking for any pros or cons for each. We are also
using SCCM 2012.
1. Run setup.exe setting the version and internal install source in an .xml file run as an SCCM package using distribution points as the install source.
2. Run click2runclient.exe with command lines setting the version and internal install source as an SCCM package using distribution points as the install source.
3 Set the version through group policy and turn on automatic updates and don't specify an install source.
Option 3 appears to be the most straight forward with the least administrative overhead. Would it be possible to revert back to an earlier version using this method?
I have read various articles but looking for any input as to what is working well or not working for others.Hi,
I would like to share this
blog post with you, which provides an example how to implement a fully automated testing and deployment process of Office 365 updates. This deployment method provides you the ability to test updates before you approve them in my environment.
The process might look like:
Deploy Office 365 in your environment with Office Deployment Tool, configure the "Updates" element in the configuration.xml file so that updates are enabled and the "UpdatePath" attribute points to an internal source.
Download the latest Office 365 build into a different internal source, configure your test machine to pick up builds from it.
After testing the updates, copy the updates to the first internal source.
You should be able to integrate the process with SCCM to reduce your administrative effort.
Hope this helps.
Regards,
Ethan Hua
TechNet Community Support
It's recommended to download and install
Configuration Analyzer Tool (OffCAT), which is developed by Microsoft Support teams. Once the tool is installed, you can run it at any time to scan for hundreds of known issues in Office
programs.
Please remember to mark the replies as answers if they help, and unmark the answers if they provide no help. If you have feedback for TechNet Support, contact [email protected] -
We have 4000 transports...would like to know the best methods of transports
We have 4000 transports...would like to know the best methods of transports.I understand mass transports and consolidation of transports through transport organiser can be of help ..However would like to understand more on the following points
1)In both the cases, how dependencies between transports will be managed.
2)How Sequencing will be managed?
3)Which is recommended and under what circumstances?
4)Pros and cons of each?
Thanks
ShamnaHave those transports already be transported (e.g. from Dev to Test?) - if so the transports from Test to Prod must be in the exact same sequence you transported them into Test because only then the coding status will be exactly the same.
If there was no transport yet then you usually transport in the sequence of the transport numbers - however if they have not yet been transported you don't know, if there are dependencies between different transports unless you kept a kind of log.
In that case the savest way to transport everything is to combine all objects of all transports into one transport and then let that one run into Test and Prod. That way you would not have to worry about sequence problems where a newer version is overwritten by an older version.
However the big transport only works if all the objects in those 4000 transports are not locked in any open request. You can create that transport by using SE03 and choosing 'Merge Object Lists' - just paste all the transports into the selection field and execute. You don't have to do all 4000 at once but do it in packages.
Hope that helps,
Michael -
XML in Oracle 9i (best method to return a xml file from a table query)
Hello.
What is the best method to query a table, or set of tables (that will return thousands of rows) using xml in oracle? (best performance)
I'm currently using DBMS_XMLGen, is there a better method ?I think, if your talking about generating XML, that you should use XMLElement, XMLForest, etc. to create your XML.
Lets assume that you base is relational data, then maybe the following great example will give you an idea how to do it : Re: Generate XML Schema from oracle tables
As michaels pointed out (did you read the link/ URL given?), the general expectancy is that the packages will be less and less important. So also maintenance wise the XMLElement, etc way will be the best, also for the future, maintainable method. -
I have the following:
Oracle 8 (8.0.5.2.6) Database running on a machine with Windows NT
And I want to accomplish the following:
Upgrade database to Oracle 9i (9.2.0.5) on a new machine running Windows Server 2003.
I am looking for advice on the best method to accomplish this considering that I am changing three variables: 1)the database version, 2)the operating system, and 3)the physical location of the database.
Thanks in advance for any advice anyone can provide.
JasonIt depends upon how you want to do the export and import. Whether a full database export or just the schema containing the data, etc. based on that you will have to create the 10g database and perform the import. Below are some relevant documents:
Oracle® Database Upgrade Guide
http://download-east.oracle.com/docs/cd/B19306_01/server.102/b14238/toc.htm
Oracle® Database Upgrade Guide - Export/Import
http://download-east.oracle.com/docs/cd/B19306_01/server.102/b14238/preup.htm#sthref67 -
I'm moving to an imac from a PC. I want to move my itunes library from the PC to my new imac. What is the best method and where can I find instructions?
Move iTunes from PC to MAC, http://www.macworld.com/article/146958/2010/03/move_itunes_windows_mac.html
-
What is the best method for backing up photos in IPhoto?
I have over 10,000 photos in IPhoto and am looking for the best method for doing a backup (or an archive?). I'm now using ICloud and it appears it's just photo streaming and does not have storage capability. External hard drive, copying to a DVD, other suggestions?
Most Simple Back Up
Drag the iPhoto Library from your Pictures Folder to another Disk. This will make a copy on that disk.
Slightly more complex:
Use an app that will do incremental back ups. This is a very good way to work. The first time you run the back up the app will make a complete copy of the Library. Thereafter it will update the back up with the changes you have made. That makes subsequent back ups much faster. Many of these apps also have scheduling capabilities: So set it up and it will do the back up automatically. Examples of such apps: Chronosync or DejaVu . But are many others. Search on MacUpdate
My Routine
My Library lives on my iMac. It’s Backed up to two external hard disks every day. These disks are permanently attached to the iMac. These back ups run automatically. One is done by Time Machine, one is a bootable back up done by SuperDuper
It’s also backed up to a portable hard disk when ever new photos are added. This hard disk lives in my car. For security, this disk is password protected.
I have a second off-site back up at a relative’s house across town. That’s updated every 3 or 4 months.
My Photos are backed up online. There are many options: Flickr, Picasa, SmugMug etc. However, check the terms of your account carefully. While most sites have free uploading, you will often find that these uploads are limited in terms of the file size or the bandwidth you can use per month. For access that allows you to upload full size pics with no restrictions you may need to pay.
Every couple of months I test the back ups to make sure they are working correctly. It’s very easy to mis-configure a back up application, and the only way to protect against that is to do a trial restore. -
What is the best method or service for backing up my digital files (catalog) in the Organizer from Photoshop Elements 12. Since there no longer is the automatic Elements sync available I do not know what to choose. I have tried to back this up using my external drive, but I cannot find the digital images per se. I see the entire program but not the catalog of pictures. Also, I have a windows operating system and Adobe Revel offers no edit capabilities with this OS.
I'm in a similar situation including movies I've purchased from iTunes...
Here's my setup:
I have all my iTunes data (music, movies, etc.) as well as about 10 GB of photos stored on a nework storage device that uses RAID-5 with SATA disks. This is basically a little toaster-sized NAS machine with four hard drives in it (www.infrant.com). If one of these drives dies, I get alerted and can insert a new one without missing a beat since the data is stored redundantly across all drives. I can also just yank out any one of the four drives while the thing is running and my data is still safe (I tried this after I bought it as a test).
That's how I prevent problems with a single disk... Redundancy
Now onto "backups"...
Nightly, my little RAID toaster automatically backs itself up to an attached 250GB USB drive. However, these backups are only of my critical data... documents, photos and home movies... I don't bother backing up my "Hollywood" movies since I can live without them in a disaster.
... I actually don't permanently store anything I care about on a laptop or a desktop. It all goes on the NAS box (Network Sttached Storage) and then my other computers use it as a network drive. It's attached via Gigbait to my main computer and via wireless to most everything else.
My achilles heel is that I don't store anything outside of my house yet. If I was smart, I'd alternate two USB drives attached to my NAS box and always keep one of them somewhere else (Safe Deposit Box?).
...and that's just one way to do it.
-Brian -
2006 MacBook Pro died, best methods to transfer data to new MacBook Pro
So my Rev. A MacBook Pro 2.16Ghz intel core duo finally died, looks like something on the logic board. So I bought a brand new 2011 2.2Ghz MacBook Pro. I have purchase and On The Go 2.5 inch enclosure from OWC, it's a USB device. the old hard drive is now in the enclosure, and I'm awaiting delivery tomorrow, of the new MacBook Pro.
My question is, what would be the best method to transfer old files. I want to start fresh with library and system files, as well as email IMAP files. my main concern is applications.
Can I simply drag the actual app from the old app folder to the new app folder? Will it bring along with it preferences and settings? Or do I need all the .DMG files to install the app into the new MBP?
Apps like; Audiohub, Connect360, Adobe Photoshop, Illustrator, Premiere Pro. honestly there are so many more that I can't remember right now.
Thanks in advance.You can attempt to manually move these files but if it doesn't work then just migrating the whole user and deleting the items you don't want might be the way to go.
Copy the following files from the old HD to their matching location on the new computer and then restart the computer before you launch any of these applications:
Safari: Users / Your User / Library / Safari
iCal: Users / Your User / Library / Calendars
Addressbook: Users / Your User / Application Support / Addressbook
iChat : Users / Your User / Library / Preferences / com.apple.ichat (there may be more than one of these files. copy them all over)
I don't use Skype so I don't know exactly where they store their files but checking in Users / Your User / Library / Skype would be a logical place to start. You can certainly ask in the Skype forum: http://forum.skype.com/
Adobe products tend to have bits and pieces scattered throughout. There may be a simple solution but I would suggest you inquire in the Adobe forums: http://forums.adobe.com/community/photoshop/photoshop_macintosh
Best of luck! -
F110: What is best method in voiding an EFT payment after check run?
What is the best method in voiding or reversing an EFT payment after a check run (F110)?
Thanks!have you figureed it out?
if I didn't remember wrong, we first find the clearing document number in proposal log, and used FBRA to reset the clearing items, then reverse the document
I am also interested in setting a dummy check number for EFT, anyone can throw some ideas here?
thanks
Keqin
Maybe you are looking for
-
I'm experiencing an issue with all day events. Using my Presidents' Day event as as example. In Entourage and iCal it shows as a single day event. When I sync it with my iPhone, the event ends up being two days long. I've checked time zones in all pr
-
I am trying to scan an image from a HP1315 printer while running OSX-10.8.6. Can anyone tell me if there is a standard app that will do?
-
System won't boot to my internal DVD drive
After receiving an imminent HDD failure warning, I replaced my internal hard drive and have been trying to use the Recovery Media (DVDs) I created to reboot my system. I get the error message "No bootable device -- insert boot disk and press any key"
-
Configurations through SXMB_ADM
Hello Everybody, can you please provide me the details, like how we can configure the different administrative and configurations in XI through this TCODE. Please let me know if there are an blogs for this. Thanks to all of you. Naresh
-
Hello, I own Blackberry Z10. Today I tried for the first time the headset that came with the phone. I made a call and it worked nicely. When I unplugged it and tried to make another call, i realized the phone (during the call) was stuck in speaker mo