Bash script to remove hidden files
Hello, I produced this script to clean folders from hidden files
#!/bin/bash
files=$(find . \( -iname '*~' -o -iname '.*' \) -type f)
#count the hidden files
n=$(echo "$files" | wc -l)
if [ $n -eq 0 ]
then
echo "No hidden file found"
exit 0
fi
#prompt to the user the found files
echo "Hidden files found:"
echo "$files"
printf "Remove? [y/N] "
read input
input=$( echo $input | tr '[A-Z]' '[a-z]' )
#take action if confirmed
if [ "$input" == "y" ] || [ "$input" == "yes" ]
then
find . \( -iname '*~' -o -iname '.*' \) -type f -exec rm '{}' \;
echo "Hidden files removed"
else
echo "Nothing was done"
fi
exit 0
As you can see I've not been able to deleting reading from $files, but I have to call a second time the find command. So first improvement is to get rid of it (keeping managing spaces in file names correctly, of course).
Other suggestions are welcomed!
falconindy wrote:Or really... just use one of the several well-formed examples in this thread.
Yeah there's one solution using arrays formulated by Trilby, and improved by aesiris. Kind of reminds me of the episode Sheldon Cooper says, "Notify the editors of the Oxford English Dictionary: the word 'plenty' has been redefined to mean 'two'."
As for your earlier post:
falconindy wrote:
No, this simply doesn't work because the entirety of the find result will be quoted as a single filename. You really should just do this all in find...
find . -name '.*' ! -type -d -exec rm -i {} +
The + thing is a good example of saving resources, however the op doesn't want to prompt the user for authorization for every single file (rm -i). Maybe two invocations of find? By the time the user has finished reading the file list, other files may have been created and they will be removed without authorization.
falconindy wrote:Manipulating IFS is entirely wrong for the purposes of reading a list out of a simple string variable (which is wrong on its own -- this is why arrays exist).
I believe that calling a functioning, standards-compliant solution "entirely wrong" means to call it "not as clean as you would desire". So, back to the drawing board:
files=$(find . -type f)
echo "$files"|xargs -d$'\n' rm
Although my earlier post proved one can work with IFS, you were right after all, it is not necessary. Hope you don't spot any further problems or my code will end up being a single word!
Similar Messages
-
How do I find and remove hidden files
For some reason, after the recent install of SL my 1TB drive is extremely bloated. I have deleted or moved just about everything possible from the disk (excluding system files and applications) and it still shows that I only have 94.08 GB of space available.
Is there a good piece of software out there that will help me to find out if there are several hidden files and remove what is unnecessary?Wow ... that was enlightening. I discovered that somehow there is a hidden folder on the 1TB drive entitled "volumes" and within that folder is a subfolder entitled "Mega Drive" (my self-titled name for the 1TB boot drive), which is an exact duplicate of my previous hard drive.
A little more about how I got here. I had to computer in to the Apple Store, after upgrading to SL, for what I initially though was a hardware issue. The Genius felt it was software, did an erase and install. When I got the macing back I tried to us Migration Assistant to copy files from my Time Machine drive, it took all night so I wasn't awake when it finished. I suspect that is where this folder came from, as the next day I could not find the information on my computer, so I started manually moving the files from that drive.
The bottom line is this ... is it safe to remove trash this entire folder? -
Using Bash script to edit config file
This is a really simple question, but given that I'm just learning Bash scripting and having this solved now would be really illustrative for me, I would really thank some help here.
I'm using uzbl, and running Tor+Polipo. So, as you will see below in the tail of the config file, there is a line to redirect the requests of uzbl through Polipo.
# === Post-load misc commands ================================================
sync_spawn_exec @scripts_dir/load_cookies.sh
sync_spawn_exec @scripts_dir/load_cookies.sh @data_home/uzbl/session-cookies.txt
# Set the "home" page.
#set uri = https://duckduckgo.com
# Local polipo proxy
set proxy_url = http://127.0.0.1:8123
# vim: set fdm=syntax:
What I want to accomplish is to comment in/out that line with a key shortcut on Awesome. I've thought of doing 2 scripts to do so and using 2 differente key shortcuts, but I want to "toggle" the proxy redirection with only 1 shortcut. To do so, I suppose that the script should go something like:
if
tool 'set proxy_url = http://127.0.0.1:8123' config_file
then
tool '#set proxy_url = http://127.0.0.1:8123' config_file
else
if
tool '#set proxy_url = http://127.0.0.1:8123' config_file
then
tool 'set proxy_url = http://127.0.0.1:8123' config_file
fi
fi
I know little about sed, but I think is the tool for this job. The most intriging part to me is to ask sed to print the regular expression when it finds it in the config file, and use that as an input in the conditional statement.
Well, this is a mess I have done here. Hope there is a simple answer to this.
Thanks in advance.-You can do this with a single sed command:
sed -i 's/^#set proxy_url/set proxy_url/;
t end;
s/^set proxy_url/#set proxy_url/;
: end' config_file
This edits the file in-place (-i) and first tries to replace the commented with the uncommented line. If that suceeds, sed jumps to the "end" label. If not, it tries to replace the uncommented with the commented line. Thus you don't have to include any logic about the current state: if the first substitution succeeds, the line was obviously commented, if not, it was uncommented, and the second substitution should succeed.
Note that my knowledge of sed is very limited. There might be a simpler way to do this.
EDIT: For the sake of example, here's how to do the same in bash using regular expressions. Note how this script needs to use a temporary file to simulate in-place editing, how it needs to process the file line by line manually, etc. All things that sed does out of the box...
#!/bin/bash
tmp=test.conf.tmp
echo -n "" > "$tmp"
while read line; do
if [[ "$line" =~ ^#set\ proxy ]]; then
echo "${line/\#/}" >> "$tmp"
elif [[ "$line" =~ ^set\ proxy ]]; then
echo "#$line" >> "$tmp"
else
echo "$line" >> "$tmp"
fi
done < test.conf
mv test.conf.tmp test.conf
To answer your original question, the line
if [[ "$line" =~ ^#set\ proxy ]]; then
reads: if the line begins with a "#", followed by "set proxy", then...
Last edited by hbekel (2011-03-20 10:40:16) -
How do I see and remove hidden files?
I copied some .jpg files to an SD card to pop into the SD card slot on my new Panasonic plasma tv.
When I setup a slide show every other file says "cannot read file 01" . All the pictures are there but there are these hidden files ( I Assume) every other one. There seems to be one of these extra hidden files for each picture file.
It's really annoying watching a slide show this way.
The files don't show in a finder window.
Is there a way to see these files and delete them so they won't show in my slideshow?
Thanks for any info or help you can provide.
Len in NCAre you copying these files from the Mac to a PC-formatted SD card? If so the invisible files to which you refer are the result of filesystem incompatibilities. The structure of a Mac file results in the file's resource fork being stripped off and saved as a separate file on the PC-formatted disk.
There are utilities that can "clean" files for use on PCs:
GrimReaperCM
Remove MacOS Junk
BlueHarvest
WinFSCleanser
Look for them at VersionTracker or MacUpdate. -
Bash script to dumpstream many files simultaneously with mplayer
hi guys
i have a problem which i´m sure can be solved with the power of bash scripting
unfortunately i´m no bash scripting guru and all my experiments failed so far
the problem:
i have a file in which are links(streaminglinks)
mplayer offers the funtion to dump such a stream with simply issuing
mplayer -dumpstream mms://path/to/video -dumpfile video1
for example.
now i want mplayer to download this streams specified in the links-file automatically.
basically all it required is a bash script which goes through the link file and generates a command like mplay -dumpstream <link> -dumpfile video<n>
(where n is the nth link) and execute it.maybe there a even simpler solutions
well since i´m not that experienced with bashscripting i can´t solve that problem at my self....
i´m grateful for any helphey guys
thx for the two scripts.
my approach was nearly the same as your´s kraluz with the difference that it doesn´t work
but they both have a little blemish
they download the files sequentially not simultaneously
how could that be realised
thx in advance -
Export to thumb drive and remove hidden files for Pandigital frame
I have a Pandigital photoframe. I find from searching online that there is zero support for Mac users. The frame mounts in the finder then disappears after a short while. Searching the web, I have found (supposedly) that one must export the files from iPhoto to an MS/DOS formatted thumb drive. After that, hidden files written by the Mac on the thumb drive must be removed using Terminal. I'm not sure if the directions that I found at these sites are correct or might not be complete. I am a novice at Terminal so maybe something is not clear to me on these sites. Can anyone help with transfer from iPhoto to a Pandigital frame? The sites are:
http://www.studiolighting.net/pandigital-frames-tout-128mb-internal-memory/
http://discussions.apple.com/thread.jspa?threadID=1276407This search
http://www.macupdate.com/search.php?keywords=resource+fork&os=mac
has links to many other apps that will remove the resource fork from these files. I don't mean to be obtuse but I don't often have reason to use these apps and so I'm slow to give specific assistance on them. You are best to read the Help on these apps or contact their support.
Regards
TD -
Simple BASH script to update subversion files
This is just a simple BASH script that will update all .svn files in a specified directory. If an update fails, it will attempt to update all the subdirectories in the failed one, so as much will be updated as possible. Theoretically, you should be able to supply this script with only your root directory ( / ), and all the .svn files on your computer will be updated.
#! /bin/bash
# Contributor: Dylon Edwards <[email protected]>
# ================================
# svnup: Updates subversion files.
# ================================
# If the user supplies no arguments
#+ then, update the current directory
#+ else, update each of those specified
[[ $# == 0 ]] \
&& dirs=($PWD) \
|| dirs=($@)
# Update the target directories
for target in ${dirs[@]}; do
# If the target file contains a .svn file
if [[ -d $target/.svn ]]; then
# Update the target
svn up $target || {
# If the update fails, update each of its subdirectories
for subdir in $( ls $target ); do
[[ -d $target/$subdir ]] &&
( svnup $target/$subdir )
done
# If the target file doesn't contain a .svn file
else
# Update each of its subdirectories
for subdir in $( ls $target ); do
[[ -d $target/$subdir ]] &&
( svnup $target/$subdir )
done;
fi
doneCerebral wrote:
To filter out blank lines, you could just modify the awk command:
${exec awk '!/^$/ { print "-", $_ }' stuffigottado.txt}
very nice; awk and grep: two commands that never cease to amaze me. -
Bash scripting, check filetype with file command
Hi community,
I've got as homework to write a script which uses the command file to check if a file is really a text file, and if yes echo its name.
I wrote the following
#!/bin/bash
for filetype in *
do
a= file $filetype | cut -f2 -d' '
b="ASCII"
if [ "$a"="$b" ]
then
echo "$filetype"
fi
done
however, this does not work.
It just echoes the filetype and all the filenames.
directory
Desktop
Bourne-Again
existe
gzip
kernel26-one-2.6.27-3-i686.pkg.tar.gz
Bourne-Again
list
Bourne-Again
pidof
empty
ps
Bourne-Again
realjpg
Bourne-Again
taille
ASCII
temp.txt
directory
themes
Any ideas?
Why is the filetype echoed, when I declare it as a variable?
Why does the condition thing not work?
Basically, what am I missing?thanks for the ressource.
here's the link
http://tldp.org/LDP/abs/html/
Thanks also for the answers, they were really helpful.
ok, I wanted to post another problem I had but while I was pasting my code in here I realised my mistake. -
A bash script to backup system only with modified files
Hi,
I've made a simple bash script to exam which files are modified and needed to be backed up.
It will use:
1. the hash in Pacman's database (/var/lib/pacman/local/<pkg_ver>/files
2. if no hash in the database, calculate it by our self from cache (/var/cache/pacman/pkg/<pkg_ver>.pkg.tar.gz
3. if no cache found, compare build date and last modified date
4. otherwise, this file better be backed up.
Currently it will only print which files are needed to be backed up, but no backup actually happens. (dry run)
And it is only in early development stage, please be careful.
USAGE:
<the script name> <where to backup files, a directory> <the files, or directories, separated by space, that are going to be examined>...
Here is the code.
#!/bin/bash
# usage:
# $1: file to backup
function do_backup() {
echo
function smart_bak() {
pkg=$(pacman -Qo "$1" 2>/dev/null)
if [ 1 -eq $? ] ; then
# No package owns $1 (locally created)
if [ "$1" != "${1%.pacsave}" ] ; then
echo "skip $1 (left by removed package)"
else
echo "backup $1 (local file)"
fi
else
pkg=${pkg#$1 is owned by }
# evaluate
# by hash
cur=$(md5sum $1)
cur=${cur%% *}
pkg_ver=${pkg/ /-}
org=$(grep "^${1:1}" "/var/lib/pacman/local/$pkg_ver/files")
org_hash=${org##* }
if [ "$org" != "$org_hash" ] ; then
# the org hash is in Pacman's database
if [ "$org_hash" = "$cur" ] ; then
echo "skip $1 (original config)"
else
echo "backup $1 (modified config)"
fi
else
# no hash
# find out hash myself?
ARCH=$(uname -m)
if [ -r "/var/cache/pacman/pkg/$pkg_ver-$ARCH.pkg.tar.gz" ] ; then
org=$(tar -Oxzf "/var/cache/pacman/pkg/$pkg_ver-$ARCH.pkg.tar.gz" "${1:1}" | md5sum -)
org_hash=${org%% *}
if [ "$cur" = "$org_hash" ] ; then
echo "skip $1 (original)"
else
echo "backup $1 (modified)"
fi
else
# no cache, may be a AUR package
# fall back to built date?
date=$(pacman -Qi ${pkg% *} | grep "^Build Date")
date=${date#*: }
date=$(date -d "$date" +%s)
mod=$(ls -l $1)
mod=${mod% $1}
mod=${mod#* * * * * }
mod=$(date -d "$mod" +%s)
tmp1=$(expr $mod "+" 60)
tmp2=$(expr $mod "-" 60)
if [ "$date" -le "$tmp1" -a "$date" -ge "$tmp2" ] ; then
echo "skip $1 (the same date)"
else
echo "backup $1 (unknown)"
fi
fi
fi
fi
function smart_bak_dir() {
for i in $(ls -A "$1") ; do
tmp="${1%/}/$i"
if [ -f "$tmp" ] ; then
smart_bak "$tmp"
elif [ -d "$tmp" ] ; then
smart_bak_dir "$tmp"
else
echo "skip $tmp (not a regular file nor a directory)"
fi
done
# usage:
# $1: the directory to store this backup
# $2: directory to evalualte for backup
# function main()
# init
target="$1"
shift
# check
if [ ! -d "$target" -o ! -x "$target" -o ! -w "$target" ] ; then
exit 4
fi
for i in $* ; do
if [ -f "$i" ] ; then
smart_bak "$i"
elif [ -d "$i" ] ; then
smart_bak_dir "$i"
else
echo "skip $i (unknown argument)"
fi
done
Good luck,
bsdson
Last edited by bsdson.tw (2008-12-30 07:53:05)Thanks for this script. Nice work.
Can we traverse the pacman.log backwards up and rollback each operation (including "installed" in this). Something like:
history=$(tail -n +$(grep -m 1 -n "$1" "$LOG" | cut -d ":" -f 1) "$LOG" | tac | grep -E "(upgraded|installed)" | cut -d " " -f 3-)
Last edited by g33k (2008-11-04 15:08:07) -
Need a script to delete user files and run on logout.
Hi folks!
I'm a total Mac novice so forgive me if I'm vague on anything here. I work in a library and we have recently acquired an iMac running Mountain Lion. The unit is primarily for use by Visually Impaired users however sighted folks can use it as well.
What I need is something that will delete any files created/downloaded by the sighted users but that will not delete anyone else's files. I tried just using the Guest Account however that also deletes all the system preferences for that account including the dock setup which I need to stay the same.
If possible I would like this to run on logout.
For additional points anyone who can also find me something that will run automatically when the users sign in and will sign them out after 30 minutes would super spectacularly awesome.
Thanks people!That being said, if you you still really want to do this you can create a logout hook and write a shell script to remove the files. See "About Daemons and Services", Appendix B on writing a logout hook.
But a far easier way is to create Account's login item that will cause a shell script to be launched to delete the files. You can encapsulate the shell script so it runs as an application that can be added to the Login Items. One such encapsulator is Platypus.
Note, rather than delete them, if you want to ensure all the specific plists have specific settings, then create a master set of them and copy the master set into the Preferences directory (being careful to observe ownership and permissions settings of course). -
Bash script run via cron not executing MYSQL command
I have a bash script which is run from a cron,
It is executing as I have it write to a log file which it does correctly.
I am wanting the bash script to restore a mysqldump file.
When I run the bash script manually the dump file gets loaded fine. But when I run it through the cron the mysql command appears to be ignored.
The mysqldump file is 54MB and I have checked to make sure that MYSQL is included in the global users path in /etc/profile
Does anyone know why this maybe??
Here is the bash file
#!/bin/bash
date >> /home/user/crons/crons.log
echo "Started loadbackup" >> /home/user/crons/crons.log
cd /home/user
dbuser=root
dbpass=password
dbname=databasename
filename=backup
mysql -hlocalhost -u"$dbuser" -p"$dbpass" "$dbname" < " >> /home/user/crons/crons.log
My crontab looks like
02 17 * * * /home/user/crons/loadbackup.sh
Many thanks
RichardHi Richard,
Have you tried redirecting the script output in the cron to see if an error is being reported?
I.e.
02 17 * * * /home/user/crons/loadbackup.sh > /tmp/loadbackup.log 2>&1 -
A simple script that removes older packages from cache dir
Hello all,
I wrote a small bash script that removes packages (from cachedir /var/cache/pacman/pkg), older than the ones available in repositories (eg. if the current version of pacman is 2.6.3, it will remove pacman-2.6.2 from cachedir). It implements the functionality of debian's 'apt-get autoclean'.
Get it from http://www.kegep.tuc.gr/~manolis/archlinux/cleanold.sh.thanks a lot for sharing this
-
How to download file using ftp in bash script
Hi! I'm runnig a bash script in solaris i want within the script to dowload file using ftp
How can i do it?
Tanks a lothello,evgchech
please try this way:
1. In the bash script, try following command:
ftp -n < ftpcmdfile2 in the ftpcmdfile (which is a file),coding the interactive commands of FTP such as:
user anonymous [email protected]
cd /var/sun/download
bi
mget *.*
bye
try it and good luck!
Wang Yu
Developer Technical Support
Sun Microsystems
http://sun.com/developers/support -
Dowload file using ftp in bash script
Hi! I'm runnig a bash script in solaris i want within the script to dowload file using ftp
How can i do it?
Tanks a lothello,evgchech
please try this way:
1. In the bash script, try following command:
ftp -n < ftpcmdfile2 in the ftpcmdfile (which is a file),coding the interactive commands of FTP such as:
user anonymous [email protected]
cd /var/sun/download
bi
mget *.*
bye
try it and good luck!
Wang Yu
Developer Technical Support
Sun Microsystems
http://sun.com/developers/support -
File sizes explodes after removing hidden information
After removing hidden content in Acrobat Pro X (Tools/remove hidden content or Tools/sanitize) the file size increases, up to 10x (2 MB -> 20 MB). Save as optimized PDF only partly reduces the problem, file is still 3-4x the initial size. Document was created with Adobe PDF writer, but this problem occurs in principle also with other PDFs. Bug? Necessary? Workaround? Any help is appreciated. Thanks in advance!
Any one found a solution to this problem?
I'm having it too, acrobat xi pro 11.0.10...
Original file size before cropping ~27MB
Same file saved as optimized PDF before cropping ~12MB
Original file size after cropping ~27MB
Same file saved as optimized PDF after cropping ~12MB
Original file saved as some other file name after removing hidden information ~60MB
Same file saved as optimized PDF after cropping AND after removing hidden information ~60MB
What on earth is going on?
Maybe you are looking for
-
I have the alarm set, indicator set, just need a digital trigger from an analog input. I realize I could use an external switch into a digital input, but would rather use what we have. We are using this analog signal for other purposes as well. This
-
I need to change the sort order in iTunes, pretty much like clicking on the browser header until the sort is "album by artist". Is there a way to do this? Thanks
-
IWeb 1.1 SOLUTION FOR DOTMAC USERS???
Hi, here is a possible solution for people publishing to dotmac. In iWeb, publish your site to a folder - save it somewhere on your system. There should be TWO items: your "site" folder and an index.html file. Navigate to your iDisk/Web/Sites/iWeb an
-
Is there any way to recover ePrinter account email by using a ePrinter e-mail address?
My problem is as follows, I registered my printer onto ePrintCenter and set custom email address to my printer. But, I logged out from the ePrintCenter and sine that time I couldn't login into the ePrintCenter any more as the system says that the e-m
-
Upload with Windows 7 - 64 bits
I have Windows 7- 64 bits. I use Dreamweaver CS3 but the Dutch edition. So I don't always know the exact names of the instructions in English. The problem is the upload function of Dreamweaver does 'n work. I have managed the site. I upload normal fr