Bash script to dumpstream many files simultaneously with mplayer

hi guys
i have a problem which i´m sure can be solved with the power of bash scripting
unfortunately i´m no bash scripting guru and all my experiments failed so far
the problem:
i have a file in which are links(streaminglinks)
mplayer offers the funtion to dump such a stream with simply issuing
mplayer -dumpstream mms://path/to/video -dumpfile video1
for example.
now i want mplayer to download this streams specified in the links-file automatically.
basically all it required is a bash script which goes through the link file and generates a command like mplay -dumpstream <link> -dumpfile video<n>
(where n is the nth link) and execute it.maybe there a even simpler solutions
well since i´m not that experienced with bashscripting i can´t solve that problem at my self....
i´m grateful for any help

hey guys
thx for the two scripts.
my approach was nearly the same as your´s kraluz with the difference that it doesn´t work
but they both have a little blemish
they download the files sequentially not simultaneously
how could that be realised
thx in advance

Similar Messages

  • Using Bash script to edit config file

    This is a really simple question, but given that I'm just learning Bash scripting and having this solved now would be really illustrative for me, I would really thank some help here.
    I'm using uzbl, and running Tor+Polipo. So, as you will see below in the tail of the config file, there is a line to redirect the requests of uzbl through Polipo.
    # === Post-load misc commands ================================================
    sync_spawn_exec @scripts_dir/load_cookies.sh
    sync_spawn_exec @scripts_dir/load_cookies.sh @data_home/uzbl/session-cookies.txt
    # Set the "home" page.
    #set uri = https://duckduckgo.com
    # Local polipo proxy
    set proxy_url = http://127.0.0.1:8123
    # vim: set fdm=syntax:
    What I want to accomplish is to comment in/out that line with a key shortcut on Awesome. I've thought of doing 2 scripts to do so and using 2 differente key shortcuts, but I want to "toggle" the proxy redirection with only 1 shortcut. To do so, I suppose that the script should go something like:
    if
    tool 'set proxy_url = http://127.0.0.1:8123' config_file
    then
    tool '#set proxy_url = http://127.0.0.1:8123' config_file
    else
    if
    tool '#set proxy_url = http://127.0.0.1:8123' config_file
    then
    tool 'set proxy_url = http://127.0.0.1:8123' config_file
    fi
    fi
    I know little about sed, but I think is the tool for this job. The most intriging part to me is to ask sed to print the regular expression when it finds it in the config file, and use that as an input in the conditional statement.
    Well, this is a mess I have done here. Hope there is a simple answer to this.
    Thanks in advance.-

    You can do this with a single sed command:
    sed -i 's/^#set proxy_url/set proxy_url/;
    t end;
    s/^set proxy_url/#set proxy_url/;
    : end' config_file
    This edits the file in-place (-i) and first tries to replace the commented with the uncommented line. If that suceeds, sed jumps to the "end" label. If not, it tries to replace the uncommented with the commented line. Thus you don't have to include any logic about the current state: if the first substitution succeeds, the line was obviously commented, if not, it was uncommented, and the second substitution should succeed.
    Note that my knowledge of sed is very limited. There might be a simpler way to do this.
    EDIT: For the sake of example, here's how to do the same in bash using regular expressions. Note how this script needs to use a temporary file to simulate in-place editing, how it needs to process the file line by line manually, etc. All things that sed does out of the box...
    #!/bin/bash
    tmp=test.conf.tmp
    echo -n "" > "$tmp"
    while read line; do
    if [[ "$line" =~ ^#set\ proxy ]]; then
    echo "${line/\#/}" >> "$tmp"
    elif [[ "$line" =~ ^set\ proxy ]]; then
    echo "#$line" >> "$tmp"
    else
    echo "$line" >> "$tmp"
    fi
    done < test.conf
    mv test.conf.tmp test.conf
    To answer your original question, the line
    if [[ "$line" =~ ^#set\ proxy ]]; then
    reads: if the line begins with a "#", followed by "set proxy", then...
    Last edited by hbekel (2011-03-20 10:40:16)

  • Simple BASH script to update subversion files

    This is just a simple BASH script that will update all .svn files in a specified directory.  If an update fails, it will attempt to update all the subdirectories in the failed one, so as much will be updated as possible.  Theoretically, you should be able to supply this script with only your root directory ( / ), and all the .svn files on your computer will be updated.
    #! /bin/bash
    # Contributor: Dylon Edwards <[email protected]>
    # ================================
    # svnup: Updates subversion files.
    # ================================
    #  If the user supplies no arguments
    #+ then, update the current directory
    #+ else, update each of those specified
    [[ $# == 0 ]] \
        && dirs=($PWD) \
        || dirs=($@)
    # Update the target directories
    for target in ${dirs[@]}; do
        # If the target file contains a .svn file
        if [[ -d $target/.svn ]]; then
            # Update the target
            svn up $target || {
                # If the update fails, update each of its subdirectories
                for subdir in $( ls $target ); do
                    [[ -d $target/$subdir ]] &&
                        ( svnup $target/$subdir )
                done
        # If the target file doesn't contain a .svn file
        else
            # Update each of its subdirectories
            for subdir in $( ls $target ); do
                [[ -d $target/$subdir ]] &&
                    ( svnup $target/$subdir )
            done;
        fi
    done

    Cerebral wrote:
    To filter out blank lines, you could just modify the awk command:
    ${exec awk '!/^$/ { print "-", $_ }' stuffigottado.txt}
    very nice; awk and grep: two commands that never cease to amaze me.

  • Bash script to remove hidden files

    Hello, I produced this script to clean folders from hidden files
    #!/bin/bash
    files=$(find . \( -iname '*~' -o -iname '.*' \) -type f)
    #count the hidden files
    n=$(echo "$files" | wc -l)
    if [ $n -eq 0 ]
    then
    echo "No hidden file found"
    exit 0
    fi
    #prompt to the user the found files
    echo "Hidden files found:"
    echo "$files"
    printf "Remove? [y/N] "
    read input
    input=$( echo $input | tr '[A-Z]' '[a-z]' )
    #take action if confirmed
    if [ "$input" == "y" ] || [ "$input" == "yes" ]
    then
    find . \( -iname '*~' -o -iname '.*' \) -type f -exec rm '{}' \;
    echo "Hidden files removed"
    else
    echo "Nothing was done"
    fi
    exit 0
    As you can see I've not been able to deleting reading from $files, but I have to call a second time the find command. So first improvement is to get rid of it (keeping managing spaces in file names correctly, of course).
    Other suggestions are welcomed!

    falconindy wrote:Or really... just use one of the several well-formed examples in this thread.
    Yeah there's one solution using arrays formulated by Trilby, and improved by aesiris. Kind of reminds me of the episode Sheldon Cooper says, "Notify the editors of the Oxford English Dictionary: the word 'plenty' has been redefined to mean 'two'."
    As for your earlier post:
    falconindy wrote:
    No, this simply doesn't work because the entirety of the find result will be quoted as a single filename. You really should just do this all in find...
    find . -name '.*' ! -type -d -exec rm -i {} +
    The + thing is a good example of saving resources, however the op doesn't want to prompt the user for authorization for every single file (rm -i). Maybe two invocations of find? By the time the user has finished reading the file list, other files may have been created and they will be removed without authorization.
    falconindy wrote:Manipulating IFS is entirely wrong for the purposes of reading a list out of a simple string variable (which is wrong on its own -- this is why arrays exist).
    I believe that calling a functioning, standards-compliant solution "entirely wrong" means to call it "not as clean as you would desire". So, back to the drawing board:
    files=$(find . -type f)
    echo "$files"|xargs -d$'\n' rm
    Although my earlier post proved one can work with IFS, you were right after all, it is not necessary. Hope you don't spot any further problems or my code will end up being a single word!

  • Script to print a file ending with specific characters

    Hi all
    Can anyone help me with a script that will automatically print to a laser, any file within various sub folders on a NAS that end in '_IMP.pdf'???
    total newb to scripting!
    thanks

    Folder actions don't handle subfolders well, and I forget whether they can cope with network drives.  I'm sure you could make it work, but since we already have the script as given here's how you modify it to do what you want. Save the following as a stay-open application (save it as an application, and check the "application stays open after run handler" checkbox), and leave it running in the background. it will label files with a color as they are processed, which will tell both you and it what files have been printed.
    property nameCue : "_IMP.pdf" -- file ending
    property folderPath : "/Path/To/Folder" -- posix path to folder that holds pdfs
    property intervalMinutes : 10 -- time in minutes between iterations
    property labelColor : 3 -- integer between 1 and 7; 3 is yellow by default
    on idle
      -- use an idle handler to run the recursive search and print periodically
              set filesToPrint to recursiveFolderSearch(folderPath)
              tell application "<printer name>" -- set this to the correct printer name
                        activate
                        print filesToPrint
              end tell
              return 60 * intervalMinutes
    end idle
    on recursiveFolderSearch(folderPath)
      -- a handler to do a recursive folder search
              tell application "System Events"
                        set folderItems to disk items of folder folderPath whose visible is true
                        set chosenFiles to {}
                        repeat with thisDiskItem in folderItems
                                  if class of thisDiskItem is folder then
      -- folder: recurse
                                            set chosenFiles to chosenFiles & my recursiveFolderSearch(POSIX path of thisDiskItem)
                                  else
      -- file: check file name for correct ending and then check to see if it's labeled
                                            if name of thisDiskItem ends with nameCue and my checkLabel(path of thisDiskItem) then
                                                      set end of chosenFiles to POSIX path of thisDiskItem
                                            end if
                                  end if
                        end repeat
              end tell
              return chosenFiles
    end recursiveFolderSearch
    on checkLabel(f)
         check if the file is labeled, which means it has already been printed.
         if it hasn't been printed, label it with the appropriate color and return true
         to tell the calling handler the file should be printed
              tell application "Finder"
                        if (label index of file f) = 0 then
                                  set label index of file f to labelColor
                                  return true
                        else
                                  return false
                        end if
              end tell
    end checkLabel

  • Can you utilize Time Capsule to backup Windows 7 files simultaneously with Time Machine backing up your MacBook?

    Can you use the Time Capsule 2TB to backup files from a PC (using the standard backup program in Windows 7) while also using the TC 2TB to run Time Machine to backup a MacBook?

    mackalan wrote:
    Can you use the Time Capsule 2TB to backup files from a PC (using the standard backup program in Windows 7) while also using the TC 2TB to run Time Machine to backup a MacBook?
    Yes, it will work fine.. at least until the disk starts to fill up, then the expanding size of the sparsebundle can cause issues with the files from windows.. Pondini has made note of the issues..
    http://pondini.org/TM/TCQ3.html
    but I have to admit to doing precisely what you are talking about without problems.. as long as you maintain space.. plenty of space. Time Machine is eventually designed to use up all space on the TC.
    You cannot partition the TC drive without opening the TC and breaking warranty.. but you can create a disk image and put your windows backups inside it.. that might help to prevent issues.

  • A bash script to backup system only with modified files

    Hi,
    I've made a simple bash script to exam which files are modified and needed to be backed up.
    It will use:
    1. the hash in Pacman's database (/var/lib/pacman/local/<pkg_ver>/files
    2. if no hash in the database, calculate it by our self from cache (/var/cache/pacman/pkg/<pkg_ver>.pkg.tar.gz
    3. if no cache found, compare build date and last modified date
    4. otherwise, this file better be backed up.
    Currently it will only print which files are needed to be backed up, but no backup actually happens. (dry run)
    And it is only in early development stage, please be careful.
    USAGE:
    <the script name> <where to backup files, a directory> <the files, or directories, separated by space, that are going to be examined>...
    Here is the code.
    #!/bin/bash
    # usage:
    # $1: file to backup
    function do_backup() {
    echo
    function smart_bak() {
    pkg=$(pacman -Qo "$1" 2>/dev/null)
    if [ 1 -eq $? ] ; then
    # No package owns $1 (locally created)
    if [ "$1" != "${1%.pacsave}" ] ; then
    echo "skip $1 (left by removed package)"
    else
    echo "backup $1 (local file)"
    fi
    else
    pkg=${pkg#$1 is owned by }
    # evaluate
    # by hash
    cur=$(md5sum $1)
    cur=${cur%% *}
    pkg_ver=${pkg/ /-}
    org=$(grep "^${1:1}" "/var/lib/pacman/local/$pkg_ver/files")
    org_hash=${org##* }
    if [ "$org" != "$org_hash" ] ; then
    # the org hash is in Pacman's database
    if [ "$org_hash" = "$cur" ] ; then
    echo "skip $1 (original config)"
    else
    echo "backup $1 (modified config)"
    fi
    else
    # no hash
    # find out hash myself?
    ARCH=$(uname -m)
    if [ -r "/var/cache/pacman/pkg/$pkg_ver-$ARCH.pkg.tar.gz" ] ; then
    org=$(tar -Oxzf "/var/cache/pacman/pkg/$pkg_ver-$ARCH.pkg.tar.gz" "${1:1}" | md5sum -)
    org_hash=${org%% *}
    if [ "$cur" = "$org_hash" ] ; then
    echo "skip $1 (original)"
    else
    echo "backup $1 (modified)"
    fi
    else
    # no cache, may be a AUR package
    # fall back to built date?
    date=$(pacman -Qi ${pkg% *} | grep "^Build Date")
    date=${date#*: }
    date=$(date -d "$date" +%s)
    mod=$(ls -l $1)
    mod=${mod% $1}
    mod=${mod#* * * * * }
    mod=$(date -d "$mod" +%s)
    tmp1=$(expr $mod "+" 60)
    tmp2=$(expr $mod "-" 60)
    if [ "$date" -le "$tmp1" -a "$date" -ge "$tmp2" ] ; then
    echo "skip $1 (the same date)"
    else
    echo "backup $1 (unknown)"
    fi
    fi
    fi
    fi
    function smart_bak_dir() {
    for i in $(ls -A "$1") ; do
    tmp="${1%/}/$i"
    if [ -f "$tmp" ] ; then
    smart_bak "$tmp"
    elif [ -d "$tmp" ] ; then
    smart_bak_dir "$tmp"
    else
    echo "skip $tmp (not a regular file nor a directory)"
    fi
    done
    # usage:
    # $1: the directory to store this backup
    # $2: directory to evalualte for backup
    # function main()
    # init
    target="$1"
    shift
    # check
    if [ ! -d "$target" -o ! -x "$target" -o ! -w "$target" ] ; then
    exit 4
    fi
    for i in $* ; do
    if [ -f "$i" ] ; then
    smart_bak "$i"
    elif [ -d "$i" ] ; then
    smart_bak_dir "$i"
    else
    echo "skip $i (unknown argument)"
    fi
    done
    Good luck,
    bsdson
    Last edited by bsdson.tw (2008-12-30 07:53:05)

    Thanks for this script. Nice work.
    Can we traverse the pacman.log backwards up and rollback each operation (including "installed" in this). Something like:
    history=$(tail -n +$(grep -m 1 -n "$1" "$LOG" | cut -d ":" -f 1) "$LOG" | tac | grep -E "(upgraded|installed)" | cut -d " " -f 3-)
    Last edited by g33k (2008-11-04 15:08:07)

  • Customize File Browse...upload many files at once

    Hi,
    I've searched the forums and this appears to not be possible using the HTMLDB app builder. Some threads I searched include :
    Upload Multiple Files
    Basically, our testing machine will create a folder for each test, inside of which are many files always with the same name. It'd be nice if the user only had to select either the folder, or one file in the folder, and any other files needed in that folder would be automatically uploaded by either HTMLDB or some custom code.
    There were some links as to how the guts of the File Browse... does it's thing in adding files to the flow_files table, but many of the links were dead. Java script can do some custom error checking, but how about behind-the-scenes multiple file loading?
    Just curious if this has been implemented. Running Oracle 9.2.0.4, HTMLDB 1.6.1.00.02.
    Thanks much,
    Matt

    Matt,
    I'm not aware of any way to do this using HTML and Javascript only running in a browser. As such, I don't know of a way to do this using HTML DB. Thing is, web browsers take great care to protect your file system from Javascript running behind the scenes.
    Sergio

  • Bash script to pseudo multithread a encode job lame

    Conceptually, I don't see why a bash script couldn't run x simultaneous lame mp3 encodes where x is the number of cores the user selects.  I just don't know how to go about doing it.  Would it make sense to read in all the .wav files to an array, then divide the array into x sub arrays and feed the work out or is there a more simplistic method?
    EDIT: I found this blog entry but I don't fully understand it. 
    Here is my adaptation of his code:
    #!/bin/bash
    # the number of cores you have on your system
    cores=4
    encode() {
    for file; do
    fileout=$(echo "$file" | sed s'/.wav//')
    lame -V 2 "$file" "$fileout".mp3; done
    export -f encode
    find . -name '*.wav' -print0 | xargs -0 -n 1 -P $cores bash -c 'encode "$@"' --
    Can someone explain to me what the last bit does: 'encode"$@"' --
    Last edited by graysky (2010-03-01 22:17:13)

    The last bit calls the encoide function, passing in all the arguments as past to it on stdin.
    'encode "$@"' calls encode and passes it its arguments (as passed to it from the find command).
    -- tells xargs that there are no more command line options, and to interpret anything "-" as part of the file names.
    man find and man xargs for more info.
    The "--" I'm not sure which man page it'll be in but I think it's a GNU thing?

  • Bash script run via cron not executing MYSQL command

    I have a bash script which is run from a cron,
    It is executing as I have it write to a log file which it does correctly.
    I am wanting the bash script to restore a mysqldump file.
    When I run the bash script manually the dump file gets loaded fine. But when I run it through the cron the mysql command appears to be ignored.
    The mysqldump file is 54MB and I have checked to make sure that MYSQL is included in the global users path in /etc/profile
    Does anyone know why this maybe??
    Here is the bash file
    #!/bin/bash
    date >> /home/user/crons/crons.log
    echo "Started loadbackup" >> /home/user/crons/crons.log
    cd /home/user
    dbuser=root
    dbpass=password
    dbname=databasename
    filename=backup
    mysql -hlocalhost -u"$dbuser" -p"$dbpass" "$dbname" < " >> /home/user/crons/crons.log
    My crontab looks like
    02 17 * * * /home/user/crons/loadbackup.sh
    Many thanks
    Richard

    Hi Richard,
    Have you tried redirecting the script output in the cron to see if an error is being reported?
    I.e.
    02 17 * * * /home/user/crons/loadbackup.sh > /tmp/loadbackup.log 2>&1

  • Bash script to trim all filenames with special characters recursively?

    Hi,
    I have a 30 GB directory full of data I recovered from a friend's laptop after her Windows XP crashed. I'd like to burn that data, but I can't, because many of the filenames contain weird characters (spaces, accents, things even worse that my XTerm displays as inverted question marks). So, mkisofs exits with an error.
    I'd like to clean that mess up, but it would take months to do that manually. Well, I only know a very little Bash, but I think this problem is already too heavy for my modest knowledge. Here's the problem:
    - check the contents of directory ~/backup recursively
    - find files whose filenames contain characters other than [A-Za-z0-9] and then maybe "-" and "_" and ".".
    - replace these characters either by an "_" or just erase them
    Now how would I translate that into a little Bash script?
    Cheers...

    Heyyyyy... nice idea

  • Bash script to replace markers in file

    Hi,
    I have a file of the form (testfile.txt):
    here Some text in here and more here Some text in here and more here Some text in here and more here Some text in here and more here Some text in here and more here Some text in here and more here Some text in here and more here
    $file1.txt$
    Some text in here and more here Some text in here and more here Some text in here and more here Some text in here and more here Some text in here and more here Some text in here and more here Some text in here and more here Some text in here and more here Some text in here and more here Some text in here and more $file2.txt$
    Some text in here and more here Some text in here and more here Some text in here and more here Some text in here and more here Some text in here and more here Some text in here and more here Some text in here and more here Some text in here and more here Some text in here and more here
    $file3.txt$
    I also have text files with names of the form "filen.txt" which have Markdown tables in them.
    age   weight   sex 
    15    45       m   
    20    56       f   
    25    65       f   
    30    72       m   
    I want to run a script to search for all the markers, and "inject" the relevant table at the marker by opening the file of that name in the same directory and doing a replace with it.
    I have the "shell" of the script (sorry about the pun). I cannot however get it to do the injecting part.
    I've tried Awk, sed, perl....you name it.
    The script so far is:
    #!/bin/bash
    line=""
    places=""
    while read -r line; do
        places=$(awk '/\$*\$/ { print $0 }')
    done < testfile.txt
    cp testfile.txt try.txt
    echo "$places" | {
        while read line; do
            strip=$(echo $line | sed 's/^\$//')
            strip=$(echo $strip | sed 's/\$$//')
            curtext=$(cat $strip)
             cat try.txt | {
                while read line; do
                # Something in here to take $strip add a $ at each end and replace
                #+with $curtext
                done
        done
    I would really appreciate some help to fill in the area where the comment is. I am successfully getting all the other bits to work.
    Many thanks
    Mike

    sed gives me a headache.  It has its place in the world, but I would never use it unless I had no other option. You do. Here's the same thing in AppleScript:
    -- set main paths
    set master_file to "/path/to/master file.txt"
    set main_folder to "/path/to/folder/"
    -- read master text, and break it down around '$' marks
    set master_text to read master_file
    set text_chunks to tid({input:master_text, delim:"$"})
    -- loop through chunks, replacing file names with file contents
    repeat with this_chunk in text_chunks
              if this_chunk ends with ".txt" then
                        set contents of this_chunk to (read (main_folder & this_chunk))
              end if
    end repeat
    -- write it all back out to the master file
    set fp to open for access master_file with write permission
    write tid({input:text_chunks, delim:""}) to fp
    close access fp
    on tid({input:input, delim:delim})
      -- generic handler for text delimiters
              set {oldTID, my text item delimiters} to {my text item delimiters, delim}
              if class of input is list then
                        set output to input as text
              else
                        set output to text items of input
              end if
              set my text item delimiters to oldTID
              return output
    end tid
    A bit wordier, but so much easier on the brain.

  • [SOLVED] problem with spaces and ls command in bash script

    I am going mad with a bash script I am trying to finish. The ls command is driving me mad with spaces in path names. This is the portion of my script that is giving me trouble:
    HOMEDIR="/home/panos/Web Site"
    for file in $(find "$HOMEDIR" -type f)
    do
    if [ "$(dateDiff -d $(ls -lh "$file" | awk '{ print $6 }') "$(date +%F)")" -gt 30 ];
    then echo -e "File $file is $(dateDiff -d $(ls -lh "$file" | awk '{ print $6 }') "$(date +%F)") old\r" >> /home/panos/scripts/temp;
    fi
    done
    The dateDiff() function is defined earlier and the script works fine when I change the HOMEDIR variable to a path where there are no spaces in directory and file names. I have isolated the problem to the ls command, so a simpler code sample that also doesn't work correctly with path names with spaces is this:
    #!/bin/bash
    HOMEDIR="/home/panos/test dir"
    for file in $(find "$HOMEDIR" -type f)
    do
    ls -lh "$file"
    done
    TIA
    Last edited by panosk (2009-11-08 21:55:31)

    oops, brain fart. *flushes with embarrassment*
    -- Edit --
    BTW, for this kind of thing, I usually do something like:
    find "$HOMEDIR" -type f | while read file ; do something with "$file" ; done
    Or put those in an array:
    IFS=$'\n' ; files=($(find "$HOMEDIR" -type f)) ; unset IFS
    for file in "${files[@]}" ; do something with "$file" ; done
    The later method is useful when elements of "${files[@]}" will be used multiple times across the script.
    Last edited by lolilolicon (2009-11-09 08:13:07)

  • Passing Files/Folders to BASH script...

    I'm trying to pass a list of files and folders selected in finder to a bash script so that I can use VLC to perform some transcoding and concatenation on them.
    I'm using "Get Selected Finder Items" to pass the "Files/Folders" output "As Arguments" to my script.
    Problem is with spaces in the filenames. Apparently Automator delimits the list of files with spaces. So I have spaces in the filenames and spaces between the files. So my bash script just sees a list like this:
    /Users/johnt/Documents/3rd Party Audio For Conversion/WFWC - 3.6.2009 3.7.2009/01c Director Report March 6 2009.wav /Users/johnt/Documents/3rd Party Audio For Conversion/WFWC - 3.6.2009 3.7.2009/01b Comm Disc Mar 6 2009.wav
    ...with no way to differentiate each file. Is there a way to tell automator to escape the spaces in the file paths so that my loop doesn't try to run through $@ as:
    /Users/johnt/Documents/3rd
    Party
    Audio
    for
    ... etc etc
    Thanks

    Thanks Neil, that was helpful. I'll play with that AppleScript and see what happens.
    I figured this was better suited in the Automator forum since I'm trying to format the output of what Finder sends before it ever hits the Bash script. My Bash script itself has no problem, so I didn't see any point going over to the UNIX forums.
    You gotta love the ability of OSX to be able to handle all sorts of technologies and integrate them well. Apparently, there's nothing Apple can do to make it's users do the same thing.

  • BASH Script - Launch App with

    I'm trying to write a simple BASH script that will laungh an program, but that program needs command line arguments.
    When I put it in quotes it says it can't find the file, if I don't use quotes then it won't run the program with the command line arguments. How can I launch a program using a BASH script with command line arguments?
    Thanks in advance

    #!/bin/bash
    /Users/name/Desktop/Directory/app -f configfile

Maybe you are looking for

  • How can i deal with java.security.AccessControlException?

    Hi all, I need to implement JavaMail using Servlet and deploy throught J2EE deployment tool. But when i test out the servlet i will always encounter this exception thrown. How can i solve this? java.security.AccessControlException: access denied (jav

  • Calendar - Trouble printing in list view

    I want to print all events and items on each of my calendars (so that ultimately I can combine them into one and know that whatever should be there, is). I'm in list view, but the print option seems to stick or hang when it gets to 'Calculating layou

  • The power button will not turn on the computer.

    My Mac book pro screen is black and the power button will not start the computer.

  • Topics in XI?

    Hello Guru's!I want to join the XI. I want to know the topics on XI to be learn. So i request you to send the details of the course contents. My mail id is [email protected] Plz send me to this id!

  • How to change schedule line fields when save sales documents in va01/va02

    Hi, every Experts, I want to change schedule line when save the sales documents in va01 or va02, such as change delivery block or schedule line category. of course, can use user exit USEREXIT_SAVE_DOCUMENT_PREPARE, but I do not know to use this user