FTP log file generation failed in shell script

Hi ALL,
I am doing FTP file transfer in shell script and able to FTP the files in to corresponding directory , But when i am trying to check the FTP status through the log files then its giving problem . please check the below code.
for file in $FILENAME1
do
echo "FTP File......$file"
echo 'FTP the file to AR1 down stream system'
ret_val=`ftp -n> $file.log <<E
#ret_val=`ftp -n << !
open $ar1_server
user $ar1_uname $ar1_pwd
hash
verbose
cd /var/tmp
put $file
bye
E`
if [ -f $DATA_OUT/$file.log ]
then
grep -i "Transfer complete." $DATA_OUT/$file.log
if [ $? -eq 0 ]; then
#mv ${file.log} ${DATA_OUT}/../archive/$file.log.log_`date +"%m%d%y%H%M%S"`
echo 'Log file archived to archive directory'
#mv $file ${DATA_OUT}/../archive/$FILENAME1.log_`date +"%m%d%y%H%M%S"`
echo 'Data file archived to archived directory'
else
echo 'FTP process is not successful'
fi
else
echo 'log file generation failed'
fi
its giving syntax error end of file not giving the exact line number , please help me on thsi
Regards
Deb

Thanks for ur reply
Actually i did a mistake in the code i wrote the following piece of code below
ret_val=`ftp -n> $file.log <<E
#ret_val=`ftp -n << !
so after the tilde symbol it as again taking the '# ' as a special character so it was giving error, so i removed the second line now its working fine.

Similar Messages

  • Log.0000000001: log file open failed

    I have been seeing an error off and on recently where my app will go along just fine writing to dbxml - and then for no apparent reason, blow up with
    log.0000000001: log file open failed: No such file or directory
    PANIC: No such file or directory
    When I go look - there is indeed no log.00000001 in my dbxml directory.
    What is the story with log.00000001? When is it created? What would cause this creation to fail. I have seen this problem on both an XP system and a Unix system.
    I think I have made this problem go away by manually creating an empty log.0000001 file before I start my app - but this seems bogus.
    Any tips appreciated

    Hi,
    If you have multiple applications or processes using Berkeley DB XML (including our utility programs) you may have set up a separate log directory for your log files or they simply were created in another directory. For this reason you may want to consider using a DB_CONFIG file and setting the location for your log files there.
    For more information about this please look at these references:
    http://www.sleepycat.com/docs/ref/env/db_config.html
    DB_CONFIG
    http://www.sleepycat.com/docs/api_c/env_set_lg_dir.html
    http://www.sleepycat.com/docs/api_c/env_set_data_dir.html
    An example for how to insert this information in a DB_CONFIG file is:
    set_data_dir datadir
    set_lg_dir logdir
    Regards,
    Ron Cohen
    Berkeley DB XML Support
    Oracle Corporation

  • Exchange 2010 SP3, RU5 - Massive Transaction Log File Generation

    Hey All,
    I am trying to figure out why 1 of our databases is generating 30k Log Files a day! The other one is generating 20K log files a day. The database does not grow in size as the log files are generated, the problem is log file generation.
    I've tried running through some of the various solutions out there, reviewed message tracking logs, rpc client access logs, IIS Logs - all of which show important info, but none of which actually provide the answers.
    I Stopped the following services to see if that would affect the log file generation in any way, and it has not!
    MS Exchange Transport
    Mail Submission
    IIS (Site Stopped in IIS)
    Mailbox Assistants
    Content Indexing Service
    With the above services stopped, I still see dozens (or more) log files generated in under 10 minutes, I also checked mailbox size reports (top 10) and found that several users mailboxes were generating item count increases for one user of
    about 300, size increases for one user of about 150Mb (over the whole day).
    I am not sure what else to check here? Any ideas?
    Thanks,
    Robert
    Robert

    Hmm - this sounds like an device is chewing up the logs.
    If you use log parser studio, are there any stand out devices in terms of the # of hits?
    And for the ExMon was that logged over a period of time?  The default 60 second window normally misses a lof of stuff.  Just curious!
    Cheers,
    Rhoderick
    Microsoft Senior Exchange PFE
    Blog:
    http://blogs.technet.com/rmilne 
    Twitter:   LinkedIn:
      Facebook:
      XING:
    Note: Posts are provided “AS IS” without warranty of any kind, either expressed or implied, including but not limited to the implied warranties of merchantability and/or fitness for a particular purpose.
    Rhoerick, 
    Thanks for the response. When checking the logs the highest number of hits were the (Source) Load Balancers, Port 25 VIP. The problem i was experience was the following: 
    1) I kept expecting the log file generation to drop to an acceptable rate of 10~20 MB Per Minute (Max). We have a large environment and use the exchange sevrers as the mail relays for the hated Nagios monitoring environment
    2) We didn't have our enterprise monitoring system watching SMTP traffic, this is  being resolved. 
    3) I needed to look closer at the SMTP transport database counters, logs, log files and focus less on the database log generation, i did do some of that but not enough of that. 
    4) My troubleshooting kept getting thrown off due to the monitoring notifications seeming to be sent out in batches (or something similar) stopping the transport service for 10 ~ 15 minutes several times seemed to finally "stop the transactions logs
    from growing at a psychotic rate". 
    5) I am re-running my data captures now that i have told the "Nagios Team" to quit killing the exchange servers, with their notifications, sometimes as much as 100+ of the same notifications for the same servers, issues. so far at a quick glance
    the log file generation seems to have dropped by about 30%. 
    Question: What would be the best counters to review in order to "Put it all together"? Also note: our Server roles are split, MBX and CAS/HT. 
    Robert 
    Robert

  • Exchange 2010 personal archive database massive log file generation

    Exchange Server 2010 SP3 + Update Rollup 4
    Windows Server 2008 R2, all updates
    VMware ESXi 5.5
    Server config: 2 x Xeon Quad Core 2.20GHz, 16GB RAM
    We recently started using personal archives. I created a database for this purpose ("Archive Mailboxes") on the same datastore as our live mailbox database ("Live Mailboxes"). It works great except that the mailbox maintenance generates
    massive amounts of log files, over 220GB per day on average. I need to know why. The Live Mailbox database generates around 70GB of log files every day. The database sizes are: Live = 159.9GB, Archive = 196.8GB. Everything appears to be working fine, there
    are no Error events related to archiving. There are 10025 MSExchangeMailboxAssistant warning events logged every day. I have moved those mailboxes back-and-forth to temp databases (both Live and Archive mailboxes) and the 10025 events have not stopped so I'm
    reasonably certain there is no corruption. Even if there were it still doesn't make sense to me that over 100 log files are generated every single minute of the day for the Archive store. And it's not that the database isn't being fully backed up; it is, every
    day.
    Do I need to disable the 24x7 option for mailbox maintenance to stop this massive log file generation? Should I disable mailbox maintenance altogether for the Archive store? Should I enable circular logging for the Archive store (would prefer to NOT do this,
    though I am 100% certain we have great backups)? It appears to me that mailbox maintenance on the Live store takes around 12 hours to run so I'm not sure it needs the 24x7 option.
    This is perplexing. Need to find a solution. Backup storage space is being rapidly consumed.

    I'm sure it will be fine for maintenance to run only on weekends so I'll do that.
    We use Veeam B&R Enterprise 7.0.0.833. We do not run incremental backups during the day but probably could if necessary. All this is fine and dandy but it still doesn't explain why this process generates so many logs. There are a lot of posts around
    the internet from people with the same issue so it would be nice to hear something from Microsoft, even if this is expected behavior.
    Thank you for the suggestions!

  • Ias Failed to start with "OPMN log file open failed" error message.

    Hi,
    My iAS doesn't start and I get this error message. Any advice? Thanks.
    opmnctl: starting opmn and all managed processes...
    OPMN log file open failed: /.../opmn/logs/ons.log (Read-only file system).
    opmnctl: opmn start failed

    Hi,
    Haven't you re-posted your question exactly after a year?
    take a look here.
    iAS not running due to OPMN log file open failed: opmn/logs/ons.log (Read-o

  • Logging OWB mapping execution in Shell script

    Hi,
    I am executing a OWB mapping from a shell script like this
    $OWB_SQLPLUS MY_WAREHOUSE plsql MY_MAPPING "," ","
    I want to log this mapping execution process into a file.
    Please let me know if this will work:
    $OWB_SQLPLUS MY_WAREHOUSE plsql MY_MAPPING "," "," >> LOGFIL.log
    I will just be using this log file to track all the execution and use it for logging purpose.
    If this wont work, please tell me the proper way to do this...
    Thanks.

    Avatar,
    ">>" is the Unix operator that will redirect output and append to a particular file, so what you have should work if you're executing it from the shell prompt. Although I don't know specifically what OWB_SQLPLUS and MY_WAREHOUSE are.
    In my company, we have the call to the owb script inside another script. For example, file x contains the following line:
    sqlplus repository_user/pwd@database @sqlplus_exec_template.sql repository_owner location task_type task_name custom_params system_params
    Then at the prompt, we enter:
    nohup x > x.log &
    And the mapping or workflow executes.
    Jakdwh,
    Are you redirecting your output to a file so you can see why it's returning a '3'? The log file will usually tell you where the error occurred. I don't know what your input parameters for your mapping is, but the script is pretty picky about the date format. Also, even if you don't have any input parameters, the "," still has to be sent into the script.
    Hope this helps,
    Heather

  • How to write CLOB parameter in a file or XML using shell script?

    I executed a oracle stored procedure using shell script. How can i get the OUT parameter of the procedure(CLOB) and write it in a file or XML in UNIX environment using shell script?
    Edit/Delete Message

    SQL> var c clob
    SQL>
    SQL> begin
      2          select
      3                  DBMS_XMLGEN.getXML(
      4                          'select rownum, object_type, object_name from user_objects where rownum <= 5'
      5                  ) into :c
      6          from    dual;
      7  end;
      8  /
    PL/SQL procedure successfully completed.
    SQL>
    SQL> set long 999999
    SQL> set heading off
    SQL> set pages 0
    SQL> set feedback off
    SQL> set termout off
    SQL> set trimspool on
    // following in the script is not echo'ed to screen
    set echo off
    spool /tmp/x.xml
    select :c from dual;
    spool off
    SQL>
    SQL> --// file size
    SQL> !ls -l /tmp/x.xml
    -rw-rw-r-- 1 billy billy 583 2011-12-22 13:35 /tmp/x.xml
    SQL> --// file content
    SQL> !cat /tmp/x.xml
    <?xml version="1.0"?>
    <ROWSET>
    <ROW>
      <ROWNUM>1</ROWNUM>
      <OBJECT_TYPE>TABLE</OBJECT_TYPE>
      <OBJECT_NAME>BONUS</OBJECT_NAME>
    </ROW>
    <ROW>
      <ROWNUM>2</ROWNUM>
      <OBJECT_TYPE>PROCEDURE</OBJECT_TYPE>
      <OBJECT_NAME>CLOSEREFCURSOR</OBJECT_NAME>
    </ROW>
    <ROW>
      <ROWNUM>3</ROWNUM>
      <OBJECT_TYPE>TABLE</OBJECT_TYPE>
      <OBJECT_NAME>DEPT</OBJECT_NAME>
    </ROW>
    <ROW>
      <ROWNUM>4</ROWNUM>
      <OBJECT_TYPE>TABLE</OBJECT_TYPE>
      <OBJECT_NAME>EMP</OBJECT_NAME>
    </ROW>
    <ROW>
      <ROWNUM>5</ROWNUM>
      <OBJECT_TYPE>TABLE</OBJECT_TYPE>
      <OBJECT_NAME>EMPTAB</OBJECT_NAME>
    </ROW>
    </ROWSET>
    SQL>

  • Export dump file and log file  name as sysdate in script

    Hi to All,
    Can anybody help me to give logical backup export dump file namd log file name is as contain sysdate in its name so that we can Uniquelly identified it.
    Regards
    DXB_DBA

    On windows it gets a bit hairy as there really is no clean and nice way of doing it.There are a couple of options.
    1. If you can rely on dateformat not changing, you can use a static substring expression. For example, the following might work w/ finnish locale echo %date:~3,2%%date:~6,2%%date:~9,4%Similarly, when you know the dateformat you can tokenize the output of 'date /t' and discard the tokens you don't want.
    2. You can set dateformat to your liking and then just use %date% in your script
    3. You can run a "SELECT to_char(sysdate,..." into a file and then read that file in your script.
    4. Simon Sheppard also has a solution you could use as a basis. I have a slight issue with the approach, but that could just be me.
    5. Use gnuwin32 or similar ;)
    Also note that %date% env var is set automatically from w2k onwards, so some of the solutions might not work w/ older versions.

  • How to pass video file name to run shell script

    Trying to make an automator script that will 1. Start a video capture, 2. Stop video capture, 3. Rename video, 4. Run Shell Script. Here is the shell script:
    for f in "$@"
    do
    echo "$f"
    /usr/bin/podcast --server my.podcast.server --user myself --pass mypass --submit --file $f --workflow "my workflow" --metadata /path/to/file
    done
    Question:
    Do I need to create a variable in the automator script to pass it to the script? If I need the variable do I also need to use both Set Value of Variable and Get Value of Variable? If I don't need a variable does $f need quotes around them?

    I'm not familiar with the terminal command /usr/bin/podcast so I don't know what it does but what i said is correct. to pass a file to this action you don't need any variables. the previous action in the workflow should output the video file you want to process, that's all. how you arrange that is up to you. it's clear that you have to fill in appropriate things in that shell script like " my workflow" and path to metadata. But if you have questions about how this shell action works you should ask people who made it.

  • Moving files into directory using shell script

    Can someone tell me how I move files into directory using *nix/linux shell script?
    I have files which created from stored procedures using utl_file. The files name for example:
    DKH_104_12345
    DKE_101_42324242
    DKH_102_32432
    DKE_101_34553
    Then I create directories automatically for example:
    /oradata/apps/dmp/output/101
    /oradata/apps/dmp/output/102
    /oradata/apps/dmp/output/103
    /oradata/apps/dmp/output/104
    Using this procedure :
    CREATE OR REPLACE PROCEDURE Xorganize AS
    v_item VARCHAR2(5);
    v_DirName VARCHAR2(50);
    v_FileName VARCHAR2(50):='xorganize';
    v_FileExt VARCHAR2(5):='.sh';
    v_ID UTL_FILE.file_type;
    CURSOR res IS
    --find the directory name from table
    SELECT brn_cde FROM vcr_brn_cde ORDER BY 1;
    BEGIN
    --used by utl.file funtion
    SELECT PRD_DIR INTO v_DirName
    FROM CR_SYS_PRM
    WHERE CLT_CDE ='FIF';
    SELECT v_FileName||v_FileExt INTO v_FileName FROM dual;
    v_ID:=UTL_FILE.FOPEN(v_DirName,v_FileName, 'w');
    utl_file.PUTF(v_ID,'%s\n','@@echo OFF');
    utl_file.PUTF(v_ID,'%s\n','cls');
    utl_file.PUTF(v_ID,'%s\n','echo Reorganizing ...');
    OPEN res;
         LOOP
              FETCH res INTO v_item;
              EXIT WHEN res%NOTFOUND;
              utl_file.PUTF(v_ID,'%s\n','mkdir '||v_item);
         END LOOP;
    CLOSE res;
    OPEN res;
         LOOP
              FETCH res INTO v_item;
              EXIT WHEN res%NOTFOUND;
              utl_file.PUTF(v_ID,'%s\n','move _'||v_item||'_.* '||v_item||'\');
         END LOOP;
    CLOSE res;
    utl_file.PUTF(v_ID,'%s\n','FOR /F "usebackq delims=" %%1 IN (`dir /b *.`) DO @rd/q %%1');
    utl_file.PUTF(v_ID,'%s\n','cls');
    utl_file.PUTF(v_ID,'%s\n','echo Reorganizing ...Done');
    utl_file.fclose(v_ID);
    END;
    Everything works fine, BUT, the script is generated in dos/windows scripting.
    Now I need to run the script in *nix/linux shell, which I still can’t do it (because of my knowledge :p).
    And also I don’t know if the script already generated in *nix/linux shell version, how do I chmod +x the script from stored procedure, I can’t use ‘host’ command in my tools
    Thanks a lot
    -firman

    If you're using 9i then UTL_FILE.FRENAME() will execute something like a Unix mv command.
    If you want to do a chmod then you'll need to check out how to use a Java Stored Procedure to execute OS calls.
    Cheers, APC

  • How to configure Log file generation

    Hi,
    I am in a migration project. Currently the OS is Unix. After migration it is going to be Windows.
    So we want to change the log files being created in Unix to Windows.
    Can anyone suggest any settings in SAP for the log file.
    Regards,
    Gijoy

    Hi Gijoy,
    can you please reformulate your question for better understanding?
    The log location and tracing severity setup mechanism is platform independent.
    After migration there's no necessary step(s) to be taken, the logs will be created in the same way on windows as on  unix under your current sap installation folder (e.g. defaultTrace is on unix under /usr/sap/.../j2ee/cluster/server<n>/log , on windows this will be <DRIVE:>\usr\sap\...\j2ee\cluster\server<n>\log)
    I hope this answers your question.
    Best Regards,
    Ervin

  • Huge log file generation

    Hi,
    I have a report server.when i start the report server the size of log file located at
    $ORACLE_HOME/opmn/logsOC4J~Webfile2~default~island~1 > 2GB in 24 Hrs
    Please tell me what may be the root cause for this and what will be the possible solution for this.
    Please its urgent.

    Hi Jaap,
    First of all Thanks.
    how to set debuging off on the container?
    some lines of the messages in line are as follows:95178969 [AJPRequestHandler-ApplicationServerThread-90] ERROR com.allianz.weo.struts.StoredProcAction - SQLException while calling proc CUSTOMER.AZBJ_WEO_SECURITY.LOAD_MODULE_MENUS: ORA-01013: user requested cancel of current operation
    ORA-06512: at "CUSTOMER.AZBJ_WEO_SECURITY", line 107
    ORA-06512: at line 1
    95178969 [AJPRequestHandler-ApplicationServerThread-90] ERROR com.allianz.weo.struts.StoredProcAction - SQLException while calling proc CUSTOMER.AZBJ_WEO_SECURITY.LOAD_MODULE_MENUS: ORA-01013: user requested cancel of current operation
    ORA-06512: at "CUSTOMER.AZBJ_WEO_SECURITY", line 107
    ORA-06512: at line 1
    95178969 [AJPRequestHandler-ApplicationServerThread-90] ERROR com.allianz.weo.struts.StoredProcAction - SQLException while calling proc CUSTOMER.AZBJ_WEO_SECURITY.LOAD_MODULE_MENUS: ORA-01013: user requested cancel of current operation
    ORA-06512: at "CUSTOMER.AZBJ_WEO_SECURITY", line 107
    ORA-06512: at line 1
    95178969 [AJPRequestHandler-ApplicationServerThread-90] ERROR com.allianz.weo.struts.StoredProcAction - SQLException while calling proc CUSTOMER.AZBJ_WEO_SECURITY.LOAD_MODULE_MENUS: ORA-01013: user requested cancel of current operation
    ORA-06512: at "CUSTOMER.AZBJ_WEO_SECURITY", line 107
    ORA-06512: at line 1
    07/07/12 12:18:32 DriverManagerConnectionPoolConnection not closed, check your code!
    07/07/12 12:18:32 (Use -Djdbc.connection.debug=true to find out where the leaked connection was created)
    Regards,
    Sushama.

  • JDBC driver log file generation on v8i

    I want to know if there is a means to generate the log files for JDBC driver transactions similar to sqlnet.log file which gets created when the OCI connection is used between the client and server.
    Where should this be done - on the client/server side? Is there a means to enable it without touching any of native Java code by enabling it through Server/Client side setting?
    thanks

    You should ask your question in the JDBC forum found at:
    http://forums.oracle.com/forums/forum.jsp?forum=99

  • Oracle bat scripts on windows log file generation

    I want to generate logfile/recording of this .bat file, msglog file is not generating any thing , any idea or any other suggestion how i can generate logfile ? like we can do in crotab scropt>> script.log etc
    set ORACLE_HOME=c:\oracle\10
    set ORACLE_SID=clin
    sqlplus ops/***** @ C:\u06\users\db\oracle\scripts\ops\datastore\scripts\main\main2xwk.sql msglog=opsshell3.log

    DBA2011 wrote:
    i am hoping to log all the action main2xwk.sql do, in one single logfileAnd exactly how do you anticpate that passing sqlplus the string 'msglog=opsshell3.log' will accomplish that? What do you expect sqlplus to do with that?
    Let's deconstruct your .bat file
    set ORACLE_HOME=c:\oracle\10
    set ORACLE_SID=clinThe above two lines simply set a couple of environment variables, to be used by some process running in the same enviornment in whatever manner said process chooses. Since your next step executes sqlplus, we know it may choose to use them, and in fact it does.
    sqlplus ops/***** @ C:\u06\users\db\oracle\scripts\ops\datastore\scripts\main\main2xwk.sql msglog=opsshell3.log The above line tells the OS to locate an executable file named 'sqlplus'
    and make available to it the character string 'ops/***** @ C:\u06\users\db\oracle\scripts\ops\datastore\scripts\main\main2xwk.sql msglog=opsshell3.log
    to use as it (sqlplus) sees fit. We know from the SQLPLus docs that sqlplus will parse that out as follows, with a 'space' as the delimiter.
    "ops/*****"
    will be broken down into username 'ops' and password '*****' and used to issue a connection request to a local database instance identified by the value of the environment variable ORACLE_SID.
    "@ C:\u06\users\db\oracle\scripts\ops\datastore\scripts\main\main2xwk.sql "
    Well, if you had closed the space between '@' and 'C:', sqlplus would have attempted to locate the file 'C:\u06\users\db\oracle\scripts\ops\datastore\scripts\main\main2xwk.sql' and process it. However, since you seem to have a space there (all I did was copy and paste the code you posted) it probably just gave up and did nothing because it found nothing appended directly to the '@' and the strings that follow the space after '@' have no special meaning at all to sqlplus.
    "msglog=opsshell3.log"
    If you had closed the space after the @, and sqlplus was able to locate and process 'main2xwk.sql', it would simply have used the string 'msglog=opsshell3.log' as the first command-line substitution parameter to be used by main2xwk.sql, however it was written to use it. Was main2xwk.sql written to accept a command-line parm? See http://docs.oracle.com/cd/B19306_01/server.102/b14357/ch5.htm#sthref1080

  • PDF File as Attachment thru Shell Script

    Hello there,
    there is a requirement that we need to send the output(PDF format) of an RDF as an attachment to the concerned users. I have taken one Main Concurrent program from which i am submitting another concurrent program for the RDF. Once the output is generated into its respective path, i am passing that path to the UNIX HOst Concurrent program (which is also submitted from the main Concurrent program after the report's one).
    HOst Concurrent program is picking the file (for eg o435678.out) and sending that output to the users email's inbox as text (in unreadable format though) but not as a PDF attachment. I want to know anyway out there that the file be sent as PDF attachment. what else need to be done either in shell program or somewhere else. Plz let me know
    Your help appreciated
    Thank you

    What is the application release? OS?
    HOst Concurrent program is picking the file (for eg o435678.out) and sending that output to the users email's inbox as text (in unreadable format though) but not as a PDF attachment. I want to know anyway out there that the file be sent as PDF attachment. what else need to be done either in shell program or somewhere else. Plz let me know What is the extension of the attached file?
    Can you open the file if you change the extension to PDF? If yes, just rename the extension to PDF and try again.
    Thanks,
    Hussein

Maybe you are looking for

  • Connecting TV to imac

    Can anyone tell me which cable and which adapter I need to connect to a Panasonic TX37P plasma TV? The TV has HDMI ports? Cheers for any advice. Chris

  • When using Apple TV

    When using Apple TV the mirror feature works fine on my iPhone, my daughters ipad and my wife's ipad mini but on my ipad I can only get certain bits to work ie photos, you tube but it will not mirror home screen or photos albums

  • ACS and CA in a wireless environment

    When setting up a ACS server to work with a CA to authenticate wireless clients via machine authentication, does the CA need to be an Enterprise CA or can I do it with a standalone CA? Note that for machine authentication, I need to push down group p

  • Battery recall serial number differ on two Apple pages.

    I went to Apple's site and there is different serial numbers listed for iBooks. I have three with listed ZZ serial # and won't accept my numbers. Anyone else have problems with the ZZ Page one lists ZZ** numbers [support.apple.com] And page two does

  • Iphone not connecting to final cut

    I can't get my iPhone to connect to final cut for import.  Ever since I downloaded the new update for Final cut x, I have had problems connecting. The new import box doesn't show the iPhone is connected.