Write/Read cluster with ftp and datasocket

I try to save a cluster to a file on my RT-target from my host machine via ftp and datasocket. I can use the DS examples between host and target but when I connect a cluster to either DS Write or DS Read the VI stops with error 42 (Generic error). The help for DataSocket does not mention any constriction concerning the data type. Is this a bug or a feature?
I found a workaround by saving the cluster to a local file and transferring this to the RT-target with FTP-VI's but the DataSocket solution would be much simpler.
LabVIEW 8.6.1
Attachments:
clusterDS.jpg ‏24 KB
Error42.jpg ‏12 KB

Hi,
I found the reason for the generic error: the file to write or read has to have a extension "dsd" (or "wav"), otherwise  you'll get the error. With a .dsd extension I was able to save the cluster. I have not yet managed to read it back but at least DS Read does not abort with the generic error. Interesting thing is if you use an other extension than dsd when writing, an empty file is actually created on the target system.
Attachments:
DSWriteCluster.vi ‏7 KB
DSReadCluster.vi ‏10 KB

Similar Messages

  • Error when Write/read Cluster in binary file.

    hi all,
    I implemented a code (using the Labview example) to save and read cluster data (see attached file). The write part works properly but when I want to read the file, there is an error : "Error 4 occurred at Read from Binary File in main_save_data_V2.vi; LabVIEW: End of file encountered'.
    BUT the data are correctly open. Why is there such message? any suggestions?
    thank you.
    Cedric
    Attachments:
    main_save_data_V2.vi ‏19 KB

    Instead of predicting how many clusters to read, just set the count in the binary read to -1.  That will read all that it can and I got no error.  Here is all you need in order to read the file.
    There are only two ways to tell somebody thanks: Kudos and Marked Solutions
    Unofficial Forum Rules and Guidelines
    Attachments:
    Read clusters.png ‏12 KB

  • Slow Connection with FTP and SSH after a migration from 10.3.9 to 10.4.6

    Hi all,
    I just move my server from 10.3.9 to 10.4.6.
    I import all my users and settings.
    When i try to connect to my server using FTP or SSH, the connexion is slow.
    I have to wait arround 45 sec before connecting to the server ( i never wait with the old 10.3.9 )
    ex of connexion + delay
    I enter : ssh myuser@myserverip
    server : Password: ( 26 sec after i hit the return key !! )
    server : Last login: Tue May 9 11:57:03 2006 from XX.XX.XX.XX ( 2 sec after )
    server : Welcome to Darwin!
    It is faster than the first time because the server must store some datas of recent connected users.
    I have the same problem with FTP
    Thx for help

    I have this same problem.
    I'm not sure how to set up or tweak DNS. Could someone point to where this is accomplished?
    I'm not using a DNS server inside our firewall because I'm not sure how it works. I have DNS service turned off on the OS X Server.

  • Oracle Clustre, Oracle Cluster with RAC and Oracle 10g

    Is there a difference between Oracle Cluster and Oracle Cluster with RAC? Please explain. Do existing database codes run unmodified in Cluster or Cluster with RAC environment? What needs to be modified to make existing SQL codes RAC-aware. How to achieve 'all automatic' in case of failure and resubmission of Queries from failed instance to a running instance?
    In 10g environment, do we need to consider licensing of RAC as a separate product? What are additional features one derives in 10g that is not in Cluster +RAC?
    Your comments and pointers to comparison study and pictorial clarification will be very helpful.

    Oracle cluster like failsafe before or Veritas Cluster or other vendor's cluster is meant for HA (high availability) purpose. Which 2 nodes or more can see a shared disk with 1 active node. Whenever this active node failed through heartbeat other machine will know and will take the database over from there.
    Oracle RAC is more for HA and load balance. In Oracle RAC 2 or more nodes are accessing the database at the same time so it spread load across all these nodes.
    I believe Oracle 10g RAC still need seperate license for it. But you need to call Oracle or check the production document to verify it.
    Oracle 10g besides improvement in RAC. It's main improvement is on the build in management of the database itself. It can monitored and selftune itself to much furthur level then before and give DBA much more information to determine the cause of the problem as well. Plus improvement on lots of utility as well like RMAN , data pump etc... I don't want to get into too much detail on this you can check on their 10g new features for more detail view.
    Hope this help. :)

  • Questions about reading/writing with output and input streams

    I'm getting a bit confused on this subject. I have written an array of doubles to a file, and now I need to know how to read them back in to another array. I have also wrapped each double into its own object, but I need to know how to write those to a file, and read the doubles contained in them back into an array.
    Here is what I have so far; the method that writes the array of doubles to a file.
    public void writePrimDoubles() {
              DataOutputStream outStream;
              System.out.println("Now writing the values to file...");
              try     {               
                   outStream = new DataOutputStream(new FileOutputStream("file.dat"));
              catch (IOException ex) {
                   System.out.println("Error opening file");
              return;
              try {
                   for(int i = 0; i<arraySize; i++)
                        outStream.writeBytes(numberArray[i] + "\n");
                   outStream.close();
              catch(IOException e) {
                   System.out.println("Error writing to file");
         }

    * writes all doubles in the given array to the file 'file.dat'
    public void writePrimDoubles(double[] array) throws IOException {
       DataOutputStream out = new DataOutputStream(new FileOutputStream("file.dat"));
       for(int i=0; i<array.length; i++) {
          out.writeDouble(array);
    out.close();
    * reads in all stored doubles from the given file ('file.dat').
    * cause of each double has been written to a 8byte part of the file
    * you could get the number of doubles stored in the file by dividing
    * its filesize by 8.
    * Alternatively you might first store all read doubles as Double in
    * a list (e.g. java.util.ArrayList). After reading all doubles you could then
    * create a new double[] with the lists size (double[] array = new double[list.size()];)
    * -> this alternative is commented out in the following code.
    public double[] readPrimDoubles() throws IOException {
    File file = new File("file.dat");
    DataInputStream in = new DataInputStream(new FileInputStream(file));
    double[] array = new double[file.length()/8];
    for(int i=0; i<array.length; i++) {
    array[i] = in.readDouble();
    in.close();
    return array;
    ArrayList list = new ArrayList();
    try {
    while(true) {
    list.add(new Double(in.readDouble()));
    catch(EOFException e) {} // catchs the exception thrown when end of file is reached
    in.close();
    double[] array = new double[list.size()];
    for(int i=0; i<list.size(); i++) {
    array[i] = ((Double)list.get(i)).doubleValue();
    return array;
    For further information on the classes you have to use you might have a look to the concerning API documentation (http://java.sun.com/j2se/1.4.2/docs/api/). In this docs you could find all methods, constructors and fields accessable for a class.
    have fun...

  • Write/Read Database with "Jeffrey"-VIs

    Hello,
    i want to write/read a sql-database without-the NI-Toolkit.
    I found while searching here the Tools from Jeffrey which are available here: http://jeffreytravis.com/lost/index.html
    Now i unocked the files and wanted to try t he example "Example - Fetch a Table.vi".
    There is also a demo-database included called: "SampleDatabase.mdb"
    My current problem is that i dont know what to set to the parameter "connection-string" (somehow the filename, but how?)
    Does someone know how to use these files?
    Thx for your help
    Attached the downloaded zip-file with all the data
    Attachments:
    LabSQL-1.1a.zip ‏1132 KB

    I tried to do what the readme says, but i think there is a problem with my windows-installation or something is missing.
    a. Go to
    your Windows Control Panels, and open "ODBC Data Sources"
    b. Click on the "System DSN" tab
    c. Click on the "Add..." button.
    d. From the list of drivers, choose
    "Microsoft Access Driver"
    No problem up to this point, but after i selected the "Microsoft Access Driver" and press "Fertig stellen" the window closes and it look like before, there is nothing like a dialog box (see attached screenshot), so i can´t continue with th points e..g.
    e. At the dialog box, type in
    "myDB" for the Data Source Name. Then click on "Select..."
    button, and find the file "Sample Database.mdb" included with the
    LabSQL examples. Leave everything else as it is, and hit OK.
    f. Close the ODBC control panel
    g. Test the connection by running one of
    the examples provided.
    Is there a way to re-install this part of windows?
    Thanks for your help
    Attachments:
    empty.jpg ‏76 KB

  • Scripting with FTP and HTTP

    Hi All,
    To help us with future planning, we would like to get a feel for how many developers are using the FTP and HTTP objects that are available with scripting in CS3 (through webaccesslib). If you are using them could you send me a quick email describing how you use the component? My email address is [email protected]
    Thanks in advance.
    Alan Morris
    Dev Tech Engineer
    Adobe Systems

    Yeah, this is so aggrevating!
    Adobe builds all of these cool ideas, then doesnt test them.
    The HTTPConnection object does not do POST at all. I have tried nearly every possibility. The documentation is either way off or the object just does not work. I can see the post in raw form and the POST variables are not coming across.
    After working on this for a few hours i thought to myself, hey maybe i should just create a flash pane instead and load the files into it, then have the flash object upload. Well i ran into a big fat wall there too! As it is with patchpanel and bridge, these technologies only accept swf objects. This whole concept of using SWF and crossscripting has a huge flaw. The SWF file's security format does not allow for local file access for doing simple things like upload. If i can't synchronize file data to web based clouds, then i cant do much worth talking about.
    I love these products and their possibilities but i have to have the ability to communicate with the world. HTTP is the way!
    Also a side note, FTP is an insecure/inflexible solution and looks like a lot more time was spent on this aspect of the scriptable product.
    PLEASE HELP ADOBE!!!!!

  • ASA 8.0 VPN cluster with WEBVPN and Certificates

    I'm looking for advice from anyone who has implemented or tested ASA 8.0 in a VPN cluster using WebVPN and the AnyConnect client. I have a stand alone ASA configured with a public certificate for SSL as vpn.xxxx.org, which works fine.
    According to the config docs for 8.0, you can use a FQDN redirect for the cluster so that certificates match when a user is sent to another ASA.
    Has anyone done this? It looks like each box will need 2 certificates, the first being vpn.xxxx.org and the second being vpn1.xxxx.org or vpn2.xxxx.org depending on whether this is ASA1 or ASA2. I also need DNS forward and reverse entries, which is no problem.
    I'm assuming the client gets presented the appropriate certificate based on the http GET.
    Has anyone experienced any issues with this? Things to look out for migrating to a cluster? Any issues with replicating the configuration and certificate to a second ASA?
    Example: Assuming ASA1 is the current virtual cluster master and is also vpn1.xxxx.org. ASA 2 is vpn2.xxxx.org. A user browses to vpn.xxxx.org and terminates to ASA1, the current virtual master. ASA1 should present the vpn.xxxx.org certificate. ASA1 determines that it has the lowest load and redirects the user to vpn1.xxxx.org to terminate the WebVPN session. The user should now be presented a certificate that matches vpn1.xxxx.org. ASA2 should also have the certificate for vpn.xxxx.org in case it becomes the cluster master during a failure scenario.
    Thanks,
    Mark

    There is a bug associated with this issue: CSCsj38269. Apparently it is fixed in the iterim release 8.0.2.11, but when I upgraded to 8.0.3 this morning the bug is still there.
    Here are the details:
    Symptom:
    ========
    ASA 8.0 load balancing cluster with WEBVPN.
    When connecting using a web browser to the load balancing ip address or FQDN,
    the certifcate send to the browser is NOT the certificate from the trustpoint
    assigned for the load balancing using the
    "ssl trust-point vpnlb-ip" command.
    Instead its using the ssl trust-point certificate assigned to the interface.
    This will generate a certificate warning on the browser as the URL entered
    on the browser does not match the CN (common name) in the certificate.
    Other than the warning, there is no functional impact if the end user
    continues by accepting to proceed to the warning message.
    Condition:
    =========
    webvpn with load balancing is used
    Workaround:
    ===========
    1) downgrade to latest 7.2.2 interim (7.2.2.8 or later)
    Warning: configs are not backward compatible.
    2) upgrade to 8.0.2 interim (8.0.2.11 or later)

  • Can BPM communicate with FTP and JDBC Adpater??

    Hello All Experts,
    I have seen a scenario FILE TO RFC TO FILE using BPM, where BPM (integration process) is having a Synchronous communication with RFC and then sends the response to FILE.
    SO i just want to ask, can i use other adpater(FIle or JDBC) in BPM for synchronus communication
    Plz help.
    Thanks and regards,
    Vanita Jain

    Raj,
    Let me explain my scenario clearly...
    I just want to transfer a selected records from database to other system as a file and once that file is gets created than only i have to update the file-name field in database otherwise not. because there may be case when file is not gets created on other system.
    so i had created a BPM which receives a records from JDBC adpater asynchronously and then send it to other system for file creation synchronously and then BPM sends a response to JDBC adapter for file name update.
    But my scenario is not working, as you said FTP adpater can't be use for synchronous communication.
    as Abhishek said i can use Transport acknowledgement option,but am not aware of how to implement this.
    Plz help and give example(if possible) ........if you can.
    Regards,
    Vanita

  • How to create an intensity waveform graph cluster with t_0 and dt ?

    Hi all,
    I would like to know whether it is possible to create an intensity waveform like you can do with a 1-d waveform (with "build waveform") so that you get a cluster with the waveform array, the t_0, the dtand the attributes. 
    If not I would like to know the following:  I use references to cluster typedefs to update my controls and indicaters on the front panel. Now if I use a property node for the intensity graph to set the offset and multiplier on the x-scale, the x-scale on the graphs on the sub-VI work perfectly, however not on the real front panel, probably since these get updated through a reference. Does anyone have a clue how to fix this?
    Regards, Pieter

    You are only writing the "value" of the type definition via the property node. This does not include properties such as offset and multiplier.
    On a sidenote, you are using way too much code for most operations.  
    For example, the to-I32 can be placed on the cluster, so only one instance is needed.
    Also property nodes are resizeable, so only one instance is needed.
    There are also Rube Goldberg constructs, such as ">0  *AND* TRUE", which is the same as a simple ">0"
    Overall, you are really fragmenting memory by constantly building and then resizing arrays to keep them at a max size of 2880. This can cause performance problems due to the constant need of reallocations. It might be better to use fixed size arrays and do things "in place".
    Message Edited by altenbach on 03-19-2009 09:57 AM
    LabVIEW Champion . Do more with less code and in less time .
    Attachments:
    OneIsEnough.png ‏8 KB
    CombineProperties.png ‏3 KB

  • Read file with nio and flush with servlet

    Can I read file, by using java.nio (for example by FileInputStream.getChannel()) and then flush the file content thru the servlet?
    I kwow about reading without java.nio, get a byte array and then flush it by httpservletresponse writer or outputstream, but I wish to know if it is possibile with java.nio reading.
    thanks

    I'm doing it only for file reading..
    protected void processRequest(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException
            response.setContentType("text/html;charset=UTF-8");
            PrintWriter out = response.getWriter();    
            FileInputStream fis = null;                               
            try
                String path = "/var/hello.txt";
                fis = new FileInputStream(path);       
                FileChannel channel =  fis.getChannel();
                ByteBuffer bb = ByteBuffer.allocate((int) channel.size());
                channel.read(bb);
                byte[] data2 = bb.array();
                channel.close();
                out.write(new String(data2));
            } catch (FileNotFoundException ex)
                ex.printStackTrace();
            } finally
                try
                    fis.close();
                } catch (IOException ex)
    ex.printStackTrace();
                out.close();
        }

  • Problem for scheduling with FTP and default FTP destination parameters

    <span class="postbody">Hi, <br />  <br /> My version of BOXI is Release 2 SP2 <br />  <br /> I want schedule a report with default parameters found in the FTP destination of the Job Server (in my case: DesktopIntelligenceJobServer). <br />  <br /> I have using a sample found in BO but it&#39;s work with specific parameter. <br />  <br /> It&#39;s on java but can be changed on JSP: <br />  <br />  <br /> </span><table border="0" cellspacing="1" cellpadding="3" width="90%" align="center"><tbody><tr>        <td><span class="genmed"><strong>Code:</strong></span></td>     </tr>     <tr>       <td class="code"> <br /> // before is the query for find the report in oObjects <br />          IInfoObject infoObject = (IInfoObject)oObjects.get(0); <br />          ISendable obj = (ISendable)infoObject; <br />           <br />          IDestinationPlugin destinationPlugin = (IDestinationPlugin)iStore.query("SELECT TOP 1 * " +  <br />                            "FROM CI_SYSTEMOBJECTS " +  <br />                    "WHERE SI_NAME=&#39;CrystalEnterprise.Ftp&#39;").get(0); <br />  <br />          //Retrieve the Scheduling Options and cast it as IFTPOptions <br />          //This interface is the one which allows us to add the file location <br />          //for the scheduling <br />          IFTPOptions ftpOptions = (IFTPOptions) destinationPlugin.getScheduleOptions(); <br />          ftpOptions.setServerName("myip"); //FTP Server name or IP address <br />          ftpOptions.setPort(21); //Port number of FTP server...default is 21 <br />          ftpOptions.setUserName("gcr"); //can either use an account or logon <br />          ftpOptions.setPassword("mypass"); <br />           <br />          List destFiles = null; <br />          destFiles = ftpOptions.getDestinationFiles(); <br />          destFiles.add( "/BOFTP/MyReport.pdf");                 <br />          IDestination destination = obj.getSendToDestination();    <br />          destination.setFromPlugin(destinationPlugin); <br />           <br />          iStore.schedule(oObjects); <br /> </td>     </tr></tbody></table><span class="postbody"> <br />  <br /> If I comment some line for using defulat parameters, it&#39;s not work correctly : <br />  <br /> </span><table border="0" cellspacing="1" cellpadding="3" width="90%" align="center"><tbody><tr>        <td><span class="genmed"><strong>Code:</strong></span></td>     </tr>     <tr>       <td class="code"> <br /> // before is the query for find the report in oObjects <br />          IInfoObject infoObject = (IInfoObject)oObjects.get(0); <br />          ISendable obj = (ISendable)infoObject; <br /> /* begin comment          <br />          IDestinationPlugin destinationPlugin = (IDestinationPlugin)iStore.query("SELECT TOP 1 * " +  <br />                            "FROM CI_SYSTEMOBJECTS " +  <br />                    "WHERE SI_NAME=&#39;CrystalEnterprise.Ftp&#39;").get(0); <br />  <br />          //Retrieve the Scheduling Options and cast it as IFTPOptions <br />          //This interface is the one which allows us to add the file location <br />          //for the scheduling <br />          IFTPOptions ftpOptions = (IFTPOptions) destinationPlugin.getScheduleOptions(); <br />          ftpOptions.setServerName("myip"); //FTP Server name or IP address <br />          ftpOptions.setPort(21); //Port number of FTP server...default is 21 <br />          ftpOptions.setUserName("gcr"); //can either use an account or logon <br />          ftpOptions.setPassword("mypass"); <br />           <br />          List destFiles = null; <br />          destFiles = ftpOptions.getDestinationFiles(); <br />          destFiles.add( "/BOFTP/MyReport.pdf");                 <br />          IDestination destination = obj.getSendToDestination();    <br />          destination.setFromPlugin(destinationPlugin); <br /> end comment */          <br />          iStore.schedule(oObjects); <br /> </td>     </tr></tbody></table><span class="postbody"> <br />  <br />  <br />  <br /> If my report is set to SMTP by default, the scheduling failed and want using SMTP protocol <br />  <br />If my report is set to FTP by default, the scheduling work. I want forcing the default FTP in JobServer in the code but i don&#39;t find this. <br /> I don&#39;t find any explain in the SDK of BOXI. <br /> could you have a solution for this ? <br />  <br /> Thank you <br /> gcr</span>

    I've found a solution ....

  • New user..dont know how to write a dvd with avi and srt in mac

    Hello, I am a bit troubled cause when i had a pc i used convertx to dvd and input avi and srt and had a dvd. now i dont know what to do? is there any software that can do the job in an easy way??
    Thank you very much

    Start with clicking on Desktop, selecting Finder->Help->Mac Help->enter burn in search window and see +Burning a CD or DVD in the Finder+. Since you're new to Macs, see:
    Switching from Windows to Mac OS X,
    Basic Tutorials on using a Mac,
    Mac 101: Mac Essentials,
    Anatomy of a Mac,
    MacTips, and
    Switching to the Mac: The Missing Manual, Snow Leopard Edition.
    Additionally, *Texas Mac Man* recommends:
    Quick Assist.
    Welcome to the Switch To A Mac Guides,
    Take Control E-books, and
    A guide for switching to a Mac.

  • [SOLVED] wget problem with ftp ( and symlink )

    Hi,
    I tried to download some file recursively from ftp, but It seems that wget can't figure a file under symlinks folder by itself.
    At first I use
    wget -r -N ftp://ftp.ncbi.nih.gov/snp/database/organism_schema
    (All folders in organism_schema are symlink one, and I want file underneath them)
    but this download only symlink folder but not a file underneath it. I tried --retrv-symlink but it gave me an error
    preecha@preecha-laptop:~/dbsnp$ wget -r -N --retr-symlink ftp://ftp.ncbi.nih.gov/snp/database/organism_data | tee error.txt
    --2009-01-26 11:07:52-- ftp://ftp.ncbi.nih.gov/snp/database/organism_data
    => `ftp.ncbi.nih.gov/snp/database/.listing'
    Resolving ftp.ncbi.nih.gov... 130.14.29.30
    Connecting to ftp.ncbi.nih.gov|130.14.29.30|:21... connected.
    Logging in as anonymous ... Logged in!
    ==> SYST ... done. ==> PWD ... done.
    ==> TYPE I ... done. ==> CWD /snp/database ... done.
    ==> PASV ... done. ==> LIST ... done.
    [ <=> ] 843 5.24K/s in 0.2s
    2009-01-26 11:08:02 (5.24 KB/s) - `ftp.ncbi.nih.gov/snp/database/.listing' saved [843]
    Removed `ftp.ncbi.nih.gov/snp/database/.listing'.
    --2009-01-26 11:08:02-- ftp://ftp.ncbi.nih.gov/snp/database/organism_data/organism_data
    => `ftp.ncbi.nih.gov/snp/database/organism_data/.listing'
    ==> CWD /snp/database/organism_data ... done.
    ==> PASV ... done. ==> LIST ... done.
    [ <=> ] 4,955 2.14K/s in 2.3s
    2009-01-26 11:08:06 (2.14 KB/s) - `ftp.ncbi.nih.gov/snp/database/organism_data/.listing' saved [4955]
    Removed `ftp.ncbi.nih.gov/snp/database/organism_data/.listing'.
    --2009-01-26 11:08:06-- ftp://ftp.ncbi.nih.gov/snp/database/organism_data/arabidopsis_3702
    => `ftp.ncbi.nih.gov/snp/database/organism_data/arabidopsis_3702'
    ==> CWD not required.
    ==> PASV ... done. ==> RETR arabidopsis_3702 ...
    No such file `arabidopsis_3702'.
    --2009-01-26 11:08:07-- ftp://ftp.ncbi.nih.gov/snp/database/organism_data/bee_7460
    => `ftp.ncbi.nih.gov/snp/database/organism_data/bee_7460'
    ==> CWD not required.
    ==> PASV ... done. ==> RETR bee_7460 ...
    No such file `bee_7460'.
    --2009-01-26 11:08:08-- ftp://ftp.ncbi.nih.gov/snp/database/organism_data/bison_9901
    => `ftp.ncbi.nih.gov/snp/database/organism_data/bison_9901'
    ==> CWD not required.
    ==> PASV ... done. ==> RETR bison_9901 ...
    No such file `bison_9901'.
    --2009-01-26 11:08:10-- ftp://ftp.ncbi.nih.gov/snp/database/organism_data/blackbird_39638
    => `ftp.ncbi.nih.gov/snp/database/organism_data/blackbird_39638'
    ==> CWD not required.
    ==> PASV ... done. ==> RETR blackbird_39638 ...
    No such file `blackbird_39638'.
    --2009-01-26 11:08:11-- ftp://ftp.ncbi.nih.gov/snp/database/organism_data/bonobo_9597
    => `ftp.ncbi.nih.gov/snp/database/organism_data/bonobo_9597'
    ==> CWD not required.
    ==> PASV ... done. ==> RETR bonobo_9597 ...
    No such file `bonobo_9597'.
    Did I do something wrong here ? Any suggestion would help a lot. Thanks !
    Last edited by Tg (2009-01-27 04:00:46)

    Xyne wrote:
    wget man page wrote:When --retr-symlinks is specified, however, symbolic links are
           traversed and the pointed-to files are retrieved.  At this time,
           this option does not cause Wget to traverse symlinks to directories
           and recurse through them, but in the future it should be enhanced
           to do this.
    I think I've found a good workaround:
    #!/bin/bash
    URL=$1
    DIRPATH=$(echo "$URL" | sed s-ftp://--)
    BASEURL=$(echo "$DIRPATH" | cut -d '/' -f1)
    wget -r -N $URL
    SYMLINKS=$(find $DIRPATH -type l)
    for SYMLINK in $SYMLINKS
    do
    TARGET=$(readlink $SYMLINK)
    if [ "${TARGET:0:1}" == "/" ]
    then
    URI="ftp://$BASEURL/$(readlink $SYMLINK)"
    else
    URI="ftp://$DIRPATH/$(readlink $SYMLINK)"
    fi
    echo "retrieving $URI"
    wget -r -N $URI
    # ./wget_symlinks $URI
    done
    You can save that as "wget_symlinks", then
    chmod 744 wget_symlinks
    ./wget_symlinks ftp://ftp.ncbi.nih.gov/snp/database/organism_schema
    I didn't download everything so I don't know if there are further symlinks. If there are, comment out the line "wget -r -N $URI" and uncomment "# ./wget_symlinks $URI". It should download everything recursively and then exit.
    I never though of using bash script  (never wrote bash more than 5 line). I guess I'll try that, thanks alot.
    Last edited by Tg (2009-01-26 07:27:56)

  • Weblogic Cluster with i686 and Itanium

    Hello,
    I want to configure a Bea Weblogic 8.1 Cluster including an itanium and a i686
    server.
    I have sucessfully configured a weblogic cluter with 2 Sun servers.
    Now I want to use for my weblogic cluster an itanium server together with an Intel
    i686 server both running on linux with jrockit as jvm.
    Is it possible to do this with two different hardware platforms to use a Weblogic
    Cluster?
    Michael

    Hello,
    I want to configure a Bea Weblogic 8.1 Cluster including an itanium and a i686
    server.
    I have sucessfully configured a weblogic cluter with 2 Sun servers.
    Now I want to use for my weblogic cluster an itanium server together with an Intel
    i686 server both running on linux with jrockit as jvm.
    Is it possible to do this with two different hardware platforms to use a Weblogic
    Cluster?
    Michael

Maybe you are looking for

  • How can I contact or talk to some one about my bill

    How can I contact or talk to some one about my billing, there are charges on my cc that I no nothing about can someone help me please.

  • How to compile JMF source code

    Hello, Im working in an application which uses Java Media Framework to code/decode and transmit/receive the video and audio. I need to modify the h.263 codec in order to get an especial issue related to video. I download the source code of jmf I foun

  • Unopened Mail icon shows on BB even though email was opened in Outlook 2007

    I have a BB 8350i with v4.6, a BES Server, and Exchange 2003 Server.  I use Office 2007.  When email is opened in Outlook, I thought it was supposed to show opened on the Blackberry. I believe it used to behave this way.  Is there a settting somewher

  • Agent for Solaris x86

    Does anyone have any info about the availability of an agent(preferably 10.2.0.5) for solaris x86 platforms? I was able to find a document referencing a 10.2.0.2 version but nothing else.

  • Do I need other software programs?

    New to mac. Daughter wants me to combine some photos and short video clips onto a DVD. Can I live with IPhoto that came with my new machine (MAC OS X 10.5.4) (IPhoto 08 7.1.4)? Should I go to the store and pay for classes? or do I need to get a diffe