Copying data to an external server during data acquisition

I need to copy out data to an external server for backup during an extended data acquisition session (days long).  What is the safest data format for this operation that won't corrupt the primary data file?  *.lvm, *.tdm, *.tdms?
Thanks 

Hi Velveeta,
I would think that you would rather save to multiple files and then back them up on the server rather than try to make a copy of a file in the middle of its creation; as this would be more likely to cause corruption issues.  If you are very concerned about this, I'd recommend breaking up the data to a couple of files and then backing them up.  So, to answer your questions:
1.  You can set up the express VI Write to Measurement File to create a new file every X data points.  I'm unsure of how you are planning to programmatically copy this to a back up disk. 
2.  Don't copy a file in the middle of its creation (for example, if you were creating one file and tried to back it up in the middle of your acquisition). 
3.  The file format shouldn't matter for this functionality in particular. 
Cheers, 
Marti C
Applications Engineer
National Instruments
NI Medical

Similar Messages

  • List of data targets is not visible during data upload

    Hi all,
      I am trying to load user defined transactional data into an info object, i will do all necessary customization steps such as creating application component,assiging data sources,creating info packages and then creating update rules in info cubes, moreover i wrote a routine which calculates sales reveune based on cost and quantity sold.
    My problem is that when i created infopackage , it does not list any data targets, Plz any one can give tips in this regard.
    thanks in advance
    regards,
    a.fahrudeen
    Message was edited by:
            FAHRUDEEN MUSTAFA

    Hi Fahrudeen,
    Am a little confused here... you say you want to load Transaction data and load it into the InfoObject?? what was that??
    You can load the Transaction data only into your data targets such as InfoCube and DataStore Objects... If you are loading the data into your InfoObjects, then that would mean that you are loading the Master data for which obviously you won't have your data targets listed in your InfoPackage... Only in case of loading the transaction data would you have your Data Targets listed in your InfoPackage...
    Regards
    Manick

  • Maxl Error during data load - file size limit?

    <p>Does anyone know if there is a file size limit while importingdata into an ASO cube via Maxl. I have tried to execute:</p><p> </p><p>Import Database TST_ASO.J_ASO_DB data</p><p>using server test data file '/XX/xXX/XXX.txt'</p><p>using server rules_file '/XXX/XXX/XXX.rul'</p><p>to load_buffer with buffer_id 1</p><p>on error write to '/XXX.log';</p><p> </p><p>It errors out after about 10 minutes and gives "unexpectedEssbase error 1130610' The file is about 1.5 gigs of data. The filelocation is right. I have tried the same code with a smaller fileand it works. Do I need to increase my cache or anything? I alsogot "DATAERRORLIMIT' reached and I can not find the log filefor this...? Thanks!</p>

    Have you looked in the data error log to see what kind of errors you are getting. The odds are high that you are trying to load data into calculated memebers (or upper level memebers) resulting in errors. It is most likely the former. <BR><BR>you specify the error file with the <BR><BR>on error write to '/XXX.log'; <BR><BR>statement. Have you looked for this file to find why you are getting errors? Do yourself a favor, load the smaller file and look for the error file to see what kind of an error you are getting. It is possible that you error file is larger than your load file, since multiple errors on a single load item may result in a restatement of the entire load line for each error.<BR><BR>This is a starting point for your exploration into the problem. <BR><BR>DATAERRORLIMIT is set at the config file, default at 1000, max at 65000.<BR><BR>NOMSGLOGGINGONDATAERRORLIMIT if set to true, just stops logging and continues the load when the data error limit is reached. I'd advise using this only in atest environement since it doesn't solve the initial problem of data errors.<BR><BR>Probably what you'll have to do is ignore some of the columns in the data load that load into calculated fields. If you have some upper level memebers, you could put them in skip loading condition. <BR><BR>let us know what works for you.

  • Retrieving data from document stored in External Server.

    Hello ,
    We are working on a PoC requirement in which the data from a document stored in an external server (not using DMS) needs to be read from a reprot program in SAPCRM. Is this possible to do from CRM. Can you please suggest if there are any standard function modules availalbe for this.
    Regards,
    Sudharani.

    Yes  Bhushan, we would be using XI but we are open for any other option if available. Please suggest. 
    Also are there any standard web services available for this purpose where in we can pass the URL of the document location and it returns of content of the document.

  • Trying to copy iPhoto Library to external hard drive for backup.  Error message:  The Finder can't complete the operation because some data in "iPhoto Library" can't be read or written (Error code -36).

    Trying to copy iPhoto Library to external hard drive for backup.  Error message:  The Finder can't complete the operation because some data in "iPhoto Library" can't be read or written (Error code -36).

    That code is
    -36
    ioErr
    I/O error (bummers)
    Make sure the EHD is formatted OS X Extended (journaled) and run Disk Utilty on the EHD and repair the disk.  If that fails to help I'd try the following:
    Using iPhoto Library Manager  to Rebuild Your iPhoto Library
    Download iPhoto Library Manager and launch.
    Click on the Add Library button,                         
    navigate to your Home/Pictures folder and select your iPhoto Library folder.
    Now that the library is listed in the left hand pane of iPLM, click on your library and go to the Library ➙ Rebuild Library menu option
    In the next  window name the new library and select the external HD as the location of the newly rebild library.
    Click on the Create button.
    Note 1: This creates a new library based on the LIbraryData.xml file in the library and will recover Events, Albums, keywords, titles and comments. However, books, calendars, cards and slideshows will be lost.
    Note 2:  Your current library will be left untouched for further attempts at a fix if so desired.
    OT

  • Can SharePoint 2010 read data from an external SQL Server 2012 data Source?

    Hi,
     I would like to Know whether SharePoint can read the data from a SQL server 2012 external data source?
    I am not talking about SharePoint internal database. I need to get the data from an sql database which is used by some other application. For My SharePoint server I am using SQL 2008 R2 as an internal database engine. But I need to get some other data from
    another data Source which is configured in SQL server 2012.  Can any one help me on this whether is there any problem on accessing the data from SQL 2012. If there is no problem for that, please provide me the version of SQL data source compatible with
    SharePoint 2010 and SP 2010 sp1. 
    Thanks!
    Regards,
    Henoy 

    Hi Romeo ,
    I have already visited this page. But there is no answer for my question. I just want to know whether we
    can done the same from a SQL 2012 server.
    Please help me to know whether SharePoint 2010 is Compatible to get the data from SQL 2012 external data source.
    Thanks for your instant reply.
    Regards,
    Henoy TM.
    +919035752610

  • Is it possible to take the CDR data from a v4.2 Call Manager and copy it to a separate server where it would be made available for reporting?

    Is it possible to take the CDR data from a v4.2 Call Manager and copy it to a separate server where it would be made available for reporting? We are not interested in migrating the CDR data to v6 because of the concerns it introduces to the upgrade process. Is it possible to get the raw data and somehow serve it from a different machine? (knowing it would be 'old' data that stops as of a certain date). If so, what would be the complexity involved in doing so?
    It seems like the CDR data lives within MSSQL and the reporting interface is within the web server portion of the Call Manager... that's as far as we've dug so far.

    Hi
    It is absolutely possible to get the data - anyone you have in your org with basic SQL skills can move the data off to a standalone SQL server. This could be done most simply by backing up and restoring the DB using SQL Enterprise Manager.
    Moving the CAR/ART reporting tool would be more difficult... if you do actually use that for reporting (most people find it doesn't do what they need and don't use it for anything but basic troubleshooting, and get a third party package) then the best option may be to keep your publisher (possibly assigning it a new IP) and leave it running for as long as you need reporting.
    You would then need a new server to run your upgraded V6 CCM; you may find you need this anyway.
    Regards
    Aaron
    Please rate helpful posts...

  • Warning EJB EJB Deployment: Fnv cannot be redeployed while the server is running. de.dr_staedtler.extern.audi.fnv.data.ejb.FnvBean is located in the server's classpath.

    Hi,
    i have developed some EJB, that are dependent from each others.
    So I set the classpath to a lib-directory with my EJB.
    When i try to deploy, i become a warning
    <Warning> <EJB> <EJB Deployment: Fnv cannot be redeployed while the server is
    running. de.dr_staedtler.extern.audi.fnv.data.ejb.FnvBean is located in the server's
    classpath.>
    But the EJB's are deployed and the client can connect it.
    Why is the warning? It is important or I can ignore it?
    Thanks for your time.
    dragan-sassler

    It means that bean classes are in the system classpath and therefore
    cannot be reloaded. If you do not plan on hot-redeploying your beans
    you can ignore this message.
    Dragan-Sassler <[email protected]> wrote:
    Hi,
    i have developed some EJB, that are dependent from each others.
    So I set the classpath to a lib-directory with my EJB.
    When i try to deploy, i become a warning
    <Warning> <EJB> <EJB Deployment: Fnv cannot be redeployed while the server is
    running. de.dr_staedtler.extern.audi.fnv.data.ejb.FnvBean is located in the server's
    classpath.>
    But the EJB's are deployed and the client can connect it.
    Why is the warning? It is important or I can ignore it?
    Thanks for your time.
    dragan-sassler--
    Dimitri

  • Error of data exchange with an external server

    Help to understand.
    There is a client which is connected to a server, data exchange with a server is realized through a method sendData(byte[] sendbytes, String code, int resplen)
    OutputStream socketOutputStream = null;
        public void connect() throws SeedLinkException, IOException {               
            try {           
                String host_name = sladdr.substring(0, sladdr.indexOf(':'));
                int nport = Integer.parseInt(sladdr.substring(sladdr.indexOf(':') + 1));
                // create and connect Socket
                Socket sock = new Socket();
                sock.setReceiveBufferSize(65536);
                sock.setReuseAddress(true);
                sock.setKeepAlive(true);
                sock.connect(new InetSocketAddress(host_name, nport));
                // Wait up to 10 seconds for the socket to be connected
                int timeout = 10;
                int i = 0;
                while (i++ < timeout && !sock.isConnected())
                if (!sock.isConnected()) {
                    String message = "[" + sladdr + "] socket connect time-out (" + timeout + "s)";
                    //sllog.log(true, 0,  message);
                    throw(new SeedLinkException(message));
                // socket connected
                sllog.log(false, 1, "[" + sladdr + "] network socket opened");
                // Set the KeepAlive socket option, not really useful in this case
                sock.setKeepAlive(true);
                this.socket = sock;
                this.socketInputStream = socket.getInputStream();
                this.socketOutputStream = socket.getOutputStream();
            } catch (Exception e) {
                //e.printStackTrace();
                errorLine = "cannot connect to SeedLink server: " + e.getMessage();
                throw(new SeedLinkException("[" + sladdr + "] cannot connect to SeedLink server: "
                        + e));
            // Everything should be connected, say hello
            try {
                sayHello();
            } catch (SeedLinkException sle) {
                try {
                    socket.close();
                    socket = null;
                } catch (Exception e1) {;}
                throw sle;
            } catch (IOException ioe) {
                try {
                    socket.close();
                    socket = null;
                } catch (Exception e1) {;}
                throw ioe;
        }     // End of connect()
    public byte[] sendData(byte[] sendbytes, String code, int resplen) throws SeedLinkException, IOException {
            try {
                socketOutputStream.write(sendbytes);
            } catch (IOException ioe) {
                throw(ioe);
            if (resplen <= 0)
                return(null);        // no response requested
            // If requested, wait up to 30 seconds for a response
            byte[] bytesread = null;
            int ackcnt = 0;               // counter for the read loop
            int ackpoll = 50;                  // poll at 0.05 seconds for reading
            int ackcntmax = 30000 / ackpoll;      // 30 second wait
            while ((bytesread = receiveData(resplen, code)) != null && bytesread.length == 0) {
                if (ackcnt > ackcntmax){
                    errorLine = "no response from SeedLink server to " + (new String(sendbytes,0,sendbytes.length-1));
                    throw (new SeedLinkException("[" + code +
                            "] no response from SeedLink server to '" +
                            (new String(sendbytes)) + "'"));
                Util.sleep(ackpoll);
                ackcnt++;
            if (bytesread == null)
                throw(new SeedLinkException("[" + code + "] bad response to '" + sendbytes + "'"));
            return(bytesread);
        }    // End of sendData()
        The given code is a part j2ee Web application.
    Why the specified method normally works in Tomcat a server,
    and at all refuses to work in Java System Application Sever 8 or 9. (data exchange does not occur)
    I can not understand in what the reason ...

    José,
    The 6036E User Manuals gives the best definition of the STARTSCAN, TRIG, and CONVERT* signals as well as their relationship to eachother. See 4-20 Connecting Timing Signals of the User Manual for this information as well as timing diagrams.
    NI 6034E/6035E/6036E User Manual
    http://digital.ni.com/manuals.nsf/websearch/B935FC073150374F86256BF10073995A?OpenDocument&node=132100_US
    You are correct that configuring your board for external timing is just as simple as connecting your external clock to a PFI line and using it as the STARTSCAN signal. The clock output of your GPS receiver will now be defining when scans are perform on your 6036E due to the synchronization between the two devices. The only synchronization issue you may encounter is
    propogation delay. This will be a factor of the cable length connecting your GPS clock to the 6036E.
    Regards,
    Justin Britten
    Applications Engineer
    National Instruments

  • How to extract R/3 (4.7) data to an external data warehouse server

    Hi,
    What are the methods or steps of extracting R/3(4.7 - no BW module) data to an external database (or flat file) for generating reports in an external BI system?
    Can I use ABAP to do ETL and how?
    Thank you,
    Bruce

    Hi Harsha,
    You cannot extract the multiple values using classification data sources, this data can be extracted by creating a generic extractor based on a view (with join on tables AUSP and INOB). this extractor brings the data in the following format.
    OBJEK | ATINN | ATZHL | ATWRT
    1 | HOBBY | 1 | CRICKET
    1 | HOBBY | 2 | PAINTING
    1 | HOBBY | 3 | READING
    2 | HOBBY | 1 | CRICKET
    2 | HOBBY | 2 | PAINTING
    3 | HOBBY | 1 | READING
    i.e. for field Hobby these are values assigned, if you want you can create 3 InfoObjects in BW and flatten this data by setting flags as below.
    OBJEK | HOBBY_CRICKET | HOBBY_PAINTING | HOBBY_READING
    1 | X | X | X |
    2 | X | X |    |
    3 |    |    | X |
    Let me know if you need more details.
    Regards,
    Rk

  • I transferred data from my external hard drive to my restored Mac Book Pro via migration assistant and now my external hard drive in time capsule has red minus signs. How do I get rid of that without getting rid of any of my data?

    I used Migration Assistant to transfer my data from my external hard drive via Time Capsule to my restored Mac Book Pro. When I now go into Time Capsule I there are red minus signs in the corner of all the folders that are contained in my back up. How do I get rid of this and access my previous back-ups?

    Select the drive and Finder > Get info and at the bottom "ignore permissions on this volume"
    You can copy the files, but the ownership on the files still belongs to the other user account, once your done copying, then perform a #6 Reset Users Permissions and that will set all the ownership to that account.
    ..Step by Step to fix your Mac
    Another method is to copy the entire folder, then change it's permissions.

  • How can I remove all Time Machine data from an external HD, while retaining the other folders/files on the disk?

    I recently replaced my old Macbook Pro with a new Macbook Pro Retina.
    The original MBP had Time Machine set up on an external HD, and I also had some manually saved/backed up files (stuff that didn't need incremental backups; very old stuff) on the disk as well, in folders.  This worked fine and well for the life of the original MBP.
    When I set up the MBPr, I did not elect to transfer everything over from a Time Machine disk, as I wanted a Fresh Install; I chose to just re-download/install the apps  I needed from the App Store.  Much cleaner, more stable.  I updated to Yosemite immediately so, it gave me more of a  clean install.
    The problem is, Time Machine now won't read any Time Machine backups from that disk.  Migration assistant can pull data from it, but it's a bit wonky.  It was easier to just manually copy over what I needed.
    Now, I just want to "reset", and completely remove the old Time Machine data from the external hard drive, since it is completely useless to me now (I can't access it on this computer, and it's taking up space).  I want to remove the Time Machine data, as if it were never there, but keep my other folders/files that are on the disk (so, a format is not feasible).
    I will then set the disk up as a new, fresh Time Machine disk for the new MBPr. 
    What is the proper method for removing Time Machine data from an external disk (which doesn't seem to be associated with this machine anyway)?  I know a simple rm -rf will cause problems.
    Thanks for any help provided.

    See the yellow box in #12 of Time Machine - Frequently Asked Questions (or use the link in *User Tips* at the top of this forum).

  • Best way to Import Data from a SQL Server Table?

    Hi,
    Firstly thanks for looking at this question.
    We need to import data into SAP BPC 5.1 on a twice daily basis and have chosen not to export to a .CSV file but instead to hold all data in a SQL table and import it directly from there.  As part of the import we wish to run the default logic, however only over the data which is imported as opposed to having to running default logic over the entire database after every import.
    We did some research on this topic and the only thing we could find that would work as described above is using the "Import SQL" package.  However we keep experiencing problems with it and have not yet been able to run it successfully; the errors it gives are not consistent from run to run which makes it difficult to start a thread, though we are getting help from the helpdesk at the moment.
    My question here though is - is there another way that someone knows of to import data into SAP BPC from a SQL table, and being able to run default logic over just the data being imported, or is our only hope getting the "Import SQL" package working?
    Any help much appreciated.
    Regards,
    Iain
    Forgot to mention details of our environment:
    SAP BPC v5.0.495, 2 server environment -
    Server 1 (DB/AS/SSIS/File server) = 64bit Windows 2k3 server with 64bit SQL Server Enterprise Edition
    Server 2 (IIS/App server) = 32bit Windows 2k3 server
    Edited by: Iain Hambleton on Jun 17, 2008 3:25 PM

    Lain,
    I recently created SSIS packages that need to work with a staging table that does all kind of manipulations of that data before it is loading into SAP BPC, because it also has to load the data in the drillthrough table within the same package. From one point in my package the data is also in a SQL table so basically the same as in your situation. This also works with an export to CSV within the DTSX file like Alwin said, because in this case you can use the standard transformation and conversion stuff during the load.
    I also needed to limit the data region for logic to the data region that is in the load for currency conversion purposes. This is not very much of a problem. I had a situation where I have an Accounts receivable cube containing a daily time dimension for keyduedates and a datasrc dimension containing weeks. Every week has a complete overview of the open AR items in that week and need to be converted for that week (datasrc) only and not for the whole database every time we load. But by default data would be converted based on the Keyduedates dimension while I wanted the Week dimension to be used as the data region. I solved it by using these rows in the logic:
    *Scope_by=version,Weeks
    *xdim_memberset weeks=%weeks_set%
    *xdim_memberset version = %version_set%
    I can send you the SSIS package and logic if you want. Just send me your details then.
    -Joost
    Edited by: Joost Hoppenbrouwers on Jun 17, 2008 4:25 PM

  • How to create a .mdf SQL Server database from a Data-Tier Application file that has data?

    This is a noob question, though I do use SQL Server databases all the time with Entity Framework when I code in C# using Visual Studio 2013.  The development environment is found below at [A].  I am trying to make a clone of a SQL Server 2008 R2
    database (.mdf)  that exists online.  I can read, connect and work with this database in Visual Studio 2013, but I wish to make a local copy of the database, as an .MDF file.  Somewhere in my notes I have a way of creating a local copy from
    an online database when using Visual Studio but I forgot how (it seems, reviewing my notes, that it deals with ADO.NET which is deprecated in Visual Studio 2013 these days, or so it seems).  So I'm looking for another way.  What I did was create
    (or export) a "Data-Tier Application File" from the online SQL Server database, with data, and it seems to have worked in that this Data-Tier Application file exists on my hard drive and seems to have data in it ("SQL Server Replication Snapshot"
    is the format it seems).  It contains skeleton code to create a database, but when I tried to execute it with SQL Server 2014 Management Studio, I got a bunch of errors.
    So my question is:
    1) Can I somehow create a .MDF SQL Server Database from an Data-Tier Application file that has data?  What tool do I use?  I saw this link, http://social.technet.microsoft.com/wiki/contents/articles/2639.how-to-use-data-tier-application-import-and-export-with-a-windows-azure-sql-database.aspx 
    and it relates to Azure, but is there a tool for C#Visual Studio 2013, standalone?
    2) If there's an easy way to create a .mdf SQL Server Database file from an online file, within SQL Server Management Studio?  I don't think so, since it would require Administrator permissions on the online server, which I don't have. I have permission
    to read, update, delete the online database file, but strangely not to download it (the service I use has a tool for backup, but not for download).
    3) same question as 2), but for Visual Studio 2013?  I don't think so, since I notice none of the templates even mentions ADO.NET anymore, but instead they go with Entity Framework.  Using EF I can of course do anything I want with the online database
    (CRUD), but it remains online.  Maybe there's a switch to make a local copy?  I guess I could write a short program to suck all the data out of the online database and put it into a new, duplicate database having the same tables, that I create on
    my localhost, but my question here is if there's an easier way than this, maybe a tool or command I can run from inside Visual Studio?
    Any advice on any of the above questions is appreciated.
    Thank you,
    Paul
    [A] Microsoft Visual Studio Professional 2013
    Version 12.0.21005.1 REL
    Microsoft .NET Framework
    Version 4.5.51641
    Microsoft Web Developer Tools 2013   2.0.40926.0
    SQL Server Data Tools   12.0.30919.1
    Microsoft SQL Server Data Tools
    Windows Azure Mobile Services Tools   1.0
    Windows Azure Mobile Services Tools

    Thanks but these links are too general to help.
    "2. what do you mean by online file?" - I mean the SQL Server database file is on a remote web server that I rent from, but I am not the administrator of.  I can access my database using SQL Server Authentication, but nothing more.
    Paul
    What do you mean by too general? It explains on how you can use data tier application to create and deploy databases
    May be this will help you to understand better
    http://www.databasejournal.com/features/mssql/article.php/3911041/Creating-Data-Tier-Applications--in-SQL-Server-2008-R2.htm
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • I cannot transfer data from my external hard drive to my new mac

    I had to get a new hard drive put into my macbook pro because it was failing. So, I moved all my data onto an external hard drive so I would not lose anything. But now I cannot get iphoto or imovie back onto my computer. It says I cannot open it because it is a time machine back up. I don't want to have to purchase iPhoto and iMovie again just because I had to replace my hard drive.

    Copying the files directly from a Time Machine backup may be possible but I don't recommend it. Your new hard disk is essentially a new computer to Time Machine, and it will have problems understanding where to put the individual files. What you're trying to do is just not the right way to go about it.
    To restore from a Time Machine backup, boot Lion Recovery and select "Restore from Time Machine backup".
    See Pondini's explanation here: http://pondini.org/TM/14.html

Maybe you are looking for