Best database for large# of records

LWCVI
SQL Toolkit
I have data logging software that logs to an MS Access database (*.mdb) with 4 internal tables.  As the database gets large (> 100,000) records, the write times become very long, and I have instances of software shutdown.  Even it it does not shutdown, the write times become several minutes, which is longer than the data logging interval (every 2 minutes).
I am currently manually archiving and emptying the records on a monthly basis, but it makes it difficult for the user to find data that they need.
Is this an inherent problem with MS Access?  This is the older version of SQL toolkit running under CVI 5.1.  This may be remedied by an upgrade for CVI/SQL or both.  I do not want to spend the $$ if this is a database structure problem.
I can move over to another format, such as MySQL, etc.  Previously, I used a dBase structure/ODBC and did not have these issues.
Any suggestion, help, etc is welcome.  Thanks in advance,
David E.

Humphrey,
I moved to MS Access because I did not have any way to view the data in the old system remotely (dBase format).  dBase is long gone.  Access will open the dBase database, but it converts it.
I created some nice custom reports in Access to display the data for my customers. 
I actually have found a solution to the large file problems.  Rather than log all the data into a single table (within the database), I create a new table each month.  This allows the table size to stay reasonable, and still allows a search routine to find the data (e.g., May13, June13, etc).
If I keep the number of records in a table < 1M, then the write times are reasonable.
Thanks for the help.  I appreciate the quick reply.
PS the optimization link is a dead link.
David

Similar Messages

  • Data Model best Practices for Large Data Models

    We are currently rolling out Hyperion IR 11.1.x and are trying to establish best practces for BQYs and how to display these models to our end users.
    So far, we have created an OCE file that limits the selectable tables to only those that are within the model.
    Then, we created a BQY that brings in the tables to a data model, created metatopics for the main tables and integrated the descriptions via lookups in the meta topics.
    This seems to be ok, however, anytime I try to add items to a query, as soon as i add columns from different tables, the app freezes up, hogs a bunch of memory and then closes itself.
    Obviously, this isnt' acceptable to be given to our end users like this, so i'm asking for suggestions.
    Are there settings I can change to get around this memory sucking issue? Do I need to use a smaller model?
    and in general, how are you all deploying this tool to your users? Our users are accustomed to a pre built data model so they can just click and add the fields they want and hit submit. How do I get close to that ideal with this tool?
    thanks for any help/advice.

    I answered my own question. in the case of the large data model, the tool by default was attempting to calculate every possible join path to get from Table A to Table B (even though there is a direct join between them).
    in the data model options, I changed the join setting to use the join path with the least number of topics. This skipped the extraneous steps and allowed me to proceed as normal.
    hope this helps anyone else who may bump into this issue.

  • Advice on best workflow for large Motion project

    I am a part-time video editor/designer/motion graphics creator/etc. Normally, I work on projects with pieces no longer than 5 minutes, even if the projects themselves might be 30-40 minutes of total material--mostly video support for conferences and awards shows.
    Right now I am embarking upon a mark larger project--10 30-minute segments, each of which is 100% motion graphics. They all involve a speaker against a green screen for the entire segment with the motion graphics keyed in front of and behind him.
    We recorded this directly to hard drive in a studio that had a VT4 (Video Toaster) system, so the best Mac-compatible codec they could provide me for clean green-screening was full-resolution component video. This is giving me great keys, but I also have about 500 GB of raw footage.
    In this project, I need to first edit all the takes from each episode into a clean 30-minute piece, and then add the motion graphics. And this is where my question comes in. It seems to me FCP is much better for editing the raw video, but that Motion is where I want to do just about everything else. I need to somehow bring the video into Motion, because I want to create "real" shadows against my background from my keyed footage.
    When working with a long project, and with a full-resolution codec, what is my smartest workflow? I am trying to spend the least time possible rendering back and forth, and also avoid generating huge in-between files each step of the way. It seems that any way to approach it has plusses and minuses, so I want to hear from people who have been there which path gets me to my goal with the least hassle.

    I need to somehow bring the video into Motion, because I want to create "real" shadows against my
    background from my keyed footage.
    "Real shadows are only faked in Motion. You have many options including a simple drop shadow or a copy of your matte layer filled with a gradient and a gradient blur applied with a distortion filter so it appears to be projected onto the wall. Be sure to take the time to make this a template effect and to keyframe the shadow angle if the foreground subject moves.
    When working with a long project, and with a full-resolution codec, what is my smartest workflow? I
    am trying to spend the least time possible rendering back and forth, and also avoid generating huge
    in-between files each step of the way. It seems that any way to approach it has plusses and minuses,
    so I want to hear from people who have been there which path gets me to my goal with the least
    hassle.
    Well, you've got two conflicting interests. One, you have to sync the Motion work with the video of the keyed speaker and, two, you have to edit. But it seems to me that your planning must include lots of design work up front, media you can re-use or modify slightly, text formatting that can be precomped, a large stock of effects you will apply over and over again. Do all of this stuff first.
    You also want to explore working at lower rez through your planning and roughing stages. for instance, there's no reason to pull a full rez copy of your foreground into Motion if all you need to do is sync to his audio and get rough positioning. You can put him over black and export all of his clips using any medium to low rez codec at reduced frame rates and just use the Screen Blend Mode to drop him roughly onto your Motion projects.
    You'll get lots of advice over the next few days. If you're posting to other Motion or motion graphics forums, please do us all a favor and return someday to all of your threads and tell us what you did and what you learned.
    bogiesan

  • Best practice for large form input data.

    I'm developing an HIE system with FLEX. I'm looking for a strategy, and management layout for pretty close to 20-30 TextInput/Combobox/Grid controls.
    What is the best way to present this in FLEX without a lot of cluter, and a given panel being way too large and unweildy.
    The options that I have come up with so far.
    1) Use lots of Tabs combined with lots of accordions, and split panes.
    2) Use popup windows
    3) Use panels that appear, capture the data, then, make the panel go away.
    I'm shying away from the popup windows, as, that strategy ALWAYS result in performance issues.
    Any help is greatly appreciated.
    Thanks.

    In general the Flex navigator containers are the way to go. ViewStack is probably the most versatile, though TabNavigator and Accordion are good. It all depends on your assumed workflow.
    If this post answers your question or helps, please mark it as such.

  • LabView DSC, RSLinx, SLC505 and Data transfer (Best way for large blocks of data?)

    I am currently programming in Labview 6.1 with the Datalogging and
    Supervisory Control module to transfer 720 floating point numbers to a
    SLC505 Allen Bradley PLC (with ethernet) using RSLinx as an OPC server. I
    have found that using the Datasocket Write takes about 30 - 40 seconds, and
    making the tags and writing to them takes about the same amount of time. Is
    it possible to transfer this data faster?
    Thanks,
    Michael Thomson (Surcher)

    Cyril,
    I was just experimenting with different ways to transfer the data from the
    computer to the PLC. In the past I have built large tag databases with
    specific tag names. This made the code rather cumbersome when you wanted to
    write to a large group of tags with descriptive names. I was just using
    datasocket write as a way to transfer the data to the plc using code to
    build the url and without having the DSC engine running. I have found that
    importing the tags right from the tag configuration editor with the names
    being simply the PLC addresses and then accessing them with the tag write is
    considerably faster (under 5 seconds). I can then build the names in an
    embedded for/next loop and change them to a tag name before I write to each
    one. The appli
    cation is a user interface that allows the machine operator
    to pick what kind of arch to put on cabinet door part. With the selections
    chosen I calculate the servo moves and download the data to the PLC.
    Thanks for the link!
    Michael Thomson
    "Cyril" wrote in message
    news:[email protected]..
    > Michael,
    >
    > I am a little bit confused about the configuration here and the
    > programming here: why are you using Datasocket Write if you are using
    > tags? Are the 720 floating numbers written to 720 different I/O
    > Points(registers). If so this shouldn't be that slow especially with
    > DSC
    > I would strongly encourage you contact the support at National
    > Instruments for LabVIEW DSC, either by phone or e-mail and give a
    > detailed description of the issue:
    > www.ni.com/ask, and communication method phone NI or e-mail NI.

  • Best practices for large ADF projects?

    I've heard mention (for example, ADF Large Projects of documentation about dealing with large ADF projects. Where exactly is this documentation? I'm interested in questions like whether Fusion web applications can have more than one ViewController project (different names, of course), more than one Model project, the best way to break up applications for ease of maintenance, etc. Thanks.
    Mark

    I'd like to mention something:
    Better have unix machines for your development.
    Have at least 3 GB of RAM on windows machines.
    Create all your commonly used LOVs & VOs first.
    If you use web services extensively, create it as a seperate app.
    Make use of popups, it's very user friendly and fast too. You no need to deal with browser back button.
    If you want to use common page template, create it at the beginning. It's very difficult if you want to apply it later after you developed pages.
    Use declarative components for commonly used forms like address, etc.
    Search the forum, you will see couple of good util classes.
    When you check-in the code, watch out some of the files don't show up in JDev like connections.xml
    Make use of this forum, you will get answers immediately from great experts.
    http://www.oracle.com/technology/products/jdev/collateral/4gl/papers/Introduction_Best_Practices.pdf

  • Xsd validation in Database for 1 million record

    Hello All,
    I would like to know the pros and cons to do the xsd validation of a million record in 11g database and if possible the processing time taken to do xsd validation for million records.
    What would be good datatype to load this xml file of million records, should it be blog/clob or varchar2(200000000).
    Thanks.

    varchar2(200000000).SQL VARCHAR2 is limited to 4000
    PL/SQL VARCHAR2 is limited to 32767

  • Vixia HF M50 - Best setting for sports (basketball) recording

    I have been recording for a couple of years with my camcorder.  I think I have been using the default settings, at least I don't remember changing anything.  
    Last week, I had to record two games at the same time, so I had someone record with my Nikon digital camera in video mode and I recorded the other games as usual.  
    When compairing the two recordings, the video from my digital cameras has a lot better color.  The file size is quiet a bit larger too.  
    Does anyone have a recommened setting for recording indoor basketball games.  
    Solved!
    Go to Solution.

    Hi cpuetz!
    Thanks for the post.
    Set the camcorder to the (M) Manual mode, then do the following:
    Press FUNC
    Select [Rec. Programs].
    Select [SCN: Portrait].
    Change the setting to [SCN: Sports].
    Touch [X] to exit.
    Did this answer your question? Please click the Accept as Solution button so that others may find the answer as well.

  • Best compression for large screen presentation?

    I've made a 4 minute movie which is going to be played through a powerbook during a keynote presentation onto a large screen. Any ideas what sort of compression is best for this purpose?

    Sorenson, MPEG4, and H.264 are each meant for desktop playback.
    Historically, Sorenson is the oldest in this group and H.264 is the newest. Predating Sorenson is Cinepak. Back in the day of QuickTime 2.5, Cinepak was the codec of choice for desktop playback. That, or you would go with a third party hardware based codec like Radius VideoVision or Targa2000.
    If your presentation system will have QuickTime 7 installed, then H.264 is probably your best bet. Have you had a chance to download and play any of the HD trailers from the Apple QuickTime Movie Trailers? You can see H.264 in action as well as get more information about the settings used in those files.
    H.264 takes a good amount of time to compress, so you'll want to be sure to allow for that when transcoding your video. It offers high image quality with low storage requirements and low bandwidth requirements when the right settings are applied.
    What resolution is your source video? In general, try to match your desktop resolution as closely as possible to your video resolution. Also, if viewing the content on a CRT, the refresh rate should be set to a multiple of the frame rate. For example, a refresh rate of 60Hz would be ideal for viewing NTSC material and 50Hz would be ideal for viewing PAL material. Although, from what you've described, you probably won't be concerned with refresh rate, but rather response time (an LCD issue related to the display itself, not display settings).
    If your source video is interlaced, be sure to deinterlace it prior to transcoding your movie.
    If your source video is DV-NTSC or DV-PAL, be sure to conform the footage for square pixel viewing.
    Also, remember that NTSC and PAL are both overscan formats. That is, the outer portion of the frame (typically as little as 4%, but as much as 10%) is scanned over the bevel of the TV frame (in other words, the outer edges are cropped). Depending on your source video, you may want to or need to crop for overscan.
    -Warren
      Mac OS X (10.3.9)  

  • The best AP for large warehouse deployment ?

    Hi,
    Hope you peeps can assist.
    New design / installation at a warehouse....
    What is the best / recommended AP for a warehouse with shelving 10m high and 1.8m appart  and a ceiling at about 11.5m? Warehouse is 129m x 97m, about  60% of the floor will be shelving and the rest offices with a ceiling of 3m.
    My concern is the area where the shelving will be utilized and scanners to be used for stock control. The goods on the shelves will differ from plastics to metal in wooden crates and normal boxes. 
    Thank you in advance for your advice and recommendation. Please also state whether you recommend internal or external antennae.
    David

    That is hard to tell you exactly what you need to do, but for that height, I would use an external antenna, the office if fine with internal.  I would use either the 3702e or 2702e for the install.  Most of my clients in the manufacturing/distribution has gone with the 3702e's.
    -Scott

  • SolMan CTS+ Best Practices for large WDP Java .SCA files

    As I know, CTS+ allows ABAP change management to steward non-ABAP objects.  With ABAP changes, if you have an issue in QA, you simply create a new Transport and correct the issue, eventually moving both transports to Production (assuming no use of ToC).
    We use ChaRM with CTS+ extensively to transport .SCA files created from NWDI. Some .SCA files can be very large: +300MB. Therefore, if we have an issue with a Java WDP application in QA, I assume we are supposed is to create a second Transport, attach a new .SCA file, and move it to QA. Eventually, this means moving both Transports (same ChaRM Document) to Production, each one having 300 MB files. Is this SAP's best practice, since all Transports should go to Production? We've seen some issues with Production not being to happy with deploying two 300MB files in a row.  What about the fact that .SCA files from the same NWDI track are cumulative, so I truly only need the newest one. Any advice?
    FYI - SAP said this was a consulting question and therefore could not address this in my OSS incident.
    Thanks,
    David

    As I know, CTS+ allows ABAP change management to steward non-ABAP objects.  With ABAP changes, if you have an issue in QA, you simply create a new Transport and correct the issue, eventually moving both transports to Production (assuming no use of ToC).
    We use ChaRM with CTS+ extensively to transport .SCA files created from NWDI. Some .SCA files can be very large: +300MB. Therefore, if we have an issue with a Java WDP application in QA, I assume we are supposed is to create a second Transport, attach a new .SCA file, and move it to QA. Eventually, this means moving both Transports (same ChaRM Document) to Production, each one having 300 MB files. Is this SAP's best practice, since all Transports should go to Production? We've seen some issues with Production not being to happy with deploying two 300MB files in a row.  What about the fact that .SCA files from the same NWDI track are cumulative, so I truly only need the newest one. Any advice?
    FYI - SAP said this was a consulting question and therefore could not address this in my OSS incident.
    Thanks,
    David

  • Best database for JSP

    Does SQL Server work well with JSP applications?
    I assume Access 2000 doesnt work too well or is not the best choice with JSP?
    What works the best out of Oracle, MySQL or SQL Server?

    Access isn't designed for high traffic web sites. MySQL is fast and has good support and should be all you need to handle most things.

  • Aperture best practices for large libraries

    Hi,
    I am very new to Aperture and still trying to figure out the best way to take advantage of it.
    I have been using iPhoto for a while, with just under 25,000 images. This amount of images takes up about 53 gig. I recently installed and built an Aperture library, leaving the images in the iPhoto library. Still, the Aperture library is over 23 gig. Is this normal? If I turn off the preview, is the integration with iLife and iWork the only functionality lost?
    Thanks,
    BC
    MacBook Pro   Mac OS X (10.4.10)  

    Still, the Aperture library is over 23 gig. Is this
    normal?
    If Previews are turned on, yes.
    If I turn off the preview, is the
    integration with iLife and iWork the only
    functionality lost?
    Pretty much.
    Ian

  • Best methods for large To: list using cfmail

    I'm getting time-out errors on one of my sites. People can
    post to a message "corkboard" and site members can subscribe. One
    of the topics has close to 600 subscribers. The SMTP mail server is
    IIS's built-in thing, on the same machine as the web server.
    I'd like to continue cflooping through the recipient list so
    each recipient can get a "Hello Joey, here's an email" at the top
    of their email, but I'm thinking that if the list gets to >1000
    subscribers I'm going to time-out after a while.
    What methods, other than giant BCC lists, are people using
    for high-volume emailing? Perhaps just a better SMTP server, or one
    on another machine so it's not fighting for processor time with the
    webserver itself?
    Michael

    Matt Woodward has some sample code on his blog to use the
    asyncronous event gateway to send email. The post is here :
    http://mattwoodward.com/blog/index.cfm?commentID=203
    This should take care of your timeout issues...
    Ross Valenti
    <edited to fix misspelled name >

  • Unity Connection 7.x - Best Practice for Large Report Mailboxes?

    Good morning, We have 150 mailboxes from Nurses to give shift reports in. The mailbox quota is 60MB and the message aging policy is on. Deletede messages are deletede after 14 days. The massage aging policy is system wide, and increasing the quota would cause storage issues. Is there a way to keep the message aging policy and reduce it for 1 group of users? Is there a way to bulk admin the mailbox quota changes?
    Version 7.1.3ES9.21004-9
    Thanks

    As for UC 8x, you're not alone.  I don't typically recommend going to an 8.0 release (no offense to Cisco).  Let things get vetted a bit and then start looking for the recommended stable version to migrate to.
    As for bulk changes to mailbox store configurations for users, Jeff (Lindborg) may be able to correct me if I am wrong here.  But with the given tools, I don't think there is a way to bulk edit or update the mailbox info for users (i.e., turn on/off Message Aging Policy).  No access to those values via Bulk Edit and no associated fields in the BAT format either.
    Now, with that said - no one knows better than Lindborg when it comes to Unity.  So I defer to him on that point.
    Hailey
    Please rate helpful posts!

Maybe you are looking for

  • Does anyone know how to delete a template created in error using PAGES?

    I created some templates in error using PAGES and cannot figure out how to remove them/delete them.  Anyone know how to do this?  Fran

  • Battery died and I lost a project in Garageband?!!!

    I spent about 5 days putting a project together. It's maybe my best work yet. I sell jingles in a lowend market, but I'm extremely happy and proud of what I do! Ok, it opens with a new empty track under the filename. When I search the filename in my

  • About Logitech 230 5.1 with Soundblaster 24-bit l

    Hello, I really need help from someone. I just bought Logitech 230 5. speakersystem and have got Soundblaster 24-bit li've at the moment. When I installed the speakers and tested them there was som frequent snapping/clicking sound in the fifth speake

  • Frame counter or timer

    Does anyone know of a technique to show a timer inset or a frame count in the frame? I am making movies of screen captures as jpgs (roughly 3000 images). I tried the plug-in called Timer from cf/x and it crashed every way i set it up. Any ideas? than

  • Need help regarding certification alternative to weblogic portal developnme

    Hi All, I was planning to give weblogic portal certification (1z0-110). But I found that the 1z0-110 certification is retired. Can anybody guide me to some other alternative. I am a JAVA developer who is interested in doing some server certification.