Best practice to combine Lookout 6.5 database files?

I have a years worth of Lookout 6.5 database log files that need to be combined with new ones being created. What is the procedure to make this happen and the old data available for reports?

You need to use the Archiving feature.
Create a new database
Attach old databases.
Archive old databases into new database
Now it is a single database.
Now, keep in mind if you have changed machine/process names you will have more work todo.
Forshock - Consult.Develop.Solve.

Similar Messages

  • Best Practices for deployment of Oracle 10g database.

    Hello ,
    Is anyone aware of a whitepaper/ document that talks about best pratices in deploying a database on Oracle 10g and configuration of the database to utilize all the features available in 10g ( eg. ADDM , reports setup etc )
    Thanking you in Advance.
    Cheers..rCube

    Appreciate the input Jaffer. Thanks.
    However I was referring to a Best Practices whitepaper like the one existing for Data Guard & MAA available at the follwogng url : - http://www.oracle.com/technology/deploy/availability/htdocs/maa.htm
    Is there something available along the same lines ?
    Cheers..rCube

  • SharePoint 2007 - the best practice to detached/remove a content database from a web app without any Farm impact

    Best
    practice to remove content databases from SharePoint 2007 without any Farm
    impact <o:p></o:p>

    Hi  ,
    For removing a content database from a Web application, you can take steps as below:
    1.On the Manage Content Databases page, click the content database that you want to remove.
    2.On the Manage Content Database Settings page, in the Remove Content Database section, select the Remove content database check box.
    If any sites are currently using this database, a message box appears. Click OK to indicate that you want to proceed with the removal.
    3.Click OK.
    Reference:
    http://technet.microsoft.com/en-us/library/cc262440(v=office.12).aspx
    Best Regards,
    Eric
    Eric Tao
    TechNet Community Support

  • Best Practice for Distributed TREX NFS vs cluster file systems

    Hi,
    We are planning to implement a distributed TREX, using RedHat on X64, but we are wondering which could be the best practice or approach to configure the "file server" used on the TREX distributed environment. The guides mention file server, that seems to be another server connected to a SAN exporting or sharing the file systems required to be mounted in all the TREX systems (Master, Backup and Slaves), but we know that the BI accelerator uses OCFS2 (cluster file systems) to access the storage, in the case of RedHat we have GFS or even OCFS.
    Basically we would like to know which is the best practice and how other companies are doing it, for a TREX distributed environment using either network file systems or cluster file systems.
    Thanks in advance,
    Zareh

    I would like to add one more thing, in my previous comment I assumed that it is possible to use cluster file system on TREX because BI accelerator, but maybe that is not supported, it does not seem to be clear on the TREX guides.
    That should be the initial question:
    Aare cluster file system solutions supported on plain TREX implementation?
    Thanks again,
    Zareh

  • Best practice for importing non-"Premiere-ready" video files

    Hello!
    I work with internal clients that provide me with a variety of differnet video types (could be almost ANYTHYING, WMV, MP4, FLV).  I of course ask for AVIs when possible, but unfortunately, I have no control over the type of file I'm given.
    And, naturally, Premiere (just upgraded to CS5) has a hard time dealing with these files.  Unpredictable, ranging from working fine to not working at all, and everything in between.  Naturally, it's become a huge issue for turnaround time.
    Is there a best practice for preparing files for editing in Premiere?
    I've tried almost everything I can think of:  converting the file(s) to .AVIs using a variety of programs/methods.  Most recently, I tried creating a Watch Folder in Adobe Media Encoder and setting it for AVI with the proper aspect ratio.  It makes sense to me that that should work:using an Adobe product to render the file into something Premiere can work with.
    However, when I imported the resulting AVI into Premiere, it gave me the Red Line of Un-renderness (that is the technical term, right?), and had the same sync issue I experienced when I brought it in as a WMV.
    Given our environment, I'm completely fine with adding render time to the front-end of projects, but it has to work.  I want files that Premiere likes.
    THANK YOU in advance for any advice you can give!
    -- Dave

    I use an older conversion program (my PrPro has a much older internal AME, unlike yours), DigitalMedia Converter 2.7. It is shareware, and has been replaced by Deskshare with newer versions, but my old one works fine. I have not tried the newer versions yet. One thing that I like about this converter is that it ONLY uses System CODEC's, and does not install its own, like a few others. This DOES mean that if I get footage with an oddball CODEC, I need to go get it, and install it on the System.
    I can batch process AV files of most types/CODEC's, and convert to DV-AVI Type II w/ 48KHz 16-bit PCM/WAV Audio and at 29.97 FPS (I am in NTSC land). So far, 99% of the resultant converted files have been perfect, whether from DivX, WMV, MPEG-2, or almost any other format/CODEC. If there is any OOS, my experience has been that it will be static, so I just have to adjust the sync offset by a few frames, and that takes care of things.
    In a few instances, the PAR flag has been missed (Standard 4:3 vs Widescreen 16:9), but Interpret Footage has solved those few issues.
    Only oddity that I have observed (mostly with DivX, or WMV's) is that occasionally, PrPro cannot get the file's Duration correct. I found that if I Import those problem files into PrElements, and then just do an Export, to the same exact specs., that resulting file (seems to be 100% identical, but something has to be different - maybe in the header info?) Imports perfectly into PrPro. This happens rarely, and I have the workaround, though it is one more step for those. I have yet to figure out why one very similar file will convert with the Duration info perfect, and then a companion file will not. Nor have I figured out exactly what is different, after running through PrE. Every theory that I have developed has been shot down by my experiences. A mystery still.
    AME works well for most, as a converter, though there are just CODEC's, that Adobe programs do not like, such as DivX and Xvid. I doubt that any Adobe program will handle those suckers easily, if at all.
    Good luck,
    Hunt

  • What are Best Practice Recommendations for Java EE 7 Property File Configuration?

    Where does application configuration belong in modern Java EE applications? What best practice(s) recommendations do people have?
    By application configuration, I mean settings like connectivity settings to services on other boxes, including external ones (e.g. Twitter and our internal Cassandra servers...for things such as hostnames, credentials, retry attempts) as well as those relating business logic (things that one might be tempted to store as constants in classes, e.g. days for something to expire, etc).
    Assumptions:
    We are deploying to a Java EE 7 server (Wildfly 8.1) using a single EAR file, which contains multiple wars and one ejb-jar.
    We will be deploying to a variety of environments: Unit testing, local dev installs, cloud based infrastructure for UAT, Stress testing and Production environments. **Many of  our properties will vary with each of these environments.**
    We are not opposed to coupling property configuration to a DI framework if that is the best practice people recommend.
    All of this is for new development, so we don't have to comply with legacy requirements or restrictions. We're very focused on the current, modern best practices.
    Does configuration belong inside or outside of an EAR?
    If outside of an EAR, where and how best to reliably access them?
    If inside of an EAR we can store it anywhere in the classpath to ease access during execution. But we'd have to re-assemble (and maybe re-build) with each configuration change. And since we'll have multiple environments, we'd need a means to differentiate the files within the EAR. I see two options here:
    Utilize expected file names (e.g. cassandra.properties) and then build multiple environment specific EARs (eg. appxyz-PROD.ear).
    Build one EAR (eg. appxyz.ear) and put all of our various environment configuration files inside it, appending an environment variable to each config file name (eg cassandra-PROD.properties). And of course adding an environment variable (to the vm or otherwise), so that the code will know which file to pickup.
    What are the best practices people can recommend for solving this common challenge?
    Thanks.

    HI Bob,
    As sometimes when you create a model using a local wsdl file then instead of refering to URL mentioned in wsdl file it refers to say, "C:\temp" folder from where you picked up that file. you can check target address of logical port. Due to this when you deploy application on server it try to search it in "c:\temp" path instead of it path specified at soap:address location in wsdl file.
    Best way is  re-import your Adaptive Web Services model using the URL specified in wsdl file as soap:address location.
    like http://<IP>:<PORT>/XISOAPAdapter/MessageServlet?channel<xirequest>
    or you can ask you XI developer to give url for webservice and username password of server

  • Best practice?-store images outside the WAR file?

    I have an EAR project with several thousand images that are constantly changing. I do not want to store the images in the WAR project since it will take an extremely long time to redeploy with every image change. What is the best practice for storing images? Is it proper to put them in the WAR and re-deploy? Or is there a better solution?

    Perryier wrote:
    Can you expand on this? Where do they get deployed and in what format? How do I point to them on a jsp?
    I am using Sun Application server 9.0, and I don't really think this has a "stand alone" web server. How will this impact it?You could install any web server you want (Apache?). The request comes in and if the request matches something like .jpg or .gif or whatever, you serve up the file. If you have a request for a jsp or what not, you forward the request to the app server (Sun App Server in your case). i.e. your web server acts as a content-aware proxy.

  • Best practice on storing the .as and .mxml files

    I have some custom components, and they use their own .as
    action script files. The custom components are placed in the
    "src/component" folder right now. Should I place the associated .as
    files in the same "src/component" folder? What is the suggested
    best practices?
    Thanks,

    Not quite following what you mean by "associated .as files ",
    but yes, that sounds fine.
    Tracy

  • Best practice for default location of mailbox database(s) / logs

    Hello,
    I don't recall seeing any options during the Exchange 2013 install, to specify an alternate location for either the mailbox database or log files. I've reviewed the commands for moving the mailbox databases, but before reviewing the options for setting a
    location for the log files, I thought it best to see whether it's still advised to separate log/mailbox databases away from the OS, with our Exchange server being a virtualised instance?
    Also, I'm assuming that Exchange still requires the usual backup of transaction logs, for them to be cleared?
    Many thanks.

    Hi JH,
    here is a link to storage options and requirements for the Mailbox server :
    http://technet.microsoft.com/en-us/library/ee832792(v=exchg.150).aspx
    By default when installing New Exchange server With mailbox role,it will create default database and log path.
    Recomended is to have Database and Log files on seperate disks.You will have to attache those disk first,then you can create New database using ECP.
    Please look at my Gallery With full guide on how to setup New Exchange server.
    http://gallery.technet.microsoft.com/Install-Exchange-server-b5cce9e4
    Also for future use you might need to clean up log files to free up Space on Your Exchange server:
    http://gallery.technet.microsoft.com/Task-Scheduler-to-cleanup-25047622
    Hope this helps
    Please mark as helpful if you find my contribution useful or as an answer if it does answer your question. That will encourage me - and others - to take time out to help you. Thank you! Off2work

  • Best Practices for data storage in portal database

    Hi,
    I need to store some stuctured data which is related to portal only. This data may grow data by data and may huge amount some point of time.
    Iam thinking which is the best way to handle it like maintanance point of view.
    i think i can store it in R/3 as custom table which is easy to maintain and read/write using RFC.
    the other one is store it on the portal database (dictionary), may not be easy to handle it.
    suggestions please?
    Thanks.

    Best way is to maintian data(growing data) on R3 and use  JCA to write a simple portal application(JSPDynpage or WD) to get the data back on portal.
    Using JCA, won't affect noticeable portal performance..
    Itz not advisable to store large data on dictionary.
    Regards,
    N.

  • Best Practice for Developer update access to database in Production

    I am curious to find out what other organizations are doing as to developer access to sysadm in production. Such as using a database account created like sysadm that can be checked out for use and locked when not in use? or ?

    Developer can be provided with Read only access to SYSADM schema.
    Thanks
    Soundappan
    Edited by: Soundappan on Dec 26, 2011 12:00 PM

  • Best practice idea;PDF forms to Oracle Database idea?

    Working on a check list type of form
    Most of the items are "Compliant," "Non Compliant," "Not applicable."
    So with three check boxes all labeled "item2" I get the behavior that I want. Specifically there are three options but only one is allowed.
    Presently the plan is to fill in the document, save, and print and/or email. So far so good.
    BUT...
    In the long haul it would be nice that the data go via email in some format to populate an Oracle database.
    Question:
    What type of logic needs to be attached (and where) such that "For 'item2' the one and only one choice from a universe of 3 is [answer]
    I guessing Javascript (not well versed in this yet) but is there something simplier?
    Thanks

    A question with three choices. Only one answer.
    To have the checkboxes mutually exclusive I have given them a name which is identical.
    This gives me the visual behavior I need.
    The project will evolve next year to pushing the value to a back end database. Presently if I give each choice a unique name in order to record the answer then the mutual exclusivity breaks
    I hail as an Authorware developer. From this perspective I'd do a "If then else" conditional statement for each choice.
    So in the PDF realm I think it is some type of script that is needed.

  • Transferring the film to video best practice and combining PAL and NTSC

    Could anyone help me with the following 2 questions that I was asked in our small school video lab, I don't really have much experience with negative film and NTSC. Thankyou so much.
    1. "I may be going back to the film negative to cut it, based on the FCP EDL. This means that Final Cut has to maintain perfect synch. I know that with AVID, it's more reliable to transfer the film to video at 25 fps rather than 24 fps. Do you have any idea whether this is also the case with Final Cut??"
    2. "Some of my source materials is on PAL and some is on NTSC. Is that going to be a nightmare?? Will I be able to convert from one to the other when I import?? Or will I need to get the NTSC miniDV tapes transfered to PAL so that your PAL deck can read them? "
    we normally use PAL (In UK).

    1. This is where Cinema Tools comes into play. It can conform your edit list from FCP back to film.
    There is a wealth of information in the Cinama Tools handbook and Help menu item.
    Someone else might be able to contribute more information, my experience with CT is very limited.
    2. Some decks are switchable between PAL and NTSC. If yours can do this then you can capture your footage in a preliminary project and convert it for free with [JES Deinterlacer|http://www.xs4all.nl/~jeschot/home.html] which does a decent job or for $100 with [Natress Standards Conversion|http://www.nattress.com/Products/standardsconversion/standardsconver sion.htm] which does a very good job. Both will take some time, best to capture only what you really need.
    The best possible conversion is done with dedicated hardware solutions such as those offered by Snell & Wilcox. Real time with excellent results. This would be the way to go if you have a lot of material or if your deck is not PAL - NTSC switchable.

  • Best practice to reduce size of BIA trace files

    Hi,
    I saw alert on BIA monitor says 'check size of trace files'. Most of my trace files are above 20MB. I clicked on details it says "Check the size of your trace files. Remove or move the trace files with the memory usage that is too high or trace files that are no longer needed."
    I would like to reduce them these trace files but not sure what is the safetest way to do it. Any suggestion would be appreciated!
    Thanks.
    Mimosa

    Mimosa,
    Let's be clear here first. The tracing set via sm50 is for tracing on the ABAP side of BI not the BIA.
    Yes, it is safe to move/delete TrexAlertServer.trc, TrexIndexServer.trc, etc from the OS level. You can also right click the individual trace when you enter the "Trace" tab in the TREX Admin Tool (python) and I believe there is options to delete them there but it is certaintly OKAY to do this on the OS level. They are simply recreated when new traces are generated.
    I would recommend that you simply .zip the files and move the .zip files to another folder in case SAP support may need them to analyze an issue. As long as they aren't huge, and if hard disk space permits, this shouldnt be an issue. After this you then will need to delete the trace file. It is important that if a trace file has an open handle registered to it then it wont let you delete/move it. Therefore it might be a good idea to do this task when system activity is low or non-existent.
    2 things also to check:
    1. Make sure the python trace is not on.
    2. In the python TREXAdmin Tool, check the Alerts tab and click "Alert Server Configuration". Make sure the trace level is set to "error".
    Hope that helps. As always check the TOM for any concerns:
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/46e11c10-0e01-0010-1182-b02db2e8bafb
    Edited by: Mike Bestvina on Apr 1, 2008 3:59 AM - revised some statements to be more clear

  • Best practice for reducing the number of XML files in an IDML file for translation?

    Our engineering team is looking for ways for us to reduce the number of XML files produced when a (relatively) simple 2-page INDD file is saved out as IDML for translation?

    IDML contains quite a few XML files, but I suspect you're only interested in the Stories folder if you're working on a translation. The way to do that is... to reduce the number of stories. If it's a two-pager, chances are that you have a whole bunch of unthreaded text frames. Thread them in logical reading order. This will help the translator(s) as well - by threading frames in logical reading order, they don't have to work to read the document in the same order as the target audience.

Maybe you are looking for