Seeking advice for timer design

Hi,
I am designing a timer application with Flex Builder 3 which
is basically a stop watch that will remember when it was started if
the page is reloaded (on a different computer, for instance). My
original plan was to use JavaScript and mySQL but I'd like to give
it a try with Flex first.
Since Flex doesn't interact directly with databases I'm
wondering if using mySQL to store the timer settings is the best
idea? I think I could manage interfacing it with PHP but I'm not
sure how much trouble this is going to be since I'm a new Flex
user.
How would you approach the project?
Thanks,
tekfetish

Interfacing with a server part is very simple using Flex, PHP
would be just fine - as well as any other server side
language/system.
Here is an Actionscript idea:
You just create a HTTPService variable
quote:
* gateway : this is the communication layer with the server
side php code
private var gateway:HTTPService = new HTTPService();
Then set properties of this service
quote:
gateway.url = '
http://www.myServer.com/myScript.php';
gateway.method = "POST";
gateway.useProxy = false;
If you'd like to pass parameters, use something like this:
quote:
parameters['someNameHere'] ='John Doe';
parameters['whichTimer'] = 'timer 1';
gateway.request = parameters;
Then, define which functions should be called once the
HTTPService returns [resultHandler on success and faultHandler on
error:
quote:
gateway.addEventListener(ResultEvent.RESULT, resultHandler);
gateway.addEventListener(FaultEvent.FAULT, faultHandler);
Then do something in these functions:
quote:
public function resultHandler(e:ResultEvent):void
// your code here...
public function faultHandler(e:FaultEvent):void
var errorMessage:String = "Connection error: " +
e.fault.faultString;
if (e.fault.faultDetail)
errorMessage += "\n\nAdditional detail: " +
e.fault.faultDetail;
Alert.show(errorMessage);
Finally, perform your request:
quote:
gateway.send();

Similar Messages

  • Mac Pro buying advice for Graphic Design

    I am going to be purchasing a Mac Pro and was hoping I could get some advice on the right set-up for what I do. I use CS4 heavily on the Mac for graphic design spending a good amount of time in Photoshop working with files that can easily go over a few gigs (when working on convention booth displays, etc.). I also use my Mac for casual video editing in iMovie, and use Apple's other great offerings like iWeb, iPhoto, etc.
    My main questions are:
    - I'm leaning towards the Quad. I keep my Macs for at least 5 years. Is that a good decision?
    - If I do go with the Quad should I opt for the 2.66 or 2.93?
    - At the moment I am running 8 Gig on my current Mac, should I configure this new one with 8 Gig?
    - Is the Radeon the correct card choice for a heavy Photoshop user?
    I hope these questions aren't ones that have been asked a thousand times. It's a lot of money to spend (especially right now) and I want to make sure I configure it correctly for what I do.
    Thank you in advance for your input.

    Hi hatter,
    Thank you again for your responses. That really helps clear up the RAM situation. I think that is part of what swayed me towards the 8-Core as opposed to the Quad. Please correct me if I am wrong, but looking at prices of RAM on Crucial I see that if I wanted today to put 16 Gigs of RAM in a Quad-Core it would cost me $1200 (for 4 4-Gig chips using all slots). However if I was going to put 16 in the 8-Core I could do it for $400 (8 2-Gig chips using all slots. Actually it would only be $200 more since I configured my Mac with 4 2-Gig chips from Apple for an extra $100). Seems like having more slots open will save me money when I want to upgrade ram since I don't have to go for the 4-Gig chips. That coupled with the theory that Snow Leopard is around the corner and will take advantage of the 8-Core it seems like opting for a low end 8-Core vs a high end Quad is worth the $300 difference.
    Side note: Are there any reports that say running more 2GB chips is worse than running fewer 4GB chips? Just curious.
    Even though I understand that the processor speed makes a difference I am hoping that the advantages of more RAM slots and future benefits of Snow Leopard, CS5 and other apps that will take advantage of the 8-Core in the near future will make it the right decision. If I could afford a faster 8-Core obviously I would opt for that but with a jump of $1,400 to the next processor speed I am afraid it is out of my reach.
    Regardless I'm sure I will feel very spoiled as soon as I unwrap this beast and get it up and running. Then in a year I'll give into the temptation of slipping a SSD in for the boot drive and I'll have to find a way to contain my excitement. I have a SSD on my laptop and WOW what an amazing difference it really is.
    Wow, these forums are great. I can't believe how helpful all of your responses have been. I feel much more informed and was able to order my Mac today not feeling like I was shopping in the dark.
    Thank you all again!

  • Seeking advice for backing up Xserve G5 w/ RAID PCI card

    Hello all!
    I'm a newbie to Macs and server admin, and I have inherited the job of setting up a server at work for file storage. I'll do my best to give a concise description of our set-up - I'm looking for some advice on the last few odds and ends... mainly how we can backup the system.
    We bought an Xserve G5 with an option RAID PCI card. We have 3 500GB drives in the Xserve, configured to RAID 5 (giving us effectively 1TB of storage space). We will be using the server for data storage. About 20 computers will access the server over a network; we are using an assortment of Macs and PCS.
    I am seeking advice on backup systems for the server. In the event that the RAID5 fails, we don't want to lose our data. We just need a snapshot of the server; we don't need to archive data or to take the HD offsite. Our old server just used Retrospect to run incremental backups every night (with a complete clean backup once a month). Our current thought is to attach large external hard drive to our admin computer (not the server directly) and run nightly backups as before.
    The major points I have are:
    -Any thoughts on reliable 1 TB external drives?
    -Any recommendations on software that can backup from a RAID over a network? I found info for Retrospect Server 6.0 - it seems to do what we want, but is rather pricey.
    Thanks in advance for any advice! I really appreciate it!
    Xserve G5 Mac OS X (10.4.2)

    Greetings-
    We all started out as newbies at one time or another-no worries. That is why we are here.
    I personally use the Lacie branded drives. They are sturdy and reliable. My only thoughts here are to have at least three external drives-one for today's backup, one for a backup stored securely nearby, and one to be securely stored off-site. Rotate the drives daily so that your worst-case scenario will be a catastrophe that requires you to use a three-day old backup, depending upon how old your off-site backup is. Not ideal but better than the alternative. External drives are cheap enough these days to allow you to do this at a reasonable cost. Plus it is easy enough to throw one in a briefcase and tote it home for safety (just don't lose it!)
    I would stay away from Retrospect. If you search these forums you will find several instances of folks having some serious issues with the program. I use a program called Carbon Copy Cloner that does the job nicely for my basement server. There are ways to do the backups via the command line interface as well but I am not so familiar with those commands. You may have to dig a little deeper to find something that works for you.
    One of the other advantages of the FW external drive is that you can share it with other users, so perhaps you can set things up to have your network backups to go to that drive. Tis a thought.
    Luck-
    -DaddyPaycheck

  • Arch workflow design advice for a designer?

    Sorry for the ambiguous title, I couldn't figure out what to call this post.
    I'm new to Arch, though not Linux, and I must say, this is an amazing distro (I'm on the 64bit version). Dead simple, super fast, and nearly as flexible as a Gentoo system (that can use binaries!). Pacman is rockin'.
    I'm a designer by trade: Web, video, and image. And I STILL boot into Windows for important tasks like Flash work, video work, and ftp work. I would obviously like to reduce that dependency, though there is little hope in the video department, right now.
    But for web, I see no reason I couldn't do it all in linux. But I'm not sure how to go about it. Here is the workflow I need, and I was wondering if you could advise how I might set up such a system (I have just a base system with Gnome installed right now):
    * WYSIWYG html and CSS editting (NVU/Kompose is fine for html, but NOT for CSS) for the design phase
    * A way to output image slices with html (does GIMP do this?)
    * Accurate web fonts
    * Reliable ftp, preferably one with drag n' drop functionality (I use filezilla on Windows, but I think the linux version lacks the drag n' drop)
    It's not a real complicated workflow, I just need to save time wherever possible because I need to work very fast. In windows, it's like having a ball and chain strapped to your leg, but it does work. With linux, I will very much appreciate access to terminal and file management advantages.
    I'm not stuck on Gnome, I just like the simplicity. I'm mainly interested in speed and efficiency (NOTE efficiency... I like time savers and fluxbox always seems to add clicks to my tasks). Let me know what you think! I may be able to move my flash work over with a little help from VirtualBox too, but I think I'm stuck when it comes to video . Thanks for any advice you might have!

    No offense, but using WYSIWYG to design web pages doesn't sound very professional imo. They just don't offer the control that one would want with the code. I have tried a few (Frontpage, Dreamweaver, NVU, Bluefish, ...) and they all suck. They just don't do what you want it to. You drag something or add some formatting and it just messes up the code. It's better to just use a text editor and view the results in browser. Maybe that's slow or inefficient for you, but I find that's the best way to do it.
    As for image slicing, I find that annoying as well. In Photoshop I never really liked the way it worked. I sliced a few images and then trashed most of the others. I tend to go for simple designs and focus on making it mostly CSS, so when I slice images it's usually a 1px wide/high gradient which would get repeated. I don't need image slicing for that. As for graphic intensive sites... well... really, you should review that. People still have slow connections and having a lot of graphics is just bad, even if your client wants it. You might as well go with flash, and waste some more bandwidth
    If you really want to do it though, I think Inkscape is quite a nice tool. I do all my designing in it, and though I don't use slicing, you can do it quite easily (though it's a bit hackish) by adding a layer and creating transparent rectangles around the stuff you want, then just select the rectangle and export it. I'm not sure if there's a more automatic way - there are plenty of tutorials.
    The MS-fonts should be fine, I just want to know that I am looking at an accurate representation of what I my windows customers will see.
    Fonts won't help you much there. You know most people use IE, so you need to view the website in IE regardless, and that means you need Windows (I think wine uses some weird IE version which uses gecko). Maybe there's some good Linux alternative for viewing stuff in IE, but I just view it on Windows. Also the font shouldn't change the general layout of the site... I don't see how that would be a problem unless it's some weird font that not everyone has, in which case you'd use @font-face anyway...

  • Mac Pro drive configuration/expansion advice for Time Machine

    I have the following:
    Mac Pro 1Tb internal drive (2 x 500Gb in software RAID, so 2 drive bays used, 2 free)
    500Gb external drive (Formac XTR platinum - 2 x 250Gb in RAID)
    I use SuperDuper to backup the internal to the external - I'm using about 440Gb on the internal and this fits on the external with about 10-20Gb spare. However, my usage of the internal drive is growing, and as Time Machine backups grow in size, my 500Gb external will very quickly lose the ability to go back in time and shortly afterwards run out of space for even a simple backup.
    My options are:
    1. Buy 2 500Gb IDE drives and upgrade the external to 1Tb - Cost about £140. If I put the redundant 250Gb drives in enclosures and sell them, net cost would come down to about £100
    2. Buy single 750Gb SATA drive and install internally for TM backups - cost £110. Sell external drive and cost comes down to about £40
    3. Buy 2 500Gb SATA drives and install inside Mac Pro for Time Machine - Cost about £130. Sell external and cost about £60
    4. Buy single 1Tb SATA drive and install internally for TM backups - cost £200. Sell external and cost about £130
    Pros & Cons
    Option 1 - Con: Left with untidy external enclosure. Pro - 2 internal bays free for expansion.
    Option 2 - Con: Less internal expansion, Pro: No untidy external box, can add another 750Gb drive when required
    Option 3 - Con: No internal expansion. Pro: Ermmm, not sure.
    Option 3 - Con: Most expensive. Pro: Longer time required before need to add another drive.
    I know external drives are useful for off-site backups but I do have a 160Gb and a 320Gb external drives for that purpose (and for TM'ing my MacBook).
    I think I've answered my own question - get a 750Gb SATA drive and stick it inside.
    Regards,
    Steve

    I am in a similar situation. I have a 160gb for my boot/system drive and two 500gb for data, backed up to two other 500gb drives. Both 500gb drives are about 80% full and I bring in 50 or gig a week, paring it down quite a bit, then importing new photos and culling them out again. I may replace my 160 backup drive for my boot partition with a 750gig, but I don't see Time Machine being capable of backing up a 500 or 750gig drive that had several hundred gigs of current data and moves gig after gig onto and then out of the hard drive. I think Time Machine will be really great for many many people, and if it allows me to "roll back" my system to an earlier known-good point in time even better than my current method of using superduper when I feel my system is stable, then Time Machine will be great for me too, but I don't see how it's going to be usable for data backup for heavy professional photographic work or for video backup where large amounts of data are entered, deleted, entered again, deleted, etc etc. There isn't enough anectodal information on Time Machine to know what size is going to be really required, but I think most folks will be surprised at how much room is going to be needed - especially those that use their machines heavily. Mom and Dad who do nothing but a few emails and surfing the web will have no problem, Mac Pro users involved with grahics and video...not so easy I'm afraid. Time will tell.

  • Seeking advice for a self-teaching programme of new Java techs

    Hello,
    after a few years of technically poor projects (I learnt a lot in terms of process, quality, and non-Java technologies, by the way), I'd like to catch up with mainstream Java development.
    I am preparing a programme to learn a few technologies I have missed over the past years. I plan to develop a couple of utilities (simple CRUD apps) on my spare time, gradually including bits and bits of the bricks I want to grasp.
    I need some advice though to make this learning effective.
    My first priority is to invest in reusable knowledge, so I will stick to "standard" bricks, and will aim mostly at off-the-shelf tools that work out of the box
    (for example, I don't want to spend hours to fix a DB install&configuration issue, as I know I will probably never be paid for that kind of job).
    Specifically, the technologies I plan to embrace are:
    * Java SE 5 and 6
    * jUnit 4
    * Spring
    * EJB 3 & JPA
    * Web UIs
    Here are the points where you might help me:
    * JDK 1.5 vs 1.6
    No real doubt here, the biggest language gap was introduced by 1.5. If I'm correct, 1.6 essentially brings new APIs; as long as those APIs are not my primary target (I'll try them only after I've completed my first round of catch-up :o), I am not missing much language-wise, am I correct?
    I could also jump directly to 1.6, ignoring its new APIs, and concentrate on
    1.5 language features, but the risk is stability (losing time with a 1.6 beta bug).
    * jUnit 4
    Nothing special, except I have practically read nothing about it on these forums (contrast this with TSS, where there are regular threads disputing the relative merits of jUnit vs TestNG).
    Is v4.0 a non-event? It seems to bring some niceties (mark test components using annotations), but how much comfort did it bring to those of you who have used it over jUnit 3.8.1?
    * Spring
    Leaving aside that it's on my must-do list, I actually wonder which version I should try?
    I also feel I should try it first on a non-EJB app, to avoid mixing concerns. I will skip the JDBC wrapping API though, as I already have a lot to do with other persistance APIs.
    Any feedback?
    * EJB3/JPA
    (I formerly worked with generation 2.1 of the EE specs)
    1) The biggest issue here is to find a reliable EJB3 container.
    * I've heard JBoss is EJB3-ready, but I don't think it's certified yet.
    * Project GlassFish (Sun-driven open-source JEE5 container) is still in the making
    * Sun SASP9 is, well, released, but I don't know how much of JEE it supports, and haven't investigated into the licensing details yet
    Any feedback on what you're using as a JEE5 container?
    2) As far as EJB vs JPA goes, I also have the plan to use the persistence API
    outside of EJB (likely, I will develop the same simple CRUD app twice,
    once as a Webapp and once as as Swing app, both over the same DB schema).
    But maybe this is pointless, as the entity annotations and API are agnostic; have you experienced some difference that deserve thorough learning?
    3) Obviously I will need a DB. If the EJB container includes one, fine, at least for the EJB part, otherwise I'll need to install and configure one, that will need to run on a desktop PC on MS WIndows, and preferrably include a handy SQL client console for manual verification.
    I know enough of SQL to build even a sub-optimal application schema. However I know nothing of a DB administration, and I don't want to invest in that direction.
    Any advice as to a simple off-the-shelf free DB that runs on Windows?
    * Web UIs
    The dilmena here is two-fold:
    1) In term of "view" technology, I hesitate between plain JSP and JSF
    If I understand correctly, JSF is a specification for reusable view components.
    My local job market leaves me no oportunity to be a UI component developer, whereas I am more likely to someday be tasked to integrate 3rd-party UI components into a business project.
    Is my surface understanding of JSF correct? In this case, how much of JSF should I know if I don't have to develop JSF components?
    2) In terms of controller, as part of my Spring learning, I'll probably have a look into Spring MVC.
    My question is, how much of the servlet/JSP API does springMVC show/hide?
    Is it a good idea to use springMVC to learn servlets and JSPs, or should I stick to manual servlet+JSP?
    I should add that I've worked with servlets/JSPs formerly (as of J2EE 2 or 3),
    I only need to dig new servlet/JSP features.

    Jim_Found wrote:
    okay, so normally you would write the following if-elseif block in order to achieve this
    if (MessageType.TYPE_A.equals(myMessageType)) {
    // do this
    } else if (MessageType.TYPE_B.equals(myMessageType)) {
    // do that
    } else if ...
    but, if you use <this design pattern> and create this <class>, you can suppress the above block to
    CoolClass coolObj = new CoolClass();
    coolObj.doMagic(myMessage);Funny enough inside the doMagic() method would be something along the lines of
    if (MessageType.TYPE_A.equals(myMessageType)) {
    // do this
    } else if (MessageType.TYPE_B.equals(myMessageType)) {
    // do that
    } else if ...Mel

  • Need advice for future design and hardware I should purchase now.

    I was wondering if someone could assist me in making a decision on where I should take the future design of my network. Currently the design has no redundancy and I've been told recently that my company has a bit of money to spend and that it needs to spend it within the next 2 weeks. I am fairly new to the company so haven't been able to really think about future redundant designs nor have I studied much about designing networks. There are about 200-300 people that may be using my network at once, all users aren't at the location in question but they may be asking for resources from our servers.
    I've included a basic design of the "core" of my network and would like any suggestions for creating redundancy in the future and also optimizing the way data travels to our servers. Do people generally have redundant Layer 3 switches for the core in small networks such as mine? I will be replacing the 2811 since it only has 100Mbps connections and was thinking, perhaps replace this with a Layer 3 switch with the plan to have another identical Layer 3 switch installed to offer redundancy in the future.
    Also, would it be a good idea to move the servers into a vlan on the core? Thanks for any advice on what I should be purchasing now with plans for redundancy being implemented over a year.  -Mark

    40k Can go pretty quick depending on the scope. Your server farm especially should be dual-homed capable of surviving link, hardware, routing, and software failure.
    It's going to be best practice your server farm be in it's logical subnet, so failover mechanism can be controlled routing protocols, as opposed to FHRP's such as HSRP/VRRP/GLBP. Especially since you adjust routing timers to sub-second convergence.
    Budget will be the primary limitation (as it always it) but ideally dual 6500's running VSS and FWSM would be the ideal way. Data centers should be designed with high availability in mind, hence the need to 2x devices.
    Depending on the size of the SAN/Virtual infrastructure Nexus might want to be considered but you will chew up 40k before you know it.
    Also make sure the server farm is scaled properly. Your server farms should be oversubscribed in a much higher ratio compared to your access layer.
    CCNP, CCIP, CCDP, CCNA: Security/Wireless
    Blog: http://ccie-or-null.net/

  • Parsing XML with Java - seeking advice for method to pursue

    Hi guys
    So here's the deal: I want to parse some XML from a server, get all the data I need from it, then shunt that data to the classes that need it. However, I'm really not sure what the best way of parsing the XML is.
    I've just written a class for obtaining the file, and one for building a DOM from the XML file. But looking at org.w3c.dom in the Java API documentation (see HERE) I'll have to implement shedloads of interfaces to be able to take advantage of the DOM, and I don't even know a lot about it.
    So am I best just writing a simple parser based on regular expressions to get the data I want? There's about 5 attributes in the XML file that I need to get - 5 for each instance of the containing element. So I don't think it'll be hard to do with regular expressions.. plus even if I did decide to implement all the DOM interfaces, the only way I'd know how to do half the stuff is with regular expressions anyway. I mean, how else would you do it? How else could you match up Nodes according to the Strings you're using for the elements, attributes etc?
    I worry that a parser using regular expressions might be too slow... I'm building this as an applet to visually display information from the server. I have nothing to support those fears, I'm just not experienced enough to know whether speed would be a problem if I chose this route... I don't think it would, but really need confirmation of that suspicion being unfounded.
    Any advice would be very, very welcome, as I'm tearing my hair out at the moment, unsure what to do.

    Komodo wrote:But JDOM is not in the core class libraries, is it? So if I want to create an applet embedded into a website, am I able to get people to download that as well?
    Sorry, I don't know anything about applets.
    Komodo wrote:Everyone's advice is appreciated, but my core question remains unanswered - would using regular expressions, considering how simple and unchanging the XML files are, be a viable option in terms of speed?
    Yes! I've done more than my fair share of XML processing with REs. It's not always easy. I often wish I could just use something XPath-like. But it's certainly easy to do and would probably mean that you'd have something up and running quicker than if you spent time investigating pure XML parsing approaches.

  • Seeking advice regarding Time Dimension - Correction and Customization

    Dear all:
    Situation
    1. We have a measure called HTD, Half Year to Date, which aggregates 6 months worth of data. However, I just realized that the previous consultants set HTD at LEVEL = "Week". It works fine but I feel that this naming convention is confusing, and would like to rename HTD's LEVEL.
    2. We have 4 more additional periods after period 12 for adjustment purpose. What is the good practice to have a year with 16 periods?
    Data
    Current LEVEL for Half Year to Date is named "Week". I would like to rename this to something like "HalfYr"
    200x.01 - 12 are regular periods
    200x.13 - 16 are additional periods
    Question
    1. If I go to SQL's back end and rename my LEVEL, and then update my Time dimension, is this going to pose a problem to my historical data?
    2. I would like to include additional 4 more periods to sync with BW, but at the same time not confuse my BPC users in running reports at month of December. In other words, how do I integrate the 16 period time dimension design into the concept of "1 yr = 12 months (periods)"?

    Tim:
    Sorry for the late reply.
    We are running BPC 5.1 SP3 and SQL 2003
    Yes, MONTH, QUARTER, YEAR are used for LEVEL property, and the only irrelevance is HALF YEAR which has WEEK in LEVEL property.
    I think perhaps a separate data source makes sense... Have you handled any case like this before and how did you manage the "additional" periods?
    Thank you! Have a great day!
    Sincerely,
    Brian

  • Seeking advice with concept / design of a Java Web Services Application

    Hi all,
    After a week of searching the internet I'm not sure how which projects, services, etc. I should use to develop my application. Please could someone offer some advice?
    Application outline (Java application running on Linux):
    1. Wait for an instruction from a Windows WCF application. Instruction contains a list of domain names and one or two other parameters.
    2. Perform queries upon those domain names (find if they are registered, etc. - takes 10-20 mins to do complete list)
    3. Send back results to WCF application.
    Solution 1 (first idea):
    1. Create SOAP web service using Java Web Services / Apache to listen for requests. Executes a Java Client Application upon request.
    2. Java Client Application performs the queries...
    3. Java Client Application sends results to the Windows web service.
    Solution 1 Problems:
    a) Using 2 applications, they won't be able to share the same memory.
    b) The whole process will be slow (having to pass the domain names from the service to application and execute the application each time) - A quick response is critical.
    c) Would like is for the whole process to be done under one application, sharing the same memory.
    Solution 2:
    1. Create a Java Daemon from scratch listening for an incoming SOAP message (no web service like Apache/Tomcat involved).
    2. Query the domain names in a new thread inside the Java Daemon
    3. Send the results back via SOAP.
    Solution 2 Problems:
    Cannot find examples of how to create a SOAP service from scratch. E.g. creating a WSDL file based on my application; converting application methods to SOAP-callable methods easily (without writing a framework).
    With all the Java tools and projects out there - Java EE, Glassfish Project - there must be a very easy way to achieve this seemingly simple task. Please can someone offer some advice?
    Many thanks for reading this.
    Richard
    Edited by: jazzinbrazil on Mar 30, 2009 4:58 AM

    You just need an app server like Tomcat.If I'm not wrong, Tomcat is a Servlet container.
    Servlets aren't deactivated when they don't receive
    any request for some time?
    How can I deploy an application to Tomcat in order to
    keep it always active?I don't know what you mean. Tomcat is an application that is always running. In what way are the Servlets deactivated?
    Apache Axis: http://ws.apache.org/axis/
    Yes I'm collecting some info about this... let's see
    it!
    Finally, to be more clearer... I don't want to start
    a new application at each invocation (something like
    getting the request, instantiating the necessary
    classes and executing them) but to call an already
    running app at each invocation (so, getting request
    and invoking, in a manner that I don't know, the
    running application).The container manages this. If you have data that must remain loaded, you can associate it with the class (use a static modifier.) This will complicate threading, however.
    I can use Axis to get the request, but it also grants
    me that my app will always be active?I think you are just using the wrong terminology here. What I think you are asking is whether the resources will be loded into memory at all times. If you want to ensure this behavior, you need to associate the data with a class. I'm not 100% positive, but I don't think Tomcat will unload classes in normal circumstances.

  • Using Suitcase Fusion/Advice for a Designer

    For the longest time, I would just add whatever fonts I could find to Fusion and the program would add them without any problem. However, now that I am doing more freelance, I need to be really careful with what fonts I use for my clients. I wouldn't want to use unlicenced fonts for a job. Consequently, I now need to think up the best workflow for using fusion for freelancing. I know I can add temporar fonts to fusion. But what if I am doing a job for a month for someone. Would you then add the fonts permanently? Then after the job is over would you remove them? What would be the best way to remove the fonts? How could you be sure you aren't removing the wrong fonts? For example, what if a client sends you a Helvetica to use, and it happens to be exact same Helvetica that you own, and is consequently already in Fusion. I imagine Fusion would not allow that other Helvetica into the database since it is already in there. My problem is, when you are done with the job and you go to delete all the fonts because you no longer need them, wouldn't you end up deleting out your own helvetica? Any advice on the best workflow would be much appreciated.

    Hi kpdesigns,
    The main thing to remember is that your fonts are never part of Suitcase. Everything listed in the active font lists are symbolic links only. If you "remove" fonts from Suitcase, you are only deactivating them. The originals are still wherever they were originally on the hard drive.
    It's a bit different if you're using Fusion's option to put activated fonts in the vault. Those are then a copy of the fonts you activated stored in Suitcase's own database. Unless, in a sudden burst of insanity, turned on the option to delete the originals after adding fonts to the vault. When you deactivate them, they remain in the vault so you can go to the Closed Fonts panel and reopen them without having the originals on hand.
    There are problems with that workflow though. What if you need to open a modified copy of Helvetica? Will Suitcase allow you to override the font in the vault. Will it even let you add it since they're the same name? To me, the vault isn't very useful and so that option is always off on my Macs. It's also downright dangerous if you're silly enough to let it delete your originals when adding fonts.
    Here's the workflow we used to best effect in the prepress shops I've worked at.
    1) Keep your open fonts to the bare minimum on your Mac so client fonts you activate are unlikely to already be open from another font with the same name. See my article, Font Management in Mac OS X Tiger and Panther for a list of minimum fonts.
    2) Keep client fonts with each job they are for. Create a folder for a job when it comes in. Within that folder, create a folder for the fonts that belong with that job. So if the job folder is called Florida Spa Flyer, name the folder within for the fonts Florida Spa Fonts to make the association easy to remember. Once you have all of the fonts for that job in the Fonts folder, drag and drop it into the Suitcase window. A new font set will be created with the same name as the folder the fonts are in, making is easy to see which fonts belong with which job.
    3) Since fonts in Suitcase are only links, you can have multiple sets for all of your separate projects. Just activate whichever set matches the job you're currently on and deactivate the other sets. That way you can have multiple versions of Helvetica, Garamond or whatever in Suitcase. Since they're all separated by sets that activate only from the job folder you added them from, you'll always be using the correct fonts supplied by the client for that job.
    4) When you're done with a project, simply deactivate and delete that set from Suitcase. The actual fonts will remain where they are and can be archived with the final documents so they stay together.
    The link, or one of the links above directs you to my personal web site. While the information is free, it does ask for a contribution. As such, I am required by Apple's rules for these discussions to include the following disclaimer.
    I may receive some form of compensation, financial or otherwise, from my recommendation or link.

  • Seeking advice for a nagging BT related issue

    Dear Forum,
    I would appreciate your advice on a performance issue related to using the BT SW stack.
    *Some information*:
    Non-Toshiba notebook, Intel dual core @ 1.2 GHz, 2 GB RAM, running WinXP pro (5.1.2600 SP2 Build 2600), all MS updates, company computer, MS Office 2003. I got the PC two months ago and separately purchased the 'Wireless 360' internal Bluetooth USB module (Toshiba device, VID 413C, PID 8140, firmware 43.15). I installed the hardware module and BT software stack (Toshiba BT Stack, version v4.41.02(D)). I'm using BT mainly to connect to my mobile phone, for which 'Nokia Suite' v6.85.141 is installed. The Nokia Suite works fine and as expected using the BT link.
    *Problem*:
    When opening a .doc/.xls file by double clicking it in Windows Explorer, the system response is very slow. Like 40 (!) secs to open a 30 KB Word document. This is reproducible. For some other files, like .pdf the response is much better. Starting Word via the start button and using Word's file/open to access the same file is OK. Opening Windows Explorer using Windows-E hotkey is very slow as well, opening Windows Explorer via start menu is normal.
    I do not see any 'heavy' disk access or 100% CPU during the delay.
    For the rest, the notebook seems to run OK.
    *Findings so far*:
    1) Re-installing the Toshiba BT Stack did not improve things
    2) When shutting down the PC, frequently the TosOBEX.exe process needs to be ended &lsquo;manually&rsquo;
    3) Uninstalling the BT SW stack makes the PC respond normal again
    4) Disabling the BT Manager (system tray, right click BT icon, exit) has same effect
    5) Stopping the TosBTmng.exe process removed the problematic effect as well
    6) Starting the PC with all non Microsoft processes disabled &ndash; except the BT stack &ndash; give a normal responsiveness (!)
    7) Uninstalling McAfee virus scanning tools does not help
    8) Replacing the BT stack with version 5.10.12(T) solves the issue, but creates another problem: process &ldquo;services.exe&rdquo; continues to consume too much CPU time (~35&hellip;40%)
    9) I have not been able (yet) to identity the non BT process(es) which effect(s) this problem
    *Questions:*
    Anybody familiar with this effect?
    Any suggestions for next steps?
    Forum&rsquo;s advice would be appreciated!
    Wytze

    Hi people. Same thing here.
    My Satellite A100 delays opening word and excel 2003 documents only when bluetooth manager is running.
    I don't see a cpu load on the services process though (or any other process for that matter).
    I had version 4 installed when detected the problem. Switched to 5.10.12(T) and the wait time is still there.
    Then I followed the previous post suggestion of wiping-out everything bt related and install again. But still no luck.
    Even reinstalled office but that didn't help either.
    Some maybe useful information: I'm running Nokia PC Suite 6.85 over bluetooth and Adobe Acrobat Professional 8 which integrates menus in Word and Excel.
    Wait time still occurs when killing all PC Suite processes or changing acrobat installation to not integrate in word.
    This kind of time wait resembles some king of name resolution timeout. But why only office apps?
    Has anyone already have any light to shed on this?
    Regards
    --Fernando

  • Seeking advice on Zone design

    I have a ZCM Zone originally built with ZCM 10.2 and has been updated over the years to 11.2.3 and soon to be updated to 11.3.1
    The Zone services 15,000 devices with over 30k users (school district). The network topology is central, each school has a 10GB fiber link to the data center. Entire Zone is build on a VMWare vSphere 5 platform.
    Current design consists of 1 dedicated ZCC server, 1 dedicated Inventory server, 2 dedicated image servers, and 12 Authentication/Content/Config servers. Database is SQL 2008 R2 on it's own VM. Typical guest machine config uses 2 vCPU 8GB vRAM. All run on Windows 2008 R2 SP1 SQL server uses 2 vCPU and 24GB of vRAM. DB has grown to 43GB in size (gets up to 60+GB before DB maintenance operations are run).
    In ZCM 10, the closest server rules were setup to split the user traffic among four selected servers for the site. When the closest server rules allowed for groups, it was enabled to get the round robin functionality. Was never able to get the needed data from my customer to fully implement Locations, so Locations Lite is in use. Pretty much set the default closest server rule to group all 12 Auth servers in a single group. It has worked to split the load quite well among the 12 Primary servers.
    Only the ZCC server and Image servers had their entire VM memory reserved (per VMWare best practice for a Java app). Was unable to reserve memory for all guest machines since it would cause to much performance issues with other guests when doing so. Because of this, I am thinking of swapping out the 12 Primary servers for 12 Satellite servers .. but I am unsure of the sanity of doing such a change. The satellite servers would run in the same virtual environment as the Primary servers.
    My hope in doing this change is to improve the authentication speed, and satellite servers seem to be faster in getting the job done. Also reduce the amount of work the database server is doing by reducing the amount of Primary servers talking to it.
    The change almost seems pointless, so I wanted to see what other thought about doing such a change.
    thank you

    We definitely want all of the VMware Memory Reserved.
    Consider Converting the 2 Dedicated Imaging Servers to Satellite Servers
    with the Imaging Role. This will consume far fewer resources and they
    memory for Satellite Servers is not required to be fully dedicated.
    12 Auth/Content/Config servers is far more than what is necessary for
    15,000 Devices. Especially with 8gb of RAM. As a Test, Remove a couple
    of these servers from the "Server Group" and test performance.
    You may also be able to reduce the RAM from 8GB to 6GB on the remaining
    10 servers to allow for dedication.
    The key is that assigning RAM above and beyond what is dedicated can
    lead to stability issues and will not be fully dedicated.
    It is quite common for servers to fail upgrading or crash after upgrades
    when the RAM is not dedicated because the servers now start hitting and
    trying to use the non-dedicated RAM that was previously not used.
    Also Drop an Email to [email protected]
    I want to email you a utility, but will need your email address.
    Note: Location Lite is just fine.
    On 7/15/2014 4:56 PM, Provogeek wrote:
    >
    > I have a ZCM Zone originally built with ZCM 10.2 and has been updated
    > over the years to 11.2.3 and soon to be updated to 11.3.1
    > The Zone services 15,000 devices with over 30k users (school district).
    > The network topology is central, each school has a 10GB fiber link to
    > the data center. Entire Zone is build on a VMWare vSphere 5 platform.
    >
    > Current design consists of 1 dedicated ZCC server, 1 dedicated Inventory
    > server, 2 dedicated image servers, and 12 Authentication/Content/Config
    > servers. Database is SQL 2008 R2 on it's own VM. Typical guest machine
    > config uses 2 vCPU 8GB vRAM. All run on Windows 2008 R2 SP1 SQL
    > server uses 2 vCPU and 24GB of vRAM. DB has grown to 43GB in size (gets
    > up to 60+GB before DB maintenance operations are run).
    >
    > In ZCM 10, the closest server rules were setup to split the user traffic
    > among four selected servers for the site. When the closest server rules
    > allowed for groups, it was enabled to get the round robin functionality.
    > Was never able to get the needed data from my customer to fully
    > implement Locations, so Locations Lite is in use. Pretty much set the
    > default closest server rule to group all 12 Auth servers in a single
    > group. It has worked to split the load quite well among the 12 Primary
    > servers.
    >
    > Only the ZCC server and Image servers had their entire VM memory
    > reserved (per VMWare best practice for a Java app). Was unable to
    > reserve memory for all guest machines since it would cause to much
    > performance issues with other guests when doing so. Because of this, I
    > am thinking of swapping out the 12 Primary servers for 12 Satellite
    > servers .. but I am unsure of the sanity of doing such a change. The
    > satellite servers would run in the same virtual environment as the
    > Primary servers.
    >
    > My hope in doing this change is to improve the authentication speed, and
    > satellite servers seem to be faster in getting the job done. Also
    > reduce the amount of work the database server is doing by reducing the
    > amount of Primary servers talking to it.
    >
    > The change almost seems pointless, so I wanted to see what other thought
    > about doing such a change.
    >
    > thank you
    >
    >
    Craig Wilson - MCNE, MCSE, CCNA
    Novell Technical Support Engineer
    Novell does not officially monitor these forums.
    Suggestions/Opinions/Statements made by me are solely my own.
    These thoughts may not be shared by either Novell or any rational human.

  • Seeking advice for a wookiee video person...

    I'm shooting alot of videos lately. Typically - they just sit in a folder when I transfer via usb to my dell laptop (sorry I don't have a Macbook pro quite yet!).
    I know there is Windows Movie Maker and some other "free" junk out there.
    I see Camtasia looks pretty fabulous - but I think all I need is something simple to basicaly do minor editing and then compressing and prepping to share via the internet in various ways with family and friends.
    Would QuickTime Pro give me all I need or should l look at buying both QuickTime Pro AND Compressor 4?
    (Assuming these are both available for Windows 7).
    Any advice would be appreciated.
    Thanks,
    David Murphy

    Compressor is Mac only
    QT Pro will only act as a cut editor so  hardly the application for you, there is no transitions, titles sound mixing or effects.
    Adobe Premier Elementsis fantastic value for money for Windows machines if you plan to do decent hobbyist editing.   Look at the website here:
    Adobe Premiere Elements

  • Seeking advice for mirroring systems

    Hi all. I'm working on a documentary. Here's the set-up: main editor works on computer connected to full-res versions of footage. She passes sequences off to an assistant, who reconnects them on a second computer to low-res copies of the same footage, works on the sequences and passes them back to the main editor.
    Reconnecting never seems to go right. FC invariably has trouble finding certain clips, even though the names are identical for the low-res and high res versions. There is no logic that we can discern for which clips FC will find and which clips it won't. Sometimes it misses whole groups of clips, and sometimes it finds one clip in the group, and not the others, at least not by itself, and we have to manually find and connect each one, one by one, which takes a long time and is extremely tedious.
    Obviously our set-up is not ideal, but we're stuck with it, and we'd like to make it work better. So I was wondering if someone could give me some tips, or point me to an article, on how to make the reconnecting process more efficient. I've noticed, for example, that FC is more likely to miss clips that are stored at the end of a series of sub-folders, so we've cutback on folders. Any other changes we should make?

    Trying to understand more of your question;
    Sounds like two different computers with FC? Accessing the same Media? But at different times? To do two different functions?
    In general, yes I agree connecting, or reconnecting Media can be a crap shot. Technically you're supposed to be able to get a good connection process going by asking/choosing to connect all the other related Media which may have the same file path.
    But with what ever you're trying to do, it's not working.
    And in general, I've found that using Media Manager to re-connect/re-organize all that media can be helpful somtimes. For me, it put all the media that I may have had in several dozen folders all into ONE. This may or may not be helpful, but after using Media Manager, I'm able to open the new project without any media connection problems.
    One other thought, perhaps try 'RENAMING' the sequences in each FC editor, so they're not the same.
    Mike

Maybe you are looking for

  • "Run Scripts" giving a problem to create portal domain

    Hello All. I am new to weblogic portal development. Was trying to create my basic sample portal in weblogic_portal_10.3.4 with the help of documentation. I am getting in running scripts when I was creating the domain. I have followed the below steps,

  • Spry horizontal menu, can't center the whole menu bar

    I have a horizontal menu set up more or less how I want it (after much time). But I can't seem to figure out how to center the entire menu. Adding padding or margins pushes it out beyond the 700px box even if I resize the menu itself to be smaller th

  • Problems navigating through old posts

    I am having problems navigating through old posts.  The slower the forums get, the longer it takes me to read them, and the further behind I get in reading them.  Eventually, when trying to read posts that are a couple of months old in the SQL and PL

  • IMovie 11 does not allow the capture of a still from video

    I am really ticked off. I just upgraded to iMovie 11 and couldn't capture a still from my video. I called support to see what I needed to do and was told that THIS FEATURE IS NOT ON iMovie 11. ARE YOU KIDDING ME!!!! How could such a basic feature be

  • Flash installation crashes.

    i am trying to install the latest version of flash and the installation keeps crashing. what could be causing this problem.