Most efficient way to work with large projects

I have just purchased Premier elements and need to produce a two hour edited project.
Would the best way be to break the task into a few smaller projects and join them up after editing or just to create on long project?
Any advice please

I use Documents to Go. And I don't have the premium version but it does have online syncing in the premium version. Both the regular and premium version have a desktop program that allows you to sync documents without going through iTunes. Add the ability to do online syncing such as sugarsync, dropbox, etc, might help.
I dont' think they automatically update. You have to manually do it. But the 'price' of manually updating the documents is that the program works off line. So type when you're away from the internet, update it when you get online again and move on.
I also like how it interfaces with regular word documents
Something you can look into and see if it looks like it may work.

Similar Messages

  • The most efficient way to search a large String

    Hi All,
    2 Quick Questions
    QUESTION 1:
    I have about 50 String keywords -- I would like to use to search a big String object (between 300-3000 characters)
    Is the most efficient way to search it for my keywords like this ?
    if(myBigString.indexOf("string1")!=1 || myBigString.indexOf("string2")!=1 || myBigString.indexOf("string1")!=1 and so on for 50 strings.)
    System.out.println("it was found");
    QUESTION 2:
    Can someone help me out with a regular expression search of phone number in the format NNN-NNN-NNNN
    I would like it to return all instances of that pattern found on the page .
    I have done regular expressions, in javascript in vbscript but I have never done regular expressions in java.
    Thanks

    Answer 2:
    If you have the option of using Java 1.4, have a look at the new regular expressions library... whose package name I forget :-/ There have been articles published on it, both at JavaWorld and IBM's developerWorks.
    If you can't use Java 1.4, have a look at the jakarta regular expression projects, of which I think there are two (ORO and Perl-like, off the top of my head)
    http://jakarta.apache.org/
    Answer 1:
    If you have n search terms, and are searching through a string of length l (the haystack, as in looking for a needle in a haystack), then searching for each term in turn will take time O(n*l). In particular, it will take longer the more terms you add (in a linear fashion, assuming the haystack stays the same length)
    If this is sufficient, then do it! The simplest solution is (almost) always the easiest to maintain.
    An alternative is to create a finite state machine that defines the search terms (Or multiple parallel finite state machines would probably be easier). You can then loop over the haystack string a single time to find every search term at once. Such an algorithm will take O(n*k) time to construct the finite state information (given an average search term length of k), and then O(l) for the search. For a large number of search terms, or a very large search string, this method will be faster than the naive method.
    One example of a state-search for strings is the Boyer-Moore algorithm.
    http://www-igm.univ-mlv.fr/~lecroq/string/tunedbm.html
    Regards, and have fun,
    -Troy

  • What is the efficient way of working with tree information in a table?

    hi all,
    i have to design a database to store,access,add or delete(the tree nodes) the tree information. what is the efficient way of accomplishing it?
    let's assume the information to be stored in the table is parent,child and type(optional).The queries should be very generic(should be able to work with any database).
    anybody has any suggestions?I have to work with large data.
    quick response is highly appreciated.
    thanks in advance,
    rahul

    Did you check out this link?
    http://www.intelligententerprise.com/001020/celko1_1.shtml
    Joe Celko has really gave some interesting way to implement tree in a rdbms.
    Best wishes
    Anubrata

  • Best way to work with large image sequences

    Hello,
    Im relatively new to Adobe CS4. I am learning to create detailed animations with another software that produces thousands of still images. (I normally save them as  .png, 720 x 480, 29.97 fps) I spent the money to buy CS4 becuase I was told this was what I needed to turn those images into video, add titles, captions, credits, etc. So far, I've been useing PremierPro to import the image sequences and add titles. But when I export the sequences, the quality is crap. I've tried all combinations of exporting formats, but even when I try to go for the greatest quality, my files are 100 MB large, and the quality is still crap (nice smooth lines are jagged, and details are blurry and pixilated, even with the titles I added). Can someone tell me how I should be working with these still image sequences so that I can retain the best quality while keeping the file size large. These videos are just played on local computer systems, not over a network or on DVDs. I dont know if I should be bringing the shorter clip sequences into AfterEffects and somehow exporting those. I just dont know. But nothing is working for me. Help!
    Nikki

    Hello!,
    The output frame size has so far been the same as my rendering size (720 x 480), but I's just trying to get the best quality, if I need to render larger I can. Delivery has so far just been video, no audio, although when I get better at this I will record narrations of the animations and put that in. Playback is intended only for computers. Every once in a while, it is requested in a format that will be played over our network, but I wanted to worry about that later.
    The preset I chose for the sequence was the DV-NTSC Standard 32kHz
    The project/sequence settings are:
    General
         Editing Mode: DV NTSC
         Timebase: 29.97 fps
    Video
         Frame size: 720, 480 4:3
         Pixel Aspect Ratio: D1/DV NTSC (0.9091)
         Display Format: 30 fps Non-Drop-Frame Timecode
    Audio
         doesnt matter
    Video Previews
         Preview File Format: NTSC DV
         Codec: DV NTSC
         Maximum Bit Depth
    Hope that was the information you requested.
    Nikki

  • Best way to work with large file?

    Ok I have a huge file (200mb) that is all sequenes of numbers separate by commas. I wish to know if a specific sequence exists in this file. What is the best way of doing that check? I cant load the whole file into a StringBuffer since it is too large.

    Well, it's not necessarily too large. You might be able to load it all into memory at once.
    But you're right. It's not a good idea.
    What you can do is use a StreamTokenizer set to split on commas.
    Conceptually, you need to have a simple state machine that keeps track of how many consecutive numbers from that sequence you've read. As soon as you hit a number that's not the next one in the sequence, you reset to the beginning of the sequence.
    It's not hard to write it yourself, but there might be a simpler way--one of the IO classes or regex classes might have something for processing a stream that way.

  • What is the most efficient way of converting relatively large program to pervious version?

    I am trying to convert 8.5 labview program to 8.2 version.  When I do that it creates many more subfolders with VIs that are not converted.
    What is the best way to do this?  I tried the mass compile but converting the high level  program creates many folders that links to other areas?
    Am I missing anything?
    Thanks for any help.
    Chetna 

    Hi Chetna,
    How are you confirming that the VIs saved in the sub-folders after conversion have not been converted to a previous version? 
    If you open the VI at the top of the hierarchy and 'Save for Previous Version...' it saves the open VI and all it's depencies for the chosen version. The reason you see all the sub-folders is because the relative path between VIs is maintained.
    This KnowledgeBase article: Can I Save VIs in My Current LabVIEW for Use in a Previous Version? describes this process.
    There is also a Community Example: Programmatically Save VIs for Previous Version that could also save time if you have a lot of VIs to convert.
    I hope this helps!
    Tanya V
    National Instruments
    LabVIEW Platform Product Support Engineer

  • Working with long projects

    I have a four hour long project that I would prefer to place on a single DVD. Can this be done with Compressor? Everything I have done to date fits within that 150 min.setting for DVD Best Quality.

    I am starting a fairly large project (about 1 hr 15min total running time) that will have a little bit of special effects (FCP, some Motion) and a few audio tracks. I just threw together a project of the same length and am testing it now. Should I expect Compressor to take 20 hours to encode it into good quality mpeg 2? (that's what I am seeing on the screen now - I haven't rendered or exported anything) I am used to long waits when working with larger projects but that seems a bit much.
    If you're using Export Using Compressor on an unrendered timeline of 75 minutes on a PowerBook then, yes, that's a very normal encoding time (especially for a timeline with many filters and/or motion effects).
    However, when I use this method, I find the MPEG2 output to be substantially better than any other workflow using the Final Cut Studio apps.
    Of course, if you don't need the best quality - say for a proof disc to be given to your client to check off on - I'd simply drop a QT reference movie into Compressor. Using the Fastest Encode possible, of course.

  • What is the most efficient way of passing large amounts of data through several subVIs?

    I am acquiring data at a rate of once every 30mS. This data is sorted into clusters with relevant information being grouped together. These clusters are then added to a queue. I have a cluster of queue references to keep track of all the queues. I pass this cluster around to the various sub VIs where I dequeue the data. Is this the most efficient way of moving the data around? I could also use "Obtain Queue" and the queue name to create the reference whenever I need it.
    Or would it be more efficient to create one large cluster which I pass around? Then I can use unbundle by index to pick off the values I need. This large cluster can have all the values individually or it co
    uld be composed of the previously mentioned clusters (ie. a large cluster of clusters).

    > I am acquiring data at a rate of once every 30mS. This data is sorted
    > into clusters with relevant information being grouped together. These
    > clusters are then added to a queue. I have a cluster of queue
    > references to keep track of all the queues. I pass this cluster
    > around to the various sub VIs where I dequeue the data. Is this the
    > most efficient way of moving the data around? I could also use
    > "Obtain Queue" and the queue name to create the reference whenever I
    > need it.
    > Or would it be more efficient to create one large cluster which I pass
    > around? Then I can use unbundle by index to pick off the values I
    > need. This large cluster can have all the values individually or it
    > could be composed of the previously mentioned clusters (i
    e. a large
    > cluster of clusters).
    It sounds pretty good the way you have it. In general, you want to sort
    these into groups that make sense to you. Then if there is a
    performance problem, you can arrange them so that it is a bit better for
    the computer, but lets face it, our performance counts too. Anyway,
    this generally means a smallish number of groups with a reasonable
    number of references or objects in them. If you need to group them into
    one to pass somewhere, bundle the clusters together and unbundle them on
    the other side to minimize the connectors needed. Since the references
    are four bytes, you don't need to worry about the performance of moving
    these around anyway.
    Greg McKaskle

  • Just got girlfriend a new iPad2. Her iMac is a PowerPC G5 (Tiger version 10.4.11) with 512 mb RAM. What's the simplest, most efficient way to get her iPad2 up and running and synced to her Mac?

    Just got girlfriend a new iPad2. Her iMac is a PowerPC G5 (Tiger version 10.4.11) with 512 mb RAM. What's the simplest, most efficient way to get her iPad2 up and running and synced to her Mac?

    Most of the Apple store sales people and some of the genius bar people are only knowledgable on Apple's more recent offerings. They are not very knowledgable, I found, on older PowerPC based Apple computers, I'm afraid.
    Here's the real scoop.
    Your girlfriend's G5 can only install up to OS X 10.5 Leopard. This is the last compatible OS X version for PowerPC users.
    OS X 10.6 Snow Leopard and OS X10.7 Lion are for newer Intel CPU Apple computers.
    Early iMac G5's can only have up to 2 GBs of RAM.
    Later iMac G5's (2005-2006) could take up to 2.5 GBs of RAM
    2 GBs of RAM will run OS X 10.5 Leopard just fine.
    The very latest iTunes (10.5.2) can be installed and runs on both PowerPC and Intel CPU Macs.
    However, there are certain new iTunes feature that won't work without an Intel Mac.
    One of iOS and iTunes feature is sync'g wirelessly over WiFi.
    This will not work unless you have an iDevice running iOS 5 and Intel Mac running 10.6 Snow Leopard or better.
    Although, I was disappointed I would not be able to do this with my G4 Mac, it's not a biggie problem for me.
    So, your girlfriend's computer should be fine for what she intends to use it for.
    The Apple people either just plain didn't know or we're trying to get you to think about buying a new Mac.
    At least, as of now, not truly necessary.
    If Apple, at some later point, drops support for PowerPC users running 10.5, then would be the time to consider a new or "newer" Intel CPU Mac.
    My planned Mac computer upgrade is seeking out a " newer" last version G5 for my "new" Mac.
    I can't afford, right now, to replace all of my core PowerPC software with Intel versions, so I need to stick with the older PowerPC Macs for the time being. The last of the G5's is what I seek.

  • Most efficient way to delete "removed" photos from hard disk?

    Hello everyone! Glad to have this great community to come to for help. I searched for this question but came up with no hits. If it's already been discussed, I apologize and would love to be directed to the link.
    My wife and I have been using LR for a long time. We're currently on version 4. Unfortunately, she's not as tech-savvy or meticulous as I am, and she has been unknowingly "Removing" photos from the LR catalogues when she really meant to delete them from the hard disk. That means we have hundreds of unwanted raw photo files floating around in our computer and no way to pick them out from the ones we want! As a very organized and space-conscious person, I can't stand the thought. So my question is, what is the most efficient way to permanently delete these unwanted photos from the hard disk
    I did fine one suggestion that said to synchronize the parent folder with their respective catalogues, select all the photos in "Previous Import," and delete those, since they will be all of the photos that were previously removed from the catalogue.
    This is a great suggestion, but it probably wouldn't work for all of my catalogues since my file structure is organized by date (the default setting for LR). So, two catalogues will share the same "parent folder" in the sense that they both have photos from May 2013, but if I synchronize May 2013 with one, then it will get all the duds PLUS the photos that belong in the other catalogue.
    Does anyone have any suggestions? I know there's probably not an easy fix, and I'm willing to put in some time. I just want to know if there is a solution and make sure I'm working as efficiently as possible.
    Thank you!
    Kenneth

    I have to agree with the comment about multiple catalogs referring to images that are mixed in together... and the added difficulty that may have brought here.
    My suggestions (assuming you are prepared to combine the current catalogs into one)
    in each catalog, put a distinctive keyword onto all the images so that you can later discriminate these images as to which particular catalog they were formerly in (just in case this is useful information later)
    as John suggests, use File / "Import from Catalog" to bring all LR images together into one catalog.
    then in order to separate out the image files that ARE imported to LR, from those which either never were / have been removed, I would duplicate just the imported ones, to an entirely separate and dedicated disk location. This may require the temporary use of an external drive, with enough space for everything.
    to do this, highlight all the images in the whole catalog, then use File / "Export as Catalog" selecting the option "include negatives". Provide a filename and location for the catalog inside your chosen new saving location. All the image files that are imported to the catalog will be selectively copied into this same location alongside the new catalog. The same relative arrangement of subfolders will be created there, for them all to live inside, as is seen currently. But image files that do not feature in LR currently, will be left behind by this operation.
    your new catalog is now functional, referring to the copied image files. Making sure you have a full backup first, you can start deleting image files from the original location, that you believe to be unwanted. You can do this safe in the knowledge that anything LR is actively relying on, has already been duplicated elsewhere. So you can be quite aggressive at this, only watching out for image files that are required for other purposes (than as master data for Lightroom) - e.g., the exported JPG files you may have made.
    IMO it is a good idea to practice a full separation of image files used in your LR image library, from all other image files. This separation means you know where it is safe to manage images freely using the OS, vs where (what I think of as the LR-managed storage area) you need to bear LR's requirements constantly in mind. Better for discrete backup, too.
    In due course, as required, the copied image files plus catalog can be moved bodily to another drive (for example, if they have been temporarily put on an external drive, and you want to store them on your main internal one again). This then just requires a single re-browsing of their parent folder's location, in order to correct LR's records inside this catalog, as to the image files' changed addresses.
    If you don't want to combine the catalogs into one, a similar set of operations as above, can be carried out for each separate catalog you have now. This will create a separate folder structure in each case, containing just those duplicated image files. Once this has been done for all catalogs, you can start to clean up the present image files location. IMO this is very much the laborious and inflexible option, so far as future management of the total body of images is concerned... though there may still be some overriding reason for working that way.
    RP

  • Working with large Artboards/Files in Illustrator

    Hello all!
    I'm currently designing a full size film poster for a client. The dimensions of the poster are 27" x 40" (industry standard film poster).
    I am a little uncertain in working with large files in Illustrator, so I have several problems that have come up in several design projects using similar large formats.
    The file size is MASSIVE. This poster uses several large, high-res images that I've embedded. I didn't want them to pixelate, so I made sure they were high quality. After embedding all these images, along with the vector graphics, the entire .ai file is 500MB. How can I reduce this file size? Can I do something with the images to make the .ai file smaller?
    I made my artboard 27" x 40" - the final size of the poster. Is this standard practice? Or when designing for a large print format, are you supposed to make a smaller, more manageable artboard size, and then scale up after to avoid these massive file sizes?
    I need to upload my support files for the project, including .ai and .eps - so it won't work if they're 500MB. This would be good info to understand for all projects I think.
    Any help with this would be appreciated. I can't seem to find any coherent information pertaining to this problem that seems to address my particular issues. Thank you very much!
    Asher

    Hi Asher,
    It's probably those high-res images you've embedded. Firstly, be sure your images are only as large as you need them Secondly, a solution would be to use linked images while you're working instead of embedding them into the file.
    Here is a link to a forum with a lot of great discussion about this issue, to get you started: http://www.cartotalk.com/lofiversion/index.php?t126.html
    And another: http://www.graphicdesignforum.com/forum/archive/index.php/t-1907.html
    Here is a great list of tips that someone in the above forum gave:
    -Properly scale files.  Do not take a 6x6' file then use the scaling tool to make it a 2x2'  Instead scale it in photoshop to 2x2 and reimport it.  Make a rule like over 20%, bring it back in to photoshop for rescaling.
    -Check resolutions. 600dpi is not going to be necessary for such and such printer.
    -Delete unused art.  Sloppy artists may leave old unused images under another image.  The old one is not being used but it still takes up space, therefore unecessarily inflating your file.
    -Choose to link instead of embedd.  This is your choice.  Either way you still have to send a large file but many times linking is less total MB then embedding.  Also embedding works well with duplicated images. That way multiple uses link to one original, whereas embedding would make copies.
    -When you are done, using compression software like ZIP or SIT (stuffit)
    http://www.maczipit.com/
    Compression can reduce file sizes alot, depending on the files.
    This business deals with alot of large files.  Generally people use FTP's to send large files, or plain old CD.  Another option is using segmented compression.  Something like winRAR/macRAR or dropsegment (peice of stuffit deluxe) compresses files, then breaks it up into smaller manageble pieces.   This way you can break up a 50mb file into say 10x 5mb pieces and send them 5mb at a time. 
    http://www.rarlab.com/download.htm
    *make sure your client knows how to uncompress those files.  You may want to link them to the site to download the software."
    Good luck!

  • What is the best way to work with mixed media in 1080 timeline?

    Hi there,
    I have a project shot mostly in 1080 24p but a bit of 720 24p and 4x3 30p footage.
    What is the best way to work with this mixed media?
    Thanks!
    Steven

    Hi Shane,
    Ok just to recap (thanks for being patient by the way)...I have put some questions in here...feel free to write in caps to respond and we'll put this baby to rest!
    1) I will work in a 1080p FCP sequence, correct?
    2) HD 1080p footage captured as applepro res hq I will leave as is and work with?
    3) HD 720p footage which was captured as applepro res hq. Can I just drop in 1080 timeline and let FCP do its work? Or should i run through compressor, if so what setting shall I submit it too?
    4) I dont have a budget for an external hardware to convert SD to HD...should I capture my SD Beta through Final Cut and just leave and let FCP do its work? Or should I run through compressor? If so what setting shall I submit it too?
    5) For for my master sequence in FCP am I not using an I-Frame format apple pro res if I am using my apple pro res hq 1080 footage as my formatted sequence? I am not sure what you mean by GOP?
    6) I also found some 60p footage 1080p XDCAM compressor shot with the same XDCAM camera. I put in the 1080p 24p timeline but it was pretty choppy...any ideas of this conform?
    Thanks very much for all your help it has gone along way,
    Steven
    Saying what CODECS you are working with was my question. 1080, 720, 4:3...really says nothing. There are a dozen 1080 codecs, another dozen 720 codecs, and nearly 100 4:3 formats.
    Best to capture all the footage to one uniform codec.
    Second best is to work with one format and let FCP conform the rest to that...IF and only IF that format is an I-Frame format like ProRes. GOP formats as master sequence formats cause TONS of issues.
    What I'd do is work 1080 ProRes, just add the 720p footage (Use Compressor to convert it if you didn't have a lot, but if you had a lot, just add it), but I'd capture all the SD 4:3 footage via a Kona 3 or Matrox MXO2 as 1080 ProRes. Hardware conversion of SD to HD is much better than anything FCP can do. AE might do good as well. But then use Compressor to convert 29.97 footage captured (you can't convert 29.97 to 23.98 when you capture) to get to 23.98.

  • Most efficient way to open a new TextEdit doc at a specific place

    Suppose I've navigated in the Finder to some deep dark location in the folder hierarchy...   I want a new text file titled "notes" in this folder, i.e. to this path.
    Assuming TextEdit is already open, what is the most efficient way of creating and begin adding text to a new TextEdit file in that location?
    Tried this:  Maybe the usual Save dialog is aware of the current folder, so I can choose "New Document" in TextEdit's Dock icon pulldown, and then choose that path when I do File --> Save, choose the Save dialog's expanded view, and pull down the Where: selection.  Nope.  The current path is not there.
    Tried this:  Keeping an empty TextEdit document named untitled on my desktop.  Drag-copying that to the current folder.  Rename the file to "notes", open it, and start editing  That works, except it is clumsy.
    Is there a better way?
    Please forgive me if I'm missing something incredibly obvious.

    The problem with that is a new file has nothing in it.
    If you save a dummy file on the desktop someplace or use a real file in each of your locations, you could right click and duplicate it, then doubleclick to open it in the program of choice.
    Another option would to be to create a AppleScript that took the current open windows pathname and create and save a Textfile there.
    You save the app in the Dock and only have to click on it once, it automatically quits when mission accomplished.

  • Most efficient way to load XML file data into tables

    I have a complex XML file running into MBs. I want to load it's data into 7-8 tables.
    Which way will be better:
    1) Use SQL Loader to actually load directly into the 7-8 tables directly by modifying the control card.
    Is this really possible and feasible? I am not even sure about it
    2) Load data as XML Type in a table and register it. Then extract from there to load into various tables.
    Please help. I have to find the most efficient way of doing it.
    Regards,
    Sudhir

    Yes it is possible to use SQL*Loader to parse and load XML, but that is not what it was designed for and so is not recommended. You also don't need to register a schema, just to load/store/parse XML in the DB either.
    So where does that leave you?
    Some options
    {thread:id=410714} (see page 2)
    {thread:id=1090681}
    {thread:id=1070213}
    Those talk some about storage options and reading in XML from disk and parsing XML. They should also give you options to consider. Without knowing more about your requirements for the effort, it is difficult to give specific advice. Maybe your 7-8 tables don't exist and so using Object Relational Storage for the XML would be the best solution as you can query/update tables that Oracle creates based off the schema associated to the XML. Maybe an External Table definition works better for reading the XML into the system because this process will happen just once. Maybe using WebDAV makes more sense for loading XML to be parsed (I don't have much experience with this, just know it is possible from what I've read on the forums). Also, your version makes a difference as you have different options available depending upon the version of Oracle.
    Hope all that helps as a starter.
    Edited by: A_Non on Jul 8, 2010 4:31 PM
    A great example, see the answers by mdrake in {thread:id=1096784}

  • Most efficient way to do some string manipulation

    Greetings,
    I need to cleanse some data in a string by replacing unsafe characters with encoded equivalents. (FYI, this is for the purpose of transforming "unsafe" characters into encoded values as data inside an XML document).
    The following code accomplishes the task:
    Note that a string "currentValue" contains the data to be cleansed.
    A string, "encodedValue" contains the result.
      for (counter = 0; counter < currentValue.length(); counter++)
        addChar = (currentValue.substring(counter,counter+1));
        if (addChar.equals("<"))
          addChar = "#60;";
        if (addChar.equals(">"))
          addChar = "#62;";
        if (addChar.equals("="))
          addChar = "#61;";
        if (addChar.equals("\""))
          addChar = "#34;";
        if (addChar.equals("'"))
          addChar = "#39;";
        if (addChar.equals("'"))
          addChar = "#39;";
        if (addChar.equals("/"))
          addChar = "#47;";
        if (addChar.equals("\\"))
          addChar = "#92;";
        encodedValue += addChar;
      } // forI'm sure there is a way to make this more efficient. I'm not exactly "new" to java, but I am learning on my own with no formal training and often take a "brute force" approach with my initial effort.
    What would be the most efficient way to re-do the above?
    TIA,
    --Paul Galvin
    Integrated Systems & Services Group

    im a c++ programmer so im not totally up on these java classes either but...from a c++ stand point you might want to consider using the if else statment.
    by using if else, you only test the character until you find the actual "violating" character and skip the rest of the tests.
    also, you might trying using something to check for alphaNumeric cases first and use the continue keyword when you find one. since more of your characters are probably safe than unsafe you can skip all the ifs/if else statement and only do one test on the good characters. (i just looked for a way to test that and i didnt find one. c++ has a function that does that by checking the ascii number range. dont think that works in java. but maybe you can find one, it would reduce the number of tests probably.)
    happy hunting,
    txjump :)

Maybe you are looking for

  • Adobe Reader 8.2 - .exe-installer

    Hello! Is there also an .exe-installer of the Adobe Reader 8.2 or is there only an .msi-Version? Greetings, Leon-123

  • Not able to assign my purchased item in avg costing organisation

    Dear Sir Our company is an Average Costing Organisation. When I try to assign an item (After applying Purchased Item Template) to my organisation, I am not able to save it. But If I apply Finished Good and Subassembly Templates, I am able to assign t

  • Question about multiple class files

    I just started learning JAVA a couple of days ago and the first program I wrote had two classes in one file. here is the program : class fib_num { public int value; public boolean is_even; class Fibonacci { /** Print the Fibonacci sequence for values

  • Can I still download the free trial of photoshop cs5?

    I have a computer with Windows Vista Business and I would like to install the free trial of photoshop CC but that doesn't work because it's Vista. Is photoshop cs5 or 6 still available because that does work on Vista.

  • Printing a range of documents

    Hello everyone, I've searched this forum and the SBO help for information about how to print a range of documents. Do anyone knows how I can print more than one document? I'm not referring too many copies of the same document Also, I tried to configu