Rate my Rig - PrePurchase Jitters

Hi all. I'm about to invest a bundle into a new setup for my office (I work in commercial real estate but enjoy dabbling with lots of technologies ranging from programming to video editing) and was hoping to get a bit of advice as to any ways to improve the machine configuration I'd like to purchase. I'm open to any suggestions you may have as to how to get more value from my purchase, whether it be through upgrading or downgrading aspects...without further ado, here is what I'm thinking:
Mac Pro: 'Two 2.66GHz Dual-Core Intel Xeon' - My gut tells me this is fast enough, based on the how my 2 gigahertz intel imac performs...and the xeon chip is superior as right? Meanwhile I've heard that the quad-core chips really don't offer a big advantage for people who aren't using very specific programs?
2 or 4 gb of ram (I can't decide, what do you think? It costs about $400 to upgrade from 2gb...also is there a good reason why apple doesn't let people order '3 gigs' ?
250GB 7200-rpm drive. I only have 150gb at home and generally don't fill it up often...though isn't 7200rpm kinda slow?
ATI Radeon X1900 XT 512MB (2 x dual-link DVI) [Add $249] - In the future I'd like to get a 2nd 30" ACD so I think this is the best card...though realistically, can this card drive two 30" monitors without getting its *** kicked?
Apple Cinema HD Display 30 Inch - From what I've been able to google, there really isn't anything else on the market that is vastly superior in this size. Supposedly I could save a couple hundred by getting a dell monitor that has basically the same stuff on the inside, but I would prefer to just deal with apple unless someone else has a way better product. I heard lacie might have some wicked monitors, but couldn't find a 30"?
1 Superdrive
Bluetooth 2.0+EDR and AirPort Extreme
Absolutely any advice on any of these components would be most appreciated!

"Mac Pro: 'Two 2.66GHz Dual-Core Intel Xeon' - My gut tells me this is fast enough, based on the how my 2 gigahertz intel imac performs...and the xeon chip is superior as right? Meanwhile I've heard that the quad-core chips really don't offer a big advantage for people who aren't using very specific programs?"
For non-server based applications I believe that the 8-core computer will not give you a very good bang for the buck, so this is probably a smart decision. [If you find that you need to in the future -- I suspect that you can replace the Xeon-Quad chips when they are available on the street (hopefully at a lower price).
"2 or 4 gb of ram (I can't decide, what do you think? It costs about $400 to upgrade from 2gb...also is there a good reason why apple doesn't let people order '3 gigs' ?"
I agree with the previous poster, 2-2G modules for this computer (ECC) are available from Crucial.com at just over $300 dollars - I have bought many many modules from Crucial and I have NEVER had any problem with them.
"250GB 7200-rpm drive. I only have 150gb at home and generally don't fill it up often...though isn't 7200rpm kinda slow?"
7200 SATA drives are the typical "high-end" for desktop computers, 10K SATA drives are available -- but are typically very expensive (and hotter -- which may cause a problem if the case was not designed to deal with the additional heat. If you doing a lot of reading/processing/writing, it is probably more cost effective to add more memory (assuming the application can take advantage of it).
"ATI Radeon X1900 XT 512MB (2 x dual-link DVI) Add $249 - In the future I'd like to get a 2nd 30" ACD so I think this is the best card...though realistically, can this card drive two 30" monitors without getting its * kicked?"
For most applications, this card should be sufficient for two monitors. If you are not buying the second monitor right away (anyway), you will probably be able to get a higher power card in the future -- which will cost less -- than buying the highest end card now.
"Apple Cinema HD Display 30 Inch - From what I've been able to google, there really isn't anything else on the market that is vastly superior in this size. Supposedly I could save a couple hundred by getting a dell monitor that has basically the same stuff on the inside, but I would prefer to just deal with apple unless someone else has a way better product. I heard lacie might have some wicked monitors, but couldn't find a 30?"
I would still recommend a Dell monitor, Dell just refreshed their monitors with a higher-colour edition of the 30" monitor (I have not seen any new releases from Apple). I use the Dell monitors and I am very happy with them (even before the refresh).

Similar Messages

  • Application control methods to prevent jerky graph refresh rate

    Hi all,
    In the simplest terms, what I am trying to achieve is a VI with a button on. When the user presses the button, a subVI is executed which acquires some data which is displayed on 2 graphs. The first graph has cursors which can be moved by the user to define the limits of the data plotted on the second graph.
    I created a VI that did the 2 graphs with the limits and they updated fine (see attached file Stiffness and Damping.vi). However, this was when loading some data from a text file (attached as LabTestData.txt).
    When I replaced the loading bit with the DAQ subVI, when the user changed the cursors on the first graph, the range of data on the second graph remained unchanged (see attached file FIND SAD v1.vi).
    I then changed the structure so the data acquisition was in a different loop to the analysis and this works to a certain extent, apart from the fact that the second graph has a very jerky refresh rate (see attached file FIND SAD.vi).
    If someone has a suggestion on a better way to structure the VI to solve these problems then that would be a great help.
    The second version also has the problem that the VI doesnt stop when the 'Back to main menu' button is pressed, and I can't figure out why this is.
    Many thanks,
    Ian
    Attachments:
    Stiffness and Damping VIs for forum.zip ‏240 KB

    Thanks Tom,
    I've attached it along with the subVI it uses as well
    Cheers,
    Ian
    Attachments:
    DATA FROM RIG v2.vi ‏553 KB
    FRA SubVI.vi ‏71 KB

  • I'm Near the End of My New Rig Design...Still Open to Changes

    Okay, here's my latest list based on the superior knowledge of all of you and reading many posts. I could still use some help where noted:
    Case: Either the Thermaltake Armor Plus http://www.newegg.com/Product/Product.aspx?Item=N82E16811133055&cm_re=thermaltake_armor_p lus-_-11-133-055-_-Product or the Coolermaster Half-X http://www.newegg.com/Product/Product.aspx?Item=N82E16811119225&cm_re=HAF-X-_-11-119-225-_ -Product   Thoughts on these two?
    Processor: 980-X
    Processor  Cooler: Noctua NH-D14 120mm & 140mm SSO CPU Cooler
    Moboard (I can use some help deciding between these two): GIGABYTE GA-X58A-UD7 LGA 1366  Intel X58 SATA 6Gb/s USB 3.0 ATX Intel Motherboard or ASUS P6X58D Premium LGA 1366 Intel X58 SATA 6Gb/s USB 3.0 ATX Intel Motherboard
    Power Supply: Thermaltake Toughpower  W0155RU 1000W ATX12V / EPS12V SLI Certified CrossFire Ready 80 PLUS Certified Active PFC Power Supply  http://www.newegg.com/Product/Product.aspx?Item=N82E16817153053
    RAM: I'd like to bite the bullet and get 24GB, meaning 6 x 4gig sticks. I'm wide open to suggestions here. Here's what I've dug up so far: PATRIOT PGS324G1600ELHK DDR3  24GB (6x4GB) 1600MHz   http://www.newegg.com/Product/Product.aspx?Item=N82E16820220478
    GPU: GTX-480
    System  Drive: Crucial Sata III SSD 2.5" 256GB SATA II MLC Internal Solid State Drive (SSD). The price just dropped $40 from yesterday. Check out the specs on this baby (again it's SATA III):  http://www.newegg.com/Product/Product.aspx?Item=N82E16820148349&cm_re=crucial_c300-_-20-14 8-349-_-Product
    Media  Drives: 2 x 2Tb Seagate Sata III HDs. I will move these over from my current rig. I'm also open to getting a third one. I also have a HighPoint RocketRAID 620 PCI-Express 2.0 x1 SATA 6.0Gb/s Controller Card that I was going to move over. A question: do you think this may have been responsible for my less-than-impressive HD performance in my recent PPBM-4 test results? It was only $60! If you have a suggestion for a better SATA III Controller, I'm all eyes!
    External  Monitoring Capablilities: Matrox MXO2-LE w/Max   http://www.bhphotovideo.com/c/product/650526-REG/Matrox_MXO2LEMAX_N_D_MXO2_LE_with_MAX.htm l
    Total System Cost: $5,450
        I'm wide open to critique/suggestions!! Haven't bought yet, but plan on ordering everything by the end of this month.

    Bill: my test results are there. There are a whopping 54 test results that are worse than mine! (And 75 that are higher performing!). I am hung up on SATA III because I'd like to upgrade to the latest and greatest, especially due to the fact that I already have 2 SATA III 2Tb drives. I think I'll add a 3rd one with a better RAID controller. I bet that RocketRaid card is responsible for my disk performance being so low. Again, since I'm starting from scratch, I'd like to go with a SATA III controller card, even if it means a wait. Remember, I'm mostly doing HD :30 sec spots, so I'm not afraid of rendering. It's just that I'd like to see the time it takes to be cut in half. Most of my time is spent dealing with the instability I'm experiencing with CS4, so gaining stability alone is half the battle. I'm fine not OCing. And yes, the Matrox MXO2 will be CS5 ready any day now.
    Baz: I love my Samsung 256Gb SSD that I'm currently using for OS & Adobe Project Files. I like the fact that it's more reliable than an HD, takes up less space than a typical HD, which means more air can flow by it (and not heat up that air as it passes by), it uses less power, it generates no noise, needs no defragging...and is rediculously expensive per GB! But I'm okay with that. The new Crucial Sata III SSD has Read speeds up to 355MB/s. Writes of 225MB/s. At this rate, I don't anticipate cancelling those all-important life-saving Auto Saves!
    shooternz: Matrox does not make you use their codecs for the MXO2, unlike the Axio LE. Since I output all of my spots to HD mpeg2's for the various media outlets, the Matrox codec issue isn't much of one, even with my current Axio LE. If you want to tell me how I can get HDMI output of the PPro Program Window, while simultaneously outputting to two edit interface monitors, I'm all eyes! The nice thing about the MXO2 is that it has all sorts of I/Os, including SDI.
    Harm: What do you think if I add a 3rd 2Tb SATA III HD and a decent RAID controller? I knew the RocketRaid card I currently have was going to make you vomit! Do you think this is what made my scores bomb on the PPBM4 test? I insist on waiting for a decent SATA III RAID controller, if waiting is required. I don't want to go backwards, here. And Harm, what's your thoughts on the Matrox MXO2-LE w-Max? I HAVE to have external monitoring capabilities and multiple format I/Os (composite, component, S, SDI & HDMI) and balanced audio I/O!!

  • External monitor jittering

    I just bought a SAMSUNG SyncMaster 940BW widescreen monitor.
    The ratio is 1.6 but when I go to use 1440x900 it only allows 60Hz refresh rate and jitters like crazy.
    Anyone else had success with this monitor - keeping with a true ratio of 1.6?

    All LCD's only operate at 60hz, regardless of what
    the display settings are. Sounds like a bad
    connection. Can you use DVI to connect? That might
    help.
    I've just purchased an LG 22" LCD monitor (L226WTQ-BF) and seem to have the same problem, though very slight, with screen jittering. I have my Powerbook hooked up via a DVI-D cable directly to the monitor. In Screen Prefs I can only choose 59.9 hertz for the Refresh Rate though I know the LG monitor is performing at 60 hertz. The base resolution for the LG is 1680x1050, the only time I'm able to choose 60 hertz as a Refresh Rate is at 1360x768. Is my Powerbook only able to produce 60 hertz at a maximum 750 resolution?

  • Movie jitters/flickers when played from DVD

    I've created a DVD but when I play it anywhere, (DVD Player or Mac) it jitters in some parts of the movie, usually when it's panning fast or following an object quickly.
    When I played the movie from DVD SP it plays perfectly, but after I've burned the DVD or created the .img file, it jitters/flickers, I've played it in 3 different TV's and 2 other Macs and it does the same thing, NOW this is the interesting part... When I play it from my MacBook Pro it doesn't do it, it plays perfectly. What is going on???
    In the DVD SP preferences I've changed the encoding from
    Mode: one pass DBR, Motion Estimation: better to
    Mode: two pass DBR and motion estimation: Best
    the rest of the specs stayed the same
    Aspect ratio: 4:3 although the movie was shot 16:9
    field order: auto
    bit rate: 4.0 mbps
    max bit rate 7.0 mbps
    I've burned a DVD from the MacBook Pro that doesn't show the problem, and when I played it from it, it was perfect as usual, but when I went back to the MacPro or to the DVD players the jitters appeared again.
    Does anyone have an idea why this is happening? or what am I missing in the burning process?
    In advance... thanks for your help.
    Claus

    I believe I am experiencing the same problem. The thing that was odd to me is that on my video only about the first 1 minute was messed up, after that the rest of the video was playing fine.
    - It was the exact same spot on 4 discs that I tried burning
    - I tried re-exporting from fcp - still the same spot messed up
    - Also tried re-encoding in DVDSP - same result
    I can only assume that there is an error occurring sometime during encoding, but I can't figure out a solution. Any ideas are greatly appreciated.
    PS:
    The above described was shot in 4:3 SD, exported using current settings, and encoded in DVDSP with best motion estimation, 8mb speed, bottom field order first, etc...
    Right now I am working on another DVD which was shot in HD, exported from final cut using current settings, encoded in compressor using SD DVD 90 min setting, then imported into DVDSP.
    I'm experiencing something similar here, but this time I have 3 different videos on the DVD. One of them, the shortest, turns out jittery and the other two turned out fine.
    This is my first time using an HD to SD DVD workflow, so I don't know if I could have made a mistake somewhere along that path.

  • NI-DAQmx frequency sampling rate

    Hi there!
    I'm working on setting up a data acquisition Labview VI, to measure different signals on a test rig.
    I'm using the NI-DAQmx assistance (the Express VI?) to continously measure analog signals (Variable current, voltage and temperatures). This is working just fine, and i can change the sampling rate by writing to the express VI. The idea is, that the user can change the sampling rate from around 1 to 500 Hz. 
    We do however have a sensor that transmittes digital signals (a frequency), and are using a NI-9423 module to "read" it. As this is a digital signal, another NI-DAQmx express VI is needed to handle it (that's ok), but so far we can't figure out how to alter the sampling rate - it's apperently locked at 1kHz. 
    Being that we want to merge the analog and digital signals to one array, we are recieving overflow errors from the "analog" DAQ, if it's not set at exactly 1kHz. 
    So, in short - is it possible to change the sampling rate of a DAQmx recieving frequencies? So that we to DAQ assistences have the same sampling rate?
    Help would be greatly appreciated!
    - Nicklas
    Attachments:
    DAQissue.PNG ‏64 KB

    Unlike voltage measurements, which tend to be (more or less) instantaneous, frequency measurements take a finite (and often variable) amount of time.
    If it is a slow signal then you measure the number of counts of your reference clock that occur in one period of your input signal. As your input signal varies in frequency, so does the measurement rate. If it is a fast signal, you can either measure how long it takes to get n cycles or your input (again variable) or you could count how many cycles of your input occur in a fixed time period.
    The NI help on frequency measurements describes three different ways you can configure a counter to measure frequency.
    The long and short of this is that generally counter measurements come at variable measurement rates which can be problematic to fit in with a fixed rate loggin system. If the measurement period is much smaller than your desired rate then you can wait and trigger a measurement at regular intervals. If not, you can let the counter run at its own rate, placing the latest result on a notifier, and in another loop just read the latest measurement from the notifier each time you want to record a result. Depending if you counter is running faster or slower than your desired logging rate you will end up with either missed samples or repeated samples. There are inherant timing inaccuracies in both approaches because, unlike analog measurements, the counter measurement is not made at 'that exact time, now!' but over a period of time which may be long or short compared to your logging rate.

  • Standard frame rate for 1080p NTSC?

    Hello and thanks for having me! Very new(b) to Adobe Premier, so I'll throw something basic out, thanks for your advice:
    the short version:
    What is the 'standard' framerate for NTSC 1080p? Does this framerate differ from what you would use for output at 720p,
    and would you choose different framerates depending on delivery method (web vs. bluray)?
    the long version:
    I have at my disposal a very capable rig to work on, and just doing some basic editing for my audio CV reel. The source material is usually 1080p and sometimes 720p, whether it be streamed into my capture card HDMI from playback sources, or full production originals which I have scored for clients.
    Therefore, while editing my reel together, I'd like to work at full 1080p and downconvert for smaller formats afterword.
    However, I am very new to video editing, and not sure what framerate to work in. I see the available selections of NTSC 1080p at several framerates,
    and I am aware of all available framerates in the US and abroad, so my question is:
    What is the standard frame rate for NTSC 1080p used for BluRay or highbandwidth web transmission? Some are telling me 'use 60, its standard'. Others tell me its 24. Still others say 24 is only for working with film, or interpolating other sources to 'make your stuff look like film'.
    Thank you for any answers or resources, and have a nice evening!

    For Blu-ray you can have two frame rates at 1080 resolution, 30 fps interlaced, or 24 fps progressive.  1080p/30 will not go onto a Blu-ray.
    At 720 resolution, you can have 60 fps or 24 fps, both progressive.
    If you want to work at 1080p, your only option is 24 fps.
    Web delivery is a different matter.  There are no defined standards and pretty much anything you like will play on the web.
    (On a side note, drop and non-drop frame refers strictly to the timecode information, and does not regulate frame rates.)

  • Is there a way for the tracker to do key frames at the frame rate of the comp setting?

    I was doing some tracking in a 24fps project (yes my comp settings are correct) and I was really annoyed to see that the track looked to be generating key frames at 29.98fps. Of course this means crazy jittering.
    Is there a way I can change the rate that tracker generates key frames?
    Please help. I use CS3.
    Thank you in advance for your help.

    Mylenium said this:
    > Is the footage interpreted correctly? There has been some
    > discussion for CS4 about incorrect frame rate treatments in
    > pretty much all of Adobe's video-centric tools, but I'm not
    > aware of any particular issues with the tracker. Still, maybe
    > it is one of those bugs.
    redmike said this:
    > I was doing some tracking in a 24fps project (yes my comp settings
    > are correct) and I was really annoyed to see that the track looked > to be generating key frames at 29.98fps. Of course this means crazy
    > jittering.
    (I assume that you mean 29.97, not 29.98.)
    In all versions of After Effects, the motion-tracking keyframes are set at the frame rate of the footage item on which the layer is based, not at the composition's frame rate.
    Have you checked to see that your footage item is interpreted to be 24fps?
    Also, are you sure that this "jittering" that you're seeing is because of the rate of the motion-tracking keyframes?

  • Guitar rig clips on playback

    Hi there,
    Been using Logic for years.  Just got Komplete 7 and everything is beautiful, except one thing.
    When I have Guitar Rig 4 being used as an amp processor on either bass or guitar tracks, it clips Logic everytime I stop and start the playhead.
    I've tried everything, no problem on the source audio file when the plug in is disabled.  No problems when playing back the whole audio file, just when I start and stop.
    Any ideas,
    Cheers.

    Hey man,
    Thanks for the response, but yes, I've tried multiple different buffer settings, even tried different sample rates and bit depths just to see what happened.
    It's bizarre, like I said, the clipping only happens when Guitar Rig is in the playback signal chain.
    I've even purged all unused samples form Battery and other Sampler instruments in my session.
    It's really annoying because all the other N.I plugs are great and work fine.
    I did find an exact same problem on another forum but no resolution was offered, simply turn off the plug in.
    Thanks anyway.

  • All that jitters is not gold . . .

    I have dead-ended.
    I recently purchased a Wacom Intuos3 tablet. When I try to use it, the cursor has a severe case of the jitters. It also randomly causes popups to appear, as if I were holding down the "control" key, even though none of the keys on the tablet or pen are set for "control". It is worst in my 3D modeling & CAD apps (the primary reason I got it), but it will still do it with only the Finder running. In CAD, it's bad enough to cause the coordinates to change at 1/16 inch resolution. The cursor is fine when the tablet is on (USB) but it is not active. It is only when I use the pen or Wacom mouse on the tablet that it occurs. My bluetooth mouse appears to have no effect on it.
    The techs at Wacom have been very helpful, but they're stuck, too. They believe it is some kind of RFI, first suggesting that it was possibly too close to my display (which does not seem to be the case), and now they think it is coming from the computer itself. At this point, all they can suggest is putting a USB card in the computer. I also believe it is some kind of environmental RFI, because how I hold the pen or touch the tablet effects the severity of the jitter. I tried it on a friend's dual G5 (although hers is Intel, mine PPC) and it worked fine. And no, it's not practical to bring hers here to see if it still works.
    I have the most recent driver (as does she). I have sat in the dark with everything in the room turned off. I unplugged the cordless phone. I have turned off the bluetooth and unmounted and turned off my external firewire drives. I have no wireless networking. I have had no mice or keyboards connected. I have plugged the tablet into different USB ports. The only things on in the room are the computer, the Apple cinema display and the dsl modem (about 30" away). I'm in a ranch house sitting on a concrete slab. The nearest commercial radio transmitter is about 6 miles away. Everything in the house that I can think of that could possibly cause RFI has been turned off (including two lights on dimmers at the other end of the house). I plugged it into the front port and moved the tablet 8' away from the desk and it still jitters just as much (if it were the computer causing it, the interference should have dropped by a factor of about 16). The only thing that slows it down, but doesn't eliminate it, is if I put my bare foot on the computer case - I'm assuming I'm draining some of the RFI I'm channeling to the tablet.
    So . . . Does anyone know if the computer can be a source of that much RFI, or of something else in my environment that may be causing it.
    Thanks.
    dual 2.5 G5   Mac OS X (10.4.8)  

    Hello, all! The saga continues . . . .
    Well, I'm closer to isolating the problem from my recent machinations, but let me first address your latest round of suggestions.
    Daniel:
    Not ground loops. Right now, my setup is such that the recording equipment is only connected to the computer by optical cable (I/O). The computer analog out runs to an amplifier, which I had pulled. And I know people are checking in, because the "views" count keeps going up. Thanks.
    japamac:
    Manual-read(?) meter. No home security system. Water meter is is remote-read, but they drive by & ping it. And it is possible: I felt like I was "probed" the other night. And we refer to my next-door neighbor as "The Alien". He's OK, but he REALLY keeps to himself, and sometimes he does things that are toward the edges of the bell-curve. Whenever we have random interference on the TV, we say he's trying to contact the mothership. As for Big Brother, I'm sure I was on lists in the 60's & 70's as a "known associate". Years ago, when my sister would call, we would sometimes get strange clicks & noises on the line, so we would just start talking to the FBI agents. They might just be using more sophisticated bugs.
    DaddyPaycheck:
    I've tried it in the dark with everything turned off. Today, everything in the room was unplugged except the computer and display, everything was disconnected from the computer, fans & humidifier in the house turned off, fridge unplugged, etc,, and I waited until the furnace cycled off. The house was dead. It jittered.
    So here's what I did (after I turned the house back on). I took things to a neighbor's a couple of doors down. Her power comes from a pole about 150' from mine, but it's about 3/4-1 mile away on the grid. That side of the neighborhood goes out about 4x more often than this side. Once last summer, they were out for about 36 hours and we flickered a couple of times. It worked fine. So maybe it's interference on the line.
    I take it to the next-door neighbor's (not the Alien); the house between. The two poles are on the back corners of their property. 2 years ago when they remodeled, they had the service drop moved to "my" pole, and have missed out on maybe half a dozen power outages. I set it up and . . . . it works fine. It's not line interference.
    {SIDEBAR: The neighbor was quite impressed with the Mac (she runs her medical transcription service over the internet), is now quite unenchanted with Dell, and will probably be converted before the year is out.}
    I lug the stuff home and set it up in the workshop, the farthest place in the house away from "the office" (and not really a healthy environment for a computer). The jitter is slight, but it's there. I move the rig to this end of the house, but set up in the hall down from the office. More jitter. I set it back up, reconnect everything, do a little troubleshooting . . . St. Vitus' Dance. It's the room. (And yes, it is the end of the house near the Alien's)
    I pondered a bit and came up with a plausible, yet bizarre, reason for the interference. I'm going to play this one close to the vest, because it will take me awhile to set things up to test my thesis. Hopefully tomorrow.
    So, thanks again to everyone, and I'm still entertaining suggestions. I could very well be dead wrong again.
    dual 2.5 G5   Mac OS X (10.4.8)  

  • Rigging Question - I'm doing something wrong

    I am a bit confused on how to use rigs. I want to control and animate the Birth Rate of three particles from one Rig slider. I connected the Birth Rate parameters to my Rig slider, but the slider doesn't seem to do anything. It doesn't change the value of the Birth Rate parameters. What did I do wrong?

    When you rig a parameter, you usually rig the Minimum and Maximum values you want to have used. To do so you click on the small circles/dots under the Slider bar (they're called "snapshots") -- they will highlight blue when they're active. Once you activate dot, then you use the parameter controls you attached to the rig and set the value they should be at that point. Then activate the other dot and set what it's value should be at that point. When the user uses the rig, the slider will automatically interpolate between the two values you set for the dot.
    If you turn down the disclosure triangle at Options, you will see you can set the Slider's Range Minimum and Range Maximum values (what the user sees in FCPX - so you can set -100 to 100 or a range from 50 to 60 if you wanted to - basically anything you like.) The important feature in the Options is the Interpolation. The default is Linear, so when the user slides from the minimum to maximum value, the parameter values will change in a linear manner. You also have the option to have the values Ease (out/in between snapshots) or you can use Constant. Constant keeps the same values across all rigged parameters until a new slider snapshot is crossed.  This means you can add as many snapshots as you want and control how values progress over the range you set.
    You add snapshots by double clicking under the slider bar. You can click on the dot and drag it to any position you like under the Slider bar (absolute precision setting the locations is next to impossible, so don't knock yourself out trying to get exact positioning at locations like 0.25, 0.5 and 0.75 -- it can be done, but you need to widen the inspector pane out as far as you can stretch it and it still takes several attempts to get the values to line up... it CAN be done... you just have to figure out if it's that worth it.)
    Anyway, by adding several snapshots and setting their Interpolation to Constant, you can set up  "jumps" from value to value -- you don't have to have a linear progression from your minimum to maximum.
    HTH

  • Highest audio sampling rate in CS4?

    Hello,
    I apologize if this has already been asked, but I have been searching everywhere and I simply cannot find the answer to this.
    What is the highest audio sampling rate that can be utilized in Premiere Pro CS4? Can it import and export 192kHz 24-bit audio?
    Thanks in advance

    Hey Hacienda,
    I might not have the experience in audio work you have since I've only been doing this for the past 6 years or so.  But I've been a musician for far longer than that, and I've learned A LOT mostly from really smart people in the industry.  So, I'm not gonna lie to you and say that I've done extensive testing in this area because I simply do not have the equipment, nor the money to buy it (WAY too expensive).  But we do share the neophyte status when it comes to video editing :-P
    Anyways, the Nyquist Theorem is not a theory, which is what people are led to believe.  It is a theorem, meaning it's already mathematically proven.  It is proven that, as long as you follow the premise of capturing twice the highest frequency of the sound source, you'll get a perfect reproduction of it.  To capture more than that is a waste of bandwith specially because most people won't even hear above 18KHz, nor do they have the equipment to reproduce such frequencies.  Most consumer systems and audio gear, including those found in professional studios, go up to about 22KHz.  You need to spend BIG dolars for anything that goes beyond that.  So, who are we really making music for here?  The super rich?  Dolphins?
    Now, I know you're not just talking about higher frequencies, but the amount of samples needed to recronstruct a perfect copy of the original waveform.  OK, well, this is the kind of snake oil marketing BS I was talking about.  The biggest one being that 1bit DSD crap that Sony/Phillips is pushing.  Adding more samples to the recording will not make any difference on how faithfully you can reproduce a sound.  It'll just make the files bigger for no reason.  Again, the Nyquist Theorem already proves this.  This is FACT!  Here's a link I found interesting regarding these audio industry lies, maybe you will too: http://theaudiocritic.com/back_issues/The_Audio_Critic_26_r.pdf It starts on page 5, but the one pertaining this discussion is lie #3 on page 6. :-D
    Don't forget that modern converters already sample at much higher frequencies than the target sampling rate.  I believe my RME Fireface 400 samples at 5.6MHz, which is twice the amount of samples compared to DSD technology, before going back down to the target rate.  But, like I said, it does so for other reasons and NOT because it needs that many samples in order to faithfully reproduce a waveform.  Of more importance are the quality of the FIR (Finite Impulse Response) filter and the clock inside the converters.  These components are what make a converter high grade, among others.  The converter chips themselves are very inexpensive (in the tens of dolars) which why you hear some companies advertizing having the same converter chip as a ProTools HD rig (not the best example I know).
    By the way, I didn't say humans can only hear up to 20KHz.  I'm sure there are people who can hear above that.  My point was that the 20Hz - 20KHz range is what's generally accepted as an average for humans (which implies that there are people who can hear avobe/below that).  Also, the reason why modern-day pop records causes headaches and sound horrible is because of a totally different issue known as "The Loudness War" (I'm sure you know about it so I won't go into details).  However, I do agree with you as far as compressed audio goes.  Unfortunately, there's a reason for that and there's nothing we can do about it until the day Internet bandwith becomes more accessible and cheaper.  Eventually it'll get to the point where uncompressed audio can be streamed reliably through the net.  But, until then, we're stuck with MP3, AAC, DTS and other audio compression formats.  As far as digital media distribution goes, it's the future and companies are seeing that.  More and more people download music rather than buying CDs, so I do believe those numbers are accurate.  Just look at sales from iTunes and even games like Guitar Hero and Rock Band.  It's just a matter of time.
    Take care!

  • Using keyframes to automate a rig slider widget in Motion 5.

    I'm trying to automate a rig slider widget with keyframes, but any time I try to set a value for a keyframe, the value changes for the entire object instead of for just at that point in time.
    Here's the background:
    I set up the rig to make a bunch of pieces appear to explode. (Could have used "repel from" behavior, but wanted more control.) It uses a slider widget, values ranging from 0 to 100. At value 0, all the pieces are together, and they form the intact, un-exploded object. At value 100, the pieces are all positioned away from each other, and the object appears to have exploded.
    Now, when I move the slider by hand, it performs the intended motion. But using keyframes to do this motion over time does not work. I'm able to change the value of keyframes, but doing so changes the values globally — that is, editing one keyframe makes any other keyframes drawn inherit that value.
    This is what it looks like broken down:
    • First, I'll draw a keyframe at value 0, time 00:01;
    • then I go to draw the keyframe at value 100, time 00:30;
    • it creates a new keyframe at value 100 and time 00:30, but...
    • the 0 keyframe at time 00:01 mysteriously jumps to value 100 (though stays in its place in time), and the entire keyframe graph becomes a horizontal line, thus eliminating the would-be change over time.
    In short, I can't animate the slider because changing a keyframe changes all values to that one value.
    Any ideas on what's going on here?
    I'm thinking it may have something to do with the fact that the paramaters modified by the slider are all "ramp" behaviors of objects. The slider controls both the start and end values of the "ramp" behavior, but I set it so the start and end values are always the same. (In effect, using ramp as a means to modify X and Y positions.)

    Widgets allow you to distill complex motions or animations into a simple interface. For instance, you can link together and control the values of numerous related parameters into a single slider. That's what I was doing.
    The problem, I now see, is that I was trying to connect values of one of the prefab behaviors. Those already happen over time, so I think it just tripped Motion up.
    I circumvented that by relinking the widget directly to the position values of the objects, rather than linking it to the behaviors, which in turn controlled the position values.
    To answer your last question more directly, I was animating the position parameters of objects. The objects were bezier shapes that, when pieced together, formed a line of text. The effect I was going for was to have the text start in (apparently) one piece — then have the pieces suddenly move outward from the center of the frame to give the effect of exploding. In the end, the easiest way to do this was to use a slider on a rig (after mapping it to start and end points for each piece). When done correctly, dragging the slider to value 0 brought all the pieces together to form the whole word. Dragging the slider to value 100 made all the pieces "explode" out.
    This also makes it easy to come back and make changes to the rate of the "explosion" motion. Which was important, because there were about 60 pieces. Redrawing each path individually would be too much work.

  • Apogee 16X, Gigas, Sample Rate Conversion, and Summing/Monitoring

    I'm trying to do an update on my home studio rig.
    I've decided to get the new Quad Core 3g G5.
    I've decided on the Apogee 16X/Symphony card combo.
    I need to rout the outputs of six PCs into Logic; One PC is running KYMA/Capybara, one PC will be running the Native Instruments Kore/Komplete VST host; and the remaining four are running Gigastudio 3. All the PCs are lightpipe out. At present, I run the lightpipes into a Hammerfall lightpipe-to-MADI converter, and from there onto a MADI card via coax directly into my present G5, the Dual 2g.
    I monitor though a Sony DMX-R100.
    So far, no problem.
    But, I've decided to start working at 96k, so my present scenario becomes more difficult, as my sample library (about 1.2 terrabytes) is all 48k.
    I'd sort of like to get rid of the DMX, but Im considering using it as a submixer for the PCs, running analog out of it into 12 channels one of the Apogees. That would solve the problem of sample rate conversion. But it would still leave me with a pretty big piece of gear that I'm not sure I need.
    The other thing I'm wondering is about summing and monitoring... I'd like to be able to avoid ganging everything rhough the Logic 2-bus, I like it better when I can spread things about a bit in groups, also there are a few other things (like the movie audio and a DVD player) that I need to be able to monitor through the same system.
    I also would like to be able to use my Fairchilds and Pultecs on the 2-bus of whatever I'm monitoring through, which would mean somehow returning that through Logic, or using my 2g Dual as a sort of mix&stem storage/archive/networking computer. (Which I'm not wholly opposed to.)
    Any ideas?
    Dual 2Ghz G5   Mac OS X (10.4.7)  
    Dual 2Ghz G5   Mac OS X (10.4.7)  

    96k (at least) is pretty important, because as well as doing the normal sort of workaday film and pop level audio, I'm also doing this very intimate project for an audiophile vinyl company, and it's stipulated that I must use 96 at the very least. I'll probably be doing that stuff at 192, and I will be keeping almost everything third-party off the processor. The company would prefer that I did everything to a 2" analog 8-track, but that's where I drew the line. Thing has to be aligned every three hours of operation, cause the track widths are so high, and the thing's pretty old, I'm afraid.
    For the film and pop stuff, I still like 96k, cause... I dunno why, I guess, now that you mention it. But even when stuff is downsampled, I still think it sounds better when its recorded at higher resolution.

  • Help with stuttering/jittering problem AFTER export

    Hi all,
    I'll describe the problem, then the solutions I've tried. First, my system is a Dell Precision 690 3.0Ghz dual core, 2Gb ram, win XP pro service pack 3, Premiere CS4. I also updated to the latest nVidia drivers (8 Jan 09, ver 181.20) which fixed a separate crashing problem with CS4.
    Here's the problem. My video in the timeline previews perfectly. No jittering, stuttering, or flickering. When I export my video, I always get some scenes jitter or stutter (the action in the scene jumps around back and forth). This generally occurs in the same places of a scene. I don't have any stills or freeze frames.
    I've read some of the forum threads on flicker and stutter, but they mainly apply to a frozen frame, or picture. I've played with the following with no effect: anti-flicker, gaussian blur, increasing my bit rate, exporting to different formats (mpeg1, avi, wmv).
    My frame rate is 29.97 drop frame. It wouldn't allow me to change my field order. Other settings are NTSC, multiplexer (Mpeg1) (I'm not sure what that does), multiplexer bit rate=variable, video bit rate tried at CBR= 3.0Mbps, and VBR1 (min=1, avg=3,max=4.5).
    This is driving me crazy as my timeline preview is perfect. Please help!!!
    Thanks very much in advance! Also, is CS3 less problematic than CS4?
    RickyRoma

    Ricky:
    The successful export was based on a time consuming troubleshooting work-around.
    This was a corporate video that was played at a fairly large meeting. Several months later other departments of the company wanted to use the same video, but with some shot changes that are relevant to their departments. In the meantime, we upgraded to CS4 and had to open the CS3 project with CS4 (We uninstalled CS3).
    I narrowed down the jitter problem to a graduating Gaussian blur added to a PSD on Video layer 1 that had an image above it on Video layer 2 with a transparent background. Gaussian blur and fast blur cause unacceptable image shake when the timeline is rendered and the shaking exists in the exported AVI. Fortunately, I was able to overlay segments of the original CS3 exported avi over the shaking areas in the new CS4 project because those areas were not changed.
    The overlay presented yet another minor problem. The aspect ratio of a widescreen NTSC video exported from CS3 is about 8 pixels wider than that exported from CS4. So I had to stretch the old CS3 overlayed avi segments to a place on the timeline where the aspect ratio change was not detectable.
    *It is possible that one of the editors stretched the width of the CS3 avi. One person in the forum did confirm though that CS3 exported avi's with a slightly wider aspect ratio than CS4.
    We had a few other weird problems with this converted from CS3, CS4 project:
    1.) Certain still image transitions failed to work. They would appear normal until rendered. The exported avi would be missing the transitions.
    I found a workaround... For whatever reason hitting the "reset" icon in the Effects Control pallet (keyframe added) fixed the missing transition problem.
    2.) Some timelapse, scaled down P2 1080i footage on the timeline exported with an odd unintended effect. The sky with clouds would hold a few frames, then jump ahead. Kind of a stop-start effect that was not in the original footage.
    Didn't find a work-around.
    3.) A still image at the end of the video shook violently in the export.
    I fixed that by being able to overlay the original CS3 avi over this area.

Maybe you are looking for