DRAM Clock Settings and SDram Cas latency

 : 8o Just got my KT3 ULTRA 2 Socket A Motherboard. When I turn on the computer, in the POST test it says:
DRAM Clock = 266MHz
SDRAM Cas Latency = 2
I checked the cmos and the latency is set at 2 , but what about the DRAM Clock? Please help for I am a newbie. Thanks

If your CPU is Athlon XP 2700+/2800+, the FSB is 166Mhz, the DRAM frequency is HCLK (166Mhz).
If your CPU is Athlon XP 2600+ and below, FSB = 133Mhz, then HCLK = 133Mhz, HCLK+33 = 166Mhz
If your CPU is older Athlon or Duron, FSB = 100Mhz, then HCLK = 100Mhz, HCLK+33 = 133Mhz, HCLK+66 = 166Mhz.

Similar Messages

  • DDR 3200 Cas latency Settings

    I have 2x256 platinum corsair 3200 Low Latency twinx memory. I changed the latency settings to 2.2.3.6. and my pc is now running faster. Does anyone Know the optimal settings I should be running them at? Also what is the Burst speed in the bios.( This is listed right ofter the cas latency section in the bios. I have a choice of 8 or 4 what should I choose

    If your system is running faster-you should be happy. Timings vary from system to system depending on bios version and quality of ram. Ideal settings are 2-2-2-5. In reality this is hard to achieve with stability on a Canterwood/Springdale chipsets-especially in the higher peformance modes (ultraturbo). Also ras to cas seems to be far more stable on these chipsets when set to 3(my gut feeling is that PAT is somehow affecting this setting). Memory manufactures like Mushkin recommend that 6 be used instead of 5 for btter stability and less potential data corruption. Finally from what I remember about burst length 4 is more stable than 8 but 8 allows for more speed. I do not think there is a big drop off when using burst 4 as opposed to 8-you would have to use your own testing. Your settings are fine as long as your system is stable. My one question is how sure of you that what the bios is reporting is what your memory is actually running at. This has been a "known" issue with the various MSI bios's. Use and independant reporting tool such as CPUZ to check the validity of your timings. Also as you tighten your timings it is advisable to raise your Vdimm-2.65-2.80 seems to do the trick-especially if overclocking.

  • Syncing and Clock Settings/Adjustment

    Hi Everyone
    I don't know if any of you do that, but I have the habit to moving my clock forward to make sure I'm on time.
    All my clocks  show a different time ( macbook +15 mn, Iphone +20mn )
    I know it's a bit silly,but it helps me to be on time.
    However it becomes a problem when all the sudden the Iphone shows the same time as my laptop.
    It first happened after I sync, then I made sure to turn Off the Automatic Date and Time on the Iphone.
    (General /Datre Time /Set Automatically OFF)
    But When I recently upgraded to IOS 6, out of nowhere it adjusted to the laptop clock settings, althought the Iphone was still set to OFF.
    What do you think causes that?
    How Can I avoid it?
    is it ueo to IOS upgrade?
    Thanks for your help
    David

    Hi, Gothboy -
    If the PMU reset that Tom suggests does not fix the problem, then try a reset of NVRAM -
    - Start the computer and immediately hold down Command-Option-O-F
    - When the computer starts it should open into a prompt screen (Open Firmware).
    - Press Return, you will see OK.
    - Type reset-nvram include the dash, here and below.
    - Press Return
    - Type set-defaults
    - Press Return
    - Type reset-all
    - Press Return
    The system should restart. If not, try typing exit or shutdown, or you can manually restart the system.
    Note - some machines may return "unknown word" in response to the reset-nvram instruction; if that happens, try init-nvram instead.

  • CAS Latency of ram in 15- and 17-inch 2010

    What is the CAS Latency (CL) of the ram in MacBook Pro 15- and 17-inch 2010 models?
    I can't find any information about this anywhere.
    Thanks!

    Hello DCoin welcome to to the forums! To answer your question: CL=7
    You can see some further spec's here:
    http://eshop.macsales.com/item/Other%20World%20Computing/8566DDR3S8GP/
    Hope this helps.

  • Memory settings? Corsair XMS Extreme - Cas Latency: 2-3-2-6 T1

    I've been trying to make sense out of the memory settings in the Bios. I read a bunch of posts but I'm still a little confused...
    Here is my memory:
    2x512Mb DDR400 - Corsair XMS Extreme - Cas Latency: 2-3-2-6 T1
    TWINX1024-3200LLPT
    on MB: K8T NEO-FIS2R
    what should be the settings as well as voltage for this?
    bank interleaving
    burst length
    CAS (CL)
    TRCD
    TRAS
    TRP
    right now I have:
    CAS: 2
    TRCD: 4
    TRP 2
    DDR Volts 2.65
    thanks
    TZ

    I'm running the same memory as you are with a different colored heat spreader, Corsair XMS 3200LL.  My system has worked fine from day one with the standard SPD settings that are 2-3-2-6-1T at the default voltage.  Unfortunately, Corsair is really hit or miss since all the BH5 chips are gone.  Our RAM is built on CH6 chips (assuming v1.2 RAM).  There's a great Athlon 64 Special Report on setting RAM timings for the Athlon 64 on Mushkin's website.  Luckily, I haven't had to change anything, but the article may help you out.

  • 6547E Max C dram BIOS settings

    I have one stick of Samsung PC 2700 DDR ram, and can
    only get it to reliably run @ 230 mhz.
    I'm running P4 2 Ghz @ 2.3 Ghz without a problem, but when I try any CPU:Dram ratio besides 1:1 my system
    locks at POST ;( .
    Shouldn't I be able to run pc 2700 @ 333 Mhz?
    Here are system/bios settings:
    P4 2 Ghz CPU FSB @ 115 for CPU speed of 2.3 Ghz.
     (runs @ 60 c and stable)
    Core voltage @ 5.25v
    Samsung 512 Mb DDR PC2700
    CPU: DRAM ratio @ 1:1 for a DRAM frequency of 230 Mhz.
     ( anything else and I lock up @ POST)
    Cas Latency @ 2.5
    Timing @ turbo
    Host dram @ fast
    Graphics window @ 128
    Generic 450w PSU
    So wassup? My comp runs OK at these settings but I'm
    just wondering why I can't reach 333 Mhz with the
    Ram I have.

    Quote
    Core voltage @ 5.25v
    I don't think so...
    Is this a Williamette or Northwood CPU ?
    Does the memory run wel at 333MHz if you don't overclock the CPU?

  • Does Cas Latency really make a difference?

    I have a MSI K8N Neo2 setup (AMD64 3500) and I am inquiring if memory latency effects are really noticible or not during heavy gaming or creating home movies for DVD's.  Currently, I have two sticks of Corsair Value Select 512MB DDR PC-3200 (VS512MB400).  It has a Cas latency of 2.5.  Is there any good noticible performance reason, I should trade up and buy some better performing memory.  If so, does anyone have a tried and true favorite with the K8N Neo2?  Any thoughts?  I have never overclocked, I just run at stock speeds.  Perhaps the performance gain is not worth the money?  
    I also have:
    eVGA geforce 6800NU
    2- 200 GB Seagate IDE barracuda
    Sony DVD burner
    Enermax 465 watt PS
    Hauppauge PVR 250
    Netgear WG311v2
    4 case fans
    1 GB PC-3200

    Spread Spectrum Modulation was invented to reduce interferences of high order
    harmonics of the bus frequency. The theory is that, because every wave form
    generates higher order harmonic waves or Obertoene, accumulation of the latter
    can result in interference with the original signal.
    One way to avoid this problem is to subject the base frequency to a slow (ca 100,000 clock cycles) modulation,
     meaning that the FSB varies between e.g. + and - 1% of the nominal value.
    In older boards, usually two different settings were available,
    either centered around the nominal value or set with the nominal frequency as the maximum (low modulation).
    Most current boards employ the centered modulation.
    This is, at least, the official version of Spread Spectrum Modulation.
    In reality, there are different reasons for its implementation.
    With increasing operating frequency, electronic components emit electromagnetic interference signals (EMI).
    EMI, on the other hand can cause interferences with other devices and is,
     therefore, subject to regulation by the FCC which limits the signal amplitude according to its guidelines.
    Any device exceeding the maximum allowable signal strength will not gain
     approval by the FCC and can, therefore, not be marketed.
    In order to understand the reason for SSM, it is necessary to know how the FCC
    tests EMI.
    Basically, the testing device is a radio receiver and the testing is done by sweeping
    its receiving frequency through the frequency range of interest and measuring the
    interference with the video and audio signals. The bandwidth sensitivity of the
    measuring device is in the order of about 1 MHz.
    If the operating frequency is modulated to spread over a bandwidth of typically
    4-5 MHz, the same will happen to the EMI spectrum and,
    instead of showing a sharp peak, the spectrum will be spread to a more or less
    Gaussian bell shape.
    In this case, the amplitude will be, of course, substantially smaller, that is,
    in the order of 1/3 - ¼ of the original peak.
    The energy, however, will be the same. On the other hand,
    the measuring instrument with its bandwidth limited to only 1/4 of the spread will,
    consequently, only see 1/3 to ¼ of the EMI.
    Therefore, the system will obtain FCC approval even if it exceeds the guidelines.
    Recommended Settings
    If running at stock setting, enabling Spread Spectrum Modulation (SSM)
    may reduce EMI, and cause less interference with wireles communication devices.
    Under all conditions, enabling SSM may cause a system to crash.
    This is especially true for overclocking, simply because with the high multiplier values employed now, even a 0.5% modulation up and down can cause
    differences up to as much as 10 MHz clock speed within one modulation cycle.
    In other words, if the CPU is already operating at its limit, increasing the clock
    speed by another 10 MHz may be fatal.
    Therefore, for any overclocking, SSM should be turned off.
    Another side effect of SSM is that it can interfere with the clock generator.
    This means that, instead of merely initializing SSM, it is possible to enable FSB settings that were never supported by the manufacturer.
    Examples are the Tyan Trinity 400 where disabling SSE results in an actual bus
    speed of 90 MHz instead of the selected 117 MHz or the MSI 6309 where FSB settings of up to 200 MHz become available.
    The reason is that activating SSM can cancel out the FSB setting since there
    can be a pin address overlap on the clock generator chip.
    (from lostcircuits.com)
    Lets say your CPU is putting out EMI , its all concentrated on one "channel/frequency" on the spectrum.
    What spread spectrum does is broaden that out to multiple channels/frequencies so it isn't as "potent" or interupting.
    http://www.rojakpot.com/default.aspx?location=8&var1=0&var2=115
    This is interesting.
    Fsb spread spectrum enabled can cause internet dropouts.
    http://www.asus.com.tw/support/faq/qanda.aspx?KB_ID=84823
    If i can access it in bios i always have it disabled in any computer i'm maintaining.

  • Cas latency setting

    why when I set my memory timings cas latency to 2.5 and it says in the bios that cas latency is set to 2.5 does cpu-z report it being set to 2?

    Quote
    Originally posted by DOS
    If you set any performance setting other than Slow the memory timings are set for you. If you want to overclock to 3G set the ram voltage to max 2.8, set the performance mode to Slow (bad term - is should be 'Normal'), set the DRAM clock to 333 (this will run the ram at 320 when FSB is at 200, also referred to as 5:4 ratio), set your FSB to 250. This will clock the ram at 400. You should be able then go further.
    This is what I have done, the performance setting is set to slow, and even right now I have CAS 3 set in the bios but cpu-z still shows its using CAS 2. Using 333MHz DRAM clock with 250MHz FSB is stable at all default voltages, but anything above those wont work, no matter what voltages. If I set the FSB to 255, it wont be stable anymore, not even with 1.7v vcore and 3.3v memory voltage. Even setting the DRAM clock to 266MHz wont be stable with 255FSB. Also, running a lower FSB with a higher memory clock, so that the FSB is lower but memory goes above 200MHz wont be stable no matter what.

  • Mixing Ram with different CAS latencies in K7n2 Delta-L?

    I have been using 512mb Kingston Value Ram in my computer, pc3200, with CAS latency of 3.0.   I just purchased some Corsair value select 512mb ram pc3200 with CAS latency of 2.5.   I know you are not supposed to mix ram with different speeds (like 400 vs. 333) but does it matter if the CAS latencies are different?  
    I'm running an AMD XP 1800+, K7n2 delta-L, nforce2 motherboard from MSI.  Someone at a computer shop told me they tried mixing Kingston 3.0 CAS with Corsair 2.5 CAS, and said his computer wouldn't even post.  So I'm wondering if this is an isolated problem, or you will always run into this problem if you mix ram with different CAS latencies.
        If anyone has experience with this issue, please share with the board.  Thanks!
    Chris

    The age old question, will the components I bought be compatible ? There are those that have bought identical matching memory that didn't work. In my case, I had an ECS K7S5A that used PC2100 Crucial memory, the sticks neither matched (at one time they did, but one stick died and they sent me the closest thing they had.). Neither of these Crucial sticks was identified by a configurator as being compatible with the K7N2 Delta L that I decided to buy. In the end, not only did both sticks work, but I put them in as single channel mode (ram slots 1 & 2), but later changed the configuration to dual channel mode (ram slots 1 & 3, even 2 & 3). It worked flawlessly in all three configurations. Sorry for the long answer, but in essence, there are no guarantees and your worries will drive you crazy. When memory is mixed, you reduce the likelihood of compatibility, even when you go beyond the recommended products that were specifically identified.
    That guy at the computer shop, did he ever indicate or isolate the true reason why the memory wouldn't work ? Maybe, maybe not. I would think with a disparity in memory, the bios is the key to setting the settings for the worst stick of the bunch. Then again, the products you get might overclock to what the better sticks run at. My assumption is that mixed sticks, the bios retards settings to this in "auto" mode. When you force it manually, it may work or it may fail. One thing you oughta know, each and every stick has the manufacturers id encoded onto them at the very least, just like the firmware on other items indicates a manufacturer code and other data that allows these things to be compatible. So know you face that battle going in with any of them. If the bios programming doesn't allow it to work, from that perspective, there'll be problems.

  • Why DDR3 @ 1333MHz on Core I7 iMac + CAS Latency

    Hello,
    I've just ordered an iMac 27" Core i7 (mid-2010) with 4Go.
    On Intel's i7 page, I read :
    http://ark.intel.com/Product.aspx?id=37148&processor=i7-940&spec-codes=SLBCK
    Memory Specifications
    Max Memory Size (dependent on memory type) 24 GB
    Memory Types DDR3-800/1066
    # of Memory Channels 3
    Max Memory Bandwidth 25.6 GB/s
    Physical Address Extensions 36-bit
    ECC Memory Supported No
    On the other hand, Apple says that memory upgrades shall match the followings:
    * PC3-10600
    * Unbuffered
    * No parity
    * 204 pins
    * *1 333 MHz*
    * SDRAM DDR3
    My questions are:
    1° Why shall we use 1333MHz since i7 only supports 1066MHz ?
    2° As regards CAS Latency, if I add 2x2Gb with faster CAS, will they work at their own CAS Latency or at the stock 2X2Gb's ?
    3° If I decide to leave one slot free, I can simply add 1x4Gb. Will this impact the dual channel performance of the stock 2X2Gb or will this perform as before ?
    Thanks a lot for your help !
    Cheers,

    Hi yoms
    Welcome to Apple Support Communities
    {quote:}1° Why shall we use 1333MHz since i7 only supports 1066MHz ?{quote}
    Do yourself a big favor and stick to [Apple's Memory specification,|http://support.apple.com/kb/HT4255] those that have not, have had all sorts of problems ranging from slow performance, Kernel Panic's, failure to boot, damaged slots and fried memory controllers.
    {quote:}2° As regards CAS Latency, if I add 2x2Gb with faster CAS, will they work at their own CAS Latency or at the stock 2X2Gb's ?{quote}
    When [Installing or replacing memory|http://support.apple.com/kb/HT3918] you can have 2 or 4GB modules in opposing banks, but they all must be the same latency and speed.
    {quote:}3° If I decide to leave one slot free, I can simply add 1x4Gb. Will this impact the dual channel performance of the stock 2X2Gb or will this perform as before ?{quote}
    More ram will help, but [Matched RAM on Intel Macs|http://guides.macrumors.com/MatchedRAM_on_IntelMacs] always provides the best performance.
    Two highly recommended third party ram suppliers are [OWC|http://eshop.macsales.com/shop/memory/iMac/2010/DDR321.527] and [Crucial|http://www.crucial.com/store/listparts.aspx?model=iMac%203.6GHz%20Inte l%20Core%20i5%20%2827-inch%20-%20DDR3%29%20Mid%202010&pl=Apple&cat=RAM]
    Dennis

  • DAQ Assistant: Clock Settings (Samples To Read, Rate) can affect signal readings?

    Dear all,
    I'm totally new to Labview, and recently I get confused with the equipment I'm dealing with.
    Technical details:
    The Labview version is 7.1; computer operate system is Windows XP;
    The equipment has a NI PCI-6220 and a 68 Pin Connector Block to read signals from the equipment;
    There are 4 channels in DAQ Assistant (2 pressure reading, 2 temperature reading);
    For the first pressure reading, Signal Input Range from 4m to 20m Amps;
    Clock Settings are Samples To Read = 5, Rate (Hz) = 20.
    Description of the problem:
    I use Labvew to monitor and record pressure readings and temperature readings. The Labview configuration was set up by my advisor several years ago. Recently, I found the pressure reading vibrated a lot; for example, 5.01 to 5.05 bar within a second. In order to get a stable pressure reading, my advisor suggested me to change "Clock Settings" in DAQ Assistant from Samples To Read = 5, Rate (Hz) = 20, to Samples To Read = 250, Rate (Hz) = 1000. In this case, she believed that since we increase sample numbers and sampling rate, we could have more data, and thus have stable pressure readings.
    At first I could have very stable pressure reading. The last digit (0.01) did not change within 20 seconds. However, somehow after a day the pressure reading became unstable and even worse than previous. (pressure reading vibrates from 5.01 to 5.30 within a second)
    This is not the worst case. We found that when we set Clock Settings: Samples To Read = 5, Rate (Hz) = 20, the pressure reading is about 8 bar. However, when we set Clock Settings: Samples To Read = 250, Rate (Hz) = 1000, the pressure reading is about 5 bar. In this case, we even don't know which pressure reading is correct.
    Labview records current, and transforms it into pressure reading. Thus my advisor tried to monitor the current reading by Labview, and she found the current reading changed when she changed the Clock Settings. (0.004 Amps (5 bar) when Samples To Read = 5, Rate (Hz) = 20; 0.005 Amps (8 bar) when Samples To Read = 250, Rate (Hz) = 1000)
    Since we only change the sample numbers and sampling rate, the average readings should still be similar. However, the reading are not similar. That is what confuses me.
    My questions are, if Clock Settings in DAQ Assistant could affect signal readings? If so, how it could affect the signal readings? What is the effect of "Samples To Read" and "Rate (Hz)"? How to determine these parameters to get the true pressure readings?
    Thank you very much for your help. Hope to have some feedbacks from you.
    Best regards,
    Cheng-Yu
    Energy and Mineral Engineering
    the Pennsylvania State University

    A 6220 cannot read a current, it can only read a Voltage, so you'll probably have some (or should have) a resistor accros the voltage input. (normally 50 Ohm for a 0-20 mA signal).
    My first step would be to measure this voltage with a multi-meter so you know what the actual voltage should be.
    Then I would read that same voltage with MAX (measurement and automation explorer) to make sure you have the right value
    Now about the changing of the voltage/current/pressure, how have you terminated the other signals? Have you provided a good earthing?
    If you sample with a high frequency (1 kHz), perform an FFT on the acquired data, I can imagine a dominant 50 or 60 Hz (depends on where you live) in the signal that might cause your problem.
    Ton
    Free Code Capture Tool! Version 2.1.3 with comments, web-upload, back-save and snippets!
    Nederlandse LabVIEW user groep www.lvug.nl
    My LabVIEW Ideas
    LabVIEW, programming like it should be!

  • CAS Latency 2 Reverts To 2.5

    My RAM is designed for 2.5-3-3-7.  I've already changed it to 2.5-3-3-7, but was recommended by someone else with similar memory that I try changing the 2.5 to 2, for more performance.  When I set it to 2 it boots and runs fine, but CPU-Z and Memtest86 report it as 2.5 still.  The BIOS definitely reports it as 2.
    Why?
    Is it true that the CAS Latency makes little difference on my system?  Why is this?
    Specs:
    Intel Pentium 4 2.8e HTT @ 3.26GHz
    Thermaltake Spark 7+ Xaser Edition A1715 CPU Cooler
    MSI 865PE Neo2-PFISR motherboard (BIOS ver. 3.8)
    2x512MB OCZ PC3700 Gold Edition Revision 3 Dual Channel Enhanced Latency DDR @ 466MHz
    Built-by-ATI Radeon 9800 Pro @ XT 128MB DDR 256-bit @ 415/744 (Catalyst 4.12)
    Arctic Cooling VGA Silencer Revision 3
    Creative SoundBlaster Audigy2 ZS Platinum
    Cyber Acoustics CA-4100 4.1 Surround
    Maxtor 6Y120M0 120GB HDD 7200RPM SATA150 8MB cache
    Sony DDU1612 40x/16x DVD-ROM
    Sony CRX230ED 52x/32x CD-RW
    Enermax Noisetaker EG475P 470W PSU(+3.3V = 34A, +5V = 40A, +12V = 33A)
    Ultra Dragon ATX Mid-Tower Case
    Windows XP Pro SP2

    Sounds like CPU-Z is reading the IC Chip settings and not BIOS settings.

  • Adjusting RAM cas latency / timings?

    Hello,
    This is mostly a question of appeasing my insatiable tinkering instinct as I know that any tweaking will have little, if any, visible results in the real world.
    Having said that...is there any way to tweak the cas latency/timing settings of RAM on my Mac? I know over on the Windows side of the world, this can be accomplished either in the BIOS or with third-party utilities so I was thinking maybe something in open firmware or the terminal or maybe a utility is out there somewhere and I just haven't looked in the right places.
    Thanks,
    Christian
    G5 dual 1.8 GHz Mac OS X (10.4.8) 3.5 GB RAM, 2x DELL 2405, Radeon 9800 256MB, 2x250GB 7200.10

    Hi Christian,
    no, you cannot do that. The CAS latency is a hardware specification of the RAM chips you use. Here are 2 articles about that topic:
    CAS latency
    CAS Latency, What Is It?
    If this answered your question please consider granting some stars: Why reward points?

  • Can not receive Mac mail -error Outlook cannot find the server. Verify the server information is entered correctly in the Account Settings, and that your DNS settings in the Network pane of System Preferences are correct.  Account name: "MacMail"

    Can not receive Mac mail -error Outlook cannot find the server. Verify the server information is entered correctly in the Account Settings, and that your DNS settings in the Network pane of System Preferences are correct.  Account name: "MacMail"
    What are the correct mail account settings and more importantly the correct DNS settings
    Thank you for any help you may be able to provide
    Cheers
    Chris (iMac i7)

    Do not delete the old account yet. sign up for an iCloud account if you haven't.
    I understand .mac mail will still come through. Do not delete the old account yet.
    You cannot use .mac or MobileMe as type of Account, you have to choose IMAP when setting up, otherwise Mail is hard coded to change imap.mail.me.com to mail.me.com & smtp.mail.me.com to smtp.me.com, no matter what you try to enter.
    iCloud Mail setup, do not choose .mac or MobileMe as type, but choose IMAP...
    On second step where it asks "Description", it has to be a unique name, but you can still use your email address.
    IMAP (Incoming Mail Server) information:
              •          Server name: imap.mail.me.com
              •          SSL Required: Yes
              •          Port: 993
              •          Username: [email protected] (use your @me.com address from your iCloud account)
              •          Password: Your iCloud password
    SMTP (outgoing mail server) information:
              •          Server name: smtp.mail.me.com
              •          SSL Required: Yes
              •          Port: 587
              •          SMTP Authentication Required: Yes
              •          Username: [email protected] (use your @me.com address from your iCloud account)
              •          Password: Your iCloud password
    Also, you must upgrade your password to meet the new criteria:  8 characters, including upper and lower case and numbers.  If you have an older password that does not meet these criteria, when you try to setup mail on your mac, using all of the IMAP criteria listed above, it will still give a server error message.  Go to   http://appleid.apple.com         then follow directions to change your password, then go back to setting up your mail using the IMAP instructions above.
    Thanks to dpepper...
    https://discussions.apple.com/thread/3867171?tstart=0

  • Buffered event counting. Why can't I explicitly sequence generating the Sample Clock Pulse and reading the counters?

    At irregular occasions I need to grab counts from several counters, and buffering the counts must be done simultaneously for all counters. I'm modeling my approach after zone.ni.com/devzone/cda/tut/p/id/5404 which someone kindly pointed out in an earlier thread. However, that example only uses one counter, and you can't test the synchronization with only one counter, so I am using two counters configured the same way, and they're wired to a single benchtop signal generator (for example at 300 kHz).
    What I want to do, I can test in a loop with a somewhat random wait in it. I want to drive a hardware digital output line high for a few ms and then low again. The hardware line is physically connected to terminals for my timing vi's Sample Clock Source and so will cause them to buffer their counts for later reading. After I pulse this line, when I know new good buffered counts await me, I want to read both my counters. If their bufferings are simultaneous, then each counter will have counted the same number of additional counts since the last loop iteration, which I can check by subtracting the last value sitting in a shift register and then subtracting the two "additional counts" values and displaying this difference as "Diff". It should always be 0, or occasionally +1 followed immediately by -1, or else the reverse, because buffering and a count could happen practically at the same moment.
    When I do this using a flat sequence to control the relative timing of these steps, so the read happens after the pulse, the counters often time out and everything dies. The lengths of time before, during, and after the pulse, and the timeout value for the read vi, and the size of the buffer and various other things, don't seem to change this, even if I make things so long I could do the counting myself holding a clipboard as my buffer. I've attached AfterPulse.vi to illustrate this. If I get 3 or 10 or so iterations before it dies, I observe Diff = 0; at least that much is good.
    When I use two flat sequences running in parallel inside my test loop, one to control the pulse timing, and the other to read the counters and do things with their results, it seems to work. In fact, Diff is always 0 or very occasionally the +/- 1 sequence. But in this case there is nothing controlling the relative timing such that the counters only get read after the pulse fires, though the results seem to show that this is true. I think the reads should be indeterminate with respect to the pulses, which would be unreliable. I don't know why it's working and can't expect it to work in other environments, can I? Moreover, if I set some of the pulse timing numbers to 1 or 2 or 5 ms, timeouts start happening again, too. So I think I have a workaround that I don't understand, shouldn't work, and shouldn't be trusted. See SeparateSequence.vi for this one.
    I also tried other versions of the well-defined, single sequence vi, moving the counter reads to different sequence frames so that they occur with the Sample Clock Source's rising edge, or while it is high, or with the falling edge, and they also often time out. I'll post these if anyone likes but can't post now due to the attachment limit.
    Here's an odd, unexpected observation: I have to sequence the reads of the counters to occur before I use the results I read, or else many of the cycles of this combine a new count from one counter with the one-back count from the other counter, and Diff takes on values like the number of counts in a loop. I though the dataflow principle would dictate that current values would get used, but apparently not so. Sequencing the calculations to happen after the reads fixes this. Any idea why?
    So, why am I not succeeding in taking proper control of the sequence of these events?
    Thanks!!!
    Attachments:
    AfterPulse.vi ‏51 KB
    InSeparateSequence.vi ‏49 KB

    Kevin, thanks for all the work.
    >Have you run with the little execution highlighting lightbulb on? -Yes. In versions of this where there is no enforced timing between the counter and the digital line, and there's a delay inserted before the digital line, it works. There are nearly simultaneous starts on two tracks. Execution proceeds directly along the task wire to the counter. Meanwhile, the execution along the task wire to the digital high gets delayed. Then, when the digital high fires, the counter completes its task, and execution proceeds downstream from the counter. Note, I do have to set the timeout on the counter longer, because the vi runs so slowly when it's painting its progress along the wires. If there is any timing relationship enforced between the counter and the digital transition, it doesn't work. It appears to me that to read a counter, you have to ask it for a result, then drive the line high, and then receive the result, and execution inside the counter has to be ongoing during the rising line edge.
    >from what I remember, there isn't much to it.  There really aren't many candidate places for trouble.  A pulse is generated with DIO, then a single sample is read from each counter.  -Yup, you got it. This should be trivial.
    >A timeout means either that the pulse isn't generated or that the counter tasks don't receive it. - Or it could mean that the counter task must be in the middle of executing when the rising edge of the pulse arrives. Certainly the highlighted execution indicates that. Making a broken vi run by cutting the error wires that sequence the counter read relative to the pulse also seems to support that.
    >Have you verified that the digital pulse happens using a scope? -Verified in some versions by running another loop watching a digital input, and lighting an indicator, or recording how many times the line goes high, etc. Also, in your vi, with highlighting, if I delete the error wire from the last digital output to the first counter to allow parallel execution, I see the counter execution start before the rising edge, and complete when the line high vi executes. Also, if I use separate loops to drive the line high and to read the counter, it works (see TwoLoops.vi or see the screenshot of the block diagram attached below so you don't need a LV box). I could go sign out a scope, but think it's obvious the line is pulsing given that all these things work.
    >Wait!  I think that's it!  If I recall correctly, you're generating the digital pulse on port0/line0...  On a 6259, the lines of port 0 are only for correlated DIO and do not map to PFI. -But I'm not using internal connections, I actually physically wired P0L1 (pin 66) to PFI0 (pin 73). It was port0/line1, by the way. And when running some of these vi's, I also physically jumper this connection to port0/line2 as an analog input to watch it. And, again, the pulse does cause the counter to operate, so it clearly connects - it just doesn't operate the way I think it is described operating.
    For what it's worth, there's another mystery. Some of the docs seem to say that the pulse has to be applied to the counter gate terminal, rather than to the line associated with the sample clock source on the timing vi. I have tried combinations of counter gate and or sample clock source and concluded it seems like the sample clock source is the terminal that matters, and it's what I'm using lately, but for example the document I cited, "Buffered Event Counting", from last September, says "It uses both the source and gate of a counter for its operation. The active edges on the gate of a counter is used to latch the current count register value in a hardware register which is then transferred via Direct Memory Access...". I may go a round of trying those combinations with the latest vi's we've discussed.
    Attachments:
    NestedSequences.png ‏26 KB

Maybe you are looking for

  • Mozilla Firefox is my internet browser, I would like to download my McAfee Anti-Virus protection DVD.

    I have a new Window 7 Premium Home Edition,Model: Inspiron 560. It came with a 30 day free trail of McAfee Anti-Virus Protection, which expires in less than three days. Roughly, two weeks ago, I received in the mail from QVC, a DVD, named PC Treasure

  • PLEASE proofread my resume

    (I've stretched the truth somewhat over here...:-)Please let me know if something does not sound plausible. Thanks a lot! Olga.) Objective: To obtain a position as an Oracle/Web Developer Summary of Qualifications: Strong knowledge of object-relation

  • Windows 7 home premium oa sea 64 bit

    My HpG42-456TX Notebook PC running on windows 7 home premium oa sea 64 bit crashed. Can I reinstall myself? Where should I download Windows 7 home premium oa sea 64 bit ?   This question was solved. View Solution.

  • Facebook login causing problems

    Hi, I use Spotify to stream music at work, but my workplace blocks Facebook. What is happening is that when Spotify attempts to "verify" my account using Facebook credentials, it is being blocked, due to the overall Facebook block. I am thus unable t

  • No Puedo Guardar El Audio En Multipista A Maxima Calidad En ADOBE AUDITION CS6

    Antes cuando guardaba en multipista guardaba ala calidad q quisiera pero mas usaba a calidad 320Kbps pero ahora no puedo guardar en maxima calidad siempre me guarda a solo 192kbps y yo quiero a maxima calidad ademas q tampoco puedo guardar a la forma