ADC DDR source synchronous timing constraints help

Hi all,
  I am using a virtex 5 xc5fx200T-1ff1738 with ISE 12.4 to capture data from an ADC (KAD5512p-50 http://www.intersil.com/content/dam/Intersil/documents/kad5/kad5512p-50.pdf).
I calculate my timing constraints as follows:
TIMEGRP "ADC_1_DP_TIM_GRP" OFFSET = IN 1.713ns VALID 1.487ns BEFORE ADC_1_CLK_P_IN RISING;
TIMEGRP "ADC_1_DP_TIM_GRP" OFFSET = IN 1.823ns VALID 1.629ns BEFORE ADC_1_CLK_P_IN FALLING;
ADC clock period (DDR) = 2ns
ADC clk->q rising max = 0.120ns
ADC clk->q rising min = -0.260ns
ADC clk->q falling max = 0.230ns
ADC clk->q falling min = -0.160ns
The board latency of the ADC clock to the FPGA is 0.424ns.
Each ADC data pin is trace matched between N and P, but sadly not between each bit.
Bit     Delay           Clk->Data delay (0.424ns - data delay)
0        0.481ns       0.057ns
1        0.480ns       0.056ns
2        0.481ns       0.057ns
3        0.479ns       0.055ns
4        0.481ns       0.057ns
5        0.390ns      -0.034ns
6        0.480ns       0.056ns
7        0.479ns       0.055ns
8        0.430ns       0.006ns
9        0.473ns       0.049ns
10      0.481ns       0.057ns
11      0.480ns        0.056ns
Delta of the minimum and maximum clk->data board latency = ( 0.057ns - -0.034ns ) = 0.023ns    (Right?)
From this I calculate:
Rising edge offset by:
  ADC clock period - clk->q falling max - max board clk->q delay
    = 2ns - 0.230ns - 0.057ns = 1.713ns
Rising edge valid time:
  ADC clock period - clk->q falling max - clk->q rising min - ( Delta of the minimum and maximum clk->data board latency )
    = 2ns - 0.230ns - 0.260ns - 0.023ns = 1.487ns
Falling edge offset by:
  ADC clock period - clk->q rising max - max board clk->q delay
    = 2ns - 0.120ns - 0.057ns = 1.823ns
Falling edge valid time:
  ADC clock period - clk->q rising max - clk->q falling min - ( Delta of the minimum and maximum clk->data board latency )
    = 2ns - 0.120ns - 0.160ns - 0.023ns = 1.629ns
The FPGA needs a window of 1.77ns( 1.55ns setup time + 0.16ns hold time according to the data sheet DS202 v.5.4 table 93, Tpsdcm0/Tphdcm0), which I obviously do not have given these constraints.  Am I calculating this wrong?
I want to use the phase shift of the DCM to move to the clock to the "data eye."  Are my constraints caluclated properly?
 

I haven't analyzed your numbers carefully, but at first glance, this looks more or less correct. There appears to be a simple math error in calculating your data skew : (0.057ns - -0.034ns ) = 0.023ns ; it looks like you added the negative number instead of subtracting it (which makes things worse).
Thus your overall window is small - probably around 1.419ns (correcting for your error above). This is too small to capture with global clock capture using the DCM (yes, you are interpreting Tpsdcm_0/Tphdcm_0 correctly)
The only option for static capture is to use ChipSync; bring the clock in on a CCIO and use a BUFIO to capture the data. Done this way your timing is Tpscs/Tphcs, which requires a data window of only 1.370ns (leaving a whopping 49ps of margin). Of course, this is best case - if you have duty cycle errors or other board/signal integrity issues, it will come out of this margin.
There are a few things that make this better. The Tpscs/Tphcs is measured with LVCMOS25 - using LVDS will probably make it better, but we can't really quantify how much.
Also, the board delay is mostly PVT independent - you could compensate for the static error using the IDELAY. In particular you could add one tap of delay to to data bit 5, which will add approximately 78ps to that one bit - bringing it more in line with the rest. Your overall skew would the be 57-6 = 49ps, instead of 57--34 = 91ps.
To center the data eye, you would put IDELAYs on both the clock and the data. Keep the data delays as small as possible (ideally 0 on all bits except bit 5 which is 1), and then delay the IDELAY on the clock to make the data window work. You may need to adjust your OFFSET IN constraints to match the fact that you are pushing the clock forward (subtract 2ns from the OFFSET IN to indicate that the data window comes after the clock edge, not before) - thus your commands will be of the format
TIMEGRP "ADC_1_DP_TIM_GRP" OFFSET = IN <x> ns VALID <y>ns BEFORE ADC_1_CLK_P_IN RISING;
Where <x> will be a small negative number - on the order of 1.172-2.000.
However, this will all only work if the clock is on a CC pair, not a GC pair (on Virtex-5 these were different), and that all LVDS pairs are in the same I/O bank as the clock pair. If your board is already designed, then you probably can't change this.
If you can't use chip-sync (and even at that, timing is really tight), then you will have to use some sort of dynamic calibration.
Avrum
 

Similar Messages

  • HDL Interface Node (UsingFilterCore.vi) and "timing constraint" error when compiling

    I'm trying to use the HDL interface node in LV8 FPGA with a PCI-5640R and had the "timing constraint" error when compiling my VI, however, the same VI was successfully compiled on a CRIO-9104, it seems the FPGA on PCI-5640R is not good as the one on CRIO-9104, or I'm not using it right. could you please kindly help me out?
    I tested it with the sample code downloaded from NI website
    ( http://zone.ni.com/devzone/conceptd.nsf/webmain/456722DDDE17986A86256E7B0065EE6F ) which demonstrates using an IP core for a filter. To simplify it, I only keep the HDL Interface Node and the While Loop (see "UsingFilterCore.vi" in attached zip file), and then I created 2 projects including this VI (1 for CRIO-9104, in sub folder "CRIO-9104", the other for PCI-5640R, in sub folder "IFRIO 5640"). When opening the 2 projects separately in LV8.0 and selecting the VI for compile, the one for 9104 passed and the other failed. Here I attach the source code, error message screenshot and the NIReport from MAX, hope you can reduplicate the problem.
    Can you help me out? Thanks very much !
    Message Edited by Jerry_L on 03-26-2006 09:28 PM
    Message Edited by Jerry_L on 03-26-2006 09:29 PM

    Hi Jerry,
    I'm just tried to make all these steps by myself (http://zone.ni.com/devzone/cda/tut/p/id/3516). I have generated FIR filter using Xilinx ISE and got *.VHD file which was going to use in HDL Node.
    In the Parameters tab of the HDL Interface Node configuration dialog, double-click in the Names column to add parameters. Create parameters as shown below.
    Next, switch to the Code tab. Notice that your parameters now appear in the entity section. To complete the next two sections of code, you will need to refer to the filt.vhd file that you generated earlier and interface the filter core to the LabVIEW FPGA execution system.
    1. The first problem I met was integrating VHDL code from earlier generated *.VHD file to CODE tab in properties of HDL Node. Content of entity section in *.VHD is not the same that in your attached file. Please check it in attached files. I'm sure this is the main reasen of problem.
    Next, switch to the External Files tab. Click the Add File button and select the filt.edn file that you created earlier. This is the EDIF netlist file that you generated earlier.
    2. I have no idea where can I get it and when during filter generation using Xilinx ISE it was generated too. How can I get it? I had to use your attached file filt.edn.
    3. After that I have made the same schematics like you have in your VI FPGA and try to run. But I've got two error messages:
    HDL Interfave node: enable chain not handled. Details: Refer to the documentation for the correct assignments for the enable_out output from your HDL code.
    HDL Interfave node: output not handled. Details: Right-click the node, select Configure to open the Configure HDL Interface Node dialog box, and use the Code tab to handle all output parameters. 
    Actually I need to model FIR filter:
    Bandwidth 200-600 Hz
    Sampling 8 KHz
    Attenuation 80 dB
    That's why I tried to follow all these steps by myself to understand how does it work.
    Thanks a lot.
    Nikita
    Attachments:
    Filter1.vi ‏16 KB

  • Source Synchronous Input: Capture clock/Launch Clock analysis

    Hi, I have a Source Synchronous LVDS DDR input into a Kintex7, the launching clock is edge-aligned to the data and capture clock should capture on opposite edge (a launch on the rising edge should be captured by the falling edge). I have designed it to work at 100Mhz by compensating the clock insertion delay with a PLL (to save the MMCM for other purposes) using BUFH (the timing is not so tight to use BUFIO/BUFR). The PLL also centers the opposing edge on the data window by shifting -90°. Now the launching clock waveform is {5.0 0.0}, and the waveform generated by the PLL is {2.5 7.5}, this is reported correctly by Vivado. But when vivado analyses the setup path (I have set the proper set_input_delays) the following happens:
    -Launching clock rising edge is correctly at 5.0ns at the input data PIN.
    -Capture clock falling edge is INCORRECTLY 7.5ns AT THE CLOCK PIN.
    What I don't get is: why Vivado, that recognizes that the capture clock is a generated clock by the PLL, uses the {2.5 7.5} waveform AT THE CLOCK PIN, and not at the ouptut of the PLL BUFH. I mean the falling edge of the capture clock should be 7.5ns at the output of the BUFH, not at the input to the FPGA (I see that the PLL correctly shifts this 7.5ns to be 5ns at the BUFH, but this is not what actually happens).
    Doing the calculations manually the interface meets setup/hold with ease. I just want Vivado to make the proper analysis.

    I am not 100% certain I followed all your logic, but I think the issue is that you aren't following how clocks are treated in SDC/XDC.
    In Vivado/SDC/XDC, there are two separate concepts - clock phase, and clock propagation. For calculating how the launch/capture edge relationship is done, it is done based purely on phase - propagation does not factor in to it. Regardless of how they are generated, you have two clocks
      - the primary clock (generated by the create_clock command), which has a waveform of {0.0 5.0} (I am not sure why you say the opposite - {5.0 0} is not a meaningful representation in XDC)
      - the automatcially generated clock at the output of the PLL, which has edges at {2.5 7.5}. Its a little irrelevent how you defined it (with a +90 or -90 degree shift since the interface is DDR) - for the sake of argument, I will say its +90 degrees.
    First, recognize that when you define a DDR input and use an IDDR to capture it, you are defining four static timing paths - you have two startpoints, one from your set_input_delay and one from your set_input_delay -clock_fall -add_delay. You also have two endpoints, one from the rising edge clock at the IDDR and one at the falling edge of the IDDR. This generates 4 paths
      a) rising input -> falling IDDR
      b) falling input -> rising IDDR
      c) rising input -> rising IDDR
      d) falling input -> falling IDDR
    All four paths exist, and all are timed.
    Now you need to understand the rules that Vivado uses to determine launch and capture edges. For this system, it is easy - the capture edge is always the edge of the capture clock that is the earliest that follows the launch edge. So in this case (assuming launching is {0 5.0} and capture is {2.5 7.5} will be
      a) rise at 0 -> fall at 7.5
      b) fall at 5 -> to rise at 12.5 
      c) rise at 0 -> rise at 2.5 (this is the most critical one, so a) is irrelevent)
      d) fall at 5 -> fall at 7.5 (this is the most critical one, so b) is irrelevent)
    Now that it has determined the launch and capture edges for all the paths, it starts the propagation at the primary clocks. For your set_input_delay these are the clock pin. For the capture IDDR, the edge starts at the primary clock, propagates through the PLL (which adjusts it for clock propagation but not for the phase shift), and ultimately to the IDDR clock.
    Now, in a real system this is what is going to happen - I am not sure why you think this is incorrect. If, however, there is some reason to believe that c and d are false paths, then you have to declare them as such (which will then leave a and b as the ones that remain). To do this, you would work with a virtual clock - you would define TWO clocks - the primary clock to the clock pin, and an idential virtual clock for your set_input_delays
    # Create the real and virtual clock
    create_clock -period 10 -name real_clk [get_ports clk_pin]
    create_clock -period 10 -name virt_clk
    # Define the input delay with respect to both edges of the virtual clock
    set_input_delay <delay> -clock virt_clk [get_ports data_pin]
    set_input_delay <delay> -clock virt_clk [get_ports data_pin] -clock_fall -add_delay
    # Disable the rising to falling and falling to rising paths
    set_false_path -rise_from [get_clocks real_clk] -fall_to [get_clocks virt_clk]
    set_false_path -fall_from [get_clocks real_clk] -rise_to [get_clocks virt_clk]
    Even though "real_clk" and "virt_clk" are different clocks all clocks in Vivado are related by default, so the fact that they have the same period and starting phase (which defaults to {0 5.0}) then they are effectively the same clock.
    The reason for using the virtual clock is to make sure that if there is a rising to falling edge path inside the FPGA, you don't accidentally declare it false too - and these will happen between the IDDR and fabric logic (if it is in OPPOSITE_EDGE mode).
    I hope this is clear... It can be a bit confusing, but it does make sense.
    Avrum

  • How to set input delay and output delay when source Synchronous

    ClkIN is the board clock which is connected to the FPGA. Clkif is the generated clock from ClkIN. The Device's clk come from Clkif. So, how to set input delay and output delay in this scene(within my understand, this is Source Synchronous)?
    The example in many document, the input delay and output delay setting all refer to board clock(within my understand, this is System Synchronous). In that scene, the input delay max = TDelay_max + Tco_max; input delay min = Tdelay_min + Tco_min; the output delay max = Tdelay_max + Tsu; output delay min = Tdelay_min - Th.
    So, I want to know how to set input/output delay in the Source Synchronous.
    In system synchronous, I set input/output delay such as:
    create_clock -period 20.000 -name ClkIN -waveform {0.000 10.000} [get_ports ClkIN]
    create_generated_clock -name Clkif -source [get_pins cfg_if/clk_tmp_reg/C] -divide_by 2 [get_pins cfg_if/clk_tmp_reg/Q]
    create_clock -period 40.000 -name VIRTUAL_clkif //make virtual clock
    set_input_delay -clock [get_clocks VIRTUAL_clkif] -min 0.530 [get_ports DIN]
    set_input_delay -clock [get_clocks VIRTUAL_clkif] -max 7.700 [get_ports DIN]
    set_output_delay -clock [get_clocks VIRTUAL_clkif] -min -0.030 [get_ports DOUT]
    set_output_delay -clock [get_clocks VIRTUAL_clkif] -max 1.800 [get_ports DOUT]
    *******************************************************************************************

    So, first. Architecturally, the clock that you forward to your external device should not come directly from the clock tree, but should be output via an ODDR with its D1 input tied to logic 1 and the D2 tied to logic 0. This guarantees minimal skew between the output data and the forwarded clock.
    ODDR #(
    .DDR_CLK_EDGE("OPPOSITE_EDGE"), // "OPPOSITE_EDGE" or "SAME_EDGE"
    .INIT(1'b0), // Initial value of Q: 1'b0 or 1'b1
    .SRTYPE("SYNC") // Set/Reset type: "SYNC" or "ASYNC"
    ) ODDR_inst (
    .Q (Clkif_ff), // 1-bit DDR output
    .C (ClkIN_BUFG), // 1-bit clock input
    .CE (1'b1), // 1-bit clock enable input
    .D1 (1'b1), // 1-bit data input (positive edge)
    .D2 (1'b0), // 1-bit data input (negative edge)
    .R (rst), // 1-bit reset
    .S (1'b0) // 1-bit set
    OBUF OBUF_inst (.I (Clkif_ff), .O (Clkif_out));
    This generates an output clock that is the same frequency as your input clock. This is consistent with your drawing, but inconsistent with your constraints - is the forwarded clock a 50MHz clock or a 25MHz clock?
    I will assume your ClkIN goes to a BUFG and generates ClkIN_BUFG.  Your first constraint generates a 50MHz clock on the ClkIN port which will propagate through the BUFG to (among other places) this ODDR.
    create_clock -period 20.000 -name ClkIN -waveform {0.000 10.000} [get_ports ClkIN]
    Assuming your forwarded clock is supposed to be 50MHz, then your 2nd command is close to correct
    create_generated_clock -name Clkif -source [get_pins cfg_if/ODDR_inst/C] -combinational  [get_pins cfg_if/ODDR_inst/Q]
    With this done, you have successfully described the forwarded clock from your design. This is the clock that goes to your device, and hence should be the clock which is used to specify your input and output constraints.
    set_input_delay -clock [get_clocks Clkif] -min 0.530 [get_ports DIN]
    set_input_delay -clock [get_clocks Clkif] -max 7.700 [get_ports DIN]
    set_output_delay -clock [get_clocks Clkif] -min -0.030 [get_ports DOUT]
    set_output_delay -clock [get_clocks Clkif] -max 1.800 [get_ports DOUT]
    If you want to get fancier, you could try adding a set_clock_latency to the forwarded clock to account for the board propagation of the clock
    set_clock_latency -source TDtrace2 [get_clocks Clkif]
    (But I haven't experimented with clock latency on a generated clock and I don't know for a fact that it works).
    Avrum

  • Fail to Meet Timing Constraint with Xilinx 9.2i Suite

    Hi,
    I've been trying to p&r the OpenSPARC T1 v1.6 with either 1core/1thread or 1core/4threads using the Xilinx 9.2i EDK with the latest updates to no avail. It looks like the design just refuses to meet a timing constraint. I have no problem doing the p&r using Xilinx ISE/EDK 10.1(and getting a working bitfile), but since the original EDK project was designed using v9.2(from looking at the system.xmp file) I was wondering if anyone was able to successfully create a bitfile for the OpenSPARC T1 using v9.2 of the Xilinx tools. I tried using a fresh tarball (with the supplied sparc.edf file) and following the instructions in OpenSPARCT1_DVGuide.pdf using the 9.2i suite, but again, I had the same results (design failed to meet timing). Any help on the issue is greatly appreciated. For reference, here is the constraint not met.
    *TS_clock_generator_0_clock_generator_0_clkgen_core_inst_clkgen_arch_inst_using_dcm_arch_model_dcm_array_1__using_dcm_module_inst_dcm_module_inst_CLK0_BUF   = 
    PERIOD TIMEGRP "clock_generator_0_clock_generator_0_clkgen_core_inst_clkgen_arch_inst_using_dcm_arch_model_dcm_array_1__using_dcm_module_inst_dcm_module_inst_CLK0_BUF"
    TS_clock_generator_0_clock_generator_0_clkgen_core_inst_clkgen_arch_inst_using_dcm_arch_model_dcm_array_0__using_dcm_module_inst_dcm_module_inst_CLKFX_BUF     HIGH 50% From looking at the Xilinx Timing Analyzer, it looked like paths from the microblaze processor the the lmbrams was the issue.
    Oleg

    We have gotten those errors from time to time as well. One thing we did to get around this was to set the following option in the system.xmp file:
    EnableParTimingError: 1
    If you are having success with ISE/EDK 10.1, that's great. I would stick with it. We will be updating the OpenSPARC T1 project for EDK 10.1 pretty soon anyway.
    formalGuy

  • I am having trouble with Imovie I finished my project but when i try to play it it says "source clip is missing" Help!

    it says "source clip is missing" Help!

    Hi
    Did You move or alter any folder named
    • iMovie Event's - or
    • iMovie Project's
    on DeskTop/Finder - if so - You must not do that - try to put all back
    Did You in iMove trash any Event's - then don't - try to get them back by looking in trash-basket on DESKTOP. Move them out and back into iMovie Event's.
    This problem might be due to the idea that there is a copy/movie in the Project - it's not !
    You are using a NON-Destructive Video Editing program (all behaves in this way) and that means that PROJECT's are only a small text document pointing to What material You want to use, Where it is located e.g. in iMovie Event's - and how it's meant to be played.
    If You move or delete any of the raw material - the Project will point to an empty space = Can Not be played.
    WILDLY GUESSING - PLEASE no Offence ! (I've been here and done this - so I guess in this way)
    Yours Bengt W

  • I inherited an 2003 ipod from my daughter. The controls will not work when it is plugged into a power source. Can anyone help?

    I inherited a 2003 ipod from my daughter. The controls will not work when it is plugged into a power source. Can anyone help?

    But they work normally when not plugged in?  What kind of power source are you plugging it into? It maybe outputting more than what the iPod can handle.  I had this happen to an iPod I plugged into a powerstrip that was plugged into the cigarette lighter in a vehicle an experienced the same thing.
    B-rock

  • I tried to update to IOS 5, but it said my connection timed out .Help!on timed out.

    i tried to update, but it said the connection timed out. help!

    Disable or turn off all firewall and anti-virus software and try the download again.
    Stedman

  • Ipod wont restore and update. network connection timed out. help!???

    I restored my ipod touch 2g 8gb. I went into itunes and followed the steps. I clicked restore and update and it goes through the whole process and gets all the way to the end and it says ipod software update (processing file) then a pop up comes up telling me "there was a problem downloading the software for the ipod. the network connection timed out". i've tried it in different usb ports and i've turned off the firewalls. still nothing. does anyone have a solution for me?? i've been trying for a few days now and i've look all over the internet to fix this issue. i'm at the point i'm just annoyed so i'm posting here for help. any help would be awesome! thanks

    When you get anything that says your network connection timed out it either means you lost internet connection or there is something blocking the connection between the computer and the servers that it is trying to connect to.
    If you have a solid internet connection and can browse to websites fine it's either an anti-virus program or a firewall on the computer OR a firewall in the router's settings that you use to connect to the internet.
    Reconfigure any security software to allow Apple's servers to pass.
    Useful information for configuring your security software:
    IMPORTANT: iTunes must be allowed to contact Apple using the following ports and servers:
    port 80
    port 443
    phobos.apple.com, deimos3.apple.com, albert.apple.com, gs.apple.com, itunes.apple.com, ax.itunes.apple.com
    For a more complete list of ports used by Apple products see "Well known" TCP and UDP ports used by Apple software products.
    iTunes also contacts VeriSign servers during an iPhone restore and activation:
    evintl-ocsp.verisign.com
    evsecure-ocsp.verisign.com
    For more information see Update and restore alert messages on iPhone and iPod touch
    Temporarily disable or uninstall your security software
    If the current available version of your security software is incompatible with iTunes, you can disable the security software and test if it is the source of the issue. It might also be necessary to temporarily uninstall security software after making sure you can reinstall it and note any required licenses. A temporary deinstallation might be required because parts of some security software remain active in the background though they appear to be deactivated.
    To reduce any security risks that security software is designed to prevent, consider making sure all files needed to reinstall it are available and disconnect network connections before uninstalling security software. Should this not be possible, such as if a download without active security software is required, that required step should be the only process completed without security software. For assistance in checking your configuration, disabling, or uninstalling your security software, please contact the manufacturer.
    Once disabled or uninstalled, perform only the steps required to verify your issue has been resolved. Complete whatever update, restore, syncing, back up, activation, or other task you were troubleshooting. Once that task is complete, install and configure compatible security software to ensure your PC is protected.
    If it isn't a program or software on the computer contact your Internet Service Provider and/or Router Company to configure or disable the firewall in the router settings.
    To isolate the issue to this you can also try disconnecting the router and plugging your computer directly in to the modem but this does not help for troubleshooting time-out issues directly on any apple mobile devices only if you are syncing or going through iTunes on the computer.

  • Cannot connect to Store - Network connection timed out - Help!!!

    Please can someone give me a little help, I've recently updated my Itunes and now cannot connect to the store. I have checked Network connections etc.. and all is fine, my internet works as normal.
    I've uninstalled all Anti-virus/spyware software and double checked the Windows Firewall.
    When I run a diagnostic from the itunes help menu, I get the following:
    Microsoft Windows XP Professional Service Pack 2 (Build 2600)
    Sony Corporation VGN-FE21S
    iTunes 7.4.1.2
    Current user is an administrator.
    Network Adapter Information
    Adapter Name: {1CB9EE56-7F9B-4B15-ACB3-28BE159EB621}
    Description: Intel(R) PRO/Wireless 3945ABG Network Connection - Packet Scheduler Miniport
    IP Address: 192.168.1.2
    Subnet Mask: 255.255.255.0
    Default Gateway: 192.168.1.1
    DHCP Enabled: Yes
    DHCP Server: 192.168.1.1
    Lease Obtained: Sat Sep 08 16:02:19 2007
    Lease Expires: Sun Sep 09 16:02:19 2007
    DNS Servers: 192.168.1.1
    Adapter Name: {FD313ACD-BDEE-489A-AB3E-DD931458D876}
    Description: Intel(R) PRO/100 VE Network Connection - Packet Scheduler Miniport
    IP Address: 0.0.0.0
    Subnet Mask: 0.0.0.0
    Default Gateway:
    DHCP Enabled: Yes
    DHCP Server: 61.8.9.254
    Lease Obtained: Sat Jul 22 23:23:22 2006
    Lease Expires: Sun Jul 23 02:23:22 2006
    DNS Servers:
    Adapter Name: {302D7F6B-B995-4E44-99B1-1A5158EF7090}
    Description: Bluetooth Personal Area Network from TOSHIBA - Packet Scheduler Miniport
    IP Address: 0.0.0.0
    Subnet Mask: 0.0.0.0
    Default Gateway:
    DHCP Enabled: Yes
    DHCP Server:
    Lease Obtained: Sun Jul 23 02:23:22 2006
    Lease Expires: Sun Jul 23 02:23:22 2006
    DNS Servers:
    Network Connection Information
    Active Connection: LAN Connection
    Connected: Yes
    Online: Yes
    Using Modem: No
    Using LAN: Yes
    Using Proxy: No
    SSL 3.0 Support: Enabled
    TLS 1.0 Support: Enabled
    Firewall Information
    Windows Firewall is on.
    iTunes is enabled in Windows Firewall.
    Connection attempt to Apple web site was unsuccessful.
    The network connection timed out.
    Connection attempt to iTunes Store was unsuccessful.
    The network connection timed out.
    Secure connection attempt to iTunes Store was unsuccessful.
    The network connection timed out.
    Any advice would be very much appreciated.
    Thanks very much.

    I have been experiencing this as well for about a month and just about five minutes ago I resolved it! I have done everything! You need to unistall iTunes and Quicktime. Don't think you will loose your music as it is on you PC. Trust me. _Follow the links_ on unistalling everything, download it again with the following link and all you music automatically uploads again. It was amazing!
    How to uninstall QuickTime on a Windows PC
    http://www.info.apple.com/kbnum/n60342
    Removing iTunes For Windows
    http://www.info.apple.com/kbnum/n93698
    Note: Titles you purchased from the iTunes Store or imported from CDs are saved in your iTunes folder by default and are not deleted by uninstalling iTunes.
    If you have difficulty removing iTunes, you may find this helpful:
    Microsoft's Windows Installer CleanUp Utility:
    http://support.microsoft.com/default.aspx?kbid=290301
    After successfully uninstalling iTunes and QuickTime, install the latest version of iTunes for Windows, which comes with QuickTime:
    http://www.apple.com/itunes/download
    If you continue to experience difficulty with this issue, you will need to call Apple technical support (there may be a fee associated with the call). To find the appropriate phone number, please visit:
    http://www.apple.com/support/contact/phone_contacts.html

  • How to filter data from a source XML? Please help!

    Hi Experts,
       I have a source XML as shown below:
        <Inventory>
         <InventoryItem>
           <ItemCode>InTransit</ItemCode>
           <Quantity>1000</Quantity>
         </InventoryItem>
         <InventoryItem>
           <ItemCode>Available</ItemCode>
           <Quantity>1500</Quantity>
         </InventoryItem>
         <InventoryItem>
           <ItemCode>Restricted</ItemCode>
           <Quantity>2500</Quantity>
         </InventoryItem>
        </Inventory>
    My Target XML is as below
        <Inventory>
          <stock>1500</stock>
        </Inventory>
    The stock element contains Quantity value where ItemCode is 'Available'.
    But note that there are 3 InventoryItem nodes.
    So how to get the desired target XML in XI mapping? Basically I have to filter data from source XML based on value of an element. What is the best approach to handle this?
    Kindly help
    Thanks
    Gopal

    Hi venkat,
              Your solution does'nt work fine. But why are you using collapsecontext and splitbyvalue before putting the value into stock element?
               Kindly explain your concept.
    My target message is:
    <?xml version="1.0" encoding="UTF-8"?>
    <Inventory>
       <InventoryItem>
          <ItemCode>InTransit</ItemCode>
          <Quantity>1500</Quantity>
       </InventoryItem>
       <InventoryItem>
          <ItemCode>Available</ItemCode>
          <Quantity>1000</Quantity>
       </InventoryItem>
       <InventoryItem>
          <ItemCode>UnRestricted</ItemCode>
          <Quantity>2000</Quantity>
       </InventoryItem>
       <InventoryItem>
          <ItemCode>Available</ItemCode>
          <Quantity>2500</Quantity>
       </InventoryItem>
    </Inventory>
    I am getting the output even though stock is unbounded and I have used collapsecontext and splitbyvalue as:
    <InvStock>
      <Stock>1000</Stock>
    </InvStock>
    I should get:
    <InvStock>
      <Stock>1000</Stock>
      <Stock>2500</Stock>
    </InvStock>
    Thanks
    Gopal
    Message was edited by:
            gopalkrishna baliga

  • Object Data Source and oracle connections.-Help please!!

    I have a detailsview with objectdatasource as a Data source.
    Every time, i Edit and save a row in the detailsview, upto 10 connections are created to Oracle. And because of this maximum no of connections is reached quickly.
    I am closing/disposing the connection object in the update method.
    What could be causing this behaviour?
    Need help immediatley. Appreciate your time.
    -Thanks.

    That helpes quite a bit. I still can't get the app to retrieve data, but I am getting a more useful message in the log:
    [Error in allocating a connection. Cause: Connection could not be allocated because: ORA-01017: invalid username/password; logon denied]
    As you suggested, I removed the <default-resource-principal> stuff from sun-web.xml and modified it to match your example. Additionally, I changed the <res-ref-name> in web.xml from "jdbc/jdbc-simple" to "jdbc/oracle-dev".
    The Connection Pool "Ping" from the Admin Console is successful with the user and password I have set in the parameters. (it fails if I change them, so I am pretty sure that is set up correctly) Is there another place I should check for user/pass information? Do I need to do anything to the samples/database.properties file?
    By the way, this is the 4th quarter 2004 release of app server. Would it be beneficial to move to the Q1 2005 beta?
    Many thanks for your help so far...

  • Prevent viewing of source files - security-constraint

    I'm using JSF and Facelets and I'd like to restrict visibility of the .xhtml source files.
    Currently if a user types in the source file name index.xhtml instead of index.jsf in the url they get presented with the raw source file.
    I've got a blanket security constraint that requires authentication of all users and I've added a second constraint that denies access to .xhtml files.
    This doesn't work I imagine because the first constraint is allowing access to all pages.
    I'd appreciate some suggestions how I can stop users viewing the .xhtml files while requiring authentication for all pages.
    <security-constraint>
    <display-name>Secure Pages</display-name>
    <web-resource-collection>
    <web-resource-name>Secure Pages</web-resource-name>
    <description/>
    <url-pattern>/*</url-pattern>
    <http-method>..snip..</http-method>
    </web-resource-collection>
    <auth-constraint>
    <description/>
    <role-name>User</role-name>
    </auth-constraint>
    </security-constraint>
    <security-constraint>
    <display-name>Source Files</display-name>
    <web-resource-collection>
    <web-resource-name>XHTML Source</web-resource-name>
    <description/>
    <url-pattern>*.xhtml</url-pattern>
    <http-method>..snip..</http-method>
    </web-resource-collection>
    </security-constraint>
    -Gianni

    I don't think you can,
    the XHTML pages will always be saved to the browser, you can prevent caching, you can use some encoding to encrypt them and javascript to prevent right click - but it would still be possible to view source - but you can make it difficult.

  • "The network connection has timed out" -- HELP!

    Ever since i downloaded itunes 7 i have been getting this message whenever i try to access itunes
    "Itunes could not connect to the itunes store. The network connection has timed out. Make sure your network settings are correct and your network connection is active, then try again."
    I have tried everything that apple has suggested in their help and support pages. I no clue what to do now since my internet is running perfectly fine.
    Also whenever i insert a cd into my rom, itunes wont load the album into its interface or recognize that there is even a cd in the rom. I am thinking this is occuring because itunes cant access the internet.
    Does anyone have any suggestions? Thanks

    I had exactly the same problem, and finally found a solution this week. If its the same problem as i had it is to do with your firewalls. I know weveryone on here keeps saying go into control panel, then firewalls and make sure itunes is selected as an exception etc etc. I did all of this and still had the same problem.
    My problem laid in other firewall software that had been installed on my PC when i bought it, and not windows firewall. I have EZ Armor installed which is anti-virus and firewall software. I uninstalled the firewall part and everything works fine. I can import music from cds and gracenotes can find the track names, and i can access the itunes store. I am waiting to hear back from Computer Associates who make EZ Armor as to how i can reinstall the fireall and make itunes an exception, as obviously i would prefer to have the firewall protection.
    My suggestion is to check your PC for any anti-virus and firewall software that you might not even know you are running, and uninstall it.
    Hope this helps

  • Design does not meet the timing constraints of the specified clock

    Hi,
    I am using labVIEW FPGA 2011 and FlexRIO 7965R (virtex 5 SX95T). I am trying to use multiple simple dual port RAMs (generated through xilinx coregen tool) in cascade. The data read from one memory block is written to the next memory block and so on. I want the design to run as fast as possible. I have used SCTL rate of 346MHz in my design. But when i compile the code, timing violation error occurs. And the maximum clock rate achieved is 291.25MHz. The timing vilation analysis window always show the feedback node and controller as paths which failed to meet the timing contraints. I have also used DMA and VI Scoped FIFOs in my design for data input and output. 
    I have attached the code and images for xilinx block memory generator IP configuration settings. Kindly have a look at my code and tell me what is it that i am doing wrong? what are the considerations for pipelining the design? cause i have tried placing registers after every stage in my code but this reduces the clock rate to even below 291.25MHz.
    Thanks
    Attachments:
    Projec_Memory Block.zip ‏309 KB
    Block Memory Generator_GUI.zip ‏236 KB

    Have you taken a look around the forums?
    http://forums.ni.com/t5/LabVIEW/Compile-Error-in-FlexRIO-FPGA-when-using-derived-clock/td-p/1617226
    What is the exact timing violation error you receive?  Depending on how your feedback nodes are used it can slow down the clock rate from the desired clock rate.  This is what I normally use as a reference for making my code operate more quickly:
    http://zone.ni.com/reference/en-XX/help/371599G-01/lvfpgaconcepts/registers/#Register_Timing
    http://zone.ni.com/reference/en-XX/help/371599G-01/lvfpga/fpga_timed_loop/
    http://zone.ni.com/reference/en-XX/help/371599G-01/lvfpgaconcepts/fpga_pipelining/

Maybe you are looking for