NI ADSL Parametric Test Platform

Hi all,
I have downloaded NI ADSL Parametric Test code from NIs web site. When i try to run the code it asks for the "Limit Testing for cluster.vi" under mamon.llb. I have checked my NI directory and found "mamon.llb" but not "Limit Testing for cluster.vi" inside? Any help/comments? I have LV7.0 already installed,WinXP
Regards.

Hi tmm,
For this particular example, the Limit Testing for cluster.vi has been renamed to Limit Testing.vi. You can find this vi in mamon.llb. What you must do is ignore the subvi when loading, and then trace back to this vi. You will see it will have a ? in place of its icon, with broken lines around it. Go ahead and replace the ? with the Limit Testing.vi, and make sure it is wired. You also may have to relink some subvis. To do this, locate the subvis in question, right click on them, and select "relink to subvi". This should fix all of your problems!
Jeremy L.
National Instruments
Jeremy L.
National Instruments

Similar Messages

  • T1 parametric test

    Hi, NI engineers, i have a question about your t1 parametric test. I need for my final proyect degree develop a system for E1 parametric test and my big problem are the two cards used in t1 parametric test.
    I need to know are two cards used in this proyect (NI-5411 , NI-5112) can be used by design E1 parametric test, because i live in ecuador and the standard telephonic system is european. I need a answer in the fact. Thanks for all.

    The main difference with T1 and E1 is the speed. T1 1.54 Mb/s vs E1 2.0 Mb/s. Both of these speed ranges are well within the speed specifications of the 2 cards you are looking at.
    For more information regarding the capabilites of those two cards:
    http://sine.ni.com/apps/we/nioc.vp?lang=US&pc=mn&cid=3630
    http://sine.ni.com/apps/we/nioc.vp?lang=US&pc=mn&cid=1482
    To best answer your question, please just make sure the specs of the baord meet your measurement requirements.
    Kevin R

  • Migrate security from production to test platform

    Is there a method by which security that exists on production platform, can be migrated back to the test platform (groups, users & passwords, and application filters)?Thanks

    HiThere is a security utility available on this website, that allows for importing and exporting all security etc.This is the simplest way to complete.Hope this helps.Andy Kingwww.analitica.biz

  • Parametrized Tests and IRunListener issues

    Hi,
    I'm currently working on fixing flexunit 4.1 support for Flexmojos and have run into some problems with the new Parametrized reports.
    In Flexmojos we usually counted the number of tests and then decremented that number each time a testFinished was called. Unfortunately with parametrized tests we can't expect the number of times the IRunListenertest.Finished functions is called to be equal to the number of test functions. This resulted in Flexmojos reporting an "all is ok" each time the number of executed tests equals that of counted test functions. All failling tests beyound that point would stay undetected, which is rather undesirable.
    My question now is, with the introduction of parametrized tests, is there a new Interface I can use instead of "IRunListener" that somehow provides callbacks for "testCaseStarted" and "testCaseFinished"? For example if one test function would be executed 2 times I would expect the callback sequence to fe as follows:
    testRunStarted
    testCaseStarted
    testStarted
    testFinished
    testStarted
    testFinished
    testCaseFinished
    testRunFinished
    If such an interface doesn't exist ... how could I implement something similar?
    Chris

    Ok well in the meanwhile I solved the problems by restructuring the flexmojos unit test support in general.

  • Crane Aerospace and electronics is looking for Test Engineers with LabVIEW experience - please disregard previous post.

    Here is the correct post:
    Are you detail-oriented, creative, and technically skilled at Engineering design and development?  Come to Crane Aerospace & Electronics and use your excellent Engineering skills to design, improve, and deliver the next generation of products in the aerospace and electronics Industry!
    We have a unique and exciting career opportunity for Engineer II, Test.
    You will be responsible for maximizing new product development and manufacturing performance through the creation and deployment of test strategies, tools, and plans.  Design and implement high performance hardware and software for test equipment.  Authoring test procedures and performing Qualification test activities.  Ensure high product quality.
    Responsibilities:
    Collaborate with customers and multi-disciplined engineers to establish/clarify test, qualification, verification and validation requirements.
    Write test plans, procedures, requirements and reports in a highly structured environment.
    Analyze, develop and deploy complex and high performance test hardware and software solutions for automated test equipment. 
    Design, develop, debug, validate & verify the fabrication of manual and automated test equipment at the circuit board and system level, and specify and procure COTS test equipment.
    Develop/maintain hardware documentation including block diagrams, schematics, BOMs, wiring diagrams and wiring lists, software documentation, and configuration control of initial release and updates. 
    Perform detailed calculations to establish test equipment specifications and design margins.
    Maintain existing test systems through bug fixes, improvements and modifications.
    Support the estimation of costs and schedules to develop or upgrade test platforms.
    To perform a number of the above responsibilities with limited supervision.
    Minimum Requirements:
    Experience: 2-5 years.  Previous work experience in aerospace, space or medical electronics industry preferred.
    Knowledge: Microprocessor / Microcontroller hardware and firmware design; Analog Circuit and power supply design; Digital Circuit Design including high-speed serial communication design; Firmware programming in c; Schematic Capture, PADS Logic preferred; Circuit Simulation; Fundamentals of magnetic proximity, temperature, and pressure sensing electronics; ESD; Familiarity with testing standards (MIL-810, MIL-704, and DO-160 preferred).  Basic laboratory test equipment; LabVIEW experience, certification preferred; Developing hardware per DO-254 and software per DO-178 preferred; Experience with Adobe FrameMaker, IBM Rational tools, TestStand, Microsoft Project preferred.
    Skills: Good interpersonal and communication skills (verbal and written)- effectively lead and/or participate in multifunctional teams in a dynamic work environment. Ability to manage multiple tasks, flexibility to switch between tasks and prioritize tasks. 
    Education/Certification: Bachelors Degree in electrical engineering, computer science, physics or related technical discipline.
    Eligibility Requirement: Must be a US Person (under ITAR rules) to be eligible.
    Working Conditions:
    Working conditions are normal for an office/manufacturing environment. Machinery operation requires the use of safety equipment to include but not limited to safety glasses, heel straps, and shop coats.
    Requires lifting 25 lbs
    Apply online today: http://ch.tbe.taleo.net/CH06/ats/careers/requisition.jsp?org=CRANEAE&cws=5&rid=3170
    Crane Aerospace & Electronics offers competitive salaries and outstanding opportunities for career growth and development.  Visit our website at CraneAE.com for more information on our company, benefits and great opportunities.
    In our efforts to maintain a safe and drug-free workplace, Crane Aerospace & Electronics requires that candidates complete a satisfactory background check and pass a drug screen prior to employment.  FAA sensitive positions require employees to participate in a random drug test pool.

    How can you say you are hiring test engineers with LabVIEW, yet the job description doesn't even mention LabVIEW.  All I see in there is CAD design.
    There are only two ways to tell somebody thanks: Kudos and Marked Solutions
    Unofficial Forum Rules and Guidelines

  • Load test SharePoint Online

    I'm putting in a critical solution for a company using SharePoint Online. It's pretty much out of the box, but the customer insists that a load-testing exercise takes place to ensure that when they go live with it, it will cope with their number of users
    without falling over or slowing down too much. Test cases are fairly simple: browse to the site, search, open a document, maybe occasionally edit a document or upload a new one. Usage will be fairly intense from their point of view but nothing that I can see
    SPO having any trouble with whatsoever. However I have to prove it.
    Has anyone found a load-testing application that will work with SharePoint Online? The test platform must run on local machines to simulate requests coming from the company, so cloud-based testing is out. I have been trying with VS2013 Ultimate, and have
    managed to get it to authenticate just one time, but mostly it refuses.
    An automated test tool would also be useful for future regression testing.

    The problem with "load testing" SPO is that you'll be throttled fairly quickly with a high load (similar to being throttled when attempting to migrate GBs of content to SPO). Since you cannot have any impact on the environment anyways, you would
    have to contact Microsoft if you encountered a performance issue, but load testing isn't going to be an accurate gauge of that.
    Trevor Seward
    Follow or contact me at...
    This post is my own opinion and does not necessarily reflect the opinion or view of Microsoft, its employees, or other MVPs.

  • Unit test cases run twice on TFS2012 build server

    Hi,
    I have enabled the build definition to run the test cases using Visual Studio Test Runner in TFS2012 build server. Once it runs I can see that each test case ran twice. Suppose if I have 5 test cases, 5 will be passed and same 5 will be failed. Why it ran
    for second time and failed? Please advice.
    Below is my activity log. If you notice at the end of the below log, same test cases ran twice. First all are passed and then failed.
    If Not DisableTests00:00:21
    Inputs
    Condition: True
    Run Tests00:00:21
    If Not TestSpecs Is Nothing00:00:21
    Inputs
    Condition: True
    For Each TestSpec in TestSpecs00:00:21
    Inputs
    Values: Run tests in test sources matching **\*test*.dll, Target platform: 'X86'
    Try Run Tests00:00:21
    Handle Test Run Exception00:00:00
    If testException is NOT TestFailureException00:00:00
    Inputs
    Condition: False
    Set TestStatus to Failed00:00:00
    Inputs
    LabelName:
    SourceGetVersion:
    KeepForever: False
    DropLocation:
    Status: None
    LogLocation:
    BuildNumber:
    Quality:
    TestStatus: Failed
    CompilationStatus: Unknown
    If spec.FailBuildOnFailure00:00:00
    Inputs
    Condition: False
    Get Requests Approved for Check In00:00:00
    Outputs
    Result: IQueuedBuild[] Array
    Mark Requests for Retry00:00:00
    Inputs
    Behavior: DoNotBatch
    Requests: IQueuedBuild[] Array
    Force: False
    If spec Is AgileTestPlatformSpec00:00:00
    Inputs
    Condition: True
    Run Visual Studio Test Runner for Test Sources00:00:00
    Assign spec to agileTestPlatformAssembly00:00:00
    Inputs
    Value: Run tests in test sources matching **\*test*.dll, Target platform: 'X86'
    Outputs
    To: Run tests in test sources matching **\*test*.dll, Target platform: 'X86'
    Find Visual Studio Test Platform Test Assemblies00:00:00
    Inputs
    MatchPattern: d:\Builds\8\Test\OfficeLink\src\**\*test*.dll
    Outputs
    Result: System.Linq.OrderedEnumerable`2[System.String,System.String]
    If Visual Studio Test Platform Test Assemblies Found00:00:00
    Inputs
    Condition: True
    If agileTestPlatformAssembly.HasRunSettingsFile00:00:00
    Inputs
    Condition: False
    Run Visual Studio Test Runner for Test Sources00:00:00
    Inputs
    ResultsDirectory:
    TreatTestAdapterErrorsAsWarnings: False
    UpdateFrequencyTimeout: 30
    RunSettings:
    Platform: Any CPU
    TestSources: System.Linq.OrderedEnumerable`2[System.String,System.String]
    KeepAlive: False
    UpdateFrequency: 5000
    TestCaseFilter:
    PublishResults: True
    RunName:
    Flavor: Debug
    ExecutionTimeout: 0
    ExecutionPlatform: X86
    DisableAutoFakes: False
    Passed TestMethod1
    Passed TestMethod1
    Passed RegisteredTypeShouldBeRegistered
    Passed RegisteredTypeLifeTimeManagerIsTryShouldBeRegistered
    Passed RegisteredTypeWhithoutNameShouldBeRegistered
    Passed RegisteredInstanseShouldBeRegistered
    Passed RegisteredInstanseWithoutNameShouldBeRegistered
    Passed RegisteredInstanseShouldBeResolved
    Passed AllRegisteredInstanseShouldBeAllResolved
    Failed RegisteredTypeShouldBeRegistered
    Failed RegisteredTypeLifeTimeManagerIsTryShouldBeRegistered
    Failed RegisteredTypeWhithoutNameShouldBeRegistered
    Failed RegisteredInstanseShouldBeRegistered
    Failed RegisteredInstanseWithoutNameShouldBeRegistered
    Failed RegisteredInstanseShouldBeResolved
    Failed AllRegisteredInstanseShouldBeAllResolved
    Test Run Completed. 16 tests executed.
    Test Run Failed.

    Hi DHegde,  
    Thanks for your post.
    You’re using TFS 2012 Update 4 and VS 2012 Update 2?
    Please manually build your project and run test project using VS 2012 on your client, then check the test case result in Test Result window, the test cases be executed as expect? 
    If everything works fine when manually run your test project using VS 2012, which build process template selected in your build definition? If it’s not TFS 2012 default build process template, please try use default build process template in your current
    build definition, then queue build definition and view test result in TFS Build log again.
    If you’re using default TFS 2012 build process template in your current build definition, but still received the same issue, how did you configure the test in your build definition, please share the detailed configure screenshot(s) here.
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Associating a comm port with a TS test socket

    Hello,
    In order to run several test platforms on one machine (parallel model), I must be able to associate each test socket with a unique serial port. Each hardware platform has it's own serial connection which is built in Labview. So my question is, "how can I use a single test sequence calling LV VIs where each test socket must use a different comm port?"
    Thank you for helping,
    Adam

    How dynamic is the com port configurations? Will some testers only differ in the number of ports? If that's the case, then you'll also need to dynamically change how many sockets you're testing. This could be done, probably in a sequence file load callback and with the LabVIEW function VISA Find Resource. Searching for all ASRL resources will return a count and their names. You could populate a station global string array with the names. Right now, I have a fixed count of resources so I just populate a main sequence local variable with all of the possible names. I assign the resource name for the VI as Locals.VISAResource and in a pre-expression for the VI, Locals.VISAResource = Locals.AllVISAResources[RunState.TestSockets.MyIndex]

  • 2.2 GPIB driver for Solaris doesn't work with Sol. 2.3 tester

    Upgraded to 2.2 GPIB driver to fix issue of driver being lost on Sun Ultra60's at shutdown and reboot. The driver works fine with 10 of the 11 tester platforms we use. This tester is a Genesis III running Solaris 2.3. The program will run successfully, but when shutdown and re-started, the drivers are not there and must be re-installed. What to do? Thanks.

    Thanks for the suggestion to check privileges, but that's the problem--the driver is not present to have any privileges to check. On reboot, there are only links (all owned by root with 777 perm.) at /dev which lead to non-existent directories.
    #/dev> ls -l gpib*
    lrwxrwxrwx 1 root other 43 Apr 18 09:50 gpib -> ../devices/pci@1f,4000/NATI,pci-gpib@2:gpib
    lrwxrwxrwx 1 root other 44 Apr 18 09:50 gpib0 -> ../devices/pci@1f,4000/NATI,pci-gpib@2:gpib0
    lrwxrwxrwx 1 root other 44 Apr 18 09:50 gpib1 -> ../devices/pci@1f,4000/NATI,pci-gpib@4:gpib1
    If you try to use these links to go to these dirs, there're not there. Also, ibconf shows no gpib devices loaded.
    Any more suggestions would be greatly appreciated.

  • Test Suite Validation Failure

    I'm usign ADL Test Suite 1.3.3 to validate a SCORM 2004
    Captivate generated PIF and after 20 seconds (the default time-out
    for receiving LMSInitialize) I receive "ERROR: Initialize() never
    invoked" while the wrapper correctly launch the swf.
    I also tried to intercept a call to Captivate1_DOFSCommand
    function and seems it is never used.
    My testing platform is:
    - Windows XP SP1
    - ADL Test Suite 1.3.3
    - JRE 1.5 update 2
    Any suggestion someone?
    Thanks in advance!
    Marco

    Andrew,
    First of all thanks for your offer to help me but yesterday
    morning I was playing with some publishing options and I found to
    have forgot, as good newbie as I am, to check the Include Breeze
    Metadata (that seems caused to not invoke the LMS functions) and
    the the Subject in the Manifest options that caused an error when
    validating the metadata.
    See you on this forum for my next problem!
    Marco

  • MSI Z97 GAMING 3 Review--Performance Testing

    After the previous hardware and software introduction, I believe Z97 GAMING 3 will meet gamers’ expectation.
     Z97 GAMING 3 integrated with Killer E2200 LAN, Audio Boost 2, M.2 interface and the normal array of connections,
    It is truly a good gaming motherboard. Could all these features offer great performance and a good experience?
    Today I will test the performance of Z97 GAMING 3 and how good it is.
    MSI Z97 GAMING 3 Testing
    My test platform is MSI Z97 GAMING 3, Intel ® Core i7-4770K and MSI GeForce GTX 750 graphics card. The test
    consists of two parts:
    CPU Performance: Super PI, PC Mark Vantage and Cinebench R11.5.
    GAMING Performance: 3DMARK 11, Evil 6 Benchmark and FFXI Benchmark.
    Test Part 1
    CPU : Intel Core i7-4770K @ 3.5 GHz
    CPU Cooler : Thermaltake TT-8085A
    Motherboard : MSI Z97 GAMING 3
    RAM : Corsair DDR 3-1600 4GB X 2
    PSU : Cooler Master 350W
    OS : Windows 7 64 bit
    Basic performance testing (CPU setting by default)
    CPU Mark Score : 679.
    Super PI 32M Result – 8m53.897s.
    Graphics Performance Testing:3DMark 11
    3DMark 11 is designed to measure  PC’s performance. It makes extensive use of all the new features in DirectX 11
    including Tessellation, Compute Shader and Multi-threading.
    Intel ® HD4600 iGPU in 3DMark 11 Basic mode testing, the results is X385 Score.
    Performance mode test score is P1511 .
    System Performance:PCMark Vantage
    PCMark Vantage is a PC analysis and benchmarking tool consisting of a mix of applications such as based and
    synthetic tests that measure system performance.
    From the test results, the score of Z97 GAMING 3 with Intel ® HD4600 iGPU is 11,946.
    MSI  GeForce GTX 750 Testing
    Test  Part 2
    CPU : Intel Core i7-4770K @ 3.5 GHz
    CPU Cooler : Thermaltake TT-8085A
    Motherboard : MSI Z97 GAMING 3
    Graphics Card:MSI GeForce GTX 750
    RAM : Corsair DDR 3-1600 4GB X 2
    PSU : Cooler Master 350W
    OS : Windows 7 64 bit
    Graphics Performance Testing:3DMark 11
    Z97 GAMING 3 with GeForce GTX 750 the test scores is X1653 in 3DMark 11 basic test mode, The performance
    mode test score is P5078.
    System Performance:PC Mark Vantage
    From the test results, Z97 GAMING 3 with GeForce GTX 750 scores 11,518.
    System Performance:Cinebench R11.5 
    Cinebench is the software developed by MAXON Cinema 4D. Cinebench could test CPU and GPU performance with
    different processes at the same time. For the CPU part, Cinebench test the CPU performance by displaying a HD 3D
    scene. For the GPU part, Cinebench test GPU performance based on OpenGL capacity.
    Main Processor Performance (CPU) - The test scenario uses all of your system's processing power to render a photorealistic
    3D scene. Graphics Card Performance (OpenGL) - This procedure uses a complex 3D scene depicting a car chase which
    measures the performance of your graphics card in OpenGL mode.
    In Cinebench R11.5 test, MSI Z97 GAMING 3 with GeForce GTX 750 multi-core test is 6.87pts; OpenGL score is 73.48 fps.
    Z97 GAMING 3 with HD 4600 and GeForce GTX 750 in the GAME Benchmark Test
    For game performance testing, I will use Resident Evil 6 and FFXI Benchmark with the same platform.
    Evil 6 Benchmark
    CPU: Core i7-4770K
    Game resolution setting: 1920X1080
    Other setting: Default
    In the Z97 GAMING 3 with Intel® HD4600 iGPU platform, score:1175 (Rank D)
    In the Z97 GAMING 3 with GeForce GTX 750 platform, score: 5874 (Rank A)
    I use Fraps tool to record FPS status during benchmark testing.The Z97 GAMING 3 with GeForce GTX 750 average
    FPS is 202. The Z97 GAMING 3 with Intel® HD4600 iGPU average FPS is 32.
    FFXIV Benchmark
    CPU: Core i7-4770K
    Game resolution setting: 1920X1080
    Other setting: Default
    The 1920X1080 resolution, Intel® HD4600 iGPU score is only 910.
    However, the GeForce GTX 750 testing score is 4167. According to the official classification system, the score
    between 3000 to 4499 means high performance.
    I use Fraps tool to recorded FPS status during benchmark testing.
    the GeForce GTX 750 average FPS is 111.  Intel® HD4600 iGPU average FPS is 19.
    Test Summary
    MSI Z97 GAMING 3 is not very expensive. It has many features which are specially designed for gaming experience
    and good performance of benchmarks. Even in 1920x1200 resolution and high quality display setting, Z97 GAMING 3
    with Intel Core i7-4770K and MSI GeForce GTX 750 can easily handle any kind of games. The FPS of this system is
    higher than 60 and users will enjoy no lag as gaming. It is really a good and afforadable chioce for gamers.

    Thx for the sharing, since there are not much reviews about Z97 GAMING 3. 

  • Tom's Hardware released USB 3.1 testing results on MSI X99A GAMING 9 ACK

    Tom’s Hardware released a test result of MSI motherboard and USB 3.1 device. It mentioned that there is an ASMedia ASM1352R controller on X99A GAMING 9 ACK to support two USB 3.1 ports on I/O interface.
    http://www.tomshardware.com/reviews/usb-3.1-performance-benchmark,4037.html
    X99A GAMING 9 ACK, expected to launch in Q1 2015.
    USB ports on MSI X99A GAMING 9 ACK
    Testing Platform
    Results: USB 3.1 VS. USB 3.0
    ASMedia is the only company ready with USB 3.1 and Intel isn’t planning to build the technology into its chipsets any time soon. Neither Broadwell’s nor Skylake platform supports native USB 3.1. Moreover, MSI is the first company to announce its motherboard with USB 3.1 feature. MSI X99A GAMING 9 ACK will undoubtedly be a $400+ motherboard like X99S GAMING 9 ACK before it. Any users who want to enjoy the newest technology can choose MSI X99A motherboard to have the best using experience.

    I dont think the usb 3.1 drives ready now, but it's nice to see msi moterboard get this new feature.
    I think the type-c usb 3.1 will be use on the mobile device for sure!

  • How do you switch between web server snapshot jpg and png modes?

    I read in another forum a posting by "NathanK" stating:
    "The snapshot feature of the web server can
    generate images either as png or jpg. In LabVIEW 8.6 and later, the
    default which is generated by the web publishing tool is the png
    format. In this mode everything happens in memory so there is no
    snapshot image file generated.
    In the jpg mode there is a
    temporary file, however it is not always the same file and the file is
    deleted after it is uploaded to the client. I would not recommend
    trying to use this programmatically.
    If you need more
    control over a snapshot of a front panel (and you are using 8.6), I
    would recommend making a custom web service that takes a picture of a
    VI's front panel and returns it to the client. Then you would have
    control over the image."
    I am currently using the web publishing tool snapshot feature with an EXE program written in LV8.6.  However, each time a browser accesses the html file, the EXE file's memory grows by 72KB (this happens with my custom EXE program and with LabVIEW.exe when running the VI in development mode).  I'm assuming this is the png that is generated each time a snapshot is taken, but the program never de-allocated the memory and the program eventually crashes.  I'm aslo assuming that if I switch to "JPG mode", the web server will generate a temporary file and eventually delete it, elminating the memory leak.  The problem is, I don't know how to change the modes from png (default) to jpg.

    I am and have been using the .snap function on many applications since LabVIEW 6.1.  It's very simple, easy to use, does exactly what I want it to do, and is still in the documentation of LabVIEW 8.6, 2009 and 2009 SP1 as being supported. 
    If I am not mistaken, don't you need to have the LabVIEW runtime engine on the target system to use embedded mode?  I don't want to have to try and keep everybody who wants to see a screenshot of my program to have to have a run-time engine installed.  For one reason, I don't know everybody who wants to monitor our system and they literally can be anywhere in the world if the VPN to our network.  Second, those who I do know would would have me install it for them.  Third, why go "backwards" in capability...everybody would ask me "Why do we have to do this now?..we never had to do it before", and I don't want to use my time/energy at work to explain why NI cannot fix a memory leak that has been identified in at least the last 3 versions of LabVIEW and is still not fixed. 
    I had been using either LV 6.1 or 7.1 for a good part of 10 years and had very few problems with them (BTW, DAQmx is the greatest thing every put on a computer EVER).  I finally make the transitioin and start using LV 8.6 on a major test platform last year, got to the very end of it and find this out.  <sigh!>
    It is funny (maybe not), but my work around feels like I just changed the floormats in my Toyota car to keep it from "crashing".
    However, if I am mistaken and you do NOT need to use LV runtime engine for embedded mode, ignore everything I wrote above and please let me know that this is the case and I will look into it.

  • Video streaming from a phone to a DLNA TV via a Ho...

    The Network:
    I have a Home Hub 2b ( Software version 4.7.5.1.83.2.11.2.6 (Type B) ) running on Infinity 2. The router is set for b/g/n working.
    I have two devices, one a netbook PC set for 'n' working and this connects at 54Mbps at best. The other a new WP8 device ( Lumia 620 ) which connects at 39Mbs at best to the same router. Even if both devices are physically next to the router.
    A DLNA TV is connected via wired 'power hub adapter' cables, the router claims this is connected at 100Mbps. This setup plays BBC iPlayer HD via an inbuilt TV app through this same wired connection fine.
    A speed test to an ADSL speed test connection site usually gives mid to high twenty's speed for the Netbook around say 26Mbps, and mid to low twenty's for the phone say 21Mbps download speeds.
    The problem:
    The phone can make 720p or WVGA video which play OK on the phone. But when I stream them via my network to the DLNA TV they stutter really badly. The 720p is worst and the WVGA is not quite as bad, but neither are smooth and you wouldn't want to watch either for long.
    The Questions:
    1 Why should a new 'n' phone only connect to the BT router at 39Mbps (as reported by the router), yet an older 'n' netbook connect at 54Mbps?
    2 For a BT HH2b, what are the expected connection speeds from an 'n' device. i.e. is 54Mb and 39Mb acceptable and expected or is it slow?
    3. Should I be able to stream 720p video wirelessly to the hub and then wired to the TV. (The TV plays BBC iPlayer HD stuff fine through the same wired connection).
    Thanks in advance.
    PS extra info here - http://discussions.nokia.com/t5/Nokia-Lumia/Lumia-620-quot-Play-To-quot-app/td-p/1772318 and here - http://forums.wpcentral.com/nokia-lumia-620/217491-lumia-620-play-app.html I've been asking on other forums as well (Charlie55).
    EDIT - I later found this on the Nokia sites (as yet unconfirmed) FAQ:
    When connecting Nokia mobile device (e.g. N8-00) to a WLAN access point using the IEEE 802.11n mode, connectivity issues have been observed with access points from certain vendors. Problems have been observed with at least the following WLAN products:
    FritzBox 7270
    Livebox 2
    BT HomeHub V2.0 Type B
    Netgear DGN1000
    The issue is related to a WLAN chip used in these access points. To solve the problems, it is recommended to upgrade the access point firmware to the latest version.
    If there is no firmware update available or the update doesn't help, the workaround is to disable the 802.11n mode and force the access points to use just the 802.11b/g mode.
    Is anyone aware if this issue is now fixed and out of date?

    I didn't know BT did that for free?
    I've emaied them via your link.
    Mind you though, it would be nice if BT could just get their auto-update system working. I bet there are loads of people with HH2s probably on the wrong (old) software. I know one other at least. f they can't make an auto update system that works, why not at least provide a quick and easy manual update method, like click <update now> button?
    Thanks

  • How many tasks can I have in NI-DAQmx system?

    I am working out of the office this week - please send your response to [email protected] as well as my normal email [email protected].
    I am designing a new test platform that will run on C# (because we like object oriented text based programming and because the IDE is free).  For hardware we'll use your PXI modules (because your hardware is ultra-reliable).  I'm starting with the following PXI modules 6259, 4072, 6509, 2575, and 2569, but we made add other modules later as future requirements demand.  I am doing the system software architecture now.  I've done some dabbling with Ni_daqmx, and I loathe "tasks" and "channels", but am determined to make them work.
    The difficulty I'm having revolves around the overhead with setting up tasks and channels and with the concept of tasks and channels.  Help hasn't helped much.  Tasks seem to work great if you intend to always collect the same large amount of data from many sources and always will do it in exactly the same way. 
    We want something much more akin to random access memory.  We'll have UUTs of different kinds coming on and off at random times.  Sometimes a test will need a simple DC voltage, and other times another test will need 2 seconds worth of data at 1MS/s.   It can cost around 150msec to make that change (stop the current task, creat another one, configure anohter channel, start up a new task, blah, blah, task this, channle that - for the life of me I can't understand what this damn taks concept does for me but make matters more complicated and bog down my system).
    I can just barely glean from the help that I can have only one analog input task at once, but I can have sevferal taks of several different kinds running at the same time ( a digital output task and another task for digital input, and another task for analong input, etc.) .  It does seem that I can set realys without having to put them into a channel and a task (Thank goodness for that - I just wanna turn on a relay I have no idea what a relay "channel" might be.)
    In the PXI system I described above I'll have two devices that are analog input devices, the DMM and the Daq.  Even with two devices, can I only have one analog input task running?  Does that mean I've got to waste 150msec just to switch from the DMM to the Daq? (Do you understand my frustration with "tasks"? - I should be able to say "read DMM" and turn around and say "read Daq channel 0" without having to jump thru task and channle hoops and waste a bunch of time doing it).
    I'll also have Daq digital IO and two 6509 digital io modules.  Do I have to cram all of that into some "task" concept too?  The Daq card has got many other kinds of wonderful capability (frequency measurement, counters, analog output).  How many tasks can I have for that stuff?  On the DMM card I can measure capacitance amoung other things.  Will I have a capacitance task?  The idea of a "capacitance task" hurts my head.
    Golly I hate tasks.  Please help.
    I am working out of the office this week - please send your response to [email protected] as well as my normal email [email protected].
    Dave

    Dave,
    Thanks for posting to the NI Forums.
    For the good of the community we like to keep conversations that start of the forums on the forums rather than moving to email.  If you want to move to email support I recommend contacting NI through the email route.
    I will answer your questions here.  If you need more direct contact please feel free to contact us through [email protected].
    For some the concept of tasks may seem daunting, however, for many applications it makes life a lot more simple.  A task at a very fundamental level is simply a collection of channels with a single type (AI, DI, DO, etc.)  and a single timing configuration (sampling rate, continuous vs. finite, etc.).  It is a way to organize configuration data.  
    The concept of a task is an abstraction like OO programming.  Like OO programming it may take some time to understand but can be a time saver in the end.  Also like OO programming it does add some initial programming overhead.  It is much more simple to simply type a printf statement in C than to have to create a bunch of classes just to output text to the screen. You can accomplish the same thing not using OO programming, but in the end OO programming is extremely useful, because it groups useful information and methods together in one place.
    With very simple applications OO programming sometimes does not make sense.  But as a program gets more and more complex OO programming becomes more and more useful.  It takes some learning but it is worth it.
    I believe the concept of a task does the same thing.  It does not change the actual functionality of the device or add excessive overhead.  It is just an abstraction that pulls configuration data about a specific "task" and methods the task can perform into a single logical place.
    Unlike the entirely abstract concept of an Object, a Task is run on a physical device and therefore has physical limitations.  You can create multiple tasks of the same type, but you can only run a one timed task of each type on a single board at a time.  In other words you can have multiple AI tasks running at the same time but they need to run on different boards or only one can be timed (have a rate).  You can also have multiple timed AI tasks configured for a single board but only one can actually be running at a time.
    The reason you can only have a single timed task running at a time is because the M-Series boards (and many other boards as well) have a single timing engine for each type of acquisition or generation.  There is a single timing engine for AI, one for AO, and so forth.  You cannot have channel 1 running at 1 MS/s and another running at 50 kS/s.
    However, tasks can exist even when they are not being actively run.  You can create all the tasks you need at the beginning of your program and simply start and stop them as you need.  After the task is stopped you do not need to clear the task until the end of your program.  You can further increase performance by moving the Task into the latest state possible without actually starting the task.  This can be done by calling myTask.Control(TaskAction.Action).  The Task states are further explained in the NI-DAQmx Help Manual.
    The concept of a  task will need to be used with any device that is being programmed using NI-DAQmx.  The 6259 and 6509 will need to be programmed using DAQmx.  With the 2575 and 2569 you have the choice of either using the NI-DAQmx API or the NI-SWITCH API.  The SWITCH API does not use the concept of tasks.  For the 4072 you will need to use the NI-DMM API.  This API also does not use the concept of tasks.
    Hopefully this information is helpful.  Let me know if you have any additional questions or concerns.
    Regards,
    Neil S.
    Applications Engineer
    National Instruments

Maybe you are looking for

  • I want to send the data on internet using my laptop as a hotspot of wifi ?

    Dear Team, I have created an I/O server & sucessfully made communication with Modbus RTU device. Now I want to send this data on an internet. I have a laptop, which i have made hotspot using connectify. so i wish to transmit the data via wifi hot spo

  • Not able to remove 'emtpy home' directory entry

    Hi I'm trying to remove an Oracle 10.2.0.1.0 installation on a Linux system. My problem is, that I'm not able to remove an 'empty home' directory with the installer. The button 'Remove..' is grayed out. The reason why I want to remove that directory

  • I WANT to open PDF in Adobe NOT FF but updates force me into FF

    My FF decided on 12/1/2012 to open in browser, overriding my settings. No matter what I do, how I change settings, PDFs open in FF. I even totally uninstalled Acrobat 11 and reinstalled Acrobat 10. Still opens in browser (so does that mean it's FF ca

  • Different rounding rule by customers

    Hi all, I have a quiestion about rounding rule. Is there any way to change rounding rule to total price after tax by customers? I understand rounding rule can be controled by condition type. However, I am looking for a way to apply a different rondin

  • Where can I find this standard SAP class?

    Hi all, Which PAR file contains this class: com.sapportals.wcm.control.search.SearchLanguageControl It is a standard search component called "search_language". I have two problems with this component: 1. Too many languages are displayed (e.g. Greek a