Streaming from two sources

Hi:
I need streaming to Flash Media Server from two sources (microphone and sound tarjet)in my pc with FME.
Streaming from microphone connect and start correctly (config stream=streaming1), but streaming from sound tarjet (config stream=streaming2) connect but doesn't start.
Why?
Regards.

Hello,
It is possible that you are seeing the consequences of LabVIEW compiling code written in parallel. More specifically, if you have code in parallel (not connected by dataflow, but in the same block diagram) LabVIEW will split execution time between those parts of your code. Previously you were likely manually starting two separate programs, which inherently adds a delay between the start of the programs, allowing the first program to get sufficiently far in its execution; we could be seeing the consequence of this. It would help if you could be more specific about the details of your setup and code (such as 1. which instruments are connected to which ports? 2. are you writing a command to your instruments and then receiving data as a response? 3. do you rec
eive any errors? 4. if you do receive errors, which errors do you see and where in your code do you first see them?).
Repost with some more information (perhaps a screen shot or your code) and we can get a more definitive answer!
Thank you,
Regards,
JLS
Applications Engineer
National Instruments
Best,
JLS
Sixclear

Similar Messages

  • Need to merge similar charecterstics comming from two sources

    Hi Experts,
    I have a requirement to upload data into BI from Two sources. one from ECC is business Sellout history data and from flat file is market sellout history data. Now one source has fields
    CALMONTH, Market, Sales Organization, Segment, Super Model, Model, Country Standard, Model Year, ZCOL, Power, Material No., Location, APO Version, Sell-Out History, UNIT
    Other source has fields
    Market,Model,Type Of Model.
    I need to upload this data to BI and merge Market, Model, as these fields are common in both the sources and display Type of model for the respective Market and Model.
    Can any one advice me how to proceed desgining this.
    Thanks in advance ...
    Santhosh.

    Hi
    An alternative to the above solutions, Create two DSOs and fields of the dsos should like below :
    DSO ECC :
    CALMONTH, Market, Sales Organization, Segment, Super Model, Model, Country Standard, Model Year, ZCOL, Power, Material No., Location, APO Version, Sell-Out History, UNIT, Type of Model
    Bold fields exists in DSO Flat file fields as well.
    DSO Flat file :
    Market,Model,Type Of Model.
    1. First load the data to the flat file DSO.
    2. while loading data to the DSO ECC you can populate the model type by writing an end routine to look up flat file DSO to find out the type of model using the condition DSO ECC- market = DSO-Flatfile- market & DSO ECC-Model = DSO-flatfile-Model.
    DSO ECC will have the merged data and DSO Flat file is used as a look up table in the end routine of the transformation.
    Also note that you can also cube in place DSO ECC if you dont want to stage data.
    Hope it helps.
    Regards
    Sadeesh

  • CSM 12 hour stickiness from two source addresses

    We have an environment where the traffic only originates from two source IP addresses (shared port forwarders). The server group need a 12 hour stickiness window for the Citrix sessions to be hitting the same server.
    If we have two servers, ServerA and ServerB, this is the problem.
    When maintenance is performed on ServerA. All traffic is sent to ServerB. When ServerA is back in operation the traffic does not use this server due to the sticky timeout of 720 minutes.
    Is there anyway to clear the connections from one source on the CSM so the processing of packets will spread the load between the two servers?
    Thanks.

    Ben,
    clearing the connections without clearing the sticky table is useless. Because the sticky entry will simply forward the new connections back to the same server.
    Also, there is no way to clear a particular sticky entry.
    Finally, I think the solution for you would be to create static sticky entry.
    You can force a client ip to go to a specific server.
    If the server is down for maintenance, the CSM will simply select another one but will go back to the initial one if available.
    To configure static entries, use the following commands
    sticky 66 netmask /32
    static client source x.x.x.x real x.x.x.x
    Gilles.

  • Acquiring streaming data from two sources

    I�m trying to acquire data from two devices at the same time. I have written two sub VI�s where each one takes the data from one piece of equipment. The equipment is such that they are both constantly outputting data. I have been successful in running both of the sub VI�s separately at the same time. The trouble occurs when I try to put the sub VI�s together in a larger VI. When the two sub VI�s part of a larger VI, both cannot run at the same time. One of the sub VI�s tries to read from the serial port and is unable to get anything in response. Is there something I am missing as to why they can not be running at the same time?

    Hello,
    It is possible that you are seeing the consequences of LabVIEW compiling code written in parallel. More specifically, if you have code in parallel (not connected by dataflow, but in the same block diagram) LabVIEW will split execution time between those parts of your code. Previously you were likely manually starting two separate programs, which inherently adds a delay between the start of the programs, allowing the first program to get sufficiently far in its execution; we could be seeing the consequence of this. It would help if you could be more specific about the details of your setup and code (such as 1. which instruments are connected to which ports? 2. are you writing a command to your instruments and then receiving data as a response? 3. do you rec
    eive any errors? 4. if you do receive errors, which errors do you see and where in your code do you first see them?).
    Repost with some more information (perhaps a screen shot or your code) and we can get a more definitive answer!
    Thank you,
    Regards,
    JLS
    Applications Engineer
    National Instruments
    Best,
    JLS
    Sixclear

  • To Load Master Data From Two Source System

    Hi All,
    I have a small question :
    - Can we load master data from two different source system say from flat file and R3 or any two different or similar source  system?
       If answer is "Yes", then how?? If possible step by step.
    Appreciate your valuable points.
    Thanks,
    Niraj Sharma

    Hi,
    Still i have problem.R3 Transformation and DTP is getting activated but when i am executing DTP for flat file,
    I am getting below ERROR.
    Object DTP DTP_d55.......could not found in version A.
    I have  checked the Master data locally for source sys in compounding Tab of the key field.
    Please help..
    Thanks,
    Niraj

  • Master Data from two source systems

    Hi Gurus,
    I need to load master data from two different source systems. What is the best way I could do that ?
    I know one approach is add system id 0logsys as prefix in compound tab and load it. But problem is master data table will have two different records and in the report it will display 2 records, can not summarized it in the report. But I need one record in the report. What is the best approach ?
    Thanks
    Liza

    Hi,
    Create two DataSources. one for each of the source system.
    Create two separate flows to the master from these two DataSources.
    Hope this helps!

  • How can I see My Photo Stream from two Macs associated with the same Apple ID?

    I have a MacBook Pro (my personal machine) and an iMac (our family computer).
    The iPhoto library housing all of our family photos lives on the iMac.
    The iPhoto library on my MacBook Pro is currently empty.
    I have an account on both machines.  I have associated the same Apple ID with both machines.
    When I launch iPhoto on my MacBook Pro and attempt to turn on photo sharing (so that I can see My Photo Stream, as well as other shared streams), iPhoto informs me that "iCloud Photos for xxx@xxx is being used with another library named 'iPhoto Library'."
    My iPhone, which is also affiliated with the same Apple ID, shows the streams just fine.
    What can I do to remedy this situation?  I'd like to be able to view my shared streams on my MacBook Pro, while leaving the actual massive photo library on the iMac.

    I found out about this issue when I created a new library with some photo's I wanted to sync to my iCloud account.  I found the following on the Fat Cat Software page and thought it might be helpful to others:
    Using Photo Stream with multiple libraries
    Starting in iPhoto 9.2, Apple introduced a new feature called Photo Stream, which lets you automatically transfer photos via their iCloud service between your iPhoto library and your other devices such as an iPhone, iPad, or another Mac. This feature works when you have multiple iPhoto libraries set up, but there are a couple things to be aware of.
    First, iPhoto will only allow Photo Stream to be active in one iPhoto library at any given time. If you've already enabled Photo Stream in one library, but then open a second library and enable Photo Stream there, this will cause Photo Stream to be turned off in the first library.
    Any photos downloaded by Photo Stream are actually stored in an entirely separate location from any of your iPhoto libraries. This means that, even if you do switch Photo Stream from one library to another, you will not need to go through redownloading all the photos you already have in Photo Stream back from the iCloud servers. So, switching Photo Stream from one library to another is a relatively inexpensive operation, so you can do it as often as needed without it being much of a hassle.

  • NOW can I sync from two sources?

    I keep my itunes library on my iMac (40GB, most purchased from iTunes), and my iphoto library on my Macbook pro. With Original Apple TV, according to local Apple store you can't automatically sync form two different sources. Anyone know if the new update will allow this? Really would love to have an Apple TV if it will always keep my current iTunes music/movies from my imac and always keep current photos from my iMac.
    Anyone know? Thank you.

    It will only sync from one computer, why not sync your photos and stream your music and everything else from your imac, exactly what I do except I do it from one mac.

  • Related to calculation of value from two source files

    Hi, we have two files based on volume and costs
    Time,Item,site,ASM,Retail are dimensions..
    The volume one is:
    May-09     item 1     Site 1     ASM 1     Retail     VOL     100
    May-09     item 2     Site 1     ASM 1     Retail     VOL     150
    May-09     item 3     Site 1     ASM 1     Retail     VOL     130
    May-09     item 4     Site 1     ASM 1     Retail     VOL     120
    May-09     item 4     Site 1     ASM 2     Retail     VOL     150
    May-09     item 4     Site 2     ASM 3     Retail     VOL     100
    The Cost one is:
    May-09     item 1     Site 1     1.2
    May-09     item 2     Site 1     1.3
    May-09     item 3     Site 1     1.1
    May-09     item 4     Site 2     1.3
    May-09     item 4     Site 1     1.5
    note that in the second file site,ASM are missing (this was the problem for us from source file)
    Here in essbase we need to calculate VALUE = VOL * COST with respect to item and site,such that in selection critera
    the value must be represented with respect to ASM and Retail dimension also.
    Psl post the approach and the sollution how to load this and how to calculate..

    Hi,
    You would have to transform the second file to include the ASM and Retail dimension members. It is essential that you have one to one relationship between Site, ASM and Retail. You can maintain the mapping in a separate file and then read the cost file and get values for ASM and Retail from mapping file.
    Otherwise in the load rule, you will have to specify one single member from each dimension - ASM, Retail in the header definition.
    Once you have the data in the system, you can run a calculation to calculate the value of VALUE. Or else, you can also define a formula on VALUE.
    Let me know if it helps.
    Cheers
    RS

  • Organisational determination from two sources

    Hi all,
    When creating quotations we need the following information available:
    Organisational Unit
    Sales org
    Distribution channel
    Division
    Sales Office
    Sales Group
    Most of this data is maintained on our customer masters – except for Organisational Unit. Therefore, we have tried to use standard Organisation Determination Rule 14000161 (“Org. Data Re. Master Data”). This works quite well because all data maintained on the sold-to party is then copied to the quotation. The only problem is the Organisational Unit which is for some reason always set to the same value as the Sales Group. 
    Our new requirement is that we want Organisational Unit to be determined based on the employee reponsible on the quotation and his assignment  in the org model. Basically, we want his/her department as Organisational Unit. All other org data (sales org, division etc.) we want to take from the sold-to party – as we are doing today.
    If we use standard determination rule 10000194 (“User Org Unit”) then we get the Organisational Unit correctly determined (and partially also sales org, distr. chnl. and division). However, Sales Office and group cannot be determined with this rule because we only assign these values to customers in our company.
    So it seems I need a new determination rule where some of the org data is determined based on the logic in 14000161 (“Org. Data Re. Master Data”) and then Organisational Unit must be based on 10000194 (“User Org Unit”).  But how do I go about this? Or is there perhaps a better way to do this?
    I assume that if I want to combine the two existing rules I will need to also build a new function module??? And also, I do not know how to feed the rule with two partner numbers – One being the sold-to party and the other being the Employee responsible.
    Hope you gurus have some input. I am quite lost here.
    Br,
    Anders

    Hi Anders,
    Yes you have to achive that by creating new functions on organization
    determination rule. With this assume your partners both Sold To and
    Employee Responsible are entered before you save the documents,
    and then you set the organization determination take place when
    saving the document.
    Hope this helps.
    Gun.

  • [Bug] Writing to the same trace in Citadel from two Sources at the same time

    Howdy
    I am experiencing an issue and am looking to get some NI support:
    I have two distributed applications that can write to a Citadel database at the same time.
    And I am investigating the possibility of what would occur if both of them were writing to the same variable at the same time, with possibly the same timestamp data.
    I did this in the attached VI by using the same timestamps with a constant for the data (either #1 or #2).
    This generates an error, as expected, stating you cannot have two references open at the same time.
    The data file, dumped from Citadel, contains 1000pts with all data from only one of the arrays (in this case #2) and everything looks fine:
    However, what appears to be happening randomly tho, is that the same portion of code can also returns no error.
    When the data is dumped from Citadel this file contains 2001pts - 1000pts from each array and one separator (or break) point:
    You cannot write an older timestamp to Citadel without it throwing an error, but from further testing it seems you can write the same timestamp repeatably. So I am guessing what is happening above is race condition that is somehow allowing two references to be opened to Citadel which should throw an error but it does not and this is somehow allowing both sets of data to be written by both loops??
    The data, IMO, is now corrupted.
    Is being able to write to the same timestamp a desirable feature?
    Or is this a known issue or a bug?
    What about acquiring two references - bug?
    Is there a workaround to protect the data from multiple distributed-application writes?
    Cheers
    -JG
    Attached VI was coded and tested using LabVIEW 2009 SP1, DSC 9.0.1, MAX 4.6.2f1
    Certified LabVIEW Architect * LabVIEW Champion
    Attachments:
    dual_databaseWrite.vi ‏21 KB

    Ben S wrote:
    The race condition of opening the refererence in two places at once should still apply to two different VIs running at the sametime.
    Will a CAR be lodged for this?
    Ben S wrote:
    If you open up the code and save it as a .vit, you don't get any missing VIs. Only when you change the extension without the aid of LabVIEW.
    No VI should have missing members by simply renaming its extension outside of LabVIEW???
    I think there is an issue with the VI.
    Can you replicate the following and verify what I am seeing?
    Attached is a Project with Main VI (contains a call to the Open Trace.vi polymorphic only), a Class and a Build Spec
    If you Build the spec with the Main VI outside of the Class it works
    If you place the VI in the Class, save then Build, the Spec fails with the error:
    Visit the Request Support page at ni.com/ask to learn more about resolving this problem. Use the following information as a reference:
    Error 1003 occurred at AB_Application.lvclasspen_Top_Level_VIs.vi -> AB_Build.lvclass:Build.vi -> AB_Application.lvclass:Build.vi -> AB_EXE.lvclass:Build.vi -> AB_Engine_Build.vi -> AB_Build_Invoke.vi -> AB_Build_Invoke.vi.ProxyCaller
    Possible reason(s):
    LabVIEW:  The VI is not executable. Most likely the VI is broken or one of its subVIs cannot be located. Select File>>Open to open the VI and then verify that you are able to run it.
    Visit the Request Support page at ni.com/ask to learn more about resolving this problem. Use the following information as a reference:
    Error 1003 occurred at AB_Application.lvclass:Open_Top_Level_VIs.vi -> AB_Build.lvclass:Build.vi -> AB_Application.lvclass:Build.vi -> AB_EXE.lvclass:Build.vi -> AB_Engine_Build.vi -> AB_Build_Invoke.vi -> AB_Build_Invoke.vi.ProxyCaller
    Possible reason(s):
    LabVIEW: The VI is not executable. Most likely the VI is broken or one of its subVIs cannot be located. Select File>>Open to open the VI and then verify that you are able to run it.
    Next place a disabled structure around the Polymorphic VI (as Class Member or not, it doesn't matter)
    Save the VI
    Close the VI
    Open the VI - the Open Trace.vi polymorphic is "missing"
    Cheers
    -JG
    Code in LV2009
    Certified LabVIEW Architect * LabVIEW Champion
    Attachments:
    DSC.zip ‏341 KB

  • You Tube Favorites from Two Sources

    I am adding You tube favorites from both my iPhone 3Gs and my Apple TV. When I merge the two, many of my saved favorites do not play. I get a couple of different error messages: Account suspended, inappropriate content, and video does not currently support iPhone. I understand the last message, but why the first two? BY the way, there is nothing X-rated here. Thanks.

    Hi Anders,
    Yes you have to achive that by creating new functions on organization
    determination rule. With this assume your partners both Sold To and
    Employee Responsible are entered before you save the documents,
    and then you set the organization determination take place when
    saving the document.
    Hope this helps.
    Gun.

  • Dhcp response from two sources

    Hi,
    We have this network , where a particular vlan(eg. vlan 10-wireless) is being configured for dhcp ip allocation & this is done by a third party authentication box(similar to ACS).
    This vlan10 is only applicable to wireless users.
    Now in the same network, we want to create another vlan (eg. Vlan20-Inhouse). This vlan20 will also have dhcp allocation , although this vlan20 will be served ip's from a server which will take the role of dhcp server.
    The layer 3 interface for this vlan20 will be on a firewall upstream & will be extended to the switches where users will connect themselves to.
    The query is , having two different dhcp on the same big network, will it create any conflict on who answers first.
    My understanding is since the two dhcp servers areon/server different vlans, the users ip request will be handled by the respective dhcp server, based on the vlan they are attached.
    Please help in getting this clarified. Thanks in advance.

    Disclaimer
    The Author of this posting offers the information contained within this posting without consideration and with the reader's understanding that there's no implied or expressed suitability or fitness for any purpose. Information provided is for informational purposes only and should not be construed as rendering professional advice of any kind. Usage of this posting's information is solely at reader's own risk.
    Liability Disclaimer
    In no event shall Author be liable for any damages whatsoever (including, without limitation, damages for loss of use, data or profit) arising out of the use or inability to use the posting's information even if Author has been advised of the possibility of such damage.
    Posting
    As Alain has noted, when you jump VLANs you normally need a DHCP relay agent.  Without that, the DHCP server won't see the request from a non-local VLAN.  But also don't forget, DHCP servers also won't provide an IP for a remote subnet if they haven't been allocated a pool of IPs for that subnet.
    Conversely, you might intentionally have both DHCP servers setup with non-overlapping blocks of IPs for the same subnet.  This way, if one fails, you'll still have DHCP redundancy.  Client will normally accept the first DHCP offer it receives.

  • Direct To Disk Capture From Two Sources?

    I was referred here from the Premiere forum regarding a feature of OnLocation.
    I've discovered I'm unable to use the audio recorded from my camera (Canon XL2) because of it being extremely noisy. I suppose I could put it in the shop to fix it and hope the noise goes away (which I've been told is really only 50/50 since direct-ins on cameras are likely to have this issue) but it's a moot point anyway because such a repair wouldn't be in my budget at this point anyway.
    So, my second option would be to go with an external interface (I'm looking at the M-Audio line) to take the audio directly into the PC and pull only the video in from the camera. However, this poses an issue with having to re-sync both the audio/video together later on, which is an extra step I'd like to avoid.
    My thought was that if I could record both the audio and video in real time (and at the same time) it would be created and saved together, thus avoiding having to sync anything later (unless there is an easy way to do this I'm not aware of).
    So my first question is this:
    I know OnLocation can capture direct to HDD as you film. Is there a way to set the camera as the video source but set the M-Audio interface as the source for the audio (which I'll connect a lav mic to via XLR cable).
    Any thoughts?

    Not that I know of, I'm afraid.  You could  try capturing the audio through the PC's generic record utility through line in.
    The trouble with OL CS 5 is that it's VERY fussy about datarate.  Whilst CS4 was very happy to record onto a 7,200 HDD, CS5 causes frequent bottlenecks.  I had to buy an e-sata raid drive running Raid-0 to eliminate this.
    My point being, I don't know whether OL would manage to run in unison with anothr application.  In any case, you would have to re-sync.

  • Displaying information from two sources

    I want to display current Citrix-sessions and the following AD-information for the users:
    Name (From AD), streetAddress (From AD), physicalDeliveryOfficeName (From AD), DeviceId (From Citrix), LaunchedViaHostName (From Citrix)
    These lines get me the Citrix session information I need:
    Asnp Citrix.*
    Import-Module ActiveDirectory
    $users = Get-BrokerSession |select -uniq UserUPN, DeviceId, LaunchedViaHostName
    After those commands I have the following information in an array $users:
    Because I now have the UserUPN information available, I should be able to map it to AD accounts to get more information. I try to get the information displayed with these commands:
    foreach ($i in $users) {
    Get-ADUser -LDAPFilter "(userprincipalname=$i.UserUPN)" -properties physicalDeliveryOfficeName, streetAddress | select name, streetAddress, physicalDeliveryOfficeName, $i.DeviceId, $i.LaunchedViaHostName |sort-object physicalDeliveryOfficeName
    But I’ll get an error:
    select : The value of a parameter was null; one of the following types was expected: {System.String, System.Management.
    Automation.ScriptBlock}.
    At line:2 char:113
    + Get-ADUser -LDAPFilter "(userprincipalname=$i.UserUPN)" -properties physicalDeli ...
    + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
        + CategoryInfo         
    : InvalidArgument: (:) [Select-Object], NotSupportedException
        + FullyQualifiedErrorId : DictionaryKeyUnknownType,Microsoft.PowerShell.Commands.SelectObjectCommand
    If I use this command, it works, but then I cannot display DeviceId and LaunchedViaHostName information:
    $users.userupn | %{ Get-ADUser -LDAPFilter "(userprincipalname=$_)" -properties physicalDeliveryOfficeName, streetAddress} | select name, streetAddress, physicalDeliveryOfficeName |sort-object physicalDeliveryOfficeName

    In your code change this
    foreach ($i in $users.UserUPN) {
    Get-ADUser -LDAPFilter "(userprincipalname=$i)" -properties physicalDeliveryOfficeName, streetAddress | select name, streetAddress, physicalDeliveryOfficeName, $i.DeviceId, $i.LaunchedViaHostName |sort-object physicalDeliveryOfficeName
    Import-Module ActiveDirectory
    $users = Get-BrokerSession |select -uniq UserUPN, DeviceId, LaunchedViaHostName
    foreach ($i in $users.UserUPN) {
    Get-ADUser -LDAPFilter "(userprincipalname=$i)" -properties physicalDeliveryOfficeName, streetAddress | select name, streetAddress, physicalDeliveryOfficeName, $i.DeviceId, $i.LaunchedViaHostName |sort-object physicalDeliveryOfficeName
    Regards Chen V [MCTS SharePoint 2010]

Maybe you are looking for

  • GL account doesnt exist in company code-error in Sub contracting PO.

    While doing GR for the sub contracting PO, I am getting this error for line item 4:  GL account 490237 is doesnot exist in company code 0600. 3 line items are already recieved to the GL account 490223 which is correct. for 4 th line item when i am go

  • What do you use to write Java programs?

    Hi. Do you mind if I ask you what do you use to write your programs in Java? I am a student and we are using NetBeans IDE 3.6. I know it is a bit old but the tutor says we must stick to that one. What is best for writing Java programs? Thanks.

  • Download to CSV format

    Hi guys, What is the best way to download an internal table data to CSV format? Thanks!

  • EE have taken my credit when I tried adding texts on 453.

    EE have taken my credit when I tried adding texts on 453. I tried adding "unlimited" texts via my usual method, dialling 453 and going through the steps. But what happened was, I got to the stage where I had confirmed my purchase, and suddenly, the v

  • Windows xp not responding when browsing leopard shares

    im having a problem with windows xp and file sharing on leopard server 10.5.4. im wondering if anyone else has experienced this issue or can provide some details on what i could try. when i use windows xp there is no problem in authenticating to and