Using logos groups

Hi all,
I've stumbled over a problem using different Logos groups for the access of a fieldpoint system with the DSC engine.
What I want to do:I've got a fieldpoint system doing some controlling and logging. Most of the data should be logged to a database. For the logging to a database a LV DSC 7.0 Run Time System is running on a PC.
Some of the data isn't time critical at all, so it just needs to be logged once a second or even less, while there's some data I'd like to log twice or four times a second. So I saw the possibility to define groups for the logos access and tried it...
What I got:
First I added a logos group and then I browsed for a tag, but it wasn't added. Even adding the URL by hand didn't work.p>
After that I changed the timing of the original Logos group. That worked fine so far and like a mircale I was able to add another Logos group after that and add tags to that group. The handling is a bit weired, though.
Later I did want to add another tag to this new group. This worked, but the drop down list of the tags belonging to that group already was empty. Well, I just added the new tag and it worked...
Just for fun I looked at an older tag (which had disappeared from the dropdown list) and when opening it, the tag editor showed the changed sign and as the URL for the connection was missing the tag didn't work any longer after saving the .SCF.
Is there an update to LV DSC 7.0 Run Time System which corrects this behaviour? I haven't tried it with the Dev.System though and I don't want to. Currently I don't have a fieldpoint system at hand just to "play a bit".
Thanks,
Carsten

Hi Carsten,
I tried to reproduce this issue with both Dev. and Run-time system, but I couldn't observe the behavior you mentioned. Could you give more information about the process. Are you publishing data from FP controller? The tags you selected are all published data?
Regards,
Remzi A.

Similar Messages

  • ACS 5.3 Authorization problem with using Identity Groups in Access Policy Rule

    Hello guys, I am found a problem which I can't solve regarding authorization with using Identity Groups in Access Policy rule.
    ACS version: 5.3.0.40.6 (internal build B.839)
    I have very simple RADIUS Authorization rule which authorize user on behalf of right Identity Group.
    Requested Identity Group exist
    Testing user is created in Internal Users and has assigned requested Identity Group
    Radius Access Policy: 
    Authentication against Identity Store Sequence, where authorization server is external RSA SecurID device and additional attributes retrieval is configured from Internal Users.
    Authorization is very simple – One Rule with only one Condition which is: Identity Group - in - Requested_Testing_Rule. Then Default rule is set to Deny.
    When I will try login with my testing user then authentication against RSA SecurID is OK, but authorization will be denied by Default rule – It looks like my Rule with Identity Group is totally omitted.
    I am managing several other ACS servers (version 5.3 but with older patches) where similar rules are working without problem.
    What I am tested:
    Remove testing user and create his account again.
    Rename Identity Group
    Use another Identity Group
    Remove Access Policy rule and create it again
    Use Compound Condition: System:Identity Group
    Use Compound Condition: System:UserID instead of Identity Group in Rule (it is working without problem)
    Do you have any idea where problem can be?

    OK guys, it started working yesterday without any configuration change. Maybe it was some database inconsistence wich was solved by ACS itself.

  • Single step workflow with multiple approvers (without using a group)

    Hi,
    is it possible to have a single step workflow with multiple approvers without using a group? This is for a contract document.
    i want to add a number of users based on particular logic. The approvers are random and do not belong to any particular group.
    DO let me know if it is possible or if any of you have done that.
    thanks in advance.
    regards,
    rubio

    Hi Rubio,
    I believe the behavior would be, if individual users are added as approvers then the system would require each approver to approve the document. However, if you use the user group, you could set the role so that it would be either ALL or ANY.
    Regards,
    Vikram

  • Overhead calculation not happening after using orgin group.

    overhead calculation not happening after using orgin group.
    There was a runtime error earlier related to u201Cdefine credit u201C IMG node under costing sheet component and we have applied SAP note 769946 and that error was gone out of the way
    We want to apply/add Overhead to SFG/FG materials.
    We are using PP order with PCC(product cost collector) as the cost object , i.e costing by period.(system ECC 6)
    But our problem is with material standard cost estimate process.
    We have assigned overhead keys to the percentage rates in costing sheet for material standard costing and assigned the origin groups to the credits of costing sheet. But after running the cost estimate overhead is not taking into account for standard cost calculation.
    In the define credit entry table key field is valid to date strangly and actually system should allow one than one entry with same valid to date and same sec.Cost element(type-41) for different cost centers.
    But if we without using overhead key and origin group, the entire cost in that supporting cost center will come to all materials (SFG/FG) and we can not distinguish between different product materials(SFG /FG).
    We have checked all things as mentined below.
    Firstly that the correct costing sheet is assigned to the valuation
    variant.
    That the costing sheet is entered for the appropriate material type:
    Finished and semi finished or material components.
    All of the above can be checked and verified via transaction OKKN.
    In addition make sure that the base value maintained is present in the
    costing, for example the base may include an Origin group, is that
    origin group part of the materials being costed?
    Similarly if the base is found and values exist how is the overhead
    rate of the costing sheet set up, is it valid etc.
    And finally do a similar check for the credit.
    we doubt this as a programm error...
    So, request all experts to have ur feedback..

    Dear,
    Check your origin group & material unit of mesaurement is same.
    some time in costing sheet origin group is maintain in different unit & for materail it's maitain in other unit of mesaurement.
    You can see unit of measure for material in Additional data - unit of measure.
    Check BOM component material unit also.
    Check same  unit of measure is maintain in KZS2
    I hope above will useful.
    GOPAN

  • What are the key fields used to group double orders in t-code SDD1?

    What are the key fields used to group double orders in t-code SDD1?
    Cheers,
    VT

    Hi,
    You can group the duplicate sales documents with the help of the following fields,
    Sold-To-Party,Document type,date and sales area details.
    Regards,
    Gopal.

  • How to create groups in AD using AD group process definition

    Hello,
    I want to create group in AD when i create group in OIM & i had achieved this using JNDI & was able to create groups in AD
    successfully,now i want to create groups in AD using AD Group process definition which contains create AD Group task.
    Can anyone suggest what flow should be followed to achieve this ?
    Thanks & Regards,
    Rahul Shah

    I think you can do R & D to acheive this via Event Handler.
    1. Apply Event Handler on Group Object Form.
    In Event handler class.
    2. get the parameter from ADITresource.
    3. Make JNDI call.
    4. Create the same group name in AD.
    There may be easy approach.
    Which version of OIM you are using?
    Thanks,
    Kuldeep

  • What is the advantages of using Flexconnect groups

    what is the advantages of using Flexconnect groups in WLC?
    Reg,
    Ezra.

    Pls refer this document for more detail about these features
    http://www.cisco.com/c/en/us/td/docs/solutions/Enterprise/Mobility/emob73dg/emob73/ch7_HREA.html#wp1091114
    FlexConnect is one mode an AP can operate, typically deployed in Branch setup where you do not have a controller at branch site. Those AP can register to a controller at your HQ or main site. So traffic will terminate at your branch switch instead of tunnel back to HQ-WLC.
    If you want roaming within your branch FlexConnect AP then you have to put those AP into a FlexConnect Group. Then only key information shared among those AP to facilitate fast roaming.
    Pls do not forget to rate our responses if you find them useful.
    HTH
    Rasika

  • Aggregate functions cannot be used in group expressions

    Hi have report showing sales by Vendor. I need to list all the vendors with Monthly Total>5000 and combine the rest as "OTHER VENDORS"
    Vendor is a Group in my report, so I tried to put an expression as a Group on:
    =IIF(Sum(Fields!Mth_1_Sales.Value)>5000,Fields!Vendor_No.Value,"OTHER VENDORS")
    I've got an error: "aggregate functions cannot be used in group expressions"
    How do I get Vendors with Sales < 5000 into  "OTHER VENDORS" ?

    Hi,
    You need to group by Month on group expression,
    And you can use the same expression in the report column as 
    =IIF(Sum(Fields!Mth_1_Sales.Value)>5000,Fields!Vendor_No.Value,"OTHER VENDORS")
    Many Thanks
    ..................................................................................................................................................................Please
    mark the post as Please mark the post as answered if this post helps to solve the post.

  • Use of GROUP BY in SELECT

    I am considering using GROUP BY in several selects where I will access very big bsik, bsid, bsad, bsak, ecc...
    My two concerns are performance and high data volume.
    GROUP BY should reduce data arriving from DB but will it hurt performance?
    What are your considerations?
    Any ideas are welcome. I cannot seem to find anything specific for or against GROUP BY.
    thanks,
    Phillip

    Hello Philip,
    I would advise that you do not use the Group By clause in the select statements for these tables. As you have rightly said, the data volume in these tables might be a major cause for concern.
    Using the Group By clause will heavily load the database server. while in the development environment, your query  might run okay, in the productive environment it might crash. The reason : the cursor to the database would have timed out.
    The Group By operation, as you know can be simulated after you get the data into the application server. The application server processing can take some time but you can usually optimize it.
    Regards,
    Anand Mandalika.

  • Photoshop CC 2014 hangs for nearly 15 seconds when using Layer Group Layers or Layer Palette Group from Layers commands.

    Photoshop hangs for nearly 15 seconds when using Layer > Group Layers or Layer Palette > Group from Layers commands. NOT on creating new group and creating layers and drag-n-dropping layers into group manually in Layer Palette. Got a UI design file with nearly 800 layers and lot of groups. This doesn't occur in smaller files. Using OS X Mavericks, Photoshop CC 2014.2.2, Mid 2014 15-inch Retina Macbook Pro.
    I understand that the solution would be just to cut the file into smaller files and design each UI view in a separate file but that would be painful. And it does not seem to be a very hard task to just group layers.

    Does the document also have a lot of layer comps?
    And can you post that document or send me a copy of it so we can see exactly why it is hanging?

  • Reg. can we display alv grid using field groups (extracts)

    Hi,
    can we display alv grid using field groups (extracts). is this possible. i have to develop a blocked alv.
    tnks
    Yerukala Setty

    No, you will need the data in an internal table to use ALV.
    Cheers
    Allan

  • Percentage based on Status using per group

    Hi Gurus,
    Can you help me with this? You have any approach of getting the percentage based on status using per group?
    Currently I have this code below but it doesn’t have result but no error. I am trying to get the % In Complete.
    <?xdofx:(sum(current-group()/NUMINCOMPLETE)/(sum(NUMSUBMITTED) + sum(NUMCOMPLETED) + sum(NUMINPROGRESS) + sum(NUMINCOMPLETE)))*100?>
    Thanks Much,
    JP
    Edited by: BIPnewbie on Feb 6, 2012 3:05 AM

    Use this:
    <?xdoxslt:div(sum(current-group()/NUMINCOMPLETE), ((sum(NUMSUBMITTED) + sum(NUMCOMPLETED) + sum(NUMINPROGRESS) + sum(NUMINCOMPLETE)))*100?>
    Thanks,
    Bipuser

  • How to handle "The specified resource does not exist" exception while using entity group transactions to purge WADLogs table

    Hi,
    We have a requirement to purge the Azure WADLogs table on a periodic basis. We are achieving this by using Entity group transactions to delete the
    records older than 15 days. The logic is like this.
    bool recordDoesNotExistExceptionOccured = false;
    CloudTable wadLogsTable = tableClient.GetTableReference(WADLogsTableName);
    partitionKey = "0" + DateTime.UtcNow.AddDays(noOfDays).Ticks;
    TableQuery<WadLogsEntity> buildQuery = new TableQuery<WadLogsEntity>().Where(
    TableQuery.GenerateFilterCondition("PartitionKey",
    QueryComparisons.LessThanOrEqual, partitionKey));
    while (!recordDoesNotExistExceptionOccured)
    IEnumerable<WadLogsEntity> result = wadLogsTable.ExecuteQuery(buildQuery).Take(1000);
    //// Batch entity delete.
    if (result != null && result.Count() > 0)
    Dictionary<string, TableBatchOperation> batches = new Dictionary<string, TableBatchOperation>();
    foreach (var entity in result)
    TableOperation tableOperation = TableOperation.Delete(entity);
    if (!batches.ContainsKey(entity.PartitionKey))
    batches.Add(entity.PartitionKey, new TableBatchOperation());
    // A Batch Operation allows a maximum 100 entities in the batch which must share the same PartitionKey.
    if (batches[entity.PartitionKey].Count < 100)
    batches[entity.PartitionKey].Add(tableOperation);
    // Execute batches.
    foreach (var batch in batches.Values)
    try
    await wadLogsTable.ExecuteBatchAsync(batch);
    catch (Exception exception)
    // Log exception here.
    // Set flag.
    if (exception.Message.Contains(ResourceDoesNotExist))
    recordDoesNotExistExceptionOccured = true;
    break;
    else
    break;
    My questions are:
    Is this an efficient way to purge the WADLogs table? If not, what can make this better?
    Is this the correct way to handle the "Specified resource does not exist exception"? If not, how can I make this better?
    Would this logic fail in any particular case?
    How would this approach change if this code is in a worker which has multiple instances deployed?
    I have come up with this code by referencing the solution given
    here by Keith Murray.

    Hi Nikhil,
    Thanks for your posting!
    I tested your and Keith's code on my side, every thing worked fine. And when result is null or "result.count()<0", the While() loop is break. I found you code had some logic to handle the error "ResourceDoesNotExist" .
    It seems that the code worked fine. If you always occurred this error, I suggest you could debug your code and find which line of code throw the exception.   
    >> Is this an efficient way to purge the WADLogs table? If not, what can make this better?
    Base on my experience, we could use code (like the above logic code) and using the third party tool to delete the entities manually. In my opinion, I think the code is every efficient, it could be auto-run and save our workload.
     >>Is this the correct way to handle the "Specified resource does not exist exception"? If not, how can I make this better?
    In you code, you used the "recordDoesNotExistExceptionOccured " as a flag to check whether the entity is null. It is a good choice. I had tried to deleted the log table entities, but I used the flag to check the result number.
    For example, I planed the query result count is 100, if the number is lower than 100, I will set the flag as false, and break the while loop. 
    >>Would this logic fail in any particular case?
    I think it shouldn't fail. But if the result is "0", your while loop will always run. It will never stop. I think you could add "recordDoesNotExistExceptionOccured
    = true;" into your "else" block.
    >>How would this approach change if this code is in a worker which has multiple instances deployed?
    You don't change anything expect the "else" block. It would work fine on the worker role.
    If any question about this issue, please let me know free.
    Regards,
    Will
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • IMPOSSIBLE TO USE A GROUPED MAIL  IN A SCHEDULING ACTION

    Hello Everybody,
    I am currently facing a problem while trying to schedule a report.
    In fact, when I use a simple mail  like '[email protected]', the report is well sent.
    However, when I use a grouped email  like '[email protected]', the people doesn't receive the email. In BO, all seems ok, there are no erros in the schedule history of the report and every services are OK.
    When I use any other kind of software (outlook, talend, ect..)  to send data thought this mail list , it works everytime.
    There is no particular configuration in the SMTP server.
    Does anyone here know why it doen't work with BO ?
    I work with BO XI 3.1.
    Thanks for your help!

    Are the users in CMC? If they are, Follow below steps.
    Create users and while creating the users add the email address of respective user.
    Add the users in a group.
    Go to the report and right click>Schedule.
    Add the group in 'Schedule For'
    Under Schedule> Destinations select Email.
    Update the information and in 'TO' specify %SI_EMAIL_ADDRESS%
    Schedule the report.

  • Using Customized group name fields in Live Office

    Hello everybody,
    I've created a Crystal Reports based on a BEx Query and including a costcenter hierarchy. Now I want to add this report in Excel sheet via Live Office. I did this according to the official HowTos provided by Ingo Hilgefort. Now I've following issue. When I choose the data for the selected fields in Live Office, I could manage to insert the Live Office object into my report but the display of the cost center hierarchy nodes doesn't comply with the display in Crystal Reports. There I've defined a group with a customized group name field (costcenter.medium description) Live Office seems to use only the costcenter node ID.
    Crystal Report:
    Marketing und Vertrieb----
    63     63     63     63     0     0
    Vertrieb----
    54     54     54     54     0     0
    Vertrieb a----
    1     1     1     1     0     0
    Vertrieb a----
    3     3     3     3     0     0
    Vertrieb a----
    4     4     4     4     0     0
    Vertrieb a----
    2     2     2     2     0     0
    Vertrieb a----
    2     2     2     2     0     0
    Vertrieb a----
    3     3     3     3     0     0
    Vertrieb a----
    2     2     2     2     0     0
    Marketing----
    34     34     34     34     0     0
    Marketing a----
    3     3     3     3     0     0
    Live Office:
    1000H1.1000H1000.1000H1300----
    63      63     63     63     0     0
    1000H1.1000H1000.1000H1300.1000H1310----
    54      54     54     54     0     0
    1000H1.1000H1000.1000H1300.1000H1310.1000/3100-----1      1     1     1     0     0
    1000H1.1000H1000.1000H1300.1000H1310.1000/3105-----3      3     3     3     0     0
    1000H1.1000H1000.1000H1300.1000H1310.1000/3110-----4      4     4     4     0     0
    1000H1.1000H1000.1000H1300.1000H1310.1000/3120-----2      2     2     2     0     0
    1000H1.1000H1000.1000H1300.1000H1310.1000/3125-----2      2     2     2     0     0
    1000H1.1000H1000.1000H1300.1000H1310.1000/3130-----3      3     3     3     0     0
    1000H1.1000H1000.1000H1300.1000H1310.1000/3135-----2      2     2     2     0     0
    1000H1.1000H1000.1000H1300.1000H1310.1000/3140-----34      34     34     34     0     0
    1000H1.1000H1000.1000H1300.1000H1310.1000/3150-----3      3     3     3     0     0
    How can I use customized group name fields in Live Office?
    Thank you for helping me on my issue.
    Regards,
    Florian

    live office uses web services to authenticate (dswsbobje) which depends on a java app server for kerberos to be enabled to login with AD. Is Infoview working with AD? This would be required. I'm not sure about your other issues.
    Regards,
    Tim

Maybe you are looking for

  • [Solved] Xorg 1.5 screwed up my NVIDIA

    Hi, Today I did a pacman -Syu and I got the newest NVIDIA driver installed and the newest Xorg. Everything broke. Well not exactly everything. But my settings are all screwed up. For a start my screen refuses to load at the normal resolution (1680x10

  • ABZU - Asset Write-Up

    Dear All, I have attempted to perform a write-up for fixed assets in April 2008 which was capitalised in the current year January 2008. However the system does not allow me to post. However if I attempt to perform a write-up for fixed assets in April

  • I have a MSI K9A2 Platinum Motherboard. Bios version 1.8 overclocking

    I have a MSI K9A2 Platinum Motherboard. Bios version 1.8 AMD Phenom II X4 940 3.0 GHZ that I have over clocked to (17.5) 3.4Ghz However anytime I go over that to say multiplyer set to 18 (3.5 ghz) it still runs but crashes when running high end games

  • IPod Does Not Sync Fully

    I have my iPod set to automatically sync every time I plug it in, but for some reason this time in my iTunes I have 18711 items, while on my iPod there is one less, 18710. iTunes never prompted me to say that a particular item couldn't be synced, as

  • Aperture won't import photos from camera?

    I'm a longtime Aperture user and have the very latest update from Aperture V.3.1.1 For some reason, I was able to import 10 photos fine this morning, then wasn't able to do it again. I have my Canon camera importing through a USB cable directly, and