Crmm_erm_cat and relationship between database questions and activities

Hello Guys,
Using this transaction, it´s possible to make the relationship between the questions from the solution database with activities, and it´s possible to locate some activities over there. These activities are not apearing. Is there something else in the transaction, or additional transaction, I have to execute, to make the crmm_erm_cat work fine?
cheers
Luiz David

Already solved, thanks

Similar Messages

  • How to find relationship between database tables

    Hi Mate,s.
    Iam new to SAPBW , PLZ tell me , How to find releationship between database table(Transperent tables).
    Regards.
    harry

    hi harry,
    from your previous postings, i guess you are asking relationship between tables in r/3, following links may help :
    http://www.sapgenie.com/abap/tables.htm
    http://www.sapgenie.com/abap/tables_sd.htm
    http://www.sapgenie.com/abap/tables_mm.htm
    http://www.sapgenie.com/abap/tables_fi.htm
    http://www.sapgenie.com/abap/tables_ps.htm

  • PS-Relationship between the system status in wbs elements and activities.

    Relationship between the system status in wbs elements and activities.
    We have the following requeriment with the system status in PS. We have a project with complete technically status and we want to undo the complete technically status in one activity without changing the complete technically status in the WBS element or the project.
    The reason is that after putting the complete technically status, users have to change the values in the P.O. We don´t want "open" all the project and we only need to "open" the activity where the P.O. are in.
    Is that possible? How is connected the system status between activities, wbs elements and the project son I can change the status in the below step without changing the higher steps?
    Thanks in advance.

    Hi Bala
    Yes, you have put it rather correctly.
    We use WBS elements for booking and posting time (client requirement). So what I am looking for is WBS Elements with the system status TERL (Time and Expense Released).
    I talked of the deletion flag also, because a WBS element flagged for deletion can't be used. It's one of the criteria to screen the WBS elements.
    I hope it's clearer now.
    Vinitha.

  • How would i design the relationship between "question", "subquestion", and "answer"

    Hi all. Consider the following scenario:
    Scenario:
    A Question has an Answer, but some Questions have Subquestions. For example:
    1. Define the following terms: (Question)
    a) Object (1 marks) (Subquestion)
    An instance of a class. (Answer)
    b) ...
    2. Differentiate between a constructor and a destructor (2 marks)
    (Question)
    A constructor constructs while a destructor destroys.
    (Answer)
    Question:
    I want to model Questions, Subquestion, and Answer as Entities with relationships/associations, preferably binary relationships as i feel ternary relationships will be problematic while programming. Any suggestion on how i would
    go about this?
    There is never infinite resources.
    For the Question Entity, a question has the attributes "QuestionPhrase <String>", "Diagram<Binary>", and "Marks
    <Decimal>".
    For the SubQuestion Entity, a subquestion has the attributes "SubQuestionPhrase <String>", "Diagram<Binary>", and "Marks <Decimal>".
    For the Answer Entity, an answer has attributes, "AnswerPhrase<String>", "Diagram <Binary>"

    Yes. I am in .Net. I sure do hope i did not ask in the wrong forum. :-|
    Hi KCWamuti,
    If you need to design the relationship between Question table and Answer table in SQL Server, as Uri’s and Visakh’s posts, you can create the foreign key to establish relationship between tables, and use join in query to get your desired result. For more
    information about JOIN in SQL Server, please review this article:
    Different Types of SQL Joins.
    However, if you need to model Questions, Subquestion, and Answer as Entities in .Net, then the issue regards data platform development. I suggest you post the question in the Data Platform Development forums at
    http://social.msdn.microsoft.com/Forums/en-US/home?category=dataplatformdev . It is appropriate and more experts will assist you.
    Thanks,
    Lydia Zhang

  • Question the relationship between AR invoice and Outgoing payment

    Dear All,
    Does anyone the know the the relationship between AR invoice and Outgoing payment? because I would like to print the report in specific date range. For example, today is 02 July 2009. I want to print "AR invoice as at 31 Mar 2009", by using SQL. It seems like date back.
    From Samson

    Hi Samson,
                      Even i haven't done much r n d  bout this but what i came to know is that, there might be a scenario where you may want to make a payment to your customer. In this case we can use the Outgoing payment for the same. The system doesnt stop you from doin so. I have seen this in SAP B1 2005B, PL34.
    Regards,
    Joseph

  • Error while adding a used relationship between the New DC and the Web DC

    Hi Gurus
    We are getting the Error in NWDS while Adding  a used relationship between the New DC and the Web DC.
    Steps what we are Done
    1. Create the custom project from inactiveDC's
    2.creating the project for the component crm/b2b in SHRAPP_1
    3.After that we changed the application.xml and given the contect path.
    4.Then we tried to add Dependency to the custom create DC we are getting the error saying that illegal deppendency : the compartment sap.com_CUSTCRMPRJ_1 of DC sap.com/home/b2b_xyz(sap.com.CUSTCRMPRJ_1) must explicitly use compartment sap.com_SAP-SHRWEB_1 of DC sap.com/crm/isa/ like that it was throwing the error.
    so, we skip this step and tried to create the build then it is saying that build is failed..
    Please help us in this regard.
    Awaiting for ur quick response...
    Regards
    Satish

    Hi
    Please Ignore my above message.
    Thanks for ur Response.
    After ur valuble inputs we have added the required dependencies and sucessfully created the projects, then building of the  projects was also sucessfully done and  EAR file was created.
    We need to deploy this EAR file in CRM Application Server by using the interface NWDI.
    For Deploying the EAR into NWDI, we need to check-in the activites what i have created for EAR. once i check-in the activites ,the NWDI will deploy the EAR into CRM Application Server.
    In the Activity Log we are able to check the Activities as Suceeded but the Deployment column is not showing any status.
    When i  right click on my activity Id the deployment summery is also disabled.
    So finally my Question is that where can i get the deployment log file, and where can i check the deployment status for my application..
    Any pointers in this regard would be of great help..
    Awaiting for ur valuble Responses..
    Regards
    Satish

  • Relationship between db_flashback_retention_target and fast_recovery_area

    Hi to all
    I was doing a test on Flashback Database on my Oracle 11gR2 and I would like to seek clarifications on the relationship between db_flashback_retention_target and fast_recovery_area. Here are my current settings below:
    SQL> show parameter db_flashback_retention
    NAME TYPE VALUE
    db_flashback_retention_target integer 60
    SQL> show parameter recovery_file_dest
    NAME TYPE VALUE
    db_recovery_file_dest string L:\app\amosleeyp\fast_recovery
    _area
    db_recovery_file_dest_size big integer 20G
    Here come the question. I know that the db_flashback_retention_target parameter specifies the upper limit (in minutes) on how far back in time the database may be flashed back. I did a testing and below is the sequence of events.
    DELETE FROM SCOTT.EMP WHERE ename = 'KING'; -- *12:13pm*
    FLASHBACK DATABASE TO TIME = "TO_DATE('2013-03-25 12:12:00','YYYY-MM-DD HH24:MI:SS')"; -- *1:30pm*
    Select count(*) from SCOTT.EMP where ename='KING'; -- *1:31pm*
    COUNT(*)
    1
    From this simple test, I was able to flashback database to more than 60minutes ago despite my retention window being set at 60minutes. Is this due to the reason that I have a huge db_recovery_file_dest_size of 20G? Thus, I am able to go back in time for more than 60minutes? Thanks for the sharing.

    Hi,
    So do you mean that the retention target can be at about 59,60,61minutes or more depending on the fast_recovery_area space? So it actually fluctuates ~ ?No it does not fluctuate. fast_recovery_area is just a storage area for multipurpose use for backup, backup copy, archived redo logs and flashback logs etc, so don't confuse it with flashback retention.
    flashback retention time is the time which Oracle will always ensure you the flashback of database. If set to 60 minutes, it means you will certainly be able to flashback your database atleast 60 minutes.
    If your fast recovery area is free, Oracle will not delete flashback logs (and you might be able to flashback to even several days if flashback logs have not been deleted from fast recovery area). It will only delete flashback logs if fast recovery area has small space left in it.
    Only when I set the guaranteed restore points , then it will always store at 60 minutes?See following for this concept
    http://docs.oracle.com/cd/E11882_01/backup.112/e10642/flashdb.htm#autoId8
    Salman

  • What is the relationship between iPhoto and my pictures ?

    I hope you don't mind if I pose another question to you: one that's been bothering me for some time. It has to do with iPhoto and storing photos in general. So far I have been placing my photos in >my mac>my pictures and then working through iPhoto to sort them into categories, events, etc. So far, so good, but I don't understand the relationship between the two. For example, I can create new folders in iPhoto, but that doesn't change my original folders in my pictures at all; so where are the iPhoto folders on my mac? And if I delete from >my pictures, then does that also automatically delete them from iPhoto, or visa versa? The set up that I would ideally like to have is that everything I do in iPhoto automatically updates my actual photos in >my mac>my pictures. Is that possible? Or am I going about this the wrong way? Your input would be appreciated. Thanks. Adrian

    You're going about it the wrong way
    iPhoto is a Database. The entire point of it is that you can manage your photos without recourse to your files. This is more powerful and a whole lot more flexible than files in folders.
    iPhoto can run in two modes: Managed and Referenced.
    In Managed (the default mode) your photos are copied into the iPhoto Library when you import them. I would strongly suspect that this is the case in your set-up. This is the best way to run iPhoto.
    It also means that you now have two copies of the photos - on in the iPhoto Library, the other in your Pictures Folder. (And by the way, there's no "my pictures" folder on a Mac. It's just the Pictures Folder. I'm always amused by the "my" prefix on WIndows folders. Who else's files would they be? If they belong to someone else, shouldn't you have a "their" Pictures Folder too? Any enough of a pointless digressions)
    In Referenced mode iPhoto does not copy the files into the LIbrary. It simply references them where you have put them. I'll add a bit to the end of this post explaining some of the gotchas with this mode.
    How can you tell if you're running a Managed or a Referenced Library? Go to iPhoto -> Preferences -> Advanced. Is the box at Copy Items to the iPhoto Library checked?
    As iPhoto is a database there is
    a: No relationship between the photos in your Pictures Folder and iPhoto if you have a Managed Library
    b: Minimal relationship if you have a Referenced Library.
    For example, I can create new folders in iPhoto, but that doesn't change my original folders in my pictures at all; so where are the iPhoto folders on my mac?
    They are entries in the iPhoto database.
    And if I delete from >my pictures, then does that also automatically delete them from iPhoto,
    Again: depending on whether your Managed or Referenced:
    a: Managed: No it does not, makes no dfference to iPhoto at all.
    b: Referenced: You've just corrupted your iPhoto Library.
    When you use iPhoto it's your "Go To" app for photographs. The point is that you don't manipulate the files in the Finder any more. You use iPhoto to import, iPhoto to organise, share and so on, you use iPhoto to Delete too.
    or visa versa?
    Managed: Deleting from iPhoto also removes the files from the HD.
    Referenced: Deleting Files from iPhoto only removes those elements within the iPhoto Library. You then need to trash the original file yourself. Why? When you elected to Reference the files you said to iPhoto "I want to manage the files myself" So manage them.
    The set up that I would ideally like to have is that everything I do in iPhoto automatically updates my actual photos in >my mac>my pictures. Is that possible?
    That's not possible. That's file organising, and as I said, iPhoto is about Photo organising. Albums, Folders, Slideshows etc in iPhoto are all virtual, that is they are entries in the database. The advantage of this is that a photo can be in 50 albums and use no extra disk space at all. Also, iPhoto has no ability to manage files outside the iPhoto Library Folder.
    So how do you actually get to your files if you want to email, upload etc?
    There are many, many ways to access your files in iPhoto:
    *For Users of 10.5 and later*
    You can use any Open / Attach / Browse dialogue. On the left there's a Media heading, your pics can be accessed there. Command-Click for selecting multiple pics.
    Uploaded with plasq's Skitch!
    (Note the above illustration is not a Finder Window. It's the dialogue you get when you go File -> Open)
    You can access the Library from the New Message Window in Mail:
    Uploaded with plasq's Skitch!
    *For users of 10.4 and later* ...
    Many internet sites such as Flickr and SmugMug have plug-ins for accessing the iPhoto Library. If the site you want to use doesn’t then some, one or any of these will also work:
    To upload to a site that does not have an iPhoto Export Plug-in the recommended way is to Select the Pic in the iPhoto Window and go File -> Export and export the pic to the desktop, then upload from there. After the upload you can trash the pic on the desktop. It's only a copy and your original is safe in iPhoto.
    This is also true for emailing with Web-based services. However, if you're using Gmail you can use iPhoto2GMail
    If you use Apple's Mail, Entourage, AOL or Eudora you can email from within iPhoto.
    If you use a Cocoa-based Browser such as Safari, you can drag the pics from the iPhoto Window to the Attach window in the browser.
    *If you want to access the files with iPhoto not running*:
    For users of 10.6 and later:
    You can download a free Services component from MacOSXAutomation which will give you access to the iPhoto Library from your Services Menu. Using the Services Preference Pane you can even create a keyboard shortcut for it.
    For Users of 10.4 and later:
    Create a Media Browser using Automator (takes about 10 seconds) or use this free utility Karelia iMedia Browser
    Other options include:
    1. *Drag and Drop*: Drag a photo from the iPhoto Window to the desktop, there iPhoto will make a full-sized copy of the pic.
    2. *File -> Export*: Select the files in the iPhoto Window and go File -> Export. The dialogue will give you various options, including altering the format, naming the files and changing the size. Again, producing a copy.
    3. *Show File*: Right- (or Control-) Click on a pic and in the resulting dialogue choose 'Show File'. A Finder window will pop open with the file already selected.
    *To work with another editor*
    You can set Photoshop (or any image editor) as an external editor in iPhoto. (Preferences -> General -> Edit Photo: Choose from the Drop Down Menu.) This way, when you double click a pic to edit in iPhoto it will open automatically in Photoshop or your Image Editor, and when you save it it's sent back to iPhoto automatically. This is the only way that edits made in another application will be displayed in iPhoto.
    Note that iPhoto sends a copy+ of the file to Photoshop, so when you save be sure to use the Save command, not Save As... If you use Save As then you're creating a new file and iPhoto has no way of knowing about this new file. iPhoto is preserving your original anyway.
    Finally +On Running a Referenced Library:+
    *How to do it:*
    Simply go to iPhoto Menu -> Preferences -> Advanced and uncheck 'Copy Files to the iPhoto Library on Import'.
    *What Happens:*
    Now iPhoto will not copy the files, but rather simply reference them on your HD. To do this it will create an alias in the Originals Folder that points to your file. It will still create a thumbnail and, if you modify the pics, a Modified version within the iPhoto Library Folder.
    *Some things to consider:*
    1. Importing and deleting pics are more complex procedures. You have to to put the files where they will be stored before importing them. When you delete them you'll need to remove the files from the HD yourself.
    2. You cannot move or rename the files on your system or iPhoto will lose track of them on systems prior to 10.5 and iPhoto 08. Even with the later versions issues can still arise if you move the referenced files to new volumes or between volumes.
    3. Most importantly, migrating to a new disk or computer can be much more complex.
    4. Because iPhoto has no tools for managing Referenced Files, if, for some reason, the path to the photos changes then you could find yourself resolving aliases for +each photo in the Library+ one by one.
    My own opinion:
    I've yet to see a good reason to run iPhoto in referenced mode unless you're using two photo organiser
    I know there's lot there, so by all means post back if you need more
    Regards
    TD

  • Getting Error The trust relationship between the primary domain and the trusted domain failed in SharePoint 2010

    Hi,
    SharePoint 2010 Backup has been taken from production and restored through Semantic Tool in one of the server.The wepapplication of which the backup was taken is working fine.
    But the problem is that the SharePoint is not working correctly.We cannot create any new webapplication ,cannot navigate to the ServiceApplications.aspx page it shows error.Even the Search and UserProfile Services of the existing Web Application is not working.Checking
    the SharePoint Logs I found out the below exception
    11/30/2011 12:14:53.78  WebAnalyticsService.exe (0x06D4)         0x2D24 SharePoint Foundation          Database                     
     8u1d High     Flushing connection pool 'Data Source=urasvr139;Initial Catalog=SharePoint_Config;Integrated Security=True;Enlist=False;Connect Timeout=15' 
    11/30/2011 12:14:53.78  WebAnalyticsService.exe (0x06D4)         0x2D24 SharePoint Foundation          Topology                     
     2myf Medium   Enabling the configuration filesystem and memory caches. 
    11/30/2011 12:14:53.79  WebAnalyticsService.exe (0x06D4)         0x12AC SharePoint Foundation          Database                     
     8u1d High     Flushing connection pool 'Data Source=urasvr139;Initial Catalog=SharePoint_Config;Integrated Security=True;Enlist=False;Connect Timeout=15' 
    11/30/2011 12:14:53.79  WebAnalyticsService.exe (0x06D4)         0x12AC SharePoint Foundation          Topology                     
     2myf Medium   Enabling the configuration filesystem and memory caches. 
    11/30/2011 12:14:55.54  mssearch.exe (0x0864)                    0x2B24 SharePoint Server Search       Propagation Manager          
     fo2s Medium   [3b3-c-0 An] aborting all propagation tasks and propagation-owned transactions after waiting 300 seconds (0 indexes)  [indexpropagator.cxx:1607]  d:\office\source\search\native\ytrip\tripoli\propagation\indexpropagator.cxx 
    11/30/2011 12:14:55.99  OWSTIMER.EXE (0x1DF4)                    0x1994 SharePoint Foundation          Topology                     
     75dz High     The SPPersistedObject with
    Name User Profile Service Application, Id 9577a6aa-33ec-498e-b198-56651b53bf27, Parent 13e1ef7d-40c2-4bcb-906c-a080866ca9bd failed to initialize with the following error: System.SystemException: The trust relationship between the primary domain and the trusted
    domain failed.       at System.Security.Principal.SecurityIdentifier.TranslateToNTAccounts(IdentityReferenceCollection sourceSids, Boolean& someFailed)     at System.Security.Principal.SecurityIdentifier.Translate(IdentityReferenceCollection
    sourceSids, Type targetType, Boolean forceSuccess)     at System.Security.Principal.SecurityIdentifier.Translate(Type targetType)     at Microsoft.SharePoint.Administration.SPAce`1.get_PrincipalName()    
    at Microsoft.SharePoint.Administration.SPAcl`1.Add(String princip... 
    11/30/2011 12:14:55.99* OWSTIMER.EXE (0x1DF4)                    0x1994 SharePoint Foundation          Topology                     
     75dz High     ...alName, String displayName, Byte[] securityIdentifier, T grantRightsMask, T denyRightsMask)     at Microsoft.SharePoint.Administration.SPAcl`1..ctor(String persistedAcl)    
    at Microsoft.SharePoint.Administration.SPServiceApplication.OnDeserialization()     at Microsoft.SharePoint.Administration.SPIisWebServiceApplication.OnDeserialization()     at Microsoft.SharePoint.Administration.SPPersistedObject.Initialize(ISPPersistedStoreProvider
    persistedStoreProvider, Guid id, Guid parentId, String name, SPObjectStatus status, Int64 version, XmlDocument state) 
    11/30/2011 12:14:56.00  OWSTIMER.EXE (0x1DF4)                    0x1994 SharePoint Foundation          Topology                     
     8xqx High     Exception in RefreshCache. Exception message :The trust relationship between the primary domain and the trusted domain failed.   
    11/30/2011 12:14:56.00  OWSTIMER.EXE (0x1DF4)                    0x1994 SharePoint Foundation          Timer                        
     2n2p Monitorable The following error occured while trying to initialize the timer: System.SystemException: The trust relationship between the primary domain and the trusted domain failed.       at System.Security.Principal.SecurityIdentifier.TranslateToNTAccounts(IdentityReferenceCollection
    sourceSids, Boolean& someFailed)     at System.Security.Principal.SecurityIdentifier.Translate(IdentityReferenceCollection sourceSids, Type targetType, Boolean forceSuccess)     at System.Security.Principal.SecurityIdentifier.Translate(Type
    targetType)     at Microsoft.SharePoint.Administration.SPAce`1.get_PrincipalName()     at Microsoft.SharePoint.Administration.SPAcl`1.Add(String principalName, String displayName, Byte[] securityIdentifier, T grantRightsMask,
    T denyRightsMask)     at Microsoft.SharePoint.Administrati... 
    11/30/2011 12:14:56.00* OWSTIMER.EXE (0x1DF4)                    0x1994 SharePoint Foundation          Timer                        
     2n2p Monitorable ...on.SPAcl`1..ctor(String persistedAcl)     at Microsoft.SharePoint.Administration.SPServiceApplication.OnDeserialization()     at Microsoft.SharePoint.Administration.SPIisWebServiceApplication.OnDeserialization()    
    at Microsoft.SharePoint.Administration.SPPersistedObject.Initialize(ISPPersistedStoreProvider persistedStoreProvider, Guid id, Guid parentId, String name, SPObjectStatus status, Int64 version, XmlDocument state)     at Microsoft.SharePoint.Administration.SPConfigurationDatabase.GetObject(Guid
    id, Guid parentId, Guid type, String name, SPObjectStatus status, Byte[] versionBuffer, String xml)     at Microsoft.SharePoint.Administration.SPConfigurationDatabase.GetObject(SqlDataReader dr)     at Microsoft.SharePoint.Administration.SPConfigurationDatabase.RefreshCache(Int64
    currentVe...
    Please guide me on the above issue ,this will be of great help
    Thanks.

    I have same error. Verified for trust , ports , cleaned up cache.. nothing has helped. 
    The problem is caused by User profile Synch Service:
    UserProfileProperty_WCFLogging :: ProfilePropertyService.GetProfileProperties Exception: System.SystemException:
    The trust relationship between the primary domain and the trusted domain failed.       at System.Security.Principal.SecurityIdentifier.TranslateToNTAccounts(IdentityReferenceCollection sourceSids,
    Boolean& someFailed)     at System.Security.Principal.SecurityIdentifier.Translate(IdentityReferenceCollection sourceSids, Type targetType, Boolean forceSuccess)     at System.Security.Principal.SecurityIdentifier.Translate(Type
    targetType)     at Microsoft.SharePoint.Administration.SPAce`1.get_PrincipalName()     at Microsoft.SharePoint.Administration.SPAcl`1.Add(String principalName, String displayName, SPIdentifierType identifierType, Byte[]
    identifier, T grantRightsMask, T denyRigh...        
    08/23/2014 13:00:20.96*        w3wp.exe (0x2204)                      
            0x293C        SharePoint Portal Server              User Profiles                
            eh0u        Unexpected        ...tsMask)     at Microsoft.SharePoint.Administration.SPAcl`1..ctor(String persistedAcl)    
    at Microsoft.Office.Server.Administration.UserProfileApplication.get_SerializedAdministratorAcl()     at Microsoft.Office.Server.Administration.UserProfileApplication.GetProperties()     at Microsoft.Office.Server.UserProfiles.ProfilePropertyService.GetProfileProperties()
    Please let me know if you any solution found for this?
    Regards,
    Kunal  

  • 1-to-1 Relationship Between UI and subVI Data Cluster

    Discussion continued from here.
    In summary:
    JackDunaway wrote:
    Yes,
    I can see clear benefits in implementing this Idea - that is, if your
    underlying datatype elements have a 1:1 relationship with the UI
    elements.
    I will
    illustrate this point by showing some potential flaws in your example:
    "Profile Running" and "Profile Complete" are mutually exclusive, no?
    Wouldn't it be better to have a single enum named "Profile" with three
    elements "Idle, Running, and Complete" for the underlying datatype?
    Having two mutually exclusive pieces of data in an underlying datatype
    is among my favorite of code smell indicators.
    Also, the underlying datatype probably only needs "Forward Miles" and
    "Reverse Miles" since "Total Miles" is derived. Exclude "Total Miles"
    from the underlying cluster and just show the sum for display.
    Another
    argument against using a 1:1 relationship: the customer now wants to
    multiply speed by -1 if Direction==Reverse and not show the Direction
    enum on the UI. The data source (the VI that generates the data) would
    need to be updated using your 1:1 relationship. Using underlying data
    different from the display data, only the data client (the UI front
    panel) needs to change. I would be much more inclined to service the UI
    FP for a cosmetic upgrade rather than tracing the data source back
    through the HMI framework, through TCP, back to the RT, back to FPGA...
    Basically...
    I question a perfectly overlapped Venn Diagram where the set of data
    shown to the user equals the dataset used for underlying data
    processing/messaging/storing. The underlying datatype should be as
    stripped and streamlined as possible, while the display datatype can
    inherit all the flair and post-processing that Upper Management wants to
    see in a UI.
    LabBEAN wrote:
    <JackDunaway wrote
    I will illustrate this point by showing some potential flaws in your example...
    <LabBEAN response
    The data you see maps directly to tags on the PLC.
    <JackDunaway wrote
    Yes, I can see clear benefits in implementing this Idea - that is, if your underlying datatype elements have a 1:1 relationship with the UI elements.
    <LabBEAN response
    JackDunaway wrote:
    This is a good indicator that we're both aware at this point that I'm
    missing something... in all seriousness, could you reply to the 1:1
    argument? I really want to understand this Idea and learn how/if I need
    to apply it to my own style (our last back-and-forth turned out to be an enlightening and introspective exercise for me).
    ***EDIT: By all means, please start a discussion on the LabVIEW board so we're not hindered by the Exchange's interface. ***
    My long delayed response:
    The indicators you see map to tags on the PLC.  That is, we were connecting through OPC to an application on a PLC that was written ~15 years ago.  I have a VI where I read a bunch of SVs (Shared Variables).  Each SV is bound through OPC to a PLC tag.  In the interest of disclosure, two 16-bit tags are required to represent each 32-bit mileage number.  In the same subVI, I read each set of mileage tags, convert, and feed my subVI cluster indicator.  The same is true for wheel size:  three bits get converted to the enum.  Regardless, though, I have one subVI that reads SVs and outputs the same "underlying data" cluster that is seen on the UI.  The UI has a "Faults" cluster full of fault Booleans that follows the same logic.  When the user configures a profile of steps, they do so via an array of "step" clusters (although the cluster look is hidden for aesthetics).  It's the same thing as above except we write tags instead of reading them.
    In my case, each set of 16-bit tags is worthless as two 16-bit numbers.  They are only useful as a 32-bit mileage, so I don't pass around the raw 16-bit data.  The same is true for the wheel size bits. My software can just as easily (in fact, more easily) operate on the enum.  So, the underlying cluster from the subVI is programmatically useful and applicable to the UI.  I would guess that the same is true for a lot of RT applications, where the read VI can have some intelligence to process the data into useful / applicable clusters.
    There are going to be cases where "Upper Management" would like to see "flair and post-processing" as you say.  Your speed illustration is a good example of this.  There are also instances where the cluster works fine on the UI the way it is (like this one and many others that we've seen).
    <JackDunaway wrote
    "Profile Running" and "Profile Complete" are mutually exclusive, no?
    Wouldn't it be better to have a single enum named "Profile" with three
    elements "Idle, Running, and Complete" for the underlying datatype?
    <LabBEAN response
    Did you mean "not" mutually exclusive?  We combined 3 "dependent" (not mutually exclusive) Booleans into an enum for Wheel Size, as I mentioned above.  Not sure now why we went the other way with these two (this was 2 years ago).  In any event, with regard to UI representation, I still pass a cluster out of my read-raw-data-and-process-into-cluster subVI up to the applicable queued state machines and to the UI.
    <JackDunaway wrote
    Having two mutually exclusive pieces of data in an underlying datatype
    is among my favorite of code smell indicators.
    <LabBEAN response
    Working with applications written in ladder logic, it is not uncommon to see separate Booleans that indicate the same condition.  This seems to be especially true when safety is a concern.  That is, ladder Coil A ON and Coil B OFF == switch open.  Coil A OFF and Coil B ON == switch closed.  If you ever read OPC tags from Coil A and Coil B and the two are the same, you know the ladder is in transition (hasn't updated the tags).  Throw that point out and read again.
    I, too, appreciate our back-and-forths.  Good discussion.
    Certified LabVIEW Architect
    Wait for Flag / Set Flag
    Separate Views from Implementation for Strict Type Defs

    Thanks for replying, Jason. Let me see if I can craft a coherent response after getting back up to speed...
    (...later)
    OK, let's go. I'm going to fully agree with you that LabVIEW imposes a strange constraint unique from most other languages where a Typedef defines two things: the underlying data structure, and also the view. A Strict Typedef should be more accurately deemed the Datatype-View-Definition, and a Typedef would be more accurately called the Datatype-Definition-and-View-Suggestion. And to be clear, there are two types of views: the programmer's view (a SubVI look and feel) and the UI view (what the user and Upper Management sees). (Finally, I admit I'm ignorant whether view or View is more a more appropriate term)
    Linking the programmer's view to the datatype is perfectly fine with me, and based on your original Idea I think we both agree that's OK. I think we run into a disagreement where you have loosely tied the concept of "Strict TD" to "UI View".
    Historically, I have used Strict Typedefs for the programmer's view (SubVIs), since I like to maintain a "functional UI" at the SubVI level. I don't use type definitions on User Interfaces - only Controls. That's the reason your Idea does not appeal to me, but perhaps if your Idea were implemented, it would appeal to me since View and Implementation would be divorced as separate entities within the Type Definition. (Does that classify as a Catch-22?) So, you're Idea is fundamentally suggesting that Type Definition .ctl files should be more accurately called "a container that holds both a Type Definition and any number of View Definitions as well".
    Fundamentally, I think I finally understand the gist of your Idea: "let's ditch this weird constraint where View and Datatype are inextricably defined together in one file", and for that, I'll give Kudos to the original Idea. I got really tied up with the example you used to present the Idea, and plus I'm still learning a lot.
    Additional thoughts:
    This Idea reminds me of another: Tag XControl as Class View
    We've still got some arguing to do on a 1:1 relationship between underlying datatype and UI presentation, so put your mean face back on: 
    Since our last conversation, interestingly, I have been on an anti-Typedef kick altogether.  Why don't you drop some feedback on my attempt at a completely typedef-free UI framework?
    a.lia-user-name-link[href="/t5/user/viewprofilepage/user-id/88938"] {color: black;} a.lia-user-name-link[href="/t5/user/viewprofilepage/user-id/88938"]:after {content: '';} .jrd-sig {height: 80px; overflow: visible;} .jrd-sig-deploy {float:left; opacity:0.2;} .jrd-sig-img {float:right; opacity:0.2;} .jrd-sig-img:hover {opacity:0.8;} .jrd-sig-deploy:hover {opacity:0.8;}

  • Extending DataFinder to include Non-Data Entries and Relationships Between Entries

    I am trying to create custom data centralization and management software for our lab with DataFinder Toolkit.  In addition to storing data from data acquisitions, I want to be able to store additional information with the data such as images, pdf files, word documents that can be easily accessed after finding the data.  My current idea is to handle external files by storing the filepath in a property of an object vs trying to store the data itself in the datafinder database.  My code would know how to open these files so the user can see them with external software or a custom labview app.
    I also want to be able to store non-data objects such as an object that describes a piece of test equipment or a sample through properties and external files.  Images and documents are again important here.  And then create relationships through unique barcoded IDs that are a property in the datafinder entry.  So not only could I look at the data, I could find information about the samples and lab equipment. Or anything else we think needs to be saved.
    The code I am writing around DataFinder would be in charge of maintaining the unique IDs and know how the relationships between those IDs work to generate the proper queries. Like a 'Get Test Equipment Used' button next to a search result.
    For the external non-data objects I was going to use Labview code to create a file with all the information about the object (unique ID, properties and external file paths) and create a plug-in so the datafinder will index the file and make it searchable.
    Does this seem reasonable? I like the idea of working with one system and having everything update based on adding, deleting and modifying files in a folder. Also, I like the idea of not splitting the data management up into multiple technologies and having to make those work together.
    Solved!
    Go to Solution.

    From the description, you've given the system seems plausible. The non-data elements you discussed when broken down are just additional data to associate with an object.  Storing file paths seems plausible as done in the community example below:
    Execute String/Numeric Querry Using Data Finder Toolkit
    https://decibel.ni.com/content/docs/DOC-10631
    Regards,
    Isaac S.
    Regards,
    Isaac S.
    Applications Engineer
    National Instruments

  • How to create the relationship between ESSBASE 11 and DM in OBIEE11

    Hi Experts,
    I have one requirement that there is one property table named 'Store Master' in DW,and it contains a lot of attribute, such as Open Date, Close Date, IS 24 Hour etc.
    But another data source is essbase and based on this source, I create all reports.
    In ESSBASE, it has one dimension and hierarchy Location, and it has four level, Country(L1),Region (L2),Province(L3),Store(L4)
    So I want to know how to create the relationship between Location (ESSBASE) and Store Master (DM).
    I try to create one relationship in physical layer between Gen4,Location and Store, then drag the open date and close date into Location Dimension in BMM,then Presentation Layer.
    When I drag column 'Open Date' ,'Gen4,Location ' and 'Sales' into reports, it will generate the following error message:
    State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 43113] Message returned from OBIS. [nQSError: 43119] Query Failed: [nQSError: 14020] None of the fact tables are compatible with the query request Dim Region.Store Open Date. (HY000)
    However, when I remove the column 'Open Date'. it will be ok.
    So what I missing the steps? Please help me. Thanks.

    >
    '2. Now, pull the 'Store' column from relational DB onto the Gen5, Location column from Essbase. This action now creates, two logical sources for your 'Store' column.'
    If the length from different data source is not same,such as 1001(DM),L_1001(ESSBASE), can I drag the 'Store' column from relational DB onto the Gen5, Location column from Essbase?
    I think it does not work.Right?Hi,
    I am not sure if you are talking about the length(as in varchar(128)) of the member value being different in different sources, or the member itself is different in both the sources.
    I am still assuming, that you are referring to the members not same in both the sources.If it is, the whole concept of federation is based on conforming dimensions. So, it needs that the same dimension information is present in both the sources and then only, you know we can analyze the numbers based on this dimension. So, either the dimension being different in both sources, or the members not present in both the dimensions might lead to incorrect numbers.
    So I select Store Attributes in relation DB and Location in ESSBASE in physical layer, then create the physical join, such as right("Hour Sales"."H_Sales".""."H_Sales"."Gen6,Location",4) = "Authorization".""."EDW"."T_EDW_MDM_STORE"."US_CODE", then drag the OPEN_DATE and CLOSE_DATE in relation DB to Location in ESSBASE in BMM,finially drag them into presentation layer.We create physical layer relationships, to send over the same relation to the underlying database during querying. So, creating a physical relationship between essbase cube and relation database would not help here.
    When you set up this federation, BI Server sends individual queries to each source and maps the conforming dimension members internally.
    Hope I was clear, and this helps.
    Thank you,
    Dhar

  • How to create the relationship between ESSBASE 11 and DM  in OBIEE 11G?

    Hi Experts,
    I have one requirement that there is one property table named 'Store Master' in DW,and it contains a lot of attribute, such as Open Date, Close Date, IS 24 Hour etc.
    But another data source is essbase and based on this source, I create all reports.
    In ESSBASE, it has one dimension and hierarchy Location, and it has four level, Country(L1),Region (L2),Province(L3),Store(L4)
    So I want to know how to create the relationship between Location (ESSBASE) and Store Master (DM).
    I try to create one relationship in physical layer between Gen4,Location and Store, then drag the open date and close date into Location Dimension in BMM,then Presentation Layer.
    When I drag column 'Open Date' ,'Gen4,Location ' and 'Sales' into reports, it will generate the following error message:
    State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 43113] Message returned from OBIS. [nQSError: 43119] Query Failed: [nQSError: 14020] None of the fact tables are compatible with the query request Dim Region.Store Open Date. (HY000)
    However, when I remove the column 'Open Date'. it will be ok
    So what I missing the steps? Please help me. Thanks.

    >
    '2. Now, pull the 'Store' column from relational DB onto the Gen5, Location column from Essbase. This action now creates, two logical sources for your 'Store' column.'
    If the length from different data source is not same,such as 1001(DM),L_1001(ESSBASE), can I drag the 'Store' column from relational DB onto the Gen5, Location column from Essbase?
    I think it does not work.Right?Hi,
    I am not sure if you are talking about the length(as in varchar(128)) of the member value being different in different sources, or the member itself is different in both the sources.
    I am still assuming, that you are referring to the members not same in both the sources.If it is, the whole concept of federation is based on conforming dimensions. So, it needs that the same dimension information is present in both the sources and then only, you know we can analyze the numbers based on this dimension. So, either the dimension being different in both sources, or the members not present in both the dimensions might lead to incorrect numbers.
    So I select Store Attributes in relation DB and Location in ESSBASE in physical layer, then create the physical join, such as right("Hour Sales"."H_Sales".""."H_Sales"."Gen6,Location",4) = "Authorization".""."EDW"."T_EDW_MDM_STORE"."US_CODE", then drag the OPEN_DATE and CLOSE_DATE in relation DB to Location in ESSBASE in BMM,finially drag them into presentation layer.We create physical layer relationships, to send over the same relation to the underlying database during querying. So, creating a physical relationship between essbase cube and relation database would not help here.
    When you set up this federation, BI Server sends individual queries to each source and maps the conforming dimension members internally.
    Hope I was clear, and this helps.
    Thank you,
    Dhar

  • Relationship between SAP R/3 and XI

    Hi
    I have a question. What kindoff relationship exists between an R/3 system and XI system. Is it 1:1 or 1:many.
    Lets say i have a requirement where i have one r/3 system and 3 EAI system. I want to generate an idoc or rfc from a single r/3 which has to go through all the 3 EAI systems. Is it possible. If so how do i achieve it.
    If not why is it so.
    I was having a debate on this with my colleague at my work-place and i would like to hear from you all as to what your opinion is about this.
    regards
    Sameer

    Hi Sameer,
    it is no problem to implement an n:m relationship between systems. For example, you can have one R/3 sending an material master IDoc, configure in the receiver determination that this IDoc should go to 3 different receivers, than define for each receiver the interface as expected by the receiver (of course you will have to implement the payload mappings) and than choose three different adapters for sending the messages.
    regards, frank

  • Relationship between logical system and a client

    Hi,
    Our internal notes say that, in order to create a new client, I must do the following:
    1. Create a logical system using BD54. The client name must be SID + "CLNT" + number. In my case, this would be NSPCLNT100.
    2. Create a client using SCC4.
    Questions:
    1. What is the relationship between the client and the logical system?
    2. How does SAP know that client number 100 must be related to NSPCLNT100? I am not specifying this relationship anywhere.
    Thank you in advance for your help.
    Regards,
    Peter

    you assing logical system to a client from SCC4

Maybe you are looking for

  • I can't cut and paste

    I want to cut and paste on my web page but the functions are not available and grayed out. I modified the user_pref as directed substituting the web page url for modzilla.org (actually cut and pasted exactly using notepad with no success. The modific

  • How can I set the resolution of my external display to 1280x800?

    Hi, I'm using a Wacom Cintiq 12WX external pen tablet display with a 24" iMac. The recommended resolution for this Cintiq is 1280x800. But when I go to System Preferences>Displays set the resolution of the display, the only available options for "Wac

  • Problem with number of rows per page ....

    Hi, I have a updatable report .This is a dynamic report which can have more than 100 columns depending on the table name .I have set number of rows =15 in layout and pagination .Its displaying 15 rows per page .My requirement is 50 rows per page .So

  • How do I copy my Blackberry contacts lists to my new iPhone 4s?

    I'm about to acquire a iPhone 4s and would appreciate guidance on copying my Blackberry Bold 9700 contacts lists to my new iPhone. Is there a quicker way than copying the contacts to the sim card on the BB and importing them to the iPhone?

  • An information about follow up transaction document

    Hi all, How can i retrieve information about follow up transaction document. I need to know which table or anything else that can give me information about follow up document. For example if i have this business scenario : from contract --> follow up