Studio Portlet - Data Submission

I have a Data Submission studio portlet - I would like users to create ONLY one record. ( i.e. submit their address for example only once with the option to change it using "My record" button). Right now they can create more then one record. Dose anyone know how can I stop them from creating multiple records, while allowing them to modify the existing one?
Thank you

I don't think its possible to meet both of your requirements with the out of the box templates.
My understanding is that the Data Submission portlet doesn't allow you to restrict the number of entries a user can create.
The Poll and Survey portlet templates both support this single entry submission, however they don't support editing of submitted records (as far as I know).
Instead you could probably create a custom template by writing against the API's but this might be significantly more work than its worth.

Similar Messages

  • Studio Portlets - Workflow - Process Portlets

    Is there anyway to tap into the Workflow API for content server to provide something similar from Studio. We want to be able to have a form in studio be submitted to Person A via email wtih a link to the request. When the request is approved it's submitted to Person B vial email with a link to the request. Person B then fills the request. In this example it's ordering PDA's and Person A is the Cost Center Manager and Person B is from Telecommications. We want to also send an email back to the person who submitted the request each time the status changes or the order is filled so they know what the status of the request is. We have several other simlar process portlets that studio is a great tool for enetering the data for but lacks the workflow. I'm wondering if anyone else has been able to tie workflow into studio. Any other ideas outside of studio would be appreciated as well. A workflow framework would be great. We are using 5.02. and C#.

    Here's some information from the Plumtree Deployment Guide:
    Using Studio to Create Workflow ApplicationsUnlike Plumtree Content Server, Studio Server does not have mechanism for creating workflow templates that can be associated with a given database table or portlet. It is, however, possible to create portlets in Studio that simulate rudimentary workflows using the basic building blocks that Studio provides. The approach, in general, is to add various status and assignment fields to a Studio database table, and then to create various portlets and reports that filter records based on the values of these fields. Email notification can also be used to alert people when records are created or modified, thus alerting them to items that may require their attention.
    For example, imagine that we wanted to add workflow capability to a work order request system. Before having the operations clerk place the order for supplies, we want to first have the employee's manager approve or reject the request. Only requests that were approved would be subsequently processed. In this case, we would add two fields to the database table used by the portlets in the system - "Manager to Approve" (Portal User data type) and "Work Order Status" (Text data type with a value list containing "New" (the default value), "Manager Approved", "Manager Rejected", "Order Placed", and "Order Delivered"). In the data submission portlet used by employees to submit requests, the "Manager to Approve" field would appear on the request form, while the "Work Order Status" field would be hidden. We would also want to create a notification rule for the form, so that an email gets sent to "Manager to Approve" so that they would be alerted to a new order requiring their attention. Next, we would create a new record browser portlet called "Work Orders to Approve" with a report that listed records where the "Work Order Status" field value was equal to "New" and the "Manager to Approve" field was equal to the value of the currently logged in user. We might also specify permission on the portlet so that it is visible only to managers in the company. When a manager views this portlet from a my or community page, it would list all the work orders with approval pending. They could then open these records, and change the status to either "Manager Approved" or "Manager Rejected" as necessary. Similarly, we could also change the record browser portlet used by the operations clerk who places the orders to only display those records where the status was "Manager Approved."
    In this way, using email notifications and filters, in conjunction with assignment and status fields allows us to simulate simple workflows with Studio.

  • Site studio contributor data files with sections

    Is it possible to have sections with in site studio contribution data files and display list of section titles as hyper links in site studio site and show only that section content on click of section title.
    Number of sections are vary from document to document (So we can't use one element as a single section).
    Thanks...

    I am not aware that out-of-the-box Site Studio would have such an options.
    You do have sections (defined in Region Definition), so most likely you'd have to customize your user experience (so that only sections defined on the fly are shown). Note that Region Definition corresponds to a logical "object" (you may abstract that sections are object's attributes), so rather than show/hide sections you might also want to select a different region definition. I also remember that some time ago Site Studio was able to do "lazy load" (display a section only if user asked for it - this requirement tried to address slow loading times of the page); unfortunately, I'm not sure how this feature can be turned on.

  • Crystal report for visual studio 2010 data source object is not valid error

    Hello,
    I receive an "data source object is not valid" error when I want to print one CR document after setting an ADODB.Recordset on SetDataSource method of my report.
    On my developer station, this operation works without problem but on client station, I get this error.
    The redistributable package for client is installed on client side (CRRuntime_32bit_13_0_1.msi).
    Can someone help me?
    Thank you.

    Thank's for your answers
    Dim rsPkLst As ADODB.Recordset = Nothing
    Dim report As New crPickingList
    ' Fill ADODB.Recordset with SQL Statment
    If rsPkLst.RecordCount > 0 Then
          report.SetDataSource(rsPkLst) ' Error : The data source object is invalid
    EndIf
    This error appears during  "report.SetDataSource(rsPkLst)" instruction.
    ADODB drivers are already installed and my ADODB.Recordset is filled with good records.
    This project is an updated project from Visual Studio 2003 to Visual studio 2010 and the old version was running fine.
    Developer and client station runs under Windows XP SP3.
    On developer side I install CRforVS_13_0_1 (BuildVersion=13.0.1.220.Cortez_CR4VS).
    On client side I install CRRuntime_32bit_13_0_1.msi.
    Both stations use Microsoft .Net Framework 4.
    Move to ADO.NET is a solution but, for the moment, I do not have the time to change all applications from my company.
    (I get this error from all application updated from VS 2003 to VS 2010 developed since 2005)
    David.

  • Measurement studio plot data vs. date/time (Measurement studio for Visual Basic 6)

    Hi, I'm trying to do something that should be simple and appears totally possible but I can't get there.
    I'd like to plot some data on the y axis vs. the date/time on the x-axis.  I've looked at the samples, and tried modifying the chart properties but I'm doing something wrong and can't get it properly.
    Any chance someone has a very simple example of plotting a few points vs. date/time that they can share?
    Thanks for any feedback!!

    Hey Larrymcd,
    What format is your date/time currently in? That might help us find the best way to do this. I was able to find a few examples of plotting a graph with time on the X-axis:
    http://forums.ni.com/t5/Measurement-Studio-for-NET/Measurement-Studio-Graph-to-plot-with-time-scale/...
    http://digital.ni.com/public.nsf/allkb/FFC867DDE42029BA8625760300477BEB
    http://zone.ni.com/devzone/cda/epd/p/id/3334
    Hopefully some of these can point you in the right direction. If you have any more questions after checking those examples out, please let us know!
    Daniel E.
    TestStand Product Support Engineer
    National Instruments

  • Essbase Studio Invaild Data Type

    Hi All,
    I am having an issue when trying to build and load data to a cube using Essbase Studio.
    They are running Oracle9i Enterprise edition 9.2.0.6.0 and Essbase 11.1.1.3.
    The cube deployment errors out on the data load and returns the following info in the details box.
    1. [connection : \'FCS3'::'PRODADM.DIM_BUSINESS_UNIT', connection : \'FCS3'::'PRODADM.DIM_SCENARIO', connection : \'FCS3'::'PRODADM.DIM_DEPARTMENT', connection : \'FCS3'::'PRODADM.DIM_TOTALYEAR']
    2. [connection : \'FCS3'::'PRODADM.DIM_ACCOUNT']
    When I try to join the tables in the schema editor I get the following error.
    Invalid Join
    Join columns must have the same data type.
    The data types are all the same.
    Has anyone run into this issue?
    Thanks in advance.

    when you created the connection were the data types the same? If you change them after making the connection, that version of studio won't pick up the change. An easy way to check what Studio thinks the data types are is to go to the tables in your connection and right click and view the properties. It will show you the data types it has

  • DBA Studio, change date format

    Hi,
    in the DBA Studio, con you tell me where and how can I chage the DB date format from dd-mom-yy to dd.mm.yyyy?
    Tank you.

    The database format in OEM is different than the database format of the database ?
    Joel Pérez

  • Kinect Studio export Data (to .csv, etc) ?

    Hello guys,
    I'm really like the Kinect Studio Application, but I haven't found a way to export data, eg. position of body elements, or raw data. Is there a way to do so?
    thanks!

    You can do that from a regular Kinect application that would use the Kinect SDK. KStudio can be thought of as a virtual sensor in this respect. If you can record a regular live stream to .csv then just playback the clip and that data will be sent through
    the runtime to any application.
    Carmine Sirignano - MSFT

  • Sorting Portlet Data retrieved from BEA Content Management

    We have several portlets which get their data from the local bea content managemrnt repository. I want to know how can we sort the data(ascending and descending) by different columns ?
    e.g. a portlet has documents with columsn like title, date, size etc and we want to sort these documents by these columns in an ascending and descending manner.
    Thanks
    J

    We have several portlets which get their data from the local bea content managemrnt repository. I want to know how can we sort the data(ascending and descending) by different columns ?
    e.g. a portlet has documents with columsn like title, date, size etc and we want to sort these documents by these columns in an ascending and descending manner.
    Thanks
    J

  • Logic Pro 8 with Triton Studio MIDI Data Not Sending

    I have a Triton Studio (Korg), recently got a MIDI IN/OUT to USB plug in, for my macbook. 
    Never done this before, thus am trying to figure this out:
    My computer is showing that the USB plug in is indeed there, and Logic Pro sees the device as well... however when I punch the keys in the keyboard, the data does not seem to transfer into the Logic Pro track.
    I just want to get it set up, so I can record onto Logic Pro 8, creating music on this software.  As in playing my board, having the software receive what I am playing.  Just piano stuff really...  Any help I can get is appreciated, I have looked through all sorts of forums and manuals, but maybe an explanation would help, or clarity on what the basic settings should be at, so it can work, then if it still does not work, then maybe there's another issue. 
    Also, when I start Logic Pro 8, it says "Cannot find DAE Folder in "System Folder" them "ESB TDM Plug-in not found"
    Are these anything to worry about?  This computer is used, thus I am not the first owner. 

    CCTM wrote:
    Hi
    Pancenter wrote:
    The cable ends are labeled  IN and OUT correct? They go into the opposite MIDI ports on the Keyboard.
    The cable labeled IN goes to the MIDI OUT port on the keyboard....etc.
    You need to use an external MIDI track or a Software (virtual) Instrument track to record MIDI.
    Normally this is correct, but I have seen a USB Interface where the cables are labelled the other way round.... IN goes to IN, OUT goes to OUT.
    Crazy situation, but might be worth checking?
    CCT
    That's one I've never seen but have no doubt they exist.
    Perhaps a quality control issue.... Nah! 
    To the O.P.
    Does the Korg Triton have to be enabled for MIDI out.
    As I recall some Korg Triton models had both MIDI ports and a built in USB interface and there was a MIDI Menu where either port could be enabled. It's worth checking into even if your model does not have the built-in USB output.
    Are you using 10.6.8 like your tagline says?

  • App Studio - JSON data source

    Hi,
    I really would like know how to incorporate JSON as a datasource for my app studio apps... Can someone please point me  in the right direction?
    I have tried this example: http://blogs.msdn.com/b/quick_thoughts/archive/2014/07/13/rest-apis-in-app-studio-part-2-changing-the-source-code.aspx but unfortunately can only get it to half work ( ie showing only the movie titles no images or synopsis etc)
    even tho when I step thru the movie collection the relevant data is there.
    So any advice or other tutorials on the same subject would be greatly appreciated...

    Hi,
    Were you able to resolve your issue?
    I came across the same problem. My JSON has ImageUrl and ArticleUrl and when I added this JSON schema to display in the detail page or the App Studio, I kept getting an error in the MainPage.xaml 
     "object reference not set to an instant of an object. 
    Can I add two or more different Web API sources for App Studio or I'm limited to just one? Below is the details error in the  MainPage.xaml.
     at AppStudio.Data.HighlightRootSchema.GetHashCode()
       at System.Collections.Generic.GenericEqualityComparer`1.GetHashCode(T obj)
       at System.Linq.Set`1.InternalGetHashCode(TElement value)
       at System.Linq.Set`1.Find(TElement value, Boolean add)
       at System.Linq.Enumerable.<IntersectIterator>d__92`1.MoveNext()
       at System.Linq.Enumerable.<OfTypeIterator>d__aa`1.MoveNext()
       at AppStudio.Data.ObservableCollectionExtensions.AddRangeUnique[T](ObservableCollection`1 oldCollection, IEnumerable`1 newItems)
       at AppStudio.Data.DataSourceBase`1.<LoadDataAsync>d__1.MoveNext()
    --- End of stack trace from previous location where exception was thrown ---
       at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
       at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
       at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult()
       at AppStudio.ViewModels.ViewModelBase`1.<LoadItemsAsync>d__0.MoveNext()
    --- End of stack trace from previous location where exception was thrown ---
       at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
       at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
       at System.Runtime.CompilerServices.TaskAwaiter.GetResult()
       at AppStudio.ViewModels.MainViewModel.<LoadDataAsync>d__0.MoveNext()
    --- End of stack trace from previous location where exception was thrown ---
       at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
       at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
       at System.Runtime.CompilerServices.TaskAwaiter.GetResult()
       at AppStudio.Views.MainPage.<OnNavigatedTo>d__0.MoveNext()
    App Studio Team, please help. 
    Thanks,
    GP

  • Data submission format - Tab missing?

    Using LiveCycle Designer > Working with Objects > Using objects > Using buttons > To insert a button that sends an email that includes XML data
    The help doc that describes how to insert a button that sends email including xml data references a 'Submit tab' in the Object palette.
    Its not there!
    Is there such a tab? Are there alternate formats for emailing form data?
    Thanks!
    Peter

    When you insert the Button Object onto your form and go to the Object Properties for it (Object tab or shift+F7) you should see a field tab with a radio button list of choices. (Regular, Submit, Execute) Choose Submit and you'll get the Submit tab where you can choose the Submit Format of your button. You can choose XML, XML Date etc, and then in the Submit to URL box type something like mailto:[email protected]
    Ken

  • Struts portlet data binding

    Struts project using syntax like
    <c:out value="${bindings.RaeCustomersView1.labels['Role']}"/> works fine unless deployed as a struts portlet.
    As a portlet all the bindings are null.
    This is a major pain !!!
    Has anyone any ideas.

    ok, just to be more specific.
    If it is a bind control that is modified from the front panel, it works fine even from remote panel. However, if you modify it through a local variable, it wont have any effect on the bound shared variable.
    For this reason, all the changes made to bind indicators wont take effect on the corresponding shared variable either.
    A 2nd work around is to use the property node -> value and that will affect the shared variable correctly. However, I read somewhere that we should reduce the number of property nodes to a minimum when planning to use the remote panel so updating the shared variable directly still seems best.
    Regards

  • Essbase Studio: Regarding data sources

    Hi,
    Just wanted to know a thing about Essbase Studio.
    Does my back-end datasource have to have tables/views in a star schema format? Is this mandatory for me when I use Studio to build/deploy cubes?
    Or I can use generation views/tables also?
    thanks to all.
    Avishek!

    It doesn't have to but it makes life much easier.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • EDQP: knowledge studio - create data lens

    Hello Experts,
    while creating data lens of a csv file, it recommends creating number of sample files with certain number of rows.
    i have 90k rows in my csv file and it suggested 20 sample-files with 100 records each.
    Little docuementation is available on managing the sample files, like delete, combine sample files etc. What i am looking is:
    1. methodology of using these sample files for creating the data lens
    2. with 20 sample-files and 100 records each, these are 2K records. How can I have the created data lens (rules) applied to my total # for rows (90k).
    3. There are two other files created along with the sample files - 1.baseline.xml and 1.reserve.xml,,,,,what are these and how these are useful in creating the data lens.
    Thanks a lot
    aps

    aps,
    How and when to work with Sample Files depends on how you are implementing EDQP.  Standalone or with Oracle PIM.
    If you are working with EDQP standalone, the number and type of sample files are first determined by the Item Definitions (Categories) that will be handled within each Data lens.  To determine which and how many item definitions will be handled by each data lens, you will need to have some sort of taxonomy defined.
    Identifying the product taxonomy to use for EDQP is one of the first points of consideration when implementing EDQP, either as a standalone solution or with a non-PIM solution.
    Generally it is ideal to have no more than 40 data lenses.  However, many businesses operate successfully with 10-15 data lenses.  Often a data lens may correspond to a Level 1 or Level 2 UNSPSC level.  The following factors should be considered when setting up the data lens taxonomy:
    Number of categories per data lens ( recommended 250 item definitions maximum)
    Mapping to categories that correspond to areas of responsibility within the business
    Generally, the concept of product categories is familiar to most businesses. It often makes sense to use these categories as is, rather than identify a new taxonomy for the purposes of the EDQP solution.  However, if the taxonomy is not a product taxonomy (i.e. if the taxonomy is a display taxonomy), then it is advisable to adopt a product taxonomy. 
    There are three points of consideration
    The item definition structure must be a product hierarchy.  See discussion in chapter 2 for details about this.
    Each leaf-node in the taxonomy should differ from other leaf nodes based on attributes (not attribute values). E.g.  Blue ballpoint pens and Black ballpoint pens should not be in different leaf-node categories, simply because the Color attribute value differs.
    Once you have determined what categories from your taxonomy will be address in the Data lens you are working with, you are now ready for a strategy on sample files:  so now I will answer your questions in this context assuming you have done the above:
    1. methodology of using these sample files for creating the data lens
         The methodology is to create enough sample files for the specific categories to be handled by the datalens.  We have empirical evidence that randomizing the data from a large corpus to data sample of 100 is a good way to test, refine, and augment the rules. the methodology of the randomized sample data is as follows:
         a. Create the recommended number of sample files for the set of categories to be handled by the data lens being refined.
         b. Open the first sample file of 100 items.  The lens (depending on how many rules or on the method or generating the lens (thru Autobuild from PIM Metadata or manually top down) will recognize between 0% to maybe 40% of the items in the sample.
         c.  Work on refining and augmenting the rules to recognize as close to 100% of the items in sample data file 1.  When done.  Open sample data file 2
         d.  What you will notice is that sample data file 2 will start at a much higher percent of recognition than sample data file 1.  This is because the randomized file will contain phrases that repeat from sample file one.
         e. Work on sample data file 2 until you reach close to 100% recognition. And now open sample data file 3
         f.  What you will notice is a pattern, sample data file 3 will open with a higher recognition percentage than 2 and much higher than 1.  -- you now get the methodology....as you continue to progress from the sample files, the random files will continue to increase in recognition percent until you reach a flattening of improved results.  At this point, the knowledge building has reached a point where you are no longer getting a good ROI in the lens building in terms of increased recognition.  This usually peaks between 95-100 percent due to various reasons including the variability or sparsity of the attributes in the data. 
    2. with 20 sample-files and 100 records each, these are 2K records. How can I have the created data lens (rules) applied to my total # for rows (90k).
         a. If the 90K rows are really all related to the categories intended to be recognized by the datalens, then yes, starting with 20 sample files is recommended to build the rules.  Follow directions above on processing files.  If you don't reach the desired recognition percent, then point to the reserve file and create another 20 samples...this should be repeated until your recognition pattern flattens between sample files. 
         b. To test the theory that after you flatten out, the lens will recognize a larger population which was reflected in the sample files, the system created the additional two file types.  Baseline and Reserve.  See below for explanations on each:
    3. There are two other files created along with the sample files - 1.baseline.xml and 1.reserve.xml,,,,,what are these and how these are useful in creating the data lens.
         Baseline:  Baseline file is usually a file of 1000 rows of randomized data from the larger set.  It is used to test the recognition percent of the Data Lens against a larger population of the data.  Use this to test if you are really flattened.  For example, if you are sample data file 8 and it opens up at 99% recognition.  So you then open up random sample file 9 and it opens up at 99% recognition.  This would be an indicator that you are converged in the rules creation.  To test this against a larger sample population, you open up Baseline file and in most cases it will open up at 96-98% for example.  If it does not, it means that maybe samples 8 and 9 were anomalies, so you should then go back and open up sample file 10 and continue building rules.
         Reserver:  The reserve file is a file that will store 10% of the total population that is not contained int he sample files.  It it used to create more sample files which you will know do not overlap with the original set.
    I hope this helps.  let me know if you have additional questions.
    Luis

Maybe you are looking for

  • IPod Shuffle 2nd gen with Error (-50)

    My wife and I each have an 2nd gen ipod shuffle.  I cannot add songs onto either one, it give me an error message.  I have updated iTunes and checked for updates for my ipod.  It says that it is syncing with the computer and my ipod shows up on itune

  • Can't burn CD in iTunes

    I can't burn a CD in iTunes. I know it's not a hardware problem because I just successfully burned a disc of photos using Finder, but every time I try to burn a playlist in iTunes, it hangs during the "Initializing" stage. The drive spins up frighten

  • Systemd/user unit enabled but not auto-starting

    Hi, I enabled the systemd/user devmon unit but after login it shows ~> systemctl --user status ● marchine State: running Jobs: 0 queued Failed: 0 units Since: Fri 2014-05-16 01:10:37 CEST; 2min 53s ago CGroup: /user.slice/user-1000.slice/[email prote

  • Bluetooth of when g4 running 10.5.6 boots

    hi Iam new to macs but have a blutooth mighty mouse, all was working fine for a week, when i boot my mac the mouse was working fine, but now when it boots the mouse doesnt work, i plug my usb mouse in and attempt to send a file to my phone and i get

  • I want to reinstall lion server

    I have a Mac Pro that came with Lion Server installed.  I went through the setup and added it to my network.  As I look through the documentation, I'm thinking I want to reinstall it and run it as a migration from our Mac Pro with Snow Leopard Server