Best Practice: Data Binding
Each non-trivial Swing application needs to bind GUI elements (windows and components) to data. Ten years back in my MFC times, the typical solution was to initialize the GUI with fixed data, show the window, and read back the modified data after window closure. But in Swing, there is more flexibility (data can be dynamic for example). But what is the best way to deal with data binding?
- Writing custom models, bound directly to data objects?
- Initializing once and read back after window closure?
- Using the non-standard (but rather interesting) beans binding API?
Certainly there are more solutions, but my intention is not to write a list of possibilities or to get your opinion on one of these solutions. Actually I'd like to know from the experienced Swing pros in this forum, whether there is a best practice for data binding?
Hi,
I'd say it depends on what kind of data are you binding and if they're refreshed or not. In our "framework" we're using Properties-based approach for dialogs - server packs an @Entity into class similar to Properties - basically it's s Map<String, Object>, sends this Properties to client and values are put into our customised components. We're using component name as a key, most of controls are simple extends of standard JComponents with handful of methods from common interface (like load(), save(), ...). This approach seems to work fine.
For refreshed "lists" we're using again our custom "framework" based on sending initial batch and consequent diffs - there is sort of event queue responsible for telling everyone what had been changed. Because this kind of data tends to be quite huge, we're sending gzipped binary representation of source data.
With beans there might be a way how to implement both simply by calling getters, but then every single value results in one roundtrip, which was absolutely unacceptable for us. But I can imaging this working fine on LAN.
Similar Messages
-
How to load best practices data into CRM4.0 installation
Hi,
We have successfully installed CRM4.0 on a lab system and now would like to install the CRM best practice data into it.
If I refer to the CRM BP help site http://help.sap.com/bp_crmv340/CRM_DE/index.htm,
It looks like I need to install at least the following In order to run it properly.
C73: CRM Essential Information
B01: CRM Generation
C71: CRM Connectivity
B09: CRM Replication
C10: CRM Master Data
B08: CRM Cross-Topic Functions
I am not sure where to start and where to end. At the minimum level I need the CRM Sales to start with.
Do we have just one installation CDs or a number of those, Also are those available in the download area of the service.sap.com?
Appreciate the response.<b>Ofcourse</b> you need to install Best Practices Configuration, or do your own config.
Simply installing CRM 4.0 from the distibutiond CD\DVD will get you a plain vanilla CRM system with no configuration and obviously no data. The Best Practices guide you trhough the process of configuring CRM, and even has automated some tasks. If you use some of the CATT processes of the Best Practices you can even populate data in your new system (BP data, or replace the input files with your own data)
In 12 years of SAP consulting, I have NEVER come across a situation whereby you simply install SAP from the distribution media, and can start using it without ANY configuration.
My advise is to work throught the base configuration modules first, either by importing the BP config/data or following the manual instruction to create the config/data yourself. Next, look at what your usage of CRM is going to be, for example Internet Sales, Service Management, et cetera, and then install the config for this/these modules. -
OBIEE Best Practice Data Model/Repository Design for Objectives/Targets
Hello World!
We are faced with a design question that has become somewhat difficult and we need some help. We want to be able to compare side-by-side actual measures with their corresponding objectives/targets. Sounds simple. But, our objectives are static (not able to be aggregated) with multi-dimensionality and multi-levels. We need some best practice tips on how to design our data model and repository properly so that we can see the objective/target for a measure regardless of the dimensions that are used in the criteria and regardless of the level.
Here is some more details:
Example of existing objective table.
Dimension1
Dimension2
Dimension3
Obj1
Obj2
Quarter
NULL
NULL
NULL
.99
1.8
1Q13
DIM1VAL1
NULL
NULL
.99
2.4
1Q13
DIM1VAL1
DIM2VAL1
NULL
.98
2.41
1Q13
DIM1VAL1
DIM2VAL1
DIM3VAL1
.97
2.3
1Q13
DIM1VAL1
NULL
DIM3VAL1
.96
1.9
1Q13
NULL
DIM2VAL1
NULL
.97
2.2
1Q13
NULL
DIM2VAL1
DIM3VAL1
.95
2.0
1Q13
NULL
NULL
DIM3VAL1
.94
3.1
1Q13
- Right now we have quarterly objectives set using 3 different dimensions. So, if an author were to add one or more (or zero) dimensions to their criteria for a given measure they could get back a different objective. They could add Dimension1 and get 99%. They could add Dimension1 and Dimension2 and get 98%. They could add all three dimensions and get 97%. They could add zero dimensions (highest grain) and get 99%. Using our existing structure if we were to add a new dimension to the mix the possible combinations would grow dramatically. (Not flexible)
- We would like our final solution to be flexible enough so that we could view objectives with altogether different dimensions and possibly get different objectives.
- We currently have 3 fact tables with 3+ conformed dimension tables and a few unique dimension tables.
Could anyone share a similar situation where you have implemented a data model structure with the proper repository joins to handle showing side-by-side objectives/targets where the objectives were static and could be displayed at differing levels with flexible dimensions as described?
Any help would be greatly appreciated.hi..yes this suggestion is nice...first configure the sensors(activity or variable) ..then configure the sensor action as a JMS Topic which will in turn insert the data into a DB..Or when u configure the sensor action as a DB..then the data goes to Oracle Reports schema..if there is any chance of altering the DB..i mean if there is any chance by changing config files so that the data doesnt go to that Reports schema and goes to a custom schema created by any User....i dont know if it can b done...my problem is wen i m configuring the jms Topic for sensor actions..i see blank data coming..for sm reason or the other the data is not getting posted ...i have used a esb ..a routing service based on the schema which i am monitoring...can any1 help?
-
Hi,
what is the BEST PRACTICE to migrate master data on a standalone CRM ?
any advice will be highly appreciated.
Cheers
GuestHi,
Please read the following threads carefully and you will understand the best method by yourself.
Initial Load on standalone CRM
Inital load on standalone CRM
CRM Master Data
<b>Reward if helps</b>,
Regards,
Paul Kondaveeti -
Question - Best practice data source for Vs2008 and Crystal Reports 2008
I have posted a question here
CR2008 using data from .NET data provider (ADO.NET DATASET from a .DLL)
but think that perhaps I need general community advise on best practice with data sources.
In Crystal reports I can choose the data source location from any number of connection types, eg ado.net(xml), com, oledb, odbc.
Now in regard to the post, the reports have all been created in Crxi 6.3, upgraded to Crystal XI and now Im using the latest and greatest. I wrote the Crystal Reports 6.3/ XI reports back in the day to do the following: The Reports use a function from COM Object which returns an ADO recordset which is then consumed fine.
So I don't want to rewrite all these reports, of which there are many.
I would like to know if any developers are actually using .NET Class libraries to return ADO.NET datasets via the method call or if you are connecting directly to XML data via whatever source ( disk, web service, http request etc).
I have not been able to eliminate the problem listed in the post mentioned above, which is that the Crystal Report is calling the .NET class library method twice before displaying the data. I have confirmed this by debugging the class lib.
So any guidance or tips is appreciated.
ThanksThis is already being discuss in one of your other threads. Let's close this one out and concentrate on the one I've already replied to.
Thanks -
Upcoming SAP Best Practices Data Migration Training - Chicago
YOU ARE INVITED TO ATTEND HANDS-ON TRAINING
SAP America, Downers Grove in Chicago, IL:
November 3 u2013 5, 2010 `
Installation and Deployment of SAP Best Practices for Data Migration & SAP BusinessObjects Data Services
Install and learn how to use the latest SAP Best Practices for Data Migration package. This new package combines the familiar IDoc technology together with the SAP BusinessObjects (SBOP) Data Services to load your customeru2019s legacy data to SAP ERP and SAP CRM (New!).
Agenda
At the end of this unique hands-on session, participants will depart with the SBOP Data Services and SAP Best Practices for Data Migration installed on their own laptops. The three-day training course will cover all aspects of the data migration package including:
1. Offering Overview u2013 Introduction to the new SAP Best Practices for Data Migration package and data migration content designed for SAP BAiO / SAP ERP and SAP CRM
2. Data Services fundamentals u2013 Architecture, source and target metadata definition. Process of creating batch Jobs, validating, tracing, debugging, and data assessment.
3. Installation and configuration of the SBOP Data Servicesu2013 Installation and deployment of the Data Services and content from SAP Best Practices. Configuration of your target SAP environment and deploying the Migration Services application.
4. Customer Master example u2013 Demonstrations and hands-on exercises on migrating an object from a legacy source application through to the target SAP application.
5. Overview of Data Quality within the Data Migration process A demonstration of the Data Quality functionality available to partners using the full Data Services toolset as an extension to the Data Services license.
Logistics & How to Register
Nov. 3 u2013 5: SAP America, Downers Grove, IL
Wednesday 10AM u2013 5PM
Thursday 9AM u2013 5PM
Friday 8AM u2013 3PM
Address:
SAP America u2013Buckingham Room
3010 Highland Parkway
Downers Grove, IL USA 60515
Partner Requirements: All participants must bring their own laptop to install SAP Business Objects Data Services on it. Please see attached laptop specifications and ensure your laptop meets these requirements.
Cost: Partner registration is free of charge
Who should attend: Partner team members responsible for customer data migration activities, or for delivery of implementation tools for SAP Business All-in-One solutions. Ideal candidates are:
u2022 Data Migration consultant and IDoc experts involved in data migration and integration projects
u2022 Functional experts that perform mapping activities for data migration
u2022 ABAP developers who write load programs for data migration
Trainers
Oren Shatil u2013 SAP Business All-in-One Development
Frank Densborn u2013 SAP Business All-in-One Development
To register please use the hyperlink below.
http://service.sap.com/~sapidb/011000358700000917382010EHello,
The link does not work. This training is still available ?
Regards,
Romuald -
Upcoming SAP Best Practices Data Migration Training - Berlin
YOU ARE INVITED TO ATTEND HANDS-ON TRAINING
Berlin, Germany: October 06 u2013 08, 2010 `
Installation and Deployment of SAP Best Practices for Data Migration & SAP BusinessObjects Data Integrator
Install and learn how to use the latest SAP Best Practices for Data Migration package. This new package combines the familiar IDoc technology together with the SAP BusinessObjects (SBOP) Data Integrator to load your customeru2019s legacy data to SAP ERP and SAP CRM (New!).
Agenda
At the end of this unique hands-on session, participants will depart with the SBOP Data Integrator and SAP Best Practices for Data Migration installed on their own laptops. The three-day training course will cover all aspects of the data migration package including:
1. Offering Overview u2013 Introduction to the new SAP Best Practices for Data Migration package and data migration content designed for SAP BAiO / SAP ERP and SAP CRM
2. Data Integrator fundamentals u2013 Architecture, source and target metadata definition. Process of creating batch Jobs, validating, tracing, debugging, and data assessment.
3. Installation and configuration of the SBOP Data Integratoru2013 Installation and deployment of the Data Integrator and content from SAP Best Practices. Configuration of your target SAP environment and deploying the Migration Services application.
4. Customer Master example u2013 Demonstrations and hands-on exercises on migrating an object from a legacy source application through to the target SAP application.
Logistics & How to Register
October 06 u2013 08: Berlin, Germany
Wednesday 10AM u2013 5PM
Thursday 9AM u2013 5PM
Friday 9AM u2013 4PM
SAP Deutschland AG & Co. KG
Rosenthaler Strasse 30
D-10178 Berlin, Germany
Training room S5 (1st floor)
Partner Requirements: All participants must bring their own laptop to install SAP Business Objects Data Integrator on it. Please see attached laptop specifications and ensure your laptop meets these requirements.
Cost: Partner registration is free of charge
Who should attend: Partner team members responsible for customer data migration activities, or for delivery of implementation tools for SAP Business All-in-One solutions. Ideal candidates are:
u2022 Data Migration consultant and IDoc experts involved in data migration and integration projects
u2022 Functional experts that perform mapping activities for data migration
u2022 ABAP developers who write load programs for data migration
Trainers
Oren Shatil u2013 SAP Business All-in-One Development
Frank Densborn u2013 SAP Business All-in-One Development
To register please follow the hyperlink below
http://intranet.sap.com/~sapidb/011000358700000940832010EHello,
The link does not work. This training is still available ?
Regards,
Romuald -
Best Practices Data Extract from Essbase
Hi there,
I have been looking for information on Best Practices to extract data from Essbase (E).
Using MDX I initially wanted to bulk extract data from E but apparently the process was never ending.
As a 2d choice, I went for a simulation of an interactive interaction and got ODI generating MDX queries requesting smaller data sets.
At the moment more than 2000 mdx queries are generated and sequentially sent to E. I takes some times ....
Has anyone be using other approaches ?
Awaiting reaction.
regards
JLDWhat method are you using to extract, what version are you on including patch
Cheers
John
http://john-goodwin.blogspot.com/ -
Best practice data source from ECC 6.0 for legal consolidation in BPC NW7.5
Hi there,
after scanning every message in this forum for "data source" I wonder if there isn't any standard approach from SAP to extract consolidation data from ECC 6.0. I have to say that my customer is not using New g/l so far and therefore the great guide "how to get balances from ECC 6.0 ..." does not fully work for us.
Coming from the old world of EC-CS there is the first option to go via the GLT3 table. This option requires clever customization and the need to keep both GLT0 and GLT3 in line. Who has experiences regarding maintenance of these tables in a production environment?
We therefore plan to use data source 0FI_GL_4 which contains all line items to the financial documents posted. Does this approach make sense or will it fail because of performance issues?
Any help is appreciated!
Kind regards,
DierkHi Dierk,
Do you have a need for the level of detail provided by the 0FI_GL_4 extractor? Normally I would recommend going for the basic 0FI_GL_6 extractor, which provides a much more manageable data volume since it only gives the periodic activity and balances as well as a much smaller selection of characteristics. Link: [http://help.sap.com/saphelp_nw70/helpdata/en/0a/558cabb2e19a4db3097b81bba4fd0e/frameset.htm]
Despite this initial recommendation, every client I've had has eventually had a need for the level of detail provided by the 0FI_GL_4 extractor (or the New G/L equivalent - 0FI_GL_14). Most BW systems can handle the output of the line-item extractors without much issue, but you should test using production data and make sure your system sizing takes into account the load.
The major problem you can run into with the line-item extractors is that if your delta somehow gets compromised it can take a very long time (days, sometimes weeks) to reinitialize and this can cause a large load in your ECC and BW system. Also, for the first transport to production, it is important to plan time to initialize the delta.
Ethan -
Best Practice Life Science Pharmaceuticals
Dear all,
I was wondering if there is any news about the Life Science Best Practice Pharma. The latest Pharma Best Practice dates back 2007.
Maybe with 7.0 release there will also be an update on the best practice, but there is nothing to be found, regarding this subject.
Anybody got some insight?
Thanks.
Regs,
RuudWould love to get some insights too!
Agne
[indijos viza voyage|http://www.voyage-voyage.lt/viza-i-indija/] -
HCM Best Practice LOAD - Error in Copy Report variants Step
Dear Friends,
We are trying to use/load the HCM Best Practice data for testing purpose. We have applied the HCM Best Practice Add -ON. It went successfully. When we try to execute the Preparation step - (in Copy Varient Phase) using Tcode /HRBPUS1/C004_K01_01 we are getting the following error.
<b>E00555 Make an entry in all required fields</b>
Request you to provide some solution for the same.
Thanks and regards,
AbhilashaHi Sunny and others,
The main error here was sapinst couldn´t find and read the cntrlW01.dbf because this file was not on that location (/oracle/W01/112_64/dbs).
I already solved this issue... what I did was:
1º) As user ora I went in sqlplus as sysdba and I ran the following script (the control.sql script that was generated at the begining of the system copy process):
SQL> @/oracle/CONTROL.SQL
Connected.
ORA-32004: obsolete or deprecated parameter(s) specified for RDBMS instance
ORA-01081: cannot start already-running ORACLE - shut it down first
Control file created.
Database altered.
Tablespace altered.
2º) This is very important, is necessary to know where is the .CTL that was created with the script, so I checked what was the value of the parameter control_files in that moment:
SQL> show parameter control_files;
/oracle/W01/oraflash/W01/controlfile/o1_mf_6pqvl4jt_.ctl
3º) Next, logged with ora, for the sapinst read the value that it needed, I copied this file to the location/path that is needed for sapinst could read:
cp /oracle/W01/oraflash/W01/controlfile/o1_mf_6pqvl4jt_.ctl /oracle/W01/112_64/dbs/cntrlW01.dbf
4º) I ran the sapinst from the point he had stopped.
Thank you for your help.
Kind regards,
João Dimas - Portugal -
How to check verison of Best Practice Baseline in existing ECC system?
Hi Expert,
How to check verison of Best Practice Baseline in existing ECC system such as v1.603 or v1.604?
Any help will be appriciate.
SayanDear,
Please go to https://websmp201.sap-ag.de/bestpractices and click on Baseline packages then on right hand side you will see that On which release is SAP Best Practices Baseline package which version is applicable.
If you are on EHP4 then you can use the v1.604.
How to Get SAP Best Practices Data Files for Installation (pdf, 278 KB) please refer this link,
https://websmp201.sap-ag.de/~sapidb/011000358700000421882008E.pdf
Hope it will help you.
Regards,
R.Brahmankar -
Webinar Invitation: Join us for "SiteStudio Best Practices"
<strong>SiteStudio Best Practices</strong>
Join us for a Webinar on November 13
Space is limited.
Reserve your Webinar seat now at:
https://www1.gotomeeting.com/register/789489722
Joe Duane is a senior principal product manager at Oracle, and former senior principal consultant. As a consultant, Joe spent 8 years architecting and deploying Web Content Management solutions for clients. This presentation will focus on best practice architecture and development with the 10gr3 release of Site Studio.
Title:
SiteStudio Best Practices
Date:
Thursday, November 13, 2008
Time:
12:00 PM - 1:00 PM CST
After registering you will receive a confirmation email containing information about joining the Webinar.
System Requirements
PC-based attendees
Required: Windows 2000, XP Home, XP Pro, 2003 Server, Vista
Macintosh-based attendees
Required: Mac OS X 10.4 (Tiger) or newerHi,
was there some kind of document formulated upon conclusion of this webinar? I was unable to attend, so I'm wondering if there is anything I can read on Best Practices for SS or any link to related information would be useful.
Thank you,
G -
Select One Choice attribute' LoV based on two bind variables, best practice
Hello there,
I am in the process of learning the ADF 11g, I have following requirement,
A page must contain a list of school names which is needed to be fetched based on two parameters, the parameters are student information been inserted in the previous page.
I have defined a read only view "SchoolNamesViewRO", it's query depends on two bind variables :stdDegree and stdCateg.
added that RO View as a view accessor to the entity to which the name attribute belongs, and then add LoV for the name attribute using the ReadOnly view,
added the name attribute as Select One Choice to page2,
and now I need to pass the values of the bind variables of the ReadOnly view,
the information needed to be passed as the bind variables is inserted in the previous page, I could have the data as bindings attribute values in the page2 definition
I have implemented the next two appraoches but both resulted in an empty list :
* added ExecuteWithParams Action to the bindings of the page and then defined an Invoke Action (set refresh condition) in the executable s, set the default values of the parameters to be the attributes values' input value,
in the trace I code see that the binding fetches correct values as supposed , but the select list appears empty, does the this execution for the query considered to be connected to the list ?
* added a method to the ReadOnly view Imp java class to set the bind variables, then I define it as a MethodAction in the bindings , and then create an Invoke action for it , also the select is empty,
if the query been executed with the passed variables, then why the list is empty? is it reading data from another place than the page!
and what is the best practice to implement that requirement?
would the solution be : by setting the default value of the bind variables to be some kind of Expression!
please notice that query execution had the bound variables ( I see in the trace) are set to the correct values.
would you give some hints or redirect me to a useful link,
Thanks in advance
Regards,please give me any example using backing bean .for example
<?xml version='1.0' encoding='UTF-8'?>
<jsp:root xmlns:jsp="http://java.sun.com/JSP/Page" version="2.1"
xmlns:f="http://java.sun.com/jsf/core"
xmlns:h="http://java.sun.com/jsf/html"
xmlns:af="http://xmlns.oracle.com/adf/faces/rich">
<jsp:directive.page contentType="text/html;charset=UTF-8"/>
<f:view>
<af:document id="d1">
<af:form id="f1">
<af:selectOneChoice label="Label 1" id="soc1" binding="#{Af.l1}"
autoSubmit="true">
<af:selectItem label="A" value="1" id="si1"/>
<af:selectItem label="B" value="2" id="si2"/>
</af:selectOneChoice>
<af:selectOneChoice label="Label 2" id="soc2" disabled="#{Af.l1=='2'}"
partialTriggers="soc1">
<af:selectItem label="C" value="3" id="si3"/>
<af:selectItem label="D" value="4" id="si4"/>
</af:selectOneChoice>
</af:form>
</af:document>
</f:view>
</jsp:root>
package a;
import oracle.adf.view.rich.component.rich.input.RichSelectOneChoice;
public class A {
private RichSelectOneChoice l1;
public A() {
public void setL1(RichSelectOneChoice l1) {
this.l1 = l1;
public RichSelectOneChoice getL1() {
return l1;
is there any mistake -
Best practice for saving data in SQL server
Hi all
Hoping for a little help on this question.
If i have a list of fields ex. (name,address,postal,phone etc.). Then i create a webform/task
to gather some of theese fields (name, postal), then i make another webform/task to gather some other fields (address, phone).
What is best practice in the SQL server for storing returning values.
Is it:
1. to make a table with all the fields in the list + taskid. Theese fields could be in
correct format (number, date etc.). And all answers to all tasks is inserted into this table.
2. Make a value table for each field with the correct type + task id. So all name values
are stored in the "name value table" with the task id.
How would i select values from a certain task from this kind of setup?
3. ??
Best regards
BoHi Atul
Thanks for your reply, can you elaborate a bit further on this, since i am still a little confused.
Let me try to explain my scenario at bit more:
Say instead that it is 50 fields in a table with their own unique ID, maybe an answer table
would look like this:
taskid | field_1 | field_2 | field_3 | field 4 | field_n
So no matter which fields the user fillsout it will can be stored in one table.
QUestion is, is this a good way to do it? and how do i select from this table using a join
As far as i know you cant name columns in a table with just numbers, which would have been
great, giving the columnnames the field_id.
OR
Would you have 50 tables each with a field_id and a value (of correct type) ?
And could you give me an example of how to bind and select from this kind of structure ?
Also inserting into 50 tables on a save.... is that the right way to go? :)
Best regards
Bo
Maybe you are looking for
-
Slow and lagging After Effects CS5.5
Hello! I've tried to add some effects and preview the video with AE CS5.5 but editin is very painful. I try to preview and my fps is from 3-14 fps. It's painful to edit. The editing video is 1280 x 720, 13min long and I've edited in Premiere Pro CS5.
-
Hard Drive Failed - iPod synced with previous iTunes Library - HELP!
The hard drive on our Dell laptop recently failed. We had to replace it and completely reinstall the operating system, which wiped out everything, including iTunes. All of our music currently resides on our iPod (thousands of songs!). After reinstall
-
Experts, How is tax generally calculated on a P O - Is it based on the vendor zip code and region ? And then after it is caluclated, where can we know as to what account it gets posted to ? And how is the tax calculated in case of service entry ? is
-
A Problem Uploading because the window is too low
I have an iMac running Yosemite. I am talking about Safari but it happens on other browsers too. I don't know where the problem lies. When I try to post a message or upload there is a window and that window that is the destination. Sometimes the up
-
Photoshop elements 13 won't update
photoshop elements 13 says application manager is not correct