Better design approach

Suppose we have to develop 14 to 15 OAF pages which comes under the same component. Means all off them will be sharing the same AM. Now each page will have methods as per there requirements which they will put in the common AM. If all pages will put there methods in the AM, then it will be difficult to maintain. Following is my approach to solve this problem. Please let me know how to improve this design:-
package poc.oracle.apps.ak.poc.server;
import poc.oracle.apps.ak.poc.server.PRTPage1AM;
import poc.oracle.apps.ak.poc.server.PRTPage2AM;
public class PocAMImpl extends OAApplicationModuleImpl
public void page1Method1()
PRTPage1AM obj=new PRTPage1AM();
obj.page1Method1();
public void page1Method2()
PRTPage1AM obj=new PRTPage1AM();
obj.page1Method2();
public void page2Method1()
PRTPage2AM obj=new PRTPage2AM();
obj.page2Method1();
public void page2Method2()
PRTPage2AM obj=new PRTPage2AM();
obj.page2Method2();
package poc.oracle.apps.ak.poc.server;
public class PRTPage1AM extends PocAMImpl
public PRTPage1AM()
public void page1Method1()
System.out.println("inside the page1Method1");
public void page1Method2()
System.out.println("inside the page1Method2");
package poc.oracle.apps.ak. poc.server;
public class PRTPage2AM extends PocAMImpl
public PRTPage2AM()
public void page2Method1()
System.out.println("inside the page2Method1");
public void page2Method2()
System.out.println("inside the page2Method2");
~Vikram

Vikram,
I agree with Mani, that all of your 15 pages, lead to a single business flow (transaction), and all of these pages need the VOs assigned to the AM all through the flow! If there are not much values that you use on each page, then u can also use session variables while retaining the root AM.
--Mukul                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

Similar Messages

  • Design approach help : BIC Mapping Tool Conversion

    Hi All,
    Design approach:
    we know that BIC mapping tool can be used for EDI to XML conversion. also i know that its a Any to any converter.
    But we prefer EDI to EDI XML conversion only via this tool and do the mapping in SAP PI as it would be easy to do mapping change if any in PI than redeploy the mapping in module location every time.
    My question is:
    we know that sender file adapter supports FCC for max of 3 hierarchy levels.
    Can we make it a practice of accepting requirements in SAP PI which has complex Flat file structure (more than 3 levels). and use BIC mapping tool to convert the Complex flat file Structure (with more  than 3 levels) into PI required format and proceed with the mapping and Configuration. (Note: hear the assumption is that the organisation has BIC Mapping tool procured for some EDI message scenario's ).
    or is it best practice to ask the Source system to supply flat files strictly in Header,with 1 line item format?
    Suggestions would be deeply appreciated.
    Warm Regards,
    Senthilprakash

    >
    > if you are thinking of using only BIC mapping tool for converting flat file to xml and want to proceed with normal configuration without using BIC modules then it is not feasible.
    >
    Dear Suresh,
    I am very well aware of that. we have Seeburger SFTP adapter installed in our PI system. and for EDI conversion we use BIC for converting to XML and deploy the module in the adapter level.
    Now my question is that: weather is it the rite way to start accepting requirement from source which has complex flat file structure (non edi messages) and using BIC module in adapter to convert it into XML and proceed with configuration or better to ask the source system to send the flat file in a format which is accepted by our file adater FCC.
    what are the drawbacks/bottlenecks we see in using BIC module for converting non EDI complex flat files into XML format? in PI
    Regards,
    Senthilprakash.

  • J2EE Design Approach

    Hi folks,
    I am not new to Java, but new to J2EE. I would like to tap your creativity for suggestions for a design approach. Basically, I would like to create a basic JSP/Servlet-based web application that interfaces with an Oracle database. I would like the JSP pages to be almost completely bereft of logic (i.e. use HTML and JSP tags only), with one or two very small snipplets. I would like the servlets to serve as "traffic cops"; that is, state machines that merely redirect to JSPs.
    My most pressing question is the following: as with one of my last J2EE projects, I would like to create a 'DatabaseManager' class. This class would basically "wrap" the JDBC calls to establish and maintain a connection, as well as provide convenient methods for querying (i.e. have a method that simply takes in a String represting the query), and returns back a data structure (another class) that wraps the "ResultSet" object (I'll call it "QueryResults", for now).
    Should the DatabaseManager and/or QueryResults classes be EJBs?
    Thanks!
    Tom

    Use of EJbs would depend on the scalability , security of your application.If yours is a very small scale application with 10 - 15 users , EJbs wont be a better choice.However if you need to set variety of security constraints like method level privileges and all , you can go for them.Also complicated transactions should encourage you to use EJbs.
    Just a thought.
    thx.

  • Design approach for custom Fiori application

    Dear Experts,
    Good day to all...!
    I am having a query on finalizing design approach for one of my custom Fiori Application development  using SAP UI5.
    Current Application design and  Features:
    As now we are having application, which is been used in laptops. The application structure is like (SAP R3 --> SUP -->UI (using .net) [back-end à Middlewareà UI).
    The UI is hosted on IIS server and the application type is desktop, so the users can use the application in offline as well.
    Once its connected to internet, they push all the data back to SAP via SUP.
    Proposal :
    We are planning to migrate the same application into Fiori with same offline features and extending to mobiles and devices.
    I have few queries here.
    What will be the best approach to deploy the application either SUP(Latest version of SMP) or SAP R3.
    If SAP R3 to deploy App:
    If we choose to deploy the application in R3, How to support the offline usage mobile and devices.
    Will the HTML5 local storage or indexed DB’s are sufficient to support the offline usage.
    In this case, Shall we drop the SUP/SMP, Since the application directly accessed from SAP R3 ..?
    SUP/SMP to deploy the app:
    In this case, I need to create (wrap the ui5 files into hybrid application) a hybrid application to support the mobile and devises as native application..? Correct me If I am wrong..:)
    Hope I can use the SUP/SMP local storage options to support my offline usage..? Correct me If I am wrong..:)
    What will be the best option to support desktop offline  usage.
    We are yet to take a decision on this.. Please provide your valuable inputs , which will help us to take some decisions.
    Thanks & Regards
    Rabin D

    Hi Anusha,
    considering the reusability aspect the components approach is the much better one (see also the best practices dev guide chapter regarding components SAPUI5 SDK - Demo Kit).
    It allows you to reuse that component in different applications or other UI components.
    I also think that the Application.js approch will not work with Fiori, cause the Fiori Launchpad loads the Components.js of the Fiori app in an Component Container.
    Best Regards, Florian

  • Display problems in IE, using Netlace Designs approach

    Display problems in IE, using Netlace Designs approach
    Could someone please look at
    www.tudo.co.uk/testing/magnification/
    I am trying to adjust some css code taken from Netlace
    Design's website
    for my purposes. The purpose is to enlarge an image on
    'hover' and to
    enlarge some text on 'hover'.
    What I have done works all right in Mozilla/Seamonkey.
    However, in IE,
    the 'text enlargement' does not work at all
    and when I am enlarging the image there is ***in IE*** a
    large grey
    surplus area at the bottom, which looks like a background
    area which has
    turned out to have too much height. I want to get rid of that
    superfluous grey area.
    How can I make sure it works in BOTH Mozilla and in IE?
    Thanks for your help.
    Adrian

    Hi ijmari,
    I checked out that page and don't see any Spry menubar in it.
    I'm assuming you've reverted back or removed it.
    If you still need/want help, please post the URL to the page
    that uses the menubar and demonstates the problem.
    --== Kin ==--

  • Design Approach For Integrating Third Party API's

    Our application requires a lot of third party API's to be integrated with the application.
    What would be a good design approach for the same
    Ritesh.

    I think maybe you should post a little bit more info: What kind of APIs? C++? java?

  • Design approach for XML load into DB table

    The question is more related to find a design approach.
    1. I have a few XML files to load data from. There will be more files which will be loaded into the DB periodically.
    2. Each file populates 4 tables.
    3. After loading all the data each file has to be moved into an archived folder.
    4. We have a STAGE & PROD setup in the DB. The usual approach is to create 2 packages for EACH table.
    One to load data into STAGE
    2nd to move data from STAGE to PROD
    Now, I'm not sure if I should create 4 tables * 2 = 8 packages or just 2 packages (1 for stage + 1 for prod).
    Please suggest!! I know the experts here have already come across such a scenarios.
    I wanted to know pros and cons of each approach. As I'm doing something like this for the first time, I can't imagine which will be the best solution.
    Thanks!!!

    I'd create one package.
    I do not see why de-coupling the loading to staging from prod needs to be so.
    When it works as one unit you can rollback the entire load as one as you may not want to end up with some data partially loaded.
    Arthur
    MyBlog
    Twitter

  • In reponse to "A Better designed SwingWorker"

    This is in response to [A Better designed SwingWorker|http://forums.sun.com/thread.jspa?threadID=5160966&tstart=0] since that thread got locked. I have got it working using a bound property on my common model which my child models listen to.
    Any other ways I could have got it to work ?
    Thanks, Bhishma

    I think the problem is not really specific to swing worker, but just general OO concepts. I'll walk through an example.
    There should be controller classes that listen & publish events to/from the view layer classes.
    So consider a set of classes like the following:
    A view class
    - Responsible for creating / arranging the JComponents.
    - Does not expose its JComponents to model or controller classes.
    - Contains a public method to get the top-level JComponent for display.
    - Fires events (to any registered listeners) with event names like "download", which would be fired when the download button is pressed.
    - It handles events such as "disable menu" and "enable menu", which would tell the view classes to enable its menu items, etc.
    - Does not know anything about the controller or model objects.
    A controller class
    - Responsible for the lifecycle of the view and model objects (instantiation, etc).
    - It keeps track of the model and view objects and controls how they interact. It acts as an intermediary between them to prevent them from talking directly to each other.
    - It listens to view and model events. Thus, it handles view events such as "download", which it would respond to by launching a swing worker.
    - It fires events such as "disable menu" and "enable menu", which is easy to dispatch since it has a reference to the model and view objects.
    - It always creates and launches the swing workers.
    A model class
    - Contains the logic for doing the work and stores the data.
    - For example, this class would contain the method to download a file.
    - The model itself is not responsible for calling the download method or even knowing when to call it.
    An easy way to enforce most of the design is to use three seperate packages (view, model and controller). So the controller class in the controller package can only see the public methods on classes in the view and model packages.

  • Universe Design approach - Dimensional Data model

    We use a Dimensional data model which has about 15 different models based on Subject areas. Eg: Billing, Claims, Eligibility, etc. Each model has its own Fact table linked to Dimensions, some of which are Conformed dimensions which is present in multiple models. We want to build Universes on top of this model, for creating Crystal Reports and to expose it to the Business Users to create WebI reports through InfoView.
    <br><br>
    The Client has already built 15 Universes one for each Subject area_, which has 1 fact table each and many Conformed dimensions with some junk dimensions. When a Report needs data from more than one Universe, we have to link the different Universe queries at Report. <br>
    Major drawback with this approach is change management. As our data model will be expanded in future, which in turn makes me to update multiple Universes when, say a Conformed dimension changes; since the Conformed dimension table will be present in multiple Universes.
    <br><br>
    Now we are considering the below approaches to have better Architectural design and have easier User interface.
    <br><br>
    1. Creating a master Universe for the Dimension tables(here there may be a effort to modify data model to suit linking Dimension tables together). Then to create derived Universes for each Fact table. These derived Universes will be linked back to common dimension Universe. <br>
    Maintenance will be easier in this approach, as whenever a Dimension changes I need not update multiple Universes, but as I am linking Universes at Designer level as Master and derived Universes, I am concerned about the Report development if the report needs data from multiple Universes. Then I would be linking u201Cmultiple Linked Universeu201D queries at Report. <br><br>
    2. The other option I have is to combine multiple dimension models(Subject areas) into one Universe. By this we will create minimal number of Universes as possible. May be end up creating 5 or 6 Universes, but we will have tough time in maintaining Security of data elements. For instance, at high level a Universe may have Billing and Eligibility data, where I have to maintain strict Security for the User groups, and let only specific users to see/ use all data elements (objects). <br><br>
    Hope I have summarized my question well, any inputs from you on the approach you are aware of/ prou2019s and conu2019s of it in terms of time it takes to build, the performance of Report(creating WebI reports through InfoView) is appreciated !!
    We want to see which approach makes it better for creating Crystal Reports and when it reaches Business Users who has little patience waiting for a Report and needs best possible interface

    There is no one perfect answer for your question.  Universes are more of an art than a science imo.  I can tell you that we have many conformed dimensions joined to multiple facts in a single universe.  The key to this approach is that for each fact table you will need a context.  The advantage to this approach is the ease in which your WebBI users will be able to build reports.  The disadvantage is that Crystal Reports cannot handle multiple contexts so your Universe is basically useless in CR.  For CR, you will need to build Business Views rather than universes.

  • Looking for best design approach for moving data from one db to another.

    We have a very simple requirement to keep 2 tables synched up that live in 2 different databases. There can be up to 20K rows of data we need to synch up (nightly).
    The current design:
    BPEL process queries Source DB, puts results into memory and inserts into Target DB. Out of memory exception occurs. (no surprise).
    I am proposing a design change to get the data in 1000 row chunks, something like this:
    1. Get next 1000 records from Source DB. (managed through query)
    2. Put into memory (OR save to file).
    3. Read from memory (OR from a file).
    4. Save into Target DB.
    Question is:
    1 Is this a good approach and if so, does SOA have any built in mechanisms to handle this? I would think so since I believe this is a common problem - we don't want to reinvent the wheel.
    2. Is it better to put records into memory or writing to a file before inserting into the Target DB?
    The implementation team told me this would have to be done with Java code, but I would think this would be out of the box functionality. Is that correct?
    I am a SOA newby, so please let me know if there is a better approach.
    Thank you very much for your valued input.
    wildeman

    Hi,
    After going through your question, the first thing that came to my mind is what would be the size of the 20K records.
    If this is going to be huge then even the 1000 row logic might take significant time to do the transfer. And I think even writing it to a file will not be efficient enough.
    If the size is not huge then probably your solution might work. But I think you will need to decide on the chunk size based on how well your BPEL process will work. Possible you can try different size and test the performance to arrive at an optimal value.
    But in case the size is going to be huge, then you might want to consider using ETL implementations. Oracle ODI does provide such features out of the box with high performance.
    On the other hand, implementing the logic using the DBAdapter should be more efficient than java code.
    Hope this helps. Please do share your thoughts/suggestions.
    Thanks,
    Patrick

  • About The Repository Design Approach

    Hi All,
    I am working for a project on OBIEE in which this tool is used for Ad Hoc Reporting. I need to create a repository that should be generic one means i need to create a repository that i need to handover to the client.
    This repository should be flexible enough so that they can create various reports based on their requirement .
    I am not clear, is it possible to create a repository that will be a generic one? Can this be possible?
    Can their is any approach to design that will be flexible and generic?

    I think you can, because you can build in a lot of intelligence in the business and model layer.
    My approach is always to make it first as generic as possible, and once I am ready with that I am going to look how to make it more flexible, so the end users can work better with answers.
    However, have a look on:
    http://www.oracle.com/technology/obe/obe_bi/bi_ee_1013/bi_admin/biadmin.html
    Here you can see how to make a repostitory, first they make the model (generic)
    and after that they add columns like "profit month ago" so the end users have more options in answers (flexible)
    I think the business model and mapping is very powerfull, but you must know how to use it ;)
    So building a repository is not hard, but the challenge is in building a good repository!
    succes!

  • Design Approach for 1:N Multimapping scenario with SAP ECC Receiver

    Hi Experts,
    I am trying to find the best approach to implement the following scenario. its desribed as follows:
    Legacy Database ->XI ->  SAP scenario.
    1. Pick records from database table with status ='n'
    2. The records picked can have one or more RefNos i.e. if 2000 records are picked 1500 can have RefNo :1111 and rest 500 RefNo: 2222
    3. 1st condition is to split the records into multiple messages if reference nos are different, so for scenario in point 2 it will be 2 messages. Secondly if the no of records are more than 1000 then split further into 2 message. So 1500 Ref No would be split into messages with 1000 and 500 records. Hence we get 3 messages. I know something similar was achieved in this blog:
    </people/claus.wallacher/blog/2006/06/29/message-splitting-using-the-graphical-mapping-tool
    4. Once the data is forwarded to ECC and its successfully updated, we need to update the status of the database table on the sender side to 'y';
    So the point where i am not very clear as of now is,
    1. What alternative I choose for PI-> ECC call? Its a 1:N multimapping split scenario and as far as I know the call has to go via AE, so is RFC the only option or is there a way to have a call with Proxy/IDOC? Or will there be an option to limit such a scenario at the JDBC Adapter level so that split is not needed. I am not sure if that can be achieved and is a better solution.
    2. If I go for either of the approaches mentioned above, what is the best way to achieve point 4, i.e. update the database table at the sender. As JDBC Adapter would be async and hence we either use a Aysnc-Sync bridge(for proxy/rfc receiver) or some other option like triggering a outbound interface from SAP side with the update data?
    Hope that experts can provide the inputs on the best way forward.Let me know if anything is missing from scenario details perspective.
    Best Regards,
    Pratik

    Pratik,
    2. If I go for either of the approaches mentioned above, what is the best way to achieve point 4, i.e. update the database table at the sender. As JDBC Adapter would be async and hence we either use a Aysnc-Sync bridge(for proxy/rfc receiver) or some other option like triggering a outbound interface from SAP side with the update data?
    Use the solution # 2, make it like this:-
    DB (async) -> PI -> Proxy (sync) -> PI -> DB (async)
    Just out of curiosity, why you breaking the records into 1000's in PI? Why not you select only 1000 records when polling the DB? That will help in improving the overall performance.
    Regards,
    Neetesh

  • Flash Builder 4 - Beta / Design view sporadic at best

    Okay, so I have installed the beta, and running through the Catalyst / Flash BUilder 4 tutorial, all is well, unitl i try to get the imprted fxp file to consistently display in the design mode. Not only that, but switching from code view to design view will break it just about 95% of the time, and then it will not display the created items that the tutorial has you put on them (i.e. the databinds to the names, etc etc).
    Has anyone else experienced this? Other than a few little things like this, so far so good guys(ADOBE).
    Thanks,
    Craig Newroth

    eh_adobe wrote:
    Please be aware that using a newer SDK with Flash Builder Beta's Design View may have unpredictable results. I can't say specifically since I don't know what version you tried and what your document contained, but... let's pretend for a second that the SDK changed something in a component - its name, its package, or any method or properties in it.  Design View was compiled with a specific version of the SDK, and would not know about that change, and so might not react very well to the document. Yes this is not ideal, but it's hard to avoid when the SDK is changing rapidly (at some point it will settle down, and this won't be an issue). We can look at ways to mitigate this.
    I have had less unpredicatable results with recent builds of the SDK the latest being build 7988- I basically update from the svn trunk every 2-3 days look through the changes run the checkintests just to make sure all is ok then see what happens to be small testbed apps, I have been doing this for several months. I don't mind the IDE not dealing with the changes as long as it tells me why it can't. But the code should not break the IDE. Also i'm getting a little confused as some of the responses to some issues (not the design mode one) in this forum have suggested seeing what happens with updated SDK's.
    Anyway I have reverted everything back to a standard FB4 install and the first thing that happened yet another new flaw in design mode rears its head, no errors but this time a visually definable issue. The component appears as a 1 dimension line in the top left hand corner of the design editor window, not even on the design view working canvas. Attached is a screen shot and the associated code that caused it. Please note the IDE didn't draw the big red ring around the visual glitch so thats one less bug to worry about.
         the app.
    <?xml version="1.0" encoding="utf-8"?>
    <s:Application xmlns:fx="http://ns.adobe.com/mxml/2009" xmlns:s="library://ns.adobe.com/flex/spark" xmlns:mx="library://ns.adobe.com/flex/halo" minWidth="1024" minHeight="768">
         <s:HSlider x="50" y="50" width="200" height="30" skinClass="skins.HSliderSkin"/>     
    </s:Application>
         skin part A.
    <?xml version="1.0" encoding="utf-8"?>
    <!--
         ADOBE SYSTEMS INCORPORATED
         Copyright 2008 Adobe Systems Incorporated
         All Rights Reserved.
         NOTICE: Adobe permits you to use, modify, and distribute this file
         in accordance with the terms of the license agreement accompanying it.
    -->
    <!--- The default skin class for the Spark HSlider component. The thumb and track skins are defined by the
    HSliderThumbSkin and HSliderTrackSkin classes, respectively. -->
    <s:SparkSkin xmlns:fx="http://ns.adobe.com/mxml/2009" xmlns:s="library://ns.adobe.com/flex/spark"
           minHeight="11" minWidth="100"
           alpha.disabled="0.5">
        <fx:Metadata>
        <![CDATA[
          * @copy spark.skins.default.ApplicationSkin#hostComponent
             [HostComponent("spark.components.HSlider")]
        ]]>
        </fx:Metadata>
        <fx:Script>
            /* Define the skin elements that should not be colorized.
               For slider, the skin itself is colorized but the individual parts are not. */
            static private const exclusions:Array = ["track", "thumb"];
          * @copy spark.skins.SparkSkin#colorizeExclusions
            override public function get colorizeExclusions():Array {return exclusions;}
        </fx:Script>
        <s:states>
             <s:State name="normal" />
             <s:State name="disabled" />
        </s:states>
        <fx:Declarations>
         <!--- Defines the appearance of the the Slider's DataTip. To customize the DataTip's appearance, create a custom HSliderSkin class. -->
            <fx:Component id="dataTip">         
                <s:MXMLComponent minHeight="24" minWidth="40" y="-34"> 
                    <s:Rect top="0" left="0" right="0" bottom="0">
                             <s:fill>
                                  <s:SolidColor color="0x000000" alpha=".9"/>
                             </s:fill>
                             <s:filters>
                                 <s:DropShadowFilter angle="90" color="0x999999" distance="3"/>
                            </s:filters>
                        </s:Rect>
                        <s:SimpleText id="labelElement" text="{data}"
                             horizontalCenter="0" verticalCenter="1"
                             left="5" right="5" top="5" bottom="5"
                             textAlign="center" verticalAlign="middle"
                             fontWeight="normal" color="white" fontSize="11">
                    </s:SimpleText>
                </s:MXMLComponent>
            </fx:Component>
         </fx:Declarations>
        <!--- Defines the skin class for the HSliderSkin's track. The default skin class is HSliderTrackSkin. -->
        <s:Button id="track" left="0" right="0" top="0" bottom="0"
                  skinClass="skins.HSliderTrackSkin"/>
        <!--- Defines the skin class for the HSliderSkin's thumb. The default skin class is HSliderThumbSkin. -->
        <s:Button id="thumb" top="1" bottom="1" width="{hostComponent.height*1.5}" skinClass="skins.HSliderThumbSkin"/>
    </s:SparkSkin>
         Skin Part B
    <?xml version="1.0" encoding="utf-8"?>
    <!--
         ADOBE SYSTEMS INCORPORATED
         Copyright 2008 Adobe Systems Incorporated
         All Rights Reserved.
         NOTICE: Adobe permits you to use, modify, and distribute this file
         in accordance with the terms of the license agreement accompanying it.
    -->
    <!--- The default skin class for the thumb of a Spark HSlider component. -->
    <s:SparkSkin xmlns:fx="http://ns.adobe.com/mxml/2009" xmlns:s="library://ns.adobe.com/flex/spark">
        <fx:Metadata>
        <![CDATA[
          * @copy spark.skins.default.ApplicationSkin#hostComponent
             [HostComponent("spark.components.Button")]
        ]]>
        </fx:Metadata>
        <s:states>
            <s:State name="up" />
            <s:State name="over" />
            <s:State name="down" />
            <s:State name="disabled" />
        </s:states>
        <!-- border -->
        <s:Rect left="0" right="0" top="0" bottom="0" radiusX="{hostComponent.height/2}" radiusY="{hostComponent.height/2}">
            <s:fill>
                <s:SolidColor color="0x4F4F4F"  />
            </s:fill>
        </s:Rect>
         <!-- fill -->
         <s:Rect left="0.5" right="0.5" top="0.5" bottom="0.5" radiusX="{hostComponent.height/2}" radiusY="{hostComponent.height/2}">
              <s:stroke>
                   <s:LinearGradientStroke rotation="90" weight="1">
                        <s:GradientEntry color="0x000000" alpha="0" />
                        <s:GradientEntry color="0x000000" alpha="0.33" />
                   </s:LinearGradientStroke>
              </s:stroke>
              <s:fill>
                   <s:LinearGradient rotation="90">
                        <s:GradientEntry color="0xFFFFFF"
                                          color.over="0xE5E5E5"
                                          color.down="0x999999" />
                        <s:GradientEntry color="0xD8D8D8"
                                          color.over="0x7D7D7D"
                                          color.down="0x555555" />
                   </s:LinearGradient>
              </s:fill>
         </s:Rect>
         <!-- highlight -->
    </s:SparkSkin>
         Skin Part C
    <?xml version="1.0" encoding="utf-8"?>
    <!--
         ADOBE SYSTEMS INCORPORATED
         Copyright 2008 Adobe Systems Incorporated
         All Rights Reserved.
         NOTICE: Adobe permits you to use, modify, and distribute this file
         in accordance with the terms of the license agreement accompanying it.
    -->
    <!--- The default skin class for the track of a Spark HSlider component. -->
    <s:SparkSkin xmlns:fx="http://ns.adobe.com/mxml/2009" xmlns:s="library://ns.adobe.com/flex/spark">
        <fx:Metadata>
        <![CDATA[
          * @copy spark.skins.default.ApplicationSkin#hostComponent
             [HostComponent("spark.components.Button")]
        ]]>
        </fx:Metadata>
        <s:states>
            <s:State name="up" />
            <s:State name="down" />
            <s:State name="over" />
            <s:State name="disabled" />
        </s:states>
         <!-- border -->
        <s:Rect left="-1" right="-1" top="0" bottom="-1" radiusX="{hostComponent.height/2}" radiusY="{hostComponent.height/2}">
            <s:fill>
                <s:LinearGradient rotation="90" >
                     <s:GradientEntry color="0x000000" alpha="0.55" />
                     <s:GradientEntry color="0xFFFFFF" alpha="0.55" ratio="0.8" />
                </s:LinearGradient>
            </s:fill>
        </s:Rect>
        <!-- fill -->
        <s:Rect left="0" right="0" top="1" bottom="0" radiusX="{hostComponent.height/2}" radiusY="{hostComponent.height/2}">
            <s:fill>
                <s:SolidColor color="0xCACACA" />
            </s:fill>
        </s:Rect>
         <!-- hit area -->
         <s:Rect left="0" right="0" top="0" bottom="0" radiusX="{hostComponent.height/2}" radiusY="{hostComponent.height/2}">
              <s:fill>
                   <s:SolidColor alpha="0"/>
              </s:fill>
         </s:Rect>
    </s:SparkSkin>

  • Which is better design: writing a file or maintaining a list?

    Hi everybody,
    I'm just looking for a quick recommendation from the design perspective. For a group of 20,000 strings or more, if they need to be edited, is it better for efficiency and speed to save them in a list within a program, or to write a file containing the strings, and then access them from there? If you have an opinion on that I would really appreciate it.
    Thanks!
    Jezzica85

    jezzica85 wrote:
    So, if I definitely don't need a separate file, and I may not need a list, is there an easy way to read multiple lines in the same file reader? The easiest way to read individual lines, from a java.io.FileReader, is to wrap it in a BufferedReader and use readLine().
    Like this:BufferedReader in = new BufferedReader(myFileReader);
    String line;
    while((line = in.readLine() != null) {
      // do stuff with line
    Basically, I need to determine which lines are in my final file both by knowing the line I'm reading and the lines before and after it.If you only can tell whether you need a line depending on the lines that come after it, then you're going to have to store the line while you read the subsequent lines. In that case, a list will be fine. (Well, you can also use a regular scalar String-typed variable, but chances are you're going to be storing multiple lines at a time, thus a List). If you're only going to make the determination based on lines that came earlier, then you don't need a list. You might want to use boolean or other types of variables to keep track of whether you need to keep a line.
    For example, if the only criterion is that spans of blank lines get replaced by a single blank line, then you could probably get by with a single String variable. Stick the current line into a variable, then get the next line. If both the new line and the previous line are blank, throw away the new line. Otherwise, print the old one and put the new one in the old one's place in the variable.
    It will all depend on your program requirements. Just try to keep things simple and don't build stuff than you need.

  • Need advice about design approach for query editing tool with JSF

    Hi !
    I would like to propose in my application a way to allow end-user to create queries that could be executed on a some tables. I suppose that this kind of stuff is not new and would like to know if someone has to good design practice or example to do this with the JSF technology ?
    I think that two approaches is possible:
    1/ the user specifies the complete query from the start and get the final result when query is executed in the background
    2/ user specifies the query in a interactive way. It specifies one criteria and get the results of it, then on the results specifies another criteria and get new results, then specifies another criteria and so on.
    Also for information I use Hibernate as database framework.
    Some advice or start of approach will be very appreciated.
    Thanks !
    B.D.

    No-one could advice on this ?

Maybe you are looking for

  • Data Execution Protection

    I have just installed iTunes On my Windows based XP (SP3) HP lap-top, but I can not launch it. Every time it comes up with DEP has closed this program. I have tried all the suggestions. Uninstalling everything and re-install iTunes. Create another us

  • XSLT Trouble

    I am using XSLT for the first time. I have a relatively easy XML structure based on data coming from the database, I want to transform it into HTML. I am using the latest XALAN in the Websphere Test Environment 4.0. XML: <?xml version="1.0" encoding=

  • InetAddress - getHostName()

    I am using getHostName method of InetAddress class to get the host name of an ip address but it returns me the same ip address as hostname(for some addresses it is working fine) Following is my code import java.awt.*; import java.awt.event.*; import

  • Regarding Barcode dimm module

    Hi Anybody has come accross the scenario, where we have to install a barcode dimm module on the printer , for barcode printing??

  • Cant open Lightroon CC

    Lightroom CC will not open- ALL other apps wotk