SOA real-time design question

Hi All,
We are currently working with SOA Suite 11.1.1.4. I have a SOA application requirement to receive real-time feed for six data tables from an external third party. The implementation consists of five one-way operations in the WSDL to populate the six database tables.
I have a design question. The organization plans to use this data across various departments which requires to replicate or supply the data to other internal databases.
In my understanding there are two options
1) Within the SOA application fork the data hitting the web-service to different databases.
My concern with this approach is what if organizations keep coming with such requests and I keep forking and supplying multiple internal databases with the same data. This feed has to be real-time, too much forking with impact the performance and create unwanted dependencies for this critical link for data supply.2) I could tell other internal projects to get the data from the populated main database.
My concern here is that firstly the data is pushed into this database flat without any constraints and it is difficult to query to get specific data. This design has been purposely put in place to facilitate real-time performance.Also asking every internal projects to get data from main database will affect its performance.
Please suggest which approach should I take (advantage/disadvantage. Apart from the above two solutions, is there any other recommended solution to mitigate the risks. This link between our organization and external party is somewhat like a lifeline for BAU, so certainly don't want to create more dependencies and overhead.
Thanks

I had tried implementing the JMS publisher/subscriber pattern before, unfortunately I experienced performance was not so good compared to the directly writing to the db adapter. I feel the organization SOA infrastructure is not setup correctly to cope with the number of messages coming through from external third party. Our current setup consists of three WebLogic Servers (Admin, SOA, BAM) all running on only 8GB physical RAM on one machine. Is there Oracle guideline for setting up infrastructure for a SOA application receiving roughly 600000 messages a day. I am using SOA 11.1.1.4. JMS publisher/subscriber pattern just does not cope and I see significant performance lag after few hours of running. The JMS server used was WebLogic JMS
Thanks
Edited by: user5108636 on Jun 13, 2011 4:19 PM
Edited by: user5108636 on Jun 13, 2011 7:03 PM

Similar Messages

  • Real time interview questions in XI

    Hi guys,
               Can anyone send me Real time interview questions in XI to my mail id [email protected] will be awarded.
    Thanks in advance

    Hi , check some of the FAQS.
    Some of them not answered.
    . Which of the following are Components of XI
    MDM, Adapter Framework, RWB, SLD, IS

    2. What is a XI Pipeline?
    3. Source element occurs once whereas the target element it is mapped to is produced 3 times when the mapping is executed. Why does this happen?
    4. A context object is used in place of what?
    5. what is UDF? What are the mandatory functions that you use in a Java based UDF
    a. Init() , Execute(), Destroy(), Run(), SetParameter()
    6. ABAP mapping is implemented using what?
    7. When you don’t find the ABAP mapping option in IR what do u do?
    8. Any of the of different Mapping types (Java, MM, XSLT, ABAP) can be called in any order for Interface mapping ? True of False
    9. Is true case sensitive in case of a Boolean function and can 1 be interpreted a Boolean TRUE?
    10. what is Context Changes ?
    11. What are the protocols that the Mail adapter supports
    12. Why is a SAP BC used?
    13. WSDL representation of a Message Interface is used to generate what kind of proxies?
    a. ABAP Proxies
    b. Java Proxies
    c. Neither ABAP nor Java
    d. Both ABAP and Java proxies
    14. Would you configure a Sender IDOC communication channel?
    15. You are required to upload additional libraries for the JMS adapter. How would you do it?
    16. QoS that a Sender JDBC communication channel supports
    17. What are the transport protocols a JMS adapter supports?
    18. Would you configure the Integration Server as a Logical system in a scenario where IDOCs are being sent from a SAP R/3 system to XI?
    19. Why do we specify the Logical System name in the SLD?
    20. The pre-requisites for sending IDOCs to an XI system
    a. Connection parameters must be maintained in SWCV
    b. User must have administration rights in XI
    c. The IDOC metadata must be imported into IR
    21. You need to post a transaction using a RFC. How would you accomplish this?
    a. Use a async BAPI call with implicit Commit?
    b. Use a async BAPI call with explicit Commit?
    22. what is PCK? What is the necessity for a PCK?
    23. In a company the Central Adapter Engine is installed close to the business partner site. Why do you this this is done?
    24. The flow of a message entering the Adapter Engine from Integration Server is--
    a. It is queued, processed using module processors and then posted to the backend application
    b. It is processed using module processors, queued and then posted to the backend application
    25. Is the persistence layer used by the Adapter Engine and the Integration Engine (Integration Server) same?
    26. Is the Message ID specified in the Integration Engine same as the Message ID used during the Message transformation in the Adapter engine?
    27. Would you configure a Sender HTTP adapter?
    28. QoS in case of a RFC Receiver adapter
    29. Sync-Async bridge is used for?
    30. A Business Process is
    a. Executable cross component
    b. Can send and receive messages
    31. What is the purpose of a deadline branch
    32. What is SXI_CACHE used for?
    33. Container elements can be typed to what ?
    34. Why is a Wait step used?
    35. A block can have which of the following?
    a. Multiple Exception branches
    b. Multiple Condition branches
    36. For what all step types can you use a Corelation?
    37. Which of the following is true?
    a. Blocks can be Nested
    b. Blocks can be Overlapped
    38. You need to collect and club messages in a container element coming from different steps. How would you do this?
    39. In case of a Block, which of the following is true?
    Elements of a super container are visible in sub-containers
    Elements of a subordinate container are not visible in all blocks
    Elements defined in the process container are visible in all blocks
    40. what is Co relation & Local Co-relation
    41. Where can you use N: M transformation?
    42. Alert framework uses/leverages CCMS?
    43. If you want to cancel a process and set its status to ‘Logically Deleted’ when a Deadline is reached, do you need to use a Control Step having its Action as ‘CancelProcess’ or is it automatically done?
    44. What are the ways an Exception can be triggered?
    45. What would be the best architecture after implementing SAP XI? Implementing EDI adapter(s).
    46. How to run the Adapter engine as a service?
    47. How SAP Netweaver supports a holistic approach to BPM (Business Process Management)?
    48. What is the role of SAP XI?
    49. How can we differentiate SAP XI from Business Connector (BC)?
    50. How to send mail from SAP XI?
    51. What are the migration steps from XI 2.0 to XI 3.0?
    52. XI will support synchronous communication and asynchronous communication?
    A. Yes
    B. No
    53. Integration server contains the following components?
    A. Additional integration services
    B. Integration Engine
    C. Business Process Engine
    D. Integration Repository
    54. Integration Repository provides the following components?
    A. Business processes
    B. Mapping Objects
    C. Components at design time
    D. Imported objects
    55. What is the usage of Web Application server in XI?
    56. What is the use of RFC and IDOC Adapters in XI?
    57. How to convert WSDL (Web Services Descriptive Language) to target language?
    58. What is the component to generate Jave classes?
    59. What is ESA (Enterprise Service Architecture)?
    60. What are the key elements of ESA?
    61. How to transport SLD, Integration Directory & Integration Repository objects to the Production system?
    62. Can we import XSD Schemas into XI 2.0?
    63. which api you use for java mapping.
    64. You use context object in place of what?
    65. To make non mandatory node mandatory which you should to?
    66. In case of RFC communication sender system sends a rfc call but target system does not receive it. What you think went wrong?
    67 Difference between Xi business process and workflow ?
    68. When you use transaction SXMB_MONI for process monitoring, which field tells you that the entry is for business process.
    69. What are different Xi components?
    70. In which all places you can use receive process?
    71 In which all steps you can activate correlation?
    1. How many interfaces are u developed in u r project.
    2,What is land accepted.?
    3.What is Your team size ?
    4.The work assignment procedure in your organization?
    5.What is your complete company object.?
    6.What is the nesicity of developing that scenario.?
    7.What is the advantage over other integration tools.?
    8.What is Sender agreement?
    9.What is Receiver agreement?
    10.Tell me the steps for Multiple Idoc to File.(BPM Scenarios)
    11.Tell me the steps for file to Multiple Idoc (BPM)
    12.How to Create Alerts in BPM?
    13.How to Use Third Party Adaptors in u r project.?
    14.How to Use External Objects?
    15.What is the Use of Node Fictions in XI?(example)
    16 Examples on RFC lookups?
    regards,
    Brahmaji.

  • Real time interview questions

    Hi gurus
    Can anyone of u pls give answers for the following real time interview questions.
    1.How do you implement a new object required by ur Client...and what are all the proccess up to Production Server....?
    2.What is the Role played by RFC is SAP Systems
    3.. What are all the initial things done to you while u are put in to the team....?
    Pls do the needfull
    Thanking you
    sure i will assign full points

    1._How do you implement a new object required by ur Client...and what are all the proccess up to Production Server....?_
    - Develop the New Info object
    - Test them Info object in Quality(if you got one in your Landscape)
       - To test in Quality,Firstly you need to "TRANSPORT"the obj to Quality(TRANSPORT CONNECTION - OBJ - TRANSPORT(assign to a Transport Request)(truck icon)
       - Once the obj is Tested fine without any bugs.errors,
    - Transport the obj from DEV to PROD
      - Transport Conn - Obj -Transport(truck icon)
       - SE09 (Release the particular Request)
       - STMS (Import the Request to PROD) (Can be done by BASIS)
    2)_What is the Role played by RFC is SAP Systems_ 
        -  RFC(Remote Function Call) used to help establish connection between the BW and the Source system.
        -  You setup the RFC in SM59
    3)_What are all the initial things done to you while u are put in to the team....?_
        - This literally depends on the Type of Project you are deployed.(Implementation/Support)
    Hope this stuff help you out..
    cheers

  • Experimenting with FCP/ Real time rendering question

    Hello, I am preparing to purchase a Mac Pro and spent some time at the local mac affiliate store playing with FCP and motion.
    In fcp I took two clips and dropped them in the timeline, adding the 3d cube transition, and some filters to both clips.
    I was looking at performance of real time rendering on teh base config of the MP I plan on purchasing.
    I then added another clip above these two, intending on dropping the opacity to further push the RTR (real time rendering). This clip overlapped the transition of the two other clips in v1 and v2.
    When attempting to play the timeline the viewer window now said "Not Rendered"
    My question is:
    Is there a limit to how many tracks can be run under the RTR? Why did I get this message?
    My main reason of the move to a Mac Pro is to get away from constantly pre-rendering.
    Thanks everyone in advance!

    In fcp I took two clips and dropped them in the timeline, adding the 3d cube transition, and some filters to both clips.I was looking at performance of real time rendering on teh base config of the MP I plan on purchasing. I then added another clip above these two, intending on dropping the opacity to further push the RTR (real time rendering). This clip overlapped the transition of the two other clips in v1 and v2.
    <
    Sounds like you know what you're doing and what you expect. A factor that has not been mentioned yet, besides RAM, prefs and drive speeds, is your video format.
    Also, the new rev of FCP is taking advantage of tons of processing overhead, vastly increasing the amount of stuff you can do in real time. However, real time is largely a preview operation, regardless of the machine or system, unless you are using proprietary codecs, such as Media 100, that might be hardware-assisted. Output is almost always a rendering issue, multiple stacked or processed layers require the creation of new media.
    The only way you're going to be able to decide if Macintosh and FCP are right for you is to demo thoroughly and that requires using your own footage.
    bogiesan

  • Real Time kernel questions

    I decided to compile myself a new kernel using a PKGBUILD from AUR, which contains Ingo Molnar's real time patch. I know that there are some entries int the wiki about this (Kernel Compilation with ABS, Custom Kernel Compilation with ABS, Kernel Patches and Patchsets), but they don't answer all of my doubts.
    The major question is related to my graphic card's drivers. I'm a happy Radeon user using fglrx proprietary drivers. At the moment they are working fine on a vanilla kernel form [core], the problem is whether I would have to do some voodoo to make them work on a rt-kernel? Now I know what is written in wiki about installing Catalyst drivers on a custom kernel, but when on Mandriva I tried to install 8.1 on my rt-kernel, I had to do some tricks and recompile fglrx to make it work. Does anyone tried installing Catalyst drivers ver. 8.2 on a real-time kernel from AUR?
    Another thing: I have a .config file from the previous rt-kernel. Would it be suitable for 2.6.24?
    Finally, I recently read about grsecurity patchset. Did anyone tried it on Arch, especially on a real time kernel? Any problems?
    Waiting for suggestions

    broch wrote:first I hope that you understand that RT kernels on desktops are slower than preempt default or vanilla kernel?
    I'm using one on my Mandriva 2008.0 and didn't notice any slowdown.
    broch wrote:second: I use grsec (or RSBAC) kernels.
    grsec will not work with anything messing up with memory (even Kolivas patches in past had issues with grsec). This can be fixed but then what is the point of enabling hardened kernel with disabled features?
    I simply patch vanilla with grsec.
    Remember that some pax options are incompatible with X (all explained when option is selected, so no way to make mistake)
    finally it works well lowest overhead, best speed and easies to manage among kernel hardening patches (I do not count Apparmor, as this is rather limited tool)
    If I understood you, then it's a choice between real time and more security, right?

  • SOA - real time proxy to web service scenario

    Hello Experts,
    I am working on PI but really very much new to the SOA concept.
    But i have to prepare document which explains about all steps that we need to follow (technicallly) for the SOA implementation.
    We are going to implement one Proof of Concept for SOA before going for the real project.
    Can someone please help me and tell me what all the things i need to do for Proxy to Web Service Scenario in PI.
    I hope i have made myself clear about it.
    Thanks,
    Hetal

    Hi Gaurav,
    Thanks for your advice.
    I have one more question and i tried to explore SDN alot but didnt get satisfactory answer.
    here is the situation:
    My scenario is like :
    .Net application = consumer
    ECC = Provider
    PI = broker
    SR = Publish ES
    I am using the Outside - In approach, where i am creating proxy structure in ESR and then creating/ Generating Proxy in ECC.
    I am exposing my Inbound Interface on SR, so that the WSDL is available on SR.
    Now the situation is, our consumer is asking data from us in different format.
    I read on help.sap.com that via PI mapping it is possible.
    But i am not understanding that consumer is using ECC WSDL to make a call, then even though mapping is there, how can they get response in their required format.
    They are even providing us their WSDL. I am totally confused that in one interface how come two different WSDLs can work?
    Or is there a way for this?
    I am even ready to use the same structure for my ECC which is provided by consumer, but then i dont see any outcome using PI.
    I also have question, that for standard service also, ECC Enterprise service structure is fixed which we expose, then using PI how can we provide consumers response in their desired way.
    I know that m asking this question again and again, but still it is not getting clear to me... might be m not understanding that how it works in SOA.
    If possible can you tell me the steps that i need to follow technically to develope this interface? so that i can get better understanding?
    Thanks,
    Hetal

  • OSB and SOA Server (Mediator) Design question

    As OSB is the strategic ESB and is used for external services does it make sense that all communication must go out via the OSB.
    For example if I have a request that comes in gets routed through the OSB and then calls a backend composite which contains a BPEL that sits on the Oracle SOA server that calls multiple external services (CRM, SAP etc..) Would all the calls out from the BPEL then go back to the OSB and out to the external services (via proxy/business services.) This seems to add a lot of network hops to the whole architecture.
    I can't really seem to find a diagram to explain this but an external service consumer could call the Oracle SOA stack that will come in via the OSB (for security gateway as OWSM 11g doesn't support gateway yet.) Then get transformed into the CDM and then passed on to the Oracle SOA server that has a composite service with BPEL that orchestrates multiple calls to internal and external services. The external ones requiring a transformation back into the service providers format.
    How does this flow work?

    If you are talking to external B2B systems you should use the Oracle B2B Server, have a look at this link,
    http://download.oracle.com/docs/cd/E14571_01/integration.1111/e10229/b2b_intro.htm#CEGGAGJA
    Section 1.6 in there gives a good example of doing just that. The OSB mediator is responsible for talking between CDM's in this case so it does some internal work, within one CDM set of services and then when ready the mediator translates from CDM to the format needed for the B2B. The B2B Server then does all the clever stuff needed to talk to the B2B partner.
    In the internal case where the call is within the Enterprise the Oracle documentation states that there may be cases when doing this translation from CDM to legacy format may take place within the SOA server but recommends that it is normally done using the mediator in the same way as with partner calls. The 11g developer documentation states "The stated direction by Oracle is for the Oracle Service Bus to be the preferred ESB for interactions outside the SOA Suite. Interactions within the SOA Suite may sometimes be better dealt with by the Oracle Mediator component in the SOA Suite, but we believe that for most cases, the Oracle Service Bus will provide a better solution and so that is what we have focused on within this book."
    Just what those exceptions are is an interesting question. If you have a CDM and all of your legacy world has translators you will spend all of your time hoping up to the mediator and back down maybe that is the exception. If everything is using the mediator then the SOA orchestration power is lost. If on the other hand most of your system lives within a canonical schema and there is one exception perhaps you make it pay the price for non-conformance.

  • Work Time design question

    Hello everyone
    I have the following design problem. I have a Work Times table with the following attributes (columns)
    Monday
    Tuesday
    Wednesday
    Thursday
    Friday
    Saturday
    Holidays
    In there I would write times like 08-16 for the weekdays and 08-14 for Saturday and Holiday 10-12 as strings.
    This seems like a really bad way to solve this so was hoping you guys had a better solution.
    Thanks

    A few questions./comments...
    1) Is the requirement to basically just hold a static schedule for when equipment is scheduled to be in an object, such that, if you queried the information in the table, you'd just get the current schedule? And that's all that needs to be able to be manipulated by the user? If that's truly the only requirement, why can't the user just keep a spreadsheet with the schedule? Using a database seems like a bit overkill for something like that.
    2) If the problem is more complex, or if you absolutely have to use a database, you could do the following:
    You will need a way to identify both the equipment and object in your table. I'm going to assume that each piece of equipment and each object has some unique identifier associated with it. For example, in the tables I've created below, I assume the equipment uses a serial number as its unique identifier, and I assume that each object is a work area, and I just numbered them 1 to 3. If this is not representative enough of your situation, you'll need to clarify it some more.
    Following database normalization practices, you'll want to keep any additional information about the equipment or objects in their own tables. Especially if the unique identifier for the equipment or object is not clear enough by itself for the users, it may be useful to provide a description of the equipment, for example:
    CREATE TABLE equip_table
    (     equip_id     NUMBER          NOT NULL
    ,     equip_desc     VARCHAR2(50)
    ,     CONSTRAINT      pk_equip PRIMARY KEY(equip_id)
    INSERT INTO equip_table VALUES (12345, 'STARRETT micrometer, SN#12345');
    INSERT INTO equip_table VALUES (54321, 'STARRETT calipers, SN#54321');
    INSERT INTO equip_table VALUES (98765, 'STARRETT hole gauge, SN#98765');
    CREATE TABLE obj_table
    (     obj_id          NUMBER          NOT NULL
    ,     obj_desc     VARCHAR2(50)
    ,     CONSTRAINT      pk_obj PRIMARY KEY(obj_id)
    INSERT INTO obj_table VALUES (1, 'some work area #1');
    INSERT INTO obj_table VALUES (2, 'some work area #2');
    INSERT INTO obj_table VALUES (3, 'some work area #3');You may want more fields in the tables to store additional information about the equipment and objects, for example, their physical location, that sort of thing, if needed.
    Next, you will need a table to store the work schedule, and you will need to reference the equipment and object each work schedule refers to. This table allows a history to be kept of previous schedules (you'd be able to run a query to find out what the schedule was at any given point). Please note, I only provided sample data for one obj_id...
    CREATE TABLE work_table
    (     equip_id     NUMBER          NOT NULL
    ,     obj_id          NUMBER          NOT NULL
    ,     in_out_flag     VARCHAR2(1)
    ,     in_out_time     DATE
    ,     active_flag     VARCHAR2(1)
    ,     dt_active     DATE
    ,     dt_inactive     DATE
    ,     holiday_flag     VARCHAR2(1)
    ,     CONSTRAINT     fk_equip_id
                   FOREIGN KEY(equip_id)
         REFERENCES     equip_table(equip_id)
    ,     CONSTRAINT     fk_obj_id
                   FOREIGN KEY(obj_id)
         REFERENCES     obj_table(obj_id)
    ,     CONSTRAINT     in_out_flag_ck
         CHECK          (in_out_flag IN ('I','O'))
    ,     CONSTRAINT     active_flag_ck
         CHECK          (active_flag IN ('A','I'))
    ,     CONSTRAINT     holiday_flag_ck
         CHECK          (holiday_flag IN ('H','N'))
    --all schedules related to equip_id = 12345
    INSERT INTO work_table
    VALUES(12345,1,'I',TRUNC(NEXT_DAY(SYSDATE-10,'MON'),'DDD')+((1/24)*60)+((1/24/60)*30),'I',SYSDATE-10,SYSDATE-5,'N');
    INSERT INTO work_table
    VALUES(12345,1,'I',TRUNC(NEXT_DAY(SYSDATE-10,'TUE'),'DDD')+((1/24)*8)+((1/24/60)*15),'A',SYSDATE-10,NULL,'N');
    INSERT INTO work_table
    VALUES(12345,1,'I',TRUNC(NEXT_DAY(SYSDATE-10,'WED'),'DDD')+((1/24)*8)+((1/24/60)*30),'A',SYSDATE-10,NULL,'N');
    INSERT INTO work_table
    VALUES(12345,1,'I',TRUNC(NEXT_DAY(SYSDATE-10,'THU'),'DDD')+((1/24)*8)+((1/24/60)*5),'A',SYSDATE-10,NULL,'N');
    INSERT INTO work_table
    VALUES(12345,1,'I',TRUNC(NEXT_DAY(SYSDATE-10,'FRI'),'DDD')+((1/24)*7)+((1/24/60)*45),'A',SYSDATE-10,NULL,'N');
    INSERT INTO work_table
    VALUES(12345,1,'I',TRUNC(NEXT_DAY(SYSDATE-10,'SAT'),'DDD')+((1/24)*12)+((1/24/60)*0),'A',SYSDATE-10,NULL,'N');
    INSERT INTO work_table
    VALUES(12345,1,'I',TRUNC(NEXT_DAY(SYSDATE-10,'SUN'),'DDD')+((1/24)*10)+((1/24/60)*30),'A',SYSDATE-10,NULL,'H');
    INSERT INTO work_table
    VALUES(12345,1,'O',TRUNC(NEXT_DAY(SYSDATE-10,'MON'),'DDD')+((1/24)*15)+((1/24/60)*30),'I',SYSDATE-10,SYSDATE-5,'N');
    INSERT INTO work_table
    VALUES(12345,1,'O',TRUNC(NEXT_DAY(SYSDATE-10,'TUE'),'DDD')+((1/24)*14)+((1/24/60)*15),'A',SYSDATE-10,NULL,'N');
    INSERT INTO work_table
    VALUES(12345,1,'O',TRUNC(NEXT_DAY(SYSDATE-10,'WED'),'DDD')+((1/24)*14)+((1/24/60)*30),'A',SYSDATE-10,NULL,'N');
    INSERT INTO work_table
    VALUES(12345,1,'O',TRUNC(NEXT_DAY(SYSDATE-10,'THU'),'DDD')+((1/24)*14)+((1/24/60)*5),'A',SYSDATE-10,NULL,'N');
    INSERT INTO work_table
    VALUES(12345,1,'O',TRUNC(NEXT_DAY(SYSDATE-10,'FRI'),'DDD')+((1/24)*13)+((1/24/60)*45),'A',SYSDATE-10,NULL,'N');
    INSERT INTO work_table
    VALUES(12345,1,'O',TRUNC(NEXT_DAY(SYSDATE-10,'SAT'),'DDD')+((1/24)*14)+((1/24/60)*0),'A',SYSDATE-10,NULL,'N');
    INSERT INTO work_table
    VALUES(12345,1,'O',TRUNC(NEXT_DAY(SYSDATE-10,'SUN'),'DDD')+((1/24)*14)+((1/24/60)*30),'A',SYSDATE-10,NULL,'H');
    INSERT INTO work_table
    VALUES(12345,1,'I',TRUNC(NEXT_DAY(SYSDATE-10,'MON'),'DDD')+((1/24)*8)+((1/24/60)*30),'A',SYSDATE-5,NULL,'N');
    INSERT INTO work_table
    VALUES(12345,1,'O',TRUNC(NEXT_DAY(SYSDATE-10,'MON'),'DDD')+((1/24)*16)+((1/24/60)*30),'A',SYSDATE-5,NULL,'N');
    --all schedules related to equip_id = 54321
    INSERT INTO work_table
    VALUES(54321,1,'I',TRUNC(NEXT_DAY(SYSDATE-10,'MON'),'DDD')+((1/24)*9)+((1/24/60)*30),'A',SYSDATE-10,NULL,'N');
    INSERT INTO work_table
    VALUES(54321,1,'I',TRUNC(NEXT_DAY(SYSDATE-10,'TUE'),'DDD')+((1/24)*7)+((1/24/60)*15),'I',SYSDATE-10,SYSDATE-4,'N');
    INSERT INTO work_table
    VALUES(54321,1,'I',TRUNC(NEXT_DAY(SYSDATE-10,'WED'),'DDD')+((1/24)*7)+((1/24/60)*30),'A',SYSDATE-10,NULL,'N');
    INSERT INTO work_table
    VALUES(54321,1,'I',TRUNC(NEXT_DAY(SYSDATE-10,'THU'),'DDD')+((1/24)*7)+((1/24/60)*5),'A',SYSDATE-10,NULL,'N');
    INSERT INTO work_table
    VALUES(54321,1,'I',TRUNC(NEXT_DAY(SYSDATE-10,'FRI'),'DDD')+((1/24)*6)+((1/24/60)*45),'A',SYSDATE-10,NULL,'N');
    INSERT INTO work_table
    VALUES(54321,1,'I',TRUNC(NEXT_DAY(SYSDATE-10,'SAT'),'DDD')+((1/24)*11)+((1/24/60)*0),'A',SYSDATE-10,NULL,'N');
    INSERT INTO work_table
    VALUES(54321,1,'I',TRUNC(NEXT_DAY(SYSDATE-10,'SUN'),'DDD')+((1/24)*9)+((1/24/60)*30),'A',SYSDATE-10,NULL,'H');
    INSERT INTO work_table
    VALUES(54321,1,'O',TRUNC(NEXT_DAY(SYSDATE-10,'MON'),'DDD')+((1/24)*16)+((1/24/60)*30),'I',SYSDATE-10,SYSDATE-4,'N');
    INSERT INTO work_table
    VALUES(54321,1,'O',TRUNC(NEXT_DAY(SYSDATE-10,'TUE'),'DDD')+((1/24)*15)+((1/24/60)*15),'A',SYSDATE-10,NULL,'N');
    INSERT INTO work_table
    VALUES(54321,1,'O',TRUNC(NEXT_DAY(SYSDATE-10,'WED'),'DDD')+((1/24)*15)+((1/24/60)*30),'A',SYSDATE-10,NULL,'N');
    INSERT INTO work_table
    VALUES(54321,1,'O',TRUNC(NEXT_DAY(SYSDATE-10,'THU'),'DDD')+((1/24)*15)+((1/24/60)*5),'A',SYSDATE-10,NULL,'N');
    INSERT INTO work_table
    VALUES(54321,1,'O',TRUNC(NEXT_DAY(SYSDATE-10,'FRI'),'DDD')+((1/24)*14)+((1/24/60)*45),'A',SYSDATE-10,NULL,'N');
    INSERT INTO work_table
    VALUES(54321,1,'O',TRUNC(NEXT_DAY(SYSDATE-10,'SAT'),'DDD')+((1/24)*15)+((1/24/60)*0),'A',SYSDATE-10,NULL,'N');
    INSERT INTO work_table
    VALUES(54321,1,'O',TRUNC(NEXT_DAY(SYSDATE-10,'SUN'),'DDD')+((1/24)*15)+((1/24/60)*30),'A',SYSDATE-10,NULL,'H');
    INSERT INTO work_table
    VALUES(54321,1,'I',TRUNC(NEXT_DAY(SYSDATE-10,'TUE'),'DDD')+((1/24)*5)+((1/24/60)*15),'A',SYSDATE-4,NULL,'N');
    INSERT INTO work_table
    VALUES(54321,1,'O',TRUNC(NEXT_DAY(SYSDATE-10,'MON'),'DDD')+((1/24)*18)+((1/24/60)*30),'A',SYSDATE-4,NULL,'N');
    --all schedules related to equip_id = 98765
    INSERT INTO work_table
    VALUES(98765,1,'I',TRUNC(NEXT_DAY(SYSDATE-10,'MON'),'DDD')+((1/24)*9)+((1/24/60)*45),'A',SYSDATE-10,NULL,'N');
    INSERT INTO work_table
    VALUES(98765,1,'I',TRUNC(NEXT_DAY(SYSDATE-10,'TUE'),'DDD')+((1/24)*7)+((1/24/60)*30),'A',SYSDATE-10,NULL,'N');
    INSERT INTO work_table
    VALUES(98765,1,'I',TRUNC(NEXT_DAY(SYSDATE-10,'WED'),'DDD')+((1/24)*7)+((1/24/60)*45),'I',SYSDATE-10,SYSDATE-3,'N');
    INSERT INTO work_table
    VALUES(98765,1,'I',TRUNC(NEXT_DAY(SYSDATE-10,'THU'),'DDD')+((1/24)*7)+((1/24/60)*20),'A',SYSDATE-10,NULL,'N');
    INSERT INTO work_table
    VALUES(98765,1,'I',TRUNC(NEXT_DAY(SYSDATE-10,'FRI'),'DDD')+((1/24)*6)+((1/24/60)*0),'A,SYSDATE-10,NULL,'N');
    INSERT INTO work_table
    VALUES(98765,1,'I',TRUNC(NEXT_DAY(SYSDATE-10,'SAT'),'DDD')+((1/24)*11)+((1/24/60)*15),'A',SYSDATE-10,NULL,'N');
    INSERT INTO work_table
    VALUES(98765,1,'I',TRUNC(NEXT_DAY(SYSDATE-10,'SUN'),'DDD')+((1/24)*9)+((1/24/60)*45),'A',SYSDATE-10,NULL,'H');
    INSERT INTO work_table
    VALUES(54321,1,'O',TRUNC(NEXT_DAY(SYSDATE-10,'MON'),'DDD')+((1/24)*16)+((1/24/60)*30),'A',SYSDATE-10,NULL,'N');
    INSERT INTO work_table
    VALUES(54321,1,'O',TRUNC(NEXT_DAY(SYSDATE-10,'TUE'),'DDD')+((1/24)*15)+((1/24/60)*15),'I',SYSDATE-10,SYSDATE-3,'N');
    INSERT INTO work_table
    VALUES(54321,1,'O',TRUNC(NEXT_DAY(SYSDATE-10,'WED'),'DDD')+((1/24)*15)+((1/24/60)*30),'A',SYSDATE-10,NULL,'N');
    INSERT INTO work_table
    VALUES(54321,1,'O',TRUNC(NEXT_DAY(SYSDATE-10,'THU'),'DDD')+((1/24)*15)+((1/24/60)*5),'A',SYSDATE-10,NULL,'N');
    INSERT INTO work_table
    VALUES(54321,1,'O',TRUNC(NEXT_DAY(SYSDATE-10,'FRI'),'DDD')+((1/24)*14)+((1/24/60)*45),'A',SYSDATE-10,NULL,'N');
    INSERT INTO work_table
    VALUES(54321,1,'O',TRUNC(NEXT_DAY(SYSDATE-10,'SAT'),'DDD')+((1/24)*15)+((1/24/60)*0),'A',SYSDATE-10,NULL,'N');
    INSERT INTO work_table
    VALUES(54321,1,'O',TRUNC(NEXT_DAY(SYSDATE-10,'SUN'),'DDD')+((1/24)*15)+((1/24/60)*30),'A',SYSDATE-10,NULL,'H');
    INSERT INTO work_table
    VALUES(98765,1,'I',TRUNC(NEXT_DAY(SYSDATE-10,'WED'),'DDD')+((1/24)*10)+((1/24/60)*45),'A',SYSDATE-3,NULL,'N');
    INSERT INTO work_table
    VALUES(54321,1,'O',TRUNC(NEXT_DAY(SYSDATE-10,'TUE'),'DDD')+((1/24)*17)+((1/24/60)*15),'A',SYSDATE-3,NULL,'N');USER INTERFACE:
    User chooses object, say from a drop-down menu populated by the query:
    SELECT     obj_id || ' - ' || obj_desc     AS my_object
    FROM     obj_table
    MY_OBJECT
    1 - some work area #1
    2 - some work area #2
    3 - some work area #3You will need to store just obj_id for use later..
    User chooses equipment, again, possibly from a drop-down menu populated by the query:
    SELECT     equip_id || ' - ' || equip_desc     AS my_equipment
    FROM     equip_table
    MY_EQUIPMENT
    12345 - STARRETT micrometer, SN#12345
    54321 - STARRETT calipers, SN#54321
    98765 - STARRETT hole gauge, SN#98765You will need to store just equip_id for use later...
    At this point, the next query will need to automatically run upon the user having chosen both criteria, or the user will need to press a 'Go' button or something, and you'll need to somehow (I'm not familiar with JDeveloper to tell you how) to feed over the values that the user selected for obj_id and equip_id, and the following query will pull up the current active schedule (I made the assumption that each day will only have one in and one out time):
    SELECT     day_name
    ,     MIN(in_time)     AS in_time
    ,     MIN(out_time)     AS out_time
    FROM     (
         SELECT     holiday_flag
         ,     CASE
                   WHEN     holiday_flag     = 'H'
                   THEN     'HOLIDAYS'
                   ELSE     TO_CHAR(in_out_time,'DAY')     
              END                         AS day_name
         ,     TO_CHAR(in_out_time,'D')          AS in_out_day
         ,     CASE
                   WHEN     in_out_flag     = 'I'
                   THEN     TO_CHAR(in_out_time,'HH24:MI')
              END                         AS in_time
         ,     CASE
                   WHEN     in_out_flag     = 'O'
                   THEN     TO_CHAR(in_out_time,'HH24:MI')
              END                         AS out_time
         FROM     work_table
         WHERE     active_flag     = 'A'
         AND     equip_id     = :equip_id     --bind variable containing chosen equip_id
         AND     obj_id          = :obj_id     --bind variable containing chosen obj_id
    GROUP BY     day_name
    ,          in_out_day
    ,          holiday_flag
    ORDER BY      CASE
                   WHEN     holiday_flag     = 'H'
                   THEN     '9'
                   WHEN     in_out_day     ='1'
                   THEN     '8'
                   ELSE     in_out_day
              END
    ;Given the sample data I provided, this query returns the following results with equip_id=12345 and obj_id=1:
    DAY_NAME                             IN_TI OUT_T
    MONDAY                               08:30 16:30
    TUESDAY                              08:15 14:15
    WEDNESDAY                            08:30 14:30
    THURSDAY                             08:05 14:05
    FRIDAY                               07:45 13:45
    SATURDAY                             12:00 14:00
    HOLIDAYS                             10:30 14:30Now, the last bit is to allow the user to edit the schedule (however you want to do that, and you'll probably want to include a check to look for only values that have changed, so you don't submit queries for all values, whether they've changed or not) and then submit the following queries for each update to the schedule (a single in or out time that is changed is one update):
    Notes:
    I continue to use :equip_id and :obj_id in the following queries
    I use :new_hr to represent the value (gotten from the user) for the new hour
    I use :new_min to represent the value (gotten from the user) for the new minutes
    I use :day_name to indicate the day of the week the time is for which you also need to get from the entry form and provide
    in the format 'MON','TUE','WED','THU', etc. Please use 'SUN' for holidays.
    Just in case you ever want to include Sundays in your schedules, I've also included the holiday_flag.
    I use :holiday to indicate whether the schedule being edited is for holidays, please feed this as either 'H' for holiday or 'N' if not
    I use :in_out to indicate whether the time being update is an 'in' time or an 'out' time, this value also needs to be fed from
    the user form in the format of 'I' for in and 'O' for out.
    UPDATE     work_table
    SET     active_flag     = 'I'
    ,     dt_inactive     = SYSDATE
    WHERE     active_flag               = 'A'
    AND     equip_id               = :equip_id
    AND     obj_id                    = :obj_id
    AND     holiday_flag               = :holiday_flag
    AND     TO_CHAR(in_out_time,'DY')     = :day_name
    AND     in_out_flag               = :in_out
    INSERT INTO work_table
    VALUES     (     :equip_id
         ,     :obj_id
         ,     :in_out
         ,     TRUNC(NEXT_DAY(SYSDATE-1,:day_name),'DDD')+((1/24)*:new_hr)+((1/24/60)*:new_min)
         ,     'A'
         ,     SYSDATE
         ,     NULL
         ,     :holiday
    commit;I don't know if an expert can recommend a better way to do the update to the table, maybe using a trigger or something to update the old 'active' record once a new record is added, but this is how I'd do it.
    Edited by: user11033437 on Dec 10, 2010 11:45 AM - fixed a typo in the SQL
    Example of an update to the table:
    UPDATE     work_table
    SET     active_flag     = 'I'
    ,     dt_inactive     = SYSDATE
    WHERE     active_flag               = 'A'
    AND     equip_id               = 12345
    AND     obj_id                    = 1
    AND     holiday_flag               = 'N'
    AND     TO_CHAR(in_out_time,'DY')     = 'MON'
    AND     in_out_flag               = 'O'
    ;After running the above, the current schedule now looks like:
    DAY_NAME                             IN_TI OUT_T
    MONDAY                               08:30
    TUESDAY                              08:15 14:15
    WEDNESDAY                            08:30 14:30
    THURSDAY                             08:05 14:05
    FRIDAY                               07:45 13:45
    SATURDAY                             12:00 14:00
    HOLIDAYS                             10:30 14:30Then, inserting the new time:
    INSERT INTO work_table
    VALUES     (     12345
         ,     1
         ,     'O'
         ,     TRUNC(NEXT_DAY(SYSDATE-1,'MON'),'DDD')+((1/24)*19)+((1/24/60)*15)
         ,     'A'
         ,     SYSDATE
         ,     NULL
         ,     'N'
    ;And now the new schedule looks like:
    DAY_NAME                             IN_TI OUT_T
    MONDAY                               08:30 19:15
    TUESDAY                              08:15 14:15
    WEDNESDAY                            08:30 14:30
    THURSDAY                             08:05 14:05
    FRIDAY                               07:45 13:45
    SATURDAY                             12:00 14:00
    HOLIDAYS                             10:30 14:30Edited by: user11033437 on Dec 10, 2010 11:51 AM - Added example of updating table with new schedule time

  • Need ABAP real time questions

    hello ABAP gurus,
    i am a fresh ABAper, technically very sound, but not getting job as a fresh ABAPer. So i am showing +2 years fake exp.
    I usually clear the ABAP tech round, but always being rejected because of the  lack of real time ABAP questions those r usually put/asked by the interviewers to trap the fake candidates.
    If anybody having such question and answers, plz open to the SAP community.
    it will be really the most rewardable help to all ABAP freshers.

    You'd think so, but in some countries it can be very messy and costly to get rid of someone even if they did get the job dishonestly or under false pretences.
    It also means that the person who made the decision to hire them has to admit he  was wrong ...
    I once saw a guy who'd been on two projects with a consulting firm - kicked off both within a few weeks and spenmt a year on the bench - get hired as develpment manager.  Now there were three guys from his ex-firm already on site and the dork who hired him didn't even think to informally ask them.  When the penny dropped they tried demoting him and he threatined to sue.
    You think Dilbert is fiction?

  • SAP-XI Real time questions

    could you please provide me some real time questions on sap-xi,
    thank you

    1.  What are the Three types of XI Cache?  How are they used?
    2.  Where would you look to find Logical System in the SLD.
    ANS Business system wizard
    3.  What 2 Data Types are automatically created when the Namespace is saved in the Integration Repository?
    ANS. ExchangeFaultdata and Exchange log data.
    4   Which Development Object in SAP XI forms the "ROOT NODE" of an XML document when an XI message is generated?
    ANS MESSAGE TYPE
    5.  Describe the setting to "Permit Importing of SAP IDOCs or BAPI/ RFCs".  Where is this configured?
    ANS.
    6.  What are the valid types of Message Mappings?  Which is configured with the Graphical User Interface and requires no coding?
    ANS. MESSAGE MAPPING, XSLT MAPPING, ABAP MAPPING AND JAVA MAPPING, MESSAGE MAPPING REQUIRES no coding
    7.  What actions should you take if your Business System does not show when attempting to "Transfer from the SLD"?
    8. What is the relationship between Product, Product Version, Software Component and Software Component Version?  Give an example.
    ANS. PRODUCT: Represents a collection of all versions of a product . In SAP environment , a product corresponds to an SAP technical Component, eg 4.6c,4.6d,4.7
    COMPONENT: Represents a collection of all versions of a software components Examples of Software components are SAP_APPL,SAP_ABA,SAP_HR Software
    9.  Describe the Design Repository (DR) objects you created to configure a scenario.
    10.  What are the different design time components used by XI?
    11. What monitoring service does the integration server provide?
    12.  Describe the alert functionality of the runtime workbench.
    ANS Message alerting allows to set conditions for triggering Alerts. This allows notification of the correct parties for a specific classes of  errors.
    13.  In BPM, if you have async-sync bridge, does the QOS change?
    14.  What is logging/ trace? List 4 ways to enable logging/ trace?
    15.  Properties of an async message?
    16   Inbound XI message has problem, where do you look for solution?
    17.  Describe XI message format.
    18.  Describe end-to-end monitoring.  What is it?  How is it configured?  What are the different   views?
    ANS  It is a tool for monitoring end to end technical processes involving multiple components.
    to configure end to end monitoring :1. On initial screen of runtime work bench ,select configuration tab page.
    2.enter the logon data for the monitoring server.
    3.choose display.
    4.system displays the components of the correct domain and the integration server is selected as default.
    5.select the other components that you want to use and configure them as sender or receiver ,or both  depending on the component type.
    6.we can select monitorin level that we want to use for each of the selected component.
    7.Choose save configuration .
              Processes overview and the instance view are the two different views
    19. Different steps to make a Sender IDOC adapter work?
    20. JMS adapter can talk with what type of systems?  Give some examples...
    ANS Messaging systems to the integration engine
    21.What format can JDBC adapter communicate in?  Should you use native SQL?
    ANS. JDBC adapter converts data base content to XML messages and the other way around.
    22.If communications with JDBC using XML format, what are 4 actions you could do?
    ANS. SELECT, INSER, DELETE , UPDATE.
    23.  How does PCK and adapter framework engine differ?
    ANS.
    24.  Can JDBC adapter query DB tables?  Can it insert?
    ANS. Yes
    25.  JMS adapter scenario messaging system provider needs to submit what?
    26.  List some properties of receiving IDOC adapter.
    27.  JDBC/ JMS required certain steps before they can work?
    ANS vender specific JMS driver must be deployed on to the J2EE engine using SDM
            Appropriate JDBC driver must be deployed on the system
    28.  What is ALEAUDIT with respect to IDOCS?  When will they use it?
    29.  Properties of the HTTP adapter, does it use or need a sender or receiver communications channel or agreement?
    30.  What does a RFC adapter support?
    ANS sRFCs and tRFCs
    31.  Is EOIO supported by RFC?
    ANS No
    32.  What are the advantages of using a decentralized adapter engine?
    ANS.
    33.  What are the different monitoring statuses?  Where do you find them?
    ANS  Different monitoring statuses are Message Monitoring, Component monitoring , performance  Analysis and Alerting and we find this in Runtime Workbench.
    34.  What adapters are not in adapter engine? code you can type in,
    ANS HTTP and IDOC adapters are not in adapterengine and we can type code in  IDOC
    A.  Which ABAP proxy, in or outbound has Which one used classes with regards to Sync/ Async what is the method call?
    36.  With an optional node what would be the cardinality?  How do you make sure the subordinate fields get mapped?
    37.  WSDL what is it?  Where do you find it in Repository?  Is it used in Java or ABAP proxy?
    38.  XI 3.0, what are the supported mapping types?
    ANS Message mapping, ABAP mapping, JAVA mapping and XSLT mapping
    39.  What are the prerequisites for importing customer defined IDOC?
    ANS
    40.  What are the three IDOC transactions in XI?
    ANS  SM59, IDX1and IDX2.
    41.  Context object replace what?
    ANS  Xpath
    42.  Two things can make up a collaboration agreement, what are they?
    ANS Sender agreement and receiver agreement
    43.  What is a logical system, with respect to SLD?  Where would you assign it?
    ANS
    44.  What is the sender communications channel?
    ANS specify the potential senders of messages and the technical communication path
    45.  If error during inbound or outbound binding, where do you look to solve?
    46.  URI, URL and URN what are they and what are their differences?
    47.  To perform content based (logical) routing, two places it can be done.  What are the two places?
    48.  What is an integration process?  Where is it executed?  What stake?
    49.  In a message mapping you have advanced user defined function, can you test for context changes, if yes how?
    50.  What is multi-mapping?  Where is it used?  What are the advantages?
    ANS Multimappping is any mapping that involves N messages either or source or target side   Multimapping can be used in ccBPM
             1.used to map abstract interfaces
             2. Development is same as message mappings.
             3.n:1 transformation
             4.1:n transformation
             5. n:m transformation
    51.  What are the two XSLT tags, previously that could not be used in XI?
    ANS
    52.  Using a simple user defined function how can you send trace information to be monitored?
    53.  What is function "exists" in message mapping?
    ANS We can handle the error by checking whet her the source tag exists and if it does not we can pass an empty value, which generates the required target field.
    54.  What Jar file is required to perform Java Mapping?
    ANS
    55.  What is context in message mapping and how is it used?
    56.  What is remove context and splitby value?
    ANS to remove parent context of an element and spliby value is counterpart of remove context.
    57.  Where can you use user defined function?  What is its scope?
    58.  If you are building Java mapping class which class interface must it implement?
    59.  What is the scope of mapping template?
    60.  How does a Boolean function work in message mappings?
    61.  Source message occurs 3 times, target only once, what is wrong?
    62.  In simple mapping - one source results in 4 identical target messages, with regarding cardinality what is the problem?
    63.  What is a prerequisite to do ABAP mapping for a comple transformation?
    64.  What step can be inserted into an exception branch?
    ANS
    65.  How do you get an error condition to generated an alert?
    ANS
    66.  What actions can you perform in SXI_CACHE?
    67.  What is a wait step and why is it used?
    ANS It is a process flow control relavent and is used toset start time for next step
    68.  What is a block step and why is it used?
    ANS
    69.  Could multiple instances of an integration process be running at the same time?  If so, how does a message find its way to the correct instance?
    70.  Which XI objects can be used in an integration process?  Which ones from Repository?
    71.  Send message within an integration process to 8 receivers at the same time, how can you do this?
    72. What is a correlation?  What is a local correlation?
    73. What is the relationship between an integration process and business workflow?

  • SOA design question

    Hi All,
    I want to design a SOA system (as per best practices) which receives XML files from a third party vendor.The system will be responsible for storing and distributing the xml metadata to a database and distribute the XML files as is, to different subsystems. Important requirement is the data send to subsystems has to be in real time (i.e. the XML files from the third party vendor needs to distributed and stored without any time lapse or minimal).
    First approach is to have an external facing webservices to receive the data and forward it to a mediator which in turn uses DB adaptor to store data to database and file adaptor to write to filesystems of subsystems.
    Second approach is to have JMS adaptor to receive the XML files from the third party, then into a JMS topic and have subsystems subscribe to it and have some ETL process to load the XML files to database in real time.
    My Queries
    1) I am just creating a pilot, any comments from experts about the implementations. I have never tried anything like this before and unsure about the results.
    2) In the second approach, how will I be able to do ETL in real time without much latency.
    3) I want to implement it as per best practices, is the any document or tutorial I can start with.
    4) Which approach should be better, also is there any other better way to do it.
    Help is much appreciated
    Regards
    Thomas

    Hello,
    I am using CAPS 6 u 1. I was wondering if it would be faster for an internal application to invoke a web service which was called within a BPEL process. The BPEL would be wrapped with a SOAP service. So the internal app would call the SOAP service to run the BPEL. I don't see the speed if the BPEL has to be accessed via a SOAP service. Wouldn't a web service in the same app server be just as fast (without the BPEL)? Or is there some other optimization that is done on the bus that makes it faster?
    I guess if there was an existing BPEL that needed to use the log service it would be okay to call the log service from within that BPEL. But to have the log service be the only service called within a BPEL process doesn't seem worthwhile.
    What do you think?
    Thanks.
    Paula

  • Can somebody give some real time questions for alv report

    hi guru
    can somebody give some real time questions for alv report.
    answers also.
    regards
    subhasis.

    hi,
    The ALV is a set of function modules and classes and their methods which are added to program code. Developers can use the functionality of the ALV in creating new reports,  saving time which might otherwise have been spent on report enhancement
    The common features of report are column    alignment, sorting, filtering, subtotals, totals etc. <b>To implement these, a lot of coding and logic is to be put. To avoid that we can use a concept called ABAP List Viewer (ALV).</b>
    Using ALV, we can have three types of reports:
       1. Simple Report
       2. Block Report
       3. Hierarchical Sequential Report
    <b>Reward useful points</b>
    Siva

  • Questions on working in real time

    Hi.  Just have a couple of questions about working in real time, actually on the swf.
    Firstly, I want to create a sort of drag and drop interface.  Is it possible to have one stage act as two seperate stages?  By this, I mean is it possible to have the left hand side of my stage hold all the components, they get dragged to the right hand side of the stage and placed anywhere, the user can then click save, and only the right hand side of the stage is saved?
    Secondly, I am testing a few things out.  What would I do to get a type of free transform on an image on the stage in real time, in the swf?  So once the movie is played, the user could then resize any image how they like by dragging a corner of the image..
    Cheers for any advise.

    What I would like is the movie to display to the user.  When they see the movie, it looks like the screen is split in two.  On the left hand side of the screen will be the components which can be dragged and dropped.  The right hand side of the screen will be a birthday card, whereby the users can drop and drag components to personalise the card.  I then want the user to be able to save the card, but a normal save will save the whole screen, including the components side of the screen.  All I want is the right hand side of the screen saved, where their card has been created.  Would this be done with two movies, and communication between the two.  Or is this possible to do on just one stage?

  • BI 7.0 Question on RDA (Real Time Data Acquisition)

    Those who have all implemented RDA (real Time Data Acquisition) in BI 7.0, could you please tell me if this is limited to only a few of the extractors or available to all the business content extractors. Reason I ask this question is I read that if you have PI_BASIS 2005.1 then RDA is possible, but when I go into ROOSOURCE table in R/3 I do not see the Real time flag checked for any of the transactional Businees content extractors!!
    Also does this mean that we need to set this flag manually and if so is this supported by SAP? please throw in expert your responses...
    Ram

    Dinesh, thank you...after going throught the posts I am still more confsued and not sure if SAP supports any Business content data sources as RDA Capable yet...also if someone has successfully implemented RDA for any of the following applications please do reply back, thank you...
    1. Plant maintenance
    2. GL
    3. EC-PCA
    thank you

  • Abap-hr real time questions

    hi friends
    kindly send me ABAP-HR REAL TIME QUESTION to my mail [email protected]
    Thanks&Regards
    babasish

    Hi
    Logical database
    A logical database is a special ABAP/4 program which combines the contents of certain database tables. Using logical databases facilitates the process of reading database tables.
    HR Logical Database is PNP
    Main Functions of the logical database PNP:
    Standard Selection screen
    Data Retrieval
    Authorization check 
    To use logical database PNP in your program, specify in your program attributes.
    Standard Selection Screen
    Date selection
    Date selection delimits the time period for which data is evaluated. GET PERNR retrieves all records of the relevant infotypes from the database.  When you enter a date selection period, the PROVIDE loop retrieves the infotype records whose validity period overlaps with at least one day of this period.
    Person selection
    Person selection is the 'true' selection of choosing a group of employees for whom the report is to run.
    Sorting Data
    · The standard sort sequence lists personnel numbers in ascending order.
    · SORT function allows you to sort the report data otherwise. All the sorting fields are from infotype 0001.
    Report Class
    · You can suppress input fields which are not used on the selection screen by assigning a report class to your program.
    · If SAP standard delivered report classes do not satisfy your requirements, you can create your own report class through the IMG.
    Data Retrieval from LDB
    1. Create data structures for infotypes.
        INFOTYPES: 0001, "ORG ASSIGNMENT
                            0002, "PERSONAL DATA
                            0008. "BASIC PAY
    2. Fill data structures with the infotype records.
        Start-of-selection.
             GET PERNR.
        End-0f-selection. 
        Read Master Data
    Infotype structures (after GET PERNR) are internal tables loaded with data.
    The infotype records (selected within the period) are processed sequentially by the PROVIDE - ENDPROVIDE loop.
              GET PERNR.
                 PROVIDE * FROM Pnnnn BETWEEN PN/BEGDA AND PN/ENDDA
                        If Pnnnn-XXXX = ' '. write:/ Pnnnn-XXXX. endif.
                 ENDPROVIDE.
    Period-Related Data
    All infotype records are time stamped.
    IT0006 (Address infotype)
    01/01/1990   12/31/9999  present
              Which record to be read depends on the date selection period specified on the
              selection screen. PN/BEGDA PN/ENDDA.
    Current Data
    IT0006 Address  -  01/01/1990 12/31/9999   present
    RP-PROVIDE-FROM-LAST retrieves the record which is valid in the data selection period.
    For example, pn/begda = '19990931'    pn/endda = '99991231'
    IT0006 subtype 1 is resident address
    RP-PROVIDE-FROM-LAST P0006 1 PN/BEGDA PN/ENDDA.
    Process Infotypes
    RMAC Modules - RMAC module as referred to Macro, is a special construct of ABAP/4 codes. Normally, the program code of these modules is stored in table 'TRMAC'. The table key combines the program code under a given name. It can also be defined in programs.The RMAC defined in the TRMAC can be used in all Reports. When an RMAC is changed, the report has to be regenerated manually to reflect the change.
    Reading Infotypes - by using RMAC (macro) RP-READ-INFOTYPE
              REPORT ZHR00001.
              INFOTYPE: 0002.
              PARAMETERS: PERNR LIKE P0002-PERNR.
              RP-READ-INFOTYPE PERNR 0002 P0002 .
              PROVIDE * FROM P0002
                  if ... then ...endif.
              ENDPROVIDE.
    Changing Infotypes - by using RMAC (macro) RP-READ-INFOTYPE. 
    · Three steps are involved in changing infotypes:
    1. Select the infotype records to be changed;
    2. Make the required changes and store the records in an alternative table;
    3. Save this table to the database;
    The RP-UPDATE macro updates the database. The parameters of this macro are the OLD internal table containing the unchanged records and the NEW internal table containing the changed records. You cannot create or delete data. Only modification is possible.
    INFOTYPES: Pnnnn NAME OLD,
    Pnnnn NAME NEW.
    GET PERNR.
        PROVIDE * FROM OLD
               WHERE .... = ... "Change old record
               *Save old record in alternate table
               NEW = OLD.
        ENDPROVIDE.
        RP-UPDATE OLD NEW. "Update changed record
    Infotype with repeat structures
    · How to identify repeat structures.
    a. On infotype entry screen, data is entered in table form.
        IT0005, IT0008, IT0041, etc.
    b. In the infotype structure, fields are grouped by the same name followed by sequence number.
        P0005-UARnn P0005-UANnn P0005-UBEnn
        P0005-UENnn P0005-UABnn
    Repeat Structures
    · Data is entered on the infotype screen in table format but stored on the database in a linear  
      structure.
    · Each row of the table is stored in the same record on the database.
    · When evaluating a repeat structure, you must define the starting point, the increment and the
      work area which contains the complete field group definition.
    Repeat Structures Evaluation (I)
    · To evaluate the repeat structures
       a. Define work area.
           The work area is a field string. Its structure is identical to that of the field group.
       b. Use a DO LOOP to divide the repeat structure into segments and make it available for  
           processing in the work area, one field group (block) at a time.
    Repeat Structures Evaluation(II)
    Define work area
    DATA: BEGIN OF VACATION,
                  UAR LIKE P0005-UAR01, "Leave type
                  UAN LIKE P0005-UAN01, "Leave entitlement
                  UBE LIKE P0005-UBE01, "Start date
                  UEN LIKE P0005-UEN01, "End date
                  UAB LIKE P0005-UAB01, "Leave accounted
               END OF VACATION.
    GET PERNR.
         RP-PROVIDE-FROM-LAST P0005 SPACE PN/BEGDA PN/ENDDA.
         DO 6 TIMES VARYING VACATION
                 FROM P0005-UAR01 "Starting point
                     NEXT P0005-UAR02. "Increment
                 If p0005-xyz then ... endif.
          ENDDO.
    Processing 'Time Data'.
    · Dependence of time data on validity period
    · Importing time data
    · Processing time data using internal tables
    Time Data and Validity Period
    · Time data always applies to a specific validity period.
    · The validity periods of different types of time data are not always the same as the date selection period specified in the selection screen.
    Date selection period |----
    |
    Leave |----
    |
    · PROVIDE in this case is therefore not used for time infotypes.
    Importing Time Data
    · GET PERNR reads all time infotypes from the lowest to highest system data, not only those within the date selection period.
    · To prevent memory overload, add MODE N to the infotype declaration. This prevents the logical database from importing all data into infotype tables at GET PERNR.
    · Use macro RP-READ-ALL-TIME-ITY to fill infotype table.
    INFOTYPES: 2001 MODE N.
    GET PERNR.
        RP-READ-ALL-TIME-ITY PN/BEGDA PN/ENDDA.
        LOOP AT P0021.
             If P0021-XYZ = ' '. A=B. Endif.
        ENDLOOP.
    Processing Time Data
    · Once data is imported into infotype tables, you can use an internal table to process the interested data.
    DATA: BEGIN OF ITAB OCCURS 0,
                  BUKRS LIKE P0001-BUKRS, "COMPANY
                  WERKS LIKE P0001-WERKS, "PERSONNEL AREA
                  AWART LIKE P2001-AWART, "ABS./ATTEND. TYPE
                  ASWTG LIKE P2001-ASWTG, "ABS./ATTEND. DAYS
               END OF ITAB.
    GET PERNR.
    RP-PROVIDE-FROM-LAST P0001 SAPCE PN/BEGDA PN/ENDDA.
    CLEAR ITAB.
    ITAB-BUKRS = P0001-BURKS. ITAB-WERKS = P0001-WERKS.
    RP-READ-ALL-TIME-ITY PN/BEGDA PN/ENDDA.
    LOOP AT P2001.
          ITAB-AWART = P2001-AWART. ITAB-ASWTG = P2001-ASWTG.
          COLLECT ITAB. (OR: APPEND ITAB.)
    ENDLOOP.
    Database Tables in HR
    ·  Personnel Administration (PA) - master and time data infotype tables (transparent tables).
       PAnnnn: e.g. PA0001 for infotype 0001
    ·  Personnel Development (PD) - Org Unit, Job, Position, etc. (transparent tables).
       HRPnnnn: e.g. HRP1000 for infotype 1000
    ·  Time/Travel expense/Payroll/Applicant Tracking data/HR work areas/Documents (cluster  
       PCLn: e.g. PCL2 for time/payroll results.
    Cluster Table
    · Cluster tables combine the data from several tables with identical (or almost identical) keys
      into one physical record on the database.
    . Data is written to a database in compressed form.
    · Retrieval of data is very fast if the primary key is known.
    · Cluster tables are defined in the data dictionary as transparent tables.
    · External programs can NOT interpret the data in a cluster table.
    · Special language elements EXPORT TO DATABASE, IMPORT TO DATABASE and DELETE
      FROM DATABASE are used to process data in the cluster tables.
    PCL1 - Database for HR work area;
    PCL2 - Accounting Results (time, travel expense and payroll);
    PCL3 - Applicant tracking data;
    PCL4 - Documents, Payroll year-end Tax data
    Database Tables PCLn
    · PCLn database tables are divided into subareas known as data clusters.
    · Data Clusters are identified by a two-character code. e.g RU for US payroll result, B2 for
      time evaluation result...
    · Each HR subarea has its own cluster.
    · Each subarea has its own key.
    Database Table PCL1
    · The database table PCL1 contains the following data areas:
      B1 time events/PDC
      G1 group incentive wages
      L1 individual incentive wages
      PC personal calendar
      TE travel expenses/payroll results
      TS travel expenses/master data
      TX infotype texts
      ZI PDC interface -> cost account
    Database Table PCL2
    · The database table PCL2 contains the following data areas:
      B2 time accounting results
      CD cluster directory of the CD manager
      PS generated schemas
      PT texts for generated schemas
      RX payroll accounting results/international
      Rn payroll accounting results/country-specific ( n = HR country indicator )
      ZL personal work schedule
    Database Table PCL3
    · The database table PCL3 contains the following data areas:
      AP action log / time schedule
      TY texts for applicant data infotypes
    Data Management of PCLn
    · The ABAP commands IMPORT and EXPORT are used for management of read/write to
      database tables PCLn.
    · A unique key has to be used when reading data from or writing data to the PCLn.
      Field Name KEY Length Text
      MANDT X 3 Client
      RELID X 2 Relation ID (RU,B2..)
      SRTFD X 40 Work Area Key
      SRTF2 X 4 Sort key for dup. key
    Cluster Definition
    · The data definition of a work area for PCLn is specified in separate programs which comply  
       with fixed naming conventions.
    · They are defined as INCLUDE programs (RPCnxxy0). The following naming convention applies:
       n = 1 or 2 (PCL1 or PCL2)
       xx = Relation ID (e.g. RX)
       y = 0 for international clusters or country indicator (T500L) for different country cluster
    Exporting Data (I)
    · The EXPORT command causes one or more 'xy' KEY data objects to be written to cluster xy.
    · The cluster definition is integrated with the INCLUDE statement.
    REPORT ZHREXPRT.
    TABLES: PCLn.
    INCLUDE: RPCnxxy0. "Cluster definition
    Fill cluster KEY
    xy-key-field = .
    Fill data object
    Export record
    EXPORT TABLE1 TO DATABASE PCLn(xy) ID xy-KEY.
       IF SY-SUBRC EQ 0.
           WRITE: / 'Update successful'.
       ENDIF.
    Exporting Data (II)
    . Export data using macro RP-EXP-Cn-xy.
    · When data records are exported using macro, they are not written to the database but to a  
      main memory buffer.
    · To save data, use the PREPARE_UPDATE routine with the USING parameter 'V'.
    REPORT ZHREXPRT.
    *Buffer definition
    INCLUDE RPPPXD00. INCLUDE RPPPXM00. "Buffer management
    DATA: BEGIN OF COMMON PART 'BUFFER'.
    INCLUDE RPPPXD10.
    DATA: END OF COMMON PART 'BUFFER'.
    RP-EXP-Cn-xy.
    IF SY-SUBRC EQ 0.
        PERFORM PREPARE_UPDATE USING 'V'..
    ENDIF.
    Importing Data (I)
    · The IMPORT command causes data objects with the specified key values to be read from
       PCLn.
    · If the import is successful, SY-SUBRC is 0; if not, it is 4.
    REPORT RPIMPORT.
    TABLES: PCLn.
    INCLUDE RPCnxxy0. "Cluster definition
    Fill cluster Key
    Import record
    IMPORT TABLE1 FROM DATABASE PCLn(xy) ID xy-KEY.
       IF SY-SUBRC EQ 0.
    Display data object
       ENDIF.
    Importing data (II)
    · Import data using macro RP-IMP-Cn-xy.
    · Check return code SY-SUBRC. If 0, it is successful. If 4, error.
    · Need include buffer management routines RPPPXM00
    REPORT RPIMPORT.
    *Buffer definition
    INCLUDE RPPPXD00.
    DATA: BEGIN OF COMMON PART 'BUFFER'.
    INCLUDE RPPPXD10.
    DATA: END OF COMMON PART 'BUFFER'.
    *import data to buffer
    RP-IMP-Cn-xy.
    *Buffer management routines
    INCLUDE RPPPXM00.
    Cluster Authorization
    · Simple EXPORT/IMPORT statement does not check for cluster authorization.
    · Use EXPORT/IMPORT via buffer, the buffer management routines check for cluster
      authorization.
    Payroll Results (I)
    · Payroll results are stored in cluster Rn of PCL2 as field string and internal tables.
      n - country identifier.
    · Standard reports read the results from cluster Rn. Report RPCLSTRn lists all payroll results;
      report RPCEDTn0 lists the results on a payroll form.
    Payroll Results (II)
    · The cluster definition of payroll results is stored in two INLCUDE reports:
      include: rpc2rx09. "Definition Cluster Ru (I)
      include: rpc2ruu0. "Definition Cluster Ru (II)
    The first INCLUDE defines the country-independent part; The second INCLUDE defines the country-specific part (US).
    · The cluster key is stored in the field string RX-KEY.
    Payroll Results (III)
    · All the field string and internal tables stored in PCL2 are defined in the ABAP/4 dictionary. This
      allows you to use the same structures in different definitions and nonetheless maintain data
      consistency.
    · The structures for cluster definition comply with the name convention PCnnn. Unfortunately, 
       'nnn' can be any set of alphanumeric characters.
    *Key definition
    DATA: BEGIN OF RX-KEY.
         INCLUDE STRUCTURE PC200.
    DATA: END OF RX-KEY.
    *Payroll directory
    DATA: BEGIN OF RGDIR OCCURS 100.
         INCLUDE STRUCTURE PC261.
    DATA: END OF RGDIR.
    Payroll Cluster Directory
    · To read payroll results, you need two keys: pernr and seqno
    . You can get SEQNO by importing the cluster directory (CD) first.
    REPORT ZHRIMPRT.
    TABLES: PERNR, PCL1, PCL2.
    INLCUDE: rpc2cd09. "definition cluster CD
    PARAMETERS: PERSON LIKE PERNR-PERNR.
    RP-INIT-BUFFER.
    *Import cluster Directory
       CD-KEY-PERNR = PERNR-PERNR.
    RP-IMP-C2-CU.
       CHECK SY-SUBRC = 0.
    LOOP AT RGDIR.
       RX-KEY-PERNR = PERSON.
       UNPACK RGDIR-SEQNR TO RX-KEY-SEQNO.
       *Import data from PCL2
       RP-IMP-C2-RU.
       INLCUDE: RPPPXM00. "PCL1/PCL2 BUFFER HANDLING
    Function Module (I)
      CD_EVALUATION_PERIODS
    · After importing the payroll directory, which record to read is up to the programmer.
    · Each payroll result has a status.
      'P' - previous result
      'A' - current (actual) result
      'O' - old result
    · Function module CD_EVALUATION_PERIODS will restore the payroll result status for a period
       when that payroll is initially run. It also will select all the relevant periods to be evaluated.
    Function Module (II)
    CD_EVALUATION_PERIODS
    call function 'CD_EVALUATION_PERIODS'
         exporting
              bonus_date = ref_periods-bondt
              inper_modif = pn-permo
              inper = ref_periods-inper
              pay_type = ref_periods-payty
              pay_ident = ref_periods-payid
         tables
              rgdir = rgdir
              evpdir = evp
              iabkrs = pnpabkrs
         exceptions
              no_record_found = 1.
    Authorization Check
       Authorization for Persons
    ·  In the authorization check for persons, the system determines whether the user has the 
       authorizations required for the organizational features of the employees selected with
       GET PERNR.
    ·  Employees for which the user has no authorization are skipped and appear in a list at the end
       of the report.
    ·  Authorization object: 'HR: Master data'
    Authorization for Data
    · In the authorization check for data, the system determines whether the user is authorized to
      read the infotypes specified in the report.
    · If the authorization for a particular infotype is missing, the evaluation is terminated and an error
      message is displayed.
    Deactivating the Authorization Check
    · In certain reports, it may be useful to deactivate the authorization check in order to improve
      performance. (e.g. when running payroll)
    · You can store this information in the object 'HR: Reporting'.
    these are the main areas they ask q?

Maybe you are looking for