Deletion strategy

I'm considering using BDB JE for building an application that needs to store a large amount of events per day (in the order of 100,000,000). The record structure is pretty small (I assume it will be less of 100B) but I was wondering what would be the best strategy for adding such a large amount of records to the database and then removing them after given amount of time according to the DB aging/archiving policy. What I'm thinking of is a sort of partitioning by creating a separate DB for each day and remove it from the environment when I'm done with it. Suggestions?
Regards.
Antonio.

Hi Antonio,
You could have a logical grouping of databases, for example, an event database each month and then simply delete the oldest database when you add the new one.
You could do this by creating a db, and then calling Environment.deleteDatabase or truncateDatabase. This approach lets you administer the db within a single env, in the normal way, and access your db in the normal way. But when you call deleteDatabase and removeDatabase, the log cleaner has to remove all that data from the intermingled files.
A faster approach is to put each droppable set of dbs into their own environment. When the time to delete comes, you just rm the whole database, so it's really fast. You skip all of the log cleaning work. But the downside is that
The db is in a separate environment; you can't use txns across environments, if that's a concern.
Each environment has its own cache, so the app has to take on the burden of deciding how much cache to allot to each env.
You may also find this forum entry useful:
Multiple envs in one process?
We are listening to the fact that folks want a single cache across multiple envs, but don't have that available yet.
Regards,
Ron Cohen
Oracle Corporation

Similar Messages

  • Duplicate processing by DBAdapter when using Distributed Polling with Logical Delete Strategy

    We have DBAdapter based polling services in OSB running across two Active-Active clusters (total 20 managed service across 2 clusters),
    listening to the same database table. (Both clusters read from the same source DB). We want to ensure distributed polling without duplication,
    hence in the DBAdapter we have selected Distributed Polling option, meaning we are using "Select For Update Skip Locking".
    But we see that sometimes, same rows are processed by two different nodes and transactions are processed twice.
    How do we ensure that only one managed server processes a particular row using select for update? We do not want to use the markReservedValue option which was preferred in older version of DBAdapter.
    We are using following values in DB Adapter configuration, the Jdev project for DBAdapter and the OSB proxy using DBAdapter are attached.
    LogicalDeletePolling Strategy
    MarkReadValue = Processed
    MarkUnreadValue = Initiate
    MarkReservedValue = <empty as we are using Skip Locking>
    PollingFrequency = 1 second
    maxRaiseSize = 1
    MaxTransactionSize = 10
    DistributionPolling = checked   (adds lock-n-wait in properties file and changes the SQL to SELECT FOR UPDATE SKIP LOCKED)
    Thanks and Regards

    Hi All,
    Actually I'm also facing the same problem.
    Step I follwed:
    1) Created a job_table in database
    create talbe job_table(id, job_name, job_desc, job_status)
    2)created a bpel process to test the Inbound distributed polling.
    3)Configure the DBAdapter for polling.
    a)update a field in the job_table with logical delete.
    b)select the field name form the drop down
    c) change the read value-->Inprogress and unRead value--->Ready
    d) dont change the value for Reserved value
    e) select the check box for "distributed polling".
    f) the query will be appended with "For update NoWait."
    g)click next and then finish.
    4) Then i followed the below steps.
    To enable pessimistic locking, run through the wizard once to create an inbound polling query. In the Applications Navigator window, expand Application Sources, then TopLink, and click TopLink Mappings. In the Structure window, click the table name. In Diagram View, click the following tabs: TopLink Mappings, Queries, Named Queries, Options; then the Advanced… button, and then Pessimistic Locking and Acquire Locks. You see the message, "Set Refresh Identity Map Results?" If a query uses pessimistic locking, it must refresh the identity map results. Click OK when you see the message, "Would you like us to set Refresh Identity Map Results and Refresh Remote Identity Map Results to true?Ó Run the wizard again to regenerate everything. In the new toplink_mappings.xml file, you see something like this for the query: <lock-mode>1</lock-mode>.
    5) lock-mose is not changed to 1 in toplink_mappingss.xml
    Can we edit the toplink_mappings.xml manually.
    If yes, what allt he values Ineed to change in toplink_mappings.xml file, so that it will not pick the same record for the multiple times in clustered environment.
    Please help me out this is urgent.
    Thanking you in advance.

  • Poll Database Adapter Physicial Delete strategy

    Guys,
    The database adapter should delete the record from table only if the record was processed successfully in esb or BPEL. But as far as i tested it deletes the polled record regardless of success or fault.
    Is there any setting that I need to do to force delete only when a successful instance is created.

    DB Adapter should participate in transaction, so upon failure it should be rolled back.
    Can you make sure you aren't using any MCF properties, and also in your process you don't have any dehydration point (e.g. recv/on-msg)?
    Regards,
    Chintan

  • DBAdapter polling with logical delete x distrib polling x DB rows per trans

    Hi all.
    I'm trying to configure a DBAdapter with "logical delete" polling strategy, distributed polling (cluster environment) and a defined number of "Database Rows per Transaction".
    When I check the box "Distributed Polling", the SQL generated gets appended by "FOR UPDATE NOWAIT"
    However, when I set a value for "Database Rows per Transaction" the "FOR UPDATE NOWAIT" sql clause disappear.
    Is this a bug, or some limitation related to the "logical delete" strategy???
    Thanks
    Denis

    Hi All,
    Actually I'm also facing the same problem.
    Step I follwed:
    1) Created a job_table in database
    create talbe job_table(id, job_name, job_desc, job_status)
    2)created a bpel process to test the Inbound distributed polling.
    3)Configure the DBAdapter for polling.
    a)update a field in the job_table with logical delete.
    b)select the field name form the drop down
    c) change the read value-->Inprogress and unRead value--->Ready
    d) dont change the value for Reserved value
    e) select the check box for "distributed polling".
    f) the query will be appended with "For update NoWait."
    g)click next and then finish.
    4) Then i followed the below steps.
    To enable pessimistic locking, run through the wizard once to create an inbound polling query. In the Applications Navigator window, expand Application Sources, then TopLink, and click TopLink Mappings. In the Structure window, click the table name. In Diagram View, click the following tabs: TopLink Mappings, Queries, Named Queries, Options; then the Advanced… button, and then Pessimistic Locking and Acquire Locks. You see the message, "Set Refresh Identity Map Results?" If a query uses pessimistic locking, it must refresh the identity map results. Click OK when you see the message, "Would you like us to set Refresh Identity Map Results and Refresh Remote Identity Map Results to true?Ó Run the wizard again to regenerate everything. In the new toplink_mappings.xml file, you see something like this for the query: <lock-mode>1</lock-mode>.
    5) lock-mose is not changed to 1 in toplink_mappingss.xml
    Can we edit the toplink_mappings.xml manually.
    If yes, what allt he values Ineed to change in toplink_mappings.xml file, so that it will not pick the same record for the multiple times in clustered environment.
    Please help me out this is urgent.
    Thanking you in advance.

  • DB Adapter Polls each record only once.

    Hi All
    I am using DB adapter Logical delete Polling strategy in BPEL my process. Whenever new record created or updated in DB table, BPEL instance gets kicked off.
    With Logical delete strategy DB Adapter always updates table with read value(given in wizard) after consuming newly created record. Now once newly created record has been processed/consumed by DB Adapter , It will not picked up again If I update the same record. Reason: DB adapter has already marked it as 'Read'. And my Application can not reset the column marked by DB Adapeter .
    Anybody has some other idea to achieve this ?
    thanks
    /Mishit

    Why can't your application update the field again to the un-read value? It is filled initially with a not-read value isn't it? A simple update trigger on the table could do the trick. If not you should create a staging area table, if you want in an other schema, that gets records whenever an insert or update takes place on your application table. Anyway without table triggers on the original table it is not posible. Did you know you can also create a trigger from a staging area schema on a table in an other schema?
    Kind Regards,
    Andre

  • DBAdapters and polling - Exactly how do they work?

    Can someone help me clarify a point please on how a DB Adapter polls for changes on a Database please?
    I'm looking at this as a starting point for an ESB process so I would be grateful for any other relevant information you could provide.
    1) Is this just based on the last time the row was updated?
    2) Is this going to mean a big table scan for large tables, unless you create a new index to ensure this is more efficient?
    3) What happens if the Db that the adapter is looking at is down? Will entries be lost, or will they be picked up with the last updated time?
    4) Do the db adapters work differently for Oracle Transparent Gateways than that when using a Generic Gateway such as MySQL?
    Many thanks
    Chris
    (updated)

    answering as best I can [ don't work for oracle ]
    1) Is this just based on the last time the row was updated? - well the best is to clear the row [ ie delete it when you're done reading ] because otherwise you need a logical deletion strategy based on a sequence key -see the Database Adapter Manual for more.
    2) Is this going to mean a big table scan for large tables, unless you create a new index to ensure this is more efficient? - No. Again, see the Db Adapter manual.
    3) What happens if the Db that the adapter is looking at is down? Will entries be lost, or will they be picked up with the last updated time? Since [ IMHO ] the sequence key or whatever will be in the same DB as the source then it will be transactional and picked up then as well.
    4) Do the db adapters work differently for Oracle Transparent Gateways than that when using a Generic Gateway such as MySQL? .... My understanding was it was all JDBC and kind of generic ....
    But I agree this very important part is very badly documented indeed.
    My own bugbear is the lack of accurate documentation around once-and-once only guaranteed delivery [ addressed to some extent in the current 10.1.3 SOA Best practices - but you have to hunt really hard to find it ]
    The same goes for ordered message delivery.
    Out of the box ESB should provide once-and-once only in-order message delivery; the documentation never tells you whether it does or not [ i.e. if a message is in the Hospital all the other messages should stop behind it ] ... do they or not ? what is the absolute conditions for this ? where is the advanced JMS/Queues discussions around this ?
    Putting little Synch/Asynch diagrams doesn't really help at this kind of detail level.

  • How to debug DB Adapter

    Hi,
    In our production DB we have a DB Adapter (SOA Suite 10.1.3.5.0) which seems suddenly stopped polling a table for records.
    For the OC4J instance i have set the following loglevels;
    oracle.tip.esb.server.dispatch =FINEST
    oracle.tip.esb.server.service =FINEST
    oracle.webservices.management.auditing =FINEST
    oracle.webservices.management.logging =FINEST
    oracle.webservices.reliability =FINEST
    oracle.webservices.service =FINEST
    oracle.tip.adapter.db
    oracle.tip.adapter.db.DBManagedConnectionFactory
    oracle.tip.adapter.db.inbound
    oracle.tip.adapter.db.outbound
    oracle.tip.esb.server.service.impl.inadapter
    oracle.tip.esb.server.service.impl.outadapter
    When i disable and enable the DB Adapter "DB_IDCARD_IN" the log.xml shows me the following relevant loglines;
    <MSG_TEXT>ESB-Service - Performing Endpoint Deactivation for event 0BAD9DA2CAD011DFBFA5DFF32328B9BE and WSDL location esb:///ES
    B_Projects/PREABXS_ESB_VNTS_IDCARD_IN/DB_IDCARD_IN.wsdl</MSG_TEXT>
    <MSG_TEXT>Creating IEsbService instance for service "DB_IDCARD_IN".</MSG_TEXT>
    <MSG_TEXT>ESBActivation agent ctor for service : DB_IDCARD_IN</MSG_TEXT>
    <MSG_TEXT>Activating IEsbService for service "DB_IDCARD_IN".</MSG_TEXT>
    <MSG_TEXT>Activating endpoint for inbound adapter service "PREABXS.DB_IDCARD_IN".</MSG_TEXT>
    <MSG_TEXT>Activate Endpoint portType: DB_IDCARD_IN_ptt</MSG_TEXT>
    <MSG_TEXT>ESB-Service - Performing Endpoint Activation for operation PREABXS.DB_IDCARD_IN_RS.receive using WSDL location esb://
    /ESB_Projects/PREABXS_ESB_VNTS_IDCARD_IN/DB_IDCARD_IN.wsdl</MSG_TEXT>
    WSDL location: "esb:///ESB_Projects/PREABXS_ESB_VNTS_IDCARD_IN/DB_IDCARD_IN.wsdl"
    portType: "DB_IDCARD_IN_ptt"
    <MSG_TEXT>Resolving location for "DBIDCARDIN_toplink_mappings.xml".</MSG_TEXT>
    <MSG_TEXT>Successfully finished endpoint activation for operation "PREABXS.DB_IDCARD_IN_RS.receive".</MSG_TEXT>
    The opmn log shows
    <2010-10-26 14:48:52,118> <INFO> <dl.collaxa.cube.activation> <AdapterFramework::Inbound> Adapter Framework instance: OraESB - perfo
    rming endpointDeactivation for portType=DB_IDCARD_IN_ptt, operation=receive
    10/10/26 14:48:56 following Normal flow1
    <2010-10-26 14:48:57,165> <INFO> <dl.collaxa.cube.activation> <AdapterFramework::Inbound> Adapter Framework instance: OraESB - endpo
    intActivation for portType=DB_IDCARD_IN_ptt, operation=receive
    <2010-10-26 14:48:57,243> <WARN> <dl.collaxa.cube.activation> <Database Adapter::Inbound> <oracle.tip.adapter.db.inbound.IPAddrPolli
    ngService getUniqueMarkReservedValue> The markReservedValue of distributed polling might not be a unique value. To make it so, you
    can add the last x digits (default = 2) of the host IP address using this format for the MarkReservedValue: [original_val]${IP[-x]}
    <2010-10-26 14:48:57,306> <INFO> <dl.collaxa.cube.activation> <AdapterFramework::Inbound> Adapter Framework instance: OraESB - succe
    ssfully completed endpointActivation for portType=DB_IDCARD_IN_ptt, operation=receive
    The DB Adapter uses a logical delete strategy, but i do not see any rows changed to the 'reserved' value. On the MS SQL 2005 DB the DBA does see the SQL statements executed by the adapter but i can not see any response (query results). The DB Adapter uses an Connection Pool based on the com.microsoft.sqlserver.jdbc.SQLServerDriver.
    How can i trace the queries and responses executed by the DB Adapter ? On the DB side, in SQL Server Profiler, i do see that the SELECT and UPDATE statements from the DB Adapter are executed, i do not know if you can see the actual query results here. But but do not see the any results in ESB or in the column used for the logical delete.
    Any hints or tips ?
    Cheers,
    Peter

    Finally the cause was a not documented clause in the adapter. The adapter was working as designed

  • Regarding BW topics

    Hello Gurus!!!
    Presently I am into BW since 1month. I would be working on BW archiving.
    I finished with basics of BW from Fu Fu .
    I have practically implemented extraction of data from flat files,R/3 etc etc.
    Cud U all please suggest me which concept should I go for ...................
    Kindly give me roadmap so that my BW basics shud get clear
    Helpful answers will b rewarded.........
    Regards,
    Aryan

    Hi
    1.      To archive data from an InfoProvider, you firstly need to create an archiving object for the InfoProvider.
    See also: Archiving Object
           2.      Archiving is carried out in the Archive Administration.
    You can call up the following commands from here.
    ·         Maintain archiving variants.
    ·         Schedule archiving run; here you can firstly start a test run before starting the actual archiving run.
    ·         Schedule a delete run
    §         Store and retrieve from a storage system.
    Also refer to the detailed documentation in Archive Administration.
    Special technical features in the BW system:
    The following graphic shows the BW data archiving architecture:
    Description of special technical features in the BW system:
    Variants maintenance:
    In order to schedule the writing program, you have to first create a variant. Depending on the archiving method that you chose with the definition of the archiving object, the selection screen is built in different ways. With the archiving of time slots (time periods with beginning and end points), you can only use selection conditions for the selected time characteristic. See also: Time Restrictions In doing so, maintenance is kept to a minimum.
    Writing to the archive
    Access to data for reading or writing from the archiving processes then occurs by means of the data manager interface. When data is read from InfoCubes, it is transformed from the star-shaped table structure into a flat format that only contains the actual characteristic attributes. Thus, the archive is independent of a possible reorganization of IDs used in the star schema. The disadvantage is an increase in the volume of data that is at least partially cleared again by means of the data compression in the archive file.
    For each archiving object, an executable ABAP program is generated to write to the archive. The program generates and processes archiving requests.
    First of all, it generates an archiving request. The selection criteria and archiving run information is stored in the archiving request, and a status is administered. The system uses the InfoProvider interface to read InfoCube- and ODS data according to the grouping characteristics that were set. With InfoCubes, data is stored in the I- Structure in the archive (no MDIDs, no navigation attributes or table-enabled field names). For ODS objects, it is written to archive files in the A-table structure.
    Deleting archived data:
    A background process is started for each archive file when the archived data is deleted. Depending on the archiving object settings, it is either automatically started in the write phase, started manually in Archive Administration or is triggered by a particular event. Each productive delete process has three steps:
    1.       In the first step, the archive file is verified in the test mode. The system checks here whether the archive file is complete and whether it can be accessed. Successful verification of the file is stored persistently in the corresponding archive request. The background process is ended if the write phase is still running, or if the system has not verified all archive files in the associated archiving run.
    2.       The second step begins when the archiving run write phase has ended successfully, and all generated archive files have been successfully verified. In this step, data is deleted from the database with the selection criteria of the archiving run. An optimal delete strategy is displayed for the selection criteria (also see Selective Deletion from an InfoCube and from an ODS-Object). For InfoCubes, this step also involves the aggregates being adjusted.
    3.       When the data has been successfully deleted, all archive files in the archiving run are confirmed. This is the third step. As a result, the data is seen as successfully deleted with regards to the ADK.
    Here is a schematic display of the deletion process:
    For more information on the strategies that the system chooses for deletion, read the Background Information.
           3.      You can use the Extractor Checker (transaction RSA3) to check an archive file.
    Also refer to Check Archive File.
           4.      You can extract data from BW archives in order to reload it into the InfoProvider.
    We do not recommend you  reload data directly into the original object. As an alternative, you can use the archive connection for the Export DataSource of an InfoCube or ODS object. Archived data can then be extracted from the archive of the original object and loaded into an InfoProvider with the same (or a similar) structure using the data mart interface. Reporting permits a combination of data that has yet to be archived with data that has been extracted from the archive using the MultiProvider function.
    Also refer to Reload Data from Archive File.
    You can find detailed step-by-step instructions under Archiving Data.
    For more information about possible errors during the archive run and their removal, read the Background Information.
    http://help.sap.com/saphelp_nw04/helpdata/en/b2/e50138fede083de10000009b38f8cf/frameset.htm
    Assign points if it helps u.

  • Database polling adapter within an asynchronous BPEL process

    Hi,
    I have a requirement to poll a database table withing an asynchronous process. The reason I want to use a receive within an asynchronous BPEL is because the process needs to be notified many times thru polling. So might need many receive activities.
    I have developed a process as described below :-
    1. Created an asynchronous process.
    2. Created a Database adapter for polling a table. Logical delete strategy is being used.
    3. Have put a receive activity from the DB Adapter created.
    4. Created a correlation set consisting of a single property(String)
    5. Created a property alias to refer to
    a. the input string of the asynchronous process. (unique for this process)
    b. the input from the database adapter (unique for this process)
    I initiate this process from the console. Then I inserted a record the table with proper inputs.
    But the BPEL waits indefinitely at the receive activity.
    When I try to poll the table from an empty BPEL process, it works perfectly fine.
    Could someone please help me here.
    Thanks in advance
    Saptarishi

    Hi,
    Could someone pls answer this query.
    Regards,
    Saptarishi

  • Database adapter exception handling

    Using Database adapter in ESB process that polls using DeletePolling strategy.
    On hitting exception due to some date format issue the adapter stops polling and the polling resumes only after identifying and removing the faulty record from the table.
    We have to do manual process of identifying the faulty records, since the error is not reported in any of the log files.
    Can configure to log this type of error?
    Would any rejection handling work in this case?
    Using Oracle ESB 10.1.3.4

    When records are inserted into the table, then the date format will be of sql type which is supported by db adapter. This might not be causing the issue. I hope you don't have a scenario where db adapter is updating a record through it's logical delete strategy and the same record is updated by another service. This would result in stopping db adapter polling. Provide the log trace for this. You can change the mode to trace-32 at soa.adapter level in em console.
    HTH.
    -Sriharish.

  • DBAdapter polling for new or changed records not issuing callback within an asynchronous BPEL process?

    Hi,
    I have a requirement to poll a database table withing an asynchronous process. The reason I want to use a receive within an asynchronous BPEL is because the process needs to be notified many times thru polling. So might need many receive activities.
    I have developed a process as described below :-
    1. Created an asynchronous process.
    2. Created a Database adapter for polling a table. Logical delete strategy is being used.
    3. Have put a receive activity from the DB Adapter created.
    4. Created a correlation set consisting of a single property(String)
    5. Created a property alias to refer to
    a. the input string of the asynchronous process. (unique for this process)
    b. the input from the database adapter (unique for this process)
    I initiate this process from the console. Then I inserted a record the table with proper inputs.
    But the BPEL waits indefinitely at the receive activity.
    When I try to poll the table from an empty BPEL process, it works perfectly fine.
    Regards
    Kabir

    Hi Kabir,
    Please refer the following sample for your usecase.
    Though Please be aware that one instance of BPEL process will correlate with only one message if you have one receive activity. Not all the messages tht DBAdapter polls.
    bpel-305-InboundCorrelationShows how to perform message correlation within BPEL
    Also Refer the following in Developer guide.
    http://docs.oracle.com/cd/E28280_01/dev.1111/e10224/bp_correlate.htm#CHDFHAAE
    If you want to keep polling and receiving the messages in the same BPEL process you might have to put the receive activity in a while loop construct. But as a design this is not a good design.
    Can you elaborate your requirement to understand better and suggest better solution.

  • Can we put multiple Database Polling within the asynchronous process

    Hi,
    Can we put multiple Database polling within the the same asynchronous BPEL process.
    There will be multiple Receive activities to receive the data from the poller.
    Saptarishi

    Hi Peter,
    I am using 10.1.3.4 and cannot use ESB/mediator.
    For the 2nd option,
    I have tried to put multiple receive's to poll data from a table but it does not work(it waits at the 2nd Receive). Let me try to elaborate the issue a bit.
    I have created a table for polling.
    The table has 3 columns :- transaction_id, status, is_read
    is_read is kept to implement the logical delete strategy of DB polling.
    What I need to do is to maintain a single instance of BPEL to maintain a transaction(which is uniquely identified by transaction_id).
    The BPEL should track the different stages of the transaction, i.e whenever a new row is inserted in the table with the updated status for a transaction, it must find the correct instance(by co-relating the transaction_id) and go to the next step (wait for the next receive activity) until all the steps of the transaction is completed.
    In the code, I created a correlation set consisting of transaction_id(only). The property transaction_id has an alias to the inbound variable's payload.
    In the first Receive, I checked the "create instance" checkbox. Also the correlation set is initialized.
    From next Receive onwards the correlation set is checked.
    The polling frequency is 5 secs.
    The first Receive works fine and the process is instantiated. But it waits at the 2nd Receive indefinitely.
    It will be very helpful if you can try this once. If you feel this is not the right approach, please guide me.
    Thanks and Regards,
    Saptarishi

  • Can I create a File object without writing it to the disk?

    I need construct a mechanism where I can "write" instructions to a file and then FTP this file to a remote system. It is basically a Telnet-like integration for a system that does not support Telnet. Once the file is on the remote device, it knows how to read the file and process its instructions.
    I would like to logically create this File object, by that I mean that I would like to use a FileWriter object to write the text instructions to the File, then use the Jakarta Commons NET API to FTP the file to the remote system. My question: can I create this File object without the file actually being written to the file system? Can the File just be memory resident for this creation and FTP?
    There could be tens of thousands of these transactions per day and I would like to just avoid any kind of deletion strategy, if I can. Thanks.

    Apart from the questionable decision to try and mess with the File object, I think the answer to (my guess at) your original question is "Yes".
    Jakarta Commons/Net FTP can upload to a server from any input stream you like. It doesn't have to be a FileInputStream. In particular it could be a ByteArrayInputStream, which reads from a byte array in memory.

  • Future plans  of sap

    Hi all,
             Please explain future plans of sap?
    regards,
    shyam

    Dear shyamkiran shetty,
    A conference is scheduled on 16-18th June having the following agenda. This will give you a clear cut view of SAP's future plan. 
    Solution Manager
    Why use it and why do I need it?
    SAP's strategy and plans
    Will it become mandatory to use Solution Manager?
    Customer experiences
    Practical uses of the product
    What tools are included in Solution Manager?
    The role of Solution Manager in upgrades and loading data
    Presentations showing and discussing functionality in the areas of System Monitoring, Helpdesk, System Landscape Management, Business Process Management, Change Management, Testing and Document Management
    What other tools are currently available as a comparison? Why choose other tools and what are the consequences?
    User Interfaces
    Current UI's - what are their features, benefits and characteristics?
    Why should I choose one over another?
    What are the pre-requisites and support implications?
    UI's developed using WebDynpro (both ABAP and Java)
    Adobe UI's - Forms and Flex
    GuiXt, Visual Composer, Portals, .NET
    Mobile technology and mobile infrastructure
    Upgrades
    Building a business case
    Examples of a business case and justification for an upgrade
    What are the compelling reasons to upgrade?
    What benefits and ROI can be gained?
    To upgrade or re-implement?
    Hear customer stories from completed upgrades
    Leveraging an upgrade to gain a business benefit
    New functionality contained in the latest versions
    The new features and tools - how to find and use them
    After the upgrade - exploiting new functionality
    Technical aspects of an upgrade
    What are the landscape and sizing implications and how do we manage those issues?
    Virtualisation - what benefits can it deliver and what limitations exist?
    Unicode upgrades
    Preparing your technical staff and resources for the upgrade
    How can I address the training and education requirements?
    Planning for continuous improvement
    TDMS - why use it and what benefits will it bring to us?
    Planning an archiving strategy, as opposed to deletion strategy
    SAP Strategy/Futures
    SAP has stated its strategy for new releases up to 2011. What does this mean to the current customers who need to make strategic plans of their own?
    In light of recent acquisitions what are the implications to customers and partners?
    The interdependencies between the various SAP solutions such as BI, CRM, and APO
    What's coming with each solution and how does this affect customers?
    What is the effect of implementing each component of SAP on other areas and the total system landscape?
    Using existing functionality in industry solutions within your solution
    Business By Design
    SAP's SDN and BPX communities
    SAP training and certification changes and what they mean
    What is coming next?
    XI(PI)/xApps/NetWeaver
    How is XI (PI) currently being used and what are the plans for the future?
    Real solutions, what has been done, how is it used and what is the benefit of replacing existing interfaces with XI?
    How are companies exchanging information between systems using XI (B2B)?
    Customer success stories and examples of what can be achieved with XI and xApps
    The xApps that are available
    The NetWeaver technology strategy
    NetWeaver core components and inter-relationships
    What demand do the inter-relationships place on the system landscape?
    Development/Modelling
    Building WebDynpro applications that are supportable, upgradeable and reusable
    Object oriented ABAP, ABAP shared objects, consuming web services and SAP's enhancement framework
    Discussions and demonstrations of the various modelling and developing tools
    ABAP vs Java and .Net and Java WebDynpro
    Designing and building in the composite environment
    Comparisons of the various modelling tools in the market, including Visual Composer and some common third party tools such as ARIS etc.
    What skills are required within an organisation for modelling?
    Business Process modelling
    Testing environments - use, requirements and demonstrations of TDMS
    Portals
    The business case
    Justifying the need to implement a portal
    What are the benefits we can expect to achieve?
    The new features and functionality of EP 7.1 over EP 6
    Federated portals and examples of successful implementations
    The technical case
    Remote access and security
    Single sign-on, authentication and authorisations
    System dependencies, administration issues and enterprise search
    Third party portals - what are the options about deploying content?
    Integrating non-SAP technology and portals
    What skills have customers had to employ to support new portal technology?
    System Admin
    Virtualisation and adaptive computing - what options are available?
    System landscape optimisation, managing complex landscapes and sizing
    Unicode conversions
    Server consolidation and performance tuning
    Database comparisons, managing large databases and database migration
    To TDMS or not to TDMS?
    Archiving and archive links
    Document management solutions
    Data integration, workflow and job scheduling
    Infrastructure and architecture platforms
    Installing, monitoring and maintaining SAP's J2EE server
    SAP's support pack strategy
    Monitoring and maintaining the SAP portal
    Security
    Security continues to be of interest as more people work from outside the traditional environment and also with the increasing use of Portal technology.
    Best practices
    Portal issues
    Role management, identity management across systems and the aspects of a single sign-on environment
    Authentication and authorisations in the mobile workforce
    Access for non-SAP users
    Access for contractors and short-term workers
    Additional security products such as Versa
    Support
    SAP's support pack strategy and support tools
    What does SAP's active Global support offer?
    Outsourcing issues and options
    Best practices for support
    The service marketplace - Helpdesk experiences and alternatives
    Building a support team
    Do let me know in case of any queries.
    Hope this helps you.
    Regards,
    Rakesh

  • SAP FUNCTIONAL

    Hi,
    I want to get in to SAP functional side, please guide me which is best in the market.
    I'm in IT field for more than 7 yrs.
    Thanks in advance.

    Dear Kobby bryant,
    It will be helpful if you can let us know the brief of your academic background and work profile. This will help the forum membes to guide you better.
    In case you want to know what are the key and hot areas in SAP, then please refer to the this:
    Solution Manager
    Why use it and why do I need it?
    SAP's strategy and plans
    Will it become mandatory to use Solution Manager?
    Customer experiences
    Practical uses of the product
    What tools are included in Solution Manager?
    The role of Solution Manager in upgrades and loading data
    Presentations showing and discussing functionality in the areas of System Monitoring, Helpdesk, System Landscape Management, Business Process Management, Change Management, Testing and Document Management
    What other tools are currently available as a comparison? Why choose other tools and what are the consequences?
    User Interfaces
    Current UI's - what are their features, benefits and characteristics?
    Why should I choose one over another?
    What are the pre-requisites and support implications?
    UI's developed using WebDynpro (both ABAP and Java)
    Adobe UI's - Forms and Flex
    GuiXt, Visual Composer, Portals, .NET
    Mobile technology and mobile infrastructure
    Upgrades
    Building a business case
    Examples of a business case and justification for an upgrade
    What are the compelling reasons to upgrade?
    What benefits and ROI can be gained?
    To upgrade or re-implement?
    Hear customer stories from completed upgrades
    Leveraging an upgrade to gain a business benefit
    New functionality contained in the latest versions
    The new features and tools - how to find and use them
    After the upgrade - exploiting new functionality
    Technical aspects of an upgrade
    What are the landscape and sizing implications and how do we manage those issues?
    Virtualisation - what benefits can it deliver and what limitations exist?
    Unicode upgrades
    Preparing your technical staff and resources for the upgrade
    How can I address the training and education requirements?
    Planning for continuous improvement
    TDMS - why use it and what benefits will it bring to us?
    Planning an archiving strategy, as opposed to deletion strategy
    SAP Strategy/Futures
    SAP has stated its strategy for new releases up to 2011. What does this mean to the current customers who need to make strategic plans of their own?
    In light of recent acquisitions what are the implications to customers and partners?
    The interdependencies between the various SAP solutions such as BI, CRM, and APO
    What's coming with each solution and how does this affect customers?
    What is the effect of implementing each component of SAP on other areas and the total system landscape?
    Using existing functionality in industry solutions within your solution
    Business By Design
    SAP's SDN and BPX communities
    SAP training and certification changes and what they mean
    What is coming next?
    XI(PI)/xApps/NetWeaver
    How is XI (PI) currently being used and what are the plans for the future?
    Real solutions, what has been done, how is it used and what is the benefit of replacing existing interfaces with XI?
    How are companies exchanging information between systems using XI (B2B)?
    Customer success stories and examples of what can be achieved with XI and xApps
    The xApps that are available
    The NetWeaver technology strategy
    NetWeaver core components and inter-relationships
    What demand do the inter-relationships place on the system landscape?
    Development/Modelling
    Building WebDynpro applications that are supportable, upgradeable and reusable
    Object oriented ABAP, ABAP shared objects, consuming web services and SAP's enhancement framework
    Discussions and demonstrations of the various modelling and developing tools
    ABAP vs Java and .Net and Java WebDynpro
    Designing and building in the composite environment
    Comparisons of the various modelling tools in the market, including Visual Composer and some common third party tools such as ARIS etc.
    What skills are required within an organisation for modelling?
    Business Process modelling
    Testing environments - use, requirements and demonstrations of TDMS
    Portals
    The business case
    Justifying the need to implement a portal
    What are the benefits we can expect to achieve?
    The new features and functionality of EP 7.1 over EP 6
    Federated portals and examples of successful implementations
    The technical case
    Remote access and security
    Single sign-on, authentication and authorisations
    System dependencies, administration issues and enterprise search
    Third party portals - what are the options about deploying content?
    Integrating non-SAP technology and portals
    What skills have customers had to employ to support new portal technology?
    System Admin
    Virtualisation and adaptive computing - what options are available?
    System landscape optimisation, managing complex landscapes and sizing
    Unicode conversions
    Server consolidation and performance tuning
    Database comparisons, managing large databases and database migration
    To TDMS or not to TDMS?
    Archiving and archive links
    Document management solutions
    Data integration, workflow and job scheduling
    Infrastructure and architecture platforms
    Installing, monitoring and maintaining SAP's J2EE server
    SAP's support pack strategy
    Monitoring and maintaining the SAP portal
    Security
    Security continues to be of interest as more people work from outside the traditional environment and also with the increasing use of Portal technology.
    Best practices
    Portal issues
    Role management, identity management across systems and the aspects of a single sign-on environment
    Authentication and authorisations in the mobile workforce
    Access for non-SAP users
    Access for contractors and short-term workers
    Additional security products such as Versa
    Support
    SAP's support pack strategy and support tools
    What does SAP's active Global support offer?
    Outsourcing issues and options
    Best practices for support
    The service marketplace - Helpdesk experiences and alternatives
    Building a support team
    Do let me know in case of any queries.
    Hope this helps you.
    Regards,
    Rakesh

Maybe you are looking for

  • Error while executing a Pro*c program

    Hi, guys iam trying to execute a Pro*c program, and iam getting the following error ld.so.1: cms2sap1: fatal: libclntsh.so.8.0: open failed: No such file or directory Killed the same executable program was very much working till yesterday, iam on Sun

  • Can't Sync My iPod ! Help Please

    well ... i just got my new iPod 2nd gen of 4gb .. and i cant sync .. windows recognize the nano .. but when im gonna start sync the nano goes to the menu screen and an error appears on the sync screen .. cant sync .. please help me

  • International Characters

    Hello, How can I make JSP pages with Japanese characters? Thanks, Marco null

  • ISE policy creation question - best practices

    Ok, I am a rookie ISE user here and am trying to learn as I go. I have a 802.1x policy for our corporate users on both wired and wireless and a wireless guest policy that redirects to the guest portal to enter credentials created in the sponsor porta

  • Times Font Problems With PDFs

    Hello, I am having an intermittent problem with PDFs with Times fonts. Sometimes the PDFs print OK and other times words will be jumbled together or strange characters will take the place of certain punctuation marks. There are several other things t