Maintain history in DW

hi,
how the best way to maintain history records into a DW?
i think create start date and end date columns in tables.
i would like how create mapping?
robson

Hello Malathi,
There are different approaches. The one I described some time ago works with two mappings for each table you want to keep history (SCD II) although the logic could also be placed in one mapping.
In the first mapping I use an external table that contains my source data. I use a set operator to subtract all of the valid target records from the records of the external table. The SQL statement the OWB generates in principle looks like this one:
select * from source_table
minus
select * from target_table
where target_table.valid_to_date = to_date('9999-12-31','YYYY-MM-DD')
The output of the set operation contains all my new records from the source table that were not already present in the target table. Therefore I can simply insert this records into the target table. I use a sequence for the surrogate key.
In the second mapping I'm doing it just the other way round so that the SQL statement the OWB creates in principle looks like this one:
select * from target_table
where target_table.valid_to_date = to_date('9999-12-31','YYYY-MM-DD')
minus
select * from source_table
The output of the set operation now contains all the records in the target table that have to be terminated (load date -1). Therefore I'm performing an update on the target table using a target filter (target filter for update: target_table.valid_to_date = to_date('9999-12-31','YYYY-MM-DD')) to make sure I only update the current version (loading type update; match by no constraints).
That's it.
Regards,
Jörg

Similar Messages

  • Hide URL in PDF output in hyperlink , Maintain History within reports

    hi
    i have some issues to discuss for report output in pdf format.
    we are using java application to call reports via AS 10g, my issue are
    1) Hide URL in PDF output for hyperlink created.(Using username/password in Key is not enough)
    2) Maintain history as i call second report from my parent report then i want to go back to parent report.(Note here go.history or history.go(-1) work in HTML output format but not in PDF)
    3)To Call Child PDF report in new window(here again working fine target="_blank" for HTML output but not for PDF)
    regards
    abid

    Hi Ram,
    Not sure if this is possible.
    However, one workaround might be the following:
    1. Write a javascript that submits the URL using, say a POST method, and does not show the parameters in the URL. You will have to write this javascript code in the "Before Report" Report Escape. For a generic example on how to use javascript in a report, see Metalink Note 125652.1. This note shows javascript to disable the right-click of the mouse on the report output.
    2. Use the Hyperlink peroperty of the report to call this javascript function, eg,
    javascript:myfunction('http://machine:port/reports/rwservlet?report=...+server=...+empno=&empno')
    I am not a javascript expert, so I cannot give you an example of the function, but I hope someone in your team can find out.
    Navneet.

  • How to maintain history in infotype 2 in ECC6 if the start date is DOB defa

    Hi,
    I have to change the status of some female employees as they are now married and their surname is aslo changed.
    But in ECC6 in infotype 2 the default start date is date of birth,  Copy option is not working as the start date will change which is the Date of Birth by default , if i directly change it history wont be maintain.
    How to go for this ?
    Regards

    Hi,
    The default start date as DOB willl be for first record.
    You can chage the data by specifying validity dates. it wont be any problem.
    Regards,
    Shankar.

  • How to maintain History

    Hi, can any one tell me how to ocnfigure XI. so that it can maintain  hte history  of last activation and previous one in both integration directory and integration repository.
    Thanks in Advance.

    Hi!
    I achieve this by making "save" copies before changing any critical objects like processes or mappings.
    Another chance (and the more official one) is the use of Software Component Versions to achieve a history.
    For more details on this please refer to http://help.sap.com/saphelp_nw04/helpdata/en/94/c461bd75be534c89b5646b6ddc0aff/content.htm
    Follow the link "Version Management".
    Regards,
    Volker

  • Archive guided procedure process and maintain history attributes

    Hi All,
    We have the need for a workflow through portal with several approval steps.
    We want to use guided procedures but cannot find whether it answers to our following requirements:
    - When a process is completed, we want to be able to see some detailed data of each activity.
              -ID of the object created after approval    (which we display in the last step of the procedure).
              -Comments from the requestor               (upon which the approver decided to approve).
              -Date and time of approval
    This so we can track back from the object id why it has been created and who approved it. (also when the process is already archived, we would still like to see this).
    -Does anyone know if there are limitations to the archiving process?
    -The exact approver of the process is important to retain as well. In the archive monitor I now only see Initiator, Owner, Overseeer and Adminstrator kept in history. But we need to know who did our approval (which is not amongst these roles).
    Thanks for your help
    Jurjen

    Hello,
    Even we have similar kindda requirement. I don't know how well is the standard archive monitor will suit this need. As you have said there might be some vital information that may not be possible to be archived in the standard Archive for GP.
    So I'm planning to build my own Ztable in SAP R/3 for the archival of all the processing details, agents involved,date and time etc so that this table can be updated as and when the particular step in the GP has been completed.
    This way, I guess i can decide what all details to be archieved and hence code accordingly. Also, this doesn't have any limitation as its a Transparent table in R/3.
    Hope you can do something similar to this.
    Cheers,
    Mandrake

  • Firefox 35.0 upgraded and will not maintain history

    Hi, RATHER FRUSTRATING... I just upgraded to v 35 and all my tabs and history is gone. I'm on Mavericks.
    Settings:
    Under Preferences/General, I had selected (and still have selected), show my windows and tabs from last time
    Under Preferences/Privacy, I had and still have UNCHECKED, 'clear history when firefox restarts'.
    Once I did the upgrade, it has blown away my history and I only have history older than 6mths. No recent history at ALL! What gives??
    What's worse, with the above mentioned setting in place, if I open two tabs (say one to Microsoft and the other to Apple), when I exit and restart FF, I get a clean install. The tabs from the last session DONT'T get restored!!!
    It seems the restoration after an upgrade is a CRAP SHOOT with FF. I have had this happen in the past and usually reluctant to do upgrades. Which is not a good thing. This time seeing the focus on security, I elected to the upgrade, after I had checked all the settings were in place. Unfortunately, it screwed me right up! What can I do?
    Thanks
    jafirefox

    You can check for problems with the <b>places.sqlite</b> database file in the Firefox profile folder.
    *http://kb.mozillazine.org/Bookmarks_history_and_toolbar_buttons_not_working_-_Firefox
    *https://support.mozilla.org/kb/Bookmarks+not+saved#w_fix-the-bookmarks-file
    *Places Maintenance: https://addons.mozilla.org/firefox/addon/places-maintenance/

  • Doubt on maintaining history of Tickets

    Hi experts
    How to do daily monitoring of infocubes and dso objects. Is there any separate T code for daily monitoring or can we use RSMON  transcation?
    Also, Plz explain me the clear procedure for daily monitoring

    Hi Deepthi,
    We have process chains scheduled for the loading of infocubes and DSO.
    You can use Tcode: RSMON or RSPCM to monitor the loads and process chains.
    Daily we monitor the Process chain using the RSPCM to monitor the loads . here in the transaction RSPCM -- add you process chain to this and can monitor the status of the process chains.
    History of tickets maintenance depends on the client few clients use tools to store the tickets history like Unicentre , Mercury .,COACH,Remedy,CSS Etc,There will be difference in support and implemenation perspective.Differences depends on the vendor who developed the tool.
    few use Excel file to store the history of tickets.
    ticketing tools

  • How to maintain previous and record count in audit table in SQL Server 2008 r2?

    Hi Experts ,
     Situation :
    in our database we are having few of stored procedures which will drop and recreates the tables and it is scheduled on weekly basis. when this job will run all the stored procedures will drop all the tables and recreate. Now we need to create one table which
    will maintain history of the records.
    my table structure is listed below
    TableName CurrentReocrdCount CurrentExecutionDate PreviousReordCount PreviousExurtiondate
    TEST         1000                   2014-03-30            NULL        NULL
    Test         1500                   2014-04-10            1000      2014-03-30
    Test         2000                   2014-04-11            1500      2014-04-10 
    How do i achive this . 
    franklinsentil

    You need to create audit tables for these. The table will be populated by COUNT value inside stored procedure. Each time it clears the main table and fills new data and also logs count details to audit tables. You can use COUNT(*)  to get count value
    and GETDATE function to get current execution value.
    So proc will look like
    CREATE PROC procname
    @param....
    AS
    --step to drop existing table
    IF OBJECT_ID('tablename') IS NOT NULL
    DROP TABLE <table name>
    --step to fill new table
    SELECT ...
    INTO TableName
    FROM
    --Audit table fill step
    INSERT AuditTable (TableName,CurrentRecordCount,CurrentExecdate,PrevRecordCount,PrevExecDate)
    SELECT TOP 1 'TableName',(SELECT COUNT(*) FROM tableName),GETDATE(),CurrentRecordCount,CurrentExecDate
    FROM AuditTable
    ORDER BY CurrentExecDate DESC
    UNION ALL
    SELECT 'TableName',(SELECT COUNT(*) FROM tableName),NULL,NULL
    WHERE NOT EXISTS (SELECT 1 FROM AuditTable)
    GO
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • How to track history on one field

    Hi All,
    I have a table that i created in OWB, but now i have a requirement wherein they want me to add a new field and TRACK history on only that particular field. For example if any changes take place it should keep the old records as well. Any idea or link to a document is greatly appreciated.
    Thanks

    hello
    check out this thread
    Re: maintain history in DW
    rgds

  • Track Table History

    Hi,
    I need to maintain history of TABLE_AA into TABLE_BB. Whenever a column is updated on TABLE_AA, the old record needs to be inserted into TABLE_BB (for history purpose). I am trying to achieve this using a trigger on the table AA but I am getting 2 records inserted everytime the trigger fires. Please help me here.
    Here is the trigger code
    CREATE OR REPLACE TRIGGER APPS.TABLEAA_TRIG
    BEFORE UPDATE
    ON APPS.TABLE_AA
    REFERENCING NEW AS NEW OLD AS OLD
    DECLARE
    V_ERR_CODE NUMBER;
    V_ERR_MSG VARCHAR2(255);
    BEGIN
    -- Insert into history table TABLE_BB --
    INSERT INTO APPS.TABLE_BB
    (COL1,
    COL2,
    COL3,
    COL4,
    COL5,
    COL6,
    COL7,
    COL8,
    COL9,
    COL10)
    VALUES
    (:OLD.COL1,
    :OLD.COL2,
    :OLD.COL3,
    :OLD.COL4,
    :OLD.COL5,
    :OLD.COL6,
    :OLD.COL7,
    :OLD.COL8,
    :OLD.COL9,
    :OLD.COL10);
    EXCEPTION
    WHEN OTHERS
    THEN
    V_ERR_CODE :=SQLCODE;
    V_ERR_MSG := SQLERRM;
    DBMS_OUTPUT.PUT_LINE(' Error : '||V_ERR_CODE||' has occurred. '||V_ERR_MSG);
    END;
    --------------------------------------------------------------------------------------------------------------------------------------------------------------------

    I modified it to a row level trigger. After this change also I am getting duplicate rows in History table TABLE_BB when ever I update TABLE_AA through form. While I get only 1 record into the history table when I update TABLE_AA from database. In the form, I have reckoned that the update is being done through a button and the button is calling a package.proedure.
    Only when I update the record through this push button, I am getting duplicate rows in the history table while when I use the standard save button (yello button on the menu) I do not get the duplicate rows. I have the code of the procedure below.
    Note: This form was already created some developer and functioning good.
    Please help me here.
    Here is the code of the package.proedure
    PACKAGE BODY PROCESS_ROW IS
    Procedure update_row
    is
    v_err_code number;
    v_err_msg varchar2(255);
    begin
    :TABLE_AA.last_update_user_i := fnd_global.user_name ;
    :TABLE_AA.creation_user_i := fnd_global.user_name ;
    :TABLE_AA.last_update_ts := sysdate;
    :TABLE_AA.creation_ts := sysdate;
    update TABLE_AA
    set COL2 = :TABLE_AA.COL2
    , COL3 = :TABLE_AA.COL3
    , COL4 = :TABLE_AA.COL4
    , COL5 = :TABLE_AA.COL5
    , COL6 = :TABLE_AA.COL6
    , COL7 = :TABLE_AA.COL7
    , COL8 = :TABLE_AA.COL8
    , COL9 = :TABLE_AA.COL9
    , COL10 = :TABLE_AA.COL10
    , last_update_ts = :TABLE_AA.last_update_ts
    , last_update_user_i = :TABLE_AA.last_update_user_i
    where COL1 = :TABLE_AA.COL1;
    commit;
    exception when others then
    v_err_code :=SQLCODE;
    v_err_msg := SQLERRM;
    fnd_message.debug(' Error : '||v_err_code||' has occurred. '||v_err_msg||' in procedure update_row');
    raise form_trigger_failure;
    end;
    end;

  • Drill down SAP history table on Linux

    You may have noticed (or not noticed) that on some fields in SAP there is a History table that appears once you start entering (typing) information into the field.
    The History table is a field box that appears underneath a field when you start entering information. An example of one of the fields is the Fund field.
    This History table contains the last entries entered (by typing) into the field. Users have discovered that it allows a quick reference to regularly entered information.
    This we i m observing on windows platform , but when i am using SAP on Linux fedora
    same history table not creating.
    Please help us to maintain history table in SAP on Linux fedora core 3 platform
    Vijay Vadgaonkar
    India
    [email protected]

    Hi Vijay,
    I assume you are talking about the history table for SAP GUI? If this is correct, please move your post to a forum like e.g. "ABAP Development".  This here is dedicated to KMC in the SAP Enterprise Portal.
    Regards,
    Karsten

  • How i get old history of record

    hi master
    sir
    My need is how I get the record insert, change, delete date and data
    Such as I insert record
    Acid date amount
    1012 01-12-2006 45824
    user change this record after week but not change the date change only acid and amount
    Acid date amount
    2312 01-12-2006 428276
    how I get information when user change and what old data
    2.
    which record user delete and when he delete my management need full history of data activities
    please give me idea how I get old history
    thank’s
    aamir

    Yes. Oracle does keep track of records but in a different way. It saves the redo information in redo logs so that it can recover the database in case of crash.
    It also keeps undo information in separate tablespace in case you rollback your transaction.
    But... this is not what I would call maintaining history.
    Moving ahead, these redo logs are archived to another location if you have archiving enabled.
    You can use log miner to mine through these archived logs but I doubt if any human would have patience to go through all those archive logs that are generated frequently, just to track the updates/deletes done.
    For me, adding a trigger is much simpler.

  • Data in DSO and CUBE

    Hi Experts
    Please..Please update me on in detail if possible with example ......Key Fields and Dimensions
    How Data Records will be processed in DSO and  Cube
    I tried to search but i can't able to find what i am looking for
    My Requirment:
    I am extracting Employee Benefits records from Non SAP Source System (Oracle Tables)
    In the Oracle there is no Date field avaliable and there won't be any history avaliable in oracle once the Benefits of an employee is changed the old record will be overwritten to new one so they can't track the employee benefits history...but in BW i need to store/show history of employee benefits by Cal day
    Oracle Table Fields
    Location (Primary Key)
    Emp No (Primary Key)
    Insurance Type
    Call Allowance Type
    Annual Leave (No of day Days)
    Pension Schem
    Company Laptop (Yes/No)

    hi,
    key fields are the primary keys of ur ods tables. based on the key field values the data fields will get overwritten.
    suppose if the key fields values were same then the data fields will get overwritten. but the changes were captured in the change log table, hence the history of changes is available.
    Dimensions- are the prmary keys of cube. but in cube only addition of records will happen as default not overwrite as in ods.
                       dimension ids were generated based on the sid values of characteristics.
    maximum of 16 key fields and dimensions can be there in ods and cube respectively.
    for ur case, include 0calday field in the key fields and use the routine to update the field with system date(SY-Date). this keeps track/ else without date also , change log maintains history for each load.
    have to add 0recordmode to the communication structure of the ods(infosource).
    Ramesh

  • Issue with OOTB Procurement and Spend report on top of W_AP_INV_DIST_F

    We are implementing OBIEE 11g on top of EBS R12. Observed today that AP_INVOICE_DISTRIBURIONS_ALL has a lot of records with PO_DISTRIBUTION_ID as NULL (This is expected as this table holds not only PO but also Expense tyoe invoices). The mapping used to populate W_AP_INV_DIST_F (SDE_ORA_APInvoiceDistributionFact) uses the PO_Distribution_ID to derive Product_ID. Thus, in W_AP_INV_DIST_F a lot of lines exist with PRODUCT_WID of NULL (which is also understable).
    My question is - In Procurement and Spend OOTB reports (for ex - Spend by Top Categories) , it is showing all records from W_AP_INV_DIST (and not filtering to Spend_Type of "PO Matched"). Thus, most of the records is going into the "Unspecified" bucket.
    Any thoughts on why the OOTB report would work like that (This is affecting any OOTB report that is built on top of W_AP_INV_DIST_F and involves products or product categories)?

    Out of curiosity, why do you only maintain 13 months of history in PeopleSoft? I have never heard of this. Dont the business users want to track PeopleSoft before 13 months? With regard to your question, given you are only able to load 13 months on a full load and this is impacting the links to the correct suppliers, you can consider keeping a refresh date on the tasks that load the W_SUPPLIER_D tables. This can prevent reload for the supplier dimension. However, I am not sure how you intend to handle the other dimensions as I assume you will have the same issue. If you only have 13 months of history, BI Apps will only load 13 months of history on a full load. Also for the SCD dimensions, there should be a CURRENT_FLAG that is set to Y only for the current record and there should still be a link to the previous records where CURRENT_FLG = N. As long as your SCD is working fine, and you can prevent the reload of supplier dimension via the refresh date changes, it may work. I would advise your PeopleSoft team to maintain history..esp if they want to take advantage of Analytics and trend analysis.

  • Help me with my survy for my projeect

    I am doing this survy for my professor- I am at Bethedsa community college. I am already behind on this survey- so I will appreciate if you guys can quickly look at the survey file- One lucky winner will get $50 gift coupon at amazon.com
    We are building a better version of JVM which will allow profiling in production. This technology, if feasible and successful may be ported to commercial application servers such as WebLogic, WebSphere, JBoss and others, with the cooperation of J2EE vendors.
    In this document, we will be calling this system- "DiagnoseNow".
    The DiagnoseNow project is in research stage, and we want to adjust the goals of the project to better meet the requirements of the application administrator, designer and IT manager.
    Your inputs will be tremendously appreciated.
    I will appreciate your response- I need to get 20-30 responses-
    before my professor will release my stipend... I am sure you were a
    student once- So I will really really appreciate your inputs. You do
    not have to be complementary or negative to the idea- just be
    unbiased and honest...
    PROFILING AND TRANSACTION TRACKING IN PRODUCTION
    We are building a better version of JVM which will allow profiling
    in production. This technology, if feasible and successful may be
    ported to commercial application servers such as WebLogic,
    WebSphere, JBoss and others, with the cooperation of J2EE vendors.
    In this document, we will be calling this system- "DiagnoseNow".
    The DiagnoseNow project is in research stage, and we want to adjust
    the goals of the project to better meet the requirements of the
    application administrator, designer and IT manager.
    Your inputs will be tremendously appreciated.
    We are giving away one $50 gift certificate at Amazon.com to one
    respondent. We expect about 25-50 responses only- So your odds of
    winning are high.
    The features of this proposed JVM are best explained using an
    example of an ecommerce site.
    Ecommerce site scenario
    Description of system
    Consider a typical ecommerce site with the following features: User
    Validation, Product Search, Shopping Cart Management and Checkout.
    Each user that actually purchases or attempts to purchase a product
    on the website is consider a "Tracking unit" or a "Session". This
    new research JVM will allow the time spent in a
    particular "Sessions" to be tracked at the method level.
    System administrator can if he / she so wishes get a report with
    the following columns:
    Session ID, MethodName, Time spent in the method(inclusive of
    called methods), Time spent in the method(excluding called methods)
    Put simply, the goal is to support session level tracking and
    profiling at the method level in the JVM itself.
    This will allow faster debugging and application turnaround,
    reducing application maintenance cost.
    A production ecommerce site could have millions of users and many
    more transactions per day. We are optimistic about supporting real
    life production sites.
    User Validation
    1. login: This validates the user using user name and password.
    2. logout: This logs the user off.
    3. passwordVerification: Using a database, the password is
    verified.
    4. retrievalOfLostPassword: If the user has lost his/her
    password the password is emailed to the user.
    Product search
    1. byname: Search product catalog by name.
    2. byDescription: Search product catalog by Description
    3. bySku: Search product catalog by SKU
    4. byBrand: Search product catalog by brand
    5. Shopping Cart
    6. add: Adds items to the cart
    7. remove: Removes Items from the cart.
    8. update: Updates Items from the cart.
    9. checkout: Checkout items from the cart, by asking for credit
    card. .
    10. retrievePastData: Retrieve user information from the past.
    11. validateCreditCard: Validate credit Card by checking with
    the bank.
    12. sendConfirmationEmail: Send Email confirming the purchase.
    13. sendConfirmationEmailShipping: Send Email confirming that
    the product has shipped.
    14. sendSurveyEmail: Send Email checking customer satisfaction.
    What is a "Transaction" or a "Tracking Unit" or a "Session"?
    Each user who searches the product catalog is considered a "Tracking
    unit" or sessions. A session ends when the users leaves the website.
    Transaction tracking/ reporting data at method level
    Time spent by each user, in each and every method above is tracked
    in two ways: Time spent in a method itself, Time spent in the method
    and the called methods. This will be done for each and every
    invocation of the method.
    Email alerts
    Emails can be generated whenever a particular method(eg.
    sendSurveyEmail) takes longer than a specified threshold.
    Analysis Reports
    Analysis can be done on a variety of topics: Reasons for abandoning
    shopping carts, Slow or underperforming parts of the application.
    Thread dumps
    If a particular method runs slower than a threshold value, then
    optionally a thread dump may be taken and stored. In this way, even
    if the slowdown occurs at midnight, the thread dump will still be
    available.
    Triggered heap profile
    Similarly, if the system is running out of heap memory a heap
    profiler may become active. Heap profile may also be taken
    periodically, to allow analysis of heap growth.
    The system is best explained using a few examples:
    Slowdown in search-books
    Problem: Analysis of sessions shows that a large number of
    customers are browsing for books, but conversion to actual sales is
    slow.
    Analysis: The book database has grown in size, and the server
    running the database has many more apps on it. Moving the database
    to another machine solved the problem.
    How and Why did DiagnoseNow help?
    DiagnoseNow maintains history of all sessions. The history was
    analyzed to show customers who were not converting to purchases- a
    large proportion of them were searching for books, and abandoning
    the site afterwards.
    Slowdown in search-general
    Problem: Search has slowed down for all products.
    Analysis and Resolution: Analysis showed that all search methods
    are taking 250% longer than normal. The slowdown happened after a
    specific date- it was the date on which the OS was upgraded. Rolling
    back the upgrade and reinstalling it properly resolved the issue.
    How and Why did DiagnoseNow help?
    DiagnoseNow maintains a baseline of past search method response
    times. So when a slowdown happens, there is no time
    wasted "apportioning blame" or debating whether a problem exists.
    Credit card authorization- intermittent failure
    Problem: Credit card authorization was failing intermittenty after
    hours
    Analysis/Resolution: Credit card authorization was failing
    intermittently. The automatically generated thread dump showed a
    faulty connection. The problem was traced to the credit card
    authorization end, and was resolved.
    How and Why did DiagnoseNow help?
    Without the thread dump this problem could not have been solved, and
    without DiagnoseNow the problem would not have been detected unless
    the system administrator was able to take thread Dumps- for that to
    happen the problem would have had to occur during normal business
    hours. (The off hours system administrators are Unix technicians
    with no app server or Java knowledge. )
    Slowdown of overall system at midnight
    Problem: The entire system slows down at midnight.
    Analysis/Resolution: All methods show a slowdown around midnight,
    degrading system performance.
    How and Why did DiagnoseNow help?
    DiagnoseNow detected the slowdown, and it was discovered that a
    large admin program ran at midnight. Splitting up the work in the
    admin program into 5 chunks, made the performance impact lot
    smaller.
    Slowdown at 9.05 pm Saturday/Sunday evening
    Problem: The entire system performance degrades by 50% around 9.05
    pm on Saturday/Sunday evening
    Analysis/Resolution: The weekend cleaning crew was unplugging one
    of the appservers, to plug in the vaccum cleaner.
    How and Why did DiagnoseNow help?
    DiagnoseNow allows method level historical tracking and system level
    analysis. This allowed quick detection of an overall slowdown.
    Without system level analysis, a variety of alerts may get
    triggered, but the root cause may not be identified.
    SURVEY
    Your Name(Required to get the prize):
    Your role: Developer/IT Administrator/Manager/ App server
    administrator/Software QA
    Email address(Required to get the prize):
    Phone number(optional):
    Place a tick mark against the question:
    1. Reducing application maintenance cost is important to me.
    Disagree Somewhat disagree Neutral Somewhat Agree
    Strongly Agree
    2. Triggering a thread dump based on specific conditions is an
    important feature
    Disagree Somewhat disagree Neutral Somewhat Agree
    Strongly Agree
    3. Tracking each and every session/transaction down to the
    method level is an important feature
    Disagree Somewhat disagree Neutral Somewhat Agree
    Strongly Agree
    4. Rapidly localizing and diagnosing a problem is important to
    me
    Disagree Somewhat disagree Neutral Somewhat Agree
    Strongly Agree
    5. Creating alerts based on overall transaction performance is
    important to me
    Disagree Somewhat disagree Neutral Somewhat Agree
    Strongly Agree
    6. If DiagnoseNow is proven to be a stable system, then I will
    be willing to pay 1500 dollars per CPU license fee
    Disagree Somewhat disagree Neutral Somewhat Agree
    Strongly Agree
    7. I can accept a CPU overhead of 8% for extensive monitoring
    leading to reduced costs.
    Disagree Somewhat disagree Neutral Somewhat Agree
    Strongly Agree
    8. If DiagnoseNow is proven to be a stable system, then I will
    be willing to pay 1500 dollars per CPU license fee(inclusive of 2
    business day email support. )
    Disagree Somewhat disagree Neutral Somewhat Agree
    Strongly Agree
    9. If DiagnoseNow is proven to be a stable system, then I will
    be willing to pay 7500 dollars per CPU license fee(inclusive of
    support with 1 hour response time)
    Disagree Somewhat disagree Neutral Somewhat Agree
    Strongly Agree

    Profiling in production is a pipedream.
    With modern JVMs and processors, hundreds of thousands of methods will be invoked on multi-CPU machines in just one second. Tracking and analyzing this data is beyond the power of the fastest database.
    To analyze one hour of data, 360 million methods will have to be handled for a 4 CPU Pentium/Xeon machine.
    This is clearly impossible to do in real life, with current hardware. May be 5 to 10 years from now, things will be better- however, by then CPUs will be processing many more methods per second...
    I think it is a nice ACADEMIC project!

Maybe you are looking for

  • Multiple Item Codes in a MRP Generated PR

    Dear Sir, We have a requirement to have "Multiple Item Codes" in a single PR . The PR are generated thru MRP . The criterion  to have multiple Items in a single PR is that a PR will contain multiple Items codes pertaining to the same Material-Group .

  • Text disappearing in pdf file

    I received a word file that was converted to pdf using "pdf maker". In the word file there is "shading" behind the text. In the pdf file if I select portions of the text and try to "highlight" it to make comments, some of the text disappears. (The te

  • Creative zen microphoto 4GB big prob

    Since a week or so, when i open the player by moving and holding the power button to on position, the only thing appear is the splash screen of creative logo and copyright 2005 and it stays there until I remove the battery, cause when I tried to clos

  • US robotics external modem

    Yes I have a US Robotics Courier V.Everything external modem I used to use with my PC and I was wondering if this modem would work with a Mac. It's serial so I was wondering if I could get a serial to usb converter. If anyone has used this modem with

  • Icons on bookmarks of N97 browser do not appear

    when I save a link into the bookmarks the icons not appear in the N97 browser. my old n95 shows them correctly. my phone has the firmware 20.0.19.  Navio