What is Averaging Time & Data Reduction Factor in Frequency-Weighted Acceleration?

I have to analyse the Ride Comfort Data of one vehicle (as per ISO2631-1). It has Acceleration levels of driver seat in X, Y, Z directions.
I need to plot Frequnecy Weighting in dB vs Frequency in Hz. Data is recorded at 500 Hz for 9.95 secs.
How much Averaging time & Data Reduction Factor should i take and how to convert them in as above said graph.

In ISO2631-1 it says that while analysing the data you should take weighing factors for seating condition. For driver seat it is 1 it is also mentioned in Frequency Weighted Averaging in Diadem as Wk & Wd etc in signal analysis.
Output should come as Frequency weighted RMS of individual X, Y , Z and combined RMS of X, Y , Z. for that i need to know the "Averaging Time Constant Required" & Data Reduction Factor. As changing any of these will change the results.
I need to have one final RMS value of X, Y, Z representing the ride comfort of that vehicle & If possible combined RMS too.
And how to change that value to dB. I know how to do it manually by taking a reference value and taking log with base 10 by dividing by reference value.

Similar Messages

  • Data reduction - setting factor in program

    Is there a possibility to set the data reduction factor of the data reduction Express VI in the program and not only in the settings of the Express VI itself ?

    The only way at present to pass a value to an Express VI is to convert it to a normal VI and then add an input pin to the connector terminal.

  • Real-Time Data Acquisition

    WHAT IS REAL-TIME DATA ACQUISITION AND
    WHAT ARE REAL TIME QUERIES AND
    DEAMON UPDATE AND HOW DO WE EXTRACT AND LOAD DATA
    Please Explain in detail.....
    regards
    GURU

    Hi,
    Real-Time Data Acquisition –BI 2004s
    Real-time data acquisition supports tactical decision-making. It also supports operational reporting by allowing you to send data to the delta queue or PSA table in real-time.
    You might be having complex reports in your BI System, which helps in making decisions on the basis of data of your transactional system. Sometimes (quarter closure, month end, year ending...) single change in the transactional data can change your decision, and its very important to consider each record of transactional data of the company at the same time in BI system as it gets updated in the transactional system.
    Using new functionality of Real-time Data Acquisition (RDA) with the Net Weaver BI 2004s system we can now load transactional data into SAP BI system every single minute. If your business is demanding real-time data in SAP BI, you should start exploring RDA.
    The source system for RDA could be SAP System or it could be any non-SAP system. SAP is providing most of the Standard Data Sources as real-time enabled.
    The other alternative for RDA is Web Services, even though Web Services are referred for non-SAP systems, but for testing purpose here I am implementing Web Service (RFC) in SAP source system.
    Eg will be a production line where business wants information regarding defective products in the real time so that production can be stopped before more defective goods are produced.
    In the source system, the BI Service API has at least the version Plug-In-Basis 2005.1 or for 4.6C source systems Plug-In 2004.1 SP10.
    Real-Time Data Acquisition -BI@2004s
    http://help.sap.com/saphelp_nw2004s/helpdata/en/42/f80a3f6a983ee4e10000000a1553f7/content.htm
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/230d95df-0801-0010-4abb-ace1b3d197fd
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/3db14666-0901-0010-99bd-c14a93493e9c
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/3cf6a212-0b01-0010-8e8b-fc3dc8e0f5f7
    http://help.sap.com/saphelp_nw04s/helpdata/en/52/777e403566c65de10000000a155106/content.htm
    https://www.sdn.sap.com/irj/sdn/webinar?rid=/library/uuid/230d95df-0801-0010-4abb-ace1b3d197fd
    Thanks,
    JituK

  • TS4000 What is the "correct" date and time settings for icloud reminders?

    What is the "correct" date and time settings for icloud reminders?

    Sep. 18th. We don't know what time.

  • What is the Maximum Data can a file adapter can send at a time.

    Hi ,
    What is the Maximum Data can a file adapter can send at a time.Is there any maximum limit data can adapters will send.
    can any one help on this.
    regrads
    Raghu

    Hi Reddy,
    I have raised the same question and I got correct answer.
    Refer the below thread which will give u information.
    Wat is the maximu size of data XI can handle
    Thnx
    Chirag

  • How to calculate Average time from a date field

    Hi All,
    I have a date type field in my table .
    I want to calculate average time for a given country in a select query. Date has to be exculded. Only time has to be taken into consideration.
    Kindly help me.
    Sample data
    india 25-JUN-09 08:12:45
    india 25-JUN-09 09:01:12

    Take which one you want.WITH dates AS
      (SELECT sysdate x FROM dual
        UNION
       SELECT sysdate + 1 +1/24 FROM dual
    SELECT TO_CHAR(to_date(AVG(to_number(TO_CHAR(to_date(TO_CHAR(x,'HH24:MI:SS'),'HH24:MI:SS'),'sssss'))),'sssss'),'hh24:mi:ss')
       FROM dates;
    WITH dates AS
      (SELECT sysdate x FROM dual
        UNION
       SELECT sysdate + 1 +1/24 FROM dual
    SELECT floor(24 * AVG(x- TRUNC(x)))
      || ':'
      || floor(mod(24 * AVG(x- TRUNC(x)),1) * 60)
      || ':'
      || floor(mod(mod(24 * AVG(x- TRUNC(x)),1) * 60,1) * 60)
       FROM dates;By
    Vamsi

  • What is autocorrelation time with respect to Data Acquisation System

    what is autocorrelation time with respect to Data Acquisation System. thew hardware i am using is PCi -6034E CARD AND SCXI_1102 and TC-2095 terminal block

    Suresh;
    Unfortunately,NI doesn't spec that.
    For a complete list of specs for that particular DAQ board and SCXI module, refer to their Users Manual.
    Regards
    Filipe A.
    Applications Engineer
    National Instruments

  • TDMS time based reduction method has no end date

    Hello TDMS experts,
    I am in the process of configuring the 'Time based reduction' method of TDMS. In the package, I am in phase "System Analysis"->"Analyze and specify'from date'".
    Here it gives me the option to specify the start/from date only. If I want to transfer data for only  a month in the past lets say april of 2007, then how can I specify the from and to dates.
    Please advise.
    Thank you,
    kind regards,
    Zaheer Shaikh

    Hi Pankaj,
    Thanks for the reply.
    In the phase "maintain Table Reduction" I have chosen SAP standard reduction, that means the table reduction is handled by SAP itself. Since my sender system database is 4 Terabytes, my question is if I select the 'from date' as 10th April 2009 (present date is 20 April 2009), will the TDMS still copy the whole tables for which I have not manually defined any reduction?
    Please advice.
    Thank you,
    Kind regards,
    Zaheer Shaikh

  • Data migration completed what about next time

    Hi Guys,
    I have sucessfully migrate the data into a new system from our existing Quality server, this is just for testing the tdms implementation, but my question is what about next time, suppose i have migrated all the data from 1.08.2007 to till date and package has been completed, but when we start again we need to create another package i guess and if we need to change only the data selection date then do we have to go for all the step or is there any selective option for us.
    Regards
    Subhash

    Hi,
    Please see the help file. Here is an extract:
    If no changes regarding the definition of the objects for transfer have been made, you can create a refresh package to replace the data in the receiver system with current data from the related sender system. You can specify a new 'from-date' for the time-related approach and include additional tables in the transfer.
    The refresh package inherits the setting of the related configuration package, which means that you do not have to execute some of the steps that were required for configuration.
    Procedure
    Place the cursor on the related configuration package, and choose Copy for Refresh.
    Regards,
    Masoud

  • Sales orders in TDMS company/time based reduction  are outside the scope

    Guys,
    I have had some issues with TDMS wheras it didn't handle company codes without plants very well. That was fixed by SAP. But I have another problem now. If I do a company code and time based reduction, It doesn't seem to affect my sales orders in VBAK/VBUK as I would have expected. I was hoping it would only copy sales orders across that have a plant which is assigned to a company code that was specified in the company code based reduction scenario. That doesn't seem to be the case.
    VBAK is now about one third of the size of the original table (number of records). But I see no logic behind the reduction. I can clearly see plenty of sales documents that have a time stamp way back from what I specified in my copy procedure and I can see others that have plant entries that should have been excluded from the copy as they do belong to different company codes than the ones I specified.
    I was under the impression that TDMS would sort out the correct sales orders for me but somehow that doesn't seem to be happening. I have to investigate further as to what exactly it did bring across but just by looking at what's in the target system I can see plenty of "wrong" entries in there either with a date outside the scope or with a plant outside the scope.
    I can also see that at least the first 10'000 entries in VBAK in the target system have a valid from and to date of 00.00.0000 which could explain why the time based reduction didn't work?
    Did you have similar experiences with your copies? Do I have to do a more detailed reduction such as specifying tables/fields and values?
    Thanks for any suggestions
    Stefan
    Edited by: Stefan Sinzig on Oct 3, 2011 4:57 AM

    The reduction itself is not based on the date when the order was created but the logic enhances it to invoices and offers, basically the complete update process.
    If you see data that definitely shouldn't be there I'd open an OSS call and let the support check what's wrong.
    Markus

  • Report to display Average time taken for processing payments".

    Hi,
    I have been asked to develop a report for "Report to display Average time taken for processing payments".
    Could any one guide me technically what are the different tables i need to take to generate the report. Treat this is very urgent. Pls provide sample code too....
    Thanks in advance....

    Given below is the set up for credit card payment processing:
    Set Up Credit Control Areas:
    Define Credit Control Area
    Transaction: OB45 
    Tables: T014
    Action: Define a credit control area and its associated currency.  The Update Group should be ‘00012’.  This entry is required so the sales order will calculate the value to authorize
    Assign Company Code to Credit Control Area
    Transaction: OB38
    Tables: T001
    Action: Assign a default credit control area for each company code
    Define Permitted Credit Control Area for a Company
    Code
    Transaction: 
    Tables: T001CM
    Action: For each company code enter every credit control area that can be used
    Identify Credit Price
    Transaction: V/08
    Tables: T683S
    Action: Towards the end of the pricing procedure, after all pricing and tax determination, create a subtotal line to store the value of the price plus any sales tax.  Make the following entries:
    Sub to:  “A”
    Reqt:  “2”
    AltCTy:  “4”
    Automatic Credit Checking
    Transaction: OVA8
    Tables: T691F
    Action: Select each combination of credit control areas, risk categories and document types for which credit checking should be bypassed.  You need to mark the field “no Credit Check” with the valid number for sales documents.
    Set Up Payment Guarantees
    Define Forms of Payment Guarantee
    Transaction: OVFD
    Tables: T691K
    Action: R/3 is delivered with form “02” defined for payment cards.  Other than the descriptor, the only other entry should be “3” in the column labeled “PymtGuaCat”
    Define Payment Guarantee Procedure
    Transaction: 
    Tables: T691M/T691O
    Action: Define a procedure and a description. 
    Forms of Payment Guarantee and make the following entries Sequential Number  “1” 
    Payment Guarantee Form “02”
    Routine Number   “0”    Routine Number can be used to validate payment card presence.
    Define Customer Payment Guarantee Flag
    Transaction: 
    Tables: T691P
    Action: Define a flag to be stored in table. 
    Create Customer Payment Guarantee = “Payment Card Payment Cards (All Customers can use Payment Cards)”.
    Define Sales Document Payment Guarantee Flag
    Transaction: 
    Tables: T691R
    Action: Define the flag that will be associated with sales document types that are relevant for payment cards
    Assign Sales Document Payment Guarantee Flag
    Transaction: 
    Tables: TVAK
    Action: Assign the document flag type the sales documents types that are relevant for payment cards.
    Determine Payment Guarantee Procedure
    Transaction: OVFJ
    Tables: T691U
    Action: Combine the Customer flag and the sales document flag to derive the payment guarantee procedure
    Payment Card Configuration
    Define Card Types
    Transaction: 
    Tables: TVCIN
    Action: Create the different card types plus the routine that validates the card for length and prefix (etc…) 
    Visa , Mastercard, American Express, and Discover 
    Create the following entries for each payment card 
    AMEX  American Express ZCCARD_CHECK_AMEX Month
    DC  Discover Card  ZCCARD_CHECK_DC  Month*****
    MC  Mastercard  ZCCARD_CHECK_MC  Month
    VISA  Visa   ZCCARD_CHECK_VISA  Month
    The Routines can be created based on the original routines delivered by SAP. 
    *****SAP does not deliver a card check for Discover Card. We created our own routine.
    Define Card Categories
    Transaction: 
    Tables: TVCTY
    Action: Define the card category to determine if a
    payment card is a credit card or a procurement card.
    Create the following two entries
    Cat Description  One Card  Additional Data
    CC Credit Cards  No-check  No-check
    PC Procurement Cards No-check  Check
    Determine Card Categories
    Transaction: 
    Tables: TVCTD
    Action: For each card category map the account number range to a card category.  Multiple ranges are possible for each card category or a masking technique can be used.  Get the card number ranges from user community.  Below is just a sample of what I am aware are the different types of cards. 
    Visa Credit  Expires in 7 days. 
        400000   405500
        405505   405549
        405555   415927
        415929   424603
        424606   427532
        427534   428799
        428900   471699
        471700   499999
    Visa Procurement  Expires in 7 days.
        405501   405504
        405550   405554
        415928   415928
        424604   424605
        427533   427533
        428800   428899
    Mastercard Credit Expires in 30 days
        500000   540499
        540600   554999
        557000   599999
    Mastercard Procurement Expires in 30 days
        540500   540599
        555000   556999
    American Express Credit Expires in 30 days
        340000   349999
        370000   379999
    Discover Card Credit Expires in 30 days
        601100   601199
    Set Sales Documents to accept Payment Card Information Transaction: 
    Tables: TVAK
    Action: Review the listing of Sales Document types and enter “03” in the column labeled “PT” for each type which can accept a payment card
    Configuration for Authorization Request
    Maintain Authorization Requirements
    Transaction: OV9A
    Tables: TFRM
    Action: Define and activate the abap requirement that determines when an authorization is sent.  Note that the following tables are available to be used in the abap requirement (VBAK, VBAP, VBKD, VBUK, and VBUP).
    Define Checking Group
    Transaction: 
    Tables: CCPGA
    Action: Define a checking group and enter the
    description.  Then follow the below guidelines for the remaining fields to be filled.
    AuthReq Routine 901 is set here.
    PreAu  If checked R/3 will request an authorization for a .01 and the authorization will be flagged as such. (Insight does not use pre-authorization check).
    A horizon This is the days in the future SAP will use to determine the value to authorize
    (Insight does not use auth horizon period).
    Valid  You will get warning message if the payment card is expiring within 30 days of order entry date. 
    Assign Checking Group to Sales Document
    Transaction: 
    Tables: TVAK
    Action: Assign the checking group to the sales order types relevant for payment cards
    Define Authorization Validity Periods
    Transaction: 
    Tables: TVCIN
    Action: For each card type enter the authorization validity period in days.
    AMEX American Express 30
    DC Discover card  30
    MC Master card  30
    VISA Visa   7
    Configuration for clearing houses
    Create new General Ledger Accounts
    Transaction: FS01
    Tables: 
    Action: Two General Ledger accounts need to be created for each payment card type.  One for A/R reconciliation purposes and one for credit card clearing.
    Maintain Condition Types
    Transaction: OV85
    Tables: T685
    Action: Define a condition type for account determination and assign it to access sequence “A001”
    Define account determination procedure
    Transaction: OV86
    Tables: T683 / T683S
    Action: Define procedure name and select the procedure for control.  Enter the condition type defined in the previous step.
    Assign account determination procedure
    Transaction: 
    Tables:
    Action: Determine which billing type we are using for payment card process.
    Authorization and Settlement Control
    Transaction: 
    Tables: TCCAA
    Action: Define the general ledger accounts for reconciliation and clearing and assign the function modules for authorization and settlement along with the proper RFC destinations for each.
    Enter Merchant ID’s
    Transaction: 
    Tables: TCCM
    Action: Create the merchant id’s that the company uses to process payment cards
    Assign merchant id’s
    Transaction: 
    Tables: TCCAA
    Action: Enter the merchant id’s with each clearinghouse account

  • Undo file Having huge read average time more than 20ms

    Please,
    I'm working on oracle 10g, windows server 2003.
    From the file I/O stats, I found that the Average read I/O from the undo files is more than 20ms, this contrast with other files where the same I/O metric is less then 0.5ms.
    Can someone give me a clue?
    thanks a lot

    I assume 'undo files' you mentioned means datafiles of undo tablespace: Yes
    How did you determined the I/O metric ? What tools used?, the following script
    column FILE_NAME format a30
    column PHYRDS format 999999,999
    column PHYWRTS format 999999,999
    column READTIM format 999999,999
    column "READ AVG (ms)" format 999.99
    column "TOTAL I/O" format 999999,999
    select
    FILE_NAME,
    PHYRDS,
    READTIM,
    PHYWRTS,
    READTIM / (PHYRDS + 1) "READ AVG (ms)",
    PHYRDS + PHYWRTS "TOTAL I/O"
    from
    V$FILESTAT a,
    DBA_DATA_FILES b
    where
    a.FILE# = b.FILE_ID
    order by
    6 DESC;
    Are other files on the same storage as undo datafiles?, all files datafiles,redo files, cotrol files are on the same storage.
    How are your disk storage configured?, it may be a RAID X technology, because the level is unknown.
    By default our client cannot accept to move the undo file to another storage, that's why all data files are stored on the same drive.
    So what can cause the undo read average time to be so high?
    Thanks again

  • TDMS - time based reduction - receiver system deletion

    Experts,
    I'm doing a time based reduction.  I'm on the step "Start Deletion of Data in Receiver System".  It's been running for over 18hours.
    But I don't see any jobs running in SM66 on the Central/Control or Sender or Reciever systems.
    When I click on the "Task" button, I see it has completed 8,444 of 12,246  sub activites.  There are 3,802 not yet started.
    We're on all the latest levels of DMIS and ECC.
    Any ideas?
    Thanks
    NICK

    Ashley and Niraj,
    Hey, I'm all for tips/tricks so don't worry about messing up my thread.
    I completely shut down the central/control system via stopsap and restarted.  Still it was in "running" status but no jobs were running on sender/rec or central/control.
    So I tried the trouble-shooting but it was un-clear to me what to do.
    I ended up highlighting the phase I reference earlier, then doing "execute" again.  The status changes from the "truck" to a green flag and I started to see jobs run again on the receiver system.  Again they have stopped, but I see another job scheduled to run in a few minutes....It's just weird, I didn't run into this on my last time-based copy.
    I'll post a few things I've learned to increase performance:
    RDISP/MAX_WP_RUNTIME = 0
    At LEAST 25 WP and 25 BCK procs
    rec/client = OFF
    RDISP/BTCTIME = 60
    RUN STATS regularly
    TAKE OUT OF ARCHIVELOG MODE
    Read/Impl these notes:
    Read theseu2026Update these parameters
    o TD05X_FILL_VBUK_1 Note 1058864
    o TD05X_FILL_VBUK_2 Note 1054584
    o TD05X_FILL_BKPF Note 1044518
    o TD05X_FILL_EBAN Note 1054583
    o TD05X_FILL_EQUI Note 1037712
    Set these oracle index on rec system:
    Table: QMIH
      fields: MANDT, BEQUI
    Table: PRPR
      fields: MANDT, EQUNR
    Table: VBFA
      fields: MANDT, VBELN, VBELV, POSNV
    set parameter u2018P_CLUu2019 to u2018Yu2019 in the following
    activities before you start the activities for filling internal header tables:
    TD05X_FILL_BKPF
    TD05X_FILL_CE
    TD05X_FILL_EKKO
    TD05X_FILL_VBUK
    TD05X_FILL_VBUK_1
    TD05X_FILL_VBUK_2
    TD05X_FILL_VSRESB
    TD05X_FILL_WBRK_1
    run TCODE CNVMBTACTPAR, specify the project number to do this
    IMPORTANT TCODEs
    CNV_MBT_TDMS_MY  Main TDMS starting point     
    CNVMBTMON  Process Monitor (must know your project number)
    DTLMON  MWB transfer monitor
    CNVMBTACTPAR  activity parameters
    CNVMBTACTDEF  MBT PCL activity maint
    CNVMBTTWB  TDMS workbench to develop scrambling rules
    CNV_TDMS_HCM_SCRAM  run in SENDER system for scrambling functionality
    Reports
    CNV_MBT_PACKAGE_REORG  to reorganize TDMS projects..aka delete
    CNV_MBT_DTL_FUGR_DELETE  deletes function groups associated with old projects
    Tables
    CNVMBTUSEDMTIDS   lists obsolete MTIDs
    IMPORTANT NOTES
    Note 894307 - TDMS: Tips, tricks, general problems, error tracing
    Note 1405597 - All relevant notes for TDMS Service Pack 12 and above
    Note 1402704 - TDMS Composite Note : Support Package Independent
    Note 890797 - SAP TDMS - required and recommended system settings
    Note 894904 - TDMS: Problems during deletion of data in receiver system
    Note 916763 - TDMS performance "composite SAP note"
    Note 1003051 - TDMS 3.0 corrections - Composite SAP Note
    Note 1159279 - Objects that are transferred with TDMS
    Note 939823 - TDMS: General questionnaire for problem specification
    Note 897100 - TDMS: Generating profiles for TDMS user roles
    Note 1068059 - To obtain the optimal deletion method for tables (receiver)
    Note 970531 - Installation and delta upgrade of DMIS 2006_1
    Note 970532 - Installation of DMIS Content 2006_1
    Note 1231203 - TDMS release strategy (Add-on: DMIS, DMIS_CNT, DMIS_EXT...)
    Note 1244346 - Support Packages for TDMS (add-on DMIS, DMIS_CNT, ...)
    I'm doing this for an ECC system running ecc 6.0 EHP6 by the way.
    Still any help with my issue on the delete would be helpful. but post tips I don't kwnow about
    NICK

  • Calculating average time from two records from the same table.

    Hi all
    I need to calculate the average time between two events that are recorded in the same table.
    The table is TMS_MESSAGE_AUDIT_LOG
    MESSAGE_ID VARCHAR2(16 BYTE) NOT NULL,
    MESSAGE_VERSION NUMBER(2) NOT NULL,
    CREATE_TM VARCHAR2(18 BYTE) NOT NULL,
    MESSAGE_STATUS VARCHAR2(30 BYTE),
    TRANSACTION_TYPE_NM VARCHAR2(30 BYTE),
    MESSAGE_TP VARCHAR2(3 BYTE),
    WORKFLOW_OBJECT VARCHAR2(30 BYTE) NOT NULL,
    WORKFLOW_REQUEST VARCHAR2(30 BYTE) NOT NULL,
    WORKFLOW_RETURN_CD VARCHAR2(30 BYTE) NOT NULL,
    AUDIT_ACTION VARCHAR2(255 BYTE),
    LAST_UPDATE_USER_LOGON_ID VARCHAR2(12 BYTE),
    LOCAL_TM VARCHAR2(18 BYTE) NOT NULL,
    LOCAL_TIME_ZN_NM VARCHAR2(70 BYTE) NOT NULL,
    LOCAL_DAYLIGHT_IN CHAR(1 BYTE) NOT NULL,
    FPRINT VARCHAR2(30 BYTE)
    What i now need is
    When the MESSAGE_ID is the same i need have the average time between when the MESSAGE_STATUS is AA and BB ( I need the time out of the CREATE_TM field )
    And this for every 15 minutes interval.
    Because this table will become BIG millions and millions of records it needs to be fast.
    Can anybody help me.
    Marcel

    Something like this?
    CREATE TABLE wr_test
    ( message_id                 VARCHAR2(16 BYTE) NOT NULL
    , message_version            NUMBER(2) NOT NULL  -- Assumption: Acknowledged ver > Received ver
    , create_tm                  VARCHAR2(18 BYTE) NOT NULL
    , message_status             VARCHAR2(30 BYTE)
    , transaction_type_nm        VARCHAR2(30 BYTE)
    , workflow_object            VARCHAR2(30 BYTE) DEFAULT 'x' NOT NULL
    , workflow_request           VARCHAR2(30 BYTE) DEFAULT 'x' NOT NULL
    , workflow_return_cd         VARCHAR2(30 BYTE) DEFAULT 'x' NOT NULL
    , audit_action               VARCHAR2(255 BYTE)
    , last_update_user_logon_id  VARCHAR2(12 BYTE)
    , local_tm                   VARCHAR2(18 BYTE) NOT NULL
    , local_time_zn_nm           VARCHAR2(70 BYTE) DEFAULT 'GMT' NOT NULL
    , local_daylight_in          CHAR(1 BYTE) DEFAULT 'x' NOT NULL );
    INSERT ALL
    INTO   wr_test
           ( message_id
           , message_version
           , create_tm
           , message_status
           , local_tm )
    VALUES ( message_id
           , 1
           , create_tm
           , '(Receive)'
           , TO_CHAR(local_tm,'YYYYMMDD HH24:MI:SS') )
    INTO   wr_test
           ( message_id
           , message_version
           , create_tm
           , message_status
           , local_tm )
    VALUES ( message_id
           , 2
           , create_tm
           , 'Wait CLSB Ack'
         , TO_CHAR
           ( local_tm + NUMTODSINTERVAL(DBMS_RANDOM.VALUE(0,2e5),'SECOND')
           , 'YYYYMMDD HH24:MI:SS' ) )
    SELECT ROWNUM AS message_id
         , TO_CHAR(SYSDATE,'YYYYMMDD HH24:MI:SS') AS create_tm
         , DATE '2000-01-01' + DBMS_RANDOM.VALUE(0,3) AS local_tm
    FROM dual CONNECT BY ROWNUM < 100000;
    WITH src AS
         ( SELECT message_id
                , message_status
                , message_version
                , TO_DATE(SUBSTR(local_tm,1,17),'YYYYMMDD HH24:MI:SS') AS dt
                , TO_DATE(SUBSTR(local_tm,1,8),'YYYYMMDD') AS dt_day
                , TO_CHAR(TO_DATE(SUBSTR(local_tm,10,8),'HH24:MI:SS'),'SSSSS') AS dt_sec
           FROM   wr_test
           WHERE  message_status IN ('(Receive)','Wait CLSB Ack') )
    SELECT dt_day + NUMTODSINTERVAL(period,'SECOND') AS dt
         , NUMTODSINTERVAL(AVG(elapsed),'DAY') AS avg_elapsed
         , NUMTODSINTERVAL(MIN(elapsed),'DAY') AS min_elapsed
         , NUMTODSINTERVAL(MAX(elapsed),'DAY') AS max_elapsed
         , COUNT(*)
    FROM   ( SELECT message_id
                  , message_status
                  , dt_day
                  , TRUNC(dt_sec/300)*300 AS period
                  , LEAD(dt) OVER (PARTITION BY message_id ORDER BY message_version) AS ack_dt
                  , LEAD(dt) OVER (PARTITION BY message_id ORDER BY message_version) - dt AS elapsed
             FROM   src ) cal
    WHERE  cal.message_status = '(Receive)'
    GROUP BY dt_day, period
    ORDER BY 1;Replace "wr_test" with "tms_message_audit_log" in the WITH subquery to test on your data.

  • Graphing Time Data

    I have a report that pulls data from a phone system database.  Each record has the amount of time spent talking, waiting, holding and ringing for each person as numeric fields showing the total number of seconds for a block of calls.  I've been able to create the report and write formulas that convert the seconds for each field into a time field.  But of course it converts it to a specific date which is not what I really want.  If they run the report for a large time period like a month, then there will likely be more than 24 hours in the hour position which the function trucates to a single day.
    As a work-around I've been able to use formulas to calculate the number of hours/minutes/seconds and then create more formulas to calculate the summary and grand totals for each person.  Not the best solution, but it works.  My problem is that now they would like to add a graph to the report showing the average total time spent on the phone for each person.
    Of course I can't graph my calculated time since the formulas result in a String value.  And it doesn't seem to want to let me sum a time field so I can graph a true time value either.  I thought maybe I could convert my finished average time per person into a time value at the group level since that wouldn't be over 24 hours, but that didn't work either.
    Was hoping someone would have some experience in working with time data.  Any suggestions would be helpful.

    Hi Jay,
    Thanks for the response, but I don't think I explained my problem clearly.  Here's how my data is stored (sorry about the lack of columns, so I separated them with "|"):
    Phone Rep ||   Day   ||   Talk Time ||   Hold Time  ||  Total Time
    John Doe   ||    1-1-09 ||  5,000 sec||   1,200 sec   ||  6,200 sec
    John Doe   ||    1-2-09 ||  6,000 sec ||  2,000 sec  ||   8,000 sec
    Total          ||             11,000 sec ||  3,200 sec  ||  14,200 sec
    The read out on the report should read:
    Phone Rep   ||   Total Talk Time  ||   Total Hold Time  ||  Grand Total Time
    John Doe      ||     3:03:20       ||          0:53:20       ||        3:56:40
    Avg/day        ||     1:31:40         ||        0:26:40          ||     1:58:20
    When I convert the time in seconds to a time variable with TimeSerial(hh,mm,ss) it gives me the time in the correct format, but it's listed as simply a time of day.  If the report runs for many days, then the time values could require the hours to go over 24 such as 155:35:26 to reflect a month's worth of calls.  My formulas that calculate the read out as shown above can do this with no problem.  The problem occurs when I try to graph the average at the bottom.  I can't seem to find a way to chart the average time per day.
    I've tried summing the total time in a group and then find a way to format the y axis to show the time format.  I've tried creating a formula to convert the individual record into a time format and then summing that for the graph, but it doesn't give me the option to sum, only to count or other non-numeric options.  I've also tried creating a formula to convert the finished average calculation into a field that can be graphed by itself, but when I insert the chart it doesn't list my formula in the available fields to graph.
    I would even be willing to create the chart in a sub report that somehow pulls in just the total average and then tries to graph that, or use the Microsoft OLE chart object to try and push the data to that and insert the finished chart, but having never done either of these, I was hoping for an easier solution.
    Thanks
    Edited by: Ken Skinner on Jan 25, 2009 8:55 PM
    Edited by: Ken Skinner on Jan 25, 2009 8:59 PM

Maybe you are looking for

  • FK02 :: vendor master data change

    FK02 :: vendor master data change I want to change payment terms to a particular vendor but i am getting a error message <b>changes for vendor not yet confirmed.</b> I tried FK08 to confirm earlier changes but i could only confirm company code change

  • Putting a Photoshop-made slideshow on iPhone.

    I made a music/photo slideshow using PhotoShop and it was automatically saved as a Windows Media File. It opens and plays using other players - Windows Media, RealPlayer, etc. Can it be converted to import into the iPhone?

  • SAP B1 for Construction

    Hi everybody, Can anybody let me know that any of the  consultant has implemented the SAPB1 in the construction company. any idea or tips  for implement the SAPB1 in the construction company. thanks in advance bhargav gampala

  • Instillation of Photoshop elements 9 on a replacement computer

    My computer on which elements 9 was installed crashed. I replaced  it and now cannot install 9 on my new computer. It will not accept my serial number. How can I overcome this problem?

  • Where to train for OCMJD?

    Hello friends, I am from Bangladesh and recently I have started to shift my career towards Java. I am at the final stages of preparing for OCPJP and my future plan is to take the OCMJD as well. However, in Bangladesh, I can only train for the OCPJP;