Analysing Task Audit, Data Audit and Process Flow History

Hi,
Internal Audit dept has requested a bunch of information, that we need to compile from Task Audit, Data Audit and Process Flow History logs. We do have all the info available, however not in a format that allows proper "reporting" of log information. What is the best way to handle HFM logs so that we can quickly filter and export required audit information?
We do have housekeeping in place, so the logs are partial "live" db tables, and partial purged tables that were exported to Excel to archive the historical log info.
Many Thanks.

I thought I posted this Friday, but I just noticed I never hit the 'Post Message Button', ha ha.
This info below will help you translate some of the information in the tables, etc. You could report on it from the Audit tables directly or move them to another appropriate data table for analysis later. The concensus, though I disagree, is that you will suffer performance issues if your audit tables get too big, so you want to move them periodically. You can do this using a scheduled Task, manual process, etc.
I personally just dump it to another table and report on it from there. As mentioned above, you'll need to translate some of the information as it is not 'human readable' in the database.
For instance, if I wanted to pull Metadata Load, Rules Load, Member List load, you could run a query like this. (NOTE: strAppName should be equal to the name of your application .... )
The main tricks to know at least for task audit table are figuring out how to convert times and determing which activity code corresponds to the user friendly name.
-- Declare working variables --
declare @dtStartDate as nvarchar(20)
declare @dtEndDate as nvarchar(20)
declare @strAppName as nvarchar(20)
declare @strSQL as nvarchar(4000)
-- Initialize working variables --
set @dtStartDate = '1/1/2012'
set @dtEndDate = '8/31/2012'
set @strAppName = 'YourAppNameHere'
--Get Rules Load, Metadata, Member List
set @strSQL = '
select sUserName as "User", ''Rules Load'' as Activity, cast(StartTime-2 as smalldatetime) as "Time Start",
      cast(EndTime-2 as smalldatetime) as ''Time End'', ServerName, strDescription, strModuleName
   from ' + @strAppName + '_task_audit ta, hsv_activity_users au
   where au.lUserID = ta.ActivityUserID and activitycode in (1)
        and cast(StartTime-2 as smalldatetime) between ''' + @dtStartDate + ''' and ''' + @dtEndDate + '''
union all
select sUserName as "User", ''Metadata Load'' as Activity, cast(StartTime-2 as smalldatetime) as "Time Start",
      cast(EndTime-2 as smalldatetime) as ''Time End'', ServerName, strDescription, strModuleName
   from ' + @strAppName + '_task_audit ta, hsv_activity_users au
   where au.lUserID = ta.ActivityUserID and activitycode in (21)
        and cast(StartTime-2 as smalldatetime) between ''' + @dtStartDate + ''' and ''' + @dtEndDate + '''
union all
select sUserName as "User", ''Memberlist Load'' as Activity, cast(StartTime-2 as smalldatetime) as "Time Start",
      cast(EndTime-2 as smalldatetime) as ''Time End'', ServerName, strDescription, strModuleName
   from ' + @strAppName + '_task_audit ta, hsv_activity_users au
   where au.lUserID = ta.ActivityUserID and activitycode in (23)
        and cast(StartTime-2 as smalldatetime) between ''' + @dtStartDate + ''' and ''' + @dtEndDate + ''''
exec sp_executesql @strSQLIn regards to activity codes, here's a quick breakdown on those ....
ActivityID     ActivityName
0     Idle
1     Rules Load
2     Rules Scan
3     Rules Extract
4     Consolidation
5     Chart Logic
6     Translation
7     Custom Logic
8     Allocate
9     Data Load
10     Data Extract
11     Data Extract via HAL
12     Data Entry
13     Data Retrieval
14     Data Clear
15     Data Copy
16     Journal Entry
17     Journal Retrieval
18     Journal Posting
19     Journal Unposting
20     Journal Template Entry
21     Metadata Load
22     Metadata Extract
23     Member List Load
24     Member List Scan
25     Member List Extract
26     Security Load
27     Security Scan
28     Security Extract
29     Logon
30     Logon Failure
31     Logoff
32     External
33     Metadata Scan
34     Data Scan
35     Extended Analytics Export
36     Extended Analytics Schema Delete
37     Transactions Load
38     Transactions Extract
39     Document Attachments
40     Document Detachments
41     Create Transactions
42     Edit Transactions
43     Delete Transactions
44     Post Transactions
45     Unpost Transactions
46     Delete Invalid Records
47     Data Audit Purged
48     Task Audit Purged
49     Post All Transactions
50     Unpost All Transactions
51     Delete All Transactions
52     Unmatch All Transactions
53     Auto Match by ID
54     Auto Match by Account
55     Intercompany Matching Report by ID
56     Intercompany Matching Report by Acct
57     Intercompany Transaction Report
58     Manual Match
59     Unmatch Selected
60     Manage IC Periods
61     Lock/Unlock IC Entities
62     Manage IC Reason Codes
63     Null

Similar Messages

  • HFM Process Control and Process Flow History

    Hi,
    When I review the Process Flow History for top parent members of the consolidation, the POV shows <Scenario> <Year> <Period> <Entity> <Entity Currency>. For all other applications and entities, the <Entity Currency> member shows but I've built a new application and the <Entity Currency> shows as "EUR Total" (EUR is one of the currencies) except that EUR is not the default currency of application or the entity's value. Could anyone explain to me why it would show that way or where it could be coming from?
    I'm using HFM v11.1.1.3.
    Thanks in advance.

    did you add new metadata?

  • FI Functional Audit questionnaire procedure and  process

    Hi Friends
    Pls provide me the information or related documentaion for FICO Audit Questionaire procedure and process flow
    Thanks in advance
    Srini
    Edited by: Srinivas A on May 8, 2009 5:07 PM

    Hello,
    http://www.auditnet.org/sapaudit.htm
    Hope this will good resource for you.
    The is a free site providing the information on audit. You can register and go throgugh the materials.
    Regards,
    Ravi

  • AIS (Audit Information Systems) and Process controls

    Dear Experts
    Have you seen any customer using AIS (Audit Information Systems) and Process controls, simultaneously for Audit purposes? Can we use the GRC Process Controls for Scheduling regular Audits, inaddition to the Internal Control Assessments?
    Thanks & Regards
    Swarna

    Hi Swarna,
    Have you seen any customer using AIS (Audit Information Systems) and Process controls, simultaneously for Audit purposes?
    Yes. AIS is purely in SAP users and authorizations,  and PC can be used at different levels.
    Can we use the GRC Process Controls for Scheduling regular Audits, inaddition to the Internal Control Assessments?
    The data from the PC is used in the ICAs. So PC itself can't/may not provide all the information that is required by the auditors.
    Regards,
    Raghu

  • Data Type for Process Flow... PB with Date?

    I've got a problem by passing parameters in process flow.
    I have a mapping with a parameter DATE_EXEC (data type : DATE) and a default value that is TO_DATE('20/01/2007' , 'dd/mm/yyyy') . My mapping is working good when i launch it.
    I have a process flow which contains the mapping. This process has a parameter DATE_EXEC (data type : DATE). I bind the 2 DATE_EXEC. But when i launch my mapping the value is not recognized, I try with :
    - TO_DATE('20/01/2007' , 'dd/mm/yyyy')
    - 20/01/2007
    - 2007.01.20
    - 2007-01-20
    My question is what are the data type in process flow? They are not ORACLE TYPE.
    For example , a parameter in a mapping which is a VARCHAR2 must be input between quotes but if you bind it to a parameter of a process flow which is a STRING (not ORACLE Data type) , you must input it without quotes?
    Anybody has some rules about that?
    I apologize for my english, i'm a french people.

    Here is some information on the literal quote or not quote query and what I think you need to do at the end, hope it helps. Not exactly intuitive...since the flow designer (you) have to know what is a PLSQL object and what is not.
    1. Literal = FALSE
    When Literal = FALSE is set then the value entered must be a valid PL/SQL expression which is evaluated at the Control Center e.g.
    'Hello World!'
    22 / 7
    2. Literal = TRUE
    When Literal = TRUE then the value is dependent on the the type of Activity. If the activity is a PL/SQL object i.e. Mapping or Transformation, then the value is PL/SQL snippet. The critical difference here is that the value is macro substituted into the call for the object. The format of the value is identical to that entered as default value in the Mapping editor. e.g.
    'Hello World!'
    sysdate()
    If the activity type is not a PL/SQL object then the value is language independent. e.g.
    Hello World
    3.1427571
    What you should try......
    Check the map activity parameter in your process flow to see if literal is false (an expression), set it to false and then try using your TO_DATE('20/01/2007' , 'dd/mm/yyyy') expression, deploy your flow and execute. Alternatively the user guide defines the DATE type for flow with the format YYYY-MM-DD so you can have the parameter value as '2007-01-20' use literal equal to true and remember and quote your value.
    Cheers
    David

  • Ways of creating contract and process flow of contracts

    hi friends
    ways of creating contract and process flow of contracts
    thanks for ur help
    regards
    krishna

    hi,
    In the MM Purchasing component, a contract is a type of outline purchase agreement against which release orders (releases) can be issued for agreed materials or services as and when required during a certain overall time-frame.
    Contracts can take the following forms:
    Quantity contracts
    Use this type of contract if the total quantity to be ordered during the validity period of the contract is known in advance. The contract is regarded as fulfilled when release orders totaling a given quantity have been issued.
    Value contracts
    Use this type of contract if the total value of all release orders issued against the contract is not to exceed a certain predefined value. The contract is regarded as fulfilled when release orders totaling a given value have been issued.
    You can also set up corporate buying contracts with your vendors. These are valid for all plants and company codes within a client (see Centrally Agreed Contract).
    You can create a contract as follows:
    Manually
    You enter all data relating to the contract manually.
    Using the referencing technique
    As reference document (the document you copy from), you can use:
    Purchase requisitions
    RFQs/quotations
    Other contracts
    CREATION OF CONTRACT MANUALLY:
    Choose Outline agreement --> Contract --> Create(ME31K)
    The initial screen appears.
    Enter the necessary data. If you make any specifications under the group heading Default data, this data will appear as the default data in each item.
    In the Agreement type field, specify whether you are creating a quantity or value contract, for example.
    Press ENTER .
    The header data screen appears.
    Enter the contract validity period. Check the other fields on this screen and make any necessary changes (e.g. the terms of payment) and define the header conditions.
    Press ENTER .
    The item overview screen appears.
    On this screen, enter the information for each item (material number, target quantity, price, receiving plant, or account assignment, etc.) using the same procedure as with purchase orders.
    Material without a master record: leave the field for the material number empty and enter the following:
    u2013 Short description of the relevant material or service in the Short text field
    u2013 Material group to which the material belongs, in the Material group field
    u2013 Account assignment category
    You can enter u (unknown) or the category of an account assignment.
    u2013 The target quantity and the order unit
    If you specify an account assignment category other than U (field A), you must enter the relevant account assignment data for the item. To do so, choose Item ® Account assignments (see also Account Assignment).
    If necessary, review the details for each item. Select the item(s) to review. Then select Item -> Details.
    Enter the desired conditions.
    Enter further text for the item if any additional instructions to the vendor or Goods Receiving are necessary. Choose Item -> Texts -> Text overview.
    Save the contract.
    Hope it helps..
    Regards
    Priyanka.P
    AWARD IF HELPFULL
    Edited by: Priyanka Paltanwale on Aug 25, 2008 7:20 AM

  • Scheduling Mappings and Process Flows

    Can you please tell me how to schedule mappings and process flows in a little detail, if any one has done that. I think OEM can be used for this but dont have any clear cut idea of doing it. Can the mappings/process flows be scheduled without using OEM console ?
    Waiting for you suggestions.
    rgds
    -AP

    Hi
    You can schedule your mapping and process flows with OEM or database job.
    You will find script templates in the OWB HOME/owb/rtp/sql directory
    and you can scheldule them.
    If you want to do it in OEM use the oem_exec_template.sql file createing an OEM job. Read the OWB documentation about it. If you have any question about it ask it, I have done this procedure many times.
    Ott Karesz
    http://www.trendo-kft.hu

  • OWB : Runtime Repo for Target and Process Flows on Different M/C

    Hi,
    I have an environment where the Runtime Repositories for the Target Warehouse and Process Flows ( Workflows ) are on different machines. I have deployed :
    * the mappings and Target Tables on the Target Warehouse
    * Process Flows on the Workflow Server which resides on a seperate M/C.
    When I try executing the Process Flows for the Mappings , it errors out indicating that the objects may not have been deployed. Looks like the Process Flows cannot see the Mappings deployed.
    Can somebody help me here ?????

    I think target schema and runtime rep shuould be in same instance

  • Getting an process-flow audit id in the process flow itself

    Hi,
    I am using OWB 11gR2 and want to capture the audit_id of the process flow itself at the process-flow level.
    I want to use this to pass it tru to the mappings in the process flow.
    I know how to get audit_id when you're in the mapping (get_audit_id), but i want the audit_id of the process flow when i am "inside" the process flow.
    When i have this i can let all of my mappings in the flow receive the same id.
    When i setup a parameter at the process flow and specify get_audit_id there it errors on me.
    Does anybody what to specify here ?

    ok,
    i figured it out myself and answered myself in another thread.
    Basically it goes like this:
    In the master flow you have sub-processflow objects which you want to pass along the audit_id of the master flow.
    So that you can pass along the same audit-id to every mapping in all of the flows.
    But it would be nice to be able to run sub flows independently from the master flows (for testing etc.)but still feed the same id(whatever that is) to the mappings in that process flow.
    And on the lowest level this applies to mappings as well: be able to run a mapping and get a id to store in a field.
    The same applies to feeding a process date to all the mappings in your flows(used for dwh purposes)
    How does this work on the mappings ?
    Create a mapping input parameter, put 2 params in it.
    One date, one number. The date part is easy: just put SYSDATE in it.
    The number part would ideally hold get_audit_id as default value. Unfortunately this generates an error.
    So put default value of 1 here. Create a constant on the mapping with value get_audit_id in it.
    Create expression that tests if input_param has something else than 1 as the value ; if so then there was no audit_id fed into the mapping by a process flow. Make the expression use the constant then. Use the output of the expression in your mapping.
    How does this work in process flows ?
    You'll have two different parts here:
    -feeding from flow to a sub-flow
    -feeding from flow to mapping
    Flow to subflow:
    You can not bind a flowparameter to subprocess paramter so you'll need to create 2 variables.(process date / audit id)
    Create two parameters, one date with sysdate as default, one number with 1 as default.
    Use assign operator to bind audit_id parameter to the audit id variable.
    Use another one to bind to the same variable, but specify parent_audit_id as value instead of binding it to the paramter.
    Use conditonal routing on these two assign operators to have one of these be executed.
    This will ensure that the variabel either gets parent_audit or the value of the input parameter of the flow.
    Use another assign op. to bind the date input to the date variabel.
    Bind the parameters of the sub flows to the variables.
    Flow to mapping:
    Use the same procedure as descibed above. Only difference here is that you can bind a mapping parameter to a parameter.
    This means that you'll don't need the assign stuff for the date parameter since you can bind the date parameter of the mapping to the input parameter of the flow.
    Hope this helps someone ...

  • TAS and process flow

    Dear All,
    We are trying to use TAS for existing imeplentation.
    We are not able to assign load id to Sales Order manually even config settings are done for manual one.
    Pls. guide whether the process flow is correct for this.
    Create Load ID --> Assign to SO --> Creat Outbound Delviery --> Create Bulk Shipment --> Create Loading Confirmation --> Create Delviery Confirmation.
    Information on SAP help and in PDF file of TAS is quite confusing.Process flow is not clear.
    Thanks and Regards,
    Vijay Mundke

    Hi Vijay,
    the process can have different variantions, but the document you want to send to the tas system has to have the load-id assigned. The load-id triggers the tas relevance.
    If you want to send the sales order to an external system, you need to assign the load-id to the sales order.
    usually, the sales order is sent to a TPI system (transportation planning system) and the transportation system sends back the delivery and bulk shipment (idoc OILSHl). This means that this order goes to this truck. The shipment represents the truck on which the delivery is scheduled. Each document that is posted after-event is a function module (one for the delivery and one for the bulk shipment).
    Then, you would send the bulk shipment to the terminal (refinery or depot). This means that a load-id needs to be assigned to the bulk shipment (idoc OILSHL). The TAS system receives the shipment and the loading must be done with reference to this shipment/load/id. Then, the TAS sends back the loading data (OILLDD), mainly the OILLDD file contains the reference, the material data, the load qty, the unit, the density, the temperature. The OILLDD idoc has many fields, but only a few are mandatory. I can send you a list of the mandatory fields if you can use it.
    Then, the system posts the load confirmation automatically based on these data. depending on the set-up, you can either have the system post the load confirmation only or the load confirmation and the delivery confirmation. Both documents have standard function modules. I strongly recommend to use the standard function modules. They work. I have used them on several projects and all you need are some exits. If you post only the load confirmation, the delivery confirmation can be posted manually. If the customers are invoiced based on arrival quantities, the delivery confirmation can have different quantities (temperature can be quite different, especially in India). If invoicing is based on loaded quantities, you can post both the load and delivery confirmation with the same data or set up the customizing that a load confirmation will be the end of the process (based on the incoterm).
    before, you can do the customizing, you need to check which processes you have. To know whether you have a shipment or a pick-up process triggers whether you should send a sales order or a shipment to the terminal. Most oil companies have their own fleet and deliver their fuel via a truck to the customer -> in this case, you need to send a bulk shipment to the terminal. But, often, the customers also pick-up the fuel at the refinery. Then, you only need a sales order and a delivery and goods issue is sufficient. However, if you need to manage the plate number at the refinery, you might want to process both of these variants via a bulk shipment. In all cases, the loadings will come back with an OILLDD idoc.
    Depending on the answer to this question, you can set-up the tas relevance customizing. If you choose to send bulk shipments, you set up tas relevance for shipments. This defines simply in which cases a load-id is required in the shipment. Say you have 2 refineries which are represented in the transportation planning point (ref1 and Ref2). If only one refinery will have an automatic interface, you set up only that one to require a load-id. If you have several product groups (say Bitumen, white fuels) and some of them are automated (typically fuel) and other not (bitumen) and you have set them up as bulk shipment types, then you can further differentiate. Then, you need one load-id type for each process variation.
    I like to use the following:
    PU-ORDER (you send an order and the tas posts the delivery and goods issue)
    SH-LC (you send a bulk shipment and the TAS posts the load confirmation only)
    SH-LCDC (you send a bulk shipment and the TSA posts load and dlv confirmation)
    SH-DC (the load confirmed shipment is sent to the arrival terminal and the arrival terminal sends back the delivery confirmation)
    The most commonly used is the SH-LC.
    After that, you call the control structures and the LID function groups the same name. (Types are A for pick-up, B for load confirmation and E for delivery confirmation). It makes the whole thing much clearer. Then, you just have to assign to each function group the inbound function modules of the standard.
    Standard load confirmation for SH-LC for example and standard delivery confirmation for SH-DC.  I hope this helps.
    Regards, Petra

  • Load .csv file data with OWb Process flow using Web

    Hi,
    I Have a file in my local machine( Machines on multiple user's), need to load data through Web user interface.
    Let's say have a web page with multiple radio buttons respective to different sources, by clicking on each button will pass the path of .csv file to through Application, (API or Java programming interface) execute owb Process flow as a accepting file path as a input parameter to execute for loading purpose.
    Should facilitate view data, Update data through web based on user requests.
    Need your guidence how can i implement this with OWb 11g R2.
    Assuming with Web browser functionality. Please confirm it and if yes, please throw some light how could be the steps to implement.
    Thanks

    Hi David,
    Thanks for your reply.
    Undersatnd your proposed solution.But my requirement should be as follows.
    1. Currently under consideration using web page likely to be implement with Java, allowing users to load .csv file data into staging area.(Loading flat file into Data abse table)
    Case 1, Assuming OWB software is not installed on user machine. I think no.
    Is it possible through web page (this case Java page) to trigger java procedure/Pl/SQl procedure or integration of both to laod data into staging area.If yes, how it could effect performance of data load with 1 GB file.
    Case 2, OWb client software installed on User machine, while runtime passing parameters means passing manually?
    In case it is automated, how should i pass machine name & Path to owb runtime web browser.
    Could you please show me guidence how should I acheive this functionality with APEX customization part?
    Thanks agin for your support.
    Anil

  • How to solve delay in a program with data aquisition and processing

    Hello, I am a starter in Labview programing. I am working on a system which contains a roller, a piston and a A/D cart which is from Data Translation Inc (DT304). I am using labview to get speed data of the roller (which is voltage first then be converted into speed) from Analog Input Channels, and then using Analog Output Channels to control the friction force being appylied to the roller by using a piston.
    I am using a Waveform Chart to show the speed of roller. However, as I am adding more components to the program, I always get delay in the speed display and also the output control of the piston. Also, as I spent more time using the program eg. from morning to night, it became slower too.
    My question is, how does labview store data and how are the data been stored and released in the buffer.Is it because I am loading too much data in the buffer so that it became slower? And also how to solve these delays?
    I am using the "save to file" function to write data to a ".lvm" file, but I did some changes to the saving by using a button to enable and disable saving. So I don't think the problem is because I am always saving the data into a file.
    Thank you very much.
    Jessie

    I'm betting it's two things:
    1. Program Architecture
    and 2. Dynamically resizing arrays.
    With the program architecture... if you have one do-while loop that acquires data from teh DAQ, processes it, opens/indexes/writes/closes a file, then comes back around... as your arrays and files increase in size, the loop is going to take longer to execute.  A Producer-Consumer loop (a good tutorial on them can be found here) has one loop that acquires data and stuffs the data into a queue.  This will buffer the data while it's in transition between acquisition and processing.  A second loop in parallel takes data in from the queue, processes it, then comes back around.  The two loops operate independently of each other, so even if the consumer loop takes longer as the files or math gets more complex, the producer loop continues to run full speed.
    Second is the arrays.  Every time you append data into an array, LabVIEW has to make a copy of the data that's in the array.  If you append small amounts of data to an array over and over and over again, eventually LabVIEW is going to be copying very large amounts of data over and over and over again.  The producer-consumer architecture can alleviate this problem.

  • Difference between Service data objects and process data object

    Hi
    Can anybody tell me what is the difference between SDO (Service Data Objects) & PDO (Process Data Objects).I am using 2 port PCI-CAN series 2 card.
    If anybody knows the answer then please reply.
    Thanking You

    Hi,
    See the online help file from our NI CANopen Library for LabVIEW for some basic informations about SDOs and PDOs.
    More information should be available from the web.
    DirkW
    Attachments:
    lvcanopenvhelp.zip ‏154 KB

  • Internal Order process config details and process flow

    Hi All,
    Please help me,
    I need process flow and configuration for assets to buy from vendors and store it and sell to our stores.
    how to config Internal order configuration step by step and post to Asset management. (Instead of normal sales for revenue account).
    Thanks and regards

    Dear Ravi,
    1. first create an internal order through KO01. That internal order type should be for investment purpose.
    2. Give the relevant detials and go to investment tab and give investment profile.
    3 Then in toolbar go to extras and select "asset under construction". An asset under construction would be created and give the details of this asset.
    4. Then release the internal order in KO02.
    5. Then go to GB60 and post a invoice with this internal order as account assignment object.
    6. GO to KO02 and in extras see the order balance. that amount would be reflected in the order.
    7. now go to KO88 and execute the internal order through which your internal order will be credited and AUC which you created would be debited.
    8. Now go to KO02 and see the order balance it would show 0 as it is moved to AUC.
    9. Now to go to settement rule and give the asset to which u want to transfer this amount and take the category as FXA and give the detials.
    9. now go to KO02 and change the status of internal order to "TECO complete".
    10 Now go to KO88 and execute the order and now the AUC is credited and your asset is debited.
    This is the exhaustive steps which i think wil solve your purpose.
    But make sure the configuration about the order type and investment profile are properly maintained.
    Thanks
    sap firdo

  • Is it mandatory to use Business object for data transfer and work flow?

    <font size="3">
    <pre>
    In our enterprise projects we deal with some screens where we add or update some data and again
    we retrieve that data and display to user in various ways like displaying on screen as
    a report or viewing in excel etc.. Most of the times, though we identify the business objects
    based on the nouns in requirements document, really while implementing there will not be any
    business logic in the objects. Almost all the objects which we identfy as business objects
    may have methods like add, delete, update and retrieve methods. Still in these cases, is it required to
    have business objects or can we use transfer object to send the data for saving and retriving or any other
    pattern/guideline is there for dealing such cases?
    I really appreciate your comments on this topic.
    Also I apologize ahead of time if my explanation is not clear to you.
    </pre>
    </font>
    <p>
    Thanks in advance.
    <p>
    Regards,
    <p>
    Rizwan.

    In my opinion, DAO pattern would be suitable for you. Generally the DAO (Data Access Object) will have the CRUD (create, read, update, delete) methods to retrieve data from data-source (e.g. database). Based on the data it will create the DTOs (Data Transfer Objects) and pass them to the caller. It will also receive the DTOs from the caller and save it the data-source. Thus in your case you can
    - remove the CRUD methods from the Business Objects (BO) and make them pure DTOs and use them with DAOs (If you are using JDBC codes inside your application and you don't have much of validation or processing logic in your BOs).
    http://www.corej2eepatterns.com/Patterns2ndEd/DataAccessObject.htm
    OR
    - use BOs in combination in DAOs (if using the database connection from a pool and BOs are having complicated processing logic).
    http://www.corej2eepatterns.com/Patterns2ndEd/BusinessObject.htm

Maybe you are looking for