TAS and process flow

Dear All,
We are trying to use TAS for existing imeplentation.
We are not able to assign load id to Sales Order manually even config settings are done for manual one.
Pls. guide whether the process flow is correct for this.
Create Load ID --> Assign to SO --> Creat Outbound Delviery --> Create Bulk Shipment --> Create Loading Confirmation --> Create Delviery Confirmation.
Information on SAP help and in PDF file of TAS is quite confusing.Process flow is not clear.
Thanks and Regards,
Vijay Mundke

Hi Vijay,
the process can have different variantions, but the document you want to send to the tas system has to have the load-id assigned. The load-id triggers the tas relevance.
If you want to send the sales order to an external system, you need to assign the load-id to the sales order.
usually, the sales order is sent to a TPI system (transportation planning system) and the transportation system sends back the delivery and bulk shipment (idoc OILSHl). This means that this order goes to this truck. The shipment represents the truck on which the delivery is scheduled. Each document that is posted after-event is a function module (one for the delivery and one for the bulk shipment).
Then, you would send the bulk shipment to the terminal (refinery or depot). This means that a load-id needs to be assigned to the bulk shipment (idoc OILSHL). The TAS system receives the shipment and the loading must be done with reference to this shipment/load/id. Then, the TAS sends back the loading data (OILLDD), mainly the OILLDD file contains the reference, the material data, the load qty, the unit, the density, the temperature. The OILLDD idoc has many fields, but only a few are mandatory. I can send you a list of the mandatory fields if you can use it.
Then, the system posts the load confirmation automatically based on these data. depending on the set-up, you can either have the system post the load confirmation only or the load confirmation and the delivery confirmation. Both documents have standard function modules. I strongly recommend to use the standard function modules. They work. I have used them on several projects and all you need are some exits. If you post only the load confirmation, the delivery confirmation can be posted manually. If the customers are invoiced based on arrival quantities, the delivery confirmation can have different quantities (temperature can be quite different, especially in India). If invoicing is based on loaded quantities, you can post both the load and delivery confirmation with the same data or set up the customizing that a load confirmation will be the end of the process (based on the incoterm).
before, you can do the customizing, you need to check which processes you have. To know whether you have a shipment or a pick-up process triggers whether you should send a sales order or a shipment to the terminal. Most oil companies have their own fleet and deliver their fuel via a truck to the customer -> in this case, you need to send a bulk shipment to the terminal. But, often, the customers also pick-up the fuel at the refinery. Then, you only need a sales order and a delivery and goods issue is sufficient. However, if you need to manage the plate number at the refinery, you might want to process both of these variants via a bulk shipment. In all cases, the loadings will come back with an OILLDD idoc.
Depending on the answer to this question, you can set-up the tas relevance customizing. If you choose to send bulk shipments, you set up tas relevance for shipments. This defines simply in which cases a load-id is required in the shipment. Say you have 2 refineries which are represented in the transportation planning point (ref1 and Ref2). If only one refinery will have an automatic interface, you set up only that one to require a load-id. If you have several product groups (say Bitumen, white fuels) and some of them are automated (typically fuel) and other not (bitumen) and you have set them up as bulk shipment types, then you can further differentiate. Then, you need one load-id type for each process variation.
I like to use the following:
PU-ORDER (you send an order and the tas posts the delivery and goods issue)
SH-LC (you send a bulk shipment and the TAS posts the load confirmation only)
SH-LCDC (you send a bulk shipment and the TSA posts load and dlv confirmation)
SH-DC (the load confirmed shipment is sent to the arrival terminal and the arrival terminal sends back the delivery confirmation)
The most commonly used is the SH-LC.
After that, you call the control structures and the LID function groups the same name. (Types are A for pick-up, B for load confirmation and E for delivery confirmation). It makes the whole thing much clearer. Then, you just have to assign to each function group the inbound function modules of the standard.
Standard load confirmation for SH-LC for example and standard delivery confirmation for SH-DC.  I hope this helps.
Regards, Petra

Similar Messages

  • Ways of creating contract and process flow of contracts

    hi friends
    ways of creating contract and process flow of contracts
    thanks for ur help
    regards
    krishna

    hi,
    In the MM Purchasing component, a contract is a type of outline purchase agreement against which release orders (releases) can be issued for agreed materials or services as and when required during a certain overall time-frame.
    Contracts can take the following forms:
    Quantity contracts
    Use this type of contract if the total quantity to be ordered during the validity period of the contract is known in advance. The contract is regarded as fulfilled when release orders totaling a given quantity have been issued.
    Value contracts
    Use this type of contract if the total value of all release orders issued against the contract is not to exceed a certain predefined value. The contract is regarded as fulfilled when release orders totaling a given value have been issued.
    You can also set up corporate buying contracts with your vendors. These are valid for all plants and company codes within a client (see Centrally Agreed Contract).
    You can create a contract as follows:
    Manually
    You enter all data relating to the contract manually.
    Using the referencing technique
    As reference document (the document you copy from), you can use:
    Purchase requisitions
    RFQs/quotations
    Other contracts
    CREATION OF CONTRACT MANUALLY:
    Choose Outline agreement --> Contract --> Create(ME31K)
    The initial screen appears.
    Enter the necessary data. If you make any specifications under the group heading Default data, this data will appear as the default data in each item.
    In the Agreement type field, specify whether you are creating a quantity or value contract, for example.
    Press ENTER .
    The header data screen appears.
    Enter the contract validity period. Check the other fields on this screen and make any necessary changes (e.g. the terms of payment) and define the header conditions.
    Press ENTER .
    The item overview screen appears.
    On this screen, enter the information for each item (material number, target quantity, price, receiving plant, or account assignment, etc.) using the same procedure as with purchase orders.
    Material without a master record: leave the field for the material number empty and enter the following:
    u2013 Short description of the relevant material or service in the Short text field
    u2013 Material group to which the material belongs, in the Material group field
    u2013 Account assignment category
    You can enter u (unknown) or the category of an account assignment.
    u2013 The target quantity and the order unit
    If you specify an account assignment category other than U (field A), you must enter the relevant account assignment data for the item. To do so, choose Item ® Account assignments (see also Account Assignment).
    If necessary, review the details for each item. Select the item(s) to review. Then select Item -> Details.
    Enter the desired conditions.
    Enter further text for the item if any additional instructions to the vendor or Goods Receiving are necessary. Choose Item -> Texts -> Text overview.
    Save the contract.
    Hope it helps..
    Regards
    Priyanka.P
    AWARD IF HELPFULL
    Edited by: Priyanka Paltanwale on Aug 25, 2008 7:20 AM

  • Analysing Task Audit, Data Audit and Process Flow History

    Hi,
    Internal Audit dept has requested a bunch of information, that we need to compile from Task Audit, Data Audit and Process Flow History logs. We do have all the info available, however not in a format that allows proper "reporting" of log information. What is the best way to handle HFM logs so that we can quickly filter and export required audit information?
    We do have housekeeping in place, so the logs are partial "live" db tables, and partial purged tables that were exported to Excel to archive the historical log info.
    Many Thanks.

    I thought I posted this Friday, but I just noticed I never hit the 'Post Message Button', ha ha.
    This info below will help you translate some of the information in the tables, etc. You could report on it from the Audit tables directly or move them to another appropriate data table for analysis later. The concensus, though I disagree, is that you will suffer performance issues if your audit tables get too big, so you want to move them periodically. You can do this using a scheduled Task, manual process, etc.
    I personally just dump it to another table and report on it from there. As mentioned above, you'll need to translate some of the information as it is not 'human readable' in the database.
    For instance, if I wanted to pull Metadata Load, Rules Load, Member List load, you could run a query like this. (NOTE: strAppName should be equal to the name of your application .... )
    The main tricks to know at least for task audit table are figuring out how to convert times and determing which activity code corresponds to the user friendly name.
    -- Declare working variables --
    declare @dtStartDate as nvarchar(20)
    declare @dtEndDate as nvarchar(20)
    declare @strAppName as nvarchar(20)
    declare @strSQL as nvarchar(4000)
    -- Initialize working variables --
    set @dtStartDate = '1/1/2012'
    set @dtEndDate = '8/31/2012'
    set @strAppName = 'YourAppNameHere'
    --Get Rules Load, Metadata, Member List
    set @strSQL = '
    select sUserName as "User", ''Rules Load'' as Activity, cast(StartTime-2 as smalldatetime) as "Time Start",
          cast(EndTime-2 as smalldatetime) as ''Time End'', ServerName, strDescription, strModuleName
       from ' + @strAppName + '_task_audit ta, hsv_activity_users au
       where au.lUserID = ta.ActivityUserID and activitycode in (1)
            and cast(StartTime-2 as smalldatetime) between ''' + @dtStartDate + ''' and ''' + @dtEndDate + '''
    union all
    select sUserName as "User", ''Metadata Load'' as Activity, cast(StartTime-2 as smalldatetime) as "Time Start",
          cast(EndTime-2 as smalldatetime) as ''Time End'', ServerName, strDescription, strModuleName
       from ' + @strAppName + '_task_audit ta, hsv_activity_users au
       where au.lUserID = ta.ActivityUserID and activitycode in (21)
            and cast(StartTime-2 as smalldatetime) between ''' + @dtStartDate + ''' and ''' + @dtEndDate + '''
    union all
    select sUserName as "User", ''Memberlist Load'' as Activity, cast(StartTime-2 as smalldatetime) as "Time Start",
          cast(EndTime-2 as smalldatetime) as ''Time End'', ServerName, strDescription, strModuleName
       from ' + @strAppName + '_task_audit ta, hsv_activity_users au
       where au.lUserID = ta.ActivityUserID and activitycode in (23)
            and cast(StartTime-2 as smalldatetime) between ''' + @dtStartDate + ''' and ''' + @dtEndDate + ''''
    exec sp_executesql @strSQLIn regards to activity codes, here's a quick breakdown on those ....
    ActivityID     ActivityName
    0     Idle
    1     Rules Load
    2     Rules Scan
    3     Rules Extract
    4     Consolidation
    5     Chart Logic
    6     Translation
    7     Custom Logic
    8     Allocate
    9     Data Load
    10     Data Extract
    11     Data Extract via HAL
    12     Data Entry
    13     Data Retrieval
    14     Data Clear
    15     Data Copy
    16     Journal Entry
    17     Journal Retrieval
    18     Journal Posting
    19     Journal Unposting
    20     Journal Template Entry
    21     Metadata Load
    22     Metadata Extract
    23     Member List Load
    24     Member List Scan
    25     Member List Extract
    26     Security Load
    27     Security Scan
    28     Security Extract
    29     Logon
    30     Logon Failure
    31     Logoff
    32     External
    33     Metadata Scan
    34     Data Scan
    35     Extended Analytics Export
    36     Extended Analytics Schema Delete
    37     Transactions Load
    38     Transactions Extract
    39     Document Attachments
    40     Document Detachments
    41     Create Transactions
    42     Edit Transactions
    43     Delete Transactions
    44     Post Transactions
    45     Unpost Transactions
    46     Delete Invalid Records
    47     Data Audit Purged
    48     Task Audit Purged
    49     Post All Transactions
    50     Unpost All Transactions
    51     Delete All Transactions
    52     Unmatch All Transactions
    53     Auto Match by ID
    54     Auto Match by Account
    55     Intercompany Matching Report by ID
    56     Intercompany Matching Report by Acct
    57     Intercompany Transaction Report
    58     Manual Match
    59     Unmatch Selected
    60     Manage IC Periods
    61     Lock/Unlock IC Entities
    62     Manage IC Reason Codes
    63     Null

  • Scheduling Mappings and Process Flows

    Can you please tell me how to schedule mappings and process flows in a little detail, if any one has done that. I think OEM can be used for this but dont have any clear cut idea of doing it. Can the mappings/process flows be scheduled without using OEM console ?
    Waiting for you suggestions.
    rgds
    -AP

    Hi
    You can schedule your mapping and process flows with OEM or database job.
    You will find script templates in the OWB HOME/owb/rtp/sql directory
    and you can scheldule them.
    If you want to do it in OEM use the oem_exec_template.sql file createing an OEM job. Read the OWB documentation about it. If you have any question about it ask it, I have done this procedure many times.
    Ott Karesz
    http://www.trendo-kft.hu

  • OWB : Runtime Repo for Target and Process Flows on Different M/C

    Hi,
    I have an environment where the Runtime Repositories for the Target Warehouse and Process Flows ( Workflows ) are on different machines. I have deployed :
    * the mappings and Target Tables on the Target Warehouse
    * Process Flows on the Workflow Server which resides on a seperate M/C.
    When I try executing the Process Flows for the Mappings , it errors out indicating that the objects may not have been deployed. Looks like the Process Flows cannot see the Mappings deployed.
    Can somebody help me here ?????

    I think target schema and runtime rep shuould be in same instance

  • Internal Order process config details and process flow

    Hi All,
    Please help me,
    I need process flow and configuration for assets to buy from vendors and store it and sell to our stores.
    how to config Internal order configuration step by step and post to Asset management. (Instead of normal sales for revenue account).
    Thanks and regards

    Dear Ravi,
    1. first create an internal order through KO01. That internal order type should be for investment purpose.
    2. Give the relevant detials and go to investment tab and give investment profile.
    3 Then in toolbar go to extras and select "asset under construction". An asset under construction would be created and give the details of this asset.
    4. Then release the internal order in KO02.
    5. Then go to GB60 and post a invoice with this internal order as account assignment object.
    6. GO to KO02 and in extras see the order balance. that amount would be reflected in the order.
    7. now go to KO88 and execute the internal order through which your internal order will be credited and AUC which you created would be debited.
    8. Now go to KO02 and see the order balance it would show 0 as it is moved to AUC.
    9. Now to go to settement rule and give the asset to which u want to transfer this amount and take the category as FXA and give the detials.
    9. now go to KO02 and change the status of internal order to "TECO complete".
    10 Now go to KO88 and execute the order and now the AUC is credited and your asset is debited.
    This is the exhaustive steps which i think wil solve your purpose.
    But make sure the configuration about the order type and investment profile are properly maintained.
    Thanks
    sap firdo

  • HFM Process Control and Process Flow History

    Hi,
    When I review the Process Flow History for top parent members of the consolidation, the POV shows <Scenario> <Year> <Period> <Entity> <Entity Currency>. For all other applications and entities, the <Entity Currency> member shows but I've built a new application and the <Entity Currency> shows as "EUR Total" (EUR is one of the currencies) except that EUR is not the default currency of application or the entity's value. Could anyone explain to me why it would show that way or where it could be coming from?
    I'm using HFM v11.1.1.3.
    Thanks in advance.

    did you add new metadata?

  • Linkage and process flow for credit memo

    Hi Experts,
       I am creating an outbound zfunction module for consignment settlement.Need to select frbnr,lifnr,budat,ebeln,werks,lgort,erfmg
    based on object key of nast table.Kindly please advise the flow and linkage of the fields for selection.
    Thanks.
    Stazin.

    Hi Charan,
    Has vendor sent you a credit memo or a subsequent credit?
    Usually a credit memo refers to a goods return and a subsequent credit to a prior invoice that was set with a higher price than it should (for example).
    So you can simply make a credit memo against the PO with the value/quantity in the vendor credit paper.
    Go to MIRO and choose transaction 2 Credit memo, insert the original PO and adjust value/quantity. After saving the document the PO / Stock and value are updated.
    Regards.

  • Where jobs are stored in database after executing mapping and process flow?

    Hi Gurus,
    I am new to OWB, i would like to know where the owb jobs stored in the database. I am planning to execute jobs on database by avoiding OWB. Could anybody give steps to take code in database and execute.
    Thanks & Regards
    Vicky

    Hi Vicky,
    read chapter "3 Using SQL*Plus to Schedule and Execute Jobs" from "Oracle Warehouse Builder API and Scripting Reference":
    http://download.oracle.com/docs/cd/B31080_01/doc/owb.102/b28225/api_4sqlforjobs.htm#i1064950
    Hope this helps,
    Oleg

  • Training and Event management.Process Flow..

    Hi, Experts,
    Can any body give me some notes on  Training and event management and process flow also like integration between ECC-6.0 to Non SAP
    1.Training request
    2.Appraisal documents for performance feedback
    3.Tracking Feedback processes
    Pls help me in configuring these services.
    With Regards,
    San Rao.

    Hi san rao,
    as per your reply you using the new appraisal template and also you have released the template now simply run the BSP application HAP_DOCUMENT and access the appraisal documents  
    attach these BSP applications in your portal system.
    following pages of the above application will  be help full
    documents_todo
    documents_received_open2
    documents_where_participated
    Regards,
    Umesh Chaudhari.

  • Regarding Performance Tunning of Mapping & Process Flow

    Hi,
    i have around 60-70 GB data in Target database.
    i need do improve the performance of Mapping and Process flow.
    i have used lookup transformation in mapping.
    Plz give me some tips for improving performance of Process.
    Thanks,

    Please go through a Performance Tuning Book for Oracle 10gR2.
    Most importantly remember that in Oracle 10g, your performace of mappings can be increased manifold by following these steps:
    1. Do not design a mapping wherein you load a table and then select from that table to load another table and so on. This is a bad design.
    2. Keep mappings as simple as possible. In other words if a mapping is complicated in terms of joins or other operators then split the mapping into more than one parts.
    3. Also check that all your source tables should be analyzed using DBMS_STATS. Ensuring this one single step can make your work very easy.
    4. Put indexes where you find your predicate has a very high selectivity. Also keep in mind the column ordering of the index.
    5. Use Set Based operation, since it is always a good idea to achieve the result by running one single query rather than a loop and multiple inserts.
    6. Use APPEND PARALLEL hint while loading the target tables. This will not generate wny redo and save time
    7. Please have a recheck while usng some performance intensive operators like UNION, DISTINCT and AGGREGATION
    8. When using a sequence operator to load a large table, check that the sequence should be cached with some values.
    9. When loading large data HASH JOINS are the most appropriate more often than not. So you can use USE_HASH as the hint for selecting from large tables.
    10. Filter out as much unrequired data from a table as soon as possible in the mapping before doing multiple joins.
    I am sure there are many more ... the above is just a random list that I could remember now. Please go through Oracle Performance Tuning Guide, Tom Kyte's Oracle Expert One on One. Knowledge of performance tuning will grow with experience. I am also learning each day !!!
    Regards
    AP

  • FI Functional Audit questionnaire procedure and  process

    Hi Friends
    Pls provide me the information or related documentaion for FICO Audit Questionaire procedure and process flow
    Thanks in advance
    Srini
    Edited by: Srinivas A on May 8, 2009 5:07 PM

    Hello,
    http://www.auditnet.org/sapaudit.htm
    Hope this will good resource for you.
    The is a free site providing the information on audit. You can register and go throgugh the materials.
    Regards,
    Ravi

  • ORA-01017 invalid username/password on execution of process flow

    Hi, there are one or two similar issues mentioned in the foum but none are resolved I think.
    Basically we have target repositories setup with a single user (not the schema owner) used for the location credentials eg.
    REP1_LOCATION pointing to schema REP1 but username OWBRT
    REP2_LOCATION pointing to schema REP2 but also username OWBRT
    The user OWBRT is defined as a user within OWB and is also defined as a target repository user.
    The objects deploy ok without errors but when running any process flow or mapping we get an ORA-01017 invalid username/password.
    If the we set the location details to use the schema owner instead of OWBRT mappings and process flows work for that schema.
    Is there any special privileges / grants that need to be setup for the username attached to a location in order for this to work ?
    We tried giving DBA privileges as a test (which you would think would work), but it didnt.
    Thanks
    Paul

    Hi Paul,
    under what user you tried to start processflows/mappings? Did you use configuration with spitted design ant runtime OWB repositories (did you use different DB for design and runtime OWB repositories)?
    You should start processflows/mappings under OWB user (created/registered in target DB) and
    you need register target locations (database and workflow locations) under this OWB user.
    Other variant - you can enable preferences options "Persist location password in metadata" and "Share location password during run time" and then register locations under any OWB user.
    I had similar problem and resolved it, also look at this thread
    Re: Complex condition don't work with variables in OWB 10.2
    Regards,
    Oleg

  • How to specify the Process Flow Module with SQLPLUS_EXEC_TEMPLATE.SQL ?

    Hi, we have a couple of process flow modules that have PF Packages and Process Flows with the same name.
    E.g
    PFMOD1 (Module)
    FILELOAD (Package)
    PF1 (Pf)
    PFMOD2 (Module)
    FILELOAD (Package)
    PF1 (Pf)
    Normally we can specify "FILELOAD/PF1" as a paramater to the procedure SQLPLUS_EXEC_TEMPLATE.SQL in order to initiate the running of an OWB process flow but how can the system distinguish between modules ?
    Anyone done this ?
    Thanks
    Paul

    If you deployed the packages to the same location then the second deployment replaced the result of the first one (actually a new version of the process flow was created). You can execute only the latest version...
    Regards,
    Robert

  • How to schedule Process Flow in OWB10gR2 ?

    Hi,
    I have a mapping and a corresponding process flow that has the mapping within. Both the mapping and process flow are deployed successfully and they also execute successfully when executed from control center.
    Now I want to run the process flow daily at 4 am in the morning. I have created a schedule and defined the parameters. Then for the process flow, I have added the schedule to the "Referred Calander". Then in control center I got a new object with _JOB as suffix. I deployed that as well and that was successfull. This implies that I have now a job that is scheduled to run daily at 4 am.
    But the next day when I checked the target table that the mapping (wrapped in the process flow which in turn was scheduled to run at 4 am ) was supposed to insert some records, there was no record at all !!! Seemed to me that the job was not kicked off at all at the specified time.
    I am using --
    OWB client version : 10.2.0.1.31
    OWB repository version : 10.2.0.1.0
    Oracle Workflow Version : 2.6.4.0.0
    Database version : 10g Enterprise edition release 10.2.0.1.0
    Does anyone have any idea how do I make this arrangement running?
    Regards,
    Swagata

    Chino -
    Once you start the job, do you have to leave the "Job Details" window open in order for teh scheduled task to run?
    thanks for the help ....
    txb

Maybe you are looking for