Question on Process Based EBS

Hi,
Does the process based EBS always need to call an EBF or it can call a Provider ABCS? would appreciate your inputs.
Thanks/Steve

My use case is, need to develop a custom EBS to handle some requirements between Siebel and OTM. Since the new operations are not CRUD, the custom EBS would be a process based EBS(Pls correct me if i am wrong)
when developing the Provider ABCS to interact with OTM, i am kind of confused whether to call it a Provider ABCS or EBF..In the provider services, just need to make couple of calls to OTM and send response back to Siebel.
To me there is not much orchestration in the provider service, i can name it as ProviderABCS but i read some where that Process Based EBS should always call EBF, so i am kind of focused. Please let me know your thoughts.

Similar Messages

  • Interview questions in process chain

    hi 
       cany any one send me possible interview questions in process chain and errors with answer.
    thanks in advance
    pradeep

    Hi Pradeep
    1.Procedure for repeat delta?
    You need to make the request status to Red in monitor screen and then delete it from ODS/Cube. Then when you open infopackage again, system will prompt you for repeat delta.
    also.....
    Goto RSA7->F2->Update Mode--->Delta Repetation
    Delta repeation is done based on type of upload you are carrying on.
    1. if you are loading masterdata then most of the time you will change the QM status to red and then repeat the delta for the repeat of delta. the delta is allowed only if you make the changes.
    and some times you need to do the RnD if the repeat of delta is not allowed even after the qm status id made to red. here you have to change the QM status to red.
    If this is not the case, the source system and therefore also the extractor, have not yet received any information regarding the last delta and you must set the request to GREEN in the monitor using a QM action.
    The system then requests a delta again since the last delta request has not yet occurred for the extractor.
    Afterwards, you must reset the old request that you previously set to GREEN to RED since it was incorrect and it would otherwise be requested as a data target by an ODS.
    Caution: If the termianted request was a REPEAT request itself, always set this to RED so that the system tries to carry out a repeat again.
    To determine whether a delta or a repeat are to be requested, the system ONLY uses the status of the monitor.
    It is irrelevant whether the request is updated in a data target somewhere.
    When activating requests in an ODS, the system checks delta repeat requests for completeness and the correct sequence.
    Each green delta/repeat request in the monitor that came from the same DataSource/source system combination must be updated in the ODS before activation, which means that in this case, you must set them back to RED in the monitor using a QM action when using the solution described above.
    If the source of the data is a DataMart, it is not just the DELTARNR field that is relevant (in the roosprmsc table in the system in which the source DataMart is, which is usually your BW system since it is a Myself extraction in this case), rather the status of the request tabstrip control is relevant as well.
    Therefore, after the last delta request has terminated, go to the administration of your data source and check whether the DataMart indicator is set for the request that you wanted to update last.
    If this is NOT the case, you must NOT request a repeat since the system would also retransfer the data of the last delta but one.
    This means, you must NOT start a delta InfoPackage which then would request a repeat because the monitor is still RED. For information about how to correct this problem, refer to the following section.
    For more information about this, see also Note 873401.
    Proceed as follows:
    Delete the rest of this request from ALL updated data targets, set the terminated request to GREEN IN THE MONITOR and request a new DELTA.
    Only if the DataMart indicator is set does the system carry out a repeat correctly and transfers only this data again.
    This means, that only in this case can you leave the monitor status as it is and restart the delta InfoPackage. Then this creates a repeat request
    In addition, you can generally also reset the DATAMART indicator and then work using a delta request after you have set the incorrect request to GREEN in the monitor.
    Simply start the delta InfoPackage after you have reset the DATAMART indicator AND after you have set the last request that was terminated to GREEN in the monitor.
    After the delta request has been carried out successfully, remember to reset the old incorrect request to RED since otherwise the problems mentioned above will occur when you activate the data in a target ODS.
    What is process chain and how you used it?
    A) Process chains are tool available in BW for Automation of upload of master data and transaction data while taking care of dependency between each processes.
    B) In one of our scenario we wanted to upload wholesale price infoobject which will have wholesale price for all the material. Then we wanted to load transaction data. While loading transaction data to populate wholesale price, there was a look up in the update rule on this InfoObject masterdata table. This dependency of first uploading masterdata and then uploading transaction data was done through the process chain.
    What is process chain and how you used it?
    A) We have used process chains to automate the delta loading process. Once you are finished with your design and testing you can automate the processes listed in RSPC. I have a real time example in the attachment.
    for more detail
    Collecting Process Chain Statistics
    /thread/235805 [original link is broken]
    Advice regarding process chains
    creation of process chains
    Message was edited by:
            Kalpana M

  • Questions on process chains

    Hai all,
             I have two questions on Process chains.
    1. I created a process chain for master data and a process chain for data loads(with ODS and cube). As the Master data has to be loaded first before loading transaction data, can I include the Master data process chain as a local process chain soon after the start process and then only after loading master data, the system proceeds to transaction data?
    2. I designed a process chain with the aggreagtion and compression of cube data. I have forgot to check the option delete overlapping requests in the infopackage to load the infocube. I ran the process chain and now there are 2 requests and the duplicate documents. I want to delete the recent request and check the deleting duplicate requests option in info package and re run. The problem is the system is not allowing me to delete the recent request.
    I appreciate any kind of help.
    Thanks.

    Hai Bhanu,
                Thanks for the reply. I am sceduling the request for deletion in the Manage infocube> but the request is not getting deleted. Everytime I click refresh, the delete icon disappears and the screen becomes as it was before scheduing the deletion. I checked after sometime, even then the same thing...
    I wonder the collpased requests can not be deleted???

  • Revenue & COGS recognition process based on Proof Of Deliver (POD) & Inco-T

    Hello Experts,
    I have a senario and i am trying to find out the proper solution for my client, hope you people can help me in this.
    Requirement :     Revenue & COGS recognition process based on Proof Of Deliver (POD) & Inco-Terms
    Steps     :At present system is posting COGS at the time of Out Bound delivery and Revenue at the time of commercial invoice.      
    (1)     Current Requirement:
    If the Inco-Term is other than Ex-works then both COGS and revenue to be posted(Recognized) based on the POD.
    If the Inco-Term is Ex-works then system should post COGS at the time of Out Bound delivery and Revenue postings to happen once Commercial Invoice is Authorized.
      (2)  _Following are the Postings to happen in a sequential way if the Inco-Term is other than Ex-work
       a. When Out Bound delivery is authorized then system should operate the Stock in transit for customers instead of
           COGS account as mentioned below.
                     Dr. Stock in Transit-Customers
                     Cr. Respective Stock Account
    Note: We understood that this provision not available in SAP at this point of time and this entry should not considered in costing for COPA
                     b. Based on the Out Bound delivery a provisional invoice(without postings) has to be generated and Excise duty entry has to be posted as mentioned below
                     Dr. Duty Paid Account
                     Cr. 23A/23C/PLA
                     C. Once the POD is received and recorded in system then the respective Out Bound delivery 
                     related accounting entry has to be reversed as mentioned below.
                     Dr.COGS
                     Cr. Stock in Transit-Customers
    Note: This entry should be considered in costing for COPA
                   D. Once the outbound delivery (above) is reversed then the system should generate the Sales
         Invoice based on POD as the entries mentioned below
                     Dr. Customer
                     Cr. Revenue & Payable accounts
    Note: We understood that this provision is available in SAP
    Thanks
    Rahul

    Hello Karl,
    The Best Practices document in the Note 1172799 provides details regarding all the SAP supported processes for RR functionality.
    Regards,
    Raghavendra

  • Call a process based on the click of a javascript confirm popup box

    I have created a function to create a javascript confirm popup box which calls an update process called Reactivate_save(), see below:
    function reactivate_save()
    var r=confirm("Do you wish to save pending changes?")
    if (r==true)
    document.getElementById('Reactivate_Save').call();
    I want to make the update process conditional on clicking the 'ok' button inside the popup box.....Is this possible?
    I thought that I could reference it by using:
    value in expression 1 = expression 2
    reactivate_save() = true or 1
    Neither of these worked and wondering if there is something else that I can use?
    Thanks,
    Chris

    Hi,
    Your function is in Javascript while the process is PL/SQL. What you need to do is somewthing like this
    if (r==true)
    document.getElementById('Reactivate_Save').call(); // not sure what this does so left it as it is
    doSubmit('MY_REQUEST');
    }You can now use the 'MY_REQUEST' request, or whatever else you choose to call it, in the process condition using
    1. Request = e condition type by entering MY_REQUEST in the Expression 1
    or
    2. PL/SQL Expression type with :REQUEST = 'MY_REQUEST' in expression 1
    Note : In Apex 3 and below you need to add a semi colon at the end of PL/SQL Expresssions
    Regards,
    PS : Noticed that this is the same as call a process based on the click of a javascript confirm popup box
    Edited by: Prabodh on Sep 28, 2010 9:05 PM

  • Questions on Custom based queuing

    Hi everybody
    I have a question on custom based queuing.
    Please consider the following example
    q1   400 bytes
    q2   300 bytes
    then  a default queue.
    This is what i understand ( based on this following link)
    http://www.cisco.com/en/US/docs/ios/12_2/qos/configuration/guide/qcfcq_ps1835_TSD_Products_Configuration_Guide_Chapter.html
    q1 will be emptied until the byte counters reaches zero,
    q2 2ill be emptied until the byte counter reaches zero
    Then deafult queue. How long default queue will be served before scheduler goes back to q1 since we are not defining any byte count for default queue. For example in case q1, we know the scheduler will empty the queues until byte counter reaches zero, we do not have any byte counter associated with default queue so how does scheduler know it is time to stop serving default queue and goes back to queue 1 and so on.
    Thanks and have a great weekend.

    Hi Dinesh,
    To try to provide some answers to your questions:
    1. A regular role determines what a user can do in Studio as a whole. An application role determines what a user can do within the context of a specific application.
    For example, a user with the Power User role for Studio can create applications, and automatically has an Application Administrator role for the applications they create. They can also grant Application Member or Application Administrator roles for those applications to other users. They cannot automatically view or edit applications created by other users, unless granted an application role by an Application Administrator for those applications.
    Users with only the User role for Studio cannot create applications (as of 3.1), but may be granted an Application Member or Administrator role for specific applications.
    Neither a Power User nor a User has access to the Control Panel.
    Users with the Administrator role for Studio have access to the Control Panel, and unrestricted access to applications.
    For information on user roles in Studio, see About user roles.
    For information on assigning application roles, see Managing memebership in an application.
    2. We don't really support component-specific permissions - the options are deprecated and shouldn't be used, which is why they are not documented. The Configuration option is the only option available for a component.
    3. Add Discussion is also a deprecated function that is not supported.

  • What is the process for EBS when customer directly deposits the payment in

    Hi,
    Please tell me the process for EBS when customer deposits the payment in bank
    1) via Cheque
    2) via Bank Transfer
    Can we clear the open items automatically if yes, then how please tell me how to map it so that it can be cleared when we have EBS?
    Thanks and Regards
    Nitin

    Hi Expert,
    Please select the Standard algorithm - 001 and don't change any settings in EBS, Just follow the below steps:
    Just configure the below steps:
    1. Delete Bank Statement Test Data in T-Code: SE38
    PROGRAM - RFEBKA96
    Execute
    On the Delete Buffer screen, enter the following data:
    Field name     User action and values
    ANWND          0001
    Choose Execute to continue.
    On the Delete Buffer screen, select the bank statement files to be deleted and choose the Delete statements button.
    By this Step the Result is:
    The bank statement previously created is deleted, thus allowing you to re-create the current dayu2019s bank statement.
    2. Create BAI File
    Use this step, you create the bank statement input file.
    Prerequisites: You must delete the existing bank statement file to create a new bank statement.
    Procedure
    Go to T-Code:SE38
    Program - RFEBKAT5     
    Choose  Execute to continue.
    On the General test data for BAI bank statement and create open items screen, enter the necessary data.
    Description                             User action and values                                               Comment
    EOD                                             Select     
    EOD File name                             RECON1     
    Company code                        XXXX     
    House Bank                             ABC     
    House Bank Account            1234XXXXX     
    Posting Offset Account            111000(Provide Offset Account)
    Statement date                    Yesterdayu2019s date     
    Invoice date                            Yesterdayu2019s date     
    Generate items                    Select this field     
    Open items                            2(Provide the Open Items)
    Last w/diff                           Select this field     
    Extrn/Trns                           165     
    Amount                                   1000     
    Customer                                   XYZ(Provide the Customer Account)     
    Increase by                           50     
    Document type                   DR     
    With bank details                   Select this field     
    Debit posting key                   01     
    With ref. data                            Select this field     
    Credit posting key                   50     
    XBLNR                                   Select this field      
    Generate items                   Select this field     
    Debit posting key                   40     
    Document type                   SA     
    Credit posting key                   50     
    Specific                                    Select this button     
    Checks Out                           575 ++++++++07 58.5!                                       Exclamation mark goes in 2nd column
    Funds Out                           495 ++++++++01 1500 200200665757699     
    Funds In                                   398 ++++++++08 150 BANK CHARGE     
    Choose Execute  to continue.
    Result
    The BAI file is displayed.
    3. Execute Bank Statement Reconciliation Program
    T-Code: FF.5
    Field name      Description                                                             User action and values     
                             Import data                                                             Select     
                          Workstation upload                                                  Deselect     
    FEBFORMAT     Elect. bank statement format                                  A     
    FEBAUSZF     Statement file                                                          RECON1     
    FEBFILTER2     XBLNR number interval                                          199900000000000 to 200099999999999     
                         Print bank statement (Output controls tab)          Select     
                         Print posting log(Output controls tab)                  Select     
                         Print statistics(Output controls tab)                          Select     
    Choose Execute  to continue.
    Result:The bank reconciliation program RFEBKA00 uploads the BAI file created in the previous step. As a result, the open items created in the previous step, have been cleared. You can display the journal entries of these postings by using transaction code FB03.
    Regards,
    GK
    SAP

  • Login Process in EBS

    Can anybody explain the process of Login Process in EBS ?
    How autentication process happens , which files in COMMON_TOP are refered?
    What is the role of APPLSYSPUB and GUEST while login , are both acccounts used for autentication ???

    Hi,
    Can anybody explain the process of Login Process in EBS ?
    How autentication process happens , which files in COMMON_TOP are refered?Have a look at the following link:
    Apps 11i login and logout flow
    http://oracleappstechnology.blogspot.com/2007/08/apps-11i-login-flow.html
    What is the role of APPLSYSPUB and GUEST while login , are both acccounts used for autentication ???This topic was discussed in previous threads, please have a look:
    applsyspub
    applsyspub
    guest account
    guest account
    Regards,
    Hussein

  • File Channel sequential processing based on file names

    Hi,
    I have a requirement in a sender file channel.
    There are mulitple files with different file names, I want these files to be processed in sequence based on the file name.
    For example.
    The files lets say are A3_1, A2, A1,A3_2.
    I want to process A1 then A2 and then A3.
    Please suggest methods to implement the same, I think an adapter module should help, please throw some light on the same.
    Regards,
    Varun.

    Varun,
    In that case I think you need to consider BPM. The file adapter does not support 'advanced' techniques for the sequence in which it polls files.
    The process would then be:
    - Use the adapter specific identiers for the file adapter to include the file name (/people/michal.krawczyk2/blog/2005/06/28/xipi-faq-frequently-asked-questions)
    - Create a message mapping that puts the file name from the dynamic configuration into a field in the message type
    - Create an integration process that receives the files from this adapter inside a loop step (/people/daniel.graversen/blog/2006/09/07/using-a-bpm-to-collect-messages-for-a-set-interval-of-time)
    - You can now create an interface mapping that sorts the messages according to their filename and then send the messages to the receiver one by one (EOIO?).
    Kind regards,
    Koen

  • Credit Management Process based on Terms of Payment.

    Hi All,
    My Client wants to use SAP Credit Managment Process. The SAP is currently being used but without Credit Management. As per my initial discussion I have enclosed the requirments below. Your Kind Help will be highly appreciated.
    1.     The Process has to be implemented in a 2 Step Process.
    2.     The Initial Requirement is based on Terms of Payment. If the Number of Days is Overdue for the Invoice for a particular Customer, the system should issue a warning message (in the Form of Mail or any other way) which can be shown as evidence to the Customer for collection of Payment so that further Sales Transactions can take place.
    (Is there the standard Process in Credit Management based on Terms of Payment)
    3.     In the next step Value Base Credit Check has to be implemented. The Value will be set for a particular Customer in Credit Master. When the Sales Order is raised and if the Credit Limit is exceeded the System should issue the warning message. Based on the Warning Message Issued an email should triggered to the particular person in Finance that the Credit Limit for a particular person has been exceeded and Credit Limit needs to be increased.
    Thanks & Regards,
    Sam.

    Dear Mr F Farooq,
    1. The Process has to be implemented in a 2 Step Process.
    Do the following configurations for credit management :
    Refer the following link.
    http://wiki.sdn.sap.com/wiki/display/ERPLO/CreditManagementConfigaration
    2. The Initial Requirement is based on Terms of Payment. If the Number of Days is Overdue for the Invoice for a particular Customer, the system should issue a warning message (in the Form of Mail or any other way) which can be shown as evidence to the Customer for collection of Payment so that further Sales Transactions can take place.
    (Is there the standard Process in Credit Management based on Terms of Payment)
    Now for this requirement, since different billing documents for the same customer  can have different payment terms (whcih means different billing will be due on different date) , please make the following changes in OVA8
    Check for 'oldest open item'
    Now follow my reply in the below thread.
    Sales order to be blocked based on customer payment terms and credit limit
    You can use WORKFLOW to trigger mail when such a message is triggered in the sales order , due to credit check.
    Ask your ABAP-er for details on workflow.
    3. In the next step Value Base Credit Check has to be implemented. The Value will be set for a particular Customer in Credit Master. When the Sales Order is raised and if the Credit Limit is exceeded the System should issue the warning message. Based on the Warning Message Issued an email should triggered to the particular person in Finance that the Credit Limit for a particular person has been exceeded and Credit Limit needs to be increased.
    Just configure according to the thread given for 1st POINT.
    CHECK FOR DYNAMIC and set reaction as 'C' and check STATUS /BLOCK.
    Maintain the credit limit in the FD32.
    Now when the sales order value exceeds the credit limit , a message will be triggered.
    Use the concept of WORKFLOW to create mail based on this message .
    When the user gets mail, he can maintain the new credit limit in FD32 and release the document for delivery/billing in VKM3.
    Revert back if there is any issues.
    Thanks & Regards,
    Hegal K Charles
    Edited by: Hegal . K . Charles on Aug 7, 2011 1:19 AM

  • Questions on Rules-Based ATP and Purchase Requisitions for STOs

    Hello experts,
    We are working on rules-based ATP configuration and have several questions about the functionality.  Iu2019m hoping that some of you are using this functionality and can help give us direction.
    In our environment we have multiple distribution centers and multiple manufacturing plants.  We want to confirm sales orders against stock and production orders in any of those plants, depending on the locations that have stock or planned production.  For example, we will place a sales order against plant A.  If there is not enough stock in plant A then rules-based ATP will use location determination to check in plant B, then C.  The scope of check on the ATP check will include stock and released production orders.  We will configure plant A as the u201Cconsolidation locationu201D so if stock is found in plants B or C then stock transport orders will automatically be created to move the stock to plant A before shipping to the customer.
    We have configured rules-based ATP and this functionality is working well in our Development system.  The ATP check is executed and uses the rules-based ATP to find eligible stock in other plants.  The system is also creating purchase requisitions to move the stock to the consolidation plant. 
    Our first concern is that there doesnu2019t appear to be any firm linkage between the sales order and the resulting purchase requisition.  For example, if we create sales order 123 for plant A and the rules-based ATP finds stock in plant B it automatically creates a purchase requisition 987 to move the stock from plant B to plant A.  However, there doesnu2019t appear to be a linkage between sales order 123 and purchase requisition 987.  For instance, if we delete sales order 123 the purchase requisition doesnu2019t get deleted. 
    Our second concern is that the quantity on the purchase requisition can still be confirmed against later sales orders.  For example, say the above scenario resulted in a purchase requisition 987 that consumed all the stock available in plant B.  We then create a second sales order 456 for the same product.  Plant A is out of stock so the rules-based ATP looks in plant B.  We would expect that plant B would also not have any stock because itu2019s all been consumed by the purchase requisition.  Instead, the system creates a second purchase requisition to move quantity from plant B to plant A.  Itu2019s as if the system doesnu2019t realize that the purchase requisition 987 is already planning to move stock out of plant B.
    Does anyone have any thoughts or suggestions on these two scenarios?  Is there a way to configure the system so there is a hard linkage between the sales order and the purchase requisition so that if the sales order is deleted then the purchase requisition is also deleted?  Should ATP realize that purchase orders are consuming inventory and not allow later sales orders to confirm against that same inventory?  Any advice or experience would be greatly appreciated.
    Thanks,
    David Eady
    Application Delivery Team Lead
    Propex, Inc.

    Hi,
    The scheduling is done in SCM, and from there, whenever the RBA is triggered, the calculation is done always with the old route in SCM. Until you get back to R/3 this is when your route is determined. But the ATP check is always with the original route. So the idea would be that you change the values of the route while still in APO, this is possible via the user exit. Should be done in scheduling in APO.  
    Hope this information is helpful.
    Regards,
    Tibor

  • How to stop message processing based on validation?

    Hello experts,
    I have a requirement to stop message processing in the graphical mapping based on validation results. Here is the scenario - messages are translated using graphical mapping and sent to the target system. An RFC lookup will be done to ECC to determine if the data in the message is good. If the lookup returns a negative result, message processing should be stopped right there.
    I guess we can throw an exception from the mapping to force a failure and stop further processing, but that will cause the message to show up as failed on SXMB_MONI and cause alert emails to be sent out in PROD. Another option will be to supress creation of the root node itself, but I think the message will then fail in the subsequent "call adapter" step if the target schema has a min occurence of 1 for the root node (as in the case of IDocs).
    Is it possible to do it without using BPM?
    Thanks,
    Michelle

    Hi Michelle,
       If your requirement, to stop message processing without sending an alert?
    If yes, then you can have a alert rule, not to trigger alerts on a failure (And to raise an exception based on the result from RFC loookup).
    If your requirement is not to make the message fail, then you have to go for the ccBPM route.
    Best Regards,
    Ravikanth Talagana

  • BPM - triggering process based on Transport Acknowledgement

    hello everyone, have a theoretical type question i would like some advice on.
    scenario --> IDOC from R/3, transformed and sent to file adapter.  In the BPM, i have defined the send step to the file adapter as being asynchronous, but have specified an acknowledgement type of 'transport' to be returned. 
    As the step is asynchronous and is thus persisted i cannot trigger an exception if the send does not occur.  Question -is it possible to make subsequent steps dependent on the success of the transport acknowledgement being returned from the send step?
    thanks in advance for any light you can shed on this topic!
    /david

    hi,
    we can have the header file as header and item file as item. basically based on our convenient we can suggest the file name to the sender. so scheme can be anything that fits our requirement

  • Sender adapter processing based on done file content

    hi
    sender system creates a done file, after the creating of actual file. SAP PO 7.4 first need to read the done file for the list of files to be processed from the same folder.
    for eg:
    source folder files:
    xxx11092014.xml
    yyy11092014.xml
    zzz10092014.xml
    done11092014.xml
    content of done file:
    <files>
    xxx11092014.xml
    zzz10092014.xml
    <files>
    In SAP PO 7,4 first need to read the done file, based on the content of the file xxx11092014.xml & zzz10092014.xml need to be processed in sap po and
    all the three file xxx11092014.xml, zzz10092014.xml & done11092014.xml need to be deleted. leaving behind the yyy11092014.xml for which new done file will be created..
    Could anyone please throw some light on this scenario to implement???

    Hi,
    Please check below steps helpful for your requirement by creating two flows and additional files.
    Interface1: This interface to read the filenames from done*.xml and generate the additional files with same name with different extension like xxx11092014.txt and zzz10092014.txt for interface2 to pick.
    1. Configure your sender CC to pick file done*.xml
    2. Use multimapping to create multiple messages based on how many files under <files>
    3. Create additional files in receiver file adapter with same name with different extension by using variable substitution.
    One IDOC to Multiple Files sending to Multiple folders of the FTP using single Communication Channel (SAP XI-PI Process …
    Interface2: This interface to pick files having only additional files with same namepart
    1. Configure your sender CC to pick files having additional files by enabling Additional Files
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/6d967fbc-0a01-0010-4fb4-91c6d38c5816?QuickLink=index&…
    Regards,
    Praveen

  • Intra date file processing in EBS

    Hi,
    While processing Intra date file, we are unable to post the opening balance (BAI code 040) in EBS processing.I have configured 040 External transaction and activated the 21 process type but still we are unable to post the opening balance with 040 value.

    HI
    Bhavesh, I am using NFS with By date.
    As Lalit suggested, I put the processing mode as Archieve, and gave queue name as "Temp". In the archieve folder I could able to see the file sequence how they are getting processed and it is up to my requirement.
    Now can I view the same file processing in "Temp" Queue. For this I think I need to configure the queues.
    Please guide me how to configure the queues and view the files in this queue "Temp".
    Thanks,
    anil.

Maybe you are looking for