Parallel processing of files through threads

Greetings for the day guys.
How parallel processing of files achieved through threads in java ?
scenario:
let's say i have a folder containing 50 files in that. I want to create a thread pool of 10 size and each thread will pick the files one by one and process those files simultaneously.
let me provide some hint to achieve this functionality ...
Thanks, Good day.

What have you tried so far? This isn't a free coding service.
But don't imagine that processing files with 50 threads is going to be 50 times as fast as processing them with one thread. The disk isn't multi-threaded.

Similar Messages

  • Process .bat files through the terminal Window?

    Hi,
    I have a large number of .bat file which have been generated from Rendering out of Maya (3D Application) on PC.
    Usually I would process the bat files from within maya's script editor which would generate the Photoshop images.
    The down side is it ties my PC up and I cant really do anything while these are processing .. and it take quiet a while.
    I have Mac which doesn't have Maya... does anybody know if I can process .bat files maybe through the terminal window?. Problem is theres well over 10,000 images to process.
    If it is possible to do this is there a way I can automate this process from the Terminal for all the files?
    Any help would be much appreciated
    Thanks
    Wayne

    No problem, Kurt. There are several possible solutions for the OP (who seems to have disappeared!):
    You mentioned already running Windows on the Mac. I have found Parallels great, so it should be as simple as setting up his virtual machine, installing his app and copying over the batch files. But he would need to purchase Parallels and have a Windows licence.
    Unix has a much more sophisticated range of scripting facilities than the old DOS batch files, and there are free tools that do a lot of graphics processing.
    Applescript can be used to control Graphic Converter (amongst others) - but OP refers to rendering, and I don't know if GC can do that (in fact I am fairly sure it can't).
    So, to advise wr_uk we really need to know what those bat files are doing!
    AK

  • How distiller processing PS file through Watch folder

    We had an issue with Acrobat Distiller 9.
    We use to create watched folder for every project in Distiller. A single distiller is having more than 30 watched folder.
    Just wanted to check how distiller watched folder works:
    1. Is the distiller pickup PS file as alphabetical wise?
    2. Is the distiller pickup PS file as priority when hitting the folder?
    3. Is the distiller pickup PS file as time wise?
    We had an experience the PS file has picked up as alphabetical wise? Is this right?
    I need to understand whether the watched folder should pickup the file based on the time?
    Kindly clarify.
    JA

    I'm assuming this isn't the discontinued Distiller Server product...
    Distiller is for low volume personal (single user) use, so Adobe have no documentation and no interest in tuning the performance of multiple queues.
    Since each end user MUST have an Acrobat license you might as well set them up individual queues on their own PCs.

  • Parallel process(ora_pnnn)

    ps -ef|grep ora_|grep DCC로 background process를 찾아보니,
    ora_pnnn_DCC가 8개가 나오더군요.(물론, init file에 parallel_min_servers=0 , parallel_max_servers=8 로 지정되어 있습니다.)
    ora_pnnn은 parallel process이고, init file에서 parallel_min_servers와 parallelserver_idle_time값을 확인해보면 된다라고 하는데...
    현재 사용 DB가 8i 버전으로 parallel_server_idle_time이 hidden parameter가 되었더라구요... 이 hidden parameter 값 확인을 어떻게 하나요?
    한가지 더, dba_tables/dba_indexes 컬럼 중 degree값은 무엇을 의미하는 것인가요? degree > 1 인 경우, parallel process와 관련이 있는 것 같은데, 잘 모르겠습니다.
    글 수정:
    sglim79

    ps -ef|grep ora_|grep DCC로 background process를 찾아보니,
    ora_pnnn_DCC가 8개가 나오더군요.(물론, init file에
    parallel_min_servers=0 , parallel_max_servers=8 로
    지정되어 있습니다.)
    ora_pnnn은 parallel process이고, init file에서
    parallel_min_servers와 parallelserver_idle_time값을
    확인해보면 된다라고 하는데...
    현재 사용 DB가 8i 버전으로 parallel_server_idle_time이 hidden
    parameter가 되었더라구요... 이 hidden parameter 값 확인을 어떻게
    하나요?
    한가지 더, dba_tables/dba_indexes 컬럼 중 degree값은 무엇을 의미하는
    것인가요? degree > 1 인 경우, parallel process와 관련이 있는 것
    같은데, 잘 모르겠습니다.
    글 수정:
    sglim79일반적인 parameter 확인하는 것과 동일합니다.
    SQL> show parameter parallelserver_idle_time
    로 확인하시면 됩니다.
    참고로, sys user 로 접속해서 아래 sql문을 실행하면
    사용할 수 있는 hidden parameter 리스트들을 모두 확인하실 수 있습니다.
    select ksppinm
    from x$ksppi
    where substr(ksppinm,1,1) = ‘_’
    degree 값은 degree of parallelism 를 말하며
    이 값이 degree > 1 인 경우
    table에 대해 설정되어 있기 때문에
    sql문에서 parallel hint 를 설정하지 않더라도
    parallel 로 실행됩니다.
    하지만, parallel hint 를 사용하게 되면 table에 설정된
    degree of parallelism 를 override 합니다.
    아래 내용은 테이블 생성 시에 parallel_clause 사용할 때
    주의해야 될 사항입니다.
    Notes on the parallel_clause
    If table contains any columns of LOB or user-defined object type, this statement as well as subsequent INSERT, UPDATE, or DELETE operations on table are executed serially without notification. Subsequent queries, however, will be executed in parallel.
    For partitioned index-organized tables, CREATE TABLE ... AS SELECT is executed serially, as are subsequent DML operations. Subsequent queries, however, will be executed in parallel.
    A parallel hint overrides the effect of the parallel_clause.
    DML statements and CREATE TABLE ... AS SELECT statements that reference remote objects can run in parallel. However, the "remote object" must really be on a remote database. The reference cannot loop back to an object on the local database (for example, by way of a synonym on the remote database pointing back to an object on the local database).
    참고로, 테이블 생성시에 degree of parallelism 를 사용하는 건
    바람직하지 않고, 필요한 sql문에 hint 를 사용하는 것을 권장합니다.

  • Group data locked error for MM01 using parallel processing

    Hello gurus,
                       I am using Call txn method (MM01) Parallel Processing method ( around 9 threads ). The Materials are getting locked around 10 percent sometimes.
    This is happening randomly ..one day i dont have any locking errors ..next day i have ...Any ideas why this could be..any prereq i need to check before executing the parallel processing..
    Thank you in advance..
    sasidhar p

    Hi Sasidhar
    I guess you are either extending the Sales Data or MRP Data. Just make sure that you are processing these transactions in a linear form for a single material. We can use parallel processing for different materials but for a single material if we go for parallel processing we can definetely expect the Lock Objects error.
    Kind Regards
    Eswar

  • ELO9 parallel processing logic

    Hi,
    Can anyone help me out in understanding the processing logic or how the parallel processing job runs through EL09 transaction?
    Could not understand how it is scheduled & processed
    Regards,
    Deepak

    Hi,
    Check things like
    http://help.sap.com/saphelp_webas620/helpdata/en/e8/df959834ac11d586f90000e82013e8/frameset.htm
    http://help.sap.com/saphelp_nw04/helpdata/en/c4/3a7f1f505211d189550000e829fbbd/frameset.htm
    http://help.sap.com/saphelp_nw04/helpdata/en/c4/3a7f39505211d189550000e829fbbd/frameset.htm
    Eddy

  • Communication between thread in the same process using file interface.

    Hi,
    I am developing  driver and i need to communicate between two thread.
    >can anyone guide me on implementing communication between two thread in the same process using File Interface. First thread will be driver and second will be application.I need to send IOCTL like commands using File interface ie is WriteFile(),ReadFile()
    from Host process to driver through file-interface(which run's in driver context).Host process should not be blocked for the duration of the driver to process the command.
    >File-interface will run in driver context and it will be responsible to receive command from application and pass it to the driver.
    what are the complexity introduced?
    >Can anyone also give me the link/reference to get more information on this topic?
    >How to replace IOCTL command's for instance baud _rate change command with a file interface for example with IRP.

    Here  is the detailed query:
    Hardware Abstraction Layer will interact with Driver(Both will be running in complete different process) .there is a IOCTL for command and  File interface for read and write.
    My requirement is:
    Both should run in the same process so HAL will run as one thread and driver as another thread in the same process .I don't want HAL to wait for completion of request and also i don't want driver to be blocked .
    We are planning to use a File Interface for communication between Hardware abstraction layer and Driver
    HAL will send the command or read/write operation to a file interface and driver will get the command or read/write request from the File interface
    There is a flexibility to change Hardware Abstraction layer and also the Driver
    Is it possible to use IOCTL between two thread under same process? if not what other options do we have.
    Can we use File interface to  send command (like IOCTL) between two thread?

  • Aim to process all files in folders on desktop to run through photoshop and save in multiple locations

    Aim to process all files in folders on desktop to run through photoshop and save in multiple locations
    Part one:-
    Gather information from desktop to get brand names and week numbers from the folders
    Excluding folders on desktop beginning with "2" or "Hot"
    Not sure about the list of folders
    but I have got this bit to work with
    set folderPath to "Hal 9000:Users:matthew:Desktop:DIVA_WK30_PSD" --<<this would be gained from the items on the desktop
    set {oldTID, my text item delimiters} to {my text item delimiters, ":"}
    set folderName to last text item of folderPath
    set my text item delimiters to "_WK"
    set FolderEndName to last text item of folderName
    set brandName to first text item of folderName
    set my text item delimiters to "_PSD"
    set weekNumber to first text item of FolderEndName
    set my text item delimiters to oldTID
    After running this I have enough information to create folders in multiple locations, (i need to know where they are so that photoshop can later save them in those multiple locations
    So I need the following folders created
    Locally
    Hal 9000:Users:matthew:Pictures:2011-2012:"WK" + weekNumber
    Hal 9000:Users:matthew:Pictures:2011-2012:"WK" + weekNumber: brandName
    Hal 9000:Users:matthew:Pictures:2011-2012:"WK" + weekNumber: brandName: brandName + "_WK" + weekNumber + "_LR" --(Set path for Later)PathA
    Hal 9000:Users:matthew:Pictures:2011-2012:"WK" + weekNumber: brandName: brandName + "_WK" + weekNumber + "_HR"--(Set path for Later)PathB
    Network
    Volumes:GEN:Brands:Zoom:Brands - Zoom:Upload Photos:2012:"Week" + weekNumber
    Volumes:GEN:Brands:Zoom:Brands - Zoom:Upload Photos:2012:"Week" + weekNumber:brandName + "_WK" + weekNumber + "_LR"  --(Set path for Later)PathC
    Volumes:GEN:Website_Images --(no need to create folder just set path)PathD
    FTP (Still as a normal Volume) So like another Network
    Volumes:impulse:"Week" + weekNumber
    Volumes:impulse:"Week" + weekNumber:Brand
    Volumes:impulse:"Week" + weekNumber:Brand:brandName + "_WK" + weekNumber + "_LR"  --(Set path for Later)PathE
    Volumes:impulse:"Week" + weekNumber:Brand:brandName + "_WK" + weekNumber + "_HR"  --(Set path for Later)PathF
    I like to think that is end of Part 1
    Part 2
    Take the images  (PSD's) from those folders relevant to the Brand then possibly run more applescript that opens flattens and then saves it in the locations above.
    For example….
    An image in folder DIVA_WK30_PSD will then run an applescript in Photoshop, lets call it DivaProcessImages within this we then save to PathA, PathB, PathC, PathD, PathE, PathF the folder path of C should therefore look like this
    Volumes:GEN:Brands:Zoom:Brands - Zoom:Upload Photos:2012:Week30:DIVA_WK30_LR and of course save the image as original filename.
    Then from the next folder
    An image in folder Free_WK30_PSD will then run an applescript in Photoshop, lets call it FreeProcessImages within this we then save to PathA, PathB, PathC, PathD, PathE, PathF the folder path of C should therefore look like this
    Volumes:GEN:Brands:Zoom:Brands - Zoom:Upload Photos:2012:Week30:Free_WK30_LR and of course save the image as original filename.
    The photoshop applescript i'm hoping will be easier as it should be a clearer step by step process without any if's and but's
    Now for the coffee!!

    Hi,
    MattJayC wrote:
    Now to the other part, where each folder was created (and those that already existed) how do I set them as varibles?
    For example,
    set localBrandFolder_High_Res to my getFolderPath(brandName & "_WK" & weekNumber & "_HR", localBrandFolder)
    This line was used to create more than one folder as it ran though the folders on the desktop. The next part is I will need to reference them to save files to them.
    You can use a records
    Examples
    if you want the path of localBrandFolder_High_Res  of "Diva", if "Diva" is the second folder of the Desktop
    You get the path with this : localBrandFolder_High_Res of record 2 of myRecords
    if you want the path of localWeekFolder  in the first folder of the Desktop
    You get the path with this : localWeekFolder of record 1 of myRecords
    Here is the script
    set myRecords to {}
    set dtF to paragraphs of (do shell script "ls -F ~/Desktop | grep '/' | cut -d'/' -f1")
    repeat with i from 1 to number of items in dtF
        set this_item to item i of dtF
        if this_item does not start with "2_" and this_item does not start with "Hot" then
            try
                set folderPath to this_item
                set {oldTID, my text item delimiters} to {my text item delimiters, ":"}
                set folderName to last text item of folderPath
                set my text item delimiters to "_WK"
                set FolderEndName to last text item of folderName
                set brandName to first text item of folderName
                set my text item delimiters to "_PSD"
                set weekNumber to first text item of FolderEndName
                set my text item delimiters to oldTID
            end try
            try
                set this_local_folder to "Hal 9000:Users:matthew:Pictures:2011-2012"
                set var1 to my getFolderPath("WK" & weekNumber, this_local_folder)
                set var2 to my getFolderPath(brandName, var1)
                set var3 to my getFolderPath(brandName & "_WK" & weekNumber & "_LR", var2)
                set var4 to my getFolderPath(brandName & "_WK" & weekNumber & "_HR", var2)
                --set up names to destination folders and create over Netwrok including an already exisiting folder
                set this_Network_folder to "DCKGEN:Brands:Zoom:Brand - Zoom:Upload Photos:2012:"
                set var5 to my getFolderPath("WK" & weekNumber, this_Network_folder)
                set var6 to my getFolderPath(brandName, var5)
                set var7 to my getFolderPath(brandName & "_WK" & weekNumber & "_LR", var6)
                set website_images to "DCKGEN:Website_Images:"
                --set up names to destination folders and create over Netwrok for FTP collection (based on a mounted drive)
                set this_ftp_folder to "Impulse:"
                set var8 to my getFolderPath("Week" & weekNumber, this_ftp_folder)
                set var9 to my getFolderPath(brandName, var8)
                set var10 to my getFolderPath(brandName & "_WK" & weekNumber & "_LR", var9)
                set var11 to my getFolderPath(brandName & "_WK" & weekNumber & "_HR", var9)
                set end of myRecords to ¬
      {localWeekFolder:var1, localBrandFolder:var2, localBrandFolder_Low_Res:var3, localBrandFolder_High_Res:var4, networkWeekFolder:var5, networkBrandFolder:var6, networkBrandFolder_Low_Res:var7, ftpWeekFolder:var8, ftpBrandFolder:var9, ftpBrandFolder_Low_Res:var10, ftpBrandFolder_High_Res:var11}
            end try
        end if
    end repeat
    localBrandFolder_High_Res of record 2 of myRecords -- get full path of localBrandFolder_High_Res in the second folder of Desktop
    on getFolderPath(tName, folderPath)
        tell application "Finder" to tell folder folderPath
            if not (exists folder tName) then
                return (make new folder at it with properties {name:tName}) as string
            else
                return (folder tName) as string
            end if
        end tell
    end getFolderPath

  • Unable to clone File Adapter receiver channel for parallel processing

    Hi Experts,
    I am using variable substitution for File - RFC - File with out BPM scenario(using request response, oneway bean).
    While i placed the file in the sender FTP folder, the file didnt get picked up, but in communication channel monitoring, i am getting error ' Unable to clone File Adapter receiver channel for parallel processing'.
    Can anybody provide me suggestions to solve this error.
    Note : without variable substitution , the interface is working good.
    Is it due to, i am trying the source structure field in response file adapter?

    Hi,
    In your CC, do you use some additional paramaters ?
    like these one of point 47/48 of [Oss note 821267 - FAQ: XI 3.0 / PI 7.0/ PI 7.1 File Adapter|https://service.sap.com/sap/support/notes/821267]
    Maybe there is conflict with a parallel connexion and the bean used to do asynch-synch bridge...
    Mickael

  • How to map expdp parallele process to output file

    How to map expdp parallel process to its output file while running...
    say i use expdp dumpfile=test_%U.dmp parallel=5 ..
    Each parallele process writing to its related output file.. i want to know the mapping in run time...

    I'm not sure if this information is reported in the status command but it's worth a shot. You can get to the status command 2 ways:
    if you are running a datapump job from a terminal window, then while it is running, type ctl-c and you will get the datapump prompt. Either IMPORT> or EXPORT>
    IMPORT> status
    If you type status, it will tell you a bunch of information about the job and then each process. It may have dumpfile information in there.
    If you run it interactively, then you need to attach to the job. To do this, you need to know the job name. If you don't know, you can look at sys.dba_datapump_jobs if prived, or sys.user_datapump_jobs if not prived. You will see a job name and a schema name. Once you have that, you can:
    expdp user/password attach=schema.job_name
    This will bring you to the EXPORT>/iMPORT> prompt. Type status there.
    Like I said, I'm not sure if file name information is specified, but it might be. If it is not there, then I don't know of any other way to get it.
    Dean

  • Processing large file in multiple threads

    I have a large plain text file (~1GB) that contains lots (~500,000) of chunks of text, separated by the record separator: "\n//\n". In my application, I have a class that reads the file and turns the records into objects, which is fairly time-consuming (the bottleneck is definitely in the processing, not the IO). The app is running on a muti-core machine, so I want to split the processing up over several threads.
    My plan is to write a "Dispatcher" class, which reads the file from disk and maintains a queue of records, and a "Processor" class, which requests records from the Dispatcher object, turns the records into objects, and add()s them to an array. Is this the correct way to do it? Can anyone point me in the direction of a tutorial or example of this type of multi-threaded file processing?

    public class TaskExecutionr {
        private static final int NTHREADS = 10; // Probably want to do System.getNumberOfProcessors+1 here
        private static final Executor exec
                = Executors.newFixedThreadPool(NTHREADS);
        private static Collection result = new ConcurrentLinkedQueue();
        public static void main(String[] args) throws IOException {
            FileReader reader =
            while (true) {
                final String text = reader.read(500,000); // read 500.000 lines to a string.
                Runnable task = new Runnable() { // create a new runnable that deals with the text
                    public void run() {
                        handle(text);
                exec.execute(task); // Place the task in the Executor
        private static void handle(Socket connection) {
            // make your objects here
            result.add(<your new object>);
    }http://www.javaconcurrencyinpractice.com/listings/TaskExecutionWebServer.java
    Edited by: Fireblaze-II on Jun 19, 2008 5:48 AM

  • Parallel process with reentrant VI have same value in both threads

    Has it been so long since I programmed LabVIEW that I forgot some basic stuff??
    I have a main VI which originally called dynamic processes in parallel.
    I then called the sub-vi's dirtectly and still run them in parallel, but thwey now have seperate names.
    I use a QSM.  Each parallel thread now has it's own QSM, because although I was using a Queue Name for each dynamic queue, the data that was being extracted was shared or done between the two threads.  If I confused everyone with the statement, I shall explain.
    Two parallel processes, calling a QSM (re-entrant).  They have the same number of elements and matching sets.
    EX:
    Process 1          Process 2
    Task1                 Task1
    Task2                 Task2
    Task3                 Task3
    Task4                 Task4
    Task5                 Task5
    Task6                 Task6
    I was expecting each thread (each process) to extract from the queue the list of tasks as entered (from Task1 to Task6).  What the processes were getting was the following:
    Process 1          Process 2
    Task1                 Task1
    Task2                 Task3
    Task4                 Task5
    Task6                 default
    Each Process was sending a different Queue Name to the QSM.  Each queue should have it's own name.
    I need to get this running by tomorrow with no excuse!  So I decided to do a lame workaround by also having 2 QSMs.  That fixed it..
    In each parallel process (which are a copy of each other with different names) there is a call to open a telnet session.  I probed and placed breakpoints in the code.  Although each process has a different name and the call to the function that opens the telnet session is re-entrant, the very same telnet reference number is assigned to both processes.
    Why?  Why would they get the same reference number?  I made all vi's down to (and including) Telnet Open Connection as re-entrant (although it was not needed) and it still assigns the same reference number to each telnet session.  Why?  What I am not seeing?  What am I missing?
    Unfortunately, I cannot post the code..  But it is not complicated code.  Just 2 sessions with different IP addresses.  I would expect different telnet session references... 
    As a matter of fact, I need to try something silly.. 

    I should provide more details with the solution....
    I just have to stop saying "D'Oh!!"
    Okay... here goes...
    In the LVOOP, I am using Notifiers and Semaphores to ensure that a race condition cannot occur.  Works well with previously written code.
    In this particular implementation, I have the same / similar object being created more than once (twice at this time).
    Where I went wrong (D'Oh!!!!), was to have a static name for a given object when creating the Notifier and Semaphore references.  Since the same name was given, so was the reference.  SInce the references were the same, so was the data, and so on.
    D'Oh!!!!!!
    D'Oh!!!!!!
    Now I know why a particular bird was called D'Oh-D'Oh bird... 
    D'Oh!!!  Such a silly mistake...
    D'Oh!!!
    Hope it makes a few people laugh...  or help another bird....

  • Parallel process with a queue and file?

    Hello, first of all sorry for my bad english^^:
    I am working for days on my project where I have to show the parallel process while transfering information by different ways and their problems (like timing and so on). 
    I chose the transmission of information to a parallel process by (1) a queue, and by (2) a file (.txt). (other ways are welcome, do you have 1-2 other ideas?)
    To solve this problem I made three while loops, the first one is the original one, where the original information (as a signal) is created and send by queue and file to the other two while loops, where this information is getting evaluted to create the same signal.
    so in the end you can compare all signals, if they are the same - so that you can answer the question with the parallelity of process.
    but in my vi file i have some problems:
    the version with the queue works pretty fine - it's almost parallel
    but the version with file doesn't work parallel, and i have no idea how i can solve it - -
    i'm a newbie^^
    can someone correct my file, so both (file and queue version) run parallel with the original one, or tell me what i can or must do?
    Attachments:
    Queue_Data_Parallel_FORUM.vi ‏23 KB

    LapLapLap wrote:
    Hello, first of all sorry for my bad english^^:
    I am working for days on my project where I have to show the parallel process while transfering information by different ways and their problems (like timing and so on). 
    I chose the transmission of information to a parallel process by (1) a queue, and by (2) a file (.txt). (other ways are welcome, do you have 1-2 other ideas?)
    To solve this problem I made three while loops, the first one is the original one, where the original information (as a signal) is created and send by queue and file to the other two while loops, where this information is getting evaluted to create the same signal.
    so in the end you can compare all signals, if they are the same - so that you can answer the question with the parallelity of process.
    but in my vi file i have some problems:
    the version with the queue works pretty fine - it's almost parallel
    but the version with file doesn't work parallel, and i have no idea how i can solve it - -
    i'm a newbie^^
    can someone correct my file, so both (file and queue version) run parallel with the original one, or tell me what i can or must do?
    A queue is technically never parallell, though you can have several if you really need parallellism. Other methods include Events, Action Engines, Notifiers (and why not webservices) for information transfer between processes.
    Due to limitations in the disk system you can only read/write one file at a time from one process, so i wouldn't recommend that. If you use a ramdisk it might work.
    /Y
    LabVIEW 8.2 - 2014
    "Only dead fish swim downstream" - "My life for Kudos!" - "Dumb people repeat old mistakes - smart ones create new ones."
    G# - Free award winning reference based OOP for LV

  • Parallel Processing through ABAP program

    Hi,
    We are trying to do the parallel processing through ABAP. As per SAP documentation we are using the CALL FUNCTION STARTING NEW TASK DESTINATION.
    We have one Z function Module and as per SAP we are making this Function module (FM)as Remote -enabled module.
    In this FM we would like to process data which we get it from internal table and would like to send back the processed data(through internal table) to the main program where we are using CALL FUNCTION STARTING NEW TASK DESTINATION.
    Please suggest how to achieve this.
    We tried out EXPORT -IMPORT option meaning we used EXPORT internal table in the FM with some memory ID and in the main program using IMPORT internal table with the same memory ID.  But this option is not working even though ID and name of the internal table is not working.
    Also, SAP documentation says that we can use RECEIVE RESULTS FROM FUNCTION 'RFC_SYSTEM_INFO'
    IMPORTING RFCSI_EXPORT = INFO in conjunction with CALL FUNCTION STARTING NEW TASK DESTINATION. Documentation also specifies that "RECEIVE is needed to gather IMPORTING and TABLE returns of an asynchronously executed RFC Function module". But while creating the FM remote-enabled we cant have EXPORT or IMPORT parameters.
    Please help !
    Thanks in advance
    Santosh

    <i>We tried out EXPORT -IMPORT option meaning we used EXPORT internal table in the FM with some memory ID and in the main program using IMPORT internal table with the same memory ID. But this option is not working even though ID and name of the internal table is not working</i>
    I think that this is not working because that memory does not work across sessions/tasks.  I think that the
    IMPORT FROM SHARED BUFFER and EXPORT TO SHARED BUFFER would work.  I have used these in the past and it works pretty good.
    Also,
    here is a quick sample of the "new task" and "recieve" functionality.   You can not specify the importing parameters when call the FM.  You specify them at the recieving end.
    report zrich_0001 .
    data: session(1) type c.
    data: ccdetail type bapi0002_2.
    start-of-selection.
    * Call the transaction in another session...control will be stop
    * in calling program and will wait for response from other session
      call function 'BAPI_COMPANYCODE_GETDETAIL'
               starting new task 'TEST' destination 'NONE'
                   performing set_session_done on end of task
        exporting
          companycodeid             = '0010'
    * IMPORTING
    *   COMPANYCODE_DETAIL        = ccdetails
    *   COMPANYCODE_ADDRESS       =
    *   RETURN                    =
    * wait here till the other session is done
      wait until session = 'X'.
      write:/ ccdetail.
    *       FORM SET_session_DONE
    form set_session_done using taskname.
    * Receive results into messtab from function.......
    * this will also close the session
      receive results from function 'BAPI_COMPANYCODE_GETDETAIL'
        importing
           companycode_detail        = ccdetail.
    * Set session as done.
      session = 'X'.
    endform.
    Hope this helps.
    Rich Heilman

  • How to Process flat File in Oracle Apps through Concurrent Program

    Hello Everyone,
    My client has a request, to process a bank file (Lockbox) which is a flat file that will be copied on UNIX box and I will have to create a new concurrent request that will process this flat file and will update receipt information in Oracle Apps database tables.
    Could you please suggest, if there are any other standard Oracle Apps functions (Example FND) available which can be used through Concurrent program that can be used to open a file from a particular directory and can be read from the flat file and after processing this file can be closed.
    Please let me know, if you have a small example, that would help me a lot.
    Thanks

    There are base concurrent programs in Accts Receivable that do consume lockbox flat files. Pl see the AR Setup/User Guides at
    http://download.oracle.com/docs/cd/B40089_10/current/html/docset.html
    Srini

Maybe you are looking for

  • Product Registration Pop

    I have a pop up reminder on the PC for the first time today to "register product and enjoy a host of benefits" - how can I get rid of it please - I've already registered and I thought it would be as simple as deleting the product registration link fr

  • Combining two different Chart types into one graph

    Hi All,   I am developing a graphical BW report using WAD. I need to place two differnt chart types into one graph.For example, I have Volume and Quantity .Here the volume should be displayed in Bars and Quantity should be displayed in Lines in the s

  • Share Airport connection between 2 iBooks?

    Hi there, I'm setting up a little roving trolley of G3 iBooks at school and they all have airport cards except for one. They are runnin OS 9.1 or 9.2 The ones with airport all see the internet/school network/postscript network printer just fine... Wo

  • External Drive Unreadable

    I have a 1TB Seagate freeagent goflex which until yesterday worked fine (using a Mac). now when i plug it in i get the error message saying it is unreadable. In disk utility the drive shows up but no volumes are listed under it and i dont have the op

  • Doubt on EDI/ALE

    hi folks,         i have one basic doubt XApps...what is the differenece betwwen ALE and EDI. when do we go for which scenario......cld ay one xplain me in detal...thnx in advance...              santosh...