Best approach for this problem

Hi there Experts,
I have an Async Sync Bridge BPM, to go from Delivery Idocs to Webservice Calls, after that i map the response to a an ALEAUD Idoc structure and i send it back to erp to update the Delivery Status.
Currently its working as intended on production system, but the Deliveries quantity is growing exponentially, raising the response times for the whole process.
I found a solution for this problem, instead of making 1 Webservice call for each delivery, i can group those deliveries and send multiple deliveries per each webservice call, the problem i have appears when i need to map this response to single ALEAUD Idocs.
To Update the original delivery Idoc i need that idoc number onto the ALEAUD, but the webservice returns the response based on delivery number, therefore i need a mechanism, maybe a temporal table or something, that allows me to store the idoc number with the matching delivery number so i can use it as reference to map the huge response to single ALEAUD Idocs, also i'd like to know if this is possible to achieve using graphical mapping or wich would be the best way to do it.
Thanks for all the input on the matter,
Regards,
Roberto.

Maybe you can write 2 RFC function modules to maintain values in ECC and the other to read it. These RFC's can be called from within mapping.
I dont know if it will work or not, but can be tried.

Similar Messages

  • TA24002 My 500 GB can't verify nor repair. I have photoshop work that I need to recover. I would like to know which erase option would be the best solution for this problem.

    My 500 GB can't verify nor repair. I have photoshop work that I need to recover. I would like to know what option would be the best solution for this problem?

    You appear to have two issues: 1) a hard drive that is not working properly and 2) files you wish to recover.
    Re 1) you need to answer Kappy's questions.
    Re 2) does the drive load and can you see your photo files? If so can you copy them to another drive?
    Do you not have a backup of the photo files?

  • I have problems in the initiation of the Encore process when opening presents the following error message : "Encore CS6 Cannot Run in Non-Royalty Serialized".... What is the best solution for this problem ?

    Help Me.
    What is the best solution for this problem ?

    Encore is activated when you activate Premiere Pro... so, as Stan asked, how did you install P-Pro?
    Ask for serial number http://forums.adobe.com/thread/1234635 has a FAQ link
    -and a fix for Encore http://forums.adobe.com/thread/1421765?tstart=0 in reply #7
    -plus more Encore http://helpx.adobe.com/encore/kb/cant-write-image-fie-larger1.html

  • Design Patterns, best approach for this app

    Hi all,
    i am starting with design patterns, and i would like to hear your opinion on what would be the best approach for this app. 
    this is basically an app for data monitoring, analysis and logging (voltage, temperature & vibration)
    i am using 3 devices for N channels (NI 9211A, NI 9215A, NI PXI 4472) all running at different rates. asynchronous.
    and signals are being processed and monitored for logging at a rate specified by the user and in realtime also. 
    individual devices can be initialized or stopped at any time
    basically i'm using 5 loops.
    *1.- GUI: Stop App, Reload Plot Names  (Event handling)
    *2.- Chart & Log:  Monitors Data and Start/Stop log data at a specified time in the GUI (State Machine)
    *3.- Temperature DAQ monitoring @ 3 S/s  (State Machine)   NI 9211A
    *4.- Voltage DAQ monitoring and scaling @ 1K kS/s (State Machine) NI 9215A
    *5.- Vibration DAQ monitoring and Analysis @ 25.6 kS/s (State Machine) NI PXI 4472
    i have attached the files for review, thanks in advance for taking the time.
    Attachments:
    V-T-G Monitor_Logger.llb ‏355 KB

    mundo wrote:
    thanks Will for your response,
    so, basically i could apply a producer/consummer architecture for just the Vibration analysis loop? or all data being collected by the Monitor/Logger loop?
    is it ok having individual loops for every DAQ device as is shown?
    thanks.
    You could use the producer/consumer architecture to split the areas where you are doing both the data collection and teh analysis in the same state machine. If one of these processes is not time critical or the data rate is slow enough you could leave it in a single state machine. I admit that I didn't look through your code but based purely on the descriptions above I would imagine that you could change the three collection state machines to use a producer/consumer architecture. I would leave your UI processing in its own loop as well as the logging process. If this logging is time critical you may want to split that as well.
    Mark Yedinak
    "Does anyone know where the love of God goes when the waves turn the minutes to hours?"
    Wreck of the Edmund Fitzgerald - Gordon Lightfoot

  • MMy iPhone 4s is struggling to send messages and dial out.. service is going back and forth between Verizon with a little circle, Roaming, and No Service.. Best fix for this problem..? Thanks

    My iPhone 4s is struggling to send messages and dial out.. Service is going back and forth between Verizon (with a little circle), Roaming, and No Service.. Best fix for the problem..?
    Thanks

    My iPhone 4s is struggling to send messages and dial out.. Service is going back and forth between Verizon (with a little circle), Roaming, and No Service.. Best fix for the problem..?
    Thanks

  • Best option for this problem

    I have an iMAC at home and a Windows Vista PC both hooked up with the AEBS. I recently purchased a new hard drive (Seagate 1.5 TB) and am trying to figure out the best way to format the drive to resolve my problem. Here is the scenario -
    1) I would like to attach the hard drive to the AEBS so that I can access it from my Window's Vista PC
    2) I would like to store files larger than 4GB on the hard drive
    How do I format the hard drive so that the AEBS can recognize it and allow me to store files larger than 4GB? Anyone else running into this problem?
    Thanks,
    Setta

    When your PC tries to access a network drive, it uses Samba to communicate with the file server that is "serving" that drive. In turn, the file server will use the appropriate protocol to actually access the drive. This is for both reading or writing to the network drive. So the drive's format shouldn't matter unless it's connected directly to the PC.
    The other issue that you may come across with either Vista or Windows 7 is that both use 32-bit encryption when communicating with network drives. This shows up as a rejection of your file share credentials.

  • Need suggestions or possible approach for this problem

    Hello,
    I have a scenario and I want to develop an apex application for this scenario.
    The thing is I have mutiple report regions on a page which are querying the same tbl 'loans'. I have a button named 'Assign loan' at the top of the page and at bottom of the page I have buttons 'Save' and 'Complete'. Initially the multiple report regions should not be displayed, only after user clicks on the buton 'Assign loan', then only they should be visible. Similarly for the buttons 'Save' and 'Complete'. After the loan is assigned to a user the button 'Assign loan' should be disabled.
    Now Consider a situation where in a user logins to the application and clicks on the button 'Assign a loan', at this point i will update the table and set a flag, so that the user will get that particular record for the reports region. I mean this record should be locked for that user and shouldn't be available for other users. The thing it should work in a multi-user environment, i mean each user should get a different loan when they click the button 'Assign loan'. So, here the user is assigned a loan and he/she makes changes to the multiple report region editable items and he/she can either save the changes by clicking on the button 'Save' or they can click on the button 'Complete' which means the loan was reviewed and is completed. The user shouldn't be assigned an another loan until he/she reviews and completes a paticular loan already assigned to him. And another case would be the user does some changes to the multiple report region editable items and clicks the button 'Save' (menaing pending) the report changes should be saved and he shouldn't be assigned any new loan until he completes the already assigned loan.
    Can anyone please help me with their possible suggestions or an approach to this kind of problem.
    thanks,
    Orton

    It looks to me that the trickiest part is preventing more than one user from getting assigned the same loan. I've seen DBMS_LOCK used for a situation like this. It's been years so I'm a bit fuzzy on the exact details but it goes something like this:
    When the user clicks 'Assign a loan', try to get an exclusive generic lock:
       dbms_lock.request(lockhandle => 'LOAN_LOCK',
                      lockmode   => dbms_lock.X_MODE,
                      timeout    => 10);If another user already has a lock with this name, try again after the timeout until the lock is obtained (and probably only try a maximum number of times).
    Once the lock is obtained, get the next available unassigned loan and set the flag in the table. Now release the lock so the next user can get a loan assigned.
       dbms_lock.release(lockhandle => 'LOAN_LOCK');As long as everyone uses the same process for getting a loan assigned, only the user with a lock can modify the table. The rest of it (the logic around what buttons to show, requiring an assigned loan to be completed before getting another one assigned, etc.) should be relatively straightforward.

  • Is an array the best solution for this problem?

    Hi there,
    I'm working up a demo where a couple of little games would show up in a  panel. There is a main menu that you bounce around to select things (the  games as well as other apps.)
    When a single game is running, it takes up the whole panel. When two or  more are running, they run in a "mini" format. Also, when a game is  running, a special "return to game" button appears in the main menu.
    This is a click through dog and pony show demo. It's not a production  app, but it has to work well enough to be played around with by internal  clients. So it has to work well.
    Right now I have some variables set up like so:
    var gameActive:Boolean = false;
    var gameDual:Boolean = false;
    In my launch game and menu function, I am checking these (and one or two  other) variables to see if I should do things like show a mini version  of the game or show the return to game button. As I add features though,  this is becoming slightly unwieldy.
    The key issue is the games. Let's say I have only two. I could make an  array, and then load in the game name when a game is launched. I could  check this array in my functions to see not only if games are launched,  but which game is launched so I can use the full or mini games as  appropriate.
    Is this a good approach, or is there a better way? I'm rusty with my  coding and not super comfortable making objects right now. But I could  go that way if it was the best.

    there's not much to it.  here are the only 3 things you're likely to need to do with your associative array:
    var yourAA:Object={};
    function addAAItem(aa:Object,o:DisplayObject){
    aa[o.name]=o;
    function removeAAItem(aa:Object,o:DisplayObject{
    aa[o.name]=null;
    function aaLength(aa:Object):int{
    var i:int=0;
    for(var s:String in aa){
    i++;
    return i;

  • Best approach for this?

    I have a table variable person_demographics.
    In this table there are a whole bunch of variables codes.
    country_code, gender_code, state_code, race_code etc etc
    all in different tables.
    I need to display to the user the real labels and not the codes which are localted on all this different tables.
    the logic is that a user enters a persons id and then the info is displayed on the screen, the info is read only enforced by the database. (belongs to a table where only DBA can write).
    can you guys suggest a good way to do this? I have already created a block called person_demographics, but again, it displays only de codes.
    please if you could be so kind, provide suggestions with actual code if possible,
    thanks!

    great suggestion William,
    but one more question, how would you avoid missing records that have Nulls on the condition variables. for instance where a record in the person demographics does not have a country code. By the simple joing it will exclude that record since the condition will not be fullfil right?.
    Now, it seems like an outter joint may be the solution, but as you know, you can do outter joint in only one table for each query. Any suggestions?
    thanks a bunch!

  • Best approach for IDOC - JDBC scenario

    Hi,
    In my scenarion I am creating sales order(ORDERS04) in R/3 system and which need to be replicated in a SQL Server system. I am sending the order to XI as an IDoc and want to use JDBC for sending data to SQL Server. I need to insert data in two tables(header & details). Is it possible without BPM?  Or what is the best approach for this?
    Thanks,
    Sri.

    Yes, this is possible without the BPM.
    Just create the Corresponding Datatype for the insertion.
    if the records to be inserted are different, then there wil be 2 different datatypes ( one for header and one for detail).
    Do a mutlimapping, where your Source is mapped into the header and details datatype and then send using the JDBC sender adapter.
    For the strucutre of your Datatype for insertion , just check this link,
    http://help.sap.com/saphelp_nw04/helpdata/en/7e/5df96381ec72468a00815dd80f8b63/content.htm
    To access any Database from XI, you will have to install the corresponding Driver on your XI server.
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/3867a582-0401-0010-6cbf-9644e49f1a10
    Regards,
    Bhavesh

  • Best approach for IDs mapping..

    Hello,
    I'd like to ask you for your experiences about classical integration problem: mapping of IDs (materials, partners...)
    What is the best approach for integration between SAP and other systems? Can you give me some hints?
    Thanx, Peter

    Hi Peter,
    you have 4 ways to do it:
    1. you can do it inside an integration process:
    RFC call for checking a table with ID -> ID mappings
    (not so good as you have to use integration process)
    but very easy to biuld as this is standard 
    2. table in R/3 and changing the values in a user exit
    (you maintaint the data in a table in R/3)
    the fastest way (no calls to other programs)
    but you have to create user exits and
    this is not why you (your client) bought the XI  
    3. you can use this new RFC API
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/uuid/801376c6-0501-0010-af8c-cb69aa29941c
    which seems to be the best approach
    as you don't need BPM for this and it's a standard
    4. value mapping tables in XI...
    Regards,
    michal
    Message was edited by: Michal Krawczyk

  • What's the best approach for handeling about 1300 connections in Oracle.

    What's the best approach for handling about 1300 connections in Oracle 9i/10g through a Java application?
    1.Using separate schema s for various type users(We can store only relevant data with a particular schema.     Then No. of records per table can be reduced by replicating tables but we have to maintain all data with a another schema     Then we need update two schema s for a given session.Because we maintain separate scheama for a one user and another schema for all data and then there may be Updating problems)
    OR
    2. Using single schema for all users.
    Note: All users may access the same tables and there may be lot of records than previous case.
    What is the Best case.
    Please give Your valuable ideas

    It is a true but i want a solution from you all.I want you to tell me how to fix my friends car.

  • What are the best approaches for mapping re-start in OWB?

    What are the best approaches for mapping re-start in OWB?
    We are using OWB repository 10.2.0.1.0 and OWB client 10.2.0.1.31. The Oracle version is 10 G (10.2.0.3.0). OWB is installed on Linux.
    We have number of mappings. We built process flows for mappings as well.
    I like to know, what are the best approches to incorportate re-start options in our process. ie a failure of mapping in process flow.
    How do we re-cycle failed rows?
    Are there any builtin features/best approaches in OWB to implement the above?
    Does runtime audit tables help us to build re-start process?
    If not, do we need to maintain our own tables (custom) to maintain such data?
    How did our forum members handled above situations?
    Any idea ?
    Thanks in advance.
    RI

    Hi RI,
    How many mappings (range) do you have in a process flows?Several hundreds (100-300 mappings).
    If we have three mappings (eg m1, m2, m3) in process flow. What will happen if m2 fails?Suppose mappings connected sequentially (m1 -> m2 -> m3). When m2 fails then processflow is suspended (transition to m3 will not be performed). You should obviate cause of error (modify mapping and redeploy, correct data, etc) and then repeat m2 mapping execution from Workflow monitor - open diagram with processflow, select mapping m2 and click button Expedite, choose option Repeat.
    In re-start, will it run m1 again and m2 son on, or will it re-start at row1 of m2?You can specify restart point. "at row1 of m2" - I don't understand what you mean (all mappings run in Set based mode, so in case of error all table updates will rollback,
    but there are several exception - for example multiple target tables in mapping without corelated commit, or error in post-mapping - you must carefully analyze results of error).
    What will happen if m3 fails?Process is suspended and you can restart execution from m3.
    By having without failover and with max.number of errors=0, you achieve re-cycle failed rows to zero (0).This settings guarantee existence only two return result of mapping - SUCCSES or ERROR.
    What is the impact, if we have large volume of data?In my opinion for large volume Set based mode is the prefered processing mode of data processing.
    With this mode you have full range enterprise features of Oracle database - parallel query, parallel DML, nologging, etc.
    Oleg

  • Best approach for RFC call from Adapter module

    What is the best approach for making a RFC call from a <b>reciever</b> file adapter module?
    1. JCo
    2. Is it possible to make use of MappingLookupAPI classes to achieve this or those run in the mapping runtime environment only?
    3. Any other way?
    Has anybody ever tried this? Any pointers????
    Regards,
    Amol

    Hi ,
    The JCo lookup is internally the same as the Jco call. the only difference being you are not hardcoding the system related data in the code. So its easier to maintain during transportation.
    Also the JCO lookup code is more readable.
    Regards
    Vijaya

  • Best Approach for Reporting on SAP HANA Views

    Hi,
    Kindly provide information w.r.t the best approach for the reporting on HANA views for the architecture displayed below:
    We are on a lookout for information mainly around the following points:
    There are two reporting options which are known to us and listed below namely:
    Reporting on HANA views through SAP BW  (View > VirtualProvider > BEx > BI 4.1)
    Reporting on HANA views in ECC using BI 4.1 tools
            Which is the best option for reporting (please provide supportive reasons : as in advantages and limitations)?
             In case a better approach exists, please let us know of the same.
    Best approach for reporting option on a mixed scenario wherein data of BW and HANA views is to be utilized together.

    Hi Alston,
    To be honest I did not understand the architecture that you have figured out in your message.
    Do you have HANA instance as far as I understood and one ERP and BW is running on HANA. Or there might be 2 HANA instance and ERP and BW are running independently.
    Anyway If you have HANA you have many options to present data by using analytic views. Also you have BW on HANA as EDW. So for both you can use BO and Lumira as well for presenting data.
    Check this document as well: http://scn.sap.com/docs/DOC-34403

Maybe you are looking for

  • Iphone 5 3g not working

    I have an Iphone 5 from straight talk that I bought a little over a week ago.  Everything was working fine until I had to reset the phone 2 days ago because I accidently synced a bunch of photos and contacts on itunes from my old ipod. Now the phones

  • How to setup the email in SAP netweaver

    Can anyone please help me while setting up the email within the SAP netweaver. Thanks Inadvance John

  • Fonts show as missing in InDesign CS6 and now also InDesign CS5.5, but infact are there.

    Company finally updated to CS6, large client's jobs all use Interstate. Yesterday would show as missing in CS6 but worked fine in CS5.5, Today show as missing in both versions but  after exporting as various file types, the fonts are infact there. Ho

  • Streaming QuickTime photo slideshow

    Hey, Will Mac OS X Server do what I'm looking to do? I have an office with about 300 people in it. I would like to have 1. a computer displaying a photo slideshow of employees with a short bio of them. 2. a website where employees can browse the phot

  • Discussion Forum Portlet Security Flaws

    Can someone from Oracle answer me on the following: - Will there be another release of that porlet soon? - And if so, will the next release fix the security vulnerabilities that have been reported (such as: http://www.sec-consult.com/248.html) - Will