Help with DTP with Processing Mode No Data Transfer, Delta Status in Source

Hi, I am trying to test a Cube to Cube Extraction where I do a full DTP Extraction for a specific account to get the data I need and then a DTP Delta Init by using Processing Mode "No Data Transfer, Delta Status in Source".  
When I do the DTP Delta init using Processing Mode "No Data" this is done in Dialog mode, but is taking an extrodinarily long time to run as it looks like it is going back through all of the source infocube requests to somehow determine the delta marker.  The source infoCube does have a lot of requests, but I would think this would be a very quick process to set the marker.
Any ideas why this dtp init delta with no data would take soooo long?

Hi,  Thanks for the information, but I see in the documentation that this type of proces is run in dialog mode.  Could you point out where this process type would run in background only?   Can anyone tell me why this would run so long?
No data transfer; delta status in source: fetched
With this processing mode you execute a delta without transferring data.
This is analogous to simulating the delta initialization with the InfoPackage.
*In this case you execute the DTP directly in the dialog. *

Similar Messages

  • First load with DTP with mode Delta

    Hello Experts,
    I'd like some suggestions based on the following scenario.
    DSO-A
    - total records = 44,000,000
    - with criteria X = 230,000
    DSO-B (New DSO)
    - due to this DSO is a new object. It is required to retrieve data from DSO-A with criteria X
    - this means that the target records to be added are 230,000 records
    - DTP for extracting data from DSO-A to DSO-B --> use Extraction Mode = Delta
    Questions
    - For the first time loading, how to extract data in small chunks at a time from DSO-A to DSO-B when Extraction Mode is set to Delta?
      I have concerns on performance if data were loaded all at once with Extraction Mode = Delta. I do not want to interrupt other existing schedule jobs much.
      After the first load, this DTP will be set to be run daily. The next day, data will not be loaded in huge amount again.
    Any best practice on this.
    All suggestions would be appreciated.
    Thank you very much.
    -WJ-

    *- For the first time loading, how to extract data in small chunks at a time from DSO-A to DSO-B when Extraction Mode is set to Delta?*
    When you are loading first time from DSO-A to DSO-B using DTP it acts like Full load only, even if you keep that as Delta mode, then for further loads it takes as Delta load. If you have any selections active for that we can load small chunks using the Filters option in the DTP in the Tab Extracion.
    I have concerns on performance if data were loaded all at once with Extraction Mode = Delta. I do not want to interrupt other existing schedule jobs much.
    You have just 230,000 records to be loaded, i dont think it gives any performance issues. We can load it.
    After the first load, this DTP will be set to be run daily. The next day, data will not be loaded in huge amount again.
    Yes for further loads from this DTP gets only delta records hope they are fewer.
    Hope this helps.
    Veerendra.

  • Time Capsule only runs with 1,1Mbit up/down during data transfer, internet speed is ok, TC is used to extend an existing network

    Hello Everyone,
    I have a new 2TB Time Capsule running with a MacBook Air. I want to use the TC to access data (Aperture library, iTunes library, other Files) and to backup my MacBook Air via Time Machine. I set it up, integrated the TC in my Network (FritzBox) by choosing "extend existing network" and it works. BUT only with 1.1Mbit/second while writing or reading Data to/from it. Internet speed is very good, like it was before. While the 1.1Mbit/second seem to be ok with Aperture and iTunes (streaming music and movies) it is still painfully slow and I'm not realy happy with that
    I've read that:
    The 1.1Mbit problem is pretty common when Time Capsule is integrated and used to extend a network (bridge modus), it also looses all ethernet connectivity then (that woul explain why connecting it by ethernet didn't help during initial setup and data transfer)
    Is there a solution besides ditching the Fritzbox and using the TC to build the network? Because basicly I still need the Fritzbox for its DECT features.
    Thank you for any help in advance and have a nice day,
    Olli

    You have double hop wireless with this setup..
    You should bridge the TC.. that is router bridge not wireless bridge, and plug the TC into the Frizyboy by ethernet.
    Then you can setup the TC wireless to reinforce the fritzy..
    Same wireless name as SSID.
    Same security settings... really should be WPA2 AES or in TC personal.
    Same password..
    Just different wireless channel.
    Then you can use 5ghz on the TC if you are up close and personal.

  • Help me find ghost files after botched data transfer

    I just bought a new MacBook. While setting up the OS upon first boot, I hooked up my old machine (which had a dead LCD, hence the new machine) for the data transfer. Well, after a solid hour+ of data transfer over firewire, there was an error, after which the old machine didn't seem to be a valid source of data transfer for the OS.
    So I boot up the new machine and none of my files appear to have transferred. Ok, I can transfer important things by hand, so it's no problem. I boot the old machine holding down T and it appears as a mounted volume, from which I copy my files.
    Here's the issue: that first, failed attempt to transfer files left "ghost" files somewhere eating up my HD space. Ie, if I sum up the size of all my files and folders in the / directory, they add up to around 60 Gb, which seems reasonable given all my music and such. But if I look at my main partition volume in Disk Utility, it claims I'm using a full on 179 Gb, which is totally unreasonable.
    So there's around 120 Gb of "ghost" usage laying around. How do I track it down?

    Log out and then into the newly created user account. To transfer them, choose Go to Folder from the FInder's Go menu, provide /Users/Shared/ as the path, and drag them there.
    (67061)

  • Urgent help with simple BPEL process for reading data from database

    Hello there,
    I need help with BPEL project.
    i have created a table Employee in Database.
    I did create application, BPEL project and connection to the database properly using Database Adapter.
    I need to read the records from the database and convert into xml fomat and it should to go approval for BPM worklist.
    Can someone please describe me step by step what i need to do.
    Thx,
    Dps

    I have created a table in Database with data like Empno,name,salary,comments.
    I created Database Connection in jsp page and connecting to BPEL process.
    It initiates the process and it goes automatically for approval.
    Please refer the code once which i created.
    <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN"
    "http://www.w3.org/TR/html4/loose.dtd">
    <%@page import="java.util.Map" %>
    <%@page import="com.oracle.bpel.client.Locator" %>
    <%@page import="com.oracle.bpel.client.NormalizedMessage" %>
    <%@page import="com.oracle.bpel.client.delivery.IDeliveryService" %>
    <%@page import="javax.naming.Context" %>
    <%@page import="java.util.Hashtable" %>
    <%@page import="java.util.HashMap" %>
    <%@ page import="java.sql.*"%>
    <%@ page import= "jspprj.DBCon"%>
    <html>
    <head>
    <title>Invoke CreditRatingService</title>
    </head>
    <body>
    <%
    DBCon dbcon=new DBCon();
    Connection conn=dbcon.createConnection();
    Statement st=null;
    PreparedStatement pstmt=null;
    Hashtable env= new Hashtable();
    ResultSet rs = null;
    Map payload =null;
    try
    env.put(Context.INITIAL_CONTEXT_FACTORY, "com.evermind.server.rmi.RMIInitialContextFactory");
    env.put(Context.PROVIDER_URL, "opmn:ormi://localhost:port:home/orabpel");//bpel server
    env.put("java.naming.security.principal", "username");
    env.put("java.naming.security.credentials", "password");//bpel console
    Locator locator = new Locator("default","password",env);
    IDeliveryService deliveryService =
    (IDeliveryService)locator.lookupService(IDeliveryService.SERVICE_NAME );
    // construct the normalized message and send to Oracle BPEL Process Manager
    NormalizedMessage nm = new NormalizedMessage();
    java.util.HashMap map = new HashMap();
    st=conn.createStatement();
    out.println("connected");
    String query1="Select * from EMPLOYEE";
    rs=st.executeQuery(query1);
    /*reading Data From Database and converting into XML format
    so that no need of going to BPEL console and entering the details.
    while (rs.next()){
    String xml1 = "<AsynchBPELProcess1ProcessRequest xmlns='http://xmlns.oracle.com/AsynchBPELProcess1'>"+
    "<Empno>"+rs.getString(1)+"</Empno>"+
    "<EmpName>"+rs.getString(2)+"</EmpName>"+
    "<Salary>"+rs.getString(3)+"</Salary>"+
    "<Comments>"+rs.getString(4)+"</Comments>"+
    "</AsynchBPELProcess1ProcessRequest>";
    out.println(xml1);
    nm.addPart("payload", xml1 );
    // EmployeeApprovalProcess is the BPEL process in which human task is implemented
    deliveryService.post("EmployeeApprovalProcess", "initiate", nm);
    // payload = res.getPayload();
    out.println( "BPELProcess CreditRatingService executed!<br>" );
    // out.println( "Credit Rating is " + payload.get("payload") );
    //Incase there is an exception while invoking the first server invoke the second server i.e lsgpas13.
    catch(Exception ee) {
    //("BPEL Server lsgpas14 invoking error.\n"+ee.toString());
    %>
    </body>
    </html>
    Its working fine.And i want it for Bulk approvals.please help me step by step procedure if any other way to implement this.

  • How to use DTP in process chin(Master data)

    Hi,
    for Master data loding process chain Iam using dtp s like s below
    Delete psa
    Load onfo pak
    Dtp
    is this correct process or not
    Delete Psa (master data) is correct or not  ?
    wht is the correct way ?

    Hi........
    For this the steps what evertone have alrady said is perfectly ok...........
    But if you use delta DTP.............it will also solve your problem...............
    ie............
    Load data till PSA throug Infopackage >> Then use delta DTP to load it to onfoobject ...........then it will aways bring the latest request >> Then attribute change run...........
    Regards,
    Debjani...........
    Edited by: Debjani  Mukherjee on Sep 10, 2008 1:44 PM

  • DTPs - Delta Init without Data transfer.

    Hi All,
    I am implementing the DTPs in my project where we replicate the old 3.x extractions with the bw7x functioanlity;
    Can anyone tell me if their is way to implement the "Delta Init without Data transfer" scenario with DTPs for
    both R/3 to BW(ODS) extraction or for the Datamarts - ODS to Cube within BW??
    Thanks in advance!
    Cheers
    Rao

    Hi Rao,
    Yes u can achieve Initialization without data in BI 7 DTPs.
    Goto ur DTp's Execute Tab, and Processing mode there drop down the list and select No Data Transfer,Delta Status in Source: Fetched.
    Assign points if helpful.

  • Activation of Objects with Type Data Transfer Process

    Hi Experts,
    I am stuck in a problem of activation of DTP. I loaded the data from the Datasource to PSA and there are about 200,000 reocrds.
    Now I wanted to take this load to DSO NEW table ..I have created a DTP and when I try to activate the DTP I get this Error
    Activation of Objects with Type Data Transfer Process
        Internal Activation (Data Transfer Process )
             Post Processing/Checking the Activation for Data Transfer Process DTP_49Z7OSAHFAR9O8335ED6
                  Error when activating Data Transfer Process DTP_49Z7OSAHFAR9O8335ED67X11C.
    I tried to activate the Data source but it still didnt help. Looked in SDN for some related stuff but most of them are talking about going for SP 11 ..we are already on
    *SAP_BW     700     SAPKW70014     *
    Your suggestion will be appreciated with maximum points
    Thanks

    Hello Experts,
    Could you'll please help me out with this ...I
    I deleted the DTP and created back again
    logged out of BW and logged back again
    Infact I found a SAP note 1086877 and i applied it ..but it still didint help
    Whie loading the Master Data from PSA to DTP ..it worked fine.
    While Loading the transactrion data I went to Tcode RSBATCH and in the drop down I selected the DTP activation and gave a 3 Back ground processes....Ever since that it started giving me problem
    Your help will be appreciated...my data loads are stalled ..i mean I cant move forward.
    Thanks

  • About DTP with real-time access.

    Hello Gurus,
           setting for error handling only has an impact while repairing a DTP request (repair) and during the conversion of the DTP to standard DTP (for example, to correct an error during extraction).
          will you please give a simpel scenario to explain above words?
    thank you very much.
    Fran

    Hi,
    Use :
    With SAP NetWeaver 7.0, SPS 14, the following changes and enhancements are available for real-time data acquisition:
    ●      Changes to the menu and the context menu of the monitor for real-time data acquisition
    The menu and the context menus for the individual objects in the monitor for real-time data acquisition have been standardized and enhanced. In particular, it is now possible to assign daemons and data transfer processes on various levels using context menu entries. For a complete overview of the functions in the menu and in the context menus, see Monitor for Real-Time Data Acquisition.
    ●      Assignment of daemons for InfoPackages and data transfer processes (DTP)
    To assign InfoPackages and data transfer processes to a daemon, you can call the monitor for real-time data acquisition in the following ways:
    ○       In the Data Warehousing Workbench using the respective context menu entry Assign RDA Daemon.
    ○       In InfoPackage maintenance using the Schedule tab page, and in data transfer process maintenance using the Execute tab page.
    The button names for jumping to the monitor for real-time data acquisition have changed here. To jump to the monitor, choose Assign Daemon.
    ●      Repair process chains for repairing a broken data transfer or a missing delta in the DataStore object
    In certain situations, a gap in the delta of the DataStore object can occur if there is a closed request in the PSA but there is no corresponding request in the DataStore object. For example, this is the case if a DTP request has terminated due to an error in the transformation. Here, you can create a repair process chain to repair a missing or broken update from the PSA. The repair process chain contains a standard DTP as well as any further processes required for subsequent processing (such as activating the data in the DataStore object or subsequent process chains). When the repair process chain is executed, the complete delta is loaded from the source into the DataStore object and processes for activation and further processing are executed if required.
    ●      Process types for starting and stopping real-time data acquisition
    You can use process chains to control real-time data acquisition using process types Start Real-Time Data Acquisition (RDA) Load Process and Stop Real-Time Data Acquisition (RDA) Load Process.
    More Info :
    http://help.sap.com/saphelp_nw70/helpdata/en/47/2731751c2a2dede10000000a1553f7/frameset.htm
    Regards
    Ram.

  • Processing Mode : DTP

    Hi All,
    We have a DSO which is getting populated usign two DTPs. For one of the DTPs, the processing mode is:
    Serial Extraction and Processing of Source Package
    For the second DTP, the processing mode is:
    Serial Extraction and Immediate Parallel Processing.
    I tried changing the processing mode of the first DTP to "Serial Extraction and Immediate Parallel Processing" but don't get that option in the drop down. I was wondering whether the processing option is source dependent. Could someone help me out in this regard?

    Hi Dharmendra,
    I am also facing the same issue.. My source is PSA and Target is DTP and I am unable to get a paralell processing option..
    Let me know if you get some answers.
    Best Wishes
    Pralay Ahluwalia

  • Clean up "Reports with FIN Interrupted processing"

    Hi All,
    Kindly let us know if there is any function to clean up “Reports with FIN Interrupted processing”. Even though "Transfer Restart" is available there are Expense reports which are not posted to ERP or you wish not to post to ERP.
    Can the Expense reports be set to status "Reimbursed" in the front end by the administrator?
    We know that the reimbursement notification is sent to C4TE from ERP.
    Kindly Suggest.
    Sincerely,
    Manasa Anantapur

    Hi All,
    Development counterpart would like to know the use case behind the scenario .
    Scenario must be explained in detail before the development would take a call.
    Sincerely,
    Manasa

  • RECEIVING PROCESSING MODE에 대한 BULLETIN

    제품 : MFG_PO
    작성날짜 : 2004-02-10
    RECEIVING PROCESSING MODE에 대한 BULLETIN
    ======================================
    PURPOSE
    PO receipt를 실행시키다 보면 RCV:Processing Mode에 따라 문제가 발생
    하거나 정상적이거나 혹은 debug가 생성되거나 하는 다른 양상을 볼 수
    있다. 이 Note는 이런 Receiving Processing Mode에 대한 차이점을 설명하여
    향후 문제 해결에 도움이 주고자 한다.
    Explanation
    ## Creating Receipts ##
    Receiving transactions이 실행되면 data는 RCV: Processing Mode(On-line,
    Immediate or Batch) 값에 의해 바로 수반되는 processing을 위해
    RCV_TRANSACTIONS_INTERFACE table에 pending status로 저장된다.
    이 Profile 값은 오직 PO form을 통해 transaction이 실행 되어질 경우에만
    고려된다. 일반적으로 이 Profile option 값은 system level로 지정되나
    users 별로 personal profile에서 이 값을 변경할 수 있다.
    ## Processing Mode values ##
    On-line:
    On-line mode는 가장 많이 사용하는 processing mode이다.
    User가 transaction을 저장하면 다른 application activity는 이 process가
    종료되기 전까지는 실행되지 않는다.
    성공적으로 저장된 transaction의 수나 error message를 화면에 나타낸다.
    $PO_TOP/bin/RCVOLTM Receiving Transaction Manager executable
    program을 이용한다.
    Immediate:
    User가 transaction을 저장할 때 Receiving Transaction Processor
    concurrent problem이 자동으로 시작되고 receipt을 위해
    RCV_TRANSACTIONS_INTERFACE table에 생성된 data를 처리한다.
    Form의 control은 다른 application activity를 실행하려는 user에게로
    돌아간다.
    Transaction에서 error가 발생하면 이 내용은 PO_INTERFACE_ERRORD table에
    저장되며 이 error내용은 Transaction Status Summary form에서 볼 수 있다.
    Error에 대한 추가적인 정보는 View/Requests의 Receiving Transaction
    Processor concurrent request의 log file에서 확인 할 수 있다.
    $PO_TOP/bin/RCVTP Reciving Transaction Manager executable
    program을 이용한다
    Batch:
    User가 transaction을 저장할 때, Receiving Transaction Processor
    concurrent program이 report submission을 통해 실행될 때까지 data는
    RCV_TRANSACTIONS_INTERFACE table 안에 남아있다.
    Form의 control은 다른 application activity를 실행하려는 user에게로
    돌아간다.
    Receiving Transaction Processor가 실행될 때,이 process는 Processing_mode
    가 'BATCH'인 모든 pending record를 처리할 것이다.
    Transaction 실행시 error가 발생하면 이 내용은 PO_INTERFACE_ERRORS table
    에 저장되고 Transaction Status Summary 화면에서 볼 수 있다.
    ## 기타 ##
    Receipt form이나 이와 관련있는 libraries는 RCV_TRANSACTIONS_INTERFACE
    table에 data를 저장하기 전에 역시 validation을 선행한다.
    이런 validation error는 어떤 processing mode를 사용하는지 관계없이
    form 위에서 조회가 된다.
    Receiving Transaction Summary form이나 Transaction Status Summary form
    은 Transaction의 성공적인 진행이나 실패를 확인하는데 이용된다.
    Receiving Transaction Summary form은 error외에 추가로 Status가 'Pending'
    인 transaction을 보여준다.
    RCV: Processing Mode가 'On-Line'일때 저장된 transaction이라 하더라도
    Pending status의 transaction이 RCV_TRANSACTIONS_INTERFACE table에 남아
    있으면 안된다.(이런 경우가 종종 발생)
    이런 transaction은 form의 Delete icon을 이용하여 삭제할 수 있으며,삭제
    후 다시 processing 해야 한다.
    RCV: Processing Mode가 'Immediate'인 Receiving Transaction Processor가
    아직 transaction을 처리하지 않았다면 Pending status가 transaction이 존재
    하는것은 가능하다. 보통 이런 transaction은 짧은 시간안에 처리가 될
    것이다. 종종 이런 record가 Pending 상태로 RCV_TRANSACTIONS_INTERFACE에
    남아있는 경우가 있는데 이런 경우 form의 Delete icon을 이용하여 삭제 후
    다시 processing 해야 한다.
    Example
    Reference Documents
    Note 197860.1

    it has been answered

  • Data S Default data transfer Options

    Hi
    What do the below mentioned terms mean in infopackage-Schedular--Data S Default data transfer
    1Max size of datapakcet in kbytes
    2Max no of dailogues process for sending data
    3No of data packet per info idoc
    Please help me with an example and what importance it has when we increase/ decrease the lengh of the above 3 mentioned do we have any interconnection in all three or they are all independent.
    Thanks
    Puneet

    Hello Puneet,
    These are some standard BW Settings done in transaction SPRO.
    SPRO ->SAP Customizing Implementation Guide->SAP NetWeaver->SAP Business Information Warehouse->Links to Other Systems->Maintain Control Parameters for the data transfer
    Maximum size of a data packet in kilo bytes.
    The individual records are sent in packages of varying sizes in the data transfer to the Business Information Warehouse. Using these parameters you determine the maximum size of such a package and therefore how much of the main memory may be used for the creation of the data package.
    SAP recommends a data package size between 10 and 50 MB.
    Frequency with which status Idocs are sent
    With this frequency you establish how many data IDocs should be sent in an Info IDoc.
    Maintain Control Parameters for the data transfer
    Standard settings
    For SAP source systems, you change the control parameter settings in the transaction SBIW (Customizing for Extractors), under Business Information Warehouse -> General Settings -> Control Parameters -> Maintain Control Parameters for Data Transfer .
    Activities
    1. Maximum size of data packages
    For data transfer into BW, the individual data records are sent in packages of variable size. You use these parameters to control how large such a data package typically is. If no entry is maintained, the data is transferred with a standard setting of 10,000 kbytes per data package. The memory requirement depends not only on the setting for data package size, but also on the width of the transfer structure, and the memory requirement of the  relevant extractor.
    2. Frequency
    With the specified frequency, you detemine after how many data IDocs an Info IDoc is sent, or how many data IDocs are described by an Info IDoc.
    The frequency is set to 1 by default. This means that an Info IDoc follows every data IDoc. Generally, choose a frequency of between 5 and 10, but not greater than 20.
    The larger the package size of a data IDoc, the lower you must set the frequency. In this way you ensure that, when loading data, you receive information on the current data load status at relatively short intervals.
    In the BW Monitor you can use each Info IDoc to see whether the loading process is running without errors. If this is the case for all the data IDocs in an Info IDoc, then the traffic light in the Monitor is green. One of the things the Info IDocs contain information on, is whether the current data IDocs have been loaded correctly.
    3. Size of a PSA partition
    Here, you can set the number of records at which a new partition is generated. This value is set to 1.000.000 records as standard.
    When you are integrating with Other SAP Components then
    SPRO ->SAP Customizing Implementation Guide->Integration with Other SAP Components->Data Transfer to the SAP Business Information Warehouse->General Settings->Maintain Control Parameters for the data transfer
    Maintain Control Parameters for Data Transfer
    Activities
    1. Source System
    Enter the logical system of your source client and assign the control parameters you selected to it.
    You can find further information on the source client in the source system by choosing the path Tools -> Administration -> Management -> Client Maintenance.
    2. Maximum Size of the Data Package
    When you transfer data into BW, the individual data records are sent in packages of variable size. You can use these parameters to control how large a typical data packet like this is.
    If no entry was maintained then the data is transferred with a default setting of 10,000 kBytes per data packet. The memory requirement not only depends on the settings of the data package, but also on the size of the transfer structure and the memory requirement of the relevant extractor.
    3. Maximum Number of Rows in a Data Package
    With large data packages, the memory requirement mainly depends on the number of data recrods that are transferred with this package. Using this parameter you control the maximum number of data records that the data package should contain.
    By default a maximum of 100,000 records are transferred per  data package.
    The maximum main memory requiremen per data package is approximately 2'Max. Rows'1000 Byte.
    4. Frequency
    The specified frequency determines the number of IDocs that an Info IDoc is to be sent to, or how many data IDocs an Info Idoc describes.
    Frequency 1 is set by default.. This means that an Info Idoc follows every data Idoc. In general, you should select a frequency between 5 and 10 but no higher than 20.
    The bigger the data IDoc packet, the lower the frequency setting should be. In this way, when you upload you can obtain information on the respective data loading in relatively short spans of time .
    With the help of every Info IDoc, you can check the BW monitor to see if there are any errors in the loading process. If there are none, then the traffic light in the monitor will be green. The Info IDocs contain information such as whether the respective data IDocs were uploaded correctly.
    5. Maximum number of parallel processes for the data transfer
    An entry in this field is only relevant from release 3.1I onwards.
    Enter a number larger than 0. The maximum number of parallel processes is set by default at 2. The ideal parameter selection depends on the configuration of the application server, which you use for transferring data.
    6. Background job target system
    Enter the name of the application server on which the extraction job is to be processed.
    To determine the name of the application server, choose Tools -> Administration -> Monitor -> System monitoring -> Server. The name of the application server is displayed in the column Computer.
    7. Maximum Number of Data Packages in a Delta Request
    With this parameter, you can restrict the number of data packages in a delta request or in the repetition of a delta request.
    Only use this restriction when you expect delta requests with a very high data volume, so that, despite sufficiently large data package sizes, more than 1000 data packages can result in a request.
    With an initial value or when the value is 0, there is no restriction. Only a value larger than 0 leads to a restriction in the number of data packages. For reasons of consistency, this number is not generally exactly adhered to. The actual restriction can, depending on how much the data is compressed in the qRFC queue , deviate from the given limit by up to 100.
    Thanks
    Chandran

  • Inventory management and physical inventory data transfer

    hi all,
    can anyone plz povide me with inventory management and physical inventory data transfer tutorial or link.
    points are guarented.
    rgds

    The information behind the blue-button for MI34, MI38 (as you mentioned) does not have enough detail. It's basically one-page. Is there another instructional source available?  How is the logical file MMIM_PHYSICAL_INVENTORY_DOCUMENTS tied to the physical, can you clarify? Not sure how to determine where the sequential file being processed needs to be located. Thanks!

  • WDA-Flash Data transfer Frequency

    Hi
    I have some WDA program with Flash Island inside.  I share some objects (tables) from WDA to Flash, and Flash make graphics.
    So i have in WDA conext the requiered nodes (for each table) wich i fill in WDA and share it to Flash.
    My doubt is about Data transfer frequency between WDA and Flash:
    If i change a node content, ie. i fill the node again with the new table content, this node is shared automatically with Flash, thats good when i change all records from that node (table).
    1. If i have a NODE_A with 100 records ,  if i only change the content for one specific record , does
    all the node content (all 100 records) are transfered to Flash ?   I guess yes.
    2. But if i have NODE_A (100 records) and i have other NODE_B  (150 records) and i change the NODE_A content  ; does NODE_B content are transfered also to Flash even this node has not changes ?
    3. Suppose i have other Node (not table) with attribute to enter Date wich is shared to Flash ,  if i change that Date ,   does the nodes NODE_A and NODE_B  are also transfered to Flash even these have not changes  ?
    4. If i do not make any change to nodes shared with Flash , when i make any event (press match code) or change some node or attribute wich are not shared to Flash   ;  does nodes NODE_A and NODE_B and Node Date are transfered also to Flash ?
    I know these are several questions, i need to know about this because if the nodes are transfered alwasy and with all node content transfered,  then i will need to have minimum data in that nodes shared to WDA and even i will need to clear/refresh  (object->invalidate) that nodes when Flash receives the data, this in order to have good performance and server work for the application.
    Does somebody can give me some 'light' about this please ?
    Best Regards
    Frank

    Hi Frank,
    From my past experience with FlashIslands, what i understood is:
    The context data is transferred from WDA to Flash whenever there is a change in any of the context nodes shared with Flash.
    And when ever there is an event fired from Flash to WDA.
    please find the answers for your questions below:
    1. Yes all the node content will be transferred. Actually not only this node where you have made the change, but also all the nodes data which are shared with Flash will be transferred.
    2. Yes. please see point 1.
    3. Yes
    4. As far as i know, No. As the change is in the content of a node which is not shared with Flash, there wont be any data transfer from WDA to Flash.
    I hope this is clear!
    Best Regards,
    Srilatha

Maybe you are looking for