Data is not getting posted in ABAP Proxy.

Hi,
I am working on File to ABAPProxy scenario. The data is not going to proxy.
In PI sxmb_moni is status is successfull and in the R3 sxmb_moni the status is successfull. But the data is not getting posted in the tables(ABAPProxy). I had checked the inbound and oubound queues. And with the same input data, abapers cheked their code in Abapproxy, then the table is getting updated. When we trigger from PI, the tables are not getting updated.Please help me.
Thanks,
Pragathi.

Hi,
The problem may be
Case 1: Cache is not getting refreshed(Check SXI_CACHE)
Case 2: The Queue Is blocked (Check SMQ1 & SMQ2)
Regards,
Sainath

Similar Messages

  • Data is not getting posted to R/3 from CRM

    Hi to All,
    When i am creating sales order in CRM data is not getting replicated to R/3. System is showing that B Doc is created , posted and validated. But here in CRM system outbound queue is generated but in R/3 there is no data in inbound queue.
    Can anybody tell me the reason behind this?
    Regards,
    Anurag

    Hi,
    Could you please check the if the Site-R/3 in SMOEAC has the subscription All Business Transactions (MESG)? If not please create a subscription to the R/3 site.
    If this doesn't work please update the post.
    Hope this helps.
    Thanks,
    Karuna.

  • Files are not getting posted in destination directory,how to trace in XI

    Hi,
    our scenario is proxy to file.
    We are posting files of .TRG and .DAT format in destination directory.Due to less space in destination directory, files only .TRG files are getting posted but .DAT are not getting posted.but i have checked in XI system SXMB_MONI,no error message is coming.message is getting processed successfully.
    why it is not visible and to monitor that files are not saved at the destination.
    Is this normal with XI that we can not see of files are really saved?

    Hi,
    > it should generate two files as per scenario but only one file is getting generated due to less space,
    NO.
    If  No  Space, The error will pops  out  in RWB  .
    Check the directory of the File Adapter  you are using
    Regards
    Agasthuri Doss

  • Idoc not getting posted into XI

    Hello Experts,
    I am trying IDoc-XI-File senario...
    I have configured, ALE on backend system. Configured IDX1 and IDX2 in XI system.
    At sender the Idoc has been sent without any errors.
    But when I check in XI using SXMB_MONI there are no messages. Idoc is not getting posted into XI.
    Please suggest as I am struck at this point. I am using XI 7.0
    Thanks
    Suma

    Hi,
    Kindly verify the ALE steps that you have performed...
    Steps for ALE settings:-
    Steps for XI
    Step 1)
         Goto SM59.
         Create new RFC destination of type 3(Abap connection).
         Give a suitable name and description.
         Give the Ip address of the R3 system.
         Give the system number.
         Give the gateway host name and gateway service (3300 + system number).
         Go to the logon security tab.
         Give the lang, client, username and password.
         Test connection and remote logon.
    Step 2)
         Goto IDX1.
         Create a new port.
         Give the port name.
         Give the client number for the R3 system.
         Select the created Rfc Destination.
    Step 3)
         Goto IDX2
         Create a new Meta data.
         Give the Idoc type.
         Select the created port.
    Steps for R3.
    Step 1)
         Goto SM59.
         Create new RFC destination of type 3(Abap connection).
         Give a suitable name and description.
         Give the Ip address of the XI system.
         Give the system number.
         Give the gateway host name and gateway service (3300 + system number).
         Go to the logon security tab.
         Give the lang, client, username and password.
         Test connection and remote logon.
    Step 2)
         Goto WE21.
         Create a port under transactional RFC.(R3->XI)
         Designate the RFC destination created in prev step.
    Step 3)
         Goto SALE.
         Basic settings->Logical Systems->Define logical system.
         Create two logical systems(one for XI and the other for R3)
         Basic settings->Logical Systems->Assign logical system.
         Assign the R3 logical system to respective client.
    Step 4)
         Goto WE20.
         Partner type LS.
         Create two partner profile(one for XI the other for R3).
         Give the outbound or inbound message type based on the direction.
    Step 5)
         Not mandatory.
         Goto BD64.
         Click on Create model view.
         Add message type.
    Step 6)
         Goto WE19
         Give the basic type and execute.
         fill in the required fields.
         Goto IDOC->edit control records.
         Give the following values.(Receiver port,partner no.,part type and sender Partner no. and type)
         Click outbound processing.
    Step 7)
         Go to SM58
         if there are any messages then there is some error in execution.
         Goto WE02.
         Check the status of the IDOC.
         Goto WE47.
         TO decode the status code.
    BD87 to check the status of IDOC.
    In case if not authorized then go to the target system and check in SU53, see for the missing object
    and assign it to the user.
    SAP r3
    sm59(status check)(no message)
    WE02(status check)
    WE05(status check)
    BD87(status check)
    Xi
    IDx5(Idoc check)
    SU53(authorization check)
    Reward points if helpful...
    PrasHaNt

  • FI - Interface triggerring multiple idoc is not getting posted

    Hi all,
    I have a scenerio, where I am receving data from SAP PI, which calls standard idoc ACC_DOCUMENT03 and which posts FI Document.
    now, the problem is interface trigger's 2idoc with seperate number with same message type and same idoc type, but both the idoc are not getting posted and show's error like 'No currency line exists for line item 0000000001' and so on.
    But the same idoc, when i proccess indivisually get's process gets posted successfully.
    I have also done implementation in function module IDOC_INPUT_ACC_DOCUMENT.
    As an abaper, I am not able to understand problem to take corrective steps.
    Thanks & Regards
    shashikant

    hi,
    if the IDoc is stuck in a qRFC queue means that the message did not reach the pipeline to be send to R3 so, anything in wrong at PI side.
    please, go to SMQ2 and check whats happening. to know the queue name check the SXMB_MONI, select the proper message and scroll rightto the proper colum.
    let us know.
    Rodrigo P-.

  • Data is not getting updated in table using RFC

    Hi Experts,
    In my scenario, I am calling one RFC using RFC receiver channel. After running scenario, channel is showing status that RFC executed successfully. But when I am checking tables in R/3 system, data is not getting updated.
    Moreover , when we tried to execute the RFC manually in R/3 system, that time data uploaded into table successfully.
    Could anybody tell me what would the reason that data not uploading into table when we send it through XI.
    Regards,
    Sari

    HI Sari,
    as you have scenario with RFC receiver.. and as you mentioned that it not updating tables when run through PI but when you execute RFC manually tables got updates.. then following are the options you can check..
    -- if you check RFC communication channel and if everything ok on then.. this means that your RFC is getting triggered successfully..but as you said tables are not updated.. for this you can go to SXMB_MONI and check the log take payload after mapping.. and compare it with the input when you try to execute it manually.. I think the input when you try manually and input to RFC when you try through PI is different and that is causing the Problem.. you will be able to see the difference in input then check.. I think the problem is data and not RFC communication channel..so by using this you will come to know difference
    -- else if possible configure your ID in PI in RFC Receiver and then check and put breakpoint on ABAP side.. so that when PI will hit RFC you will get it in debug mode and able to see what is going wrong..
    Thanks,
    Bhupesh

  • Invoice is not getting posted to accounts it is showing the export error

    invoice is not getting posted to accounts, it is showing an error of foreign trade / export data. where as it is with in the country. it is maintained for the plants abroad stock transport order.

    A stock transport order to a location overseas etails crossing national borders and change of legal regulations. hence you need to maintina FT data. Please check the FT data in the billing header and item and maintian them. They are simple items like customs offices meas of transport...
    This will give you a green status from the red status you will initially have. Then you can post the invoice.
    regds

  • COPA Documents not getting posted - Doc with unauthorized busi.trans SD00

    Dear Experts,
    COPA Documents are not getting posted, although billing /accounting documents are getting posted
    We are getting the below error message,  when KE4ST is executed
    Kindly share your inputs for this issue
    Document with unauthorized business transaction "SD00"
    Message no. KE/AD604
    Diagnosis
    Document 6000000026 cannot be transferred to Profitability Analysis (CO-PA) because the business transaction"SD00'"(Billing document) is not profit-related.
    System Response
    The document is not posted to CO-PA.
    Procedure
    If you decide that the document does contain data relevant to CO-PA and that it therefore should be posted to CO-PA, change your Customizing settings accordingly. Otherwise you can ignore this message.
    Advance Thanks,
    Sanjai

    Hi,
       1. First please check if COPA is active in the system i.e. tcode KEKE
       2. Then check if operating concern is active in tcode KEA0
       3. If so then check if billing document has an account assignment to PA segment. If not then maybe you have changed the account assignment in OKC9.
       4. If all the 3 conditions above are satisfied then you need to check if this is a made-to-order scenario (MTO). A made-to-order scenario can be identified in COPA in that the billing doc will have an account assignment to a sales order in addition to a
    PA segment (field vbap-kzvbr = 'E' and vbap-vbelv = sales order no).
      If the above conditions are satisfied then from a COPA point of view it is a MTO. In this case the invoice will be assigned to the sales order and will not be directly transferred to COPA. With the order settlement the costs and the revenues (invoice) will then be posted to COPA.
    regards
    Waman

  • Plz Help:Idoc not getting posted

    In my requirement some of the idocs are not getting posted.The error message is "Balance not zero".Idoc type is "INVOIC01".In my requirement they told to look into the standard sap pgm"SAPLMRMC" in that FM MRM_AMOUNT_COMPARISON.Please give me suggestions to proceed.

    Hi Hema,
    Please check the invoice data that your are trying to send. From the error it seems that the IDoc is failing the data validations that must be placed in the program/FM that is used to generate/post the IDoc.
    Regards,
    Ashwinee

  • Data is not getting displayed in the report from an Infoset.

    Hi All,
    I am having a report  based on an infoset. This report is displaying the data in the Dev. envmt. When it is transported to the QA, it is not displaying the data in the BEx as well as RSRT, in the QA envmt. The patch levels of both the Dev. and QA are the same. The Queries are same in the Dev and QA also.
    While trying to display the data from the infoset (rt.click- display data), i am able to view the data, in the QA.
    Could anyone please suggest why the data is not getting displayed in the query designer.
    Thanks & Regards,
    A.V.N.Rao

    Hi Ashish,
    I ran the "ZPS/!ZPS" in RSRT where ZPS is the infoset name. In Dev, it displayed the values. In QA, it displayed the below messages:
    ECharacteristic 0TCAKYFNM does not exist. Check authorizations
    WThere are calculated elements. These results are bracketed [  ]
    and below that, it displayed the values for Number of records. But, it has not displayed the values for the other figures.
    Does this has any impact in QA.
    Thanks & Regards,
    AVN Rao.

  • Data is not Getting refresh in Dashboard

    Hi I have implemented Xcelsius on the top of SAP BI System.
    I have the following architecture -
    SAP BI Cube --> BO XI 3.1 Universe Based on Cube --> QWaaS on the Universe --> Xcelsius Dashboard on the QWaas.
    When i created a dashboard and run it , it worked fine and showed me data , which is lying in Cube.
    But When the Data get updated in Cube and i again refresh the Dashboard , It still showing me the old data , the Data is not getting refresh in Dashboard .
    what could be the possible casue ?
    Is QWaas always fetch the dat at run time ?
    Thanks

    QaaWS web service call needs to be invoked to get the data updates from the datasource. Xcelsius provides several options to command how frequently web services connections are calle/refreshed ('refresh on load', 'refresh every X seconds' or trigger cells...).
    You should also keep in mind, that, since QaaWS hits datasource each time web service is called, this can bring some performance issues (web service calls WebI server to run corresponding query, which performs a refresh). In order to mitigate this, since XI 3.0, QaaWS features a cache mechanism, that is used to store data results from each query refresh (corresponding to a web service call), in order to serve it again with better efficiency, in case same request is performed during a specific time interval (which typically corresponds to dashboard interactions).
    Cache sessions are sorted according to the user names and prompt values used, cache is emptied after a defauilt duration without any interaction (request from the same session).
    Cache timeout duration is set for each QaaWS query, and can be tuned from QaaWS Designer when modifying/creating the query : go to Advanced... button on first step of query edition wizard, cache lifetime corresponds to timeout value (in seconds) displayed on Advanced parameter panel (default value being set to 60 seconds).
    Cache lifetime may also be an explanation why you do not see data refreshed (if you are with QaaWS XI 3.0 or later).
    Hope that helps,
    David.

  • Data is not getting replicating to the destination db.

    I has set up streams replication on 2 databases running Oracle 10.1.0.2 on windows.
    Steps for setting up one-way replication between two ORACLE databases using streams at schema level followed by the metalink doc
    I entered a few few records in the source db, and the data is not getting replication to the destination db. Could you please guide me as to how do i analyse this problem to reach to the solution
    setps for configuration _ steps followed by metalink doc.
    ==================
    Set up ARCHIVELOG mode.
    Set up the Streams administrator.
    Set initialization parameters.
    Create a database link.
    Set up source and destination queues.
    Set up supplemental logging at the source database.
    Configure the capture process at the source database.
    Configure the propagation process.
    Create the destination table.
    Grant object privileges.
    Set the instantiation system change number (SCN).
    Configure the apply process at the destination database.
    Start the capture and apply processes.
    Section 2 : Create user and grant privileges on both Source and Target
    2.1 Create Streams Administrator :
    connect SYS/password as SYSDBA
    create user STRMADMIN identified by STRMADMIN;
    2.2 Grant the necessary privileges to the Streams Administrator :
    GRANT CONNECT, RESOURCE, AQ_ADMINISTRATOR_ROLE,DBA to STRMADMIN;
    In 10g :
    GRANT CONNECT, RESOURCE, AQ_ADMINISTRATOR_ROLE,DBA to STRMADMIN;
    execute DBMS_STREAMS_AUTH.GRANT_ADMIN_PRIVILEGE('STRMADMIN');
    2.3 Create streams queue :
    connect STRMADMIN/STRMADMIN
    BEGIN
    DBMS_STREAMS_ADM.SET_UP_QUEUE(
    queue_table => 'STREAMS_QUEUE_TABLE',
    queue_name => 'STREAMS_QUEUE',
    queue_user => 'STRMADMIN');
    END;
    Section 3 : Steps to be carried out at the Destination Database PLUTO
    3.1 Add apply rules for the Schema at the destination database :
    BEGIN
    DBMS_STREAMS_ADM.ADD_SCHEMA_RULES(
    schema_name => 'SCOTT',
    streams_type => 'APPLY ',
    streams_name => 'STRMADMIN_APPLY',
    queue_name => 'STRMADMIN.STREAMS_QUEUE',
    include_dml => true,
    include_ddl => true,
    source_database => 'REP2');
    END;
    3.2 Specify an 'APPLY USER' at the destination database:
    This is the user who would apply all DML statements and DDL statements.
    The user specified in the APPLY_USER parameter must have the necessary
    privileges to perform DML and DDL changes on the apply objects.
    BEGIN
    DBMS_APPLY_ADM.ALTER_APPLY(
    apply_name => 'STRMADMIN_APPLY',
    apply_user => 'SCOTT');
    END;
    3.3 Start the Apply process :
    DECLARE
    v_started number;
    BEGIN
    SELECT decode(status, 'ENABLED', 1, 0) INTO v_started
    FROM DBA_APPLY WHERE APPLY_NAME = 'STRMADMIN_APPLY';
    if (v_started = 0) then
    DBMS_APPLY_ADM.START_APPLY(apply_name => 'STRMADMIN_APPLY');
    end if;
    END;
    Section 4 :Steps to be carried out at the Source Database REP2
    4.1 Move LogMiner tables from SYSTEM tablespace:
    By default, all LogMiner tables are created in the SYSTEM tablespace.
    It is a good practice to create an alternate tablespace for the LogMiner
    tables.
    CREATE TABLESPACE LOGMNRTS DATAFILE 'logmnrts.dbf' SIZE 25M AUTOEXTEND ON
    MAXSIZE UNLIMITED;
    BEGIN
    DBMS_LOGMNR_D.SET_TABLESPACE('LOGMNRTS');
    END;
    4.2 Turn on supplemental logging for DEPT and EMPLOYEES table :
    connect SYS/password as SYSDBA
    ALTER TABLE scott.dept ADD SUPPLEMENTAL LOG GROUP dept_pk(deptno) ALWAYS;
    ALTER TABLE scott.EMPLOYEES ADD SUPPLEMENTAL LOG GROUP dep_pk(empno) ALWAYS;
    Note: If the number of tables are more the supplemental logging can be
    set at database level .
    4.3 Create a database link to the destination database :
    connect STRMADMIN/STRMADMIN
    CREATE DATABASE LINK PLUTO connect to
    STRMADMIN identified by STRMADMIN using 'PLUTO';
    Test the database link to be working properly by querying against the
    destination database.
    Eg : select * from global_name@PLUTO;
    4.4 Add capture rules for the schema SCOTT at the source database:
    BEGIN
    DBMS_STREAMS_ADM.ADD_SCHEMA_RULES(
    schema_name => 'SCOTT',
    streams_type => 'CAPTURE',
    streams_name => 'STREAM_CAPTURE',
    queue_name => 'STRMADMIN.STREAMS_QUEUE',
    include_dml => true,
    include_ddl => true,
    source_database => 'REP2');
    END;
    4.5 Add propagation rules for the schema SCOTT at the source database.
    This step will also create a propagation job to the destination database.
    BEGIN
    DBMS_STREAMS_ADM.ADD_SCHEMA_PROPAGATION_RULES(
    schema_name => 'SCOTT',
    streams_name => 'STREAM_PROPAGATE',
    source_queue_name => 'STRMADMIN.STREAMS_QUEUE',
    destination_queue_name => 'STRMADMIN.STREAMS_QUEUE@PLUTO',
    include_dml => true,
    include_ddl => true,
    source_database => 'REP2');
    END;
    Section 5 : Export, import and instantiation of tables from
    Source to Destination Database
    5.1 If the objects are not present in the destination database, perform
    an export of the objects from the source database and import them
    into the destination database
    Export from the Source Database:
    Specify the OBJECT_CONSISTENT=Y clause on the export command.
    By doing this, an export is performed that is consistent for each
    individual object at a particular system change number (SCN).
    exp USERID=SYSTEM/manager@rep2 OWNER=SCOTT FILE=scott.dmp
    LOG=exportTables.log OBJECT_CONSISTENT=Y STATISTICS = NONE
    Import into the Destination Database:
    Specify STREAMS_INSTANTIATION=Y clause in the import command.
    By doing this, the streams metadata is updated with the appropriate
    information in the destination database corresponding to the SCN that
    is recorded in the export file.
    imp USERID=SYSTEM@pluto FULL=Y CONSTRAINTS=Y FILE=scott.dmp IGNORE=Y
    COMMIT=Y LOG=importTables.log STREAMS_INSTANTIATION=Y
    5.2 If the objects are already present in the desination database, there
    are two ways of instanitating the objects at the destination site.
    1. By means of Metadata-only export/import :
    Specify ROWS=N during Export
    Specify IGNORE=Y during Import along with above import parameters.
    2. By Manaually instantiating the objects
    Get the Instantiation SCN at the source database:
    connect STRMADMIN/STRMADMIN@source
    set serveroutput on
    DECLARE
    iscn NUMBER; -- Variable to hold instantiation SCN value
    BEGIN
    iscn := DBMS_FLASHBACK.GET_SYSTEM_CHANGE_NUMBER();
    DBMS_OUTPUT.PUT_LINE ('Instantiation SCN is: ' || iscn);
    END;
    Instantiate the objects at the destination database with
    this SCN value. The SET_TABLE_INSTANTIATION_SCN procedure
    controls which LCRs for a table are to be applied by the
    apply process. If the commit SCN of an LCR from the source
    database is less than or equal to this instantiation SCN,
    then the apply process discards the LCR. Else, the apply
    process applies the LCR.
    connect STRMADMIN/STRMADMIN@destination
    BEGIN
    DBMS_APPLY_ADM.SET_SCHEMA_INSTANTIATION_SCN(
    SOURCE_SCHEMA_NAME => 'SCOTT',
    source_database_name => 'REP2',
    instantiation_scn => &iscn );
    END;
    Enter value for iscn:
    <Provide the value of SCN that you got from the source database>
    Note:In 9i, you must instantiate each table individually.
    In 10g recursive=true parameter of DBMS_APPLY_ADM.SET_SCHEMA_INSTANTIATION_SCN
    is used for instantiation...
    Section 6 : Start the Capture process
    begin
    DBMS_CAPTURE_ADM.START_CAPTURE(capture_name => 'STREAM_CAPTURE');
    end;
    /

    same problem, data not replicated.
    its captured,propagated from source,but not applied.
    also no apply errors in DBA_APPLY_ERROR. Looks like the problem is that LCRs propagated from source db do not reach target queue.can i get any help on this?
    queried results are as under:
    1.at source(capture process)
    Capture Session Total
    Process Session Serial Redo Entries LCRs
    Number ID Number State Scanned Enqueued
    CP01 16 7 CAPTURING CHANGES 1010143 72
    2. data propagated from source
    Total Time Executing
    in Seconds Total Events Propagated Total Bytes Propagated
    7 13 6731
    3. Apply at target(nothing is applied)
    Coordinator Session Total Total Total
    Process Session Serial Trans Trans Apply
    Name ID Number State Received Applied Errors
    A001 154 33 APPLYING 0 0 0
    4. At target:(nothing in buffer)
    Total Captured LCRs
    Queue Owner Queue Name LCRs in Memory Spilled LCRs in Buffered Queue
    STRMADMIN STREAMS_QUEUE 0 0 0

  • Data is not getting updated in DB table

    hi all
    i am doing IDOC to jdbc scenario
    i am triggering idoc from R/3 and the data is going into DB table
    sender side:       ZVendorIdoc
    receiver side:
    DT_testVendor
      Table
        tblVendor
          action       UPDATE_INSERT
          access      1:unbounded
            cVendorName 1
            cVendorCode 1
         fromdate    1
         todate      1
          Key
            cVendorName  1
    if i trigger idoc for example vendor 2005,2006 and 2010 data is getting updated in the table
    but again if i trigger idoc for same vendor nos data does not get updated in DB table while message is successfull in moni and RWB both
    plz suggest if any change need to be done to update the data
    Regards
    sandeep sharma

    Hi Ravi
    you are right, vendor no is my key field . problem is when i send data again then it should Update the data but it's not updating the data again
    i did on exp with this : i deleted all the record from the table and then  triggered idoc for vendor 2005 , 2006,2010 after this data is updated in the table i deleted the rows for vendor no 2006 and 2010 and kept the row for vendor 2005
    then i again trigered the idoc for vendor no 2005,2006 and 2010 now this should update and it should insert rows for vendor no 2006 and 2010 but i am surprised its not updating the data
    Thanks
    sandeep

  • Table PA0045 is getting updated but data is not getting displayed in Infty

    Hi,
             When i save entries for any employee in Infty 0045 , the table pa0045 is getting updated with the record.
              But FOR FEW USERS, who created the record, the data is not getting displayed in one tab of the tabstrip bur rest tabs are filled with data. Its the case with few users only .
    Regards,
    Jyoti

    Check if there is proper data to be shown in the tabstrip.
    EX.. Loan Amount Granted in the Tabstrip Basicdata ...

  • Data is not getting loaded in the Cube

    Hi Experts,
    I had a cube to which I created aggregates and then I deleted the data of the cube and made some changes in it and trying to load the data again from PSA.
    But I'm the data is not getting loaded in to it the cube.
    Can anyone tell me do I have to delete the aggregates before I start a fresh data in to my cube. If yes can you tell me how do i delete it and load it again.
    If it is something else please help me resolve my issue.
    Thanks

    Hi,
    Deactivate the aggregates and then load the data from PSA to Cube.
    And then reactivate the aggregates. While reactivating them you can ask to fill them also.
    Regards,
    Anil Kumar Sharma. P

Maybe you are looking for

  • Large form set: Deliver in portfolio, or build one big PDF?

    Greetings, all-- I am using LiveCycle to recreate a set of 24 or so forms that were originally built in Word. My users download the forms, complete them on their desktops, then print and submit the forms in hard copy. Most users are not tech savvy. T

  • Camera Raw vs Lightroom Color Spaces

    I photographed RAW image of a Gregtag color target with my Nikon D300 and opened it in camera raw in the ProPhoto Color space and adjusted the develop sliders so that the tone squares on the bottom row matched the ProPhoto values, (e.g approx 238,189

  • Where in migration process does old server stop functioning as primary server?

    I am migrating from a SBS 2003 box to a new WS 2012 R2 Essentials box - following the instructions from  "Migrate from Previous Versions to Windows Server 2012 R2 Essentials or Windows Server Essentials Experience".   What I need to know is where in

  • Problem using WRVS4400n with WPA4410n as Repeater

    hi there, we're using a WRVS4400n with V2.0.0.8-ETSI as Router and WLAN Access. - Network Mode: N-Only - Channel: 13 - Secutity: WPA2 - Isolation: disabled - Bandwith: 20/40 - Signal is allowed to be repeated Now we installed a new WAP4410n with 2.0.

  • CASH RECEIPT VOUCHERS

    Dear Experts, We have one issue.  We issue cash receipt vouchers to our sales team and they collect the payments from our customers. At the time of incoming payment we need to track the receipt no & sales team details.  And at any time we need to kno