Data struck in R/3

Hi gurus,
I am extracting data from r/3. I have missed some records in data packet.but delta was successful.but report is giving wrong due to missing records.how to extract those particular records in to target from r/3.
please treat as urgent.
priya.

Hi Priya,
Seems that u have missed the same from R3 side...
Repair Full will work fine when the data is present in the setup tables but here as u have missed the records in the Delta, so ur repair for that selection records will be Zero..
So repair full in your case doesnt support as the data is related to the Delta...
About the Selective Setup loads, u need to hadle this activity carefully....as this requires the deletion of setup and then proceed with the selective Setup load for the missed out records...Which again needs approval for the same..
thanks
Assign points if this helps..

Similar Messages

  • Data struck up in ALE inbox

    Hi,
    While tried to load Masterdata from ECC,  Data struck-up in ALE inbox E1RSSH -> E1RSHIE and it is popping up with a error in process monitoring that 21,35 and 65 node id's are missing. PLease help me out with process of eliminating those records and load the data into Info object.

    Hi,
    Is the job in source system still active. Job name BI_reqnumber
    Also check SM66 in both BW and source system. If you are loading hierarchy it generally takse time to load it.
    Regards,
    Pravin
    Edited by: Pravin Karkhanis on May 28, 2010 3:19 AM

  • Administratoin Acivities

    Hi,
    1. Can any 1 tell me the list of Administratoin Acivities in XI
    2. what is the ABAP & JAVA Proxies in XI

    HI,
    Administation activities:
    1) Installation of XI box. which includes installation of ABAP,java stck,oracle db etc.
    2) Post installtion configuration activities.
    3) Creation of Products,SLD Business systems, technical systems.
    4) Exchannge profile configuration.
    5) Assigning users roles to users.
    6) IDOC, proxy configuration.
    7) handle Cache.
    8) Transport handling
    9) Queue management.
    many more:
    refer the below links:
    XI Administration
    A Beginner’s Guide to SAP XI Settings, Part II
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/6bd6f69a-0701-0010-a88b-adbb6ee89b34
    A Beginner’s Guide to SAP XI Settings, Part I
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/73527b2c-0501-0010-5398-c4ac372c9692
    General Configuration Steps
    http://help.sap.com/saphelp_nw04/helpdata/en/cf/230240d981e469e10000000a155106/content.htm
    SAP Exchange Infrastructure (XI) : Installation & CONFIGURATION GUIDE
    http://help.sap.com/saphelp_nw04/helpdata/en/d7/f01a403233dd5fe10000000a155106/frameset.htm
    Personal Settings
    http://help.sap.com/saphelp_erp2004/helpdata/en/e9/c4cc9b03a422428603643ad3e8a5aa/content.htm
    Roles and Tool Access : Administration, Technical Configuration,Design,Configuration,Monitoring
    http://help.sap.com/saphelp_nw04/helpdata/en/89/05793c05f0807be10000000a11405a/content.htm
    http://www.forumtopics.com/busobj/viewtopic.php?t=59586&start=15&postdays=0&postorder=asc
    SLD:
    How To…Handle the SLD for SAP XI
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/9e76e511-0d01-0010-5c9d-9f768d644808
    How To…Handle Caches in SAP XI 3.0
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/1a69ea11-0d01-0010-fa80-b47a79301290
    Queue:
    /people/sap.india5/blog/2006/01/03/xi-asynchronous-message-processing-understanding-xi-queues-part-i
    SMQ2 - Inbound queue.
    Entires in this if are due to http response or something that could be due to temporary issue and that issue is now over then select the queue and right click it and do "Execute LUW".
    SMQ1 - Outbound queue.
    SXMB_MONI - Find the message and go to the right side and see in the queue option that it will show that it has blocked the queue.
    SMQ1-outbound queue
    SMQ2-inbound queue
    SMQ1 – qRFC Monitor for the outbound queue You use this transaction to monitor the status of the LUWs in the outbound queue and restart any hanging queues manually.
    SMQ2 – qRFC Monitor for the inbound queue. You use this transaction to monitor the status of the LUWs in the outbound queue.
    see here fro details
    http://help.sap.com/saphelp_nw04/helpdata/en/76/e12041c877f623e10000000a155106/content.htm
    Outbound Queue Administration
    http://help.sap.com/saphelp_nw04/helpdata/en/4c/cf5c3c3b067331e10000000a114084/content.htm
    Inbound Queue Administration
    http://help.sap.com/saphelp_nw04/helpdata/en/b0/df5f3c8dde1c67e10000000a114084/content.htm
    queue status fro SMQ1
    http://help.sap.com/saphelp_nw04/helpdata/en/ad/7b623c6374a865e10000000a11402f/content.htm
    queue status for SMQ2
    http://help.sap.com/saphelp_nw04/helpdata/en/d9/b9f2407b937e7fe10000000a1550b0/content.htm
    As part of monitoring you can view these messages also in adapter engine but in error.
    Monitoring through various componenets:-
    Runtime workbench -
    http://help.sap.com/saphelp_nw04/helpdata/en/88/21bc3ff6beeb0ce10000000a114084/content.htm
    Trace and logs -
    http://help.sap.com/saphelp_nw04/helpdata/en/64/0b59010a65de44be4f26cb57b9580d/content.htm
    Data archieving -
    http://help.sap.com/saphelp_nw04/helpdata/en/f5/d347ddec72274ca2abb0d7682c800b/content.htm
    Software components monitoring -
    http://help.sap.com/saphelp_nw04/helpdata/en/9d/f984809f41b74ba010d15e1ed49065/content.htm
    Try the TCode smqr if Monitoring says data struck in Queue.
    /people/community.user/blog/2006/09/21/xi-rfcadapter-reconnect-issue
    Proxies:
    two types of proxies are there
    1) ABAP proxy
    2) Java proxy
    ABAP proxy:
    for abap proxy u need to first do the configuration in XI as well as in R3.
    ABAP Proxy configuration:
    /people/vijaya.kumari2/blog/2006/01/26/how-do-you-activate-abap-proxies
    client proxy is outbound proxy from R3:
    Client Proxy -
    /people/ravikumar.allampallam/blog/2005/03/14/abap-proxies-in-xiclient-proxy
    Please chekc this PDF :
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/3dfac358-0d01-0010-5598-d4900e81a30c
    Server proxy:
    it is the inbound proxy to R3 system:
    Server Proxy -
    /people/siva.maranani/blog/2005/04/03/abap-server-proxies
    File to Inbound Proxy:
    /people/prateek.shah/blog/2005/06/14/file-to-r3-via-abap-proxy
    Debugging Inbound Proxy:
    /people/stefan.grube/blog/2006/07/28/xi-debug-your-inbound-abap-proxy-implementation
    SPROXY not working:
    /people/vijaya.kumari2/blog/2006/01/26/how-do-you-activate-abap-proxies
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/f83deb90-0201-0010-189c-8d3ff87572f8
    java proxy:
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/a068cf2f-0401-0010-2aa9-f5ae4b2096f9
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/f272165e-0401-0010-b4a1-e7eb8903501d
    chirag

  • Demantra 7.3 Installation-Need Help

    Dear Demantra Guru's,
                                        Installed demantra demand planning 7.3.1 standalone without any integration's.So when i open any work sheet,there is no data.
    Not able to view any data.How can i import data into demantra,so that i can see some data in worksheets.Oracle suggested doc id 1444437.1
    For dummy data, please download files.zip archive .
    This files.zip contains:
    t_src_item_tmpl
    t_src_loc_tmpl
    t_src_sales_tmpl
    I dont understand how to upload this data.Struck in the final stage post installation.
    Greatly appreciate your help!!
    Thank You
    Madhav

    Hi Madhav,
    The  t_src_item_tmpl, t_src_loc_tmpl, t_src_sales_tmpl are staging tables in Demantra. You need to populate these staging tables with the data in the files that you have downloaded from metalink. Once the data is present in the staging tables, you need to run EP_LOAD_MAIN procedure (or the individual EP_LOADs available for items, location, levels etc. ) that validates the data and moves it to the core Demantra tables. You should then be able to see useful data in the worksheets.
    Hope this helps. Good luck!
    Regards,
    Ruby Rajakumar

  • PSA Request struck in yellow state if no data is available

    If there is no data available in the source system then the PSA reuest in the Datasource is not updated as sucess rather it is struck in yellow state.
    However, if there is data available then there is no issue.
    We are running a delta load through a Process chain and daily we have to manually set the satus to sucess for the chain to continue its execution.
    Any help would be appreciated

    Time out time(TOT)
    This settings are useful when we have huge data and load will take more time. in general default  time out time maintained some number(ex 7hours). 
    when that time was crossed we get time out time error for info pack. in that case we need to increase the wait time for info pack.
    Treatment for warning(TFW)  - if we receive any warning during the load in default request will be red.  in generally warnings are allowed. once you check what is the warning, after that accordingly warning message you can make request to green.

  • I struck up with uploading flash data into HTML

    Hi all,
    I am new to flash. I know animation and a litle bit of
    ActionScript 3.0, got struck in this way:
    Presently, I am working with flash CS3 and ActionScript 3.0.
    Suppose, I have a button,check-box and a combo-box
    components within a flash application. Now, when I click on the
    image being selected from combobox, a small dot appears depending
    upon the pixel position being clicked.
    Now, How to pass these pixel values to a HTML through an URL
    from the flash, so that we can store the pixel values in the
    database. Here, pixel values are stored in flash inside a
    "flash.geom.Point" class' object.
    Please help me... Thannx in advance....
    Srihari.Ch

    You can use "ExternalInterface" class to pass this data to
    javascript (if you want to pass this data to html page). If you
    want to send this info to a server, you can use "URLLoader"
    class.

  • Data packets strucking in yellow

    Hi ,
    We are in BW 3.5. We have daily loads since long.
    Now a days hell lot of data packets are strucking up in yellow and we are puching them manually which is causing delay in availability of the data to users.
    We dont have any recent chnages in teh system adn we are in BW support pack 12.
    How can we get red of this?
    Regards,
    RajNi

    Sometimes the packets get stuck because indices prevent them from getting loaded properly.
    Go to performance tab of the cube ( what I a saying is relevant only for cube loads ) --> Create Index ( Batch ) - > there will be checkboxes to create and drop indices on load -> check both of them and then save -> it will ask if you want to transport the setting --> select no and then try doing the load.

  • How can we show data in rows when it is in Columns........

    I have strucked with a simple but a complex problem.
    I have a Report with data in vertical rows..need of the hour is how can we show that data into column structure.
    here is the table sample which i have in Database..
    Location      chrg_type    Effective_date
    xxxxxxx        xxxx          xx-xx
    thhis is the structure in effective date we have several months in it in a single column i want show the effective date or effective month in various columns instead of a single column.
    my required table is....
                                Effective_date     Effective_date
    Location      chrg_type     xx-xx              xx-xx
    xxxxxxx        xxxx          xx-xx
    any tough heads?????

    Hi Sunil,
    Your 1st problem is that you are going to need one more field to accomplish you goal. What type of data do want under your new date columns?
    Anyway, once you have that you need to move on to how to move the data out into columns based on date.
    Try something like this.
    0_Date = IF DATEPART('m',{tbl.Effective_date}) = DATEPART('m',CurrentDate) AND
    DATEPART('yyyy',{tbl.Effective_date}) = DATEPART('yyyy',CurrentDate)
    THEN {tbl.DataField}
    1_Date = IF DATEPART('m',{tbl.Effective_date}) = DATEPART('m',DATEADD('m', -1, CurrentDate)) AND
    DATEPART('yyyy',{tbl.Effective_date}) = DATEPART('yyyy',DATEADD('m', -1, CurrentDate))
    THEN {tbl.DataField}
    2_Date = IF DATEPART('m',{tbl.Effective_date}) = DATEPART('m',DATEADD('m', -2, CurrentDate)) AND
    DATEPART('yyyy',{tbl.Effective_date}) = DATEPART('yyyy',DATEADD('m', -2, CurrentDate))
    THEN {tbl.DataField}
    ... Repeat this process until you have all of the columns you need, following this format.
    Hope this works for you,
    Jason

  • Report is not getting data from Remote cube thru Multi Provider

    Hi SAPians
    I ve strucked up with a Problem in The Reconciliation Report in BW3.5
    The Report was built on a Multi Provider, which was created on Basic and Remote Cubes .
    Both cubes have same Data Source and all the Objects were in Active version and found good.
    When I m executing the Report ,I m only getting the data from the Basic cube and no data is coming from Remote Cube.
    I ve checked the char " 0Infoprov " in Multi Provider and It was assigned with the both the cubes.
    What might be the problem
    Please help me in this regard
    Thanks in advance
    Regards
    Arjun

    Hi
    In the Reconciliation multiprovider, include 0INFOPROVIDER = Remote cube.
    If data still not coming, you can be sure connectivity with Source system of the Remote cube is the issue
    Check with basis to solve the connectivity issue.
    Ensure Remote cube is consistent
    Bye

  • RECEIVING OPEN INTERFACE TABLE의 STRUCK RECORDS 정보 취합 SCRIPT

    제품 : MFG_PO     
    작성날짜 : 2005-01-27     
    RECEIVING OPEN INTERFACE TABLE의 STRUCK RECORDS 정보 취합 SCRIPT
    =================================================================
    PURPOSE
    RCV_TRANSACTIONS_INTERFACE table에 있는 struck receipt records의 정보를
    취합하여, struck 원인 및 해결에 도움을 줄 수 있다.
    (Oracle Purchasing - Version: 11.5.2 to 11.5.9)
    Explanation
    Receiving Open Interface Tables에 걸려 있는 receipt struck의 source
    document의 추가 정보를 어떻게 취합할 수 있을것인가?
    (RCV_TRANSACTIONS_INTERFACE)
    Source document는 아래 중 하나가 될 수 있다.
    * Purchase Order
    * Return Material Authorization (RMA)
    * Internal Requisition
    아래에 소개하는 scripts는 interface tables에 많은 수의 records가 있을때,
    많은 수의 records의 정보를 한 번에 취합하길 원할때 사용하면 유용하다.
    만약 interface tables에 단지 몇 개의 records만 있다면 Note: 171257.1
    - Oracle Procurement (Purchasing)/iProcurement Diagnostic Tests 에 소개
    되어져 있는 적당한 diagnostic script를 이용하는편이 낫다.
    Example
    1.Error가 발생한 Purchase Order receipts의 추가 정보를 취합하는 script:
    SELECT SUBSTR(POH.SEGMENT1,1,10) "PO Number",
    POR.RELEASE_NUM "Release Num",
    POL.LINE_NUM "Line Num",
    RTI.INTERFACE_TRANSACTION_ID "Intf Trx Id",
    RTI.TRANSACTION_DATE "Trx Date",
    RTI.PROCESSING_MODE_CODE "Proc Mode",
    RTI.TRANSACTION_STATUS_CODE "Trx Status",
    RTI.PROCESSING_STATUS_CODE "Proc Status",
    SUBSTR(PIE.ERROR_MESSAGE,1,75) "Error Message",
    POH.ORG_ID "Op Unit Id",
    RTI.PO_HEADER_ID "PO Hdr Id",
    RTI.PO_RELEASE_ID "Rel Id",
    RTI.PO_LINE_ID "PO Line Id",
    RTI.PO_LINE_LOCATION_ID "Line Loc Id",
    RTI.QUANTITY "Intf Qty",
    POLL.QUANTITY "Order Qty",
    POLL.QUANTITY_RECEIVED "Qty recvd",
    POD.QUANTITY_DELIVERED "Qty Delv",
    NVL(POLL.CLOSED_CODE,'OPEN') "Closed Code",
    OOD.ORGANIZATION_CODE "To Inv Org",
    RTI.DESTINATION_TYPE_CODE "Dest Type",
    RTI.TRANSACTION_TYPE "Intf Trx Type",
    NVL(POLL.QTY_RCV_EXCEPTION_CODE,'NONE') "Qty Exception",
    POLL.QTY_RCV_TOLERANCE "Qty Tolerance",
    POLL.RECEIVE_CLOSE_TOLERANCE "Receipt Tolerance"
    FROM RCV_TRANSACTIONS_INTERFACE RTI,
    PO_HEADERS_ALL POH,
    PO_RELEASES_ALL POR,
    PO_LINES_ALL POL,
    PO_LINE_LOCATIONS_ALL POLL,
    ORG_ORGANIZATION_DEFINITIONS OOD,
    PO_DISTRIBUTIONS_ALL POD,
    PO_INTERFACE_ERRORS PIE
    WHERE POH.PO_HEADER_ID = RTI.PO_HEADER_ID
    AND POR.PO_RELEASE_ID(+) = RTI.PO_RELEASE_ID
    AND POL.PO_LINE_ID = RTI.PO_LINE_ID
    AND OOD.ORGANIZATION_ID(+) = RTI.TO_ORGANIZATION_ID
    AND POLL.LINE_LOCATION_ID = RTI.PO_LINE_LOCATION_ID
    AND POD.LINE_LOCATION_ID = POLL.LINE_LOCATION_ID
    AND RTI.INTERFACE_TRANSACTION_ID = PIE.INTERFACE_TRANSACTION_ID(+);
    2.Error가 발생한 RMA receipts의 추가 정보를 취합하는 script:
    SELECT RTI.INTERFACE_TRANSACTION_ID "Interface Transaction ID",
    RTI.PROCESSING_STATUS_CODE "Processing Status",
    RTI.PROCESSING_MODE_CODE "Processing Mode",
    RTI.TRANSACTION_STATUS_CODE "Transaction Status",
    RTI.TRANSACTION_TYPE "Transaction Type",
    RTI.TRANSACTION_DATE "Transaction Date",
    RTI.OE_ORDER_HEADER_ID "Order header Id",
    OOH.ORDER_NUMBER "Order Number",
    OOH.ORG_ID "Oper Unit Id",
    RTI.OE_ORDER_LINE_ID "Order Line Id",
    OOL.INVENTORY_ITEM_ID "Order Item",
    OOL.FLOW_STATUS_CODE "Line Flow Status Code",
    RTI.QUANTITY "Intf Qty",
    RTI.UNIT_OF_MEASURE "Intf UOM",
    OOL.ORDERED_QUANTITY "Qty Ordered",
    OOL.SHIPPED_QUANTITY "Qty Shipped",
    OOL.FULFILLED_QUANTITY "Qty Fulfilled",
    OOL.OPEN_FLAG "Open Flag",
    OOL.CANCELLED_FLAG "Cancelled",
    WH.ORGANIZATION_CODE||' - '||WH.ORGANIZATION_NAME
    "Ship To Organization",
    WH1.ORGANIZATION_CODE||' - '||WH1.ORGANIZATION_NAME
    "Ship From Organization",
    OOL.BOOKED_FLAG "Booked",
    SUBSTR(PIE.ERROR_MESSAGE,1,75) "Error Message"
    FROM RCV_TRANSACTIONS_INTERFACE RTI,
    OE_ORDER_LINES_ALL OOL,
    OE_ORDER_HEADERS_ALL OOH,
    ORG_ORGANIZATION_DEFINITIONS WH,
    ORG_ORGANIZATION_DEFINITIONS WH1,
    PO_INTERFACE_ERRORS PIE
    WHERE OOL.HEADER_ID = RTI.OE_ORDER_HEADER_ID
    AND OOL.LINE_CATEGORY_CODE = 'RETURN'
    AND OOL.LINE_ID = RTI.OE_ORDER_LINE_ID
    AND OOH.HEADER_ID = RTI.OE_ORDER_HEADER_ID
    AND OOL.SHIP_TO_ORG_ID = WH.ORGANIZATION_ID(+)
    AND OOL.SHIP_FROM_ORG_ID = WH1.ORGANIZATION_ID(+)
    AND RTI.INTERFACE_TRANSACTION_ID = PIE.INTERFACE_TRANSACTION_ID(+)
    ORDER BY RTI.INTERFACE_TRANSACTION_ID;
    3.Error가 발생한 Internal Requisition receipts의 추가 정보를 취합하는
    script:
    SELECT RTI.RECEIPT_SOURCE_CODE "Receipt Source Code",
    RTI.REQUISITION_LINE_ID "Intf Req Line Id",
    RTI.SHIPMENT_HEADER_ID "Ship Hdr Id",
    RTI.SHIPMENT_LINE_ID "Ship Line Id",
    RTI.ITEM_ID "Item Id",
    RTI.QUANTITY "Intf Qty",
    RTI.UNIT_OF_MEASURE "UOM",
    RTI.FROM_ORGANIZATION_ID "From Org Id",
    RTI.TO_ORGANIZATION_ID "To Org id",
    PRH.REQUISITION_HEADER_ID "Req Hdr Id",
    PRH.SEGMENT1 "Req Num",
    PRL.REQUISITION_LINE_ID "Req Line Id",
    NVL(PRL.QUANTITY_DELIVERED,0) "Qty Delv",
    NVL(PRL.QUANTITY_CANCELLED,0) "Qty Canc",
    PRL.QUANTITY "Qty Ordered",
    PRL.QUANTITY - (nvl(PRL.QUANTITY_CANCELLED,0) +
    NVL(PRL.QUANTITY_DELIVERED,0))"Qty Remaining",
    PRL.SOURCE_TYPE_CODE "Source Type",
    PRH.TRANSFERRED_TO_OE_FLAG "XFR to OE Flag",
    NVL(PRL.CANCEL_FLAG,'N') "Cancelled",
    NVL(PRL.CLOSED_CODE,'OPEN') "Closed Code",
    LIN.LINE_ID "ISO Line Id",
    NVL(LIN.ORDERED_QUANTITY,0) "ISO Line Qty",
    NVL(LIN.SHIPPED_QUANTITY,0) "ISO Ship Qty",
    NVL(FULFILLED_QUANTITY,0) "ISO Fulfilled Qty",
    NVL(LIN.CANCELLED_QUANTITY,0) "ISO Cancelled Qty",
    NVL(LIN.OPEN_FLAG,'N') "ISO Line Open",
    NVL(LIN.BOOKED_FLAG,'N') "ISO Line Booked",
    NVL(LIN.CANCELLED_FLAG,'N') "ISO Line Cancelled",
    OOH.ORDER_NUMBER "ISO Order Number",
    PIE.COLUMN_NAME "Intf Column",
    SUBSTR(PIE.ERROR_MESSAGE,1,75) "Error Message"
    FROM RCV_TRANSACTIONS_INTERFACE RTI,
    PO_REQUISITION_HEADERS_ALL PRH,
    PO_REQUISITION_LINES_ALL PRL,
    PO_INTERFACE_ERRORS PIE,
    OE_ORDER_LINES_ALL LIN,
    OE_ORDER_HEADERS_ALL OOH
    WHERE PRL.REQUISITION_HEADER_ID = PRH.REQUISITION_HEADER_ID
    AND RTI.REQUISITION_LINE_ID = PRL.REQUISITION_LINE_ID
    AND LIN.SOURCE_DOCUMENT_ID = PRH.REQUISITION_HEADER_ID
    AND LIN.SOURCE_DOCUMENT_LINE_ID = PRL.REQUISITION_LINE_ID
    AND LIN.SOURCE_DOCUMENT_TYPE_ID = 10 --Internal Requisition
    AND OOH.HEADER_ID = LIN.HEADER_ID
    AND PIE.INTERFACE_TRANSACTION_ID(+) = RTI.INTERFACE_TRANSACTION_ID
    ORDER by PRH.SEGMENT1;
    Reference Documents
    Note 263368.1

    Any suggestion/advise please!
    Thanks,
    Genoo

  • Issue with Date Comparision

    Hi Buddies,
    I explain the scenario,
    I have an dunning letter statement report which basically run for the three remainders (1,2,3) to the customer.
    If the remainder 3 runs , i want to display the pervious remainder dates of 1 and 2 in my report.
    If the remainder 2 runs, i want to display the pervious remainder date of 1 in my report.
    I have date column in the dunning letter history table as Dunning_Date.
    How can we achive this in sql?. I am getting struck over here, please give your advice to do it.
    Regards
    Ram

    Is this really an SQL question or are you looking for a solution in Reports? If so you should ask this in the Reports

  • Date comparision issue in the report

    Hi Buddies,
    I explain the scenario,
    I have an dunning letter statement report which basically run for the three remainders (1,2,3) to the customer.
    If the remainder 3 runs , i want to display the pervious remainder dates of 1 and 2 in my report.
    If the remainder 2 runs, i want to display the pervious remainder date of 1 in my report.
    I have date column in the dunning letter history table as Dunning_Date.
    How can we achive this in sql?. I am getting struck over here, please give your advice to do it.
    Regards
    Ram

    I think your date format is in different format with the date format u stored in presentation variable so better check the date format of the case function u wrote it in you variable that will solve the problem.

  • Data mismatch

    Hi all,
              My project manager shown screen shot like data mismatch from r/3 side and bw report, Now i have to analysis where it is
    strucked give me steps to follow find that error, you guys surely have idea where and all possible to get stuck please give yr
    suggestion, then i can easy find that problem.
    thanks & regards,
    R. Saravanan

    Hi Saravanan,
    Goto R3 side--> RSA3 Transaction > Enter the DS name>  Give some restriction w.r.t some Characteristics --> Extract the data
    Come to BW side> RSA1 trans> you can start checking with the either Infocube content/ MP contents> right click> Display data> Give the same restrictions as you gave in the R3 RSA3 trans> Execute
    Compare the field values whether you are getting the same data or not.
    if the data is matching here, then there might be some filtering in the Query--> check that filtering & check the data accordingly
    If it is not matching at Cube/MP level--> Then compare the data at PSA level before which check all the Routines/Update rules/ Transformations because there is a possibility that the data might be filtered in any of these levels.
    Hope it helps!
    Regards,
    Pavan

  • Data flows are getting started but not completing successfully while extracting/loading of the data

    Hello People,
    We are facing a abnormal behavior with the dataflows in the data services job.
    Scenario:
    We are extracting the data from CRM end in parallel. Please refer the build:
    a. We have 5 main workflows flows i.e :
       => Main WF1 has 6 more sub Wf's in it, in which each sub Wf has 1/2 DF's associated in parallel.
       => Main WF2 has 21 DF's and 1 WFa->with a DF & a WFb. WFb has 1 DF in parallel.
       => Main WF3 has 1 DF in parallel.
       => Main WF4 has 3 DF in parallel.
       => Main WF5 has 1 WF & a DF in sequence.
    b. Regularly the job works perfectly fine but, sometimes it gets stuck at the DF’s without any error logs.
    c. Job doesn’t stuck at a specific dataflow or on a specific day, many a times it strucks at different DF’s.
    d. Observations in the Monitor Log:
    Dataflow---------------------- State----------------RowCnt------LT-------AT------ 
    +DF1/ZABAPDF
    PROCEED
    234000
    8.113      394.164
    /DF1/Query
    PROCEED
    234000
    8.159      394.242
    -DF1/Query_2
    PROCEED
    234000
    8.159      394.242
    Where LT: Lapse Time and AT: Absolute time
    If you check the monitor log, the State of the Dataflow DF1 remains PROCEED till the end, ideally it should complete.
    In successful jobs, the status for DF1  is STOP . This DF takes approx. 2 min to execute.
    The row count for DF1 extraction is 234204 but, it got stuck at  234000.
    Then we terminate the job after sometime,but for surprise it gets executed successfully on next day.
    e. As per analysis over all the failed jobs, same things were observed over the different data flows that got stuck during the execution.Logic related to the data flows is perfectly fine.
    Observations in the Trace log:
    DATAFLOW: Process to execute data flow <DF1> is started.
    DATAFLOW: Data flow <DF1> is started.
    ABAP: ABAP flow <ZABAPDF> is started.
    ABAP: ABAP flow <ZABAPDF> is completed.
    Cache statistics determined that data flow <DF1>
    uses <0>caches with a total size of <0> bytes. This is less than(or equal to) the virtual memory <1609564160> bytes available for caches.
    Statistics is switching the cache type to IN MEMORY.
    DATAFLOW: Data flow <DF1> using IN MEMORY Cache.
    DATAFLOW: <DF1> is completed successfully.
    The highlighted text in the trace log is not appearing in the unsuccessful job but, it appears for the successful one.
    Note: The cache type is pageable cache, DS ver is 3.2.
    Please suggest.
    Regards,
    Santosh

    Hi Santosh,
    just a wild guess.
    Would you be able to replicate all the DF\WF , delete original DF\WF, rename replicated objects to original to DF\WF names(for your convenience)   and excute it.
    Some time reference does not work.
    Hope this should work.
    Regards,
    Shiva Sahu

  • How to create multiple editable PDFs from data merged Indesign file

    Hi
    This is quite complicated but I'm hoping someone can help me and hoping I can explain it too!
    I am working on a project for a client. She wants to send 150 people a 6 page PDF with fields to complete and return to her.
    She has an Excel spreadsheet with some of the fields completed already so these will be Data Merged into InDesign CS6.
    The short version of the question goes like this:
    Can you automate the process so that once all the data is in the InDesign file, it can then create a PDF every 6 pages all in one go?
    For those still interested, the long version of the question goes like this:
    I have created 3 pages in InDesign with fields to complete and used the 'Data Merge' facility to add the Excel info.
    I have created 3 more pages with blank fields to fill in.
    Then I have made all fields editable using the 'Buttons and Forms' facility in the Interactive menu.
    Then I have created an interactive PDF. Then I have opened that PDF in Adobe Acrobat X Pro and saved it as 'Reader Extended PDF' and 'Enable Additional Features'
    I now have an Editable PDF with half the fields filled in and half blank to be filled in and this can be saved and returned.
    Now, this would be great if it was only one or two. But I need to create 150 different versions and this is just for the Pilot study. If all goes well, this will go to 5,000 people and I can't be creating 5,000 seperate PDFS!!
    At this stage, I am just focussing on the 150. So again, going back to the original question, is there a script that could be written either to break up the 1st PDF every 6 pages into 150 different PDFS? Is there even a way of extending that script to Save as Reader Extended etc? Is there a simple idea that I'm missing? Are there any ways to do this? I can use 'Created Merged Documents' to make 150 x 6 pages in Indesign so can anything be done from there?
    Any advice greatly appreciated
    Thanks
    G

    Hi Akash,
    1) Pass the three header data in header table. 2) Pass all the item data in item table. here explain where you struck.
    Regards,
    Madhu.

Maybe you are looking for